Description:
Position requires resource to work with business team to understand data requirements and work with source systems
Requires extracting data from source systems and loading into Hadoop(Spark)
Requires investing and resolving data and system issues, provide support
Proven experience in Designing & building data lake on Hadoop using latest technologies
Experience in building integration layer to integrate data from multiple systems to Hadoop
Experience in Architecture and solution design of analytics on Hadoop
Experience in building pipeline to ingest data into Hadoop
Working experience in dealing with data in multiple formats (csv, txt, xml, JSON)
10 Years of experience in building DW/ETL applications
5+ Years of experience in working with Hadoop echo system
Working experience with Hive, Spark, Python, Hadoop libraries
Experience in building data lake in Hadoop is must
Experience in building ETL and data aggregations
Experience with sqoop/ flumes/ spark is required
Excellent SQL knowledge is required
Experience in working in extracting data from RDBMS, SAS applications using API's
Must have skills
10+ yrs experience in building DW/ETL applications / 5+ yrs experience in working with Hadoop echo system / Knowledge of Hive, Spark, Python, Hadoop libraries / experience building data lake in Hadoop / Experience with sqoop, flumes, spark / Excellent SQL knowledge
Bachelor's degree in Computer Science