Description

Position 1: hashtag#SeniorDataEngineer with hashtag#GCP
Job Location: hashtag#Sunnyvale,CA – Day 1 Onsite – Need Only hashtag#California Candidates 
Job Type: Contract Long Term 
hashtag#VisaAcceptable: hashtag#USC and hashtag#Greencard and hashtag#H1B and EAD(Passport Number must for client submission for H1B & EAD Candidates)
Experience: #12+ Years

Looking for Senior Data Engineer with hashtag#Spark, hashtag#Scala, hashtag#GCP experience.
Must Have Skills –
Spark – 8+ Yrs of Exp
Scala – 8+ Yrs of Exp
GCP –5+ Yrs of Exp
Hive– 8+Yrs of Exp
SQL - 8+ Yrs of Exp
ETL Process / Data Pipeline - 8+ Years of experience


Responsibilities:
As a Senior Data Engineer, you will
·      Design and develop big data applications using the latest open source technologies.
·      Desired working in offshore model and Managed outcome
·      Develop logical and physical data models for big data platforms.
·      Automate workflows using Apache Airflow.
·      Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.
·      Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
·      Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
·      Mentor junior engineers on the team
·      Lead daily standups and design reviews
·      Groom and prioritize backlog using JIRA
·      Act as the point of contact for your assigned business domain
Requirements:
·      8+ years of hands-on experience with developing data warehouse solutions and data products.
·      4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow or a workflow orchestration solution are required
·      4 + years of experience in GCP, GCS Data proc, BIG Query
·      2+ years of hands-on experience in modeling (Erwin) and designing schema for data lakes or for RDBMS platforms.
·      Experience with programming languages: Python, Java, Scala, etc.
·      Experience with scripting languages: Perl, Shell, etc.
·      Practice working with, processing, and managing large data sets (multi-TB/PB scale).
·      Exposure to test driven development and automated testing frameworks.
·      Background in Scrum/Agile development methodologies.
The most successful candidates will also have experience in the following:
·      GitFlow
·      Atlassian products – BitBucket, JIRA, Confluence etc.
·      Continuous Integration tools such as Bamboo, Jenkins, or TFS

Education

ANY GRADUATE