Description

JOB Description:
Responsibilities:
·         As a Senior Data Engineer, you will
•          Design and develop big data applications using the latest open source technologies.
•          Desired working in offshore model and Managed outcome
•          Develop logical and physical data models for big data platforms.
•          Automate workflows using Apache Airflow.
•          Create data pipelines using Apache Hive, Apache Spark, Apache Kafka.
•          Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
•          Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
•          Mentor junior engineers on the team
•          Lead daily standups and design reviews
•          Groom and prioritize backlog using JIRA
•          Act as the point of contact for your assigned business domain

Requirements:

GCP Experience
• 2+ years of recent GCP experience
• Experience building data pipelines in GCP
• GCP Dataproc, GCS & BIGQuery experience

• 5+ years of hands-on experience with developing data warehouse solutions and data products.
• 5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
• 2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.

Education

Bachelor's degree