Description

Responsibilities
·         Software Development: Design, build and maintain robust and scalable software development applications
·         Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines on AWS, ensuring the efficient and timely transfer of data from various sources to our data warehouse.
·         Data Integration: Implement and manage third-party data integration tools like Fivetran to streamline data collection and transformation processes.
·         Data Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and reliability of data throughout the pipeline.
·         Performance Optimization: Continuously monitor and optimize data pipelines for improved speed, efficiency, and cost-effectiveness.
·         Security and Compliance: Implement data security best practices and ensure compliance with data protection regulations, such as GDPR, as applicable.
·         Documentation: Create and maintain comprehensive documentation for data pipelines, workflows, and procedures to facilitate knowledge sharing and collaboration within the team.
·         Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand their data requirements and provide support in delivering relevant datasets.
·         Technology Evaluation: Stay updated on emerging data engineering technologies and best practices, making recommendations for improvements and optimizations
Qualifications
·         Proven experience as a Software Engineer, with a minimum of 9 years working on software engineering, data pipelines, data integration, and database management.
·         Strong expertise in AWS services, including S3, Glue, Redshift, and other relevant data-related services.
·         Hands-on experience working with data integration tools, preferably Fivetran.
·         Hands-on experience working with orchestration services, preferably Airflow.
·         Proficiency in database management systems, such as Amazon Aurora and Snowflake.
·         Solid programming skills in languages like Python, Java, or Scala.
·         Knowledge of data modeling, ELT processes, and data warehousing concepts.
·         Familiarity with data quality and data governance principles.
·         Strong problem-solving skills and an ability to work independently and as part of a team.
·         Excellent communication skills to collaborate with cross-functional teams and articulate technical concepts to non-technical stakeholders.
·         AWS or other relevant certifications are a plus.
·         Experience working with distributed teams - be a self starter and proactive in communication

Key Skills
Education

Any Graduate