Description

About You – experience, education, skills, and accomplishments  

  • Bachelor's degree in computer science, mechanical engineering, or related degree or at least 4 years of equivalent relevant experience
  • 8+ years of experience with database designing, pipelining and development
  • Working knowledge on snowflake and relational databases like PostgreSQL or MySQL
  • Experience working on Spark, PySpark, Hive, Airflow and AWS (or any other cloud platform)
  •  Proficient with writing SQL queries, Stored Procedures and Views
  •  Passion for learning and a desire to grow!

 

It would be great if you also had . . .  

  • Familiarity with any Talend, Tableau and Snowflake would be added advantage. 
  • Experience in building big data platforms.
  • Understanding on healthcare data.

 

What will you be doing in this role? 

 

As a member of Data Engineering Team, you’ll Step into a key role on an expanding data engineering team to build our data platforms, data pipelines, and data transformation capabilities.

  • Define and implement our data platform strategy on Cloud, have a meaningful impact on our customers, and working in our high energy, innovative, fast-paced Agile culture.
  • Drive rapid prototyping and development with Product and Technical teams in building and scaling high-value medical data capabilities.
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using Apache suite (airflow, spark), SQL, Python, ETL, and AWS big data technologies.
  • Creation and support of batch and real-time data pipelines and ongoing data monitoring and validation built on AWS/Snowflake/Apache technologies for medical data from many different sources. 
  • Conduct functional and non-functional testing, writing test scenarios and test scripts.
  • Evaluate existing applications to update and add new features to meet business requirements.

 

Product you will be developing

Big Data Platforms

Education

Bachelor's degree in computer science,