The Role
If you have a passion for working with data using multiple emerging technologies on the cloud, this might be the right opportunity for you! The Cloud Data Engineer will be working as part of a core team building solutions for the data analytics platform for Asset Management. This will involve designing and developing solutions for a variety of data lake needs using Snowflake as the data store for structured/semi-structured data and AWS s3 for unstructured data.
If you are a highly motivated and expert data engineer with a strong agile mindset, who's looking for new challenges, we have an exciting opportunity for you to join our fast paced and highly collaborative group. This role will be involved in the full end-to-end process through planning, design, development, quality, and implementation of solutions.
This position can be in Boston, Merrimack, Raleigh, and Westlake.
The Expertise And Skills You Bring
- Expertise in SQL, identifying patterns and trends in data, recommend and define data requirements, mastery in implementing data quality checks to ensure accuracy and completeness.
- You enjoy learning new technologies, data analysis, identifying data patterns and trends and you can independently resolve technical challenges.
- Experience in processing and exposing data using AWS technologies like ec2, s3, Lambda, API Gateway, Load Balancers, Auto scaling etc.
- Expertise in building data ingestion tools using technologies like Python to extract data from Relational Databases/Web Scraping/External API's
- Experience in Snowflake or any MPP and columnar database on the Cloud.
- Experience in CI/CD release automation and deployment (Jenkins, Concourse, CloudFormation etc.)
- Experience and good understating of databases (Oracle, Netezza) and ETL tools
- Experience in scheduling tools like Autosys, Control-M, Airflow etc.
- Excellent programming skills in SQL, PL/SQL, Python, shell etc.
- Exposure to Big Data technologies (Hadoop, Spark, Hive, presto etc.)
- Exposure to streaming services like Kafka will be an advantage
- Good understanding of overall AWS security services like KMS, IAM, Security groups etc.
- At least 6 years of software development experience with at least 3 years working on Cloud/Big Data technologies
- BS in Computer Science or related degree, or equivalent experience
- Description
The Role
If you have a passion for working with data using multiple emerging technologies on the cloud, this might be the right opportunity for you! The Cloud Data Engineer will be working as part of a core team building solutions for the data analytics platform for Asset Management. This will involve designing and developing solutions for a variety of data lake needs using Snowflake as the data store for structured/semi-structured data and AWS s3 for unstructured data.
If you are a highly motivated and expert data engineer with a strong agile mindset, who's looking for new challenges, we have an exciting opportunity for you to join our fast paced and highly collaborative group. This role will be involved in the full end-to-end process through planning, design, development, quality, and implementation of solutions.
This position can be in Boston, Merrimack, Raleigh, and Westlake.
The Expertise And Skills You Bring
- Expertise in SQL, identifying patterns and trends in data, recommend and define data requirements, mastery in implementing data quality checks to ensure accuracy and completeness.
- You enjoy learning new technologies, data analysis, identifying data patterns and trends and you can independently resolve technical challenges.
- Experience in processing and exposing data using AWS technologies like ec2, s3, Lambda, API Gateway, Load Balancers, Auto scaling etc.
- Expertise in building data ingestion tools using technologies like Python to extract data from Relational Databases/Web Scraping/External API's
- Experience in Snowflake or any MPP and columnar database on the Cloud.
- Experience in CI/CD release automation and deployment (Jenkins, Concourse, CloudFormation etc.)
- Experience and good understating of databases (Oracle, Netezza) and ETL tools
- Experience in scheduling tools like Autosys, Control-M, Airflow etc.
- Excellent programming skills in SQL, PL/SQL, Python, shell etc.
- Exposure to Big Data technologies (Hadoop, Spark, Hive, presto etc.)
- Exposure to streaming services like Kafka will be an advantage
- Good understanding of overall AWS security services like KMS, IAM, Security groups etc.
- At least 6 years of software development experience with at least 3 years working on Cloud/Big Data technologies
- BS in Computer Science or related degree, or equivalent experience
- Special Instructions
BOS, Westlake, NC This will involve designing and developing solutions for a variety of data lake needs using Snowflake as the data store for structured/semi-structured data and AWS s3 for unstructured data. Top skills required in the below order: 1.) Data analysis, expert in SQL 2.) Python 3.) Snowflake 4.) AWS