Job duties and responsibilities include:
- Evaluating the cost and time to minimize the effort to provide necessary data for different ML models.
- Responsible for defining enterprise level Data modernization & transformation roadmap on cloud (AWS and GCP) adoption strategy for customers.
- Responsible for architecting and solution designing of holistic cloud data strategy and platform at enterprise level covering end to end data lifecycle aligned with the business objectives.
- Design and streamline enterprise data architectures and pipelines and architect solutions which are intuitive, interoperable, extensible, and scalable.
- Develop best practices, thought leadership and Point of views related to Cloud.
- Provide robust technical solutions for on prem data lakes and data warehouse architectures, design, implementations and hybrid data architectures on cloud and migration of on-prem to cloud.
- Leading geographically distributed teams of data engineers in providing guidance in implementing data analytics solutions.
- Responsible for providing design solutions in areas of data virtualization, data catalogues, metadata management, data ingestion, data visualization, data governance, security, MDM and data quality management frameworks.
- Develop platform architecture for incremental and batch data processing to ingest the data in analytics platform.
- Created and followed lambda architecture for data analytics platform to streamline the architecture process.
- Advise researchers and provide solutions on AWS and Google cloud to build concepts quickly and deploy their solution and validate the use cases.
- Utilized Agile Scrum Methodology to help manage and organize a team of developers with regular code review sessions.
- Explore new ideas in the AI/ML and data analytics to identify and improve energy efficiency on connected lighting and smart cities.
- Demonstrate data analytics insights to business teams and work with the product team for field implementation.
- Build Recommendation models using Machine Learning approaches.
- This position does not supervise any other personnel.
Requirements:
Master’s degree in Computer Science, Computer or Software Engineering, or any related IT or Engineering field of study, plus at least three (3) years of experience in the job offered or in any related position(s).
In lieu of the above-stated primary education and experience requirements, employer will accept a Bachelor’s degree in Computer Science, Computer or Software Engineering, or any related IT or Engineering field of study, plus at least five (5) years of post-degree, progressively responsible experience in any related position(s).
Qualified applicants must also have demonstrable skill, knowledge, experience and proficiency in the following:
- AWS (S3, Redshift, Glue, Lambda, Athena, Spectrum).
- Google Cloud (Bigquery, Data PROC, Data Flow, Composer, Cloud Functions, Pubsub).
- Bigdata (Spark, Python, Hive, Airflow, Snowflake).
No travel or telecommuting. Position is project-based at various unanticipated work sites within the U.S., and relocation may be required at the end of each long-term project.