Key Consideration:
Data Integration and Datawarehouse experience, Ascend or any one of the ETL, GCP, Python, DBT, GCP Kubernetes, Composer, Google Big Query, Airflow DAG
- Experience in Google Cloud Platform (GBQ, GCS, Kubernetes POD operator and Image creation, Composer, DAG creation)
- Python Programming Expert
- Experience in Ascend tool
- Create Data flow, Data service, Read and Write connection for the various sources.
- Failure Notification settings (Ascend Webhook notification, Scheduling in Ascend)
- Big query merge to perform merge
- Custom Read and Write connector using python framework
- Good understanding in Error logging
- Experience in Airflow (Create DAG, Configure the variables in Airflow, Scheduling)
- Experience in DBT to create the lineage in GCP. – Optional
- Worked in Dev-Sec-Ops (CI/CD) environment.
- Experienced with containerized execution/code containerization
- Interact with the platform leads and 3rd party application leads to understand and analyze the various interfaces and data integration needs to create technical user stories
- Contribute to the creation of technical design, estimation as applicable for the user stories and review with the EAC team and Product Owner
- Conduct and participate in code review sessions for the team deliverables
- Guides the team members and involves in development and unit testing
- Work with business users for UAT and assist QA team accordingly for defects fixing
- Help the team to troubleshoot critical issues in development, staging or prodRetail & Manufacturing Domain expertise with Qualifications MTech, MS, BE, BTech or MCA