Job Title: Azure Technical Consultant
Location: Coimbatore
Experience: 6-9 Years
Work Duration: Contract( 3.5 Months)
Skill Required: DevOps, Azure Data Factory,Spark,DataBricks.
Job Summary:
As an Azure Technical Consultant, you will be responsible for providing technical expertise and guidance in designing, implementing, and optimizing solutions on the Microsoft Azure platform. Your primary role will involve working closely with clients to understand their business requirements and translating them into effective cloud-based solutions. You will collaborate with cross-functional teams, including architects, developers, and business stakeholders, to design and deploy Azure infrastructure, services, and applications. You will be responsible for ensuring that the solutions are scalable, secure, and aligned with best practices. Additionally, you will play a key role in cloud migration initiatives, implementing automation, monitoring, and performance tuning, while ensuring compliance with security and regulatory standards. You will also contribute to the continuous improvement of cloud architectures and practices within the organization.
Skills Required:
- DataBricks, Spark, Azure Data Factory, DevOps.
- To meet our requirements, the candidates must have hands-on experience in Azure and Databricks.
- We require strong candidates with specific expertise in Databricks and Azure for this role.
Key Responsibilites:
- Provide mutually agreed upon Architecture for Data Lake, and Data Migration.
- Azure Landing Zone Review/Hardening.
- Provide Automated Deployment of Services (ADF, Synapse, Databricks, Log Analytics, Storage-Data lake, in accordance with the Architecture Diagram to be finalized.
- Create full refresh and incremental resilient Data Pipelines as agreed to in Architecture. — ADF, Azure Databricks, Event Hub, Kafka, Synapse as required by Architecture.
- Create full refresh and incremental resilient ADF/Synapse/Data Bricks/Spark pipelines as needed.
- Support and provide Data Governance for Data using Purview/ERWin Intelligence, ADF/Synapse Pipelines, and PowerBI dashboard.
- Create a Data Lake Environment with initial schema and load the data to include data domains: — Clinical Data (MVP) — Claims — Provider — Members — Eligibility — Regular ingestion of the data identified above
- Develop 3 patterns and necessary infrastructure support for the business on how to consume data from the curated zone. These 3 patterns will include the below: — PowerBI — Data Science access from that exploratory Subscription — APIM access for interoperability
- Schedule, and set up alerts and Metrics using Log Analytics Workspace.
- Provide data quality, and code quality pipelines for observed data set.
- Design and implement Security Paradigm for the entire estate, role-based access control (RBAC), Encryption, Network setup for private and or Service End Points to GEHA standards
- Provide full design documentation of the new platform and future roadmaps to be shared with clients/auditors if needed for validation or awareness. This will be in the form of as-built documentation.
- Provide templates for further development and deployment.
- Conduct session to review Control Environment and summarize additional recommended controls (if required) • Conduct knowledge transfer to the GEHA team