Job Description: SQL/Synapse
- You will need good experience in building solutions using a variety of open source tools a Microsoft Azure services and a proven track record in delivering high quality work to tight deadlines.
- Your main responsibilities will be:
- Helping Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark, Azure Databricks, Dataverse, Azure cosmos DB.
- Delivering and presenting proofs of concept to of key technology components to project stakeholders.
- Working with other members of the project team to support delivery of additional project components (API interfaces, Dev-ops, ARM, Azure Powershell)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Having knowledge in T SQL, Designing data models.
- Designing work load management according to the customer needs.
- Having understanding of all the services in azure and intra integration of all the services.
- Should have a knowledge on SQL Data warehousing MPP architecture, Serverless Architecture.
- Troubleshooting and providing suggestions on best query plan.
- Should be able to work in CRITSIT’s and provide rapid support to customers.
- Also have an understanding of Networking, Private network, Azure DNS, Load balancer, Private link and Azure Active directory.
- Working within an Agile delivery / support methodology to deliver proof of concept and production implementation in iterative sprints.
Qualifications You will have:
- Strong knowledge of Data Management principles
- Experience in building ETL / data warehouse transformation processes
- Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks).
- Microsoft Azure AZ 900, DP900 certification.
- Hands on experience designing and delivering solutions using the Azure Synapse Analytics platform including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics.
- Experience in troubleshooting Kusto Explorer, Analyzing the logs and providing a resolution.
- Experience in analyzing network traces, Performance troubleshooting and also analyzing diagnostic data
- Experience working with structured and unstructured data including imaging & geospatial data.
- Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform