itle: Azure Data Engineer
Location: 100% Remote (Remote, but local candidates to Cincinnati a plus)
Duration: 12+ Months
Interviews: MS Teams
Note To Vendors
- 1 year contract initially, with option to renew. DIEM is a 5-6yr project.
- Top 3 Skills: Azure Data Factory/Lake, Databricks, python, spark, SQL, Kafka, EventHub
- What is not in the job description that is important for the vendors to know: Highly visible position to ensure delivery on a new data integration platform.
- What is the timeframe you want to onboard the new person: ASAP
- What is the interview process for this job: No video screen. Panel interview with engineering team.
Job Description:
Accountable for developing and delivering technological responses to targeted business outcomes. Analyze, design, and develop enterprise data and information architecture deliverables, focusing on data as an asset for the enterprise. Understand and follow reusable standards, design patterns, guidelines, and configurations to deliver valuable data and information across the enterprise, including direct collaboration with The Sister Company, where needed. Demonstrate the company's core values of respect, honesty, integrity, diversity, inclusion, and safety. Expertise in Azure Data Platform stack: Azure Data Lake, Data Factory and Databricks
Minimum Position Qualifications
- 4+ years' experience in data development and principles including end-to-end design patterns.
- 4+ years proven track record of delivering large scale, high quality operational or analytical data systems.
- 4+ years of successful and applicable experience building complex data solutions that have been successfully delivered to customers.
- Any experience in a minimum of two of the following technical disciplines: data warehousing, big data management, analytics development, data science, application programming interfaces (APIs), data integration, cloud, servers and storage, and database mgmt.
- Excellent oral/written communication skills
Key Responsibilities:
- Utilize enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses.
- Ensure there is clarity between ongoing projects, escalating when necessary, including direct collaboration with The Sister Company.
- Leverage innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms.
- Define high-level migration plans to address the gaps between the current and future state.
- Contribute to the development of cost/benefit analysis for leadership to shape sound architectural decisions.
- Analyze technology environments to detect critical deficiencies and recommend solutions for improvement.
- Promote the reuse of data assets, including the management of the data catalog for reference.
- Draft architectural diagrams, interface specifications and other design documents
Skills: Azure Data Engineer, Azure Data Factory/ DataLake, Databricks, python, spark, SQL, Kafka, EventHub