Description


Responsibilities

Focused on technical leadership, defining patterns and operational guidelines for their vertical(s)
Independently scopes, designs, and delivers solutions for large, complex challenges
Provides oversight, coaching and guidance through code and design reviews
Designs for scale and reliability with the future in mind. Can do critical R&D
Successfully plans and delivers complex, multi-team or system, long-term projects, including ones with external dependencies
Identifies problems that need to be solved and advocates for their prioritization
Owns one or more large, mission-critical systems at Gemini or multiple complex, team level projects, overseeing all aspects from design through implementation through operation
Collaborates with coworkers across the org to document and design how systems work and interact
Leads large initiatives across domains, even outside their core expertise. Coordinates large initiatives
Designs, architects and implements best-in-class Data Warehousing and reporting solutions
Builds real-time data and reporting solutions
Develops new systems and tools to enable the teams to consume and understand data more intuitively

Minimum Qualifications

10+ years experience in data engineering with data warehouse technologies
10+ years experience in custom ETL design, implementation and maintenance
10+ years experience with schema design and dimensional data modeling
Experience building real-time data solutions and processes
Advanced skills with Python and SQL are a must
Experience and expertise in Databricks, Spark, Hadoop etc.
Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
Strong computer science fundamentals including data structures and algorithms
Strong software engineering skills in any server side language, preferable Python
Experienced in working collaboratively across different teams and departments
Strong technical and business communication skills

Preferred Qualifications

Kafka, HDFS, Hive, Cloud computing, machine learning, LLMs, NLP & Web development experience is a plus
NoSQL experience a plus
Deep knowledge of Apache Airflow
Expert experience implementing complex, enterprise-wide data transformation and processing solutions
Experience with Continuous integration and deployment
Knowledge and experience of financial markets, banking or exchanges
Web development skills with HTML, CSS, or JavaScript

It Pays to Work Here

The Compensation & Benefits Package For This Role Includes

Competitive starting salary
A discretionary annual bonus
Long-term incentive in the form of a new hire equity grant
Comprehensive health plans
401K with company matching
Paid Parental Leave
Flexible time off

Education

Any Graduate