Description

The Department: Analytics

Data and analytics are central to all of our business functions and drive many of our most important decisions at Gemini. The Analytics team is responsible for data architecture, data engineering, business intelligence, machine learning, and data governance functions that shape the way data is stored and leveraged across Gemini. Analytics Engineers, Data engineers and Machine Learning engineers make up the Analytics team are responsible for building the primary decision support system that derives continuous value by enabling individuals and various functional groups to make data driven informed decisions via our reliable data processes, data products and advanced analytics ability. The projects executed by the team cover a wide-range of topics including user acquisition and customer journey, cryptocurrency performance, generative AI, product analytics, order book analytics, risk analytics, enabling automated and scalable blockchain based reconciliation systems, building predictive models all the way to enabling anomaly and fraud detection.

The Role: Analytics Engineer

As a member of our data team, you'll deliver high quality work while solving challenges that impact the whole or part of the team's data architecture. You'll update yourself with recent advances in Big data space and provide solutions for large-scale applications aligning with team's long term goals. Your work will help resolve complex problems with identifying root causes, documenting the solutions, and implementing Operations excellence (Data auditing, validation, automation, maintainability) in mind. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

Design, architect and implement best-in-class Data Warehousing and reporting solutions
Partner with product managers, data engineers and other functions to drive insights for the firm
Design, automate, build, and launch scalable, efficient and reliable data pipelines into production using Python
Design, build and enhance dimensional models for Data Warehouse and BI solutions
Perform analysis to derive actionable insights for key stakeholders
Perform root cause analysis and resolve production and data issues
Create test plans, test scripts and perform data validation
Tune SQL queries, reports and ETL pipelines
Build and maintain data dictionary and process documentation
Research new tools and technologies to improve existing processes
Minimum Qualifications:

4+ years experience developing a BI applications (Tableau/Looker/Power BI, etc)  
Advanced skills with Python and SQL are a must
4+ years experience in custom ETL design, implementation and maintenance
4+ years experience with schema design and dimensional data modeling
Experience making sense of data and developing actionable insights
Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
Experience with one or more ETL frameworks (Custom, DBT, Databricks, etc)
Experience with statistics, machine learning and generative AI
Strong software engineering skills in any server side language, preferable Python
Experienced in working collaboratively across different teams and departments
Strong technical and business communication
Preferred Qualifications:

Proficient in Developing LookML
Proficient in Python
Knowledge and experience of financial markets, banking or exchanges

Education

Any Graduate