Position Overview
The primary responsibility of the Data Engineer II – Enterprise Analytics is assisting in designing, developing, and deploying data-driven solutions as part of Enterprise Analytics data strategy and goals. Data Engineer II – Enterprise Analytics is responsible for creating reliable ETLs and scalable data pipelines to support Analytics and BI environment (including modeling and machine learning, visualizations, reports, cubes & applications, etc.). Data Engineer II – Enterprise Analytics participates in data modeling and development of data models and data marts by interpreting business logic required to turn complex ideas into a sustainable value add processes. All duties are to be performed in accordance with departmental and employers policies, practices, and procedures.
Essential Duties & Responsibilities
Collaborate with Enterprise Analytics BI Analysts and Data Scientists, and other business stakeholders to understand business problems and build/automate data structures ingested by analytics products (e.g.: reports, dashboards, cubes, etc.).
Create BI solutions, including dashboards (Power BI) and reports (SSRS, Excel, and Cognos).
Development of logic for KPIs requested by the business leadership.
Troubleshoot existing and create new ETLs, SSIS packages, SQL stored procedures and jobs.
Assist in Star and Snowflake modeling, creating dimensional models and ETL processes.
Write an efficient SQL code for use in data pipelines and data processing.
Drive data quality processes like data profiling, data cleansing, etc.
Develop best practices and approaches to support continuous process automation for data ingestion and data pipelines.
Use innovative problem solving and critical thinking approaches to troubleshoot challenging data obstacles.
Test, optimize, troubleshoot, and fine-tune queries for maximum efficiency.
Maintain existing and create new Microsoft Power Apps solutions (including but not limited to Power Apps, Power Automate, Power BI, Flows, etc.) according to business needs.
Perform QA and UAT processes to foster an agile development cycle.
Create documentation on table design, mapping out steps and underlying logic within data marts to facilitate data adoption with minimum guidance from the Enterprise Analytics management.
Identify areas of improvement not just in owned work, but also other areas of the business.
Mentor and train junior Data Engineers on best practices, query and ETL optimization techniques.
Create and maintain daily, weekly, monthly, quarterly reports and dashboards.
Consolidate fractured enterprise reporting into a standardized product for easy visualization and cross-departmental understanding.
Create reporting structures that accurately link cross-departmental data, which allows for on demand delivery of ad-hoc reports.
Safety is an essential function of this job.
Consistent and regular attendance is an essential function of this job.
Performs other related duties as assigned.
Company Standards of Conduct
All Team Members are expected to always conduct and carry themselves in a professional manner. Team Members are required to observe the Company’s standards, work requirements and rules of conduct.
Minimum Qualifications
Proof of authorization/eligibility to work in the United States.
Bachelor’s degree in Computer Science, Information Systems, Engineering, Analytics, or related field is required.
Master’s degree in related discipline is preferred.
Must be able to obtain and maintain a Nevada Gaming Control Board registration and any other certification or license, as required by law or policy.
2+ years of experience in building data pipelines and ETL processes is required.
2+ years of experience creating visualizations and reports (Power BI, Tableau, MicroStrategy, Google Analytics) is required.
2+ years of experience in writing advanced SQL, data mining and working with traditional relational databases (tables, views, window functions, scalar and aggregate functions, primary/foreign keys, indexes DML/DDL statements, joins and unions) and/or distributed systems (Hadoop, Big Query) is required.
1+ years of experience with programming/scripting languages such as Python, R or Big Query is required.
Experience in either Microsoft Power Suite (Power Apps, Power Automate, Power BI, etc.), Microsoft Azure, Google Cloud Platform, or RPA tools is preferred.
Excellent understanding of data types, data structures and database systems and their specific use cases is required.
Strong understanding of data modeling principles including Dimensional modeling, and Data Normalization principles is required.
Extensive knowledge of Microsoft Excel (Excel formulas, data wrangling, VBA macros, graphs, and pivot tables) is required.
Excellent critical thinker and effective problem solver with creative solutions.
Physical Requirements
Must be able to:
Lift or carry 10 pounds, unassisted, in the performance of specific tasks, as assigned.
Physically access all areas of the property and drive areas with or without a reasonable accommodation.
Maintain composure under pressure and consistently meet deadlines with internal and external customers and contacts.
Ability to interact appropriately and effectively with guests, management, other team members, and outside contacts.
Ability for prolonged periods of time to walk, stand, stretch, bend and kneel.
Work in a fast-paced and busy environment.
Work indoors and be exposed to various environmental factors such as, but not limited to, CRT, noise, dust, and cigarette smoke.
Any Graduate