Description

Mid-level  Data Engineer (BigQuery, Python, Airflow) The Mid-level Data Engineer should possess a deep sense of curiosity and a passion for building smart data pipelines, data structures and data products and the ability to communicate data structures and tools throughout the Paramount Streaming organization. The candidate for this role will use their skills in reverse engineering, analytics, and creative, experimental solutions to devise data and BI solutions. This engineer supports data pipeline development which includes machine learning algorithms using disparate data sources. The ideal candidate will work closely with BI, Research, Engineering, Marketing, Finance, and Product teams to implement data-driven plans that drive the business. They will have good communication skills and possess the ability to convey knowledge of data structures and tools throughout the Paramount Digital Media organization. This candidate will be expected to lead a project from inception to completion as well as help mentor junior members of the team on best practices and approaches around data. Your Day-to-Day: ● Works with large volumes of traffic data and user behaviors to build pipelines that enhance raw data. ● Able to break down and communicate highly complex data problems into simple, feasible solutions. ● Extract patterns from large datasets and transform data into an informational advantage. ● Find answers to business questions via hands-on exploration of data sets via Jupyter, SQL, dashboards, statistical analysis, and data visualizations. ● Partner with the internal product and business intelligence teams to determine the best approach around data ingestion, structure, and storage. Then, work with the team to ensure these are implemented correctly. ● Contributing ideas on how to make our data more effective and working with other members of the engineering, BI teams, and business units to implement changes. ● Ongoing development of technical solutions while developing and maintaining documentation, at times training impacted teams. ● Early on collaboration with the team on internal initiatives to create strategies that improve company processes. ● Look at ways of improving efficiency by staying current on the latest technology and trends and introducing team members to such. ● Develop prototypes to proof out strategies for data pipelines and products. ● Mentor members of the team and department on best practices and approaches. ● Lead initiatives in ways to improve the quality of our data as well as make the data more effective, with other members of engineering, BI teams, and business units to implement changes. ● Able to break down and communicate highly complex data problems into simple, feasible solutions. Qualifications: What you bring to the team: You have— ● Bachelor's degree and 5+ years of work experience in Data Engineering and Analytics fields or consulting roles with a focus on digital analytics implementations. ● Experience with large scale data warehouse management systems such as BigQuery for 3+ years with advanced level understanding of warehouse cost management and query optimization ● Proficient in Python. ● Experience with Apache Airflow or equivalent tools for orchestration of pipelines. ● Experience with Data Modeling of performant table structures. ● Able to write SQL to perform common types of analysis and transformations. ● Strong problem-solving and creative-thinking skills. ● Demonstrated development of ongoing technical solutions while developing and maintaining documentation, at times training impacted teams. ● Experience developing solutions to business requirements via hands-on discovery and exploration of data. ● Exceptional written and verbal communication skills, including the ability to communicate technical concepts to non-technical audiences, as well as translating business requirements into Data Solutions ● Strong Experience with ETL & ELT. ● Experience building and deploying applications on GCP and AWS cloud platform. ● Influences and applies data standards, policies, and procedures ● Builds strong commitment within the team to support the appropriate team priorities ● Stays current with new and evolving technologies via formal training and self-directed education You might also have: (Nice to have’s): ● Experience with Snowflake, Redshift and other AWS technologies. ● Experience with Docker and container deployment. ● Experience with Marketing tools like Kochava, Braze, Branch, Salesforce Marketing Cloud is a plus. ● Experience with exploratory data analysis using tools like iPython Notebook, Pandas & matplotlib, etc. ● Familiarity in Hadoop pipelines using Spark, Kafka. ● Familiar with GIT. ● Familiar with Adobe Analytics (Omniture) or Google Analytics. ● Digital marketing strategy including site, video, social media, SEM, SEO, and display advertising.

Education

ANY GRADUATE