Description

About You – experience, education, skills, and accomplishments

  • Bachelor’s Degree or equivalent
  • At least 7 years of Relevant Experience
  • At least 5+ Years in Software Development: Demonstrated experience in software development, with a focus on Big Data technologies.
  • At least 3+ Years in Distributed Data Processing: Proven experience in building scalable distributed data processing solutions.
  • At least 3+ Years in Database Design: Expertise in database design and development, with a strong focus on data model design.
  • Strong Proficiency with Apache Spark and Airflow: Extensive hands-on experience with these technologies, leveraging them for data processing and orchestration.
  • Python Proficiency: Advanced proficiency in Python for data processing and building services.
  • Experience with Databricks and Snowflake: Practical experience with these platforms, including their use in cloud-based data pipelines.
  • Familiarity with Delta Lake or Apache Iceberg: Experience working with these data storage to decouple storage from processing engines.
  • Cloud-Based Solutions Expertise: Proven experience in designing and implementing cloud-based data pipelines, with specific expertise in AWS services such as S3, RDS, EMR, and AWS Glue.
  • CI/CD Best Practices: Strong understanding and application of CI/CD principles

 

It would be great if you also had . . .

  • Knowledge of Additional Technologies: Familiarity with Cassandra, Hadoop, Apache Hive, Jupyter notebooks, Bi tools: Tableau and PowerBI.
  • Experience with PL/SQL and Oracle GoldenGate: Additional experience in these areas is advantageous

 

What will you be doing in this role?

  • Provide Technical Leadership: Offer strategic guidance on technology choices, comparing different solutions to meet business requirements while considering cost control and performance optimization.
  • Communicate Effectively: Exhibit excellent communication skills, with the ability to clearly articulate complex technical concepts to both technical and non-technical stakeholders.
  • Design and Maintain Data Solutions: Develop and maintain the overall solution architecture for the Data Platform, demonstrating deep expertise in integration architecture and design across multiple platforms at an enterprise scale.
  • Enforce Best Practices: Implement and enforce best practices in Big Data management, from software selection to architecture design and implementation processes.
  • Drive Continuous Improvement: Contribute to the continuous enhancement of support and delivery functions by staying informed about technology trends and making recommendations for improving application services.
  • Lead Technical Investigations: Conduct technical investigations and proofs of concept, both individually and as part of a team, including hands-on coding to provide technical recommendations.
  • Knowledge Sharing: Actively spread knowledge and best practices within the team, fostering a culture of continuous learning and improvement

Education

Bachelor’s Degree