Description

About You – experience, education, skills, and accomplishments:

  • At least 6+ years’ experience in working in software development
  • At least 6+ years’ experience in building massively scalable distributed data processing solutions
  • At least 6+ years’ experience of database design & development.
  • Alt least 4 years of experience in Apache Spark, ElasticSearch
  • Spark, ElasticSearch, Cassandra, Hadoop, Apache Hive, Snowflake, Jupiter notebook, databricks stack
  • At least 4 years of experience in PostgreSQL or Oracle DB experience or Oracle 11g+, PostgresSQL 9+, AWS RDS
  • At least 4 years of experience in Technologies and Tools like AWS, AWS Glue, Lambda
  • Building Data Pipelines & ETL jobs using cloud-native technologies & design patterns
  • Experience in designing resilient systems & creating disaster recovery plans
  • Working in Master Data Management & designing CMSes, or evaluating 3rd parawty CMS products
  • Working in Agile Scrum or Kanban teams & deploying solutions using Continuous Delivery best practices
  • Using automated database migration tools & have strong opinions on version control best practices for SQL scripts

 

It would be great if you also had:

  • Experience in designing resilient systems & creating disaster recovery plans
  • Working in Master Data Management & designing CMSes, or evaluating 3rd parawty CMS products
  • Working in Agile Scrum or Kanban teams & deploying solutions using Continuous Delivery best practices
  • Using automated database migration tools & have strong opinions on version control best practices for SQL scripts

 

What will you be doing in this role?

  • Provide technical thought leadership, compare different technologies to meet business requirements and cost control drivers.
  • Work with Business and IT groups to design and deliver a data lake platform.
  • Produce & maintain the overall solution design for the entire Data Lake Platform.
  • Execution of data strategy, help in the design and architecture of solutions on the platform
  • Enforce technical best practices for Big Data management and solutions, from software selection to technical architectures and implementation processes. 
  • Document and publish best practices, guidelines, and training information. 
  • Ensures all functional solutions and components of the Data Lake platform service are designed and implemented in a way to always meet SLAs. 
  • Contributes to the continuous improvement of the support & delivery functions by maintaining awareness of technology developments and making appropriate recommendations to enhance application services.
  • Focus on data quality throughout the ETL & data pipelines, driving improvements to data management processes, data storage, and data security to meet the needs of the business and customers

Education

Any Graduate