• 6-8 years of IT experience focusing on Testing and Automation experience using Java and Groovy.
. Rest assured app concepts in automation.
• Experience with Databricks & on Prem, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
• Experience with Spark Scala and Java programming
• Data Lake concepts such as time travel and schema evolution and optimization
• Experience leading and architecting enterprise-wide initiatives specifically in automation testing for ETL testing and several phases of testing, data migration, transformation.
• understanding of streaming data pipelines and how they differ from batch systems
• understanding of ETL and ELT and ETL/ELT tools such as Data Migration Service etc
• Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus
Architecture experience in AWS environment a bonus
• Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus
• Experience with Gitlab and CloudWatch and ability to write and maintain Gitlab for supporting CI/CD pipelines
• Experience working with AWS Lambdas for configuration and optimization and experience with S3
• Familiarity with Schema Registry, and message formats such as Avro, ORC, etc.
Bachelor's Degree