Description

Top 3 Must Have's

AWS

Kafka (or any other data processing skill

Python

Required: AWS, Java, Python, EMR, Flink, Streams Processing application development experience using Kafka Streams and/or Flink java APIs, Redshift, Lake Formation, Experience with Postgres, MySQL, DocumentDB, Glue Catalog, Terraform, Concourse

Nice to have: ECS/EKS, and at least AWS Practitioner, Developer, or Data Specialty certification.

Roles & Responsibilities

Hands-on Experience with Java, EMR, Flink, Kafka, AWS services - S3, Lambda, Athena

Extensive Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora

Required Tools and Languages Python, Spark, PySpark and Pandas

Infrastructure as Code technology Terraform/CloudFormation

Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse

Experience with RDBMS platforms and Strong proficiency with MySQL, Postgres

Deep knowledge of IAM roles and Policies

Experience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch events

Experience with Kafka/Messaging preferably Confluent Kafka

Experience with Event Driven Architecture

Job Summary

Works with the team on key technical aspects and responsible for product tech delivery

Engages in the Design, Build, Test and Deployment of components

Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)

Understand requirements / use case to outline technical scope and lead delivery of technical solution

Confirm Required Developers And Skillsets Specific To Product

Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)

Works closely with the Product Owner to align on delivery goals and timing

Assists Product Owner with prioritizing and managing team backlog

Collaborates with Data and Solution architects on key technical decisions

The architecture and design to deliver the requirements and functionality

Education

ANY GRADUATE