Job Description:
What you need
Minimum of 5 years of Consulting or client service delivery experience on Amazon AWS (AWS)
Minimum of 10 years of experience in big data, database and data warehouse architecture and delivery
Extensive hands-on experience implementing data migration and data processing using AWS services: VPC/SG, EC2, S3, AutoScaling, CloudFormation, LakeFormation, DMS, Kinesis, Kafka, Nifi, CDC processing Redshift, Snowflake, RDS, Aurora, Neptune, DynamoDB, Cloudtrail, CloudWatch, Docker, Lambda, Spark, Glue, Sage Maker, AI/ML, API GW, etc.
Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc.
Bachelor's Degree or equivalent amount of work experience (12 years), or Associate's Degree with six years of work experience.
Certifications
Certified AWS Solutions Architect - Associate
Nice to Have:
Certified AWS Solutions Architect - Professional (Nice to have)
Certified AWS Big Data Specialty (Nice to have)
Certified AWS AI/ML Specialty (Nice to have)
DevOps on an AWS platform. Multi-cloud experience a plus.
Experience developing and deploying ETL solutions on AWS
Strong in Java, C##, Spark, PySpark, Unix shell/Perl scripting
Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog etc.
Multi-cloud experience a plus - Azure, AWS, Google
Professional Skill Requirements
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Desire to work in an information systems environment
Excellent communication (written and oral) and interpersonal skills
Excellent leadership and management skills
Bachelor's Degree