Required
8+ years’ experience working within a large scale enterprise level environment
8+ years Proficiency working with Python specifically related to data processing with proficiency in Python Libraries - Pandas, NumPy, PySpark, PyOdbc, PyMsSQL, Requests, Boto3, SimpleSalesforce, Json.
4+ years Strong SQL (SQL, performance, Stored Procedures, Triggers, schema design) skills and knowledge of one of more RDBMS like MSSQL/MySQL.
2+ years Strong AWS skills using AWS Data Exchange, Athena, Cloud Formation, Lambda, S3, AWS Console, IAM, STS, EC2, EMR
2+ year’s experience in Data Warehouse technologies like Snowflake/Spark/Databricks/ Informatica
2+ years ETL tools like Apache Airflow/AWS Glue/Azure Data Factory/Talend/Alteryx
1+ year in Hadoop, Hive
Excellent verbal communication skills.
Knowledge of DevOps/Git for agile planning and code repository
Bachelor's degree