Awareness on information security measures such as acceptable use of information assets, malware protection, password security
Understand and report security risks and how they impact the confidentiality, integrity, and availability of information assets
Understand how data is stored, processed, or transmitted from a Data privacy and protection standpoint
Preferred Qualifications
7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
Working knowledge of ETL technology - Talend / Apache Ni-fi / AWS Glue
Experience with relational SQL and NoSQL databases
Experience with big data tools: Hadoop, Spark, Kafka, etc. (Nice to have)
Advanced Alteryx Designer (Mandatory at this point - relaxing that would be tough)
Tableau Dashboarding
AWS (familiarity with Lambda, EC2, AMI)
Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. (Nice to have)
Experience with cloud services: EMR, RDS, Redshift or Snowflake
Experience with stream-processing systems: Storm, Spark-Streaming, etc.(Nice to have)
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Responsibilities
Work with Project Managers, Senior Architects and other team members from Bounteous & Client teams to evaluate data systems and project requirements
In cooperation with platform developers, develop scalable and fault-tolerant Extract Transform Load (ETL) and integration systems for various data platforms which can operate at appropriate scale; meeting security, logging, fault tolerance and alerting requirements.
Work on Data Migration Projects.
Effectively communicate data requirements of various data platforms to team members
Evaluate and document existing data ecosystems and platform capabilities
Configure CI/CD pipelines
Implement proposed architecture and assist in infrastructure setup