Description

  • Experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions
  • Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake
  • Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc.
  • Significant experience with big data processing and/or developing applications and data Pipelines via Hadoop, Yarn, Hive, Spark, Pig, Sqoop, MapReduce, HBASE, Flume, etc.
  • Data Engineering and Analytics on Google Cloud Platform using BigQuery, Cloud Storage, Cloud SQL, Cloud Pub/Sub, Cloud DataFlow, Cloud Composer..etc or equivalent cloud platform.
  • Familiarity with software architecture (data structures, data schemas, etc.)
  • Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL.
  • Strong mathematics background, analytical, problem solving, and organizational skills
  • Strong communication skills (written, verbal and presentation)
  • Experience working in a global, multi-functional environment
  • Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g. JAVA/Python/Perl/Scala, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript etc.); one or more Data Extraction Tools (Apache NiFi/Informatica/Talend etc.)
  • Software development using programming languages like Python/Java/Scala
  • Ability to travel as needed

Education

ANY GRADUATE