Description

Job Description

Seeking a Sr. Data Engineers for my client in SFO CA.

Duration: Long Term #Contract

-----------------------------------------------------------------

Ideal Candidates must have below skills.

•Responsible for scaling up data processing flow to meet rapid data growth.

•Evolve data model and data schema based on business and engineering needs.

•Implement systems tracking data quality and consistency.

•Develop tools supporting self-service data pipeline management (ETL).

•SQL and MapReduce job tuning to improve data processing performance.

Experience required:

•5+ years of relevant professional experience.

•Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet).

•Proficient in at least one of the SQL languages (MySQL, PostgreSQL, SqlServer, Oracle).

•Good understanding of SQL Engine and able to conduct advanced performance tuning.

•Strong skills in a scripting language (Python, Ruby, Bash).

•1+ years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4).

Key Skills

(MySQL PostgreSQL SqlServer Oracle (MapReduce Yarn HDFS Hive Spark Presto Pig HBase Parquet).

  • Posted On: Few Days Ago
  • Experience: 5+
  • Openings: 2
  • Category: Sr. Data Engineer
  • Tenure: Contract - Corp-to-Corp