Description

Job Description

  •     Masters / Bachelors Degree in Computer Science, Information Technology or other relevant fields.
    •    Minimum 5+ Years of Mandatory experience in Advance Data & Analytics Architecture, ETL, Data Engineering solution using the following Skills, tools and technologies:
    •    AWS Data & Analytics Services: Athena, Glue, DynamoDB, Redshift, Kinesis, Lambda
    •    Databricks Lakehouse Platform
    •    PySpark, Spark SQL, Spark Streaming
    •    Experience with any NoSQL Database
    •    3+ year of coding experience with modern programming or scripting language (Python).
    •    Expert-level skills in writing and optimizing SQL.
    •    Experience operating very large data warehouses or data lakes/ data platform.
    •    Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data.
    •    Data Modelling, star schema, derivative, measures
    •    Experience of working in Agile delivery
    •    Experience with full software development life cycle, including coding standards, code reviews, source control management, build processes, and testing.
    •    Excellent business and communication skills to work with business owners to understand data requirements.

Education

ANY GRADUATE