Description

Responsibility:
Design build and operationalize largescale enterprise data solutions in Hadoop Postgres and Snowflake
Demonstrate outstanding understanding of AWS cloud services especially in the data engineering and analytics space
Analyze rearchitect and replatform onpremises big data platforms
Craft and develop solution designs for data acquisitioningestion of multifaceted data sets internalexternal data integrations and data warehousemarts
Collaborate with business partners product owners functional specialists business analysts IT architecture and developers to develop solution designs adhering to architecture standards
Ensure solutions adhere to enterprise data governance and design standards
Act as a point of contact to resolve architectural technical and solutionrelated challenges from delivery teams for best efficiency
Design and develop ETL pipelines to ingest data into Hadoop from different data sources Files Mainframe Relational Sources NoSQL etc using Informatica BDM
Advocate the importance of data catalogs data governance and data quality practices
Exhibit outstanding problemsolving skills
Work in an Agile delivery framework to evolve data models and solution designs to deliver value incrementally
Be a selfstarter with experience working in a fastpaced agile development environment
Provide strong mentoring and coaching leading by example for junior team members
Be outcomefocused with strong decisionmaking and critical thinking skills to challenge the status quo impacting delivery pace and performance and striving for efficiency

What youll bring
Education Bachelors in computer engineering or computer science
Experience Over 8 years of experience in designing solutions for data lakes data integrations and data warehousesmarts
Technical Skills Proficient with data technologies and tools such as Hadoop PostgreSQL and Informatica
ETL Expertise Experience with various execution modules in BDM including Spark Hive and Native Extensive knowledge and experience in ETL using the Informatica product suite
Cloud Data Lake Design Familiarity with cloud data lake design preferably using AWS technologies like S3 EMR Redshift and Data Catalog
Data Governance Proven experience in implementing data governance principles and practices
ReportingAnalytics Tools Understanding of tools such as QlikSense SAP Business Objects SAS and Dataiku
Agile Methodology Familiar with Agile software development practices
Communication Skills Excellent verbal and written communication abilities
Insurance Knowledge Experience with Claim Center and Guidewire will be an asset with the ability to understand complex business processes driving technical systems

Education

Bachelor's Degree