Position Description:
*Our Data Factory Platform covers all business processes and technical components involved in ingesting a wide range of enterprise data into the Global Data Insight & Analytics Data Factory (Data Lake) and the transformation of that data into consumable data sets in support of analytics.
*This role will be part of our Ingest Pattern team which owns end-to-end responsibility to design, build and support ingestion solutions that allow ingesting teams to quickly build pipelines to ingest data into the Data Factory Platform.
*These ingestion solutions provide a framework and code templates that provide no/low code solutions for developers and operators to deploy data pipelines to our Google Cloud Platform (GCP) based Data Factory environment.
*The Data Factory ingestion patterns include a mix of in-house developed components leveraging native GCP services and 3rd party data products such as airflow.
Skills Required:
*Required Technical Skills 4-5 years of experience in:
*Java/Kotlin with Spring Boot, Rest API’s
*LDAP, Spring Security,
*B migration using Liquibase/Flyway
*Git and Tekton
*Google Cloud Platform (GCP)
*Development in Cloud Run/Cloud Function
*Cloud PubSub and Kafka
*BigQuery development/operations
*Cloud Logging/Monitoring principles/operations
*CI/CD on Google Cloud
*Terraform Development
*Knowledge of Dataflow with streaming
Skills Preferred:
*Knowledge of Python is plus
*Knowledge of Angular is plus
*Software Quality tools, CheckMarx, Fossa ,42C and CYCODE
Bachelor's degree in Computer Science