Introduction This position is a contract hire opportunity which is expected to be a hybrid work arrangement for one year. The selected resource must reside in the Jacksonville, FL area as he/she will be required to be onsite from time to time for meetings. Summary This position is for a Data Engineer working on different cloud data services mainly Azure & AWS, Python, SSIS and to support data pipeline development & migration efforts along with Power BI dashboard/report development. This includes development and application maintenance efforts. The individual would be a self-starter and capable of working with complex SQL queries, debugging Python and SQL routines, packages and code modification as needed. Job Description: *Design, develop, document & maintain complex data pipelines using AWS data services such as EMR, Step Function, SNS, Event Scheduler to source data from multiple sources including RDBMS, Parquet files (on AWS S3 bucket), Excel, and text files. *Design, develop, document & maintain complex data pipelines using Azure data services such as Logic App, ADF, ADLS gen 2, AzureDatabricks, Azure Synapse workspace, Event hub, service bus to source data from multiple sources including RDBMS, Parquet files (ADLS gen 2, S3 bucket, GCP, on prem), Excel, and text files. *Deploy & automate (using CICD pipeline) Azure, AWS cloud services using IaaC (Infra as a Code) *Develop & design web solution using ReactJS, Redux & NodeJS *Evaluate, debug, and modify existing complex Python/PySpark code, & SSIS packages in accordance with business requirements. *Perform data analysis and test/debug software solutions *Analyze existing ETL jobs, develop ELT pipelines & data warehouse applications or work to formulate logic for moderately complex new systems and devise moderately complex algorithms. *Design and extract data from different data sources such as Guidewire© application or other on prem databases *Apply effective programming security practices. *Practices current development methods/techniques including CI/CD pipelines using AWS CodeCommit & AWS CodePipeline, Azure DevOps and establishes development standards (which include coding standards, documentation standards, and testing standards) to ensure the quality and maintainability of automated solutions. *Familiarity with designing & developing data pipelines to source data from AWS/Azure cloud or on prem to Snowflake platform *Following established change management code migration processes. SKILLS AND ABILITIES *Experience in data ingestion, data transformation, and data loading *Experience in Azure Data platforms such as Azure Databricks, Azure Synapse, Azure Data Factory, ADLS Gen2, etc. (REQUIRED) *Experience in AWS data platform such as AWS EMR, Step Function, Event Scheduler, SNS, S3, CodeCommit, CodePipeline (REQUIRED) *Experience in Snowflake platform *Working knowledge of Python, PySpark or equivalent *Demonstrated broad problem-solving skills, including experience identifying and using code libraries and open-source forums *Good written and verbal communication skills *4 + Years of Data ingestion and transformation experience using AWS data services such as EMR, step function, event scheduler, SNS, Python & SQL Server SSIS (any combination) *2+ Years experience in Snowflake platform *8+ Years ETL experience with designing star/snowflake schema *2+ Years of experience with Azure Data platforms (ADF, Synapse workspace, Databricks, Event Hub, Service Bus, Logic App, ADLS gen 2) and Python *2 + Years of experience in more than one object-oriented programming languages like React.JS, Jode.JS, HTML, JavaScript *Proficiency with HTML, CSS, JavaScript/jQuery, local storage and cross browser compatibility *Preferred - any experience working with Guidewire© Application *Any experience in Power BI Report Development will be a plus
ANY GRADUATE