Description

Job Title: Senior Data Engineer

Location: Bangalore

Experience: 5-8 Years

Skill Set : Snowflake , Python , SQL , Data engineer

Job Summary:

We are seeking a highly skilled Senior Data Engineer with expertise in data engineering, data pipelines, and cloud-based technologies. The ideal candidate will be responsible for designing, developing, and maintaining scalable and efficient data infrastructure to support data-driven decision-making across the organization. This role requires a deep understanding of data architecture, strong problem-solving skills, and the ability to collaborate with cross-functional teams to deliver high-quality data solutions. Experience with tools like Apache Spark, Hadoop, SQL, and cloud platforms such as AWS, Azure, or Google Cloud is essential.

About Us:

This position is being recruited by Smartwork IT Services, a leading recruitment and product-based company. In addition to staffing solutions, Smartwork IT Services is involved in developing cutting-edge products like SWITS ATS (Applicant Tracking System) and SWITS HRMS (Human Resource Management Services). We focus on delivering exceptional value through innovative solutions and top-tier talent acquisition.

Key Responsibilities:

  • Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
  • Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
  • Work in tandem with our engineering team to identify and implement the most optimal solutions
  • Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
  • Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
  • Able to manage deliverables in fast paced environments Areas of Expertise

Required Skills:

  • At least 5-8 years of experience designing and development of data solutions in enterprise environment
  • At least 2+ years’ experience on Snowflake Platform
  • Strong hands on SQL and Python development
  • Experience with designing and development data warehouses in Snowflake
  • A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
  • Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
  • Good understanding on Metadata and data lineage
  • Hands on knowledge on SQL Analytical functions
  • Strong knowledge and hands-on experience in Shell scripting, Java Scripting
  • Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
  • Good understanding and exposure to Git, Confluence and Jira
  • Good problem solving and troubleshooting skills.
  • Team player, collaborative approach and excellent communication skills

Education

Any Graduate