Description

What You'll Do

As an experienced data engineer of expertise in data engineering technologies such as PySpark, Python, Scala, Java, SQL, Databricks, RESTful Web Services, and Microservices Architecture, your responsibilities will include:
Collaborate with cross-functional teams to develop and implement data solutions.
Assist in managing architecture, platforms, and data solutions.
Create, maintain, and optimize data pipelines using PySpark, Python, Scala, Java, SQL, Databricks, and AWS.
Work within the Databricks platform to develop and execute data processes.
Contribute to the creation and maintenance of ETL processes using Python and PySpark for seamless data movement and transformation.
Identify and resolve data issues within the data pipeline and enterprise data warehouse.

What You Know

A minimum of 2-3 years of experience in data engineering.
Proficient in designing and implementing data pipelines to ingest, transform, and move data from various sources (databases, APIs, sensors) to analytics platforms.
Strong working knowledge of Python and good understanding of PySpark/Scala/Java.
Experience in developing and maintaining data infrastructure using cloud platforms like AWS, Azure, or Google Cloud Platform.
Sound knowledge of DevOps on CI/CD pipeline automation.
Proficient in troubleshooting data pipeline issues and identifying potential data anomalies.
Good understanding of data engineering concepts and best practices.
Strong knowledge of database management systems (SQL/NoSQL)
Familiar with Agile methodology. 
People-oriented with effective interpersonal skills.
Eagerness to learn new concepts and apply acquired skills.
Adaptability to tackle new challenges.
Enthusiastic and passionate about the work.
Good communication skills.
The ability to articulate and present learned concepts effectively.
Having positive attitude and flexible to work in any project.
Strong problem-solving skills and meticulous attention to detail

Good To Have

Utilize batch and streaming processing techniques to ensure efficient and timely data delivery.
Knowledge of microservices.
Automation of data workflows and tasks to minimize manual intervention and ensure data consistency.
Design and implementation of data quality checks and monitoring systems to ensure data integrity and accuracy.
Knowledge of data visualization tools (e.g., Tableau, Power BI).
Understanding of ETL processes and data modeling.
Knowledge of production log analysis like Splunk

Education

A 4-year bachelor's degree (Computer Science, Information Systems, or a related functional field) or an equivalent combination of education or higher qualifications.

Benefits

In addition to competitive salaries and benefits packages, Nisum India offers its employees some unique and fun extras:
Continuous Learning - Year-round training sessions are offered as part of skill enhancement certifications sponsored by the company on an as need basis. We support our team to excel in their field.
Parental Medical Insurance - Nisum believes our team is the heart of our business and we want to make sure to take care of the heart of theirs. We offer opt-in parental medical insurance in addition to our medical benefits.
Activities -From the Nisum Premier League's cricket tournaments to hosted Hack-a-thon, Nisum employees can participate in a variety of team building activities such as skits, dances performance in addition to festival celebrations.
Free Meals - Free snacks and dinner is provided on a daily basis, in addition to subsidized lunch.

Education

Any graduate