Key Responsibilities
● Design, build, and maintain efficient and reliable data pipelines to process large volumes of data.
● Implement ETL (Extract, Transform, Load) processes to integrate data from various sources.
● Manage and optimize databases and data warehouses to ensure high performance and availability.
● Implement data storage solutions that are secure, scalable, and cost-effective.
● Ensure data quality by implementing data validation, cleansing, and monitoring processes.
● Develop and enforce data governance policies to maintain data integrity and compliance.
● Work closely with data scientists, analysts, and other stakeholders to understand their data requirements and provide necessary support.
● Communicate technical concepts and solutions effectively to both technical and non-technical team members.
Tool and Technology Utilization
● Utilize data engineering tools and technologies such as SQL, Python, Hadoop, Spark, and cloud platforms (e.g., AWS, Azure, Google Cloud).
● Stay updated with the latest advancements in data engineering and big data technologies.
Qualifications
● Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
● 3-5 years of experience in data engineering or a related field.
● Proven experience with data pipeline development, database management, and data integration.
● Experience working with large datasets and big data technologies.
● Proficiency in SQL and programming languages such as Python or Java.
● Strong experience with big data technologies (e.g., Hadoop, Spark).
● Familiarity with cloud computing platforms (e.g., AWS, Azure, Google Cloud).
● Experience with data warehousing solutions (e.g., Redshift, Snowflake).
● Strong understanding of data security and privacy principles.
● Ability to adapt to a fast-paced and dynamic work environment.
Preferred Qualifications
● Experience with real-time data processing technologies (e.g., Kafka, Flink).
● Knowledge of DevOps practices and tools (e.g., Docker, Kubernetes, CI/CD).
● Familiarity with data modeling and schema design.
● Experience with project management tools (e.g., JIRA, Trello).
Bachelor's degree in Computer Science