Description


We are seeking a skilled Data Developer to join our team. The ideal candidate will have expertise in designing, implementing, and maintaining ETL/ELT processes to facilitate data integration, migration, and transformation. This role requires proficiency in both cloud and local solutions, as well as a strong understanding of database management systems, data warehousing concepts, Python scripting, dbt (data build tool), and spatial data sources.

Responsibilities:
1. Design, develop, and maintain ETL/ELT processes to extract data from various sources, transform it according to business requirements, and load it into target databases/data warehouses.
2. Collaborate with stakeholders to understand data integration needs and translate them into ETL requirements.
3. Implement solutions using both cloud-based platforms (such as AWS, Azure, or Google Cloud) and local solutions (such as Apache NiFi, Talend, Informatica, or custom scripts).
4. Optimize workflows for performance, scalability, and reliability, considering factors like data volume, frequency, and complexity.
5. Ensure data quality and integrity throughout the process by implementing data validation and error handling mechanisms.
6. Monitor jobs, troubleshoot issues, and perform debugging and performance tuning as needed.
7. Document processes, data mappings, and system configurations to maintain comprehensive documentation.
8. Stay updated on industry trends and best practices.

MUST HAVE SKILLS (Most Important):
1. Bachelor’s degree in Computer Science, Information Technology, or related field.
2. Proven experience as an ETL Developer or similar role, with a strong understanding of data concepts and methodologies.
3. Proficiency in SQL and experience working with relational databases (e.g., MySQL, PostgreSQL, SQL Server, Oracle).
4. Hands-on experience with tools and frameworks, both cloud-based (e.g., AWS Glue, Azure Data Factory, Google Cloud Dataflow) and local (e.g., Apache NiFi, Talend, Informatica).
5. Proficiency in Python programming language for scripting, automation, and data manipulation tasks..
6. Familiarity with spatial data sources and geospatial analysis tools (e.g., PostGIS, GeoPandas, ArcGIS) for handling location-based data.
7. Strong understanding of GIS (Geographic Information Systems) concepts and spatial data formats (e.g., shapefiles, GeoJSON).
8. Familiarity with data warehousing concepts, dimensional modeling, and data integration patterns.
9. Knowledge of cloud computing platforms and services, including storage, compute, and networking.
10. Strong analytical and problem-solving skills, with the ability to troubleshoot complex data pipeline issues.
11. Excellent communication and collaboration skills, with the ability to work effectively in a team environment and interact with stakeholders at various levels of the organization.

DESIRED SKILLS:
1. Experience with big data technologies (e.g., Hadoop, Spark, Clickhouse), NoSQL databases (e.g., MongoDB, Cassandra) and traditional RDMS (PostgreSQL)
2. Certification in cloud computing (e.g., AWS Certified Developer, Azure Developer Associate).
3. Experience with version control systems (e.g., Git) and CI/CD pipelines.
4. Knowledge of data governance principles and regulatory requirements (e.g., GDPR, HIPAA).

EDUCATION/CERTIFICATIONS:
Please indicate whether education and/or certifications are required or desired.
 

Desired Skills and Experience

DATA

Education

Bachelor's