Collaborate with cross-functional teams to design, develop, and deliver data solutions in the wireless space using cloud data platforms.
Manage day-to-day development activities for new data solutions and troubleshoot existing implementations.
Lead technical discussions and resolve technical issues in partnership with product owners and technical leads.
Apply best practices in data integration for data quality and automation.
Collaborate with product vendors to identify and manage open product issues.
Solve complex data integration problems.
Develop and maintain code for data ingestion and curation.
Work with business analysts to understand business requirements and use cases.
Technical Requirements:
Minimum of 5 years of experience delivering data solutions on various data warehousing, big data, and cloud data platforms.
3+ years of experience working with distributed data technologies (e.g., Spark, Kafka, etc.) to build efficient, large-scale big data pipelines.
Strong software engineering experience with proficiency in at least one of the following programming languages: Spark, Python, Scala, or equivalent.
Experience in building data ingestion pipelines, both real-time and batch, using best practices.
Familiarity with Cloud Computing platforms like Amazon AWS, Google Cloud, etc.
Experience in transforming and integrating data in Redshift/Snowflake.
Proficiency in writing SQL and PL/SQL to ingest data into cloud data warehouses.
Experience supporting and working with cross-functional teams in a dynamic environment.
Familiarity with relational SQL and NoSQL databases, including Postgres and MongoDB.
Experience with scheduling tools, preferably Control-M, Airflow, or AWS Step Functions.
Strong interpersonal, analytical, problem-solving, influencing, prioritization, decision-making, and conflict resolution skills.
Excellent written and verbal communication skills.
Any Graduate