Job Description:
Experience in building end to end architecture for Data Lakes, Data Warehouses and Data Marts along with DWH, Data Integration, Cloud, Architecture, Design, Data Modelling
Experience working with structured and unstructured data and Extensive knowledge of ETL and Data Warehousing concepts, strategies, methodologies. Hands-on experience in Pyspark is a must.
Experience in data ingestion, preparation, integration, and operationalization techniques in optimally addressing the data requirements with relational data processing technology like MS SQL, Delta Lake, Spark SQL, SQL Server
Experience with Orchestration tools and GitHub and experience to own end-to-end development, including coding, testing, debugging and deployment.
Must be team oriented with strong communication, collaboration, prioritization, and adaptability skill and need to be an individual performer as well as a Lead.
Any Graduate