Description

Role: Senior Solutions Architect
Experience Level: 12 Years and above
Work location: Mumbai, Bangalore & Trivandrum
What You’ll Do

You will be working as a Solutions architect within the healthcare domain, designing and delivering big data pipelines for structured and unstructured data that are running across multiple geographies, helping healthcare organizations achieve their business goals with use of data warehousing, data lakes, data marts, data ingestion tools & technologies, cloud services & DevOps. You will be working with Architects from other specialties such as Cloud engineering, Software engineering, ML engineering to create products, platforms,

solutions and web applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical device, AI marketplaces, amongst others.

Role & Responsibilities

  • Work with a team of architects, engineers, project managers and customers stakeholders to solve big data problems by developing utilities for migration, storage and processing on clouds
  • Design and build a cloud migration strategy for cloud and on-premise data architectures, applications, pipelines etc
  • Diagnose and troubleshoot complex distributed data systems problems and develop solutions with a significant impact at massive scale
  • Build tools to ingest and jobs to process several terabytes or petabytes per day.
  • Design and develop next-gen storage and compute solutions for several large customers in healthcare & life sciences space
  • Be involved in proposals, RFPs and provide effort estimates, solution design etc
  • Communicate with a wide set of teams, including Infrastructure, Network, Engineering, DevOps, SiteOps teams, Security and cloud customers
  • Build advanced tooling for automation, testing, monitoring, administration, and data operations across multiple cloud clusters
  • Better understanding of Data modeling and governance

Must-Have

12+ years experience of hands-on in data architectures, cloud, data structures, distributed systems, hadoop and spark, SQL & NoSQL databases

  • Strong software development skills in at least one of: Python, PySpark, Java,C/C++, Scala, SQL commands
  • Experience in building and deploying cloud-based solutions at scale using different tools like databricks, snowflake, etc
  • Experience in developing large scale enterprise big data solutions (migration, storage, processing) to accommodate structured, semi-structured, unstructured data.
  • Experience in different medical imaging formats like dicom, wsi, etc. & different health data formats like HL7, FHIR, etc
  • Experience in building and supporting large-scale systems in a production environment
  • Designing and development of ETL, ELT pipelines
  • Requirement gathering and understanding of the problem statement
  • End-to-end ownership of the entire delivery of the project
  • Designing and documentation of the solution
  • Knowledge of RDBMS & NoSQL databases
  • Experience in any of Kafka, Kinesis, Cloud pub-sub
  • Experience in any of Cloud Platforms – AWS, GCP or Azure Big Data Distributions
  • Experience in any of Apache Hadoop/CDH/HDP/EMR/Google DataProc/HD-Insights Distributed processing Frameworks.
  • Experience in one or more of MapReduce, Apache Spark, Apache Storm, Apache Flink. Database/warehouse
  • Experience in Hive, HBase, and at least one cloud native services Orchestration Frameworks
  • Experience in any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions
  • Exposure to reporting tool (at least one of Power BI, tableau, Looker)

Good To Have

  • Experience in leading multiple large projects as well as a deep understanding of Agile developments
  • Hands-on experience on Kubernetes or Swarm, Container Orchestration
  • Experience working on Terraform
  • Exposure to DevOps or DevSecOps methodologies
  • Exposure to Machine learning & AI initiatives
     

Education

ANY GRADUATE