Description

Job Description

Remote role
Banking Investment domain experience is required in recent project
Top Skills Required: NOTE: Resume and LinkedIn must reflect these skills required.

  • PySpark, ADF, ADLS, Delta tables, SparkSQL, etc.
  • In addition to the above, must have: Architecture experience designing solutions and frameworks ground

RESUMES MUST HAVE CLEAR ELABORATION ON ALL OF THESE:
Azure databricks
Azure data factory
Python & Sql:
Architecture work w azure:
They want it all clear on the resume. We watched how they review & they hit control F to search those skills. Then, in the interview really focus on the skills/drill with questions!


Data Architect, Manager
Client is looking for a Data Architect to join our team of bright thinkers and
doers. You will team with top-notch technologists to enable real business outcomes for
our enterprise clients by translating their needs into transformative solutions that provide
valuable insight. Working with the latest data technologies in the industry, you will be
instrumental in helping the world's most established brands evolve for a more digital
future.

Your Impact:
• Play a key role in delivering data-driven interactive experiences to our clients
• Work closely with our clients in understanding their needs and translating
them to technology solutions
• Provide expertise as a technical resource to solve complex business issues
that translate into data integration and database systems designs
• Problem solving to resolve issues and remove barriers throughout the
lifecycle of client engagements
• Ensuring all deliverables are high quality by setting development standards,
adhering to the standards and participating in code reviews
• Participate in integrated validation and analysis sessions of components and
subsystems on production servers
• Mentor, support and manage team members

Your Skills & Experience:
• Demonstrable experience in enterprise level data platforms involving
implementation of end to end data pipelines
• Good communication and willingness to work as a team
• Hands-on experience with at least one of the leading public cloud data
platforms (Amazon Web Services, Azure or Google Cloud)
• Experience with column-oriented database technologies (i.e. Big Query,
Redshift, Vertica), NoSQL database technologies (i.e. DynamoDB, BigTable,
Cosmos DB, etc.) and traditional database systems (i.e. SQL Server, Oracle,
MySQL)
• Experience in architecting data pipelines and solutions for both streaming and
batch integrations using tools/frameworks like Glue ETL, Lambda, Google
Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
• Ability to handle multiple responsibilities simultaneously in leadership and
contributing to tasks "hands-on”
• Understanding of data modeling, warehouse design and fact/dimension
concepts

Set Yourself Apart With:
• Certifications for any of the cloud services like AWS, GCP or Azure
• Experience working with code repositories and continuous integration
• Understanding of development and project methodologies
• Willingness to travel

Please send the skill matrix along with the submission.

  • Data Architecture
  • Azure
  • Databricks
  • PySpark
  • ADF
  • ADLS
  • Delta tables
  • SparkSQL
  • Python
  • SQL
  • Architecture experience designing solutions and frameworks from the ground up.

Education

Any Graduate