Job Description:
Technical Skillset (Mandatory)
Big Data Tools: PySpark Visualization Tool: SAP Business Objects, Tableau Databases: Relational SQL and NoSQL databases, Snowflake, Postgres and Cassandra.
AWS Cloud Services: EC2, S3, EMR, RDS, Lambda, DMS etc.
Object Oriented Languages: Python
Technical Skillset (Optional)
Data Pipeline & Integration: Informatica Cloud (IICS), PowerCenter Stream Processing Systems: Spark Streaming, etc.
Role and Responsibilities
a. Data oriented personality, great communication skills, and an excellent eye for details.
b. Acquire data from primary or secondary data sources and maintain databases/data systems.
c. Interpret data, analyze results using statistical techniques and provide ongoing reports.
d. Own and maintain all dashboards, ongoing reporting reports, and ad hoc requests from the organization.
e. Work with management to prioritize business and information needs.
f.Partners with members of the Business Intelligence team to provide required access/structure to data.
g. Build relationships with business intelligence partners to understand data needs in order to execute with excellence on documented user requirements
Project Specific requirement (if any):
a. Data oriented personality, good communication skills, and an excellent eye for details.
Relevant Experience required a. 8+ years of experience in data, consulting, analytics or a related function involving quantitative data analysis to solve problems
b. At least 5+ years experience in Business Objects, Tableau & Other Visualization tools
c. Technical expertise regarding data models, database design development, data mining and segmentation techniques
d. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
e. Proficiency in using query languages such as SQL, Hive, Pig. f. Good at queries, report writing and presenting findings.
Any Graduate