Thursday 21 July 2016

Data Architect (6+ years)@Bangalore

00:25
Looking for Data Architect with Minimum 6 Years of Expertise in Big data Technologies and Global Data Warehousing for an Multinational client – Digital Marketing, which is ranked #240 in the Inc.500 list of fastest growing companies in the U.S.

Position: Data Architect

We are looking for a Big Data Architect that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

Skill requirements:
  • Minimum 4 years of hands-on implementation experience in Big Data Technologies.
  • Minimum 3 Years of Hands on experience in leading large-scale global data warehousing and analytics projects.
  • Track record of implementing AWS services in a variety of distributed computing, enterprise environments
  • Proficiency with Hadoop Eco System, MapReduce, HDFS,Spark
  • Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, VoltDB, RedShift
  • Experience with various messaging systems, such as Kafka or RabbitMQ,SQS
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
  • Experience in large-scale, distributed systems, with ETL tools.
  • Proficient understanding of distributed computing principles
  • Proven experience in Data Modelling.
  • Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Scala, Linux, Apache, Perl/Python/PHP)
  • Experience in Agile development methodology
Key Responsibilities:
  • Design and build scalable infrastructure and platform to collect, process and analyze very large amounts of data (structured and unstructured), including streaming real- time data.
  • Implementation of full life cycle of a Hadoop solution, including requirements analysis, platform selection, technical architecture design, application design and development, testing, and deployment.
  • Exhibit Strong technical team leadership, mentorship and collaboration in running Big Data POCs/Pilots.
  • Implementing ETL process
  • Monitoring performance and advising any necessary infrastructure changes
  • Perform analysis of vast data stores and uncover insights
  • Collaborate with engineering teams, product managers and data scientists in implementing best architecture to support their needs
  • Provide technical leadership, mentoring and guide teams on / help resolve Technical challenges.
  • Drive innovation initiatives across engineering by publishing white-papers, presenting innovation proposals, conducting POCs and introducing new techniques to improve product quality and coverage across platform.
Personal Attributes:
  • Candidate must possess good written and oral communication skills
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Dedicated, Hardworking and person who lives up to his commitment.
  • Must exhibit strong problem solving skills
  • Should be able to collaborate with teams on various decisions
Desired profile:
  • Years of experience: 6-11 years
  • E/B.Tech or M.Tech/MS
  • CS, CS-IT is an added advantage
Job Location: Bangalore
Package: 30 Lakhs
Share Profile on kiran@madhees.com

http://madhees.com/

0 comments:

Post a Comment