Senior AWS Data Engineer || Immediate Joiners

Senior AWS Data Engineer || Immediate Joiners

big-data Full Time Bangalore, Pune, Gurgaon, Navi Mumbai Mar 31 2023

Senior AWS Data Engineer – Exp range: 7 + years in Data Engineering

About Bizmetric

Bizmetric is a fast-paced organization marking exponential growth every quarter. As a leading data analytics solutions provider, Bizmetric is on a mission to simplify business-decision making through value-added insights and advanced data analytics. With our Headquarters located in Houston, US, and our Operations spanning across India and Mexico, Bizmetric is committed to expanding both in advanced technologies and across various geographies.

At Bizmetric, we have firmly established our presence in the US, UK, Australia, and the Middle East Markets, demonstrating our commitment to delivering innovative solutions on a global scale.
Bizmetric has a strong hold on Oracle applications & Advanced Data Analytics, and the team comprises experts with 12+ years of experience in the required skill set. We have delivered some complex business scenarios for Finance, SCM, Procurement, and HR functions. Bizmetric has vast experience in successfully serving different strata of industries ranging from Manufacturing, Retail, Oil & Gas, Logistics, Education & Research Institutes, and Life Sciences.

Roles & Responsibilities

  • Design and build large-scale enterprise data solutions and applications using one or more AWS data and analytics services using EMR, Lambda, Glue, DynamoDB, RedShift, Kinesis.
  • Analyze, re-architect, and re-platform on-premise data warehouses to data platforms on AWS cloud using AWS or 3rd party services.
  • Must have experience in design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
  • MUST have HANDS-ON experience on Hadoop tools/technologies like Spark (Strong in Spark), Map Reduce, Hive, HDFS.
  • HANDS-ON expertise and excellent understanding of big data toolsets such as Sqoop, Spark[1]streaming, Kafka, NiFi.
  • Good working knowledge in NoSQL DB (Mongo, HBase, Casandra).
  • Implemented complex projects dealing with the considerable data size (TB/ PB) and with high complexity in the production environment. • Hortonworks (HDPCA/HDPCD/HDPCD-Spark) or Cloudera certification is an added advantage.

Required Skills

  • Bachelor’s degree or higher in a quantitative /technical field (e.g., Computer Science, Statistics, Engineering) and software development experience with proven hands-on experience in Big Data technologies.
  • Experience in developing/architecting environments in the Hadoop ecosystem using HDP and HDF.
  • Demonstrated strength in data modelling, ETL development, and Data.
  • Experience in designing and implementing an enterprise data lake.
  • Experience in Big Data Management and Big Data Governance.
  • Some experience with Kubernetes, Docker containers, etc.

Benefits to Work With Us

  1. Certifications
  2. Flexibility
  3. 5 working days
  4. Medical Coverage of Rs.5 Lakhs
  5. Company Outings/Company Events
  6. Fun Fridays
  7. Tech Thursdays
  8. Opportunity to work on multiple technologies

If you are interested, send your updated CV with CTC details to