Job Description
  • Be an integral part of large-scale client business development and delivery engagements by understanding the business requirements.
  • Hands-on with Dataflow/Apache beam and Realtime data streaming
  • Engineer ingestion and processing pipelines on GCP using python libraries, Java, BigQuery and composer.
  • Automate the repeatable tasks into a framework that can be reused in other parts of the projects.
  • Handle the data quality, governance, and reconciliation during the development phases.
  • Being able to communicate with internal/external customers, desire to develop communication and client-facing skills.
  • Understand and contribute in all the agile ceremonies to ensure the efficiency in delivery.

Qualification & Experience:

  • A bachelor s degree in Computer Science or related field.
  • Minimum 5 years of experience in software development.
  • Minimum 3 years of technology experience in Data Engineering projects
  • Minimum 3 years of experience in GCP.
  • Minimum 3 years of experience in python programming.
  • Minimum 3 years of experience in SQL/PL SQL Scripting.
  • Minimum 3 years of experience in Data Warehouse / ETL .
  • Ability to build streaming/batching solutions.
  • Exposure to project management tools like JIRA, Confluence and GIT .
  • Ability to define, create, test, and execute operations procedures.

Must have skills:

  • Strong understanding of real time streaming concepts
  • Strong problem solving and analytical skills.
  • Good communication skills.
  • Understanding of message queues like Kafka, Rabbit MQ, PubSub
  • Understanding of fast data caching systems like Redis/Memory Store
  • GCP experience - ~3+ years
  • Dataflow/Apache beam hands of experience - Custom templates
  • Understanding of Composer
  • Good experience with Big Query and PubSub
  • Good hands-on experience with Python
  • Hands on experience with modular java code development involving design patterns - Factory, Reflection, etc.

Good to have skills:

  • GCP Professional Data Engineer certification is an added advantage.
  • Understanding of Terraform script.
  • Understanding of Devops Pipeline
  • Identity and Access Management, Authentication protocols
  • Google drive APIs, Onedrive APIs

Location

  • Hyderabad (Client location)

Role: Data Platform Engineer

Industry Type: IT Services & Consulting

Department: Engineering - Software & QA

Employment Type: Full Time, Permanent

Role Category: Software Development

Education

UG: Any Graduate

PG: Any Postgraduate

Key Skills

  • Computer science
  • Access management
  • GCP
  • Project management
  • Reconciliation
  • Agile
  • PLSQL
  • Data quality
  • Apache
  • Python

Salary

Not Disclosed

Monthly based

Location

Karnataka , India

Paid time off Company retreats No policies at work
Job Overview
Job Posted:
1 year ago
Job Type
Full Time
Job Role
Developer
Education
Graduated
Experience
5+ Years
Location (Karnataka , India)