3Point

Data Engineer (Scalable Pipelines & Analytics)

Job Location: Pune

Job Description

We are looking for a Data Engineer with strong experience in the manufacturing domain to build and manage scalable data pipelines, industrial data integrations, and analytics-driven applications.

Key Responsibilities

  • Build and maintain data pipelines using Apache NiFi / MiniNiFi, Pyspark
  • Work with industrial data sources: OPC-UA, MQTT, Modbus, SCADA, MES.
  • Develop data models and pipelines for OEE, predictive maintenance, and energy analytics.
  • Write and optimize SQL on PostgreSQL / Timescale / Doris.
  • Implement transformations in DBT and real-time streaming through Apache Kafka.
  • Deploy services using Docker & Kubernetes in Linux environments.
  • Ensure data lineage, mapping, and governance.

Skills Required

  • Data Engineering: NiFi, DBT, Kafka, OLTP systems
  • Databases: PostgreSQL, TimescaleDB, DorisDB
  • Programming: Python
  • Industrial Protocols: OPC-UA, MQTT, Modbus
  • DevOps: Linux, Git, Docker/K8s

Share CVs to pooja.c@3point.in

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
Scroll to Top