Job Description - Operations Engineer
Overview
The Operations & Engineering team is seeking an Operations Engineer to support
operational workflows, data integrity, and client deliverables across client-facing
platforms. Reporting into the Director of Operations & Analytics, the Operations Engineer
will focus on monitoring, managing, and optimizing ingestion and post-ingestion
pipelines to ensure SLA compliance and seamless client experiences.
The ideal candidate has a strong technical foundation in operations, business analysis,
SQL, or data engineering, excellent problem-solving skills, and the ability to collaborate
across engineering, product, and client success teams. This role is critical to maintaining
operational excellence, mitigating risks, and supporting continuous improvement in a
fast-paced environment.
Roles & Responsibilities
Monitor operational workflows and system processes (data ingestion,
post-ingestion, and UI delivery) to ensure smooth and uninterrupted service.
Monitor the health of our applications and troubleshoot by following provided
runbooks or documentation.
Investigate and understand root causes for issues in business operations; develop
and implement corrective actions
Log issues, delays, and anomalies accurately and escalate critical problems per
internal protocols.
Respond to client queries promptly and professionally, ensuring resolution within
defined timelines.
Work with clients and internal teams to provide timely solutions while ensuring
compliance with company standards and regulations.
Deliver outputs accurately and within agreed SLAs, maintaining high reliability
standards.
Collaborate with Engineering, Product, and Client Success teams to resolve
incidents, align on expectations, and drive process improvements.
Maintain detailed records of incidents, client interactions, and operational metrics
for reporting, audits, and analysis.
Identify recurring issues and contribute to the design of preventive measures and
long-term solutions.
Use SQL, Shell scripting, and programming skills (python) as part of technical
operations support.
Support automation and workflow improvements to increase operational
efficiency.
Required Qualifications
4+ years of experience in operations, data engineering, or Production support
roles.
Proven hands-on experience in querying and analyzing data, Advanced Excel, and
MySQL.
Experience in Linux basics and Networking fundamentals.
Experience using cloud technologies such as AWS.
Strong analytical and troubleshooting skills.
Familiarity with workflow orchestration tools (Airflow), cloud infrastructure, and
containerization (Kubernetes).
Strong communication, documentation, and cross-team collaboration skills.
Ability to work in a fast-paced, client-focused environment.
Proactive, detail-oriented, and committed to continuous improvement.
Preferred Qualifications
Experience working in SaaS, energy, or data-driven industries.
Experience with Datadog or similar health monitoring systems.
Experience in pySpark, data management, data integration, ETL, data quality and
controls, data analysis, reporting, and testing.
Familiarity with UI-based client platforms and Excel API integrations.
Exposure to monitoring and ing tools, including custom scripting.
Exposure to Git
Experience using Agile methodologies.
Experience working with Jira or other issue-tracking systems.
Location and work mode:
Work mode - Hybrid.
Candidate location - Pune