Columbus, OH, USA
13 hours ago
Cloud ETL Software Engineer III

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the Corporate Technology Finance and Risk Warehouse SRE Team, you will solve complex and broad business problems with simple and straightforward solutions. Through code and cloud infrastructure, you will configure, maintain, monitor, and optimize applications and their associated infrastructure to independently decompose and iteratively improve on existing solutions

Job responsibilities

Guides and assists others in the areas of building appropriate level designs and gaining consensus from peers where appropriateCollaborates with other software engineers and teams to design and implement deployment approaches using automated continuous integration and continuous delivery (CI/CD) pipelinesCollaborates with other software engineers and teams to design, develop, test, and implement availability, reliability, scalability, and solutions in their applicationsImplements infrastructure, configuration, and network as code for the applications and platforms in your remitCollaborates with technical experts, key stakeholders, and team members to resolve complex problemsUnderstands service level indicators and utilizes service level objectives to proactively resolve issues before they impact customersSupports the adoption of site reliability engineering best practices within the team

 

Required qualifications, capabilities, and skills

Formal training or certification on software engineering concepts and 3+ years applied experienceStrong analysis, research, investigation, and evaluation skills, with a structured approach to problem solving.Specialized ETL knowledge in SparkExperience with monitoring and observability tools, including Dynatrace, Open Telemetry (OTEL), Prometheus, Datadog, and Grafana, particularly in dashboard developmentProficient in at least one programming language such as Python, Java/Spring Boot, Scala, and/or .NetWorking knowledge of Kubernetes, Dockers, any other containers technologyExperience managing and developing/deploying on Cloud (private cloud or public cloud)Knowledge of GIT, BitBucket, Jenkins, SONAR, SPLUNK, Maven, AIM and Continuous Delivery toolsUNIX file management & administration and good shell scripting experienceProduction working knowledge of Databricks and Apache Airflow on AWSWilling to work weekend support

 

Preferred qualifications, capabilities, and skills

Developing/deploying and running Ab Initio (ETL Tool) on a public Cloud like AWSAWS and/or Databricks certificationExperience developing and running data pipelines using PySparkOracle (v9i/10/11/19c ) running on Exadata, Ansi SQL, PL /SQL Stored Procedures support/developmentWorking Knowledge of Control-M/Autosys scheduling package Knowledge/experience in Hadoop environment administration, release deployments to Hive/HBase, supervising Hadoop jobs, performing cluster coordination services
Confirm your E-mail: Send Email