At JPMorgan Chase, you’ll have the opportunity to advance your career and push the boundaries of data engineering by developing innovative, secure solutions that transform financial services.
Join us as a Lead Data Engineer on the Corporate Data Technology Team. Become an integral part of an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you will be responsible for conducting critical technology solutions across multiple technical areas within various business functions, supporting the firm’s business objectives.
Job Responsibilities
Execute creative software and data solutions, including design, development, and technical troubleshooting, by thinking beyond routine or conventional approaches to build solutions or break down technical problems.Develop secure, high-quality production code and data pipelines, reviewing and debugging processes implemented by others.Identify opportunities to eliminate or automate remediation of recurring issues to improve operational stability of software applications and systems.Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture.Work with business stakeholders to understand requirements and design appropriate solutions, producing architecture and design artifacts for complex applications.Implement robust monitoring and alerting systems to proactively identify and address data ingestion issues, optimizing performance and throughput.Implement data quality checks and validation processes to ensure accuracy and reliability of data.Design and implement scalable data frameworks to manage end-to-end data pipelines for workforce data analytics.Share and develop best practices with Platform and Architecture teams to improve data pipeline framework and modernize the workforce data analytics platform.Gather, analyze, and synthesize large, diverse data sets to continuously improve capabilities and user experiences, leveraging data-driven insights.Lead and contribute to software engineering communities of practice and events that explore new and emerging technologies, fostering a culture of diversity, opportunity, inclusion, and respect.
Required Qualifications, Capabilities, and Skills
Formal training or certification on software engineering concepts and 5+ years applied experience.7+ years of experience in data engineering, including design, application development, testing, and operational stability.Advanced proficiency in data processing frameworks and tools, including Parquet, Iceberg, PySpark, Glue, Lambda, Databricks, and AWS data services (EMR, Athena, Redshift).Proficiency in programming languages such as Python, Java, or Scala for data processing and application development.Extensive hands-on experience with Databricks, including architecting and managing large-scale data pipelines, optimizing Spark jobs, implementing security and governance controls, and automating workflows. Proven ability to leverage Databricks for advanced analytics, data lakehouse architectures, and seamless integration with AWS or other cloud services.Proficiency in automation and continuous delivery methods, utilizing CI/CD pipelines with tools like Git/Bitbucket, Jenkins, or Spinnaker for automated deployment and version control.Hands-on practical experience delivering system design, application development, testing, and operational stability, with advanced understanding of agile methodologies, application resiliency, and security.Demonstrated proficiency in software applications and technical processes within technical disciplines such as cloud, artificial intelligence, machine learning, and mobile.Leverage AI agents and tools, such as Co-Pilot, to enhance productivity, code quality, and problem-solving in engineering tasks.Experience with scheduling tools like Autosys or Airflow to automate and manage job scheduling for efficient workflow execution.
Preferred Qualifications, Capabilities, and Skills
AWS Certifications.Databricks Certifications.Proficiency in relational databases (Oracle or SQL Server).Skilled in writing Oracle SQL queries utilizing DML, DDL, and PL/SQL.