We have an exciting and rewarding opportunity for you to take your career to the next level.
As a Software Engineer III -AWS/Data at JPMorgan Chase within the Corporate Sector-Consumer and Community Banking Risk team, you are part of an agile team that works to enhance, design, and deliver the data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As an emerging member of a data engineering team, you execute data solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.
Job responsibilities
• Organizes, updates, and maintains gathered data that will aid in making the data actionable
• Demonstrates basic knowledge of the data system components to determine controls needed to ensure secure data access
• Be responsible for making custom configuration changes in one to two tools to generate a product at the business or customer request
• Build and develop automation tools
• Troubleshoot priority incidents, facilitate blameless post-mortems and ensure permanent closure of incidents
• Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
• Formal training or certification on Data Engineering concepts and 3+ years applied experience
• Demonstrate ability to work independently with strong ownership, collaboration & communication skills
• Experience in data lifecycle and data management functions
• Spark, Shell/ Perl Scripting and/ Python or Java
• Hands on experience with AWS frameworks such as ECS, EKS, EMR
• Significant experience with statistical data analysis and ability to determine appropriate tools to perform analysis
• Basic knowledge of data system components to determine controls needed
• Experience with SRE operations & principles
Preferred qualifications, capabilities, and skills
• Advanced knowledge of SQL (e.g., joins and aggregations) Python and Spark
• Experience with workflow automation tools Control-M
• Experience in maintaining and optimizing cloud-based infrastructure for enhanced performance and reliability. and Apache Airflow
• Experience with monitoring tools Grafana and Dynatrace