New York, NY, United States
18 hours ago
Software Engineer III- ETL/ELT Pipelines / Python / Pyspark/ AWS

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III- ETL/ELT Pipelines / Python / Pyspark / AWS at JPMorganChase within the Asset and Wealth Management Technology Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

 

Design and implement scalable data solutions that align with business objectives and technology strategies and technical troubleshooting with ability to think beyond routine or conventional approaches to build and support solutions or break down technical problemsDesign, develop, and optimize robust ETL/ELT pipelines using SQL, Python, and PySpark for large-scale, complex data environmentsDevelop and support secure high-quality production code, and review and debug code written by othersSupport data migration and modernization initiatives, transitioning legacy systems to cloud-based data warehousesIdentify opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systemsCollaborate with cross-functional teams to understand data requirements and translate them into technical specificationsMonitor and tune ETL processes for efficiency, resilience and scalability, including alerting for data quality issuesWork closely with stakeholders to identify opportunities for data-driven improvements and efficienciesDocument data flows, logic, and transformation rules to maintain transparency and facilitate knowledge sharing across teamsStay current on emerging ETL and data engineering technologies with industry trends to drive innovation

 

 

Required qualifications, capabilities, and skills

 

Formal training or certification in software engineering with 3+ years of applied experienceProficient in coding in one or more languages including PythonStrong hands-on coding proficiency in Python, PySpark, Apache Spark, SQL, and with AWS cloud services such as AWS EMR, S3, Athena, RedshiftHands-on experience with AWS cloud and data lake platforms, Snowflake, Databricks etc Proven experience in ETL/ELT pipeline development and with large-scale data processing with SQLPractical experience implementing data validation, cleansing, transformation, and reconciliation processes to ensure high-quality, trustworthy datasetsExperience with cloud-based data warehouse migration and modernizationProficiency in automation and continuous delivery methods and understanding of agile methodologies such as CI/CD, Application Resiliency, and SecurityExcellent problem-solving and troubleshooting skills, with ability to optimize performance and troubleshoot complex data pipelinesStrong communication and documentation abilitiesAbility to collaborate effectively with business and technical stakeholders

 

 

Preferred qualifications, capabilities, and skills

 

Knowledge of Apache IcebergKnowledge of the financial services industry and IT systems
Confirm your E-mail: Send Email