Own your opportunity to manage the network that makes mission success possible. Make an impact by using your skills to deliver “One GDIT Network” for our clients.
Job DescriptionSeize your opportunity to make a personal impact supporting the Case Management Modernization (CMM) Program. The CMM program is an initiative to support the Administrative Office of the US Courts (AO) in developing a modern cloud-based solution to support all 204+ federal courts across the United States.
GDIT is your place to make meaningful contributions to challenging projects and grow a rewarding career. The Senior Data Engineer/ Developer will work as part of the CMM Enterprise Data Warehouse (EDW) team to deploy a secured cloud-native EDW platform and support the statutory and operational reporting, data cataloging, and other analytical objectives.
The Senior Data Engineer/ Developer is responsible for designing, building, testing, and maintaining scalable data engineering components and platform services that power the EDW. This role delivers high-quality, secure, and performant data pipelines, transformations, and integrations aligned with federal standards and EDW SOW objectives for modernization, analytics, and operational excellence.
RESPONSIBILITIES:
Develop a solution to integrate case management business data from the case management system into Snowflake EDW on a near real-time basisProduce and maintain dimensional data models that form the basis for the data marts to satisfy the requirements of the data productsDevelop and maintain downstream data marts and their corresponding ETL/ELT processes (on a near real-time basis or overnight batch depending on the individual uses case requirements) and their associated reports and dashboardsDevelop and maintain comprehensive documentation including the STTMs, data models (logical and physical), design documents, testing documents and other related documentsDevelop, deploy and maintain the data presentation and data reports for each of the data products using SAP Business Objects, Tableau, and/or other presentation tool(s) selected by the customerDesign and develop data ingestion, ETL/ELT pipelines, and transformation logic for near–real-time and batch workloadsImplement robust, reusable data services supporting analytics, reporting, and downstream data martsCollaborate with architects to implement logical and physical data models in SnowflakeDevelop and maintain high-quality, testable code using secure coding standards and best practicesIntegrate data pipelines with cloud services, messaging, and storage componentsImplement data quality checks, validations, and error-handling mechanisms.Optimize pipeline performance, scalability, and cost efficiencySupport CI/CD-enabled deployments, including automated testing and promotion across environmentsParticipate in code reviews, design reviews, and sprint ceremoniesSupport incident resolution and root cause analysis for data pipeline failuresProduce and maintain technical documentation, runbooks, and workflows.Ensure deliverables meet federal security, governance, and audit requirementsOperates within an Agile federal delivery environment across multiple scrum teamsCollaborates closely with architects, DBAs, business analysts, and QA personnelAccountable for code quality, performance, and delivery timelinesExpected to maintain audit-ready documentation and technical artifacts
REQUIRED EXPERIENCE & QUALIFICATIONS:
Bachelor's degree in Computer Science, Computer Programming, Computer Engineering or relevant computer-based major, strongly preferred8 years of IT related experience with 7+ years’ experience specifically in developing IT and cloud infrastructuresExperience in software engineering and design architecturesExperience and understanding of best practices regarding system security measuresExperience with building and migration software and IT services to align to strategic business needs and goalsExperience in conducting research for advanced technologies to determining how IT can support business needs leveraging software, hardware, or infrastructureExperience with AWS data and compute servicesExperience with Airflow or equivalent orchestration toolsKnowledge of cloud messaging and storage servicesProven track record in software and data engineering rolesHands-on experience building enterprise-scale data pipelinesStrong proficiency with SQL and data transformation techniquesExperience developing in cloud-based data platformsFamiliarity with Agile/Scrum delivery environmentsExperience working in regulated or compliance-driven environmentsExperience supporting federal EDW or analytics programsHands-on experience with SnowflakeProficiency with ETL/ELT frameworksExperience with streaming or near–real-time data ingestionFamiliarity with data governance, metadata, and classification standardsExperience mentoring junior developersCERTIFICATIONS (Preferred):
Snowflake SnowPro CoreAWS Certified Data Analytics – SpecialtyRelevant programming or data engineering certifications
COMMUNICATION & ORGANIZATIONAL:
GDIT IS YOUR PLACE:
At GDIT, the mission is our purpose, and our people are at the center of everything we do.
Growth: AI-powered career tool that identifies career steps and learning opportunitiesSupport: An internal mobility team focused on helping you achieve your career goalsRewards: Comprehensive benefits and wellness packages, 401K with company match, and competitive pay and paid time offCommunity: Award-winning culture of innovation and a military-friendly workplace
OWN YOUR OPPORTUNITY:
Explore a career in data science and engineering at GDIT and you’ll find endless opportunities to grow alongside colleagues who share your determination for solving complex data challenges.