Staff Data Engineer (f/m/d) - Kraków (onsite) OR Poland (remote)
Danaher Corporation
Bring more to life.
At Danaher, our work saves lives. And each of us plays a part. Fueled by our culture of continuous improvement, we turn ideas into impact – innovating at the speed of life.
Our 63,000+ associates work across the globe at more than 15 unique businesses within life sciences, diagnostics, and biotechnology.
Are you ready to accelerate your potential and make a real difference? At Danaher, you can build an incredible career at a leading science and technology company, where we’re committed to hiring and developing from within. You’ll thrive in a culture of belonging where you and your unique viewpoint matter.
Learn about the Danaher Business System which makes everything possible.
Staff Data Engineer (f/m/d) is responsible for designing, building and scale Danaher’s data platform . The primary objective of this role is to build data assets using data & analytics strategies that align with business priorities and drive measurable business outcomes .
This position reports to the Sr. Director – Data Engineering and is part of the Core Technology & Planning.
This is a Danaher Corporate role, hosted by our Cytiva operating company in Kraków, and will be an on-site role (Kraków) or remote (Poland).
In this role, you will have the opportunity to:
+ Design, build and maintain trusted, scalable, secure , compliant, and reusable data products in Snowflake that support AI/ML models, executive dashboards, and cross-OpCo analytics.
+ Lead design and implementation of complex data initiatives spanning multiple teams.
+ Act as a technical owner for key data domains.
+ Set and reinforce best practices for data modelling, pipeline design, testing, and observability.
+ Drive continuous improvement by implementing automated data quality checks, scaling ingestion/onboarding for M&A integrations, and maintaining uptime, optimization, and IT General Controls.
The essential requirements of the job include:
+ Minimum 10 year's experience in enterprise Data & Analytics.
+ Design and build complex Snowflake-based pipelines leveraging Matillion and dbt.
+ Strong understanding of CI/CD practices, DevOps.
+ Build and manage complex ETL/ELT workflows that extract, transform, and load data for downstream analytics and reporting.
+ Deep expertise with modern data technologies (Snowflake, BigQuery , Databrikcs , Azure Fabric/Synapse, Airflow, Spark,etc.)
+ Hands-on experience with cloud data ecosystems (AWS, Azure) designing high-volume, high-availability data pipelines.
+ Effective communication skills and ability to influence cross-functional stakeholders.
Travel, Motor Vehicle Record & Physical/Environment Requirements:
+ Ability to travel to Washington DC and global Danaher sites, up to 10%
It would be a plus if you also possess previous experience in:
+ Cloud data architecture certifications.
+ K nowledge of data governance, metadata management, and data quality frameworks.
+ Experienced in Fabric/Databricks lakehouse solutions leveraging PySpark Notebooks.
+ Building Power BI Semantic Models and Dashboards.
#LI-KK1
Join our winning team today. Together, we’ll accelerate the real-life impact of tomorrow’s science and technology. We partner with customers across the globe to help them solve their most complex challenges, architecting solutions that bring the power of science to life.
For more information, visit www.danaher.com .
Confirm your E-mail: Send Email
All Jobs from Danaher Corporation