Database Engineer
Insight Global
Job Description
Insight Global is looking to hire a Data Engineer for a retail client based in Vancouver. This role is a hybrid position and requires 3 days onsite/week in Downtown Vancouver. You will be joining a team that works heavily with customer data - owning the customer data pipelines that gather data coming from multiple sources and consolidating that data for different use cases. As a Mid Data Engineer, you will bring a high level of technical knowledge, but also an ability to spread knowledge to your co-workers. You will help form the core of the engineering practice by contributing to all areas of development and operations (pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that “own” all parts of software development, release pipelines, production monitoring, security and support. Other duties and responsibilities include:
Build, modernize, and maintain data pipelines using Azure, Databricks, Snowflake, and GCP.
Create and publish secure Snowflake views to the Enterprise Data Exchange.
Migrate pipelines to Delta Live Tables (DLT) and Unity Catalog.
Deploy pipelines using existing CI/CD frameworks.
Ensure compliance with PII masking/encryption requirements.
Use ADF/Airflow/FiveTran for orchestration; SQL & Python for development.
Support streaming workloads (Kafka, EventHub, Spark Streaming).
Participate in DevOps: improve CI/CD, monitor production, handle failures, join on‑call rotations.
Collaborate with global engineering teams as part of an Agile, DevOps, SRE‑aligned culture.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
5+ years of experience with database engineering -- building out and deploying pipelines, ideally working with customer data
Strong experience with Azure (deployments, configurations, Storage Accounts)
Hands-on experience with Azure Data Factory, Azure Databricks, Snowflake, DBT/DLT, and Medallion Architecture
Strong Python (especially PySpark) for building and optimizing pipelines across GCP/Azure/Snowflake/Databricks
Experience with CI/CD concepts & tooling (Azure DevOps; Repos; pipeline deployments)
Experience working in an Agile environment
Strong communication skills, both written and verbal Experience with Fivetran, Feedonomics, or similar marketing technology tools
Experience with Terraform or ARM templates
Experience with Unity Catalog or similar governance tools
Experience with PII masking/encryption standards
Background with Snowflake secure views and enterprise data sharing
Confirm your E-mail: Send Email
All Jobs from Insight Global