Toronto, ON, M5R 1A6, CAN
9 days ago
Big Data Engineer
Job Description The Big Data Engineer is responsible for architecting, developing, and maintaining enterprise‑scale data platforms that support analytics, operational reporting, and machine learning initiatives. This role requires deep technical expertise in distributed systems, cloud platforms, and data modeling, combined with strong communication and leadership capabilities. Responsibilities Design and implement large‑scale ETL/ELT pipelines using Python, Spark, and distributed processing frameworks Develop and maintain big data infrastructure leveraging Hadoop, Spark, Kafka, and Kafka Streams Architect cloud‑native data solutions on AWS or Azure, including serverless components Build and optimize Snowflake data warehouses using dimensional modeling best practices Conduct data discovery and source analysis to support new integrations and transformations Model complex datasets using normalized, denormalized, star, and snowflake schemas Integrate external systems through RESTful APIs and automated ingestion frameworks Implement DevOps practices including CI/CD, containerization, and orchestration Develop real‑time streaming applications using Kafka, Storm, Kinesis, or Pub/Sub Operationalize machine learning models in collaboration with data science teams Optimize performance across queries, pipelines, and distributed workloads Troubleshoot and resolve complex data issues across upstream and downstream systems Maintain comprehensive documentation for data pipelines, models, and integrations We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/. Skills and Requirements Qualifications Expertise in big data technologies (Hadoop, Spark, Kafka) Advanced SQL proficiency for investigation and optimization Strong Python development skills Experience with cloud platforms (AWS or Azure) Knowledge of Snowflake and dimensional modeling Experience with DevOps tooling (CI/CD, Docker, Kubernetes) Familiarity with ML frameworks (TensorFlow, PyTorch) Strong debugging and analytical skills Behavioral Competencies Leadership and mentorship Effective communication with technical and non‑technical audiences Problem‑solving and critical thinking Ownership and accountability Adaptability and continuous learning
Confirm your E-mail: Send Email