Job Title or Location

Data Engineer(with MLOps Exposure)

Gore Mutual Insurance - 26 Jobs
Cambridge, ON
Posted 30 days ago
Job Details:
Full-time
Experienced

Next Horizon is here. Fueled by investments in talent and technology, our bold strategy to transform is nearly complete.

At Gore Mutual, we've always set ourselves apart as a modern mutual that does good. Now, we're proudly building on that legacy to transform our company—and our industry—for the better.

Our path forward sharpens our focus on business performance, driven by leading technology, innovation and an agile, high-performing culture. With Gore Mutual and Beneva announcing their intent to merge in 2026, we'll be uniting two well-established, financially strong, and trusted brands to become the strongest mutual insurer in Canada, ensuring Canadians have purpose-driven insurance options for generations to come. Come join us.

As a Data Engineer at Gore Mutual Insurance, you have a strong technical background in software engineering / computer science, you will play a pivotal role in designing, building, and maintaining our data platform. Your work will facilitate data accessibility, accuracy, and accountability, enabling data-driven decision-making across our organization. You will collaborate closely with cross-functional teams to ensure our data processes are robust and scalable.

What will you be doing in this role?

Design and Implement Data Processes

  • Design and implement robust data infrastructure, tooling, workflows, and models that power the data platform.
  • Build and maintain enterprise data assets to support business reporting and analytical modelling for all business stakeholders.
  • Ensure integration of required tools, monitoring health of data platform, and maintain CI/CD pipelines to enforce standards. Identify opportunities for code optimization and operational efficiencies.

Design and Implement Data Governance Processes

  • Automate day-to-day data operations, pipeline monitoring, and data integration with external systems.
  • Ensure that security protocols follow best practices are in place to protect against potential security threats.
  • Create and maintain frameworks for metadata, data tagging, and data lineage.
  • Implement data quality and monitoring frameworks to ensure data reliability.

Test and Optimize Data Platform

  • Optimize data pipelines to ensure efficient data flow.
  • Ensure that the data extracted from sources is accurate, complete, and usable. This might involve checking for missing values, inconsistent formats, or anomalies that could indicate errors.
  • Test the efficiency and speed of data pipelines and databases. This can help identify bottlenecks and optimize performance.
  • Verify that different components of the data infrastructure work together as expected. This includes checking that data flows correctly from sources to databases, and from databases to applications.

What will you need to succeed in this role?

  • Bachelor's or Master's degree in Computer Science, Data Engineering, Software Engineering or a related field.
  • A minimum of 4-5 years relevant experience as a data engineer is required. This includes experience in data engineering, data system development, or related roles.
  • Strong understanding of data structures, modern data modeling, and software architecture.
  • Good knowledge of Microsoft Azure Services (DevOps, Databricks, SQL Server, Event Hub, Web Apps, Data Factory, Azure Storage, Keyvault, etc.)
  • 2+ years of experience in ML engineering and MLOps, define and enforce MLOps best practices - including versioning, governance, and monitoring. Proven track record of deploying and maintaining ML models in production at scale.
  • Building automated and reusable pipelines for ML model training, evaluation, retraining, and deployment. Monitor ML model usage, latency, cost, and drift, and optimize model serving for low latency and high throughput
  • Experience with ML orchestration, develop pipelines for automate feature engineering, predictive model training and testing, prompt engineering, fine-tuning, and RAG workflows. Strong Python skills and familiarity with LLM frameworks (LangChain, LlamaIndex), Azure OpenAI models and Agentic AI, CI/CD, YAML pipeline, MLflo
  • Experience with software design patterns and test-driven development (TDD)
  • Proficiency in Python, including a strong grasp of Object Oriented and Functional programming paradigms.
  • Solid understanding of Spark concepts and distributed systems, including data transformations, RDDs, DataFrames, and Spark SQL.
  • Strong SQL skills and expertise in database management and performance tuning.
  • Experience with data lakehouse and Medallion architectures.
  • Strong problem-solving and critical-thinking abilities.
  • Strong communication and collaboration skills.
  • Experience with version control systems (Git) and CI/CD practices. Familiarity with data governance principles and metadata management practices.

Bonus points for

  • Azure certifications (Microsoft Certified: Azure Data Engineer Associate).
  • Databricks certifications (Databricks Certified /Data Engineer Professional certification).
  • Experience with modern data transformation tools (dbt, Dataform, or similar) for building scalable analytics workflows.
  • Proficiency in dimensional modeling techniques such as star schema and snowflake schema.

#LI-Hybrid

#IndHP

Gore Mutual Insurance is committed to providing accommodations for people with disabilities during all phases of the recruiting process, including the application process. If you require accommodation because of a disability, we will work with you to meet your needs. If you are selected for an interview and require accommodation, please advise the HR representative who will consult with you to determine an appropriate accommodation.

This position is no longer available.

Share This Job: