O

Business Intelligence Engineer (Power BI Developer)

OMERS Administration Corporation
Full-time
On-site
Toronto, Canada

Choose a workplace that empowers your impact.Β 

Join a global workplace where employees thrive. One that embraces diversity of thought, expertise and experience. A place where you can personalize your employee journey to be β€” and deliver β€” your best. Β 

We are a purpose-driven, dynamic and sustainable pension plan. An industry leading global investor with teams in Toronto to London, New York, Singapore, Sydney and other major cities across North America and Europe. We embody the values of our 600,000+ members, placing their best interests at the heart of everything we do.

Join us to accelerate your growth & development, prioritize wellness, build connections, and support the communities where we live and work.

Don’t just work anywhere β€” come build tomorrow together with us.

Know someone at OMERS or Oxford Properties? Great! If you're referred, have them submit your name through Workday first. Then, watch for a unique link in your email to apply.

We are seeking a highly skilled and motivated BI Engineer (Power BI Developer) to join our BI Platform team in Toronto. The ideal candidate will have at least 5 years of hands-on experience in data analysis, leveraging tools such as Power BI, Microsoft Fabric Notebooks (Python), Lakehouse architecture, Data Warehousing concepts, and Gen2 Data Flows to deliver advanced analytics, develop insightful dashboards, and create impactful reports that drive data-informed decisions. Along with having expertise with these tools, the ideal candidate will have experience in using GitHub CI/CD Pipelines to promote Power BI reports, ALM Toolkit, and Tabular Editor.

This is a fantastic opportunity for a BI Engineer to work with cutting-edge data analytical tools while being part of a dynamic BI Platform team that fosters collaboration, encourages innovation, and supports professional growth.

As a member of this team, you will be responsible for:

Data Collection & Integration

  • Ingest data from databases, APIs, and external systems into the Microsoft Fabric Lakehouse using Gen2 Dataflows (Power Query) for standard connectors and Fabric Notebooks (Python) for API orchestration and complex logic.

  • Design and maintain medallion (bronze/silver/gold) layers in the Lakehouse with Delta/Parquet storage; model conformed dimensions and facts per data warehousing best practices.

  • Implement incremental refresh/CDC where applicable and wire up Power BI semantic models (Import/Direct Lake) to curated Lakehouse/DWH tables for high-performance analytics.

  • Ensure lineage and discoverability by publishing certified datasets and documenting data contracts and dependencies across dataflows, notebooks, and models.

Data Cleaning and Preprocessing

  • Clean and preprocess raw data to remove errors, inconsistencies, or duplicate entries.

  • Transform data into a structured format suitable for analysis, applying necessary formatting or aggregation techniques.

  • Validate data to ensure its accuracy and integrity before performing analysis.

Data Analysis & Interpretation

  • Analyze data to identify trends, patterns, and relationships that can provide insights.

  • Use statistical methods and predictive models to address business problems and support decision-making.

  • Perform descriptive and exploratory analysis to understand key metrics and KPIs.

Reporting & Visualization

  • Create reports, dashboards, and visualizations to communicate data insights clearly and effectively using Power BI.

  • Automate recurring reports to improve efficiency and ensure timely delivery of insights.

  • Present findings to stakeholders with clear, actionable insights through reports and presentations.

Collaboration & Communication

  • Partner with business, product, and engineering teams to translate requirements into Power BI semantic models, curated datasets, and well-defined KPIs.

  • Use GitHub CI/CD pipelines to version and promote Power BI reports, datasets, dataflows, and notebooks; participate in PR reviews and enforce branching standards.

  • Leverage ALM Toolkit for model diff/deployment and Tabular Editor for scripting, calculation groups, and documentation to keep environments consistent across dev/test/prod.

  • Present insights and trade-offs clearly to non-technical stakeholders; maintain concise runbooks and wikis for self-serve analytics and operational support.

Continuous Learning and Improvement

  • Identify opportunities to improve existing data processes and analytical techniques.

  • Keep up with the latest trends and best practices in data analysis, visualization, and tools.

  • Experiment with new methods and tools to enhance the quality and speed of data analysis.

Data Governance & Quality Assurance

  • Ensure data security, privacy, and compliance with regulations during the collection and analysis process.

  • Implement processes for quality control to ensure the accuracy and reliability of data.

  • Monitor and audit data sources for consistency and potential issues over time.

Ad-Hoc SQL Query Development & Analysis

  • Develop SQL Queries & Conduct ad-hoc analysis to address immediate data requests and to provide urgent data insights.

Documentation & Knowledge Sharing

  • Document analysis methods, assumptions, and results for transparency and reproducibility.

  • Share insights and learnings with the broader team, contributing to a knowledge-sharing culture.

To succeed in this role, you have:

  • Bachelor’s degree in computer engineering, Computer Science, Information Technology, or a related field (or equivalent experience).

  • 2-5 years of hands-on experience in Data analysis, Power BI development and BI Engineering.

  • Strong proficiency in Power BI, including Power Query, Power BI Service, Power BI Desktop, Power Platform, Tabular Editor, and Lakehouse/Notebook experience.

  • Expertise in DAX (Data Analysis Expressions) for creating measures, calculated columns and complex DAX expressions.

  • Must have experience building semantic data models in Power BI and using GitHub CI/CD pipelines to promote and version control Power BI reports

  • Experience working with relational databases and data warehousing concepts.

  • Proficiency in SQL and Python for querying and transforming data.

  • Experience or knowledge of Microsoft Fabric preferred

  • A team player and motivated self-starter

  • Strong problem-solving and critical-thinking skills.

  • Experience with Azure Data Services (Azure SQL Database, Azure Synapse)

  • Excellent communication and presentation skills.

  • Attention to detail and a commitment to accuracy.

  • Ability to manage multiple projects and meet deadlines.

We believe that time together in the office is important for OMERS and Oxford, the strength of our employees, and the work we do for our pension members. In delivering on our pension promise, keeping us connected to our work and each other,Β our flexible hybrid work guideline requires teams to come in to the office 4 days per week.Β 

As one of Canada’s largest defined benefit pension plans, our people-first culture is at its best when our workforce reflects the communities where we live and work β€” and the members we proudly serve.

From hire to retire, we are an equal opportunity employer committed to an inclusive, barrier-free recruitment and selection process that extends all the way through your employee experience. This sense of belonging and connection is cultivated up, down and across our global organization thanks to our vast network of Employee Resource Groups with executive leader sponsorship, our Purpose@Work committee and employee recognition programs.