If it was up to OCAD U Associate Professor Dr. Alexis Morris, farm data would be something you could see, explore and learn from, which is the goal of a new mixed reality research project he has completed with his team.

The Mixed Reality Agriculture Project is an early test of a new digital tool that helps farmers better understand and use their farm data. It brings together information about soil, crops, and the environment and shows it in an interactive, easy-to-understand way. By using technologies like AI and immersive visuals, the tool is designed to help farmers make better decisions, work more efficiently, and reduce environmental impact. 

OCAD U caught up with Dr. Alexis Morris to find out more about this collaborative project. 

Tell us more about your Mixed Reality Agriculture Project.

The project had three main goals: to build and test a working prototype, to help farmers better understand soil and how farming affects the environment, and to find better ways to use farm data to support, inform, and even educate farmers and other stakeholders. 

The work happened in three stages. First, the team worked with an advisory group to identify farmers’ needs and decide what the system should do. Next, we built an initial version of the tool and shared it with advisors for feedback. Finally, we tested and refined the system with experts and stakeholders to create a more polished and practical prototype. 

What inspired you to explore this particular topic?

Farm data are often trapped in spreadsheets or specialized monitors that only a few people on a farm ever see, and often the knowledge experts have acquired a lived experience of the field that gives knowledge that is hard to unpack, visualize, and share. I’ve always been fascinated by ways immersive interface design can make invisible information visible and actionable, so translating soil maps and yield layers into something you can literally walk around felt like a natural challenge. 

Mixed reality lets us merge empirical data and everyday experience in a way a flat screen never could, opening the door to more intuitive and democratic decision‑making on farms. Additionally, as AI capabilities advanced during the course of this project it was a natural opportunity to explore how language AI assistants can also be merged into an overall immersive decision support system.

What drew you into this field of study? Was there a turning point or defining moment?

My background is in smart systems and immersion, basically the cross section of how we can use sensing technologies and AI to understand environment data and human contexts on the one hand and then using immersive 3D visuals to help people understand complex information on the other hand. This project gave us a use case where we could explore this domain from a design-science lens, and we were able to access real yield and soil data where we could reimagine how farmers activate on their data through new interfaces to go beyond “gut feel” and gain real time insights. 

The defining moments were when we integrated our mixed reality frameworks together with our AI language models (like ChatGPT) and were able to effectively speak verbally to the data, query rules and regulations, and even query the geospatial information in natural language, all while interacting with it on a visual map interface hovering on the table in 3D. That convinced me that design research could bridge the gap between sophisticated ag data and practical on‑farm choices, especially now that hardware is becoming prevalent and wearable and wireless. Increasing this connection between humans and their data environments remains an exciting direction.

What problem or question was your research trying to solve, and why was this important?

Farmers face a flood of fragmented data including soil tests, satellite imagery, weather alerts, yet tools to synthesize that information are often siloed or difficult to use. Our question was: could an AI‑assisted mixed reality interface integrate those layers and present them in a format that supports faster, more confident decisions? Answering this matters because it can impact decisions, leading to better‑timed fertilizer, smarter cover‑crop choices, and reduced runoff all translate into economic gains for growers and environmental benefits for communities. 

At a more human-factor level this helps the knowledge layer of the farming community to be more seamless and enables fluid access between farmers, members of their teams, other stakeholders like regulators, and their overall data ecosystem.

How do you see the research findings contributing to your field or affecting people’s lives in the real world?

The prototype shows that conversational AI and spatial visualization can be designed to support lowering the barrier to using precision‑ag data, potentially saving inputs and reducing environmental impact. For the design community, it offers a case study in blending extended‑reality interaction with trustworthy AI explanations. For farmers, it hints at a future where field insights are literally in view, whether at the kitchen table or out by the fenceline, supporting climate‑smart, data‑driven practices.

Were you working with any collaborators, institutions, or funding bodies on this project?

Yes. The project was funded and supported by the Ontario Ministry of Agriculture, Food and Agribusiness (OMAFA). We also worked with an Ontario demonstration farm that provided field data used in the prototype, and an advisory group of specialists in precision agriculture and soil science. The development team included OCADU Digital Futures graduate students and our ACE Lab interdisciplinary researchers.