Topographic forest map with contour lines, data points, stream network, and trails around Raven Lake

Last Updated on May 14, 2026 by Jaspreet Kaur

Climate change is changing ecosystems everywhere. Scientists are trying to understand how plants and animals will adapt. One big problem in conservation science is predicting where species live now and where they might survive later. To solve this ecologists use species distribution models. These models use climate data and records of where species have been spotted to estimate where they might live.

These models are important for policies, wildlife protection plans and climate adaptation strategies. Many of these models have a big weakness: uncertainty. Scientists often don’t know how accurate the predictions are, with incomplete data or rapidly changing climates.

A new study from Université de Montréal researcher Timothée Poisot aims to fix this issue. He introduced a machine-learning method that better measures uncertainty in biodiversity predictions.

Limits of Models

Species distribution models forecast how environmental conditions affect species. Nature is complex and data is often incomplete. Some species are rarely seen, while others live in poorly studied areas.

Because of these gaps, predictions can be unreliable. Climate conditions may change in ways making it hard for models trained on past data to predict future outcomes accurately.

According to Poisot the problem gets even worse when these predictions influence policy or conservation investments. Governments and organizations may spend millions protecting habitats based on forecasts that could be wrong.

To improve confidence in these models Poisot adapted a method from intelligence called conformal prediction. This technique helps scientists estimate how uncertain a prediction is of just producing a single answer.

Bigfoot as a Test

Using fake ecological data Poisot used an unusual dataset: Bigfoot sightings. Bigfoot, also known as Sasquatch is a creature said to inhabit forests in North America particularly in the Pacific Northwest.

The researcher had a reason for choosing Bigfoot data. Simulated data is often too clean and unrealistic compared to real-world data.

Bigfoot enthusiasts have collected databases of sighting reports. These records include locations, dates and environmental conditions making them useful for testing prediction models.

Poisot said the Bigfoot dataset provides an example of uncertain observational data. By using it researchers could evaluate how well the conformal prediction method works under conditions.

The study showed that the approach could successfully map uncertainty across regions while producing more transparent biodiversity predictions.

Mapping Uncertainty

One big advantage of the approach is flexibility. Scientists and policymakers can adjust how uncertainty they are willing to tolerate.

For example, detecting a species early may require accepting higher uncertainty. On the other hand, protecting a rare species through expensive conservation programs may demand stronger confidence.

The method also allows uncertainty to be projected into the future. This is important for climate change research, where environmental conditions are expected to shift over time.

Future of Conservation

The findings could have implications for biodiversity protection and environmental planning. By identifying uncertainty scientists can help policymakers make more informed decisions.

The study also highlights how machine learning techniques can strengthen research. As climate change accelerates, tools that combine AI with science may become essential.

Although Bigfoot itself remains unproven it has helped researchers develop a framework for studying real species and habitats.

Poisot’s work demonstrates that even unconventional datasets can reveal scientific insights. It offers conservation scientists a way to evaluate confidence in their predictions, helping improve future biodiversity strategies.

Read the press release here


About the author

Health and Chemistry