AI is transforming materials science, weather forecasting & biodiversity—
discoveries beyond medicine that are reshaping our planet’s future.
The Day Science Stopped Having Borders
Imagine you are standing in front of three doors.
Behind Door Number One: a machine that can design new battery materials from scratch — materials that would take human chemists centuries to discover through trial and error.
Behind Door Number Two: a weather oracle that can peer ten days into the future, predict the path of a hurricane with nine-days’ notice, and do all of this in under one minute on hardware you could buy off the shelf.
Behind Door Number Three: an enzyme — a molecular machine conjured partly by artificial intelligence — that eats plastic in hours, solving one of humanity’s ugliest environmental embarrassments.
The twist? All three doors are already open. The machines exist. The discoveries are real. And what they have in common is the restless, pattern-hungry mind of artificial intelligence — a technology that, it turns out, doesn’t care which scientific department it works in.
Welcome to Post 6 of our AI in Science & Medicine series — where we leave the clinic and the pharmaceutical lab behind and step into a much, much bigger laboratory. One that encompasses the periodic table, the atmosphere, the ocean floor, and the microbial world teeming under your feet. Science has never been confined to medicine, and neither should our conversation about AI.
Buckle up. This one goes everywhere.
Chapter One: The Alchemist’s Dream, Now Digital
For centuries, the alchemist’s fantasy was transformation — turning base metals into gold, coaxing matter to bend to human will. They failed, of course. But they were onto something philosophically profound: the conviction that the physical world, at its core, is a puzzle with solutions, if only you could find the right combinations.
Modern materials science is, in many ways, alchemy’s honest descendant. Instead of mystical transmutation, researchers work with crystal structures, electron configurations, and quantum mechanics to discover materials that can store more energy, conduct electricity more efficiently, or withstand temperatures that would reduce ordinary matter to vapor. The problem has always been the sheer immensity of the search space.
Consider: scientists estimate there are roughly 10 to the power of 60 possible molecular compounds that could theoretically exist. That is a number so large that calling it “astronomical” is itself an understatement — the observable universe contains only about 10 to the power of 80 atoms. Exploring even a fraction of that space through traditional laboratory synthesis, one compound at a time, would require more lifetimes than humanity has had.
Then, in November 2023, DeepMind dropped a scientific thunderclap. Their model, GNoME — short for Graph Networks for Materials Exploration — predicted the existence and stability of 2.2 million new crystal structures. Of those, 380,000 were identified as stable enough for real-world synthesis. To put that in perspective: the entire accumulated library of materials science up to that point contained roughly 28,000 new computational discoveries over an entire decade. GNoME matched that and then obliterated it, generating what researchers described as the equivalent of nearly 800 years’ worth of knowledge (DeepMind, 2023).
| ⚡ By the Numbers: GNoME’s Discovery 2.2 million new crystal structures predicted • 380,000 confirmed stable • 52,000 new graphene-like layered compounds (potential superconductors) • 528 new lithium-ion conductors for next-generation batteries — all generated in one research cycle. |
Among the haul: 52,000 new layered compounds similar to graphene — the material that launched a thousand research programs when it was discovered in 2004 — and 528 new lithium-ion conductors, directly relevant to the batteries powering everything from electric vehicles to the devices you’re reading this on.
Crucially, DeepMind’s collaborators at Berkeley Lab demonstrated that a robotic autonomous laboratory could then synthesize more than 41 of these predicted materials in rapid succession — closing the loop between digital prediction and physical reality (DeepMind, 2023). The atom has met its match: an AI that can sketch its portrait before it even exists.
“It is in this collaboration between people and algorithms that incredible scientific progress lies over the next few decades.”
— Demis Hassabis, CEO and Co-founder, Google DeepMind (Financial Times, 2017)
But GNoME is just the beginning of AI’s materials adventure. At Tohoku University and Fujitsu, researchers used the AI platform Fujitsu Kozuchi to automatically decode the superconductivity mechanism of cesium vanadium antimonide — a potential high-temperature superconductor — by analyzing data from the NanoTerasu Synchrotron Light Source, one of the world’s most advanced light sources (Fujita et al., 2025). The approach, published in Scientific Reports, used AI-driven causal discovery to extract relationships from experimental data that would have taken researchers years to untangle manually.
And then there is the dream that haunts every materials scientist: the room-temperature superconductor. A material that conducts electricity with zero resistance at everyday temperatures would, at a stroke, revolutionize power grids, make magnetic levitation trains economically viable, and transform computing hardware. Startups including Periodic Labs (founded by DeepMind’s former materials discovery lead Dogus Cubuk) and Lila Sciences are actively deploying AI to hunt for it, using generative models to propose novel quantum materials rather than just screening existing ones (MIT Technology Review, 2025). They have not found it yet. But they are, for the first time in decades, narrowing the search.
Meanwhile, AI is also rewriting the story of one of the most urgent environmental challenges of our time: plastic pollution. Polyethylene terephthalate, or PET, makes up 12% of the world’s solid waste. It surrounds us in bottles, packaging, and containers — and in natural environments, it can take centuries to degrade.
In 2022, researchers at the University of Texas at Austin used a machine learning model to engineer a dramatically improved version of a plastic-eating enzyme called PETase. The result — FAST-PETase (Functional, Active, Stable, and Tolerant PETase) — can break down PET plastic waste in days, operating at temperatures below 50°C, making it cheap enough for industrial scale. The ML model identified five critical mutations that transformed a sluggish natural enzyme into a molecular wrecking ball. The team proved it could almost completely degrade 51 different post-consumer plastic products within a week and demonstrated a closed-loop recycling process — breaking plastic down and resynthesizing it into new PET (Lu et al., 2022).
“This work really demonstrates the power of bringing together different disciplines, from synthetic biology to chemical engineering to artificial intelligence.”
— Andrew Ellington, Professor of Molecular Biosciences, University of Texas at Austin; lead researcher, FAST-PETase project (GEN, 2022)
The philosophical implication is quietly staggering. What took evolution millions of years to produce — a bacterium capable of eating plastic — AI-accelerated enzyme engineering replicated and vastly improved in a single research cycle. The scale of acceleration is not merely incremental. It is categorical.
Chapter Two: Teaching the Weather to Tell the Truth
There is a reason weather forecasters have historically been the butt of jokes. Predicting the chaotic, turbulent, perpetually restless atmosphere of a planet is, to put it gently, hard. Numerical weather prediction — the traditional approach, where massive supercomputers crunch physics equations representing atmospheric dynamics — has been refined over decades. It is genuinely impressive. It is also extraordinarily expensive in computational terms, requiring tens of thousands of processors running for hours to generate a 10-day global forecast.
In November 2023, Google DeepMind introduced GraphCast, a machine learning model trained on nearly 40 years of atmospheric reanalysis data. The results were, by any honest measure, remarkable: GraphCast outperformed the industry gold standard — the European Centre for Medium-Range Weather Forecasts’ High-Resolution Forecast system — on 90% of 1,380 verification targets. When analysis was restricted to the troposphere, the layer of atmosphere where virtually all weather events happen, GraphCast beat the traditional system on 99.7% of test variables (Lam et al., 2023).
The speed difference was almost insulting. Traditional numerical systems: hours, on supercomputer clusters with tens of thousands of processors. GraphCast: under one minute, on a single Google TPU machine — and approximately 1,000 times more energy-efficient (World Economic Forum, 2023).
The real-world test came during Hurricane Lee in September 2023. GraphCast accurately predicted the storm’s Nova Scotia landfall nine days in advance. Conventional forecasting systems only pinpointed that destination six days out — a three-day improvement that, for evacuation planning and emergency response, could represent thousands of lives saved.
| Forecast accuracy vs. industry standard | 90% of 1,380 metrics outperformed |
| Tropospheric accuracy advantage | 99.7% of test variables |
| Forecast generation time | < 1 minute (vs. hours) |
| Energy efficiency gain | ~1,000x more efficient |
| Hurricane Lee prediction lead time | 9 days (vs. 6 days traditional) |
| Training data span | ~40 years of historical weather data |
DeepMind did not stop there. In December 2024, they published GenCast in Nature — a probabilistic ensemble model that generates 50 or more possible weather scenarios simultaneously, rather than a single deterministic forecast (Price et al., 2024). GenCast outperformed the top operational ensemble system on 97.2% of 1,320 test combinations. It generates each ensemble forecast in 8 minutes on a single Google Cloud TPU chip, compared to hours on supercomputers with tens of thousands of processors — and it delivers better predictions of extreme weather events including heat waves, strong winds, and cyclones.
NOAA took notice. In late 2025, the United States’ National Oceanic and Atmospheric Administration deployed a new suite of AI-driven global weather models — AIGFS and AIGEFS — built partly on GraphCast’s foundations and fine-tuned with NOAA’s own atmospheric data. The AIGEFS, a 31-member AI ensemble, achieves forecast skill comparable to the operational GEFS while requiring only 9% of the computing resources (NOAA, 2025). The operational gold standard is quietly, deliberately being replaced.
The climate science implications extend far beyond forecasting. AI-powered systems are now being used to model long-term climate trends, identify tipping points in complex Earth systems, and guide decisions about renewable energy deployment. The same pattern-recognition capabilities that identify a hurricane’s track can trace atmospheric river systems associated with catastrophic flooding, or predict the onset of dangerous heat waves with greater precision and earlier warning than ever before.
Chapter Three: Nature’s Data, Finally Decoded
Every square kilometer of Earth’s surface is, in a sense, a laboratory — teeming with organisms interacting in ways that science has barely begun to catalogue. The biodiversity crisis is, partly, a knowledge crisis: we do not know what we have, so we struggle to protect it. Estimates suggest that the majority of the world’s species have not yet been formally described by science.
AI is beginning to change this with remarkable speed. Consider BirdNET, a deep learning system developed by Cornell Lab of Ornithology and the Chemnitz University of Technology: it can now identify approximately 3,000 of the most common bird species worldwide from audio recordings alone (Kahl et al., 2021). Deploy that tool on a network of autonomous recording units scattered through a forest, and you have effectively deployed a distributed biodiversity monitoring system capable of running continuously without human presence — tracking not just species occurrence but ecosystem health, breeding seasons, and responses to climate change over time.
Research published in Trends in Ecology & Evolution in December 2024 outlined an international horizon scan of AI applications for conservation, with 21 key applications identified by conservation scientists and AI experts (Pollock et al., 2024). These include using AI to identify previously unknown “dark diversity” — species that should theoretically exist in an area based on habitat suitability but whose absence is itself ecologically meaningful — and deploying multimodal models that combine image, audio, DNA sequence, and text data to build richer biodiversity maps than any single data type could yield.
The Amazon is perhaps the most dramatic theatre of AI-enabled conservation. Project Guacamaya, working with Microsoft’s AI for Good Lab, uses solar-powered microphones, satellite imagery, camera traps, and bioacoustic analysis to monitor real-time soundscapes across tropical forest landscapes, protecting biodiversity and flagging threats like illegal logging and poaching (World Economic Forum, 2025). The forest itself, in a sense, is learning to speak — and AI is learning to listen.
Coral reefs present a similarly urgent case. Covering less than 0.1% of the ocean floor but supporting an estimated 25% of all marine species, they are collapsing under the combined pressure of warming, acidification, and coastal development. AI systems trained on sonar recordings and underwater video are now being used to assess reef health from acoustic signatures — the clicks, crunches, and biological noise of a thriving reef sound measurably different from a dying one — enabling conservationists to monitor vast expanses of ocean that no team of human divers could ever survey (Williams et al., 2022).
Then there are wildfires. In Canada, TELUS is integrating connected technologies into post-wildfire forest restoration, using AI to guide replanting decisions. Pano AI’s platform combines sensor networks with predictive modeling to identify early-stage wildfires before they become catastrophic, protecting both ecosystems and human communities (World Economic Forum, 2025). In a world where climate change is making fire seasons longer, hotter, and more unpredictable, early AI-powered detection is not a luxury — it is an emergency infrastructure.
| 🌿 The Biodiversity Knowledge Gap AI research published in Nature Reviews Biodiversity (2025) identified seven critical shortfalls in global biodiversity data. AI applications are being developed to address all seven: mapping species distributions, tracking population changes, identifying ecological functions, monitoring threats, and predicting extinction risk. The same deep learning architectures accelerating drug discovery are now being turned on the tree of life. |
Chapter Four: The Microbes Are Running the Show (And AI Is Finally Paying Attention)
Here is a fact that should recalibrate your sense of biological scale: the human body contains roughly as many microbial cells as human cells. The microbiome — the vast, largely unmapped ecosystem of bacteria, fungi, viruses, and archaea living in and on us and everywhere in the environment — runs metabolic processes, modulates immune systems, produces compounds, and degrades waste in ways that science is only beginning to systematically understand.
AI is now being applied to decode this microbial world at unprecedented speed. Deep learning models trained on environmental DNA sequences are being used to identify novel enzymes with industrial applications — the FAST-PETase story is one example, but it is far from the only one. A January 2025 paper in The ISME Journal described how researchers mined metagenomic data from hydrothermal sediments in the Guaymas Basin — one of the most extreme environments on Earth — and discovered a novel archaeal PETase enzyme (GuaPA), the first enzyme from the Archaea domain capable of degrading PET plastic (Acosta et al., 2025). The universe of plastic-eating biology, it turns out, is much larger than anyone suspected, and AI-assisted metagenomic mining is the tool revealing it.
In synthetic biology more broadly, the 2024 Nobel Prize in Chemistry — awarded in part to David Baker of the University of Washington for computationally designed proteins — validated a decade of work demonstrating that AI can not only predict the structures of existing proteins (as AlphaFold does) but actively design entirely new ones with specified functions. AI-assisted “biofoundries” — automated labs where robotic systems design, synthesize, test, and iterate on biological constructs with AI guidance — are compressing the design-build-test-learn cycles of synthetic biology from months to days (Bloomsbury Intelligence and Security Institute, 2025).
The neuroscience frontier is also lighting up. Brain-computer interfaces — devices that translate neural signals into digital commands — are moving from science fiction to clinical practice faster than almost any other field anticipated. In August 2024, China’s NEO system became the country’s first brain-computer interface product to enter the Innovative Medical Devices Special Review Procedure, enabling ambulatory recovery in spinal cord injury patients within 72 hours of implantation. AI is central to the signal processing required to make this work: distinguishing meaningful neural activity from noise in real time requires machine learning architectures running at biological speeds (PMC, 2025).
Chapter Five: The Cartographer’s Dilemma — Who Owns the Map?
Every story about scientific progress eventually confronts a harder question. Not “can we do this?” but “who gets to do this, and for whom?”
The same AI tools discovering new materials, predicting weather patterns, and mapping biodiversity are largely being developed by a small number of large technology companies — DeepMind, Microsoft, Google, and their close academic partners — operating primarily in wealthy industrialized nations. The computational infrastructure required to train foundation models for materials science or climate prediction is staggeringly expensive, placing it beyond the reach of most research institutions in the Global South.
This creates what researchers are calling an “AI colonialism” risk in conservation: the possibility that AI tools are deployed in biodiversity-rich but economically poor regions by organizations from wealthier countries, extracting data and insights without meaningful benefit flowing back to local communities or governments (Pollock et al., 2024). The forests of the Amazon, the reefs of Southeast Asia, the savannas of sub-Saharan Africa — these are the regions where biodiversity monitoring is most urgently needed and where AI deployment is most likely to be controlled by outside institutions.
There is a related question about what happens to scientific labor as AI accelerates discovery. When a single model can propose 2.2 million new crystal structures in a single research cycle — work that would have required an unimaginable army of chemists working for centuries — what happens to the humans who traditionally did that exploratory work? The question is not merely economic. Science, as a human endeavor, is partly about the disciplined practice of attention: the years of careful observation that build not just knowledge but judgment, intuition, and the capacity to ask the next question.
The 2024 Nobel Prize in Chemistry encapsulated this tension perfectly. Demis Hassabis and John Jumper of DeepMind and David Baker of the University of Washington were honored — justly — for breakthroughs in protein structure prediction and design. But the Nobel Committee was essentially awarding a prize for building a tool that has now largely automated what was previously one of the most painstaking intellectual endeavors in biology. The tool is extraordinary. The question of what happens to the practitioners it displaces, and whether the benefits are equitably distributed, is not answered by the prize citation.
Perhaps the most honest framing is this: AI in science is an extraordinary cartographer. It can map the territory of the possible — crystal structures, protein folds, species distributions, climate futures — faster and more comprehensively than any human expedition could. But maps do not determine who travels, who benefits from what is found, or who decides what to do with it. Those decisions remain irreducibly human, irreducibly political, and irreducibly urgent.
“Labs around the world, including my own, are using his AI tools to tackle rare genetic diseases, antibiotic resistance, and even climate-driven challenges in agriculture.”
— Anonymous Nobel-era profile, Time Magazine, on the real-world reach of DeepMind’s tools (Time, 2025)
Key Takeaways
1. Materials science has entered a new era: AI models like GNoME have discovered 2.2 million new crystal structures — the equivalent of 800 years of human scientific labor — opening pathways to next-generation batteries, superconductors, and clean energy materials.
2. Weather forecasting is being transformed: GraphCast and GenCast outperform gold-standard systems on 90%+ of verification targets, running in minutes rather than hours, and are already being deployed operationally by NOAA.
3. Biodiversity monitoring is going digital at scale: AI tools like BirdNET identify thousands of species from sound, while Amazon monitoring platforms use bioacoustic AI to protect ecosystems in real time.
4. AI-designed enzymes are solving environmental crises: FAST-PETase, engineered with machine learning, can degrade plastic waste in days — a process that took nature over 60 years to evolve naturally.
5. The equity question is real and urgent: the benefits of AI in science are concentrated in wealthy institutions; global governance frameworks are urgently needed to ensure equitable access and benefit-sharing.
Glossary of Key Terms
- Graph Neural Network (GNN): A machine learning architecture that represents data as graphs — nodes connected by edges — particularly effective for modeling molecular structures and crystal lattices where relationships between atoms define material properties.
- Biofoundry: An automated laboratory infrastructure that combines robotics, AI-guided design, and high-throughput testing to accelerate the synthetic biology design-build-test-learn cycle.
- Metagenomics: The study of genetic material recovered directly from environmental samples — soil, water, sediment — without first culturing organisms in a lab. AI enables rapid analysis of the massive datasets this generates.
- PETase: An enzyme capable of breaking down polyethylene terephthalate (PET) plastic. FAST-PETase is an AI-engineered variant with dramatically improved speed and stability.
- Ensemble Forecast: A weather prediction approach that generates multiple possible future weather scenarios simultaneously, providing a probability distribution of outcomes rather than a single forecast.
- Inverse Design: Rather than predicting properties of a given material, AI performs inverse design by starting with desired properties and working backward to propose materials or molecular structures that should exhibit them.
Reference List
- Acosta, D. J., Barth, D. R., Bondy, J., Appler, K. E., De Anda, V., Ngo, P. H. T., Alper, H. S., Baker, B. J., Marcotte, E. M., & Ellington, A. D. (2025). Plastic degradation by enzymes from uncultured deep sea microorganisms. The ISME Journal, 19(1), wraf068. https://doi.org/10.1093/ismejo/wraf068
- Bloomsbury Intelligence and Security Institute. (2025, November 12). AI and synthetic biology: The new frontier of promise and power. https://bisi.org.uk/reports/ai-and-synthetic-biology-the-new-frontier-of-promise-and-power
- DeepMind. (2023, November). Millions of new materials discovered with deep learning. Google DeepMind Blog. https://deepmind.google/blog/millions-of-new-materials-discovered-with-deep-learning/
- Fujita, K., et al. (2025). Extracting causality from spectroscopy. Scientific Reports. https://doi.org/10.1038/s41598-025-29687-8
- GEN (Genetic Engineering & Biotechnology News). (2022). Fast and efficient plastic-degrading enzyme developed using AI. https://www.genengnews.com/news/fast-and-efficient-plastic-degrading-enzyme-developed-using-ai/
- Hassabis, D. (2017, March). [Quote]. Financial Times. Cited in AIIFI. (2025). 9 Demis Hassabis quotes: DeepMind CEO predicts AGI in 5–10 years. https://www.aiifi.ai/post/demis-hassabis-quotes
- Kahl, S., Wood, C. M., Eibl, M., & Klinck, H. (2021). BirdNET: A deep learning solution for avian diversity monitoring. Ecological Informatics, 61, 101236. https://doi.org/10.1016/j.ecoinf.2021.101236
- Lam, R., Sanchez-Gonzalez, A., Willson, M., Wirnsberger, P., Fortunato, M., Alet, F., Ravuri, S., Ewalds, T., Eaton-Rosen, Z., Hu, W., Merose, A., Hoyer, S., Holland, G., Vinyals, O., Stott, J., Pritzel, A., Mohamed, S., & Battaglia, P. (2023). Learning skillful medium-range global weather forecasting. Science, 382, 1416–1421. https://doi.org/10.1126/science.adi2336
- Lu, H., Diaz, D. J., Czarnecki, N. J., Zhu, C., Kim, W., Shroff, R., Acosta, D. J., Alexander, B. R., Cole, H. O., Zhang, Y., Lynd, N. A., Ellington, A. D., & Alper, H. S. (2022). Machine learning-aided engineering of hydrolases for PET depolymerization. Nature, 604, 662–667. https://doi.org/10.1038/s41586-022-04599-z
- MIT Technology Review. (2025, December 15). AI materials discovery now needs to move into the real world. https://www.technologyreview.com/2025/12/15/1129210/ai-materials-science-discovery-startups-investment/
- NOAA. (2025). NOAA deploys new generation of AI-driven global weather models. https://www.noaa.gov/news-release/noaa-deploys-new-generation-of-ai-driven-global-weather-models
- Pollock, L. J., et al. (2024). The potential for AI to revolutionize conservation: A horizon scan. Trends in Ecology & Evolution. https://doi.org/10.1016/j.tree.2024.08.015
- Price, I., Sanchez-Gonzalez, A., Yang, F., Stott, J., Holland, G., Lam, R., Bouqueau, O., Bromberg, J., Peters, J., Ewalds, T., & Battaglia, P. (2024). Probabilistic weather forecasting with machine learning. Nature, 637, 84–90. https://doi.org/10.1038/s41586-024-08252-9
- Time Magazine. (2025). Demis Hassabis: The 100 Most Influential People of 2025. https://time.com/collections/time100-ai-2024/7012767/demis-hassabis/
- World Economic Forum. (2023, December). AI can now outperform conventional weather forecasting — in under a minute, too. https://www.weforum.org/stories/2023/12/ai-weather-forecasting-climate-crisis/
- World Economic Forum. (2025, October). Responsible use of AI for nature protection and preservation. https://www.weforum.org/stories/2025/10/ai-companies-protect-restore-nature/
Additional Reading
1. Lam, R., et al. (2023). Learning skillful medium-range global weather forecasting. Science, 382, 1416–1421. — The original GraphCast paper, freely accessible via Science journal.
2. Pollock, L. J., et al. (2024). The potential for AI to revolutionize conservation: A horizon scan. Trends in Ecology & Evolution. — Essential reading for anyone interested in the intersection of AI and biodiversity conservation.
3. Price, I., et al. (2024). Probabilistic weather forecasting with machine learning. Nature, 637, 84–90. — The GenCast paper, advancing ensemble AI weather forecasting.
4. Ma, Y., Gao, Y., Wang, L., et al. (2025). Accelerating materials discovery through active learning: Methods, challenges and opportunities. The Innovation Informatics, 1, 100013. — A comprehensive technical review of how active learning is changing materials science.
5. Pollock, L. J., et al. (2025). Harnessing artificial intelligence to fill global shortfalls in biodiversity knowledge. Nature Reviews Biodiversity. — A key review mapping AI’s potential to close the seven identified gaps in what we know about life on Earth.
Additional Resources
1. Google DeepMind — Materials Science Research: deepmind.google — Follow ongoing GNoME, AlphaFold, and Genesis project updates from the lab driving many of the breakthroughs discussed in this post.
2. NOAA AI Weather Forecasting: noaa.gov — The official home of NOAA’s AI-enhanced global weather models, including public documentation of the AIGFS and AIGEFS deployment.
3. Cornell Lab of Ornithology — BirdNET: birdnet.cornell.edu — The public interface for the AI bird identification tool; also a window into how bioacoustic AI is being democratized for citizen scientists.
4. The Materials Project: materialsproject.org — An open-access database of computed information on known and predicted materials, serving as the foundational dataset for many AI-driven materials discovery efforts including GNoME.
5. Global Forest Watch: globalforestwatch.org — Uses satellite imagery and AI analytics to provide real-time monitoring of forest cover change worldwide, a practical example of AI-powered environmental surveillance at planetary scale.




Leave a Reply