Toggle light / dark theme

In today’s AI news, in a social media post, DeepSeek said the daily releases it is planning for its Open Source Week would provide visibility into these humble building blocks in our online service that have been documented, deployed and battle-tested in production. As part of the open-source community, we believe that every line shared becomes collective momentum that accelerates the journey.

In other advancements, Together AI an AI cloud platform that enables companies to train and deploy artificial intelligence models — has raised $305 million in Series B funding in a round led by General Catalyst, more than doubling its valuation to $3.3 billion from $1.25 billion last March. The funding comes amid growing demand for computing power to run advanced open-source models.

In personal and professional development, if you’re curious about how to integrate AI smartly into your business, here are some friendly tips to get you started while keeping things safe and effective. The key is strategic integration with safeguards in place, use AI’s strengths — without losing your own.

Then, search startup Genspark has raised $100 million in a series A funding round, valuing the startup at $530 million, according to a source familiar with the matter, as the race to use artificial intelligence to disrupt Google’s stranglehold on the search engine market heats up. The Palo Alto-based company currently has over 2 million monthly active users, and the round was led by a group of U.S. and Singapore-based investors.

S like to compete with Google, and what the future of search could look like. + Then, as AI scales from the cloud to the very edges of our devices, the potential for transformative innovation grows exponentially. In this Imagination In Action session at Davos, Daniel Newman, CEO The Futurum Group moderates this expert panel which includes: Åsa Tamsons, Executive VP, Ericsson, Gill Pratt, CEO Toyota Research, Chief Scientist Toyota, Kinuko Masaki, CEO, VoiceBrain, Cyril Perducat, CTO, Rockwell Automation, and Alexander Amini, CSO, Liquid AI.

An interesting glimpse into the adventurous world of neutrino research in Antarctica!


At McMurdo, Karle must wait for the weather to permit the final leg of the trip. “It is not uncommon to spend several days in McMurdo,” he says. (Karle’s record is 10.) When it’s time, he takes a 3.5-hour flight on a ski-equipped LC-130 aircraft to reach the South Pole. Anyone or anything else that goes to the South Pole must take a similarly tedious route.

There’s a reason scientists have endured the challenges of the climate, the commute and the cost for over half a century—since members of the US Navy completed the original Amundsen–Scott South Pole Station in 1957. Despite all the trouble it takes to get there, the South Pole is an unparalleled environment for scientific research, from climate science and glaciology to particle physics and astrophysics.

This sentiment was echoed by the Particle Physics Project Prioritization Panel in its 2023 report, a decadal plan for the future of particle physics research in the United States. Under its recommendation to “Construct a portfolio of major projects that collectively study nearly all fundamental constituents of our universe and their interactions,” the report prioritized support for five specific projects—two of which are located at the South Pole: cosmic microwave background experiment CMB-S4, the top priority, and neutrino experiment IceCube-Gen2, recommended fifth. Because of the high scientific priority of these projects, the report also urged maintenance of the South Pole site.

Glaciers separate from the continental ice sheets in Greenland and Antarctica covered a global area of approximately 706,000 km2 around the year 200019, with an estimated total volume of 158,170 ± 41,030 km3, equivalent to a potential sea-level rise of 324 ± 84 mm (ref. 20). Glaciers are integral components of Earth’s climate and hydrologic system1. Hence, glacier monitoring is essential for understanding and assessing ongoing changes21,22, providing a basis for impact2,3,4,5,6,7,8,9,10 and modelling11,12,13 studies, and helping to track progress on limiting climate change23. The four main observation methods to derive glacier mass changes include glaciological measurements, digital elevation model (DEM) differencing, altimetry and gravimetry. Additional concepts include hybrid approaches that combine different observation methods. In situ glaciological measurements have been carried out at about 500 unevenly distributed glaciers24, representing less than 1% of Earth’s glaciers19. Glaciological time series provide seasonal-to-annual variability of glacier mass changes25. Although these are generally well correlated regionally, long-term trends of individual glaciers might not always be representative of a given region. Spaceborne observations complement in situ measurements, allowing for glacier monitoring at global scale over recent decades. Several optical and radar sensors allow the derivation of DEMs, which reflect the glacier surface topography. Repeat mapping and calculation of DEM differences provide multi-annual trends in elevation and volume changes26 for all glaciers in the world27. Similarly, laser and radar altimetry determine elevation changes along linear tracks, which can be extrapolated to calculate regional estimates of glacier elevation and volume change28. Unlike DEM differencing, altimetry provides spatially sparse observations but has a high (that is, monthly to annual) temporal resolution26. DEM differencing and altimetry require converting glacier volume to mass changes using density assumptions29. Satellite gravimetry estimates regional glacier mass changes at monthly resolution by measuring changes in Earth’s gravitational field after correcting for solid Earth and hydrological effects30,31. Although satellite gravimetry provides high temporal resolution and direct estimates of mass, it has a spatial resolution of a few hundred kilometres, which is several orders of magnitude lower than DEM differencing or altimetry26.

The heterogeneity of these observation methods in terms of spatial, temporal and observational characteristics, the diversity of approaches within a given method, and the lack of homogenization challenged past assessments of glacier mass changes. In the Intergovernmental Panel on Climate Change (IPCC)’s Sixth Assessment Report (AR6)16, for example, glacier mass changes for the period from 2000 to 2019 relied on DEM differencing from a limited number of global27 and regional studies16. Results from a combination of glaciological and DEM differencing25 as well as from gravimetry30 were used for comparison only. The report calculated regional estimates over a specific baseline period (2000–2019) and as mean mass-change rates based on selected studies per region, which only partly considered the strengths and limitations of the different observation methods.

The spread of reported results—many outside uncertainty margins—and recent updates from different observation methods afford an opportunity to assess regional and global glacier mass loss with a community-led effort. Within the Glacier Mass Balance Intercomparison Exercise (GlaMBIE; https://glambie.org), we collected, homogenized and combined regional results from the observation methods described above to yield a global assessment towards the upcoming IPCC reports of the seventh assessment cycle. At the same time, GlaMBIE provides insights into regional trends and interannual variabilities, quantifies the differences among observation methods, tracks observations within the range of projections, and delivers a refined observational baseline for future impact and modelling studies.

Particles in high-energy nuclear collisions move in a way that follows a pattern known as Lévy walks, a motion found across many scientific fields.

Named after mathematician Paul Lévy, Lévy walks (or, in some cases, Lévy flights) describe a type of random movement seen in nature and various scientific processes. This pattern appears in diverse phenomena, from how predators search for food to economic fluctuations, microbiology, chemical reactions, and even climate dynamics.

Lévy walks in high-energy nuclear collisions.

A study led by Professor Ginestra Bianconi from Queen Mary University of London, in collaboration with international researchers, has unveiled a transformative framework for understanding complex systems.

Published in Nature Physics, this paper establishes the new field of higher-order topological dynamics, revealing how the hidden geometry of networks shapes everything from brain activity to .

“Complex systems like the brain, climate, and next-generation artificial intelligence rely on interactions that extend beyond simple pairwise relationships. Our study reveals the critical role of higher-order networks, structures that capture multi-body interactions, in shaping the dynamics of such systems,” said Professor Bianconi.

Astronomers have mapped the 3D structure of an exoplanet’s atmosphere for the first time, revealing powerful winds that transport elements like iron and titanium. Using all four telescope units of the European Southern Observatory’s Very Large Telescope (ESO’s VLT), researchers uncovered complex weather patterns shaping the planet’s skies. This breakthrough paves the way for more detailed studies of atmospheric composition and climate on distant worlds.

MIT researchers developed a new approach for assessing predictions with a spatial dimension, like forecasting weather or mapping air pollution.

Re relying on a weather app to predict next week’s temperature. How do you know you can trust its forecast? Scientists use statistical and physical models to make predictions about everything from weather to air pollution. But checking whether these models are truly reliable is trickier than it seems—especially when the locations where we have validation data don Traditional validation methods struggle with this problem, failing to provide consistent accuracy in real-world scenarios. In this work, researchers introduce a new validation approach designed to improve trust in spatial predictions. They define a key requirement: as more validation data becomes available, the accuracy of the validation method should improve indefinitely. They show that existing methods don’t always meet this standard. Instead, they propose an approach inspired by previous work on handling differences in data distributions (known as “covariate shift”) but adapted for spatial prediction. Their method not only meets their strict validation requirement but also outperforms existing techniques in both simulations and real-world data.

By refining how we validate predictive models, this work helps ensure that critical forecasts—like air pollution levels or extreme weather events—can be trusted with greater confidence.


A new evaluation method assesses the accuracy of spatial prediction techniques, outperforming traditional methods. This could help scientists make better predictions in areas like weather forecasting, climate research, public health, and ecological management.

Extreme precipitation events in Antarctica, which are mostly dominated by snowfall due to sub-zero temperatures, also include rainfall, according to new research.

BAS scientists studying atmospheric rivers—narrow bands of concentrated moisture in the atmosphere or “rivers in the sky”—have discovered that these phenomena bring not only snow but also rain to parts of Antarctica, even during the continent’s cold winter months. The results are published in the journal The Cryosphere.

Using cutting-edge regional climate models (RCMs) at a of just one kilometer, the researchers explored how atmospheric rivers interact with Antarctica’s rugged terrain to deliver significant precipitation to key areas, including the Thwaites and Pine Island Ice Shelves in West Antarctica. These are areas known for their ongoing retreat and contribution to global sea-level rise.

What types of new plastics can be developed with enhanced recycling capabilities? This is what a recent study published in Nature hopes to address as a team of researchers at Cornell University have developed an enhanced type of thermoset, which is built from a type of polymer that is often difficult to recycle, resulting in it being put back into the atmosphere from burning it or into landfills, which destroy marine ecosystems. This study has the potential to help scientists, engineers, policymakers, and the public better understand new recycling methods that can be used to both help the environment and mitigate the impacts of climate change.

For the study, the researchers used a bio-sourced material known as dihydrofuran (DHF) to design and build a new thermoset polymer that maintains its robustness while ensuring safely being recycled through heat and environmental degradation. When compared to traditional thermosets, the DHF thermosets can still be used for a myriad of commercial applications, including footwear, electronics, and garden hoses, just to name a few.

“We’ve spent 100 years trying to make polymers that last forever, and we’ve realized that’s not actually a good thing,” said Dr. Brett Fors, who is a professor of physical chemistry at Cornell University and a co-author on the study. “Now we’re making polymers that don’t last forever, that can environmentally degrade.”