Reading view

There are new articles available, click to refresh the page.

A how-to for ethical geoengineering research

Over the Northern Hemisphere's summer, the world's temperatures hovered near 1.5° C above pre-industrial temperatures, and the catastrophic weather events that ensued provided a preview of what might be expected to be the new normal before mid-century. And the warming won't stop there; our current emissions trajectory is such that we will double that temperature increase by the time the century is out and continue beyond its end.

This frightening trajectory and its results have led many people to argue that some form of geoengineering is necessary. If we know the effects of that much warming will be catastrophic, why not try canceling some of it out? Unfortunately, the list of "why nots" includes the fact that we don't know how well some of these techniques work or fully understand their unintended consequences. This means more research is required before we put them into practice.

But how do we do that research if there's the risk of unintended consequences? To help guide the process, the American Geophysical Union (AGU) has just released guidelines for ensuring that geoengineering research is conducted ethically.

Read full article

Comments

© Handout / Getty Images

With four more years like 2023, carbon emissions will blow past 1.5° limit

On Thursday, the United Nations' Environmental Programme (UNEP) released a report on what it terms the "emissions gap"—the difference between where we're heading and where we'd need to be to achieve the goals set out in the Paris Agreement. It makes for some pretty grim reading. Given last year's greenhouse gas emissions, we can afford fewer than four similar years before we would exceed the total emissions compatible with limiting the planet's warming to 1.5° C above pre-industrial conditions. Following existing policies out to the turn of the century would leave us facing over 3° C of warming.

The report ascribes this situation to two distinct emissions gaps: between the goals of the Paris Agreement and what countries have pledged to do and between their pledges and the policies they've actually put in place. There are some reasons to think that rapid progress could be made—the six largest greenhouse gas emitters accounted for nearly two-thirds of the global emissions, so it wouldn't take many policy changes to make a big difference. And the report suggests increased deployment of wind and solar could handle over a quarter of the needed emissions reductions.

But so far, progress has been far too limited to cut into global emissions.

Read full article

Comments

© Mario Tama

De-extinction company provides a progress report on thylacine efforts

Colossal, the company founded to try to restore the mammoth to the Arctic tundra, has also decided to tackle a number of other species that have gone extinct relatively recently: the dodo and the thylacine. Because of significant differences in biology, not the least of which is the generation time of Proboscideans, these other efforts may reach many critical milestones well in advance of the work on mammoths.

Late last week, Colossal released a progress report on the work involved in resurrecting the thylacine, also known as the Tasmanian tiger, which went extinct when the last known survivor died in a zoo in 1936. Marsupial biology has some features that may make de-extinction somewhat easier, but we have far less sophisticated ways of manipulating it compared to the technology we've developed for working with the stem cells and reproduction of placental mammals. But, based on these new announcements, the technology available for working with marsupials is expanding rapidly.

Cane toad resistance

Colossal has branched out from its original de-extinction mission to include efforts to keep species from ever needing its services. In the case of marsupial predators, the de-extinction effort is incorporating work that will benefit existing marsupial predators: generating resistance to the toxins found on the cane toad, an invasive species that has spread widely across Australia.

Read full article

Comments

© Universal History Archive

Simple voltage pulse can restore capacity to Li-Si batteries

If you're using a large battery for a specialized purpose—say grid-scale storage or an electric vehicle—then it's possible to tweak the battery chemistry, provide a little bit of excess capacity, and carefully manage its charging and discharging so that it enjoys a long life span. But for consumer electronics, the batteries are smaller, the need for light weight dictates the chemistry, and the demand for quick charging can be higher. So most batteries in our gadgets start to see serious degradation after just a couple of years of use.

A big contributor to that is an internal fragmentation of the electrode materials. This leaves some of the electrode material disconnected from the battery's charge handling system, essentially stranding the material inside the battery and trapping some of the lithium uselessly. Now, researchers have found that, for at least one battery chemistry, it's possible to partially reverse some of this decay, boosting the remaining capacity of the battery by up to 30 percent.

The only problem is that not many batteries use the specific chemistry tested here. But it does show how understanding what's going on inside batteries can provide us with ways to extend their lifespan.

Read full article

Comments

© da-kuk

Amazon joins Google in investing in small modular nuclear power

On Tuesday, Google announced that it had made a power purchase agreement for electricity generated by a small modular nuclear reactor design that hasn't even received regulatory approval yet. Today, it's Amazon's turn. The company's Amazon Web Services (AWS) group has announced three different investments, including one targeting a different startup that has its own design for small, modular nuclear reactors—one that has not yet received regulatory approval.

Unlike Google's deal, which is a commitment to purchase power should the reactors ever be completed, Amazon will lay out some money upfront as part of the agreements. We'll take a look at the deals and technology that Amazon is backing before analyzing why companies are taking a risk on unproven technologies.

Money for utilities and a startup

Two of Amazon's deals are with utilities that serve areas where it already has a significant data center footprint. One of these is Energy Northwest, which is an energy supplier that sends power to utilities in the Pacific Northwest. Amazon is putting up the money for Energy Northwest to study the feasibility of adding small modular reactors to its Columbia Generating Station, which currently houses a single, large reactor. In return, Amazon will get the right to purchase power from an initial installation of four small modular reactors. The site could potentially support additional reactors, which Energy Northwest would be able to use to meet demands from other users.

Read full article

Comments

© X-energy

People think they already know everything they need to make decisions

The world is full of people who have excessive confidence in their own abilities. This is famously described as the Dunning-Kruger effect, which describes how people who lack expertise in something will necessarily lack the knowledge needed to recognize their own limits. Now, a different set of researchers has come out with what might be viewed as a corollary to Dunning-Kruger: People have a strong tendency to believe that they always have enough data to make an informed decision—regardless of what information they actually have.

The work, done by Hunter Gehlbach, Carly Robinson, and Angus Fletcher, is based on an experiment in which they intentionally gave people only partial, biased information, finding that people never seemed to consider they might only have a partial picture. "Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not," they write. The good news? When given the full picture, most people are willing to change their opinions.

Ignorant but confident

The basic setup of the experiment is very straightforward. The researchers developed a scenario where an ongoing water shortage was forcing a school district to consider closing one of its schools and merging its students into another existing school. They then wrote an article that described the situation and contained seven different pieces of information: three that favored merging, three that disfavored it, and one that was neutral. Just over half of the control group that read the full article favored merging the two schools.

Read full article

Comments

© LUDOVIC MARIN

Climate change boosted Milton’s landfall strength from Category 2 to 3

As attempts to clean up after Hurricane Milton are beginning, scientists at the World Weather Attribution project have taken a quick look at whether climate change contributed to its destructive power. While the analysis is limited by the fact that not all the meteorological data is even available yet, by several measures, climate change made aspects of Milton significantly more likely.

This isn't a huge surprise, given that Milton traveled across the same exceptionally warm Gulf of Mexico that Helene had recently transited. But the analysis does produce one striking result: Milton would have been a Category 2 storm at landfall if climate change weren't boosting its strength.

From the oceans to the skies

Hurricanes strengthen while over warm ocean waters, and climate change has been slowly cranking up the heat content of the oceans. But it's important to recognize that the slow warming is an average, and that can include some localized extreme events. This year has seen lots of ocean temperature records set in the Atlantic basin, and that seems to be true in the Gulf of Mexico as well. The researchers note that a different rapid analysis released earlier this week showed that the ocean temperatures—which had boosted Milton to a Category 5 storm during its time in the Gulf—were between 400 and 800 times more likely to exist thanks to climate change.

Read full article

Comments

© Frank Ramspott

Rapid analysis finds climate change’s fingerprint on Hurricane Helene

Hurricane Helene crossed the Gulf of Mexico at a time when sea surface temperatures were at record highs and then barreled into a region where heavy rains had left the ground saturated. The result was historic, catastrophic flooding.

One key question is how soon we might expect history to repeat itself. Our rapidly warming planet has tilted the odds in favor of some extreme weather events in a way that means we can expect some events that had been extremely rare to start occurring with some regularity. Our first stab at understanding climate change's influence on Helene was released on Wednesday, and it suggests that rainfall of the sort experienced by the Carolinas may now be a once-in-70-year event, which could have implications for how we rebuild some of the communities shattered by the rain.

Rapid attribution

The quick analysis was done by the World Weather Attribution project, which has developed peer-reviewed methods of looking for the fingerprints of climate change in major weather events. In general, this involves identifying the key weather patterns that produced the event and then exploring their frequency using climate models run with and without the carbon dioxide we've added to the atmosphere.

Read full article

Comments

© Frank Ramspott

Google identifies low noise “phase transition” in its quantum processor

Back in 2019, Google made waves by claiming it had achieved what has been called "quantum supremacy"—the ability of a quantum computer to perform operations that would take a wildly impractical amount of time to simulate on standard computing hardware. That claim proved to be controversial, in that the operations were little more than a benchmark that involved getting the quantum computer to behave like a quantum computer; separately, improved ideas about how to perform the simulation on a supercomputer cut the time required down significantly.

But Google is back with a new exploration of the benchmark, described in a paper published in Nature on Wednesday. It uses the benchmark to identify what it calls a phase transition in the performance of its quantum processor and uses it to identify conditions where the processor can operate with low noise. Taking advantage of that, they again show that, even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

Cross entropy benchmarking

The benchmark in question involves the performance of what are called quantum random circuits, which involves performing a set of operations on qubits and letting the state of the system evolve over time, so that the output depends heavily on the stochastic nature of measurement outcomes in quantum mechanics. Each qubit will have a probability of producing one of two results, but unless that probability is one, there's no way of knowing which of the results you'll actually get. As a result, the output of the operations will be a string of truly random bits.

Read full article

Comments

© Google

Protein structure and design software gets the Chemistry Nobel

On Wednesday, the Nobel Committee announced that it had awarded the Nobel Prize in chemistry to researchers who pioneered major breakthroughs in computational chemistry. These include two researchers at Google's DeepMind in acknowledgment of their role in developing AI software that could take a raw protein sequence and use it to predict the three-dimensional structure the protein would adopt in cells. Separately, the University of Washington's David Baker was honored for developing software that could design entirely new proteins with specific structures.

The award makes for a bit of a theme for this year, as yesterday's Physics prize honored AI developments. In that case, the connection to physics seemed a bit tenuous, but here, there should be little question that the developments solved major problems in biochemistry.

Understanding protein structure

DeepMind, represented by Demis Hassabis and John Jumper, had developed AIs that managed to master games as diverse as chess and StarCraft. But it was always working on more significant problems in parallel, and in 2020, it surprised many people by announcing that it had tackled one of the biggest computational challenges in existence: the prediction of protein structures.

Read full article

Comments

© Johan Jarnestad/The Royal Swedish Academy of Science

Medicine Nobel goes to previously unknown way of controlling genes

On Monday, the Nobel Committee announced that two US researchers, Victor Ambros and Gary Ruvkun, will receive the prize in Physiology or Medicine for their discovery of a previously unknown mechanism for controlling the activity of genes. They discovered the first of what is now known to be a large collection of MicroRNAs, short (21-23 bases long) RNAs that bind to and alter the behavior of protein-coding RNAs. While first discovered in a roundworm, they've since been discovered to play key roles in the development of most complex life.

The story behind the discovery is typical of a lot of the progress in the biological sciences: genetics helps identify a gene important for the development of one species, and then evolutionary conservation reveals its widespread significance.

In the worm

Ambros and Ruvkun started on the path to discovery while post-doctoral fellows in the lab of earlier Nobel winner Robert Horvitz, who won for his role in developing the roundworm C. elegans as an experimental genetic organism. As part of the early genetic screens, people had identified a variety of mutations that caused developmental problems for specific lineages of cells. These lin mutations included lin-4, which Ambros was characterizing. It lacked a number of specialized cell types, as well as the physical structures that depended on them.

Read full article

Comments

© HeitiPaves

Ants learned to farm fungi during a mass extinction

We tend to think of agriculture as a human innovation. But insects beat us to it by millions of years. Various ant species cooperate with fungi, creating a home for them, providing them with nutrients, and harvesting them as food. This reaches the peak of sophistication in the leafcutter ants, which cut foliage and return it to feed their fungi, which in turn form specialized growths that are harvested for food. But other ant species cooperate with fungi—in some cases strains of fungus that are also found growing in their environment.

Genetic studies have shown that these symbiotic relationships are highly specific—a given ant species will often cooperate with just a single strain of fungus. A number of genes that appear to have evolved rapidly in response to strains of fungi take part in this cooperative relationship. But it has been less clear how the cooperation originally came about, partly because we don't have a good picture of what the undomesticated relatives of these fungi look like.

Now, a large international team of researchers has done a study that traces the relationships among a large collection of both fungi and ants, providing a clearer picture of how this form of agriculture evolved. And the history this study reveals suggests that the cooperation between ants and their crops began after the mass extinction that killed the dinosaurs, when little beyond fungi could thrive.

Read full article

Comments

© pxhidalgo

For the first time since 1882, UK will have no coal-fired power plants

On Monday, the UK will see the closure of its last operational coal power plant, Ratcliffe-on-Soar, which has been operating since 1968. The closure of the plant, which had a capacity of 2,000 megawatts, will bring an end to the history of the country's coal use, which started with the opening of the first coal-fired power station in 1882. Coal played a central part in the UK's power system in the interim, in some years providing over 90 percent of its total electricity.

But a number of factors combined to place coal in a long-term decline: the growth of natural gas-powered plants and renewables, pollution controls, carbon pricing, and a government goal to hit net-zero greenhouse gas emissions by 2050.

From boom to bust

It's difficult to overstate the importance of coal to the UK grid. It was providing over 90 percent of the UK's electricity as recently as 1956. The total amount of power generated continued to climb well after that, reaching a peak of 212 terawatt hours of production by 1980. And the construction of new coal plants was under consideration as recently as the late 2000s. According to the organization Carbon Brief's excellent timeline of coal use in the UK, continuing the use of coal with carbon capture was given consideration.

Read full article

Comments

© Ashley Cooper

Ants learned to farm fungi during a mass extinction

We tend to think of agriculture as a human innovation. But insects beat us to it by millions of years. Various ant species cooperate with fungi, creating a home for them, providing them with nutrients, and harvesting them as food. This reaches the peak of sophistication in the leafcutter ants, which cut foliage and return it to feed their fungi, which in turn form specialized growths that are harvested for food. But other ant species cooperate with fungi—in some cases strains of fungus that are also found growing in their environment.

Genetic studies have shown that these symbiotic relationships are highly specific—a given ant species will often cooperate with just a single strain of fungus. A number of genes that appear to have evolved rapidly in response to strains of fungi take part in this cooperative relationship. But it has been less clear how the cooperation originally came about, partly because we don't have a good picture of what the undomesticated relatives of these fungi look like.

Now, a large international team of researchers has done a study that traces the relationships among a large collection of both fungi and ants, providing a clearer picture of how this form of agriculture evolved. And the history this study reveals suggests that the cooperation between ants and their crops began after the mass extinction that killed the dinosaurs, when little beyond fungi could thrive.

Read full article

Comments

© [CDATA[pxhidalgo]]

For the first time since 1882, UK will have no coal-fired power plants

On Monday, the UK will see the closure of its last operational coal power plant, Ratcliffe-on-Soar, which has been operating since 1968. The closure of the plant, which had a capacity of 2,000 megawatts, will bring an end to the history of the country's coal use, which started with the opening of the first coal-fired power station in 1882. Coal played a central part in the UK's power system in the interim, in some years providing over 90 percent of its total electricity.

But a number of factors combined to place coal in a long-term decline: the growth of natural gas-powered plants and renewables, pollution controls, carbon pricing, and a government goal to hit net-zero greenhouse gas emissions by 2050.

From boom to bust

It's difficult to overstate the importance of coal to the UK grid. It was providing over 90 percent of the UK's electricity as recently as 1956. The total amount of power generated continued to climb well after that, reaching a peak of 212 terawatt hours of production by 1980. And the construction of new coal plants was under consideration as recently as the late 2000s. According to the organization Carbon Brief's excellent timeline of coal use in the UK, continuing the use of coal with carbon capture was given consideration.

Read full article

Comments

© [CDATA[Ashley Cooper]]

Black hole jet appears to boost rate of nova explosions

The intense electromagnetic environment near a black hole can accelerate particles to a large fraction of the speed of light and sends the speeding particles along jets that extend from each of the object's poles. In the case of the supermassive black holes found in the center of galaxies, these jets are truly colossal, blasting material not just out of the galaxy, but possibly out of the galaxy's entire neighborhood.

But this week, scientists have described how the jets may be doing some strange things inside of a galaxy, as well. A study of the galaxy M87 showed that nova explosions appear to be occurring at an unusual high frequency in the neighborhood of one of the jets from the galaxy's central black hole. But there's absolutely no mechanism to explain why this might happen, and there's no sign that it's happening at the jet that's traveling in the opposite direction.

Whether this effect is real, and whether we can come up with an explanation for it, may take some further observations.

Read full article

Comments

© [CDATA[NASA and the Hubble Heritage Team (STScI/AURA)]]

IBM opens its quantum-computing stack to third parties

As we described earlier this year, operating a quantum computer will require a significant investment in classical computing resources, given the amount of measurements and control operations that need to be executed and interpreted. That means that operating a quantum computer will also require a software stack to control and interpret the flow of information from the quantum side.

But software also gets involved well before anything gets executed. While it's possible to execute algorithms on quantum hardware by defining the full set of commands sent to the hardware, most users are going to want to focus on algorithm development, rather than the details of controlling any single piece of quantum hardware. "If everyone's got to get down and know what the noise is, [use] performance management tools, they've got to know how to compile a quantum circuit through hardware, you've got to become an expert in too much to be able to do the algorithm discovery," said IBM's Jay Gambetta. So, part of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them.

IBM's version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. We'll take a look at what software stacks do before getting into the details of what's new.

Read full article

Comments

© [CDATA[IBM]]

Radiation should be able to deflect asteroids as large as 4 km across

Image of a large, circular chamber covered filled with a lot of mechanical equipment, all of which is lit by blue internal glows and covered with massive, branching trails of lightning.

Enlarge / Sandia National Labs' Z machine in action. (credit: Randy Montoya)

The old joke about the dinosaurs going extinct because they didn't have a space program may be overselling the need for one. It turns out you can probably divert some of the more threatening asteroids with nothing more than the products of a nuclear weapons program. But it doesn't work the way you probably think it does.

Obviously, nuclear weapons are great at destroying things, so why not asteroids? That won't work because a lot of the damage that nukes generate comes from the blast wave as it propagates through the atmosphere. And the environment around asteroids is notably short on atmosphere, so blast waves won't happen. But you can still use a nuclear weapon's radiation to vaporize part of the asteroid's surface, creating a very temporary, very hot atmosphere on one side of the asteroid. This should create enough pressure to deflect the asteroid's orbit, potentially causing it to fly safely past Earth.

But will it work? Some scientists at Sandia National Lab have decided to tackle a very cool question with one of the cooler bits of hardware on Earth: the Z machine, which can create a pulse of X-rays bright enough to vaporize rock. They estimate that a nuclear weapon can probably impart enough force to deflect asteroids as large as 4 kilometers across.

Read 12 remaining paragraphs | Comments

A record of the Earth’s temperature covering half a billion years

Image of the Earth with a single, enormous land mass composed of several present-day continents.

Enlarge / The cycle of building and breaking up of supercontinents seems to drive long-term climate trends. (credit: Walter Myers/Stocktrek Images)

Global temperature records go back less than two centuries. But that doesn't mean we have no idea what the world was doing before we started building thermometers. There are various things—tree rings, isotope ratios, and more—that register temperatures in the past. Using these temperature proxies, we've managed to reconstruct thousands of years of our planet's climate.

But going back further is difficult. Fewer proxies get preserved over longer times, and samples get rarer. By the time we go back past a million years, it's difficult to find enough proxies from around the globe and the same time period to reconstruct a global temperature. There are a few exceptions, like the Paleocene-Eocene Thermal Maximum (PETM), a burst of sudden warming about 55 million years ago, but few events that old are nearly as well understood.

Now, researchers have used a combination of proxy records and climate models to reconstruct the Earth's climate for the last half-billion years, providing a global record of temperatures stretching all the way back to near the Cambrian explosion of complex life. The record shows that, with one apparent exception, carbon dioxide and global temperatures have been tightly linked. Which is somewhat surprising, given the other changes the Earth has experienced over this time.

Read 13 remaining paragraphs | Comments

❌