Reading view

There are new articles available, click to refresh the page.

Simple voltage pulse can restore capacity to Li-Si batteries

If you're using a large battery for a specialized purpose—say grid-scale storage or an electric vehicle—then it's possible to tweak the battery chemistry, provide a little bit of excess capacity, and carefully manage its charging and discharging so that it enjoys a long life span. But for consumer electronics, the batteries are smaller, the need for light weight dictates the chemistry, and the demand for quick charging can be higher. So most batteries in our gadgets start to see serious degradation after just a couple of years of use.

A big contributor to that is an internal fragmentation of the electrode materials. This leaves some of the electrode material disconnected from the battery's charge handling system, essentially stranding the material inside the battery and trapping some of the lithium uselessly. Now, researchers have found that, for at least one battery chemistry, it's possible to partially reverse some of this decay, boosting the remaining capacity of the battery by up to 30 percent.

The only problem is that not many batteries use the specific chemistry tested here. But it does show how understanding what's going on inside batteries can provide us with ways to extend their lifespan.

Read full article

Comments

© da-kuk

Amazon joins Google in investing in small modular nuclear power

On Tuesday, Google announced that it had made a power purchase agreement for electricity generated by a small modular nuclear reactor design that hasn't even received regulatory approval yet. Today, it's Amazon's turn. The company's Amazon Web Services (AWS) group has announced three different investments, including one targeting a different startup that has its own design for small, modular nuclear reactors—one that has not yet received regulatory approval.

Unlike Google's deal, which is a commitment to purchase power should the reactors ever be completed, Amazon will lay out some money upfront as part of the agreements. We'll take a look at the deals and technology that Amazon is backing before analyzing why companies are taking a risk on unproven technologies.

Money for utilities and a startup

Two of Amazon's deals are with utilities that serve areas where it already has a significant data center footprint. One of these is Energy Northwest, which is an energy supplier that sends power to utilities in the Pacific Northwest. Amazon is putting up the money for Energy Northwest to study the feasibility of adding small modular reactors to its Columbia Generating Station, which currently houses a single, large reactor. In return, Amazon will get the right to purchase power from an initial installation of four small modular reactors. The site could potentially support additional reactors, which Energy Northwest would be able to use to meet demands from other users.

Read full article

Comments

© X-energy

People think they already know everything they need to make decisions

The world is full of people who have excessive confidence in their own abilities. This is famously described as the Dunning-Kruger effect, which describes how people who lack expertise in something will necessarily lack the knowledge needed to recognize their own limits. Now, a different set of researchers has come out with what might be viewed as a corollary to Dunning-Kruger: People have a strong tendency to believe that they always have enough data to make an informed decision—regardless of what information they actually have.

The work, done by Hunter Gehlbach, Carly Robinson, and Angus Fletcher, is based on an experiment in which they intentionally gave people only partial, biased information, finding that people never seemed to consider they might only have a partial picture. "Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not," they write. The good news? When given the full picture, most people are willing to change their opinions.

Ignorant but confident

The basic setup of the experiment is very straightforward. The researchers developed a scenario where an ongoing water shortage was forcing a school district to consider closing one of its schools and merging its students into another existing school. They then wrote an article that described the situation and contained seven different pieces of information: three that favored merging, three that disfavored it, and one that was neutral. Just over half of the control group that read the full article favored merging the two schools.

Read full article

Comments

© LUDOVIC MARIN

Climate change boosted Milton’s landfall strength from Category 2 to 3

As attempts to clean up after Hurricane Milton are beginning, scientists at the World Weather Attribution project have taken a quick look at whether climate change contributed to its destructive power. While the analysis is limited by the fact that not all the meteorological data is even available yet, by several measures, climate change made aspects of Milton significantly more likely.

This isn't a huge surprise, given that Milton traveled across the same exceptionally warm Gulf of Mexico that Helene had recently transited. But the analysis does produce one striking result: Milton would have been a Category 2 storm at landfall if climate change weren't boosting its strength.

From the oceans to the skies

Hurricanes strengthen while over warm ocean waters, and climate change has been slowly cranking up the heat content of the oceans. But it's important to recognize that the slow warming is an average, and that can include some localized extreme events. This year has seen lots of ocean temperature records set in the Atlantic basin, and that seems to be true in the Gulf of Mexico as well. The researchers note that a different rapid analysis released earlier this week showed that the ocean temperatures—which had boosted Milton to a Category 5 storm during its time in the Gulf—were between 400 and 800 times more likely to exist thanks to climate change.

Read full article

Comments

© Frank Ramspott

Rapid analysis finds climate change’s fingerprint on Hurricane Helene

Hurricane Helene crossed the Gulf of Mexico at a time when sea surface temperatures were at record highs and then barreled into a region where heavy rains had left the ground saturated. The result was historic, catastrophic flooding.

One key question is how soon we might expect history to repeat itself. Our rapidly warming planet has tilted the odds in favor of some extreme weather events in a way that means we can expect some events that had been extremely rare to start occurring with some regularity. Our first stab at understanding climate change's influence on Helene was released on Wednesday, and it suggests that rainfall of the sort experienced by the Carolinas may now be a once-in-70-year event, which could have implications for how we rebuild some of the communities shattered by the rain.

Rapid attribution

The quick analysis was done by the World Weather Attribution project, which has developed peer-reviewed methods of looking for the fingerprints of climate change in major weather events. In general, this involves identifying the key weather patterns that produced the event and then exploring their frequency using climate models run with and without the carbon dioxide we've added to the atmosphere.

Read full article

Comments

© Frank Ramspott

Google identifies low noise “phase transition” in its quantum processor

Back in 2019, Google made waves by claiming it had achieved what has been called "quantum supremacy"—the ability of a quantum computer to perform operations that would take a wildly impractical amount of time to simulate on standard computing hardware. That claim proved to be controversial, in that the operations were little more than a benchmark that involved getting the quantum computer to behave like a quantum computer; separately, improved ideas about how to perform the simulation on a supercomputer cut the time required down significantly.

But Google is back with a new exploration of the benchmark, described in a paper published in Nature on Wednesday. It uses the benchmark to identify what it calls a phase transition in the performance of its quantum processor and uses it to identify conditions where the processor can operate with low noise. Taking advantage of that, they again show that, even giving classical hardware every potential advantage, it would take a supercomputer a dozen years to simulate things.

Cross entropy benchmarking

The benchmark in question involves the performance of what are called quantum random circuits, which involves performing a set of operations on qubits and letting the state of the system evolve over time, so that the output depends heavily on the stochastic nature of measurement outcomes in quantum mechanics. Each qubit will have a probability of producing one of two results, but unless that probability is one, there's no way of knowing which of the results you'll actually get. As a result, the output of the operations will be a string of truly random bits.

Read full article

Comments

© Google

Protein structure and design software gets the Chemistry Nobel

On Wednesday, the Nobel Committee announced that it had awarded the Nobel Prize in chemistry to researchers who pioneered major breakthroughs in computational chemistry. These include two researchers at Google's DeepMind in acknowledgment of their role in developing AI software that could take a raw protein sequence and use it to predict the three-dimensional structure the protein would adopt in cells. Separately, the University of Washington's David Baker was honored for developing software that could design entirely new proteins with specific structures.

The award makes for a bit of a theme for this year, as yesterday's Physics prize honored AI developments. In that case, the connection to physics seemed a bit tenuous, but here, there should be little question that the developments solved major problems in biochemistry.

Understanding protein structure

DeepMind, represented by Demis Hassabis and John Jumper, had developed AIs that managed to master games as diverse as chess and StarCraft. But it was always working on more significant problems in parallel, and in 2020, it surprised many people by announcing that it had tackled one of the biggest computational challenges in existence: the prediction of protein structures.

Read full article

Comments

© Johan Jarnestad/The Royal Swedish Academy of Science

Medicine Nobel goes to previously unknown way of controlling genes

On Monday, the Nobel Committee announced that two US researchers, Victor Ambros and Gary Ruvkun, will receive the prize in Physiology or Medicine for their discovery of a previously unknown mechanism for controlling the activity of genes. They discovered the first of what is now known to be a large collection of MicroRNAs, short (21-23 bases long) RNAs that bind to and alter the behavior of protein-coding RNAs. While first discovered in a roundworm, they've since been discovered to play key roles in the development of most complex life.

The story behind the discovery is typical of a lot of the progress in the biological sciences: genetics helps identify a gene important for the development of one species, and then evolutionary conservation reveals its widespread significance.

In the worm

Ambros and Ruvkun started on the path to discovery while post-doctoral fellows in the lab of earlier Nobel winner Robert Horvitz, who won for his role in developing the roundworm C. elegans as an experimental genetic organism. As part of the early genetic screens, people had identified a variety of mutations that caused developmental problems for specific lineages of cells. These lin mutations included lin-4, which Ambros was characterizing. It lacked a number of specialized cell types, as well as the physical structures that depended on them.

Read full article

Comments

© HeitiPaves

Ants learned to farm fungi during a mass extinction

We tend to think of agriculture as a human innovation. But insects beat us to it by millions of years. Various ant species cooperate with fungi, creating a home for them, providing them with nutrients, and harvesting them as food. This reaches the peak of sophistication in the leafcutter ants, which cut foliage and return it to feed their fungi, which in turn form specialized growths that are harvested for food. But other ant species cooperate with fungi—in some cases strains of fungus that are also found growing in their environment.

Genetic studies have shown that these symbiotic relationships are highly specific—a given ant species will often cooperate with just a single strain of fungus. A number of genes that appear to have evolved rapidly in response to strains of fungi take part in this cooperative relationship. But it has been less clear how the cooperation originally came about, partly because we don't have a good picture of what the undomesticated relatives of these fungi look like.

Now, a large international team of researchers has done a study that traces the relationships among a large collection of both fungi and ants, providing a clearer picture of how this form of agriculture evolved. And the history this study reveals suggests that the cooperation between ants and their crops began after the mass extinction that killed the dinosaurs, when little beyond fungi could thrive.

Read full article

Comments

© pxhidalgo

For the first time since 1882, UK will have no coal-fired power plants

On Monday, the UK will see the closure of its last operational coal power plant, Ratcliffe-on-Soar, which has been operating since 1968. The closure of the plant, which had a capacity of 2,000 megawatts, will bring an end to the history of the country's coal use, which started with the opening of the first coal-fired power station in 1882. Coal played a central part in the UK's power system in the interim, in some years providing over 90 percent of its total electricity.

But a number of factors combined to place coal in a long-term decline: the growth of natural gas-powered plants and renewables, pollution controls, carbon pricing, and a government goal to hit net-zero greenhouse gas emissions by 2050.

From boom to bust

It's difficult to overstate the importance of coal to the UK grid. It was providing over 90 percent of the UK's electricity as recently as 1956. The total amount of power generated continued to climb well after that, reaching a peak of 212 terawatt hours of production by 1980. And the construction of new coal plants was under consideration as recently as the late 2000s. According to the organization Carbon Brief's excellent timeline of coal use in the UK, continuing the use of coal with carbon capture was given consideration.

Read full article

Comments

© Ashley Cooper

Ants learned to farm fungi during a mass extinction

We tend to think of agriculture as a human innovation. But insects beat us to it by millions of years. Various ant species cooperate with fungi, creating a home for them, providing them with nutrients, and harvesting them as food. This reaches the peak of sophistication in the leafcutter ants, which cut foliage and return it to feed their fungi, which in turn form specialized growths that are harvested for food. But other ant species cooperate with fungi—in some cases strains of fungus that are also found growing in their environment.

Genetic studies have shown that these symbiotic relationships are highly specific—a given ant species will often cooperate with just a single strain of fungus. A number of genes that appear to have evolved rapidly in response to strains of fungi take part in this cooperative relationship. But it has been less clear how the cooperation originally came about, partly because we don't have a good picture of what the undomesticated relatives of these fungi look like.

Now, a large international team of researchers has done a study that traces the relationships among a large collection of both fungi and ants, providing a clearer picture of how this form of agriculture evolved. And the history this study reveals suggests that the cooperation between ants and their crops began after the mass extinction that killed the dinosaurs, when little beyond fungi could thrive.

Read full article

Comments

© [CDATA[pxhidalgo]]

For the first time since 1882, UK will have no coal-fired power plants

On Monday, the UK will see the closure of its last operational coal power plant, Ratcliffe-on-Soar, which has been operating since 1968. The closure of the plant, which had a capacity of 2,000 megawatts, will bring an end to the history of the country's coal use, which started with the opening of the first coal-fired power station in 1882. Coal played a central part in the UK's power system in the interim, in some years providing over 90 percent of its total electricity.

But a number of factors combined to place coal in a long-term decline: the growth of natural gas-powered plants and renewables, pollution controls, carbon pricing, and a government goal to hit net-zero greenhouse gas emissions by 2050.

From boom to bust

It's difficult to overstate the importance of coal to the UK grid. It was providing over 90 percent of the UK's electricity as recently as 1956. The total amount of power generated continued to climb well after that, reaching a peak of 212 terawatt hours of production by 1980. And the construction of new coal plants was under consideration as recently as the late 2000s. According to the organization Carbon Brief's excellent timeline of coal use in the UK, continuing the use of coal with carbon capture was given consideration.

Read full article

Comments

© [CDATA[Ashley Cooper]]

Black hole jet appears to boost rate of nova explosions

The intense electromagnetic environment near a black hole can accelerate particles to a large fraction of the speed of light and sends the speeding particles along jets that extend from each of the object's poles. In the case of the supermassive black holes found in the center of galaxies, these jets are truly colossal, blasting material not just out of the galaxy, but possibly out of the galaxy's entire neighborhood.

But this week, scientists have described how the jets may be doing some strange things inside of a galaxy, as well. A study of the galaxy M87 showed that nova explosions appear to be occurring at an unusual high frequency in the neighborhood of one of the jets from the galaxy's central black hole. But there's absolutely no mechanism to explain why this might happen, and there's no sign that it's happening at the jet that's traveling in the opposite direction.

Whether this effect is real, and whether we can come up with an explanation for it, may take some further observations.

Read full article

Comments

© [CDATA[NASA and the Hubble Heritage Team (STScI/AURA)]]

IBM opens its quantum-computing stack to third parties

As we described earlier this year, operating a quantum computer will require a significant investment in classical computing resources, given the amount of measurements and control operations that need to be executed and interpreted. That means that operating a quantum computer will also require a software stack to control and interpret the flow of information from the quantum side.

But software also gets involved well before anything gets executed. While it's possible to execute algorithms on quantum hardware by defining the full set of commands sent to the hardware, most users are going to want to focus on algorithm development, rather than the details of controlling any single piece of quantum hardware. "If everyone's got to get down and know what the noise is, [use] performance management tools, they've got to know how to compile a quantum circuit through hardware, you've got to become an expert in too much to be able to do the algorithm discovery," said IBM's Jay Gambetta. So, part of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them.

IBM's version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. We'll take a look at what software stacks do before getting into the details of what's new.

Read full article

Comments

© [CDATA[IBM]]

Radiation should be able to deflect asteroids as large as 4 km across

Image of a large, circular chamber covered filled with a lot of mechanical equipment, all of which is lit by blue internal glows and covered with massive, branching trails of lightning.

Enlarge / Sandia National Labs' Z machine in action. (credit: Randy Montoya)

The old joke about the dinosaurs going extinct because they didn't have a space program may be overselling the need for one. It turns out you can probably divert some of the more threatening asteroids with nothing more than the products of a nuclear weapons program. But it doesn't work the way you probably think it does.

Obviously, nuclear weapons are great at destroying things, so why not asteroids? That won't work because a lot of the damage that nukes generate comes from the blast wave as it propagates through the atmosphere. And the environment around asteroids is notably short on atmosphere, so blast waves won't happen. But you can still use a nuclear weapon's radiation to vaporize part of the asteroid's surface, creating a very temporary, very hot atmosphere on one side of the asteroid. This should create enough pressure to deflect the asteroid's orbit, potentially causing it to fly safely past Earth.

But will it work? Some scientists at Sandia National Lab have decided to tackle a very cool question with one of the cooler bits of hardware on Earth: the Z machine, which can create a pulse of X-rays bright enough to vaporize rock. They estimate that a nuclear weapon can probably impart enough force to deflect asteroids as large as 4 kilometers across.

Read 12 remaining paragraphs | Comments

A record of the Earth’s temperature covering half a billion years

Image of the Earth with a single, enormous land mass composed of several present-day continents.

Enlarge / The cycle of building and breaking up of supercontinents seems to drive long-term climate trends. (credit: Walter Myers/Stocktrek Images)

Global temperature records go back less than two centuries. But that doesn't mean we have no idea what the world was doing before we started building thermometers. There are various things—tree rings, isotope ratios, and more—that register temperatures in the past. Using these temperature proxies, we've managed to reconstruct thousands of years of our planet's climate.

But going back further is difficult. Fewer proxies get preserved over longer times, and samples get rarer. By the time we go back past a million years, it's difficult to find enough proxies from around the globe and the same time period to reconstruct a global temperature. There are a few exceptions, like the Paleocene-Eocene Thermal Maximum (PETM), a burst of sudden warming about 55 million years ago, but few events that old are nearly as well understood.

Now, researchers have used a combination of proxy records and climate models to reconstruct the Earth's climate for the last half-billion years, providing a global record of temperatures stretching all the way back to near the Cambrian explosion of complex life. The record shows that, with one apparent exception, carbon dioxide and global temperatures have been tightly linked. Which is somewhat surprising, given the other changes the Earth has experienced over this time.

Read 13 remaining paragraphs | Comments

Researchers spot largest black hole jets ever discovered

Image of a faint web of lighter material against a dark background. The web is punctuated by bright objects, representing galaxies. One of those galaxies has shot jets of material outside the web itself.

Enlarge / Artist's conception of a dark matter filament containing a galaxy with large jets. (Caltech noted that some details of this image were created using AI.) (credit: Martijn Oei (Caltech) / Dylan Nelson (IllustrisTNG Collaboration).)

The supermassive black holes that sit at the center of galaxies aren't just decorative. The intense radiation they emit when feeding helps drive away gas and dust that would otherwise form stars, providing feedback that limits the growth of the galaxy. But their influence may extend beyond the galaxy they inhabit. Many black holes produce jets and, in the case of supermassive versions, these jets can eject material entirely out of the galaxy.

Now, researchers are getting a clearer picture of just how far outside of the galaxy their influence can reach. A new study describes the largest-ever jets observed, extending across a total distance of 23 million light-years (seven megaparsecs). At those distances, the jets could easily send material into other galaxies and across the cosmic web of dark matter that structures the Universe.

Extreme jets

Jets are formed in the complex environment near a black hole. The intense heating of infalling material ionizes and heats it, creating electromagnetic fields that act as a natural particle accelerator. This creates jets of particles that travel at a substantial fraction of the speed of light. These will ultimately slam into nearby material, creating shockwaves that heat and accelerate that, too. Over time, this leads to large-scale, coordinated outflows of material, with the scale of the jet being proportional to a combination of the size of the black hole and the amount of material it is feeding on.

Read 11 remaining paragraphs | Comments

Court clears researchers of defamation for identifying manipulated data

A formal red brick building on a college campus.

Enlarge / Harvard Business School was targeted by a faculty member's lawsuit. (credit: APCortizasJr)

Earlier this year, we got a look at something unusual: the results of an internal investigation conducted by Harvard Business School that concluded one of its star faculty members had committed research misconduct. Normally, these reports are kept confidential, leaving questions regarding the methods and extent of data manipulations.

But in this case, the report became public because the researcher had filed a lawsuit that alleged defamation on the part of the team of data detectives that had first identified potential cases of fabricated data, as well as Harvard Business School itself. Now, the court has ruled on motions to dismiss the case. While the suit against Harvard will go on, the court has ruled that evidence-backed conclusions regarding fabricated data cannot constitute defamation—which is probably a very good thing for science.

Data and defamation

The researchers who had been sued, Uri Simonsohn, Leif Nelson, and Joe Simmons, run a blog called Data Colada where, among other things, they note cases of suspicious-looking data in the behavioral sciences. As we detailed in our earlier coverage, they published a series of blog posts describing an apparent case of fabricated data in four different papers published by the high-profile researcher Francesca Gino, a professor at Harvard Business School.

Read 11 remaining paragraphs | Comments

Old Easter Island genomes show no sign of a population collapse

A row of grey rock sculptures of human torsos and heads, arranged in a long line.

Enlarge (credit: Jarcosa)

Rapa Nui, often referred to as Easter Island, is one of the most remote populated islands in the world. It's so distant that Europeans didn't stumble onto it until centuries after they had started exploring the Pacific. When they arrived, though, they found that the relatively small island supported a population of thousands, one that had built imposing monumental statues called moai. Arguments over how this population got there and what happened once it did have gone on ever since.

Some of these arguments, such as the idea that the island's indigenous people had traveled there from South America, have since been put to rest. Genomes from people native to the island show that its original population was part of the Polynesian expansion across the Pacific. But others, such as the role of ecological collapse in limiting the island's population and altering its culture, continue to be debated.

Researchers have now obtained genome sequence from the remains of 15 Rapa Nui natives who predate European contact. And they indicate that the population of the island appears to have grown slowly and steadily, without any sign of a bottleneck that could be associated with an ecological collapse. And roughly 10 percent of the genomes appear to have a Native American source that likely dates from roughly the same time that the island was settled.

Read 16 remaining paragraphs | Comments

❌