Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

New congressional report: “COVID-19 most likely emerged from a laboratory”

11 December 2024 at 16:45

Recently, Congress' Select Subcommittee on the Coronavirus Pandemic released its final report. The basic gist is about what you'd expect from a Republican-run committee, in that it trashes a lot of Biden-era policies and state-level responses while praising a number of Trump's decisions. But what's perhaps most striking is how it tackles a variety of scientific topics, including many where there's a large, complicated body of evidence.

Notably, this includes conclusions about the origin of the pandemic, which the report describes as "most likely" emerging from a lab rather than being the product of the zoonotic transfer between an animal species and humans. The latter explanation is favored by many scientists.

The conclusions themselves aren't especially interesting; they're expected from a report with partisan aims. But the method used to reach those conclusions is often striking: The Republican majority engages in a process of systematically changing the standard of evidence needed for it to reach a conclusion. For a conclusion the report's authors favor, they'll happily accept evidence from computer models or arguments from an editorial in the popular press; for conclusions they disfavor, they demand double-blind controlled clinical trials.

Read full article

Comments

© Grace Cary

Google gets an error-corrected quantum bit to be stable for an hour

9 December 2024 at 18:25

On Monday, Nature released a paper from Google's quantum computing team that provides a key demonstration of the potential of quantum error correction. Thanks to an improved processor, Google's team found that increasing the number of hardware qubits dedicated to an error-corrected logical qubit led to an exponential increase in performance. By the time the entire 105-qubit processor was dedicated to hosting a single error-corrected qubit, the system was stable for an average of an hour.

In fact, Google told Ars that errors on this single logical qubit were rare enough that it was difficult to study them. The work provides a significant validation that quantum error correction is likely to be capable of supporting the execution of complex algorithms that might require hours to execute.

A new fab

Google is making a number of announcements in association with the paper's release (an earlier version of the paper has been up on the arXiv since August). One of those is that the company is committed enough to its quantum computing efforts that it has built its own fabrication facility for its superconducting processors.

Read full article

Comments

© Google

US to start nationwide testing for H5N1 flu virus in milk supply

6 December 2024 at 21:18

On Friday, the US Department of Agriculture (USDA) announced that it would begin a nationwide testing program for the presence of the H5N1 flu virus, also known as the bird flu. Testing will focus on pre-pasteurized milk at dairy processing facilities (pasteurization inactivates the virus), but the order that's launching the program will require anybody involved with milk production before then to provide samples to the USDA on request. That includes "any entity responsible for a dairy farm, bulk milk transporter, bulk milk transfer station, or dairy processing facility."

The ultimate goal is to identify individual herds where the virus is circulating and use the agency's existing powers to do contact tracing and restrict the movement of cattle, with the ultimate goal of eliminating the virus from US herds.

A bovine disease vector

At the time of publication, the CDC had identified 58 cases of humans infected by the H5N1 flu virus, over half of them in California. All but two have come about due to contact with agriculture, either cattle (35 cases) or poultry (21). The virus's genetic material has appeared in the milk supply and, although pasteurization should eliminate any intact infectious virus, raw milk is notable for not undergoing pasteurization, which has led to at least one recall when the virus made its way into raw milk. And we know the virus can spread to other species if they drink milk from infected cows.

Read full article

Comments

© Credit: mikedabell

Study: Warming has accelerated due to the Earth absorbing more sunlight

5 December 2024 at 20:15

2023 was always going to be a hot year, given that warmer El Niño conditions were superimposed on the long-term trend of climate change driven by our greenhouse gas emissions. But it's not clear anybody was expecting the striking string of hot months that allowed the year to easily eclipse any previous year on record. As the warmth has continued at record levels even after the El Niño faded, it's an event that seems to demand an explanation.

On Thursday, a group of German scientists—Helge Goessling, Thomas Rackow, and Thomas Jung—released a paper that attempts to provide one. They present data that suggests the Earth is absorbing more incoming sunlight than it has in the past, largely due to reduced cloud cover.

Balancing the numbers on radiation

Years with strong El Niño conditions tend to break records. But the 2023 El Niño was relatively mild. The effects of the phenomenon are also directly felt in the tropical Pacific, yet ocean temperatures set records in the Atlantic and contributed to a massive retreat in ice near Antarctica. So, there are clearly limits to what can be attributed to El Niño. Other influences that have been considered include the injection of water vapor into the stratosphere by the Hunga Tonga eruption, and a reduction in sulfur emissions due to new rules governing international shipping. 2023 also corresponds to a peak in the most recent solar cycle.

Read full article

Comments

© NASA

Google’s DeepMind tackles weather forecasting, with great performance

4 December 2024 at 17:06

By some measures, AI systems are now competitive with traditional computing methods for generating weather forecasts. Because their training penalizes errors, however, the forecasts tend to get "blurry"—as you move further ahead in time, the models make fewer specific predictions since those are more likely to be wrong. As a result, you start to see things like storm tracks broadening and the storms themselves losing clearly defined edges.

But using AI is still extremely tempting because the alternative is a computational atmospheric circulation model, which is extremely compute-intensive. Still, it's highly successful, with the ensemble model from the European Centre for Medium-Range Weather Forecasts considered the best in class.

In a paper being released today, Google's DeepMind claims its new AI system manages to outperform the European model on forecasts out to at least a week and often beyond. DeepMind's system, called GenCast, merges some computational approaches used by atmospheric scientists with a diffusion model, commonly used in generative AI. The result is a system that maintains high resolution while cutting the computational cost significantly.

Read full article

Comments

Qubit that makes most errors obvious now available to customers

20 November 2024 at 20:58

We're nearing the end of the year, and there are typically a flood of announcements regarding quantum computers around now, in part because some companies want to live up to promised schedules. Most of these involve evolutionary improvements on previous generations of hardware. But this year, we have something new: the first company to market with a new qubit technology.

The technology is called a dual-rail qubit, and it is intended to make the most common form of error trivially easy to detect in hardware, thus making error correction far more efficient. And, while tech giant Amazon has been experimenting with them, a startup called Quantum Circuits is the first to give the public access to dual-rail qubits via a cloud service.

While the tech is interesting on its own, it also provides us with a window into how the field as a whole is thinking about getting error-corrected quantum computing to work.

Read full article

Comments

© Quantum Circuits

Microsoft and Atom Computing combine for quantum error correction demo

19 November 2024 at 21:00

In September, Microsoft made an unusual combination of announcements. It demonstrated progress with quantum error correction, something that will be needed for the technology to move much beyond the interesting demo phase, using hardware from a quantum computing startup called Quantinuum. At the same time, however, the company also announced that it was forming a partnership with a different startup, Atom Computing, which uses a different technology to make qubits available for computations.

Given that, it was probably inevitable that the folks in Redmond, Washington, would want to show that similar error correction techniques would also work with Atom Computing's hardware. It didn't take long, as the two companies are releasing a draft manuscript describing their work on error correction today. The paper serves as both a good summary of where things currently stand in the world of error correction, as well as a good look at some of the distinct features of computation using neutral atoms.

Atoms and errors

While we have various technologies that provide a way of storing and manipulating bits of quantum information, none of them can be operated error-free. At present, errors make it difficult to perform even the simplest computations that are clearly beyond the capabilities of classical computers. More sophisticated algorithms would inevitably encounter an error before they could be completed, a situation that would remain true even if we could somehow improve the hardware error rates of qubits by a factor of 1,000—something we're unlikely to ever be able to do.

Read full article

Comments

© Atom Computing

IBM boosts the amount of computation you can get done on quantum hardware

13 November 2024 at 22:42

There's a general consensus that we won't be able to consistently perform sophisticated quantum calculations without the development of error-corrected quantum computing, which is unlikely to arrive until the end of the decade. It's still an open question, however, whether we could perform limited but useful calculations at an earlier point. IBM is one of the companies that's betting the answer is yes, and on Wednesday, it announced a series of developments aimed at making that possible.

On their own, none of the changes being announced are revolutionary. But collectively, changes across the hardware and software stacks have produced much more efficient and less error-prone operations. The net result is a system that supports the most complicated calculations yet on IBM's hardware, leaving the company optimistic that its users will find some calculations where quantum hardware provides an advantage.

Better hardware and software

IBM's early efforts in the quantum computing space saw it ramp up the qubit count rapidly, being one of the first companies to reach the 1,000 qubit count. However, each of those qubits had an error rate that ensured that any algorithms that tried to use all of these qubits in a single calculation would inevitably trigger one. Since then, the company's focus has been on improving the performance of smaller processors. Wednesday's announcement was based on the introduction of the second version of its Heron processor, which has 156 qubits (up from an earlier 133 in Revision 1). That's still beyond the capability of simulations on classical computers, should it be able to operate with sufficiently low errors.

Read full article

Comments

© IBM Research

What did the snowball Earth look like?

13 November 2024 at 17:25

By now, it has been firmly established that the Earth went through a series of global glaciations around 600 million to 700 million years ago, shortly before complex animal life exploded in the Cambrian. Climate models have confirmed that, once enough of a dark ocean is covered by reflective ice, it sets off a cooling feedback that turns the entire planet into an icehouse. And we've found glacial material that was deposited off the coasts in the tropics.

We have an extremely incomplete picture of what these snowball periods looked like, and Antarctic terrain provides different models for what an icehouse continent might look like. But now, researchers have found deposits that they argue were formed beneath a massive ice sheet that was being melted from below by volcanic activity. And, although the deposits are currently in Colorado's Front Range, at the time they resided much closer to the equator.

In the icehouse

Glacial deposits can be difficult to identify in deep time. Massive sheets of ice will scour the terrain down to bare rock, leaving behind loosely consolidated bits of rubble that can easily be swept away after the ice is gone. We can spot when that rubble shows up in ocean deposits to confirm there were glaciers along the coast, but rubble can be difficult to find on land.

Read full article

Comments

© MARK GARLICK/SCIENCE PHOTO LIBRARY

Researchers spot black hole feeding at 40x its theoretical limit

4 November 2024 at 21:21

How did supermassive black holes end up at the center of every galaxy? A while back, it wasn't that hard to explain: That's where the highest concentration of matter is, and the black holes had billions of years to feed on it. But as we've looked ever deeper into the Universe's history, we keep finding supermassive black holes, which shortens the timeline for their formation. Rather than making a leisurely meal of nearby matter, these black holes have gorged themselves in a feeding frenzy.

With the advent of the Webb Space Telescope, the problem has pushed up against theoretical limits. The matter falling into a black hole generates radiation, with faster feeding meaning more radiation. And that radiation can drive off nearby matter, choking off the black hole's food supply. That sets a limit on how fast black holes can grow unless matter is somehow fed directly into them. The Webb was used to identify early supermassive black holes that needed to have been pushing against the limit for their entire existence.

But the Webb may have just identified a solution to the dilemma as well. It has spotted a black hole that appears to have been feeding at 40 times the theoretical limit for millions of years, allowing growth at a pace sufficient to build a supermassive black hole.

Read full article

Comments

© NOIRLab/NSF/AURA/J. da Silva/M. Zamani

RFK Jr. claims Trump promised to put him in charge of NIH, CDC, and more

31 October 2024 at 22:20

Earlier this week, Robert F. Kennedy, Jr. used a Zoom call to tell his supporters that Donald Trump had promised him "control" of the Department of Health and Human Services (HHS), the federal agency that includes the Centers for Disease Control, Food and Drug Administration, National Institutes of Health, as well as the Department of Agriculture. Given Kennedy's support for debunked anti-vaccine nonsense, this represents a potential public health nightmare.

A few days after, Howard Lutnick, a co-chair of Trump's transition team, appeared on CNN to deny that RFK Jr. would be put in charge of HHS. But he followed that with a long rant in which he echoed Kennedy's spurious claims about vaccines. This provides yet another indication of how anti-vaccine activism has become deeply enmeshed with Republican politics, to the point where it may be just as bad even if Kennedy isn't appointed.

Trump as Kennedy’s route to power

Kennedy has a long history of misinformation regarding health, with a special focus on vaccines. This includes the extensively debunked suggestion that there is a correlation between vaccinations and autism incidence, and it extends to a general skepticism about vaccine safety. That's mixed with conspiracy theories regarding collusion between federal regulators and pharmaceutical companies.

Read full article

Comments

© Anna Moneymaker / Staff

A how-to for ethical geoengineering research

26 October 2024 at 11:21

Over the Northern Hemisphere's summer, the world's temperatures hovered near 1.5° C above pre-industrial temperatures, and the catastrophic weather events that ensued provided a preview of what might be expected to be the new normal before mid-century. And the warming won't stop there; our current emissions trajectory is such that we will double that temperature increase by the time the century is out and continue beyond its end.

This frightening trajectory and its results have led many people to argue that some form of geoengineering is necessary. If we know the effects of that much warming will be catastrophic, why not try canceling some of it out? Unfortunately, the list of "why nots" includes the fact that we don't know how well some of these techniques work or fully understand their unintended consequences. This means more research is required before we put them into practice.

But how do we do that research if there's the risk of unintended consequences? To help guide the process, the American Geophysical Union (AGU) has just released guidelines for ensuring that geoengineering research is conducted ethically.

Read full article

Comments

© Handout / Getty Images

With four more years like 2023, carbon emissions will blow past 1.5° limit

24 October 2024 at 20:23

On Thursday, the United Nations' Environmental Programme (UNEP) released a report on what it terms the "emissions gap"—the difference between where we're heading and where we'd need to be to achieve the goals set out in the Paris Agreement. It makes for some pretty grim reading. Given last year's greenhouse gas emissions, we can afford fewer than four similar years before we would exceed the total emissions compatible with limiting the planet's warming to 1.5° C above pre-industrial conditions. Following existing policies out to the turn of the century would leave us facing over 3° C of warming.

The report ascribes this situation to two distinct emissions gaps: between the goals of the Paris Agreement and what countries have pledged to do and between their pledges and the policies they've actually put in place. There are some reasons to think that rapid progress could be made—the six largest greenhouse gas emitters accounted for nearly two-thirds of the global emissions, so it wouldn't take many policy changes to make a big difference. And the report suggests increased deployment of wind and solar could handle over a quarter of the needed emissions reductions.

But so far, progress has been far too limited to cut into global emissions.

Read full article

Comments

© Mario Tama

De-extinction company provides a progress report on thylacine efforts

22 October 2024 at 23:13

Colossal, the company founded to try to restore the mammoth to the Arctic tundra, has also decided to tackle a number of other species that have gone extinct relatively recently: the dodo and the thylacine. Because of significant differences in biology, not the least of which is the generation time of Proboscideans, these other efforts may reach many critical milestones well in advance of the work on mammoths.

Late last week, Colossal released a progress report on the work involved in resurrecting the thylacine, also known as the Tasmanian tiger, which went extinct when the last known survivor died in a zoo in 1936. Marsupial biology has some features that may make de-extinction somewhat easier, but we have far less sophisticated ways of manipulating it compared to the technology we've developed for working with the stem cells and reproduction of placental mammals. But, based on these new announcements, the technology available for working with marsupials is expanding rapidly.

Cane toad resistance

Colossal has branched out from its original de-extinction mission to include efforts to keep species from ever needing its services. In the case of marsupial predators, the de-extinction effort is incorporating work that will benefit existing marsupial predators: generating resistance to the toxins found on the cane toad, an invasive species that has spread widely across Australia.

Read full article

Comments

© Universal History Archive

Simple voltage pulse can restore capacity to Li-Si batteries

18 October 2024 at 11:15

If you're using a large battery for a specialized purpose—say grid-scale storage or an electric vehicle—then it's possible to tweak the battery chemistry, provide a little bit of excess capacity, and carefully manage its charging and discharging so that it enjoys a long life span. But for consumer electronics, the batteries are smaller, the need for light weight dictates the chemistry, and the demand for quick charging can be higher. So most batteries in our gadgets start to see serious degradation after just a couple of years of use.

A big contributor to that is an internal fragmentation of the electrode materials. This leaves some of the electrode material disconnected from the battery's charge handling system, essentially stranding the material inside the battery and trapping some of the lithium uselessly. Now, researchers have found that, for at least one battery chemistry, it's possible to partially reverse some of this decay, boosting the remaining capacity of the battery by up to 30 percent.

The only problem is that not many batteries use the specific chemistry tested here. But it does show how understanding what's going on inside batteries can provide us with ways to extend their lifespan.

Read full article

Comments

© da-kuk

Amazon joins Google in investing in small modular nuclear power

16 October 2024 at 16:47

On Tuesday, Google announced that it had made a power purchase agreement for electricity generated by a small modular nuclear reactor design that hasn't even received regulatory approval yet. Today, it's Amazon's turn. The company's Amazon Web Services (AWS) group has announced three different investments, including one targeting a different startup that has its own design for small, modular nuclear reactors—one that has not yet received regulatory approval.

Unlike Google's deal, which is a commitment to purchase power should the reactors ever be completed, Amazon will lay out some money upfront as part of the agreements. We'll take a look at the deals and technology that Amazon is backing before analyzing why companies are taking a risk on unproven technologies.

Money for utilities and a startup

Two of Amazon's deals are with utilities that serve areas where it already has a significant data center footprint. One of these is Energy Northwest, which is an energy supplier that sends power to utilities in the Pacific Northwest. Amazon is putting up the money for Energy Northwest to study the feasibility of adding small modular reactors to its Columbia Generating Station, which currently houses a single, large reactor. In return, Amazon will get the right to purchase power from an initial installation of four small modular reactors. The site could potentially support additional reactors, which Energy Northwest would be able to use to meet demands from other users.

Read full article

Comments

© X-energy

People think they already know everything they need to make decisions

14 October 2024 at 18:56

The world is full of people who have excessive confidence in their own abilities. This is famously described as the Dunning-Kruger effect, which describes how people who lack expertise in something will necessarily lack the knowledge needed to recognize their own limits. Now, a different set of researchers has come out with what might be viewed as a corollary to Dunning-Kruger: People have a strong tendency to believe that they always have enough data to make an informed decision—regardless of what information they actually have.

The work, done by Hunter Gehlbach, Carly Robinson, and Angus Fletcher, is based on an experiment in which they intentionally gave people only partial, biased information, finding that people never seemed to consider they might only have a partial picture. "Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not," they write. The good news? When given the full picture, most people are willing to change their opinions.

Ignorant but confident

The basic setup of the experiment is very straightforward. The researchers developed a scenario where an ongoing water shortage was forcing a school district to consider closing one of its schools and merging its students into another existing school. They then wrote an article that described the situation and contained seven different pieces of information: three that favored merging, three that disfavored it, and one that was neutral. Just over half of the control group that read the full article favored merging the two schools.

Read full article

Comments

© LUDOVIC MARIN

Climate change boosted Milton’s landfall strength from Category 2 to 3

11 October 2024 at 20:05

As attempts to clean up after Hurricane Milton are beginning, scientists at the World Weather Attribution project have taken a quick look at whether climate change contributed to its destructive power. While the analysis is limited by the fact that not all the meteorological data is even available yet, by several measures, climate change made aspects of Milton significantly more likely.

This isn't a huge surprise, given that Milton traveled across the same exceptionally warm Gulf of Mexico that Helene had recently transited. But the analysis does produce one striking result: Milton would have been a Category 2 storm at landfall if climate change weren't boosting its strength.

From the oceans to the skies

Hurricanes strengthen while over warm ocean waters, and climate change has been slowly cranking up the heat content of the oceans. But it's important to recognize that the slow warming is an average, and that can include some localized extreme events. This year has seen lots of ocean temperature records set in the Atlantic basin, and that seems to be true in the Gulf of Mexico as well. The researchers note that a different rapid analysis released earlier this week showed that the ocean temperatures—which had boosted Milton to a Category 5 storm during its time in the Gulf—were between 400 and 800 times more likely to exist thanks to climate change.

Read full article

Comments

© Frank Ramspott

Rapid analysis finds climate change’s fingerprint on Hurricane Helene

9 October 2024 at 21:50

Hurricane Helene crossed the Gulf of Mexico at a time when sea surface temperatures were at record highs and then barreled into a region where heavy rains had left the ground saturated. The result was historic, catastrophic flooding.

One key question is how soon we might expect history to repeat itself. Our rapidly warming planet has tilted the odds in favor of some extreme weather events in a way that means we can expect some events that had been extremely rare to start occurring with some regularity. Our first stab at understanding climate change's influence on Helene was released on Wednesday, and it suggests that rainfall of the sort experienced by the Carolinas may now be a once-in-70-year event, which could have implications for how we rebuild some of the communities shattered by the rain.

Rapid attribution

The quick analysis was done by the World Weather Attribution project, which has developed peer-reviewed methods of looking for the fingerprints of climate change in major weather events. In general, this involves identifying the key weather patterns that produced the event and then exploring their frequency using climate models run with and without the carbon dioxide we've added to the atmosphere.

Read full article

Comments

© Frank Ramspott

❌
❌