All's not as it appears, this tale has many twists -
but if I wasn't here documenting the story
would that mean that the plot did not exist?

— Peter Hammill

Universe Today

Syndicate content Universe Today
Space and astronomy news
Updated: 12 hours 28 min ago

Does the Rise of AI Explain the Great Silence in the Universe?

Mon, 04/08/2024 - 3:18pm

Artificial Intelligence is making its presence felt in thousands of different ways. It helps scientists make sense of vast troves of data; it helps detect financial fraud; it drives our cars; it feeds us music suggestions; its chatbots drive us crazy. And it’s only getting started.

Are we capable of understanding how quickly AI will continue to develop? And if the answer is no, does that constitute the Great Filter?

The Fermi Paradox is the discrepancy between the apparent high likelihood of advanced civilizations existing and the total lack of evidence that they do exist. Many solutions have been proposed for why the discrepancy exists. One of the ideas is the “Great Filter.”

The Great Filter is a hypothesized event or situation that prevents intelligent life from becoming interplanetary and interstellar and even leads to its demise. Think climate change, nuclear war, asteroid strikes, supernova explosions, plagues, or any number of other things from the rogue’s gallery of cataclysmic events.

Or how about the rapid development of AI?

A new paper in Acta Astronautica explores the idea that Artificial Intelligence becomes Artificial Super Intelligence (ASI) and that ASI is the Great Filter. The paper’s title is “Is Artificial Intelligence the Great Filter that makes advanced technical civilizations rare in the universe?” The author is Michael Garrett from the Department of Physics and Astronomy at the University of Manchester.

“Without practical regulation, there is every reason to believe that AI could represent a major threat to the future course of not only our technical civilization but all technical civilizations.”

Michael Garrett, University of Manchester

Some think the Great Filter prevents technological species like ours from becoming multi-planetary. That’s bad because a species is at greater risk of extinction or stagnation with only one home. According to Garrett, a species is in a race against time without a backup planet. “It is proposed that such a filter emerges before these civilizations can develop a stable, multi-planetary existence, suggesting the typical longevity (L) of a technical civilization is less than 200 years,” Garrett writes.

If true, that can explain why we detect no technosignatures or other evidence of ETIs (Extraterrestrial Intelligences.) What does that tell us about our own technological trajectory? If we face a 200-year constraint, and if it’s because of ASI, where does that leave us? Garrett underscores the “…critical need to quickly establish regulatory frameworks for AI development on Earth and the advancement of a multi-planetary society to mitigate against such existential threats.”

An image of our beautiful Earth taken by the Galileo spacecraft in 1990. Do we need a backup home? Credit: NASA/JPL

Many scientists and other thinkers say we’re on the cusp of enormous transformation. AI is just beginning to transform how we do things; much of the transformation is behind the scenes. AI seems poised to eliminate jobs for millions, and when paired with robotics, the transformation seems almost unlimited. That’s a fairly obvious concern.

But there are deeper, more systematic concerns. Who writes the algorithms? Will AI discriminate somehow? Almost certainly. Will competing algorithms undermine powerful democratic societies? Will open societies remain open? Will ASI start making decisions for us, and who will be accountable if it does?

This is an expanding tree of branching questions with no clear terminus.

Stephen Hawking (RIP) famously warned that AI could end humanity if it begins to evolve independently. “I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans,” he told Wired magazine in 2017. Once AI can outperform humans, it becomes ASI.

Stephen Hawking was a major proponent for colonizing other worlds, mainly to ensure humanity does not go extinct. In later years, Hawking recognized that AI could be an extinction-level threat. Credit: educatinghumanity.com

Hawking may be one of the most recognizable voices to issue warnings about AI, but he’s far from the only one. The media is full of discussions and warnings, alongside articles about the work AI does for us. The most alarming warnings say that ASI could go rogue. Some people dismiss that as science fiction, but not Garrett.

“Concerns about Artificial Superintelligence (ASI) eventually going rogue is considered a major issue – combatting this possibility over the next few years is a growing research pursuit for leaders in the field,” Garrett writes.

If AI provided no benefits, the issue would be much easier. But it provides all kinds of benefits, from improved medical imaging and diagnosis to safer transportation systems. The trick for governments is to allow benefits to flourish while limiting damage. “This is especially the case in areas such as national security and defence, where responsible and ethical development should be paramount,” writes Garrett.

News reports like this might seem impossibly naive in a few years or decades.

The problem is that we and our governments are unprepared. There’s never been anything like AI, and no matter how we try to conceptualize it and understand its trajectory, we’re left wanting. And if we’re in this position, so would any other biological species that develops AI. The advent of AI and then ASI could be universal, making it a candidate for the Great Filter.

This is the risk ASI poses in concrete terms: It could no longer need the biological life that created it. “Upon reaching a technological singularity, ASI systems will quickly surpass biological intelligence and evolve at a pace that completely outstrips traditional oversight mechanisms, leading to unforeseen and unintended consequences that are unlikely to be aligned with biological interests or ethics,” Garrett explains.

How could ASI relieve itself of the pesky biological life that corrals it? It could engineer a deadly virus, it could inhibit agricultural food production and distribution, it could force a nuclear power plant to melt down, and it could start wars. We don’t really know because it’s all uncharted territory. Hundreds of years ago, cartographers would draw monsters on the unexplored regions of the world, and that’s kind of what we’re doing now.

This is a portion of the Carta Marina map from the year 1539. It shows monsters lurking in the unknown waters off of Scandinavia. Are the fears of ASI kind of like this? Or could ASI be the Great Filter? Image Credit: By Olaus Magnus – http://www.npm.ac.uk/rsdas/projects/carta_marina/carta_marina_small.jpg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=558827

If this all sounds forlorn and unavoidable, Garrett says it’s not.

His analysis so far is based on ASI and humans occupying the same space. But if we can attain multi-planetary status, the outlook changes. “For example, a multi-planetary biological species could take advantage of independent experiences on different planets, diversifying their survival strategies and possibly avoiding the single-point failure that a planetary-bound civilization faces,” Garrett writes.

If we can distribute the risk across multiple planets around multiple stars, we can buffer ourselves against the worst possible outcomes of ASI. “This distributed model of existence increases the resilience of a biological civilization to AI-induced catastrophes by creating redundancy,” he writes.

If one of the planets or outposts that future humans occupy fails to survive the ASI technological singularity, others may survive. And they would learn from it.

Artist’s illustration of a SpaceX Starship landing on Mars. If we can become a multi-planetary species, the threat of ASI is diminished. Credit: SpaceX

Multi-planetary status might even do more than just survive ASI. It could help us master it. Garrett imagines situations where we can experiment more thoroughly with AI while keeping it contained. Imagine AI on an isolated asteroid or dwarf planet, doing our bidding without access to the resources required to escape its prison. “It allows for isolated environments where the effects of advanced AI can be studied without the immediate risk of global annihilation,” Garrett writes.

But here’s the conundrum. AI development is proceeding at an accelerating pace, while our attempts to become multi-planetary aren’t. “The disparity between the rapid advancement of AI and the slower progress in space technology is stark,” Garrett writes.

The difference is that AI is computational and informational, but space travel contains multiple physical obstacles that we don’t yet know how to overcome. Our own biological nature restrains space travel, but no such obstacle restrains AI. “While AI can theoretically improve its own capabilities almost without physical constraints,” Garrett writes, “space travel must contend with energy limitations, material science boundaries, and the harsh realities of the space environment.”

For now, AI operates within the constraints we set. But that may not always be the case. We don’t know when AI might become ASI or even if it can. But we can’t ignore the possibility. That leads to two intertwined conclusions.

If Garrett is correct, humanity must work more diligently on space travel. It can seem far-fetched, but knowledgeable people know it’s true: Earth will not be inhabitable forever. Humanity will perish here by our own hand or nature’s hand if we don’t expand into space. Garrett’s 200-year estimate just puts an exclamation point on it. A renewed emphasis on reaching the Moon and Mars offers some hope.

The Artemis program is a renewed effort to establish a presence on the Moon. After that, we could visit Mars. Are these our first steps to becoming a multi-planetary civilization? Image Credit: NASA

The second conclusion concerns legislating and governing AI, a difficult task in a world where psychopaths can gain control of entire nations and are bent on waging war. “While industry stakeholders, policymakers, individual experts, and their governments already warn that regulation is necessary, establishing a regulatory framework that can be globally acceptable is going to be challenging,” Garrett writes. Challenging barely describes it. Humanity’s internecine squabbling makes it all even more unmanageable. Also, no matter how quickly we develop guidelines, ASI might change even more quickly.

“Without practical regulation, there is every reason to believe that AI could represent a major threat to the future course of not only our technical civilization but all technical civilizations,” Garrett writes.

This is the United Nations General Assembly. Are we united enough to constrain AI? Image Credit: By Patrick Gruban, cropped and downsampled by Pine – originally posted to Flickr as UN General Assembly, CC BY-SA 2.0, https://commons.wikimedia.org/w/index.php?curid=4806869

Many of humanity’s hopes and dreams crystallize around the Fermi Paradox and the Great Filter. Are there other civilizations? Are we in the same situation as other ETIs? Will our species leave Earth? Will we navigate the many difficulties that face us? Will we survive?

If we do, it might come down to what can seem boring and workaday: wrangling over legislation.

“The persistence of intelligent and conscious life in the universe could hinge on the timely and effective implementation of such international regulatory measures and technological endeavours,” Garrett writes.

The post Does the Rise of AI Explain the Great Silence in the Universe? appeared first on Universe Today.

Categories: Astronomy

If We Want to Visit More Asteroids, We Need to Let the Spacecraft Think for Themselves

Mon, 04/08/2024 - 12:22pm

Missions to asteroids have been on a tear recently. Visits by Rosetta, Osirix-REX, and Hayabusa2 have all visited small bodies and, in some cases, successfully returned samples to the Earth. But as humanity starts reaching out to asteroids, it will run into a significant technical problem – bandwidth. There are tens of thousands of asteroids in our vicinity, some of which could potentially be dangerous. If we launched a mission to collect necessary data about each of them, our interplanetary communication and control infrastructure would be quickly overwhelmed. So why not let our robotic ambassadors do it for themselves – that’s the idea behind a new paper from researchers at the Federal University of São Paulo and Brazil’s National Institute for Space Research.

The paper primarily focuses on the control problem of what to do when a spacecraft is approaching a new asteroid. Current missions take months to approach and require consistent feedback from ground teams to ensure the spacecraft understands the parameters of the asteroid it’s approaching – especially the gravitational constant.

Some missions have seen more success with that than others – for example, Philase, the lander that went along with Rosetta, had trouble when it bounced off the surface of comet 67P/Churyumov-Gerasimenko. As the authors pointed out, part of that difference was a massive discrepancy between the actual shape of the comet and the observed shape that telescopes had seen before Rosetta arrived there. 

Fraser discusses the possibility of capturing an asteroid.

Even more successful missions, such as OSIRIS-Rex, take months of lead-up time to complete relatively trivial maneuvers in the context of millions of kilometers their overall journey takes them. For example, it took 20 days for OSIRIX-Rex to perform multiple flybys at 7 km above the asteroid’s surface before its mission control deemed it safe to enter a stable orbit.

One of the significant constraints the mission controllers were looking at was whether they could accurately calculate the gravitational constant of the asteroid they were visiting. Gravity is notoriously difficult to determine from far away, and its miscalculation led to the problems with Philae. So, can a control scheme do to solve all of these problems?

Simply put, it can allow the spacecraft to decide what to do when approaching their target. With a well-defined control scheme, the likelihood of a spacecraft failure due to some unforeseen consequence is relatively minimal. It could dramatically decrease the time missions spend on approach and limit the communication bandwidth back toward mission control on Earth. 

One use case for quick asteroid mission – mining them, as Fraser discusses here.

Such a scheme would also require only four relatively ubiquitous, inexpensive sensors to operate effectively – a LiDAR (similar to those found on autonomous cars), two optical cameras for depth perception, and an inertial measurement unit (IMU) that measures parameters like orientation, acceleration, and magnetic field. 

The paper spends plenty of time detailing the complex math that would go into the control schema – some of which involve statistical calculations similar to basic learning models. The authors also run trials on two potential asteroid targets of interest to see how the system would perform.

One is already well understood. Bennu was the target of the OSIRIX-Rex mission and, therefore, is well-characterized as asteroids go. According to the paper, with the new control system, a spacecraft could enter a 2000 m orbit within a day of approaching from hundreds of kilometers away, then enter an 800 m orbit the next day. This is compared to the months of preparatory work the actual OSIRIS-Rex mission had to accomplish. And it can be completed with minimal thrust and, more importantly, fuel – a precious commodity on deep-space missions.

Asteroid defense is another important use case for quick asteroid missions – as Isaac Arthus discusses in this video.
Credit – Isaac Arthur

Another demonstration mission is one to Eros, the second-largest asteroid near Earth. It has a unique shape for an asteroid, as it is relatively elongated, which could pose an exciting challenge for automated systems like those described in the paper. Controlling a spacecraft with the new schema for a rendezvous with Eros doesn’t have all the same advantages of a more traditional asteroid like Bennu. For example, it has a much higher thrust requirement and fuel consumption. However, it still shortens the mission time and bandwidth required to operate it.

Autonomous systems are becoming increasingly popular on Earth and in space. Papers like this one push the thinking about what is possible forward. Suppose all that’s required to eliminate months of painstaking manual technical work is to slap a few sensors and implement a new control algorithm. In that case, it’s likely that one of the various agencies and companies planning to rendezvous with an asteroid shortly will adopt that plan.

Learn More:
Negri et al. – Autonomous Rapid Exploration in Close-Proximity of an Asteroid
UT – Miniaturized Jumping Robots Could Study An Asteroid’s Gravity
UT – How to Make Asteroid Landings Safer
UT – A Spacecraft Could use Gravity to Prevent a Dangerous Asteroid Impact

Lead Image:
Artist’s conception of the Lucy mission to the Trojan asteroids.
Credit – NASA

The post If We Want to Visit More Asteroids, We Need to Let the Spacecraft Think for Themselves appeared first on Universe Today.

Categories: Astronomy

Testing a Probe that Could Drill into an Ice World

Mon, 04/08/2024 - 12:12pm

I remember reading about an audacious mission to endeavour to drill through the surface ice of Europa, drop in a submersible and explore the depths below. Now that concept may be taking a step closer to reality with researchers working on technology to do just that. Worlds like Europa are high on the list for exploration due to their potential to harbour life. If technology like the SLUSH probe (Search for Life Using Submersible Head) work then we are well on the way to realising that dream. 

The search for life has always been something to captivate the mind. Think about the diversity of life on Earth and it is easy to see why we typically envisage creatures that rely upon sunlight, food and drink. But on Earth, life has found a way in the most inhospitable of environments, even at the very bottom of the ocean. The Mariana’s Trench is deeper than Mount Everest is tall and anything that lives there has to cope with cold water, crushingly high pressure and no sunlight. Seems quite alien but even here, life thrives such as the deep-sea crustacean Hirondellea Gigas – catchy name. 

Location of the Mariana Trench. Credit: Wikipedia Commons/Kmusser

Europa, one of the moon’s of Jupiter has an ice crust but this covers over a global ocean of liquid water.  The conditions deep down in the ocean of Europa might not be so very different from those at the bottom of the Mariana’s Trench so it is here that a glimmer of hope exists to find other life in the Solar System. Should it exist, getting to it is the tricky bit. It’s not just on Europa but Enceladus and even Mars may have water underneath ice shelves. Layers of ice up to a kilometre thick might exist so technology like SLUSH has been developed to overcome. 

Natural color image of Europa obtained by NASA’s Juno spacecraft. (Credit: NASA/JPL-Caltech/SwRI/MSSS/Kevin M. Gill)

The technology is not too new though since melt probes like SLUSH have been tested before. The idea is beautifully simple.  The thermo-mechanical probe uses a drilling mechanism to break through the ice and then the heat probe to partially melt the ice chips, forming slush to enable their transportation to behind the probe as it descends. 

The probe, which looks rather like a light sabre, is then able to transmit data from the subsurface water back to the lander. A tether system is used for the data transmission using conductive microfilaments and an optical fibre cable. Intriguingly and perhaps even cunningly, should the fibre cable break (which is a possibility due to tidal stresses from the ice) then the microfilaments will work as an antenna.  They can then be tuned into by the lander to resume data transmission. The tether is coiled up and housed inside spools which are left behind in the ice as the spool is emptied. I must confess my immediate thought here was ‘litter’! I accept we have to leave probes in order to explore but surely we can do it without leaving litter behind! However there is a reason for this too. As the spools are deployed, they act as receivers and transmitters to allow the radio frequencies to travel through the ice. 

The company working on the device is Honeybee Robotics have created prototypes. The first was stand alone, had no data transmission capability and demonstrated the drilling and slushing technology in an ice tower in Honeybee’s walk in freezer. While this was underway, the tether communication technology was being tested too with the first version called the Salmon Probe. This was taken to Devon Island in the Arctic where the unspooling method is being put through its paces. The first attempts back in 2022 saw the probe achieving depths of 1.8m! 

A further probe was developed called the Dolphin probe and this was capable of getting to depths of about 100m but sea ice limitations meant it could only get to a depth of 2m! Thus far, all probes have performed well. Honeybee are now working on the Narwhal Probe which will have more measuring equipment on board, a deployable tether and spool and will be far more like the finished product. If all goes to plan it will profile the ice on Devon Island to a depth of 100m.  This is still quite short of the kilometre thick ice expected but it is most definitely fantastic progress toward exploring the cold watery depths of alien worlds. 

Source : SLUSH: AN ICE DRILLING PROBE TO ACCESS OCEAN WORLDS

The post Testing a Probe that Could Drill into an Ice World appeared first on Universe Today.

Categories: Astronomy

What Could We Build With Lunar Regolith?

Mon, 04/08/2024 - 10:56am

It has often been likened to talcum powder. The ultra fine lunar surface material known as the regolith is crushed volcanic rock. For visitors to the surface of the Moon it can be a health hazard, causing wear and tear on astronauts and their equipment, but it has potential. The fine material may be suitable for building roads, landing pads and shelters. Researchers are now working to analyse its suitability for a number of different applications.

Back in the summer of 1969, Armstrong and Aldrin became the first visitors from Earth to set foot on the Moon. Now, 55 years on and their footprints are still there. The lack of weathering effects and the fine powdery material have held the footprints in perfect shape since the day they were formed. Once we – and I believe this will happen – establish lunar bases and even holidays to the Moon those footprints are likely still going to be there. 

There are many challenges to setting up permanent basis on the Moon, least of which is getting all the material there. I’ve been embarking on a fairly substantial home renovation over recent years and even getting bags of cement and blocks to site has proved a challenge. Whilst I live in South Norfolk in UK (which isn’t the easiest place to get to I accept) the Moon is even harder to get to. Transporting all the necessary materials over a quarter of a million kilometres of empty space is not going to be easy. Teams of engineers and scientists are looking at what materials can be acquired on site instead of transporting from Earth. 

The fine regolith has been getting a lot of attention for this very purpose and to that end, mineralogist Steven Jacobsen from the Northwestern University has been funded by NASAs Marshall Space Flight Centre to see what it back be used for. In addition NASA has partnered with ICON Technology, a robotics firm to explore lunar building technologies using resources found on the Moon. A key challenge with the lunar regolith though is that samples can vary considerably depending on where they are collected from. Jacobsen is trying to understand this to maximise construction potential. 

ICON were awarded the $57.2 million grant back in November 2022 to develop lunar construction methods. Work had already begun on space based construction, again from ICON in their Project Olympus. This didn’t just focus on the Moon though, Mars was also part of the vision to create construction techniques that could work wherever they were employed. 

Artist’s concept for a lunar base using construction robots and a form of 3D printing contour-crafitng.

3D printing may play a part in the lunar construction approach. It is already being used by ICON and others like them to build houses here on Earth. Employing 3D technology on the Moon using raw lunar material could be one solution. 

One of the first priorities would be to establish a suitable permanent landing area on the Moon. Without it, every time a lander arrives, the fine regolith will get kicked up and disturbed and may very well play havoc with other equipment in the vicinity. The particles can be quite sharp too so it may be quite abrasive on equipment. 

Source : Examining lunar soil for moon-based construction

The post What Could We Build With Lunar Regolith? appeared first on Universe Today.

Categories: Astronomy

The World's Largest Digital Camera is Complete. It Will Go Into the Vera Rubin Observatory

Sun, 04/07/2024 - 3:43pm

The Vera C. Rubin Observatory, formerly the Large Synoptic Survey Telescope (LSST), was formally proposed in 2001 to create an astronomical facility that could conduct deep-sky surveys using the latest technology. This includes a wide-field reflecting telescope with an 8.4-meter (~27.5-foot) primary mirror that relies on a novel three-mirror design (the Simonyi Survey Telescope) and a 3.2-megapixel Charge-Coupled Device (CCD) imaging camera (the LSST Camera). Once complete, Rubin will perform a 10-year survey of the southern sky known as the Legacy Survey of Space and Time (LSST).

While construction on the observatory itself did not begin until 2015, work began on the telescope’s digital cameras and primary mirror much sooner (in 2004 and 2007, respectively). After two decades of work, scientists and engineers at the Department of Energy’s (DOE) SLAC National Accelerator Laboratory and their collaborators announced the completion of the LSST Camera – the largest digital camera ever constructed. Once mounted on the Simonyi Survey Telescope, this camera will help researchers observe our Universe in unprecedented detail.

The Vera C. Rubin Observatory is jointly funded by the U.S. National Science Foundation (NSF) and the U.S. Department of Energy (DOE) and is cooperatively operated by NSF NOIRLab and SLAC. When Rubin begins its ten-year survey (scheduled for August 2025), it will help address some of the most pressing and enduring questions in astronomy and cosmology. These include understanding the nature of Dark Matter and Dark Energy, creating an inventory of the Solar System, mapping the Milky Way, and exploring the transient optical sky (i.e., objects that vary in location and brightness).

A schematic of the LSST Camera. Note the size comparison; the camera will be the size of a small SUV. Credit: Vera Rubin Observatory/DOE

The LSST Camera will assist these efforts by gathering an estimated 5,000 terabytes of new raw images and data annually. “With the completion of the unique LSST Camera at SLAC and its imminent integration with the rest of Rubin Observatory systems in Chile, we will soon start producing the greatest movie of all time and the most informative map of the night sky ever assembled,” said Željko Ivezic, an astronomy professor at the University of Washington and the Director of Rubin Observatory Construction in a NoirLab press release.

Measuring 1.65 x 3 meters (5.5 x 9.8 ft), with a front lens over 1.5 m (5 ft) across, the camera is about the size of a small SUV and weighs almost 2800 kg (6200 lbs). Its large-aperture, wide-field optical imaging capabilities can capture light from the near-ultraviolet (near-UV) to the near-infrared (NIR), or 0.3 – 1 micrometers (?m). But the camera’s greatest attribute is its ability to capture unprecedented detail over an unprecedented field of view. This will allow the Rubin Observatory to map the positions and measure the brightness of billions of stars, galaxies, and transient objects, creating a robust catalog that will fuel research for years.

Said Kathy Turner, the program manager for the DOE’s Cosmic Frontier Program, these images will help astronomers unlock the secrets of the Universe:

“And those secrets are increasingly important to reveal. More than ever before, expanding our understanding of fundamental physics requires looking farther out into the Universe. With the LSST Camera at its core, Rubin Observatory will delve deeper than ever before into the cosmos and help answer some of the hardest, most important questions in physics today.”

In particular, astronomers are looking forward to using the LSST Camera to search for signs of weak gravitational lensing. This phenomenon occurs when massive galaxies alter the curvature of spacetime around them, causing light from more distant background galaxies to become redirected and amplified. This technique allows astronomers to study the distribution of mass in the Universe and how this has changed over time. This is vital to determining the presence and influence of Dark Matter, the mysterious and invisible matter that makes up 85% of the total mass in the Universe.

Similarly, scientists also want to study the distribution of galaxies and how those have changed over time, enabling them to identify Dark Matter clusters and supernovae, which may help improve our understanding of Dark Matter and Dark Energy alike. Within our Solar System, astronomers will use the LSST Camera to create a more thorough consensus of small objects, including asteroids, planetoids, and Near-Earth Objects (NEO) that could pose a collision risk someday. It will also catalog the dozen or so interstellar objects (ISOs) that enter our Solar System every year.

This is an especially exciting prospect for scientists who hope to conduct rendezvous missions in the near future that will allow us to study them up close. Now that the LSST Camera is complete and has finished being tested at SLAC, it will be shipped to Cerro Pachón in Chile (where the Vera C. Rubin Observatory is being constructed) and integrated with the Simonyi Survey Telescope later this year. Said Bob Blum, Director for Operations for Vera C. Rubin Observatory:

“Rubin Observatory Operations is very excited to see this major milestone about to be completed by the construction team. Combined with the progress of coating the primary mirror, this brings us confidently and much closer to starting the Legacy Survey of Space and Time. It is happening.”

The LSST Camera was made possible thanks to the expertise and technology contributed by international partners. These include the Brookhaven National Laboratory, which built the camera’s digital sensor array; the Lawrence Livermore National Laboratory and its industrial partners, who designed and built the lenses; the National Institute of Nuclear and Particle Physics in France, which built the camera’s filter exchange system and contributed to the sensor and electronics design.

Further Reading: NoirLab

The post The World's Largest Digital Camera is Complete. It Will Go Into the Vera Rubin Observatory appeared first on Universe Today.

Categories: Astronomy

The First Atmospheric Rainbow on an Exoplanet?

Sat, 04/06/2024 - 11:12am

When light strikes the atmosphere all sorts of interesting things can happen. Water vapor can split sunlight into a rainbow arc of colors, corpuscular rays can stream through gaps in clouds like the light from heaven, and halos and sundogs can appear due to sunlight reflecting off ice crystals. And then there is the glory effect, which can create a colorful almost saint-like halo around objects.

Like rainbows, glories are seen when facing away from the light source. They are often confused with circular rainbows because of their similarity, but glories are a unique effect. Rainbows are caused by the refraction of light through water droplets, while glories are caused by the wave interference of light. Because of this, a glory is most apparent when the water droplets of a cloud or fog are small and uniform in size. The appearance of a glory gives us information about the atmosphere. We have assumed that some distant exoplanets would experience glories similar to Earth, but now astronomers have found the first evidence of them.

A solar glory seen from an airplane. Credit: Brocken Inaglory

The observations come from the Characterising ExOplanet Satellite (Cheops) as well as observations from other observatories of an exoplanet known as WASP-76b. It’s not the kind of exoplanet where you’d expect a glory to appear. WASP-76b is not a temperate Earth-like world with a humid atmosphere, but a hellish hot Jupiter with a surface temperature of about 2,500 Kelvin. Because of this, the team wasn’t looking for extraterrestrial glories but rather studying the odd asymmetry of the planet’s atmosphere.

WASP-76b orbits its star at a tenth of the distance of Mercury from the Sun. At such a close distance the world is likely tidally locked, with one side forever boiling under its sun’s heat and the other side always in shadow. No such planet exists in our solar system, so astronomers are eager to study how this would affect the atmosphere of such a world. Previous studies have shown that the atmosphere is not symmetrical. The star-facing side is puffed up by the immense heat, while the atmosphere of the dark side is more dense.

For three years the team observed WASP-76b as it passed in front of and behind its star, capturing data on the intersection between the light and dark side. They found that on the planet’s eastern terminator (the boundary between light and dark sides) there was a surprising increase in light. This extra glow could be caused by a glory effect. It will take more observations to confirm this effect but if verified it will be the first glory observed beyond our solar system. Currently, glories have only been observed on Earth and Venus.

The presence of a glory on WASP-76b would mean that spherical droplets must have been present in the atmosphere for at least three years. This means either they are stable within the atmosphere, or they are constantly replenished. One possibility is that the glory is caused by iron droplets that rain from the sky on the cooler side of the planet. Even if this particular effect is not confirmed, the ability of modern telescopes to capture this data suggests that we will soon be able to study many subtle effects of exoplanet atmospheres.

Reference: Demangeon, O. D. S., et al. “Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b.” Astronomy & Astrophysics 684 (2024): A27.

The post The First Atmospheric Rainbow on an Exoplanet? appeared first on Universe Today.

Categories: Astronomy

Roman Will Learn the Ages of Hundreds of Thousands of Stars

Sat, 04/06/2024 - 11:03am

Astronomers routinely provide the ages of the stars they study. But the methods of measuring ages aren’t 100% accurate. Measuring the ages of distant stars is a difficult task.

The Nancy Grace Roman Space Telescope should make some progress.

Stars like our Sun settle into their main sequence lives of fusion and change very little for billions of years. It’s like watching middle-aged adults go about their business during their working lives. They get up, drive to work, sit at a desk, then drive home.

But what can change over time is their rotation rate. The Sun now rotates about once a month. When it was first formed, it rotated more rapidly.

But over time, the Sun’s rotation rate, and the rotation rate of stars the same mass or lower than the Sun’s, will slow down. The slowdown is caused by interactions between the star’s magnetic fields and the stellar wind, the stream of high-energy protons and electrons emitted by stars. Over time, these interactions reduce a star’s angular momentum, and its rotation slows. The phenomenon is called “magnetic braking,” and it depends on the strength of a star’s magnetic fields.

When the Sun rotates, the magnetic field lines rotate with it. The combination is almost like a solid object. Ionized material from the solar wind will be carried along the field lines and, at some point, will escape the magnetic field lines altogether. That reduces the Sun’s angular momentum. Image Credit: By Coronal_Hole_Magnetic_Field_Lines.svg: Sebman81Sun_in_X-Ray.png: NASA Goddard Laboratory for AtmospheresCelestia_sun.jpg: NikoLangderivative work: Aza (talk) – Coronal_Hole_Magnetic_Field_Lines.svgSun_in_X-Ray.pngCelestia_sun.jpg, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=8258519

The more rapidly a star initially spins, the stronger its magnetic fields. That means they slow down faster. After about one billion years of life, stars of the same age and mass will spin at the same rate. Once astronomers know a star’s mass and rotation rate, they can estimate its age. Knowing stars’ ages is critical in research. It makes everything astronomers do more accurate, including piecing together the Milky Way’s history.

The problem is that measuring rotation rates is challenging. One method is to observe spots on stars’ surfaces and watch as they come into and out of view. All stars have star spots, though their characteristics vary quite a bit. In fact, stars can have dozens of spots, and the spots change locations. Therein lies the difficulty. It’s extremely difficult to figure out the periodicity when dozens of spots change locations on the star’s surface.

This is where the Nancy Grace Roman Space Telescope (the Roman) comes in. It’s scheduled for launch in May 2027 to begin its five-year mission. It’s a wide-field infrared survey telescope with multiple science objectives. One of its main programs is the Galactic Bulge Time Domain Survey. That effort will gather detailed information on hundreds of millions of stars in the Milky Way’s galactic bulge.

This is a simulated image of what the Roman Space Telescope will see when it surveys the Milky Way’s galactic bulge. The telescope will observe hundreds of millions of stars in the region. Image Credit: Matthew Penny (Louisiana State University)

The Roman will generate an enormous amount of data. Much of it will be measurements of how the brightness of hundreds of thousands of stars changes. But untangling those measurements and figuring out what those changes in brightness mean for stellar rotation requires help from AI.

Astronomers at the University of Florida are developing AI to extract stellar rotation periods from all that data.

Zachary Claytor is a postdoc at the University of Florida and the AI project’s science principal investigator. Their AI is called a convolutional neural network. This type of AI is well-suited to analyzing images and is used in image classification and medical image analysis, among other things.

AI needs to be trained before it can do its job. In this case, Claytor and his associates wrote a computer program to generate simulated stellar light curves for the AI to process and learn from.

“This program lets the user set a number of variables, like the star’s rotation rate, the number of spots, and spot lifetimes. Then it will calculate how spots emerge, evolve, and decay as the star rotates and convert that spot evolution to a light curve – what we would measure from a distance,” explained Claytor.

Claytor and his co-researchers have already tested their AI on data from NASA’s TESS, the Transiting Exoplanet Survey Satellite. The longer a star’s rotation period is, the more difficult it is to measure. But the team’s AI demonstrated that it could successfully determine these periods in TESS data.

The Roman’s Galactic Bulge Time Domain Survey is still being designed. So astronomers can use this AI-based effort to help design the survey.

“We can test which things matter and what we can pull out of the Roman data depending on different survey strategies. So when we actually get the data, we’ll already have a plan,” said Jamie Tayar, assistant professor of astronomy at the University of Florida and the program’s principal investigator.

“We have a lot of the tools already, and we think they can be adapted to Roman,” she added.

Artist’s impression of the Nancy Grace Roman Space Telescope, named after NASA’s first Chief of Astronomy. When launched later this decade, the telescope will measure the rotational periods of hundreds of thousands of stars and, with the help of AI, will determine their ages. Credits: NASA

Measuring stellar ages is difficult, yet age is a key factor in understanding any star. Astronomers use various methods to measure ages, including evolutionary models, a star’s membership in a cluster of similarly-aged stars, and even the presence of a protoplanetary disk. But no single method can measure every star’s age, and each method has its own drawbacks.

If the Roman can break through this barrier and accurately measure stellar rotation rates, astronomers should have a leg-up in understanding stellar ages. But there’s still one problem: magnetic braking.

This method relies on a solid understanding of how magnetic braking works over time. But astronomers may not understand it as thoroughly as they’d like. For instance, research from 2016 showed that magnetic braking might not slow down older stars as much as thought. That research found unexpectedly rapid rotation rates in stars more evolved than our Sun.

Somehow, astronomers will figure this all out. The Roman Space Telescope should help, as its vast trove of data is bound to lead to some unexpected conclusions. One way or another, with the help of the Roman Space Telescope, the ESA’s Gaia mission, and others, astronomers will untangle the problem of measuring everything about stars, including their ages.

The post Roman Will Learn the Ages of Hundreds of Thousands of Stars appeared first on Universe Today.

Categories: Astronomy

Webb Sees a Galaxy Awash in Star Formation

Fri, 04/05/2024 - 8:07pm

Since it began operations in July 2022, the James Webb Space Telescope (JWST) has fulfilled many scientific objectives. In addition to probing the depths of the Universe in search of galaxies that formed shortly after the Big Bang, it has also provided the clearest and most detailed images of nearby galaxies. In the process, Webb has provided new insight into the processes through which galaxies form and evolve over billions of years. This includes galaxies like Messier 82 (M82), a “starburst galaxy” located about 12 million light-years away in the constellation Ursa Major.

Also known as the “Cigar Galaxy” because of its distinctive shape, M82 is a rather compact galaxy with a very high star formation rate. Roughly five times that of the Milky Way, this is why the core region of M82 is over 100 times as bright as the Milky Way’s. Combined with the gas and dust that naturally obscures visible light, this makes examining M82’s core region difficult. Using the extreme sensitivity of Webb‘s Near-Infrared Camera (NIRCam), a team led by the University of Maryland observed the central region of this starburst galaxy to examine the physical conditions that give rise to new stars.

The team was led by Alberto Bollato, an astronomy professor at the University of Maryland and a researcher with the Joint Space-Science Institute (JSSI). He was joined by researchers from NASA’s Jet Propulsion Laboratory, NASA Ames, the European Space Agency (ESA), the Space Telescope Science Institute (STScI), the ARC Centre of Excellence for All Sky Astrophysics in 3 Dimensions (ASTRO 3D), the Max-Planck-Institut für Astronomie (MPIA), National Radio Astronomy Observatory (NRAO), the Infrared Processing and Analysis Center (IPAC-Caltech) and multiple universities, institutes, and observatories. Their findings are described in a paper accepted for publication in The Astrophysical Journal.

Annotated image of the starburst galaxy Messier 82 captured by Hubble (left) and Webb’s NIRCam (right). Credit: NASA/ESA/CSA/STScI/Alberto Bolatto (UMD)

Their observations were part of a Cycle 1 General Observations (GO) project – for which Bollato is the Principal Investigator (PI) – that used NIRCam data to examine the “prototypical starbursts” NGC 253 and M82 and their “cool” galactic winds. Such galaxies remain a source of fascination for astronomers because of what they can reveal about the birth of new stars in the early Universe. Starbursts are galaxies that experience rapid and efficient star formation, a phase that most galaxies went through during the early history of the Universe (ca. 10 billion years ago). Studying early galaxies in this phase is challenging due to the distances involved.

Fortunately, starburst galaxies like NGC 253 and M82 are relatively close to the Milky Way. While these galaxies have been observed before, Webb’s extreme sensitivity in the near-infrared spectrum provided the most detailed look to date. Moreover, the NIRCam observations were made using an instrument mode that prevented the galaxy’s intense brightness from overwhelming the instrument. The resulting images revealed details that have been historically obscured, such as dark brown tendrils of heavy dust that contained concentrations of iron (visible in the image as green specks).

These consist largely of supernova remnants, while small patches of red are clouds of molecular hydrogen lit up by young stars nearby. Said Rebecca Levy, second author of the study at the University of Arizona in Tucson, in a NASA press release, “This image shows the power of Webb. Every single white dot in this image is either a star or a star cluster. We can start to distinguish all of these tiny point sources, which enables us to acquire an accurate count of all the star clusters in this galaxy.”

Another key detail captured in the images is the “galactic wind” rushing out from the core, which was visible at longer infrared wavelengths. This wind is caused by the rapid rate of star formation and subsequent supernovae and has a significant influence on the surrounding environment. Studying this wind was a major objective of the project (GO 1701), which aimed to investigate how these winds interact with cold and hot material. By a central region of M82, the team was able to examine where the wind originates and the impact it has on surrounding material.

The Cigar Galaxy (M82), a starburst galaxy with high star production. Credit: NASA, ESA, and the Hubble Heritage Team (STScI/AURA)

The team was surprised by the way Webb’s NIRCam was able to trace the structure of the galactic wind via emission spectra from very small dust grains known as polycyclic aromatic hydrocarbons (PAHs) – a chemical produced when coal, wood, gasoline, and tobacco are burned. These emissions highlighted the galactic wind’s fine structure, which appeared as red filaments flowing from above and below the galaxy’s disk. Another surprise was the structure of the PAH emission, which was similar to that of the hot ionized gas in the wind. As Bollato explained:

“M82 has garnered a variety of observations over the years because it can be considered as the prototypical starburst galaxy. Both NASA’s Spitzer and Hubble space telescopes have observed this target. With Webb’s size and resolution, we can look at this star-forming galaxy and see all of this beautiful, new detail. It was unexpected to see the PAH emission resemble ionized gas. PAHs are not supposed to live very long when exposed to such a strong radiation field, so perhaps they are being replenished all the time. It challenges our theories and shows us that further investigation is required.”

The team hopes to further investigate the questions these findings raise using Webb data, which will include spectroscopic observations made using the Near-infrared Spectrograph (NIRSpec) and large-scale images of the galaxy and wind. This data will help astronomers obtain accurate ages for the star clusters and determine how long each phase of star formation lasts in starburst galaxies. As always, this information could shed light on how similar phases took place in the early Universe, helping shape galaxies like ours. As Bollato summarized:

“Webb’s observation of M82, a target closer to us, is a reminder that the telescope excels at studying galaxies at all distances. In addition to looking at young, high-redshift galaxies, we can look at targets closer to home to gather insight into the processes that are happening here – events that also occurred in the early universe.”

Further Reading: Webb Space Telescope, MPIA

The post Webb Sees a Galaxy Awash in Star Formation appeared first on Universe Today.

Categories: Astronomy

The Stellar Demolition Derby in the Centre of the Galaxy

Fri, 04/05/2024 - 4:27pm

The region near the Milky Way’s centre is dominated by the supermassive black hole that resides there. Sagittarius A*’s overwhelming gravity creates a chaotic region where tightly packed, high-speed stars crash into one another like cars in a demolition derby.

These collisions and glancing blows change the stars forever. Some become strange, stripped-down, low-mass stars, while others gain new life.

The Milky Way’s supermassive black hole (SMBH) is called Sagittarius A* (Sgr. A*). Sgr. A* is about four million times more massive than the Sun. With that much mass, the much smaller stars nearby are easily affected by the black hole’s powerful gravity and are accelerated to rapid velocities.

In the inner 0.1 parsec, or about one-third of a light-year, stars travel thousands of kilometres per second. Outside that region, the pace is much more sedate. Stars beyond 0.1 parsec travel at hundreds of km/s.

But it’s not only the speed that drives the collisions. The region is also tightly packed with stars into what astronomers call a nuclear star cluster (NSC.) The combination of high speed and high stellar density creates a region where stars are bound to collide.

“They whack into each other and keep going.”

Sanaea Rose, Department of Physics and Astronomy, UCLA

New research led by Northwestern University simulated stars orbiting Sgr. A* to understand the interactions and collisions and their results. It’s titled “Stellar Collisions in the Galactic Center: Massive Stars, Collision Remnants, and Missing Red Giants.” The lead author is Sanaea C. Rose from UCLA’s Department of Physics and Astronomy. The research was also recently presented at the American Physical Society’s April meeting.

The researchers simulated a population of 1,000 stars embedded in the NSC. The stars ranged from 0.5 to 100 solar masses, but in practice, the upper limit was about 30 solar masses due to the initial mass function. Other characteristics, like orbital eccentricities, were varied to ensure that the sample caught stars at different distances from Sgr. A*. That’s necessary to build a solid understanding of the stellar collisions.

“The region around the central black hole is dense with stars moving at extremely high speeds,” said lead author Rose. “It’s a bit like running through an incredibly crowded subway station in New York City during rush hour. If you aren’t colliding with other people, then you are passing very closely by them. For stars, these near collisions still cause them to interact gravitationally. We wanted to explore what these collisions and interactions mean for the stellar population and characterize their outcomes.”

“Stars, which are under the influence of a supermassive black hole in a very crowded region, are unlike anything we will ever see in our own solar neighbourhood.”

Sanaea Rose, Department of Physics and Astronomy, UCLA

The stellar density in the inner 0.1 parsecs is nothing like our Solar System’s neighbourhood. The nearest star to our Sun is the low-mass Proxima Centauri. It’s just over four light-years away. It’s like having no neighbours at all.

But in the NSC, things are way different.

The Milky Way galaxy hosts a supermassive black hole (Sgr A*, shown in the inset on the right) embedded in the Nuclear Star Cluster (NSC) at the center, highlighted and enlarged in the middle panel. The image on the right shows the stellar density in the NSC. Image Credit: Zhuo Chen

“The closest star to our sun is about four light-years away,” Rose explained. “Within that same distance near the supermassive black hole, there are more than a million stars. It’s an incredibly crowded neighbourhood. On top of that, the supermassive black hole has a really strong gravitational pull. As they orbit the black hole, stars can move at thousands of kilometres per second.”

In a stellar density that high, collisions are inevitable. The rate of collisions is more severe the closer stars are to the SMBH. In their research, Rose and her colleagues simulated the region to determine the collisions’ effect on individual stars and the stellar population.

The simulations showed that head-on collisions are rare. So stars aren’t destroyed. Instead, they’re more like glancing blows, where stars can be stripped of their outer layers before continuing their trajectories.

“They whack into each other and keep going,” Rose said. “They just graze each other as though they are exchanging a very violent high-five. This causes the stars to eject some material and lose their outer layers. Depending on how fast they are moving and how much they overlap when they collide, they might lose quite a bit of their outer layers. These destructive collisions result in a population of strange, stripped down, low-mass stars.”

These stars end up migrating away from the SMBH. The authors say that there is likely a population of these low-mass stars spread throughout the galactic centre (GC.) They also say that the ejected mass from these grazing collisions could produce the gas and dust features other researchers have observed in the GC, like X7, and G objects like G3 and G2.

X7 is an elongated gas and dust structure in the galactic centre. The researchers suggest it could be made of mass stripped from stars during collisions between fast-moving stars near Sgr. A*. G3 and G2 are objects that resemble clouds of gas and dust but also have properties of stellar objects. Image Credit: Ciurlo et al. 2023.

Outside of the 0.1 parsecs region, the stars are slower. As a result, collisions between stars aren’t as energetic or destructive. Instead of creating a population of stripped-down stars, collisions allow the stars to merge, creating more massive stars. Multiple mergers are possible, creating stars more massive than our Sun.

“A few stars win the collision lottery,” Rose said. “Through collisions and mergers, these stars collect more hydrogen. Although they were formed from an older population, they masquerade as rejuvenated, young-looking stars. They are like zombie stars; they eat their neighbours.”

But after they gain that mass, they hasten their own demise. They become like young, massive stars that consume their fuel quickly.

This artist’s illustration shows a massive star orbiting Sagittarius A*. Post-collision, some stars gain mass and end up shortening their lives. Image Credit: University of Cologne

“They die very quickly,” Rose said. “Massive stars are sort of like giant, gas-guzzling cars. They start with a lot of hydrogen, but they burn through it very, very fast.”

Another puzzling thing about this inner region is the lack of red giants. “Observations of the GC indicate a deficit of RGs within about 0.3 pc of the SMBH,” the authors write, referencing other research. Their results could explain it. “We consider whether main-sequence stellar collisions may help explain this observational puzzle,” they write. “We find that within ~ 0.01 pc of the SMBH, stellar collisions destroy most low-mass stars before they can evolve off the main sequence. Thus, we expect a lack of RGs in this region.”

The region around the Milky Way’s SMBH is chaotic. Even disregarding the black hole itself and its swirling accretion disk and tortured magnetic fields, the stars that dance to its tune live chaotic lives. The simulations show that most stars in the GC will experience direct collisions with other stars. But their chaotic lives could shed light on how the entire region evolved. And since the region resists astronomers’ attempts to observe it, simulations like this are their next best tool.

“It’s an environment unlike any other,” Rose said. “Stars, which are under the influence of a supermassive black hole in a very crowded region, are unlike anything we will ever see in our own solar neighbourhood. But if we can learn about these stellar populations, then we might be able to learn something new about how the galactic center was assembled. At the very least, it certainly provides a point of contrast for the neighbourhood where we live.”

Note: these results are based on a pair of published papers:

The post The Stellar Demolition Derby in the Centre of the Galaxy appeared first on Universe Today.

Categories: Astronomy

A New Map Shows the Universe’s Dark Energy May Be Evolving

Fri, 04/05/2024 - 1:19pm

At the Kitt Peak National Observatory in Arizona, an instrument with 5,000 tiny robotic eyes scans the night sky. Every 20 minutes, the instrument and the telescope it’s attached to observe a new set of 5,000 galaxies. The instrument is called DESI—Dark Energy Survey Instrument—and once it’s completed its five-year mission, it’ll create the largest 3D map of the Universe ever created.

But scientists are getting access to DESI’s first data release and it suggests that dark energy may be evolving.

DESI is the most powerful multi-object survey spectrograph in the world, according to their website. It’s gathering the spectra for tens of millions of galaxies and quasars. The goal is a 3D map of the Universe that extends out to 11 billion light-years. That map will help explain how dark energy has driven the Universe’s expansion.

DESI began in 2021 and is a five-year mission. The first year of data has been released, and scientists with the project say that DESI has successfully measured the expansion of the Universe over the last 11 billion years with extreme precision.

“The DESI team has set a new standard for studies of large-scale structure in the Universe.”

Pat McCarthy, NOIRLab Director

DESI collects light from 5,000 objects at once with its 5,000 robotic eyes. It observes a new set of 5,000 objects every 20 minutes, which means it observes 100,000 objects—galaxies and quasars—each night, given the right observing conditions.

This image shows Stu Harris working on assembling the focal plane for the Dark Energy Spectroscopic Instrument (DESI) at Lawrence Berkeley National Laboratory in 2017 in Berkeley, Calif. Ten petals, each containing 500 robotic positioners that are used to gather light from targeted galaxies, form the complete focal plane. DESI is attached to the 4-meter Mayall Telescope at Kitt Peak National Observatory. Image Credit: DESI/NSF NOIRlab

DESI’s data creates a map of the large-scale structure of the Universe. The map will help scientists unravel the history of the Universe’s expansion and the role dark energy plays. We don’t know what dark energy is, but we know some force is causing the Universe’s expansion to accelerate.

“The DESI instrument has transformed the Mayall Telescope into the world’s premier cosmic cartography machine,” said Pat McCarthy, Director of NOIRLab, the organization behind DESI. “The DESI team has set a new standard for studies of large-scale structure in the Universe. These first-year data are only the beginning of DESI’s quest to unravel the expansion history of the Universe, and they hint at the extraordinary science to come.”

DESI measures dark energy by relying on baryonic acoustic oscillations (BAO.) Baryonic matter is “normal” matter: atoms and everything made of atoms. The acoustic oscillations are density fluctuations in normal matter that date back to the Universe’s beginnings. BAO are the imprint of those fluctuations, or pressure waves, that moved through the Universe when it was all hot, dense plasma.

As the Universe cooled and expanded, the density waves froze their ripples in place, and where density was high, galaxies eventually formed. The ripple pattern of the BAO is visible in the DESI leading image. It shows strands of galaxies, or galaxy filaments, clustered together. They’re separated by voids where density is much lower.

The deeper DESI looks, the fainter the galaxies are. They don’t provide enough light to detect the BAO. That’s where quasars come in. Quasars are extremely bright galaxy cores, and the light from distant quasars creates a shadow of the BAO pattern. As the light travels through space, it interacts with and gets absorbed by clouds of matter. That lets astronomers map dense pockets of matter, but it took over 450,000 quasars. That’s the most quasars ever observed in a survey like this.

Because the BAO pattern is gathered in such detail and across such vast distances, it can act as a cosmic ruler. By combining the measurements of nearby galaxies and distant quasars, astronomers can measure the ripples across different periods of the Universe’s history. That allows them to see how dark energy has stretched the scale over time.

It’s all aimed at understanding the expansion of the Universe.

In the Universe’s first three billion years, radiation dominated it. The Cosmic Microwave Background is evidence of that. For the next several billion years, matter dominated the Universe. It was still expanding, but the expansion was slowing because of the gravitational force from matter. But since then, the expansion has accelerated again, and we give the name dark energy to the force behind that acceleration.

So far, DESI’s data supports cosmologists’ best model of the Universe. But there are some twists.

“We’re incredibly proud of the data, which have produced world-leading cosmology results,” said DESI director and LBNL scientist Michael Levi. “So far, we’re seeing basic agreement with our best model of the Universe, but we’re also seeing some potentially interesting differences that could indicate dark energy is evolving with time.”

Levi is referring to Lambda Cold Dark Matter (Lambda CDM), also known as the standard model of Big Bang Cosmology. Lambda CDM includes cold dark matter—a weakly interacting type of matter—and dark energy. They both shape how the Universe expands but in opposite ways. Dark energy accelerates the expansion, and regular matter and dark matter slow it down. The Universe evolves based on the contributions from all three. The Lambda CDM does a good job of describing what other experiments and observations find. It also assumes that dark energy is constant and spread evenly throughout the Universe.

This data is just the first release, so confirmation of dark energy evolution must wait. By the time DESI has completed its five-year run, it will have mapped over three million quasars and 37 million galaxies. That massive trove of data should help scientists understand if dark energy is changing.

Whatever the eventual answer, the question is vital to understanding the Universe.

“This project is addressing some of the biggest questions in astronomy, like the nature of the mysterious dark energy that drives the expansion of the Universe,” says Chris Davis, NSF program director for NOIRLab. “The exceptional and continuing results yielded by the NSF Mayall telescope with DOE DESI will undoubtedly drive cosmology research for many years to come.”

DESI isn’t the only effort to understand dark energy. The ESA’s Euclid spacecraft is already taking its own measurements to help cosmologists answer their dark energy questions.

In a few years, DESI will have some more powerful allies in the quest to understand dark energy. The Vera Rubin Observatory and Nancy Grace Roman Space Telescope will both contribute to our understanding of the elusive dark energy. They’ll perform surveys of their own, and by combining data from all three, cosmologists are poised to generate some long-sought answers.

But for now, scientists are celebrating DESI’s first data release.

“We are delighted to see cosmology results from DESI’s first year of operations,” said Gina Rameika, associate director for High Energy Physics at the Department of Energy. “DESI continues to amaze us with its stellar performance and how it is shaping our understanding of dark energy in the Universe.”

The post A New Map Shows the Universe’s Dark Energy May Be Evolving appeared first on Universe Today.

Categories: Astronomy

Why is it so hard to drill off Earth?

Fri, 04/05/2024 - 1:05pm

Humans have been digging underground for millennia – on the Earth. It’s where we extract some of our most valuable resources that have moved society forward. For example, there wouldn’t have been a Bronze Age without tin and copper – both of which are primarily found under the ground. But when digging under the ground on celestial bodies, we’ve had a much rougher time. That is going to have to change if we ever hope to utilize the potential resources that are available under the surface. A paper from Dariusz Knez and Mitra Kahlilidermani of the University of Krakow looks at why it’s so hard to drill in space – and what we might do about it.

In the paper, the authors detail two major categories of difficulties when drilling off-world – environmental challenges and technological challenges. Let’s dive into the environmental challenges first.

One obvious difference between Earth and most other rocky bodies that we would potentially want to drill holes into is the lack of an atmosphere. There are some exceptions – such as Venus and Titan, but even Mars has a thin enough atmosphere that it can’t support one fundamental material used for drilling here on Earth – fluids.

The ocean on Europa is a common destination for a exploration mission that will require some drilling. Fraser explores how we would do it.

If you’ve ever tried drilling a hole in metal, you’ve probably used some cooling fluid. If you don’t, there is a good chance either your drill bit or your workpiece will heat up and deform to a point where you can no longer drill. To alleviate that problem, most machinists simply spray some lubricant into the drill hole and keep pressing through. A larger scale version of this happens when construction companies drill into the ground, especially into bedrock – they use liquids to cool the spots where they’re drilling.

That isn’t possible on a celestial body with no atmosphere. At least not using traditional drilling technologies. Any liquid exposed to the lack of atmosphere would immediately sublimate away, providing little to no cooling effect to the work area. And given that many drilling operations occur autonomously, the drill itself – typically attached to a rover or lander – has to know when to back off on its drilling process before the bits melt. That’s an added layer of complexity and not one that many designs have yet come up with a solution.

A similar fluid problem has limited the adoption of a ubiquitous drill technology used on Earth – hydraulics. Extreme temperature swings, such as those seen on the Moon during the day/night cycle, make it extremely difficult to provide a liquid for use in a hydraulic system that doesn’t freeze during cold nights or evaporate during scorching days. As such, hydraulic systems used in almost every large drilling rig on Earth are extremely limited when used in space.

Here’s a detailed look at a drill used on Mars by Smarter Every Day.
Credit – Smarter Every Day YouTube Channel

Other problems like abrasive or clingy regolith can also crop up, such as a lack of magnetic field when orienting the drill. Ultimately, these environmental challenges can be overcome with the same things humans always use to overcome them, no matter what planetary body they’re on – technology.

There are plenty of technological challenges for drilling off-world as well, though. The most obvious is the weight constraint, a crucial consideration for doing anything in space. Large drilling rigs use heavy materials, such as steel casings, to support the boreholes they drill, but these would be prohibitively expensive using current launch technologies. 

Additionally, the size of the drilling system itself is the limiting factor of the force of the drill – as stated in the paper, “the maximum force transmitted to the bit cannot exceed the weight of the whole drilling system.” This problem is exacerbated by the fact that typical rover drills are leveraged out on a robotic arm rather than placed directly underneath where the maximum amount of weight can be applied. This force limitation also limits the type of material the drill can get through – it will be hard-pressed to drill through any significant boulder, for example. While redesigning rovers with drill location in mind could be helpful, again, the launch weight limitation comes into play here.

Curiosity has a unique drilling technique, as described in this JPL video.
Credit – NASA JPL YouTube Channel

Another technological problem is the lack of power. Hydrocarbon-fueled engines power most large drilling rigs on Earth. That isn’t feasible off of Earth, so the system must be powered by solar cells and the batteries they provide. These systems also suffer from the same tyranny of the rocket equation, so they are typically relatively limited in size, making it difficult for drilling systems to take advantage of some of the benefits of entirely electric systems over hydrocarbon-powered ones – such as higher torque.

No matter the difficulties these drilling systems face, they will be vital for the success of any future exploration program, including crewed ones. If we ever want to create lava cave cities on the Moon or get through Enceladeus’ ice sheet to the ocean within, we will need better drilling technologies and techniques. Luckily, there are plenty of design efforts to come up with them.

The paper details four different categories of drill designs:

  1. Surface drills – less than 10 cm depth
  2. Shallow-depth drills – less than 1m depth
  3. Medium-depth drills – between 1m and 10m depth
  4. Large-depth drills – greater than 10m depth 

For each category, the paper lists several designs at various completeness stages. Many of them have novel ideas about how to go about drilling, such as using an “inchworm” system or using ultrasonics. 

CNET describes another Martian mission that used a drill – InSight.
Credit – CNET YouTube Channel

But for now, drilling off-world, and especially on asteroids and comets, which have their own gravitational challenges, remains a difficult but necessary task. As humanity becomes more experienced at it, we will undoubtedly get better at it. Given how important this process is for the grand plans of space explorers everywhere, the time when we can drill effectively into any rocky or icy body in the solar system can’t come soon enough.

Learn More:
Knez & Khalilidermani – A Review of Different Aspects of Off-Earth Drilling
UT – Drill, Baby, Drill! – How Does Curiosity ‘Do It’
UT – Cylindrical Autonomous Drilling Bot Could Reach Buried Martian Water
UT – Perseverance Drills Another Hole, and This Time the Sample is Intact

Lead Image:
Curiosity’s arm with its drill extended.
Credit – NASA/JPL/Ken Kremer/kenkremer.com/Marco Di Lorenzo

The post Why is it so hard to drill off Earth? appeared first on Universe Today.

Categories: Astronomy

Want to Start a Farm on Mars? This Rover Will Find Out if it’s Possible

Thu, 04/04/2024 - 8:22pm

Travelling to Mars has its own challenges. The distance alone makes the journey something of a mission in itself. Arrive though, and the handwork has only just begun. Living and surviving on Mars will be perhaps humans biggest challenge yet.  It would be impossible to take everything along with you to survive so instead, it would be imperative to ‘live off the land’ and produce as much locally as possible. A new rover called AgroMars will be equipped with a number of agriculture related experiments to study the make up of the soil to assess its suitability for growing food. 

Growing food on Mars poses a number of challenges, chiefly due to the harsh environmental conditions. Not least of which is the low atmospheric pressure, temperature extremes and high radiation levels. To try and address these, new techniques have been developed in the fields of hydroponics and aeroponics. The key to these new techniques involves using nutrient rich solutions instead of soils. 

Special structures are build analogous to greenhouses on Earth with artificial lighting, temperature and humidity control. Genetic engineering too has played a part in developing plants that are more hardy and capably of surviving in harsh Martian environments. As we continue to explore the Solar System and in particular Mars, we are going to have to find ways to grow food in alien environments. 

The space station’s Veggie Facility, tended here by NASA astronaut Scott Tingle, during the VEG-03 plant growth investigation, which cultivated Extra Dwarf Pak Choi, Red Russian Kale, Wasabi mustard, and Red Lettuce and harvested on-orbit samples for testing back on Earth. Credits: NASA

Enter AgroMars. A space mission taking a rover to Mars to hunt for, and explore the possibility of establishing agriculture on Mars! The rover will be launched with similar capabilities to the likes of Perseverance or Curiosity. The rover will be launched to Mars by a Falcon 9 launch vehicle operated by Space X but this is some years off yet. The development phase has yet to start. In a paper by lead author M. Duarte dos San- tos the mission has been shaped, reality is a little way off. 

On arrival, AgroMars will use an X-ray and infrared spectrometer, high resolution cameras, pH sensors, mass spectrometers and drilling tools to collect and analyse soil samples. The samples will be assessed for mineralogical composition, soil texture, soil pH, presence of organic compounds and water retention capacity. 

To be able to assess the Martian soil the rover must possess advanced capabilities for collecting and analysing soil samples, more than before. The data will then be sent on to laboratories on Earth and it is their responsibility to interpret the information. The multitude of groups involved is a wonderful reminder how science transcends geographical borders. Working together will yield far better results and help to advance our knowledge of astrobiology and agriculture on Mars. 

‘Calypso’ Panorama of Spirit’s View from ‘Troy’. This full-circle view from the panoramic camera (Pancam) on NASA’s Mars Exploration Rover Spirit shows the terrain surrounding the location called “Troy,” where Spirit became embedded in soft soil during the spring of 2009. The hundreds of images combined into this view were taken beginning on the 1,906th Martian day (or sol) of Spirit’s mission on Mars (May 14, 2009) and ending on Sol 1943 (June 20, 2009). Credit: NASA/JPL-Caltech/Cornell University

This doesn’t come cheap though. The estimated cost of the mission is in the region of $2.7 billion which includes development, launch and exploration for the entire mission. 

The total cost of the mission is estimated to be around $2.7 billion, which includes $2.2 billion for the development and launch of the rover and $500 million for its exploitation during the entirety of the mission. Whether it – pardon the pun – gets off the ground is yet to be seen but if we are to explore and even establish a permanent base on Mars then we will have to gain a better understanding of the environment to feed and sustain future explorers. 

Source : AgroMars, Space Mission Concept Study To Explore Martian Soil And Atmosphere To Search For Possibility Of Agriculture on Mars.

Link :

The post Want to Start a Farm on Mars? This Rover Will Find Out if it’s Possible appeared first on Universe Today.

Categories: Astronomy

Which Animal Has Seen the Most Total Solar Eclipses?

Thu, 04/04/2024 - 7:21pm

In a paper published on the 1st April, author Mark Popinchalk reported upon a fascinating piece of research focussing on which animal has seen the most solar eclipses! It turns out that, whilst us humans have seen our fair share we are nowhere near the top of the list.  According to Popinchalk, the horseshoe crabs have seen a staggering 138 trillion solar eclipses across the entire species. We are hot on their heels but it won’t be until about 10 million years that we catch up!

On Monday we will be treated to another total solar eclipse across many parts of the globe. As the eclipse progresses – which is the result of a perfect Earth, Moon and Sun alignment – the Moon blocks sunlight from reaching parts of the Earth. When the Moon is directly between the two, from parts of the Earth, the Sun is completely blocked and we see a total solar eclipse. When only part of the Sun is blocked, we see a partial eclipse. As the eclipse progresses on Monday, hundreds of millions of people will witness the event unfold. 

Totality and the ‘diamond ring effect,’ captured during the 2023 total solar eclipse as seen from Ah Chong Island, Australia. Credit: Eliot Herman

It goes without saying that eclipses are not human constructs, nor are they purely the domain of the human being. Eclipses have occurred for millions of years, from a time long before humans appeared on Earth. This means that animals, for billions of years, witnessed eclipses long before we were the proverbial twinkle in the eye of mother Earth. 

Across the eons where eclipses have taken place there has been countless creatures walking/flying and swimming around. Even microbial activity Popinchalk suggests should be considered but it is impossible to say too much about them. In the Cambrian period there was a wide range of animals that evolved onto the surface of the Earth. The challenge however is to decide if an animal is actually aware of an eclipse, much less actually ‘observe’ it. There are anecdotal reports of birds going to roost during the lower light levels. Quantifying this is difficult.

Recent studies into the reaction of animals during total solar eclipses from zoos in metropolitan areas. Hartstone—Rose and team tracked the responses to 17 families of animals during the 2017 eclipse and found that 13 of them behaved differently than usual, with 8 performing night time routines. Others, such as primates, exhibited anxiety based behaviours much like our early ancestors did. 

Hartstone-Rose et al observed the Galapagos turtles turning to look toward the sky during an eclipse, were they perhaps observing and contemplating the event? Studies from Lofting and Dolittle (1920) have explored animal communications but until we can unlock the mystery of animal communication we may never know. We cannot however, hide from the fact that animals may well have seen eclipses, the debate is whether they really cottoned on to what was happening. 

In the conclusion, Popinchalk shows how, for an estimated standing population of horseshoe crabs of 120 million they would have witnessed 1.5 million eclipses making a total of 130 trillion total solar eclipse experiences. As for humans, if we take a standing (average) population of 1 million and 320,000 eclipses thats a mere 32 billion experiences. We are lagging behind. The paper is a fascinating read, give it a try, but do remember it was published on the 1st April, the numbers may have changed by then! It’s worthy of a winky emoji at this point

Categories: Astronomy

The Moon Will Get its Own Time Zone

Thu, 04/04/2024 - 4:08pm

White House officials have directed NASA to begin work on establishing a standard time for the Moon, according to a report from Reuters this week. Coordinated Lunar Time (LTC) is intended to help ensure synchronization between the various lunar activities planned under the Artemis program.

Timekeeping is essential to space travel. It ensures orbital maneuvers occur correctly, it helps communications between spacecraft remain secure, and it prevents errors in positioning and mapping. Without it, in other words, lunar exploration would get very complicated.

We can blame Einstein and his theory of relativity for part of the problem. Time is experienced differently under different gravitational conditions, an effect known as time dilation.

“The same clock that we have on Earth would move at a different rate on the moon,” Kevin Coggins, NASA’s space communications and navigation chief, told Reuters.

On the Moon, clocks move faster than their Earthly counterparts by 58.7 microseconds per day. While most humans wouldn’t notice such a tiny difference, spacecraft certainly do.

Currently, spacecraft in low Earth orbit, like GPS satellites and the International Space Station, run on Coordinated Universal Time (UTC). But even in these cases, periodic corrections need to be made for time dilation, otherwise GPS systems would lose precision and ultimately fail.

Canadian astronaut David Saint-Jacques shows his watch, set to UTC, aboard the International Space Station. Credit: CSA.

The Apollo program’s moon missions in the 1960s-70s relied on Houston time. Mission control was the astronauts’ timekeeper – though astronauts made sidereal measurements using the stars to ensure they were on course and on time – that was enough for short-term lunar visits with only two vehicles (a command module and a lander).

But with dozens of countries and private companies vying to engage in long-term lunar exploration under the Artemis program, a shared timekeeping system is going to be vital.

“Think of the atomic clocks at the U.S. Naval Observatory (in Washington). They’re the heartbeat of the nation, synchronizing everything. You’re going to want a heartbeat on the moon,” Coggins said.

NASA will need international cooperation to bring LTC into being. UTC, the global standard for Earthly timekeeping, is managed by the International Bureau of Weights and Measures, and LTC will likely need to be brought before the same body to ensure its implementation is accepted internationally.

The White House memo proposing LTC recognized the need for international agreements to bring it to fruition. It suggested facilitating LTC through existing international bodies, but also through the Artemis Accords, a recent 36-nation agreement that outlines guidelines for cooperative space exploration.

According to the memo, plans for LTC are expected to be finalized by the end of 2026.

The post The Moon Will Get its Own Time Zone appeared first on Universe Today.

Categories: Astronomy

A New Tabletop Experiment to Search for Dark Matter

Thu, 04/04/2024 - 2:38pm

What is Dark Matter? We don’t know. At this stage of the game, scientists are busy trying to detect it and map out its presence and distribution throughout the Universe. Usually, that involves highly-engineered, sophisticated telescopes.

But a new approach involves a device so small it can sit on a kitchen table.

A collaboration between the University of Chicago and the Fermi National Accelerator Laboratory has resulted in a tabletop device called Broadband Reflector Experiment for Axion Detection or BREAD. BREAD is built to detect dark matter, and its first results are now available in a new paper.

The paper is “First Results from a Broadband Search for Dark Photon Dark Matter in the 44 to 52 µeV Range with a Coaxial Dish Antenna.” It’s published in Physical Review Letters, and the lead author is Stefan Knirck. Knirck is a Fermilab postdoctoral scholar who led the construction of the detector.

The word ‘mysterious’ barely describes dark matter. It constitutes about 85% of the matter in the Universe. It can’t be seen, but its presence is inferred from observations. Its mass holds galaxies together; without it, they would fly apart.

“We’re very confident that something is there, but there are many, many forms it could take.”

David Miller, University of Chicago

Dark matter is sometimes described as the Universe’s backbone or the scaffolding that holds regular matter. Simulations like TNG Illustris showed how dark matter is distributed throughout the Universe in a network of filaments and clumps. The distribution of galaxy clusters follows the same pattern.

TNG 50, TNG 100, and TNG 300 simulated increasingly large sections of the Universe, showing how dark matter is spread throughout the Universe. Image: IllustrisTNG

Physicists still don’t know what dark matter is. But it’s there, and there are several candidates.

“We’re very confident that something is there, but there are many, many forms it could take,” said University of Chicago Associate Professor David Miller. Miller is a co-leader of the BREAD experiment.

One of the candidates is a hypothetical type of particle called an axion. If they’re real and their mass is within certain limits, they could be one of dark matter’s components.

The BREAD experiment focuses on the mass range of 10.7–12.5 GHz. Within that range, it searches for dark photon dark matter. Along with axions, they’re one of the most promising candidates for dark matter. Dark photons are a hypothetical type of particle that physicists think might act as a force carrier for dark matter like photons are force carriers for electromagnetism. Axions and dark photons are linked in the search for dark matter, but a detailed explanation is beyond the scope of this article. (Watch Fraser Cain’s videos for a deeper dive.)

BREAD’s first run lasted 24 days and didn’t detect anything; if it had, it would be huge news, and we’d all hear it. But, since its effort is so focused, the lack of detection is still constructive.

“We’re very excited about what we’ve been able to do so far,” said Miller, “There are lots of practical advantages to this design, and we’ve already shown the best sensitivity to date in this 11-12 gigahertz frequency.”

Each candidate for dark matter requires a specific search. Physicists build detectors aimed at specific candidates. BREAD is a little bit different. As its name illustrates, it’s a broadband detector. It can search across a range of frequencies, though its precision suffers.

“If you think about it like a radio, the search for dark matter is like tuning the dial to search for one particular radio station, except there are a million frequencies to check through,” said Miller. “Our method is like doing a scan of 100,000 radio stations, rather than a few very thoroughly.”

This version of BREAD is a scaled-down version of what the full-scale version will be. Eventually, BREAD will sit inside a magnet. The magnetic field will boost the chances that dark matter particles will be converted into detectable photons. This first 24-day run was a proof of principle.

“This is just the first step in a series of exciting experiments we are planning.”

Andrew Sonnenschein, Fermilab Fermilab’s Stefan Knirck with components of the BREAD detector. Eventually, BREAD will be placed inside a magnet to boost the chances that dark photons will convert to photons. Image Credit: BREAD

Though this first proof of principle run didn’t detect any dark matter, the results were still helpful. The run showed that BREAD is very sensitive in its frequency range. The researchers think they can improve the sensitivity even more.

“This is just the first step in a series of exciting experiments we are planning,” said Andrew Sonnenschein from Fermilab, who originally developed the concept behind BREAD. “We have many ideas for improving the sensitivity of our axion search.”

This schematic from the research helps explain how BREAD works. Dark photons convert to photons emitted perpendicularly from the cylinder. The signal is focused on a coaxial horn antenna, amplified using a low-noise receiver chain (right), down-converted and digitized using a custom real-time field-programmable gate array-based broadband data acquisition system (bottom). Image Credit: Knirck et al. 2024

Dark matter and what comprises it is one of the most confounding questions in science. For Miller, BREAD is more than just another science experiment. It speaks to the creativity needed to explore dark matter thoroughly and the way researchers at different institutions can work together to make progress.

“There are still so many open questions in science and an enormous space for creative new ideas for tackling those questions,” said Miller. “I think this is really a hallmark example of those kinds of creative ideas—in this case, impactful, collaborative partnerships between smaller-scale science at universities and larger-scale science at national laboratories.”

The post A New Tabletop Experiment to Search for Dark Matter appeared first on Universe Today.

Categories: Astronomy

NASA Announces Starliner’s Next Launch Attempt: May 6

Thu, 04/04/2024 - 2:13pm

Starliner, the new crewed capsule from Boeing, has been in the works for a long time. Originally unveiled in 2010, the capsule has been under development for the last 14 years, primarily utilizing NASA grants and contracts. However, Boeing itself has taken upwards of 1 billion dollars in hits to earnings as part of the craft’s development. After all that time in the prototype stages, Starliner is finally ready for its first crewed flight – which has now officially been scheduled for May 6th.

The launch will utilize a ULA Atlas V, which was also partly developed by Boeing. Like most Atlas V launches, it will take off from Cape Canaveral in Florida and take two astronauts – Suni Williams and Butch Wilmore – to the International Space Station.

To make room for the capsule, the crew already stationed on the ISS has to do some additional work, including moving a Dragon capsule out of the docking port on the ISS’s Harmony module to which the Starliner will have to attach. To move the capsule, they will also have to complete some additional “science and cargo logistics,” according to a NASA Press release.

Fraser covers Starliner’s successful test flight.

Those logistics seem to be the primary cause of a final five-day delay (from May 1st to 5th) that the Starliner will have to endure. Once at the ISS, Williams and Wilmore will spend a week helping out on the ISS before using the Starliner capsule to return to Earth.

That is assuming all goes well with their flight. Starliner has had at least one spectacular failure as part of its development, though it successfully completed an uncrewed flight in May of 2022. If any astronauts are ready to ride on a new crewed capsule, it’s Williams and Wilmore. Both have been astronauts for over 20 years, and each was a trained Navy Test Pilot before joining NASA.

The capsule they will be using, known as Calypso, has already been to orbit, though not as many times as the astronauts themselves. It was used in the first orbital test flight, and while it didn’t manage to dock up with the ISS, it did land successfully and wouldn’t pose a risk to any astronauts on board.

Video from Boeing showcasing Starliner mounted atop an Atlas V.
Credit – Boeing YouTube Channel

Upon completing this test flight, NASA hopes to rely on the Starliner to provide regular crewed missions to the ISS. This would be supplemental to the SpaceX Dragon capsule the agency already uses and mark the definitive end to the drought of American crewed spaceflight.

Future missions include a four-person flight planned for 2025, assuming all goes well with this first one. Boeing also has a contract with NASA for five additional flights between 2026 and 2030. But first, if all goes well, on May 6th, after decades of work, the world will hopefully gain another crewed vehicle to help facilitate our path to the stars.

Learn More:
NASA – NASA, Boeing Update Launch Date for Starliner’s First Astronaut Flight
UT – Starliner Faces New Delays for Crewed Flights to ISS
UT – Finally! We get to See a View From Inside Boeing’s Starliner During its First Flight
UT – Starliner Needs Even More Fixes, and Probably won’t Carry Astronauts Until 2023

Lead Image:
The Boeing CST-100 Starliner spacecraft is lifted at the Vertical Integration Facility at Space Launch Complex-41 at Florida’s Cape Canaveral Space Force Station on May 4, 2022.
Photo credit: NASA/Frank Michaux


The post NASA Announces Starliner’s Next Launch Attempt: May 6 appeared first on Universe Today.

Categories: Astronomy

Perseverance Finds its Dream Rock

Thu, 04/04/2024 - 1:02pm

If there’s a Holy Grail on Mars, it’s probably a specific type of rock: A rock so important that it holds convincing clues to Mars’ ancient habitability.

Perseverance might have just found it.

If scientists could design the perfect rock for Perseverance to find, it would be one that displayed evidence of ancient water and was the type that preserves ancient organic material. The rover may have found it as it explores the Margin Unit, a geologic region on the inner edge of Jezero Crater’s rim. The Margin Unit was one of the reasons Jezero Crater was selected for Perseverance’s mission.

“To put it simply, this is the kind of rock we had hoped to find when we decided to investigate Jezero Crater.”

Ken Farley, Perseverance project scientist, Caltech.

The Margin Unit is in a narrow band along the crater’s western rim. Orbital observations showed that it’s one of the most carbonate-rich regions on the planet. “Its presence, along with the adjacent fluvial delta, made Jezero crater the most compelling landing site for the Mars 2020 <Perseverance> mission,” presenters at the 2024 Lunar and Planetary Science Conference wrote.

The Margin Unit lies near the western rim of Jezero Crater. White dots show Perseverance’s stopping points, and the blue line shows the rover’s future route. Image Credit: R.C. Wiens et al. 2024

The decision to send Perseverance to the Jezero Crater and the Margin Unit seems to be paying off. Bunsen Peak caught scientists’ attention because it stands tall compared to its surroundings. One of the rock’s faces also has an interesting texture. Scientists thought the rock would allow for nice cross-sections, and since it stood vertically, there’d be less dust when working on it. Surface dust is a problem for Perseverance because it can obscure the rock’s chemistry.

The Perseverance team decided to sample it and cache the sample along with the rest of its cores for eventual return to Earth. But first, they scanned the rock’s surface with SuperCam and PIXL, the rover’s spectrometers. Then, they abraded the rock’s surface and scanned it again. The results show that Bunsen Peak is 75% carbonate grains cemented together by nearly pure silica.

This image mosaic shows the Bunsen Peak rock that has ignited scientists’ excitement. The rover abraded a circular patch to test its composition and extracted a core sample for return to Earth. The lighter surfaces are dust-covered, so Perseverance avoided those areas as the dust can obscure the rock’s chemistry from the rover’s instruments. Image Credit: NASA/JPL-Caltech/ASU/MSSS

“To put it simply, this is the kind of rock we had hoped to find when we decided to investigate Jezero Crater,” said Ken Farley, project scientist for Perseverance at Caltech in Pasadena, California. “Nearly all the minerals in the rock we just sampled were made in water; on Earth, water-deposited minerals are often good at trapping and preserving ancient organic material and biosignatures. The rock can even tell us about Mars’s climate conditions that were present when it was formed.”

This image shows the bottom of the Bunsen Peak sample core. The sample contains about 75% carbonate minerals cemented by almost pure silica. Image Credit: NASA/JPL-Caltech

Here on our planet, carbonate minerals can form directly around microbe cells. Once encapsulated, the cells can quickly become fossils, and are preserved for a long time. This is what happened to stromatolites here on Earth, and they now constitute some of the earliest evidence of life on our planet.

These minerals are a high priority for return to Earth. This sample is number 24, named Comet Geyser, because everything gets a name when you intend to transport it to Earth from another planet.

There’s something specific that makes this sample even more intriguing. They’re microcrystalline rocks, meaning they’re made of crystals so small that only microscopes can see them. On Earth, microcrystalline rocks like Precambrian chert hold fossilized cyanobacteria. Could the same be true of Bunsen Peak?

“The silica and parts of the carbonate appear microcrystalline, which makes them extremely good at trapping and preserving signs of microbial life that might have once lived in this environment,” said Sandra Siljeström, a Perseverance scientist from the Research Institutes of Sweden (RISE) in Stockholm. “That makes this sample great for biosignature studies if returned to Earth. Additionally, the sample might be one of the older cores collected so far by Perseverance, and that is important because Mars was at its most habitable early in its history.”

via GIPHY

Comet Geyser is Perseverance’s third sample from the Margin Unit. There’s still more work to do, but the samples support what scientists thought about Jezero Crater before Perseverance landed there: it was once a paleolake.

“We’re still exploring the margin and gathering data, but results so far may support our hypothesis that the rocks here formed along the shores of an ancient lake,” said Briony Horgan, a Perseverance scientist from Purdue University. “The science team is also considering other ideas for the origin of the Margin Unit, as there are other ways to form carbonate and silica. But no matter how this rock formed, it is really exciting to get a sample.”

It wasn’t that long ago that we knew very little about Mars. In the absence of knowledge, imagination took over. American astronomer Percival Lowell wrote three books about canals on Mars, popularizing the idea that intelligent life was extant on Mars and engineering the planet’s surface.

Astronomers didn’t buy the idea, which turned out to be untrue. But now we know that Lowell was at least partially, though inadvertently, correct. There are no canals, but there may have been lakes.

There was no intelligent life, but there may have been simple life in those lakes. Once we get Comet Geyser and the other samples back to Earth, we may find out for sure.

The post Perseverance Finds its Dream Rock appeared first on Universe Today.

Categories: Astronomy

Start Your Engines: NASA Picks 3 Teams to Work on Lunar Terrain Vehicle

Wed, 04/03/2024 - 11:46pm

Some of the biggest names in aerospace — and the automotive industry — will play roles in putting NASA astronauts in the driver’s seat for roving around on the moon.

The space agency today selected three teams to develop the capabilities for a lunar terrain vehicle, or LTV, which astronauts could use during Artemis missions to the moon starting with Artemis 5. That mission is currently scheduled for 2029, three years after the projected date for Artemis’ first crewed lunar landing.

The teams’ leading companies may not yet be household names outside the space community: Intuitive Machines, Lunar Outpost and Venturi Astrolab. But each of those ventures has more established companies as their teammates.

Over the next 15 years, the three teams will be eligible to work on task orders amounting to a potential total value of $4.6 billion — with the aim of providing mobility technology for crewed and uncrewed moon rovers. The marquee vehicle would be a rover capable of carrying Artemis astronauts on journeys of exploration around the lunar surface, as well as taking robotic trips on its own.

“We look forward to the development of the Artemis generation lunar exploration vehicle to help us advance what we learn at the moon,” Vanessa Wyche, director of NASA’s Johnson Space Center in Houston, said today in a news release. “This vehicle will greatly increase our astronauts’ ability to explore and conduct science on the lunar surface while also serving as a science platform between crewed missions.”

In a posting to X / Twitter, NASA Administrator Bill Nelson said the LTV rover is “essential to the success of Artemis.”

After the teams conduct year-long feasibility studies, NASA plans to select one of the teams to go ahead with construction and testing of its LTV, leading up to a lunar demonstration mission in advance of Artemis 5. NASA could give the teams additional task orders to fill its needs for unpressurized rover capabilities on the moon through 2039.

Texas-based Intuitive Machines is best-known for putting a robotic lander on the lunar surface in February. A couple of its teammates — Boeing and Northrop Grumman — have moon-mission experience that goes back to the Apollo era. Michelin (the tire company) and AVL (which provides vehicle testing and simulation services) round out the Moon RACER team.

NASA has awarded Intuitive Machines $30 million as a prime contractor to complete a Lunar Terrain Vehicle Services contract. The company’s global Moon RACER team will be tasked with creating a feasibility roadmap to develop and deploy a Lunar Terrain Vehicle on the Moon using… pic.twitter.com/GaVh3cvrG5

— Intuitive Machines (@Int_Machines) April 3, 2024

Colorado-based Lunar Outpost has already booked three rover missions for delivery to the moon by SpaceX and Intuitive Machines. Its teammates on the Lunar Dawn project include Lockheed Martin, General Motors, Goodyear Tire & Rubber and MDA Space (known for building the robotic arms on NASA’s space shuttles and the International Space Station).

Buckle up, Earthlings!@NASA has selected the Lunar Dawn team to develop a next-generation lunar terrain vehicle for its LTV contract as part of the @NASAArtemis program. pic.twitter.com/blxXrYL0F8

— Lockheed Martin Space (@LMSpace) April 3, 2024

California-based Astrolab made a separate deal last year with SpaceX to have its FLEX rover delivered to the moon aboard a Starship lander for a commercial mission that’s set for as soon as 2026. Astrolab’s teammates on the FLEX LTV project include Axiom Space (which is making spacesuits for Artemis moon missions) and Odyssey Space Research.

NASA has awarded Astrolab and its partners a contract worth up to $1.9 billion to advance the development of the Lunar Terrain Vehicle which will help Artemis astronauts explore more of the Moon’s surface.

Read the full announcement: https://t.co/h9Cwopy5Z5 pic.twitter.com/FJJtq0oiH9

— Astrolab (@Astrolab_Space) April 3, 2024

NASA said the LTV would support the Artemis program’s crewed missions to the moon’s south polar region, plus remote-controlled exploration activities as needed between those missions. “Outside those times, the provider will have the ability to use their LTV for commercial lunar surface activities unrelated to NASA missions,” the space agency said.

With regard to the financial arrangements, NASA said only that the Lunar Terrain Vehicle Services contract had a combined maximum potential value of $4.6 billion for all task-order awards. But a couple of the teams provided additional details. Intuitive Machines said it was awarded $30 million as a prime contractor to complete the initial feasibility study for Moon RACER. And Astrolab said its LTV contract could be worth up to $1.9 billion, depending on NASA’s needs.

The post Start Your Engines: NASA Picks 3 Teams to Work on Lunar Terrain Vehicle appeared first on Universe Today.

Categories: Astronomy

The Large Magellanic Cloud isn’t Very Metal

Wed, 04/03/2024 - 4:02pm

The Large Magellanic Cloud (LMC) is the Milky Way’s most massive satellite galaxy. Because it’s so easily observed, astronomers have studied it intently. They’re interested in how star formation in the LMC might have been different than in the Milky Way.

A team of researchers zeroed in on the LMC’s most metal-deficient stars to find out how different.

The LMC is about 163,000 light-years away and about 32,000 light-years across. Even though it’s that large, it’s still only 1/100th the mass of the Milky Way. It was probably a dwarf spiral galaxy before gravitational interactions with the Milky Way and the Small Magellanic Cloud warped its shape. Scientists predict it’ll probably merge with the Milky Way in about 2.4 billion years.

The LMC wasn’t always this close to the Milky Way. It formed elsewhere in the Universe, out of a different reservoir of gas than the Milky Way. The LMC’s stars preserve the environmental conditions they formed in.

The first stars to form in the Universe were the most metal-poor stars. When they formed, only hydrogen and helium from the Big Bang were available. These stars are called Population 3 stars, and they’re largely hypothetical. They were massive and many of them exploded as supernovae. These stars forged the heavier elements, called metals in astronomy, and then spread them out into space to be taken up by the next stars to form. That process continued generation by generation.

Population III stars were the Universe’s first stars. They were extremely massive, luminous stars, and many exploded as supernovae. Image Credit: DALL-E

Nobody’s ever found a Population 3 star because even if they’re more than hypothetical, they’d all be long gone by now. But in new research, scientists examined 10 of the LMC’s most metal-poor stars. They found one Population 2 star that is so metal-poor it’s similar to Population 3 stars.

The research is titled “Enrichment by extragalactic first stars in the Large Magellanic Cloud.” It’s published in the journal Nature Astronomy. The lead author is Anirudh Chiti from the Department of Astronomy & Astrophysics and the Kavli Institute for Cosmological Physics, both at the University of Chicago.

“This star provides a unique window into the very early element-forming process in galaxies other than our own,” said lead author Chiti. “We have built up an idea of how these stars that were chemically enriched by the first stars look like in the Milky Way, but we don’t yet know if some of these signatures are unique or if things happened similarly across other galaxies.”

The earliest Population 3 stars changed the Universe. By producing metals, they guaranteed the stars to follow had higher metallicities. But exactly what metals did they produce, and how much?

“We want to understand what the properties of those first stars were and what were the elements they produced,” said Chiti.

The difficult part is that nobody’s ever seen a Population 3 star. But by identifying an extremely metal-poor star that’s very similar to the first stars, the researchers found the next best thing. Finding nine other metal-poor stars was also helpful.

They compared the 10 LMC metal-poor stars to metal-poor stars in the Milky Way. The results show how different processes and different environments in both galaxies affected star formation and metal enrichment.

This illustration shows the Milky Way galaxy’s inner and outer halos. Old, metal-poor stars tend to inhabit the halo. (Image Credits: NASA, ESA, and A. Feild [STScI])

These metal-poor stars are difficult to find. Most of the stars in the Universe resulted from successive generations of stars; their enriched metallicity is a testament to that. Our Sun is a metal-rich Population 1 star, for example.

But these older, metal-poor Population 2 stars are out there. Since astronomers will likely never find an ancient Population 3 star, the Population 2 stars with the lowest metallicities are the next best things.

“Maybe fewer than 1 in 100,000 stars in the Milky Way is one of these second-gen stars,” Chiti said. “You really are fishing needles out of haystacks.”

But once astronomers find them, the outer layers of these rare stars hold evidence of the conditions they formed in. “In their outer layers, these stars preserve the elements near where they formed,” Chiti explained. “If you can find a very old star and get its chemical composition, you can understand what the chemical composition of the universe was like where that star formed billions of years ago.”

This figure from the study shows the ten LMC stars (blue crosses) compared to all stars within 10° of the LMC. They’re colour-coded with the Fe/H bar on the right. The Fe/H ratio shows the ratio of iron atoms to hydrogen atoms and is a common measure of overall metallicity. The scale on the left shows Calcium, Hydrogen, and Potassium abundances across the whole sky, another useful measure of metallicity. Image Credit: Chiti et al. 2024.

Finding such metal-poor stars in the LMC allowed astronomers to compare the star-forming conditions in that satellite galaxy to those in the Milky Way. The comparison can help astrophysicists understand how these star-forming conditions may have differed.

One of the 10 stars in the LMC stood out from the rest. It had markedly lower metallicity than the other nine. Called LMC 119, it’s 50 times more metal-deficient than the others. “Given its extremely low metallicity, this star exhibits the characteristics of a second-generation star that preserves the chemical imprints of a first-star supernova,” the authors write.

This figure from the research compares the atomic abundances of LM 119 to red giant stars in the Milky Way’s halo, where older, metal-poor stars are situated. As the figure shows, LMC 119 has much lower metallicity than the Milky Way’s metal-poor stars. Image Credit: Chiti et al. 2024.

One fact stood out to the researchers when they mapped LMC 119’s elements. It had much less carbon than iron when compared to Milky Way stars. In fact, the same was true of all 10 stars in the sample. This is important because the LMC wasn’t always a satellite galaxy of the Milky Way. That association only goes back a couple of billion years or so. Its stars formed in a distant region of the high-redshift Universe.

“That was very intriguing, and it suggests that perhaps carbon enhancement of the earliest generation, as we see in the Milky Way, was not universal,” Chiti said. “We’ll have to do further studies, but it suggests there are differences from place to place.”

For Chiti and his colleagues, the conclusion is clear. “This, and other abundance differences, affirm that the extragalactic early LMC experienced diverging enrichment processes compared to the early Milky Way. Early element production, driven by the earliest stars, thus appears to proceed in an environment-dependent manner,” they write in their conclusion.

The Large and Small Magellanic Clouds are visible at the lower right-hand corner of this image of the Milky Way as seen by the European Space Agency’s Gaia satellite. Image Credit: ESA/Gaia/DPAC

Since Chiti and his fellow researchers found one very low-metallicity star in the LMC, there are probably many more among its suspected population of 20 billion stars. Chiti is leading a program to map out more stars in the southern sky and find more of these types of stars.

“This discovery suggests there should be many of these stars in the Large Magellanic Cloud if we look closely,” he said. “It’s really exciting to be opening up stellar archeology of the Large Magellanic Cloud and to be able to map out in such detail how the first stars chemically enriched the universe in different regions.”

The post The Large Magellanic Cloud isn’t Very Metal appeared first on Universe Today.

Categories: Astronomy

Could We Directly Observe Volcanoes on an Exoplanet?

Wed, 04/03/2024 - 2:54pm

After a few decades of simply finding exoplanets, humanity is starting to be able to do something more – peer into their atmospheres. The James Webb Space Telescope (JWST) has already started looking at the atmospheres of some larger exoplanets around brighter stars. But in many cases, scientists are still developing models that both explain what the planet’s atmosphere is made of and match the data. A new study from researchers at UC Riverside, NASA’s Goddard Spaceflight Center, American University, and the University of Maryland looks at what one particular atmospheric process might look like on an exoplanet – volcanism.

There are a few caveats in the paper, though. First, the model itself is for an “exoEarth” – a planet equivalent to Earth circling a Sun-like star. Even JWST isn’t powerful enough to capture the data spectrographic data of an atmospheric planet of this size, no matter how close it is. So, the authors make some assumptions about the next generation of large in-space telescopes – specifically, they refer to the LUVOIR project we’ve reported on before.

Assuming the next great space telescope can collect data as planned, it is still necessary to understand the data that comes in. In particular, understanding what the dips in spectra are caused by and what, if any, specific pattern emerges that might be related to active volcanoes.

Fraser talks about JWST’s capabilities as an exoplanet hunter.

Those volcanoes would likely be spewing out sulfur dioxide and sulfate aerosols into the atmosphere of the exoEarth. To model the introduction of those materials, the authors turned to a simulation program called the Goddard Earth Observing System Chemistry Climate Model (GEOSCCM). This model allows researchers to manipulate certain aspects of the atmosphere and watch the results over long periods.

In this particular case, the researchers modeled the effect of a volcano by injecting one of several quantities of sulfur dioxide into the atmosphere every three months for four years. They then observed the effects for some time after the volcano stopped “erupting” (i.e., when they stopped injecting sulfur dioxide into the model) so they could conclude the atmospheric composition of a planet in recovery from a sustained eruption.

Three main spectra lines stood out in the researcher’s analysis. All three were related to oxygen – O2 (the breathable stuff), O3 (ozone), and good old H20. Each of these three spectral signals underwent serious changes around the time of the eruptions, and then those changes were reversed once the eruptions ceased.

Fraser talks about the difficulties in directly imaging planet with Dr. Thayne Currie

One particular feature that stood out was the spectral line for ozone (O3). It continually decreased during the eruption phase, likely caused by its transformation into sulfuric acid. After the eruptions, however, the quantity of ozone in the modeled atmosphere began to creep up again, showing a similar resilience to our own ozone layer that had been impacted by the use of CFCs last century. 

With their expected results in hand, the researchers calculated how long they thought it would take a telescope like LUVOIR to observe a particular exoplanet to find these tale-tell spectral lines that would indicate whether there was active volcanism on the planet. Ozone was relatively simple, as it required only 6 hours of observation. In contrast, water vapor was trickier to quantify, as it could be as short as 9 hours or impossible altogether, depending on the variability in the signal.

Studies like this will be crucial to the success of any future large space telescope mission, and there will be plenty of things for LUVOIR, or its equivalent, to look at when (and if) it launches. Therefore, plenty of other studies detailing what features we can expect to see will be necessary in the near future. But for now, at least we’ll know what to look for if we see volcanoes on a planet just like our own.

Learn More:
Ostberg et al – The Prospect of Detecting Volcanic Signatures on an ExoEarth Using Direct Imaging
UT – A Super-Earth (and Possible Earth-Sized) Exoplanet Found in the Habitable Zone
UT – Can JWST Tell the Difference Between an Exo-Earth and an Exo-Venus?
UT – Earth is an Exoplanet to Aliens. This is What They’d See

Lead Image:
LP 791-18 d, shown here in an artist’s concept, is an Earth-size world about 90 light-years away.
Credit: NASA’s Goddard Space Flight Center/Chris Smith (KRBwyle)

The post Could We Directly Observe Volcanoes on an Exoplanet? appeared first on Universe Today.

Categories: Astronomy