These earthly godfathers of Heaven's lights, that give a name to every fixed star, have no more profit of their shining nights than those that walk and know not what they are.

— William Shakespeare

Universe Today

Syndicate content Universe Today
Space and astronomy news
Updated: 12 hours 10 min ago

If Europa has Geysers, They’re Very Faint

Tue, 04/09/2024 - 9:20pm

In 2013, the Hubble Space Telescope spotted water vapour on Jupiter’s moon Europa. The vapour was evidence of plumes similar to the ones on Saturn’s moon Enceladus. That, and other compelling evidence, showed that the moon has an ocean. That led to speculation that the ocean could harbour life.

But the ocean is obscured under a thick, global layer of ice, making the plumes our only way of examining the ocean. The plumes are so difficult to detect they haven’t been confirmed.

The lead author of the paper presenting Hubble’s 2013 evidence is Lorenz Roth of Southwest Research Institute. He said, “By far, the simplest explanation for this water vapour is that it erupted from plumes on the surface of Europa. If those plumes are connected with the subsurface water ocean we are confident exists under Europa’s crust, then this means that future investigations can directly investigate the chemical makeup of Europa’s potentially habitable environment without drilling through layers of ice. And that is tremendously exciting.”

It is, but first, scientists have to find the plumes.

“We pushed Hubble to its limits to see this very faint emission. These could be stealth plumes because they might be tenuous and difficult to observe in visible light,” said Joachim Saur of the University of Cologne, co-author of the 2013 paper.

This artist’s illustration shows plumes erupting through Europa’s icy surface. Gigantic Jupiter lurks in the background. Image Credit: NASA/ESA/K. Retherford/SWRI

Describing them as tenuous stealth plumes turned out to be prophetic.

Recently, a team of researchers went looking for the plumes. Their results are in a presentation given to the IAU Symposium 383 titled “ALMA Spectroscopy of Europa: A Search for Active Plumes.” The lead author is M.A. Cordiner from the Solar System Exploration Division at NASA’s Goddard Space Flight Center.

“The subsurface ocean of Europa is a high-priority target in the search for extraterrestrial life, but direct investigations are hindered by the presence of a thick exterior ice shell,” the authors write. The researchers used ALMA to search for molecular emissions from atmospheric plumes. They were investigating processes under the ice that could help them understand Europa’s ocean and its chemistry.

The Solar System is full of icy bodies, including comets, Kuiper Belt Objects, dwarf planets, and moons like Europa. Europa has a high density compared to other icy bodies, indicating a substantial rocky interior. Its ocean makes up about 10% of the moon and is covered by an icy shell of uncertain thickness. It could be several tens of kilometres thick. Scientists learned much of this from NASA’s Galileo mission.

In recent years, Europa and its ocean have leapt to the top of the list of targets in the search for life. The reasons aren’t obscure: liquid water is an irresistible beacon in our search for habitable places. The plumes from Europa’s ocean are our only way to study the ocean and its potential habitability.

This illustration shows what the interior of Europa might look like. Geysers might erupt through cracks and fissures in the ice. Image Credit: NASA/JPL-Caltech/Michael Carroll)

Over the years, different telescopes have examined Europa, searching for more evidence of the plumes. They’ve found potential intermittent plume activity near the moon’s south pole. But confirmation of the plumes the Hubble spotted in 2013 is elusive. In 2023, the JWST examined Europa. Those observations “found no evidence for active plumes, indicating that any present-day activity must be localized and weak; robust confirmation of the initial HST plume results also remains challenging,” the authors write.

In an attempt to find the plumes, the authors employed ALMA, the Atacama Large Millimeter/submillimeter Array. They observed Europa on four separate days to cover the moon’s surface. Unfortunately, they found no plumes.

These are four ALMA images of Europa. The researchers observed the moon on four different days so they could image almost the entire surface. They found no plumes. Image Credit: Cordiner et al. 2024.

“Despite near-complete coverage of both Europa’s leading and trailing hemispheres, we find no evidence for gas phase molecular absorption or emission in our ALMA data,” the researchers write. “Using ALMA’s unique combination of high spectral/spatial resolution and sensitivity, our observations have enabled the first dedicated search for HCN, H2CO, SO2 and CH3OH in Europa’s exosphere and plumes. No evidence was found for the presence of these molecules.”

Finding no evidence doesn’t quite mean that those molecules aren’t there. Rather, it means that if they are there, their concentrations are so low they’re below the detection threshold. In this case, some concentrations would be lower than those detected in Enceladus’ plumes, which are confirmed.

One chemical in particular illustrates this point: CH3OH (methanol.) “For the CH3OH abundance, on the
other hand, our ALMA upper limit of < 0.86% would not have been sensitive enough to detect this molecule at the Enceladus plume abundance of 0.02%,” the authors write.

There are some interesting relationships between Europa and other icy objects in the Solar System. It has to do with abundance limits. The researchers established upper limits for H2CO (formaldehyde) on Europa. “Indeed, our H2CO abundance upper limit is significantly lower than measured by Cassini in the Enceladus plume, implying a possible chemical difference.”

Despite the fact that it didn’t find any plumes, the observations were still valuable. By setting detection limits it helps subsequent efforts to search for them. And this won’t be scientists’ final attempt at finding plumes. Anything that provides clues to Europa’s ocean is too tantalizing to ignore, and this research shows that ALMA is suited to this type of investigation.

“Our results show that ALMA is a powerful tool in the search for outgassing from icy bodies within the Solar System and that follow-up searches for other molecules at additional epochs (on Europa and other icy bodies) are justified,” the researchers conclude.

The post If Europa has Geysers, They’re Very Faint appeared first on Universe Today.

Categories: Astronomy

WISPR Team Images Turbulence within Solar Transients for the First Time

Tue, 04/09/2024 - 5:37pm

NASA’s Parker Solar Probe has been in studying the Sun for the last six years. In 2021 it was hit directly by a coronal mass ejection when it was a mere 10 million kilometres from the solar surface. Luckily it was gathering data and images enabling scientists to piece together an amazing video. The interactions between the solar wind and the coronal mass ejection were measured giving an unprecedented view of the solar corona. 

The Sun is a fascinating object and as our local star, has been the subject of many studies. There are still mysteries though and it was hoping to unravel some of these that the NASA Parker Solar Probe was launched. It was sent on its way by the Delta IV heavy back in 2018 and has flown seven times closer to the Sun than any spacecraft before it. 

Illustration of the Parker Solar Probe spacecraft approaching the Sun. Credits: Johns Hopkins University Applied Physics Laboratory

By the time Parker completes its seven year mission it will have completed 24 orbits of the Sun and flown to within 6.2 million kilometres to the visible surface. For this to happen, its going to get very hot so the probe has a 11.4cm thick carbon composite shield to keep its components as cool as possible in the searing 1,377 Celsius temperatures. 

Flying within the Sun’s outer atmosphere, the corona, the probe picked up turbulence inside a coronal mass ejection as it interacted with the solar wind. These events are eruptions of large amounts of highly magnetised and energetic plasma from within the Sun’s corona. When directed toward Earth they can cause magnetic and radio disruptions in many ways from communications to power systems. 

Image of a coronal mass ejection being discharged from the Sun. (Credit: NASA/Goddard Space Flight Center/Solar Dynamics Observatory)

Using the Wide Field Imager for Parker Solar Probe (WISPR) and its prime position inside the solar atmosphere, unprecedented footage was captured (click on this link for the video). The science team from the US Naval Research Laboratory revealed what seemed like turbulent eddies, so called Kelvin-Helmholtz instabilities (KHIs) in one of the images. Turbulent eddy structures like these have been seen in the atmosphere of terrestrial planets. Strong wind shear between upper and lower cloud levels causes thin trains of crescent wave like clouds. 

Member of the WISPR team Evangelos Paouris PhD was the eagle eyed individual that spotted the disturbance. Paouris and team analysed the structure to verify the waves. The discovery of these rare features in the CME have opened up a whole new field of investigations.  

The KHIs are the result of turbulence which plays a key role in the movement of CMEs as they flow through the ambient solar wind. Understanding the CMEs and their dynamics of CMEs and a more fuller understanding of the Sun’s corona. This doesn’t just help us understand the Sun but also helps to understand the effect of CMEs on Earth and our space based technology.

Source : WISPR Team Images Turbulence within Solar Transients for the First Time

The post WISPR Team Images Turbulence within Solar Transients for the First Time appeared first on Universe Today.

Categories: Astronomy

What Happens to Solar Systems When Stars Become White Dwarfs?

Tue, 04/09/2024 - 5:35pm

In a couple billion years, our Sun will be unrecognizable. It will swell up and become a red giant, then shrink again and become a white dwarf. The inner planets aren’t expected to survive all the mayhem these transitions unleash, but what will happen to them? What will happen to the outer planets?

Right now, our Sun is about 4.6 billion years old. It’s firmly in the main sequence now, meaning it’s going about its business fusing hydrogen into helium and releasing energy. But even though it’s about 330,000 times more massive than the Earth, and nearly all of that mass is hydrogen fuel, it will eventually run out.

In another five billion years or so, its vast reservoir of hydrogen will suffer depletion. As it burns through its hydrogen, the Sun will lose mass. As it loses mass, its gravity weakens and can no longer counteract the outward force driven by fusion. A star is a balancing act between the outward expansion of fusion and the inward force of gravity. Eventually, the Sun’s billions-of-years-long balancing act will totter.

With weakened gravity, the Sun will begin to expand and become a red giant.

This illustration shows the current-day Sun at about 4.6 billion years old. In the future, the Sun will expand and become a red giant. Image Credit: By Oona Räisänen (User:Mysid), User:Mrsanitazier. – Vectorized in Inkscape by Mysid on a JPEG by Mrsanitazier (en:Image: Sun Red Giant2.jpg). CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=2585107

The Sun will almost certainly consume Mercury and Venus when it becomes a red giant. It will expand and become about 256 times larger than it is now. The inner two planets are too close, and there’s no way they can escape the swelling star. Earth’s fate is less certain. It may be swallowed by the giant Sun, or it may not. But even if it isn’t consumed, it will lose its oceans and atmosphere and become uninhabitable.

The Sun will be a red giant for about one billion years. After that, it will undergo a series of more rapid changes, shrinking and expanding again. But the mayhem doesn’t end there.

The Sun will pulse and shed its outer layers before being reduced to a tiny remnant of what it once was: a white dwarf.

An artist’s impression of a white dwarf star. The material inside white dwarfs is tightly packed, making them extremely dense. Image credit: Mark Garlick / University of Warwick.

This will happen to the Sun, its ilk, and almost all stars that host planets. Even the long-lived red dwarfs (M-dwarfs) will eventually become white dwarfs, though their path is different.

Astronomers know the fate of planets too close to the stars undergoing these tumultuous changes. But what happens to planets further away? To their moons? To asteroids and comets?

New research published in The Monthly Notices of the Royal Astronomical Society digs into the issue. The title is “Long-term variability in debris transiting white dwarfs,” and the lead author is Dr. Amornrat Aungwerojwit of Naresuan University in Thailand.

“Practically all known planet hosts will evolve eventually into white dwarfs, and large parts of the various components of their planetary systems—planets, moons, asteroids, and comets—will survive that metamorphosis,” the authors write.

There’s lots of observational evidence for this. Astronomers have detected planetary debris polluting the photospheres of white dwarfs, and they’ve also found compact debris disks around white dwarfs. Those findings show that not everything survives the main sequence to red giant to white dwarf transition.

“Previous research had shown that when asteroids, moons and planets get close to white dwarfs, the huge gravity of these stars rips these small planetary bodies into smaller and smaller pieces,” said lead author Aungwerojwit.

This Hubble Space Telescope shows Sirius, with its white dwarf companion Sirius B to the lower left. Sirius B is the closest white dwarf to the Sun. Credit: NASA, ESA, H. Bond (STScI) and M. Barstow (University of Leicester).

In this research, the authors observed three white dwarfs over the span of 17 years. They analyzed the changes in brightness that occurred. Each of the three stars behaved differently.

When planets orbit stars, their transits are orderly and predictable. Not so with debris. The fact that the three white dwarfs showed such disorderly transits means they’re being orbited by debris. It also means the nature of that debris is changing.

“The unpredictable nature of these transits can drive astronomers crazy—one minute they are there, the next they are gone.”

Professor Boris Gaensicke, University of Warwick

As small bodies like asteroids and moons are torn into small pieces, they collide with one another until nothing’s left but dust. The dust forms clouds and disks that orbit and rotate around the white dwarfs.

Professor Boris Gaensicke of the University of Warwick is one of the study’s co-authors. “The simple fact that we can detect the debris of asteroids, maybe moons or even planets whizzing around a white dwarf every couple of hours is quite mind-blowing, but our study shows that the behaviour of these systems can evolve rapidly, in a matter of a few years,” Gaensicke said.

“While we think we are on the right path in our studies, the fate of these systems is far more complex than we could have ever imagined,” added Gaensicke.

This artist’s illustration shows the white dwarf WD J0914+1914 (Not part of this research.) A Neptune-sized planet orbits the white dwarf, and the white dwarf is drawing material away from the planet and forming a debris disk around the star. Image Credit: By ESO/M. Kornmesser – https://www.eso.org/public/images/eso1919a/, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=84618722

During the 17 years of observations, all three white dwarfs showed variability.

The first white dwarf (ZTF J0328?1219) was steady and stable until a major catastrophic event around 2011. “This might suggest that the system underwent a large collisional event around 2011, resulting in the production of large amounts of dust occulting the white dwarf, which has since then gradually dispersed, though leaving sufficient material to account for the ongoing transit activity, which implies continued dust production,” the researchers explain.

The second white dwarf (ZTF J0923+4236) dimmed irregularly every couple of months and displayed chaotic variability on the timescale of minutes. “These long-term changes may be the result of the ongoing disruption of a planetesimal or the collision between multiple fragments, both leading to a temporarily increased dust production,” the authors explain in their paper.

The third star (WD 1145+017) showed large variations in numbers, shapes and depths of transits in 2015. This activity “concurs with a large increase in transit activity, followed by a subsequent gradual re-brightening,” the authors explain, adding that “the overall trends seen in the brightness of WD?1145+017 are linked to varying amounts of transit activity.”

But now all those transits are gone.

“The unpredictable nature of these transits can drive astronomers crazy—one minute they are there, the next they are gone,” said Gaensicke. “And this points to the chaotic environment they are in.”

But astronomers have also found planetesimals, planets, and giant planets around white dwarfs, indicating that the stars’ transitions from main sequence to red giant don’t destroy everything. The dust and debris that astronomers see around these white dwarfs might come from asteroids or from moons pulled free from their giant planets.

“For the rest of the Solar System, some of the asteroids located between Mars and Jupiter, and maybe some of the moons of Jupiter may get dislodged and travel close enough to the eventual white dwarf to undergo the shredding process we have investigated,” said Professor Gaensicke.

When our Sun finally becomes a white dwarf, it will likely have debris around it. But the debris won’t be from Earth. One way or another, the Sun will destroy Earth during its red giant phase.

“Whether or not the Earth can just move out fast enough before the Sun can catch up and burn it is not clear, but [if it does] the Earth would [still] lose its atmosphere and ocean and not be a very nice place to live,” explained Professor Gaensicke.

The post What Happens to Solar Systems When Stars Become White Dwarfs? appeared first on Universe Today.

Categories: Astronomy

A Neutron Star Merged with a Surprisingly Light Black Hole

Tue, 04/09/2024 - 3:36pm

Galactic collisions, meteor impacts and even stellar mergers are not uncommon events. neutron stars colliding with black holes however are a little more rare, in fact, until now, we have never observed one. The fourth LIGO-Virgo-KAGRA observing detected gravitational waves from a collision between a black hole and neutron star 650 million light years away. The black hole was tiny though with a mass between 2.5 to 4.5 times that of the Sun. 

Neutron stars and black holes have something in common; they are both the remains of a massive star that has reached the end of its life. During the main part of a stars life the inward pull of gravity is balanced by the outward push of the thermonuclear pressure that makes the star shine. The thermonuclear pressure overcomes gravity for low mass stars like the Sun but for more massive stars, gravity wins. The core collapses compressing it into either a neutron star or a black hole (depending on the progenitor star mass) and explodes as a supernova – in the blink of an eye. 

In May 2023, as a result of the fourth observing session of the LIGO-Virgo-KAGRA (Laser Interferometer Gravitational Wave Observatory-Virgo Gravitational Wave Interferometer and Kamioka Gravitational Wave Detector) network, gravitational waves were picked up from a merger event. The signal came from an object 1.2 times the mass of the Sun and another slightly more massive object. Further analysis revealed the likelihood that one was a neutron star and the other a low mass black hole. The latter falls into the so called ‘mass gap’, more massive than the most massive neutron star and less massive than the least massive black hole.

Interactions between objects can generate gravitational waves. Before they were detected back in 2015, stellar mass black holes were typically found through X-ray observations. Neutron stars on the other hand, were usually found with radio observations. Between the two, was the mass gap with objects lacking between three and five solar masses. 

It has been the subject of debate among scientists with the odd object found which fell within the gap, fuelling debate about its existence. The gap has generally been considered to separate the neutron stars from the black holes and items in this mass group have been scarce. This gravitational wave discovery suggests maybe objects in this gap are not so rare after-all. 

One of the challenges of detecting mass gap objects and mergers between them is the sensitivity of detectors. The LIGO team at the University of British Columbia researchers are working hard to improve the coatings used in mirror production. Enhanced performance on future LIGO detectors will further enhance detection capabilities. It’s not just optical equipment that is being developed, infrastructure changes are also being addressed including data analysis software too. Improving sensitivity in all aspects of the gravity wave network is sure to yield results in future runs. However for now, the rest of the first half of the observing run needs analysing with 80 more candidate signals to study. 

Source : New gravitational wave signal helps fill the ‘mass gap’ between neutron stars and black holes

The post A Neutron Star Merged with a Surprisingly Light Black Hole appeared first on Universe Today.

Categories: Astronomy

The Seven Most Intriguing Worlds to Search for Advanced Civilizations (So Far)

Tue, 04/09/2024 - 2:29pm

Sometimes, the easy calculations are the most interesting. A recent paper from Balázs Bradák of Kobe University in Japan is a case in point. In it, he takes an admittedly simplistic approach but comes up with seven known exoplanets that could hold the key to the biggest question of them all – are we alone?

Dr. Bradák starts with a simple premise – there is a chance that life on Earth might have started via panspermia. There is also a case that panspermia was intention – an advanced civilization could theoretically have purposefully sent a biological seed ship to our local solar system to spread life here, essentially from scratch.

With those admittedly very large assumptions in place, Dr. Bradák works out a few characteristics about the planets that could have been the starting point for such a civilization. First, he assumes, as much of the astrobiological community does, that for an advanced civilization to arise on a planet, that planet has to be at least partially covered in an ocean. 

Sun-like stars aren’t the only potential hosts for habitable planets, as Fraser discusses here.

To meet that requirement, the planet has to be both the right size and the right temperature. The two size categories of exoplanets that Dr. Bradák originally selected were “terrestrial” – planets similar to Earth, including so-called “Super-Earths” – and “sub-Neptunes” – planets that are significantly larger than Earth but smaller than the ice giant in the outer fringes of our solar system.

Any such exoplanet also has to be in the habitable zone of its parent star. That alone dramatically narrows the potential field of planetary candidates. For simplicity’s sake, Dr. Bradák also eliminates sub-Neptunes as a potential planetary class. However, one other factor comes into play as well: age.

We know it took around 4.6 billion years for life to evolve to a point where it could theoretically send objects to other star systems – as we have now with Voyager. Since the original planet would also have to have evolved such a civilization, it would be double the time for its minimum age – or 9.2 billion years old.

The idea of panspermia has been around for decades, as Fraser discusses.

Dr. Bradák adds some additional argument that lowers the required age of the system – and he also assumes that the planetary system of a star forms at a similar time gap as our planetary system did. The distance to most of these stars is inconsequential on the scale of billions of years, so the travel time for the seed ship was discounted in this calculation. 

After all that pruning, Dr. Bradák turned to NASA’s Exoplanet Archive, which currently contains 5271 known exoplanets. Of those 5271, only 7 meet the specified age, size, and habitable zone placement criteria. In other words, according to our current knowledge of exoplanets and how life evolved, only a few planets could potentially have been the starting point for an intentional panspermia campaign.

One planet in particular stands out – Kepler-452 b, which has a star similar to ours and an orbit similar to ours. That system is only 1,400 light years away, relatively close by astronomical standards. If nothing else, it points to that system as a potentially interesting focal point for exoplanet surveys, including assessments of exoplanet atmospheres. However, we’ll likely have to wait for the next generation of grand telescopes.

For now, this was an interesting, though brief, speculative exercise. Astronomers are always looking for exciting things, and this paper contributes to the arguments about why it’s so important to spend time looking in detail at some of the exoplanets we already know about.

Learn More:
B. Bradák – A BOLD AND HASTY SPECULATION ABOUT ADVANCED CIVILIZATION-BEARING PLANETS
APPEARING IN EXOPLANET DATABASES

UT – A Super-Earth (and Possible Earth-Sized) Exoplanet Found in the Habitable Zone
UT – A New Place to Search for Habitable Planets: “The Soot Line.”
UT – Want to Find Life? Compare a Planet to its Neighbors

Lead Image:
Artist’s illustration of a habitable planet.
Credit – Wikipedia / VP8/Vorbis

The post The Seven Most Intriguing Worlds to Search for Advanced Civilizations (So Far) appeared first on Universe Today.

Categories: Astronomy

What a Swarm of Probes Can Teach Us About Proxima Centauri B

Tue, 04/09/2024 - 1:06pm

You’ve likely heard of the Breakthrough Starshot (BTS) initiative. BTS aims to send tiny gram-scale, light sail picospacecraft to our neighbour, Proxima Centauri B. In BTS’s scheme, lasers would propel a whole fleet of tiny probes to the potentially water-rich exoplanet.

Now, another company, Space Initiatives Inc., is tackling the idea. NASA has funded them so they can study the idea. What can we expect to learn from the effort?

Proxima b may be a close neighbour in planetary terms. But it’s in a completely different solar system, about four light-years away. That means any probes sent there must travel at relativistic speeds if we want them to arrive in a reasonable amount of time.

That’s why Space Initiatives Inc. proposes such tiny spacecraft. With their small masses, direct lasers can propel them to their destination. That means they must send a swarm of hundreds or even one thousand probes to get valuable scientific results.

This is much different than the architecture that missions usually conform to. Most missions are a single spacecraft, perhaps with a smaller attached probe like the Huygens probe attached to the Cassini spacecraft. How does using a swarm change the mission? What results can we expect?

“We anticipate our innovations would have a profound effect on space exploration.”

Thomas Eubanks, Space Inititatives Inc.

A new presentation at the 55th Lunar and Planetary Science Conference (LPSC) in Texas examined the idea. It’s titled “SCIENTIFIC RETURN FROM IN SITU EXPLORATION OF THE PROXIMA B EXOPLANET.” The lead author is T. Marshall Eubanks from Space Initiatives Inc., a start-up developing 50-gram femtosatellites that weigh less than 100 grams (3.5 oz.)

Tiny probes like these can only do flybys. They’re too tiny and low-mass for anything else. When designing a mission like this, the first consideration is whether the probes will operate as a dispersed or coherent swarm. In a dispersed swarm, the probes reach their destination sequentially. In a coherent swarm, the probes are together when they do their flyby. Both architectures have their merits.

In either case, these tiny solar sail probes will be very thin. But thanks to technological advances, they can still gather high-resolution images by working together.

The image below shows 247 probes forming an array as they fly by Proxima b. Together, they have the light-collecting area of a three-meter telescope. This arrangement should enable sub-arc-second resolutions at optical wavelengths. Spectroscopy should be equally as fine.

“While both erosion by the Interstellar Medium (ISM) and image smearing will degrade imaging, we anticipate these systems will enable sub-arcsecond resolution imaging and spectroscopy of the target planet,” the authors write.

This image from the presentation shows how the probe swarm would arrive at Proxima b. (Note that the planned swarm dispersion is much smaller than is indicated here.) Image Credit: Eubanks et al. 2024.

These tiny spacecraft could do some course correction, but not much. So, getting the navigation right is critical. Unfortunately, our data on Proxima b’s orbit is not as well-understood as the planets in our own Solar System. It all comes down to ephemeris.

Ephemeris tables show the trajectory of planets and other objects in space. But in Proxima b’s case, the ephemeris error is potentially quite large.

Added to that is the distance. If the probes can travel at 20% of light speed, reaching the planet will take over 21 years. The authors calculate that if they can restrict Proxima b’s ephemeris error to 100,000 km and send 1,000 probes, at least one will come within 1,000 km of the planet. “Meeting this ephemeris error goal will require improved astrometry of the Proxima system,” the authors write.

The probes would perform science observations on their way to Proxima b. As they travel, the swarm would have dozens or even hundreds of opportunities to use microlensing to study stellar objects. A stellar mass microlensing event requiring one month on Earth would only take one hour.

“It is now possible to predict lensing events for nearby stars; BTS probe observations of dozens or hundreds of predicted microlensing events by nearby stars will offer both a means of observing these systems and a novel means of interstellar navigation,” the authors explain.

The swarm would be only the third mission to leave our Solar System. The Voyage spacecraft left the heliosphere, but only inadvertently. So, the swarm could observe the interstellar medium (ISM) during its 20+ year journey. One of the questions we have about the local ISM concerns clouds. We only have poor data on the nature of these clouds, and scientists aren’t certain if our Solar System is in the Local Interstellar Cloud (LIC.)

“In situ observation of the properties of these clouds will be a primary scientific goal for mission science during the long interstellar voyage,” the researchers write.

There are clouds in the ISM near our Solar System. But we don’t know much about them, including if our Solar System is in the LIC or if it’s leaving it. Image Credit: Interstellar Probe/JHUAPL

Opportunistic science during the voyage is great, but arrival at Proxima b is the meat of the mission. One day before the probes arrive, they would still be 35 AU away. At that point, the mission could begin imaging. Proxima b would still only be several pixels across, but it’s enough to see any visible moons.

“At this point, it would be worth turning some probes to face forward and begin imaging the Proxima system to search for undiscovered planets, moons and asteroids in the system, and to begin a Proxima b approach video,” the researchers explain.

Upon arrival at Proxima Centauri b, a one-meter aperture telescope 6,000 km away from the planet could attain a six-meter resolution on the surface. That’s an idealistic number, as not all of the planet’s surface could be imaged at that resolution. PCb is also tidally locked to its star, meaning one side is in darkness. Because of that, the mission should be designed to gather low-light and infrared images of the night side. “Night-side illumination imagery might also be the most conclusive technosignature from an initial Proxima mission,” the authors write.

As probes pass through Proxima b’s shadow, they could use the light from the star to perform spectroscopy. Probes passing behind Proxima b could use the Earth laser system for spectrometry, and if the probes are in a coherent swarm, they could use the lasers from pairs of probes on either side of the planet.

“Transmission spectroscopy, which for Proxima b cannot be done from Earth,” the researchers explain, “will likely provide the best means of determining the existence of a biology or even a technological society on Proxima b through the search for the spectral lines of biomarkers and technomarkers.”

As humanity’s first mission to Proxima Centauri b, the swarm would face some hurdles and uncertainties. But in a coherent swarm architecture, the mission could also be almost too successful. “A BTS mission, especially with a coherent swarm, may collect more data than can be returned to Earth,” the authors write. If the data returned has to be selected autonomously by the swarm itself, that could be more demanding than deciding what data to collect in the first place.

Scientists have many questions about Proxima Centauri b. Should the swarm ever be launched, any amount of data it returns will be valuable. Even though it’ll take over four years for the data to be sent back to Earth.

An artist’s conception of a violent flare erupting from the red dwarf star Proxima Centauri. Such flares can obliterate the atmospheres of nearby planets. Credit: NRAO/S. Dagnello.

Scientists don’t know how hot the planet is. They’re not certain if it even has liquid water. It looks like the planet is just over one Earth mass and has a slightly higher radius. But those measurements are uncertain. Scientists are also uncertain about its composition. The star it orbits is a flare star, which means the planet could be subjected to extremely powerful bursts of radiation. That’s a lot of uncertainty.

But it’s the nearest exoplanet, the only one we could feasibly reach in a realistic amount of time. That alone makes it a desirable target.

There’s no final plan for a mission like this. It’s largely conceptual. But the technology to do it is coming along. NASA has funded a mission study, so it definitely has merit.

“Fortunately, we don’t have to wait until mid-century to make practical progress – we can explore and test swarming techniques now in a simulated environment, which is what we propose to do in this work,” said report lead author Thomas Eubanks from Space Initiatives Inc. “We anticipate our innovations would have a profound effect on space exploration, complementing existing techniques and enabling entirely new types of missions, for example, picospacecraft swarms covering all of cislunar space or instrumenting an entire planetary magnetosphere.”

Eubanks also points out how a swarm of probes could investigate interstellar objects that pass through our inner Solar System, like Oumuamua.

But the main mission would be the one to Proxima Centauri b. According to Eubanks, that would happen sometime in the third quarter of this century.

The post What a Swarm of Probes Can Teach Us About Proxima Centauri B appeared first on Universe Today.

Categories: Astronomy

Measuring the Atmospheres of Other Worlds to See if There are Enough Nutrients for Life

Mon, 04/08/2024 - 6:25pm

Life on Earth depends on six critical elements: Carbon, Hydrogen, Nitrogen, Oxygen, Phosphorous, and Sulfur. These elements are referred to as CHNOPS, and along with several trace micronutrients and liquid water, they’re what life needs.

Scientists are getting a handle on detecting exoplanets that might be warm enough to have liquid water on their surfaces, habitability’s most basic signal. But now, they’re looking to up their game by finding CHNOPS in exoplanet atmospheres.

We’re only at the beginning of understanding how exoplanets could support life. To grow our understanding, we need to understand the availability of CHNOPS in planetary atmospheres.

A new paper examines the issue. It’s titled “Habitability constraints by nutrient availability in atmospheres of rocky exoplanets.” The lead author is Oliver Herbort from the Department of Astrophysics at the University of Vienna and an ARIEL post-doctoral fellow. The paper has been accepted by the International Journal of Astrobiology.

At our current technological level, we’re just beginning to examine exoplanet atmospheres. The JWST is our main tool for the task, and it’s good at it. But the JWST is busy with other tasks. In 2029, the ESA will launch ARIEL, the Atmospheric Remote-sensing Infrared Exoplanet Large survey. ARIEL will be solely focused on exoplanet atmospheres.

An artist’s impression of the ESA’s Ariel space telescope. During its four-year mission, it’ll examine 1,000 exoplanet atmospheres with the transit method. It’ll study and characterize both the compositions and thermal structures. Image Credit: ESA

In anticipation of that telescope’s mission, Herbort and his co-researchers are preparing for the results and what they mean for habitability. “The detailed understanding of the planets itself becomes important for interpreting observations, especially for the detection of biosignatures,” they write. In particular, they’re scrutinizing the idea of aerial biospheres. “We aim to understand the presence of these nutrients within atmospheres that show the presence of water cloud condensates, potentially allowing the existence of aerial biospheres.”

Our sister planet Venus has an unsurvivable surface. The extreme heat and pressure make the planet’s surface uninhabitable by any measure we can determine. But some scientists have proposed that life could exist in Venus’ atmosphere, based largely on the detection of phosphine, a possible indicator of life. This is an example of what an aerial biosphere might look like.

This artistic impression depicts Venus. Astronomers at MIT, Cardiff University, and elsewhere may have observed signs of life in the atmosphere of Venus by detecting phosphine. Subsequent research disagreed with this finding, but the issue is ongoing. Image Credits: ESO (European Space Organization)/M. Kornmesser & NASA/JPL/Caltech

“This concept of aerial biospheres enlarges the possibilities of potential habitability from the presence of liquid water on the surface to all planets with liquid water clouds,” the authors explain.

The authors examined the idea of aerial biospheres and how the detection of CHNOPS plays into them. They introduced the concept of nutrient availability levels in exoplanet atmospheres. In their framework, the presence of water is required regardless of other nutrient availability. “We considered any atmosphere without water condensates as uninhabitable,” they write, a nod to water’s primacy. The researchers assigned different levels of habitability based on the presence and amounts of the CHNOPS nutrients.

This table from the research illustrates the authors’ concept of atmospheric nutrient availability. As the top row shows, without water, no atmosphere is habitable. Different combinations of nutrients have different habitability potential. ‘red’ stands for redox, and ‘ox’ stands for the presence of the oxidized state of CO2, NOx, and SO2. Image Credit: Herbort et al. 2024.

To explore their framework of nutrient availability, the researchers turned to simulations. The simulated atmospheres held different levels of nutrients, and the researchers applied their concept of nutrient availability. Their results aim to understand not habitability but the chemical potential for habitability. A planet’s atmosphere can be altered drastically by life, and this research aims to understand the atmospheric potential for life.

“Our approach does not directly aim for the understanding of biosignatures and atmospheres of planets, which are inhabited, but for the conditions in which pre-biotic chemistry can occur,” they write. In their work, the minimum atmospheric concentration for a nutrient to be available is 10?9, or one ppb (part per billion.)

“We find that for most atmospheres at ( p gas, T gas) points, where liquid water is stable, CNS-bearing molecules are present at concentrations above 10?9,” they write. They also found that carbon is generally present in every simulated atmosphere and that sulphur availability increases with surface temperature. With lower surface temperatures, nitrogen (N2, NH3) is present in increasing amounts. But with higher surface temperatures, nitrogen can become depleted.

Phosphorus is a different matter. “The limiting element of the CHNOPS elements is phosphorus, which is mostly bound in the planetary crust,” they write. The authors point out that, at past times in Earth’s atmosphere, phosphorus scarcity limited the biosphere.

An aerial biosphere is an interesting idea. But it’s not the main thrust of scientists’ efforts to detect exoplanet atmospheres. Surface life is their holy grail. It should be no surprise that it still comes down to liquid water, all things considered. “Similar to previous work, our models suggest that the limiting factor for habitability at the surface of a planet is the presence of liquid water,” the authors write. In their work, when surface water was available, CNS was available in the lower atmosphere near the surface.

But surface water plays several roles in atmospheric chemistry. It can bond with some nutrients in some circumstances, making them unavailable, and in other circumstances, it can make them available.

“If water is available at the surface, the elements not present in the gas phase are stored in the crust condensates,” the authors write. Chemical weathering can then make them available as nutrients. “This provides a pathway to overcome the lack of atmospheric phosphorus and metals, which are used in enzymes that drive many biological processes.”

Artist’s impression of the surface of a hycean world. Hycean worlds are still hypothetical, with large oceans and thick hydrogen-rich atmospheres that trap heat. It’s unclear if a world with no surface can support life. Image Credit: University of Cambridge

This complicates matters on worlds covered by oceans. Pre-biotic molecules might not be available if there’s no opportunity for water and rock to interact with the atmosphere. “If indeed it can be shown that life can form in a water ocean without any exposed land, this constraint becomes weaker, and the potential for the surface habitability becomes mainly a question of water stability,” the authors write.

Some of the models are surprising because of atmospheric liquid water. “Many of the models show the presence of a liquid water zone in the atmospheres, which is detached from the surface. These regions could be of interest for the formation of life in forms of aerial biospheres,” Herbort and his colleagues write.

If there’s one thing that research like this shows, planetary atmospheres are extraordinarily complex and can change dramatically over time, sometimes because of life itself. This research makes some sense in trying to understand it all. Emphasizing the complexity is the fact that the researchers didn’t include stellar radiation in their work. Including that would’ve made the effort unwieldy.

The habitability issue is complicated, confounded by our lack of answers to foundational questions. Does a planet’s crust have to be in contact with water and the atmosphere for the CHNOPS nutrients to be available? Earth has a temporary aerial biosphere. Can aerial biospheres be an important part of exoplanet habitability?

But beyond all the simulations and models, as powerful as they are, what scientists need most is more data. When ARIEL launches, scientists will have much more data to work with. Research like this will help scientists understand what ARIEL finds.

The post Measuring the Atmospheres of Other Worlds to See if There are Enough Nutrients for Life appeared first on Universe Today.

Categories: Astronomy

Does the Rise of AI Explain the Great Silence in the Universe?

Mon, 04/08/2024 - 3:18pm

Artificial Intelligence is making its presence felt in thousands of different ways. It helps scientists make sense of vast troves of data; it helps detect financial fraud; it drives our cars; it feeds us music suggestions; its chatbots drive us crazy. And it’s only getting started.

Are we capable of understanding how quickly AI will continue to develop? And if the answer is no, does that constitute the Great Filter?

The Fermi Paradox is the discrepancy between the apparent high likelihood of advanced civilizations existing and the total lack of evidence that they do exist. Many solutions have been proposed for why the discrepancy exists. One of the ideas is the “Great Filter.”

The Great Filter is a hypothesized event or situation that prevents intelligent life from becoming interplanetary and interstellar and even leads to its demise. Think climate change, nuclear war, asteroid strikes, supernova explosions, plagues, or any number of other things from the rogue’s gallery of cataclysmic events.

Or how about the rapid development of AI?

A new paper in Acta Astronautica explores the idea that Artificial Intelligence becomes Artificial Super Intelligence (ASI) and that ASI is the Great Filter. The paper’s title is “Is Artificial Intelligence the Great Filter that makes advanced technical civilizations rare in the universe?” The author is Michael Garrett from the Department of Physics and Astronomy at the University of Manchester.

“Without practical regulation, there is every reason to believe that AI could represent a major threat to the future course of not only our technical civilization but all technical civilizations.”

Michael Garrett, University of Manchester

Some think the Great Filter prevents technological species like ours from becoming multi-planetary. That’s bad because a species is at greater risk of extinction or stagnation with only one home. According to Garrett, a species is in a race against time without a backup planet. “It is proposed that such a filter emerges before these civilizations can develop a stable, multi-planetary existence, suggesting the typical longevity (L) of a technical civilization is less than 200 years,” Garrett writes.

If true, that can explain why we detect no technosignatures or other evidence of ETIs (Extraterrestrial Intelligences.) What does that tell us about our own technological trajectory? If we face a 200-year constraint, and if it’s because of ASI, where does that leave us? Garrett underscores the “…critical need to quickly establish regulatory frameworks for AI development on Earth and the advancement of a multi-planetary society to mitigate against such existential threats.”

An image of our beautiful Earth taken by the Galileo spacecraft in 1990. Do we need a backup home? Credit: NASA/JPL

Many scientists and other thinkers say we’re on the cusp of enormous transformation. AI is just beginning to transform how we do things; much of the transformation is behind the scenes. AI seems poised to eliminate jobs for millions, and when paired with robotics, the transformation seems almost unlimited. That’s a fairly obvious concern.

But there are deeper, more systematic concerns. Who writes the algorithms? Will AI discriminate somehow? Almost certainly. Will competing algorithms undermine powerful democratic societies? Will open societies remain open? Will ASI start making decisions for us, and who will be accountable if it does?

This is an expanding tree of branching questions with no clear terminus.

Stephen Hawking (RIP) famously warned that AI could end humanity if it begins to evolve independently. “I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans,” he told Wired magazine in 2017. Once AI can outperform humans, it becomes ASI.

Stephen Hawking was a major proponent for colonizing other worlds, mainly to ensure humanity does not go extinct. In later years, Hawking recognized that AI could be an extinction-level threat. Credit: educatinghumanity.com

Hawking may be one of the most recognizable voices to issue warnings about AI, but he’s far from the only one. The media is full of discussions and warnings, alongside articles about the work AI does for us. The most alarming warnings say that ASI could go rogue. Some people dismiss that as science fiction, but not Garrett.

“Concerns about Artificial Superintelligence (ASI) eventually going rogue is considered a major issue – combatting this possibility over the next few years is a growing research pursuit for leaders in the field,” Garrett writes.

If AI provided no benefits, the issue would be much easier. But it provides all kinds of benefits, from improved medical imaging and diagnosis to safer transportation systems. The trick for governments is to allow benefits to flourish while limiting damage. “This is especially the case in areas such as national security and defence, where responsible and ethical development should be paramount,” writes Garrett.

News reports like this might seem impossibly naive in a few years or decades.

The problem is that we and our governments are unprepared. There’s never been anything like AI, and no matter how we try to conceptualize it and understand its trajectory, we’re left wanting. And if we’re in this position, so would any other biological species that develops AI. The advent of AI and then ASI could be universal, making it a candidate for the Great Filter.

This is the risk ASI poses in concrete terms: It could no longer need the biological life that created it. “Upon reaching a technological singularity, ASI systems will quickly surpass biological intelligence and evolve at a pace that completely outstrips traditional oversight mechanisms, leading to unforeseen and unintended consequences that are unlikely to be aligned with biological interests or ethics,” Garrett explains.

How could ASI relieve itself of the pesky biological life that corrals it? It could engineer a deadly virus, it could inhibit agricultural food production and distribution, it could force a nuclear power plant to melt down, and it could start wars. We don’t really know because it’s all uncharted territory. Hundreds of years ago, cartographers would draw monsters on the unexplored regions of the world, and that’s kind of what we’re doing now.

This is a portion of the Carta Marina map from the year 1539. It shows monsters lurking in the unknown waters off of Scandinavia. Are the fears of ASI kind of like this? Or could ASI be the Great Filter? Image Credit: By Olaus Magnus – http://www.npm.ac.uk/rsdas/projects/carta_marina/carta_marina_small.jpg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=558827

If this all sounds forlorn and unavoidable, Garrett says it’s not.

His analysis so far is based on ASI and humans occupying the same space. But if we can attain multi-planetary status, the outlook changes. “For example, a multi-planetary biological species could take advantage of independent experiences on different planets, diversifying their survival strategies and possibly avoiding the single-point failure that a planetary-bound civilization faces,” Garrett writes.

If we can distribute the risk across multiple planets around multiple stars, we can buffer ourselves against the worst possible outcomes of ASI. “This distributed model of existence increases the resilience of a biological civilization to AI-induced catastrophes by creating redundancy,” he writes.

If one of the planets or outposts that future humans occupy fails to survive the ASI technological singularity, others may survive. And they would learn from it.

Artist’s illustration of a SpaceX Starship landing on Mars. If we can become a multi-planetary species, the threat of ASI is diminished. Credit: SpaceX

Multi-planetary status might even do more than just survive ASI. It could help us master it. Garrett imagines situations where we can experiment more thoroughly with AI while keeping it contained. Imagine AI on an isolated asteroid or dwarf planet, doing our bidding without access to the resources required to escape its prison. “It allows for isolated environments where the effects of advanced AI can be studied without the immediate risk of global annihilation,” Garrett writes.

But here’s the conundrum. AI development is proceeding at an accelerating pace, while our attempts to become multi-planetary aren’t. “The disparity between the rapid advancement of AI and the slower progress in space technology is stark,” Garrett writes.

The difference is that AI is computational and informational, but space travel contains multiple physical obstacles that we don’t yet know how to overcome. Our own biological nature restrains space travel, but no such obstacle restrains AI. “While AI can theoretically improve its own capabilities almost without physical constraints,” Garrett writes, “space travel must contend with energy limitations, material science boundaries, and the harsh realities of the space environment.”

For now, AI operates within the constraints we set. But that may not always be the case. We don’t know when AI might become ASI or even if it can. But we can’t ignore the possibility. That leads to two intertwined conclusions.

If Garrett is correct, humanity must work more diligently on space travel. It can seem far-fetched, but knowledgeable people know it’s true: Earth will not be inhabitable forever. Humanity will perish here by our own hand or nature’s hand if we don’t expand into space. Garrett’s 200-year estimate just puts an exclamation point on it. A renewed emphasis on reaching the Moon and Mars offers some hope.

The Artemis program is a renewed effort to establish a presence on the Moon. After that, we could visit Mars. Are these our first steps to becoming a multi-planetary civilization? Image Credit: NASA

The second conclusion concerns legislating and governing AI, a difficult task in a world where psychopaths can gain control of entire nations and are bent on waging war. “While industry stakeholders, policymakers, individual experts, and their governments already warn that regulation is necessary, establishing a regulatory framework that can be globally acceptable is going to be challenging,” Garrett writes. Challenging barely describes it. Humanity’s internecine squabbling makes it all even more unmanageable. Also, no matter how quickly we develop guidelines, ASI might change even more quickly.

“Without practical regulation, there is every reason to believe that AI could represent a major threat to the future course of not only our technical civilization but all technical civilizations,” Garrett writes.

This is the United Nations General Assembly. Are we united enough to constrain AI? Image Credit: By Patrick Gruban, cropped and downsampled by Pine – originally posted to Flickr as UN General Assembly, CC BY-SA 2.0, https://commons.wikimedia.org/w/index.php?curid=4806869

Many of humanity’s hopes and dreams crystallize around the Fermi Paradox and the Great Filter. Are there other civilizations? Are we in the same situation as other ETIs? Will our species leave Earth? Will we navigate the many difficulties that face us? Will we survive?

If we do, it might come down to what can seem boring and workaday: wrangling over legislation.

“The persistence of intelligent and conscious life in the universe could hinge on the timely and effective implementation of such international regulatory measures and technological endeavours,” Garrett writes.

The post Does the Rise of AI Explain the Great Silence in the Universe? appeared first on Universe Today.

Categories: Astronomy

If We Want to Visit More Asteroids, We Need to Let the Spacecraft Think for Themselves

Mon, 04/08/2024 - 12:22pm

Missions to asteroids have been on a tear recently. Visits by Rosetta, Osirix-REX, and Hayabusa2 have all visited small bodies and, in some cases, successfully returned samples to the Earth. But as humanity starts reaching out to asteroids, it will run into a significant technical problem – bandwidth. There are tens of thousands of asteroids in our vicinity, some of which could potentially be dangerous. If we launched a mission to collect necessary data about each of them, our interplanetary communication and control infrastructure would be quickly overwhelmed. So why not let our robotic ambassadors do it for themselves – that’s the idea behind a new paper from researchers at the Federal University of São Paulo and Brazil’s National Institute for Space Research.

The paper primarily focuses on the control problem of what to do when a spacecraft is approaching a new asteroid. Current missions take months to approach and require consistent feedback from ground teams to ensure the spacecraft understands the parameters of the asteroid it’s approaching – especially the gravitational constant.

Some missions have seen more success with that than others – for example, Philase, the lander that went along with Rosetta, had trouble when it bounced off the surface of comet 67P/Churyumov-Gerasimenko. As the authors pointed out, part of that difference was a massive discrepancy between the actual shape of the comet and the observed shape that telescopes had seen before Rosetta arrived there. 

Fraser discusses the possibility of capturing an asteroid.

Even more successful missions, such as OSIRIS-Rex, take months of lead-up time to complete relatively trivial maneuvers in the context of millions of kilometers their overall journey takes them. For example, it took 20 days for OSIRIX-Rex to perform multiple flybys at 7 km above the asteroid’s surface before its mission control deemed it safe to enter a stable orbit.

One of the significant constraints the mission controllers were looking at was whether they could accurately calculate the gravitational constant of the asteroid they were visiting. Gravity is notoriously difficult to determine from far away, and its miscalculation led to the problems with Philae. So, can a control scheme do to solve all of these problems?

Simply put, it can allow the spacecraft to decide what to do when approaching their target. With a well-defined control scheme, the likelihood of a spacecraft failure due to some unforeseen consequence is relatively minimal. It could dramatically decrease the time missions spend on approach and limit the communication bandwidth back toward mission control on Earth. 

One use case for quick asteroid mission – mining them, as Fraser discusses here.

Such a scheme would also require only four relatively ubiquitous, inexpensive sensors to operate effectively – a LiDAR (similar to those found on autonomous cars), two optical cameras for depth perception, and an inertial measurement unit (IMU) that measures parameters like orientation, acceleration, and magnetic field. 

The paper spends plenty of time detailing the complex math that would go into the control schema – some of which involve statistical calculations similar to basic learning models. The authors also run trials on two potential asteroid targets of interest to see how the system would perform.

One is already well understood. Bennu was the target of the OSIRIX-Rex mission and, therefore, is well-characterized as asteroids go. According to the paper, with the new control system, a spacecraft could enter a 2000 m orbit within a day of approaching from hundreds of kilometers away, then enter an 800 m orbit the next day. This is compared to the months of preparatory work the actual OSIRIS-Rex mission had to accomplish. And it can be completed with minimal thrust and, more importantly, fuel – a precious commodity on deep-space missions.

Asteroid defense is another important use case for quick asteroid missions – as Isaac Arthus discusses in this video.
Credit – Isaac Arthur

Another demonstration mission is one to Eros, the second-largest asteroid near Earth. It has a unique shape for an asteroid, as it is relatively elongated, which could pose an exciting challenge for automated systems like those described in the paper. Controlling a spacecraft with the new schema for a rendezvous with Eros doesn’t have all the same advantages of a more traditional asteroid like Bennu. For example, it has a much higher thrust requirement and fuel consumption. However, it still shortens the mission time and bandwidth required to operate it.

Autonomous systems are becoming increasingly popular on Earth and in space. Papers like this one push the thinking about what is possible forward. Suppose all that’s required to eliminate months of painstaking manual technical work is to slap a few sensors and implement a new control algorithm. In that case, it’s likely that one of the various agencies and companies planning to rendezvous with an asteroid shortly will adopt that plan.

Learn More:
Negri et al. – Autonomous Rapid Exploration in Close-Proximity of an Asteroid
UT – Miniaturized Jumping Robots Could Study An Asteroid’s Gravity
UT – How to Make Asteroid Landings Safer
UT – A Spacecraft Could use Gravity to Prevent a Dangerous Asteroid Impact

Lead Image:
Artist’s conception of the Lucy mission to the Trojan asteroids.
Credit – NASA

The post If We Want to Visit More Asteroids, We Need to Let the Spacecraft Think for Themselves appeared first on Universe Today.

Categories: Astronomy

Testing a Probe that Could Drill into an Ice World

Mon, 04/08/2024 - 12:12pm

I remember reading about an audacious mission to endeavour to drill through the surface ice of Europa, drop in a submersible and explore the depths below. Now that concept may be taking a step closer to reality with researchers working on technology to do just that. Worlds like Europa are high on the list for exploration due to their potential to harbour life. If technology like the SLUSH probe (Search for Life Using Submersible Head) work then we are well on the way to realising that dream. 

The search for life has always been something to captivate the mind. Think about the diversity of life on Earth and it is easy to see why we typically envisage creatures that rely upon sunlight, food and drink. But on Earth, life has found a way in the most inhospitable of environments, even at the very bottom of the ocean. The Mariana’s Trench is deeper than Mount Everest is tall and anything that lives there has to cope with cold water, crushingly high pressure and no sunlight. Seems quite alien but even here, life thrives such as the deep-sea crustacean Hirondellea Gigas – catchy name. 

Location of the Mariana Trench. Credit: Wikipedia Commons/Kmusser

Europa, one of the moon’s of Jupiter has an ice crust but this covers over a global ocean of liquid water.  The conditions deep down in the ocean of Europa might not be so very different from those at the bottom of the Mariana’s Trench so it is here that a glimmer of hope exists to find other life in the Solar System. Should it exist, getting to it is the tricky bit. It’s not just on Europa but Enceladus and even Mars may have water underneath ice shelves. Layers of ice up to a kilometre thick might exist so technology like SLUSH has been developed to overcome. 

Natural color image of Europa obtained by NASA’s Juno spacecraft. (Credit: NASA/JPL-Caltech/SwRI/MSSS/Kevin M. Gill)

The technology is not too new though since melt probes like SLUSH have been tested before. The idea is beautifully simple.  The thermo-mechanical probe uses a drilling mechanism to break through the ice and then the heat probe to partially melt the ice chips, forming slush to enable their transportation to behind the probe as it descends. 

The probe, which looks rather like a light sabre, is then able to transmit data from the subsurface water back to the lander. A tether system is used for the data transmission using conductive microfilaments and an optical fibre cable. Intriguingly and perhaps even cunningly, should the fibre cable break (which is a possibility due to tidal stresses from the ice) then the microfilaments will work as an antenna.  They can then be tuned into by the lander to resume data transmission. The tether is coiled up and housed inside spools which are left behind in the ice as the spool is emptied. I must confess my immediate thought here was ‘litter’! I accept we have to leave probes in order to explore but surely we can do it without leaving litter behind! However there is a reason for this too. As the spools are deployed, they act as receivers and transmitters to allow the radio frequencies to travel through the ice. 

The company working on the device is Honeybee Robotics have created prototypes. The first was stand alone, had no data transmission capability and demonstrated the drilling and slushing technology in an ice tower in Honeybee’s walk in freezer. While this was underway, the tether communication technology was being tested too with the first version called the Salmon Probe. This was taken to Devon Island in the Arctic where the unspooling method is being put through its paces. The first attempts back in 2022 saw the probe achieving depths of 1.8m! 

A further probe was developed called the Dolphin probe and this was capable of getting to depths of about 100m but sea ice limitations meant it could only get to a depth of 2m! Thus far, all probes have performed well. Honeybee are now working on the Narwhal Probe which will have more measuring equipment on board, a deployable tether and spool and will be far more like the finished product. If all goes to plan it will profile the ice on Devon Island to a depth of 100m.  This is still quite short of the kilometre thick ice expected but it is most definitely fantastic progress toward exploring the cold watery depths of alien worlds. 

Source : SLUSH: AN ICE DRILLING PROBE TO ACCESS OCEAN WORLDS

The post Testing a Probe that Could Drill into an Ice World appeared first on Universe Today.

Categories: Astronomy

What Could We Build With Lunar Regolith?

Mon, 04/08/2024 - 10:56am

It has often been likened to talcum powder. The ultra fine lunar surface material known as the regolith is crushed volcanic rock. For visitors to the surface of the Moon it can be a health hazard, causing wear and tear on astronauts and their equipment, but it has potential. The fine material may be suitable for building roads, landing pads and shelters. Researchers are now working to analyse its suitability for a number of different applications.

Back in the summer of 1969, Armstrong and Aldrin became the first visitors from Earth to set foot on the Moon. Now, 55 years on and their footprints are still there. The lack of weathering effects and the fine powdery material have held the footprints in perfect shape since the day they were formed. Once we – and I believe this will happen – establish lunar bases and even holidays to the Moon those footprints are likely still going to be there. 

There are many challenges to setting up permanent basis on the Moon, least of which is getting all the material there. I’ve been embarking on a fairly substantial home renovation over recent years and even getting bags of cement and blocks to site has proved a challenge. Whilst I live in South Norfolk in UK (which isn’t the easiest place to get to I accept) the Moon is even harder to get to. Transporting all the necessary materials over a quarter of a million kilometres of empty space is not going to be easy. Teams of engineers and scientists are looking at what materials can be acquired on site instead of transporting from Earth. 

The fine regolith has been getting a lot of attention for this very purpose and to that end, mineralogist Steven Jacobsen from the Northwestern University has been funded by NASAs Marshall Space Flight Centre to see what it back be used for. In addition NASA has partnered with ICON Technology, a robotics firm to explore lunar building technologies using resources found on the Moon. A key challenge with the lunar regolith though is that samples can vary considerably depending on where they are collected from. Jacobsen is trying to understand this to maximise construction potential. 

ICON were awarded the $57.2 million grant back in November 2022 to develop lunar construction methods. Work had already begun on space based construction, again from ICON in their Project Olympus. This didn’t just focus on the Moon though, Mars was also part of the vision to create construction techniques that could work wherever they were employed. 

Artist’s concept for a lunar base using construction robots and a form of 3D printing contour-crafitng.

3D printing may play a part in the lunar construction approach. It is already being used by ICON and others like them to build houses here on Earth. Employing 3D technology on the Moon using raw lunar material could be one solution. 

One of the first priorities would be to establish a suitable permanent landing area on the Moon. Without it, every time a lander arrives, the fine regolith will get kicked up and disturbed and may very well play havoc with other equipment in the vicinity. The particles can be quite sharp too so it may be quite abrasive on equipment. 

Source : Examining lunar soil for moon-based construction

The post What Could We Build With Lunar Regolith? appeared first on Universe Today.

Categories: Astronomy

The World's Largest Digital Camera is Complete. It Will Go Into the Vera Rubin Observatory

Sun, 04/07/2024 - 3:43pm

The Vera C. Rubin Observatory, formerly the Large Synoptic Survey Telescope (LSST), was formally proposed in 2001 to create an astronomical facility that could conduct deep-sky surveys using the latest technology. This includes a wide-field reflecting telescope with an 8.4-meter (~27.5-foot) primary mirror that relies on a novel three-mirror design (the Simonyi Survey Telescope) and a 3.2-megapixel Charge-Coupled Device (CCD) imaging camera (the LSST Camera). Once complete, Rubin will perform a 10-year survey of the southern sky known as the Legacy Survey of Space and Time (LSST).

While construction on the observatory itself did not begin until 2015, work began on the telescope’s digital cameras and primary mirror much sooner (in 2004 and 2007, respectively). After two decades of work, scientists and engineers at the Department of Energy’s (DOE) SLAC National Accelerator Laboratory and their collaborators announced the completion of the LSST Camera – the largest digital camera ever constructed. Once mounted on the Simonyi Survey Telescope, this camera will help researchers observe our Universe in unprecedented detail.

The Vera C. Rubin Observatory is jointly funded by the U.S. National Science Foundation (NSF) and the U.S. Department of Energy (DOE) and is cooperatively operated by NSF NOIRLab and SLAC. When Rubin begins its ten-year survey (scheduled for August 2025), it will help address some of the most pressing and enduring questions in astronomy and cosmology. These include understanding the nature of Dark Matter and Dark Energy, creating an inventory of the Solar System, mapping the Milky Way, and exploring the transient optical sky (i.e., objects that vary in location and brightness).

A schematic of the LSST Camera. Note the size comparison; the camera will be the size of a small SUV. Credit: Vera Rubin Observatory/DOE

The LSST Camera will assist these efforts by gathering an estimated 5,000 terabytes of new raw images and data annually. “With the completion of the unique LSST Camera at SLAC and its imminent integration with the rest of Rubin Observatory systems in Chile, we will soon start producing the greatest movie of all time and the most informative map of the night sky ever assembled,” said Željko Ivezic, an astronomy professor at the University of Washington and the Director of Rubin Observatory Construction in a NoirLab press release.

Measuring 1.65 x 3 meters (5.5 x 9.8 ft), with a front lens over 1.5 m (5 ft) across, the camera is about the size of a small SUV and weighs almost 2800 kg (6200 lbs). Its large-aperture, wide-field optical imaging capabilities can capture light from the near-ultraviolet (near-UV) to the near-infrared (NIR), or 0.3 – 1 micrometers (?m). But the camera’s greatest attribute is its ability to capture unprecedented detail over an unprecedented field of view. This will allow the Rubin Observatory to map the positions and measure the brightness of billions of stars, galaxies, and transient objects, creating a robust catalog that will fuel research for years.

Said Kathy Turner, the program manager for the DOE’s Cosmic Frontier Program, these images will help astronomers unlock the secrets of the Universe:

“And those secrets are increasingly important to reveal. More than ever before, expanding our understanding of fundamental physics requires looking farther out into the Universe. With the LSST Camera at its core, Rubin Observatory will delve deeper than ever before into the cosmos and help answer some of the hardest, most important questions in physics today.”

In particular, astronomers are looking forward to using the LSST Camera to search for signs of weak gravitational lensing. This phenomenon occurs when massive galaxies alter the curvature of spacetime around them, causing light from more distant background galaxies to become redirected and amplified. This technique allows astronomers to study the distribution of mass in the Universe and how this has changed over time. This is vital to determining the presence and influence of Dark Matter, the mysterious and invisible matter that makes up 85% of the total mass in the Universe.

Similarly, scientists also want to study the distribution of galaxies and how those have changed over time, enabling them to identify Dark Matter clusters and supernovae, which may help improve our understanding of Dark Matter and Dark Energy alike. Within our Solar System, astronomers will use the LSST Camera to create a more thorough consensus of small objects, including asteroids, planetoids, and Near-Earth Objects (NEO) that could pose a collision risk someday. It will also catalog the dozen or so interstellar objects (ISOs) that enter our Solar System every year.

This is an especially exciting prospect for scientists who hope to conduct rendezvous missions in the near future that will allow us to study them up close. Now that the LSST Camera is complete and has finished being tested at SLAC, it will be shipped to Cerro Pachón in Chile (where the Vera C. Rubin Observatory is being constructed) and integrated with the Simonyi Survey Telescope later this year. Said Bob Blum, Director for Operations for Vera C. Rubin Observatory:

“Rubin Observatory Operations is very excited to see this major milestone about to be completed by the construction team. Combined with the progress of coating the primary mirror, this brings us confidently and much closer to starting the Legacy Survey of Space and Time. It is happening.”

The LSST Camera was made possible thanks to the expertise and technology contributed by international partners. These include the Brookhaven National Laboratory, which built the camera’s digital sensor array; the Lawrence Livermore National Laboratory and its industrial partners, who designed and built the lenses; the National Institute of Nuclear and Particle Physics in France, which built the camera’s filter exchange system and contributed to the sensor and electronics design.

Further Reading: NoirLab

The post The World's Largest Digital Camera is Complete. It Will Go Into the Vera Rubin Observatory appeared first on Universe Today.

Categories: Astronomy

The First Atmospheric Rainbow on an Exoplanet?

Sat, 04/06/2024 - 11:12am

When light strikes the atmosphere all sorts of interesting things can happen. Water vapor can split sunlight into a rainbow arc of colors, corpuscular rays can stream through gaps in clouds like the light from heaven, and halos and sundogs can appear due to sunlight reflecting off ice crystals. And then there is the glory effect, which can create a colorful almost saint-like halo around objects.

Like rainbows, glories are seen when facing away from the light source. They are often confused with circular rainbows because of their similarity, but glories are a unique effect. Rainbows are caused by the refraction of light through water droplets, while glories are caused by the wave interference of light. Because of this, a glory is most apparent when the water droplets of a cloud or fog are small and uniform in size. The appearance of a glory gives us information about the atmosphere. We have assumed that some distant exoplanets would experience glories similar to Earth, but now astronomers have found the first evidence of them.

A solar glory seen from an airplane. Credit: Brocken Inaglory

The observations come from the Characterising ExOplanet Satellite (Cheops) as well as observations from other observatories of an exoplanet known as WASP-76b. It’s not the kind of exoplanet where you’d expect a glory to appear. WASP-76b is not a temperate Earth-like world with a humid atmosphere, but a hellish hot Jupiter with a surface temperature of about 2,500 Kelvin. Because of this, the team wasn’t looking for extraterrestrial glories but rather studying the odd asymmetry of the planet’s atmosphere.

WASP-76b orbits its star at a tenth of the distance of Mercury from the Sun. At such a close distance the world is likely tidally locked, with one side forever boiling under its sun’s heat and the other side always in shadow. No such planet exists in our solar system, so astronomers are eager to study how this would affect the atmosphere of such a world. Previous studies have shown that the atmosphere is not symmetrical. The star-facing side is puffed up by the immense heat, while the atmosphere of the dark side is more dense.

For three years the team observed WASP-76b as it passed in front of and behind its star, capturing data on the intersection between the light and dark side. They found that on the planet’s eastern terminator (the boundary between light and dark sides) there was a surprising increase in light. This extra glow could be caused by a glory effect. It will take more observations to confirm this effect but if verified it will be the first glory observed beyond our solar system. Currently, glories have only been observed on Earth and Venus.

The presence of a glory on WASP-76b would mean that spherical droplets must have been present in the atmosphere for at least three years. This means either they are stable within the atmosphere, or they are constantly replenished. One possibility is that the glory is caused by iron droplets that rain from the sky on the cooler side of the planet. Even if this particular effect is not confirmed, the ability of modern telescopes to capture this data suggests that we will soon be able to study many subtle effects of exoplanet atmospheres.

Reference: Demangeon, O. D. S., et al. “Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b.” Astronomy & Astrophysics 684 (2024): A27.

The post The First Atmospheric Rainbow on an Exoplanet? appeared first on Universe Today.

Categories: Astronomy

Roman Will Learn the Ages of Hundreds of Thousands of Stars

Sat, 04/06/2024 - 11:03am

Astronomers routinely provide the ages of the stars they study. But the methods of measuring ages aren’t 100% accurate. Measuring the ages of distant stars is a difficult task.

The Nancy Grace Roman Space Telescope should make some progress.

Stars like our Sun settle into their main sequence lives of fusion and change very little for billions of years. It’s like watching middle-aged adults go about their business during their working lives. They get up, drive to work, sit at a desk, then drive home.

But what can change over time is their rotation rate. The Sun now rotates about once a month. When it was first formed, it rotated more rapidly.

But over time, the Sun’s rotation rate, and the rotation rate of stars the same mass or lower than the Sun’s, will slow down. The slowdown is caused by interactions between the star’s magnetic fields and the stellar wind, the stream of high-energy protons and electrons emitted by stars. Over time, these interactions reduce a star’s angular momentum, and its rotation slows. The phenomenon is called “magnetic braking,” and it depends on the strength of a star’s magnetic fields.

When the Sun rotates, the magnetic field lines rotate with it. The combination is almost like a solid object. Ionized material from the solar wind will be carried along the field lines and, at some point, will escape the magnetic field lines altogether. That reduces the Sun’s angular momentum. Image Credit: By Coronal_Hole_Magnetic_Field_Lines.svg: Sebman81Sun_in_X-Ray.png: NASA Goddard Laboratory for AtmospheresCelestia_sun.jpg: NikoLangderivative work: Aza (talk) – Coronal_Hole_Magnetic_Field_Lines.svgSun_in_X-Ray.pngCelestia_sun.jpg, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=8258519

The more rapidly a star initially spins, the stronger its magnetic fields. That means they slow down faster. After about one billion years of life, stars of the same age and mass will spin at the same rate. Once astronomers know a star’s mass and rotation rate, they can estimate its age. Knowing stars’ ages is critical in research. It makes everything astronomers do more accurate, including piecing together the Milky Way’s history.

The problem is that measuring rotation rates is challenging. One method is to observe spots on stars’ surfaces and watch as they come into and out of view. All stars have star spots, though their characteristics vary quite a bit. In fact, stars can have dozens of spots, and the spots change locations. Therein lies the difficulty. It’s extremely difficult to figure out the periodicity when dozens of spots change locations on the star’s surface.

This is where the Nancy Grace Roman Space Telescope (the Roman) comes in. It’s scheduled for launch in May 2027 to begin its five-year mission. It’s a wide-field infrared survey telescope with multiple science objectives. One of its main programs is the Galactic Bulge Time Domain Survey. That effort will gather detailed information on hundreds of millions of stars in the Milky Way’s galactic bulge.

This is a simulated image of what the Roman Space Telescope will see when it surveys the Milky Way’s galactic bulge. The telescope will observe hundreds of millions of stars in the region. Image Credit: Matthew Penny (Louisiana State University)

The Roman will generate an enormous amount of data. Much of it will be measurements of how the brightness of hundreds of thousands of stars changes. But untangling those measurements and figuring out what those changes in brightness mean for stellar rotation requires help from AI.

Astronomers at the University of Florida are developing AI to extract stellar rotation periods from all that data.

Zachary Claytor is a postdoc at the University of Florida and the AI project’s science principal investigator. Their AI is called a convolutional neural network. This type of AI is well-suited to analyzing images and is used in image classification and medical image analysis, among other things.

AI needs to be trained before it can do its job. In this case, Claytor and his associates wrote a computer program to generate simulated stellar light curves for the AI to process and learn from.

“This program lets the user set a number of variables, like the star’s rotation rate, the number of spots, and spot lifetimes. Then it will calculate how spots emerge, evolve, and decay as the star rotates and convert that spot evolution to a light curve – what we would measure from a distance,” explained Claytor.

Claytor and his co-researchers have already tested their AI on data from NASA’s TESS, the Transiting Exoplanet Survey Satellite. The longer a star’s rotation period is, the more difficult it is to measure. But the team’s AI demonstrated that it could successfully determine these periods in TESS data.

The Roman’s Galactic Bulge Time Domain Survey is still being designed. So astronomers can use this AI-based effort to help design the survey.

“We can test which things matter and what we can pull out of the Roman data depending on different survey strategies. So when we actually get the data, we’ll already have a plan,” said Jamie Tayar, assistant professor of astronomy at the University of Florida and the program’s principal investigator.

“We have a lot of the tools already, and we think they can be adapted to Roman,” she added.

Artist’s impression of the Nancy Grace Roman Space Telescope, named after NASA’s first Chief of Astronomy. When launched later this decade, the telescope will measure the rotational periods of hundreds of thousands of stars and, with the help of AI, will determine their ages. Credits: NASA

Measuring stellar ages is difficult, yet age is a key factor in understanding any star. Astronomers use various methods to measure ages, including evolutionary models, a star’s membership in a cluster of similarly-aged stars, and even the presence of a protoplanetary disk. But no single method can measure every star’s age, and each method has its own drawbacks.

If the Roman can break through this barrier and accurately measure stellar rotation rates, astronomers should have a leg-up in understanding stellar ages. But there’s still one problem: magnetic braking.

This method relies on a solid understanding of how magnetic braking works over time. But astronomers may not understand it as thoroughly as they’d like. For instance, research from 2016 showed that magnetic braking might not slow down older stars as much as thought. That research found unexpectedly rapid rotation rates in stars more evolved than our Sun.

Somehow, astronomers will figure this all out. The Roman Space Telescope should help, as its vast trove of data is bound to lead to some unexpected conclusions. One way or another, with the help of the Roman Space Telescope, the ESA’s Gaia mission, and others, astronomers will untangle the problem of measuring everything about stars, including their ages.

The post Roman Will Learn the Ages of Hundreds of Thousands of Stars appeared first on Universe Today.

Categories: Astronomy

Webb Sees a Galaxy Awash in Star Formation

Fri, 04/05/2024 - 8:07pm

Since it began operations in July 2022, the James Webb Space Telescope (JWST) has fulfilled many scientific objectives. In addition to probing the depths of the Universe in search of galaxies that formed shortly after the Big Bang, it has also provided the clearest and most detailed images of nearby galaxies. In the process, Webb has provided new insight into the processes through which galaxies form and evolve over billions of years. This includes galaxies like Messier 82 (M82), a “starburst galaxy” located about 12 million light-years away in the constellation Ursa Major.

Also known as the “Cigar Galaxy” because of its distinctive shape, M82 is a rather compact galaxy with a very high star formation rate. Roughly five times that of the Milky Way, this is why the core region of M82 is over 100 times as bright as the Milky Way’s. Combined with the gas and dust that naturally obscures visible light, this makes examining M82’s core region difficult. Using the extreme sensitivity of Webb‘s Near-Infrared Camera (NIRCam), a team led by the University of Maryland observed the central region of this starburst galaxy to examine the physical conditions that give rise to new stars.

The team was led by Alberto Bollato, an astronomy professor at the University of Maryland and a researcher with the Joint Space-Science Institute (JSSI). He was joined by researchers from NASA’s Jet Propulsion Laboratory, NASA Ames, the European Space Agency (ESA), the Space Telescope Science Institute (STScI), the ARC Centre of Excellence for All Sky Astrophysics in 3 Dimensions (ASTRO 3D), the Max-Planck-Institut für Astronomie (MPIA), National Radio Astronomy Observatory (NRAO), the Infrared Processing and Analysis Center (IPAC-Caltech) and multiple universities, institutes, and observatories. Their findings are described in a paper accepted for publication in The Astrophysical Journal.

Annotated image of the starburst galaxy Messier 82 captured by Hubble (left) and Webb’s NIRCam (right). Credit: NASA/ESA/CSA/STScI/Alberto Bolatto (UMD)

Their observations were part of a Cycle 1 General Observations (GO) project – for which Bollato is the Principal Investigator (PI) – that used NIRCam data to examine the “prototypical starbursts” NGC 253 and M82 and their “cool” galactic winds. Such galaxies remain a source of fascination for astronomers because of what they can reveal about the birth of new stars in the early Universe. Starbursts are galaxies that experience rapid and efficient star formation, a phase that most galaxies went through during the early history of the Universe (ca. 10 billion years ago). Studying early galaxies in this phase is challenging due to the distances involved.

Fortunately, starburst galaxies like NGC 253 and M82 are relatively close to the Milky Way. While these galaxies have been observed before, Webb’s extreme sensitivity in the near-infrared spectrum provided the most detailed look to date. Moreover, the NIRCam observations were made using an instrument mode that prevented the galaxy’s intense brightness from overwhelming the instrument. The resulting images revealed details that have been historically obscured, such as dark brown tendrils of heavy dust that contained concentrations of iron (visible in the image as green specks).

These consist largely of supernova remnants, while small patches of red are clouds of molecular hydrogen lit up by young stars nearby. Said Rebecca Levy, second author of the study at the University of Arizona in Tucson, in a NASA press release, “This image shows the power of Webb. Every single white dot in this image is either a star or a star cluster. We can start to distinguish all of these tiny point sources, which enables us to acquire an accurate count of all the star clusters in this galaxy.”

Another key detail captured in the images is the “galactic wind” rushing out from the core, which was visible at longer infrared wavelengths. This wind is caused by the rapid rate of star formation and subsequent supernovae and has a significant influence on the surrounding environment. Studying this wind was a major objective of the project (GO 1701), which aimed to investigate how these winds interact with cold and hot material. By a central region of M82, the team was able to examine where the wind originates and the impact it has on surrounding material.

The Cigar Galaxy (M82), a starburst galaxy with high star production. Credit: NASA, ESA, and the Hubble Heritage Team (STScI/AURA)

The team was surprised by the way Webb’s NIRCam was able to trace the structure of the galactic wind via emission spectra from very small dust grains known as polycyclic aromatic hydrocarbons (PAHs) – a chemical produced when coal, wood, gasoline, and tobacco are burned. These emissions highlighted the galactic wind’s fine structure, which appeared as red filaments flowing from above and below the galaxy’s disk. Another surprise was the structure of the PAH emission, which was similar to that of the hot ionized gas in the wind. As Bollato explained:

“M82 has garnered a variety of observations over the years because it can be considered as the prototypical starburst galaxy. Both NASA’s Spitzer and Hubble space telescopes have observed this target. With Webb’s size and resolution, we can look at this star-forming galaxy and see all of this beautiful, new detail. It was unexpected to see the PAH emission resemble ionized gas. PAHs are not supposed to live very long when exposed to such a strong radiation field, so perhaps they are being replenished all the time. It challenges our theories and shows us that further investigation is required.”

The team hopes to further investigate the questions these findings raise using Webb data, which will include spectroscopic observations made using the Near-infrared Spectrograph (NIRSpec) and large-scale images of the galaxy and wind. This data will help astronomers obtain accurate ages for the star clusters and determine how long each phase of star formation lasts in starburst galaxies. As always, this information could shed light on how similar phases took place in the early Universe, helping shape galaxies like ours. As Bollato summarized:

“Webb’s observation of M82, a target closer to us, is a reminder that the telescope excels at studying galaxies at all distances. In addition to looking at young, high-redshift galaxies, we can look at targets closer to home to gather insight into the processes that are happening here – events that also occurred in the early universe.”

Further Reading: Webb Space Telescope, MPIA

The post Webb Sees a Galaxy Awash in Star Formation appeared first on Universe Today.

Categories: Astronomy

The Stellar Demolition Derby in the Centre of the Galaxy

Fri, 04/05/2024 - 4:27pm

The region near the Milky Way’s centre is dominated by the supermassive black hole that resides there. Sagittarius A*’s overwhelming gravity creates a chaotic region where tightly packed, high-speed stars crash into one another like cars in a demolition derby.

These collisions and glancing blows change the stars forever. Some become strange, stripped-down, low-mass stars, while others gain new life.

The Milky Way’s supermassive black hole (SMBH) is called Sagittarius A* (Sgr. A*). Sgr. A* is about four million times more massive than the Sun. With that much mass, the much smaller stars nearby are easily affected by the black hole’s powerful gravity and are accelerated to rapid velocities.

In the inner 0.1 parsec, or about one-third of a light-year, stars travel thousands of kilometres per second. Outside that region, the pace is much more sedate. Stars beyond 0.1 parsec travel at hundreds of km/s.

But it’s not only the speed that drives the collisions. The region is also tightly packed with stars into what astronomers call a nuclear star cluster (NSC.) The combination of high speed and high stellar density creates a region where stars are bound to collide.

“They whack into each other and keep going.”

Sanaea Rose, Department of Physics and Astronomy, UCLA

New research led by Northwestern University simulated stars orbiting Sgr. A* to understand the interactions and collisions and their results. It’s titled “Stellar Collisions in the Galactic Center: Massive Stars, Collision Remnants, and Missing Red Giants.” The lead author is Sanaea C. Rose from UCLA’s Department of Physics and Astronomy. The research was also recently presented at the American Physical Society’s April meeting.

The researchers simulated a population of 1,000 stars embedded in the NSC. The stars ranged from 0.5 to 100 solar masses, but in practice, the upper limit was about 30 solar masses due to the initial mass function. Other characteristics, like orbital eccentricities, were varied to ensure that the sample caught stars at different distances from Sgr. A*. That’s necessary to build a solid understanding of the stellar collisions.

“The region around the central black hole is dense with stars moving at extremely high speeds,” said lead author Rose. “It’s a bit like running through an incredibly crowded subway station in New York City during rush hour. If you aren’t colliding with other people, then you are passing very closely by them. For stars, these near collisions still cause them to interact gravitationally. We wanted to explore what these collisions and interactions mean for the stellar population and characterize their outcomes.”

“Stars, which are under the influence of a supermassive black hole in a very crowded region, are unlike anything we will ever see in our own solar neighbourhood.”

Sanaea Rose, Department of Physics and Astronomy, UCLA

The stellar density in the inner 0.1 parsecs is nothing like our Solar System’s neighbourhood. The nearest star to our Sun is the low-mass Proxima Centauri. It’s just over four light-years away. It’s like having no neighbours at all.

But in the NSC, things are way different.

The Milky Way galaxy hosts a supermassive black hole (Sgr A*, shown in the inset on the right) embedded in the Nuclear Star Cluster (NSC) at the center, highlighted and enlarged in the middle panel. The image on the right shows the stellar density in the NSC. Image Credit: Zhuo Chen

“The closest star to our sun is about four light-years away,” Rose explained. “Within that same distance near the supermassive black hole, there are more than a million stars. It’s an incredibly crowded neighbourhood. On top of that, the supermassive black hole has a really strong gravitational pull. As they orbit the black hole, stars can move at thousands of kilometres per second.”

In a stellar density that high, collisions are inevitable. The rate of collisions is more severe the closer stars are to the SMBH. In their research, Rose and her colleagues simulated the region to determine the collisions’ effect on individual stars and the stellar population.

The simulations showed that head-on collisions are rare. So stars aren’t destroyed. Instead, they’re more like glancing blows, where stars can be stripped of their outer layers before continuing their trajectories.

“They whack into each other and keep going,” Rose said. “They just graze each other as though they are exchanging a very violent high-five. This causes the stars to eject some material and lose their outer layers. Depending on how fast they are moving and how much they overlap when they collide, they might lose quite a bit of their outer layers. These destructive collisions result in a population of strange, stripped down, low-mass stars.”

These stars end up migrating away from the SMBH. The authors say that there is likely a population of these low-mass stars spread throughout the galactic centre (GC.) They also say that the ejected mass from these grazing collisions could produce the gas and dust features other researchers have observed in the GC, like X7, and G objects like G3 and G2.

X7 is an elongated gas and dust structure in the galactic centre. The researchers suggest it could be made of mass stripped from stars during collisions between fast-moving stars near Sgr. A*. G3 and G2 are objects that resemble clouds of gas and dust but also have properties of stellar objects. Image Credit: Ciurlo et al. 2023.

Outside of the 0.1 parsecs region, the stars are slower. As a result, collisions between stars aren’t as energetic or destructive. Instead of creating a population of stripped-down stars, collisions allow the stars to merge, creating more massive stars. Multiple mergers are possible, creating stars more massive than our Sun.

“A few stars win the collision lottery,” Rose said. “Through collisions and mergers, these stars collect more hydrogen. Although they were formed from an older population, they masquerade as rejuvenated, young-looking stars. They are like zombie stars; they eat their neighbours.”

But after they gain that mass, they hasten their own demise. They become like young, massive stars that consume their fuel quickly.

This artist’s illustration shows a massive star orbiting Sagittarius A*. Post-collision, some stars gain mass and end up shortening their lives. Image Credit: University of Cologne

“They die very quickly,” Rose said. “Massive stars are sort of like giant, gas-guzzling cars. They start with a lot of hydrogen, but they burn through it very, very fast.”

Another puzzling thing about this inner region is the lack of red giants. “Observations of the GC indicate a deficit of RGs within about 0.3 pc of the SMBH,” the authors write, referencing other research. Their results could explain it. “We consider whether main-sequence stellar collisions may help explain this observational puzzle,” they write. “We find that within ~ 0.01 pc of the SMBH, stellar collisions destroy most low-mass stars before they can evolve off the main sequence. Thus, we expect a lack of RGs in this region.”

The region around the Milky Way’s SMBH is chaotic. Even disregarding the black hole itself and its swirling accretion disk and tortured magnetic fields, the stars that dance to its tune live chaotic lives. The simulations show that most stars in the GC will experience direct collisions with other stars. But their chaotic lives could shed light on how the entire region evolved. And since the region resists astronomers’ attempts to observe it, simulations like this are their next best tool.

“It’s an environment unlike any other,” Rose said. “Stars, which are under the influence of a supermassive black hole in a very crowded region, are unlike anything we will ever see in our own solar neighbourhood. But if we can learn about these stellar populations, then we might be able to learn something new about how the galactic center was assembled. At the very least, it certainly provides a point of contrast for the neighbourhood where we live.”

Note: these results are based on a pair of published papers:

The post The Stellar Demolition Derby in the Centre of the Galaxy appeared first on Universe Today.

Categories: Astronomy

A New Map Shows the Universe’s Dark Energy May Be Evolving

Fri, 04/05/2024 - 1:19pm

At the Kitt Peak National Observatory in Arizona, an instrument with 5,000 tiny robotic eyes scans the night sky. Every 20 minutes, the instrument and the telescope it’s attached to observe a new set of 5,000 galaxies. The instrument is called DESI—Dark Energy Survey Instrument—and once it’s completed its five-year mission, it’ll create the largest 3D map of the Universe ever created.

But scientists are getting access to DESI’s first data release and it suggests that dark energy may be evolving.

DESI is the most powerful multi-object survey spectrograph in the world, according to their website. It’s gathering the spectra for tens of millions of galaxies and quasars. The goal is a 3D map of the Universe that extends out to 11 billion light-years. That map will help explain how dark energy has driven the Universe’s expansion.

DESI began in 2021 and is a five-year mission. The first year of data has been released, and scientists with the project say that DESI has successfully measured the expansion of the Universe over the last 11 billion years with extreme precision.

“The DESI team has set a new standard for studies of large-scale structure in the Universe.”

Pat McCarthy, NOIRLab Director

DESI collects light from 5,000 objects at once with its 5,000 robotic eyes. It observes a new set of 5,000 objects every 20 minutes, which means it observes 100,000 objects—galaxies and quasars—each night, given the right observing conditions.

This image shows Stu Harris working on assembling the focal plane for the Dark Energy Spectroscopic Instrument (DESI) at Lawrence Berkeley National Laboratory in 2017 in Berkeley, Calif. Ten petals, each containing 500 robotic positioners that are used to gather light from targeted galaxies, form the complete focal plane. DESI is attached to the 4-meter Mayall Telescope at Kitt Peak National Observatory. Image Credit: DESI/NSF NOIRlab

DESI’s data creates a map of the large-scale structure of the Universe. The map will help scientists unravel the history of the Universe’s expansion and the role dark energy plays. We don’t know what dark energy is, but we know some force is causing the Universe’s expansion to accelerate.

“The DESI instrument has transformed the Mayall Telescope into the world’s premier cosmic cartography machine,” said Pat McCarthy, Director of NOIRLab, the organization behind DESI. “The DESI team has set a new standard for studies of large-scale structure in the Universe. These first-year data are only the beginning of DESI’s quest to unravel the expansion history of the Universe, and they hint at the extraordinary science to come.”

DESI measures dark energy by relying on baryonic acoustic oscillations (BAO.) Baryonic matter is “normal” matter: atoms and everything made of atoms. The acoustic oscillations are density fluctuations in normal matter that date back to the Universe’s beginnings. BAO are the imprint of those fluctuations, or pressure waves, that moved through the Universe when it was all hot, dense plasma.

As the Universe cooled and expanded, the density waves froze their ripples in place, and where density was high, galaxies eventually formed. The ripple pattern of the BAO is visible in the DESI leading image. It shows strands of galaxies, or galaxy filaments, clustered together. They’re separated by voids where density is much lower.

The deeper DESI looks, the fainter the galaxies are. They don’t provide enough light to detect the BAO. That’s where quasars come in. Quasars are extremely bright galaxy cores, and the light from distant quasars creates a shadow of the BAO pattern. As the light travels through space, it interacts with and gets absorbed by clouds of matter. That lets astronomers map dense pockets of matter, but it took over 450,000 quasars. That’s the most quasars ever observed in a survey like this.

Because the BAO pattern is gathered in such detail and across such vast distances, it can act as a cosmic ruler. By combining the measurements of nearby galaxies and distant quasars, astronomers can measure the ripples across different periods of the Universe’s history. That allows them to see how dark energy has stretched the scale over time.

It’s all aimed at understanding the expansion of the Universe.

In the Universe’s first three billion years, radiation dominated it. The Cosmic Microwave Background is evidence of that. For the next several billion years, matter dominated the Universe. It was still expanding, but the expansion was slowing because of the gravitational force from matter. But since then, the expansion has accelerated again, and we give the name dark energy to the force behind that acceleration.

So far, DESI’s data supports cosmologists’ best model of the Universe. But there are some twists.

“We’re incredibly proud of the data, which have produced world-leading cosmology results,” said DESI director and LBNL scientist Michael Levi. “So far, we’re seeing basic agreement with our best model of the Universe, but we’re also seeing some potentially interesting differences that could indicate dark energy is evolving with time.”

Levi is referring to Lambda Cold Dark Matter (Lambda CDM), also known as the standard model of Big Bang Cosmology. Lambda CDM includes cold dark matter—a weakly interacting type of matter—and dark energy. They both shape how the Universe expands but in opposite ways. Dark energy accelerates the expansion, and regular matter and dark matter slow it down. The Universe evolves based on the contributions from all three. The Lambda CDM does a good job of describing what other experiments and observations find. It also assumes that dark energy is constant and spread evenly throughout the Universe.

This data is just the first release, so confirmation of dark energy evolution must wait. By the time DESI has completed its five-year run, it will have mapped over three million quasars and 37 million galaxies. That massive trove of data should help scientists understand if dark energy is changing.

Whatever the eventual answer, the question is vital to understanding the Universe.

“This project is addressing some of the biggest questions in astronomy, like the nature of the mysterious dark energy that drives the expansion of the Universe,” says Chris Davis, NSF program director for NOIRLab. “The exceptional and continuing results yielded by the NSF Mayall telescope with DOE DESI will undoubtedly drive cosmology research for many years to come.”

DESI isn’t the only effort to understand dark energy. The ESA’s Euclid spacecraft is already taking its own measurements to help cosmologists answer their dark energy questions.

In a few years, DESI will have some more powerful allies in the quest to understand dark energy. The Vera Rubin Observatory and Nancy Grace Roman Space Telescope will both contribute to our understanding of the elusive dark energy. They’ll perform surveys of their own, and by combining data from all three, cosmologists are poised to generate some long-sought answers.

But for now, scientists are celebrating DESI’s first data release.

“We are delighted to see cosmology results from DESI’s first year of operations,” said Gina Rameika, associate director for High Energy Physics at the Department of Energy. “DESI continues to amaze us with its stellar performance and how it is shaping our understanding of dark energy in the Universe.”

The post A New Map Shows the Universe’s Dark Energy May Be Evolving appeared first on Universe Today.

Categories: Astronomy

Why is it so hard to drill off Earth?

Fri, 04/05/2024 - 1:05pm

Humans have been digging underground for millennia – on the Earth. It’s where we extract some of our most valuable resources that have moved society forward. For example, there wouldn’t have been a Bronze Age without tin and copper – both of which are primarily found under the ground. But when digging under the ground on celestial bodies, we’ve had a much rougher time. That is going to have to change if we ever hope to utilize the potential resources that are available under the surface. A paper from Dariusz Knez and Mitra Kahlilidermani of the University of Krakow looks at why it’s so hard to drill in space – and what we might do about it.

In the paper, the authors detail two major categories of difficulties when drilling off-world – environmental challenges and technological challenges. Let’s dive into the environmental challenges first.

One obvious difference between Earth and most other rocky bodies that we would potentially want to drill holes into is the lack of an atmosphere. There are some exceptions – such as Venus and Titan, but even Mars has a thin enough atmosphere that it can’t support one fundamental material used for drilling here on Earth – fluids.

The ocean on Europa is a common destination for a exploration mission that will require some drilling. Fraser explores how we would do it.

If you’ve ever tried drilling a hole in metal, you’ve probably used some cooling fluid. If you don’t, there is a good chance either your drill bit or your workpiece will heat up and deform to a point where you can no longer drill. To alleviate that problem, most machinists simply spray some lubricant into the drill hole and keep pressing through. A larger scale version of this happens when construction companies drill into the ground, especially into bedrock – they use liquids to cool the spots where they’re drilling.

That isn’t possible on a celestial body with no atmosphere. At least not using traditional drilling technologies. Any liquid exposed to the lack of atmosphere would immediately sublimate away, providing little to no cooling effect to the work area. And given that many drilling operations occur autonomously, the drill itself – typically attached to a rover or lander – has to know when to back off on its drilling process before the bits melt. That’s an added layer of complexity and not one that many designs have yet come up with a solution.

A similar fluid problem has limited the adoption of a ubiquitous drill technology used on Earth – hydraulics. Extreme temperature swings, such as those seen on the Moon during the day/night cycle, make it extremely difficult to provide a liquid for use in a hydraulic system that doesn’t freeze during cold nights or evaporate during scorching days. As such, hydraulic systems used in almost every large drilling rig on Earth are extremely limited when used in space.

Here’s a detailed look at a drill used on Mars by Smarter Every Day.
Credit – Smarter Every Day YouTube Channel

Other problems like abrasive or clingy regolith can also crop up, such as a lack of magnetic field when orienting the drill. Ultimately, these environmental challenges can be overcome with the same things humans always use to overcome them, no matter what planetary body they’re on – technology.

There are plenty of technological challenges for drilling off-world as well, though. The most obvious is the weight constraint, a crucial consideration for doing anything in space. Large drilling rigs use heavy materials, such as steel casings, to support the boreholes they drill, but these would be prohibitively expensive using current launch technologies. 

Additionally, the size of the drilling system itself is the limiting factor of the force of the drill – as stated in the paper, “the maximum force transmitted to the bit cannot exceed the weight of the whole drilling system.” This problem is exacerbated by the fact that typical rover drills are leveraged out on a robotic arm rather than placed directly underneath where the maximum amount of weight can be applied. This force limitation also limits the type of material the drill can get through – it will be hard-pressed to drill through any significant boulder, for example. While redesigning rovers with drill location in mind could be helpful, again, the launch weight limitation comes into play here.

Curiosity has a unique drilling technique, as described in this JPL video.
Credit – NASA JPL YouTube Channel

Another technological problem is the lack of power. Hydrocarbon-fueled engines power most large drilling rigs on Earth. That isn’t feasible off of Earth, so the system must be powered by solar cells and the batteries they provide. These systems also suffer from the same tyranny of the rocket equation, so they are typically relatively limited in size, making it difficult for drilling systems to take advantage of some of the benefits of entirely electric systems over hydrocarbon-powered ones – such as higher torque.

No matter the difficulties these drilling systems face, they will be vital for the success of any future exploration program, including crewed ones. If we ever want to create lava cave cities on the Moon or get through Enceladeus’ ice sheet to the ocean within, we will need better drilling technologies and techniques. Luckily, there are plenty of design efforts to come up with them.

The paper details four different categories of drill designs:

  1. Surface drills – less than 10 cm depth
  2. Shallow-depth drills – less than 1m depth
  3. Medium-depth drills – between 1m and 10m depth
  4. Large-depth drills – greater than 10m depth 

For each category, the paper lists several designs at various completeness stages. Many of them have novel ideas about how to go about drilling, such as using an “inchworm” system or using ultrasonics. 

CNET describes another Martian mission that used a drill – InSight.
Credit – CNET YouTube Channel

But for now, drilling off-world, and especially on asteroids and comets, which have their own gravitational challenges, remains a difficult but necessary task. As humanity becomes more experienced at it, we will undoubtedly get better at it. Given how important this process is for the grand plans of space explorers everywhere, the time when we can drill effectively into any rocky or icy body in the solar system can’t come soon enough.

Learn More:
Knez & Khalilidermani – A Review of Different Aspects of Off-Earth Drilling
UT – Drill, Baby, Drill! – How Does Curiosity ‘Do It’
UT – Cylindrical Autonomous Drilling Bot Could Reach Buried Martian Water
UT – Perseverance Drills Another Hole, and This Time the Sample is Intact

Lead Image:
Curiosity’s arm with its drill extended.
Credit – NASA/JPL/Ken Kremer/kenkremer.com/Marco Di Lorenzo

The post Why is it so hard to drill off Earth? appeared first on Universe Today.

Categories: Astronomy

Want to Start a Farm on Mars? This Rover Will Find Out if it’s Possible

Thu, 04/04/2024 - 8:22pm

Travelling to Mars has its own challenges. The distance alone makes the journey something of a mission in itself. Arrive though, and the handwork has only just begun. Living and surviving on Mars will be perhaps humans biggest challenge yet.  It would be impossible to take everything along with you to survive so instead, it would be imperative to ‘live off the land’ and produce as much locally as possible. A new rover called AgroMars will be equipped with a number of agriculture related experiments to study the make up of the soil to assess its suitability for growing food. 

Growing food on Mars poses a number of challenges, chiefly due to the harsh environmental conditions. Not least of which is the low atmospheric pressure, temperature extremes and high radiation levels. To try and address these, new techniques have been developed in the fields of hydroponics and aeroponics. The key to these new techniques involves using nutrient rich solutions instead of soils. 

Special structures are build analogous to greenhouses on Earth with artificial lighting, temperature and humidity control. Genetic engineering too has played a part in developing plants that are more hardy and capably of surviving in harsh Martian environments. As we continue to explore the Solar System and in particular Mars, we are going to have to find ways to grow food in alien environments. 

The space station’s Veggie Facility, tended here by NASA astronaut Scott Tingle, during the VEG-03 plant growth investigation, which cultivated Extra Dwarf Pak Choi, Red Russian Kale, Wasabi mustard, and Red Lettuce and harvested on-orbit samples for testing back on Earth. Credits: NASA

Enter AgroMars. A space mission taking a rover to Mars to hunt for, and explore the possibility of establishing agriculture on Mars! The rover will be launched with similar capabilities to the likes of Perseverance or Curiosity. The rover will be launched to Mars by a Falcon 9 launch vehicle operated by Space X but this is some years off yet. The development phase has yet to start. In a paper by lead author M. Duarte dos San- tos the mission has been shaped, reality is a little way off. 

On arrival, AgroMars will use an X-ray and infrared spectrometer, high resolution cameras, pH sensors, mass spectrometers and drilling tools to collect and analyse soil samples. The samples will be assessed for mineralogical composition, soil texture, soil pH, presence of organic compounds and water retention capacity. 

To be able to assess the Martian soil the rover must possess advanced capabilities for collecting and analysing soil samples, more than before. The data will then be sent on to laboratories on Earth and it is their responsibility to interpret the information. The multitude of groups involved is a wonderful reminder how science transcends geographical borders. Working together will yield far better results and help to advance our knowledge of astrobiology and agriculture on Mars. 

‘Calypso’ Panorama of Spirit’s View from ‘Troy’. This full-circle view from the panoramic camera (Pancam) on NASA’s Mars Exploration Rover Spirit shows the terrain surrounding the location called “Troy,” where Spirit became embedded in soft soil during the spring of 2009. The hundreds of images combined into this view were taken beginning on the 1,906th Martian day (or sol) of Spirit’s mission on Mars (May 14, 2009) and ending on Sol 1943 (June 20, 2009). Credit: NASA/JPL-Caltech/Cornell University

This doesn’t come cheap though. The estimated cost of the mission is in the region of $2.7 billion which includes development, launch and exploration for the entire mission. 

The total cost of the mission is estimated to be around $2.7 billion, which includes $2.2 billion for the development and launch of the rover and $500 million for its exploitation during the entirety of the mission. Whether it – pardon the pun – gets off the ground is yet to be seen but if we are to explore and even establish a permanent base on Mars then we will have to gain a better understanding of the environment to feed and sustain future explorers. 

Source : AgroMars, Space Mission Concept Study To Explore Martian Soil And Atmosphere To Search For Possibility Of Agriculture on Mars.

Link :

The post Want to Start a Farm on Mars? This Rover Will Find Out if it’s Possible appeared first on Universe Today.

Categories: Astronomy

Which Animal Has Seen the Most Total Solar Eclipses?

Thu, 04/04/2024 - 7:21pm

In a paper published on the 1st April, author Mark Popinchalk reported upon a fascinating piece of research focussing on which animal has seen the most solar eclipses! It turns out that, whilst us humans have seen our fair share we are nowhere near the top of the list.  According to Popinchalk, the horseshoe crabs have seen a staggering 138 trillion solar eclipses across the entire species. We are hot on their heels but it won’t be until about 10 million years that we catch up!

On Monday we will be treated to another total solar eclipse across many parts of the globe. As the eclipse progresses – which is the result of a perfect Earth, Moon and Sun alignment – the Moon blocks sunlight from reaching parts of the Earth. When the Moon is directly between the two, from parts of the Earth, the Sun is completely blocked and we see a total solar eclipse. When only part of the Sun is blocked, we see a partial eclipse. As the eclipse progresses on Monday, hundreds of millions of people will witness the event unfold. 

Totality and the ‘diamond ring effect,’ captured during the 2023 total solar eclipse as seen from Ah Chong Island, Australia. Credit: Eliot Herman

It goes without saying that eclipses are not human constructs, nor are they purely the domain of the human being. Eclipses have occurred for millions of years, from a time long before humans appeared on Earth. This means that animals, for billions of years, witnessed eclipses long before we were the proverbial twinkle in the eye of mother Earth. 

Across the eons where eclipses have taken place there has been countless creatures walking/flying and swimming around. Even microbial activity Popinchalk suggests should be considered but it is impossible to say too much about them. In the Cambrian period there was a wide range of animals that evolved onto the surface of the Earth. The challenge however is to decide if an animal is actually aware of an eclipse, much less actually ‘observe’ it. There are anecdotal reports of birds going to roost during the lower light levels. Quantifying this is difficult.

Recent studies into the reaction of animals during total solar eclipses from zoos in metropolitan areas. Hartstone—Rose and team tracked the responses to 17 families of animals during the 2017 eclipse and found that 13 of them behaved differently than usual, with 8 performing night time routines. Others, such as primates, exhibited anxiety based behaviours much like our early ancestors did. 

Hartstone-Rose et al observed the Galapagos turtles turning to look toward the sky during an eclipse, were they perhaps observing and contemplating the event? Studies from Lofting and Dolittle (1920) have explored animal communications but until we can unlock the mystery of animal communication we may never know. We cannot however, hide from the fact that animals may well have seen eclipses, the debate is whether they really cottoned on to what was happening. 

In the conclusion, Popinchalk shows how, for an estimated standing population of horseshoe crabs of 120 million they would have witnessed 1.5 million eclipses making a total of 130 trillion total solar eclipse experiences. As for humans, if we take a standing (average) population of 1 million and 320,000 eclipses thats a mere 32 billion experiences. We are lagging behind. The paper is a fascinating read, give it a try, but do remember it was published on the 1st April, the numbers may have changed by then! It’s worthy of a winky emoji at this point

Categories: Astronomy