Universe Today
The New Mars Landing Approach: How We’ll Land Large Payloads on the Red Planet
Back in 2007, I talked with Rob Manning, engineer extraordinaire at the Jet Propulsion Laboratory, and he told me something shocking. Even though he had successfully led the entry, descent, and landing (EDL) teams for three Mars rover missions, he said the prospect of landing a human mission on the Red Planet might be impossible.
But now, after nearly 20 years of work and research — as well as more successful Mars rover landings — Manning says the outlook has vastly improved.
“We’ve made huge progress since 2007,” Manning told me when we chatted a few weeks ago in 2024. “It’s interesting how its evolved, but the fundamental challenges we had in 2007 haven’t gone away, they’ve just morphed.”
Image of the Martian atmosphere and surface obtained by the Viking 1 orbiter in June 1976. (Credit: NASA/Viking 1)The problems arise from the combination of Mars’ ultra-thin atmosphere—which is over 100 times thinner than Earth’s — and the ultra-large size of spacecraft needed for human missions, likely between 20 – 100 metric tons.
“Many people immediately conclude that landing humans on Mars should be easy,” Manning said back in 2007, “since we’ve landed successfully on the Moon and we routinely land human-carrying vehicles from space to Earth. And since Mars falls between the Earth and the Moon in size and in the amount of atmosphere, then the middle ground of Mars should be easy.”
But Mars’ atmosphere provides challenges not found on Earth or the Moon. A large, heavy spacecraft streaking through Mars’ thin, volatile atmosphere only has just a few minutes to slow from incoming interplanetary speeds (for example, the Perseverance rover was traveling 12,100 mph [19,500 kph] when it reached Mars) to under Mach 1, and then quickly transition to a lander to slow to be able to touch down gently.
Universe Today publisher Fraser Cain’s video about the challenges of landing Mars, with more details in this article.In 2007, the prevailing notion among EDL engineers was that there’s too little atmosphere to land like we do on Earth, but there is actually too much atmosphere on Mars to land heavy vehicles like we do on the Moon by using propulsive technology alone.
“We call it the Supersonic Transition Problem,” said Manning, again in 2007. “Unique to Mars, there is a velocity-altitude gap below Mach 5. The gap is between the delivery capability of large entry systems at Mars and the capability of super-and sub-sonic decelerator technologies to get below the speed of sound.”
The largest payload to land on Mars so far is the Perseverance rover, which has a mass of about 1 metric ton. Successfully landing Perseverance and its predecessor Curiosity required a complicated, Rube Goldberg-like series of maneuvers and devices such as the Sky Crane. Larger, human-rated vehicles will be coming in even faster and heavier, making them incredibly difficult to slow down.
Rob Manning, Chief Engineer for NASA’s Jet Propulsion Laboratory, and the Sky Crane for landing rovers on Mars. Credit: NASA/JPL-Caltech/Keck Institute“So, how do you slow down to subsonic speeds,” Manning said now in 2024 as the chief engineer at JPL, “to get to speeds where traditionally we know how to fire our engines to enable touchdown? We thought bigger parachutes or supersonic decelerators like LOFTID (Low-Earth Orbit Flight Test of an Inflatable Decelerator) tested by NASA) would allow us to maybe slow down better, but there were still issues with both those devices.”
“But there was one trick we didn’t know anything about it,” Manning continued. “How about using your propulsion system and firing the engines backwards —retro propulsion — while you are flying at supersonic speeds to shed velocity? Back in 2007, we didn’t know the answer to that. We didn’t even think it was possible.”
Why not? What could go wrong?
“When you fire engines backwards as you are moving through an atmosphere, there’s a shock front that forms and it would be moving around,” Manning explained, “so it could come along and whack the vehicle and cause it to go unstable or cause damage. You’re also flying right into the plume of the rocket engine exhaust, so there could be extra friction and heating possibilities on the vehicle.”
All of this is very hard to model and there was virtually no experience doing it, as in 2007, no one had ever used propulsive technology alone to slow and then land a spacecraft back on Earth. This is mostly because our planet’s beautiful, luxuriously thick atmosphere slows a spacecraft down easily, especially with a parachute or creative flying as the space shuttle did.
“People did study it a bit, and we came to the conclusion it would be great to try it and find out whether we could fire engines backwards and see what happens,” Manning mused, adding that there wasn’t any extra funding laying around to launch a rocket just to watch it come down again to see what happened.
A SpaceX Falcon-9 rocket poised to launch Dragon from Cape Canaveral. Credit: NASABut then, SpaceX started doing tests in attempt to land their Falcon 9’s first stage booster back on Earth to re-use them.
“SpaceX said they were going to try it,” Manning said, “And to do that they needed to slow the booster down in the supersonic phase while in Earth’s upper atmosphere. So, there’s a portion of the flight where they fire their engines backwards at supersonic speeds through a rarified atmosphere which is very much what’s like at Mars.”
As you can imagine, this was incredibly intriguing to EDL engineers thinking about future Mars missions.
After a few years of trial, error, and failures, on September 29, 2013, SpaceX performed the first supersonic retropropulsion (SRP) maneuver to decelerate the reentry of the first stage of their Falcon 9 rocket. While it ultimately hit the ocean and was destroyed, the SRP actually worked to slow down the booster.
NASA asked if their EDL engineers could watch and study SpaceX’s data, and SpaceX readily agreed. Beginning in 2014, NASA and SpaceX formed a three-year public-private partnership centered on SRP data analysis called the NASA Propulsive Descent Technology (PDT) project. The F9 boosters were outfitted with special instruments to collect data specifically on portions of the entry burn which fell within the range of Mach numbers and dynamic pressures expected at Mars. Additionally, there were visual and infrared imagery campaigns, flight reconstruction, and fluid dynamics analysis – all of which helped both NASA and SpaceX.
To everyone’s surprise and delight, it worked. On December 21, 2015, an F9 first stage returned and successfully landed on Landing Zone 1 at Cape Canaveral, the first-ever orbital class rocket landing. This was a game changing demonstration of SRP, which advanced the knowledge and tested the technology of using SRP on Mars.
View of SpaceX Falcon 9 first stage approaching Landing Zone 1 on Dec. 21, 2015. Credit: SpaceX“Based on the analyses completed, the remaining SRP challenge is characterized as one of prudent flight systems engineering dependent on maturation of specific Mars flight systems, not technology advancement,” wrote an EDL team, detailing the results of the PDT project in a paper. In short, SpaceX’s success meant it wouldn’t require any fancy new technology or breaking the laws of physics to land large payloads on Mars.
“It turns out, we learned some new physics,” Manning said. They found that the shock front ‘bubble’ created around the vehicle by firing the engines somehow insulates the spacecraft from any buffeting, as well as from some of the heating.
EDL engineers now believe that SRP is the only Mars entry, descent and landing technology that is intrinsically scalable across a wide range and size of missions to shed enough velocity during atmospheric flight to enable safe landings. Alongside aerobraking, this is one of the leading means of landing heavy equipment, habitats and even humans on Mars.
But still, numerous issues remain unsolved when it comes to landing a human mission on Mars. Manning mentioned there are multiple unknowns, including how a big ship such as SpaceX’s Starship would be steered and flown through Mars’ atmosphere; can fins be used hypersonically or will the plasma thermal environment melt them? The amount of debris kicked up by large engines on human-sized ship could be fatal, especially for the engines you’d like to reuse for returning to orbit or to Earth, so how do you protect the engines and the ship? Mars can be quite windy, so what happens if you encounter wind shears or a dust storm during landing? What kind of landing legs will work for a large ship on Mars’ rocky surface? Then there are logistics problems such as how will all the infrastructure get established? How will ships be refueled to return home?
“This is all going to take a lot of time, more time than people realize,” Manning said. “One of the downsides of going to Mars is that it is hard to do trial and error unless you are very patient. The next time you can try again is 26 months later because of the timing of the launch windows between our two planets. Holy buckets, what a pain that is going to be! But I think we’re going to learn a lot whenever we can try it for the first time.”
And at least the supersonic retropropulsion question has been answered.
“We’re basically doing what Buck Rogers told us to do back in the 1930s: fire your engines backwards while you’re going really fast.”
2007 article: The Mars Landing Approach: Getting Large Payloads to the Surface of the Red Planet
The post The New Mars Landing Approach: How We’ll Land Large Payloads on the Red Planet appeared first on Universe Today.
Three More “Galactic Monster” Ultra-Massive Galaxies Found
One of the surprise findings with the James Webb Space Telescope is the discovery of massive galaxies in the early Universe. The expectations were that only young, small, baby galaxies would exist within the first billion years after the Big Bang. But some of the newly found galaxies appear to be as large and as mature as galaxies that we see today.
Three more of these “monster” galaxies have now been found, and they have a similar mass to our own Milky Way. These galaxies are forming stars nearly twice as efficiently as galaxies that were formed later on in the Universe. Although they’re still within standard theories of cosmology, researchers say they demonstrate how much needs to be learned about the early Universe.
‘‘Our findings are reshaping our understanding of galaxy formation in the early Universe,’’ said Dr. Mengyuan Xiao, lead author of the new study and postdoctoral researcher at the University of Geneva, in a press release.
The most widely accepted cosmological model is the Lambda Cold Dark Matter (LCDM) model which posits that the first galaxies in the Universe did not have enough time to become so massive and should have been more modestly sized.
The new findings, published in the journal Nature, were made using JWST’s spectroscopic capabilities at near-infrared wavelengths. This allows astronomers to systematically study galaxies in the very distant and early Universe, including these three massive and dust-obscured galaxies. The study was conducted as part of the telescope’s FRESCO program (First Reionization Epoch Spectroscopically Complete Observations), which uses JWST’s NIRCam/grism spectrograph to measure accurate distances and stellar masses of galaxies. The results may indicate that the formation of stars in the early Universe was far more efficient than previously thought, which does challenge existing galaxy formation models.
The JWST NIRCAM operates over a wavelength range of 0.6 to 5 microns. Credit: NASA.However, there has been some controversy as to whether these galaxies really are super-large and mature. In August, another study debated the earlier findings of “impossibly large” galaxies, saying that what was observed may have been the result of an optical illusion, as the presence of black holes in some of these early galaxies made them appear much brighter and larger than they actually were.
But this latest study was part of the new FRESCO program with JWST to systematically analyze a complete sample of galaxies within the first billion years of cosmic history to determine whether they are dominated by ionization from young stars (starburst galaxies) or by an active galactic nucleus (AGN), i.e., a black hole. The researchers say this new approach allows for precise distance estimates and reliable stellar mass measurements for the full galaxy sample.
‘‘Our findings highlight the remarkable power of NIRCam/grism spectroscopy,” said Pascal Oesch, also from the University of Geneva, and principal investigator of the FRESCO program. ‘‘The instrument on board the space telescope allows us to identify and study the growth of galaxies over time, and to obtain a clearer picture of how stellar mass accumulates over the course of cosmic history.’’
Images of six candidate massive galaxies, that were reported in February 2023, seen 500-700 million years after the Big Bang. One of the sources (bottom left) could contain as many stars as our present-day Milky Way, according to researchers, but it is 30 times more compact. Credit: NASA, ESA, CSA, I. Labbe (Swinburne University of Technology). Image processing: G. Brammer (Niels Bohr Institute’s Cosmic Dawn Center at the University of Copenhagen).Researchers will certainly be making further observations of all these newly seen galaxies, which hopefully will help resolve any remaining questions about how massive these galaxies are and whether or not star formation was more rapid during the early Universe. The new observations of more of the large but young galaxies raises the question of whether the galaxies really are surprising monsters or optical illusions. Either way, all the findings raise new questions about the formation process of stars and galaxies in the early Universe.
“There is still that sense of intrigue,” said Katherine Chworowsky, a graduate student at the University of Texas at Austin (UT), who led the study we reported on in August. “Not everything is fully understood. That’s what makes doing this kind of science fun, because it’d be a terribly boring field if one paper figured everything out, or there were no more questions to answer.”
Further reading:
University of Geneva
UC Santa Cruz
Nature
The post Three More “Galactic Monster” Ultra-Massive Galaxies Found appeared first on Universe Today.
James Webb Confirms Hubble’s Calculation of Hubble’s Constant
We have been spoiled over recent years with first the Hubble Space Telescope (HST) and then the James Webb Space Telescope (JWST.) Both have opened our eyes on the Universe and made amazing discoveries. One subject that has received attention from both is the derivation of the Hubble Constant – a constant relating the velocity of remote galaxies and their distances. A recent paper announces that JWST has just validated the results of previous studies by the Hubble Space Telescope to accurately measure its value.
The Hubble Constant (H0) is a fundamental parameter in cosmology that defines the rate of expansion of the universe. It defines the relationship between Earth and distant galaxies by the velocity they are receding from us. It was first discussed by Edwin Hubble in 1929 as he observed the spectra of distant galaxies. It is measured in unites of kilometres per second per megaparsec and shows how fast galaxies are moving away from us per unit of distance. The exact value of the constant has been the cause of many a scientific debate and more recently the HST and JWST have been trying to fine tune its value. Getting an accurate value is key to determining the age, size and fate of the universe.
Edwin HubbleA paper recently published by a team of researchers led by Adam G. Riess from John Hopkins University validate the results from a previous HST study. They use JWST to explore its earlier results of the cepheid/supernova distance ladder. This has been used to establish distances across the cosmos using cepheid variable stars and Type 1a supernovae. Both objects can be likened to ‘standard candles’ whose actual brightness is very well understood. By measuring their apparent brightness from Earth, their distances can be calculated by comparing it to their actual brightness, their intrinsic luminosity.
NASA’s James Webb Space Telescope has spotted a multiply-imaged supernova in a distant galaxy designated MRG-M0138. Image Credit: NASA, ESA, CSA, STScI, Justin Pierel (STScI) and Andrew Newman (Carnegie Institution for Science).Over recent decades, a number of attempts have been made to accurately determine H0 using a multitude of different instruments and observations. The cosmic microwave background has been used along with the aforementioned studies using cepheid variables and supernovae events. The results provide a range of results which has become known as ‘Hubble tension.’ The recent study using JWST hopes that it may be able to fine tune and validate previous work.
To be able to determine H0 with a level of accuracy using the cepheid/supernova ladder, a sufficiently high sample of cepheids and supernovae must be observed. This has been challenging, in particular of the sample size of supernovae within the range of cepheid variable stars. The team also explored other techniques for determining H0 for example studying data from HST of the study of the luminosity of the brightest red giant branch stars in a galaxy – which can also work as a standard candle. Or the luminosity of certain carbon rich stars which are another technique.
This illustration shows three steps astronomers used to measure the universe’s expansion rate (Hubble constant) to an unprecedented accuracy, reducing the total uncertainty to 2.3 percent. The measurements streamline and strengthen the construction of the cosmic distance ladder, which is used to measure accurate distances to galaxies near to and far from Earth. The latest Hubble study extends the number of Cepheid variable stars analyzed to distances of up to 10 times farther across our galaxy than previous Hubble results. Credits: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)The team conclude that, when all JWST measurements are combined, including a correction for the low sample of supernovae data, that H0 comes out at 72.6 ± 2.0 km s?1 Mpc?1 This compares to the combined HST data which determines H0 as 72.8 km s?1 Mpc?1 It will take more years and more studies for the sample size of supernova from JWST to equal that from HST but the cross-check has so far revealed we are finally honing in on an accurate value for Hubble’s Constant.
The post James Webb Confirms Hubble’s Calculation of Hubble’s Constant appeared first on Universe Today.
What Should Light Sails Be Made Out Of?
The Breakthrough Starshot program aims to cross the immense distances to the nearest star in just decades. Using a high-powered laser to propel a reflective sail technology to relativistic speeds is their mission. The selection of sail material is key to its success as it must be lightweight while being able to withstand acceleration and radiation from the laser. A recent study explores various materials and proposes that core-shell structures—spherical particles composed of two different materials—could be a promising solution.
Breakthrough Starshot is an ambitious project to explore interstellar space by sending tiny, lightweight spacecraft to the nearest star system, Alpha Centauri. The project plans to use ground-based, high-powered lasers to accelerate reflective ‘light sails,’ enabling the spacecraft to achieve relativistic speeds and travel the 4.37 light-years in just a few years. Each spacecraft will be equipped with tiny sensors and communication systems, will collect data on exoplanets and other interstellar phenomena along the way. If successful, it could mark our first step toward exploring distant star systems and searching for extraterrestrial life.
This image of the sky around the bright star Alpha Centauri AB also shows the much fainter red dwarf star, Proxima Centauri, the closest star to the Solar System. The picture was created from pictures forming part of the Digitized Sky Survey 2. The blue halo around Alpha Centauri AB is an artifact of the photographic process, the star is really pale yellow in colour like the Sun. Image Credit: Digitized Sky Survey 2 Acknowledgement: Davide De Martin/Mahdi ZamaniTraveling at relativistic speeds, which are velocities close to the speed of light, presents amazing possibilities but brings with it immense difficulties. At these speeds, time dilation (a phenomenon predicted by Einstein’s theory of relativity) causes time to pass more slowly for the traveler relative to observers on Earth, potentially allowing journeys to distant stars within a single human lifetime from the traveler’s perspective. This won’t be a problem for Starshot however as they plan to send tiny spacecraft only. However, achieving such speeds, even for Starshot requires overcoming immense energy demands, as the kinetic energy needed increases exponentially with velocity. The environment at relativistic speeds also becomes particularly hazardous. Collisions with particles at such high speeds could easily destroy spacecraft, and radiation exposure would intensify due to relativistic effects.
This image shows the ACS3 being unfurled at NASA’s Langley Research Center. The solar wind is reliable but not very powerful. It requires a large sail area to power a spacecraft effectively. The ACS2 is about 9 meters (30 ft) per side, requiring a strong, lightweight boom system. Image Credit: NASATo complete the journey in a few decades the spacecraft needs to be accelerated to an estimated 20% of the speed of light bringing with it all the problems outlined above. The selection of the right material for the sails is key. In a paper recently published by Mitchell R. Whittam, Lukas Rebholz, Benedikt Zerulla and Carsten Rockstuhl from the Karlsruhe Institute of Technology in Germany the team report on the results of their search for the best material. In particular they focus attention on the so called core-shell spheres.
The structures are based upon a matrix design which finds its origins in Mie Theory. This mathematical framework was developed by German physicist Gustav Mie in 1908 to describe how spherical particles scatter electromagnetic waves such as light. In their study, they explore the reflective properties and acceleration times of spheres made from aluminium, silicon, silicon dioxide and various combinations.
The results were promising with a shell composed of a silicon and silicon dioxide combination yielding the best results. The work offers a significant insight into the structure of materials for light sails. Whilst not a definitive outcome, they showed that core-shell spheres, which were a previously unexplored area of light sail physics is a promising avenue to explore for future experimental work.
The post What Should Light Sails Be Made Out Of? appeared first on Universe Today.
A Giant Meteorite Impact 3.26 Billion Years Ago Helped Push Life Forward
The Earth has always been bombarded with rocks from space. It’s true to say though that there were more rocks flying around the Solar System during earlier periods of its history. A team of researchers have been studying a meteorite impact from 3.26 billion years ago. They have calculated this rock was 200 times bigger than the one that wiped out the dinosaurs. The event would have triggered tsunamis mixing up the oceans and flushing debris from the land. The newly available organic material allowed organisms to thrive.
Meteorite impacts are a common event and its not unusual to see these rocks from space whizzing through the atmosphere. Giant meteorite impacts have become an important part of Earth’s geological history. The impacts release colossal amounts of energy that can destroy life, create wildfires, tsunamis and eject dust into the atmosphere. The Chicxulub impact around 66 million years ago is perhaps one of the most well known impacts and wiped out the dinosaurs. The study of these interplanetary wanderers is imperative as we strive to protect ourselves from potential impactors that pose a threat to human life.
A bright meteor caught by one of the Global Fireball Network’s cameras from the Rancho Mirage Observatory (Eric McLaughlin) on April 7, 2019. Credit: NASA Meteorite Tracking and Recovery Network.Impacts like these have had a massive affect on the development of Earth and its suitability for life. Geological studies of rocks from the Archean Eon have revealed 16 major impacts with impactors measuring at least 10km in diameter. At the time of impact the effects can be devastating but over time, their can be benefits to life although it’s not well understood. In a paper published in Earth, Atmospheric and Planetary Sciences the team led by Nadja Drabon from Harvard University explore rocks from an event 3.26 billion years ago.
Known as the S2 event, the impactor is believed to be a carbonaceous chondrite between 37 to 58 km in diameter. It is thought to have exploded over South Africa with debris landing in the ocean causing giant tsunamis. The impact mixed up iron(II) rich deep waters with the iron(II) poor shallower waters. It will have also caused the waters to heat leading to partial evaporation of surface water with a temporary increase in erosion around coastal areas.
A three-dimensional cross-section of the hydrothermal system in the Chicxulub impact crater and its seafloor vents. The system has the potential for harboring microbial life. Illustration by Victor O. Leshyk for the Lunar and Planetary Institute.Perhaps one of the most valuable effects of the impact was the injection of phosphorus into the atmosphere with a positive impact on the Earth’s habitability for life. Study of the layers of rock above the layer caused by the S2 event reveals an increased amount of nutrients and iron which helped microbial life to thrive.
The study has helped to build a clearer understanding of how giant impacts can aid the development of life. It does of course depend on the size and type, material and the conditions of the atmosphere before the event. The S2 event seems to have quite a mixed effect on early life, in particular marine life. Overall some forms of life were positively impacted while others seemed to have experienced challenges. Marine life that relies upon sunlight to survive (the phototrophs) were effected by the darkness while those living at lower depths were less influenced. The detrimental effects of the atmosphere would likely only have been short lived lasting perhaps just a few years before recovering quickly causing only a temporary impact to marine life. But the injection of phosphorous in the atmosphere would have had far more long term beneficial effects to life.
Source : Effect of a giant meteorite impact on Paleoarchean surface environments and life
The post A Giant Meteorite Impact 3.26 Billion Years Ago Helped Push Life Forward appeared first on Universe Today.
America’s Particle Physics Plan Spans the Globe — and the Cosmos
RALEIGH, N.C. — Particle physicist Hitoshi Murayama admits that he used to worry about being known as the “most hated man” in his field of science. But the good news is that now he can joke about it.
Last year, the Berkeley professor chaired the Particle Physics Project Prioritization Panel, or P5, which drew up a list of multimillion-dollar physics experiments that should move ahead over the next 10 years. The list focused on phenomena ranging from subatomic smash-ups to cosmic inflation. At the same time, the panel also had to decide which projects would have to be left behind for budgetary reasons, which could have turned Murayama into the Dr. No of physics.
Although Murayama has some regrets about the projects that were put off, he’s satisfied with how the process turned out. Now he’s just hoping that the federal government will follow through on the P5’s top priorities.
Berkeley particle physicist Hitoshi Murayama speaks at the ScienceWriters 2024 conference in Raleigh, N.C. (Photo by Alan Boyle)“There are five actually exciting projects we think we can do within the budget program,” Murayama said this week during a presentation at the ScienceWriters 2024 conference in Raleigh. Not all of the projects recommended for U.S. funding are totally new — and not all of them are based in the U.S. Here’s a quick rundown:
- Looking for dark matter: About 85% of all the matter in the universe is thought to exist in an invisible form that so far has been detectable only through its gravitational effect. For years, an experiment being conducted in a converted South Dakota gold mine has been looking for traces of dark matter’s interactions with a huge reservoir of liquid xenon. The experiment hasn’t yet found anything, but Murayama said the P5 panel supports the idea of boosting the size of the reservoir size from seven to on the scale of 70 tons and intensifying the search.
- Following up on the Higgs boson: The discovery of the Higgs boson in 2012 provided the last missing piece in the Standard Model of particle physics, one of science’s most successful theories. But physicists don’t have a good grip on how the Higgs works. “You’d like to mass-produce this Higgs boson and study its properties in great detail, so we know how it got stuck and frozen into space, so that we can stay in one place,” Murayama said. That would require building a bigger particle collider, capable of smashing electrons and positrons — but the P5 panel determined that such a machine couldn’t be built in the U.S. Instead, the panel recommends supporting an “offshore Higgs factory” like the FCC-ee facility that CERN is considering, or the International Linear Collider that’s been proposed for construction in Japan.
- Studying the nature of neutrinos: The Big Bang is thought to have created equal amounts of matter and antimatter, which would theoretically annihilate each other. Fortunately for us, matter won out rather than being totally annihilated. How did it happen? “The only candidate elementary particle we know who might have done this is actually neutrinos,” Murayama said. “How do we know if that’s really the case? One thing we try to do is to look at the behavior of neutrinos by creating them in Illinois and shooting them to a location in South Dakota, because neutrinos can pass through the dirt without any problems.” The Deep Underground Neutrino Experiment is under construction, and excavation of the Long-Baseline Neutrino Facility was recently completed in South Dakota. The P5 report proposes upgrading DUNE’s capabilities.
- Getting a neutrino view of the cosmos: The P5 panel also called for a dramatic expansion of the IceCube Neutrino Observatory in Antarctica. “They managed to peer into the supermassive black hole in a nearby galaxy, and for the first time, they even took a picture of a galactic disk using neutrinos as well,” Murayama said. “So this is finally becoming a true tool to observe the universe in a different way from what we do with older telescopes.”
- Seeking signs of cosmic inflation: A widely held theory asserts that in the instant after the Big Bang, the universe inflated at a prodigious rate to “lock in” the slight perturbations that scientists see in the cosmic microwave background radiation. In 2014, astronomers claimed that an experiment at the South Pole had picked up evidence of that primordial cosmic inflation, but months later, they had to back away from those claims. The Antarctic studies are continuing, however, and the P5 panel supported an experiment known as CMB-S4 that would widen the search for evidence. “For that, we need two sites, one in Chile, another at the South Pole,” Murayama said.
In addition to the top five projects, the panel endorsed a longer-term effort to develop an advanced particle accelerator that would produce collisions between subatomic particles known as muons. Such a machine would increase the chances of finding new frontiers in physics in the 2030s, Murayama said.
“We call this a ‘muon shot,’ like a moonshot,” he said. “We don’t know quite well if we can really get there, but as you work toward it, that would end up producing so many interesting things on the way, more science and more technologies.”
Will the P5’s priorities prevail? That’s up to the U.S. Department of Energy and the National Science Foundation, which must decide what to do with the physicists’ recommendations. Success isn’t guaranteed: For example, NSF put the CMB-S4 experiment on hold in May to focus instead on upgrading aging infrastructure at its Antarctic facilities.
Looking ahead, it’s not yet clear how particle physics will fare when Donald Trump returns to the White House. For what it’s worth, the price tags for four of the projects add up to more than $2.5 billion over the course of several years. The cost of the offshore Higgs factory is certain to amount to billions more.
Murayama called attention to an issue that could affect IceCube, CMB-S4 and other Antarctic research in the nearer term. “There is a fleet of cargo airplanes that is owned by the U.S. Air Force that actually served us well over many decades,” he said. “But they were built back in the ’70s, and they’re about to retire, and right now there are no plans to replace them. Then we will lose access.”
Senate Majority Leader Chuck Schumer, D-N.Y., managed to get a $229 million appropriation for new planes into the Senate’s version of the defense budget bill for the current fiscal year, but the House still has to take action. That sets up a bit of a congressional cliffhanger for the weeks and months ahead.
“I don’t get a good sense of the priority,” Murayama confessed. “But this is supposed to be part of the defense budget, which is way bigger than the science budget — so in that part, it’s peanuts. Hopefully, it just can get in and get funded.”
For a critical perspective on the P5 wish list, check out physicist Sabine Hossenfelder’s YouTube video:
Alan Boyle is a volunteer board member for the Council for the Advancement of Science Writing, which was one of the organizers of the ScienceWriters 2024 conference.
The post America’s Particle Physics Plan Spans the Globe — and the Cosmos appeared first on Universe Today.
Millions of Phones Could Map the Earth’s Ionosphere
We are all familiar with the atmosphere of the Earth and part of this, the ionosphere, is a layer of weakly ionized plasma. It extends from 50 to 1,500 km above the planet. It’s a diffuse layer but sufficient to interfere with satellite communications and navigation systems too. A team of researchers have come up with an intriguing idea to utilise millions of mobile phones to help map the ionosphere by relying on their GPS antennas.
The ionosphere is a layer of the Earth’s atmosphere where radiation ionizes atoms and molecules. The incoming solar radiation is the primary cause which energises gases causing them to lose electrons and become electrically charged. The process creates a region of charged particles or ions known as plasma. The ionosphere is a key part of radio communications since its ionized particles reflect and refract radio waves back to Earth facilitating long distance communication. It’s density and surprisingly perhaps its composition changes as solar activity waxes and wanes.
A view of Earth’s atmosphere from space. Credit: NASAIn a paper recently published in Nature, a team of researchers at Google have used data from over 40 million mobile phones to map conditions in the ionosphere. The concept of using crowdsourced signals is an intriguing one and the study will help to improve satellite navigation and our understanding of the upper regions of our atmosphere. We still don’t have a full understanding of the properties of the ionosphere across regions like Africa and South America so this study will fill significant gaps.
The ionosphere can slow down radio signals that travel to Earth from satellites, in particular from GPS and other navigation satellites. When it comes to these navigation signals, they rely heavily upon signal timing and relies upon nano-second precision. This gives systems the ability to pinpoint location with incredible accuracy, having an accurate model of the ionosphere is key to its success however.
NavCube, the product of a merger between the Goddard-developed SpaceCube 2.0 and Navigator GPS technologies, could play a vital role helping to demonstrate X-ray communications in space — a potential NASA first. Credit: NASA/W. HrybykUsing data from ground based stations, engineers can create real time maps of the ionospheric density. To do this, data is received across two different frequencies from the same satellite and their arrival timed. Dependent on the density of the ionosphere, the low frequency waves are slowed down more than the high frequency signals. Not taking these into account could put GPS and navigation systems out by 5 metres or more.
Receiving multiple frequencies is within the capability of most mobile phones and it’s using this that has been the focus of the study. There is however, a degree of noise in the data received by mobile phones but the team at Google found that combining the signal of large numbers of phones reduced the noise.
The study is currently only working with Android phones. Anyone who allows for their sensor data to be shared was able to contribute to the study. The data has already revealed plasma in the ionosphere over South America that had not been seen before.
Source : Mapping the ionosphere with millions of phones
The post Millions of Phones Could Map the Earth’s Ionosphere appeared first on Universe Today.
Detecting Primordial Black Hole Mergers Might be Within Our Grasp
Imagine a black hole with the mass of the asteroid Ceres. It would be no larger than a bacterium and practically undetectable. But if such black holes are common in the Universe, they would affect the motions of stars and galaxies, just as we observe. Perhaps they are the source of dark matter.
Such tiny black holes could not form from dying stars, but they might have formed within the hot, dense cosmos soon after the Big Bang. For this reason, they are known as primordial black holes. We have no evidence they exist, but since they would be such a great explanation for dark matter, astronomers keep looking.
The one thing we know at this point is that most primordial black holes are ruled out by the data. Large, almost stellar mass black holes would affect the clustering of galaxies in a way we don’t observe. Tiny black holes of mountain mass or smaller would have evaporated long ago, making them useless as a dark matter candidate. But asteroid mass black holes are still possible. They aren’t likely, but they haven’t been formally excluded by the data. So a new study looks at how asteroid mass primordial black holes might be detected through gravitational waves.
The size and lifetime of primordial black holes by mass. Credit: NASA’s Goddard Space Flight CenterTo account for dark matter, the smaller the primordial black hole, the more common they must be. For asteroid masses, the cosmos would need to contain a vast sea of them. Since they would cluster within galaxies, they would be common enough within galaxies for some of them to merge on a regular basis. As the study points out, each of these mergers would produce a gravitational chirp similar to the ones we have observed with stellar-mass black holes. They would just have a much higher frequency and be more common.
The frequency of these primordial chirps would be too high for current observatories such as LIGO to observe, but the authors point out that some current dark matter experiments might be able to observe them. One alternative model for dark matter involves a hypothetical particle known as the axion. Axions were originally proposed to solve some issues in high-energy particle physics, and while they have fallen out of popularity in particle physics, they’ve gained some popularity in cosmology. We have made a few attempts to detect axions, but to no success. In their paper, the authors show how axion experiments could be tweaked slightly to observe the chirps of primordial black hole mergers in ideal conditions.
The chances of success are pretty slim. It would be odd for primordial black holes to exist in the only allowed mass range and nowhere else, and the conditions we could observe would be pretty narrow. But it might be worth doing a search on the off chance. The nature of dark matter remains a huge mystery in astronomy, so we don’t have much to lose in trying the occasional long-shot idea.
Reference: Profumo, Stefano, et al. “The Maximal Gravitational Wave Signal from Asteroid-Mass Primordial Black Hole Mergers.” arXiv preprint arXiv:2410.15400 (2024).
The post Detecting Primordial Black Hole Mergers Might be Within Our Grasp appeared first on Universe Today.
What’s Behind the Martian Methane Mystery?
The seasonal variations of methane in the Martian atmosphere is an intriguing clue that there might be life hiding under the surface of the red planet. But we won’t know for sure until we go digging for it.
Hints of methane on Mars go back all the way to the Mariner missions of the 1970s. But in 2013 NASA’s Curiosity rover saw methane levels around it rise to several times greater than the background. A few months later it dwindled and disappeared, only to return again.
This Martian methane mystery poses an interesting challenge for scientists. On one hand, there are known chemical reactions that can take the molecules known to exist on Mars and turn them into methane. For example, liquid water interacting with magnesium- and iron-rich rocks like olivine can oxidize them, which can produce pockets of hydrogen. This hydrogen can then react with the carbon dioxide in the Martian atmosphere through the Fischer-Tropsch process to produce methane.
But while this scenario is relatively straightforward, the devil is in the details. In order for this process to work there must be liquid water underground. And some other mechanism needs to be able to remove the methane, or at least make this process cycle on and off every few months.
That opens up the possibility for life. We know of forms of life on Earth known as methanogens that do not get their energy from photosynthesis. Instead they essentially eat hydrogen and produce methane as a byproduct. The advantage of using life to explain the Martian methane mystery is that it can potentially naturally introduce seasonal variations. When conditions change under the Martian surface, for example through the warming summer months or cooling winter months, then the life can respond appropriately.
But while this hypothesis explains the seasonal variation, it doesn’t get around the fact that the Martian life would still need a source of water to live. Plus, we have absolutely no evidence for any life appearing on Mars, even in its distant past.
To date there is no clear consensus as to what is causing the seasonal variations of methane on Mars. The idea of life under the surface of the red planet remains a tantalizing possibility. The only way to answer this is to keep sending missions back to Mars and start digging.
The post What’s Behind the Martian Methane Mystery? appeared first on Universe Today.
Scientists Develop Technique to Create 3D Models of Cosmic Structures
For decades, astronomers have used powerful instruments to capture images of the cosmos in various wavelengths. This includes optical images, where visible light is observed, and images that capture non-visible radiation, ranging from the radio and infrared to the X-ray and Gamma-ray wavelengths. However, these two-dimensional images do not allow scientists to infer what the objects look like in three dimensions. Transforming these images into a 3D space could lead to a better understanding of the physics that drives our Universe.
In a recent study, an international team of researchers led by the Minnesota Institute for Astrophysics (MIfA) at the University of Minnesota announced the development of a new technique for radio astronomy. This first-ever technique reconstructs radio images into three-dimensional “Pseudo3D cubes” that allow astronomers to get a better idea of what cosmic structures look like. This technique could lead to an improved understanding of how galaxies, massive black holes, jet structures, and the Universe work.
The study was led by Lawrence Rudnick, a Professor Emeritus at the Minnesota Institute for Astrophysics. he was joined by colleagues from the Research School of Astronomy and Astrophysics at the Australian National University, National Radio Astronomy Observatory (NRAO), the Institute for Radio Astronomy and Astrophysics, National Autonomous University of Mexico, the Jodrell Bank Centre for Astrophysics, University of Manchester, and the Kavli Institute for Particle Astrophysics and Cosmology.
Researchers used a new technique to transform 2D radio images into a 3D model to better understand phenomena in our Universe. Credit: Lawrence Rudnick/MeerKAT Radio Telescope
To develop their 3D modeling tool, the team looked at polarized radio light, which vibrates in a specific direction. The research team then factored in the effect called “Faraday rotation,” where the the polarization of light rotates along the direction of propagation in proportion to the projection of a magnetic field. Named after Michael Faraday, this effect was the first experimental evidence that light and electromagnetism are related. In the case of radio waves, the rotation depends on how much material they have passed through.
With this technique, the team examined various radio image samples obtained by the Australian Square Kilometer Array Pathfinder Telescope (ASKAP) and MeerKAT radio telescopes. They found they could estimate how far each part of the radio light had traveled, enabling them to create a 3D model of phenomena happening millions of light-years away. This technique also allowed the team to demonstrate, for the first time, how the line-of-sight orientation of relativistic jets can be determined.
They also examined the supermassive black hole (SMBH) at the heart of the M87 galaxy. Using their technique, the team was able to show how material ejected interacts with cosmic winds and space weather and also analyzed the structures of the jet’s magnetic fields in space. As Rudnick said in a recent University of Minnesota press release:
“We found that the shapes of the objects were very different from the impression that we got by just looking at them in a 2D space. Our technique has dramatically altered our understanding of these exotic objects. We may need to reconsider previous models on the physics of how these things work. There is no question in my mind that we will end up with lots of surprises in the future that some objects will not look like we thought in 2D.”
The team recommends using this technique to reevaluate all previous analyses of polarized light sources. They also hope this technique will be applied to images taken by next-generation telescopes around the world. This includes the new Square Kilometer Array (SKA-Phase2) project, which will extend the facility to about 2000 dishes, making it 50 times more sensitive and 10,000 times faster than any other radio telescope in the world.
Further Reading: UofM-CSE, MNRAS
The post Scientists Develop Technique to Create 3D Models of Cosmic Structures appeared first on Universe Today.
The Best Way to Find Planet Nine Might Be Hundreds of Tiny Telescopes
Ever since William Herschel discovered Uranus in 1781, astronomers have been eager to find new planets on the outer edge of the solar system. But after the discovery of Neptune in 1846, we’ve found no other large planets. Sure, we discovered Pluto and other dwarf planets beyond it, but nothing Earth-sized or larger. If there is some planet nine, or “Planet X” lurking out there, we have yet to find it.
But there is some tentative evidence for it. As we have found more Pluto-like bodies known as Trans-Neptunian Objects (TNOs) and even more distant bodies known as Kuiper Belt Objects (KBOs), we’ve noticed that there appears to be an odd bit of orbital clustering among them. The orientation of their orbits isn’t as randomly distributed as we’d expect, which could be caused by the small gravitational tugs of a super-Earth at the edge of the solar system. If we assume that is the solution to the orbital bias, then there could be a five Earth-mass planet orbiting ten times farther from the Sun than Neptune.
Astronomers have searched for the planet but have found nothing. This has led some to speculate that Planet X might be a primordial black hole, while more skeptical minds argue it must not exist. The evidence just isn’t that strong, and there are other possible explanations for the clustering. So a new paper argues for a new way to gather evidence of Planet X, and it’s remarkably clever.
The idea is based on a phenomenon known as occultation. This is when an asteroid or planetary body passes in front of a star. By observing the star as the object occults it, astronomers can measure things such as the orbit and shape of the body. Through an occultation, we discovered that the asteroid Chariklo has a ring system. Amateur astronomers have used occultation events to map the shapes of small asteroids.
Ocultations different vantage points reveals the shape of an asteroid. Credit: IOTAThe authors propose building 200 40-cm telescopes spaced 5 kilometers apart to create an occultation array 1,000 km wide. Since each telescope would have a slightly different vantage point, occultations would be seen differently by different telescopes, allowing astronomers to map the orbit and size of Trans-Neptunian Objects. They estimate that over the course of a 10-year study they could detect about 1,800 new TNOs. Based on simulations of TNO orbits and clustering, the authors show that such a system should find clear evidence of any 5 Earth-mass body within 800 AU of the Sun. In other words, if Planet X is out there, this study could prove it.
The whole array would only cost about $15 million U.S. dollars, which is surprisingly cheap for such a project. Even if the study failed to find Planet X, it would add to our understanding of the distant solar system and also allow us to study how [sunlight can shift the orbits of small solar system bodies.](https://briankoberlein.com/blog/super-breakout/)
Reference: Gomes, Daniel CH, and Gary M. Bernstein. “An automated occultation network for gravitational mapping of the trans-neptunian solar system.” arXiv preprint arXiv:2410.16348 (2024).
The post The Best Way to Find Planet Nine Might Be Hundreds of Tiny Telescopes appeared first on Universe Today.
It Takes Very Special Conditions to Create This Bizarre Stellar Spectacle
A stellar odd couple 700 light-years away is creating a chaotically beautiful display of colourful, gaseous filaments. The Hubble captured the pair, named R Aquarii, and their symbiotic interactions. Every 44 years the system’s violent eruptions blast out filaments of gas at over 1.6 million kilometers per hour.
R Aquarii consists of two dramatically different types of stars: a white dwarf and a particular type of variable star.
The white dwarf is a stellar remnant. It’s what remains of a main sequence star that’s reached the end of its life of fusion. It shines only because of its remnant heat. White dwarfs are extremely dense, so even though they’re about the same size as Earth, they have a mass similar to the Sun. That means for such a small volume object, they exert a powerful gravitational pull.
The variable star is a type of red giant called a Mira-type variable. It’s a complete opposite to its companion star. Rather than extremely compact and dense, the red giant is bloated and red. It’s more than 400 times larger than the Sun. It’s a pulsating giant star that’s more at home atop Sauron’s Dark Tower than it is in a catalogue of stars. As it pulses, it changes temperature and luminosity. Over an approximately 390-day period, its brightness changes by a factor of 750.
That means that when the star is at its peak brightness, it’s more than 5,000 times as bright as our Sun.
This image of R Aquarii is from the SPHERE planet-hunting instrument on the ESO’s Very Large telescope in 2018. It was captured while the instrument was being tested, and astronomers were able to capture dramatic details of the turbulent stellar relationship with unprecedented clarity. This image is from the SPHERE/ZIMPOL observations of R Aquarii, and shows the binary star itself, as well as the jets of material spewing from the stellar couple. Image Credit: By ESO/Schmid et al. – https://www.eso.org/public/images/eso1840a/, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=75014181The powerful pulsing of this massive red star is enough to be a spectacle in itself. But it’s relationship with its binary partner creates an even more spectacular display. As the two orbit, the dense white dwarf draws hydrogen gas away from the red giant. The hydrogen accumulates on the white dwarf until the star can’t take it anymore. Then the hydrogen explodes in nuclear fusion on the surface of the small, dense star.
The nova explosion ejects the material into space in gaseous filaments. But the region around white dwarfs is dominated by the star’s powerful magnetic fields, which can be millions of times stronger than Earth’s. The force of the nuclear explosion and the magnetic fields twist the gaseous hydrogen filaments into trails and streamers, and eventually, they loop back on themselves and form spiral patterns.
We can only see this nebula of gaseous filaments because the radiation from both stars strips electrons from the hydrogen, turning it into ionized gas. The ionized hydrogen glows brightly and creates a beautiful natural display.
The central binary star’s brightness changes over time because of the pulsing of the red giant. The gas appears red to us, but not because of the red giant. R Aquarii is in a dusty region, and the dust absorbs all the blue light, with only red reaching us.
A Hubble timelapse consisting of five images of R Aquarii from 2014 to 2023 helps bring the dynamic interplay to life.
Looking at these images, it’s easy to misunderstand the scale of the stars, the nebula, and the brightly-lit, filaments of ionized hydrogen. However, the material blasted into space reaches as far as 400 billion kilometers (248 billion miles). For comparison, that’s about 24 times greater than our Solar System’s diameter.
R Aquarii was first observed by German astronomer Karl Ludwig Harding in 1810, when he was a colleague of Carl Friedrich Gauss at Gottingen Observatory. It’s one of the nearest symbiotic stars, and is an object that astronomers are very interested in observing. In the 20th century, Edwin Hubble and others studied it and recognized its complex interactions and the resulting nebula. R Aquarii and its brethren can teach astronomers a lot about stellar winds, accretion, and ionized nebula.
The post It Takes Very Special Conditions to Create This Bizarre Stellar Spectacle appeared first on Universe Today.
A New Look a the Most Ancient Light in the Universe
In the earliest moments of the Universe, the first photons were trapped in a sea of ionized gas. They scattered randomly with the hot nuclei and electrons of the cosmic fireball, like tiny boats in a stormy sea. Then, about 370,000 years after the big bang, the Universe cooled enough for the photons to be free. After one last scattering, they could finally ply interstellar space. Some of them traveled across 14 billion years of space and time to reach Earth, where we see them as part of the cosmic microwave background. The remnant first light of creation.
The CMB is a central point of evidence supporting the Big Bang and the standard model of cosmology. By observing the scale of fluctuations within the CMB, we can measure things such as the shape of space, the distribution of matter and energy, and the rate of cosmic expansion. It’s that last one that has been troubling astronomers, thanks to the Hubble tension problem.
Astronomers have several ways to measure the Hubble parameter, the value of which tells us the rate of cosmic expansion. The methods generally fall into two types: those based on observations of the CMB, and those based on astrophysical phenomena such as supernovae. The problem is that these two types of methods don’t agree on the value. They even contradict each other, leading some astronomers to argue there must be something wrong with the standard model.
Polarization fluctuations within the CMB. Credit: SPT-3G CollaborationOf the two types, the CMB method is the one with the most limited data. The best CMB observations we have come from space telescopes such as Planck, which measured fluctuations in CMB intensity. One solution to the tension problem would be to argue that the CMB observations are somehow biased. But new observations gathered by the South Pole Telescope (SPT) throw that idea out of the water.
Rather than measuring intensity fluctuations in the cosmic microwave background, the SPT observed variations in its polarization. All the CMB light we observe comes from a moment of last scattering, when photons scattered off an ion one last time before making the billion-year journey to reach us. When light is scattered, it is polarized relative to the distribution of ionized gas. So these observations are a truly independent measure of cosmic expansion.
Different modes of CMB polarization. Credit: Sky and TelescopeOne challenge in working with polarized CMB data is that as the first light traveled through space, it interacted with matter, space, and time. Not only is the light red-shifted due to cosmic expansion, it is gravitationally lensed by galaxies, which changes the polarization. Some of the light scatters off interstellar gas, which gives a false polarization. Even ripples of gravitational waves can affect the light’s orientation. So the team looked at not just the raw polarization of the CMB, but also what are known as E-mode and B-mode polarization. Each of these is sensitive to different kinds of bias. For example, the E-mode is more sensitive to secondary scattering, while the B-mode is more sensitive to cosmic inflation and gravitational waves.
By combining and contrasting these polarization modes, the team was able to calculate a new value for the Hubble parameter. Since it isn’t based on intensity fluctuations, it is free of any bias in the space-based CMB observations. Based on their data, the team got a value of H<sub>0</sub> at 66.0–67.6 (km/s)/Mpc. This agrees with the intensity-based observations of WMAP and Planck, which found a value of 67–68 (km/s)/Mpc. In comparison, the astrophysical methods find a value of 73–75 (km/s)/Mpc.
This study confirms that earlier CMB observations are not biased. The Hubble tension is very real, and we currently have no clear way to resolve it.
Reference: SPT-3G Collaboration. “Cosmology From CMB Lensing and Delensed EE Power Spectra Using 2019-2020 SPT-3G Polarization Data.” arXiv preprint arXiv:2411.06000 (2024).
The post A New Look a the Most Ancient Light in the Universe appeared first on Universe Today.
Space Tourism: The Good, The Bad, The Meh
Space tourism here is here to stay, and will likely remain a permanent fixture of near-Earth activities for the foreseeable future. But is it worth it?
While for decades private individuals have been able to negotiate with national space agencies to get rides to the International Space Station, it wasn’t until the advent of private aerospace that many more opportunities opened up. With wealthy billionaires like Elon Musk, Jeff Bezos, and Richard Branson all creating their own rocket companies, it changed the playing field. Now if you are a private individual wanting to take a hop into space you can shop around with a lot more options.
While Elon Musk’s SpaceX does not have a stated goal of space tourism, if you are willing to front the money you can get a ride on a Crew Dragon capsule, like Jared Isaacman recently did with his Polaris Dawn mission. On the other end of the spectrum, Richard Branson’s Virgin Galactic is explicitly designed around space tourism. They offer short sub-orbital hops for a few hundred thousand dollars each.
Space tourism certainly has several positives. For one there is more interest and activity in space which generally brings positive attention to the industry. Second, by companies chasing after a new market niche, these companies are developing new technologies and approaches which can have further beneficial effects on the larger industry. Lastly, there’s the well-reported “overview effect” where people finally get a view of our fragile home planet and gain a new perspective on what is important in human life.
On the other hand, it’s not exactly like many people get to be space tourists. Even the cheapest tickets are comparable to the cost of a home, making it inaccessible to all but the wealthiest people in our society. So it’s not like many people are getting to appreciate the view or participate in this new market. In fact, space tourism can lead to negative feelings as people just think of space as the province of the rich and elite.
Lastly, there are precious few dollars available for rocket development and space exploration. Many might argue that these dollars would be better suited to scientific exploration or experimental development of new technologies rather than creating a new pastime for the ultra wealthy.
Ultimately space tourism is going to be a thing whether we like it or not. It’s also not going to be a big thing. For the foreseeable future it will remain incredibly expensive, and most rocket companies are more interested in scientific and industrial pursuits in low-Earth orbit and beyond. So either way, whether it’s a good or bad thing, it’s simply not going to make a huge difference.
The post Space Tourism: The Good, The Bad, The Meh appeared first on Universe Today.
New Study Examines Cosmic Expansion, Leading to a New Drake Equation
In 1960, in preparation for the first SETI conference, Cornell astronomer Frank Drake formulated an equation to calculate the number of detectable extraterrestrial civilizations in our Milky Way. Rather than being a scientific principle, the equation was intended as a thought experiment that summarized the challenges SETI researchers faced. This became known as the Drake Equation, which remains foundational to the Search for Extraterrestrial Intelligence (SETI) to this day. Since then, astronomers and astrophysicists have proposed many updates and revisions for the equation.
This is motivated by ongoing research into the origins of life on Earth and the preconditions that led to its emergence. In a recent study, astrophysicists led by Durham University produced a new model for the emergence of life that focuses on the acceleration of the Universe’s expansion (aka. the Hubble Constant) and the number of stars formed. Since stars are essential to the emergence of life as we knot it, this model could be used to estimate the probability of intelligent life in our Universe and beyond (i.e., in a multiverse scenario).
The study was led by Daniele Sorini, a postdoctoral Research Associate at Durham University’s Institute for Computational Cosmology, and was funded by a European Research Council (ERC) grant. She was joined by John Peacock, a Professor of Cosmology at the Royal Observatory and the University of Edinburgh’s Institute for Astronomy, and Lucas Lombriser, from the Département de Physique Théorique, Université de Genève. The paper that details their findings was recently published in the Monthly Notices of the Royal Astronomical Society.
The Drake Equation is a mathematical formula for the probability of finding life or advanced civilizations in the universe. Credit: University of RochesterAs noted, the Drake Equation was not intended as a tool for estimating the number of extraterrestrial intelligences (ETIs) but as a guide for how scientists should search for life in the Universe. The formula for the equation is:
N = R* x fp x ne x fl x fi x fc x LWhereas N is the number of civilizations in our galaxy that we might able to communicate with, R* is the average rate of star formation in our galaxy, fp is the fraction of those stars that have planets, ne is the number of planets that can actually support life, fl is the number of planets that will develop life, fi is the number of planets that will develop intelligent life, fc is the number civilizations that would develop transmission technologies, and L is the length of time that these civilizations would have to transmit their signals into space.
In the same sense, the new research does not attempt to calculate the absolute number of intelligent species in the Universe. Instead, the team presents an analytical model for cosmic star formation history to measure the impact of cosmological parameters within the most widely accepted cosmological model. This is none other than the Lambda-Cold Dark Matter (LCDM) model, where Dark Matter and Dark Energy (Lambda) account for roughly 95% of the matter-energy density of the Universe. The remaining 5%, the “ordinary” matter we see every day, is what scientists refer to as baryonic matter (aka. “luminous matter”).
In their paper, the team calculated the fraction of ordinary matter that is converted into stars over the entire history of the Universe based on different Dark Energy densities. Stars are essential to life, creating heavier elements through nuclear fusion that allow for planet formation, biochemistry, and all life as we know it. Their model predicts that the most efficient density for star formation would be 27%, compared to 23% scientists have observed in our Universe. In short, their results suggest that our Universe is an outlier in the context of the multiverse.
Early Dark Energy could have caused early seeds of galaxies (depicted at left) to sprout many more bright galaxies (at right) than theory predicts. Credit: Josh Borrow/Thesan TeamThese findings could have significant implications for cosmology and the ongoing debate about whether or not our Universe is “fine-tuned” for life. As Dr. Sorini explained in a Royal Astronomical Society press release:
“Understanding Dark Energy and the impact on our Universe is one of the biggest challenges in cosmology and fundamental physics. The parameters that govern our Universe, including the density of dark energy, could explain our own existence. Surprisingly, though, we found that even a significantly higher dark energy density would still be compatible with life, suggesting we may not live in the most likely of Universes.”
The new model could also provide insight into how differing densities of Dark Energy affect the formation of the Universe and the development of conditions that allow life to emerge. The influence of Dark Energy drives cosmic expansion, causing the large-scale structures of the Universe (galaxies and galaxy clusters) to move farther and farther apart. For life to develop, matter must be able to clump together to form stars and planets and remain stable for billions of years – since evolution is a long-term process lasting billions of years.
Another takeaway from this research is that star formation and the evolution of the large-scale structure of the Universe achieve a balance over time. This balance determines the optimal value of Dark Energy density needed for the emergence of life and the eventual development of intelligent life. Said Prof. Lombriser: “It will be exciting to employ the model to explore the emergence of life across different universes and see whether some fundamental questions we ask ourselves about our own Universe must be reinterpreted.”
The Drake Equation may need additional parameters, including a Lambda energy density (ld) and a multiverse (mv) parameter. Regardless, the search for life and the question of how it can arise endure, much like Frank Drake’s equation itself!
Further Reading: Royal Astronomical Society, MNRAS
The post New Study Examines Cosmic Expansion, Leading to a New Drake Equation appeared first on Universe Today.
Pentagon’s Latest UFO Report Identifies Hotspots for Sightings
The Pentagon office in charge of fielding UFO reports says that it has resolved 118 cases over the past year, with most of those anomalous objects turning out to be balloons. But it also says many other cases remain unresolved.
This year’s legally mandated report from the Department of Defense’s All-Domain Anomaly Resolution Office, or AARO, also identifies areas of the world that seem to be hotspots for sightings of unidentified flying objects. Such objects have been re-branded as unidentified anomalous phenomena, or UAPs.
Today’s report come just one day after a House subcommittee hearing about UAPs, during which witnesses — and some lawmakers — voiced concerns about potential alien visitations and undisclosed efforts to gather evidence. In contrast, the Pentagon’s report for the 2023-2024 time period states that, “to date, AARO has discovered no evidence of extraterrestrial beings, activity or technology.”
“AARO has successfully resolved hundreds of cases in its holdings to commonplace objects such as balloons, birds, drones, satellites and aircraft,” the office’s director, Jon Kosloski, said in a news release. “Only a very small percentage of reports to AARO are potentially anomalous, but these are the cases that require significant time, resources and a focused scientific inquiry by AARO and its partners.”
In the past, U.S. military and intelligence officials have suggested that some UAP sightings may be attributable to intrusions by rival powers such as Russia or China. The Chinese spy balloon that was intercepted and destroyed by Air Force fighter jets last year after crossing over the U.S. serves as a prime example.
AARO’s latest report says that U.S. military aircrews provided two reports over the past year that identified flight safety concerns, and three reports described pilots being trailed or shadowed by anomalous objects. “To date, AARO has no indication or confirmation that these activities are attributable to foreign adversaries,” the report says, but the office is continuing to work with the U.S. intelligence community to investigate the cases.
“None of the reports AARO received during the reporting period indicated that observers suffered any adverse health effects,” the report says.
AARO’s reporting system was established to encourage members of the U.S. military to let the Pentagon know about UAP sightings and take the stigma out of the process. Based on the latest numbers, the strategy seems to be working. Between May 2023 and June 2024, AARO received 757 UAP reports, compared with 291 reports for the period between August 2022 and April 2023.
Here are more statistics from today’s report:
- Of the 757 reports received over the past year, 485 relate to incidents during the yearlong reporting period, and the remaining 272 reports relate to incidents occurring in the 2021-2022 time frame.
- In addition to the 118 resolved cases, another 174 cases have been queued up for closure, pending a final review and approval by AARO’s director. All those cases were attributed to prosaic objects.
- Seventy percent of the closed cases in 2023-2024 were attributed to balloons. Sixteen percent were attributed to drones, 8% to birds, 4% to satellites and 2% to birds.
- AARO determined that 21 cases merited further analysis, based on reported anomalous characteristics or behaviors. Those cases are being studied by AARO’s experts as well as the office’s partners in the intelligence community and the science and tech community. “AARO will provide immediate notification to Congress should AARO identify that any cases indicate or involve a breakthrough foreign adversarial aerospace capability,” the report says.
- The remaining 444 cases received over the past year lacked sufficient data for further analysis. They’ve been placed in an archive and will be revisited if additional data comes to light. AARO says it has 1,652 UAP reports in all.
- In addition to reports from the U.S. military, AARO is receiving reports of sightings by civil and commercial pilots via the Federal Aviation Administration. AARO says 392 of the 757 reports received over the past year came from the FAA.
- AARO says unidentified lights or orb-shaped objects were mentioned most frequently in the subset of UAP reports that included references to visual characteristics. Other reports mentioned cylinders, disks, triangles, squares or exotic objects such as a “green fireball” or “a jellyfish with flashing lights.”
AARO’s global map of UAP reporting hotspots highlights four broad areas: the southeastern U.S. and Gulf of Mexico; the West Coast and Pacific Northwest; the Middle East; and northeastern Asia in the vicinity of Japan and the Korean peninsula. This doesn’t mean the aliens favor those regions. Instead, AARO says the distribution favors a “continued geographic collection bias based on locations near U.S. military assets and sensors operating globally.”
AARO says it’s getting an increasing number of cases that can be traced to sightings of SpaceX’s Starlink satellites. “For example, a commercial pilot reported white flashing lights in the night sky,” the report says. “The pilot did not report an altitude or speed, and no data or imagery was recorded. AARO assessed that this sighting of flashing lights correlated with a Starlink satellite launch from Cape Canaveral, Florida, the same evening about one hour prior to the sighting.”
One of the reports received via the FAA mentioned a possible flight safety issue. “In this instance, a commercial aircrew reported a near miss with a ‘cylindrical object’ while over the Atlantic Ocean off the coast of New York,” the report says. “AARO continues its research into, and analysis of, this case.”
AARO received 18 reports from the Nuclear Regulatory Commission that related to UAP incidents near U.S. nuclear infrastructure, weapons and launch sites. NRC officials attributed all those sightings to drones. One of the incidents, in August 2023, involved the recovery of a crashed drone in the vicinity of the D.C. Cook Nuclear Power Plant in Michigan — but AARO provided no further information about the drone.
What more can be done? In today’s report, AARO says its ability to resolve cases has been constrained due to “a lack of timely and actionable sensor data.”
“AARO continues to address this challenge by working with military and technical partners to optimize sensor requirements, information-sharing processes, and the content of UAP reporting,” the report says. “AARO is also expanding engagement with foreign partners to share information and collaborate on best practices for resolving UAP cases.”
The post Pentagon’s Latest UFO Report Identifies Hotspots for Sightings appeared first on Universe Today.
A New Way to Detect Daisy Worlds
The Daisy World model describes a hypothetical planet that self-regulates, maintaining a delicate balance involving its biogeochemical cycles, climate, and feedback loops that keep it habitable. It’s associated with the Gaia Hypothesis developed by James Lovelock. How can we detect these worlds if they’re out there?
By looking closely at information.
A Daisy World (DW) is inhabited by two types of daisies: white and black. They have different albedos, and the blacks absorb more sunlight and warm the planet, while the whites reflect more sunlight and cool the planet.
As the DW’s star brightens, the planet’s temperature rises. At first, black daisies thrive because they absorb more energy. However, as the planet gets hotter, absorbing more energy becomes undesirable, and the white daisies begin to outcompete the blacks and thrive. As they thrive, they reflect more sunlight and cool the planet.
The result is a delicate homeostasis where the daisies regulate the planet’s temperature and keep it in a habitable range. It can’t get too hot and it can’t get too cold. The DW model shows how life can influence a planet’s climate and create conditions favourable for its own survival.
Earth is not exactly a daisy world, but life on Earth influences the climate. The DW model simply illustrates the concept of basic climate feedback mechanisms.
The ESA’s Sentinel 2 satellite captured this image of an algae bloom in the Baltic Sea in 2015. A ship can be seen moving through it. Algae blooms interact with the climate through feedback loops. Image Credit: Copernicus Sentinel data / ESA.In new research, scientists from the Department of Physics and Astronomy and the Department of Computer Science at Rochester University wanted to find ways to analyze how planetary systems like biospheres and geospheres are coupled. If there are self-regulating “Daisy Worlds” out there, how can we detect them?
The research is “Exo-Daisy World: Revisiting Gaia Theory through an Informational Architecture Perspective.” The lead author is Damian Sowinski, a research physicist and postdoctoral associate in the Department of Physics and Astronomy at the University of Rochester. The research is awaiting publishing and is not peer-reviewed yet.
The idea is to find a way to detect agnostic biosignatures on exoplanets. Regular biosignatures are specific chemicals like oxygen or methane that can be byproducts of living organisms. Agnostic biosignatures are indications that life is present but don’t rely on identifying which types of organisms might be producing them. Instead, they’re like overarching planetary patterns that living worlds produce.
For the authors, finding agnostic biosignatures begins with information and how it flows.
“In this study, we extend the classic Daisy World model through the lens of Semantic Information Theory (SIT), aiming to characterize the information flow between the biosphere and planetary environment—what we term the information architecture of Daisy World systems,” the authors explain.
Semantic Information Theory has been around since the mid-20th century. It attempts to define meaning in different contexts, how human subjective interpretation affects it, and related concepts in the same vein. It’s taken on a new focus as artificial intelligence and machine learning become more prevalent.
There’s a drive to understand exoplanet atmospheres and environments and to have a way to differentiate between those that may be life-supporting and those that aren’t. This is a complex problem that hinges on agnostic biosignatures.
The JWST captured this atmospheric spectrum of exoplanet K2-18 b showing the presence of methane, which can act as a biosignature. The authors say that information theory can help undercover agnostic biosignatures. Rather than specific chemicals like methane, agnostic biosignatures are patterns that can only be created by a biosphere. Image Credit: NASA, CSA, ESA, R. Crawford (STScI), J. Olmsted (STScI), Science: N. Madhusudhan (Cambridge University)Agnostic biosignatures are complex patterns and structures that can’t be explained by non-biological processes. There’s also disequilibrium, novel energy transfer, unusual levels of organization at different scales, and cyclical or systematic changes that suggest a biological cause.
A search for agnostic biosignatures can involve complex molecules that need biological synthesis, chemical distributions that require metabolism, unexpected accumulations of specific molecules, and features in an atmosphere or on a planetary surface that require biological maintenance.
Some examples of agnostic biosignatures on Earth are methane and oxygen co-existing in the atmosphere, the ‘Red Edge‘ in Earth’s vegetation spectrum, and daily or seasonal cycles of gas emissions.
The Red Edge is a region of rapid change in vegetation reflectance in the near-infrared (NIR). It could be useful in detecting vegetation on exoplanets. Image Credit: Seager et al. 2024.“The search for life on exoplanets requires the identification of biosignatures, which rely on life having
significantly altered the spectroscopic properties of a planet. Thus, exoplanetary life searches focus not
on detecting individual organisms but on identifying the collective effects of life on the planetary system—what we refer to as exo-biospheres,” the authors explain.
In short, we can’t study biosignatures without studying biospheres. In doing so it’s critical to understand where and how an exo-biosphere reaches a “mature” state where they exert a strong influence on the atmosphere, hydrosphere, cryosphere, and lithosphere, collectively known as the geosphere. Once they’re mature and exert a strong influence, they’re in line with the Daisy World hypothesis.
The authors aim is to study how information flows between a biosphere and the planetary environment. To do this, they modelled potential conditions on M-dwarf exoplanets and came up with equations that describe the co-evolution of the daisies on these worlds with their planetary environments. They created what they term an ‘information narrative’ for exo-Daisy Worlds (eDWs).
Typically, the homeostatic feedback in DWs rests on physical quantities like radiation fluxes, albedos, and plant life coverage fractions. That’s the physical narrative. However, the researchers used Semantic Information Theory to derive a complementary narrative based on how information flows. In their work, SIT focuses on correlations between an agent—the biosphere—and an environment and how those correlations benefit the agent.
Their model showed that as stellar luminosity rises, the correlations between the biosphere and its environment intensify. The correlations correspond to distinct phases of information exchange between the two. This leads to the idea of rein control, a control exerted by flora through the positive and negative differences of their albedos compared to the bare ground. This is how the biosphere exerts a regulatory influence on a planet’s climate. In their informational narrative, the planetary temperatures are more constrained “at the cooler and warmer boundaries of the bearable temperature range.”
Not all of the information that flows between the biosphere and the environment is relevant. The biosphere doesn’t use all of it because some of it doesn’t help the biosphere maintain control. The authors say that by analyzing all this information according to information theory, they can determine which information, and when and how, it contributes to its own viability.
The Daisy World model is instructive, but it’s a toy model. For example, it doesn’t include stochastic events like volcanic eruptions. But the big question is how does it relate to exobiospheres?
The authors say that their work shows the potential in using approaches like SIT to understand how exoplanets and their biospheres co-evolved like they have on Earth. More realistic models will be necessary that include more of the complex networks of interactions between an exoplanet’s living and non-living systems. The biosphere processes information in ways that non-living systems don’t, so information-centric systems can undercover agnostic biosignatures in ways that physical or chemical models can’t.
“As a result, the next step in our research program will involve applying SIT and other information-theoretic approaches to more complex models of coupled planetary systems,” the authors conclude.
The post A New Way to Detect Daisy Worlds appeared first on Universe Today.
Two Supermassive Black Holes on the Verge of a Merger
In March 2021, astronomers observed a high-energy burst of light from a distant galaxy. Assigned the name AT 2021hdr, it was thought to be a supernova. However, there were enough interesting features that flagged as potentially interesting by the Automatic Learning for the Rapid Classification of Events (ALeRCE). In 2022, another outburst was observed, and over time the Zwicky Transient Facility (ZTF) found a pattern of outbursts every 60–90 days. It clearly wasn’t a supernova, but it was unclear on what it could be until a recent study solved the mystery.
One idea was that AT 2021hdr was a tidal disruption event (TDE),] where a star strays too close to a black hole and is ripped apart. This can create periodic bursts as the stellar remnant orbits the black hole, but TDEs don’t tend to have such regular patterns. So the team considered another model, where a massive interstellar cloud passes into the realm of a pair of binary black holes.
Simulations show how binary black holes interact with a gas cloud. Credit: F. Goicovic et al. 2016Computer simulations show that rather than simply ripping apart the cloud, a binary black hole would churn the cloud as it consumes it. This would produce a periodic burst of light as the black holes orbit. The team observed AT 2021hdr using the Neil Gehrels Swift Observatory and found periodic oscillations of ultraviolet and X-ray light that match the transient bursts observed by ZTF. These observations match the simulations of a binary black hole.
Based on the data, the black holes have a combined mass of about 40 million Suns, and they orbit each other every 130 days. If they continue along their paths, the two black holes will merge in about 70,000 years. Without the passing cloud, we would have never noticed them.
The team plans to continue their observations of the system to further refine their model. They also plan to study how the black holes interact with their home galaxy.
Reference: L. Hernández-García, et al. “AT 2021hdr: A candidate tidal disruption of a gas cloud by a binary super massive black hole system.” Astronomy & Astrophysics 691 (2024)
The post Two Supermassive Black Holes on the Verge of a Merger appeared first on Universe Today.
Interferometry Will Be the Key to Resolving Exoplanets
When it comes to telescopes, bigger really is better. A larger telescope brings with it the ability to see fainter objects and also to be able to see more detail. Typically we have relied upon larger and larger single aperture telescopes in our attempts to distinguish exoplanets around other stars. Space telescopes have also been employed but all that may be about to change. A new paper suggests that multiple telescopes working together as interferometers are what’s needed.
When telescopes were invented they were single aperture instruments. A new technique emerged in the late 1800’s to combine optics from multiple instruments. This achieved higher resolution than would ordinarily be achieved by the instruments operating on their own. The concept involves analysis of the interference pattern when the incoming light from all the individual optical elements is combined. This is used very successfully in radio astronomy for example at the aptly named Very Large Array. It is not just radio waves that are used, infra-red and even visible light interferometers have been developed saving significant costs and producing results that would otherwise not be achievable from a single instrument.
Image of radio telescopes at the Karl G. Jansky Very Large Array, located in Socorro, New Mexico. (Credit: National Radio Astronomy Observatory)One area of astronomical research is the study of exoplanets. Observing alien worlds orbiting distant stars presents a number of challenges but the two key difficulties are that they lie at great distances and orbit bright stars. The planets are usually small and faint making them almost (but not quite) impossible to study directly due to the brightness and proximity to their star. Some understanding of their nature can be gleaned from using the transit method of study. This involves studying starlight as it passes through any atmosphere present to reveal its composition.
Direct imaging and study is a little more challenging and requires high resolution and sometimes a way of blocking light from the nearby star. To achieve direct observations requires angular resolution of a few milliarcseconds or even less (the full Moon covers 1,860,000 milliarcseconds!) This depends largely on the planets size and distance from Earth and from its host star. To give some idea of context, to resolve a planet like Earth orbiting the Sun from a distance of just 10 light years requires an angular resolution of 0.1 milliarcseconds. The James Webb Space Telescope has a resolution of 70 milliarcseconds so even that will struggle.
This artist’s impression depicts the exomoon candidate Kepler-1625b-i, the planet it is orbiting and the star in the centre of the star system. Kepler-1625b-i is the first exomoon candidate and, if confirmed, the first moon to be found outside the Solar System. Like many exoplanets, Kepler-1625b-i was discovered using the transit method. Exomoons are difficult to find because they are smaller than their companion planets, so their transit signal is weak, and their position in the system changes with each transit because of their orbit. This requires extensive modelling and data analysis.A paper recently authored by Amit Kumar Jha from the University of Arizona and a team of astronomers explores this very possibility. They look at using interferometry techniques to achieve the required resolutions, at using advanced imaging techniques like the Quantum Binary Spatial Mode Demultiplexing to analyse the point spread function (familiar to amateur astronomical imagers) and at using quantum based detectors.
The study draws upon radio interferometric techniques with promising results. They showed that a multi-aperture interferometry approach utilising quantum based detectors are more effective than single aperture instruments. They will provide a super-resolution imaging solution that has to date not been used in exoplanetary research. Not only will it hugely increase resolution, it’s also a very cost effective way to observe exoplanets and indeed other objects across the cosmos.
The post Interferometry Will Be the Key to Resolving Exoplanets appeared first on Universe Today.
A New Mission To Pluto Could Answer the Questions Raised by New Horizons
Pluto may have been downgraded from full-planet status, but that doesn’t mean it doesn’t hold a special place in scientist’s hearts. There are practical and sentimental reasons for that – Pluto has tantalizing mysteries to unlock that New Horizons, the most recent spacecraft to visit the system, only added to. To research those mysteries, a multidisciplinary team from dozens of universities and research institutes has proposed Persephone – a mission to the Pluto system that could last 50 years.
New Horizons rocketed past the Pluto system in 2015, which is now technically considered part of the Kuiper Belt. The mission collected data on the dwarf planet and its unique moon, Charon. Scientists have now had time to analyze the data from that mission, and it left them wanting more—in particular, data about some of the surface features that they observed.
Persephone has four main scientific questions it is designed to answer, according to a paper published back in 2021:
1) “How has the population of the Kuiper Belt evolved?”
2) “What are the particle and magnetic field environments of the Kuiper Belt?”
3) “How have the surfaces of both Pluto and Charon changed?”
4) “What are the internal structures of Pluto and Charon?”
That last one might be the most intriguing, as the answer for Pluto’s internal structure might be that it has a subsurface ocean despite being so far away from the Sun. There is already some evidence for this, as Pluto appears to have an active surface, and an ice sheet called Sputnik Planitia could potentially be caused by a subsurface ocean. We don’t have enough data yet to prove it.
That is what Persephone is designed to provide. Unfortunately, with the unforgiving logic of orbital mechanics and current constraints on propulsion technology, any such mission would take multiple decades, even with a gravity assist from Jupiter. The mission design for Persephone has been operational for almost 31 years, including a 28-year cruise phase and a three-year orbit period around Pluto and Charon. It could then have an extended operational mission to visit other Kuiper belt objects to help constrain the variance in the different kinds of objects in that massive section of space.
That travel time could be helped by the development of a more effective nuclear electric propulsion system, which could shave up to 2 years off it even with a heavier payload than currently planned for Persephone. Such a system has been described but might not be available for the planned 2031 launch date for Persephone on board an SLS rocket.
Fraser discusses the longevity of spacecraft, which will definitely be a consideration for any future missions to Pluto.Persephone will take a suite of sensors, no matter its propulsion system, which can be “brought to bear on any and every object encountered during the mission,”. According to the flight plan, that would include Jupiter and its moons. These sensors include cameras, spectrometers, radar, magnetometers, and altimeters to meet the mission’s necessary science objectives.
A critical differentiator for the mission is that it is designed to be an orbiter rather than a flyby. According to the authors, much of the data needed to be collected would be infeasible with the short period a flyby would provide with the system. An orbiter would be able to stick around and collect data over the three-year period about both Pluto and Charon, including their active surface dynamics.
This proposal is just one of many mission proposals to the outer planets seeking further funding, and a preliminary estimate of $3bn puts it in the higher range of those missions. But if it is funded in some capacity, it could provide answers to the questions that New Horizons posed, even if it would take several decades to reach them.
Learn More:
Howett et al – Persephone: A Pluto-system Orbiter and Kuiper Belt Explorer
UT – The (Dwarf) Planet Pluto
UT – NASA’s New Horizons Mission Still Threatened
UT – New Horizons is Funded Through the Decade. Enough to Explore Another Kuiper Belt Object
Lead Image:
Graphic of Pluto being visited by Persephone and all the different questions the mission could answer.
Credit – Howett et al.
The post A New Mission To Pluto Could Answer the Questions Raised by New Horizons appeared first on Universe Today.