Feed aggregator
Vital Atlantic Ocean current is already weakening due to melting ice
Vital Atlantic Ocean current is already weakening due to melting ice
Evidence is growing that microbes in your mouth contribute to cancer
Evidence is growing that microbes in your mouth contribute to cancer
Sentinel-1C fuelled ahead of liftoff
Marking a major milestone in the preparation of Copernicus Sentinel-1C for its scheduled 3 December liftoff, experts have completed the critical and hazardous process of fuelling the satellite.
Once in orbit, Sentinel-1C will extend the Sentinel-1 mission’s legacy, delivering radar imagery to monitor Earth’s changing environment to support a diverse range of applications and scientific research. Additionally, Sentinel-1C brings new capabilities for detecting and monitoring maritime traffic
SpaceX launches 20 Starlink satellites from California (video, photos)
James Webb Confirms Hubble’s Calculation of Hubble’s Constant
We have been spoiled over recent years with first the Hubble Space Telescope (HST) and then the James Webb Space Telescope (JWST.) Both have opened our eyes on the Universe and made amazing discoveries. One subject that has received attention from both is the derivation of the Hubble Constant – a constant relating the velocity of remote galaxies and their distances. A recent paper announces that JWST has just validated the results of previous studies by the Hubble Space Telescope to accurately measure its value.
The Hubble Constant (H0) is a fundamental parameter in cosmology that defines the rate of expansion of the universe. It defines the relationship between Earth and distant galaxies by the velocity they are receding from us. It was first discussed by Edwin Hubble in 1929 as he observed the spectra of distant galaxies. It is measured in unites of kilometres per second per megaparsec and shows how fast galaxies are moving away from us per unit of distance. The exact value of the constant has been the cause of many a scientific debate and more recently the HST and JWST have been trying to fine tune its value. Getting an accurate value is key to determining the age, size and fate of the universe.
Edwin HubbleA paper recently published by a team of researchers led by Adam G. Riess from John Hopkins University validate the results from a previous HST study. They use JWST to explore its earlier results of the cepheid/supernova distance ladder. This has been used to establish distances across the cosmos using cepheid variable stars and Type 1a supernovae. Both objects can be likened to ‘standard candles’ whose actual brightness is very well understood. By measuring their apparent brightness from Earth, their distances can be calculated by comparing it to their actual brightness, their intrinsic luminosity.
NASA’s James Webb Space Telescope has spotted a multiply-imaged supernova in a distant galaxy designated MRG-M0138. Image Credit: NASA, ESA, CSA, STScI, Justin Pierel (STScI) and Andrew Newman (Carnegie Institution for Science).Over recent decades, a number of attempts have been made to accurately determine H0 using a multitude of different instruments and observations. The cosmic microwave background has been used along with the aforementioned studies using cepheid variables and supernovae events. The results provide a range of results which has become known as ‘Hubble tension.’ The recent study using JWST hopes that it may be able to fine tune and validate previous work.
To be able to determine H0 with a level of accuracy using the cepheid/supernova ladder, a sufficiently high sample of cepheids and supernovae must be observed. This has been challenging, in particular of the sample size of supernovae within the range of cepheid variable stars. The team also explored other techniques for determining H0 for example studying data from HST of the study of the luminosity of the brightest red giant branch stars in a galaxy – which can also work as a standard candle. Or the luminosity of certain carbon rich stars which are another technique.
This illustration shows three steps astronomers used to measure the universe’s expansion rate (Hubble constant) to an unprecedented accuracy, reducing the total uncertainty to 2.3 percent. The measurements streamline and strengthen the construction of the cosmic distance ladder, which is used to measure accurate distances to galaxies near to and far from Earth. The latest Hubble study extends the number of Cepheid variable stars analyzed to distances of up to 10 times farther across our galaxy than previous Hubble results. Credits: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)The team conclude that, when all JWST measurements are combined, including a correction for the low sample of supernovae data, that H0 comes out at 72.6 ± 2.0 km s?1 Mpc?1 This compares to the combined HST data which determines H0 as 72.8 km s?1 Mpc?1 It will take more years and more studies for the sample size of supernova from JWST to equal that from HST but the cross-check has so far revealed we are finally honing in on an accurate value for Hubble’s Constant.
The post James Webb Confirms Hubble’s Calculation of Hubble’s Constant appeared first on Universe Today.
SpaceX launches Optus-X telecom satellite from Florida in gorgeous sunset liftoff (video, photos)
What Should Light Sails Be Made Out Of?
The Breakthrough Starshot program aims to cross the immense distances to the nearest star in just decades. Using a high-powered laser to propel a reflective sail technology to relativistic speeds is their mission. The selection of sail material is key to its success as it must be lightweight while being able to withstand acceleration and radiation from the laser. A recent study explores various materials and proposes that core-shell structures—spherical particles composed of two different materials—could be a promising solution.
Breakthrough Starshot is an ambitious project to explore interstellar space by sending tiny, lightweight spacecraft to the nearest star system, Alpha Centauri. The project plans to use ground-based, high-powered lasers to accelerate reflective ‘light sails,’ enabling the spacecraft to achieve relativistic speeds and travel the 4.37 light-years in just a few years. Each spacecraft will be equipped with tiny sensors and communication systems, will collect data on exoplanets and other interstellar phenomena along the way. If successful, it could mark our first step toward exploring distant star systems and searching for extraterrestrial life.
This image of the sky around the bright star Alpha Centauri AB also shows the much fainter red dwarf star, Proxima Centauri, the closest star to the Solar System. The picture was created from pictures forming part of the Digitized Sky Survey 2. The blue halo around Alpha Centauri AB is an artifact of the photographic process, the star is really pale yellow in colour like the Sun. Image Credit: Digitized Sky Survey 2 Acknowledgement: Davide De Martin/Mahdi ZamaniTraveling at relativistic speeds, which are velocities close to the speed of light, presents amazing possibilities but brings with it immense difficulties. At these speeds, time dilation (a phenomenon predicted by Einstein’s theory of relativity) causes time to pass more slowly for the traveler relative to observers on Earth, potentially allowing journeys to distant stars within a single human lifetime from the traveler’s perspective. This won’t be a problem for Starshot however as they plan to send tiny spacecraft only. However, achieving such speeds, even for Starshot requires overcoming immense energy demands, as the kinetic energy needed increases exponentially with velocity. The environment at relativistic speeds also becomes particularly hazardous. Collisions with particles at such high speeds could easily destroy spacecraft, and radiation exposure would intensify due to relativistic effects.
This image shows the ACS3 being unfurled at NASA’s Langley Research Center. The solar wind is reliable but not very powerful. It requires a large sail area to power a spacecraft effectively. The ACS2 is about 9 meters (30 ft) per side, requiring a strong, lightweight boom system. Image Credit: NASATo complete the journey in a few decades the spacecraft needs to be accelerated to an estimated 20% of the speed of light bringing with it all the problems outlined above. The selection of the right material for the sails is key. In a paper recently published by Mitchell R. Whittam, Lukas Rebholz, Benedikt Zerulla and Carsten Rockstuhl from the Karlsruhe Institute of Technology in Germany the team report on the results of their search for the best material. In particular they focus attention on the so called core-shell spheres.
The structures are based upon a matrix design which finds its origins in Mie Theory. This mathematical framework was developed by German physicist Gustav Mie in 1908 to describe how spherical particles scatter electromagnetic waves such as light. In their study, they explore the reflective properties and acceleration times of spheres made from aluminium, silicon, silicon dioxide and various combinations.
The results were promising with a shell composed of a silicon and silicon dioxide combination yielding the best results. The work offers a significant insight into the structure of materials for light sails. Whilst not a definitive outcome, they showed that core-shell spheres, which were a previously unexplored area of light sail physics is a promising avenue to explore for future experimental work.
The post What Should Light Sails Be Made Out Of? appeared first on Universe Today.
Why does everything look flat even though the Earth is round?
A Giant Meteorite Impact 3.26 Billion Years Ago Helped Push Life Forward
The Earth has always been bombarded with rocks from space. It’s true to say though that there were more rocks flying around the Solar System during earlier periods of its history. A team of researchers have been studying a meteorite impact from 3.26 billion years ago. They have calculated this rock was 200 times bigger than the one that wiped out the dinosaurs. The event would have triggered tsunamis mixing up the oceans and flushing debris from the land. The newly available organic material allowed organisms to thrive.
Meteorite impacts are a common event and its not unusual to see these rocks from space whizzing through the atmosphere. Giant meteorite impacts have become an important part of Earth’s geological history. The impacts release colossal amounts of energy that can destroy life, create wildfires, tsunamis and eject dust into the atmosphere. The Chicxulub impact around 66 million years ago is perhaps one of the most well known impacts and wiped out the dinosaurs. The study of these interplanetary wanderers is imperative as we strive to protect ourselves from potential impactors that pose a threat to human life.
A bright meteor caught by one of the Global Fireball Network’s cameras from the Rancho Mirage Observatory (Eric McLaughlin) on April 7, 2019. Credit: NASA Meteorite Tracking and Recovery Network.Impacts like these have had a massive affect on the development of Earth and its suitability for life. Geological studies of rocks from the Archean Eon have revealed 16 major impacts with impactors measuring at least 10km in diameter. At the time of impact the effects can be devastating but over time, their can be benefits to life although it’s not well understood. In a paper published in Earth, Atmospheric and Planetary Sciences the team led by Nadja Drabon from Harvard University explore rocks from an event 3.26 billion years ago.
Known as the S2 event, the impactor is believed to be a carbonaceous chondrite between 37 to 58 km in diameter. It is thought to have exploded over South Africa with debris landing in the ocean causing giant tsunamis. The impact mixed up iron(II) rich deep waters with the iron(II) poor shallower waters. It will have also caused the waters to heat leading to partial evaporation of surface water with a temporary increase in erosion around coastal areas.
A three-dimensional cross-section of the hydrothermal system in the Chicxulub impact crater and its seafloor vents. The system has the potential for harboring microbial life. Illustration by Victor O. Leshyk for the Lunar and Planetary Institute.Perhaps one of the most valuable effects of the impact was the injection of phosphorus into the atmosphere with a positive impact on the Earth’s habitability for life. Study of the layers of rock above the layer caused by the S2 event reveals an increased amount of nutrients and iron which helped microbial life to thrive.
The study has helped to build a clearer understanding of how giant impacts can aid the development of life. It does of course depend on the size and type, material and the conditions of the atmosphere before the event. The S2 event seems to have quite a mixed effect on early life, in particular marine life. Overall some forms of life were positively impacted while others seemed to have experienced challenges. Marine life that relies upon sunlight to survive (the phototrophs) were effected by the darkness while those living at lower depths were less influenced. The detrimental effects of the atmosphere would likely only have been short lived lasting perhaps just a few years before recovering quickly causing only a temporary impact to marine life. But the injection of phosphorous in the atmosphere would have had far more long term beneficial effects to life.
Source : Effect of a giant meteorite impact on Paleoarchean surface environments and life
The post A Giant Meteorite Impact 3.26 Billion Years Ago Helped Push Life Forward appeared first on Universe Today.
Space-flown Choctaw Nation seeds to be planted on Earth for STEM experiment
'Lunik Heist:' A real-life CIA rocket kidnapping goes to Hollywood
America’s Particle Physics Plan Spans the Globe — and the Cosmos
RALEIGH, N.C. — Particle physicist Hitoshi Murayama admits that he used to worry about being known as the “most hated man” in his field of science. But the good news is that now he can joke about it.
Last year, the Berkeley professor chaired the Particle Physics Project Prioritization Panel, or P5, which drew up a list of multimillion-dollar physics experiments that should move ahead over the next 10 years. The list focused on phenomena ranging from subatomic smash-ups to cosmic inflation. At the same time, the panel also had to decide which projects would have to be left behind for budgetary reasons, which could have turned Murayama into the Dr. No of physics.
Although Murayama has some regrets about the projects that were put off, he’s satisfied with how the process turned out. Now he’s just hoping that the federal government will follow through on the P5’s top priorities.
Berkeley particle physicist Hitoshi Murayama speaks at the ScienceWriters 2024 conference in Raleigh, N.C. (Photo by Alan Boyle)“There are five actually exciting projects we think we can do within the budget program,” Murayama said this week during a presentation at the ScienceWriters 2024 conference in Raleigh. Not all of the projects recommended for U.S. funding are totally new — and not all of them are based in the U.S. Here’s a quick rundown:
- Looking for dark matter: About 85% of all the matter in the universe is thought to exist in an invisible form that so far has been detectable only through its gravitational effect. For years, an experiment being conducted in a converted South Dakota gold mine has been looking for traces of dark matter’s interactions with a huge reservoir of liquid xenon. The experiment hasn’t yet found anything, but Murayama said the P5 panel supports the idea of boosting the size of the reservoir size from seven to on the scale of 70 tons and intensifying the search.
- Following up on the Higgs boson: The discovery of the Higgs boson in 2012 provided the last missing piece in the Standard Model of particle physics, one of science’s most successful theories. But physicists don’t have a good grip on how the Higgs works. “You’d like to mass-produce this Higgs boson and study its properties in great detail, so we know how it got stuck and frozen into space, so that we can stay in one place,” Murayama said. That would require building a bigger particle collider, capable of smashing electrons and positrons — but the P5 panel determined that such a machine couldn’t be built in the U.S. Instead, the panel recommends supporting an “offshore Higgs factory” like the FCC-ee facility that CERN is considering, or the International Linear Collider that’s been proposed for construction in Japan.
- Studying the nature of neutrinos: The Big Bang is thought to have created equal amounts of matter and antimatter, which would theoretically annihilate each other. Fortunately for us, matter won out rather than being totally annihilated. How did it happen? “The only candidate elementary particle we know who might have done this is actually neutrinos,” Murayama said. “How do we know if that’s really the case? One thing we try to do is to look at the behavior of neutrinos by creating them in Illinois and shooting them to a location in South Dakota, because neutrinos can pass through the dirt without any problems.” The Deep Underground Neutrino Experiment is under construction, and excavation of the Long-Baseline Neutrino Facility was recently completed in South Dakota. The P5 report proposes upgrading DUNE’s capabilities.
- Getting a neutrino view of the cosmos: The P5 panel also called for a dramatic expansion of the IceCube Neutrino Observatory in Antarctica. “They managed to peer into the supermassive black hole in a nearby galaxy, and for the first time, they even took a picture of a galactic disk using neutrinos as well,” Murayama said. “So this is finally becoming a true tool to observe the universe in a different way from what we do with older telescopes.”
- Seeking signs of cosmic inflation: A widely held theory asserts that in the instant after the Big Bang, the universe inflated at a prodigious rate to “lock in” the slight perturbations that scientists see in the cosmic microwave background radiation. In 2014, astronomers claimed that an experiment at the South Pole had picked up evidence of that primordial cosmic inflation, but months later, they had to back away from those claims. The Antarctic studies are continuing, however, and the P5 panel supported an experiment known as CMB-S4 that would widen the search for evidence. “For that, we need two sites, one in Chile, another at the South Pole,” Murayama said.
In addition to the top five projects, the panel endorsed a longer-term effort to develop an advanced particle accelerator that would produce collisions between subatomic particles known as muons. Such a machine would increase the chances of finding new frontiers in physics in the 2030s, Murayama said.
“We call this a ‘muon shot,’ like a moonshot,” he said. “We don’t know quite well if we can really get there, but as you work toward it, that would end up producing so many interesting things on the way, more science and more technologies.”
Will the P5’s priorities prevail? That’s up to the U.S. Department of Energy and the National Science Foundation, which must decide what to do with the physicists’ recommendations. Success isn’t guaranteed: For example, NSF put the CMB-S4 experiment on hold in May to focus instead on upgrading aging infrastructure at its Antarctic facilities.
Looking ahead, it’s not yet clear how particle physics will fare when Donald Trump returns to the White House. For what it’s worth, the price tags for four of the projects add up to more than $2.5 billion over the course of several years. The cost of the offshore Higgs factory is certain to amount to billions more.
Murayama called attention to an issue that could affect IceCube, CMB-S4 and other Antarctic research in the nearer term. “There is a fleet of cargo airplanes that is owned by the U.S. Air Force that actually served us well over many decades,” he said. “But they were built back in the ’70s, and they’re about to retire, and right now there are no plans to replace them. Then we will lose access.”
Senate Majority Leader Chuck Schumer, D-N.Y., managed to get a $229 million appropriation for new planes into the Senate’s version of the defense budget bill for the current fiscal year, but the House still has to take action. That sets up a bit of a congressional cliffhanger for the weeks and months ahead.
“I don’t get a good sense of the priority,” Murayama confessed. “But this is supposed to be part of the defense budget, which is way bigger than the science budget — so in that part, it’s peanuts. Hopefully, it just can get in and get funded.”
For a critical perspective on the P5 wish list, check out physicist Sabine Hossenfelder’s YouTube video:
Alan Boyle is a volunteer board member for the Council for the Advancement of Science Writing, which was one of the organizers of the ScienceWriters 2024 conference.
The post America’s Particle Physics Plan Spans the Globe — and the Cosmos appeared first on Universe Today.
Millions of Phones Could Map the Earth’s Ionosphere
We are all familiar with the atmosphere of the Earth and part of this, the ionosphere, is a layer of weakly ionized plasma. It extends from 50 to 1,500 km above the planet. It’s a diffuse layer but sufficient to interfere with satellite communications and navigation systems too. A team of researchers have come up with an intriguing idea to utilise millions of mobile phones to help map the ionosphere by relying on their GPS antennas.
The ionosphere is a layer of the Earth’s atmosphere where radiation ionizes atoms and molecules. The incoming solar radiation is the primary cause which energises gases causing them to lose electrons and become electrically charged. The process creates a region of charged particles or ions known as plasma. The ionosphere is a key part of radio communications since its ionized particles reflect and refract radio waves back to Earth facilitating long distance communication. It’s density and surprisingly perhaps its composition changes as solar activity waxes and wanes.
A view of Earth’s atmosphere from space. Credit: NASAIn a paper recently published in Nature, a team of researchers at Google have used data from over 40 million mobile phones to map conditions in the ionosphere. The concept of using crowdsourced signals is an intriguing one and the study will help to improve satellite navigation and our understanding of the upper regions of our atmosphere. We still don’t have a full understanding of the properties of the ionosphere across regions like Africa and South America so this study will fill significant gaps.
The ionosphere can slow down radio signals that travel to Earth from satellites, in particular from GPS and other navigation satellites. When it comes to these navigation signals, they rely heavily upon signal timing and relies upon nano-second precision. This gives systems the ability to pinpoint location with incredible accuracy, having an accurate model of the ionosphere is key to its success however.
NavCube, the product of a merger between the Goddard-developed SpaceCube 2.0 and Navigator GPS technologies, could play a vital role helping to demonstrate X-ray communications in space — a potential NASA first. Credit: NASA/W. HrybykUsing data from ground based stations, engineers can create real time maps of the ionospheric density. To do this, data is received across two different frequencies from the same satellite and their arrival timed. Dependent on the density of the ionosphere, the low frequency waves are slowed down more than the high frequency signals. Not taking these into account could put GPS and navigation systems out by 5 metres or more.
Receiving multiple frequencies is within the capability of most mobile phones and it’s using this that has been the focus of the study. There is however, a degree of noise in the data received by mobile phones but the team at Google found that combining the signal of large numbers of phones reduced the noise.
The study is currently only working with Android phones. Anyone who allows for their sensor data to be shared was able to contribute to the study. The data has already revealed plasma in the ionosphere over South America that had not been seen before.
Source : Mapping the ionosphere with millions of phones
The post Millions of Phones Could Map the Earth’s Ionosphere appeared first on Universe Today.
Detecting Primordial Black Hole Mergers Might be Within Our Grasp
Imagine a black hole with the mass of the asteroid Ceres. It would be no larger than a bacterium and practically undetectable. But if such black holes are common in the Universe, they would affect the motions of stars and galaxies, just as we observe. Perhaps they are the source of dark matter.
Such tiny black holes could not form from dying stars, but they might have formed within the hot, dense cosmos soon after the Big Bang. For this reason, they are known as primordial black holes. We have no evidence they exist, but since they would be such a great explanation for dark matter, astronomers keep looking.
The one thing we know at this point is that most primordial black holes are ruled out by the data. Large, almost stellar mass black holes would affect the clustering of galaxies in a way we don’t observe. Tiny black holes of mountain mass or smaller would have evaporated long ago, making them useless as a dark matter candidate. But asteroid mass black holes are still possible. They aren’t likely, but they haven’t been formally excluded by the data. So a new study looks at how asteroid mass primordial black holes might be detected through gravitational waves.
The size and lifetime of primordial black holes by mass. Credit: NASA’s Goddard Space Flight CenterTo account for dark matter, the smaller the primordial black hole, the more common they must be. For asteroid masses, the cosmos would need to contain a vast sea of them. Since they would cluster within galaxies, they would be common enough within galaxies for some of them to merge on a regular basis. As the study points out, each of these mergers would produce a gravitational chirp similar to the ones we have observed with stellar-mass black holes. They would just have a much higher frequency and be more common.
The frequency of these primordial chirps would be too high for current observatories such as LIGO to observe, but the authors point out that some current dark matter experiments might be able to observe them. One alternative model for dark matter involves a hypothetical particle known as the axion. Axions were originally proposed to solve some issues in high-energy particle physics, and while they have fallen out of popularity in particle physics, they’ve gained some popularity in cosmology. We have made a few attempts to detect axions, but to no success. In their paper, the authors show how axion experiments could be tweaked slightly to observe the chirps of primordial black hole mergers in ideal conditions.
The chances of success are pretty slim. It would be odd for primordial black holes to exist in the only allowed mass range and nowhere else, and the conditions we could observe would be pretty narrow. But it might be worth doing a search on the off chance. The nature of dark matter remains a huge mystery in astronomy, so we don’t have much to lose in trying the occasional long-shot idea.
Reference: Profumo, Stefano, et al. “The Maximal Gravitational Wave Signal from Asteroid-Mass Primordial Black Hole Mergers.” arXiv preprint arXiv:2410.15400 (2024).
The post Detecting Primordial Black Hole Mergers Might be Within Our Grasp appeared first on Universe Today.