We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Are black holes expected to contain the same ratio of dark matter to regular matter as the rest of the universe? I've heard that dark matter is distributed in halos around galaxies. Does that make it less likely to be ingested into a black hole?
(Short answer: No, scroll to the last point.)
- It is irrelevant to an external observer whether the matter that fell into the black hole was dark matter or baryonic, by the no hair theorem. The only properties of a black hole from our point of view are mass, electric charge and angular momentum. (But of course we don't understand quantum gravity.)
- From the point of view of matter which has fallen into the black hole, nothing special happens upon crossing the event horizon. This means that dark matter stays dark and baryonic matter stays baryonic when viewed from inside the black hole.
- There is some controversy about how much dark matter exists in the universe. This recent article, for example, indicates in the abstract that more accurate modelling of galactic rotation curves could eliminate a large percentage of the expected non-baryonic dark matter. (Note as @pela indicated in the comments, that this author's papers have not been peer reviewed and could be suspect.) Obviously, the amount of dark matter in the universe would greatly affect the question's answer. I should note that the controversy is mostly composed of a small number of vocal scientists who appear disproportionately in the media. Following the mainstream news science sections, I get the impression that the death of dark matter seems to be announced once a month or so.
- The formation of supermassive black holes is poorly understood. One hypothesis is that they may form by successive merger of stellar mass black holes. As there have recently been gravitational wave observations of such mergers, and as candidates for intermediate mass black holes have also been observed recently, I will assume here that this is how they form and that supermassive black holes are therefore made of roughly the same stuff as stellar mass black holes.
- Black holes lose most of their mass during the formation process. It is important to always keep in mind whether we are talking about the mass of the stellar core which collapsed to form the black hole (this is often the "mass" of a black hole that is referred to when speaking about e.g. the minimum size black hole that can form from core collapse) or the mass of the black hole as seen by a distant observer after the supernova.
- Dark matter particles can not lose much orbital energy by interacting with other matter nor by radiation, therefore will remain in orbit around a black hole rather than falling in, unless they happen by unlikely chance to hit it near the event horizon. This paper indicates that simulated supermassive black holes derive no more than about 10% of their mass from dark matter.
However, it must be said that some scientists suspect that dark matter is made of primordial black holes in the first place. There is also the theory of MACHOs (Massive Compact Halo Objects), that dark matter is composed of large compact bodies such as black holes, but it is believed by most that this theory can not account for the dark matter in the universe.
Dark matter is (thought to be) in halos which extend both to the centers of galaxies and outside most of the normal matter in galaxies (gas, stars, dust). So a black hole inside a galaxy could and undoubtedly will ingest some dark matter. However:
Stellar-mass black holes form from the core-collapse of a massive star. Since stars are almost entirely of regular matter, the initially formed BH remnant would itself have been made almost entirely of regular matter. Such BHs might later grow by accreting gas (e.g., from a close binary companion star), in which case they're gaining mass in the form of regular matter. There would inevitably be some dark matter swallowed by the BH as it orbited within its parent galaxy -- just as the BH would swallow some interstellar dust, for example. But it would still be overwhelmingly formed out of regular matter.
Supermassive black holes in galaxy centers would probably start out from some kind of early-universe collapse of a gas cloud or very massive star, which would again be mostly regular matter. Subsequent growth of supermassive BHs comes primarily from interstellar gas feeding an accretion disk around the BH, plus the occasional star that wanders too close -- so once again it's mostly regular matter that falls in the black hole. (The central regions of galaxies do have some dark matter, but they're dominated by regular matter. Plus, regular matter in the form of gas clouds can easily lose energy via cloud-cloud collisions and sink to the center of the galaxy, where it could feed a supermassive BH; dark matter can't do this.)
(Of course, as user25972 points out, it's largely irrelevant to outsiders like us what kind of matter goes into making a BH. A black hole formed out of dark matter would behave identically to one formed out of regular matter.)
Are black holes expected to contain the same ratio of dark matter to regular matter as the rest of the universe? - Astronomy
Galaxies are born out of primordial fluctuations with an evolution probably driven by gravitation as the dominant effect. Gravitation, as a geometric concept, has the same effect on the different types of particles. Some forces other than gravitation, such as the interaction with photons, dissipative effects, magnetic fields, etc., could also have an influence and act on the involved particles differentially, but an overall trend for galaxies and clusters to have a similar composition to the general composition of the Universe is to be expected.
Our knowledge about the composition of the Universe has changed in recent times with respect to the classical view, summarized, for instance, by Schramm (1992). This new conception has been reviewed, for instance, by Turner (1999a, b). The dominant matter is considered to be cold dark matter (CDM), consisting of particles moving slowly, so that the CDM energy density is mainly due to the particle's rest mass, there being a large series of candidates for CDM particles, but axions and neutralinos being the most attractive possibilities.
Big Bang nucleosynthesis studies have been able to accurately determine the baryon density as (0.019۪.0012) h -2 . The cluster baryon density has also been accurately determined by X-ray and the Sunyaev-Zeldovich effect to be f B = (0.07۪.007) h -3/2 and, assuming that rich clusters provide a fair sample of matter in the Universe, also / = f B , from which, it follows = (0.27۪.05) h -1/2 . The Universe is however flat, = 1, with the CMB spectrum being a sensitive indicator. Therefore = 1 = + , where
0.7 represents the contribution of the vacuum energy, or rather, the contribution of the cosmological term . With this high value of the Universe should be in accelerating expansion, which has been confirmed by the study of high-redshift supernovae, which also suggest
0.7 (Perlmutter, Turner and White, 1999 Perlmutter et al. 1999). The stellar or visible matter is estimated to be = 0.003 - 0.006. All these values can be written in a list easier to remember, with values compatible with the above figures, adopting the values of H 0 = 65 kms -1 Mpc -1 h=0.65:
less precise but useful for exploratory fast calculations.
A large cluster should have more or less this composition, including the halo of course, even if a halo could contain several baryonic concentrations or simply none. Therefore, a first direct approach to the problem suggests that halos are non baryonic, with baryonic matter being a minor constituent.
This is also the point of view assumed by most current theoretical models (this will be considered later, in Section 4.2.2), which follow the seminal papers by Press and Schechter (1974) and White and Rees (1978). We advance the comment that, in these models, a dominant collisionless non dissipative cold dark matter is the main ingredient of halos while baryons, probably simply gas, constitute the dissipative component, able to cool, concentrate, fragment and star-producing. Some gas can be retained mixed in the halo, and therefore halos would be constituted of non-baryonic matter plus small quantities of gas, its fraction decreasing with time, while mergers and accretion would provide increasing quantities to the visible disks and bulges. Therefore, a first approach suggests that galactic dark matter is mainly non-baryonic, which would be considered as the standard description. Baryons, and therefore visible matter, may not have condensed completely within a large DM halo, and therefore the baryon/DM ratio should be similar in the largest halos and in the whole Universe, although this ratio could be different in normal galaxies.
However, other interesting possibilities have also been proposed. The galactic visible/dark matter fraction depends very much on the type of galaxy, but a typical value could be 0.1. This is also approximately the visible/baryon matter fraction in the Universe, which has led some authors to think that the galactic dark matter is baryonic (e.g. Freeman, 1997) in which case the best candidates would be gas clouds, stellar remnants or substellar objects. The stellar remnants present some problems: white dwarfs require unjustified initial mass functions neutron stars and black holes would have produced much more metal enrichment. We cannot account for the many different possibilities explored. Substellar objects, like brown dwarfs, are an interesting identification of MACHOs, the compact objects producing microlensing of foreground stars. Alcock et al. (1993), Aubourg et al. (1993) and others have suggested that MACHOSs could provide a substantial amount of the halo dark matter, as much as 50-60% for masses of about 0.25 M , but the results very much depend on the model assumed for the visible and dark matter components, and are still uncertain. Honma and Kan-ya (1998) argued that if the Milky Way does not have a flat rotation curve out to 50 kpc, brown dwarfs could account for the whole halo, and in this case the Milky Way mass is only 1.1 × 10 11 M .
Let us then briefly comment on the possibility of dark gas clouds, as defended by Pfenniger and Combes (1994), Pfenniger, Combes and Martinet (1994) and Pfenniger (1997). They have proposed that spiral galaxies evolve from Sd to Sa, i.e. the bulge and the disk both increase and at the same time the M/L ratio decreases. Sd are gas-richer than Sa. It is then tempting to conclude that dark matter gradually transforms into visible matter, i.e. into stars. Then, the dark matter should be identified with gas. Why, then, cannot we see that gas? Such a scenario could be the case if molecular clouds possessed a fractal structure from 0.01 to 100 pc. Clouds would be fragmented into smaller, denser and colder sub-clumps, with the fractal dimension being 1.6-2. Available millimeter radiotelescopes are unable to detect such very small clouds. This hypothesis would also explain Bosma's relation between dark matter and gas (Section 2.3), because dark matter would, in fact, be gas (the observable HI disk could be the observable atmosphere of the dense molecular clouds). In this case, the dark matter should have a disk distribution.
The identification of disk gas as galactic dark matter was first proposed by Valentijn (1991) and was later analyzed by González-Serrano and Valentijn (1991), Lequeux, Allen and Guilloteau (1993), Pfenniger, Combes and Martinet (1994), Gerhard and Silk (1996) and others. H 2 could be associated to dust, producing a colour dependence of the radial scale length compatible with large amounts of H 2 . Recently, Valentijn and van der Werf (1999) detected rotational lines of H 2 at 28.2 and 17.0 m in NGC 891 on board ISO, which are compatible with the required dark matter. If confirmed, this experiment would be crucial, demonstrating that a disk baryonic visible component is responsible for the anomalous rotation curve and the fragility of apparently solid theories. Confirmation in other galaxies could be difficult as H 2 in NGC 891 seems to be exceptionally warm (80-90 K).
A disk distribution is, indeed, the most audacious statement of this scenario. Olling (1996) has deduced that the galaxy NGC 4244 has a flaring that requires a flattened halo. However, this analysis needs many theoretical assumptions for example, the condition of vertical hydrostatic equilibrium requires further justification, particularly considering that NGC 4244 is a Scd galaxy, with vertical outflows being more important in late type galaxies. Warps have also been used to deduce the shape of the halo. Again, Hofner and Sparke (1994) found that only one galaxy NGC 2903, out of the five studied, had a flattened halo. In this paper, a particular model of warps is assumed (Sparke and Casertano, 1988), but there are other alternatives (Binney 1991, 1992). The Sparke and Casertano model seems to fail once the response of the halo to the precession of the disk is taken into account (Nelson and Tremaine, 1995 Dubinski and Kuijken, 1995). Kuijken (1997) concludes that "perhaps the answer lies in the magnetic generation of warps" (Battaner, Florido and Sanchez-Saavedra 1990). On the other hand, if warps are a deformation of that part of the disk that is already gravitationally dominated by the halo, the deformation of the disk would be a consequence of departures from symmetry in the halo. To isolate disk perturbations embedded in a perfect unperturbed halo is unrealistic. Many other proposals have been made to study the shape of the halo, most of which are reviewed in the cited papers by Olling, and in Ashman (1982), but very different shapes have been reported (see section 3.4).
There is also the possibility that a visible halo component could have been observed (Sackett et al. 1994 Rausher et al. 1997) but due to the difficulty of working at these faint levels, this finding has yet to be confirmed.
Many other authors propose that the halo is baryonic, even if new models of galactic formation and evolution should be developed (de Paolis et al. 1997). This is in part based on the fact that all dark matter "observed" in galaxies and clusters could be accounted for by baryonic matter alone. Under the interpretation of de Paolis et al. (1995) small dense clouds of H 2 could also be identified with dark matter, and even be responsible for microlensing, but instead of being distributed in the disk, they would lie in a spherical halo.
Dark fluid theory
Farnes' new theory says that 95 percent of the cosmos is made up of a "dark fluid," and dark matter and dark energy are effectively both "symptoms" of that underlying phenomenon. It does do a good job of describing both of those, although it requires a little number-fudging of its own.
This dark fluid would need to have negative mass. That alone sounds like a sci-fi concept – how can something have a mass of -1 kg? But according to Newtonian physics it's entirely possible, albeit still hypothetical.
Something that has negative mass would have some pretty weird characteristics. For one, forces are inverted, so if you were to push a ball with negative mass it would accelerate towards your hand, instead of away from it. That also means it exhibits a kind of negative gravity, which repels other material instead of attracting it.
If the cosmos is filled with dark fluid, its negative gravity would be pushing everything away from everything else – exactly the observed phenomenon that dark energy was invented to explain. Meanwhile, it's not the gravitational pull of a dark matter halo that's holding galaxies together – it's the negative "push" of the dark fluid surrounding them. Galaxies of regular matter are basically bubbles floating in a cosmological dark fluid.
How many of you think that Dark matter is just a misconception?
I wish to know more about people's thoughts regarding the rejection of dark matter as some new unknown stuffs rather than misconception of something we have thought wrong.
Below is basically a historical approach to why we believe in dark matter. I will also cite this paper for the serious student who wants to read more, or who wants to check my claims agains the literature.
In the early 1930s, a Dutch scientist named Jan Oort originally found that there are objects in galaxies that are moving faster than the escape velocity of the same galaxies (given the observed mass) and concluded there must be unobservable mass holding these objects in and published his theory in 1932.
Evidence 1: Objects in galaxies often move faster than the escape velocities but don't actually escape.
Zwicky, also in the 1930s, found that galaxies have much more kinetic energy than could be explained by the observed mass and concluded there must be some unobserved mass he called dark matter. (Zwicky then coined the term "dark matter")
Evidence 2: Galaxies have more kinetic energy than "normal" matter alone would allow for.
Vera Rubin then decided to study what are known as the 'rotation curves' of galaxies and found this plot. As you can see, the velocity away from the center is very different from what is predicted from the observed matter. She concluded that something like Zwickey's proposed dark matter was needed to explain this.
Evidence 3: Galaxies rotate differently than "normal" matter alone would allow for.
In 1979, D. Walsh et al. were among the first to detect gravitational lensing proposed by relativity. One problem: the amount light that is lensed is much greater than would be expected from the known observable matter. However, if you add the exact amount of dark matter that fixes the rotation curves above, you get the exact amount of expected gravitational lensing.
Evidence 4: Galaxies bend light greater than "normal" matter alone would allow. And the "unseen" amount needed is the exact same amount that resolves 1-3 above.
By this time people were taking dark matter seriously since there were independent ways of verifying the needed mass.
MACHOs were proposed as solutions (which are basically normal stars that are just to faint to see from earth) but recent surveys have ruled this out because as our sensitivity for these objects increase, we don't see any "missing" stars that could explain the issue.
Evidence 5: Our telescopes are orders of magnitude better than in the 30s. And the better we look then more it's confirmed that unseen "normal" matter is never going to solve the problem
The ratio of deuterium to hydrogen in a the present universe is known to be proportional to the density of the universe. The observed ratio in the universe was discovered to be inconsistent with only observed matter. but it was exactly what was predicted if you add the same dark mater to galaxies as the groups did above.
Evidence 6: The deuterium to hydrogen ratio is completely independent of the evidences above and yet confirms the exact same amount of "missing" mass is needed.
The cosmic microwave background's power spectrum is very sensitive to how much matter is in the universe. As this plot shows here, only if the observable matter is
4% of the total energy budget can the data be explained.
Evidence 7: Independent of all observations of stars and galaxies, light from the big bang also calls for the exact same amount of "missing" mass.
This image may be hard to understand but it turns out that we can quantify the "shape" of how galaxies cluster with and without dark matter. The "splotchiness" of the clustering from these SDSS pictures match the dark matter prediction only.
Evidence 8: Independent of how galaxies rotate, their kinetic energy, etc. is the question of how they cluster together. And observations of clustering confirm the necessity of vats of intermediate dark matter"
One of the recent most convincing things was the bullet cluster as described here. We saw two galaxies collide where the "observed" matter actually underwent a collision but the gravitational lensing kept moving un-impeded which matches the belief that the majority of mass in a galaxy is collisionless dark matter that felt no colliding interaction and passed right on through bringing the bulk of the gravitational lensing with it.
Evidence 9: When galaxies merge, we can literally watch the collisionless dark matter passing through the other side via gravitational lensing.
In 2009, Penny et al. showed that dark matter is required for fast rotating galaxies to not be ripped apart by tidal forces. And of course, the required amount is the exact same as what solves every other problem above.
Evidence 10: Galaxies experience tidal forces that basic physics says should rip them apart and yet they remain stable. And the amount of unseen matter necessary to keep them stable is exactly what is needed for everything else.
11. There are counter-theories, but as Sean Carroll does nicely here is to show how badly the counter theories work. They don't fit all the data. They are way more messy and complicated. They continue to be falsified by new experiments. Etc.
To the contrary, Zwicky's proposed dark matter model from back in the 1930s continues to both explain and predict everything we observe flawlessly across multiple generations of scientists testing it independently. Hence dark matter is widely believed.
Evidence 11: Dark matter theories have been around for more than 80 years, and not one alternative has ever been able to explain even most of the above. Except the original theory that has predicted it all.
Conclusion: Look, I know people love to express skepticism for dark matter for a whole host of reasons but at the end of the day, the vanilla theories of dark matter have passed literally dozens of tests without fail over many many decades now. Very independent tests across different research groups and generations. So personally I think that we have officially entered a realm where it's important for everyone to be skeptical of the claim that dark matter isn't real. Or the claim that scientists don't know what they are doing.
Also be skeptical when the inevitable media article comes out month after month saying someone has "debunked" dark matter because their theory explains some rotation curve from the 1930s. Skeptical because rotation curves are one of at least a dozen independent tests, not to mention 80 years of solid predictivity.
So there you go. These are some basic reasons to take dark matter seriously.
New findings on makeup of universe may spawn research
New areas of extragalactic study may emerge from research by University of Alabama in Huntsville (UAH) astrophysicists using data from the Chandra Space Telescope to conclude that baryons making up all visible matter -- once thought to be missing from clusters -- are present in the expected ratios in large, luminous clusters.
The new research studied very large galaxy clusters and concludes that they indeed contain the proportion of visible matter that is being worked out as part of the Big Bang Theory. The paper was authored by graduate student David Landry with Dr. Massimiliano (Max) Bonamente, UAH associate professor of physics, Paul Giles and Ben Maughan of the University of Bristol, U.K., and Marshall Joy of NASA Marshall Space Flight Center. Dr. David Landry is now a scientist at Corvid Technologies in Huntsville, Ala.
The work may prompt new efforts to explain past research findings that some clusters have a deficit in baryons from what is expected. The universe is composed of about 75 percent dark energy and 25 percent matter. Of the portion that is matter, about 16 percent is the familiar visible matter that is all around us and the remaining 84 percent is dark matter.
"We call it dark matter because we don't know what it is made of, but it is made of some type of particles and it doesn't seem to emit visible energy," said Dr. Bonamente. Together dark energy, dark matter and ordinary baryonic matter form a pie chart of the mass of the universe, where everything has to add up to 100 percent. "We don't know what dark matter is," he said, "but we have the means to put the pie together."
While dark energy has a repulsive energy, dark matter and baryonic matter have an attractive force where "everything likes to clump together" to form stars and planets and galaxies, said Dr. Bonamente. Using x-rays, astrophysicists discovered that there is a diffuse hot plasma gas that fills the space between galaxies.
"Basically, the space between galaxies is filled with this hot plasma that is 100 million degrees in temperature," said Dr. Bonamente. Because the gas is so diffuse, it has very low heat capacity. "It is like if I posed this question to you: Which would you rather put your finger in, a boiling cup of water or a room that had been heated to 212 degrees Fahrenheit? You choose the room because the temperature inside it is more diffused than it would be in the concentrated cup of water, and so you can tolerate it."
So why doesn't the hot gas simply escape? "It is bound to the cluster by gravity," said Dr. Bonamente. "With hot gas, you can do two things. You can measure the regular matter, which is the baryon content. And two, since the hot gas is bound, you can measure how much matter it would take to hold the gas and therefore you can tell how much dark matter there is. "All of a sudden, there is something really wonderful about the hot gases," he said. "You can have your cake and eat it, too."
Theoretically, the universe should contain the same proportions of visible and dark matter regardless of where it is sampled. Using cosmic microwave radiation readings, astrophysicists have been able to do a type of forensics of the universe's past, and those finding have shown the proportions that were present at the Big Bang or shortly thereafter.
"Because it started in the Big Bang, that ratio should persist," Dr. Bonamente said. "It is like if I go to the ocean with a scoop. The scoop of water I get should have the same concentration of salt as the rest of the ocean, no matter where I get it."
But past research had indicated that some clusters were short on the expected percentage of baryons, posing the question of where they were.
"Since recently, people believed that clusters had less than 16 percent of baryons, so there were missing baryons," Dr. Bonamente said. "We said no, they are there. So, how did we find clusters with this correct ratio? We studied the most luminous ones, because they have more mass and retain more baryons."
The findings could open new areas of investigation into why the deficits in baryons were recorded in past research. Dr. Bonamente suggests one theory. "We know that some smaller clusters do have lower concentrations of baryons than the larger ones," he said. Perhaps because of weaker gravitational forces, the hot gases escaped in similar fashion as planets that have no atmosphere. "Maybe the gas can be bound but maybe a little bit can fly off if there is just not quite enough gravity."
For further studies on smaller clusters, Dr. Bonamente looks forward to the arrival of new faculty member Dr. Ming Sun, formerly at the University of Virginia, who is an expert on groups having less than 16 percent baryons.
"I am excited that Ming has decided to join our research group," says Dr. Bonamente."With him on board, UAH is poised to continue making discoveries on the makeup of the universe, and that is the most exciting question to answer that I can think of."
Something else was created during the Big Bang: dark matter. "But we can't say what form it took, because we haven't detected those particles," Bahcall told Live Science.
Dark matter can't be observed directly &mdash yet &mdash but its fingerprints are preserved in the universe's first light, or the cosmic microwave background radiation (CMB), as tiny fluctuations in radiation, Bahcall said. Scientists first proposed the existence of dark matter in the 1930s, theorizing that dark matter's unseen pull must be what held together fast-moving galaxy clusters. Decades later, in the 1970s, American astronomer Vera Rubin found more indirect evidence of dark matter in the faster-than-expected rotation rates of stars.
Based on Rubin's findings, astrophysicists calculated that dark matter &mdash even though it couldn't be seen or measured &mdash must make up a significant portion of the universe. But about 20 years ago, scientists discovered that the universe held something even stranger than dark matter dark energy, which is thought to be significantly more abundant than either matter or dark matter. [Gallery: Dark Matter Throughout the Universe]
Matter and Energy Tell Spacetime How to Be: Dark Gravity
Is gravity fundamental or emergent? Electromagnetism is one example of a fundamental force. Thermodynamics is an example of emergent, statistical behavior.
Newton saw gravity as a mysterious force acting at a distance between two objects, obeying the well-known inverse square law, and occurring in a spacetime that was inflexible, and had a single frame of reference.
Einstein looked into the nature of space and time and realized they are flexible. Yet general relativity is still a classical theory, without quantum behavior. And it presupposes a continuous fabric for space.
As John Wheeler said, “spacetime tells matter how to move matter tells spacetime how to curve”. Now Wheeler full well knew that not just matter, but also energy, curves spacetime.
A modest suggestion: invert Wheeler’s sentence. And then generalize it. Matter, and energy, tells spacetime how to be.
Which is more fundamental? Matter or spacetime?
Quantum theories of gravity seek to couple the known quantum fields with gravity, and it is expected that at the extremely small Planck scales, time and space both lose their continuous nature.
In physics, space and time are typically assumed as continuous backdrops.
But what if space is not fundamental at all? What if time is not fundamental? It is not difficult to conceive of time as merely an ordering of events. But space and time are to some extent interchangeable, as Einstein showed with special relativity.
So what about space? Is it just us placing rulers between objects, between masses?
Particle physicists are increasingly coming to the view that space, and time, are emergent. Not fundamental.
If emergent, from what? The concept is that particles, and quantum fields, for that matter, are entangled with one another. Their microscopic quantum states are correlated. The phenomenon of quantum entanglement has been studied in the laboratory and is well proven.
Chinese scientists have even, just last year, demonstrated quantum entanglement of photons over a satellite uplink with a total path exceeding 1200 kilometers.
Quantum entanglement thus becomes the thread Nature uses to stitch together the fabric of space. And as the degree of quantum entanglement changes the local curvature of the fabric changes. As the curvature changes, matter follows different paths. And that is gravity in action.
Newton’s laws are an approximation of general relativity for the case of small accelerations. But if space is not a continuous fabric and results from quantum entanglement, then for very small accelerations (in a sub-Newtonian range) both Newton dynamics and general relativity may be incomplete.
The connection between gravity and thermodynamics has been around for four decades, through research on black holes, and from string theory. Jacob Bekenstein and Stephen Hawking determined that a black hole possesses entropy proportional to its area divided by the gravitational constant G. This area law entropy approach can be used to derive general relativity as Ted Jacobson did in 1995.
But it may be that the supposed area law component is insufficient according to Erik Verlinde’s new emergent gravity hypothesis, there is also a volume law component for entropy, that must be considered due to dark energy and when accelerations are very low.
We have had hints about this incomplete description of gravity in the velocity measurements made at the outskirts of galaxies during the past eight decades. Higher velocities than expected are seen, reflecting higher acceleration of stars and gas than Newton (or Einstein) would predict. We can call this dark gravity.
Now this dark gravity could be due to dark matter. Or it could just be modified gravity, with extra gravity over what we expected.
It has been understood since the work of Mordehai Milgrom in the 1980s that the excess velocities that are observed are better correlated with extra acceleration than with distance from the galactic center.
Stacey McGaugh and collaborators have demonstrated a very tight correlation between the observed accelerations and the expected Newtonian acceleration, as I discussed in a prior blog here. The extra acceleration kicks in below a few times meters per second per second (m/s²).
This is suspiciously close to the speed of light divided by the age of the universe! Which is about m/s².
Why should that be? The mass/energy density (both mass and energy contribute to gravity) of the universe is dominated today by dark energy.
The canonical cosmological model has 70% dark energy, 25% dark matter, and 5% ordinary matter. In fact if there is no dark matter, just dark gravity, or dark acceleration, then it could be more like a 95% and 5% split between dark energy and (ordinary) matter components.
A homogeneous universe composed only of dark energy in general relativity is known as a de Sitter (dS) universe. Our universe is, at present, basically a dS universe ‘salted’ with matter.
Then one needs to ask how does gravity behave in dark energy influenced domains? Now unlike ordinary matter, dark energy is highly uniformly distributed on the largest scales. It is driving an accelerated expansion of the universe (the fabric of spacetime!) and dragging the ordinary matter along with it.
But where the density of ordinary matter is high, dark energy is evacuated. An ironic thought, since dark energy is considered to be vacuum energy. But where there is lots of matter, the vacuum is pushed aside.
That general concept was what Erik Verlinde used to derive an extra acceleration formula in 2016. He modeled an emergent, entropic gravity due to ordinary matter and also due to the interplay between dark energy and ordinary matter. He modeled the dark energy as responding like an elastic medium when it is displaced within the vicinity of matter. Using this analogy with elasticity, he derived an extra acceleration as proportional to the square root of the product of the usual Newtonian acceleration and a term related to the speed of light divided by the universe’s age. This leads to a 1/r force law for the extra component since Newtonian acceleration goes as 1/r².
Verlinde’s dark gravity depends on the square root of the product of a characteristic acceleration a0 and ordinary Newtonian (baryonic) gravity, gB
The idea is that the elastic, dark energy medium, relaxes over a cosmological timescales. Matter displaces energy and entropy from this medium, and there is a back reaction of the dark energy on matter that is expressed as a volume law entropy. Verlinde is able to show that this interplay between the matter and dark energy leads precisely to the characteristic acceleration is , where H is the Hubble expansion parameter and is equal to one over the age of the universe for a dS universe. This turns out be the right value of just over m/s² that matches observations.
In our solar system, and indeed in the central regions of galaxies, we see gravity as the interplay of ordinary matter and other ordinary matter. We are not used to this other dance.
Domains of gravity
Black holes, neutron stars
The table above summarizes three domains for gravity: general relativity, Newtonian, and dark gravity, the latter arising at very low accelerations. We are always calculating gravity incorrectly! Usually, such as in our solar system, it matters not at all. For example at the Earth’s surface gravity is 11 orders of magnitude greater than the very low acceleration domain where the extra term kicks in.
Recently, Alexander Peach, a Teaching Fellow in physics at Durham University, has taken a different angle based on Verlinde’s original, and much simpler, exposition of his emergent gravity theory in his 2010 paper. He derives an equivalent result to Verlinde’s in a way which I believe is easier to understand. He assumes that holography (the assumption that all of the entropy can be calculated as area law entropy on a spherical screen surrounding the mass) breaks down at a certain length scale. To mimic the effect of dark energy in Verlinde’s new hypothesis, Peach adds a volume law contribution to entropy which competes with the holographic area law at this certain length scale. And he ends up with the same result, an extra 1/r entropic force that should be added for correctness in very low acceleration domains.
In figure 2 (above) from Peach’s paper he discusses a test particle located beyond a critical radius for which volume law entropy must also be considered. Well within (shown in b) the dark energy is fully displaced by the attracting mass located at the origin and the area law entropy calculation is accurate (indicated by the shaded surface). Beyond the dark energy effect is important, the holographic screen approximation breaks down, and the volume entropy must be included in the contribution to the emergent gravitational force (shown in c). It is this volume entropy that provides an additional 1/r term for the gravitational force.
Peach makes the assumption that the bulk and boundary systems are in thermal equilibrium. The bulk is the source of volume entropy. In his thought experiment he models a single bit of information corresponding to the test particle being one Compton wavelength away from the screen, just as Verlinde initially did in his description of emergent Newtonian gravity in 2010. The Compton wavelength is equal to the wavelength a photon would have if its energy were equal to the rest mass energy of the test particle. It quantifies the limitation in measuring the position of a particle.
Then the change in boundary (screen) entropy can be related to the small displacement of the particle. Assuming thermal equilibrium and equipartition within each system and adopting the first law of thermodynamics, the extra entropic force can be determined as equal to the Newtonian formula, but replacing one of the r terms in the denominator by .
To understand , for a given system, it is the radius at which the extra gravity is equal to the Newtonian calculation, in other words, gravity is just twice as strong as would be expected at that location. In turn, this traces back to the fact that, by definition, it is the length scale beyond which the volume law term overwhelms the holographic area law.
It is thus the distance at which the Newtonian gravity alone drops to about m/s², i.e. , for a given system.
So Peach and Verlinde use two different methods but with consistent assumptions to model a dark gravity term which follows a 1/r force law. And this kicks in at around m/s².
The ingredients introduced by Peach’s setup may be sufficient to derive a covariant theory, which would entail a modified version of general relativity that introduces new fields, which could have novel interactions with ordinary matter. This could add more detail to the story of covariant emergent gravity already considered by Hossenfelder (2017), and allow for further phenomenological testing of emergent dark gravity. Currently, it is not clear what the extra degrees of freedom in the covariant version of Peach’s model should look like. It may be that Verlinde’s introduction of elastic variables is the only sensible option, or it could be one of several consistent choices.
With Peach’s work, physicists have taken another step in understanding and modeling dark gravity in a fashion that obviates the need for dark matter to explain our universe
We close with another of John Wheeler’s sayings:
“The only thing harder to understand than a law of statistical origin would be a law that is not of statistical origin, for then there would be no way for it—or its progenitor principles—to come into being. On the other hand, when we view each of the laws of physics—and no laws are more magnificent in scope or better tested—as at bottom statistical in character, then we are at last able to forego the idea of a law that endures from everlasting to everlasting. “
It is a pleasure to thank Alexander Peach for his comments on, and contributions to, this article.
https://arxiv.org/abs/gr-qc/9504004 “Thermodynamics of Spacetime: The Einstein Equation of State” 1995, Ted Jacobson
https://arxiv.org/pdf/1806.10195.pdf “Emergent Dark Gravity from (Non) Holographic Screens” 2018, Alexander Peach
https://arxiv.org/pdf/1703.01415.pdf “A Covariant Version of Verlinde’s Emergent Gravity” Sabine Hossenfelder
Did LIGO Detect Dark Matter?
It has often been said, including by me, that one of the most intriguing aspects of dark matter is that provides us with the best current evidence for physics beyond the Core Theory (general relativity plus the Standard Model of particle physics). The basis of that claim is that we have good evidence from at least two fronts — Big Bang nucleosynthesis, and perturbations in the cosmic microwave background — that the total density of matter in the universe is much greater than the density of “ordinary” matter like we find in the Standard Model.
There is one important loophole to this idea. The Core Theory includes not only the Standard Model, but also gravity. Gravitons themselves can’t be the dark matter — they’re massless particles, moving at the speed of light, while we know from its effects on galaxies that dark matter is “cold” (moving slowly compared to light). But there are massive, slowly-moving objects that are made of “pure gravity,” namely black holes. Could black holes be the dark matter?
It depends. The constraints from nucleosynthesis, for example, imply that the dark matter was not made of ordinary particles by the time the universe was a minute old. So you can’t have a universe with just regular matter and then form black-hole-dark-matter in the conventional ways (like collapsing stars) at late times. What you can do is imagine that the black holes were there from almost the start — that they’re primordial. Having primordial black holes isn’t the most natural thing in the world, but there are ways to make it happen, such as having very strong density perturbations at relatively small length scales (as opposed to the very weak density perturbations we see at universe-sized scales).
Recently, of course, black holes were in the news, when LIGO detected gravitational waves from the inspiral of two black holes of approximately 30 solar masses each. This raises an interesting question, at least if you’re clever enough to put the pieces together: could the dark matter be made of primordial black holes of around 30 solar masses, and could two of them have come together to produce the LIGO signal? (So the question is not, “Are the black holes made of dark matter?”, it’s “Is the dark matter made of black holes?”)
This idea has just been examined in a new paper by Bird et al.:
Did LIGO detect dark matter?
Simeon Bird, Ilias Cholis, Julian B. Muñoz, Yacine Ali-Haïmoud, Marc Kamionkowski, Ely D. Kovetz, Alvise Raccanelli, Adam G. Riess
We consider the possibility that the black-hole (BH) binary detected by LIGO may be a signature of dark matter. Interestingly enough, there remains a window for masses 10M⊙≲Mbh≲100M⊙ where primordial black holes (PBHs) may constitute the dark matter. If two BHs in a galactic halo pass sufficiently close, they can radiate enough energy in gravitational waves to become gravitationally bound. The bound BHs will then rapidly spiral inward due to emission of gravitational radiation and ultimately merge. Uncertainties in the rate for such events arise from our imprecise knowledge of the phase-space structure of galactic halos on the smallest scales. Still, reasonable estimates span a range that overlaps the 2−53 Gpc −3 yr −1 rate estimated from GW150914, thus raising the possibility that LIGO has detected PBH dark matter. PBH mergers are likely to be distributed spatially more like dark matter than luminous matter and have no optical nor neutrino counterparts. They may be distinguished from mergers of BHs from more traditional astrophysical sources through the observed mass spectrum, their high ellipticities, or their stochastic gravitational wave background. Next generation experiments will be invaluable in performing these tests.
Given this intriguing idea, there are a couple of things you can do. First, of course, you’d like to check that it’s not ruled out by some other data. This turns out to be a very interesting question, as there are good limits on what masses are allowed for primordial-black-hole dark matter, from things like gravitational microlensing and the fact that sufficiently massive objects would disrupt the orbits of wide binary stars. The authors claim (and quote papers to the effect) that 30 solar masses fits snugly inside the range of values that are not ruled out by the data.
The other thing you’d like to do is figure out how many mergers like the one LIGO saw should be expected under such a scenario. Remember, LIGO seemed to get lucky by seeing such a big beautiful event right out of the gate — the thought was that most detectable signals would be from relatively puny neutron-star/neutron-star mergers, not ones from such gloriously massive black holes.
The expected rate of such mergers, under the assumption that the dark matter is made of such big black holes, isn’t easy to estimate, but the authors do their best and come up with a figure of about 5 mergers per cubic gigaparsec per year. You can then ask what the rate should be if LIGO didn’t actually get lucky, but simply observed something that is happening all the time the answer, remarkably, is between about 2 and 50 per cubic gigaparsec per year. The numbers kind of make sense!
The scenario would be quite remarkable and significant, if it turns out to be right. Good news: we’ve found that dark matter! Bad news: hopes would dim considerably for finding new particles at energies accessible to particle accelerators. The Core Theory would turn out to be even more triumphant than we had believed.
Happily, there are ways to test the idea. If events like the ones LIGO saw came from dark-matter black holes, there would be no reason for them to be closely associated with stars. They would be distributed through space like dark matter is rather than like ordinary matter is, and we wouldn’t expect to see many visible electromagnetic counterpart events (as we might if the black holes were surrounded by gas and dust).
We shall see. It’s a popular truism, especially among gravitational-wave enthusiasts, that every time we look at the universe in a new kind of way we end up seeing something we hadn’t anticipated. If the LIGO black holes are the dark matter of the universe, that would be an understatement indeed.
Dark matter (DM) candidates must be beyond standard model (SM) particles, neutral and stable. Having so far escaped detection, they must have tiny interactions with SM particles. It would be even possible that they interact only gravitationally.
A possible production mechanism for DM particles, taking place in the early universe, is via evaporation  of primordial black holes (BHs), with masses in the broad range (10^<-5>) – (10^9) g. In this case, all particles with mass below the Hawking temperature of the BH are emitted, with weights simply given by their number of degrees of freedom (dof). It has been proposed that the particles produced via the evaporation mechanism might be responsible for the excess of baryons over anti-baryons [2, 3], for the observed dark matter abundance [4,5,6] and, if sufficiently light, also for dark radiation [5, 7,8,9]. Apart from the case of gravitino production [10, 11], the primordial BH density at formation for the range (10^<-5>) – (10^9) g is at present unconstrained, as reviewed, for example, in Ref. . However, Ref.  (see also Ref. ) derives an upper bound on the fraction of the universe collapsed into primordial BHs in this very mass range from possible backreaction gravitational waves from primordial BHs. Ref.  considers constraints on DM particles charged under a hidden gauge group.
Depending on the fraction of primordial BHs at formation with respect to radiation (eta ) , there is a possibility that the universe was BH dominated before the evanescence of the BHs [4, 16, 17]: this situation is referred to as BH domination. The case in which the BHs evaporate before they dominate the energy content of the universe is called radiation domination.
Fujita et al.  calculated the contribution to DM by primordial BH evaporation into new particles beyond the SM: they found that a significant contribution to DM could come from stable particles that are either superheavy or light, that is, with masses in the MeV range. In the light case, DM candidates would be warm, while in the superheavy case they would be cold. Exploiting the warm DM velocity constraints available at that time , Ref.  first discussed also the lower limits on the mass of the light DM candidates, using an order-of-magnitude argument essentially based on the geometrical optics approximation for Hawking radiation. This approximation ignores the low-energy suppression in the greybody factors [19, 20], accounting quite well for the case in which the warm DM candidate has (s=0) , but missing to reproduce the case of different spins. For an up-to-date presentation of this argument, see [21, 22].
A more sophisticated analysis was done by Lennon et al. . They also adopted the geometrical optics approximation, but included the redshift effect in the calculation of the momentum distribution of the emitted particles. Their result is an estimate of the number of particles that are still relativistic, with a spin dependence reintroduced a posteriori and based on greybody factors derived from the older literature [20, 23]. As a rough-and-ready criterion for successful structure formation, they impose that when the temperature of the universe drops below 1 keV (at which stage the horizon mass is about (10^9) solar masses), less than (10\%) of the DM is relativistic. The result of this ingenious, but quite arbitrary, argument is that, for BH domination, warm DM candidates with (s le 1) are excluded, those with (s=3/2) are marginally allowed, while those with (s=2) naively survive. Summarizing, for the lower spin values (say (s=0,1/2,1) ), the order-of-magnitude results of Ref.  were confirmed by Ref. , but the latter analysis was, however, not fully conclusive for the higher spins ( (s=3/2,2) ).
The more recent analysis of Baldes et al.  goes some step further. As suggested in , they include the redshift effect in the momentum distribution of the emitted particles at evaporation and derive the related phase space distribution as an input for the Boltzmann code CLASS [24,25,26]. The latter allows to extract the matter power spectrum for warm DM from primordial BHs and to compare it to the standard cold DM case thanks to the transfer function. This enables to constrain warm DM from primordial BHs using the structure formation bounds from Lyman data already derived for the well-known case of DM thermal relics. The analysis of Ref. , however, relies on the geometrical optics approximation and, in particular, provides quantitative results only for the (s=1/2) case, which agree with previous order-of-magnitude estimates [4, 21], also based on the geometrical optics approximation. The case for the higher spins could thus not be quantitatively clarified (apart from a qualitative mention of the greybody effects in appendix A of Ref. ) with respect to the results of Ref. .
Given the present lack of robust results about the fate of warm DM candidates with high spin values, we think it would be useful and timely to make a dedicated study. The aim of this work is precisely to provide a complete and updated study on the viability of warm DM candidates from the evaporation of primordial BHs.
In order to numerically account for the greybody factors associated with the different spins, we use the recently developed and publicly available code BlackHawk . We also compare the numerical results from BlackHawk with the analytical ones derived in the geometrical optics approximation. Taking into account the redshift effects as suggested in Ref. , we study the impact on structure formation by calculating the transfer function with CLASS , as suggested in Ref. . We derive the transfer function for all spins values, finding that, assuming BH domination, the scenario of warm DM from primordial BHs is excluded for all spins and for all BH masses in the range (10^<-5>) – (10^9) g. Our results for the (s=0) case agree with previous order-of-magnitude estimates [4, 21]. For radiation domination, we derive the upper limits on (eta ) (or, equivalently, on the warm DM mass) for the various warm DM spins. For the case (s=1/2) (the only for which the comparison is possible), we find conceptual differences with respect to the results of Ref. , but substantial numerical agreement.
In this work, we consider BH evaporation as the only production mechanism. The consequences of allowing for other production mechanisms have been recently explored in refs.  and [29, 30]. For a mixed model of DM production, Ref.  proved that a primordial BH dominated period of DM creation by evaporation cannot explain the abundance observed today. For an updated analysis of the possibility that the matter–antimatter asymmetry is due to particles produced by primordial BHs evaporation, we refer the interested reader to Ref.  for GUT baryogenesis and to Ref.  for leptogenesis. DM and baryogenesis in the case of stable remnants from thermal 2-2-holes have been studied in Ref. .
The paper is organized as follows. In Sect. 2, we introduce our notation and review basic ideas about formation and evaporation of primordial BHs. In Sect. 3, we discuss the instantaneous primary spectrum for the emitted particles. In Sect. 4, we discuss the dynamics of the primordial BH abundance. Sect. 5 deals with the momentum distribution at evaporation and Sect. 6 with the calculation of the DM phase space distribution. The calculation of the DM abundance and the impact on structure formation are presented in Sects. 7 and 8, respectively. The discussion of the results and our conclusions are presented in Sect. 9.
In order to have a better control of our formulas for dimensional analysis and numerical computations, we do not use natural units.
Why does dark matter matter?
This week - the mysterious stuff that's passing through you right now, and it literally holds the galaxy together. but we have no idea what it is. We talk to the scientists trying to find out. Plus in the news, the 100 year old technology that’s helping us fight infections we can’t currently treat. And evidence that wasps can size things up.
In this episode
00:60 - Phages treat antibiotic-resistant bacteria
Phages treat antibiotic-resistant bacteria with Graham Hatfull, University of Pittsburgh
A technology first pioneered about a Century ago - but then largely abandoned with the advent of antibiotics - has received a shot in the arm and saved the life of a patient at Great Ormond Street, thanks to modern technology. This is “phage therapy” - the use of viruses that kill bacteria - to fight infections. Chris Smith spoke to Graham Hatfull from the University of Pittsburgh.
Graham - The headline is we’ve used bacteriophages to treat a patient with an infection with a very nasty antibiotic resistant organism. Colleagues of ours at Great Ormond Street Hospital in London, they had patients that had cystic fibrosis, had a double lung transplant but then had suffered with very serious bacterial infections that became, essentially, untreatable because they were resistant to all of the antibiotics that they could throw at them. And so what we did was to find bacteriophages that infected the very specific bacterial strain that the patient was infected with, that was administered to the patient and we saw really great clinical outcomes and survival of the patient.
Chris - Where did you get the bacteriophage that you ultimately ended up using? How did you go and find it?
Graham - We've been studying these bacteriophages for quite a long time and so we have a library of about 15,000 individual bacteriophages, and from what we know about them we could whittle that down to a shortlist and we were able to identify three phages which worked well against this particular bacterial pathogen.
Chris - And how long did it take you to do that? Because one of the critical things with someone who is extremely unwell is that you don't have much time and if you administer antibiotics that's great because usually you can get them off-the-shelf and give them to the patient straightaway. I'm presuming that you just can't find a phage and turn it round and administer it in the same sort of timeline that you can with an antibiotic at present?
Graham - Yes. It took several months, especially because we not only had to screen amongst our favourite candidates, but we had to do some genetic engineering to take what were rather poor candidates and turn them into being effective antibacterial drugs. These kinds of infections caused by Mycobacteria tend to progress relatively slowly, so in this case we had a period of time - it was six months or so - where the patient was essentially hanging in there, and we were able to get the phages within sufficient time in order to be able to administer them with a good outcome.
Chris - And how did you manipulate the phages to get them so that they would hit the sweet spot, as it were, and take out just the right bacterium?
Graham - So one of the issues we face is that not all of the phages are lytic, they don't always kill when they infect the bacteria. What we needed to do was to go in and use genetic engineering to remove one particular gene which was causing that problem, and thus convert what was really not a very useful phage into being one that was going to be effective therapeutically.
Chris - How did you administer the phages once you'd found the ones that you wanted and you knew you'd optimised them?
Graham - There's really two routes of administration: deliver them intravenously and then some phage solution on gauze was added to both the sternal wound from the transplant as well as onto the skin nodules that appear as a sort of common manifestation of these kinds of diseases.
Chris - And how do you know that you've actually got rid of the bacteria? How do you know that there are not some hiding in there that are now resistant to all known antibiotics and your phage and could come back?
Graham - Again, that's a great question and obviously something that we worry about a lot. Rather than using one phage, we specifically made a cocktail of three phages in order to try to battle that problem of resistance. The bacteria could become resistant against one phage but then they should still be susceptible to the others that we're giving in the cocktail.
Chris - We are in what a UK Chief Medical Officer described as an "antibiotic apocalypse" situation, so do you think there's going to be a big comeback for phages then?
Graham - I think there's a real opportunity to try to find the kinds of infections that phages could really be useful to treat. There are particular types of diseases and infections where they could find a use. And one can imagine using phages in a smart way where you essentially combine them with antibiotics in order to essentially enhance the utility of the antibiotics and to try to help reduce the incidence of resistance to antibiotics.
Graham Hatfull, on how the bacteriophage, first discovered in 1915 by English researcher Frederick Twort, could be set to help us fight antibiotic-resistant infections a hundred years on. Those results were reported in Nature Medicine.
05:57 - Repairing injured lungs for transplantation
Repairing injured lungs for transplantation with Matthew Bacchetta, Vanderbilt University
Thousands of people die every year waiting on transplant lists. And lungs are in particularly short supply. Now scientists might have found a way to increase the numbers of donor organs that are suitable for transplantation. In experiments using pigs, which have lungs very similar to ours, Matthew Bacchetta from Vanderbuilt University has found that if he takes injured lungs that would normally be unsuitable for transplant, and plumbs them in to a potential recipient’s circulatory system for a day or so - but keeps the lungs outside the body in a special organ chamber - nourished by the healing effects of the blood supply, they recover very rapidly to a state that means they can then be moved inside the recipient. Matthew spoke to Chris Smith.
Matthew - Lungs are extremely sensitive to injury from gastric aspiration, pulmonary contusion, meaning that the lung gets bruised, getting infected while the patient is on a ventilator so they could develop a bacterial infection like pneumonia, and those are the primary reasons that organs are deemed unacceptable for transplantation. The major thrust of what we've done here is replicate the injury that we see in humans. We used what we call a gastric aspiration what that basically means is that the patient has taken gastric contents, which are very acidic and caustic, into their lungs and it causes inflammation like a severe pneumonia so that the organ cannot be used. And what our system has enabled is the organ to regenerate or repair itself over time.
Chris - How have you done it?
Matthew - We first failed a lot trying different types of ex vivo systems, meaning the organ was placed outside the body separated into a sort of machine perfusion system. And after being very frustrated and failing repeatedly, we eventually sort of had the 'eureka' moment where we said we can't replicate a whole system but what we can do is attach the organ to a natural host or recipient. In other words, the organ could be attached to somebody who needs potentially a lung transplant and that body provides the whole natural system required for wound healing. What we have essentially done is plumbed this organ into the potential recipient who provides all of the critical factors in their blood that enable the organ to heal.
Chris - So where did the lungs sit then, are they in the bath next to what will be the patient? When you actually come do this you’re gonna end up with some tubes coming out of the individual bringing the blood to and from these lungs which are going to be outside their body next to them?
Matthew - That's correct. That's exactly what we do. They are in a specialised container - it actually looks very similar to what we would do for a patient that was on dialysis, they would have blood coming out, it would go into the dialysis machine and then that blood is returned to the patient.
Chris - Does the lung breathe as well as? Are you pushing air in and out of the lungs you do this to keep it natural and what it would be expecting were it inside the body, to create as much a mimic for what the real body environment would be like?
Matthew - It does. We connect it to a mechanical ventilator and we can actually measure the performance of the organ in real time.
Chris - So you're looking at how much oxygen is getting pushed into the blood that your pushing through it by that set of lungs, so that gives you a marker for how well they're behaving and what the improvement is?
Matthew - That's correct, exactly. So we can monitor that over time and that gives us a benchmark to monitor the improvement process and to also let us know when we've reached a level that's normal.
Chris - Why is this better than just putting the lungs into the individual full stop? Because you're basically doing the same thing, you're sending them a blood supply, you're sending them an air supply, why is that better doing it outside body than just putting them in?
Matthew - Yeah, that's a great question. The major difference is that you have to subject the patient to a very invasive procedure. I have to remove their lungs, I have to put in new lungs, and we know that the lungs are damaged, that they're not really acceptable for a transplant. And then I have to support that patient with damaged lungs which causes a profound inflammatory process and so the body actually does not actually work as effectively in healing those lungs and the patient becomes unstable because they're now relying upon an injured organ to keep them alive.
Chris - And how did the recipient fare while they've got this extra set of lungs hitched up to them, and not just any old lungs, someone else's lungs, some other animal's lungs, and diseased lungs at that? Was there an obvious burden on the individual or did they cope well?
Matthew - They actually coped remarkably well. They were hemodynamically stable, meaning that their blood pressure, their heart rate, and all of the other physiologic measures that we use were normal and stable.
12:35 - Predicting Inflammatory Bowel Disease
Predicting Inflammatory Bowel Disease with Ken Smith, James Lee, University of Cambridge
Inflammatory bowel disease can cause severe pain and serious issues in those who deal with it every day. But what does it mean for those who suffer from it, and how might we improve their lives? Adam Murphy spoke to Ken Smith and James Lee from Cambridge University about a new test they've developed, which can predict the future severity for inflammatory bowel disease in those who suffer with it. But first, we heard from IBD patient Kate, who has been coping with the condition for many years.
Kate - I was diagnosed with Crohn's at 14 and there was no indication that things were going to be so severe in the long run. I was told by my consultant at the time that I would need to have some bowel removed but that there was no reason to think that that wouldn’t be the end of it for a long time. Unfortunately though, within nine months I was unwell again and for the next few years I went through a series of drugs, all of which caused quite serious side effects, but did little to stop the progress of the disease. Eventually, my large bowel became too badly damaged to save and after a few months with a feeding tube to try and get my weight up before surgery, I was told I would need to have a permanent colostomy bag.
Adam - That is Kate. As you heard, she suffers from Crohn's disease, an inflammatory bowel disease or IBD. But what is going on in the bodies people like her? I spoke to James Lee, a gastroenterologist at the University of Cambridge.
James - IBD is an umbrella term. It stands for inflammatory bowel disease and it encompasses Crohn's disease and ulcerative colitis which are two different diseases. But essentially in both diseases what happens is your immune system gets its wires crossed, the immune system actually attacks the bowel, and the result of that is it can cause ulceration and inflammation within the bowel, and that can lead to quite nasty symptoms where you get bleeding and abdominal pain. These are incurable, lifelong diseases and one of the big problems is that some patients will get a very severe and aggressive form of the disease, other people with the same disease can actually have a very mild disease course. And so one of the biggest challenges for treating patients with ulcerative colitis and Crohn's disease right now is identifying which patients need the more aggressive treatment approach because their disease is going to be that much more aggressive, and which patients actually would do very well with relatively minimal therapy.
Adam - And how do you do that? Ken Smith, Head of the Department of Medicine at Cambridge University took me through it.
Ken - We started about 12 years ago. We were interested in working out what factors drove different long-term outcomes for patients with diseases like inflammatory bowel disease. So we started off by recruiting a lot of patients at diagnosis, measuring the expression of genes in their blood at that day, and then comparing the patterns of expression of those genes, so-called signatures, comparing that with their long-term clinical outcomes, so this study's taken many years to do.
What we found was a signature that correlated very strongly with how well people did in the long term. We then took that signature, and in a complex process, developed a test that worked on whole blood that recreated the effect of that signature allowing us to divide patients into two groups that had very different long-term outcomes.
Adam - And the signature you found, what was that?
Ken - It was a signature in things called CD8 T cells, which are a subset of white blood cell, and it essentially was a measure of something that's called T cell exhaustion. So if you have a tendency to have exhausted T cells you tend to have very good long-term outcome, whereas if you don't have exhaustion you have the opposite, you tend to have more aggressive disease course. So we do understand the biological pathways that sort of underpin this observation in this test.
Adam - And what could this mean for patients? Back to James.
James - This could really be a game changer for treating patients with IBD. At the moment, most patients receive what is ostensibly a 'one size fits all' approach to their treatment and that's because we simply haven't had good ways of identifying the patients who need the more aggressive treatment from those who don't. So, at the moment, everybody in the UK and in many other parts of the world will be started on an initial treatment. If their disease continues to flare up frequently, they'll go on to something stronger, and if it continues to flare up they'll go on to something stronger still, and that incremental increase in treatment keeps going until we finally get to the treatment they need.
For the patients who have the most agressive disease that might not be until they get onto their fourth of fifth line treatment and, in the meantime, they've been exposed to sometimes years of persistently active disease with all the risks of the complications that go along with that. Conversely we know that actually if we were able to give the most effective treatments upfront to those sort of patients, those are the patients who stand the most to gain by getting their disease into control early.
So for a long time in inflammatory bowel disease and, for that matter, in other fields of medicine people have been looking for ways to match the right treatment to the right patient, so if you have something that enables you to personalise treatment in that way it could completely change how we treat patients in the future.
Adam - And finally, what could it mean for people like Kate?
Kate - For me, Crohn's has always been a disease that's constantly trying to gain ground. In the years after my diagnosis, failing my way through drugs of varying strength, I lost ground that I might never have had to give up if the people in charge of my care had a tool that allowed them to see a clearer picture of what I needed to stay well. It's incredible to think what a test like this could spare people.
18:05 - Can wasps make comparisons like humans?
Can wasps make comparisons like humans? with Elizabeth Tibbetts, University of Michigan
Human beings are very talented at using a cognitive skill called “transitive inference” - using information about things you know to draw conclusions about things you don’t know. For example, if you know that A is bigger than B, and B is bigger than C, you can tell that A is bigger than C without having to look at them side by side. We know humans can do this, but it’s something of an open question which other animals can do it as well. Professor Elizabeth Tibbetts from the University of Michigan has been investigating whether one of mankind’s most maligned foes, the wasp, is capable of using this advanced cognitive technique. She spoke to Ben McAllister.
Elizabeth - Long ago people thought that transitive inference was based on logical reasoning and we thought only humans were capable of transitive inference and, not too surprisingly, before long we found that humans were not the only ones. It turns out that a huge range of vertebrates can do transitive inference so primates and birds and even fish.
Ben - Wow. So it lives in that bucket of things that we used to think were kind of unique to the human experience, but we are rapidly learning is becoming a much much smaller bucket?
Elizabeth - It's a very small bucket I think. There had been one study on transitive inference in a non-vertebrate and that was done in honeybees, and they found that bees couldn't do transitive inference. And so I think that wasps are way smarter than bees so I wanted to test whether wasps could do it.
Ben - And for anyone out there who isn't super fond of wasps, you heard it here first, wasps are not only scarier than bees they are indeed infinitely more cunning as well, so add that to your consideration. What did you do with this study specifically in order to figure out if wasps could use transitive inference?
Elizabeth - What we did is we trained them to a bunch of colours. So, for example, we would train them that blue was better than green, and then we would train them the green is better than purple, and then we would train them that purple was better than yellow. So they had all this information and now we asked them to make an inference, so asked them what you like better green or yellow?
Ben - Right. And they've never seen green or yellow together before?
Elizabeth - Exactly. They've never seen green or yellow together. Some of the time green has been good, some of the time green has been bad so there's nothing that should be inherently different about the stimuli.
Ben - How do you go about training a wasp that green is better than say any other colour?
Elizabeth - We train them in this tiny little maize. It has to be tiny because wasps are tiny. Some of the bottom is electrified and then some of the bottom isn't. So when we're training them that blue is better than green, blue is a safe area in the maze and green gives them a little electric shock.
Ben - How exactly do you figure out what's a little electric shock for a wasp?
Elizabeth - I would say it's trial and error. But I promise no wasps were harmed in this experiment. We want them to learn so we don't want them to be freaked out or really worried or anything, right. So we just give them enough shock so that they act a little uncomfortable so they start moving around more quickly and try to get away from it.
Ben - And so they just spend a bit of time in this maze until they eventually land on the part of it that doesn't shock them and that part corresponds to the colour that you want to train them is good?
Elizabeth - Exactly. They move around the maze and they eventually go to the part that's safe and they're like oh my gosh, it's safe and there's the colour green - green is great.
Ben - Okay. What did you do after you trained them?
Elizabeth - After we trained them we had to test them, so we put them in the middle of a box and then we tested which colour they prefer to go to.
Ben - And you had no electric stimuli or was that still present?
Elizabeth - There were colours on either end and there was no electricity to cue them, but the idea is that they've learned that green is good, then they would go to the green side. So we tested them on the colours we had originally trained them on just to confirm that they had learned what we trained them on, and then we also tested them on those new transitive pairs.
Ben - Okay. And what did you find?
Elizabeth - We found that wasps do have transitive inference. So they took all those trained pairs and they seem to kind of organise them in their mind linearly, and then use transitive inference to choose between stimuli that had never been next to each other before.
Ben - That's fascinating because, as you mentioned before, previously someone else had found that bees are incapable of doing this. Surely a bee and a wasp have pretty similar sized brains, right?
Elizabeth - Yeah. Bees and wasps both have similar sized brains and their brains are really tiny, about the size of a grain of rice. I think the difference between bees and wasps isn't really that wasps are just geniuses and bees are dumb, it's more about what the social life of wasps and bees are like. All the workers on a bee colony are about the same, they spend their time foraging, but on a wasp colony there's all sorts of interesting dominance relationships. They have linear dominance hierarchy where the dominant wasp does most of the reproduction and the subordinate wasps do most of the work, and so figuring out how dominant other wasps are in wasp land is incredibly important. For example, if you've beaten Jane in a fight before and you see Jane beat Susan then you can infer hey, I'm probably going to be able to beat Susan. So that kind of thing is really important for wasps and not important for bees.
Ben - I would also say probably important for humans depending on who you ask.
Elizabeth - Yeah, definitely important for humans.
Ben - That's an important thing to know. Do you think there's scope for extending this kind of reasoning to dealing with other animals or other kinds of animal cognition?
Elizabeth - I bet that many other insects are capable of transitive inference. I think we just haven't tested them yet. I think one of the messages is that animals can be really good at what's important to them. We think of humans as being like the best at everything, but lots of animals are amazing at really specific things they need to do to be successful.
Ben - Intelligence doesn't necessarily correlate just the size of the brain but also to the tasks that need to be undertaken?
Elizabeth - Exactly. You don't need a big brain to do complicated things. Even a tiny little brain can do complicated things if the animal needs to be able to do it.
24:03 - In the Naked Scientists mailbox
In the Naked Scientists mailbox
Chris Smith and Katie Haylor opened up the Naked Scientists mailbag to see what listeners have been asking and telling us.
Katie - It turns out that our postman, as he was handing over the mail outside the office the other day, is a fan of the show, so thank you very much. Now he wants to know about gravity. What actually is gravity and what is it made out of? Ben, can you help us out with this one?
Ben - This is a great question. The answer essentially is that we don't know. Nobody really knows which is why it's a great question to be asking. We know there are four fundamental forces in nature and stay tuned to the back half of this program where you'll be hearing about them in a little more detail.
I’ll talk about two specifically gravity is a force we're talking about here. This is something that exists between any two things that have mass, pulls them together. And we can compare that to another force that we sort of do know what it's made up of which is electromagnetism, this is the force that magnets feel when they attract each other. That force, if you want to say, is made up of something, it's actually made up of these particles called photons which are just little chunks of light. When magnets are attracting they're actually shooting little chunks of light back and forth at each other and that's what that force is made of, if you like.
If we were to bring in analogue to gravity, we don't know if there's something like that with gravity. Some people think there is, there think there might be a particle called the graviton, although this has never been detected. Other people would say there is no thing, it's actually just the physical bending of space-time itself that creates effects that like gravity. So short answer - we don't know. A lot of people are trying to find out - great question.
Katie - It's a big question isn't it?
Katie - But it sounds like the second half of this show may help us to try and understand some of this science.
Ben - Absolutely. A better understanding of dark matter will certainly lead to a better understanding of gravity.
Katie - So our postman actually picked a very good week to ask about gravity then?
Chris - And there you go. Thank you very much Ben for that first-class answer for our postman
26:40 - What is dark matter?
What is dark matter? with Professor Lord Martin Rees, Cambridge University
We’re going to be delving into the mysterious stuff that makes up a massive amount of the Universe. But we can’t see it, and we’ve haven’t the foggiest idea what it is. So how are we trying to actually find out, and how do we even know it’s there at all? Ben McAllister has been finding out.
Ben - I'd like to tell you a story. It's a story about galaxies, black holes, stars, planets, people, and everything else in the universe. We now know that all of the big stuff in the universe - people, planets, stars - is made up of a handful of different kinds of particles. These tiny little things like atoms which are made up of protons, neutrons and electrons. We've come to know quite a lot about those little things how they make up bigger things over the last few hundred years. Collectively, astronomers call all that stuff baryonic matter, and that's kind of all there is, right?
I'm here to tell you that that's not the case. The baryonic matter - people, planets, stars - is only a very small fraction of the whole story. In today's program we're going to hear what we know about the rest of it. We're going to hear about dark matter.
To give you a taste, dark matter is this mysterious stuff that occupies the universe. It's enormous, there's five times as much of it as there is regular matter, it's everywhere. Right now, as you listen it's passing right through your body we can't see touch or feel it. But before we get to what it is, we have to go back a bit. Have you ever looked at the stars and wondered if there's more out there? If you have, you really aren't alone, humans have been doing it for as long as there gave been humans.
Professor Lord Martin Rees, Astronomer Royal.
Martin - It was actually some of which emerged in the 1930s through the work of Fritz Zwicky, who was a Swiss-American astronomer and he was studying the distribution of galaxies. Each galaxy is of course as big as our Milky Way, so he was looking at the universe on very large scales. He realised that the galaxies weren't distributed randomly, but they were clusters, and these clusters obviously seem to be held together by gravity. But when he measured the speeds of these galaxies he found it was surprising that they weren't flying apart, because the energy corresponding to those speeds would overwhelm the gravitational force holding the cluster together if that gravity was just due to the galaxies. He inferred that there must be some extra material that bound the cluster together, and this was the first really serious evidence that there was some dark stuff in the universe over and above the gas and stars that are visible.
Ben - For decades we've been observing things like this. Strange movements of large bodies in space that can't be explained if we only consider the matter that we can see. It all comes down to gravity. Gravity is the main force that governs the way things move around in space. It's a force that exists between any two things that have mass and it pulls them together. Gravity gets stronger the more mass there is but importantly, it gets weaker the further apart the two things are.
In space, when we look at the stuff we can see like stars, for example, we can estimate how much mass there is in the system and then, by using the laws of gravity - what we call Newtonian gravity - we can model the way we expect the mass to move. When things don't move the way we expect, say they move much faster for example, it implies that something's missing from our picture. There's some extra force making things move around faster which points to their being some extra mass to provide that extra force.
Martin - If you had found, for instance, that Jupiter was going round the Sun as fast as the Earth was, you'd have had to infer that there was a lot of mysterious mass outside the Earth's orbit, but inside Jupiter's orbit. So Jupiter was feeling not just the mass of the Sun, but something extra which the Earth wasn't feeling. Something like that, on a far bigger scale of course, happened when people studied the outer parts of galaxies. They found that the material was going round faster, the outlying stars and the gas at large distances was going faster, and this implied that the stars in a galaxy were not the dominant kind of mass, and that a whole, this galaxy like ours was embedded in what came to be called a halo of some material which was not emitting any light but was exerting a strong gravity, and was dominating the gravitational pull in the outer parts of the galaxy.
Ben - We've arrived at the point in the story where, thanks to observations of bodies moving around in space, we're pretty sure we're surrounded by massive amount of dark matter. Again, it's moving through your body right now, and it massively outweighs the regular matter that we understand, we just don't know what it is. We've since figured out a little more about it, but not that much more. It's a new frontier, a new region to explore. We do have some theories to explain the phenomena we see, some of which don't actually include any dark matter at all.
Martin - And there is, of course, the idea that we are wrong about gravity. And of course, all the arguments where you infer a mass from the motions of planets and stars and galaxies: that is assuming, in a sense, Newtonian gravity. So some people are proposing other ways where we wouldn't need to have dark matter at all, and we would simply have a different theory of gravity. But I think most people are against that, because first of all there's no particular reason why we should be surprised by dark matter. There's lots of scope for dark matter particles. And secondly we'd be jettisoning a lot of good data if we abandon the idea that we understood gravity. I would still bet that it's most likely that dark matter is in some kind of particles.
Ben - A number of experiments around the world propose to try and detect these particles as they pass through the earth and we'll hear more about some of those later. But why should we care about this? We can't see, touch or feel it, it's just this mysterious stuff that floats on by.
Martin - Well, we know everyone has gazed up throughout human history at the stars and wondered about them. One of the great achievements of cosmology is to understand the structure of the universe - why there are stars, why there are galaxies, why they are clustered, and the details of that. This is only giving us a consistent story if we have the presence of dark matter which is, on average in the universe, five times as dense as the gas and stars that we see. And I think this success is one of the great achievements of modern science. I would say it's up there with the standard model of particle physics in the genome. When the history of science is written I think the fact that we understand cosmic evolution, and why galaxies exist, is really a very great achievement.
Ben - If that doesn't do it for you, consider this. Think about everything humans have accomplished with an understanding of just one sixth of the matter in the universe. Computers. Modern medicine and spaceflight. All of art literature. Imagine what we could do if we could unlock the rest.