How strong is gravity? Scientists devise new way to measure.
Loading...
Using a novel measurement technique, scientists in Italy have come up with a new way to measure the gravitational constant, a figure whose precision has long proven elusive.
Most physicists regard the gravitational constant, which describes the strength of the gravitational pull that bodies exert on one another, as one of a basic feature of physical reality. It's a number that, were it even just slightly higher or lower, would probably result in a universe drastically different from the one we happen to be in.
But as for what this number actually is, the universe hasn't been particularly forthcoming. We know of no way to calculate it based on other known properties of the cosmos. The only way to arrive at it, as far as anyone knows, is to put objects near each other and measure – as carefully and precisely as possible – how they move.
This is harder than it sounds. Isaac Newton's law of universal gravitation, first published in 1687, states that every object with mass, be it star, a planet, a marble, or even a bit of pocket lint, exerts an attractive force on any other object with mass. In our day to day lives, the force exerted by the Earth renders that exerted by all other objects insignificant.
Newton found that this force is directly proportional to the product of the two objects' masses, and inversely proportional to the square of the distance between their centers. But this direct proportion is not one-to-one; it requires another number, something mathematicians call a proportionality constant. Just as the proportion between a circle's diameter and its circumference is modulated by π, the proportion between the force of gravity and the objects' masses and separation is modulated by the gravitational constant.
Here's the mathematical formulation of Newton's law: F = Gm1 m2 /r2 . F is the force of gravity, m1 is the mass of the first body, m2 the second, and r2 the square of the distance separating them. And G is the gravitational constant.
Newton never did quite figure out what G was. It took more than a century before British scientist Henry Cavendish, in the course of trying to determine the density of the Earth, managed to get within the ballpark.
Cavendish's experiment, conducted in 1797 or 1798, involved taking a "dumbbell" consisting of two 1.61-pound lead balls connected by a horizontal rod and hanging it by a slender wire. When he placed two 348-pound lead balls on opposite sides of the dumbbell, about 9 inches away, the dumbbell would twist on the wire ever so slightly due to the gravitational attraction between the heavier balls and the lighter ones. The magnitude of these twists can be used to calculate G.
G is a fantastically tiny number – roughly 0.0000000000667 cubic meters per kilogram per square second, which means that measuring it requires extremely sensitive instruments. Cavendish and his "torsion balance" performed admirably, getting within 1 percent of today's accepted figure. His precision was not exceeded until almost 100 years later, when British physicist C.V. Boys constructed a more sensitive torsion balance.
Over the past two centuries, there have been about 300 attempts to measure G, with most of them using a torsion balance or similar apparatus, and then ruling out the gravitational effects of the Earth, moon, sun, and anything else that possibly could interfere with the measurement.
But, unlike with other measurements of our universe's fundamental constants such as the speed of light and the mass of an electron, measurements of G have failed to converge on a single value. Instead, they varied wildly, with some of the more precise measurements in recent years being mutually incompatible. With nearly every recent measurement of Big G, as physicists call it, our uncertainty has actually increased.
The most recent measurement, described last week in the scientific journal Nature, points to a resolution. Instead of using lead balls hanging in a torsion balance, a team led by physicist University of Bologna and the University of Florence and Guglielmo Tino used a 516-kilogram array of tungsten cylinders, a cloud of rubidium atoms, and a device called an atom interferometer.
To slow the rubidium atoms down from their typical speed of several kilometers per second down to a more conveniently measurable speed of a few millimeters per second, the researchers cooled them in a vacuum tube to near absolute zero. They then zapped them with lasers, causing the atoms to leap up and then fall back down.
With the laser pulses, the researchers were exploiting the counterintuitive quantum nature of the individual atoms. At the atomic level and smaller, particles seem to exist in simultaneous contradictory states, smeared out into something that looks like a wave, with measurable wavelengths, frequencies, and momentums. But when measured, the matter wave collapses into one state.
The laser pulses split each rubidium atom's matter wave into two simultaneously existing quantum states, with one state jumping to 60 centimeters and one jumping to 90 centimeters. When the wave function of each atom collapsed, its resulting trajectory bore the signature of the differences between the heavy tungsten cylinders' gravitational tug at different heights.
Their result: Big G = (6.67191 ± 0.00099) × 10−11 cubic meters per kilogram per square second, a figure with a 0.015 percent uncertainty.
This most recent attempt to get at Big G is not the world's most precise, but it does have advantages over methods that use macroscopic masses. The atom, writes Stephan Schlamminger, a physicist with the US National Institute of Standards and Technology, in a Nature article accompanying the team's paper. "does not require a physical connection to the laboratory and is hence not biased by stray forces that such a connection would introduce to the measurement."
Big G is woven into much of sciences knowledge of our cosmos. It determines our calculations of the mass of the Earth, the Moon and Sun, all other planets in our solar system, as well as our galaxy, other galaxies and their stars, and even so-called dark matter. Even though Newton's Law of Gravitation has been superseded in astrophysics with Albert Einstein's General Law of Relativity, which describes gravitation in terms of bending and stretching of space-time, Big G features just as prominently in Einstein's field equations as it does in Newton's law.
Despite relying on the concept of a space-time continuum, Einstein's model of gravity operates more or less along a classical, deterministic lines. Quantum mechanics, by contrast interprets the world as probabilistic. General relativity and quantum mechanics each do a terrific job of predicting how objects, large and small, respectively, will behave. But physicists have yet to reconcile the two theories, to link the force of gravity with the other fundamental forces that define our cosmos.
By using microscopic masses at smaller distances, scientists can examine whether the discrepancies produced by the torsion balance measurements are just experimental noise, or if they represent actual fluctuations of the force of gravity. To be sure, it's unlikely that G has been subtly wobbling on us all along, but if has been, this discovery could point to a far deeper, and perhaps even more unified, understanding of why objects seem to attract.