Genesis of Eden Diversity Encyclopedia

Get the Genesis of Eden AV-CD by secure internet order >> CLICK_HERE
Windows / Mac Compatible. Includes live video seminars, enchanting renewal songs and a thousand page illustrated codex.



Join  SAKINA-Weave A transformative network reflowering Earth's living diversity in gender reunion.

Return to Genesis of Eden?

Anthraxtoxin silences immune cells' alarm call

INVASIONS are much easier if you knock out your enemy's communication system. And it tums out that this is what anthrax has been doing for millions of years. The discovery explains why people infected by breathing in anthrax spores don't develop symptoms for several days. By the time they are diagnosed, it's often too late to save them. Researchers already know that macrophages, white blood cells that normally destroy bacteria, engulf spores in the lungs. But the tough spores not only survive the onslaught, they begin to multiply.

The macrophages, carry their deadly cargo around the body, eventually bursting open and releasing hordes of bacteria. Normally, other macrophages nearby would spot the danger immediately and raise the alarm. But a toxin released by the bacteria somehow silences these sentinels. Now Michael Karin's team at the University of California, San Diego, has discovered exactly how the toxin works. One component, called lethal factor, specifically targets one of the macrophage's enzymes, called MKK6, and chops part of it off. That in turn prevents the activation of another enzyme called P38 MAPK, which would normally trigger the release of chemical signals to summon help from other cells, the team report in Science. What's more, P38 MAPK is also needed to keep active macrophages alive, by preventing the cells self- destructing. So lethal factor not only prevents cells that have spotted the invaders from sounding the alarm, it also leads to their death. Karin says it should be relatively easy to develop drugs that block lethal factor (see New Scientist, 27 October 2001, P 7). "By preventing this process, we should allow our own immune systems to get rid of the bacteria," he says. Andy Coghlan

Just how much did a photon weigh?

EVEN a child knows you cant weigh a ray of light. But that may not always have been the case. Physicists have finally come up with an explanation for one of their most counter-intuitive ideas - that fbr a tiny fracoon of a second following the big bang, light itseff had mass. The idea was born a few years ago, when Tomislav Pmkopec of Heidelberg University in Germany and Ola Tdmkvist of Imperial College, London, tried to explain why galaxies thmughout the Universe are surmunded by magnetic fields. They suggested that the fields might be remnants of photons with mass that existed during the Unhmrse's initial period of rapid expansion. But no one could explain how photons might acquire mass. Most particles are thought to get their mass from an as yet undiscovered partide called the Higgs boson. The idea is that a sea of Higgs bosons fills ali of space and drags on parddes travelling though it, making it harder to accelerate them. Those panicles affected most are the heaviest, while photons are immune and so have no mass.

Now the researchers, along with Richard Woodard at the University of Flodda, have come up with a way photons could, after all, have mass (Physical Review Letters, vol 89, p 101301). They say the secret I les i n the vacu u m energy that permeates all of space. According to quantum theory, a vacuum is not really empty: instead it's full of pairs of particles created from nothing. Normally these parddes collide and annihilate each other immediately after they form. But in the first fracbon of a second after the big bang, the Universe is thought to have exploded outwards incredibly fast, a pedod called inflation. For pairs of partides that can fm the pull of inflation, the rapid expansion of space would have pulled themsofar apart they wouldn't have been able to annihilate each other, and would have filled space. The Higgs boson canl aflkt photons, but these charged partides can. ft would have taken more energy than normal to create a photon amid this sea of parddes. And the particles would have dragged on the photons. In effect, the photons had a mass of about a hundred-billionth of a gram each. After inflation S"W, the extra energy associated with this mass would have created magnetic fields Mat evolved into the fields that e)dst today. "It's a fascinating example of a purely.quantum mechanical effkt occurring on cosmic dimensions', says Woodard. Stefan Maier

Qubits spot the difference

IFANYONE can build quantum computers big enough, they could have a valuable talent that's long been sought by artificial intelligence researchers. A new algorithm developed by physicists shows they should also be able to spot pattems in apparently random noise at undreamed-of speeds. Today's computers process data in the form of voltages representing is and os. A quantum computer, however, exploits the bizarre phenomenon of "superposition", in which particles spin opposite ways simultaneously, to make so-called 11 quantum bits", or quoits. These are essentially on and off at the same time, allowing them to be used in two calculations simultaneously. As the number of quoits increases, performance speeds rise exponentially, so a 2oo-qubit computer would be able to run more calculations simultaneously than there are atoms in the Sun. But so far, the world's most advanced quantum computer is IBM's 7-qubit device, which is no more powerful than a pocket calculator.

You can see the chequerboard within this pattern, but a dassical computer finds it tough. Massively parallel quantum computers will see it in a trice.

Researchers hope that bigger ones will be able to factor huge numbers - the basis of many secret codes -in seconds, beating standard computers by years. But aside from code-breaking, there have been surprisingly few useful applications mooted for quantum computers. Now Ralf Schiltzhold, a theoretical physicist at the University of British Columbia in Vancouver, has worked out a way to make quantum computers pick out a pattem from otherwise random data far faster than standard computers. Pattem recognition is a cornerstone of AI research. While people can see at a glance that there is a chequerboard region in the dots below, today's computers struggle to do this. "A big product of human intelligence is recognising pattems," Schiltzhold says. When a classical computer tries to find pattems, it runs what's called a Fourier transform, which describes the image in terms of its frequency components. But this is a slow "serial" process: the program has to finish analysing one piece of data before moving to the next. Schiltzhold's new algorithm exploits a quantum computer's ability to do many things at once to seek out parallel lines among random collections of dots. By using quoits to hold large amounts of image information in superposition, it would allow a quantum Fourier transform to examine big chunks of images in one fleeting glance, malting the whole process much faster. One application of the algorithm, suggests Seth Lloyd at Massachusetts Institute of Technology, might be in mmtary aerial surveillance. This could snow pattems to be found in grainy images that are light on data. That would be a big advantage if, for example, you have a plane that's flying over-a battlefield and you only have a few seconds to pick out a tank. Charles Choi

Crunch time soon?

HAZEL MUIR

THE Universe might yet collapse in a devastating "big crunch". Physicists have shown that even though its growth is speeding up, it could still start to implode by the time it's only twice its current age. "A few years ago, nobody would even think seriously about the end of the world within the next 10 to 2o billion years, especially since we learned that the Universe's expansion is accelerating," says Andrei Linde of Stanford University. "Now we see it is a real possibility." In 1998, astronomers studying distant supernovae found evidence that the expansion of the Universe is getting faster. This suggests that some kind of "dark eneru" is pushing space apart. Most theories of dark energy propose that the Univ'erse's accelerating expansion is driven by a cosmos-wide repulsive "scalar field" that has a uniform magnitude right across space. A similar energy field is thought to have made the Universe expand incredibly quickly just after the big bang, a period known as inflation. Last month, Linde won the Dirac medal for his role in developing this theory. Scientists have assumed that the repulsion of the field will drop as the Universe grows, eventually falling to zero. Though this would slow the rate of expansion of the Universe, it would never actually stop expanding. But Linde says this assumption could be wrong. He and his colleagues have shown that according to some theories of supergravity, which try to describe gravity within the context of quantum theory, the dark energy from a scalar field will do more than simply reach zero - it will become negative and possibly even plunge as far as minus infinity. This would slow the rate of expansion of the Universe and then put it into reverse, causing space and time to collapse to a point in a big cr the Un averse to start this collapse would be in 10 tO 2o billion years from now. Its current age is about 14billionyears. "This was the greatest surprise," says Linde. "We might be in the middle of the life cycle of the Universe, not at the beginning." His team's report is at wwwarxivorg/ abs/hep-th/02o8l56. With both collapse and indefinite expansion possible, we seem to be further than ever from predicting the fate of our Universe. But Linde says observations of supernovae, the leftover radiation from the big bang and galaxy distributions should help resolve the issue by pinning down the densities of dark energy and matter at different times in the past. "It was never easy to look into the future, but it is possible and we should not miss our chance," says Linde. "We may be unable to change our fate, but we surely want to know it' England's Astronomer Royal, Martin Rees of Cambridge University, is keeping an open mind. He agrees that a future collapse is possible. "Since we have no idea what the dark energy is, such scenarios cannot be ruled out,' he says. "But ultra-long-range forecasts are all exceedingly speculative.'

Beyond the Fourth Dimension

Nature's constants may be changing, but nobody knows why. John D. Barrow thinks we could find the answer beyond the fourth dimension

WHY are we here? In one sense at least, it's just a cosmic accident. Our existence is possible only because a number of peculiar coincidences between the values of different constants of nature allow it. The speed of light, the strength of gravity and the charge of an electron, for example, fall within the narrow windows of opportunity that allow atoms to form and hold together. If their values were slightly different there would be no stars, no galaxies and, of course, no life. But nobody has any idea why the fundamental constants of nature have the numerical values that they do. We can pin them down with impressive experimental accuracy, yet we know nothing of their origins. The constants combine our most precise experimental knowledge of the Universe with our most profound ignorance. We can't even be sure they are constant at all. There are hints that what we like to think are the constants of nature might be changing. If that's true, they could eventually slip out of the range that allows life to exist - and of course, there is no reason why the Universe's characteristics should facilitate our perpetual existence. If there's one thing that the directed random walk known as progress in science has taught us, it's that we are not the centre of the Universe. It doesn't need us, and it certainly doesn't exist to serve us. Copernicus was the first to see this with his - at the time - heretical deduction that the Universe does not revolve around us. That taught us that the Universe guarantees us no special location in space. Then Darwin showed that we are not the culmination of any special design, and the geologist Charles Lyell discovered that most of history had gone by, eventfully, without us. Deeper still was the insight of Einstein, who showed how to express the laws of nature so that they look the same to all observers, no matter where they are or how they are moving. We can now express the basic laws of nature in forms that would be found by anyone investigating the Universe, from Vega to Vegas - wherever they are and however they are moving. The constants of nature provide the next great distancing of science from human idiosyncrasy.

"The constants are central to our understanding of the Universe and our place in it"

We now understand that the structure of the Universe around us is determined by a collection of unchanging characteristics. These include things like the masses of the smallest subatomic particles, the strengths of the forces of nature, and the speed of light in vacuum. They have been quantified by precision measurement: in the backs of physics books the world over their latest values are listed to large numbers of decimal places. These characteristics generally have units which are rather anthropocentric: they rely dn human scale, or perhaps the properties of our Solar System. The speed of light, c, is measured in metres per second, for example. Centimetres, metres, feet and inches are conveniently related to the scale of the human frame. Our days and years are familiar celestial units of time that derive from the timing of the Earth's orbit and spin. Nature provides more fundamental measures of mass, length and time which identify the scales at which gravity and quantum reality collide, and where our understanding of physics is incomplete (see "Universal units", P 33). But there is no need to stop there. Having taken so many steps to remove the anthropocentric bias from science, it makes sense to get rid of units altogether. Constants like the speed of light c, Newton's gravitational constant G and Planck's quantum constant h can be combined in ways that produce pure, dimensionless numbers. The resufting constants are much more than just a combination of various properties of the Universe. They are its fundamental descriptors: the bar codes of physical reality. They are central to our understanding of the Universe and our place in it.

"This is an important number. If the fine structure constantwere much larger, we wouldn't be here to know about it"

Take the fine structure constant-, or alpha, for example, which tells us the strength of electromagnetic forces and controls the nature of atoms and molecules. It is determined by a combination of the charge of the electron e, Planck's constant h, traditionally divided by 2Tr, and the speed of light c. Denoted by the Greek letter alpha and defined by 2T[ellhc, it is currently determined to be approximately equal to 1/137.03599976...

High and low This is an important number. If it were much larger, atoms and molecules would be unable to exist; alpha's value affects the interaction between electrons and protons, and determines their binding energy. No stars would be able to form either, because their centres would be too cool to start self-sustaining nuclear reactions; alpha dictates the ignition temperature at which these can occur. In short, if alpha were much larger we wouldn't be here to know about it. But no one knows why alpha has this particular value. Even more mysteriously, we have seen hints that its value can change. Over the past two years I have been part of a team, led by John Webb of the University of New South Wales in Sydney, that used new theoretical techniques to analyse the absorption of light from distant quasars by intervening dust clouds. We look at the separation between absorption lines caused by different chemical elements that depend sensitively on the value of alpha at the red shift (an astronomer's measure of historical time) where the absorption occurs. The light left these clouds between 5 and 11 billion years ago, so comparing the observed line separations with separations measured now, in the laboratory, provides a probe of whether alpha can have changed over the past ii billion years. By using computational solutions of the equations of atomic structure we can determine the shifts in line spacings that would result from tiny changes in alpha and find the shift in its value between then and now that best fits all the data.

The results from observations of 147 quasars over two years were a big surprise and have potentially far-reaching implicatioi (New Scientist, 11 May, p 28). The complicated "fingerprint" of shifts matches that expected if the value of the fine structure constant 11 billion years ago was smaller than it is now by about 7 parts in a million. In an attempt to find other observational consequences of a changing alpha, joio Magueijo and Hdvard Sandvik of Imperial College, London, and I have investigated theories which extend Einstein's theory of general relativity to include this possibility. It appears that alpha would only change at certain times in our cosmological history controlled by our Universe's rate of expansion. Our theory suggests that during the first 300,000yearsoftheUniverse'shistory-a period of rapid expansion - there would have been no significant change in alpha at all. After that it would have started to increase in value very slowly until about 5 billion years ago. Observations of distant supemovae suggest that, at this time, our Universe's expansion began to accelerate. This is most likely due to a "vacuum energy", described by Einstein's famous cosmological constant, that began to control the expansion of the Universe. our analysis leads us to expect that, if that were the case, alpha would have stopped increasing then, and remained steady ever since. There are other limits on a varying alpha.

Geochemical data from a natural nuclear 11 reactor" that ran intermittently below the surface in the oklo region of Gabon shows that alpha hasn't changed by more than one part in ten million in the past i.8 billion years. The halting of the change in alpha caused by the acceleration of the Universe's expansion makes the Oklo limit compatible with the quasar observations. That's good news for all of us. Without the vacuum energy to stop this increase in alpha's value, there would come a time when atoms and stars could no longer exist. The Universe would no longer be unable to contain the building blocks of complexity, and life, like all good things, would have to come to an end. The example of alpha illustrates how important the numbers on the Universe's bar code can be. But what about the others? Can they change too? With our current technological abilities, it's difficult to say. Even if they can, it is no surprise that variations in alpha have been seen first: that is where the highest-precision observations are possible. Detecting whether there have been changes in Newton's constant of gravitation, G, is far harder because gravity is so much weaker than electricity and magnetism, so our observational probes of G are a thousand times weaker than those of alpha. They tell us only that G can't be changing faster than 1 per cent of the rate of expansion of the Universe. in the future it will be possible to probe constants like the ratio of the electron mass to the proton mass with greater sensitivity by astronomical observations at high red shift. The familiar constants that govern atomic and gravitational phenomena are not all there is to the Universe, however. In the best current working theory of particle physics there are more than 25 other basic constants goveming the masses of elementary particles and their interactions. Then there are the grander constants that pin down the structure of the Universe as a whole. We have already mentioned Einstein's cosmological constant. Its value is tiny, but nonetheless big enough to have caused the expansion of the Universe to accelerate. Until recently nobody was even sure that it existed, and we still don't know why it exists or why it takes the mysteriously small value that it does. String theory, the idea that all of matter arises from the vibration of dense tubes of energy, seems to say that it should not be able to exist at all. Like alpha, the cosmological constant might well turn out to be not quite a constant; the vacuum energy it represents might vary over billions of years, or it might one day decay away into radiation. in this latter case, the Universe would cease to accelerate in the far future and alpha might start to increase again.

From alpha to omega And then there is omega, a number that measures the present-day deviation of our Universe's expansion rate from the "critical rate". The critical rate is the value that separates universes that will expand forever from those that will contract back to a "big crunch" ' Observation shows omega to be close to i, and popular theories of the Universe, such as inflation, predict that it will be within 1 part in loo,ooo of that figure. With such a small margin either way, we may never know on which side of the critical divide we lie. Finding the complete explanation of these constants is the greatest challenge that physicists face. It might be that they are all uniquely and completely determined by some as yet unfound Theory of Everything. or perhaps only some of them are determined in this way: the cosmological constant and omega may be bound up with the starting conditions of the Universe, while the values of other constants like alpha and G might just fall out at random as a result of processes that occur during the Universe's very early history. The signs are that any theory of everything that explains these constants can exist only if the world has many more dimensions of space than the three that we see. These "extra" dimensions would have to behave very differently. Where are they? Perhaps they are imperceptibly small or their influence only manifests itself through the action of gravity. If they exist, then the true, unchanging constants of nature will only emerge from a theory that embraces and explains the characteristics of all the dimensions. The three-dimensional shadows that we call our constants of nature will be neither fundamental nor necessarily constant at all. If it is the higher-dimensional constants that are the only true constants, and the extra dimensions they inhabit were to expand in size or move relative to one another, we would observe the shadows of our three-dimensional 11 constants", Iike e and G, to change at the same rate. This could have profound consequences; the properties of these extra dimensions are likely to prove crucial in our quest to understand the constants of nature. Only when we know why these constants take the values they do will we be able to say we understand the Universe.

John D. Barrow is Professor of Mathematical Sciences at Cambridge University and Director of the Millennium Mathematics Project. His new book The Constants of Nature: From alpha to omega is published by jonathan Cape on 5 September