Superstring Theory


Superstring Theory - Why we need it, how it's progressing[/siz]

Superstring Theory - In Search of the Inner Mechanics of the Universe

Throughout history, humanity has attempted to understand the nature of the universe by developing theories describing its inner mechanics. During the Renaissance and the Enlightenment, great minds such as Bacon, Copernicus, and Newton introduced important tools and methods that gave birth to science. In the hundreds of years since, theories devised to describe the universe have become increasingly accurate in their prediction of physical properties and increasingly powerful in their expanding validity over an ever-wider range of physical conditions. Today, scientists are attempting to develop the ultimate theory, one that will accurately predict the behavior of the universe under all possible conditions. Many physicists believe it is likely that when this theory is complete, it will have its roots in superstring theory.

The successes and problems of accepted theories

Before attempting to look at superstring theory, it is useful to examine the theories currently accepted by science. Today, physicists believe that there are three fundamental forces in nature, gravity, electroweak force, and strong force. They rely on Einstein's General Relativity to describe the effects of gravity, and rely on the Standard Model to describe everything else. Experiments have shown that each is accurate in their predictions to the highest experimental precision achievable by modern science. However, the fundamental inability of GR and the Standard Model to work together consistently at the quantum level necessitates a more complete and less arbitrary picture of the universe.

Einstein's General Relativity (hereafter abbreviated as GR) describes gravity as the apparent effect of the distortion of space-time geometry. According to the theory, the energy of objects in the universe bend space-time, changing the absolute distance between points in space-time. To someone moving through bent space-time, this leads to the perception of a gravitational force emitted by matter that is proportional to the energy of the source. An elegant description, GR has passed all experimental tests. However, it is fundamentally flawed in that it does not incorporate quantum theory.

The preponderance of experimental evidence accumulated over the past century show that nature obeys quantum theory. Some of the fundamental features of quantum theory dictate that energy must only exist in discrete quantities (a physicist would say that energy is quantized) and that there must be inherent uncertainties in the prediction and measurement of physical properties. GR does not have any of these characteristics. In GR, energy is not quantized and there is no inherent uncertainty associated with either the prediction or the measurement of the path, or worldline, of objects in space-time. In short, GR is not consistent with quantum theory.

Today, under the conditions that scientists can create or observe, gravity is weak enough compared to other forces that its own quantum characteristics are insignificant. However, under conditions that occur inside black holes or at the birth of the universe, the strength of gravity becomes comparable to the strength of the electroweak and the strong forces, the other two fundamental forces. Under those conditions, one must include the quantum nature of gravity in one's calculations to get results that make sense. Thus, if one wishes to fully unravel the mysteries of black holes or of cosmology, one must first obtain a more complete theory of gravity that is consistent with quantum theory and works well together with the Standard Model, the accepted theory for electroweak and strong interactions.

Unlike gravity, the Standard Model does not treat force as the consequence of the distortion of space-time geometry. Instead, the Standard Model (hereafter abbreviated as SM) uses elementary particles moving through undistorted (a physicist would say flat) space-time. Elementary particles are objects that have no structure and possess no dimension, that is, no width, length, or height. According to SM, these particles nonetheless have certain inherent properties that distinguish them from one another and govern their interactions. The universe is composed of two broad categories of particles distinguished from each other by a characteristic called spin. Fermions have one half integer spins (1/2, 3/2, 5/2, etc) while Bosons have integer spins (1, 2, 3, etc). Loosely speaking, Fermions such as the electron constitute matter while Bosons such as the photon carry force. Besides the electron and the photon, there are other types of Fermions and Bosons that carry distinguishing properties. Of course, although physicists assumed the existence of particles when developing the Standard Model, they did not arbitrarily assign particles with properties. Instead, they derived the existence of particular types of particles starting from the known properties of nature.

First, physicists required that particles obeyed quantum theory and special relativity (essentially the requirement that particles cannot travel faster than the speed of light in vacuum). This constrained the mathematical description of particle interactions to a class of theories called the quantum field theories. Then, they required that the mathematical equations governing particle interactions obey certain known symmetries of nature. What are symmetries? Symmetries are the changes under which the behaviors of a system should remain the same. There are two types of symmetries relevant to quantum field theories. Gauge symmetries act at each point in space-time independently while global symmetries act on the universe as a whole. When developing the Standard Model, physicists assumed that the universe obeys SU(3) x SU(2) x U(1) gauge symmetry. The technical definition of SU(3) x SU(2) x U(1) gauge symmetry is unimportant to this discussion. What is important and, actually, quite elegant is the result that the existence of the strong and the electroweak forces and the properties of different particles are the direct consequence of particle interactions that obey the chosen gauge symmetry. Today, if one asks a particle physicist what the Standard Model is, he would say that the Standard Model is quantum field theory with SU(3) x SU(2) x U(1) gauge symmetry.

There are two problems with the Standard Model. One is aesthetic and the other is more fundamental. From an aesthetic point of view, physicists dislike that fact that, even with the requirements that the Standard Model obeys quantum theory, special relativity, and SU(3) x SU(2) x U(1) symmetry, the theory is still not unique. That is, instead of narrowing the theory down to a single set of self-consistent mathematical equations, the above three constraining conditions led to an entire range of consistent equations that differ by a host of parameters. To construct a Standard Model that had predictive power in our universe, physicists had to determine the free parameters in the Standard Model by measuring a number of universal constants experimentally. While the resulting model was astonishingly accurate, the fact it was not unique left, and still leaves, the troubling possibility that ours may be one of many self-consistent universes that obey quantum theory and special relativity but vary in the choice of symmetry group and free parameters present in the Standard Model. This is especially disturbing since we cannot, in principle, observe whether the other universes actually exist.

While the uniqueness problem is troubling, the divergence problem of quantum gravity shows that the Standard Model is fundamentally flawed. For many years now, physicists have tried without success to recast Einstein's General Relativity in the mode of quantum field theory by following the same path of development that led them to the Standard Model. To do this, they have constructed equations that attempt to represent gravity as a force carried by an elementary particle called a graviton. These equations are supposed to work consistently with the Standard Model when the quantum effects of gravity are significant and reduce to the equations representing General Relativity when quantum effects are insignificant. However, when one tries to use these theories in conjunction with the Standard Model, one always gets nonsensical answers. Physicists call this the divergence problem of quantum gravity.

One can trace the cause of the divergence problem of quantum gravity to the point-particle nature of the quantum field theory framework. In quantum field theories such as the Standard Model, one arrives at results by using a set of rules developed by physicists, notably the late Richard Feynman, to calculate quantum amplitudes of various contact-interactions (interactions in which particles meet one another at specific points in space-time). The quantum amplitudes allow one to analyze the effects of various interactions and arrive at a precise picture of the final outcome of interesting processes. Unfortunately, the Feynman methodology does not work for quantum gravity because contact interactions involving gravitons lead to infinite amplitudes that prove useless for analyzing interactions.

Superstring theory offers solutions

Over the years, people have proposed several approaches to resolving the problems with unifying quantum gravity and the Standard Model. One approach involves unifying electroweak and strong interactions into one overarching interaction governed by a unified force. This approach is called grand unification. Another approach, the Kaluza-Klein mechanism, suggests investigating whether the differences in particle properties in four dimensions result from the presence of extra dimensions that are too small for current experiments to detect. A third approach suggests that introducing a "supersymmetry" that relates Fermions to Bosons could help alleviate the unification problem.

The above three proposals looked like disparate approaches until string theory enters the unification fray. The key principle of string theory is the assumption that small vibrating 1-dimensional objects with length and tension constitute the universe. According to string theory, if one examines an elementary particle with fine enough resolution, one will find that it is really a vibrating one-dimensional object existing as a loop or as a strand. The apparent properties of the elementary particle are then the results of the way in which the string vibrates in space-time and the geometry of the space-time in which it vibrates.

This arrangement gives two immediate benefits. One, since strings interact along their length instead of at a single point in space-time, the quantum gravity divergence problem caused by point-particle contact interactions disappears. Two, string theories that are consistent with quantum theory and special relativity are unique. Each string theory has no dimensionless free parameters, and one does not need to arbitrarily choose the symmetry group, such as SU(3) x SU(2) x U(1).

Furthermore, the introduction of string theory as a means to unifying gravity and the Standard Model crystallizes the three other proposals regarding unification. First, string theory's picture of the universe satisfies grand unification because it replaces separate families of particles with one object, the string. Second, the only way to get consistent string theories is with the assumption that there exist extra dimensions in which the strings vibrate. Thus, consistency requires that string theory incorporate the ideas of the Kaluza-Klein mechanism. Lastly, a string theory that contains both bosons and fermions automatically incorporates supersymmetry. In fact, realistic string theories are called superstring theories for their use of supersymmetry.

Initial setback and the First Superstring Revolution

The aforementioned advantages of superstring theory were not always evident. String theory began life in 1968 with the publication of Veneziano's paper on Bosonic strings. Later, in 1971, work by Ramond, Neveu, and Schwarz added Fermions to the picture to form superstring theory. At the time, the theory was presented as a candidate for the development of a theory of strong force interactions. However, it soon fell out of favor due to several problems. One, quantum chromodynamics, the quantum field theory approach to strong interactions, proved successful. Two, superstring theory included a spin 2 massless boson that did not belong in any strong interaction. Three, it contained extra space-time dimensions that seemed out of place in a four dimensional universe.

By the mid 1970s, these and other problems had driven all but a few dedicated researchers out of the field. The few diehards who continued working on superstring theory included Joel Scherk and John Schwarz, who, in 1974, proposed that superstring theory offered a path not towards strong force interactions, but towards the creation of a unified theory of gravity, electroweak force, and strong force. They showed, in their proposal, that the spin-2 boson acted as a graviton whose behavior would reduce to General Relativity at low energies. They also pointed out that, since the presence of gravity distorted space-time, it was possible that the extra dimensions present in superstring theory were folded into imperceptibly small length scales. Furthermore, strings that act as the basic constituents of a unified theory would be about the size of Planck length, or around 10^20 (1 followed by 20 zeros) times smaller than the distances probed by strong force interactions. This would make quantum field theory a very good approximation to string theory, which would explain why quantum chromodynamics was successful in describing the strong force.

Still, there were enough problems left with superstring theory that people did not pay much heed to Schwarz and Scherk's bold idea until 1984, when Michael Green and John Schwarz discovered a ten dimensional superstring theory whose properties appeared qualitatively consistent with our universe. This convinced much of the theoretical physics community that superstring theory might be a viable candidate for a unified description of all forces. The resulting explosion in the amount of attention and effort devoted to string theory is now called the First Superstring Revolution. By 1985, a total of five self-consistent ten-dimensional superstring theories existed. These were type I, type IIA, type IIB, E8 X E8 heterotic, and SO(32) heterotic theories.

Obstacles and conundrums

At the time of their discovery, the existence of five self-consistent string theories seemed like an embarrassment of riches. The five theories were seemingly unrelated to one another, yet all seemed to be qualitatively similar to our universe. No one knew which of the five was actually the correct theory, and this became an obstacle blocking the advancement towards a satisfying theory for our universe.

At the same time, people encountered another roadblock to understanding superstring theory. Namely, no one had any idea how to write down the equations of any flavor of string theory exactly. Unfortunately, without exact formulations of superstring theory, one could not make interesting numerical predictions. The best that one could do was to use an approximation method called perturbation theory to investigate the qualitative aspects of the different superstring theories. This led to many interesting results, but ultimately, an exact formulation of superstring theory was essential for creating a useful and verifiable model of the universe.

During the next decade, the five superstring theories issue and the exact formulation issue languished. In the meantime, theorists made some progress on the study of the compactification of extra dimensions.

Experimental evidence and common experience show that our universe has three dimensions of space and one dimension of time. Yet, superstring theories need exactly ten dimensions to maintain self-consistency. Physicists got around this problem by postulating that the six extra dimensions are folded into length scales too small for experiments to detect. This is called compactification. Using perturbative methods, physicists investigated the qualitative features of compactification. They found that, if compactified into a particular form called a Calabai-Yau manifold, the six extra dimensions of E8 X E8 heterotic superstring theory would affect string vibrations in such a way as to give a low energy picture that resembled the Standard Model. This was a notable success. However, as was the case in many other areas of research in superstring theory, theorists could not push any further without breakthroughs in resolving either the five theories issue or the exact formulation issue. These breakthroughs came with the application of an idea called duality.

A new era begins with the Second Superstring Revolution

A duality is a hidden connection between two vastly different looking theoretical pictures that can transform one into the other and vice versa. By the 1990's, theorists had discovered one type of duality, T. In the words of John Schwarz, "T duality connects two theories if "theory A compactified on a space of large volume is equivalent to theory B compactified on a space of small volume." What this means is that if two theories are T dual, then both would have the same behavior if the six hidden dimensions of space-time in one theory was folded into something with a large characteristic size while the hidden dimensions in the other theory was folded into something with a small characteristic size. T duality connected type IIA and IIB theories to each other. It also connected E8 X E8 heterotic and SO(32) heterotic theories to each other. Thus, T dualities partially helped to solve the five unique theories problem by connecting two pairs of theories to one another.

In the mid 1990's, a rapid succession of dramatic discoveries by Sen, Hull, Townsend, Seiberg, Witten, Schwarz, Polchinski, Strominger, Vafa and Maldacena touched off the Second Superstring Revolution.

One of the discoveries was that S duality connected type I theory to SO(32) heterotic theory and type IIB theory to itself. Under S duality, the behavior of one theory at weak coupling (the interaction strength is weak), where perturbative methods work well, is equivalent to the behavior of the other theory at strong coupling (the strength of interaction is strong), where perturbative methods are invalid and exact formulations are needed. Thus, S duality not only showed that type I theory and SO(32) heterotic theory are connected, but offered a way to investigate the nonperturbative behavior of these theories. To probe strong coupling behavior in type I theory, one would just look at the perturbative behavior of SO(32) heterotic theory. To look at the strong coupling behavior of type IIB theory, one would just look at its own perturbative behavior.

Further revelations took place when the investigation of strong coupling behavior enabled by S and T duality led to the discovery that, in type IIA and E8 X E8 heterotic theories, an 11th dimension emerges at strong coupling. This was behavior that perturbative methods were incapable of finding, and its discovery has led string theorists to an eleven dimensional theory called M theory. Some string theorists believe that M theory is an overarching framework that contains the other string theories as special limiting cases. Others believe that M theory is one part of the overall picture but does not subsume other string theories. In any case, the discovery of M theory has led to important consequences, including the increasing likelihood that higher dimensional objects called P-branes will play an integral role in any exact formulation of string theory.

The promising road ahead

Superstring theory has grown much since its early days as a theory of strong interactions. In 1984, the First Superstring Revolution made superstring theory the best candidate and only serious contender for a theory capable of unifying gravity and the Standard Model. Ten years later, the breakthroughs of the Second Superstring Revolution has given string physicists newfound ability to explore the nonperturbative behavior of superstrings. However, much work remains to be done before there will finally be a unified theory of physics. Today, physicists still do not know how to formulate the complete superstring theory, and M theory remains an enigmatic treasure trove of potential insights. Furthermore, even with the successful completion of the superstring/M theory framework, one might still need guiding principles and initial conditions to extract the configuration of our own universe from the possibly infinite number of solutions of that framework. No one, as of yet, knows what those principles and conditions may be. However, the sheer beauty and elegance of string theory inspires physicists forward in their quest to unravel its mysteries. It is not improbable that one day, perhaps decades from now, the study of superstring theory will lead to a satisfactory understanding of the inner mechanics of the universe.

I have this saved on my computer, but I forgot where I got it from. I find superstring theory VERY interesting... anyone got any thoughts on it?
Here is another website I have in my list: It is kind of technical, but it is very interesting. :)
ohhhh geez, i read half of it, will do the rest later.

so basically these guys are trying to consider the space as a non-continuous entity, which means it is "discrete" or quantizable.

I have my doubts about it, since you can always split something in half, you might have read about the dicotomy (??) theory.

Space is like math, you can always have a smaller ammount.
Luis G i would like to challenge your notion that:
Space is like math, you can always have a smaller ammount

Let's imagine we have two objects say needles with finite points moving towards one another and they won't stop until they meet. If they keep moving closer to one another at half the distance per second then mathmatically they will never touch because the distance between them can be cut in half an infinitesimal number of times. But theoretically and logically they must at some point come together. If we start this process in reverse where the two finite points are touching and move them apart the smallest distance possible so that they aren't touching then mathmatically you should still be able to cut that distance in half regardless of it's distance.
nice wording :D

infinitesimal maths are a bit more complex, we use the term "infinite" to describe ammounts beyond our imagination and we also use the "infinitesimal" term to define things that are very small.

However, such things "exists" as such, either numerically or in mass, we just use the term infinitesimal to say that they are "too small to measure", which in another words it is an imprecision of the physics (can't remember the word for that).

Yes, the space must be made of "discrete" particles, however, such particles must be composed of other "discrete" particles......and so on.

I see your point, i hope you see mine.
I just love that peticular concept. It's one of the reasons that i think math alone is imperfect when dealing with the laws of physics. By the way did you know that scientists believe they have actually discovered the one true Atom(indivisable). I don't believe it though. Eventually they'll discover it has contents of it's own.
that's preciselly what i meant, by considering one "true atom" they are making the space discrete or quantizable.
By the way i didn't read what LL posted. Looks pretty long winded and i already have a million other sources i'm reading on the subject. My response was soley to that portion of your comment.
the thought of reading that, at the moment, makes my head spin. My theory on string theory is, cotton.
But with all that, the uncertainty principle still holds sway. The moment you've got it all worked out, it changes.
The uncertainty principle mearly shows us that we cannot observe accurately without affecting that which we are observing. However one day we will have a device to observe with that does not interfere with the process itself which will render the uncerianty principle a part of history.
HeXp£Øi± said:
The uncertainty principle mearly shows us that we cannot observe accurately without affecting that which we are observing. However one day we will have a device to observe with that does not interfere with the process itself which will render the uncerianty principle a part of history.
Should you be watching Ghandi? :)
I haven't read that yet LL, but I have a question (if it was covered, I guess I'll find out this afternoon when I get a chance to read it):

I had read a few books and articles that discussed string theory, and it seemed pretty clear that there was something very special mathematically about the ten-dimensional heterotic version. In fact, string theory demanded that space be either 10 or (I think) 26 dimensional in order to remain logically consistent, and maintain the theory's structure.

Well, it seems that popularity is shifting to the 11 dimensional M-brane theory. The question is how is the extra dimension accounted for? If ten was the magic number, what changed to allow for 11? Is there something inherent in branes being two-dimensional instead of one that makes this possible (or requires 11)?
Where did you get that article. I just started reading it and already i've ran into some serious flaws. There are four forces not three. Gravity(the weakest of the four forces), Electromagnetic Force, the weak force and the strong force.
That would be like saying chemicals can only exist in three states, solid, liquid and gas. Hello Plasma?
HeXp£Øi± said:
The uncertainty principle mearly shows us that we cannot observe accurately without affecting that which we are observing. However one day we will have a device to observe with that does not interfere with the process itself which will render the uncerianty principle a part of history.

I don't believe that is possible. Einstein spent a great deal of time (and brainpower) pondering just that question, and he failed to solve it for one simple reason: uncertainty isn't just about the interaction between the measurement and the measured phenomenon, but also about the underlying nature of reality. Specifically, even if you had a measuring instrument that somehow didn't affect the particle or whatever you are measuring when you "look" at it, you still could not determine with unlimited accuracy both values of an entangled pair. Particles really are fuzzy in that regard. It's not a mathematical trick... it's part of nature. Along those lines, the whole "phantom particles" or "virtual particles" in quantum physics invoked to explain the double slit experiment are not just mathematical slight of hand... they are as real as you and I are. The nature of the universe is simply not in line with what we believe is normal or logical.
HeXp£Øi± said:
I just started reading it and already i've ran into some serious flaws. There are four forces not three. Gravity(the weakest of the four forces), Electromagnetic Force, the weak force and the strong force.

They didn't leave out one of the forces, they just used the term electroweak to refer to the electromagnetic and weak nuclear forces. It has been shown that these are actually different facets of a unified force that displays a broken symmetry at the energy levels we commonly deal with.

Personally, I think that's a bit misleading and confusing, since it has been shown that the strong nuclear force is also united with the electroweak force at very high energy levels, yet they still refer to it separately.
I think we will eventually have devices that will catch whatever particle is emanating from the source and multiply it a billionfold(to make it more detectable to our equiptment) so that we might have an more accurate picture of the event. Now i'm not using the word unlimited but that's practically an impossibility given our current technology. I do however envision us making leaps and bounds in the next twenty years when it comes to how accurately we observe. After all, if we can actually detect & catch neutrinos....