[...]The end of the Second World War in Europe, at least as normally recounted, does not make sense, for in its standard form as learned in history books that history resembles nothing so much as a badly written finale to some melodramatic Wagnerian opera. [...]To appreciate how badly written a finale it truly is, it is best to begin at the logical place: in Berlin, far below ground, in the last weeks of the war. [...]Generaloberst Heinrici, commander of the vastly outnumbered Army Group Vistula that faces the massed armies of Marshal Zhukov poised less than sixty miles from Berlin, is pleading with his leader for more troops. The general is questioning the disposition of the forces he sees displayed on the battle map, for it is clear to him that some of Germany's finest and few remaining battle worthy formations are far south, facing Marshal Koniev's forces in Silesia. These forces were thus, incomprehensibly, poised to make a stiff defense of Breslau and Prague, not Berlin. The general pleads for Hitler to release some of these forces and transfer them north, but to no avail. "Prague," the Fuhrer responds stubbornly, almost mystically, "is the key to winning the war." Generaloberts Heinrici's hard-pressed troops must "do without."
One may also perhaps imagine Heinrici and the other assembled generals perhaps casting a doleful glance at Norway on the situation map, where thousands of German troops are still stationed, occupying a country that had long since ceased to be of any strategic or operational value to the defense of the Reich. Why indeed did Hitler maintain so many German troops in Norway up to the very end of the war? These paradoxical German troops deployments are the first mystery of the badly written finale of the war in Europe. Both Allied and German generals would ponder it after the war, and both would write it off to Hitler's insanity, a conclusion that would become part of the "Allied Legend" of the end of the war. This interpretation does make sense, for if one assumed that Hitler were having a rare seizure of sanity when he ordered these deployments, what possibly could he have been thinking? Prague? Norway? There were no standard or conventional military reasons for the deployments. In other words, the deployments themselves attest his complete lack of touch with military reality.
[...]But on the Allied side of the Allied Legend, things are equally peculiar. In March and April of 1945, US General George S. Patton's Third Army is literally racing across southern Bavaria, as fast as is operationally possible, making a beeline for:
(1) the huge Skoda munitions works at Pilsen, a complex all but blown off the map by Allied bombers;
(2) Prague; and
(3) A region of the Harz Mountains in Thuringia known to Germans as the Dreiecks or Three Corners," a region encompassed by the old medieval towns and villages of Arnstadt, Jonastal, Wechmar, and Ohrdruf.
One is informed by countless history books that this maneuver was thought to be necessary by the Supreme Headquarters of the Allied Expeditionary Force (SHEAF) because of reports that the Nazis were planning to make a last stand in the "Alpine National Redoubt", a network of fortified mountains stretching from the Alps to the Harz Mountains. The Third Army's movements, so the story goes, were designed to cut off the "escape route" of Nazis fleeing the carnage of Berlin. Maps are produced in old history books, accompanied in some cases by de-classified German plans some dating from the Weimar Republic! - for just such a redoubt. Case settled. However, there is a problem with that explanation. Allied aerial reconnaissance would likely have told Eisenhower and SHAEF that there were precious few fortified strong points in the "National Redoubt". Indeed, it would have told them that the "Redoubt" was no redoubt at all. General Patton and his divisional commanders would most certainly have been privy to at least some of this information. So why the extraordinary and almost reckless speed of his advance, an advance the post-war Allied Legend would have us believe was to cut off the escape route of Nazis fleeing Berlin, who it turns out weren't fleeing, to a redoubt that didn't exist? The mystery deepens.
Then, remarkably, in a strange twist of fate, General Patton himself, America's most celebrated general, dies suddenly, and, some would say, suspiciously, as a result of complications from injuries he sustained in a freak automobile accident soon after the end of the war and the beginning of the Allied military occupation.
[...]Matters are not helped by events on the other side of the world in the Pacific theater, for there American investigators would uncover similarly strange goings on after the war ended. There, after Nagasaki, the Emperor Hirohito, overriding his ministers who wanted to continue the war, decided that Japan would surrender unconditionally. But why would Hirohito's ministers urge continuance of the war in the face of overwhelming Allied conventional arms superiority, and, from their point of view, facing a potential rain of atomic bombs? After all, "two" bombs could just as easily have turned into twenty. One could, of course, attribute the ministers' objections to the Emperor's intentions to "proud samurai traditions" and the Japanese sense of "honor" and so on. And that would indeed be a plausible explanation. But another explanation is that Hirohito's cabinet ministers knew something. What his ministers probably knew was what American intelligence would soon discover: that the Japanese, "just prior to their surrender, had developed and successfully test fired an atomic bomb. The project had been housed in or near Konan(Japanese name for Hungnam), Korea, in the peninsula's North." It was exploded, so the story goes, one day after the American plutonium bomb, "Fat Man", exploded over Nagasaki, i.e., on August 10, 1945. The war, in other words, depending on Hirohito's decision, could have "gone nuclear". By that time, of course it would have done Japan no good to prolong it, with no viable means of delivery of an atomic weapon to any worthwhile strategic American targets. The Emperor stood his ministers down. These allegations constitute yet another difficulty for the Allied Legend, for where did Japan obtain the necessary uranium for its (alleged) A-bomb? And more importantly, the technology to enrich it ?
[...]Finally, a curious fact, one of those obvious things that one lends to overlook unless attention is drawn to it: the atomic bomb test that took place at the Trinity site in new Mexico was a test of America's implosion-plutonium bomb, a test needed to see if the concept would actually work. It did, and magnificently. But what is immensely significant - a fact missing from almost all mainstream literature on the subject since the end of the war - is that the uranium bomb with its apparatus of a cannon shooting the critical mass of uranium together, the bomb that was actually first used in war, the bomb dropped on Hiroshima, was never tested. As German author Friedrich Georg notes, this tears a rather gaping hole in the Allied Legend:[...]Why was the uranium bomb of the USA, unlike the plutonium bomb, not tested prior to being hurled on Japan? Militarily this would appear to be extremely dangerous.... Did the Americans simply forget to test it, or did others already do it for them?
[...]So, what exactly did the German pilot Hans Zinsser see on that night of October, 1944, as he flew his Heinkel bomber over the twilight skies of northern Germany? Something that, had he known it, would require the previous badly written Wagnerian libretto to be almost completely revised. His affidavit is contained in a military intelligence report of August 19, 1945, roll number A1007, filmed in 1973 at Maxwell Air Force Base in Alabama. Zinsser's statement is found on the last page of the report:
47. A man named ZINSSER, a Flak rocket expert, mentioned what he noticed one day: In the beginning of Oct, 1944 I flew from Ludwigslust (south of Lubeck), about 12 to 15 km from an atomic bomb test station, when I noticed a strong, bright illumination of the whole atmosphere, lasting about 2 seconds.
48. The clearly visible pressure wave escaped the approaching and following cloud formed by the explosion. This wave had a diameter of about 1 km when it became visible and the color of the cloud changed frequently. It became dotted after a short period of darkness with all sorts of light spots, which were, in contrast to normal explosions, of a pale blue color.
49. After about 10 seconds the sharp outlines of the explosion cloud disappeared, then the cloud began to take on a lighter color against the sky covered with a gray overcast. The diameter of the still visible pressure wave was at least 9000 meters while remaining visible for at least 15 seconds.
50. Personal observations of the colors of the explosion cloud found an almost blue-violet shade. During this manifestation reddishcolored rims were to be seen, changing to a dirty-like shade in very rapid succession.
51. The combustion was lightly felt from my observation plane in the form of pulling and pushing.
52. About one hour later I started with an He 111 from the A/D at Ludwigslust and flew in an easterly direction. Shortly after the start I passed through the almost complete overcast (between 3000 and 4000 meter altitude). A cloud shaped like a mushroom with turbulent, billowing sections (at about 7000 meter altitude) stood, without any seeming connections, over the spot where the explosion took place. Strong electrical disturbances and the impossibility to continue radio communication as by lightning, turned up.
53. Because of the P-38s operating in the area Wittenberg-Mersburg I had to turn to the north but observed a better visibility at the bottom of the cloud where the explosion occured (sic). Note: It does not seem very clear to me why these experiments took place in such crowded areas.
In other words, a German pilot had observed the test of a weapon, having all the signatures of a nuclear bomb: electromagnetic pulse and resulting malfunction of his radio, mushroom cloud, continuing fire and combustion of nuclear material in the cloud and so on. And all this on territory clearly under German control, in October of 1944, fully eight months before the first American A-bomb test in New Mexico.
[...]In a nutshell: the Allied Legend about the German failure to obtain the atom bomb because they never had a functioning reactor is simply utter scientific nonsense, because a reactor is needed only it one wants to produce plutonium. It is an unneeded, and expensive, development, if one only wants to make a uranium A-bomb. Thus, there is sufficient reason, due to the science of bombmaking and the political and military realities of the war after America's entry, that the Germans took the decision to develop only a uranium bomb, since that afforded the best, most direct, and technologically least complicated route to acquisition of a bomb.
[Source: Joseph P. Farrell - Reich of the Black Sun: Nazi Secret Weapons and the Cold War Allied Legend]
Silent weapons for quiet wars
[...]In order to achieve a totally predictable economy, the low-class elements of society must be brought under total control, i.e., must be housebroken, trained, and assigned a yoke and long-term social duties from a very early age, before they have an opportunity to question the propriety of the matter. In order to achieve such conformity, the lower-class family unit must be disintegrated by a process of increasing preoccupation of the parents and the establishment of government-operated day-care centers for the occupationally orphaned children. The quality of education given to the lower class must be of the poorest sort, so that the moat of ignorance isolating the inferior class from the superior class is and remains incomprehensible to the inferior class. With such an initial handicap, even bright lower class individuals have little if any hope of extricating themselves from their assigned lot in life. This form of slavery is essential to maintain some measure of social order, peace, and tranquility for the ruling upper class.
[...]From the time a person leaves its mother's womb, its every effort is directed towards building, maintaining, and withdrawing into artificial wombs, various sorts of substitute protective devices or shells. The objective of these artificial wombs is to provide a stable environment for both stable and unstable activity; to provide a shelter for the evolutionary processes of growth and maturity - i.e., survival; to provide security for freedom and to provide defensive protection for offensive activity. This is equally true of both the general public and the elite. However, there is a definite difference in the way each of these classes go about the solution of problems.
The primary reason why the individual citizens of a country create a political structure is a subconscious wish or desire to perpetuate their own dependency relationship of childhood. Simply put, they want a human god to eliminate all risk from their life, pat them on the head, kiss their bruises, put a chicken on every dinner table, clothe their bodies, tuck them into bed at night, and tell them that everything will be alright when they wake up in the morning. This public demand is incredible, so the human god, the politician, meets incredibility with incredibility by promising the world and delivering nothing. So who is the bigger liar? the public? or the "godfather"? This public behavior is surrender born of fear, laziness, and expediency. It is the basis of the welfare state as a strategic weapon, useful against a disgusting public.
Most people want to be able to subdue and/or kill other human beings which disturb their daily lives, but they do not want to have to cope with the moral and religious issues which such an overt act on their part might raise. Therefore, they assign the dirty work to others (including their own children) so as to keep the blood off their hands. They rave about the humane treatment of animals and then sit down to a delicious hamburger from a whitewashed slaughterhouse down the street and out of sight. But even more hypocritical, they pay taxes to finance a professional association of hit men collectively called politicians, and then complain about corruption in government.
Again, most people want to be free to do the things (to explore, etc.) but they are afraid to fail. The fear of failure is manifested in irresponsibility, and especially in delegating those personal responsibilities to others where success is uncertain or carries possible or created liabilities (law) which the person is not prepared to accept. They want authority (root word - "author"), but they will not accept responsibility or liability. So they hire politicians to face reality for them.
[...]From the time a person leaves its mother's womb, its every effort is directed towards building, maintaining, and withdrawing into artificial wombs, various sorts of substitute protective devices or shells. The objective of these artificial wombs is to provide a stable environment for both stable and unstable activity; to provide a shelter for the evolutionary processes of growth and maturity - i.e., survival; to provide security for freedom and to provide defensive protection for offensive activity. This is equally true of both the general public and the elite. However, there is a definite difference in the way each of these classes go about the solution of problems.
The primary reason why the individual citizens of a country create a political structure is a subconscious wish or desire to perpetuate their own dependency relationship of childhood. Simply put, they want a human god to eliminate all risk from their life, pat them on the head, kiss their bruises, put a chicken on every dinner table, clothe their bodies, tuck them into bed at night, and tell them that everything will be alright when they wake up in the morning. This public demand is incredible, so the human god, the politician, meets incredibility with incredibility by promising the world and delivering nothing. So who is the bigger liar? the public? or the "godfather"? This public behavior is surrender born of fear, laziness, and expediency. It is the basis of the welfare state as a strategic weapon, useful against a disgusting public.
Most people want to be able to subdue and/or kill other human beings which disturb their daily lives, but they do not want to have to cope with the moral and religious issues which such an overt act on their part might raise. Therefore, they assign the dirty work to others (including their own children) so as to keep the blood off their hands. They rave about the humane treatment of animals and then sit down to a delicious hamburger from a whitewashed slaughterhouse down the street and out of sight. But even more hypocritical, they pay taxes to finance a professional association of hit men collectively called politicians, and then complain about corruption in government.
Again, most people want to be free to do the things (to explore, etc.) but they are afraid to fail. The fear of failure is manifested in irresponsibility, and especially in delegating those personal responsibilities to others where success is uncertain or carries possible or created liabilities (law) which the person is not prepared to accept. They want authority (root word - "author"), but they will not accept responsibility or liability. So they hire politicians to face reality for them.
Labels:
politics
Big Bang as Genesis
[...]Just as the Ptolemaic cosmology was yoked to the theology of medieval Catholicism, so the Big Bang is today entangled with religious and theological ideas. It is used to support those concepts, and religion in turn is marshaled in defense of modern cosmology. Once again, as four hundred years ago, some theologians attempt to define which scientific concepts are permissible and which are not. The new scientific revolution, like the Copernican revolution, is not an attack on religion as a whole, but on the entanglement of science and religion--the idea that religious authority can dictate or reject scientific doctrines, or that the evidence of science can be used to bolster religious authority.
[...]So we should not be surprised that today cosmology remains entangled with religion. From theologians to physicists to novelists, it is widely believed that the Big Bang theory supports Christian concepts of a creator. In February of 1989, for example, the front-page article of the New York Times Book Review argued that scientists and novelists were returning to God, in large part through the influence of the Big Bang. A character in John Updike's 1987 novel Roger's Version is cited as typical of the trend. The character, a computer hacker, says, "The physicists are getting things down to the ultimate details and the last thing they ever expected to be happening is happening. God is showing through, facts are facts . . . God the Creator, maker of heaven and earth. He made it, we now can see, with such incredible precision that a Swiss watch is just a bunch of little rocks by comparison."
Astrophysicist Robert Jastrow echoes the same theme in his widely noted God and the Astronomers: the Big Bang of the astronomers is simply the scientific version of Genesis, a universe created in an instant, therefore the work of a creator. These ideas are repeated in a dozen or more popular books on cosmology and fundamental physics.
Such thinking is not limited to physicists and novelists, who could perhaps be dismissed as amateur theologians. Ever since 1951, when Pope Pius XII asserted that the still-new Big Bang supports the doctrine of creation ex nihilo, Catholic theologians have used it in this way. The pope wrote in an address to the Pontifical Academy of Sciences, "In fact, it seems that presentday science, with one sweeping step back across millions of centuries, has succeeded in bearing witness to that primordial 'Fiat lux' [Let there be light] uttered at the moment when, along with matter, there burst forth from nothing a sea of light and radiation, while the particles of the chemical elements split and formed into millions of galaxies. . . . Hence, creation took place in time, therefore, there is a Creator, therefore, God exists!"
To be sure, these views are by no means unanimous within the Catholic Church. The present pope, John Paul II, is far more cautious in mixing science and religion. In his own address on the subject, he repeatedly apologized on behalf of the Church for the persecution of Galileo and reaffirmed the autonomy of religion and science. Addressing the Pontifical Academy of Sciences in 1981 he paraphrased Galileo, saying that the Bible "does not wish to teach how heaven was made but how one goes to heaven." It is therefore not up to religion, he argues, to judge one or another cosmological theory. Yet in the same address, John Paul II favorably quotes from Pius XII's earlier speech and contends that the question of the beginning of the universe is not one that can be solved by science alone--to do so requires "above all the knowledge that comes from God's revelation."
[Source: Eric J Lerner - The Big Bang Never Happened]
[...]So we should not be surprised that today cosmology remains entangled with religion. From theologians to physicists to novelists, it is widely believed that the Big Bang theory supports Christian concepts of a creator. In February of 1989, for example, the front-page article of the New York Times Book Review argued that scientists and novelists were returning to God, in large part through the influence of the Big Bang. A character in John Updike's 1987 novel Roger's Version is cited as typical of the trend. The character, a computer hacker, says, "The physicists are getting things down to the ultimate details and the last thing they ever expected to be happening is happening. God is showing through, facts are facts . . . God the Creator, maker of heaven and earth. He made it, we now can see, with such incredible precision that a Swiss watch is just a bunch of little rocks by comparison."
Astrophysicist Robert Jastrow echoes the same theme in his widely noted God and the Astronomers: the Big Bang of the astronomers is simply the scientific version of Genesis, a universe created in an instant, therefore the work of a creator. These ideas are repeated in a dozen or more popular books on cosmology and fundamental physics.
Such thinking is not limited to physicists and novelists, who could perhaps be dismissed as amateur theologians. Ever since 1951, when Pope Pius XII asserted that the still-new Big Bang supports the doctrine of creation ex nihilo, Catholic theologians have used it in this way. The pope wrote in an address to the Pontifical Academy of Sciences, "In fact, it seems that presentday science, with one sweeping step back across millions of centuries, has succeeded in bearing witness to that primordial 'Fiat lux' [Let there be light] uttered at the moment when, along with matter, there burst forth from nothing a sea of light and radiation, while the particles of the chemical elements split and formed into millions of galaxies. . . . Hence, creation took place in time, therefore, there is a Creator, therefore, God exists!"
To be sure, these views are by no means unanimous within the Catholic Church. The present pope, John Paul II, is far more cautious in mixing science and religion. In his own address on the subject, he repeatedly apologized on behalf of the Church for the persecution of Galileo and reaffirmed the autonomy of religion and science. Addressing the Pontifical Academy of Sciences in 1981 he paraphrased Galileo, saying that the Bible "does not wish to teach how heaven was made but how one goes to heaven." It is therefore not up to religion, he argues, to judge one or another cosmological theory. Yet in the same address, John Paul II favorably quotes from Pius XII's earlier speech and contends that the question of the beginning of the universe is not one that can be solved by science alone--to do so requires "above all the knowledge that comes from God's revelation."
[Source: Eric J Lerner - The Big Bang Never Happened]
Labels:
edu
Cosmology and Ideology
There has always been an intimate relation between the ideas dominant in cosmology and the ideas dominant in society. It would be astonishing if that relationship had come to an end in our present enlightened times. Not that cosmologists directly derive their theories from social or political ideas--far from it. But what sounds reasonable to them cannot but be influenced by events in the world around them and what they and others think about it.
So it is certainly no coincidence that the period during which the Big Bang was in eclipse, from around 1957 to 1964, corresponds to the time of the most vigorous expansion of postwar recovery and a resurgence of confidence in progress. The Big Bang's golden age in the seventies, on the other hand, corresponds to the end of the postwar boom and a new decade of growing pessimism. In fact, the links between cosmological and social ideas were made explicit by both cosmologists and political writers of the period.
In the late sixties and the early seventies the postwar recovery ended in all the market economies. Real wages peaked in the United States and Western Europe, and somewhat later in Japan. In the Third World, per capita grain production, the best overall indicator of food supply and living standards, reached 340 kilograms per year and stopped rising, only to remain there for the next twenty years. While the food supply had increased by 50 percent from the depths of the late forties, it only recovered the levels of 1913.
As the seventies wore on, the economic problems facing the entire world, east as well as west, became more obvious. As had happened before, new markets were being saturated and increasing pressure was put on wages and living standards worldwide, as industrialists strove to maintain and increase their companies' profitability.
Almost as soon as this cessation of growth began to manifest itself, social ideas that justified the situation as inevitable started to circulate. In 1968 the Club of Rome, bringing together industrialists and academics, championed the idea of zero growth: the earth is finite, the universe is running down, it is impossible to continue the increase in living standards. The two oil crises were interpreted as warnings of the exhaustibility of finite resources-- a logic that must appear quaint to oil producers who now go to war in a struggle against a persistent glut.
Many writers used the Big Bang cosmology and the idea of universal decay to buttress the argument that consumption has to be restrained. In his 1976 book The Poverty of Power Barry Commoner begins from the cosmological premise that "the universe is constantly, irretrievably becoming less ordered than it was", and concludes that, given this overall tendency, Americans must make do with less in order to postpone the inevitable day when total disorder reigns on earth. The faltering universe of the Big Bang became a metaphor for the faltering economy--both equally inevitable processes, beyond the control of mere mortals.
Nor were cosmologists and physicists immune from the influence of such analogies. In the popular 1977 account of the Big Bang, The First Three Minutes, Nobel Prize winner Steven Weinberg concludes by contemplating the philosophical lessons of this universe, which will end either in the icy cold of final decay and infinite expansion, or in the fiery collapse to a new singularity: "It is almost irresistible for humans to believe that we have some special relation to the universe, that human life is not just a more or less farcical outcome of a chain of accidents reaching back to the first three minutes, but that we were somehow built in from the beginning. As I write this I happen to be in an airplane at 30,000 feet, flying over Wyoming en route home from San Francisco to Boston. Below, the earth looks very soft and comfortable--fluffy clouds here and there, snow turning pink as the sun sets, roads stretching straight across the country from one town to another. It is very hard to realize that this all is just a tiny part of an overwhelmingly hostile universe. It is even harder to realize that this present universe has evolved from an unspeakably unfamiliar early condition, and faces a future extinction of endless cold or intolerable heat. The more the universe seems comprehensible, the more it also seems pointless."
For Weinberg, as for others, the universe of the Big Bang is irreconcilable with human progress. The end may come billions of years from now, but in the end all that the human race accomplished in aeons will be nothing, of no consequence. Progress, then, is an illusion, as it was for Augustine sixteen hundred years ago. The only question is when it will stop--now, or at some point in the future. It is thus no surprise that the Big Bang flourished simultaneously with the social ideas, like zero growth, that deny the reality of progress, and with a growing economic crisis that, at least in the short term, had stalled that progress. Once again, cosmology justified the course of events on earth.
But there is probably no better example in this century of the interaction of social ideology and cosmology than the development of the inflationary universe in the eighties. Nineteen eighty, with the coming to power of conservative administrations in America and elsewhere, marked the end of a period of fashionable pessimism and the beginning of a decade of speculative boom. Alan Guth arrived at his idea of cosmic inflation just as the worst monetary inflation of the century was coming to a climax. He concluded that the universe is a "free lunch" just as the American economy began its own gigantic free lunch--a period of speculation which rewarded its wealthy participants while actual production stagnated.
Throughout the decade, the rise of financial speculation in Wall Street was shadowed by the rise of cosmologists' speculations in Princeton, Cambridge, and elsewhere. As Witten and his colleagues were acclaimed by the press as geniuses for theories that produced not a single valid prediction, so men like Michael Milken and Donald Trump earned not only far greater fame but also incomes that peaked, in Milken's case, at half a billion dollars per year for paper manipulations that added not a single penny to the nation's production.
In the realm of finance, fortunes were built on a tower of debt. A speculator would borrow four billion dollars to buy a company, sell it for five billion to another speculator, who would, in turn, break it up to sell it in pieces for six billion dollars--all on borrowed money. All involved reaped handsome profits and were hailed as geniuses of financial wizardry--until their indictments.
The result of this was an actual decline in living standards both in the U.S. and throughout the world: by the end of the eighties real family income in the U.S. had dropped by 10 percent and was at the same level as it had been twenty-five years earlier, despite the fact that most families by now had two incomes.
Obviously, the small-scale speculators of cosmology did not, in any conscious way, imitate the large-scale speculators of Wall Street. Yet, as in every other epoch, society's dominant ideas permeated cosmology. If the wealthiest members of society earned billions by mere manipulation of numbers, without building a single factory or mill, it didn't seem too strange that scientific reputations could be made with theories that have no more relation to reality. If a tower of financial speculation could be built on debt--the promise of future payment--then, similarly, a tower of cosmological speculation could be built on promises of future experimental confirmation.
There was, however, a more direct relationship between the development of the economy over the past decade and the development of cosmology and science generally. The eighties saw a slashing, particularly in the U.S., of the amount of money devoted to nonmilitary research and development and a drastic slowing of technical advance.
To a large extent, this intensified a tendency evident in the seventies and even in the sixties. Since 1960 there has not been a single major qualitative breakthrough in physical technology.
The thirty years before 1960 saw a series of fundamental developments: television in the thirties; the transistor, computer, radar, and, of course, nuclear energy in the forties; the development of space travel and the laser in the fifties. In the subsequent three decades there have been dramatic improvements in all these areas, particularly in computers, but not a single qualitatively new, functional idea. Only in biology has genetic engineering brought about a qualitative advance.
This is a profound change for modern society: not since the beginning of the industrial revolution 250 years ago has there been a similar period of three decades without major technical advances. Such technical stagnation has a deep impact on science and technology. An advancing society, which requires and thus supports fundamental work in science and technology, continually generates challenges for the pure sciences and provides the materials needed to meet those challenges. Thus the problems arising from the development of electricity and electrotechnology in the late nineteenth century led directly to the study of nuclear structure and eventually to the release of nuclear energy. When technological progress slows or ceases, that crossfertilization of theory and experiment, thought and action, begins to wither and scientists begin to turn to sterile speculation.
The slowing of technology is, today, directly linked to the growth of financial speculation. Five billion dollars invested in buying, say, Hughes Aircraft, is five billion dollars that the buyer, General Motors, will not put into new factories or new research. To the extent that the world market appears to be saturated, as it does today, then profits are easier to make through speculation than in production. What use is new technology if new factories aren't profitable? The diversion of financial resources from technical advance has pushed thousands of scientists away from the challenges of the real world into the deserts of speculation.
Fortunately for science, even the perfection of existing technologies, such as the computer, requires a broad base of scientific research. But it is fundamental research--investigations whose findings don't seem to be immediately useful--that suffer first when technological development slows. Today those areas are clearly cosmology and particle or high-energy physics--where the link between science and technology, theory and human progress, has been broken almost completely. It is here that, as in postclassical Greece, the stagnation of society has led to the return of mathematical myths, a retreat from the problems of base matter to the serene contemplation of numbers.
Today cosmologists often pride themselves on the isolation of their work from the everyday world and from any possible application. They and their particle theorist colleagues give their hypothetical entities whimsical and comical names to flaunt their belief that their activity is, at base, an elaborate and difficult game, the "free play of the mind." In a society beset by growing crises, a world of poverty, crime, drugs, and AIDS, a world without progress, the pure realm of mathematics offers a serene cloister.
[Source: Eric J Lerner - The Big Bang Never Happened]
So it is certainly no coincidence that the period during which the Big Bang was in eclipse, from around 1957 to 1964, corresponds to the time of the most vigorous expansion of postwar recovery and a resurgence of confidence in progress. The Big Bang's golden age in the seventies, on the other hand, corresponds to the end of the postwar boom and a new decade of growing pessimism. In fact, the links between cosmological and social ideas were made explicit by both cosmologists and political writers of the period.
In the late sixties and the early seventies the postwar recovery ended in all the market economies. Real wages peaked in the United States and Western Europe, and somewhat later in Japan. In the Third World, per capita grain production, the best overall indicator of food supply and living standards, reached 340 kilograms per year and stopped rising, only to remain there for the next twenty years. While the food supply had increased by 50 percent from the depths of the late forties, it only recovered the levels of 1913.
As the seventies wore on, the economic problems facing the entire world, east as well as west, became more obvious. As had happened before, new markets were being saturated and increasing pressure was put on wages and living standards worldwide, as industrialists strove to maintain and increase their companies' profitability.
Almost as soon as this cessation of growth began to manifest itself, social ideas that justified the situation as inevitable started to circulate. In 1968 the Club of Rome, bringing together industrialists and academics, championed the idea of zero growth: the earth is finite, the universe is running down, it is impossible to continue the increase in living standards. The two oil crises were interpreted as warnings of the exhaustibility of finite resources-- a logic that must appear quaint to oil producers who now go to war in a struggle against a persistent glut.
Many writers used the Big Bang cosmology and the idea of universal decay to buttress the argument that consumption has to be restrained. In his 1976 book The Poverty of Power Barry Commoner begins from the cosmological premise that "the universe is constantly, irretrievably becoming less ordered than it was", and concludes that, given this overall tendency, Americans must make do with less in order to postpone the inevitable day when total disorder reigns on earth. The faltering universe of the Big Bang became a metaphor for the faltering economy--both equally inevitable processes, beyond the control of mere mortals.
Nor were cosmologists and physicists immune from the influence of such analogies. In the popular 1977 account of the Big Bang, The First Three Minutes, Nobel Prize winner Steven Weinberg concludes by contemplating the philosophical lessons of this universe, which will end either in the icy cold of final decay and infinite expansion, or in the fiery collapse to a new singularity: "It is almost irresistible for humans to believe that we have some special relation to the universe, that human life is not just a more or less farcical outcome of a chain of accidents reaching back to the first three minutes, but that we were somehow built in from the beginning. As I write this I happen to be in an airplane at 30,000 feet, flying over Wyoming en route home from San Francisco to Boston. Below, the earth looks very soft and comfortable--fluffy clouds here and there, snow turning pink as the sun sets, roads stretching straight across the country from one town to another. It is very hard to realize that this all is just a tiny part of an overwhelmingly hostile universe. It is even harder to realize that this present universe has evolved from an unspeakably unfamiliar early condition, and faces a future extinction of endless cold or intolerable heat. The more the universe seems comprehensible, the more it also seems pointless."
For Weinberg, as for others, the universe of the Big Bang is irreconcilable with human progress. The end may come billions of years from now, but in the end all that the human race accomplished in aeons will be nothing, of no consequence. Progress, then, is an illusion, as it was for Augustine sixteen hundred years ago. The only question is when it will stop--now, or at some point in the future. It is thus no surprise that the Big Bang flourished simultaneously with the social ideas, like zero growth, that deny the reality of progress, and with a growing economic crisis that, at least in the short term, had stalled that progress. Once again, cosmology justified the course of events on earth.
But there is probably no better example in this century of the interaction of social ideology and cosmology than the development of the inflationary universe in the eighties. Nineteen eighty, with the coming to power of conservative administrations in America and elsewhere, marked the end of a period of fashionable pessimism and the beginning of a decade of speculative boom. Alan Guth arrived at his idea of cosmic inflation just as the worst monetary inflation of the century was coming to a climax. He concluded that the universe is a "free lunch" just as the American economy began its own gigantic free lunch--a period of speculation which rewarded its wealthy participants while actual production stagnated.
Throughout the decade, the rise of financial speculation in Wall Street was shadowed by the rise of cosmologists' speculations in Princeton, Cambridge, and elsewhere. As Witten and his colleagues were acclaimed by the press as geniuses for theories that produced not a single valid prediction, so men like Michael Milken and Donald Trump earned not only far greater fame but also incomes that peaked, in Milken's case, at half a billion dollars per year for paper manipulations that added not a single penny to the nation's production.
In the realm of finance, fortunes were built on a tower of debt. A speculator would borrow four billion dollars to buy a company, sell it for five billion to another speculator, who would, in turn, break it up to sell it in pieces for six billion dollars--all on borrowed money. All involved reaped handsome profits and were hailed as geniuses of financial wizardry--until their indictments.
The result of this was an actual decline in living standards both in the U.S. and throughout the world: by the end of the eighties real family income in the U.S. had dropped by 10 percent and was at the same level as it had been twenty-five years earlier, despite the fact that most families by now had two incomes.
Obviously, the small-scale speculators of cosmology did not, in any conscious way, imitate the large-scale speculators of Wall Street. Yet, as in every other epoch, society's dominant ideas permeated cosmology. If the wealthiest members of society earned billions by mere manipulation of numbers, without building a single factory or mill, it didn't seem too strange that scientific reputations could be made with theories that have no more relation to reality. If a tower of financial speculation could be built on debt--the promise of future payment--then, similarly, a tower of cosmological speculation could be built on promises of future experimental confirmation.
There was, however, a more direct relationship between the development of the economy over the past decade and the development of cosmology and science generally. The eighties saw a slashing, particularly in the U.S., of the amount of money devoted to nonmilitary research and development and a drastic slowing of technical advance.
To a large extent, this intensified a tendency evident in the seventies and even in the sixties. Since 1960 there has not been a single major qualitative breakthrough in physical technology.
The thirty years before 1960 saw a series of fundamental developments: television in the thirties; the transistor, computer, radar, and, of course, nuclear energy in the forties; the development of space travel and the laser in the fifties. In the subsequent three decades there have been dramatic improvements in all these areas, particularly in computers, but not a single qualitatively new, functional idea. Only in biology has genetic engineering brought about a qualitative advance.
This is a profound change for modern society: not since the beginning of the industrial revolution 250 years ago has there been a similar period of three decades without major technical advances. Such technical stagnation has a deep impact on science and technology. An advancing society, which requires and thus supports fundamental work in science and technology, continually generates challenges for the pure sciences and provides the materials needed to meet those challenges. Thus the problems arising from the development of electricity and electrotechnology in the late nineteenth century led directly to the study of nuclear structure and eventually to the release of nuclear energy. When technological progress slows or ceases, that crossfertilization of theory and experiment, thought and action, begins to wither and scientists begin to turn to sterile speculation.
The slowing of technology is, today, directly linked to the growth of financial speculation. Five billion dollars invested in buying, say, Hughes Aircraft, is five billion dollars that the buyer, General Motors, will not put into new factories or new research. To the extent that the world market appears to be saturated, as it does today, then profits are easier to make through speculation than in production. What use is new technology if new factories aren't profitable? The diversion of financial resources from technical advance has pushed thousands of scientists away from the challenges of the real world into the deserts of speculation.
Fortunately for science, even the perfection of existing technologies, such as the computer, requires a broad base of scientific research. But it is fundamental research--investigations whose findings don't seem to be immediately useful--that suffer first when technological development slows. Today those areas are clearly cosmology and particle or high-energy physics--where the link between science and technology, theory and human progress, has been broken almost completely. It is here that, as in postclassical Greece, the stagnation of society has led to the return of mathematical myths, a retreat from the problems of base matter to the serene contemplation of numbers.
Today cosmologists often pride themselves on the isolation of their work from the everyday world and from any possible application. They and their particle theorist colleagues give their hypothetical entities whimsical and comical names to flaunt their belief that their activity is, at base, an elaborate and difficult game, the "free play of the mind." In a society beset by growing crises, a world of poverty, crime, drugs, and AIDS, a world without progress, the pure realm of mathematics offers a serene cloister.
[Source: Eric J Lerner - The Big Bang Never Happened]
Labels:
edu
The Structure of Scientific Revolutions
[...] one of the things a scientific community acquires with a paradigm is a criterion for choosing problems that, while the paradigm is taken for granted, can be assumed to have solutions. To a great extent these are the only problems that the community will admit as scientific or encourage its members to undertake. Other problems, including many that had previously been standard, are rejected as metaphysical, as the concern of another discipline, or sometimes as just too problematic to be worth the time. A paradigm can, for that matter, even insulate the community from those socially important problems that are not reducible to the puzzle form, because they cannot be stated in terms of the conceptual and instrumental tools the paradigm supplies.
[...] once it has achieved the status of a paradigm, a scientific theory is declared invalid only if an alternate candidate is available to take its place. No process yet disclosed by the historical study of scientific development at all resembles the methodological stereotype of falsification by direct comparison with nature. [...] the act of judgement that leads scientists to reject a previously accepted theory is always based upon more than a comparison of that theory with the world. The decision to reject one paradigm is always simultaneously the decision to accept another, and the judgement leading to that decision involves the comparison of both paradigms with nature AND with each other.
[...] Almost always the men who achieve these fundamental inventions of a new paradigm have been either very young or very new to the field whose paradigm they change. And perhaps that point need not have been made explicit, for obviously these are men who, being little commited by prior practice to the traditional rules of normal science, are particularly likely to see that those rules no longer define a playable game and to conceive another set that can replace them.
[...] Why should a change of paradigm be called a revolution ? In the face of the vast and essential differences between political and scientific development, what parallelism can justify the metaphor that finds revolutions in both ?[...] Political revolutions aim to change political institutions in ways that those institutions themselves prohibit.
[...] Like the choice between competing political institutions, that between competing paradigms proves to be a choice between incompatible modes of community life. Because it has that character, the choice is not and cannot be determined merely by the evaluative procedures characteristic of normal science, for these depend in part upon a particular paradigm, and that paradigm is at issue. When paradigms enter, as they must, into a debate about paradigm choice, their role is necessarily circular. Each group uses its own paradigm to argue in that paradigm defense. [...] whatever its force, the status of the circular argument is only that of persuasion. It cannot be made logically or even probabilistically compelling for those who refuse to step inside the circle. [...] As in political revolutions, so in paradigm choice - there is no standard higher than the assent of the relevant community.
[...] cumulative acquisition of novelty is not only rare in fact but improbable in principle. Normal research, which is cumulative, owes its success to the ability of scientists regularly to select problems that can be solved with conceptual and instrumental techniques close to those already in existence. (That is why an excessive concern with useful problems, regardless of their relation to existing knowledge and technique, can so easily inhibit scientific development.) The man who is striving to solve a problem defined by existing knowledge and technique is not, however, just looking around. He knows what he wants to achieve, and he designs his instruments and directs his thoughts accordingly. Unanticipated novelty, the new discovery, can emerge only to the extent that his anticipations about nature and his instruments prove wrong.
[...] To the extent, as significant as it is incomplete, that two scientific schools disagree about what is a problem and what a solution, they will inevitably talk through each other when debating the relative merits of their respective paradigms. In the partially circular arguments that regularly result, each paradigm will be shown to satisfy more or less the criteria that it dictates for itself and to fall short of a few of those dictated by its opponent. [...] since no paradigm ever solves the problems it defines and since no two paradigms leave all the same problems unsolved, paradigm debates always involve the question: Which problems is it more significant to have solved ?
[...] What a man sees depends both upon what he looks at and also upon what his previous visual-conceptual experience has taught him to see. [...] Far more clearly than the immediate experience from which they in part derive, operations and measurements are paradigm-determined. Science does not deal in all possible laboratory manipulations. Instead, it selects those relevant to the juxtaposition of a paradigm with the immediate experience that the paradigm has partially determined.
[...] Both scientists and laymen take much of their image of creative scientific activity from an authoritative source that systematically disguises - partly for important functional reasons - the existence and significance of scientific revolutions. [...] As the source of authority, I have in mind principally text-books of science together with both the popularizations and the philosophical works modeled on them. [...] All three record the stable OUTCOME of past revolutions and thus display the bases of the current normal-scientific tradition. To fulfill their function they need not provide authentic information about the way in which those bases were first recognized and then embraced by the profession. In the case of textbooks, at least, there are even good reasons why, in these matters, they should be systematically misleading. Textbooks [...] have to be rewritten in whole or in part whenever the language, problem-structure, or standards of normal science change. In short, they have to be rewritten in the aftermath of each scientific revolution, and, once rewritten, they inevitably disguise not only the role but the very existence of the revolutions that produced them.
[...] Textbooks thus begin by truncating the scientist's sense of his discipline's history and then proceed to supply a substitute for what they have eliminated. Characteristically, textbooks of science contain just a bit of history, either in an introductory chapter or, more often, in scattered references to the great heroes of an earlier age. From such references both students and professionals come to feel like participants in a long-standing historical tradition. Yet the textbook-derived tradition in which scientists come to sense their participation is one that, in fact, never existed. For reasons that are both obvious and highly functional, science textbooks (and too many of the older histories of science) refer only to that part of the work of past scientists that can easily be viewed as contributions to the statement and solution of the texts' paradigm problems. Partly by selection and partly by distortion, the scientists of earlier ages are implicitly represented as having worked upon the same set of fixed problems and in accordance with the same set of fixed canons that the most recent revolution in scientific theory and method has made seem scientific. No wonder textbooks and the historical tradition they imply have to be rewritten after each scientific revolution. And no wonder that, as they are re-written, science once again comes to seem largely cumulative.
[...] And Max Plank, surveying his own career in his Scientific Autobiography, sadly remarked that "a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it".
[...] paradigm debates are not really about relative problem-solving ability, though for good reasons they are usually couched in those terms. Instead, the issue is which paradigm should in the future guide research on problems many of which neither competitor can yet claim to resolve completely. A decision between alternate ways of practicing science is called for, and in the circumstances that decision must be based less on past achievement than on future promise. The man who embraces a new paradigm at an early stage must often do so in defiance of the evidence provided by problem-solving. He must, that is, have faith that the new paradigm will succeed with the many large problems that confront it, knowing only that the older paradigm has failed with a few. A decision of that kind can only be made on faith.
[Source: Thomas S Kuhn - The Structure of Scientific Revolutions]
[...] once it has achieved the status of a paradigm, a scientific theory is declared invalid only if an alternate candidate is available to take its place. No process yet disclosed by the historical study of scientific development at all resembles the methodological stereotype of falsification by direct comparison with nature. [...] the act of judgement that leads scientists to reject a previously accepted theory is always based upon more than a comparison of that theory with the world. The decision to reject one paradigm is always simultaneously the decision to accept another, and the judgement leading to that decision involves the comparison of both paradigms with nature AND with each other.
[...] Almost always the men who achieve these fundamental inventions of a new paradigm have been either very young or very new to the field whose paradigm they change. And perhaps that point need not have been made explicit, for obviously these are men who, being little commited by prior practice to the traditional rules of normal science, are particularly likely to see that those rules no longer define a playable game and to conceive another set that can replace them.
[...] Why should a change of paradigm be called a revolution ? In the face of the vast and essential differences between political and scientific development, what parallelism can justify the metaphor that finds revolutions in both ?[...] Political revolutions aim to change political institutions in ways that those institutions themselves prohibit.
[...] Like the choice between competing political institutions, that between competing paradigms proves to be a choice between incompatible modes of community life. Because it has that character, the choice is not and cannot be determined merely by the evaluative procedures characteristic of normal science, for these depend in part upon a particular paradigm, and that paradigm is at issue. When paradigms enter, as they must, into a debate about paradigm choice, their role is necessarily circular. Each group uses its own paradigm to argue in that paradigm defense. [...] whatever its force, the status of the circular argument is only that of persuasion. It cannot be made logically or even probabilistically compelling for those who refuse to step inside the circle. [...] As in political revolutions, so in paradigm choice - there is no standard higher than the assent of the relevant community.
[...] cumulative acquisition of novelty is not only rare in fact but improbable in principle. Normal research, which is cumulative, owes its success to the ability of scientists regularly to select problems that can be solved with conceptual and instrumental techniques close to those already in existence. (That is why an excessive concern with useful problems, regardless of their relation to existing knowledge and technique, can so easily inhibit scientific development.) The man who is striving to solve a problem defined by existing knowledge and technique is not, however, just looking around. He knows what he wants to achieve, and he designs his instruments and directs his thoughts accordingly. Unanticipated novelty, the new discovery, can emerge only to the extent that his anticipations about nature and his instruments prove wrong.
[...] To the extent, as significant as it is incomplete, that two scientific schools disagree about what is a problem and what a solution, they will inevitably talk through each other when debating the relative merits of their respective paradigms. In the partially circular arguments that regularly result, each paradigm will be shown to satisfy more or less the criteria that it dictates for itself and to fall short of a few of those dictated by its opponent. [...] since no paradigm ever solves the problems it defines and since no two paradigms leave all the same problems unsolved, paradigm debates always involve the question: Which problems is it more significant to have solved ?
[...] What a man sees depends both upon what he looks at and also upon what his previous visual-conceptual experience has taught him to see. [...] Far more clearly than the immediate experience from which they in part derive, operations and measurements are paradigm-determined. Science does not deal in all possible laboratory manipulations. Instead, it selects those relevant to the juxtaposition of a paradigm with the immediate experience that the paradigm has partially determined.
[...] Both scientists and laymen take much of their image of creative scientific activity from an authoritative source that systematically disguises - partly for important functional reasons - the existence and significance of scientific revolutions. [...] As the source of authority, I have in mind principally text-books of science together with both the popularizations and the philosophical works modeled on them. [...] All three record the stable OUTCOME of past revolutions and thus display the bases of the current normal-scientific tradition. To fulfill their function they need not provide authentic information about the way in which those bases were first recognized and then embraced by the profession. In the case of textbooks, at least, there are even good reasons why, in these matters, they should be systematically misleading. Textbooks [...] have to be rewritten in whole or in part whenever the language, problem-structure, or standards of normal science change. In short, they have to be rewritten in the aftermath of each scientific revolution, and, once rewritten, they inevitably disguise not only the role but the very existence of the revolutions that produced them.
[...] Textbooks thus begin by truncating the scientist's sense of his discipline's history and then proceed to supply a substitute for what they have eliminated. Characteristically, textbooks of science contain just a bit of history, either in an introductory chapter or, more often, in scattered references to the great heroes of an earlier age. From such references both students and professionals come to feel like participants in a long-standing historical tradition. Yet the textbook-derived tradition in which scientists come to sense their participation is one that, in fact, never existed. For reasons that are both obvious and highly functional, science textbooks (and too many of the older histories of science) refer only to that part of the work of past scientists that can easily be viewed as contributions to the statement and solution of the texts' paradigm problems. Partly by selection and partly by distortion, the scientists of earlier ages are implicitly represented as having worked upon the same set of fixed problems and in accordance with the same set of fixed canons that the most recent revolution in scientific theory and method has made seem scientific. No wonder textbooks and the historical tradition they imply have to be rewritten after each scientific revolution. And no wonder that, as they are re-written, science once again comes to seem largely cumulative.
[...] And Max Plank, surveying his own career in his Scientific Autobiography, sadly remarked that "a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it".
[...] paradigm debates are not really about relative problem-solving ability, though for good reasons they are usually couched in those terms. Instead, the issue is which paradigm should in the future guide research on problems many of which neither competitor can yet claim to resolve completely. A decision between alternate ways of practicing science is called for, and in the circumstances that decision must be based less on past achievement than on future promise. The man who embraces a new paradigm at an early stage must often do so in defiance of the evidence provided by problem-solving. He must, that is, have faith that the new paradigm will succeed with the many large problems that confront it, knowing only that the older paradigm has failed with a few. A decision of that kind can only be made on faith.
[Source: Thomas S Kuhn - The Structure of Scientific Revolutions]
Labels:
edu
The World of Free Energy - by Peter Lindemann
The World of Free Energy
by Peter Lindemann, D.Sc.
In the late 1880's, trade journals in the electrical sciences were predicting "free electricity" in the near future. Incredible discoveries about the nature of electricity were becoming common place. Nikola Tesla was demonstrating "wireless lighting" and other wonders associated with high frequency currents. There was an excitement about the future like never before.
Within 20 years, there would be automobiles, airplanes, movies, recorded music, telephones, radio, and practical cameras. The Victorian Age was giving way to something totally new. For the first time in history, common people were encouraged to envision a utopian future, filled with abundant modern transportation and communication, as well as jobs, housing and food for everyone. Disease would be conquered, and so would poverty. Life was getting better, and this time, everyone was going to get "a piece of the pie." So, what happened? In the midst of this technological explosion, where did the energy breakthroughs go? Was all of this excitement about "free electricity", which happened just before the beginning of the last century, all just wishful thinking that "real science" eventually disproved?
Current State of Technology
Actually, the answer to that question is NO. In fact, the opposite is true. Spectacular energy technologies were developed right along with the other breakthroughs. Since that time, multiple methods for producing vast amounts of energy at extremely low cost have been developed. None of these technologies have made it to the "open" consumer market as an article of commerce, however. Exactly why this is true will be discussed shortly. But first, I would like to describe to you a short list of "free energy" technologies that I am currently aware of, and that are proven beyond all reasonable doubt. The common feature connecting all of these discoveries, is that they use a small amount of one form of energy to control or release a large amount of a different kind of energy. Many of them tap the underlying aether in some way; a source of energy conveniently ignored by "modern" science.
1) Radiant Energy.
Nikola Tesla's Magnifying Transmitter, T. Henry Moray's Radiant Energy Device, Edwin Gray's EMA Motor, and Paul Baumann's Testatika Machine all run on Radiant Energy. This natural energy form can be gathered directly from the environment (mistakenly called "static" electricity) or extracted from ordinary electricity by the method called "fractionation." Radiant Energy can perform the same wonders as ordinary electricity, at less than 1% of the cost. It does not behave exactly like electricity, however, which has contributed to the scientific community's misunderstanding of it. The Methernitha Community in Switzerland currently has 5 or 6 working models of fuelless, self-running devices that tap this energy.
2) Permanent Magnets.
Dr. Robert Adams (NZ) has developed astounding designs of electric motors, generators and heaters that run on permanent magnets. One such device draws 100 watts of electricity from the source, generates 100 watts to recharge the source, and produces over 140 BTU's of heat in two minutes! Dr. Tom Bearden (USA) has two working models of a permanent magnet powered electrical transformer. It uses a 6-watt electrical input to control the path of a magnetic field coming out of a permanent magnet. By channeling the magnetic field, first to one output coil and then a second output coil, and by doing this repeatedly and rapidly in a "Ping-Pong" fashion, the device can produce a 96-watt electrical output with no moving parts. Bearden calls his device a Motionless Electromagnetic Generator, or MEG. Jean-Louis Naudin has duplicated Bearden's device in France. The principles for this type of device were first disclosed by Frank Richardson (USA) in 1978. Troy Reed (USA) has working models of a special magnetized fan that heats up as it spins. It takes exactly the same amount of energy to spin the fan whether it is generating heat or not. Beyond these developments, multiple inventors have identified working mechanisms that produce motor torque from permanent magnets alone.
3) Mechanical Heaters.
There are two classes of machines that transform a small amount of mechanical energy into a large amount of heat. The best of these purely mechanical designs are the rotating cylinder systems designed by Frenette (USA) and Perkins (USA). In these machines, one cylinder is rotated within another cylinder with about an eighth of an inch of clearance between them. The space between the cylinders is filled with a liquid such as water or oil, and it is this "working fluid" that heats up as the inner cylinder spins. Another method uses magnets mounted on a wheel to produce large eddy currents in a plate of aluminium, causing the aluminium to heat up rapidly. These magnetic heaters have been demonstrated by Muller (Canada), Adams (NZ) and Reed (USA). All of these systems can produce ten times more heat than standard methods using the same energy input.
4) Super-Efficient Electrolysis.
Water can be broken into Hydrogen and Oxygen using electricity. Standard chemistry books claim that this process requires more energy than can be recovered when the gases are recombined. This is true only under the worst case scenario. When water is hit with its own molecular resonant frequency, using a system developed by Stan Meyers (USA) and again recently by Xogen Power, Inc., it collapses into Hydrogen and Oxygen gas with very little electrical input. Also, using different electrolytes (additives that make the water conduct electricity better) changes the efficiency of the process dramatically. It is also known that certain geometric structures and surface textures work better than others do. The implication is that unlimited amounts of Hydrogen fuel can be made to drive engines (like in your car) for the cost of water. Even more amazing is the fact that a special metal alloy was patented by Freedman (USA) in 1957 that spontaneously breaks water into Hydrogen and Oxygen with no outside electrical input and without causing any chemical changes in the metal itself. This means that this special metal alloy can make Hydrogen from water for free, forever.
5) Implosion/Vortex.
All major industrial engines use the release of heat to cause expansion and pressure to produce work, like in your car engine. Nature uses the opposite process of cooling to cause suction and vacuum to produce work, like in a tornado. Viktor Schauberger (Austria) was the first to build working models of Implosion Engines in the 1930's and 1940's. Since that time, Callum Coats has published extensively on Schauberger's work in his book Living Energies and subsequently, a number of researchers have built working models of Implosion Turbine Engines. These are fuelless engines that produce mechanical work from energy accessed from a vacuum. There are also much simpler designs that use vortex motions to tap a combination of gravity and centrifugal force to produce a continuous motion in fluids.
6) Cold Fusion.
In March 1989, two Chemists from Brigham Young University in Utah (USA) announced that they had produced atomic fusion reactions in a simple tabletop device. The claims were "debunked" within 6 months and the public lost interest. Nevertheless, Cold Fusion is very real. Not only has excess heat production been repeatedly documented, but also low energy atomic element transmutation has been catalogued, involving dozens of different reactions! This technology definitely can produce low cost energy and scores of other important industrial processes.
7) Solar Assisted Heat Pumps.
The refrigerator in your kitchen is the only "free energy machine" you currently own. It's an electrically operated heat pump. It uses one amount of energy (electricity) to move three amounts of energy (heat). This gives it a "co-efficient of performance" (COP) of about 3. Your refrigerator uses one amount of electricity to pump three amounts of heat from the inside of the refrigerator to the outside of the refrigerator. This is its typical use, but it is the worst possible way to use the technology. Here's why. A heat pump pumps heat from the "source" of heat to the "sink" or place that absorbs the heat. The "source" of heat should obviously be HOT and the "sink" for heat should obviously be COLD for this process to work the best. In your refrigerator, it's exactly the opposite. The "source" of heat is inside the box, which is COLD, and the "sink" for heat is the room temperature air of your kitchen, which is warmer than the source. This is why the COP remains low for your kitchen refrigerator. But this is not true for all heat pumps. COP's of 8 to 10 are easily attained with solar assisted heat pumps. In such a device, a heat pump draws heat from a solar collector and dumps the heat into a large underground absorber, which remains at 55° F, and mechanical energy is extracted in the transfer. This process is equivalent to a steam engine that extracts mechanical energy between the boiler and the condenser, except that it uses a fluid that "boils" at a much lower temperature than water. One such system that was tested in the 1970's produced 350 hp, measured on a Dynamometer, in a specially designed engine from just 100-sq. ft. of solar collector. (This is NOT the system promoted by Dennis Lee.) The amount of energy it took to run the compressor (input) was less than 20 hp, so this system produced more than 17 times more energy than it took to keep it going! It could power a small neighborhood from the roof of a hot tub gazebo, using exactly the same technology that keeps the food cold in your kitchen. Currently, there is an industrial scale heat pump system just north of Kona, Hawaii that generates electricity from temperature differences in ocean water.
There are dozens of other systems that I have not mentioned, many of them are as viable and well tested as the ones I have just recounted. But this short list is sufficient to make my point: free energy technology is here, now. It offers the world pollution-free, energy abundance for everyone, everywhere. It is now possible to stop the production of "greenhouse gases" and shut down all of the nuclear power plants. We can now desalinate unlimited amounts of seawater at an affordable price, and bring adequate fresh water to even the most remote habitats. Transportation costs and the production costs for just about everything can drop dramatically. Food can even be grown in heated greenhouses in the winter, anywhere. All of these wonderful benefits that can make life on this planet so much easier and better for everyone have been postponed for decades. Why? Whose purposes are served by this postponement?
The Opportunity
What is starting to happen is that inventors are publishing their work, instead of patenting it and keeping it secret. More and more, people are "giving away" information on these technologies in books, videos and websites. While there is still a great deal of useless information about free energy on the Internet, the availability of good information is rising rapidly. Check out the list of websites and other resources at the end of this article.
It is imperative that you begin to gather all of the information you can on real free energy systems. The reason for this is simple. Focus on what you can do now, not on how much there still is to be done. Small, private research groups are working out the details as you read this. Many are committed to publishing their results on the Internet.
If we stand up and refuse to remain ignorant and action-less, we can change the course of history. It is the aggregate of our combined action that can make a difference. Only the mass action that represents our consensus can create the world we want. Nevertheless, free energy technology is here. It is real, and it will change everything about the way we live, work and relate to each other. In the last analysis, free energy technology obsoletes greed and the fear for survival. But like all exercises of Spiritual Faith, we must first manifest the generosity and trust in our own lives
The Source of Free Energy is INSIDE of us. It is that excitement of expressing ourselves freely. It is our Spiritually guided intuition expressing itself without distraction, intimidation or manipulation. It is our open-heartedness. Ideally, the free energy technologies underpin a just society where everyone has enough food, clothing, shelter, self-worth, and the leisure time to contemplate the higher Spiritual meanings of Life. Do we not owe it to each other, to face our fears, and take action to create this future for our children's children?
by Peter Lindemann, D.Sc.
In the late 1880's, trade journals in the electrical sciences were predicting "free electricity" in the near future. Incredible discoveries about the nature of electricity were becoming common place. Nikola Tesla was demonstrating "wireless lighting" and other wonders associated with high frequency currents. There was an excitement about the future like never before.
Within 20 years, there would be automobiles, airplanes, movies, recorded music, telephones, radio, and practical cameras. The Victorian Age was giving way to something totally new. For the first time in history, common people were encouraged to envision a utopian future, filled with abundant modern transportation and communication, as well as jobs, housing and food for everyone. Disease would be conquered, and so would poverty. Life was getting better, and this time, everyone was going to get "a piece of the pie." So, what happened? In the midst of this technological explosion, where did the energy breakthroughs go? Was all of this excitement about "free electricity", which happened just before the beginning of the last century, all just wishful thinking that "real science" eventually disproved?
Current State of Technology
Actually, the answer to that question is NO. In fact, the opposite is true. Spectacular energy technologies were developed right along with the other breakthroughs. Since that time, multiple methods for producing vast amounts of energy at extremely low cost have been developed. None of these technologies have made it to the "open" consumer market as an article of commerce, however. Exactly why this is true will be discussed shortly. But first, I would like to describe to you a short list of "free energy" technologies that I am currently aware of, and that are proven beyond all reasonable doubt. The common feature connecting all of these discoveries, is that they use a small amount of one form of energy to control or release a large amount of a different kind of energy. Many of them tap the underlying aether in some way; a source of energy conveniently ignored by "modern" science.
1) Radiant Energy.
Nikola Tesla's Magnifying Transmitter, T. Henry Moray's Radiant Energy Device, Edwin Gray's EMA Motor, and Paul Baumann's Testatika Machine all run on Radiant Energy. This natural energy form can be gathered directly from the environment (mistakenly called "static" electricity) or extracted from ordinary electricity by the method called "fractionation." Radiant Energy can perform the same wonders as ordinary electricity, at less than 1% of the cost. It does not behave exactly like electricity, however, which has contributed to the scientific community's misunderstanding of it. The Methernitha Community in Switzerland currently has 5 or 6 working models of fuelless, self-running devices that tap this energy.
2) Permanent Magnets.
Dr. Robert Adams (NZ) has developed astounding designs of electric motors, generators and heaters that run on permanent magnets. One such device draws 100 watts of electricity from the source, generates 100 watts to recharge the source, and produces over 140 BTU's of heat in two minutes! Dr. Tom Bearden (USA) has two working models of a permanent magnet powered electrical transformer. It uses a 6-watt electrical input to control the path of a magnetic field coming out of a permanent magnet. By channeling the magnetic field, first to one output coil and then a second output coil, and by doing this repeatedly and rapidly in a "Ping-Pong" fashion, the device can produce a 96-watt electrical output with no moving parts. Bearden calls his device a Motionless Electromagnetic Generator, or MEG. Jean-Louis Naudin has duplicated Bearden's device in France. The principles for this type of device were first disclosed by Frank Richardson (USA) in 1978. Troy Reed (USA) has working models of a special magnetized fan that heats up as it spins. It takes exactly the same amount of energy to spin the fan whether it is generating heat or not. Beyond these developments, multiple inventors have identified working mechanisms that produce motor torque from permanent magnets alone.
3) Mechanical Heaters.
There are two classes of machines that transform a small amount of mechanical energy into a large amount of heat. The best of these purely mechanical designs are the rotating cylinder systems designed by Frenette (USA) and Perkins (USA). In these machines, one cylinder is rotated within another cylinder with about an eighth of an inch of clearance between them. The space between the cylinders is filled with a liquid such as water or oil, and it is this "working fluid" that heats up as the inner cylinder spins. Another method uses magnets mounted on a wheel to produce large eddy currents in a plate of aluminium, causing the aluminium to heat up rapidly. These magnetic heaters have been demonstrated by Muller (Canada), Adams (NZ) and Reed (USA). All of these systems can produce ten times more heat than standard methods using the same energy input.
4) Super-Efficient Electrolysis.
Water can be broken into Hydrogen and Oxygen using electricity. Standard chemistry books claim that this process requires more energy than can be recovered when the gases are recombined. This is true only under the worst case scenario. When water is hit with its own molecular resonant frequency, using a system developed by Stan Meyers (USA) and again recently by Xogen Power, Inc., it collapses into Hydrogen and Oxygen gas with very little electrical input. Also, using different electrolytes (additives that make the water conduct electricity better) changes the efficiency of the process dramatically. It is also known that certain geometric structures and surface textures work better than others do. The implication is that unlimited amounts of Hydrogen fuel can be made to drive engines (like in your car) for the cost of water. Even more amazing is the fact that a special metal alloy was patented by Freedman (USA) in 1957 that spontaneously breaks water into Hydrogen and Oxygen with no outside electrical input and without causing any chemical changes in the metal itself. This means that this special metal alloy can make Hydrogen from water for free, forever.
5) Implosion/Vortex.
All major industrial engines use the release of heat to cause expansion and pressure to produce work, like in your car engine. Nature uses the opposite process of cooling to cause suction and vacuum to produce work, like in a tornado. Viktor Schauberger (Austria) was the first to build working models of Implosion Engines in the 1930's and 1940's. Since that time, Callum Coats has published extensively on Schauberger's work in his book Living Energies and subsequently, a number of researchers have built working models of Implosion Turbine Engines. These are fuelless engines that produce mechanical work from energy accessed from a vacuum. There are also much simpler designs that use vortex motions to tap a combination of gravity and centrifugal force to produce a continuous motion in fluids.
6) Cold Fusion.
In March 1989, two Chemists from Brigham Young University in Utah (USA) announced that they had produced atomic fusion reactions in a simple tabletop device. The claims were "debunked" within 6 months and the public lost interest. Nevertheless, Cold Fusion is very real. Not only has excess heat production been repeatedly documented, but also low energy atomic element transmutation has been catalogued, involving dozens of different reactions! This technology definitely can produce low cost energy and scores of other important industrial processes.
7) Solar Assisted Heat Pumps.
The refrigerator in your kitchen is the only "free energy machine" you currently own. It's an electrically operated heat pump. It uses one amount of energy (electricity) to move three amounts of energy (heat). This gives it a "co-efficient of performance" (COP) of about 3. Your refrigerator uses one amount of electricity to pump three amounts of heat from the inside of the refrigerator to the outside of the refrigerator. This is its typical use, but it is the worst possible way to use the technology. Here's why. A heat pump pumps heat from the "source" of heat to the "sink" or place that absorbs the heat. The "source" of heat should obviously be HOT and the "sink" for heat should obviously be COLD for this process to work the best. In your refrigerator, it's exactly the opposite. The "source" of heat is inside the box, which is COLD, and the "sink" for heat is the room temperature air of your kitchen, which is warmer than the source. This is why the COP remains low for your kitchen refrigerator. But this is not true for all heat pumps. COP's of 8 to 10 are easily attained with solar assisted heat pumps. In such a device, a heat pump draws heat from a solar collector and dumps the heat into a large underground absorber, which remains at 55° F, and mechanical energy is extracted in the transfer. This process is equivalent to a steam engine that extracts mechanical energy between the boiler and the condenser, except that it uses a fluid that "boils" at a much lower temperature than water. One such system that was tested in the 1970's produced 350 hp, measured on a Dynamometer, in a specially designed engine from just 100-sq. ft. of solar collector. (This is NOT the system promoted by Dennis Lee.) The amount of energy it took to run the compressor (input) was less than 20 hp, so this system produced more than 17 times more energy than it took to keep it going! It could power a small neighborhood from the roof of a hot tub gazebo, using exactly the same technology that keeps the food cold in your kitchen. Currently, there is an industrial scale heat pump system just north of Kona, Hawaii that generates electricity from temperature differences in ocean water.
There are dozens of other systems that I have not mentioned, many of them are as viable and well tested as the ones I have just recounted. But this short list is sufficient to make my point: free energy technology is here, now. It offers the world pollution-free, energy abundance for everyone, everywhere. It is now possible to stop the production of "greenhouse gases" and shut down all of the nuclear power plants. We can now desalinate unlimited amounts of seawater at an affordable price, and bring adequate fresh water to even the most remote habitats. Transportation costs and the production costs for just about everything can drop dramatically. Food can even be grown in heated greenhouses in the winter, anywhere. All of these wonderful benefits that can make life on this planet so much easier and better for everyone have been postponed for decades. Why? Whose purposes are served by this postponement?
The Opportunity
What is starting to happen is that inventors are publishing their work, instead of patenting it and keeping it secret. More and more, people are "giving away" information on these technologies in books, videos and websites. While there is still a great deal of useless information about free energy on the Internet, the availability of good information is rising rapidly. Check out the list of websites and other resources at the end of this article.
It is imperative that you begin to gather all of the information you can on real free energy systems. The reason for this is simple. Focus on what you can do now, not on how much there still is to be done. Small, private research groups are working out the details as you read this. Many are committed to publishing their results on the Internet.
If we stand up and refuse to remain ignorant and action-less, we can change the course of history. It is the aggregate of our combined action that can make a difference. Only the mass action that represents our consensus can create the world we want. Nevertheless, free energy technology is here. It is real, and it will change everything about the way we live, work and relate to each other. In the last analysis, free energy technology obsoletes greed and the fear for survival. But like all exercises of Spiritual Faith, we must first manifest the generosity and trust in our own lives
The Source of Free Energy is INSIDE of us. It is that excitement of expressing ourselves freely. It is our Spiritually guided intuition expressing itself without distraction, intimidation or manipulation. It is our open-heartedness. Ideally, the free energy technologies underpin a just society where everyone has enough food, clothing, shelter, self-worth, and the leisure time to contemplate the higher Spiritual meanings of Life. Do we not owe it to each other, to face our fears, and take action to create this future for our children's children?
Labels:
sci-tech
Subscribe to:
Posts (Atom)