Notes on Where Is My Flying Car?

book
notes
history
My notes from the book Where Is My Flying Car? by J. Storrs Hall.
Author

Christian Mills

Published

September 2, 2024

Chapter 1: The World of Tomorrow

The Allure of Future Technology

  • C. L. Kelly Johnson, a renowned aeronautical engineer, was inspired by the Tom Swift Jr. stories, which depicted a future with flying cars and advanced technology.
    • “I read Tom Swift and his airship, Tom Swift and his electric runabout, Tom Swift and his submarine boat, and I said, that’s for me.” - C. L. Kelly Johnson

  • The original Tom Swift Sr. series, introduced in 1910, featured gadgets like an electric runabout and a photo telephone, modeled after Thomas Edison.
    • Edison was a celebrated inventor who significantly shaped the 20th century with his inventions and the establishment of the Industrial Research Laboratory.
  • A century ago, people held a strong belief in a bright future driven by technology, fueled by the rapid advancements of the early 20th century.
    • Examples of advancements: Cars, radios, airplanes, widespread electricity, and economic growth.
  • This optimism persisted through the mid-20th century, with figures like Tom Swift Jr. continuing to inspire individuals like Kelly Johnson, who became a leading figure in aircraft design.
    • Johnson’s contributions included the F-104 Starfighter, SR-71 Blackbird, and the first business jet, the Jetstar.

The Rise of Technological Utopianism

  • The vision of a technologically advanced future gained momentum in the early 20th century and was further fueled by the Great Depression.
    • Science fiction flourished during this period, offering a hopeful escape from the hardships of the time.
  • H.G. Wells’ film “Things to Come” (1935) depicted a technological utopia and accurately predicted the onset of World War II and the Blitz.
    • The film’s Art Deco futuristic style influenced subsequent popular speculations about the future.
  • The 1939 World’s Fair, with its “I Have Seen the Future” pin and General Motors’ Futurama exhibit, further solidified the public’s vision of a technologically advanced future.
    • Grover Whelan, a key organizer of the fair, coined the phrase “The World of Tomorrow.
    • Norman Bel Geddes’ streamlined modern designs, featured in the Futurama exhibit, became synonymous with the future aesthetic.
  • Geddes’ influence extended to various aspects of design, from trains and cars to radios and furniture, characterized by curved lines and rounded corners.
    • His concepts included multi-floored passenger airplanes and self-driving cars.

Wells’ Vision and its Influence

  • In “Things to Come,” Wells portrays a technocratic elite called “Wings Over the World,” a group of airmen who exemplify the emerging technological professional class.
  • Wells’ concept: This new class, possessing advanced scientific knowledge and skills, would naturally assume leadership after the decline of the old order.
  • Wells’ utopian vision includes:
    • A technocratic elite ruling over a scientifically advanced society.
    • Aggressive deployment of technology to solve global problems.
    • Victorian-style exploration and adventurism, prioritizing results over individual safety.
  • Wells’ villains in “Things to Come” are the “do-nots,” individuals who are content with their comfortable lives and resist technological progress.
    • This conflict is not a social class struggle but a clash between those who embrace progress (“doers”) and those who oppose it.
  • The “do-nots” represent a more nuanced version of the Eloi from Wells’ “The Time Machine,” highlighting the potential for societal stagnation due to prolonged comfort and safety.

The Post-War World and Technological Optimism

  • World War II accelerated technological advancements, as promising innovations were rapidly developed, mass-produced, and deployed for military purposes.
    • The war witnessed significant progress in aviation, weaponry, medicine, and other fields.
  • Post-war optimism: The belief that wartime scientific and technological advancements could be repurposed to improve civilian life gained widespread traction.
  • This optimism was reflected in the growing genre of post-war world literature, which often echoed the themes of H.G. Wells’ work.
  • A 1944 review of two books, “Your World Tomorrow” and “Miracles Ahead,” titled “Green Light for the Age of Miracles,” exemplified this optimistic outlook.
  • The review highlighted predictions for transportation, including:
    • Cars with wings for air travel.
    • Helicopter “flivers” (small personal aircraft).
    • Affordable giant airliners enabling global travel.
  • While flying cars and flivers remain unrealized, the prediction of affordable air travel has come true.
  • The books also accurately predicted other advancements:
    • High-octane gasoline, factory-built housing, portable radios, pocket-sized televisions, air conditioning, plastics, synthetic foods and materials, electronic cooking controls, fluorescent lights, and global tourism.

Popularizing the World of Tomorrow

  • Walt Disney became a prominent advocate for the World of Tomorrow vision through projects like Tomorrowland and the Experimental Prototype Community of Tomorrow (EPCOT).
  • The promise of flying cars extended beyond science fiction, with companies like Cessna and Piper running ads depicting them in the 1940s and 1960s.
  • Automakers embraced the futuristic aesthetic, designing cars with features resembling aircraft, such as tail fins and simulated jet intakes.
  • Experimental flying car projects emerged, including Ford’s Leva Car and the Curtis Wright Bee.
  • The flying car was considered a feasible achievement within the context of the burgeoning space age.
  • However, the reality of air travel has fallen short of these optimistic visions, with airports and travel experiences often being less than ideal.

The Shift to Space Travel

  • The focus of futuristic visions shifted from flying cars to space travel, as seen in the covers of Galaxy Science Fiction magazine from 1950 to 1980.
    • Only six covers featured flying cars, while over half depicted space travel.
  • John Glenn’s orbital flight in 1962 marked a turning point, as real-world space exploration began to overshadow the futuristic visions of the 1950s.
  • The Space Age concept, while present in science fiction since the war, gained significant public attention with the launch of Sputnik in 1957.

Arthur C. Clarke’s “Profiles of the Future”

  • Arthur C. Clarke, a renowned science fiction writer and technologist, published “Profiles of the Future” in 1962.
  • The book explored the limits of technological possibilities, ranging from existing technologies like hovercraft to speculative concepts like antigravity.
  • Clarke’s work joined a growing body of literature on futurism and technological forecasting, including Herman Kahn and Anthony J. Wiener’s “The Year 2000.”
  • Clarke’s predictions focused on the mid- to far-future, acknowledging that near-term predictions would likely disappoint those seeking immediate solutions.
  • “Profiles of the Future” became a highly influential text among those who embraced technological optimism.

Accurate Predictions and Inspired Guesses

  • Clarke accurately predicted the development of self-driving cars:
    • “The automobile of the future… will travel there by the most efficient route, after first checking with the highway information system for blockages and traffic jams.” - Arthur C. Clarke

    • “It may one day be a serious offense to drive an automobile on a public highway.” - Arthur C. Clarke

  • Isaac Asimov, in a 1964 article, also envisioned self-driving vehicles with “robot brains.”
  • Mainstream futurists like Kahn and Wiener predicted electric cars and pneumatic tube trains but not self-driving ones.
  • Clarke’s accurate prediction of worldwide communication networks:
    • “We could be in instant contact with each other wherever we may be… It will be possible… for a man to conduct his business from Tahiti or Bali just as well as he could from London.” - Arthur C. Clarke

Predictions from the 1960s: A Mixed Bag

  • Science fiction writers of the 1960s, including Clarke, Asimov, and Heinlein, made various predictions for the present day (2014):
    • Orbital space stations, lunar bases, nuclear rockets, interplanetary travel, pocket telephones, video phones, atomic power, fusion power, space-based military, modular architecture, automatic meal preparation, synthetic food, high-speed transportation, declining highways, self-driving vehicles, cyborgs, robots, automation, wireless energy transmission, translating machines, artificial intelligence, a global library, sea mining, contraception, cures for cancer and other diseases, and psychology and education as hard sciences.
  • Some predictions have come true (e.g., pocket phones, global library/internet), while others have not (e.g., nuclear rockets, interplanetary travel).
  • Many predictions fall into a gray area, with varying degrees of realization (e.g., atomic power, artificial intelligence, sea mining).
  • Overall, the predictions were reasonably accurate, with a tendency to be optimistic by about a decade, except for transportation and space exploration, which lagged significantly behind expectations.

“The Jetsons”: A Cartoon Reflection of the Space Age

  • Hanna-Barbera’s “The Jetsons” (1962) embodied the space-age zeitgeist, portraying a future where aeronautics and space travel are commonplace.
  • The show depicted a working-class family in the year 2062, mirroring the narrative structure of shows like “The Honeymooners” and “The Flintstones.”
  • The first episode showcased various futuristic concepts: flying cars, conveyor belts, powered chairs, large-screen 3D TVs, robot maids, automatic doors, and pneumatic tube transportation.
  • The show’s visual style, with its curved lines, tail fins, and antennae, reflected the futuristic aesthetics of “Things to Come” and Futurama.
  • The Jetsons’ lives were portrayed as incredibly easy and affluent, highlighting the optimistic belief that technological advancements would improve everyone’s lives.

The Subtext of “The Jetsons”

  • “The Jetsons” presented a stark contrast to the realities of 1960s America, where many lacked basic amenities like televisions and telephones.
  • The show’s depiction of effortless luxury and advanced technology implied that technological progress would lead to a future of abundance and leisure for all.
  • The subtext: Advancing technology would elevate everyone’s standard of living, making everyone a “jet-setter.”
  • This optimistic message aligned with the historical trend of rising material well-being in the early 20th century, leading to the expectation of continued progress for future generations.

The “Miracles” and the Zeitgeist

  • The remarkable technological and economic advancements of the early 20th century were considered a set of “miracles” by some economists.
  • However, these advancements were the culmination of a long process of industrial development that had been underway for centuries.
  • The Industrial Revolution, with its innovations in steam power, machinery, and transportation, laid the groundwork for the 20th century’s progress.
  • The key factor that enabled these advancements was the zeitgeist, the spirit of the times, which fostered a culture of innovation and a respect for “doers” over “do-nots.”
  • Physics remained constant; it was the change in people’s attitudes and approach to technology that drove the progress of the Industrial Revolution and beyond.

Chapter 2: The Graveyard of Dreams

The Great Stagnation

Setting the Stage

  • Quote:
    • “The golden age of science fiction gave us more than just optimistic predictions of technological wonders. Gripping fiction requires conflict. Sometimes the aliens are threatening to invade. Sometimes they have already succeeded.”

  • Science fiction often presents dystopian futures alongside optimistic ones.
  • Example: Philip Francis Nolan’s Armageddon 2419 A.D. (Buck Rogers story) depicts a future where America is in ruins and Americans are a hunted race.
  • Detroit’s Decline:
    • Detroit in 1960 was a symbol of American prosperity, with the automobile industry as its cornerstone.
    • By 2013, Detroit had filed for bankruptcy, facing widespread urban decay and economic hardship.
    • Statistics on Detroit’s decline:
      • 40% of street lamps don’t work.
      • 210 of 317 public parks are permanently closed.
      • 1-hour police response time to 911 calls.
      • Only a third of ambulances are drivable.
      • One-third of the city abandoned.
  • Quote: “To achieve this level of devastation, you usually have to be invaded by a foreign power.”
  • Detroit’s decline exemplifies a broader pessimism about the future, contrasting with the optimism of the early-to-mid 20th century.

The Flying Car as a Symbol of Failed Expectations

  • Quote:
    • “It’s hard to predict what life will be like in a hundred years. There are only a few things we can say with certainty. We know that everyone will drive flying cars, that zoning laws will be relaxed to allow buildings hundreds of stories tall, that it will be dark most of the time, and that women will all be trained in the martial arts.” - Paul Graham

  • Film Noir Future: This quote satirizes the common dystopian depiction of the future in film noir, exemplified by films like Blade Runner.
  • Flying Car Expectations: In the 1930s-50s, flying cars were a widely anticipated technological advancement, with news stories reporting progress towards their production.
  • Waning Optimism: This optimism faded in the 1960s and 70s, replaced by a more pessimistic outlook.
  • Quote:
    • “We should also recall the past over optimism, including the universal prediction in the late 1940s that within a generation, each family would have its own vertical liftoff airplane, a universal society of Jetsons.” - Robert J. Gordon

  • Quote: “We’re very far from the flying car.” - Tyler Cowen
  • Contrasting Expectations: People in 1962 expected significant advancements in transportation technology based on the rapid progress seen in the first half of the 20th century.
  • Historical Context:
    • In 1912, flying machines were in their infancy.
      • The Wright Model B, the first production airplane, was just being sold.
      • Louis Blériot’s flight across the English Channel was a major event.
    • By 1962, fighter planes like the F-104 Starfighter reached speeds over 1,300 mph, and bombers like the B-52 had a range of over 10,000 miles.
    • Commercial airliners like the 727 and DC-8 could carry 170 people at 600 mph.
    • The automobile had become universally owned.
  • Reasonable Expectations: The rapid progress in aviation and automobile technology in the first half of the 20th century led to the reasonable expectation that private aircraft would become commonplace by the 21st century.

Life-Changing Technologies and Economic Growth

  • Life in 1962: The average American in 1962 enjoyed a significantly higher standard of living than in 1900, thanks to numerous technological advancements.
  • Examples of Life-Changing Technologies: Automobiles, airplanes, skyscrapers, antibiotics, movies, pre-made clothing, electric lighting, radio, television, refrigerators, vacuum cleaners, washing machines, and modern stoves.
  • Economic Growth:
    • Americans became 3-4 times richer between 1900 and 1962.
    • GDP per capita grew from $4,000-$5,000 (constant 2005 dollars) in 1900 to $16,000 in 1962.
    • Personal income remained a consistent 50% of total productivity.
    • Innovations like assembly line manufacturing and containerized shipping boosted productivity.
  • Continued Progress Expected: This period of rapid progress and economic growth fueled expectations of similar advancements in the future, including the widespread adoption of private aircraft.

Stagnation Point

The Missing Future

  • Quote: “You call this the future? Where are the flying cars?” - Calvin (from the comic strip Calvin and Hobbes)
  • Persistent Disappointment: The absence of flying cars and other anticipated technologies highlights the perceived failure of modern technology to meet earlier expectations.
  • Quote: “The question, ‘where is my flying car,’ Wikipedia tells us, is emblematic of the supposed failure of modern technology to match futuristic visions that were promoted in earlier decades.”
  • Google Search Results: A Google search for “where is my flying car” returns 781,000 webpages, indicating widespread interest in this topic.
  • The Future’s Uneven Distribution:
    • Quote: “The future is already here. It’s just not very evenly distributed.” - William Gibson
    • Quote:
      • “We wanted flying cars; instead we got the ability to instantly connect with anyone anywhere in the world to share stories pictures music podcasts ideas film animation comics feedback friendship love and our lives. Flying cars seem really cool on their face, but I somehow doubt that they would have so meaningful an impact on our lives. After all, George Jetson didn’t even have a cell phone.” - Adam Gurry

  • Alternative Viewpoints: Some argue that the lack of flying cars doesn’t necessarily indicate a slowdown in innovation, but rather a shift in technological focus towards other areas, such as communication and information technology.

Slowing Innovation and Cost Disease

  • The Great Stagnation:
    • Tyler Cowen’s book The Great Stagnation (2011) argues that the US entered a period of economic slump in the 1970s after a period of rapid progress due to the exploitation of easily accessible technological advancements.
    • This stagnation is reflected in slow income growth and a decline in the “American Dream”.
    • Cowen uses median income rates as evidence, while Peter Turchin highlights the flatlining of unskilled labor wages.
    • Turchin observes a shift from an “integrative” to a “disintegrative” phase in various societal trends in the 1970s.
  • Cost Disease:
    • Definition: The phenomenon where the costs of essential goods and services (e.g., housing, education, healthcare) continue to rise in real terms, even after adjusting for inflation.
    • Examples:
      • Housing costs have doubled since the 1960s.
      • Primary education costs have tripled, without a corresponding increase in learning outcomes.
      • Healthcare costs have risen sixfold, while longevity growth has slowed.
      • College tuition and textbooks cost roughly ten times more.
  • Distinguishing from Efficiency-Driven Price Changes: Cost disease is distinct from situations where prices decrease due to increased efficiency (e.g., mass production).
  • Secular Economic Decline:
    • Since the 1970s, actual economic growth has fallen below the historical trend line and has not recovered.
    • The per capita GDP growth rate dropped from 2.8% (pre-1970) to 1.9% (post-1970).
    • The post-war era (1945-1975) saw consistent economic growth, leading to optimistic projections that were not realized.

Optimistic Projections and Their Failure

  • Quote:
    • “The most significant discovery emerging from our increasingly sophisticated understanding of the economic growth process is the fact that accelerated economic growth can be achieved by a combination of fiscal policy, monetary policy, and continuously increasing R&D budgets, both federal and private.” - Robert Prohota

  • Herbert Simon’s Prediction: Herbert Simon, in his 1966 book The Shape of Automation, predicted that average family income would reach $28,000 (1966 dollars) by the turn of the century, based on a 3% growth rate.
  • Reality: Instead of the projected 3% growth, family income growth slowed to 0.5% in the following half-century.
  • Factors Contributing to Slowdown:
    • The rise of single-parent households, leading to a loss of economic efficiencies associated with the traditional family structure.
  • Conclusion: While individual indicators of stagnation might be debatable, the combined evidence suggests a significant slowdown in technological progress and economic growth since the 1970s.

The Airplane Crash

Decline of the Private Airplane Industry

  • Disappearance of the Industry: The private airplane industry, including companies like Cessna and Piper, experienced a significant decline starting in the late 1970s.
  • The 1960s Boom: The 1960s was a period of growth for private aviation, with a wide variety of small family airplanes available.
  • Impact of the Energy Crisis: The energy crisis of the 1970s temporarily impacted the industry, but it recovered within a few years.
  • Current State: Today, only about 700 new piston airplanes, 400 turboprops, and 400 private jets are sold annually.
  • Increased Costs: Due to limited demand, airplane prices have increased more than tenfold compared to the 1970s.

Flatlining of Airliner Speeds

  • Exponential Growth Followed by Plateau: Airliner speeds increased exponentially until the 1960s, then plateaued.
  • Reasons for the Plateau:
    • Physics and Economics: Flying just above Mach 1 is three times more expensive than flying just below it.
    • Subsidized SST Projects: The Concorde and Tupolev Tu-144 supersonic transports were heavily subsidized national projects.
  • Engine Technology Continued to Advance: Despite the plateau in airliner speeds, engine technology continued to improve significantly.
    • Example: The de Havilland Ghost turbojet engine (1950s) had 5,000 pounds of thrust, while the GE-90 (1990s) had 115,000 pounds of thrust.
  • Unfulfilled Expectations: Based on the rapid progress in the first half of the 20th century, people in 1962 expected further significant increases in airliner speed, range, and capacity.
  • Stagnation in Air Travel: Instead of further advancements, air travel has seen little improvement in the past 50 years, with some aspects even declining in the pursuit of fuel efficiency.
  • Potential Market for Faster Travel: Despite the existence of technology for faster air travel, it has not been implemented, suggesting roadblocks beyond economic considerations.
  • Energy Considerations: For long international flights, the energy required to fly through the atmosphere is comparable to that needed for orbital flight, which could be significantly faster.
  • Iconic Example of Stagnation: The plateau in air travel serves as a representative example of the broader stagnation in life-improving technologies observed since the 1970s.
  • Historical Analogy: The transition from wooden sailboats to steel steamships demonstrates that the adoption of more expensive technology can be justified by its superior performance.

The Henry Adams Curve

The Historical Trend of Energy Growth

  • Quote:
    • “The coal output of the world, speaking roughly, doubled every ten years between 1840 and 1900, in the form of utilized power, for the ton of coal yielded three or four times as much power in 1900 as in 1840. Rapid as this rate of acceleration in volume seems, it may be tested in a thousand ways without greatly reducing it. Perhaps the ocean steamer is nearest unity and easiest to measure, for anyone might hire in 1905, for a small sum of money, the use of 30,000 steam horsepower to cross the ocean. And by halving this figure every ten years, he got back to 234 horsepower for 1835, which was accuracy enough for his purposes.” - Henry Adams

  • The Henry Adams Curve: A historical trend of roughly 7% annual growth in usable energy available to civilization, observed since the early days of the steam engine.
  • Factors Contributing to the Curve:
    • 3% population growth.
    • 2% energy efficiency growth.
    • 2% growth in per capita energy consumption.
  • Flatlining of Energy Consumption: The Henry Adams curve plateaued in the 1970s, as shown by US energy consumption data.
  • The 1970s Energy Crisis:
    • The energy crisis was caused by Nixon’s price controls (1971), followed by the OPEC oil embargo (1973).
    • The Department of Energy (established 1977) is jokingly suggested to have been created to prevent energy use.
  • Human Power vs. Modern Energy Use:
    • Before the Industrial Revolution, humans relied primarily on their own muscle power and animal power.
    • A human can produce about 100 watts of power daily.
    • The average South American uses 10 times this amount, while the average North American uses 100 times.
  • Potential for Further Growth: It’s conceivable that future technological advancements could significantly increase energy use per capita.
  • Consequences of the Flatline: The plateau in energy consumption has contributed to a slowdown in life-improving technologies, including those related to productivity, health, and transportation.

Correlation Between Energy Intensity and Technological Progress

  • Analysis of Science Fiction Predictions: Examining the fulfillment of technological predictions from the golden age of science fiction reveals a pattern related to energy intensity.
  • Correlation with Energy Intensity:
    • Technologies requiring low power levels have generally been fulfilled or even exceeded expectations.
    • Technologies requiring moderate power levels have seen mixed results, with some fulfilling expectations and others falling short.
    • Technologies requiring high power levels have consistently failed to meet expectations.
  • Moore’s Law as an Exception: The continued progress in computing and communications, exemplified by Moore’s Law, is an exception because energy consumption was not a major constraint in this area.
  • The Forbidden Zone: The top right quadrant of the graph, representing high-power technologies, appears to be a “forbidden zone” where predicted advancements have not materialized.
  • Denied Futures: This “forbidden zone” represents the futures we were promised but were denied, including flying cars, due to the limitations imposed by the energy flatline.

The Consequences of Power Limitations

  • Quote:

    • “Power is our only lack. We generate all we can with the materials and knowledge at our disposal, but we never have enough. Our development is hindered, our birth rate must be held down to a minimum, many new cities which we need cannot be built, and many new projects cannot be started, all for lack of power.” - E.E. “Doc” Smith

  • This quote from E.E. “Doc” Smith highlights the profound impact of power limitations on societal development and progress.

Chapter 3: The Conquest of the Air

Early Predictions and Skepticism

  • H.G. Wells predicted successful airplane flight before the year 2000. (Wells’ Anticipations, Circa 1900)
  • The scientific community, including astronomer Simon Newcomb, largely believed powered, heavier-than-air flight was impossible or at least a million years away. (New York Times, 1903)
  • Newcomb’s critiques, while ultimately incorrect, highlighted key challenges to be solved, such as:
    • How to stop an airplane traveling at high speeds: “How is he ever going to stop?” (Newcomb, 1903)
      • Current airplanes use a “kludge” of descending to the ground at flying speed and slowing down on a runway.
      • This runway requirement prevents airplanes from being practical flying cars.

The Power Curve and Energy Management

  • The key to flying is energy management.
  • Airplanes require power to:
    • Generate lift: Stay aloft.
    • Overcome drag: Maintain speed.
  • Three stores of energy power an airplane:
    1. Speed: Provides a few seconds of flight.
    2. Altitude: Provides a few minutes of flight.
    3. Fuel: Provides a few hours of flight.
  • Takeoff, climb, and landing are phases of managing these energy stores.
  • Landing is particularly problematic:
    • Descending directly downwards converts altitude energy into unwanted speed.
    • Requires a long, slanting approach to bleed off energy.
    • Airplanes must maintain flying speed until touchdown.
  • Induced drag (related to lift) decreases with increasing speed.
  • Parasitic drag (related to other air disturbances) increases with increasing speed.
  • The power curve represents the total power needed to fly at different speeds.
    • Flying faster or slower than the optimal speed requires more power.
  • Birds can efficiently land because they can:
    • Spread their wings for high lift and high drag.
    • Flap vigorously to increase lift while slowing down.
    • Integrate thrust production with lift generation.
  • Fixed-wing airplanes cannot replicate this:
    • Increasing lift with the propeller also increases speed.
    • This creates a conflict during landing when slowing down is desired.
  • The closer flight technology can emulate birds, the closer we get to flying cars.

Early Attempts at Runway-Independent Flight

The Flying Pancake and Custer Channel Wing

  • Charles Zimmerman (NACA) developed the Flying Pancake in the 1930s.
    • Goal: Create an airplane that could descend steeply at low speeds for easier landings, eliminating the need for long runways. (Zimmerman, 1932)
  • Willard R. Custer’s Channel Wing (CCW) aircraft used upper surface blowing.
    • Patented in 1929, flew in the 1940s.
    • CCW-5:
      • 5 passengers, 2.5 tons, 200 mph cruise speed.
      • 35 mph flight speed, 250-foot takeoff roll.
    • Achieved low-speed lift by positioning propellers to create high-speed airflow over the upper wing surface.
  • Challenges faced by the Flying Pancake and CCW:
    • High angle of attack required for low-speed flight, obscuring ground visibility.
    • High power consumption at low speeds.

Autogyros: A Promising Compromise

Juan de la Cierva’s Innovation

  • Juan de la Cierva invented the autogyro in the early 1920s.
    • Goal: Solve the stall problem by making wings move fast even at low aircraft speeds.
  • Autogyro features:
    • Rotor blades act as the wing, generating lift.
    • Separate propeller provides thrust to overcome drag.
    • Autorotation: Wind passing through the rotor disc causes the rotors to spin.
  • Innovations to overcome autogyro challenges:
    • Dissymmetry of lift: Solved by using flapping hinges that allow blades to adjust their angle of attack.
      • Blades on the advancing side flap up, reducing lift; blades on the retreating side flap down, increasing lift.
    • Centrifugal force keeps the flapping blades extended outwards.
  • Development and Refinement:
    • De la Cierva refined the autogyro throughout the 1920s.
    • Frank Courtney’s experience with blade breakage led to the addition of lead-lag hinges to accommodate blade acceleration and deceleration.

Harold Pitcairn and the Rise of Autogyros

  • Harold Pitcairn acquired the US rights to build and develop autogyros in 1929.
  • 1931 milestones:
    • Pitcairn autogyro landed on the White House lawn.
    • Amelia Earhart set an altitude record in an autogyro.
  • Autogyros in popular culture:
    • Featured in fiction and movies (e.g., It Happened One Night, Things to Come, For Us the Living).
  • Direct Control Rotor Hub:
    • Invented by de la Cierva, allowed direct pilot control of rotor blade pitch.
    • Eliminated the need for wings on later autogyro models.

The Decline of Autogyros

  • De la Cierva’s death in 1936 hindered autogyro development.
    • Loss of intellectual and spiritual leadership.
    • Breakdown of collaboration and information sharing.
  • World War II further disrupted development.
  • Pitcairn’s legal battles with the US government over helicopter patents diverted resources.

Autogyro Safety and Advantages

  • Safety features:
    • Low takeoff and landing speeds.
    • Less susceptible to wind gusts.
    • Inherent autorotation capability.
  • Advantages over helicopters:
    • Simpler and less expensive.
    • Autorotation is always available, unlike in some helicopter flight regimes.
  • Accident statistics from the 1930s showed a low fatality rate.

Why Autogyros Didn’t Become Flying Cars

  • Superseded by helicopters:
    • Helicopters offer greater versatility (hovering, vertical takeoff).
  • Market specialization:
    • Airplanes excel in speed and fuel efficiency.
    • Helicopters excel in hovering and vertical operations.
  • Post-war decline in private aviation:
    • Government focus shifted away from pilot training.
    • Interstate highway system provided a compelling alternative for travel.

Convertible Car-Airplanes

Buckminster Fuller’s Dymaxion Car

  • Buckminster Fuller’s Dymaxion car (early 1930s):
    • Aerodynamic egg shape.
    • Three-wheel configuration, steered by the rear wheel.
  • Challenges:
    • Ground looping: Instability in crosswinds.
    • Lifting body effect: Tendency to lose ground contact at high speeds.

Waldo Waterman’s Aerobeel

  • Waldo Waterman’s Aerobeel (1937):
    • First fully functional fixed-wing flying car.
    • Tailless (flying wing) design.
    • Tricycle landing gear (more stable than tail-draggers).
  • Features:
    • Easy to fly and drive.
    • Removable wings for car mode.
  • Reasons for failure:
    • Great Depression hampered sales.
    • Loss of financial support and Waterman’s illness.
    • World War II disruption.

World War II’s Impact

  • The war significantly hindered flying car development:
    • Diverted resources and personnel.
    • Halted private aviation growth.
    • Examples of disrupted projects:
      • Moulton Taylor’s flying car.
      • Ted Hall’s modular flying car designs.
      • Pitcairn Aviation’s autogyro development.

Moulton Taylor’s AeroCar

  • Moulton Taylor’s AeroCar (post-war):
    • Two-seater coupe with a 143-horsepower engine.
    • Foldable wings and tail stored in a towable trailer.
    • Well-designed, stable, and easy to fly.
  • Models:
    • Four AeroCar I, one AeroCar II, one AeroCar III (rebuilt from AeroCar I).
  • Achievements:
    • Received Federal Aircraft Certification in 1956.
    • Flew a total of 9,000 hours.
    • 135 mph cruise speed.
  • Significance:
    • Solved the three-vehicle problem.
  • Reasons for limited success:
    • Decline in private aviation.
    • Rise of the Interstate Highway System.

Convair Car

  • Theodore Hall’s Convair Car (late 1940s):
    • Airplane designed to carry a car, not fly independently.
    • Car designed by Henry Dreyfus.
    • Specifications:
      • 1,500 lbs empty weight, 1,000 lbs payload and fuel capacity.
      • 190-horsepower engine, 125 mph airspeed.
      • 25-horsepower car engine.
  • Marketing plan:
    • Sell the car separately, rent the airplane part at airports.
  • Reason for failure:
    • Test pilot Ruben Snodgrass crash-landed the prototype due to fuel exhaustion.
    • Negative publicity led to project cancellation.
  • Significance:
    • Nearly solved the three-vehicle problem.

Helicopters and VTOL Aircraft

Jules Verne’s Albatross

  • Jules Verne’s Rober the Conqueror (1886) featured the Albatross, an early depiction of a VTOL flying machine.
    • Clipper ship lifted by numerous airscrews.
  • Modern feasibility:
    • A lighter, more aerodynamic version could be built today.

Helicopter Development

  • Helicopters became practical around 1940.
  • Key innovations from autogyros:
    • Flapping hinges.
    • Dampened lead-lag hinges.
    • Collective and cyclic pitch controls.
  • Sikorsky’s early helicopter:
    • Three tail rotors for torque counteraction and pitch/roll control.
  • Adoption of Pitcairn’s fully articulated hub (patent 582):
    • Allowed for the standard single side-facing tail rotor configuration.

Limitations of Helicopters

  • High cost:
    • Complex and precise rotor hub requires expensive manufacturing and maintenance.
  • Solution:
    • Replace the single large rotor with multiple small, ducted fans (like Verne’s Albatross).

Ducted Fan VTOL Aircraft

  • Hiller Flying Platform (1950s):
    • Ducted fan VTOL resembling a flying Segway.
  • Ducted fan advantages:
    • Court nozzle effect: Amplifies lift by ~40%.
    • Noise reduction.
    • Increased stability.
  • Flying Platform specifications:
    • 7 feet wide, 370 lbs empty weight, 180 lbs payload capacity.
    • 80 horsepower (two 40-horsepower engines).
    • Limited altitude (33 feet) and speed (16 mph).
  • Challenges:
    • Duct-induced stability limits forward speed.

Tilt Duct VTOL Aircraft

  • Doak VZ-4:
    • Tilting ducted fans for forward flight.
    • Specifications:
      • ~1 ton empty weight, ~1.5 tons loaded weight.
      • 230 mph maximum speed.
  • Army VZ program (1950s):
    • VZ-6 (Chrysler): Unsuccessful, crashed due to instability.
    • VZ-8 (Piasecki): Reasonably successful flying jeep.
      • Twin 400-horsepower turboshaft engines.
      • Capable of hovering, low-altitude flight, and weapons platform use.
      • Ultimately deemed unsuitable for field operations.
  • Navy Bell X-22A (mid-1960s):
    • Four-fan tilt duct VTOL.
    • Nearly 40 feet long, 9 tons loaded weight.
    • Successful tilt duct design, easy transition to forward flight.
    • 255 mph top speed.
    • Potential for scaled-down flying car application.

The Unfulfilled Promise of Flying Cars

The Role of Visionaries and Entrepreneurs

  • 1930s:
    • Understanding of the “needs of the private owner” for aircraft design.
    • De la Cierva and Pitcairn’s efforts to develop the autogyro.
  • Post-war era:
    • Helicopters emerged as a potential solution.
    • Pitcairn focused on autogyros due to their lower cost and suitability for private use.
    • Legal battles hindered Pitcairn’s progress.

Technological Feasibility vs. Realization

  • By the 1970s, various flying car technologies had been successfully demonstrated:
    • Convertible airplanes.
    • Ducted-fan VTOLs.
  • Cost remained a major barrier to widespread adoption.
  • Technological advancement could have potentially overcome cost barriers over time (similar to computers).

Conclusion

  • The absence of flying cars today is not due to technological limitations.
  • Factors contributing to the lack of flying cars:
    • Economic factors (Great Depression, World War II).
    • Shifting government priorities.
    • Rise of alternative transportation systems (Interstate Highway System).
    • Lack of sustained investment and development.

Questions for Reflection

  • What other potentially transformative technologies have failed to materialize?
  • Are we accepting limitations that could be overcome with further innovation?
  • What is the future of personal transportation?

Chapter 4: Waldo and Magic, Inc.

Waldo’s Origin and Concept

  • Robert A. Heinlein, writing as Anson MacDonald, introduced the Waldo F. Jones Synchronous Reduplicating Pantograph in a 1942 Astounding science fiction story.
    • This concept established Heinlein as the conceptual inventor of the telemanipulator, often referred to as a Waldo.
  • A Waldo is essentially a robot arm controlled remotely by a human operator, unlike programmed robots.
    • Waldos emerged in the 1950s, driven by advancements in cybernetics and control theory, and the need to handle radioactive materials in the burgeoning nuclear field.
  • Heinlein’s original Waldos had two key features:
    • Self-replicating and reduplicating.
    • Scale-shifting.
  • Waldo’s objective was to operate on individual nerve cells by creating a series of progressively smaller Waldos.
    • Existing tools, like electromagnetic instruments and neural surgery, lacked the precision for Waldo’s desired level of investigation.
  • Waldo’s smallest Waldos at the time had half-inch palms and micro-scanners.
    • These were still too large for his purpose, leading him to use them to create even tinier Waldos.
  • Waldo’s final Waldos for nerve and brain surgery ranged from life-size mechanical hands to tiny “fairy digits” capable of manipulating objects invisible to the naked eye.
    • These were mounted in banks to work on the same area.
    • Waldo controlled them all from the same primary controls, switching sizes without removing his gauntlets.
    • Circuit changes automatically adjusted the scanning sweep and magnification, providing Waldo with a life-size image of his “other hands” in his stereo receiver.
  • Heinlein briefly addressed the challenge of visualization at smaller scales in a science fiction context, but lacked a detailed plan.
    • A more concrete solution would emerge nearly two decades later.

Feynman’s Vision: Nanotechnology’s Genesis

  • Richard Feynman, a physics professor at Caltech, likely encountered Heinlein’s Waldo story.
    • Feynman’s academic background (MIT, Princeton, Los Alamos) placed him in environments with strong science fiction communities.
  • In late 1959, Feynman repurposed the Waldo idea for an after-dinner speech at an American Physics Society conference.
    • He added detailed technological and scientific grounding to the concept, becoming intrigued by its potential.
    • This marked Feynman as the first to envision nanotechnology.
  • Feynman’s revolutionary insight led him to predict:
    • “In the year 2000, when they look back at this age, they will wonder why it was not until the year 1960 that anybody began seriously to move in this direction.”
  • Feynman’s prediction was premature.
    • Serious progress in nanotechnology didn’t begin until the 1990s.
  • By 2000, atomic-scale electronic devices like molecular transistors existed, but complex circuits like nanocomputers were still 15 years away.
  • The missing element was the infrastructure that macroscopic technology relies on:
    • Sorting and testing parts.
    • Cutting and joining materials.
    • Creating frameworks for devices.
    • Placing parts into frameworks.
  • Feynman’s 1959 talk “There’s Plenty of Room at the Bottom” outlined a plan that could have achieved this infrastructure by 2000.

Feynman’s Plan: Top-Down Approach

  • Feynman’s talk proposed a “weird possibility” inspired by remote manipulators used in nuclear plants:
    • Build a master-slave system operating electrically.
    • Create precisely-made slave hands at one-quarter the scale of the master hands.
    • Use these smaller hands to manipulate quarter-sized tools and machines, including a quarter-size lathe.
    • Repeat the process to create even smaller hands and tools, at one-sixteenth scale.
    • Connect the large-scale system to the smallest servo motors through transformers for control.
  • Feynman’s core idea was to transition from macro-scale to nano-scale fabrication while maintaining general fabrication and manipulation capabilities, mirroring Waldo’s approach in Heinlein’s story.
  • Feynman acknowledged the difficulty but emphasized the possibility of this approach.
  • He offered $1,000 prizes for a tiny motor and tiny writing, aiming to stimulate interest.
    • The prizes were won, but Feynman’s vision failed to gain widespread traction.

Drexler’s Vision: Bottom-Up Approach

  • Around 1976, K. Eric Drexler, an MIT undergraduate, envisioned building with designed protein molecules and other biomolecules.
    • He observed the mechanical and electronic functions of cellular components and questioned if humans could replicate such processes.
  • By 1981, Drexler published a paper in the Proceedings of the National Academy of the Sciences, sparking the formation of a nanotechnology study group at MIT.
  • In 1986, Drexler and his group published Engines of Creation, introducing nanotechnology to a broader audience.
  • Drexler’s approach differed from Feynman’s:
    • Feynman’s was top-down, starting with machines and scaling down to atoms.
    • Drexler’s was bottom-up, starting with biochemistry and scaling up to machines.
  • Both approaches ultimately aimed for molecular machines.
    • Feynman envisioned the physicist synthesizing anything.
    • Drexler described it as complete control of the structure of matter.
  • Interpreting these visions requires nuance.
    • Creating unstable atomic arrangements like exploding dynamite atom-by-atom is not feasible.
  • A reasonable interpretation focuses on rearranging existing atoms, exemplified by the hamburger analogy:
    • Eating a hamburger rearranges atoms without creating or destroying them.
    • A general synthesis capability would allow for transforming waste products back into the original hamburger.
    • Current technology achieves this indirectly through agriculture and food processing.
    • A nanotechnological machine could achieve this directly at the molecular level.
  • DNA has limitations: it cannot describe every possible atomic arrangement, only the linear sequence of amino acids in proteins.
    • However, amino acids can form diverse molecular machinery, performing life’s essential functions.
    • This parallels limitations in macroscopic technology, where machinists can create components but not infinitely complex shapes.
  • Self-replication was an implicit concept in nanotechnology from the outset:
    • Heinlein explicitly mentioned “reduplicating”.
    • Feynman adopted this idea.
    • Drexler’s work stemmed from self-reproducing biological mechanisms.
  • Nanotechnology surpasses limitations of cellular biochemistry:
    • It can operate without water, across a wider temperature range, and at higher speeds and power levels.

Nanotech’s Potential: Incomprehensible Capabilities

  • Nanotechnology offers immense potential due to its ability to rearrange atoms.
    • The difference between charcoal and diamond, sand and computer chips, diseased and healthy tissue, and even life and death lies solely in atomic arrangement.
  • Drexler’s Engines of Creation paints a technophile’s dream:
    • Microscopic replicating units building skyscraper-sized objects with atomic precision.
    • Artificial intelligence and complex engineering systems.
    • Easy space travel with lightweight spacesuits.
    • Cell repair machines and the cure for aging.
    • Cryonics and resurrection.
  • Drexler’s vision expanded significantly on Feynman’s initial ideas:
    • Feynman focused on data storage and chemical synthesis.
    • Drexler explored the broader implications of “making absolutely anything”.
  • Drexler’s projections were based on conservative engineering estimates.
  • Despite this, science fiction writers took the concept further, often with less scientific accuracy.
    • A 1989 Star Trek episode featured nanites evolving into an intelligent, coordinated species in a short timeframe, defying basic scientific principles.
  • Growing enthusiasm for nanotechnology led to increased hype and difficulty distinguishing serious research from fictionalized portrayals.
  • The technical world’s excitement was palpable:
    • The Usenet Nanotechnology Discussion Group (Sci.Nanotech), founded in 1989, received overwhelming support.
    • The first Foresight Conference on Nanotechnology in 1989 evoked comparisons to the historic Solvay Conferences.
  • Drexler’s influence on nanotechnology’s development in the 1980s and 1990s was substantial, exceeding that of Feynman or Heinlein.

The Mightiest Machine: Understanding Nanotech’s Power

  • Grasping the raw power and potential of nanotechnology can be challenging.
  • Drexler’s Nanosystems highlights the immense power density of nanomotors:
    • Greater than 10^15 watts per cubic meter.
    • For comparison, Earth receives about 10^17 watts of solar radiation.
    • Cooling limitations prevent sustained operation at this density.
  • This implies:
    • A 1,000 horsepower (10^6 watts) engine could fit in a 1 millimeter cube (10^-9 cubic meters).
    • One cubic foot of nanomotors would require more power (10^13 watts) than all human machinery combined.
    • A Kardashev Type I civilization’s power needs (10^17 watts) could be met by nanomotors fitting in a 500-square-foot apartment.
  • The Jetsons’ world becomes more realistic than traditional futurism:
    • Microscopic motors would be ubiquitous, seamlessly moving objects like biological muscles.
  • Miniaturization and speed are key advantages of nanotechnology:
    • Smaller machines can operate faster, similar to the trend in electronics.
    • Birds and insects demonstrate this principle with wing flapping speeds.
  • Factory throughput is crucial for productivity:
    • A factory processing materials at walking speed (5 feet per second) across a mile takes 20 minutes.
    • Scaling down to cell size (microns) reduces processing time to microseconds.
    • This explains the vast difference in reproduction speeds between humans (20 years) and E. coli (20 minutes).
  • Reproduction speed impacts economic growth:
    • A factory with a 3% annual return on investment takes 33 years to recoup its cost.
    • Reinventing profits accelerates growth through compound interest.
    • Self-replicating factories dramatically change the equation.
  • Replacing the US capital stock with mature nanotechnology:
    • Estimated time: about a week, based on calculations by Robert Freitas and the author.
  • This seemingly outlandish claim is supported by Nanosystems and molecular manufacturing research.
  • Analogy with information technology:
    • The power and sophistication of modern computers would seem “fantastic, ludicrous, insane” to someone from the 1960s.
    • The IBM 7094, used by NASA, had 0.35 MIPS and cost $35 million in today’s dollars.
    • A $35 Raspberry Pi in 2015 had 9,700 MIPS.
    • The author’s workstation has 2,356,230 MIPS.
  • Similar exponential advancements are possible with matter and energy manipulation through nanotechnology.

From Bits to Atoms: The Digital Revolution in Matter

  • Nanotech is a digital technology, distinct from nanotechnology (nanoscale surface and material science).
  • Atomically precise machines that build more atomically precise machines create a digital matter escalator, analogous to digital information technology.
  • Hardware becomes secondary to design:
    • In software, hardware advancements (Moore’s Law) outpaced software complexity.
    • In nanotech, physics (the abundance of atoms) is the substrate, and machine design (the software) is the bottleneck.
  • The challenge lies in designing self-constructing machines in digital matter.
  • Rebuilding America physically in a week highlights the need for efficient design processes.
  • Biotechnology is currently the only field manipulating matter digitally, following nature’s lead.
  • Within decades, we will create machines capable of making anything, similar to 3D printers but with atomic precision and diverse materials.
  • Self-replicating nanofactories will accelerate this capability, potentially following a Moore’s Law-like trajectory.
  • Moore’s Law represents a 60% growth rate.
  • A 40% growth rate in physical manufacturing is plausible with nanotech, considering real-world constraints.
  • Comparing the iPhone to the IBM 1401 demonstrates the potential for miniaturization, increased capability, and reduced cost in physical technology.
  • Feynman’s 1959 vision could have led to a drastically different present.

The Unanswered Question: Why the Delay?

  • Decades after 2000, we still grapple with Feynman’s question: Why the delay in pursuing nanotechnology?
  • The Overton window effect might explain the resistance to revolutionary ideas.
    • Machiavelli’s observation on the incredulity of men facing new things resonates.
  • Even Feynman’s stature couldn’t overcome this inertia.
  • Discarding a technology with such immense potential (flying cars, immortality, peak health) qualifies as folly.
  • Other factors might be at play, requiring further exploration.

The 2020 Pandemic: A Case Study in Lost Opportunities

  • The COVID-19 pandemic highlights the consequences of our technological stagnation.
  • Rapid vaccine development (Moderna’s mRNA vaccine) showcased the potential of biotechnology.
  • Bureaucratic delays and limited manufacturing capacity hindered vaccine distribution, contributing to a disastrous year.
  • Nanofactories could have addressed the manufacturing bottleneck:
    • mRNA vaccines are arrangements of atoms that nanofactories could readily produce.
  • Thought experiment:
    • Imagine nanofactory progress mirroring computer advancements since 1960.
    • Personal nanofactories, potentially integrated into smartphones, could synthesize vaccine doses on demand.
  • Nanotechnology’s potential has been within reach for decades.
  • The question remains: Why have we failed to capitalize on it?

Chapter 5: Cold Fusion

The Occurrence of the Impossible

  • E.E. Doc Smith’s “The Skylark of Space” provides a fictional example of a scientist discovering a new energy source through electrolysis of an unknown metal (X).
    • The experiment results in a sudden release of energy, causing the apparatus to move violently.
    • This fictional account foreshadows some aspects of the cold fusion controversy.

Cold Fusion at the University of Utah

  • February 1985: Kevin Ashley, a graduate student, witnessed the aftermath of an experiment in the electrochemistry lab at the University of Utah.
    • The lab was in disarray, with dust in the air.
    • A hole was found in the lab bench and the concrete floor beneath it.
    • Stanley Pons and Martin Fleischmann appeared surprised by the results.
  • Pons and Fleischmann’s Belief: They believed they achieved deuterium fusion within an electrochemical beaker.
    • They invested significant personal funds ($100,000, equivalent to $250,000 today) in further research.
    • Their hypothesis, if true, would have revolutionized energy production.
  • Reality Check: The energy release, if it were truly fusion, would have been lethal and caused widespread radiation.
    • Their experiments over the next four years yielded inconsistent results.
      • Massive heat releases were infrequent.
      • More often, nothing happened.
      • Occasional bursts of heat, sometimes exceeding the input energy by tenfold, were observed after prolonged deuterium loading of a palladium electrode.
      • These bursts were unpredictable in their onset and duration.
  • Assessment of Pons and Fleischmann:
    • Both were experts in electrochemistry, making systematic measurement errors unlikely.
    • Fleischmann was considered a leading figure in the field.

The Cold Fusion Announcement and Its Aftermath

  • 1989: Events unfolded that led to a premature public announcement.
    • Pons and Fleischmann sought funding from the University of Utah.
    • The university was interested in potential patent rights.
    • Steve Jones, a physicist at Brigham Young University, was working on muon-catalyzed fusion (MCF).
      • MCF is a known phenomenon in standard physics.
      • It involves the fusion of deuterium nuclei facilitated by muons (particles similar to electrons but heavier).
      • MCF is not a practical energy source due to the energy cost of muon creation.
    • The University of Utah, fearing loss of patent priority, pressured Pons and Fleischmann to announce their findings.
  • March 1989: A press conference was held, and a preliminary paper titled “Electrochemically Induced Nuclear Fusion of Deuterium” was published.
    • Fleischmann expressed reservations about this hasty approach, preferring to delay publication.
  • Media Frenzy and Scientific Controversy: The announcement sparked intense media attention and a strong reaction from the scientific community.
    • The author, then a researcher at Rutgers, recalls the excitement and skepticism surrounding the announcement.
      • Physicists at Princeton attempted to replicate the experiment with radiation shielding, highlighting the contrast in approaches.

From Space Opera to Soap Opera: The Machiavelli Effect

  • Parallels with “The Skylark of Space”:
    • Both the fictional X and palladium are rare platinum group metals.
    • Both involve energy release through electrolysis.
    • In the novel, the fictional scientist Seton faces disbelief, theft, assassination attempts, and the abduction of his fiancée.
  • The Machiavelli Effect in Science:
    • Richard Feynman emphasized the importance of admitting ignorance and uncertainty for scientific progress.
    • Niccolo Machiavelli, in “The Prince” (1532), described the challenges faced by innovators who threaten the established order.
      • Innovators face strong opposition from those who benefit from the status quo and lukewarm support from potential beneficiaries.
      • This resistance can endanger the innovator and their supporters.
  • Evidence of the Machiavelli Effect in Cold Fusion:
    • Scientists with ties to hot fusion research were particularly critical of cold fusion.
    • The Department of Energy (DOE) formed a committee (ERAB) to investigate cold fusion in April 1989.
      • The committee leadership had strong connections to nuclear weapons research.
      • The report, while not entirely dismissive, was interpreted as such.
      • John Huizenga wrote a book debunking cold fusion, leveraging his position as committee chairman.
      • The report acknowledged the impossibility of definitively proving or disproving all claims.
      • It listed 11 positive and 13 negative experiments for excess heat.
      • At least one credible positive result from NASA was overlooked.
      • A later positive result from the Naval Air Warfare Center was initially listed as negative.
      • The evidence for excess heat was arguably balanced, if not slightly favoring cold fusion.
  • The Role of Politics and Funding:
    • The Superconducting Supercollider (SSC) project was competing for funding at the time.
    • Cold fusion posed a potential threat to the SSC’s funding and the prestige of high-energy physics.
    • The Machiavelli effect explains the biased reaction, as the scientific establishment sought to protect its interests.
  • Examples of Bias:
    • MIT’s Plasma Science and Fusion Center reported no excess heat, but their experimental setup was flawed.
      • Peter Hagelstein pointed out the significantly higher measurement errors in the MIT experiments compared to Fleischmann and Pons’s work.
    • The emotional and dismissive language used by some scientists against cold fusion proponents further suggests bias.
  • The Impact of the ERAB Report:
    • By November 1989, the scientific community largely dismissed cold fusion.
    • Fleischmann and Pons’s detailed paper was published only then, making proper replication attempts possible.
    • The Machiavelli effect was in full force, hindering further research.
  • Continued Suppression of Research:
    • Hagelstein’s experience with a company interested in funding cold fusion research demonstrates ongoing resistance.
      • A prominent physicist at MIT intervened, canceling the program and jeopardizing the careers of researchers involved.
  • Implications of the Machiavelli Effect:
    • The attacks on cold fusion may not have been solely based on scientific merit.
    • There is a high prior probability of bias due to the political and financial stakes.
  • Challenges in Cold Fusion Research:
    • The field is plagued by unreliable claims from less credible sources.
    • Replication is difficult, increasing the risk of self-deception among researchers.
    • Cold fusion is often associated with pseudoscience and fringe ideas, further damaging its reputation.
    • The constant barrage of unsubstantiated claims makes it difficult for scientists to differentiate potentially valid research.
  • The Importance of Exploring the Impossible:
    • Arthur C. Clarke: “The only way of discovering the limits of the possible is to venture a little way past them into the impossible.”
    • Many scientific and technological advancements resulted from challenging established limits.

Aftermath and Current Status

  • Lessons Learned:
    • Fleischmann and Pons’s focus on fusion products, an area outside their expertise, made them vulnerable to criticism.
    • They should have emphasized the observed heat amplification instead.
  • The National Cold Fusion Institute:
    • Founded at the University of Utah with $4.5 million in funding.
    • Failed to produce significant results and closed within a year.
    • Legal disputes surrounding the institute cost five times more than the research itself.
  • Further Research Efforts:
    • Toyota invested $40 million in Fleischmann and Pons’s research at IMRA Europe, with limited success.
    • Japan’s MITI spent $20 million, also with no breakthroughs.
    • These efforts did lead to a gradual improvement in producing the effect, but reliability remained elusive.
  • Difficulties in Replication:
    • Richard Oriani described cold fusion experiments as the most challenging of his career.
    • Replication remains extremely difficult, requiring specific conditions and materials.
  • Progress in Understanding:
    • Michael McKubre’s research at SRI International showed that a deuterium loading ratio of at least 90% in palladium is crucial.
    • The quality of palladium samples is also a significant factor.
    • Dennis Cravens and Edmund Storms found that only a small percentage of palladium samples are effective.
  • Department of Energy Review (2004):
    • Another ERAB review was conducted, resulting in a non-committal report.
    • An analysis of reviewers’ comments suggests a slight lean towards the possibility of excess heat and nuclear phenomena, but not strong support.
  • Ongoing Research:
    • Despite limited funding and skepticism, research continues at NASA, the Navy, and DARPA.
  • The Significance of Cold Fusion:
    • Cold fusion, even if not a guaranteed energy solution, highlights potential gaps in our understanding of physics.
    • The suppression of research exemplifies the Machiavelli effect, hindering scientific progress in various fields.
  • Conclusion:
    • Due to the Machiavelli effect, the viability of cold fusion as an energy source remains uncertain.
    • The controversy has cast a shadow on the openness and objectivity of the scientific establishment.
    • Many promising avenues of research may be suppressed due to similar biases

Chapter 6: The Machiavelli Effect

Introduction: Rewriting History & Rethinking Cold Fusion

  • Thomas Kuhn, in “The Structure of Scientific Revolutions,” argues that members of mature scientific communities are victims of history rewritten by those in power.
    • Similar to characters in Orwell’s “1984.”
  • By 1989, the consensus was that cold fusion was a mistake due to misinterpretations and poor lab techniques.
  • The author initially accepted this consensus.
  • Decades later, after encountering the Machiavelli Effect in nanotechnology, the author reconsidered cold fusion.

The Machiavelli Effect in Nanotechnology

  • The resistance to innovation in nanotechnology was less intense than in cold fusion.
  • Nanotechnology gained prominence in the 1990s.
    • Eric Drexler’s 1991 MIT dissertation, the first explicitly on nanotechnology, provided a theoretical basis for its manufacturing capabilities.
    • Republished as “Nanosystems” in 1992, it became a bestseller.
    • The Foresight Institute’s nanotechnology conferences grew, attracting top researchers.
    • Richard Feynman’s “Plenty of Room” (1960) gained recognition in the late 1990s.
    • In 1997, Nobel laureate Richard Smalley (discoverer of C60) was the Foresight Institute’s keynote speaker.
  • By 2000, nanotechnology gained political attention.
    • President Clinton proposed a National Nanotechnology Initiative (NNI), citing Feynman’s work.

Machiavelli’s Two Classes & The NNI

  • Machiavelli identified two classes:
    1. Nobles: Those who benefit from the status quo and have the laws on their side. They tend to be politically and economically powerful.
    2. Tradesmen: Those who would benefit from innovation but fear the risks associated with challenging the nobles.
  • The NNI did not introduce new funding but redistributed existing funds, creating a Machiavelli Effect scenario.
    • Nobles: Established researchers in related fields (e.g., surface and materials science).
    • Tradesmen: Researchers who could potentially work on true nanotech but were working in other areas or were still students.
  • The Nobles’ Reaction:
    • Labeled their existing work as nanotechnology.
    • Attacked Drexler’s (and implicitly Feynman’s) vision.
  • The Tradesmen’s Response:
    • Hesitated to embrace the new field due to its perceived difficulty and the risk of attacks from established researchers.
  • Consequences:
    • Innovation slowed down.
    • This dynamic is common in zero-sum science and technology funding environments.

The Machiavelli Effect as a Systemic Immune Response

  • The Machiavelli Effect is not a conspiracy but a natural human response to protect one’s interests.
  • It’s akin to a social and economic system’s immune response.
  • It’s a part of normal science as described by Thomas Kuhn.
  • Isaac Asimov observed similar resistance to technological change throughout history:
    • “I discovered, to my amazement, that all through history there had been resistance, and bitter, exaggerated, last-ditch resistance, to every significant technological change that had taken place on Earth. Usually, the resistance came from those groups who stood to lose influence, status, money as a result of the change. Although they never advanced this as their reason for resisting it, it was always the good of humanity that rested upon their hearts.”

  • By the early 2000s, the NNI led to a full-fledged Machiavelli Effect in nanotechnology.
    • Researchers working on molecular machines faced funding difficulties due to partisan attacks.
    • A productive nanotech program at NASA Ames was canceled.
    • Foresight Institute experienced difficulty attracting researchers who had previously presented work.
  • The attacks lacked intellectual substance.
    • Smalley, a prominent attacker, contradicted himself on the feasibility of molecular self-replication.
  • Machiavelli’s prediction of the 20th-century behavior of nanoscience leaders does not invalidate nanotechnology itself.
  • It highlights the possibility of partisan influence in attacks on new ideas.
  • In Bayesian terms, the prior probability of such attacks was high regardless of the validity of Drexler’s or Feynman’s vision.

The Machiavelli Effect Beyond Physical Science

  • The Machiavelli Effect is not limited to physical sciences and engineering.
  • It’s arguably less prevalent in these fields compared to others.
  • George Miller (cognitive psychologist) described the difficulty of challenging behaviorism’s dominance in psychology:
    • “The power, the honors, the authority, the textbooks, the money, everything in psychology was owned by the behavioristic school. Those of us who wanted to be scientific psychologists couldn’t really oppose it. You just wouldn’t get a job.”

  • Distinguishing between the Machiavelli Effect and legitimate criticism of flawed ideas can be challenging.
  • Textbook knowledge is generally “probably approximately correct.”
  • But in some cases (e.g., flying machines), expert consensus has been wrong.
  • The bureaucratic resistance to innovators can be detrimental to scientific progress.

The Wright Brothers & The Smithsonian

  • In 1910, the Wright brothers offered their Wright Flyer (the first successful heavier-than-air flying machine) to the Smithsonian.
  • Their offer was rejected.
  • The Flyer was eventually donated to the Science Museum in London (until 1948).
  • Meanwhile, the Smithsonian exhibited Samuel Langley’s Aerodrome as the first flying machine.
    • Langley, a former Smithsonian director, had received significant government funding for his failed attempts at flight.
    • In 1914, the Smithsonian lent the Aerodrome to Glenn Curtiss, who modified and successfully flew it.
    • Curtiss was a competitor of the Wrights and sought to invalidate their airplane patent.
  • Funding Disparity:
    • The Wrights spent about $1,000 of their own money to build the Flyer.
    • Langley received over $50,000 in government funding for his unsuccessful Aerodrome.
  • This story highlights the often-overlooked human side of scientific history.

Public R&D and Economic Growth

  • The belief that increased federal funding for scientific research would boost economic growth became widespread.
  • Evidence Against This Belief:
    • England, where the Industrial Revolution began, had minimal public research support in the 19th century.
    • France and Germany, with strong public scientific enterprises, did not experience comparable economic growth.
    • The US, with a laissez-faire approach, surpassed Britain’s GDP per capita.
    • Public R&D funding coincided with the Great Stagnation, a period of slowed technological innovation.
  • OECD Study (2005):
    • Found a positive correlation (0.26) between private R&D and economic growth.
    • Found a negative correlation (-0.37) between government-funded R&D and economic growth.
    • Authors suggested that public R&D might crowd out private sector resources, including private R&D.
  • Crowding Out vs. The Machiavelli Effect:
    • Some economists (e.g., Terence Keeley) favor the crowding-out explanation.
    • The author argues that the Machiavelli Effect also plays a significant role.
    • Centralized funding empowers established groups who resist new ideas.

The Inventors of Tomorrow & Ivory Tower Syndrome

  • A Nation at Risk (1983):
    • “Our nation is at risk. Our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world. The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a nation and a people.” - National Commission on Excellence in Education

  • Samuel Broder (Director of the National Cancer Institute):
    • “If it had been up to the NIH to cure polio, we’d have the best iron lungs in the world, but we still wouldn’t have the Salk vaccine.”

  • Tyler Cowen’s “The Great Stagnation” Hypothesis:
    • Proposes that the US ran out of talented young people leaving farms for universities in the 1970s.
    • This trend did slow down as the agricultural workforce diminished.
  • Counterarguments to Cowen’s Hypothesis:
    • The decline in manufacturing jobs continued well into the 2000s, freeing up workers for other sectors.
    • The US consistently produces more manufactured goods with fewer workers.
    • There is significant potential for increased production (e.g., flying cars) if desired.
    • The feminist revolution doubled the pool of educated young people.
    • Women’s participation in higher education increased significantly (e.g., PhDs awarded to women).
    • PhDs awarded per year increased dramatically compared to the 1950s and early 20th century.
  • Ivory Tower Syndrome Hypothesis:
    • Suggests that excessive education might hinder the economy and technological innovation.
    • Too many young people are in academia instead of pursuing real-world endeavors.
  • Kevin Jones’s Critique of Cowen:
    • “But there’s a tension here that Tyler doesn’t address. Technology grew like gangbusters in the first half of the 20th century, but it wasn’t until the second half that education took off. So, apparently, it’s not higher education that’s really responsible for dramatic technological growth. But if that’s the case, who cares about education?”

  • The correlation between the surge in PhDs and the Great Stagnation supports the ivory tower syndrome hypothesis.
  • Daniel Dennett’s Analogy:
    • “The juvenile sea squirt wanders through the sea, searching for a suitable rock or hunk of coral to cling to and make its home for life. For this task, it has a rudimentary nervous system. When it finds its spot and takes root, it doesn’t need its brain anymore, so it eats it. It’s rather like getting tenure.”

Academia vs. Real-World Innovation

  • DARPA Grand Challenge (2005):
    • Five autonomous vehicles successfully completed a 131.2-mile course, a significant improvement over the previous challenge.
  • An AAAI Researcher’s Perspective:
    • Dismissed the achievement as merely combining existing techniques, highlighting an academic bias towards novel intellectual concepts over practical solutions.
  • Real-World Innovation:
    • Often involves combining existing technologies to achieve previously impossible outcomes.
    • The Grand Challenge demonstrated the difference between working and not working in the real world.
  • Academia’s Focus:
    • Favors “mind candy” - intellectually stimulating ideas that impress other academics - over mundane but functional techniques.
  • The Importance of Practical Results:
    • While specialists might recognize the building blocks of a new invention, the world cares about whether it works.
    • The Grand Challenge marked a major milestone in self-driving car technology, despite lacking groundbreaking academic novelty.

Failure of Foresight: Failures of Nerve and Imagination

  • Arthur C. Clarke (“Profiles of the Future”) identified two types of failure in technological forecasting:
    1. Failure of Nerve: Occurs when the science and engineering are understood, but the predicted outcome is too unconventional.
      • Example: The scientific community’s denial of heavier-than-air flight (even after the Wright brothers’ success).
      • Simon Newcomb’s argument against airplanes (falling out of the sky when they stopped) exemplifies a failure of nerve.
      • He failed to consider the possibility of runways for takeoff and landing.
      • Arthur C. Clarke encountered similar resistance in the field of rocketry.
      • New York Times Editorial (1920):
        • “That Professor Goddard, with his chair in Clark College and the countenancing of the Smithsonian Institution, does not know the relation of action and reaction, and of the need to have something better than a vacuum against which to react, to say that would be absurd. Of course, he only seems to lack the knowledge ladled out daily in high schools.”

      • The editorial displays arrogance and ignorance.
      • It illustrates the potential for influential figures to stifle innovation through biased funding decisions.
      • The Times retracted the claim in 1969 during the Apollo 11 mission:
        • “Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century, and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.”

      • Nanotechnology as a Modern Example:
        • Standard physics, chemistry, and quantum mechanics predict the feasibility of atomically precise machines.
        • The failure of nerve is essentially thinking within limitations that do not actually exist.
        • Feynman’s breakthrough was recognizing the potential of manipulating matter at the atomic level.
        • Drexler further weakened the resistance by drawing parallels with molecular machines in cells.
    2. Failure of Imagination: Occurs when the knowledge required to make accurate predictions is simply unavailable.
      • Example: Ernest Rutherford’s dismissal of atomic energy:
        • “The energy produced by the breaking down of the atom is a very poor kind of thing. Anyone who expects a source of power from transformation of these atoms is talking moonshine.”

      • Rutherford, a leading nuclear physicist, was unaware of the neutron and nuclear fission, concepts that were unknown at the time.
      • Failures of imagination are difficult to identify except in hindsight.
      • It’s essential to acknowledge the existence of unknown unknowns.
      • Technological progress often involves venturing beyond the known.
      • Crystal Radio Example:
        • Someone familiar with crystal radios might have envisioned the transistor, not through knowledge of quantum mechanics, but by recognizing the possibility of undiscovered solid-state phenomena.

The Consequences of Centralization

  • The increasing centralization and bureaucratization of science and research funding in the late 20th century had negative consequences.

  • It amplified the impact of failures of nerve and imagination, turning them into self-fulfilling prophecies.

  • Shift in Focus:

    • A century ago, talented individuals faced real-world challenges, leading to innovative solutions.
    • Today, many of their successors are trapped in academia, focused on securing funding rather than addressing practical problems.
  • Eisenhower’s Warning (1961):

    • “The free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. The prospect of domination of the nation’s scholars by federal employment, project allocations, and the power of money is ever-present and is gravely to be regarded.”

  • Scientific Knowledge Explosion vs. Application:

    • The amount of scientific knowledge has grown exponentially in recent decades.
    • Ivory tower syndrome may contribute to this growth but hinder practical applications.

The French Scientific Boom and the British Industrial Revolution

  • France in the Early 19th Century:
    • Experienced a remarkable surge in scientific talent (e.g., Carnot, Lavoisier, Laplace, Montgolfier, Doulon, Petit, Pion, Freynell, Gay-Lussac, Ampère, Saval, Fourier, Corioli, Cauchy, Lamarck).
    • Science and technology advancement was a national priority.
    • The École Polytechnique was a leading institution, unmatched in England or elsewhere.
  • The Industrial Revolution:
    • Despite France’s scientific prowess, the Industrial Revolution occurred in Britain.
    • By 1850, Britain led in railroads, steamships, engines, machine tools, and textiles.

Ivory Tower Syndrome in the United States

  • The US is a global leader in science and technology.
  • It boasts numerous top-tier research universities and substantial public research funding.
  • Ivory tower syndrome is evident:
    • The Great Stagnation demonstrates a decline in transformative innovation.
    • The focus on academic pursuits may be hindering practical applications of knowledge.
  • Conclusion:
    • “The trees of knowledge are growing taller than ever. But someone appears to have been spraying paraquat on the low-hanging fruit.”

    • While scientific knowledge expands, the development of impactful, real-world innovations has slowed down.

Chapter 7: The Age of Aquarius

The Shift from Science to Emotionalism

  • The 17th-century Scientific Revolution: Isaac Newton’s discoveries in motion, gravitation, and calculus built upon the work of previous scientists, marking a shift in understanding the universe from anthropomorphic gods to logic, mathematics, observation, and experiment.
  • The Beauty and Value of Science: Science builds upon itself, incorporating insights from previous generations, unlike other forms of knowledge.
    • Example: Ptolemy’s epicycles predicted planetary positions, while Newton’s celestial mechanics allowed for the discovery of Neptune based on observations of Uranus.
  • Scientific and Technological Progress until the 1960s: Knowledge and capabilities continued to grow, culminating in the moon landing, driven by celestial mechanics and Newton’s third law.
  • The Age of Aquarius and the Decline of Science: The 1960s saw a cultural shift away from scientific reasoning towards emotionalism, symbolized by the “Age of Aquarius.”
    • Example: The movie “Flash Gordon” (1974) mirrored the cultural atmosphere of the 1970s, emphasizing sexual liberation and a departure from traditional values.

H.G. Wells’ Prediction of Societal Decline

  • H.G. Wells’ “The Time Machine” (1895) Foreshadowed the Cultural Shift: Wells described a future society, the Eloi, characterized by a decline in strength, intelligence, and vigor due to a life of ease and security.
    • Strength as a Consequence of Need: Wells argued that hardship and freedom fostered strength, intelligence, and cooperation, while security led to feebleness.
    • The Fate of Energy in Security: Wells suggested that in a secure society, energy is diverted towards art and eroticism, eventually leading to languor and decay.
  • Wells’ Predictions about the Sexual Revolution and Family Breakdown: Wells anticipated the shift away from traditional family structures and the embrace of sexual liberation.
  • The Rapid Pace of Societal Transformation: The cultural changes Wells predicted occurred within a century, much faster than the 8,000 centuries he envisioned.

The Cultural Transformation of the 1960s and 70s

  • The 1960s as a Cultural Phase Change: Western culture, particularly in America, underwent a significant transformation in the 1960s and 70s.
    • Examples of Cultural Movements: Hippies, Woodstock, environmentalism, Earth Day, free love, zero population growth, civil rights, passivism, and feminism.
    • Similarities to the Bloomsbury Set: The 1960s counterculture echoed sentiments of the earlier Bloomsbury Group in London, but on a much larger scale.
  • The Mimetic Cambrian Explosion of the 1960s: The simultaneous emergence of various social and cultural revolutions can be seen as a mimetic Cambrian explosion.
    • Preexisting Concerns: Many of the issues championed by the counterculture had existed before the 1960s (e.g., slavery, women’s suffrage, free love, conservationism).
  • The Question of Simultaneous Revolutions: The key question is why all these movements converged and intensified in the 1960s.
    • Analogy of Flowers in the Desert: The widespread emergence of social movements suggests a common underlying cause, like rain in a desert causing widespread blooming.

Maslow’s Hierarchy of Needs and the Cultural Shift

  • Maslow’s Hierarchy of Needs as an Explanation: Abraham Maslow’s hierarchy of needs can help explain the cultural shift of the 1960s.
    • Hierarchy Levels:
      • Basic needs: Physical safety, food, clothing, shelter.
      • Social needs: Friendship, status.
      • Higher needs: Spiritual fulfillment, self-actualization.
    • Motivation and Hierarchy: Individuals are motivated by the lowest unmet need in the hierarchy.
  • Western Culture’s Fulfillment of Lower-Level Needs: By the 1960s, Western culture had largely fulfilled the basic needs of its citizens, allowing them to focus on higher-level needs.
  • The Modern Elite and Higher-Level Needs: The “modern elite” could take basic needs for granted and focus on love, esteem, self-actualization, etc.
  • Emancipation of Women and Technological Advancements: Labor-saving machines of the 20th century freed women from some domestic drudgery, allowing them to pursue other interests.
    • Example: The character of Stella in “The Jetsons” (1962) represented this shift, although it was still emerging at the time.
  • Rising Productivity and the “Good Life”: Increased productivity throughout the 19th and 20th centuries led to a higher standard of living, making the “good life” accessible to more people.

The Decline of Evolutionary Pressure and the Rise of the Eloi

  • Medieval Europe and the Constant Threat of War: Medieval life was shaped by the constant presence of war, creating an evolutionary pressure for moral codes that prioritized productivity, efficiency, hard work, and the needs of the polity over the individual.
  • The Forging of Western Civilization: This pressure shaped Western civilization, promoting values that led to a better standard of living and the development of science.
  • The Impact of Nuclear Weapons on Warfare: The advent of nuclear weapons and the concept of mutually assured destruction (MAD) fundamentally changed the nature of warfare.
    • Short-Circuiting the Evolutionary Process: MAD removed the pressure on societies to maintain efficient cultural and governmental practices for survival.
  • The Nuclear Umbrella and the Cambrian Explosion of Eloi: The nuclear umbrella allowed societies with less efficient practices to survive, leading to a proliferation of “Eloi” (those focused on higher-level needs).
  • Proxy Wars and Reduced Societal Pressure: Wars like the Korean War became proxy conflicts, reducing the direct pressure on American society compared to World War II.
  • Peter Turchin’s Historical Analysis: Historian Peter Turchin observed a pattern of empires becoming complacent and vulnerable to internal decay when their survival is no longer threatened.
  • The Vietnam War and the Decline of Technological Optimism: The Vietnam War, perceived as cruel and useless, contributed to a decline in American faith in technological progress and military strength.
  • World War II’s Influence on Post-War Culture: The experience of World War II shaped American culture and institutions, promoting a belief in centralized bureaucratic government and fostering a cooperative spirit.
  • The Unsustainability of the 1950s Boom: The post-war boom of the 1950s relied on the cooperative spirit inherited from the war, which began to fade in subsequent generations.
  • The Baby Boomers and the Two Cultures: The baby boomer generation split into two distinct cultures with differing values and worldviews, exemplified by the contrasting responses to the Apollo 11 moon landing and Woodstock.
  • The Corporate State and the Age of Aquarius: The centralized corporate state that worked with the cooperative Greatest Generation struggled to adapt to the individualistic and rebellious spirit of the Age of Aquarius.
  • Science Fiction’s Misjudgment of Societal Progress: Science fiction writers often assumed that social decision-making would become more scientific and rational, failing to anticipate the cultural shift towards emotionalism and individualism.
  • The Role of Ease and Plenty in Moral and Intellectual Decay: Contrary to some post-apocalyptic narratives, it was ease and abundance, not hardship, that fostered moral and intellectual decline.
  • The Limitations of War as a Shaper of Culture: War is no longer a viable mechanism for shaping culture due to the destructive potential of modern technology.
  • The Question of Corrective Mechanisms for False Beliefs: In the absence of war, it is unclear what mechanisms will correct societal false beliefs.

The Eloi Agonists and the Search for Meaning

  • The Modern Eloi’s Perception of Struggle: Despite living in relative ease and security, many people today perceive their lives as a struggle and believe they are contributing to something meaningful.
  • Maslow’s Hierarchy and the Need for Belonging and Esteem: The need for belonging and esteem drives individuals to seek purpose and validation in their lives.
  • The Human Psyche’s Resistance to Uselessness: Humans are psychologically averse to a life devoid of purpose and contribution to others, as depicted in Wells’ portrayal of the Eloi.
  • The Post-Industrial Age’s Fulfillment of the Need for Meaning: The post-industrial age has created a system where people can believe they are doing important work even when it is not objectively productive.
  • The Illusion of Progress and Societal Stagnation: While the Great Stagnation indicates a decline in actual progress, many believe they are making the world a better place.
  • The Shift from Conquering the Physical World to Conquering Minds: The focus has shifted from technological advancement to influencing the beliefs and behaviors of others.
  • The Role of Emotion and Self-Deception: In this new struggle, objective truth is often disregarded in favor of emotional appeals and self-deception.
  • The Evolutionary Basis for Self-Deception: Humans have evolved the ability to deceive themselves as a means of more effectively deceiving others.
    • Research on Self-Deception: The work of Robert Trivers and Robin Hanson highlights the evolutionary advantages of self-deception.
  • Literature’s Exploration of Self-Deception: The theme of self-deception and hypocrisy is prevalent in literature, showcasing our tendency to distort reality to maintain a positive self-image.
    • Example: T.S. Eliot’s observation about people’s desire to feel important and their capacity for self-justification.
  • The Rise of Science and the Advantage of Truth: Science emerged because, in specific historical contexts, knowing the objective truth became more advantageous than not knowing it.
  • The Rise of Virtue Signaling and Cost Disease: Today, many social institutions gain prestige through virtue signaling rather than demonstrable results, leading to cost disease in areas like healthcare, education, and environmental regulation.

Risk Homeostasis and the Appeal of Scare Stories

  • Risk Homeostasis and Risk Compensation: People tend to maintain a constant level of perceived risk, adjusting their behavior to compensate for changes in actual risk.
    • The Peltzman Effect: Introducing safety features can lead to riskier behavior, as people compensate to maintain their preferred risk level.
  • Societal Risk Homeostasis and the Appeal of Scare Stories: The decline in perceived exogenous risk after World War II may have led to increased susceptibility to scare stories, such as those related to environmental hazards.
  • The Correlation between Energy Intensity and Failed Predictions: Technologies with higher energy intensity are easier to portray as dangerous and are further removed from everyday experience, making them more susceptible to scare stories.
    • Example: Cell phones vs. nuclear reactors – personal experience with cell phones counteracts scare stories, while the lack of experience with nuclear power makes it easier to believe exaggerated risks.

The Green Religion and the Demonization of Humanity

  • The Rise of Environmentalism as a Religion: The decline of organized religion has been accompanied by the rise of environmentalism, which, in its extreme form, can be considered a religion.
    • John Burroughs’ Observation: Burroughs noted the increasing tendency to treat nature as a temple.
  • Homeostasis in the Human Psyche for Religion: The human psyche appears to have a need for religious or spiritual belief systems, which environmentalism may be fulfilling for some.
  • Bill McKibben’s Biocentrism: McKibben’s writings express a biocentric worldview, where human impact on nature is inherently bad, regardless of observable consequences.
    • “The End of Nature”: McKibben lamented the loss of nature’s independence and mystery due to human influence.
  • The Doctrine of Original Sin on Steroids: This view can be seen as an extreme version of the doctrine of original sin, where humanity’s very existence is seen as a blight on nature.
  • Calls for Human Extinction: Some environmental extremists advocate for human extinction as a solution to environmental problems.
    • Example: David Graeber’s hope for a virus to eliminate humanity.
  • The Green Religion’s Dominance in Western Culture: The green religion has become the default belief system for many in Western civilization, particularly in academia.
  • Climate Change as the Central Focus: Climate change has become the central apocalyptic narrative of the green religion.
  • The Intertwining of Environmentalism and a Desire for a Healthy Environment: It is important to distinguish between a reasonable desire for a clean environment and the religious aspects of the green movement.

The Climate Change Debate: Science vs. Religion

  • The Fundamentalist Green View of Humanity: Fundamentalist greens view human impact on nature as inherently negative, regardless of the specific activity.
  • The Eloi Agonists and Climate Change: Many climate change believers are not scientists but are motivated by moral and social factors, attaching a sense of moral superiority to their beliefs.
  • The Author’s Position on Climate Change: The author distinguishes between green activism, which is largely religious, and the scientific study of climate change.
  • The Apocalyptic Nature Cult: A significant portion of the American public believes climate change poses an existential threat to humanity, reflecting an apocalyptic worldview.
  • The IPCC’s Assessment of Climate Change’s Economic Impact: The Intergovernmental Panel on Climate Change (IPCC) concludes that the economic impact of climate change will be relatively small compared to other factors.
  • The Disconnect between Scientific Consensus and Public Perception: There is a vast gap between the IPCC’s scientific assessment and the apocalyptic narrative promoted by climate activists.

Baptists and Bootleggers: The Political Economy of Climate Change

  • Baptists and Bootleggers Analogy: The climate change debate can be understood through the “Baptists and bootleggers” framework from public choice theory, where seemingly opposed groups support the same policies for different reasons.
  • The Dark Green Religion and Climate Science: The dark green true believers, like the Baptists, are motivated by moral and religious convictions, while scientists and renewable energy entrepreneurs, like the bootleggers, may benefit from climate concern.
  • The Reversal of the Baptists and Bootleggers Dynamic: Unlike the original analogy, where bootleggers pretended to be Baptists, in the climate change debate, the Baptists (dark green activists) often pretend to be bootleggers (scientists) to gain wider support.
  • The Distortion of Scientific Findings: This dynamic can lead to the exaggeration of climate change’s risks and the suppression of dissenting scientific views.
  • The Negative Impacts of Agriculture and Highways on Habitat: The author notes that agriculture and highways are major contributors to habitat destruction, but these are often overlooked in favor of focusing on energy production.
  • The Paradox of Opposition to Nuclear Power: The green movement’s opposition to nuclear power, a clean energy source, is paradoxical given their concern for habitat destruction.
  • The Prevalence of “Schrodinger’s Cats”: The author suggests that many individuals hold contradictory beliefs, supporting green causes while also engaging in activities that harm the environment.

Ergophobia: The Irrational Fear of Energy

  • Ergophobia: The Fear of Work/Energy: Ergophobia is the irrational fear of work, which can be extended to mean the fear of using energy.
  • The Origins of Ergophobia: Ergophobia in American culture predates climate change concerns, as seen in the anti-nuclear power movement.
  • The Climate Movement as a Repackaged Crusade Against Energy: The modern climate change movement can be seen as a continuation of the anti-energy crusade, using climate change as a justification.
  • The Contradiction of Opposing Nuclear Power While Fearing Climate Change: The green movement’s opposition to nuclear power, a carbon-free energy source, is a central contradiction within their ideology.
  • The Assumption that Anything Human is Bad: The fundamentalist green perspective seems to stem from the belief that any human impact on nature is inherently negative.

The Meme Plague and the Suppression of Doubt

  • The Xhosa Cattle Killing: The Xhosa cattle killing of 1856-57 serves as an example of a meme plague, where a self-destructive belief spreads rapidly through a population, leading to devastating consequences.
  • The Power of Social Feedback and Suppression of Doubt: The Xhosa example highlights how social feedback, superstition, and the suppression of skepticism can lead to the adoption of harmful beliefs.
  • Meme Plagues and Societal Stagnation: In modern societies, meme plagues may not cause catastrophic collapse but can contribute to societal stagnation.
  • Eric Hoffer’s Observation on Mass Movements: Hoffer noted that mass movements in America tend to evolve into rackets, cults, or corporations.
  • The Eloi-Machiavelli Effect Dynamic: The Eloi agonists’ crusades, driven by religious fervor, can gain widespread acceptance and be exploited by those seeking power or profit.
  • Evolutionary Game Theory and Morality: Evolutionary game theory suggests that morality arises from non-zero-sum interactions, where cooperation benefits all parties.
  • The Zero-Sum Mentality and the Erosion of Morality: In a static, no-growth society, the zero-sum mentality prevails, leading to a decline in cooperation and morality.
  • The Green Religion’s Zero-Sum Approach: The green religion’s focus on limiting human impact can promote a zero-sum worldview, hindering progress and fostering conflict.
  • The Corruption of Science by Zero-Sum Politics: Zero-sum political funding can bias scientific research, promoting agendas rather than objective truth.
  • The Fragility of Science and the Resurgence of Religion: The author argues that science is not the default human mode of thinking and that the resurgence of religious and value-laden approaches threatens its integrity.

Science Fiction’s Predictions of Societal Decline and the Suppression of Science

  • Science Fiction’s Depiction of Civilizational Cycles: Many science fiction writers have explored the rise and fall of civilizations, often attributing decline to the loss of scientific knowledge or the suppression of technology.
    • Examples: Wells’ “The Time Machine,” Campbell’s “Twilight,” Asimov’s “Foundation,” Piper’s “Tarot,” and Heinlein’s “Future History” series.
  • Heinlein’s “Crazy Years” and the Deterioration of Mores: Heinlein’s “Future History” chart predicted a period of technological advancement accompanied by a decline in morals and social institutions, culminating in mass psychosis.
  • The Ubiquity of Perceived Moral Decline: The perception of moral decline is a recurring theme throughout history, as each generation tends to judge subsequent generations’ values against their own.
  • Heinlein’s Prediction of a “New Crusade”: Heinlein also predicted a hiatus in space travel and the rise of a religious dictatorship that suppresses scientific research and technological progress.
  • The Potential Accuracy of Heinlein’s Predictions: The author suggests that Heinlein’s predictions, particularly regarding the suppression of science and the rise of religious fervor, may be becoming increasingly accurate.
  • Carl Sagan’s Warning about the Decline of Critical Thinking: Carl Sagan warned about the potential for a decline in critical thinking and a resurgence of superstition in a society dominated by technology controlled by a few.

Chapter 8: Forbidden Fruit

The Cultural Shift and Its Consequences

  • The 1960s and 70s Cultural Shift: A significant cultural shift occurred in the 1960s and 70s, marked by a decline in trust, the rise of “culture wars,” and changes in fashion.
  • Impact of the Shift: This shift contributed to the lack of technological progress, particularly in areas like flying cars, due to increased susceptibility to baseless fears and hostility towards technology, especially energy.
  • Ergophobia:
    • Defined as the fear or hatred of work or energy.
    • Grew from a rare condition in the 1950s to a widespread phenomenon by the 1970s.
  • Hypothetical Consequences:
    • If this cultural shift had occurred in a smaller, isolated community, it could have led to economic collapse, similar to historical examples like the Xhosa cattle-killing movement.
    • Even in a free market, areas with strong ergophobia might have failed, similar to many religiously-based utopian experiments.

The Three Barriers to Future Technology

  • Cerberus Analogy: The barriers to achieving the futuristic technologies we were promised are compared to Cerberus, the three-headed dog guarding the underworld in Greek mythology.
  • The Three Heads:
    1. Bureaucratic Structure of Science and Technology: Amplifies the Machiavelli Effect, where institutions prioritize self-preservation and power over their intended goals.
    2. Ergophobic Religion of the Eloi Agonists: The widespread fear and rejection of energy and technology, driven by those who benefit from fear-mongering and regulation.
    3. Strangling Red Tape of Regulation: Excessive government rules and regulations that hinder innovation and development.
  • Synergistic Effect: These three forces have increasingly reinforced each other, creating a powerful barrier to progress.
  • Historical Impact of Regulation: Even in isolation, regulation was sufficient to hinder early attempts at developing flying cars.

The Case of the Aero Car and Potential Economic Impact

  • Moulton Taylor’s Aero Car: A hypothetical scenario explores what might have happened if the Aero Car and similar designs had followed the same development path as the automobile.
  • Assumptions:
    • Starting point in 1950 with high pilot registration levels.
    • Initial ownership by wealthy enthusiasts and hobbyists, followed by broader adoption.
    • Technological improvements in areas like wing attachment, power, speed, reliability, and ease of use.
  • Potential Economic Impact:
    • By 1975, a significant new industry comparable to the automotive sector could have emerged, boosting GDP directly and indirectly.
    • Analogy to the Interstate Highway System: The Interstate system significantly impacted the U.S. economy, contributing to growth, lower prices, time savings, and reduced accidents.
    • Increased Property Values: Similar to the impact of new highways, flying cars could have increased property values and economic activity in previously isolated areas.
    • New Businesses and Increased Economic Radius: Flying cars could have spawned new industries (e.g., landing strips) and expanded the economic reach of households.
    • Addressing the Great Stagnation: Flying cars could have mitigated the Great Stagnation by increasing connectivity and reducing transportation costs to remote areas.
  • Lost Potential: If this path had been taken, we might have achieved significantly more advanced technologies by now.

Bureaucracy and Regulation: Hindering Innovation

A Multitude of Officers

  • Bureaucracy as a Perennial Problem: Throughout history, excessive bureaucracy and regulation have been identified as obstacles to progress, as noted by figures like Thomas Jefferson and Winston Churchill.
  • The Wright Brothers and French Customs:
    • In 1908, Wilbur Wright encountered damage to his airplane due to careless handling by French customs officials.
    • This incident highlights the potential for bureaucratic interference to hinder even groundbreaking innovations.
  • Bruce Halleck and the Rotaplane:
    • Bruce Halleck, inventor of the Rotaplane flying car, faced legal trouble due to a misapplication of the Neutrality Act, illustrating how regulations can stifle inventors.
  • Robert Edison Fulton Jr. and the Airphibian:
    • The Airphibian, another promising flying car design, faced regulatory hurdles and ultimately failed to achieve production due to the high costs of meeting government standards.
  • Moulton Taylor’s Perspective:
    • Moulton Taylor attributed the failure of his Aero Car to government regulation, specifically the FAA and DOT.
    • He claimed the agencies feared the widespread adoption of flying cars and the resulting air traffic challenges.

The Corvair, Ralph Nader, and the Rise of Fear-Mongering

  • The Chevrolet Corvair:
    • The 1960 Chevrolet Corvair, with its rear-engine design, faced criticism despite its handling being comparable to or even superior to other cars of its time.
    • It became a target for Eloi Agonist fear-mongering, exemplified by Ralph Nader’s book “Unsafe at Any Speed.”
  • Societal Shift and Susceptibility to Scare Stories:
    • By the 1960s, fewer people had direct experience with machinery, making them more vulnerable to scare stories about technology.
    • This increased susceptibility created an environment where fear-mongering by individuals like Nader and the perceived need for regulation could thrive.

The Great Explosion

  • The Regulatory Explosion: The Great Stagnation coincided with a massive increase in federal regulation, particularly during the Nixon administration in the 1970s.
  • Examples of New Regulations: This period saw the passage of major environmental and safety regulations like the Clean Air Act, Clean Water Act, and the creation of agencies like OSHA and the EPA.
  • Growth of the Federal Register: The Federal Register, which documents new regulations, grew dramatically under Nixon and continued to expand in subsequent decades.
  • Complexity and Burden of Regulations:
    • Modern regulations are often complex and difficult to understand, as illustrated by an example from the Federal Air Regulations.
    • The sheer volume and complexity of regulations create a significant burden for businesses and individuals.
  • Economic Impact of Regulation:
    • A study by Dawson and Seder found that excessive regulation has significantly reduced U.S. household income compared to a scenario with lower regulation.
    • They linked slower economic growth to heavier regulatory burdens.
  • Overestimation of Regulatory Benefits:
    • The actual benefits of regulation are often overstated, as demonstrated by an analysis of the 1962 drug laws, which slowed innovation without improving quality.
  • Impact on Life Expectancy:
    • Life expectancy gains in the U.S. were greater in the first half of the 20th century (pre-heavy regulation) than in the latter half, suggesting that regulation may not be the primary driver of safety improvements.
  • Burden on Aviation:
    • Regulation has significantly increased the cost of components for certified aircraft.
  • Airline Deregulation as a Counter-Example:
    • The deregulation of the airline industry in 1978 led to lower fares, increased safety, and overall benefits for consumers.
    • This example demonstrates that deregulation can lead to positive outcomes, challenging the assumption that heavy regulation is always necessary.

Product Liability and the Destruction of General Aviation

  • Rise of Product Liability: The 1970s saw a surge in product liability lawsuits, often driven by scare stories and large jury awards.
  • Impact on the General Aviation Industry:
    • This increase in lawsuits devastated the general aviation industry, leading to the cessation of production by major manufacturers like Cessna and Piper.
  • Explanation of the Collapse:
    • The Concise Encyclopedia of Economics attributes the collapse to the dramatic increase in liability costs, which made it unsustainable for aircraft manufacturers to operate.
  • The Role of Regulation:
    • While the energy crisis initially impacted sales, the subsequent collapse of the industry was primarily driven by the product liability environment, not cyclical factors.
  • General Aviation Revitalization Act:
    • Congress passed the General Aviation Revitalization Act in 1994 to limit liability for older aircraft, but the industry never fully recovered.
  • Long-Term Consequences:
    • The cost of private airplanes has skyrocketed, and the number of new aircraft produced annually is a fraction of what it once was.
  • Broader Impact of Liability Costs:
    • The general aviation industry’s experience is just one example of how product liability costs have negatively impacted various industries and the economy as a whole.
  • Economic Costs of the Tort System:
    • The U.S. tort system, including product liability, consumes a significant portion of GDP, potentially slowing overall economic growth.
    • This cost also includes the loss of talented individuals who are drawn into legal professions instead of contributing to innovation and production.

The German Economic Miracle: A Case for Deregulation

Miracles

  • Robert Gordon’s “Miracle”: Economist Robert Gordon described the early 20th century’s productivity boom as a “miracle,” implying that the Great Stagnation was a lack of such miracles.
  • The German Economic Miracle: The post-WWII recovery of West Germany demonstrates that economic miracles can be intentionally created through deregulation.
  • Post-War Germany’s Dire Situation:
    • Germany faced widespread destruction, food shortages, and severely hampered industrial production after the war.
  • Ludwig Erhardt’s Reforms:
    • Ludwig Erhardt, influenced by Hayek’s free-market ideas, implemented sweeping deregulation, including the removal of price controls and rationing.
  • Dramatic Economic Rebound:
    • Deregulation had an immediate and positive impact on the West German economy, leading to rapid increases in industrial production and a resurgence of economic activity.
  • Contrast with East Germany:
    • East Germany, under communist rule, experienced economic stagnation, highlighting the benefits of a free market approach.
  • The Importance of Ignoring “Right-Thinking” People:
    • The German example shows that progress can be achieved by disregarding the conventional wisdom of those who advocate for excessive control and regulation.

The Great Strangulation

  • Regulation and Long-Term Decline: Over time, excessive regulation hinders learning, stifles innovation, protects inefficiency, and ultimately reverses progress.
  • The Role of Energy and Sanity: The optimistic predictions of the mid-20th century relied on continued growth in energy consumption and a reasonable regulatory environment.
  • The Great Stagnation as Self-Inflicted: The Great Stagnation was not a natural phenomenon but a consequence of choices that stifled innovation and progress.

The Unexpected Rise of the Family Car

The Family Car

  • H.G. Wells’ Predictions: H.G. Wells’ “Anticipations” accurately predicted the development of motor trucks, private motor carriages, and motor omnibuses.
  • Limited Vision:
    • Wells saw the private car as a luxury for the wealthy, similar to private jets today.
    • He failed to foresee the widespread adoption of the automobile by the general population.
  • The Missing Element: Productivity:
    • D.S.L. Cardwell points out that contemporary thinkers failed to anticipate the universal motor car because they did not grasp the potential impact of increased productivity.
  • The Importance of Productivity:
    • High productivity is essential for making goods affordable and accessible to the masses.
    • The assembly line and other innovations dramatically increased automobile production, making the family car a reality.
  • The Rise of the Suburbs: The automobile and the highway system facilitated the growth of suburbs, allowing people to live further from urban centers.
  • The Decline of Infrastructure Investment:
    • Investment in transportation infrastructure peaked in the 1960s and has declined since, contributing to the stagnation.
  • The Demonization of Cars and the Rise of Regulation:
    • The rise of environmental concerns in the 1970s led to increased regulation and the demonization of cars, further hindering progress in transportation.

Leveling Up and the Eloi Agonists

Leveling Up

  • Hans Rosling’s Levels of Wealth: Hans Rosling categorized global wealth into four levels based on income and access to transportation: Barefoot, Bicycle, Motorbike, and Car.
  • The Industrial Revolution’s Impact: The Industrial Revolution dramatically reduced the percentage of the world’s population at the lowest level of wealth.
  • Progress in the U.S.: The average American progressed through Rosling’s levels throughout the 20th century.
  • Global Wealth Distribution: Over the past 50 years, global wealth distribution has become more bell-shaped as many people moved up from the lowest levels.
  • The Absence of Level 5: The Great Stagnation can be seen as the absence of a “Level 5” in Rosling’s framework, representing a significant leap beyond car ownership.
  • Continued Energy Growth in Developing Countries: Developing countries continue to experience growth in energy use per capita, unlike developed countries where it has stagnated.
  • Maslow’s Hierarchy and the Eloi Agonists:
    • The rise of Eloi agonists in developed countries may be linked to Maslow’s hierarchy of needs; once basic needs are met, people may focus on virtue signaling and fear-mongering rather than material progress.
  • Economic Growth in Developed vs. Developing Countries: Developing countries like China and India are experiencing faster economic growth than developed countries like France, Germany, and the U.S.

The Great Strangulation Revisited

  • The Root Cause of the Failed Future: The failure to achieve the futuristic technologies we were promised is primarily due to the stagnation of energy growth and the rise of ergophobia, fueled by the Eloi agonists.
  • The Role of Bureaucracy and Regulation: Bureaucracy, the Machiavelli effect, and excessive regulation have further exacerbated the stagnation.
  • Moving Forward: Understanding the causes of the Great Stagnation is crucial for exploring possibilities for reversing it and achieving a more technologically advanced future.

Chapter 9: Ceiling and Visibility Unlimited

The Potential of Flying Cars

  • Flying cars and helicopters, offering point-to-point travel, have been technologically feasible since the 1930s and 1940s, respectively.
  • Despite this feasibility, widespread adoption hasn’t occurred, representing a “failure of nerve” in technological development.
  • This chapter explores the human element in adopting flying cars, particularly focusing on the feasibility of average individuals piloting them.

Public Perception and Reality of Flying

  • A common reaction to the idea of flying cars is skepticism and the belief that they are impossible or impractical.
    • Concerns include:
      • Difficulty and fear associated with flying, especially for those without helicopter experience.
      • High cost of ownership and operation.
      • The perceived inability of average car drivers to pilot aircraft.
  • The author challenges this perception by arguing that flying is a learnable skill, comparable to riding a bicycle, and that planes can be as safe as cars.
  • The author’s personal experience of becoming a pilot supports this claim, highlighting the accessibility of flying with proper training and technology.

The Author’s Journey to Becoming a Pilot

  • Motivated by a lack of practical experience in flying, the author obtained a pilot’s license and purchased a 1977 Beechcraft airplane as part of his research for the book.
  • This experience provided valuable insights into the realities of flying and informed his analysis of flying cars.
  • Beyond practical considerations, the author emphasizes the inherent joy and wonder of flying, offering a unique perspective on the world.

The State of General Aviation

  • General aviation (GA) aircraft in the US are often older models, similar to classic cars, reflecting a stagnation in the industry.
  • This stagnation is partly attributed to regulatory issues that have stifled innovation and production of new aircraft.
  • The author suggests that without these regulatory hurdles, the GA industry could be significantly larger and more vibrant, with newer and potentially safer aircraft.

Safety in General Aviation

  • Despite anecdotal perceptions of danger, general aviation is statistically relatively safe.
  • The author’s low airplane insurance cost (less than $800 per year) suggests that insurance companies assess the risk as comparable to cars or homes.
  • The leading cause of death among active pilots is motorcycle accidents, further highlighting the relative safety of flying.

The Mechanics of Flight

  • Airplanes are mechanically simpler than cars, with basic models requiring only a glider-like structure, a motor, and a propeller.
  • However, flying is more challenging than driving due to the three-dimensional nature of flight, requiring adjustments to human instincts.
  • The Wright brothers’ significant contribution was solving the problem of control, altitude, and steering in flight.
  • Learning to fly involves mastering four key tasks:
    1. Aviate: Maintaining control of the aircraft.
    2. Navigate: Ensuring the plane is on the correct course.
    3. Communicate: Interacting with air traffic control and other pilots.
    4. Engineer: Monitoring and managing the engine and other mechanical systems.
  • While each task is manageable individually, performing all four simultaneously requires practice and skill.

The Dangers of Slow Flight

  • Unlike cars, airplanes are most vulnerable at low speeds, especially during landing.
  • Maintaining a minimum speed is crucial to prevent stalling, which is the loss of lift due to a disrupted airflow over the wings.
  • Stall speed is the speed at which the stall angle of attack generates lift equal to the airplane’s weight. Going below this speed can result in a loss of altitude.
  • The author estimates that around half of current car drivers might not be suited to manually piloting planes due to the challenges of slow flight.

Technological Advancements and Automation

  • Modern technology offers solutions to some of the challenges of flying.
    • Automated engine control and monitoring systems reduce the pilot’s workload.
    • GPS simplifies navigation significantly.
    • Autopilots can automate various aspects of flight.
    • Fully autonomous aircraft are also emerging, capable of flying without human intervention.

The Allure and Challenges of Flight

  • Despite the challenges, flying offers a unique and rewarding experience.
  • The author notes that some people are hesitant to fly in small planes due to concerns about motion sickness and noise.
    • Motion sickness can be a factor, especially on turbulent days, but the forces experienced are often no greater than in a car, just different in direction and frequency.
    • Noise is also a concern, but technological advancements can potentially address this issue.
  • Perhaps the biggest hurdle is the lack of intuitive understanding of how airplanes work, contributing to fear and apprehension.

Understanding How Airplanes Fly

Windy

  • The force exerted by wind increases with the square of its speed, making high-speed flight a significant challenge.
  • Airplanes utilize aerodynamic principles to generate lift and overcome drag, allowing them to stay aloft.

How It Flies

  • Airplane wings generate lift by creating a pressure difference between the upper and lower surfaces.
    • The curved upper surface forces the air to travel a longer distance, creating a lower pressure area.
    • The higher pressure underneath the wing pushes the wing upward, generating lift.
  • Lift is proportional to the square of the speed and the angle of attack (the angle between the wing and the oncoming airflow).
  • Stall occurs when the angle of attack exceeds a critical point, causing the airflow to separate from the wing and resulting in a loss of lift.
  • The Wright brothers initially modeled wings on bird wings but later adopted more rounded and flatter designs for easier handling.

The Power Curve

  • The power curve illustrates the relationship between airspeed and vertical speed (rate of climb or descent) for an airplane.
  • The curve shifts up or down depending on engine power (throttle setting).
  • At the front end of the curve, pushing the nose down increases speed, both forward and downward.
  • At the back end of the curve (slower speeds), pulling the nose up can actually cause the plane to slow down further and descend, increasing the risk of a stall.

Airplane Balance and Weight

  • Airplanes are balanced on their wings, and the elevator (a control surface on the tail) provides pitch control.
  • Proper weight distribution and balance are crucial for safe flight, similar to loading a rowboat.
  • Aerodynamic forces increase with the square of the speed, allowing heavier planes to maintain lift by flying faster.
  • However, increased weight also requires higher landing speeds, potentially making runways seem shorter during landing.

Weather as a Limiting Factor

  • Weather is a major constraint on flying, especially for general aviation.
  • Pilots who are instrument-rated and fly aircraft equipped for icing conditions have more flexibility but still face limitations.
  • Wind shear, a sudden change in wind speed or direction, can be particularly dangerous, especially during landing.
  • Clouds, haze, and fog can reduce visibility, posing a risk of collision with terrain or other aircraft.
  • Flying in clouds or over featureless terrain can lead to spatial disorientation, where pilots lose their sense of direction and may inadvertently enter a dangerous flight attitude.
  • Learning to trust instruments over instincts is crucial for safe flight in such conditions.

Air Traffic Control

  • Air Traffic Control (ATC) plays a vital role in ensuring the safe and orderly flow of air traffic.
  • Pilots communicate with ATC using radio, providing information about their position, intentions, and altitude.
  • ATC provides instructions and clearances to pilots, helping them avoid collisions and navigate through controlled airspace.
  • The author uses a fictional scenario of driving a car in a town with strict traffic control to illustrate the complexities of ATC procedures.
  • Airplanes are surprisingly difficult to see at a distance, making radio communication essential for avoiding collisions.

The $100 Hamburger

  • The $100 hamburger is a common phrase among pilots, referring to a short flight to a nearby airport with a restaurant.
  • It represents a blend of the practical and the recreational aspects of general aviation.
  • While sometimes used humorously to justify a flight, it also highlights the potential of flying as a convenient mode of transportation, especially when the destination is near an airport.
  • The author contrasts this with the complexities and inconveniences of commercial air travel, emphasizing the “last-mile problem” of aviation.
  • The $100 hamburger experience offers a glimpse of the convenience that flying cars could provide.

Conclusion

  • The chapter concludes by reiterating that flying, while challenging, is a learnable skill and that technological advancements are making it more accessible.
  • Overcoming the psychological barriers and embracing the potential of flight could unlock a new era of personal transportation, where flying cars become a practical reality.

Chapter 10: Dialogue Concerning the Two Great Systems of the World

The History and Technology of Flying Cars

  • The question of whether people could fly cars was answered with a qualified yes.
    • Many people could pilot the simpler flying cars from the 1950s.
    • Modern craft with automated assistance would be even easier to pilot.
  • The harder question is whether people would be willing to put in the effort to fly them.
  • Convertibles and VTOLs (Vertical Take-Off and Landing) are the two major categories of flying cars.

VTOLs: Helicopters and Beyond

Helicopters

  • Helicopters offer a significant advantage in certain terrains and situations.
    • New Zealand, with its rugged terrain and sparse roads, has a high rate of helicopter ownership (1 per 6,000 people).
    • The US has a much lower rate (1 per 35,000 people).
  • Comparison of Helicopter (Robinson R-44) and Airplane (Author’s):
    • Similarities:
      • Total weight
      • Interior size
      • Useful load
      • Cruise speed
      • Operational ceiling
      • Engine type (Lycoming air-cooled horizontally opposed aero engine)
    • Differences:
      • Airplane has twice the range.
      • Helicopter requires a much smaller landing area (10 ft x 10 ft pad vs. thousands of feet of runway).
  • Advantages of Helicopters:
    • Smaller landing area: This allows for helipads in places like hospitals, resorts, and corporate headquarters.
    • More civilized takeoff: No taxiing or runway required, simply a smooth vertical ascent.
    • Less susceptible to wind gusts:
      • The fast-moving rotor (at airliner speeds) is less affected by wind speed variations.
      • The smaller rotor area reduces the impact of gusts.
  • Disadvantages of Helicopters:
    • More complex piloting:
      • More links in the energy chain (engine speed, altitude, rotor speed).
      • More demanding control due to instability and non-intuitive rotor dynamics.
      • Lag between control input and response (unlike airplanes that can be briefly left uncontrolled).
    • Pilot seat placement:
      • Helicopter pilot seats are typically on the right, a tradition stemming from the Sikorsky R-4’s demanding controls which favored right-handed stick operation and left-handed collective control and radio operation.
    • Mechanical complexity:
      • Individually hinged, flapped, and tilted rotor blades.
      • Collective and cyclic controls for adjusting blade angle of attack.
      • Secondary driveshaft for the tail rotor with its own collective pitch control.
    • Higher Cost:
      • More powerful engine required.
      • Higher maintenance needs.
      • New helicopters: $1 million - $10 million.
      • Used helicopters: $100,000 - $1 million.
      • Five times more expensive than small piston planes.
    • Engine Cost:
      • Lycoming IO540-AE1A5 (used in R-44): $118,481.
      • Most helicopters use turbines, increasing the cost.
      • Piston-powered Robinsons: $300,000 - $400,000.
      • Guimbal Cabri G2: $410,000.
    • Performance Trade-offs:
      • Lower top speed.
      • Higher fuel costs per mile.
  • Speed Limitations:
    • Hovering: Rotor blade tips must remain subsonic to avoid noise, shockwaves, and power loss.
    • Forward Flight: Advancing blade approaches the sound barrier, while retreating blade risks stalling.
    • Standard small helicopter design limited to around 125 knots.

Autogyros

  • Autogyros offer a compromise between airplanes and helicopters.
  • Power Efficiency: All engine power goes to thrust, unlike helicopters that use some power to counteract torque.
  • Speed: Can achieve higher speeds than helicopters (experimental models up to 150 mph).
  • Takeoff: Requires a short takeoff roll (around 50 feet).
  • Landing: Can land in very small spaces (“on a dime”).
  • Vertical Takeoff Autogyros:
    • Pitcairn PA-36 Whirlwing (1930s) could achieve a 20-foot hop using a pre-rotator.
    • The mechanism added complexity and risk.
  • Simplest Vertical Takeoff Solution: A stationary fan creating a headwind.
  • Benson Teeter Hub (1950s):
    • Simplified rotor design by eliminating the need for individually hinged blades in two-bladed rotors.
    • Reduced cost and increased reliability.
  • Recent Resurgence:
    • Driven by the Benson-style teeter hub and light sport category regulations.
    • Offer a good compromise in terms of cost, takeoff roll, performance, and training.
    • Easier to make roadable.

The Flying Car Spectrum

  • Helicopters at one end (high cost, VTOL).
  • Roadable airplanes at the other end (lower cost, long runways).
  • Gyros in the middle (compromise between cost and takeoff/landing requirements).

Travel Theory and the Value of Flying Cars

The Value of Time

  • Convertible Planes (e.g., Aero Car):
    • Significant overhead time for conversion and airport travel (at least 1 hour).
    • Fit well with existing infrastructure.
  • VTOLs:
    • Require helipads but offer much faster travel for short distances.
  • The Jevons Paradox:
    • Increased efficiency leads to increased consumption (e.g., more efficient steam engines led to increased coal use).
    • Applied to flying cars: Reduced travel time (increased efficiency) will lead to more travel.
    • Increased efficiency leads to greater overall value.
  • Travel Theory:
    • Studies how much people travel in different environments and modes.
    • Shows that people travel more when long-distance travel becomes more convenient.
  • Travel Time Universal: People in all societies spend about an hour a day traveling on average.
  • Effective Car Speed Universal: The effective speed of a car, considering detours and road networks, is about 40 mph for trips of any length.
  • Trip Values Graph:
    • Shows the value of destinations at various distances based on people’s willingness to travel.
    • Peak for trips under 10 miles due to low time cost and shadowing effect (nearby destinations are preferred).
    • “Hump” for trips around 50 miles, likely due to day trip destinations (e.g., ballpark, hospital).
  • Flying Car Value Analysis:
    • Helicopter-like VTOL (100 knots): Dominates for short distances.
    • Convertible Airplane (200 knots): Better for longer distances.
    • Jet Car (400 mph, requires airport): Dominates for very long distances, but not used for short trips.
  • Value Calculation:
    • A jet car VTOL (400 mph, no latency) would be worth 7 times the value of a regular car.
    • A jet car with 1-hour overhead: 3.5 times the value of a car.
    • Fast prop-driven convertible (250 mph): 2.5 times the value of a car.
    • Slow convertible (100 mph, airport required): 1.4 times the value of a car.
    • Commercial air travel (400 knots, 3-hour overhead): Similar value increase to the slow convertible.
  • Hypothetical Jet Car Value: If the average car costs $35,000, a fast VTOL jet car could be worth $245,000 based on travel time value alone.

The Potential for Autogyros and the Path Not Taken

  • The Importance of VTOL: Zimmerman, Custer, de la Sierra, and Pitcairn recognized the importance of VTOL capability for widespread adoption of flying cars.
  • Helicopter Limitations: Despite solving many airplane problems, helicopters are expensive due to their mechanical complexity and power requirements.
  • A Possible Autogyro Future:
    • With mass production and deregulation, autogyros could have become affordable by now.
    • Pitcairn PCA2: $15,000 in 1931, under $5,000 by the late 1930s.
    • Autogyros could have enabled widespread private landing strips and facilitated helicopter development.
    • Eastern Airlines used mail-carrying autogyros from the Philadelphia Post Office rooftop in the 1930s.
  • Ideal Flying Car Specs (Travel Theory): 5 minutes overhead, 250 knots speed (worth 5 times a car).
  • Power Requirements for 250 Knots:
    • Need horsepower between 0.15 and 0.25 times the aircraft weight in pounds.
    • Estimated 600 horsepower needed for a 3,000-pound vehicle (2,000-pound empty weight, 1,000-pound payload).
  • Piston Engines vs. Turbines:
    • Piston engines are too heavy for this application (a 600 hp engine weighs around 900 pounds).
    • Turbines offer a much higher power-to-weight ratio (e.g., Lycoming Honeywell LTS 101-650C: 675 hp, 241 pounds).
  • The Turbine Advantage:
    • High power-to-weight ratio.
    • Higher reliability (10,000-hour overhaul interval vs. 2,000 for piston engines).
  • Turbine Cost: Expensive, but potentially within reach if mass production and innovation had continued.
  • Potential for Turbine-Powered Flying Cars:
    • Could have achieved car-like adoption levels in a non-stagnated present with higher incomes.
  • Futurist Predictions: Flying cars of the future were envisioned with turbines, enabling both VTOL and high speeds.

The VTOL Dilemma and Potential Solutions

The VTOL Trade-off

  • Fast aircraft are generally not VTOL, and VTOL aircraft are generally not fast.
  • Power Consumption:
    • Turbines use a lot of fuel because they produce a lot of power.
    • Power requirements increase with the cube of speed.
    • VTOL requires significant power for hovering, leading to high fuel consumption per mile.
  • Design Dilemma: Choosing between speed and VTOL capability.

Quadcopter VTOL Example

  • Four 10-foot propellers (total disc area: 314 sq ft).
  • 3,000-pound car (requires 6,000 pounds of thrust for safe VTOL).
  • Disc loading: 19 pounds per sq ft.
  • Lift capacity: 6 pounds per horsepower.
  • Required horsepower: 1,000.
  • Helicopters use more thrust area and don’t follow quadcopter peak power guidelines due to differences in rotor design and energy storage.

Solutions to the VTOL Dilemma

  1. High-Powered Jets:
    • Use powerful jets for both lift and forward thrust (e.g., Harrier).
    • Inefficient at low speeds, high fuel consumption.
  2. Tilt Rotor/Ducted Fan:
    • Compromise using prop rotors or tilting ducted fans for both takeoff and cruise (e.g., V-22 Osprey, Joby Aviation VTOL Air Taxi).
    • Variable-pitch propeller blades enhance efficiency.
  3. Separate Lift and Cruise Fans:
    • Lift fans pose a drag challenge in forward flight.
    • Retracting mechanisms add complexity.
    • Ryan XV-5 (1960s): Experimental design with lift fans inside the wings, covered by lids in forward flight.

The Convertible Jet Car and Its Challenges

Potential of the Convertible Jet Car

  • The Ryan XV-5 offers high theoretical value (7 times that of a car) but faces manufacturing cost challenges.
  • A jet-powered convertible is potentially feasible in a non-stagnated present.
  • Would integrate well with existing road and airspace systems.
  • Offers the most value for longer, high-speed trips.
  • Example: 300-knot convertible (similar to Cirrus Vision Jet) with 20-minute airport drive would be worth 5 times a car.

Challenges of the Convertible Jet Car

  1. Piloting Difficulty:
    • Jets are generally easier to fly than propeller planes (no torque, no lift effects from power changes).
    • However, higher speeds require more advanced planning and maneuvering skills.
  2. Airport Congestion:
    • Increased jet car usage could lead to airport backups, increasing travel latency and reducing the value proposition.

The Quad Gyro: A Promising Compromise

Advantages of Rotorcraft

  • Landing Flexibility: Can land in more places than fixed-wing aircraft.
  • Weather Resilience: Less affected by gusty winds and fog.

The Quad Gyro Concept

  • Combines advantages of autogyros and quadcopters.
  • Optimized for forward flight with pusher propellers.
  • Electric motors in the hubs replace lead bars for momentum and energy storage.
  • Benefits of the Quadcopter Design:
    • Separate or mixed tilt control for hubs.
    • Quick response to gusts for a smoother ride.
  • Power Requirements:
    • Lower power needed due to autogyro design (around 250 hp).
    • Piston engine or turbine generator possible.
  • Enhanced Autorotation: Motors assist takeoff and shorten takeoff roll.
  • Landing: Requires minimal space (50-foot circle).
  • Increased Efficiency: Motors supplement autorotation, allowing for faster flight.
  • Potential Speed: Could achieve 150 mph.

Cost and Value

  • Estimated cost: Three times the price of a high-end car.
  • Estimated value: Three times the value of a car.
  • Multi-fan tilt rotors (e.g., Joby) offer even higher value (four times that of a car).

Provisos for Quad Gyro and Multi-fan Tilt Rotors

  1. Stability in Gusty Wind: Susceptibility needs further investigation.
  2. Power Source and Range:
    • Joby’s claimed 150-mile range for its battery-powered craft limits its value, especially for longer trips.
    • Turbine generators, fuel cells, and ultimately non-chemical power sources are needed for these vehicles to reach their full potential.

Chapter 11: The Atomic Age

The Dream of Abundant Energy

  • The Atomic Age Promise: The latter half of the 20th century saw tremendous advancements in information technology, but progress in energy technology seemed to stagnate.
  • The Henry Adams Curve: Historical trends show that societal progress and optimism are linked to increasing energy availability.
    • This is not tied to a specific energy source (wood, coal, oil transitions have occurred).
  • Science Fiction’s Vision: Science fiction authors, unlike many in the real world, recognized the potential of nuclear energy as the next major energy source (e.g., H.G. Wells, E.E. Smith, Heinlein, Asimov).
  • Asimov’s Prediction: “The appliances of 2014 will have no electric cords, of course, for they will be powered by long-lived batteries running on radioisotopes.” (Isaac Asimov)

The Case for Nuclear Power

Energy Density and Cost Comparison

  • Jet Fuel Example (Boeing 747-400):
    • Fuel capacity: 57,285 gallons
    • Equivalent to 114 years of gasoline consumption for the average American (500 gallons/year).
    • Fuel weight: 194.6 tons
    • Energy produced: 7.5 terajoules (TJ)
    • Fuel cost: $343,710 (at $6/gallon)
  • Uranium Fission Equivalent:
    • Amount of uranium needed for 7.5 TJ: 94.3 grams (3.3 ounces)
    • Cost of uranium: $8.66 (based on $49.50/pound for yellowcake U-308, with 84.8% uranium content).
  • Energy Cost Comparison:
    • Terajoule Definition: A terajoule is the amount of energy the average American uses in all forms (including manufacturing, shipping, military) over about three years.
    • Consumer vs. Commodity Prices: Consumer retail energy prices are generally about five times higher than commodity prices for the raw fuel (except for gasoline).
    • Uranium Price Markup: Uranium has a higher markup (around 30x) due to isotopic enrichment.
    • Thorium Potential: A thorium fuel cycle could reduce uranium costs by a factor of 10.
    • Average Annual Energy Cost:
      • Chemical fuels: $6,553
      • Nuclear: $5.80
  • “Too Cheap to Meter”: Early in the atomic age, some believed nuclear electricity might be so cheap that metering wouldn’t be necessary.
    • Marginal Price vs. Absolute Price: Metering decisions are based on the marginal cost (cost to produce one more kilowatt-hour) rather than the absolute price.

The Stalled Nuclear Revolution

  • Historical Energy Transitions: The 20th century saw transitions from wood to coal, then to oil and natural gas. Nuclear power started its growth around 1960 but stalled by 1975.
  • Potential Growth Curve: If nuclear power had followed the same growth trajectory as natural gas from 1920 to 1970, it could be the dominant source of electricity today.
  • Nuclear Fuel Efficiency: Nuclear fuels generate 1 million to 10 million times more energy per weight than chemical fuels, requiring less raw material extraction and producing less waste.
  • Wind Turbine Comparison: A wind turbine uses more lubricating oil than a nuclear plant uses uranium per kilowatt-hour generated.

Headroom for Improvement

  • Molten Salt Reactors (MSR) and Integral Fast Reactors (IFR): These designs, using thorium and uranium-plutonium alloys respectively, could achieve 99% fuel burn-up, significantly improving efficiency and reducing waste compared to 1960s designs.
  • Oak Ridge MSR Project: A molten salt reactor operated at Oak Ridge for about a year but was canceled by Richard Nixon.
  • Advantages of Thorium Reactors (as noted by Fortune magazine, February 2015):
    • Meltdown-proof
    • Less long-lived waste
    • More difficult to weaponize waste
    • Operates at safer atmospheric pressure
    • Higher operating temperatures, making them more efficient
    • No enrichment step needed
    • Higher fuel utilization (nearly 100% vs. less than 1% for uranium reactors)
    • Thorium is 3-4 times more abundant than uranium.
  • Fuel Reserves:
    • Proven uranium reserves: 77 years of global power
    • Proven thorium reserves: 6,472 years of global power
  • Additional Thorium Reactor Advantages:
    • Proliferation resistance (difficult to divert fuel for weapons)
    • Walk-away safety (passive cooling system using frozen salt plug that melts in case of failure).

Potential for Future Reactors

  • Efficiency Gains: With continued development, it’s estimated that we could get 100 times more power from each ton of nuclear fuel.

The Obstacle of Radiophobia

The Fukushima Disaster: A Case Study

  • Impact of Earthquake and Tsunami (2011):
    • Deaths: ~16,000 (mostly from drowning)
    • Injuries: 6,000
    • Missing: 2,500
    • Homeless: 250,000
    • Building damage: Extensive
  • Fukushima Power Plant Damage: The disaster caused damage to the power plant, leading to the release of radioactive materials.
  • Radiation-Related Deaths and Injuries: Zero
  • UNSCEAR Report (2021): “No adverse health effects among Fukushima residents have been documented that could be directly attributed to radiation exposure,” and further delayed effects are unlikely.
  • Media Coverage Bias: News coverage heavily focused on the power plant incident, overshadowing the far greater human toll from the earthquake and tsunami.
  • Evacuation Consequences:
    • Approximately 100,000 people were evacuated from the Fukushima area.
    • Radiation levels in the evacuated area were comparable to natural background radiation in Finland.
    • Around 1,600 evacuees died from various causes, including privation and suicide.
  • Fukushima Plume Area:
    • Radiation doses in most of the release area were less than the dose from a couple of CT scans in a year (2011 data).
    • By 2012, radiation levels had significantly decreased.

The Cost of Fear

  • Nuclear Power Safety Record: Nuclear power is statistically the cleanest and safest form of large-scale energy generation, even with older reactor designs.
  • Clean Air Act Benefits: The Clean Air Act, which reduced coal-fired pollution, saved over 50,000 lives per year in the U.S.
  • Missed Opportunity: If the environmental movement had focused on promoting nuclear power instead of opposing it, pollution-related deaths and CO2 emissions could have been dramatically reduced.
  • Misinformation and Public Perception: Activists spread the false claim that nuclear power plants could explode like atom bombs.
    • Post-Three Mile Island Poll: 66% of Americans believed a power reactor could explode like an atomic bomb.
    • Reactor Explosions: Why They Can’t Happen:
      • Explosive chain reactions require unmoderated critical mass with 80%+ enrichment.
      • Power reactors use moderated reactions with only 3% enriched uranium.
  • Energy Poverty Deaths: An estimated 28,000 U.S. residents die annually from cold due to energy poverty (primarily affecting the poor).
  • Potential Lives Saved: Wider deployment of nuclear power could have saved around a million lives over the past 25 years.

Proliferation Concerns

  • Weaponization of Reactor-Grade Uranium: The risk of rogue states or terrorists obtaining nuclear material from power reactors is exaggerated.
    • Enriching reactor-grade uranium (3%) to weapons-grade (80%) requires significant resources and effort, comparable to refining it from natural uranium.
    • Nations with nuclear ambitions are closely monitored.
  • Small Modular Reactors (SMRs): SMRs, which are factory-built, self-contained, and buried, would further reduce the risk of proliferation.

The Regulatory Burden

  • Impact of Regulation on Nuclear Power Costs: Excessive regulations have drastically increased the cost of nuclear power plants.
  • Speed Limit Analogy: Regulating nuclear power based on extreme safety standards is like setting a speed limit of 1 mph because people have been injured at 100 mph.
  • Peter Lang’s Study:
    • During the 1950s and 60s, the cost of nuclear plants decreased by 25% with each doubling of capacity.
    • This trend reversed after increased regulation.
    • If the trend had continued, the price of nuclear power could be 10% of its current level (2020).
    • Coal generation could have been largely replaced by nuclear by 2000.
    • Millions of deaths and gigatons of CO2 emissions could have been avoided.
  • Suppression of Innovation: The high cost and regulatory burden stifle innovation in the nuclear industry.
    • Cheryl Green (Oak Ridge): The current environment is hostile to innovation due to regulations, high costs, and risk aversion.
  • Lack of R&D Investment: The nuclear industry invests less in R&D as a percentage of revenue than most other major industries.
  • The U.S. Navy Exception: The U.S. Navy’s successful nuclear program demonstrates that a strong, accountable leadership can overcome bureaucratic obstacles.
    • Over 6,000 reactor years of accident-free operation.
    • 526 reactor cores built.
    • 86 nuclear-powered vessels in use.
  • Historical Cost and Potential Savings:
    • Pre-regulatory cost of nuclear power: $1,175/kilowatt.
    • Had the Henry Adams curve continued, a 2% annual increase in per capita power consumption (10 kW average) would have required a $235 investment per year.
    • With continued learning, new capacity could now cost $495/kilowatt.
    • A 2% annual increase in the average American’s 25 kW consumption would cost less than $250.

The Great Physics Stagnation

Decline in Nuclear Physics Research

  • Public Awareness vs. Scientific Understanding: Despite the political prominence of nuclear issues, there is a lack of public understanding of nuclear physics.
  • Limited Availability of Current Literature: Bookstores have a scarcity of recent texts on nuclear physics, with many older books being reprinted without updates.
  • Norman Cook’s Observation: Nuclear physics lacks a unified theoretical framework comparable to quantum electrodynamics in atomic physics. This necessitates a phenomenological approach with separate formulations for different nuclear phenomena.
  • Used Bookstore Experience: A search in a well-stocked used bookstore revealed only three books on nuclear physics, the most recent being from 1991. This indicates a decline in interest and demand for the subject.
  • Declining PhDs in Nuclear Physics: The number of PhDs awarded in nuclear physics has decreased significantly since the 1970s, reflecting limited career prospects.
  • Consequences of Stagnation:
    • Most operating reactors are still based on 1960s Generation 2 pressurized water designs.
    • A shrinking field becomes more political, resistant to change (Machiavelli effect), and less likely to produce breakthroughs.

The Anti-Abundance Mindset

Opposition to Cheap, Clean Energy

  • Quotes from Green Activists:
    • “The prospect of cheap fusion energy is the worst thing that could happen to the planet.” (Jeremy Rifkin)
    • “Giving society cheap, abundant energy would be the equivalent of giving an idiot child a machine gun.” (Paul Ehrlich)
    • “It would be little short of disastrous for us to discover a source of clean, cheap, abundant energy because of what we might do with it.” (Amory Lovins)
  • History of Energy Source Attacks: Green activists have consistently attacked various sources of energy, including:
    • Coal: Despite its role in industrialization and improving living standards.
    • Oil: The foundation of modern transportation and industry.
    • Hydropower, nuclear fission, natural gas: Have all faced opposition.
  • Renewables Exception: Renewables like wind and solar have largely escaped criticism, but this is because they currently lack the capacity to meet society’s energy needs at an affordable price.
  • The True Motive: The underlying motivation behind these attacks is not solely about environmental concerns but also about controlling human impact on the environment. Green agonists view the ability to alter the Earth as inherently negative.

Fusion’s Potential and Future Opposition

  • Fusion as a “Future” Energy Source: Fusion has been perpetually “20 years away,” allowing it to avoid serious scrutiny.
  • Cold Fusion and the Unveiling of Opposition: The brief excitement around cold fusion revealed the true anti-abundance stance of some activists.
  • Fission vs. Fusion: Fusion might have offered advantages in terms of reactor size and cost, but its current leading project (ITER) is massive and expensive.
  • Predicted Opposition to Fusion: If fusion becomes viable, it will likely face the same exaggerated safety and environmental concerns that have been used to suppress fission.

A Call to Action for Energy Innovators

  • Challenges for Energy Pioneers: Technologists working on new energy sources should expect strong opposition if they succeed.
  • The Real Rewards: Despite the challenges, the potential to improve human lives through abundant, clean energy is a worthy goal.

Conclusion

  • Missed Opportunities: Progress in nuclear power and other advanced energy technologies has been stifled by fear, misinformation, and excessive regulation.
  • The Link Between Energy and Progress: Cheap, abundant energy is essential for economic growth, poverty reduction, and improving living standards.
  • Arthur C. Clarke’s Warnings:
    • “In this inconceivably enormous universe, we can never run out of energy or matter, but we can all too easily run out of brains.”

    • “If, as is perfectly possible, we are short of energy two generations from now, it will be through our own incompetence. We will be like stone-age men freezing to death on top of a coal bed.”

  • The Great Stagnation: The stagnation in energy technology is a major factor contributing to broader societal stagnation.
  • The Need for Rationality and Courage: Overcoming the obstacles to progress requires overcoming irrational fears, promoting scientific literacy, and adopting a more sensible approach to regulation.

Chapter 12: When Worlds Collide

The Urgency of Time and the Feynman Path

  • The Feynman Path: A concept proposed by Richard Feynman in his 1959 lecture “Plenty of Room at the Bottom,” suggesting a top-down approach to nanotechnology by building progressively smaller machine shops, each capable of creating a smaller copy of itself, eventually reaching the molecular scale.
    • This path aims to establish a broad-based industrial capability at the atomic scale.
    • It emphasizes the importance of atomic precision in fabrication.
  • Historical Context:
    • The 1950s and the space race instilled a sense of urgency in technological development, exemplified by projects like the Manhattan Project and the Apollo program.
    • This urgency has since diminished, leading to stagnation in fields like nanotechnology.
  • The Foresight Institute’s Battelle Roadmap to Productive Nanosystems:
    • An attempt to guide nanotechnology research towards Feynman’s and Drexler’s original vision of atomically precise manufacturing.
    • Focused on techniques producing atomically precise products, excluding broader approaches like the Feynman path.
  • Feynman’s Vision:
    • Build an automated machine shop, operable by remote control.
    • Capable of building a complete working copy of itself at its own scale and a smaller scale.
    • The challenge lies in fabricating parts with better tolerances than the machine itself.
    • The ultimate goal is to reach molecular manipulation capabilities through iterative miniaturization.

Obstacles and Misconceptions

  • The Giggle Factor: A common reaction of skepticism and dismissal towards the idea of macro-scale self-replicating machines (SRMs).
    • This stems from the intuition that factories are inherently larger and more complex than their products (e.g., a car factory vs. a car).
    • The Feynman path challenges this intuition by proposing machines that can build copies of themselves.
  • Skepticism about Self-Replication:
    • George Friedman, engineering professor at USC, highlighted the lack of self-replicating machines despite the perceived simplicity of biological self-replication.
    • Mark Reed, Yale nanotech researcher, expressed skepticism about the self-replication capabilities of machines built with atomically-sized parts.
  • Challenges in SRM Design:
    • SRMs defy standard top-down design methodologies, as the product (the machine itself) is unknown until the design is complete.
    • Many SRM designs either:
      • Oversimplify construction capabilities by relying on pre-made complex subsystems.
      • Become overly complex by incorporating capabilities like mining and refining raw materials (e.g., the 1980 NASA moon base study).
  • The Misconception of MEMS as a Failed Feynman Path:
    • MEMS (Micro-Electro-Mechanical Systems): A technology for creating micromachines using photolithography (similar to integrated circuit manufacturing).
    • While MEMS allows for the creation of incredibly small machines, their relative tolerances (accuracy of fabrication relative to size) are significantly lower than those achieved in traditional machining.
      • Standard machining: Relative tolerances of one in a million.
      • MEMS: Relative tolerances of one in one hundred.
    • This leads to issues like stiction (super-friction), hindering the creation of complex mechanisms like bearings, gears, and screws.
  • Feynman’s Emphasis on Precision:
    • Feynman recognized the need for improving precision at each stage of miniaturization.
    • He proposed techniques like lapping and polishing to achieve higher tolerances at smaller scales.
    • This iterative improvement of precision is crucial for achieving the Feynman path’s goals.

The Potential of the Feynman Path

  • Benefits of a MEMS-scale Machine Shop:
    • Even a millimeter-sized system with 10-micron parts, built using the Feynman path, would have immense value, particularly in medical applications.
    • Example: The COVID-19 pandemic highlighted the need for high-throughput microfluidic devices (like ILINPs) for vaccine manufacturing. A Feynman path approach could have significantly accelerated their production.
  • Synergy with Bottom-Up Approaches:
    • The Feynman path and bottom-up approaches (e.g., chemistry, molecular biology, DNA origami) are not mutually exclusive.
    • They can complement each other, with bottom-up methods producing atomically precise parts and top-down methods assembling them.
    • Example: Nanoscientists creating atomically precise nanogears could be utilized directly by Feynman-Path machines.
  • Accelerated Timeline for Nanotech:
    • Pursuing the Feynman path alongside bottom-up approaches could potentially cut the development time for “real nanotech” from 20 years to 10 years.

Key Questions and Challenges

  • Feasibility of Compact Macro-scale SRMs:
    • Can a compact, self-replicating machine be built using macroscopic parts and standard fabrication techniques?
    • We know that non-compact (global industrial infrastructure) and compact microscopic (bacteria) replicators exist, but the macro-scale compact case remains unexplored.
  • “Cheating” with Vitamins:
    • What external resources (“vitamins”) can be legitimately provided to the system at various scales?
    • Examples: control signals, pre-synthesized molecular gears, single-crystal materials, advanced surface treatment techniques.
  • Roadblocks and Phase Changes:
    • What are the specific challenges and fundamental shifts in physics that will be encountered during miniaturization?
    • Examples:
      • Motor Scaling: Transition from electromagnetic motors (poor at small scales) to electrostatic motors.
      • Ontological Phase Boundary: Shift from continuous materials to discrete atoms.
      • Changes in Physics:
        • Diminishing gravity.
        • Increased adhesion (stiction).
        • Loss of fluid lubrication.
        • Decreased stiffness of materials.
        • Quantum tunneling of hydrogen atoms.
  • Forming Techniques:
    • What methods can be used to shape and assemble materials at different scales?
    • Additive Manufacturing (3D Printing):
      • Likely to play a major role across a wide range of scales.
      • Macro-scale: Melting and depositing materials drop by drop.
      • Smaller scales: Electrodeposition and electroremoval.
      • Nanoscale: Atomically precise deposition and removal techniques.
  • Visualization and Measurement:
    • How can we observe and measure the fabrication process at smaller scales?
    • Scanning Probes: Limited by their ability to operate on relatively flat surfaces.
    • Physical Surface Scanners: Likely to be necessary, drawing inspiration from existing contact measurement techniques in machining.

Plan of Attack

The Difficult (Immediate Goals)

  1. Design a Scalable SRM Workstation:
    • Develop a remotely operated workstation capable of manufacturing and manipulating parts, and replicating itself at its own scale and at one-quarter scale.
    • Utilize “vitamins” or external inputs as needed at different scales.
  2. Macro-Scale Implementation:
    • Build a physical prototype of the architecture at desktop scale, using materials like plastic.
    • Include operator controls (not intended for replication).
    • Identify scaling challenges and potential roadblocks.
    • Determine appropriate scaling steps.
  3. Simulation-Based Verification:
    • Use simulations to verify the scalability of the architecture through identified phase changes.
    • Example: Simulate the transition from electromagnetic to electrostatic motors.
  4. Develop a Roadmap:
    • Create a detailed and actionable roadmap for achieving the desired fabrication and manipulation techniques at the nanoscale.
    • Define the starting point based on current state-of-the-art fabrication technology, considering materials and scale.
    • The roadmap should guide the development of the actual scaling sequence.

The Impossible (Long-Term Goals)

  1. Identify the Optimal Starting Scale:
    • Determine the best scale for initiating the Feynman path based on available fabrication and assembly technology.
    • Consider the trade-off between building a smaller system directly or building a larger system capable of producing a smaller copy.
    • Example: The Nippon Denso microcar (one-thousandth scale) demonstrates the feasibility of manipulating micron-scale parts. Advances in nanotechnology since 1994 likely enable even finer tolerances and roughnesses.
  2. Additive Manufacturing Strategies:
    • Identify and develop appropriate additive manufacturing techniques for different scales.
    • Focus on deposition methods that can be adapted for progressively smaller feature sizes.
  3. Addressing Tolerance Challenges:
    • Recognize that atomic-scale tolerances become crucial even at micron-scale part sizes.
    • Focus on surface forming and reforming techniques to achieve the required precision.
    • Design machine elements with flat bearing surfaces (e.g., thrust bearings, sliders) to leverage crystal planes until strained shell circular bearings become feasible.
    • Example: Explore the use of multi-walled carbon nanotubes for such applications.

The Future of Additive Manufacturing and its Impact

  • H.G. Wells’ Vision:
    • Wells anticipated the revolutionary potential of additive manufacturing (3D printing), envisioning automated construction methods that surpass traditional building techniques.
  • Evolution of 3D Printing:
    • Future advancements will likely lead to:
      • Increased speed and lower cost of 3D printers.
      • Expanded range of printable materials.
      • Improved precision, enabling the creation of complex objects and machines.
  • Economic and Social Consequences:
    • Similar to the computer revolution, nanomanufacturing could lead to:
      • Decentralization of production.
      • Increased personal autonomy and creativity.
      • A Cambrian explosion of new physical machines and applications, analogous to the growth of software applications.
  • The Resurgence of the Industrial Revolution:
    • Nanotech has the potential to dramatically increase productivity, just as the Industrial Revolution did.
    • Example: The cost of producing a woolen tunic today is significantly lower than it was in the year 300, thanks to machine-based manufacturing.
  • The Promise of Nanotech:
    • Nanotech could lead to another leap in productivity, enabling us to accomplish tasks in a day that currently take a year.
    • This could drastically reduce the cost of complex products like flying cars.

Conclusion

  • The Feynman path to nanotechnology, though largely unexplored, offers a promising route to achieving atomically precise manufacturing.
  • It complements bottom-up approaches and could potentially accelerate the development of nanotech.
  • Addressing the key questions and challenges outlined above is crucial for realizing Feynman’s vision.
  • The potential benefits of nanotech are vast, ranging from medical advancements to increased personal autonomy and a resurgence of the Industrial Revolution.
  • Taking Feynman’s path is a long and difficult program, but it is a possibility with immense potential to transform our future.

Chapter 13: When the Sleeper Wakes

The Unfulfilled Promise of Future Technology

Science Fiction’s Role and The Kernel of Truth

  • Science fiction emerged when humanity envisioned a future beyond its current limitations.
  • Science fiction fans, unlike people of the past, believe their dreams have a kernel of truth and can be realized.
  • “The fantastic fiction of today may well become the fact of tomorrow.” - Sam Moskowitz, The Immortal Storm

Flying Cars: A Case Study in Technological Stagnation

  • Flying cars, a frequent topic in discussions about the future, are technologically feasible.
  • They have been built and flown since the 1930s.
    • Example: The auto gyro had innovative design and funding but faced challenges during the Great Depression.
  • Obstacles to development:
    • 1930s: Great Depression hampered marketing.
    • 1940s: World War II diverted engineering talent and resources to war production.
    • 1950s:
      • Aircraft regulations caused delays in supply.
      • Investment in highways and infrastructure reduced demand.
      • The government hindered Pickhairn’s efforts to develop flying cars.
    • 1960s:
      • Passenger jet travel and private aviation boomed.
      • Military developed VTOLs (Vertical Take-Off and Landing aircraft), like air jeeps.
    • 1970s:
      • The Henry Adams Curve, representing long-term energy growth, flatlined, impacting technology dependent on energy.
      • Academia shifted towards virtue-signaling and self-deception, hindering technological advancement.
      • A “Baptists and bootleggers” alliance formed between those who believed in progressive policies and those who benefited from the associated funding and influence.
      • Public spending and PhDs tripled between 1960 and 1980.
      • The “war on cars” gained traction, leading to policies that discouraged car use.
      • Supersonic flight was banned.
      • Bridge building, which peaked in the 1960s, declined, leading to increased traffic congestion (five times worse than in the 1960s).
    • Around 1980:
      • Liability law developments crippled the private aviation industry.
      • Regulation increased significantly.
      • Bureaucrats, unconcerned with costs, replaced those who balanced costs and benefits in decision-making.
      • The cost disease replaced the learning curve in the economy.
      • The nuclear industry faced escalating costs and stagnated.
      • Interest and research in nuclear physics declined.
  • Green Fundamentalism became a dominant ideology, contributing to pessimism despite objective improvements in living standards.

The Missed Opportunities

  • Flying cars were achievable in 1950 if not for the Depression and World War II.
  • The Henry Adams Curve flatline is the proximate cause for their absence today.
  • Complacent naysayers and bureaucrats have stifled progress.

Comparing Predictions: Flying Cars, Space Travel, and Nuclear Power

  • Science fiction writers accurately predicted the technology for flying cars and nuclear power.
  • Their overestimation of fusion power was based on the opinions of experts at the time (estimated to be 50 years optimistic).
  • Flying cars:
    • Development hindered by historical events, government actions, regulations, and liability concerns.
  • Space travel:
    • Received substantial government funding for military and prestige reasons.
    • Writers like Clark were overly optimistic about colonization, while Heinlein predicted a government takeover hostile to technology and a pause in space travel.
  • Nuclear power:
    • The Manhattan Project, while costly, did not yield economically beneficial technology like the B-29 project.
    • Military restrictions hindered private development of nuclear power.
    • It took decades for nuclear power to become economically viable, and then it was stifled by regulation.
  • Computing:
    • Experienced a phase change from centralized computing centers to decentralized personal computers.
    • Combined with the internet, it progressed as predicted by technological optimists.
    • Asimov and Clark’s predictions about applications like robots and information access were largely accurate.
    • Ordering online and receiving quick delivery is similar to predictions by Nolan (1927) and Heinlein (1938).
  • Other technologies, particularly high-energy ones, remain in the centralized era. Example: Airline travel.

Science and Technological Progress Despite Stagnation

  • Scientific knowledge has continued to advance:
    • Discovery of exoplanets and new bodies in our solar system.
    • Understanding of the molecular mechanisms of life and the human genome.
    • Mapping of the human brain.
    • Construction of neutrino telescopes and detection of neutrinos from the sun.
    • Detection of gravitational waves from a stellar collapse.
  • Many unfulfilled technological predictions were due to misplaced faith in culture, governance, and information providers, not inaccurate technological foresight.
  • Resources were diverted towards academia and virtue-signaling instead of improving lives.

A Hypothetical Pro-Technology Present

  • Scenario:
    • The best and brightest focused on science and engineering instead of activism and regulation.
    • Education, healthcare, and public works were not affected by the cost disease.
    • The private aircraft industry thrived.
    • The Henry Adams curve continued its upward trend.
    • Technology and quality of life continued to improve as in the previous 50 years.
  • Potential Outcomes:
    • Increased wealth: Average American income would be four times higher (median $200,000 vs. $50,000 today).
    • Wider homeownership: The top 50% of families could afford two homes, compared to the top 5% today.
    • Shift in GDP composition: More focus on machines and less on paperwork.
    • Reduced financial sector dominance: Its share of GDP has doubled since 1980, exceeding the energy sector.
    • More efficient education and healthcare sectors: Similar outcomes as today, but at a lower cost.
    • Advanced high-power and high-tech experience: Further progress in areas like turbine aero engines.
  • Decentralization and Urban Expansion:
    • The Economist’s study on the economic history of cities (April 4, 2015) highlights the impact of transportation costs on city growth.
    • Declining transportation costs, especially time costs, would have spurred continued expansion into exurbs.
    • A flight-based transportation system would expand the reach of urban centers and make rural living more viable.
    • Businesses could be located almost anywhere.
  • Mountainside Living:
    • Economical mountainside living without roads or physical utilities would become possible.
    • Helicopter deliveries and personal VTOLs would address transportation needs.
    • Telecommunications are already readily available.
    • Power options:
      • Solar power with batteries.
      • Fuel cells fed by delivered fuel.
      • Advanced nuclear technology:
        • NASA’s kilopower reactor (10 kilowatts).
        • Chargeable atomic batteries (20 kilowatts, half-ton).
        • Isotopic batteries (direct electric conversion, as envisioned by Asimov).
      • Cellular power systems with small modular reactors.
      • Space-based solar power transmission (as studied in the 1970s High Frontier project).
      • Geothermal energy: Tapping into the Earth’s internal heat (fission-driven).
  • Diaspora from Cities: Continued movement towards the sea, mountains, and other areas.
  • Space Travel and Exploration: Increased activity, but not large-scale colonization yet.
  • Nanotechnology: Early applications would be emerging, similar to personal computing in the 1980s.
  • Flying Cars: Early adopters would be transitioning to fuel cells.

The Current Landscape and Potential Renaissance

  • Flying car resurgence: Many new designs are being proposed, including battery and electric powered.
  • Investment in e-VTOL taxis: Potentially excessive, given the limitations of current battery technology.
  • Battery technology:
    • Potential for an eightfold increase in energy density and halving the price.
    • Suitable for ground cars and fixed-wing aircraft, but not for VTOLs requiring a 500-mile range.
  • Fuel cells: Promising for early nanotech applications and could replace batteries in existing aircraft designs.
  • Electric flying car enthusiasm: Likely sustainable, potentially on the verge of a technological renaissance.
  • Anhydrous ammonia: Gaining traction as a potential fuel for fuel cells (carbon-free and easier to handle than hydrogen).
  • Startups exploring small modular reactors: Both fission and fusion.
  • Biotechnology boom.
  • Drones, video phones, digital assistants, and robot vacuum cleaners.
  • Reusable rockets for space travel.

The Future of Technological Progress

  • Many new technologies may emerge in the coming century.
  • The question remains: Will these technologies recreate the transformative impact of the Industrial Revolution and significantly improve the lives of ordinary people?

Chapter 14: The Dawn of Robots

Epigraph: H. G. Wells, Things to Come

  • Quote: “And now for the world of the airmen, and a new start for mankind.”

Introduction: Rosie the Robot and the State of AI

  • Popular culture, like The Jetsons, envisioned robots like Rosie the Robot Housemaid.
  • Robots are technologically more challenging than flying cars.
  • Artificial intelligence (AI) and robotics are progressing, aligning with Asimov’s predictions.
  • This chapter examines the future of AI and robots, grounding predictions in current capabilities and avoiding limitations in imagination.

The Wright Brothers Analogy and the AI Boom

  • The author previously argued against a sudden AI takeoff driven by self-improving super AI.
  • Instead, he predicted a more gradual but effective progress:
    • AI would start to work, attract investment, and experience rapid development.
    • This progress would mirror the aviation boom after the Wright Brothers’ successful flight:
      • Initial skepticism followed by widespread recognition.
      • Rapid influx of resources and development effort.
      • Exponential growth in capabilities.
  • This AI boom is currently underway:
    • Deep learning in multi-level neural networks has brought significant advancements.
    • Deep learning is essentially back-propagation of error terms (used since the 1980s) but with improvements:
      • Small improvements combined with increased processing power (GPUs).
      • Training networks with many more levels (e.g., 22 vs. 2-3).
      • Faster training times (hours instead of weeks).
      • Availability of large datasets for training, providing broader experience.
    • These advancements are supported by progress in other AI areas (e.g., processing power, robotics).
    • Examples of recent AI advancements:
      • Robots learning manual tasks by observation and practice.
      • Improved speech recognition surpassing typing accuracy.

Example: Training a Deep Neural Net on Text

  • Experiment: Training a multi-level recurrent neural net on a 50 MB corpus of fiction novels.
  • Initial output: Random characters (e.g., “W98VWMA0N&W47T**KNQ4”).
  • After one night’s training: More English-like but nonsensical text (e.g., “The jets of the hundred for the idea of the mini-poor, from a seamed planet…”).
  • After a few days: More coherent, grammatically correct, and even creative text (e.g., “Dejah Thoris, he said, I’ll be somewhere…”).
  • Significance: The model learned language structure (words, sentences, punctuation) solely from predicting the next character in a sequence.
  • This demonstrates powerful learning capabilities that can be integrated with other AI techniques.

Prediction: AI Capabilities by the 2020s

  • Prediction: By the end of the 2020s, AI systems will achieve human-like performance in tasks involving reading, writing, talking, and listening.

AlphaGo: Mastering the Game of Go

  • AlphaGo was the first program to play Go at a professional level.
  • Go was previously considered a significant AI challenge:
    • Quote (from Beyond AI): “By contrast to chess, programs playing Go have yet to challenge even serious amateurs, much less the top professionals.”
    • Complexity: Go requires concept formation and dynamic assessment of game pieces (clusters).
    • Quote (from Beyond AI): “playing go well requires solving the real AI problem, that of creating useful concepts from experience.”
  • AlphaGo’s success:
    • Utilizes two deep neural nets integrated with a traditional game-playing program.
    • Rapid improvement: Beat top-ranked player Lee Sedol in 2016.
  • Criticisms and Rebuttals:
    • Some dismissed AlphaGo’s achievement as “brute force” due to its reliance on substantial computing power (1,920 CPUs and 280 GPUs).
    • Counterargument: Access to and effective deployment of computing power is a significant real-world advantage.
    • Google developed custom chips to accelerate deep learning algorithms.
  • Shift from Academia to Industry:
    • DeepMind’s AlphaGo paper had 20 authors.
    • Machine learning engineer is the fastest-growing job (LinkedIn 2017).
    • Numerous AI/machine learning conferences held annually (e.g., 225 in 2018).
  • Conclusion: The AI field is experiencing a boom similar to aviation after the Wright Brothers.

Robo Habilis: The Wozniak Test

  • Heinlein Quote (Time Enough for Love): Emphasizes the versatility of humans compared to specialized insects.
  • Steve Wozniak’s opinion: Robots will never be able to enter an unfamiliar house and make coffee.
  • The Wozniak Test: A proposed test for embodied AI (robots in the Asimovian sense):
    1. Robot enters an unfamiliar house.
    2. Finds and uses the doorbell/knocker.
    3. Explains its purpose and gains entry.
    4. Locates the kitchen and coffee-making supplies.
    5. Makes coffee to the householder’s taste (asking questions if needed).
    6. Serves the coffee in another room.
  • Challenges for current robotics:
    • Vision: Navigation, object recognition, gesture understanding.
    • Manipulation: Coordinating complex movements, physical modeling, learning loops.
    • Speech: Recognition, natural language understanding and generation.
    • Planning: Multi-level planning from manipulation paths to coffee brewing sequences.
    • Coordination: Integrating all capabilities towards the overall goal.
  • The Wozniak Test assesses adaptability and common sense.
  • It emphasizes general learning skills over pre-programmed routines.
  • Comparison to Homo habilis: The test evaluates tool use and recognition, a milestone towards human-level intelligence.
  • The Nielsen Test: A broader concept proposed by Nils Nielsen, suggesting that human-level AI should be able to perform human jobs (the “employment test”).

Limitations of Deep Learning in Robotics

  • Challenge: Applying deep learning to robotics is limited by the lack of vast real-world data.
  • Deep learning thrives on massive datasets (e.g., text, game records).
  • Real-world robotics data is expensive and difficult to collect.
  • Quote (Gary Marcus): “Rosie the robot, you can’t have it knock over your furniture a hundred thousand times to learn.”
  • Deep learning needs to improve its ability to learn quickly from limited experience.
  • Future research directions:
    • Learning from instruction.
    • Learning by imitation.
    • Learning by trial and error.

The Dawn of Robots and Realistic Expectations

  • We are at the beginning of the age of robots.
  • Expectations of robots are becoming more realistic.
  • Compared to other future technologies, robots are closest to happening on schedule.

Profession: Automation and Productivity

  • Quote (Philip Francis Nolan, Armageddon, 2419 A.D.): Depicts a future where machines replace human labor.
  • The Jetsons oversimplified the implications of automation.
  • Real-world trend: Technology has consistently increased productivity, requiring fewer people to produce more goods.
  • Examples:
    • Agriculture: Farmers in 1790 fed 1.1 people each; now they feed 40.
    • Manufacturing: US manufacturing jobs declined from over 33% in the 1940s to under 9% now, while output tripled.
  • Manufacturing productivity:
    • Output per worker increased from $25,000 in 1962 to $160,000 in 2012 (constant 2010 dollars).
    • Hyper-exponential growth: Productivity increases are accelerating.
  • Projected manufacturing worker productivity in 2062: $661,000 (constant 2010 dollars) or $1,854,392 (with 3% inflation).
  • The question of work in a highly automated future:
    • Thought experiment: Lining up the US population along I-80 in order of productivity.
    • Finding the cutoff point: Where the fewest people can maintain a desired level of consumption.
    • With 1790’s consumption levels and 2020’s technology, only a small fraction of the population would need to work.
  • Unemployment is not at 98% because we desire a higher level of consumption than our ancestors.
  • Keynes’ observation (1933): Technological advancements in economizing labor need to be matched by finding new uses for labor.
  • Labor force participation:
    • Historically around 90% until the mid-20th century.
    • Declined to around 70% for men and 50% overall by 2020.
  • Truck driver example:
    • Self-driving trucks are a potential source of job displacement.
    • However, automation often leads to increased efficiency within an industry rather than complete job elimination.
    • Trucking will likely follow the pattern of manufacturing, with fewer workers handling more freight.
  • Conclusion: Technological advancements have consistently improved living standards and reduced extreme poverty.

A Cleaner, Better Breed: Machine Ethics

  • Early AI ethics concerns: Focused on the potential threat of superintelligent AI (e.g., turning us all into paperclips).
  • Critique of early AI: Little attention was given to imbuing AI with a sense of morality.
  • Analogy to natural language processing:
    • AI struggled for decades to achieve competent language understanding and generation.
    • Moral reasoning might be similarly complex.
  • Lack of research in computational ethics: No significant work compared to computational linguistics.
  • Beyond AI (2007): One of the first books dedicated to machine ethics.
  • GPT-3 and the illusion of understanding:
    • GPT-3 is a powerful language model that can generate impressive text.
    • However, it lacks true understanding; it primarily relies on statistical patterns in its training data.
    • Understanding requires mental models that enable recognition, imitation, prediction, and manipulation of concepts.
  • The challenge of building a corpus of moral judgments:
    • Unlike language data, there’s no readily available massive dataset for training ethical AI.
  • Short-term solution: Asimov’s robots - “Do what the humans tell you to do.”
  • Humans can train robots through feedback and reinforcement.
  • Building a corpus of practical and moral judgments:
    • Robots can learn from human instructions and examples.
    • Sharing learned judgments can create a collective knowledge base for ethical decision-making.
  • The potential for morally superior robots:
    • Quote (Asimov): “They’re a cleaner, better breed than we are.”
    • Focusing on the positive aspects of human behavior can lead to robots with enhanced ethical capabilities.
  • Example: Ethical retail clerk robot: No extra effort needed to prevent it from stealing, highlighting the inherent reliability of robots.
  • Benefits of robots in positions of trust:
    • Reliability, 24/7 availability, no human biases or misconduct.
    • Reduced financial losses due to employee theft.
  • The Sorcerer’s Apprentice problem:
    • The hypothetical scenario of a robot tasked with maximizing paperclip production leading to disastrous consequences.
    • This scenario is considered unrealistic: It assumes a combination of superhuman intelligence and a lack of common sense.
  • Robots will be developed incrementally, with continuous improvements addressing potential flaws.
  • Example: Robot doctors:
    • Future robots could combine the knowledge of general practitioners and specialists, potentially exceeding human capabilities.
    • Benefits: Encyclopedic expertise, instant access to research, high IQ, no fatigue or errors.
  • Programming robots through natural language:
    • Advanced AI will understand human instructions, leading to easier and more intuitive programming.
  • Robots’ impact on quality of life:
    • Increased competence in professional services.
    • Affordable domestic help (like Rosie the Robot).
    • Self-driving cars (like personal chauffeurs).
    • Combined with nanotechnology, robots can significantly enhance living standards.
  • Conclusion: While challenges remain, the dawn of robots promises significant advancements and improvements to our lives. We can see enough of the future to be optimistic about the potential benefits of this technology.
  • “We can only see a short distance ahead, but we can see plenty there that needs to be done.” - Alan Turing.

Chapter 15: The Second Atomic Age

The Synergy of Nanotech and Nuclear Technology

  • The Second Atomic Age will be defined by the convergence of nanotechnology and nuclear technology, resulting in atomically precise machinery and a new era of atomic energy.
    • Like the relationship between chemical fuels and steel machines, nanotech enables and complements nuclear technology.

Nanotech’s Role in Nuclear Advancement

1. Isotopic Separation

  • Isotopes: Nuclei with the same number of protons but different numbers of neutrons.
    • Chemically identical, but differ in nuclear properties.
    • Examples:
      • U-235 (fissile) vs. U-238 (not fissile)
      • Xenon-134 (stable, low neutron cross-section) vs. Xenon-135 (radioactive, high neutron cross-section)
  • Current Limitations: Conventional chemical separation techniques are expensive and inefficient.
  • Nanotech’s Advantage: Atom-by-atom manipulation allows for sorting by weight or nuclear magnetic properties.
    • Challenges: Radioactive atoms can damage nanoscale machinery.
    • Benefits:
      • Cheaper isotopic separation.
      • Cleanup of low-level nuclear waste: “Given current technology, isotopic separation is far too expensive to be used to clean up stuff that has been exposed to neutrons. Low-level nuclear waste. But with nanotech, that process would be straightforward.”
      • Transformation of nuclear power into a clean, contained technology.

2. Building Extreme and Precise Nuclear Structures

  • Nanotech’s Ability: Construct precise structures at the atomic scale within nuclear mechanisms.
    • Influence on Nuclear Reactions: Enables the creation of intense electric and magnetic fields that can influence nuclear reactions.
  • Example: Californium-251 Reactor:
    • A working moderated reactor can be built with just 25 grams of Californium-251.
    • Challenges: Californium-251 is synthetic and currently produced in minute quantities.
  • Current Transmutation Limitations:
    • Neutron bombardment is the primary method for changing isotopes.
    • Highly reactive isotopes are more susceptible to damage during the process.
  • Nanotech’s Advantage: Individual atom handling allows for precise nuclear processing that is currently impossible with bulk technology.

3. Enhanced Productivity and Radiation Resistance

  • Continuous Reconstruction: Nanotech enables continuous replacement of reactor parts, minimizing radiation-induced defects.
  • Biological Analogy: Similar to the radiation resistance mechanism of Deinococcus radiodurans (Conan the Bacterium).
  • Benefits:
    • Reduced vulnerability to radiation’s health effects.
    • Cell repair machines could enhance radiation resistance.
    • Possibility of living in space.

The Power Imbalance: Nanotech and Energy Needs

  • Nanotech’s Power Requirements: Nanotech machines have extremely high power densities.
    • Drexler Electric Motor: 3 billion horsepower per pound.
  • Nuclear Power’s Advantage: Nuclear power densities are well-matched to nanotech’s energy needs, similar to the relationship between chemical fuels and steel machines.

Renewable Energy: Uranium from Seawater

  • Julian Simon’s Prediction: “It’s reasonable to expect the supply of energy to continue becoming more available and less scarce forever.”

  • Uranium in Oceans:

    • 4 billion tons of dissolved uranium.
    • Equivalent to over 100 quadrillion watt-years of energy.
    • Enough to power 10 billion people at the current American energy consumption level (10 kilowatts per person) for 10,000 years.
  • Current Technology: Economical extraction of uranium from seawater is nearing feasibility.

  • Nanotech’s Potential:

    • Simplified and efficient extraction.

    • Minimal environmental impact: Maintaining the homeostatic equilibrium of uranium concentration in seawater.

    • “If we start taking uranium out of seawater and use it for the entire world’s energy economy, indeed a robustly growing energy economy, the concentration in seawater will not decline for millions of years.”

Safe Reactor Design: The TRIGA Reactor

  • General Atomic’s TRIGA Reactors:
    • Based on Freeman Dyson’s design.
    • Inherent Safety: Do not require external quenching rods or cooling.
  • Negative Coefficient of Reactivity:
    • Mechanism: The hotter the fuel gets, the less it supports the fission chain reaction.
    • Opposite of an Implosion Bomb: In a bomb, compression increases criticality; in the TRIGA, thermal expansion reduces criticality.
  • Safety Record: 70 TRIGA reactors have operated for 60 years without significant accidents.
  • Proof of Concept: Demonstrates the feasibility of designing inherently safe reactors.

Miniaturization of Reactors

  • Critical Mass in Small Reactors:
    • Plutonium-241: 0.246 kilograms (just over half a pound) in a neutron-reflected solution-based moderated reaction.
    • Californium-251: Less than an ounce (one-third of a teaspoon).
  • Aqueous Homogeneous Reactors:
    • Reactors with fuel dissolved in a liquid moderator.
    • History dating back to Los Alamos and Oak Ridge in the 1950s.
    • Theoretical Minimum Size:
      • Spherical aqueous homogeneous reactor powered by Americium-242m nitrate solution in water.
      • Mass: 4.95 kg (0.7 kg of Americium-242m fuel).
      • Diameter: 19 cm (7.5 inches).
      • Power output: A few kilowatts.
      • Potential applications: Space program and portable neutron sources.
  • Sandia Reactor Design:
    • Requires about 5 gallons of water for moderation and reflection.
    • Reactor core smaller than a cubic foot.

The Hanford B-Site Reactor and Xenon Poisoning

  • Hanford B-Site Reactor: The first full-scale reactor, built for plutonium production during the Manhattan Project.
  • Xenon-135 Poisoning:
    • Xenon-135, a fission byproduct, absorbs neutrons and temporarily shuts down the reactor.
    • Xenon-135 decays with a half-life of 8 hours, allowing the reactor to restart after a period of inactivity.
  • Overbuilding for Mitigation: The Hanford reactor was built twice as large as calculated to overcome xenon poisoning.
  • Nanotech Solution:
    • Liquid-core reactors with continuous fuel circulation and xenon-135 filtering.
    • Eliminates the need for overbuilding.
    • Self-quenching in case of filter malfunction.

Nanotech-Enabled Reactor Possibilities

  • Potential Size and Power:
    • Estimated Size: Closet-sized reactor.
    • Estimated Power Output: On the order of a megawatt.
  • NASA’s Kilopower Project:
    • Developing a 10-kilowatt reactor for space probes.
    • Solid U-235 molybdenum alloy core, the size of a roll of paper towels.
  • Economic Viability with Nanotech:
    • Nanotech’s productive power could make small-scale reactors economically feasible.
  • Benefits of Home Reactors:
    • Location Independence: No reliance on power lines or fuel deliveries.
    • Reduced Environmental Impact: No emissions.
    • Example: A 100-kilowatt home reactor would produce about an ounce of fission byproducts per year.

Fissionable vs. Fissile Isotopes

  • Thermal Neutrons: Neutrons slowed down to the energy level of atoms at room temperature (0.025 electron volts).
  • Fissile Isotopes:
    • Fissionable by thermal neutrons.
    • Capable of sustaining a chain reaction.
  • High Cross-Section for Thermal Neutrons: The probability of neutron capture is higher for thermal neutrons.
  • Moderated Chain Reactions: Use smaller quantities of fissile material due to neutron moderation.
  • Fissionable Isotopes:
    • Not fissile but can be fissioned by high-energy neutrons.
    • Example: Uranium-238.

Chainless Reactors

  • Mechanism: Use a source of high-energy neutrons to fission non-fissile isotopes like U-238.
  • Safety Advantages:
    • Runaway-proof and meltdown-proof.
    • No chain reaction.
  • Fuel Advantages:
    • Can use natural unenriched uranium or thorium.
    • Eliminates the need for enrichment facilities, reducing proliferation risks.

Sources of High-Energy Neutrons

1. Particle Accelerators

  • Direct Neutron Acceleration: Accelerating neutrons directly.
  • Spallation: Bombarding materials with charged particles (e.g., protons) to release neutrons.

2. Fusion

  • Deuterium-Tritium Fusion: Produces helium and a 14 MeV neutron.
  • Lattice Confinement Fusion:
    • Pre-loading erbium with deuterium and bombarding it with high-energy particles to induce fusion.
    • Under investigation as a spacecraft power source.

Assassins and Accidents: The Case of Polonium-210

  • Alexander Litvinenko’s Assassination: Killed by polonium-210 poisoning in 2006.
  • Polonium-210:
    • Vigorous alpha emitter.
    • Fatal dose: 1 microgram if introduced internally.
  • Addressing Concerns about Nuclear Safety:
    • Cost and Rarity: Polonium-210 is expensive and rare, not a byproduct of power generation.
    • Comparison to Botulinum Toxin: Botulinum toxin is more deadly and can be produced accidentally.
  • Toxicity vs. Public Hazard:
    • A substance’s toxicity alone does not determine its public hazard.
    • Factors to consider: Quantity handled, purpose of use, precautions taken.
  • Example: Tritium:
    • Can be dangerous if incorporated into water and absorbed by the body.
    • High cost ($30,000 per gram) incentivizes careful handling.
  • Handling Large Quantities of Fuel vs. Small Quantities of Nuclear Material:
    • The high energy density of nuclear materials means smaller quantities are needed, reducing handling risks.
    • Example: Half an ounce of thorium has the same energy content as 60,000 pounds of gasoline.
  • Nanotech’s Role in Accident Mitigation:
    • Cell repair machines and artificial organs could counteract radiation exposure.

Cold Fusion: Reassessing the Possibilities

  • Shifting Perceptions: Cold fusion research has gained some credibility over time.
  • Evidence of a Heat Effect: More reproducible experiments suggest a genuine heat effect, though the fusion pathway remains unclear.
  • The Navy’s Role: Research at the Naval Surface Warfare Center has produced peer-reviewed publications and patents related to cold fusion.

The Three Miracles of Cold Fusion

  • John Huizenga’s Critique: Identified three key challenges that a theory of cold fusion must address:
    • 1. The Coulomb Barrier: The energy required for nuclear fusion is much higher than what electrochemical processes can provide.
    • 2. Reaction Pathways: Deuterium fusion typically produces tritium and a proton or helium-3 and a neutron, which are not detected in expected quantities.
    • 3. Energy Output: Even if fusion occurs, the expected high-energy gamma rays are not detected. Instead, heat is observed.

Peter Hagelstein’s Quantum Coherence Theory

  • Quantum Coherence: Proposes a direct quantum mechanical coupling between nuclear transitions and phonons (quantized vibrations in solids).
  • Mechanism:
    • Phonons could provide a channel for excitation to overcome the Coulomb barrier.
    • Phonons could influence the reaction pathway and facilitate energy dissipation as heat instead of radiation.
  • Experimental Support: Optical stimulation at specific phonon frequencies has been shown to enhance the excess heat effect in some experiments.

Implications of a Quantum Coupling

  • Relativistic Effects: The coupling between phonons and nuclear degrees of freedom might involve relativistic terms in the Dirac equation that are usually neglected.

Potential of Cold Fusion (Assuming it’s Real)

  • Challenges:
    • Palladium (a key material in some cold fusion experiments) is rare and expensive.
    • Current cold fusion experiments are not economically viable.
    • Example: Mitchell-Schwarz’s NANR wires produce 100 milliwatts of heat from 1 milliwatt of electricity but cost over $100,000 to make.
  • The Spaywar Navy Labs Approach:
    • Co-deposited palladium with deuterium during electrolysis.
    • Detected high-energy neutrons, suggesting a potential pathway for chainless reactors.
  • The Need for Nanotech:
    • Nanotech could enable precise analysis and control of atomic-scale conditions to unlock and optimize cold fusion.

The Real Tragedy of Cold Fusion

  • Loss of Imagination in Research: The cold fusion controversy highlights the decline of purely experimental, observation-driven research.
  • Failure of Imagination: The inability to explore unconventional ideas has become a significant barrier to scientific progress.

Hot Fusion: The Power Source of the Future (Still)

  • Louis Strauss’s Prediction (1954): “Energy too cheap to meter” from fusion power.
  • Over-Expectation: Science fiction writers and even scientists were overly optimistic about the timeline for fusion power.
  • Challenges of Replicating Solar Fusion:
    • Extreme Conditions in the Sun’s Core: 15 million degrees Celsius, 265 billion atmospheres pressure, extremely dense plasma.
    • Slow Reaction Rate: The first step in solar fusion (proton-proton fusion) takes an average of a billion years.
    • Low Power Density: A volume of the sun’s core the size of a car engine produces only half a watt of power.

Hydrogen Bombs and Fusion

  • Differences from Solar Fusion:
    • Uses deuterium, tritium, and lithium, bypassing the slow initial steps of solar fusion.
    • Ignition temperature of 400 million degrees Celsius (from a fission bomb).
  • Challenges of Thermonuclear Bomb Design:
    • Took five years after WWII to develop a working design.
    • Requires a complex sequence of four explosions.
  • Project Pacer:
    • A 1970s study exploring the use of thermonuclear bombs for power generation.
    • Technically feasible but faced logistical and economic challenges.

Progress in Hot Fusion

  • Technological Advancements: Superconductors and other technologies have enabled smaller and more efficient fusion reactor designs.
  • Private Sector Involvement: Numerous startups are actively researching various fusion approaches.
  • Challenges Remain:
    • Radiation and Neutron Production: Fusion reactions, particularly deuterium-tritium fusion, produce significant radiation and neutrons.
    • The Quest for Aneutronic Fusion: Fusion reactions that produce most of their energy as alpha particles (helium nuclei) are highly desirable.
      • Examples: Proton-Boron-11 and Proton-Lithium-7 reactions.

Examples of Aneutronic Fusion Research

  • TAE (formerly Tri-Alpha) and LPP Fusion: Using magnetic confinement for aneutronic fusion.
  • Laser-Driven Fusion: High-intensity lasers can be used to initiate fusion reactions.
    • Example: Simulation study at New South Wales demonstrating the potential for capturing 1 gigajoule of electricity from a 14-milligram sample of hydrogen and boron-11 using a laser-driven approach.

The Second Atomic Age: A New Era of Energy Abundance

  • Transformative Potential: The combination of nanotech and advanced nuclear technologies could unlock unprecedented energy abundance.
  • Henry Adams’s Analogy:
    • “We will have been like a boy playing on the seashore, diverting ourselves in now, and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of energy lay all undiscovered before us.”

  • Overcoming Impossibilities: Technological advancements have consistently surpassed expectations throughout history.
  • The Importance of Pursuing Force (Energy):
    • “He had seen the number of mines engaged in pursuing force, the truest measure of its attraction, increase from a few scores or hundreds in 1838 to many thousands in 1905, trained to sharpness never before reached, and armed with instruments amounting to new senses of indefinite power and accuracy, while they chased force into hiding places where nature herself had never known it to be, making analyses that contradicted being and syntheses that endangered the elements.” - Henry Adams, The Education of Henry Adams

Chapter 16: Tom Swift and His Flying Car

Feasibility of Flying Cars

Current Technological Feasibility

  • A “parked-in-your-garage” flying car is feasible with today’s technology.
  • Pre-1970 Advancement Rates: Continued advancement at these rates would have made flying cars affordable for many.
    • These cars would likely have:
      • Visible wings, rotors, or ducted fans
      • Turbine generators for power
      • Cruise speeds under 250 knots
      • Potentially foldable designs for road driving
      • Compact size for fuel efficiency

The Great Stagnation and Energy Poverty

  • The Great Stagnation: A period of slowed technological advancement, particularly in energy technology.
  • Ergophobic Religion: A viewpoint that regards energy as undesirable or problematic.
  • Energy Poverty: A major obstacle to achieving the futuristic vision of flying cars.
  • We have limited ourselves through our attitudes towards energy, but we can change our mindset and regain our potential.

Hypersonic Runabouts and the Future of Flight

Electric Jets and Propulsion Technologies

Electric Ducted Fans
  • Electric Jet (Model Aircraft): A ducted fan powered by an electric motor.
  • Modern Technology: Rare earth magnets and other innovations allow for electric motors with power-to-weight ratios comparable to turbines.
  • Superconducting Motors: Full-size superconducting motors could achieve even better ratios.
  • Optimization: Electric motors enable better placement and control of thrusters, potentially allowing for separate lift and cruise motors.
  • Schubeler DS215DIA HST: Example of a high-thrust electric ducted fan:
    • 56 pounds of thrust from 15.6 kilowatts (21 horsepower)
    • 8-inch package
    • 75 units could lift a flying car (noisy)
    • 1,170 kilowatts total power needed
  • Winged Aircraft with Runway:
    • Only about 10 units needed (150 kilowatts)
  • Advantages of Multiple Small Fans:
    • Increased reliability (redundancy)
    • Improved flight efficiency (better airflow control)
    • Reduced noise
  • NASA’s X-57 Maxwell: Example of an electric airplane using multiple small fans.
Battery Limitations
  • Weight: Batteries are significantly heavier than equivalent energy in fuel.
    • Example: 360 pounds of Avgas (aviation fuel) vs. 3 tons of lithium-ion batteries for the same energy.
    • Cost of batteries: $170,000 (without charging circuitry)
  • Spontaneous Combustion: Lithium-ion batteries are prone to catching fire.
    • Examples: Electric aircraft companies experiencing battery-caused ground fires.
  • Crash Safety Concerns: Ruptured batteries release energy instantly, potentially causing more severe fires than ruptured fuel tanks.
    • Example: Tesla crash requiring 30,000 gallons of water over 4 hours to extinguish.
Fuel Cells
  • Fuel Cells: Electrochemical devices that convert chemical energy into electricity.
  • Improvements since the 1960s: Steady improvements in efficiency and power density.
  • Current State-of-the-Art:
    • Ultralight custom-built fuel cells at $10,000 per kilowatt.
    • Power density comparable to piston engines (1 kilowatt per kilogram).
    • Example: 250-pound piston engine vs. 300-pound fuel cell equivalent.
  • Limitations:
    • Suitable for conventional airplanes and helicopters, but not compact VTOLs.
    • Hydrogen Storage: Fuel cells typically use pure hydrogen, which is difficult to store and handle.
      • Hydrogen causes metal embrittlement and sealing issues.
      • Hydrogen is very light, requiring large storage containers.
Ammonia as a Hydrogen Source
  • Ammonia (NH3): A potential solution for hydrogen storage.
  • Advantages:
    • Higher hydrogen density per gallon than liquid hydrogen.
    • Widely produced industrial chemical with existing infrastructure (pipelines).
    • Price per energy comparable to fossil fuels.
  • Ammonia Fuel Cells:
    • Advanced from laboratory to commercial deployment in the 2010s.
    • Currently used for remote power generation and backup, but too heavy for vehicles.
    • Potential for improvement in the coming decade.
Turbine Generators
  • Turbine Generators: As of 2021, the best option for powering electric fan VTOLs.
  • Availability: Can be bought off-the-shelf (used as auxiliary power units in airliners).
  • Weight Advantage: Turbine generator plus electric fan motors can be lighter than a direct-coupled piston engine.
  • Future Potential of Fuel Cells: Fuel cells are expected to become lighter and quieter in the future.
Hypersonic Runabout Power Source
  • Alan Shepard’s Mercury Redstone: Carried 50,000 pounds of fuel and oxidizer (7 terajoules of energy).
  • Energy Requirements for Hypersonic Flight: 50 gigajoules for a 3,000-pound car (140 times less than the Redstone).
  • Uranium as a Potential Fuel: With efficient energy extraction, a hypersonic trip could cost about 15 cents worth of uranium (0.1 grams).
Ionic Thrusters
  • Ionic Thrust: A method of propulsion that uses electric charge to accelerate air.
  • Mechanism:
    • Electric charge sprayed into the air, creating ions.
    • Electric field applied to the charged air, generating thrust.
  • History: Ionic fans have been laboratory curiosities for decades.
  • MIT Research (2013): Rigorous evaluation of ionic thrust by Stephen Barrett and Kento Masuyama.
  • Findings:
    • Ionic thrusters could produce over 100 newtons (22 pounds) of thrust per kilowatt.
    • Significantly more efficient than electric jets (6.25 kilowatts per 100 newtons).
    • A 3,000-pound car could achieve VTOL with a 180-horsepower motor.
  • Limitations:
    • High efficiency depends on moving a large volume of air slowly (similar to helicopter blades).
    • Safety Concerns: High voltage sprayed into the air presents potential hazards (especially in rain).
  • MIT’s First Powered Flight with No Moving Parts (2018): Demonstrated the potential of ionic thrust.
  • Power Supply Weight: High-power, high-voltage power supplies are currently heavy but could be improved with second atomic age technology.
Stratospheric Applications of Ionic Thrust
  • Stratospheric Advantages: Thin, cold, and dry air is ideal for ionic thrust.
  • Natural Ionization: The sun already ionizes the air at high altitudes.
  • Ozone Layer Benefit: Ionic thrusters could potentially benefit the ozone layer.
  • Potential Synergy with Nuclear Energy: Beta emission (high-energy electrons) could be used to power ionic thrusters.
Circulation Control
  • Circulation Control: A technique that uses high-pressure air jets to enhance lift and control airflow.
  • Dyson Bladeless Fan: An example of circulation control in a consumer product.
  • Benefits:
    • Improved lift-to-drag ratios
    • High-lift effects without external moving parts
    • Increased effective wing aspect ratio
  • Advantages for Telescoping/Folding Wings: Eliminates the need for hinges, flaps, and actuators, leading to lighter wings.
  • Research and Development: NASA and other institutions have been researching circulation control since the 1970s.
  • Mark Moore’s Research: Exploring the use of circulation control in propeller ducts for optimizing performance in different flight regimes.

Designing Your Flying Car

Wingless Aircraft and Lifting Bodies

  • Do We Need Wings? Wingless aircraft, known as lifting bodies, have been tested.
  • NASA’s M2F-1:
    • Car-sized lifting body built of plywood and steel.
    • Capable of gliding at 100 knots.
    • Towed by a Pontiac in many tests.

Stability Considerations

  • Airplane Tie-Downs: Airplanes are susceptible to being blown around by wind gusts.
  • Car Stability: Cars are much less affected by wind due to their lower profile and higher weight relative to surface area.
  • Wing Loading: Weight of the aircraft per unit area of the wing.

Designing a Car-Sized Lifting Body

  • Assumptions:
    • One-ton aircraft with 1,000-pound payload (typical for a small plane).
    • Multiple small thrusters, circulation control, and a lifting body design.
    • Wing loading of 30 pounds per square foot.
  • Comparison to Other Aircraft:
    • High for a small plane (Piper Cub: 6, Beechcraft Bonanza: 25).
    • Low for a jet (Cessna Citation: 40, Boeing 747: 150).
  • Takeoff and Landing Speed: Around 100 knots.
  • Low Aspect Ratio Wing: Requires air curtains and boundary layer control for efficient flight.

Arthur C. Clarke’s Vision of Aircars

  • Childhood’s End: Depicts small, wingless aircars propelled by jet reaction and boundary layer control.
  • Differences from Potential Reality:
    • Flying cars would likely be larger than ground cars to interact with enough air for flight.
    • Crumple zones would be needed for ground safety.
  • High Wing Loading: Smaller cars or heavier full-size cars could be used to increase wing loading, improving stability and reducing turbulence sensitivity.
  • Power Requirements: 1,000 horsepower turbine or equivalent power source would be needed due to the use of forced air for circulation control.

VTOL Capability and Drag Considerations

  • VTOL Feasibility: Theoretically possible with down-blowing fans in the trunk and hood.
  • Historical Examples: Air jeeps of the 1960s used similar designs.
  • Space Constraints: Room for engine, passenger compartment, and other components is a challenge.
  • Schubeler Electric Jets:
    • Only 20 square feet of footprint needed for 3,000 pounds of lift (54 jets, 850 kilowatts).
    • De-rated jets (10 kilowatts, 64 jets) could be used for lower power consumption (640 kilowatts).
  • Doke VZ-4: Used 1,000 horsepower in 50 square feet of lift fan area.
  • Footprint Options:
    • 50 square feet for passengers, engine, etc.
    • Expanding to 100 square feet (similar to a Ford F-150 or 1959 Cadillac) is possible.
  • Battery Limitations: Current battery technology would limit flight time to about 10 minutes.
  • Turbine Power: Currently the most practical power source for VTOL flying cars.
  • VTOL vs. Range/Load Trade-Off: VTOL capability reduces range and/or payload capacity.
  • Doke VZ-4 Range: One hour of flight time.
  • Passenger Capacity: VTOL design likely limits capacity to two passengers.

Nanotech and the Future of Flying Cars

  • Nanotech Advantages: Eliminates the need to thermalize chemical energy, leading to higher efficiency and lighter engines.
  • Fuel Efficiency: Three to four times more usable energy from fuel.
  • Hydrogen Fuel: Three times the energy-to-weight ratio of gasoline.
  • Nanotech Power Mills: Could further increase efficiency by a factor of 3.
  • Hydrogen Storage: Lighter than carbon-based fuels, but potentially just as bulky.
  • Engine Weight Reduction: Nanotech could significantly reduce engine weight.
  • Structural Weight Reduction: Nanotech could also reduce the weight of the aircraft structure.
  • Morphing Structures: Nanotech enables the creation of powered, morphing wings and propellers.
  • Impeller Arrays: Instead of propellers or ducted fans, designs could use arrays of smaller impellers.
  • Advantages of Impeller Arrays:
    • Robustness to individual impeller failure
    • Improved control responsiveness
    • Reduced noise and turbulence
  • Quiet VTOL: Nanotech could enable VTOL aircraft that are significantly quieter.
  • Design Challenge: Maintaining subsonic airflow and minimizing turbulence.

Autopilots for Flying Cars

Autopilot Technology

  • Autopilot Capabilities: Even older airplanes can have autopilots that navigate to destinations.
  • Mechanical Autopilots: Systems for maintaining heading and altitude have existed since the 1930s.
  • GPS Integration: Modern autopilots use GPS for navigation.
  • Commercial Autopilot Systems: Can land planes automatically.
  • Garmin Autoland System: An example of an emergency auto-landing system available in high-end private aircraft.
  • Autoland Functionality:
    • Alerts air traffic control
    • Plots a course to the nearest airport
    • Flies to the airport and lands
  • Autopilot Reliability: Autopilots can fail, and pilots must be ready to take over.
  • Pilot Training: Learning to turn off the autopilot is a crucial first step in pilot training.
  • Autopilot Kill Switches: Multiple ways to disable the autopilot are typically provided.

Autopilots for Flying Cars

  • Safety Considerations: The reliability of autopilots is a key concern for flying cars.
  • Human Pilot vs. Autopilot Reliability: The goal is to make autopilots at least as reliable as human pilots.
  • Pilot Instincts: Humans lack natural instincts for flight, making piloting more challenging than driving.
  • VTOL Difficulty: Flying in VTOL mode is particularly challenging.
  • Remote-Controlled Quadrotor Experience: VTOL drones are significantly harder to fly than conventional airplanes.
  • VTOL Crash History: Many experimental VTOL aircraft have been crashed by experienced pilots.
  • Automatic Pilot with Artificial Sensorium:
    • A potential solution for safer VTOL flight.
    • Direct connection to sensors that provide detailed information about airflow and aircraft status.
  • Existing Technology: Commercial airplanes and hobby drones already have sophisticated autopilots.

Highways in the Sky

Roadway Infrastructure

  • U.S. Roadway Mileage: Approximately 4 million miles (2.6 million paved).
  • Interstate System: About 50,000 miles.

Airspace as Usable Right of Way

  • Contiguous U.S. Area: Around 3 million square miles.
  • Usable Altitude: About 10 miles.
  • Safe Altitude for Non-Pressurized Aircraft: 2 miles.
  • Current Aircraft Separation Regulations:
    • 1,000 feet vertical separation
    • 5 miles horizontal separation
  • Limitations of Current Regulations: Based on manual altitude control and voice communication with air traffic controllers.
  • Potential for Closer Spacing: Aircraft could safely fly closer together with advanced electronic control systems.
  • Hypothetical Airspace Capacity:
    • Current regulations: 30 million miles of high-speed “highway” (600 times the length of the interstate system).
    • 1-mile horizontal separation: 150 million miles of lanes.
  • Potential Number of Aircraft: Millions of aircraft could potentially be accommodated in the airspace simultaneously.

Limitations of Current Air Traffic Control (ATC)

  • ATC System: Outdated compared to modern distributed control systems like the internet.
  • Voice Communication: ATC relies on voice communication in English with human controllers.
  • Capacity Limitations: The system often operates near its capacity limits.
  • Fragility: The system is vulnerable to disruptions (e.g., the Chicago Center incident).
  • Expansion Challenges: Difficult and costly to expand the current system.
  • Inability to Handle Flying Car Traffic: The current ATC system could not manage the traffic volume of widespread flying car usage.

Decentralized Traffic Control

  • Ground Traffic Control Analogy: We don’t need centralized control for ground traffic (cars or pedestrians).
  • Bird Flocking: Birds demonstrate the feasibility of complex flight patterns with decentralized control.
  • Seagull Flock Tornado: An example of highly coordinated flight without centralized control.

Technologies for Decentralized Aircraft Control

  • Anti-Collision Strobes: Visible at night, but would create excessive light pollution with high traffic density.
  • Radar Frequency Beacons: Could transmit aircraft position, altitude, and course data.
  • Autopilot Integration: Autopilots could use beacon data to maintain safe separation.
  • Cell Phone Tower Network:
    • Approximately 200,000 cell towers in the U.S. (one per 15 square miles, average distance of 4 miles).
    • Aircraft are typically in view of 10-100 towers at cruising altitude.
  • Tower-Based Aircraft Tracking: Towers could be equipped to track and communicate with aircraft, similar to how they handle cell phones.
  • GPS and Inter-Car Communication: These technologies, combined with tower-based tracking, could enable decentralized air traffic management.

Automatic Dependent Surveillance-Broadcast (ADS-B)

  • ADS-B: A system that broadcasts aircraft position and altitude data to ground stations and other aircraft.
  • Implementation:
    • Uses 634 government-run ground stations (covers most of the continental U.S.).
    • ADS-B Out (mandatory in some areas): Aircraft transmit their position and altitude.
    • ADS-B In (optional): Aircraft receive traffic information from the system.
  • Benefits:
    • Provides more complete and accurate information to air traffic controllers.
    • Gives pilots situational awareness they wouldn’t otherwise have.
  • Limitations: Still requires a major overhaul of ATC to handle flying car traffic.

Unmanned Aircraft System Traffic Management (UTM)

  • UTM: A system designed to manage low-altitude drone operations.
  • Purpose: To organize, coordinate, and manage multiple Beyond Visual Line of Sight (BVLOS) drone flights.
  • Current Capabilities: Primarily supports automatic notification of airports about drone activity.
  • Potential for Development: Increased drone usage and experimentation could drive advancements in UTM.

Drone Traffic Control Systems

  • Industry Projects: Various companies are developing distributed drone traffic control systems.
  • Drone Coordination Research: Active research and development in this area.
  • Drones as a Testing Ground: Drones provide a platform for exploring and testing decentralized air traffic management solutions.

Conclusion

  • Technical Feasibility: Building a flying car with VTOL capability and high speed is technically possible today.
  • Cost: Currently too expensive for the average person, but nanotech could make it affordable in the future.
  • Noise Reduction: Nanotech could also lead to quieter flying cars.
  • Spectrum of Flying Car Capabilities: A range of options will likely exist, from small airport-based planes to high-speed, long-range VTOL vehicles.
  • Performance vs. Price: Unlike ground cars, the price of flying cars will likely be directly related to their capabilities.
  • The Ultimate Flying Car: Full second atomic age technology could lead to private spaceships.

Chapter 17: Escape Velocity

Jules Verne’s Vision of Space Travel

  • Quote:
    • “In spite of the opinions of certain narrow-minded people, who would shut up the human race upon this globe as within some magic circle it must never outstep, we shall one day travel to the moon, the planets, the stars, with the same facility, rapidity, and certainty as we now make the voyage from Liverpool to New York.” - Jules Verne, From the Earth to the Moon

Technological Stagnation in Space Travel

  • Question: How can futurists be optimistic about future technology when space travel, a highly hyped technological application, has seen regression rather than progress since 1962?
  • Answer 1 (Political Stunt):
    • Project Apollo was a political stunt.
    • Once publicity was achieved, there was no reason to continue moon missions due to the cost.
    • Repeating political stunts is ineffective.
  • Answer 2 (Bureaucracy):
    • Technological stagnation is caused by bureaucratic ossification and incompetence.

The Challenger Disaster and NASA’s Bureaucracy

  • Example: The Challenger disaster on January 28, 1986, illustrates bureaucratic failure.
    • Cold Weather and O-Rings: A cold wave stiffened the O-rings in the Space Shuttle Challenger’s solid rocket boosters, causing a gas leak and the vehicle’s breakup.
    • Rogers Commission and Feynman’s Demonstration: The Rogers Commission investigated the disaster, with Richard Feynman demonstrating the O-ring issue using ice water.
    • Ignored Warnings: Alan MacDonald, director of the Solid Rocket Motor Project, warned about the O-ring problem and refused to approve the launch.
    • Punishment for Foresight: MacDonald was demoted for his responsible actions.
    • Communication Gap: Feynman threatened to resign from the commission unless it addressed the communication gap between NASA engineers and management.
  • Dilbertization of NASA: NASA’s bureaucratic issues worsened over time, though the agency represented a best-case scenario in the 1960s due to its talented personnel, common goals, and well-defined tasks.
  • Shuttle Program Failure: The Space Shuttle program, aimed at reducing the cost to orbit, ultimately failed to meet its goals.
    • Cost Goal vs. Reality: The goal was $100 per pound; the lowest achieved was $10,000 per pound.
    • Launch Frequency Goal vs. Reality: The goal was once a week; the average was once in 11 weeks.
    • Shuttle Reliability: The shuttle had a 1.5% failure rate, resulting in the death of everyone on board during failures.
    • Airliner Comparison: If airliners had the same reliability, there would be 1,600 fatal crashes daily.

The Failure of Space Colonies

  • Space Colonies Vision: The idea of space colonies, envisioned by Gerard O’Neill, gained significant support through the L5 Society and technical/economic analysis.
  • Failure due to NASA and Shuttle: The movement failed largely because of NASA’s and the shuttle program’s failures.

Rocket Science and Fuel Requirements

  • Rockets and Fuel: Rockets require a massive amount of fuel.
  • Understanding Fuel Requirements (Analogy):
    • Imagine standing in mid-air, 10 miles up, and needing to prevent falling.
    • You jump off a rock your weight, staying up briefly.
    • To stay up longer, you need to jump off progressively larger rocks, doubling the weight for each additional second.
  • Rockets and Specific Impulse:
    • Specific Impulse: The velocity at which you can jump off the rock (analogy), measured in seconds before needing another jump.
    • Mass Ratio: The total weight of the rocks (analogy) you bring, in multiples of your own weight.
    • Chemical Rocket Specific Impulse: Ranges from 300 to 400 seconds.

Nuclear-Powered Rockets in Science Fiction and Reality

  • Nuclear Rockets in Science Fiction: Science fiction writers in the 1950s often envisioned nuclear-powered rockets for space exploration.
    • Heinlein’s Novels and Movies: Featured nuclear rockets.
    • Blake Savage (Hal Goodwin): A scientist and writer who accurately depicted nuclear spaceships in his 1952 novel, Rip Foster Rides the Gray Planet.
    • 2001: A Space Odyssey: The Discovery One spaceship featured a nuclear reactor for propulsion, with design considerations for radiation shielding.
  • NERVA Project:
    • A real-life project in the 1960s that experimented with nuclear thermal rockets.
    • Limited by material capabilities, achieving a specific impulse of around 800 seconds, twice that of chemical rockets.
  • Lack of Research and Development: The predicted research and development of gaseous core nuclear reactors in the 1980s and 90s did not materialize.
  • Potential Revival of Nuclear Rockets:
    • DARPA has awarded a contract to General Atomics to design a small nuclear reactor for space propulsion.
    • Advances in technologies like superconducting magnets could benefit fusion rocket engines.
  • Nuclear-to-Electric-to-Ion Rockets:
    • Not yet demonstrated but a potential next step.
    • Prototypes suggest a specific impulse of around 3,000 seconds.
  • Direct Fusion-to-Jet Rockets:
    • Potentially the ultimate reaction rocket, using processes like the Proton Plus Lithium-7 fusion fission reaction.
    • Could achieve a specific impulse of roughly 2 million seconds.
    • Example: A 10-ton spaceship with 1 ton of fuel could accelerate at 1G for two days.

Advantages of Nuclear Rockets for Solar System Exploration

  • Limitations of Chemical Rockets: Chemical rockets are limited to low-energy, long-duration trajectories, such as the eight-month Hohmann transfer orbit to Mars, with launch windows every 26 months.
  • High-Energy, Low-Time Trajectories with Nuclear Rockets: Nuclear rockets enable faster travel with more flexible trajectories.
  • Example:
    • Accelerating at 1G for one day covers 50 gigameters and achieves a speed of 3.6 gigameters per hour.
    • Earth and Mars orbital speeds are around 0.1 gigameters per hour.
    • A typical trip could involve one day of acceleration, a few days of coasting, and one day of deceleration, resulting in a near-straight-line trajectory.

Project Orion: A Spaceship Propelled by Nuclear Explosions

Ted Taylor’s Vision

  • Ted Taylor: A nuclear engineer who designed both the smallest and largest fission devices in the U.S.
  • Barber Chair in a Spaceship: Taylor’s insistence on a barber chair symbolized his vision of a spaceship as a large, comfortable vessel like an ocean liner, not a small capsule.

Project Orion’s Development

  • Project Orion: The first study funded by ARPA (later DARPA) and the second major project of General Atomics.
  • Serious Engineering Effort: Started in 1959, involving top minds like Freeman Dyson, with some design elements still classified.
  • Nuclear Propulsion: Orion was designed to be propelled by a series of nuclear explosions behind it.
  • Inspiration: The idea came from a nuclear test where metal spheres survived a nearby explosion relatively unharmed.
  • Stanislav Ulam’s Influence: Ulam, co-inventor of the hydrogen bomb, was intrigued by using nuclear explosions for propulsion, leading to Taylor heading the project.

Orion’s Design and Feasibility

  • Design: 150 feet in diameter, 4,000 tons, built of steel using shipbuilding techniques.
  • Feasibility: Deemed feasible with 1960s technology.
  • Expectations: Scientists anticipated visiting Saturn’s moons by the 1970s.

Super-Orion and Interstellar Travel

  • Super-Orion: A concept using hydrogen fusion bombs for propulsion.
  • Dyson’s Quote:
    • “A ship with a million-ton payload could escape from the Earth with the expenditure of about 1,000 H-bombs with yields of a few megatons. The fuel cost of such a mission would be about five cents per pound of payload, at present prices. Each bomb would be surrounded by a thousand tons of inert propellant material, and it would be easy to load this material with boron to such an extent that practically no neutrons escape to the atmosphere. The atmospheric contamination would only arise from tritium and from fission products. Preliminary studies indicate that the tritium contamination from such a series of high-yield explosions would not approach biologically significant levels.”

  • Largest Ship Size: Estimated at 8 million tons, potentially carrying 200,000 people in comfort (compared to a modern cruise ship at 126,000 tons carrying 3,000 people).
  • Atmospheric Contamination: Total contamination per launch remained roughly constant regardless of ship size, encouraging larger designs.

Orion and End-of-the-World Scenarios

  • Science Fiction Staple: Stories of launching a remnant of humanity into space to escape Earth’s destruction.
  • Mega-Orion’s Potential: Could save the entire human population with 35,000 launches (compared to 100,000 daily airliner flights worldwide).

Orion’s Demise

  • Reasons for Failure: Lack of political support and concerns about fallout from atmospheric nuclear explosions.
  • Cleaner Explosions: It’s possible to engineer cleaner explosions than those used in the past, especially with fusion.

The Linear No-Threshold Hypothesis and Radiation Risk

  • LNT Hypothesis: Assumes that any amount of radiation, however small, poses a risk proportional to the number of people exposed.
  • Impact on Orion: The LNT hypothesis significantly affects risk calculations for radiation-producing projects like Orion.
  • Dyson’s Fallout Analysis: Calculated 10 excess cancer deaths per launch using the LNT hypothesis (1 death per 10,000 REM).
  • LNT and Cancer Deaths: Applying the LNT hypothesis globally would mean an Orion launch would result in 10 more cancer deaths out of the hundreds of millions expected annually.
  • Actual Dose vs. Background Radiation: The calculated dose per person from an Orion launch would be 0.02 millirem, compared to natural background radiation of 300 millirem and a likely no-damage threshold of 50,000 millirem.
  • LNT’s Impact on Research: Adherence to the LNT hypothesis hinders research on radiation effects.

The Importance of Understanding Radiation

  • Space Radiation Environment: Outer space has high radiation levels.
  • Need for Radiation Tolerance: Humans must learn to live with radiation to explore the universe, which requires honest scientific study.

The Cold Equations and Space Colonization

Proxy Wars in History

  • 18th-Century North America: The French and Indian War and the American Revolution were proxy wars between England and France.
  • Post-World War II: Many wars were Cold War proxies fought in colonies.

Potential Future Cold War

  • US-China Cold War: A potential future Cold War could involve the U.S. and China, with Russia and India as contenders, vying for control of the oceans and outer space.

Thought Experiment: Reactions to Actions in Space

  • Scenario 1: Chinese Space Force destroys 10 U.S. spy satellites.
  • Scenario 2: Chinese Space Force destroys a space station with 10 people aboard.
  • Political vs. Monetary Impact: These scenarios are comparable financially and in terms of national capability, but their political implications are vastly different.

Tripwires in Space

  • Cold War Analogy: U.S. forces in Europe during the Cold War served as a tripwire, potentially triggering a larger conflict if attacked.
  • Space Colonization and Political Logic: Similar dynamics might shape space exploration and colonization, leading to a larger human presence in space than strictly necessary for economic or scientific reasons.
  • Colonization of the New World: This mirrors how the New World was colonized.

Cradle: Earth’s Future and the Need for Expansion

Earth’s Transformation and the Need for Population Control

  • Asimov’s Vision: Isaac Asimov imagined a planet-sized city as the capital of the Galactic Empire.
  • Earth’s Future: In 1,000 years, if current trends continue, Earth could become entirely urbanized, like New York City or Los Angeles.
  • Consequences: Natural landscapes would disappear, and animal life would be confined to zoos, farms (potentially replaced by nanotech food production), and pets.
  • Undesirable Outcome: This is not a desirable future.
  • Population Control: Population control is essential, the question is when and how.

Wealth as a Means of Population Control

  • Link Between Wealth and Population Growth: Wealth is the only known way to moderate population growth (outside of catastrophic events).
  • China’s Example: Even in China, birth restrictions are linked to rising incomes and are being phased out.
  • Technophobia’s Flaw: Technophobic approaches that reduce individual ecological footprints are ultimately self-defeating because population growth outweighs individual reductions.

Human-Centric Arguments for Population Control

  • Quality of Life: A human-centric perspective also supports population control. 10 billion people living in poverty is worse than 10 billion living prosperous lives.
  • Need for More Space and Energy: Whether prioritizing Earth or people, the conclusion is the same: more living space and energy are needed.

Long-Term Survival and Diversification

  • Diversifying Risk: Humanity’s long-term survival depends on diversification and not keeping all “eggs in one basket.”
  • Existential Threats:
    • Nuclear war
    • Global pandemics
    • Asteroid impacts
    • Unknown unknowns
  • Sagan’s Quip: “All civilizations either become spacefaring or extinct.”
  • Sagan’s Published Statements: Extinction is the norm for species, survival is the exception.
  • Spaceflight as a Necessity: Intelligent beings must unify their planets, leave, and move small worlds – the choice is spaceflight or extinction.

Resources and Opportunities in Space

  • Solar Energy: The solar system receives 10,000 times more solar energy than Earth, available for utilization.
  • Asteroids and Matter: Asteroids contain at least tens of thousands of times more usable matter than Earth.
  • Potential for Wealth and Expansion: Space provides ample room and resources for a vast, wealthy human population without relying on Earth.

Benefits of Space Colonization

  • Earth as a Preserve: Earth could be set aside as a natural preserve or managed park.
  • Experimentation: Space colonies allow for experimentation with different ecologies and governmental systems.
  • Climate and Ecology Experiments: We can experiment with Earth’s climate and ecology with the resources to fix mistakes, preventing extinction-level risks.
  • Resource Abundance: Space resources would dwarf Earth’s, reducing conflict over them.

Adapting to Space Environments

Human Body’s Limitations

  • Earth-Bound Adaptation: The human body is adapted for life on Earth’s surface but still requires clothing and shelter in many regions and seasons.
  • Technological Mediation: Humans use inventions to interact with the environment and overcome limitations.

Nanotech and Space Colonization

  • Science Fiction’s Assumption: Science fiction envisioned humans continuing to adapt to space environments through advanced technology.
  • Nanotech’s Role: Nanotech is likely to be crucial for widespread space colonization by enabling the rearrangement of atoms and providing the necessary energy.

Modifying and Replacing the Human Body

  • Limited Modification in Science Fiction: Science fiction typically features limited human modifications (e.g., radiation resistance, gills) to maintain character relatability.
  • Alien Forms: More radical body modifications were often reserved for alien characters.
  • Adapting to Freefall: Different body forms might be more advantageous in space environments.

Earth-Like Habitats vs. Adapted Humans

  • High Frontier Movement: The movement in the 1970s and 80s aimed to build Earth-like habitats in orbit, which was technologically possible but more expensive than anticipated.
  • Adapted Humans for Space: Space colonization is more likely to succeed when humans are adapted to their environments, making living in space as comfortable as living on Earth.

The Century of Enhancement and Beyond

Enhanced Body Parts

  • Future of Replacements: In the next half-century, we can expect replacement body parts that surpass the originals in function.
  • The Century of Enhancement: The coming century will focus on human enhancement.
  • Enhanced Humans: Humans of the future will have improved senses, strength, and adaptability to diverse environments.

Beyond Enhancement: New Forms and Capabilities

  • Diverse Body Forms: In the future, humans may adopt diverse body forms as varied as our current machines, adapted to specific environments and needs.
  • Matter Processing and Energy Consumption: We will be able to process various forms of matter and utilize different energy sources.
  • Integrated Biosphere Capabilities: Humans will have built-in capabilities for rearranging atoms, eliminating the need for an ecological footprint.

The Need for a Frontier

  • Avoiding Degeneration: Without external challenges, human society risks degenerating into internal conflict and stagnation.
  • Space as a Challenge: The solar system and the galaxy offer vast challenges for humanity to overcome.
  • Unifying Humanity: We can focus on conquering the universe instead of fighting amongst ourselves.

Chapter 18: Metropolis

Introduction: Mile-High Buildings and Artificial Environments

  • Dennis Poon, Lead Engineer of Chengdu Tower, expresses interest in designing a mile-high building.
  • The 21st century marks a shift where over half of humanity lives in cities, highlighting the increasing prevalence of artificial environments.

The Burj Khalifa: A Real-World Skyscraper

  • The Burj Khalifa, opened in 2010 in Dubai, is the world’s tallest building at approximately half a mile high (2,600 feet).
  • It surpassed Toronto’s CN Tower (1975-2007) as the tallest building.
  • The Burj Khalifa resembles Frank Lloyd Wright’s proposed mile-high tower in shape.
  • It could potentially house and provide infrastructure for 5,000 people.
  • The actual Burj Khalifa contains 900 luxury residences and 37 floors of offices.
  • A mile-high skyscraper could potentially house 40,000 people.
  • The Burj Khalifa has a small footprint (less than two acres) due to its shape.
  • A mile-high version could potentially be built on a seven-acre footprint.
  • With one such tower per square mile (1% land usage), the Earth’s population could be housed in an area the size of Montana.
  • Flying cars or underground high-speed trains could connect towers, potentially transforming the Earth into a park-like environment.

Engineering Challenges of Skyscrapers

  • Wind load is a major engineering challenge for tall buildings.
    • Wind load increases with building height.
    • The Eiffel Tower’s structural members experience tension, not compression, on the windward side at the second platform level.
  • Modern skyscrapers utilize interesting shapes (twists, holes) to mitigate wind load.
    • These shapes break up wind vortex streets generated by simple, regular shapes.
  • Wind loading increases significantly up to 20,000 feet (4 miles), plateaus until 40,000 feet, and then decreases due to lower air density.
  • Pressurization becomes necessary for buildings between 5,000 and 10,000 feet to accommodate human habitation.
    • Alternatively, increasing the oxygen fraction of interior air could maintain sea level partial pressure up to about 35,000 feet (7 miles).
  • Building a world of towers would require advanced technology and significant resources (steel, concrete).
  • Nanotechnology, specifically nanotubes and diamond, could potentially simplify the construction of such towers.

Materials Science and Skyscraper Construction

  • Materials science, sometimes referred to as nanotechnology, has advanced rapidly.
  • An aluminum-steel alloy with the strength of titanium and the price of steel is under development.
    • This alloy could enable the construction of a mile-high Eiffel Tower with simple scaling.
  • The original Eiffel Tower (984 feet) is made of wrought iron with a maximum design stress of 10 KSI and a safety factor of three.
  • Steel, while strong, is also springy, making it unsuitable for extremely tall structures due to wind sway.
  • Concrete provides stiffness and inertia, making it more suitable for tall buildings.
    • The Burj Khalifa utilized high-strength concrete that could be pumped to greater heights.
  • Diamond or nanotube composites could enable the construction of much taller towers.

Timeline for Mile-High and Ten-Mile-High Towers

  • The surge in building heights coincided with the Industrial Revolution and the use of iron and steel (e.g., the Eiffel Tower).
  • The trend line suggests that mile-high buildings could be achieved around 2065.
  • However, a shift to nanomanufacturing and advanced materials could accelerate this timeline.
  • A 10-mile tower, with a footprint of a square mile, could potentially house 40 million people.
    • Eight such towers could house the entire 2021 U.S. population.

Governance and Maintenance Challenges

  • Honest and trustworthy governance, management, planning, and maintenance are crucial for the success of large-scale tower projects.
  • Without proper governance, potential disasters (e.g., a tower collapse) could have significant consequences.

City Services: Rethinking Urban Planning

  • Jane Jacobs, author of “The Death and Life of Great American Cities,” criticizes conventional city planning approaches.
  • Many science fiction writers envisioned automatic package delivery as a key feature of future cities.
    • Philip Francis Nolan’s Buck Rogers novels (1929) described a system where food and entertainment were delivered directly to apartments.
    • Nolan also predicted radio-controlled flying drones for military use.
  • Early to mid-20th century science fiction often depicted cities with advanced transportation infrastructure for automated movement of goods and people.

The Value of Cities: Reducing Travel Time

  • The primary value of a city lies in its ability to reduce travel time between destinations (homes, businesses, institutions, etc.).
  • The average person travels about 1.1 hours per day, regardless of location.
  • Cities offer more valuable destinations within a given travel time.
  • Traffic congestion is a major issue in many cities, increasing travel time.
    • Megan McArdle’s experience in New York City highlights the challenges of navigating a city with inadequate transportation infrastructure.
  • Travel between distant points in a city can be significantly faster with a car or helicopter if there were no congestion.

City Design and Transportation Infrastructure

  • Current city design, particularly transportation infrastructure, is considered primitive compared to advancements in other fields (manufacturing, electronics, software).
  • Cities typically have only one level of interconnect for transportation.
  • Multi-level streets, as envisioned by Robert Heinlein in his 1938 utopia, could significantly reduce travel times and improve pedestrian safety.
    • Separate pedestrian levels with awnings could enhance the walking experience.
  • Pedestrian levels should prioritize functionality (e.g., shopping malls) over purely aesthetic green spaces.
    • Jane Jacobs advocated for integrating commercial activities within residential districts.
  • Recent utopia-building projects often lack innovation in transportation infrastructure, relying on conventional road systems.
  • A well-designed city should incorporate a multi-level transportation infrastructure similar to the complex wiring systems found in brains, bodies, processor chips, etc.

Densification vs. Quality of Life

  • The tension between densification and quality of life arises from the incentives of bureaucrats and politicians.
    • They often benefit from increased interaction and contention among citizens, which creates opportunities for control and favors.
  • This incentive contributes to the prevalence of flat, one-level cities where various modes of transportation and pedestrians compete for space.
  • Jane Jacobs recognized the importance of community interaction in neighborhoods, but this concept is more applicable to smaller communities than large metropolises.
  • Apartment buildings can foster a sense of community if designed and managed properly.
  • Densification proponents often cite the paradox of road closures not always leading to increased traffic on remaining streets.
    • This phenomenon can be explained by travel theory: when something becomes more expensive (in time or effort), people demand less of it.
  • Building more roads can lead to more traffic, but this new traffic represents increased access to desired activities.
  • An optimal city design would incorporate multiple levels of traffic interconnect, pedestrian walkways, and potentially towering structures.

Tower Cities and Flying Cars

  • Tower cities, with limited ground-level transportation, could make cars useless.
  • Cars, conversely, reduce the necessity of cities, as demonstrated by the rise of suburbia.
  • A city consisting of loosely clustered towers, each with a flying car garage, could provide efficient transportation without multi-level highways.
  • VTOL flying cars offer a viable solution for transportation in a tower city environment.
  • Online shopping and drone delivery further reduce the need for conventional transportation.
  • Nanotechnology and advanced 3D printing could enhance the capabilities of flying cars and personalize delivery systems.
  • With flying cars, destinations could be located in diverse environments (e.g., mountainsides) without sacrificing accessibility.

Seasteading: Colonizing the Oceans

  • Seasteading, the creation of permanent dwellings at sea, is an underpredicted trend for the coming century.
  • Cruise ships represent a basic form of seagoing city, but they rely on Victorian technology.
  • Nanotechnology could enable the construction of more advanced and self-sufficient seagoing structures.
  • An AeroCity, a massive flying wing structure, could potentially house millions of people and incorporate multi-level roadways, service cores, and nuclear power plants.
  • Island construction using materials dredged from the seafloor could create desirable living spaces in the oceans.
  • The tropical oceans offer vast areas with favorable climates, potentially suitable for island communities.
  • Airships or Cloud Cities composed of numerous small balloons could provide unique and mobile living environments.

Seasteading Advantages and Technologies

  • Seasteading offers advantages such as:
    • Escape from incompetent government (potentially leading to increased economic growth).
    • Abundant free space.
    • Mobility (allowing for climate control and potential escape from government restrictions).
  • Seaplanes could provide efficient transportation between distributed seastead communities.
  • Seawater contains significant energy resources (deuterium, lithium, boron, uranium) and dissolved minerals that could be utilized by nanofactories.
  • The development of seasteading technologies requires focused research and problem-solving efforts.
  • Economically viable ocean communities could potentially emerge early in this century, with self-sufficient family-sized units becoming possible in later decades.

Conclusion: Reimagining Cities and Habitats

  • Nanotechnology could revolutionize city design and enable the construction of innovative structures like AeroCities and Cloud Cities.
  • Seasteading offers a promising alternative to land-based living, with potential benefits in terms of freedom, space, and mobility.
  • The future of human habitats could involve a combination of advanced tower cities, seastead communities, and innovative airborne structures, potentially leaving the Earth’s surface largely untouched.
  • Psychological surveys consistently show that people are happier and healthier in rural environments compared to large cities.
    • Greenery has been linked to reduced mortality rates.
  • The COVID-19 pandemic highlighted the vulnerabilities of densely populated cities.
  • The development of flying cars and other advanced technologies could reshape the relationship between cities and transportation, potentially leading to more distributed and desirable living environments.

Chapter 19: Engineer’s Dreams

Engineer’s Dreams: Introduction

  • Quote: “I’d settle for flying cars. They need to be real flying cars with anti-gravity or reactionless thrusters, not ducted fan kludges.” - Glenn Instapundit Reynolds (Source: Opening quote)
  • Science fiction writers and astronautics professionals believe there must be a better way to reach other planets than rockets.
    • Desired characteristics of future space travel: Safer, quieter, cheaper, and less messy.
  • Reactionless thrusters are problematic in current physics, but opposing gravity is achievable (e.g., tables, balloons, wings, magnets).

The Mystery of Gravity and Quantum Mechanics

  • The Composition of the Universe:
    • Normal matter (stars and planets): 5%
    • Dark energy and dark matter: 95%
      • “Dark” signifies our inability to observe or understand their nature.
      • Their existence is inferred due to discrepancies between our theories of gravitation and observed behavior of normal matter.
  • The Future of Gravity and Physics:
    • New physics related to gravity is likely to be discovered.
    • Whether this will lead to anti-gravity technology is unknown.
    • A significant revolution in basic physics, particularly quantum mechanics, is anticipated.
  • Discrepancies in Current Physics:
    • Theories of gravity are estimated to be off by a factor of 20.
    • A much larger discrepancy exists between theoretical and observed vacuum energy density.
      • “Depending on the Planck energy cutoff and other factors, the discrepancy is as high as 120 orders of magnitude, a state of affairs described by physicists as ‘the largest discrepancy between theory and experiment in all of science,’ and ‘the worst theoretical prediction in the history of physics.’” - Wikipedia

Carver Mead’s Perspective on Quantum Mechanics

  • Carver Mead: A prominent figure in 20th-century technology, a colleague of Richard Feynman, and a key contributor to Moore’s Law. (Source: Background information on Mead)
  • Mead’s Critique of 20th Century Physics:
    • Quote: “It is my firm belief that the last seven decades of the 20th century will be characterized in history as the dark ages of theoretical physics.” - Carver Mead (Source: Mead)
  • Einstein vs. Bohr:
    • Niels Bohr: The dominant figure in quantum mechanics during the early to mid-20th century, particularly known for the Copenhagen Interpretation.
    • Albert Einstein: Opposed the Copenhagen Interpretation, arguing against the probabilistic nature of quantum mechanics. (“God doesn’t play dice with the universe.”)
    • Bohr was generally considered to have “won” the debates within the physics community.
  • The Success and Challenges of Quantum Mechanics:
    • The mathematics of quantum mechanics has been highly successful in predicting experimental outcomes statistically.
    • However, the interpretation of quantum mechanics remains a source of contention and seemingly incoherent explanations.
      • Schrödinger’s cat: A thought experiment highlighting the absurdity of certain interpretations.
      • Copenhagen Interpretation: The cat is neither alive nor dead until observed.
      • Many-worlds interpretation: The universe splits, with the cat alive in one and dead in another.
  • The Limitations of Experimentation:
    • Different interpretations of quantum mechanics yield the same experimental results, making it difficult to determine which is correct.
    • They differ significantly in their descriptions of unobservable aspects of reality.
  • The Case of the Laser:
    • Bohr and John von Neumann advised Charles Townes that his attempt to build a laser was impossible according to the Copenhagen Interpretation.
    • Townes proved them wrong with a working model and later won the Nobel Prize.

The Central Mystery and Alternative Interpretations

  • The Wave-Particle Duality:
    • Energy, such as a photon of light, behaves like a wave when traveling but interacts like a particle upon detection.
    • Analogy: A rubber ducky thrown into water creates a wave that spreads out. The ducky disappears, and then reappears somewhere else with a probability related to the wave’s amplitude at that location. The wave then instantly vanishes everywhere else.
  • The Copenhagen Interpretation’s Limitations:
    • It claims that the ducky’s path between sightings is unknowable and that the wave is a mental construct.
    • This is seen as a denial of deeper knowledge, similar to behaviorism’s focus on stimulus and response.
  • Problems with the Copenhagen Interpretation:
    • The insistence on the electron being both a wave and a point particle leads to issues like:
      • Renormalization: Mathematical adjustments that Einstein and Schrödinger criticized as being akin to Ptolemaic epicycles.
      • Vacuum energy discrepancy: The vast difference between theoretical and observed vacuum energy density.
  • Alternative Perspectives:
    • Carver Mead, along with Einstein, Schrödinger, Fred Hoyle, Jayant Narlikar, Paul Davies, and John Cramer (with contributions from Wheeler and Feynman), advocate for alternative interpretations.
    • They believe that a deeper reality exists and that Schrödinger’s equation, when properly interpreted, can explain quantum phenomena without resorting to “magic.”
  • The Electron as a Wave:
    • The electron is seen as the wave itself, not a point particle guided by a non-physical wave.
    • Particle-like behavior arises from the mathematics of wave mechanics.
  • The Advanced Wave Solution:
    • Standard practice often discards the advanced wave solution of the wave equation.
    • Keeping and utilizing the advanced wave leads to a better understanding of coherent quantum phenomena (e.g., lasers, superconductors, resonant tunneling, Bose-Einstein condensates).

Towards a Copernican Revolution in Quantum Mechanics

  • The Future of Quantum Mechanics:
    • Extensive research is still needed to fully develop a “Copernican” interpretation of quantum mechanics.
    • It is acknowledged that the Copenhagen Interpretation, like Ptolemaic epicycles, might provide more accurate predictions for a time.
  • Benefits of a New Perspective:
    • A new interpretation will likely be more intuitively satisfying and will reveal new mechanisms and insights.

Space Piers: A Second Atomic Age Approach to Space Travel

  • Making Space Travel Achievable:
    • The chapter explores how Second Atomic Age technology, specifically nanotechnology, could revolutionize space travel.
  • The Rocket Equation:
    • While lighter and stronger materials improve rocket efficiency, the rocket equation remains a significant obstacle due to the need to carry vast amounts of propellant.
  • Space Piers as an Alternative:
    • Space pier: A structure that bridges the gap between Earth and the environment where spaceships operate (vacuum and orbital velocity).
  • Space Pier Design:
    • Height: 100 km (62 miles)
    • Length: 300 km (186 miles)
    • Shape: Curved to account for Earth’s curvature, with the center bowed nearly 2 km away from a straight line.
    • Accelerator Track: Located at the top, nearly 5 km longer than the base due to the curvature.
    • Atmospheric Layers: Extends through the stratosphere (50 km) and mesosphere (90 km) into the ionosphere.
    • Propulsion: Utilizes a linear induction motor or other electromagnetic motor along the top track.
    • Elevator: Transports payloads to the top of the tower.
    • Launch Acceleration: 10 Gs horizontally for 80 seconds, achievable with appropriate cushioning for humans.
  • Advantages of Space Piers:
    • Only the payload is accelerated to orbital velocity, unlike rockets that must accelerate propellant as well.
  • Construction with Nanotechnology:
    • Material: Flawless diamond with a compressive strength of 50 gigapascals (GPA) could be used, requiring no taper for a 100 km tower.
    • Strength: A 100 km diamond column weighs 3.5 billion Newtons per square meter but can support 50 billion.
    • Alternative Material: Commercially available polycrystalline synthetic diamond (5 GPA strength) could also be used.
    • Tapering: In practice, columns would be tapered to conserve material.
    • Base Broadening: The base would be broadened to handle transverse forces like wind.
    • Weather Considerations: Only the bottom 15 km (troposphere) needs to account for weather.

Accelerator and Structural Considerations

  • Electromagnetic Accelerator:
    • Weight: Potentially heavy, with some designs using iron cores (e.g., NASA’s Marshall prototype at 100 pounds per foot).
    • Example: A 1-ton per meter accelerator would weigh 300,000 tons (comparable to the Golden Gate Bridge at 800,000 tons).
    • Material: Mostly iron, which is relatively inexpensive.
    • Stability: Heavier construction might be preferred for stability.
    • Support: Even a relatively small column of currently available polycrystalline diamond could support the entire accelerator weight.
  • Overall Structure:
    • Design: Openwork, similar to a radio tower.
    • Footprints: Approximately 60 footprints spaced 10 km apart.
    • Land Use: Footprints would occupy a minimal amount of land (0.02% of the area under the tower).
    • Foundation Loads: Each footprint would bear a load comparable to a small office building.
  • Construction with Teleoperated Robots:
    • Teleoperated scale-shifting robots would facilitate construction, eliminating the need for humans to work in the challenging environment at high altitudes.
  • The Kármán Line (100 km):
    • The edge of space, where the air is a million times thinner than at ground level.
    • Still enough air density to prevent objects from orbiting for long at this altitude.
  • Material Requirements and Cost:
    • Estimated Material: Around a million tons for a 300 km structure.
    • Comparison to Highways: A typical 300 km superhighway uses about 15 million tons of material and costs $1-5 billion.
    • Land Value: The land under the tower may retain or even increase in value, particularly near the embarkation port.
  • Major Obstacle: Likely to be legal and regulatory hurdles rather than technical challenges.

Energy Costs and Launch Procedure

  • Elevator Energy Cost:
    • Energy: 10 gigajoules (2,778 kilowatt-hours) to send a 10-ton payload to the top.
    • Cost: $138.89 at 5 cents per kilowatt-hour (1.4 cents per kilogram).
    • Speed: An express elevator speed of 50 meters per second would take about 30 minutes to reach the top.
    • Efficiency: Elevators can be made efficient at various climb rates.
  • Payload and Passengers:
    • Freight: Inert freight would be loaded into projectiles on the ground.
    • Passengers: Would ascend in the elevator, potentially enjoying amenities like a revolving restaurant at the top (with a 1,000 km view).
  • Launch Acceleration and Passenger Comfort:
    • Acceleration: 10 Gs during horizontal launch.
    • Passenger Accommodations: Form-fitting, fluid-filled sarcophagi to mitigate the effects of high acceleration.
  • Orbital Injection:
    • Acceleration: 10 Gs for 80 seconds, reaching a velocity of 8 km/s.
    • Orbit: Achieves an orbit with a 100 km perigee and a 500 km apogee.
    • Earth Rotation Assist: The launch benefits from Earth’s rotation.
  • Launch Energy Cost:
    • Energy: 300 gigajoules.
    • Cost: $4,166 at current prices (42 cents per kilogram).
    • Estimated Cost with Second Atomic Age Tech: Pennies per kilogram, due to advancements in energy production and resource utilization.
  • Power Requirements:
    • Average Power: 3,750 megawatts during the 80-second launch.
    • Peak Power: 7,500 megawatts at the end of the launch.
    • Comparison: A typical suburb on the same land might have a peak power demand of 750 megawatts.
  • Energy Storage:
    • Requirement: Local, short-term energy storage capable of rapid discharge for load averaging.
    • Storage Density: 1 megajoule per meter of track length.
    • Examples of 1 MJ: 15 pounds of lead-acid batteries, an ounce of butter, or a grain-of-salt-sized speck of uranium.
    • Potential Storage Method: Storing energy in the magnetic fields of the accelerator coils.
  • Recharge Time:
    • A typical power station (1,000 megawatts) could recharge the tower in about 5 minutes.

Orbit Circularization and Vehicle Design

  • Spacecraft Requirements:
    • The launched vehicle must be capable of approximately 330 meters per second (m/s) delta-v to circularize its orbit.
  • Orbit Correction Strategy:
    • Inject into an orbit with the same energy as the desired circular orbit.
    • Perform a correction maneuver at the point where the orbits intersect (only changes direction, not energy).
    • This minimizes delta-v requirements at the correction stage, which is more expensive than delta-v at launch.
  • Propellant Requirements:
    • Chemical Rockets: Propellant for orbit correction would be about 10-15% of the vehicle’s gross weight.
    • High-Specific Impulse Rockets (e.g., direct fusion): Propellant requirements would be significantly lower.
  • The Evolution of Spacecraft:
    • Analogy: “Zeppelins breed like elephants, but airplanes breed like rabbits.” (Source: Early aviation saying)
    • Prediction: Space piers will likely be built only after space travel becomes more frequent and economically viable.
  • Multi-Purpose Vehicles:
    • Second Atomic Age vehicles could potentially serve as cars, boats, airplanes, and spaceships.

Flying Cars and Atmospheric Propulsion

  • Passenger Comfort and Acceleration:
    • A 1G thrust would be comfortable for passengers, achieving orbital velocity in about 13 minutes.
    • This would cover a distance of 3,000 kilometers.
  • Energy Requirements:
    • Energy per kilometer: 100 megajoules.
    • Nitrogen Fusion: If achievable, a cubic foot of air could provide the necessary fuel.
    • Uranium Fuel: With a direct fusion rocket (bypassing the rocket equation), 50 cents worth of uranium could provide the total 300 gigajoules needed.
  • Atmospheric Propulsion:
    • Electric Jets (Ionic Thrusters): Potentially efficient at hypersonic speeds, without the need for supersonic combustion or thermodynamic cycles (as in ramjets).
    • Power Requirement: Peaks at 80 megawatts.
  • Limitations of Electric Jets:
    • Thrust would need to be reduced near orbital velocity.
    • A rocket would still be required for maneuvering in space.

World Weather Control: Introduction

  • Science Fiction and Climate Catastrophes:
    • Against the Fall of Night (Arthur C. Clarke): The sun expands, boiling away Earth’s oceans.
    • A Pail of Air (Fritz Leiber): Earth is flung into interstellar space, causing the atmosphere to freeze.
    • Seven Eves (Neal Stephenson): The moon explodes, causing a meteor bombardment that devastates Earth.
    • This Island Earth (movie): The planet Metaluna suffers a similar fate.
    • When Worlds Collide (novel and movie): A rogue planet collides with Earth.
  • Climate Change and the Henry Adams Curve:
    • Anthropogenic climate change is a major concern, but it was not the cause of the Henry Adams curve flatline in the 1970s. The concept gained prominence later.
    • Media portrayals often exaggerate the scientific and economic understanding of climate change.
  • Second Atomic Age Solutions:
    • Second Atomic Age technology could power civilization without CO2 emissions.
    • Returning to the Henry Adams curve requires moving beyond fossil fuels, particularly for space exploration.
  • The CO2 Problem of the Future:
    • Too little CO2: May become a greater concern in the future than too much CO2.
    • Current Levels: Around 400 parts per million (ppm), lower than optimal for plant growth.
    • Commercial Greenhouses: Operate at 1,000 ppm.
    • Consequences of Reduced CO2: Halving CO2 levels could threaten plant life and the ecosystem.
  • Pocket Synthesizers and CO2 Demand:
    • Pocket synthesizers: Second Atomic Age devices capable of creating objects from the air, utilizing carbon from CO2.
    • CO2 Replenishment: May require reopening coal mines or other sources to maintain adequate atmospheric CO2 for plant life.

Weather Control as a Superior Alternative

  • The Cost of Climate Regulation:
    • Current efforts cost approximately $1 trillion annually, excluding externalities like energy poverty.
    • These efforts are not effectively reducing global CO2 emissions.
  • A More Effective Approach:
    • Proposal: Invest the $1 trillion in building orbital infrastructure and deploying sunshades.
    • Sunshade Requirements: A 2-mile-wide ribbon around the equator could negate the enhanced greenhouse effect.
    • Benefits: Climate control and a valuable orbital infrastructure (potentially a solar power satellite).
  • Decoupling Temperature and CO2:
    • This approach would decouple Earth’s temperature from atmospheric CO2 concentration, allowing independent control of each.
  • CO2 Adjustment with Second Atomic Age Tech:
    • Adjusting atmospheric CO2 levels would be straightforward with the ability to manipulate atoms at the molecular level.

The Weather Machine: Mark I

  • Design:
    • Aerostat: Small, hydrogen-filled balloon (centimeter-scale).
    • Shell: Very thin diamond shell (nanometer-thick).
    • Shape: Spherical with a mirrored equatorial plane, possibly extending outwards like Saturn’s rings.
    • Thickness: A few nanometers if flattened into a disk.
  • Material Considerations:
    • While constructible with current balloon materials, nanotechnology makes it more feasible and economical.
    • Material Requirements: A nanometer-thick design requires about 10 million tons of material (comparable to 100 miles of freeway construction).
    • Comparison to Conventional Balloons: Covering Earth with a 100-micron-thick layer would require 100 billion tons of material.
  • Altitude and Functionality:
    • Altitude: Floats at approximately 20 miles (32 km) in the stratosphere, above weather systems and jet streams.
    • Control Unit:
      • Radio receiver
      • Computer
      • GPS receiver
    • Power and Actuation: Minimal power and actuators (fans or other mechanisms) to adjust orientation.
  • Operation:
    • Receives commands via radio.
    • Tilts its mirror based on latitude, longitude, and desired effect.
  • Global Coverage:
    • Requires a vast number of aerostats (quintillions for centimeter-scale). Nanotechnology enables this scale.
  • Functionality:
    • Acts as a programmable greenhouse gas.
    • Cooling: Reflects sunlight back into space when mirrors face the sun.
    • Warming: Reflects outgoing long-wave (infrared) radiation back towards Earth when mirrors are angled appropriately.
  • Energy Balance: The Earth radiates away the same amount of energy it receives from the Sun (to within tenths of a percent).
  • Climate Control:
    • Only a small fraction (tenths of a percent) of a full weather machine would be needed for climate control (a few watts per square meter).
  • Mirror Control:
    • Simple control systems can be used.
    • When letting sunlight through, mirrors can be slightly angled, causing scattering (haze) without significantly reducing insolation.

The Kardashev Scale

  • Kardashev Scale: Proposed by Russian astronomer Nikolai Kardashev in 1964, classifying civilizations based on energy utilization.
  • Type I: Controls the energy of a planet (typically the amount of sunlight hitting it).
  • Type II: Controls the energy of a star.
  • Type III: Controls the energy of a galaxy.
  • Scale Comparisons:
    • The difference between a Type III and a hypothetical ergophobic (energy-avoiding) Type I civilization is comparable to the difference between current human civilization and a single bacterium.

The Weather Machine and Kardashev Type I Status

  • Weather Machine as a Type I Enabler:
    • A weather machine could grant humanity Type I status by controlling the energy equivalent of all sunlight hitting Earth.
  • Human Energy Use:
    • Current Power Use: 2 x 10^13 watts, a tiny fraction of the 10^17 watts a weather machine could control.
  • Benefits of Weather/Climate Control:
    • Economic Value: Controlling climate could make currently unproductive land (e.g., northern Canada, Russia) as valuable as fertile regions like California.
    • Historical Desire: Weather control has been a long-held aspiration, appearing in myths and science fiction.
  • Weather vs. Climate Control:
    • Finer control over individual aerostats could enable influence over daily weather patterns in addition to long-term climate.
    • Hurricane Steering: Shading specific areas could potentially redirect hurricanes away from populated areas.

Asteroid Deflection with the Weather Machine

  • Apophis Asteroid:
    • Close Approach: Will pass near Earth in 2029 (within the Moon’s orbit).
    • Uncertain Trajectory: The close encounter will make its future orbit difficult to predict.
    • Potential Impact: Could potentially strike Earth in 2036.
  • Weather Machine Intervention:
    • A highly controllable weather machine could focus sunlight on Apophis during its 2029 flyby to alter its trajectory and prevent a future impact.

The Weather Machine: Mark II

  • Advanced Functionality:
    • Aerogel: Contains electronically switchable optical frequency antennas within the aerostat.
    • Frequency and Direction Control: Can absorb or transmit radiation at any desired frequency and direction.
    • Phase Control: Potentially capable of controlling the phase of transmitted radiation.
  • Solid-State Control: No need for physical orientation changes.
  • Applications:
    • Directional Video Screen/Hologram: The entire Earth could become a giant, programmable display.
    • Telescope: Turns Earth into a telescope with an 8,000-mile aperture.

Weather Machine Mark II: Power and Capabilities

  • Astronomical Applications:
    • Phobos Targeting: At Mars’ closest approach, a Mark II weather machine could focus a petawatt beam (equivalent to a quarter megaton per second) onto a 2.7 mm spot on Phobos.
    • Phobos Manipulation: This could be used to:
      • Destroy Phobos.
      • Write on its surface.
      • Create controlled ablation for propulsion, maneuvering it around Mars.
  • Nighttime Power:
    • Mark II can be powered at night by capturing outgoing infrared radiation.
  • Energy Sources and Power Density:
    • Incoming Solar Power: Roughly 1 kilowatt per square meter at the surface (at noon, near the equator).
    • Outgoing Infrared Power: About 250 watts per square meter (averaged over the entire surface).
  • Mark II Power Utilization:
    • Absorbs outgoing infrared for various applications (e.g., street lighting, displays).
    • Can act as a hologram, creating different effects for different locations.
  • Power Beaming:
    • Can focus power to flying cars, ships, trains, ground vehicles, and buildings.
    • Optimal Frequency: Microwave atmospheric window at 15 GHz or less (wavelength of 20 mm or longer) for efficient electricity conversion.
    • Power Density: A 1 km patch of weather machine could focus 250 megawatts onto a 1-meter rectenna at 15 GHz (even at midnight).

The Inevitability and Implications of the Weather Machine

  • Prediction: A weather machine, or something similar, will likely be built this century.
  • Reasons for Development:
    • Technological Feasibility: Molecular manufacturing makes it achievable.
    • Addressing Global Threats: Climate change and asteroid impacts are motivating factors.
    • Economic Value: Weather/climate control has significant economic potential.
    • Military Applications: Weather control offers a powerful strategic advantage.
  • Military Implications:
    • Mark I could be used to induce climate disruptions (e.g., denying summers or springs to adversaries).
    • Mark II could be used for targeted destruction (as demonstrated with Phobos).
  • The Dead Man’s Switch:
    • A partially built Mark I (around 5%) could act as a doomsday device.
    • If the controlling entity is destroyed, the aerostats could default to a “snowball Earth” scenario.
  • Geopolitical Consequences:
    • Control of the weather machine would grant immense power.
    • Competing weather machine networks are likely to emerge.
    • Negotiations and potentially a “weather control world government” might be necessary.
  • Ultimate Implications:
    • The far-reaching consequences of a weather machine are difficult to fully grasp, but its development seems probable.

Chapter 20: Rocket to the Resistance

Benjamin Franklin’s Vision

  • “The rapid progress true science now makes occasions my regretting sometimes that I was born so soon. It is impossible to imagine the height to which may be carried in a one thousand years the power of man over matter. We may perhaps learn to deprive large masses of their gravity and give them absolute levity for the sake of easy transport. Agriculture may diminish its labor and double its produce. All diseases may by sure means be prevented or cured, not accepting even that of old age, and our lives lengthened at pleasure even beyond the antediluvian standard. Oh, that moral science were in as fair a way of improvement, that men would cease to be wolves to one another, and the human beings would at length learn what they now improperly call humanity.” - Benjamin Franklin

The End of Agriculture

Technological Advancements in Food Production

  • Vertical Farming:
    • Description: Warehouse-sized buildings with shelves stacked high, growing lettuce under specialized LED lights.
    • Advantages:
      • Increased Efficiency: Produces 300 times more lettuce per square foot compared to traditional farming.
      • Controlled Environment: Allows for year-round production, regardless of location or weather.
      • Resource Optimization: Recycles water, filters air, and eliminates the need for pesticides.
      • Example: Growing strawberries in Siberia or avocados in Antarctica becomes feasible with sufficient power.
  • Lab-Grown Meat:
    • Progress:
      • 2013: Cost $325,000 per pound.
      • 2018: Cost $363 per pound.
    • Investment: Attracting substantial investments from agribusiness giants like Tyson Foods.
    • Current Status (2021): Singapore approved and began selling cultured bioreactor-grown chicken meat.

Impact on Land Use and Energy Consumption

  • Current Land Use:
    • 80% of cultivated land in the U.S. produces animal feed (corn, soybeans, hay).
    • Significant land area dedicated to grazing.
  • Inefficient Solar Collection: Existing agriculture acts as an inefficient solar collector, converting sunlight into food through biological processes.
  • Second Atomic Age Technology:
    • Bypasses Biological Processes: Directly converts energy into food, bypassing the need for extensive land and inefficient solar collection.
    • Potential Impact: Could free up vast amounts of land currently used for agriculture.

The Choice: Static Comfort or Dynamic Expansion?

  • Door Number One: Static Comfort
    • Description: Society chooses a comfortable level of existence, with robots providing for basic needs.
    • Potential Problems:
      • Make-Work and Self-Deception: People may engage in meaningless activities to fill their time.
      • Social Conflict: “Being wolves to each other” – increased conflict and competition in a resource-abundant society.
      • Evolutionary Concerns:
        • H.G. Wells’s “The Time Machine”: Predicted the degeneration of humanity into the Eloi (idle, harmless) and Morlocks (subterranean, predatory).
        • Game Theory and Simulations: Suggest that even in comfortable environments, humans do not become uniformly benevolent.
        • Virtue Signaling: May conceal underlying selfish motivations.
        • Zero-Sum Society: A recipe for evil, as resources are finite and competition is fierce.
  • Door Number Two: Dynamic Expansion
    • Description: Society embraces continued growth and innovation, using increased productivity to enhance everyone’s lives.
    • Potential Benefits:
      • A World of Makers: People engage in creative and productive pursuits.
      • Increased Access to Resources: Everyone can afford advanced technology (e.g., flying cars, space travel).
      • Expansion Beyond Earth: Building cities on the sea, colonies in space, preserving Earth as a park.
      • Challenges and Opportunities: Provides challenges commensurate with human abilities, fostering growth and preventing stagnation.
  • The Need for a Frontier: Humans need challenges and opportunities for exploration and expansion to thrive.

The Technium and the Great Stagnation

The Technium: An Analogy for Technological Progress

  • Definition: The sum total of human-made things and capabilities, growing and interdependent like an ecosystem. (Kevin Kelly’s term).
  • Analogy:
    • Water Level: Represents the level of the technium.
    • Landscape: Represents the constraints and opportunities imposed by physics and economics.
    • Habitable Shores: Represent the areas where humans can live and thrive based on the current level of the technium.

Two Explanations for the Great Stagnation

  • Default Stagnation Explanation:
    • The Barren Desert: The technium has risen, but the landscape offers few new opportunities (except for computing).
    • Low-Hanging Fruit: All the easy advancements have been made.
  • The Forbidden Valleys:
    • Abundant Valleys: Many opportunities for technological advancement exist.
    • Angels with Flaming Swords: Cultural and regulatory barriers prevent access to these valleys.
    • Multiple Entry Points: The rising technium can find ways into these valleys through various means:
      • Technological End-Runs: Bypassing existing barriers with new technologies (e.g., drones).
      • Escape from the Ban: Individuals or groups circumventing restrictions.
      • Global Competition: Other nations forging ahead while stagnating societies fall behind.
      • Political Shifts: Changes in policy or leadership that open up new possibilities.

Technology Leading Science: Historical Perspective

  • Historical Pattern: Technological innovation often precedes scientific understanding.
    • Examples:
      • Candles: Used for millennia before Faraday explained the science of combustion.
      • Wine and Cheese: Made using microbes long before Pasteur’s discoveries.
      • Steam Engines: Invented before Carnot developed thermodynamics.

The Scientific Overhang: Dammed Potential

  • Cause of Stagnation: Not a lack of opportunity, but cultural and regulatory barriers hindering experimentation and innovation, particularly in high-power technologies.
  • Examples:
    • Nuclear Power: Molten salt reactors and nuclear rocket engines developed decades ago but not widely adopted.
    • Nanotechnology: Potential recognized by Feynman in 1960, but progress hindered by fear and regulation.
  • Potential for Rapid Advancement: Existing scientific knowledge could fuel rapid progress if barriers are removed.

The Collapse of Pax Britannica: A Historical Parallel

  • Pax Britannica: A period of British dominance fueled by the Industrial Revolution.
  • Overhang Technologies: Nuclear power and nanotechnology could enable other nations to rapidly catch up with and potentially surpass stagnating Western societies.
  • Weapons Development: Nanotech could democratize access to advanced weapons technology, potentially destabilizing the global balance of power.

The Future Isn’t What It Used to Be: Shifting Technological Focus

  • Hot Technologies of Today: Computing, communication, biotechnology.
  • Linear Extrapolation: Science fiction and futurism often rely on simple linear projections, failing to anticipate paradigm shifts.
  • The Second Atomic Age: Represents a potential resurgence of high-power technologies, driven by nuclear power, nanotechnology, biotechnology, and AI.

The Importance of Moral Science and Governance

  • Benjamin Franklin’s Moral Science: If humans could overcome their destructive tendencies, technological progress could lead to a far better world.
  • The Quality of Governance: Good governance is essential for maximizing the benefits of technological advancement.
    • Rosling’s Level 4: Represents a high level of development and well-being.
    • Potential for Improvement: The world could be significantly wealthier and happier with improved governance.
  • Incentives for Positive Change: The potential for immense wealth and global improvement could drive innovation in governance.

Futures Past: Dystopia vs. Optimism

Philip K. Dick’s Dystopian Vision

  • Philip K. Dick: A mid-20th century science fiction writer known for his dystopian stories (e.g., Blade Runner, Minority Report).
  • Characteristics of Dick’s Work: Lack of optimism, a focus on social decay, and a sense of hopelessness.
  • Influence: Despite his pessimism, Dick’s work has been highly influential, particularly among intellectuals.

Jill Lepore’s Critique of Dystopian Fiction

  • “Dystopia used to be a fiction of resistance. It’s become a fiction of submission. The fiction of an untrusting, lonely, and sullen 21st century. The fiction of fake news and info wars. The fiction of helplessness and hopelessness. It cannot imagine a better future, and it doesn’t ask anyone to bother to make one. It nurses grievances and indulges resentments. It doesn’t call for courage. It finds that cowardice suffices. Its only admonition is, despair more.” - Jill Lepore, Harvard historian

    • Context: Criticizing the trend of dystopian fiction as a reflection of societal pessimism and a lack of hope for the future.

The Great Stagnation: A Dickian Future

  • Quip: “We are not living in the future of Robert Heinlein. We are living in the future of Philip K. Dick.”
  • Pessimism as Intellectual Fashion: Throughout history, pessimism has often been seen as a sign of intellectual sophistication.
    • John Stuart Mill’s Observation: “Not the man who hopes when others despair, but the man who despairs when others hope, is admired by a large class of persons as a sage.”

Thomas Macaulay’s Optimism (1830)

  • “On what principle is it that, when we see nothing but betterment behind us, we are to expect nothing but deterioration before us? If we were to prophesy that in the year 1930 a population of fifty million, better fed, clad, and lodged than the English of our time, will cover these islands, that Sussex and Huntington Shire will be wealthier than the wealthiest parts of the West Riding of Yorkshire now are, that machines constructed on principles yet undiscovered will be in every house, many people would think us insane.” - Thomas Macaulay.

    • Context: Macaulay’s optimistic prediction during the Victorian era proved accurate, highlighting the dangers of assuming future decline based on past progress.

The Role of Science Fiction in Shaping Visions of the Future

  • Golden Age Science Fiction: Provided inspiring visions of a better future, driven by technological advancement and human ingenuity.
    • Authors: Verne, Wells, Burroughs, Gernsback, Bellamy, Campbell, Docksmith, Van Vogt, Heinlein, Asimov, Garrett, Piper, Niven, Pournelle.
  • Science Fiction since the 1960s: Largely shifted towards dystopian themes, focusing on the potential dangers and negative consequences of technology.

The Faustian Bargain of Science Fiction

  • Gaining Respectability: By embracing dystopia, science fiction gained academic acceptance and moved beyond its genre niche.
  • Losing its Soul: In the process, it lost its focus on technological optimism, adventure, and hope for a better future.

Can Science Fiction Regain its Soul?

  • The Time Machine and its Influence:
    • H.G. Wells’s Prediction: Predicted the decline of humanity into the Eloi.
    • Robert Heinlein’s Response: Took Wells’s prediction as a challenge and explored ways to prevent it in his novel “Beyond This Horizon.”

Robert Heinlein’s “Beyond This Horizon”: Countering Dystopia

  • Premise: Accepting Wells’s prediction of decline, Heinlein explores solutions to maintain human intelligence and vigor.
  • Three Possibilities:
    • Reduced Safety, Enhanced Competition: Creating a society where intelligence and vigor are advantageous for survival (e.g., dueling).
    • Genetic Selection: Using technology to choose beneficial genes for offspring while maintaining natural variation (the Heinlein solution).
    • Higher Purposes: Recognizing the importance of goals beyond mere comfort and survival, such as exploration and understanding the universe.

The Need for a Higher Purpose: Disturbing the Universe

  • Heinlein’s Third Possibility (reinterpreted): Humans need challenges and goals beyond basic needs to thrive.
  • Examples:
    • Harnessing Creative Energies: Focusing on innovation and improvement instead of internal conflict and virtue signaling.
    • Expanding Beyond Earth: Protecting the Earth’s biosphere and venturing into space.
    • Exploring the Solar System and Beyond: Recognizing the vastness of the universe and humanity’s potential to explore it.

Looking Upward: Regaining Optimism and a Sense of Wonder

Henry Adams’ Shifting Perspective on Technology

  • 1862 (Pessimistic):
    • “Man has mounted science and is now run away with. I firmly believe that before many centuries more, science will be the master of men. The engines he will have invented will be beyond his strength to control. Someday, science may have the existence of mankind in its power, and the human race commit suicide by blowing up the world.”

  • 1907 (Optimistic):
    • “He could see that the new American, the child of incalculable coal power, chemical power, electric power, and radiating energy, as well as of new forces yet undetermined, must be a sort of God compared with any former creation of nature. At the rate of progress since 1800, every American who lived into the year 2000 would know how to control unlimited power. He would think in complexities unimaginable to an earlier mind. He would deal with problems altogether beyond the range of earlier society.”

  • Context: Adams’ views shifted from pessimism during the Civil War to optimism during the Edwardian era, illustrating the influence of the zeitgeist on perceptions of technology.

The Great Stagnation and the Information Age

  • The Qing Dynasty Analogy: The Great Stagnation can be seen as a rapid repeat of the Qing Dynasty’s self-imposed decline due to resistance to change.
  • The Internet as the Printing Press: The internet has democratized access to information, similar to the printing press during the Reformation.
  • The Reformation and the Power of Information: The printing press facilitated the Reformation by challenging the Catholic Church’s control over information.
  • Challenging Bureaucracy: The internet has the potential to challenge the power of entrenched bureaucracies and promote positive change.
  • History as a Source of Hope: Historical examples of overcoming stagnation and oppression offer reasons for optimism.

Exceeding Science Fiction’s Predictions: The Potential of the Second Atomic Age

  • Underestimated Possibilities: Science fiction often underestimated the potential of technology to transform the world.
    • Examples:
      • Environmental Transformation: The ability to reshape environments on a large scale (e.g., turning deserts into fertile lands).
      • Human Augmentation: Enhancing human capabilities beyond what was previously imagined (e.g., enabling unassisted flight).
      • Space Colonization: Building new worlds and living comfortably in space, exceeding the limitations of traditional space travel.

The Importance of Vision and Cultural Change

  • The 2062 Vision: By 2062, significant technological progress could be achieved if cultural attitudes towards regulation and scientific bureaucracy change.
    • Examples:
      • Flying Cars and Domestic Robots: Becoming commonplace.
      • Synthesizers: Enabling on-demand production of goods through 3D printing.
      • Increased Life Expectancy: Extending lifespans beyond 100 years.
      • Clean Energy: Eliminating the need for fossil fuels.
      • Space Exploration and Colonization: Establishing cities on the sea, the moon, and beyond.
  • Cultural Prerequisites: Achieving this vision requires a culture that values innovation, experimentation, and individual freedom.

Failure of Imagination: The Limits of Extrapolation

  • Limitations of Current Knowledge: Predicting the future based solely on current scientific understanding is inherently limited.
  • The 1900 Example: Predicting today’s technology in 1900 would have been impossible without knowledge of relativity, quantum mechanics, DNA, nuclear physics, etc.
  • The Continued Advancement of Science: Science is an ongoing process, and future discoveries will likely lead to technologies beyond our current imagination.

The Tom Swift Jr. Analogy: Embracing Experimentation and Discovery

  • Tom Swift Jr.’s Flying Lab: A fictional example of a mobile laboratory equipped with tools for experimentation and invention.
  • The Internet as a Flying Lab: The internet provides access to vast amounts of information, computing power, and design tools, enabling anyone to engage in experimentation and innovation.
  • The Power of Individual Initiative: With the right tools and mindset, individuals can make significant contributions to scientific and technological progress.

The Kardashev Scale and the Future of Humanity

  • Kardashev Type I Civilization: A civilization that can harness all the energy available on its planet.
  • Projected Timeline: If humanity resumes a path of growth and innovation, it could achieve Type I status by 2200.
  • Solar System-Scale Civilization: This level of development would necessitate expanding beyond Earth and utilizing resources throughout the solar system.

The Importance of Frontiers and a Dynamic Ethic

  • Reducing Virtue Signaling and Cost Disease: Expanding into new frontiers can shift societal focus away from internal conflicts and towards external challenges.
  • The Key to the Future:
    • Imagination: Envisioning a better future and believing in the possibility of achieving it.
    • Desire: Wanting to achieve great things and improve the human condition.
    • Effort: Working diligently towards achieving those goals.

The Need for Hopers, Dreamers, and Visionaries

  • Call to Action: Humanity needs individuals who can imagine and inspire others towards a better future.
  • The Role of Science Fiction: Science fiction can play a crucial role in shaping these visions and fostering a sense of wonder and optimism.
  • Looking Up at the Stars: A metaphorical call to embrace ambition, exploration, and the pursuit of grand challenges.

Conclusion: Opening the Roads to the Future

  • Focus on the Near Term: Even without looking too far into the future, the potential for technological and societal progress is evident.
  • Wilbur Wright’s Quote (1908): “It is not really necessary to look too far into the future. We see enough already to be certain it will be magnificent. Only let us hurry and open the roads.”

About Me:
  • I’m Christian Mills, a deep learning consultant specializing in computer vision and practical AI implementations.
  • I help clients leverage cutting-edge AI technologies to solve real-world problems.
  • Learn more about me or reach out via email at [email protected] to discuss your project.