All posts by Ellen Brown

A New Water Source That Could Make Drought a Thing of the Past

Lack of fresh water is now a global crisis. Water shortages mean food shortages, with hunger creating death tolls substantially exceeding those of the current Covid-19 crisis. According to the United Nations, some 800 million people are without clean water, and 40% of the world’s population is impacted by drought. By one measure, almost 100 percent of the Western United States is currently in drought, setting an all-time 122-year record. Meanwhile, local “water wars” rage, with states, cities and whole countries battling each other for scarce water resources.

The ideal solution would be new water flows to add to the hydrologic cycle, and promising new scientific discoveries and technologies are holding out that possibility.

But mainstream geologists have long contended that water is a fixed, non-renewable resource; and vested interests are happy to profit from that limiting proposition. Declaring water “the new oil,” an investor class of “Water Barons” —including wealthy billionaire tycoons, megabanks, mega-funds and investment powerhouses — has cornered the market by buying up water rights and water infrastructure everywhere. As Jo-Shing Yang, author of Solving Global Water Crises, wrote in a 2012 article titled “The New ‘Water Barons’: Wall Street Mega-Banks are Buying up the World’s Water”:

Facing offers of millions of dollars in cash from Goldman Sachs, JPMorgan Chase, Citigroup, UBS, and other elite banks for their utilities and other infrastructure and municipal services, cities and states will find it extremely difficult to refuse these privatization offers.

For developing countries, the World Bank has in some cases made water privatization a condition of getting a loan.

Competing Theories

Geologists say that all of the water on Earth, including the atmosphere, oceans, surface water and groundwater, participates in the natural system called the “hydrologic cycle,” a closed circuit in which water moves from the surface to the atmosphere and back again. Rainwater falls, becoming groundwater which collects in aquifers (underground layers of porous rock or sand), emerging as rivers and lakes, and evaporating into clouds to again become rain. New water called “juvenile water” may be added through volcanic activities, but this addition is considered to be negligible.

The most widely held theory is that water arrived on the planet from comets or asteroids, since any water on Earth when it was first formed would have evaporated in the intense heat of its early atmosphere. One problem with that theory is that comet water is different from Earth water. It has a higher ratio of deuterium (“heavy water” with an extra neutron in it). Asteroids, too, are not a good fit for Earth’s water.

A more likely theory gaining new attention is that Earth’s water comes largely from within. Minerals containing hydrogen and oxygen out-gas water vapor (H2O) under intense pressure and heat from the lower mantle (the layer between Earth’s thin crust and its hot core). Water emerges as steam and seeps outward under the centrifugal force of the spinning earth toward the crust, where it cools and seeps up through the fractured rock formations of the crust and the upper mantle.

Studies over the past two decades have found evidence of several oceans’ worth of water locked up in rock as far down as 1,000 kilometers, challenging the assumption that water arrived from space after  Earth’s formation. A study reported in January 2017 based on isotopes from meteorites and the mantle found that water is unlikely to have arrived on icy comets after Earth formed.

Another study, reported in New Scientist the same month, showed that Earth’s huge store of water may have originated via chemical reactions in the mantle rather than coming from space. The researchers ran a computer simulation of reactions between liquid hydrogen and quartz in Earth’s upper mantle. The simulation showed that water forms within quartz but then cannot escape, so the pressure builds up – to such high levels that it could induce deep earthquakes. Rather than hydrogen bonding into the quartz crystal structure, as the researchers expected, it was found to disrupt the structure by bonding with oxygen. When the rock melts under intense heat, the water is released, forming water-rich regions below Earth’s surface. The researchers said that water formed in the mantle could reach the surface in various ways — for example, via magma in the form of volcanic activity — and that water could still be being created deep inside the Earth today. If so, that means water is a renewable resource.

New Technological Solutions

The challenge is drawing this deep water to the surface, but there are many verified cases of mountaintop wells that have gushed water for decades in arid lands. This water, which could not have come from the rainwater of the conventional hydrologic cycle, is variously called “deep-seated,” “juvenile” or “primary” water. It is now being located and tapped by enterprising hydrogeologists using technological innovations like those used in other extractive industries – but without their destructive impact on the environment.

According to Mark Burr, CEO of Primary Water Technologies,  these innovations include mapping techniques using GIS layering and 3-D modeling, satellite imagery and other sophisticated geophysical data collection; radiometrics, passive seismics, advanced resistivity and even quantum physics. A video capturing one of his successful drills at Chekshani Cliffs, Utah, and the innovative techniques used to pinpoint where to drill, can be seen here.

Burr comments that locating “primary water” does not require drilling down thousands of feet. He says that globally, thousands of primary water wells have been successfully drilled; and for most of them, flowing water was tapped at less than 400 feet. It is forced up from below through fissures in the Earth. What is new are the innovative technologies now being used to pinpoint where those fissures are.

The developments, he says, mirror those in the U.S. oil and gas industry, which went from cries of “Peak Oil” deficiency to an oil and gas glut in less than a decade. Dominated for 40 years by a foreign OPEC cartel, the oil industry was disrupted through a combination of scientific advancements (including recognition of abiotic oil and gas formations), technological innovation, and regulatory modernization. The same transformation is under way in water exploration and production.

Water Pioneers

These developments were pioneered in the U.S. by Burr’s mentors, led by Bavarian-born mining engineer and geologist Stephen Riess of San Diego. Riess drilled over 800 wells around the world before his death in 1985 and was featured in several books, including New Water for a Thirsty World (1960) by Dr. Michael Salzman, professor of economics at the University of Southern California.

Partnering with Riess until his death was Hungarian-born hydrogeologist Pal Pauer, founder of the Primary Water Institute based in Ojai, California. Pauer has also successfully located and drilled over 1,000 primary water wells worldwide, including over 500 in East Africa. One noteworthy well was drilled high on the top of a mountain in Kenya at Ngu-Nyumu, captured in a short video here. The workers drilled through rock and hit water at 300 feet, pumping at 15-30 gallons per minute. The flow, which is now being captured in a water tank, is still serving hundreds of villagers who were previously hauling water from heavily infested streams in jugs balanced on their heads.

Another remarkable mountaintop project overseen by Pauer involved two wells drilled at a 6,000 foot elevation in the Tehachapi Mountains in California. The drill first hit water at 35 feet. The 7-inch diameter borehole proceeded to eject water at a rate estimated to be over 800 gallons per minute. The event is captured on YouTube here.

Like California, Australia is an arid land with chronic water problems. An Australian company called Sustainable Water Solutions (SWS), a partner of Burr’s Primary Water Technologies, was featured on a local TV news program here. A video of one of SWS’s successful case studies detailing its methodologies is here.

A rival company is Australian-based AquaterreX Deep Seated Water Technology. According to its website, AquaterreX is an international enterprise employing  geology, environmental and earth sciences with a range of proprietary methodologies to identify and analyze geologic, hydrologic, atmospheric, and other data to locate reliable sources of Deep Seated Water with nearly 100% accuracy. Some of the company’s results are shown in a video which describes “deep-seated water” as being stored in a deeper layer of aquifers below those of the conventional hydrologic cycle.

Fresh Water Is Ubiquitous and Renewable

What these researchers call “primary water” or “deep seated water” is classified by the National Ground Water Association (NGWA) simply as a form of “groundwater,” since it is in the ground. But whatever it is called, these newly tapped flows have not been part of the hydrologic cycle for at least the last century. This is shown on testing by the lack of the environmental contaminants found in the hydrologic water cycle. From the time when atomic testing began in the Pacific, hydrologic water has contained traces of tritium, a radioactive isotope of hydrogen used as a fuel in thermonuclear bombs. Primary water shoots up tritium-free —clean, fresh and usually drinkable without filtration.

The latest NGWA fact sheet explicitly confirms that water is a renewable resource. It states:

  • About 90 percent of our freshwater supplies lie underground, but less than 27 percent of the water Americans use comes from underground sources, which illustrates the under-utilization of groundwater.
  • Groundwater is a significant water supply source — the amount of groundwater storage dwarfs our present surface water supply.
  • Hydrologists estimate, according to the National Geographic Society, U.S. groundwater reserves to be at least 33,000 trillion gallons — equal to the amount discharged into the Gulf of Mexico by the Mississippi River in the past 200 years.
  • At any given moment, groundwater is 20 to 30 times greater than the amount in all the lakes, streams, and rivers of the United States….
  • Groundwater is a renewable resource. [Emphasis added.]

In some states, such as Texas,  property owners have the right to capture the water beneath their property  (called the “Rule of Capture”), but this is not true in other states. California, for example, has a complicated system of regulation requiring costly and laborious permits. Granting property owners the right to drill wells on their own property, particularly where the water has been tested and shown to be “deep” or “primary water,” could be a major step toward turning water scarcity into abundance.

According to the American Society of Civil Engineers, the U.S. needs over $500 billion in infrastructure investment just for drinking water, wastewater, stormwater and dams. But legislators at both federal and state levels have been slow to respond, chiefly due to budget constraints. One proposal is a National Infrastructure Bank (HR 3339) constructed on the model of Franklin Roosevelt’s Reconstruction Finance Corporation (discussed in my earlier article here). When allocating funds for water usage, however, policymakers would do well to consider investing in “primary water” wells.

Tapping into local deep water sources not only can help ease pressures on debt-strapped public treasuries but can bypass the Water Barons and relieve territorial tensions over water rights. Water sovereignty is a critical prerequisite to food sovereignty and to national and regional independence. As noted in a recent Water Today article, quoting James D’Arezzo:

The fact is, we do not have to severely restrict water usage, if we leverage all the tools at our disposal. There is plenty of water available on the planet and we now know how to find it. We also have newer best practices that can make a dramatic difference in total usage…. If we start acting now, in a short time the headlines about ‘water restrictions’ and grotesque pictures of dead animals and starving children can be replaced with headlines about more food production, smarter use of water and less conflict.

• First published in ScheerPost.

The post A New Water Source That Could Make Drought a Thing of the Past first appeared on Dissident Voice.

Nature’s Own Fuel Could Save Us From the Greenhouse Effect and Electric Grid Failure

Hemp fuel and other biofuels could reduce carbon emissions while saving the electric grid, but they’re often overlooked for more expensive, high-tech climate solutions.

On July 14, the European Union unveiled sweeping climate change and emissions targets that would, according to Gulf News, mean “the end of the internal combustion engine”:

The commission’s draft would reduce permitted emissions from new passenger cars and light commercial vehicles to zero from 2035 – effectively obliging the industry to move on to battery-electric models.

While biofuels are a less high-tech, cheaper and in many ways more effective solution to our dependence on petroleum, the United States and other countries are discussing similar plans to the EU’s and California is already on board. But in a recent article in the Los Angeles Times and related video, Evan Halper argues that we may be trading one environmental crisis for another:

 

The sprint to supply automakers with heavy-duty lithium batteries is propelled by climate-conscious countries like the United States that aspire to abandon gas-powered cars and SUVs. They are racing to secure the materials needed to go electric, and the Biden administration is under pressure to fast-track mammoth extraction projects that threaten to unleash their own environmental fallout.

Extraction proposals include vacuuming the ocean floor, disturbing marine ecosystems; and mining Native American ancestral sites and pristine federal lands. Proponents of these proposals argue that China controls most of the market for the raw material refining needed for the batteries, posing economic and security threats. But opponents say the negative environmental impact will be worse than the oil fracking that electric vehicles are projected to replace.

Not just the batteries but the electricity needed to run electric vehicles (EVs) poses environmental concerns. Currently, generating electric fuel depends heavily on non-renewable sources. And  according to a March 2021 report from the Government Accountability Office, electric vehicles are making the electrical grid more vulnerable to cyber attacks, threatening the portions of the grid that deliver electricity to homes and businesses. If that is true at current use levels, the grid could clearly not sustain the load if all the cars on the road were EVs.

Not just tribal land residents but poor households everywhere will bear the cost if the proposed emissions targets and EV mandates are implemented. According to one European think tank, “average expenses of the poorest households could increase by 44 percent for transport and by 50 percent for residential heating.” As noted in Agence France-Presse, “The recent ‘yellow vest’ protests in France demonstrated the kind of populist fury that environmental controls on motoring can provoke.”

People who can barely make ends meet cannot afford new electric vehicles (EVs), and buying a used EV is risky. If the lithium battery fails, replacing it could cost as much as the car itself; and repairs must be done by pricey dealers. No more doing it yourself with instructions off the Internet, and even your friendly auto repair shop probably won’t have the tools. Except for the high-end Tesla, auto manufacturers themselves are largely losing money on EVs, due to the high cost of the batteries and low consumer demand.

Off the Electric Grid with Clean Biofuel

Whether carbon dioxide emissions are the cause of climate change is still debated, but gasoline-fueled vehicles do pose environmental hazards. There is an alternative to gasoline that does not require sending all our combustion engine vehicles to the junkyard. This is alcohol fuel (bioethanol). Not only are greenhouse gas emissions from ethanol substantially lower than from gasoline, but as detailed in a biofuel “explainer” on the website of the Massachusetts Institute of Technology:

As we search for fuels that won’t contribute to the greenhouse effect and climate change, biofuels are a promising option because the carbon dioxide (CO2) they emit is recycled through the atmosphere. When the plants used to make biofuels grow, they absorb CO2 from the air, and it’s that same CO2 that goes back into the atmosphere when the fuels are burned. In theory, biofuels can be a “carbon neutral” or even “carbon negative” way to power cars, trucks and planes, meaning they take at least as much CO2 out of the atmosphere as they put back in.

A major promise of biofuels is that they can lower overall CO2 emissions without changing a lot of our infrastructure. They can work with existing vehicles, and they can be mass-produced from biomass in the same way as other biotechnology products, like chemicals and pharmaceuticals, which are already made on a large scale.… Most gasoline sold in the U.S. is mixed with 10% ethanol.

Biofuels can be created from any sort of organic commercial waste that is high in carbohydrates, which can be fermented into alcohol locally. Unlike the waste fryer oil and grease used to generate biodiesel, carbohydrates are supplied by plants in abundance. Methanol, the simplest form of alcohol, can be made from any biomass – anything that is or once was a plant (wood chips, agricultural waste of all kinds, animal waste, etc.). In the US, 160 million tons of trash ends up in landfills annually. Estimates are that this landfill waste could be converted to 15-16 million gallons of methanol.

In the third in a series of national assessments calculating the potential supply of biomass in the United States, the US Energy Department concluded in 2016 that the country has the future potential to produce at least one billion dry tons of biomass resources annually without adversely affecting the environment. This amount of biomass could be used to produce enough biofuel, biopower, and  bioproducts to displace approximately 30% of 2005 U.S. petroleum consumption, said the report, without negatively affecting the production of food or other agricultural products.

Energy Independence

A documentary film called Pump tells the tale of the monopolization of the auto fuel industry by the petroleum cartel, and how that monopoly can be ended with a choice of biofuels at the pump.

Henry Ford’s first car, built in 1896, ran 100% on alcohol fuel, produced by farmers using beets, apples, corn and other starchy crops in their own stills. He envisioned the family piling into the car and driving through the countryside, fueling up along the road at independent farms. But alcohol was burdened with a liquor tax, and John D. Rockefeller saw a use for the gasoline fuel that was being discarded as a toxic waste product of the kerosene market he had cornered. In 1908, Ford accommodated Rockefeller’s gasoline fuel by building America’s first “flex-fuel” car, the Model T or “Tin Lizzie.” It could be made to run on either gasoline or ethanol by adjusting the ignition timing and air fuel mixture. Rockefeller then blocked competition from Ford’s ethanol fuel by using his power and influence to help pass Prohibition, a Constitutional amendment banning the sale and transport of alcohol.

The petroleum monopoly was first broken in Brazil, where most cars are adapted to run on bioethanol made from sugar cane. Existing combustion engines can be converted to use this “flex fuel” with simple, inexpensive kits. The Brazilian biofuel market dates back to the oil crisis of the 1970s, when gas had to be imported and was quite expensive. With the conversion to biofuels, Pres. Luiz Inácio Lula da Silva achieved national energy independence, giving a major boost to the struggling Brazilian economy.

The U.S. push for biofuels was begun in California in the 1980s, when Ford Motor Company was enlisted to design a flex fuel car to help reduce the state’s smog problem. But again the oil industry lobbied against it. They argued that bioethanol, which in the U.S. is chiefly made from corn, was competing for corn as a foodstuff at a time when food shortages were a major concern.

David Blume  counters that it is not a question of “food or fuel” but “food and fuel.” Most U.S. corn is grown as livestock feed, and the “distillers grains” left after the alcohol is removed are more easily digested by cows than unprocessed grain. Distillers grains have another advantage over hay as a livestock feed: its easier digestion reduces the noxious cow emissions said to be a significant source of greenhouse gases.

Fuel from a Weed: The Wide-ranging Virtues of Hemp

Opponents, however, continue to raise the “food versus fuel” objection,  and they claim that biofuels from corn are not “carbon neutral” when the steps used to create them are factored in. Even the fertilizers needed to grow them may emit CO2 and other greenhouse gases.  But corn is not the only biofuel option. There are plants that can grow like weeds on poor soil without fertilizers.

Industrial hemp – the non-intoxicating form of cannabis grown for fiber, cloth, oil, and many other purposes – is a prime candidate not just for fuel but to help save the environment. Hemp has been proven to absorb more CO2 per hectare than any forest or commercial crop, making it the ideal carbon sink. It can be grown on a wide scale on nutrient poor soils; it grows remarkably fast with almost no fertilizer or irrigation; and it returns around 70% of the nutrients used in the growth cycle back to the soil. Biofuels usually require substantially more water than fossil fuels, but hemp needs roughly half the amount needed for corn. Hemp can also be used for “bioremediation” – the restoration of soil from toxic pollution. It helps remove toxins and has been used by farmers to “cure” their fields, even from radioactive agents, metals, pesticides, crude oil, and toxins in landfills.

An analysis published in the journal Science in 2019 concluded that a worldwide tree planting program could remove two-thirds of all the CO2 emissions that have been pumped into the atmosphere by human activities. As reported in The Guardian in 2019, one trillion trees could be restored for as little as $300 billion – “by far the cheapest solution that has ever been proposed.” The chief drawback to that solution is that trees grow slowly, requiring 50 to 100 years to reach their full carbon sequestering potential. Hemp, on the other hand, is one of the fastest CO2-to-biomass conversion tools available, growing to 13 feet in 100 days. It also requires much less space per plant than trees, and it can be grown on nearly any type of soil without fertilizers.

In a 2015 book titled Cannabis Vs. Climate Change Paul von Hartmann notes that hemp is also one of the richest available sources of aromatic terpenes, which are known to slow climate change. When emitted by pine forests, terpenes help to cool the planet by bouncing energy from the sun back into space. In a mature hemp field, the temperature on a hot day can be 20 degrees cooler than in surrounding areas.

Reviving an American Staple

Hemp has many uses besides fuel. Long an American staple, its cultivation was mandated in colonial America. It has been used for centuries in pharmaceuticals, clothing and textiles; it is an excellent construction material; its fiber can be used to make paper, saving the forests; and hemp seeds are providing protein equivalent by weight to beef or lamb.

The value of industrial hemp has long been known by the U.S. government, which produced an informational film in 1942 called “Hemp for Victory” to encourage farmers to grow it for the war effort. Besides its many industrial uses, including for cloth and cordage, the film detailed the history of the plant’s use and best growing practices.

Henry Ford used hemp as a construction material for his Model T, and Porsche is now using hemp-based material in the body of its 718 Cayman GT4 Clubsport track car to reduce its weight while maintaining rigidity and safety. “Hempcrete” (concrete made from hemp mixed with lime) is a “green” building material used for construction and insulation, including for building “tiny homes.”

Hemp can replace so many environmentally damaging industries that an April 2019 article in Forbes claimed that “Industrial Hemp Is the Answer to Petrochemical Dependency.” The authors wrote:

[O]ur dependency on petrochemicals has proven hard to overcome, largely because these materials are as versatile as they are volatile. From fuel to plastics to textiles to paper to packaging to construction materials to cleaning supplies, petroleum-based products are critical to our industrial infrastructure and way of life.

… Interestingly, however, there is a naturally-occurring and increasingly-popular material that can be used to manufacture many of the same products we now make from petroleum-derived materials …. That material is hemp.

… The crop can be used to make everything from biodegradable plastic to construction materials like flooring, siding, drywall and insulation to paper to clothing to soap to biofuels made from hemp seeds and stalks.

The authors note that while hemp was widely grown until a century ago, the knowledge, facilities and equipment required to produce it efficiently are no longer commonly available, since hemp farming was banned for decades due to its association with the psychoactive version of the plant.

Fueling a Rural Renaissance

In an effort to fill that vacuum, a recent initiative in California is exploring different hemp varieties and growing techniques, in the first extensive growing trials for hemp fiber and grain in the state since the 1990s. The project is a joint effort among the World Cannabis Foundation, hemp wholesaler Hemp Traders, and Oklahoma-based processor Western Fiber. The Pennsylvania-based Rodale Institute, a nonprofit that supports research into organic farming, has also partnered on a USDA-supported research project on the use of hemp in the development of biochar (charcoal produced by firing biomass in the absence of oxygen). On July 31, the World Cannabis Foundation will host a field day and factory tour in Riverdale, California, where an old cotton gin has been converted to hemp textile manufacture. The event will also feature presentations by a panel of hemp experts.

How to decarbonize 51 billion tons of greenhouse gases annually with hemp technology and regenerative farming will also be the focus of a COP26 “fringe festival” called “Beyond the Green,” to be held in Glasgow, Scotland, in November along with COP26, the 2021 UN Climate Change Conference.

A 2018 article summarizing research from the University of Connecticut concluded that hemp farming could “set a great example of a self-sustainable mini ‘ecosystem’ with minimal environmental footprint.” Henry Ford’s vision was to decentralize industry, with “small [factory] plants … on every stream,” a rural renaissance fueled not with oil but with alcohol. Hemp fuel and other forms of bioethanol are renewable energy sources that can be produced anywhere, contributing to energy independence not just for families but for local communities and even for the country. And it doesn’t place the burden of addressing climate change on the middle or working classes.

• This article was first posted on ScheerPost.

The post Nature’s Own Fuel Could Save Us From the Greenhouse Effect and Electric Grid Failure first appeared on Dissident Voice.

How America Went From Mom-and-Pop Capitalism to Techno-Feudalism

The crisis of 2020 has created the greatest wealth gap in history. The middle class, capitalism and democracy are all under threat. What went wrong and what can be done?

In a matter of decades, the United States has gone from a largely benign form of capitalism to a neo-feudal form that has created an ever-widening gap in wealth and power. In his 2013 bestseller Capital in the 21st Century, French economist Thomas Piketty declared that “the level of inequality in the US is probably higher than in any other society at any time in the past anywhere in the world.” In a 2014 podcast about the book, Bill Moyers commented:

Here’s one of its extraordinary insights: We are now really all headed into a future dominated by inherited wealth, as capital is concentrated in fewer and fewer hands, giving the very rich ever greater power over politics, government and society. Patrimonial capitalism is the name for it, and it has potentially terrifying consequences for democracy.

Paul Krugman maintained in the same podcast that the United States is becoming an oligarchy, a society of inherited wealth, “the very system our founders revolted against.” While things have only gotten worse since then thanks to the economic crisis of 2020, it’s worth retracing the history that brought us to this volatile moment.

Not the Vision of Our Founders

The sort of capitalism on which the United States was originally built has been called mom-and-pop capitalism. Families owned their own farms and small shops and competed with each other on a more or less level playing field. It was a form of capitalism that broke free of the feudalistic model and reflected the groundbreaking values set forth in the Declaration of Independence and Bill of Rights: that all men are created equal and are endowed by their Creator with certain inalienable rights, including the rights to free speech, a free press, to worship and assemble; and the right not to be deprived of life, liberty or property without due process.

It was good in theory, but there were glaring, inhumane exceptions to this idealized template, including the confiscation of the lands of indigenous populations and the slavery that then prevailed. The slaves were emancipated by the US Civil War; but while they were freed in their persons, they were not economically free. They remained entrapped in economic serfdom. Although Black and Indigenous communities have been disproportionately oppressed, poor people were all trapped in “indentured servitude” of sorts — the obligation to serve in order to pay off debts; e.g., the debts of Irish workers to pay for passage to the United States, and the debts of “sharecroppers” (two-thirds of whom were white), who had to borrow from landlords at interest for land and equipment. Today’s U.S. prison system has also been called a form of slavery, in which free or cheap labor is extracted from poor people of color.

To the creditors, economic captivity actually had certain advantages over “chattel” slavery (ownership of humans as a property right). According to an infamous document called The Hazard Circular, circulated by British banking interests among their American banking counterparts during the American Civil War:

Slavery is likely to be abolished by the war power and chattel slavery destroyed. This, I and my European friends are glad of, for slavery is but the owning of labor and carries with it the care of the laborers, while the European plan, led by England, is that capital shall control labor by controlling wages.

Slaves had to be housed, fed and cared for. “Free” men housed and fed themselves.  Free men could be kept enslaved by debt by paying them wages that were insufficient to meet their costs of living.

From ‘Industrial Capitalism’ to ‘Finance Capitalism

The economy crashed in the Great Depression, when Franklin D. Roosevelt’s government revived it and rebuilt the country through a public financial institution called the Reconstruction Finance Corporation. After World War II, the US middle class thrived. Small businesses competed on a relatively level playing field similar to the mom-and-pop capitalism of the early pioneers. Meanwhile, larger corporations engaged in “industrial capitalism,” in which the goal was to produce real goods and services.

But the middle class, considered the backbone of the economy, has been progressively eroded since the 1970s. The one-two punch of the Great Recession and what the IMF has called the “Great Lockdown” has again reduced much of the population to indentured servitude; while industrial capitalism has largely been displaced by “finance capitalism,” in which money makes money for those who have it, “in their sleep.” As economist Michael Hudson explains, unearned income, not productivity, is the goal. Corporations take out cheap 1% loans, not to invest in machinery and production, but to buy their own stock earning 8% or 9%; or to buy out smaller corporations, eliminating competition and creating monopolies. Former Greek Finance Minister Yanis Varoufakis explains that “capital” has been decoupled from productivity: businesses can make money without making profits on their products.  As Kevin Cahill described the plight of people today in a book titled Who Owns the World?:

These latter day pharaohs, the planet owners, the richest 5% – allow the rest of us to pay day after day for the right to live on their planet. And as we make them richer, they buy yet more of the planet for themselves, and use their wealth and power to fight amongst themselves over what each possesses – though of course it’s actually us who have to fight and die in their wars.

The 2020 Knockout Punch 

The final blow to the middle class came in 2020. Nick Hudson, co-founder of a data analytics firm called PANDA (Pandemics, Data and Analysis),  argued in an interview following his keynote address at a March 2021 investment conference:

Lockdowns are the most regressive strategy that has ever been invented. The wealthy have become much wealthier. Trillions of dollars of wealth have been transferred to wealthy people. … Not a single country did a cost/benefit analysis before imposing these measures.

Policymakers followed the recommendations of the World Health Organization, based on predictive modeling by the Imperial College London that subsequently proved to be wildly inaccurate. Later studies have now been done, at least some of which have concluded that lockdowns have no significant effects on case numbers and that the costs of lockdowns substantially outweigh the benefits, in terms not just of economic costs but of lives.

On the economic front,  global lockdowns eliminated competition from small and medium-sized businesses, allowing monopolies and oligopolies to grow. “The biggest loser from all this is the middle class,” wrote Logan Kane on Seeking Alpha. By May 2020, about one in four Americans had filed for unemployment, with over 40 million Americans filing jobless claims; and 200,000 more businesses closed in 2020 than the historical annual average. Meanwhile, US billionaires collectively increased their total net worth by $1.1 trillion during the last 10 months of 2020; and 46 people joined the billionaire class.

The number of “centi-billionaires”– individuals with a net worth of $100 billion or more – also grew. In the US they included:

  • Jeff Bezos, soon-to-be former CEO of Amazon, whose net worth increased from $113 billion in March 2020 to $182 billion in March 2021, up by $70 billion for the year;
  • Elon Musk, CEO of Tesla and SpaceX, whose net worth increased from $25 billion in March 2020 to $164 billion in March 2021, up by $139 billion for the year; and
  • Bill Gates, formerly CEO of Microsoft and currently considered the “global vaccine czar,” whose net worth increased to $124 billion in March 2021, up by $26 billion for the year.

Two others are almost centi-billionaires:

  • The net worth of Mark Zuckerberg, CEO of Facebook, grew from $55 billion in March 2020 to $95 billion in March 2021, up by $40 billion for the year; and
  • The net worth of Warren Buffett of Berkshire Hathaway grew from $68 billion in March 2020 to $95 billion in March 2021, up by $27.6 billion for the year.

These five individuals collectively added $300 billion to their net worth just in 2020. For perspective, that’s enough to create 300,000 millionaires, or to give $100,000 to 3 million people.

Philanthrocapitalism

The need to shield the multibillionaire class from taxes and to change their predatory corporate image has given rise to another form of capitalism, called philanthrocapitalism. Wealth is transferred to foundations or limited liability corporations that are designated as having charitable purposes but remain under the ownership and control of the donors, who can invest the funds in ways that serve their corporate interests. As noted in The Reporter Magazine of the Rochester Institute of Technology:

Essentially, what we are witnessing is the transfer of responsibility for public goods and services from democratic institutions to the wealthy, to be administered by an executive class. In the CEO society, the exercise of social responsibilities is no longer debated in terms of whether corporations should or shouldn’t be responsible for more than their own business interests. Instead, it is about how philanthropy can be used to reinforce a politico-economic system that enables such a small number of people to accumulate obscene amounts of wealth.

With $100 billion, nearly anything can be bought – not just land and resources but media and journalists, political influence and legislation, regulators, university research departments and laboratories. Jeff Bezos now owns The Washington Post. Bill Gates is not only the largest funder of the World Health Organization and the Imperial College London but the largest owner of agricultural land in the US.

And Elon Musk’s aerospace manufacturer SpaceX has effectively privatized the sky. Astronomers and stargazers complain that the thousands of satellites it has already launched, with many more in the works, are blocking their ability to see the stars.

Astronomy professor Samantha Lawler writes in a piece for The Conversation:

SpaceX has already received approval for 12,000 Starlink satellites and is seeking approval for 30,000 more. Other companies are not far behind […]

The point of the Starlink mega-constellation is to provide global internet access. It is often stated by Starlink supporters that this will provide internet access to places on the globe not currently served by other communication technologies. But currently available information shows the cost of access will be too high in nearly every location that needs internet access. Thus, Starlink will likely only provide an alternate for residents of wealthy countries who already have other ways of accessing the internet

[…] With tens of thousands of new satellites approved for launch, and no laws about orbit crowding, right-of-way or space cleanup, the stage is set for the disastrous possibility of Kessler Syndrome, a runaway cascade of debris that could destroy most satellites in orbit and prevent launches for decades…. Large corporations like SpaceX and Amazon will only respond to legislation — which is slow, especially for international legislation — and consumer pressure […] Our species has been stargazing for thousands of years, do we really want to lose access now for the profit of a few large corporations?

Public advocacy groups, such as the Cellular Phone Task Force,  have also objected due to health concerns over increased electromagnetic radiation. But the people have little say over public policy these days. So concluded a study summarized in a January 2021 article in Foreign Affairs. Princeton professor and study co-author Martin Gilens wrote:

[O]rdinary citizens have virtually no influence over what their government does in the United States. … Government policy-making over the last few decades reflects the preferences … of economic elites and of organized interests.

Varoufakis calls our current economic scheme “postcapitalism” and “techno-feudalism.” As in the medieval feudal model, assets are owned by the few. He notes that the stock market and the businesses in it are essentially owned by three companies – the giant exchange-traded funds BlackRock, Vanguard, and State Street. Under the highly controversial “Great Reset” envisioned by the World Economic Forum, “you will own nothing and be happy.” By implication, everything will be owned by the techno-feudal lords.

Getting Back on Track

The capitalist model has clearly gone off the rails. How to get it back on track? One obvious option is to tax the uber-rich. As Chuck Collins, author of The Wealth Hoarders: How Billionaires Pay Millions to Hide Trillions (2021), writes in a March 2021 article:

A wealth tax would reverse more than a half-​century of tax cuts for the wealthiest households. Billionaires have seen their taxes decline roughly 79 percent as a percentage of their wealth since 1980. The “effective rate” on the billionaire class—the actual percentage paid—was 23 percent in 2018, lower than for most middle-​income taxpayers.

He notes that Sen. Elizabeth Warren (D-​Mass.) and co-authors recently introduced legislation to levy a 2 percent annual tax on wealth starting at $50 million, rising to 3 percent on fortunes of more than $1 billion:

The tax, which would apply to fewer than 100,000 U.S. residents, would raise an estimated $3 trillion over the next decade. It would be paid entirely by multi-​millionaires and billionaires who have reaped the lion’s share of wealth gains over the last four decades, including during the pandemic.

Varoufakis contends, however, that taxing wealth won’t be enough. The corporate model itself needs an overhaul. To create a “humanist” capitalism, he says, democracy needs to be brought to the marketplace.

Politically, one adult gets one vote. But in corporate elections, votes are weighted according to financial investment: the largest investors hold the largest number of voting shares. Varoufakis argues that the proper principle for reconfiguring the ownership of corporations for a market-based society would be one employee, one share (not tradeable), one vote. On that basis, he says, we can imagine as an alternative to our post-capitalist model a market-based democratic society without capitalism.

Another proposed solution is a land value tax, restoring at least a portion of the land to the “commons.” As Michael Hudson has observed:

There is one Achilles heel in the globalists’ strategy, an option that remains open to governments. This option is a tax on the rental income – the “unearned income” – of land, natural resources and monopoly takings.

Reforming the banking system is another critical tool. Banks operated as a public utility could allocate credit for productive purposes serving the public interest. Other possibilities include enforcement of anti-monopoly legislation and patent law reform.Perhaps, however, the flaw is in the competitive capitalist model itself. The winners will inevitably capture and exploit the losers, creating an ever-growing gap in wealth and power. Studies of natural systems have shown that cooperative models are more efficient than competitive schemes. That does not mean the sort of “cooperation” coerced through iron-fisted totalitarian control at the top. We need a set of rules that actually levels the playing field, rewards productivity, and maximizes benefit to society as a whole, while preserving the individual rights guaranteed by the U.S. Constitution.

• This article was first posted on ScheerPost.

The post How America Went From Mom-and-Pop Capitalism to Techno-Feudalism first appeared on Dissident Voice.

Will 2021 Be Public Banking’s Watershed Moment?

Just over two months into the new year, 2021 has already seen a flurry of public banking activity. Sixteen new bills to form publicly-owned banks or facilitate their formation were introduced in eight U.S. states in January and February. Two bills for a state-owned bank were introduced in New Mexico, two in Massachusetts, two in New York, one each in Oregon and Hawaii, and Washington State’s Public Bank Bill was re-introduced as a “Substitution.” Bills for city-owned banks were introduced in Philadelphia and San Francisco, and bills facilitating the formation of public banks or for a feasibility study were introduced in New York, Oregon (three bills), and Hawaii.

In addition, California is expected to introduce a bill for a state-owned bank later this year, and New Jersey is moving forward with a strong commitment from its governor to implement one. At the federal level, three bills for public banking were also introduced last year: the National Infrastructure Bank Bill (HR 6422), a new Postal Banking Act (S 4614), and the Public Banking Act (HR 8721). (For details on all these bills, see the Public Banking Institute website here.)

As Oscar Abello wrote on NextCity.org in February, “2021 could be public banking’s watershed moment.… Legislators are starting to see public banks as a powerful potential tool to ensure a recovery that is more equitable than the last time.”

Why the Surge in Interest?

The devastation caused by nationwide Covid-19 lockdowns in 2020 has highlighted the inadequacies of the current financial system in serving the public, local businesses, and local governments. Nearly 10 million jobs were lost to the lockdowns, over 100,000 businesses closed permanently, and a quarter of the population remains unbanked or underbanked. Over 18 million people are receiving unemployment benefits, and moratoria on rent and home foreclosures are due to expire this spring.

Where was the Federal Reserve in all this? It poured out trillions of dollars in relief, but the funds did not trickle down to the real economy. They flooded up, dramatically increasing the wealth gap. By October 2020, the top 1% of the U.S. population held 30.4% of all household wealth, 15 times that of the bottom 50%, which held just 1.9% of all wealth.

State and local governments are also in dire straits due to the crisis. Their costs have shot up and their tax bases have shrunk. But the Fed’s “special purpose vehicles” were no help. The Municipal Liquidity Facility, ostensibly intended to relieve municipal debt burdens, lent at market interest rates plus a penalty, making borrowing at the facility so expensive that it went nearly unused; and it was discontinued in December.

The Fed’s emergency lending facilities were also of little help to local businesses. In a January 2021 Wall Street Journal article titled “Corporate Debt ‘Relief’ Is an Economic Dud,” Sheila Bair, former chair of the Federal Deposit Insurance Corporation, and Lawrence Goodman, president of the Center for Financial Stability, observed:

The creation of the corporate facilities last March marked the first time in history that the Fed would buy corporate debt… The purpose of the corporate facilities was to help companies access debt markets during the pandemic, making it possible to sustain operations and keep employees on payroll. Instead, the facilities resulted in a huge and unnecessary bailout of corporate debt issuers, underwriters and bondholders….This created a further unfair opportunity for large corporations to get even bigger by purchasing competitors with government-subsidized credit.

…. This presents a double whammy for the young companies that have been hit hardest by the pandemic. They are the primary source of job creation and innovation, and squeezing them deprives our economy of the dynamism and creativity it needs to thrive.

In a September 2020 study for ACRE called “Cancel Wall Street,” Saqib Bhatti and Brittany Alston showed that U.S. state and local governments collectively pay $160 billion annually just in interest in the bond market, which is controlled by big private banks. For comparative purposes, $160 billion would be enough to help 13 million families avoid eviction by covering their annual rent; and $134 billion could make up the revenue shortfall suffered by every city and town in the U.S. due to the pandemic.

Half the cost of infrastructure generally consists of financing, doubling its cost to municipal governments. Local governments are extremely good credit risks; yet private, bank-affiliated rating agencies give them a lower credit score (raising their rates) than private corporations, which are 63 times more likely to default. States are not allowed to go bankrupt, and that is also true for cities in about half the states. State and local governments have a tax base to pay their debts and are not going anywhere, unlike bankrupt corporations, which simply disappear and leave their creditors holding the bag.

How Publicly-owned Banks Can Help 

Banks do not have the funding problems of local governments. In March 2020, the Federal Reserve reduced the interest rate at its discount window, encouraging all banks in good standing to borrow there at 0.25%. No stigma or strings were attached to this virtually free liquidity – no need to retain employees or to cut dividends, bonuses, or the interest rates charged to borrowers. Wall Street banks can borrow at a mere one-quarter of one percent while continuing to charge customers 15% or more on their credit cards.

Local governments extend credit to their communities through loan funds, but these “revolving funds” can lend only the capital they have. Depository banks, on the other hand, can leverage their capital, generating up to ten times their capital base in loans. For a local government with its own depository bank, that would mean up to ten times the credit to inject into the local economy, and ten times the profit to be funneled back into community needs. A public depository bank could also borrow at 0.25% from the Fed’s discount window.

North Dakota Leads the Way

What a state can achieve by forming its own bank has been demonstrated in North Dakota. There  the nation’s only state-owned bank was formed in 1919 when North Dakota farmers were losing their farms to big out-of-state banks. Unlike the Wall Street megabanks mandated to make as much money as possible for their shareholders, the Bank of North Dakota (BND) is mandated to serve the public interest. Yet it has had a stellar return on investment, outperforming even J.P. Morgan Chase and Goldman Sachs. In its 2019 Annual Report, the BND reported its sixteenth consecutive year of record profits, with $169 million in income, just over $7 billion in assets, and a hefty return on investment of 18.6%.

The BND maximizes its profits and its ability to serve the community by eliminating profiteering middlemen. It has no private shareholders bent on short-term profits, no high-paid executives, no need to advertise for depositors or borrowers, and no need for multiple branches. It has a massive built-in deposit base, since the state’s revenues must be deposited in the BND by law. It does not compete with North Dakota’s local banks in the retail market but instead partners with them. The local bank services and retains the customer, while the BND helps as needed with capital and liquidity. Largely due to this amicable relationship, North Dakota has nearly six times as many local financial institutions per person as the country overall.

The BND has performed particularly well in economic crises. It helped pay the state’s teachers during the Great Depression, and sold foreclosed farmland back to farmers in the 1940s. It has also helped the state recover from a litany of natural disasters.

Its emergency capabilities were demonstrated in 1997, when record flooding and fires devastated Grand Forks, North Dakota. The town and its sister city, East Grand Forks on the Minnesota side of the Red River, lay in ruins. The response of the BND was immediate and comprehensive, demonstrating a financial flexibility and public generosity that no privately-owned bank could match. The BND quickly established nearly $70 million in credit lines and launched a disaster relief loan program; worked closely with federal agencies to gain forbearance on federally-backed home loans and student loans; and reduced interest rates on existing family farm and farm operating programs. The BND obtained funds at reduced rates from the Federal Home Loan Bank and passed the savings on to flood-affected borrowers. Grand Forks was quickly rebuilt and restored, losing only 3% of its population by 2000, compared to 17% in East Grand Forks on the other side of the river.

In the 2020 crisis, North Dakota shone again, leading the nation in getting funds into the hands of workers and small businesses. Unemployment benefits were distributed in North Dakota faster than in any other state, and small businesses secured more Payroll Protection Program funds per worker than in any other state. Jeff Stein, writing in May 2020 in The Washington Post, asked:

What’s their secret? Much credit goes to the century-old Bank of North Dakota, which — even before the PPP officially rolled out — coordinated and educated local bankers in weekly conference calls and flurries of calls and emails.

According Eric Hardmeyer, BND’s president and chief executive, BND connected the state’s small bankers with politicians and U.S. Small Business Administration officials and even bought some of their PPP loans to help spread out the cost and risk….

BND has already rolled out two local successor programs to the PPP, intended to help businesses restart and rebuild. It has also offered deferments on its $1.1 billion portfolio of student loans.

Public Banks Excel Globally in Crises

Publicly-owned banks around the world have responded quickly and efficiently to crises. As of mid-2020, public banks worldwide held nearly $49 trillion in combined assets; and including other public financial institutions, the figure reached nearly $82 trillion. In a 2020 compendium of cases studies titled Public Banks and Covid 19: Combatting the Pandemic with Public Finance, the editors write:

Five overarching and promising lessons stand out: public banks have the potential to respond rapidly; to fulfill their public purpose mandates; to act boldly; to mobilize their existing institutional capacity; and to build on ‘public-public’ solidarity. In short, public banks are helping us navigate the tidal wave of Covid-19 at the same time as private lenders are turning away….

Public banks have crafted unprecedented responses to allow micro-, small- and medium-sized enterprises (MSMEs), large businesses, public entities, governing authorities and households time to breathe, time to adjust and time to overcome the worst of the crisis. Typically, this meant offering liquidity with generously reduced rates of interest, preferential repayment terms and eased conditions of repayment. For the most vulnerable in society, public banks offered non-repayable grants.

The editors conclude that public banks offer a path toward democratization (giving society a meaningful say in how financial resources are used) and definancialization (moving away from speculative predatory investment practices toward financing that grows the real economy). For local governments, public banks offer a path to escape monopoly control by giant private financial institutions over public policies.

This article was first posted on ScheerPost.

The post Will 2021 Be Public Banking’s Watershed Moment? first appeared on Dissident Voice.

The Gamers’ Uprising Against Wall Street Has Deep Populist Roots

Wall Street may own the country, as Kansas populist leader Mary Elizabeth Lease once declared, but a new generation of “retail” stock market traders is fighting back.

A short squeeze frenzy driven by a new generation of gamers captured financial headlines in recent weeks, centered on a struggling strip mall video game store called GameStop. The Internet and a year off in this shut down to study up have given a younger generation of investors the tools to compete in the market. Gerald Celente calls it the “Youth Revolution.” A group of New York Young Republicans who protested in the snow on January 31 called it “Re-occupy Wall Street.” Others have called it  Occupy Wall Street 2.0.

The populist uprising against Wall Street goes back farther, however, than to the 2010 Occupy movement. In the late 19th century, the country was suffering from a depression nearly as severe as the Great Depression of the 1930s. Kansas populist leader Mary Elizabeth Lease declared in a fiery speech in 1890:

Wall Street owns the country. It is no longer a government of the people, by the people, and for the people, but a government of Wall Street, by Wall Street, and for Wall Street. The great common people of this country are slaves, and monopoly is the master.

Wall Street still owns the country. Millions of people have been forced out of work, while billionaires have doubled their money in the stock market. But a new generation of “retail” stock market traders is fighting back. (“Retail” traders are individuals trading for their own accounts as opposed to institutional investors.) Occupy Wall Street succeeded in raising awareness of the issues and putting a spotlight on the villains: the chief fault for the subprime crisis and 2008 crash was not with the defaulting homeowners but was with the banks. The Wall Street bankers, however, were not much fazed by the protests on the streets outside their windows. Not until January 2021 was Wall Street actually “occupied,” with millions of small traders landing a multibillion-dollar blow to at least a few of the mighty Wall Street hedge funds. GameStop was the most heavily shorted stock on the market, and the losing hedge funds were on the short end of the stick.

The Short Squeeze

“Short selling” works like this: an investor borrows shares from a broker middleman and immediately sells them, hoping to buy them back later at a lower price, return them to the broker, and pocket the difference. The trade, however, is quite risky. If the shorted stock keeps going up, the shorter’s potential loss is unlimited. “Covering” the short position by buying the shares at the higher price and returning them to the lender will result in a loss. But if the shorter fails to cover, the broker will eventually demand more collateral as protection against its own potential losses.

“Naked” short-selling involves selling stock without borrowing it. Naked short-selling is illegal, but the regulation is not well enforced; and in this case short-sellers had sold 40% more GameStop shares than were in existence. That made the short sellers extremely vulnerable to a “short squeeze.” WallStreetBets, a Reddit social media forum that then had 2.7 million members (a number that has since risen to about 8 million) called the shorters’ bluff and bought furiously, using a game-like trading app called “Robinhood.” The hedge funds had to buy the shares back to cover at the new price, further driving the stock up; and other players, seeing the action, jumped in. This is what is known as a “short squeeze” – driving the shorters to buy before the stock is out of reach. The WallStreetBets short squeeze drove the GameStop stock price up nearly 900% in five days, to around $380 on January 27. The 52-week low before that was $2.57.

Legally, traders cannot engage in “market manipulation,” but the term is poorly defined and the rule is poorly enforced. Hedge funds are reported to hold “idea dinners” where they get together and pitch each other on their favorite trades. As pointed out in a Peak Prosperity interview with financial researcher Jim Bianco, the big players routinely act together. Wall Street has been run as a cartel, and the house always wins; or at least it did until recently.

Bianco explained that formerly, small traders lacked the tools to compete. They lacked the knowledge base, the money to buy full shares of the higher priced stocks, and the time needed to study the market. But in the last two decades, technical advances have given the “little guys” connectivity and access to market information. And in 2020, forced lockdowns gave them the time to sit in front of their computers studying the market and playing it, while stimulus payments gave them some extra cash to play with. New game-like online trading apps such as Robinhood, billed as “democratizing” investing, waived commissions. The purchase of fractional shares was allowed, and social media groups such as WallStreetBets shared investment information and strategies with each other.

The “little guys” learned how to use the rigged stock market system to beat the big boys at their own game. By January 30, short-sellers had lost an estimated $19 billion on GameStop alone, and other heavily shorted stocks were hit as well. Hedge funds lost over $70 billion in January 2021, with GameStop jumping 1000% the last week of the month.

The Empire Strikes Back

Predictably, on January 28 the game came to a screeching halt. Robinhood restricted trades of GameStop and other shorted stocks, allowing investors to settle their trades but not make new purchases. GameStop closed down 44%. This market manipulation appeared to be so blatantly in favor of Goliath against David that congressional representatives from both sides of the aisle protested. Rather than a left/right issue or a racial issue, it was a class issue, one that has fortuitously served to align the 99% on all sides of the political spectrum.

By February 2, the GameStop stock price had plummeted to $97, and hedge fund losses dropped to $13 billion, half of what they were the previous week but still a serious hit. Some retail investors had struck it big, and used the proceeds to pay student loans or rent or mortgages; but others, those who bought or sold last, lost big. The rule is to invest no more than you can afford to lose, but some desperate gamers may have unfortunately learned that lesson the hard way.

According to the message boards, however, for other players it was not chiefly about making money on a single trade. It was an uprising, a show of force against a Wall Street behemoth that had destroyed small businesses, turned families out of their homes, and robbed young people of the “American dream.” In a February 6 post titled “This Is for You, Dad,” Matt Taibbi recounts the touching tales of young people whose parents’ lives were ruined by the 2008 crash, who were willing to risk their own money to see justice done.

In that sense it is an emotionally satisfying “Robinhood” tale, but it has a dark side. Pam and Russ Martens point out that the short squeeze was not actually a spontaneous, grassroots uprising. It was coordinated by a “sophisticated” trader in the Reddit sub-group, whose trading license forbids that sort of practice.

He could be in trouble with his employer or his licensing board, but the larger WallStreetBets group does not seem to have been guilty of “market manipulation” as usually defined. The group did not engage in fraudulent activity; for example, a “pump and dump” scheme based on false advertising of a product. The Reddit group was not conspiring behind closed doors to commit fraud. They were quite open about what they were doing, and there is nothing illegal about advocating the purchase of a stock, something investor newsletters do routinely. GameStop was a beloved retail store of the young traders, and they wanted to save it from the marauding hedge funds. As Jill Carlson observed on Yahoo.com:

When hedge fund managers get together and share insights and ideas, it is all in the name of moving towards a more efficient free market. Yet, when the online mob of retail traders comes to consensus about which stock is worth buying, they are considered by some to be artificially pumping the name and manipulating the market.

And there’s the rub. The disruption of the market caused by the WallStreetBets play could be the rationale for new regulations that hurt retail investors. It could also be another excuse to shut down free speech on the Internet and social media. And as Whitney Webb points out, it was not just a battle of David versus Goliath but one of hedge fund against hedge fund. The biggest winners included some of the most notorious Wall Street institutions – BlackRock, J.P. Morgan Chase and Goldman Sachs. They saw what was happening, bought early, bought big, and sold near the top. BlackRock made a staggering 700% on its trades.

In a February 9 article, Pam and Russ Martens show that the biggest winners were high-frequency program traders. They jumped in front of the WallStreetBets trades using sophisticated algorithms feeding on data provided by Robinhood and other trading platforms. Lance Roberts argues that Wall Street won again.

The Power of the People

So what was achieved by the GameStop short squeeze? Financial analyst Wolf Richter contends:

What all this shows, for all to see, is just how massively the stock market has been manipulated, from the Fed on down. Most of these manipulations attempt to manipulate prices up. But some manipulations are aimed at pushing prices down. What the millions of people conspiring on WallStreetBets to exact their pound of flesh accomplished was nevertheless enormous: They showed for all to see just how completely broken the stock market is.

True, but they showed more than that. They showed that the 99%, if organized and acting in concert, have the tools to overwhelm the 1%. Information, communication, and collective action are key.

The American people acting together have beaten the financial elite before. In the 18th century, it was London bankers who held the British colonies in financial bondage. The American colonists broke free by issuing their own paper currencies. The colony of Pennsylvania and then Treasury Secretary Alexander Hamilton refined that system by establishing publicly-owned banks that extended the credit of the colony and then of the nation; and Abraham Lincoln’s government avoided a 30% interest rate by reviving the colonial practice of issuing debt-free, interest-free paper currency.

The 19th century movement against Wall Street largely faded after Populist and Democratic Party candidate William Jennings Bryan lost to Republican candidate William McKinley in the 1896 and 1900 presidential elections. But a remnant of the Midwest movement triumphed in 1919, when the Non-Partisan League (NPL) was formed in protest against widespread foreclosures on North Dakota farms. The NPL won a statewide election and beat Wall Street at its own game by establishing a state-owned bank. The Bank of North Dakota cut out the Wall Street middlemen and partnered with local community banks; and by 2014, according to the Wall Street Journal, it was more profitable than either J.P. Morgan Chase or Goldman Sachs.

We the people need to recapture our local and national governments and regulators. If we had a popular nonpartisan government that put the public interest above the Corporatocracy, we could do remarkable things as a nation. Congress could even follow Lincoln in exercising its constitutional right to mint some very large coins, perhaps $1 trillion each, buy a controlling interest in some major corporations (Amazon? The New York Times? Some pharmaceutical or energy or telecommunications giants?), and distribute the dividends to the people. The public acting together is Goliath. E pluribus unum: strength in unity.

• This article was first posted on ScheerPost.

The post The Gamers’ Uprising Against Wall Street Has Deep Populist Roots first appeared on Dissident Voice.

Tackling the Infrastructure and Unemployment Crises: The “American System” Solution

A self-funding national infrastructure bank modeled on the “American System” of Alexander Hamilton, Abraham Lincoln, and Franklin D. Roosevelt would help solve two of the country’s biggest problems.

Millions of Americans have joined the ranks of the unemployed, and government relief checks and savings are running out; meanwhile, the country still needs trillions of dollars in infrastructure. Putting the unemployed to work on those infrastructure projects seems an obvious solution, especially given that the $600 or $700 stimulus checks Congress is planning on issuing will do little to address the growing crisis. Various plans for solving the infrastructure crisis involving public-private partnerships have been proposed, but they’ll invariably result in private investors reaping the profits while the public bears the costs and liabilities. We have relied for too long on private, often global, capital, while the Chinese run circles around us building infrastructure with credit simply created on the books of their government-owned banks.

Earlier publicly-owned U.S. national banks and U.S. Treasuries pulled off similar feats, using what Sen. Henry Clay, U.S. statesman from 1806 to 1852, named the “American System” – funding national production simply with “sovereign” money and credit. They included the First (1791-1811) and Second (1816-1836) Banks of the United States, President Lincoln’s federal treasury and banking system, and President Franklin Roosevelt’s Reconstruction Finance Corporation (RFC) (1932-1957). Chester Morrill, former Secretary of the Board of Governors of the Federal Reserve, wrote of the RFC:

[I]t became apparent almost immediately, to many Congressmen and Senators, that here was a device which would enable them to provide for activities that they favored for which government funds would be required, but without any apparent increase in appropriations. . . . [T]here need be no more appropriations and its activities could be enlarged indefinitely, as they were, almost to fantastic proportions. [emphasis added]

Even the Federal Reserve with its “quantitative easing” cannot fund infrastructure without driving up federal expenditures or debt, at least without changes to the Federal Reserve Act. The Fed is not allowed to spend money directly into the economy or to lend directly to Congress. It must go through the private banking system and its “primary dealers.” The Fed can create and pay only with “reserves” credited to the reserve accounts of banks. These reserves are a completely separate system from the deposits circulating in the real producer/consumer economy; and those deposits are chiefly created by banks when they make loans. (See the Bank of England’s 2014 quarterly report here.) New liquidity gets into the real economy when banks make loans to local businesses and individuals; and in risky environments like that today, banks are not lending adequately even with massive reserves on their books.

A publicly-owned national infrastructure bank, on the other hand, would be mandated to lend into the real economy; and if the loans were of the “self funding” sort characterizing most infrastructure projects (generating fees to pay off the loans), they would be repaid, canceling out the debt by which the money was created. That is how China built 12,000 miles of high-speed rail in a decade: credit created on the books of government-owned banks was advanced to pay for workers and materials, and the loans were repaid with profits from passenger fees.

Unlike the QE pumped into financial markets, which creates asset bubbles in stocks and housing, this sort of public credit mechanism is not inflationary. Credit money advanced for productive purposes balances the circulating money supply with new goods and services in the real economy. Supply and demand rise together, keeping prices stable. China increased its money supply by nearly 1800% over 24 years (from 1996 to 2020) without driving up price inflation, by increasing GDP in step with the money supply.

HR 6422, The National Infrastructure Bank Act of 2020

A promising new bill for a national infrastructure bank modeled on the RFC and the American System, H.R. 6422, was filed by Rep. Danny Davis, D-Ill., in March. The National Infrastructure Bank of 2020 (NIB) is projected to create $4 trillion or more in bank credit money to rebuild the nation’s rusting bridges, roads, and power grid; relieve traffic congestion; and provide clean air and water, new schools and affordable housing. It will do this while generating up to 25 million union jobs paying union-level wages. The bill projects a net profit to the government of $80 billion per year, which can be used to cover infrastructure needs that are not self-funding (broken pipes, aging sewers, potholes in roads, etc.). The bill also provides for substantial investment in “disadvantage communities,” those defined by persistent poverty.

The NIB is designed to be a true depository bank, giving it the perks of those institutions for leverage and liquidity, including the ability to borrow at the Fed’s discount window without penalty at 0.25% interest (almost interest-free). According to Alphecca Muttardy, a former macroeconomist for the International Monetary Fund and chief economist on the 2020 NIB team, the NIB will create the $4 trillion it lends simply as deposits on its books, as the Bank of England attests all depository banks do. For liquidity to cover withdrawals, the NIB can either borrow from the Fed at 0.25% or issue and sell bonds.

Modeled on its American System predecessors, the NIB will be capitalized with existing federal government debt. According to the summary on the NIB Coalition website:

The NIB would be capitalized by purchasing up to $500 billion in existing Treasury bonds held by the private sector (e.g., in pension and other savings funds), in exchange for an equivalent in shares of preferred [non-voting] stock in the NIB. The exchange would take place via a sales contract with the NIB/Federal Government that guarantees a preferred stock dividend of 2% more than private-holders currently earn on their Treasuries. The contract would form a binding obligation to provide the incremental 2%, or about $10 billion per year, from the Budget. While temporarily appearing as mandatory spending under the Budget, the $10 billion per year would ultimately be returned as a dividend paid to government, from the NIB’s earnings stream.

Since the federal government will be paying the interest on the bonds, the NIB needs to come up with only the 2% dividend to entice investors. The proposal is to make infrastructure loans at a very modest 2%, substantially lower than the rates now available to the state and local governments that create most of the nation’s infrastructure. At a 10% capital requirement, the bonds can capitalize ten times their value in loans. The return will thus be 20% on a 2% dividend outlay from the NIB, for a net return on investment of 18% less operating costs. The U.S. Treasury will also be asked to deposit Treasury bonds with the bank as an “on-call” subscriber.

The American System: Sovereign Money and Credit

U.S. precedents for funding internal improvements with “sovereign credit” – credit issued by the national government rather than borrowed from the private banking system – go back to the American colonists’ paper scrip, colonial Pennsylvania’s “land bank”, and the First U.S. Bank of Alexander Hamilton, the first U.S. Treasury Secretary. Hamilton proposed to achieve the constitutional ideal of “promoting the general welfare” by nurturing the country’s fledgling industries with federal subsidies for roads, canals, and other internal improvements; protective measures such as tariffs; and easy credit provided through a national bank. Production and the money to finance it would all be kept “in house,” without incurring debt to foreign financiers. The national bank would promote a single currency, making trade easier, and would issue loans in the form of “sovereign credit.” ’

Senator Henry Clay called this model the “American System” to distinguish it from the “British System” that left the market to the “invisible hand” of “free trade,” allowing big monopolies to gobble up small entrepreneurs, and foreign bankers and industrialists to exploit the country’s labor and materials. After the charter for the First US Bank expired in 1811, Congress created the Second Bank of the United States in 1816 on the American System model.

In 1836, Pres. Andrew Jackson shut down the Second U.S. Bank due to perceived corruption, leaving the country with no national currency and precipitating a recession.  “Wildcat” banks issued their own banknotes – promissory notes allegedly backed by gold. But the banks often lacked the gold necessary to redeem the notes, and the era was beset with bank runs and banking crises.

Abraham Lincoln’s economic advisor was Henry Carey, the son of Matthew Carey, a well-known printer and publisher who had been tutored by Benjamin Franklin and had tutored Henry Clay. Henry Carey proposed creating an independent national currency that was non-exportable, one that would remain at home to do the country’s own work. He advocated a currency founded on “national credit,” something he defined as “a national system based entirely on the credit of the government with the people, not liable to interference from abroad.” It would simply be a paper unit of account that tallied work performed and goods delivered.

On that model, in 1862 Abraham Lincoln issued U.S. Notes or Greenbacks directly from the U.S. Treasury, allowing Lincoln’s government not only to avoid an exorbitant debt to British bankers and win the Civil War, but to fund major economic development, including tying the country together with the transcontinental railroad – an investment that actually turned a profit for the government.

After Lincoln was assassinated in 1865, the Greenback program was discontinued; but Lincoln’s government also passed the National Bank Act of 1863, supplemented by the National Bank Act of 1864. Originally known as the National Currency Act, its stated purpose was to stabilize the banking system by eradicating the problem of notes issued by multiple banks circulating at the same time. A single banker-issued national currency was created through chartered national banks, which could issue notes backed by the U.S. Treasury in a quantity proportional to the bank’s level of capital (cash and federal bonds) deposited with the Comptroller of the Currency.

From Roosevelt’s Reconstruction Finance Corporation (1932-57) to HR 6422

The American president dealing with an economic situation most closely resembling that today, however, was Franklin D. Roosevelt. America’s 32nd president resolved massive unemployment and infrastructure problems by greatly expanding the Reconstruction Finance Corporation (RFC) set up by his predecessor Herbert Hoover. The RFC was a remarkable publicly-owned credit machine that allowed the government to finance the New Deal and World War II without turning to Congress or the taxpayers for appropriations. The RFC was not called an infrastructure bank and was not even a bank, but it served the same basic functions. It was continually enlarged and modified by Pres. Roosevelt to meet the crisis of the times until it became America’s largest corporation and the world’s largest financial organization. Its semi-independent status let it work quickly, allowing New Deal agencies to be financed as the need arose. According to Encyclopedia.com:

[T]he RFC—by far the most influential of New Deal agencies—was an institution designed to save capitalism from the ravages of the Great Depression. Through the RFC, Roosevelt and the New Deal handed over $10 billion to tens of thousands of private businesses, keeping them afloat when they would otherwise have gone under ….

A similar arrangement could save local economies from the ravages of the global shutdowns today.

The Banking Acts of 1932 provided the RFC with capital stock of $500 million and the authority to extend credit up to $1.5 billion (subsequently increased several times). The initial capital came from a stock sale to the U.S. Treasury. With those modest resources, from 1932 to 1957 the RFC loaned or invested more than $40 billion. A small part of this came from its initial capitalization. The rest was financed with bonds sold to the Treasury, some of which were then sold to the public. The RFC ended up borrowing a total of $51.3 billion from the Treasury and $3.1 billion from the public.

Thus the Treasury was the lender, not the borrower, in this arrangement. As the self-funding loans were repaid, so were the bonds that were sold to the Treasury, leaving the RFC with a net profit. The RFC was the lender for thousands of infrastructure and small business projects that revitalized the economy, and these loans produced a total net income of over $690 million on the RFC’s “normal” lending functions (omitting such things as extraordinary grants for wartime). The RFC financed roads, bridges, dams, post offices, universities, electrical power, mortgages, farms, and much more–all while generating income for the government.

HR 6422 proposes to mimic this feat. The National Infrastructure Bank of 2020 can rebuild crumbling infrastructure across America, pushing up long-term growth, not only without driving up taxes or the federal debt, but without hyperinflating the money supply or generating financial asset bubbles. The NIB has growing support across the country from labor leaders, elected officials, and grassroots organizations. It can generate real wealth in the form of upgraded infrastructure and increased employment as well as federal and local taxes and GDP, paying for itself several times over without additional outlays from the federal government. With official unemployment at nearly double what it was a year ago and an economic crisis unlike the U.S. has seen in nearly a century, the NIB can trigger the sort of “economic miracle” the country desperately needs.

This article was first posted on ScheerPost.

The post Tackling the Infrastructure and Unemployment Crises: The “American System” Solution first appeared on Dissident Voice.

Why the Fed Needs Public Banks

The Fed’s policy tools – interest rate manipulation, quantitative easing, and “Special Purpose Vehicles” – have all failed to revive local economies suffering from government-mandated shutdowns. The Fed must rely on private banks to inject credit into Main Street, and private banks are currently unable or unwilling to do it. The tools the Fed actually needs are public banks, which could and would do the job.

On November 20, US Treasury Secretary Steven Mnuchin informed Federal Reserve Chairman Jerome Powell that he would not extend five of the Special Purpose Vehicles (SPVs) set up last spring to bail out bondholders, and that he wanted the $455 billion in taxpayer money back that the Treasury had sent to the Fed to capitalize these SPVs. The next day, Powell replied that he thought it was too soon – the SPVs still served a purpose – but he agreed to return the funds. Both had good grounds for their moves, but as Wolf Richter wrote on WolfStreet.com, “You’d think something earth-​shattering happened based on the media hullabaloo that ensued.”

Richter noted that the expiration date on the SPVs had already been extended; that their purpose was “to bail out and enrich bondholders, particularly junk-bond holders and speculators with huge leveraged bets”; and that their use had been “minuscule by Fed standards.” They had done their job, which was mostly to be “a jawboning tool to inflate asset prices.” Investors and speculators, confident that the Fed had their backs, had “created wondrous credit markets that are now frothing at the mouth,” making the bond speculators quite rich. However, in Mnuchin’s own words, “The people that really need support right now are not the rich corporations, it is the small businesses, it’s the people who are unemployed.” So why aren’t they getting the support? According to Richter:

Powell himself has been badgering Congress for months to provide more fiscal support to small businesses and other entities because the Fed was not well suited to do so, which was the reason the Main Street Lending Program (MSLP) never really got off the ground.

The reason the Fed is not well suited to the task is that it is not allowed to make loans directly to Main Street businesses. It must rely on banks to do it, and private banks are currently unable or unwilling to make those loans as needed. But publicly-owned banks would. Fortunately, Several promising public bank bills were recently introduced in Congress that could help resolve this crisis.

The reason the Fed is not well suited to the task is that it is not allowed to make loans directly to Main Street businesses. It must rely on banks to do it, and private banks are currently unable or unwilling to make those loans as needed. But publicly-owned banks would. Fortunately, Several promising public bank bills were recently introduced in Congress that could help resolve this crisis.

But first, a look at why the Fed’s own efforts have failed.

The Fed Lacks the Tools to Inject Liquidity into the Real Economy

Congress has charged the Federal Reserve with a dual mandate: to maintain the stability of the currency (prevent inflation or deflation) and maintain full employment.  Not only are we a long way from full employment, but the stability of the currency is in question, although economists disagree on whether we are headed for massive inflation or crippling deflation. Food prices and other at-home costs are up; but away-from-home costs (gas, flights, hotels, entertainment, office apparel) are down. Food prices are up not because of “too much money chasing too few goods” (demand/pull inflation) but because of supply and production problems (cost/push inflation). In terms of “output,” we are definitely looking at deflation. An August 2020 Bloomberg article quotes economist Lacy Hunt:

[A]ccording to the figures of the Congressional Budget Office, the output gap will be a record this year and we will have a deflationary gap. In other words, potential GDP will be well above real GDP. And according to the CBO, we’re going to have a deflationary output gap through 2030.

The Fed’s monetary policies, it seems, are not working. On November 11 and 12, according to Reuters:

[T]he world’s top central bankers … tune[d] into the European Central Bank’s annual policy symposium … to figure out why monetary policy is not working as it used to and what new role they must play in a changed world – be it fighting inequality or climate change.

… Central banks’ failure to achieve their targets is beginning to challenge a key tenet of monetary theory: that inflation is always a factor of their policy and that prices rise as unemployment falls.

The Fed adopted a fixed 2% target in 2012. To achieve it, explains investment writer James Molony, they “have implemented unprecedented policies. Interest rates have been slashed, in some cases to near zero, and they have engaged in printing money in order to buy bonds and other assets, otherwise known as quantitative easing.”

Lowering the interest rate is supposed to encourage lending, which increases the circulating money supply and generates the demand necessary to prompt producers to increase GDP. But the fed funds rate, the only rate the central bank controls, is nearly at zero; and the equivalent rates in the European Union and Japan are actually in negative territory. Yet in none of these three countries has the central bank been able to reach its inflation target.

The Fed has now resorted to “average inflation targeting” – meaning it will allow inflation to run above its 2% target to make up for periods when inflation was below 2%. To turn up the economic heat, Chairman Powell has been pleading for more stimulus from Congress. If Congress issues bonds, increasing the federal debt, the Fed can buy the bonds; and the money spent into the economy will increase the money supply. But federal legislators have not been able to agree on the terms of a stimulus package.

Why can’t the Fed do the job, though, itself? In a speech to the Japanese in 2002, former Fed Chairman Ben Bernanke argued (citing Milton Friedman) that it was relatively easy to fix a deflationary recession:  just fly over the people in helicopters and drop money on them. They would then spend it on consumer goods, creating the demand necessary to prompt productivity. So where are the Fed’s helicopters?

“The Fed Doesn’t  ‘Do’ Money.”

In a recent article titled “Where Is It, Chairman Powell?”, Jeffrey Snider, Head of Global Research at Alhambra Investments, questioned whether the Fed’s policies were creating inflation as alleged at all. He wrote:

After spending months deliberately hyping a “flood” of digital money printing, and then unleashing average inflation targeting making Americans believe the central bank will be wickedly irresponsible when it comes to consumer prices, the evidence portrays a very different set of circumstances. Inflationary pressures were supposed to have been visible by now, seven months and counting, when instead it is disinflation which is most evident – and it is spreading.

The problem, said Snider, is that “The Fed doesn’t do money, therefore there’s no way the Fed can have its monetary inflation.”

The Fed doesn’t “do” money? What does that mean?

As explained by Prof. Joseph Huber, chair of economic and environmental sociology at Martin Luther University of Halle-Wittenberg, Germany, we have a two-tiered money system. The only monies the central bank can create and spend are “bank reserves,” and these circulate only between banks. The central bank is not allowed to spend money directly into the economy or to lend it to local businesses. It is not even allowed to lend it directly to Congress. Rather, it must go through the private banking system. When the central bank buys assets (bonds or debt), it simply credits the reserve accounts of the banks from which the assets were bought; and banks cannot spend or lend these reserves except to each other. In an article titled “Repeat After Me: Banks Cannot And Do Not ‘Lend Out’ Reserves,” Paul Sheard, Chief Global Economist for Standard & Poor’s, explained:

Many talk as if banks can “lend out” their reserves, raising concerns that massive excess reserves created by QE could fuel runaway credit creation and inflation in the future. But banks cannot lend their reserves directly to commercial borrowers, so this concern is misplaced….

Banks don’t lend out of deposits; nor do they lend out of reserves. They lend by creating deposits. And deposits are also created by government deficits. [Emphasis added.]

The deposits circulating in the producer/consumer economy are created, not by the Fed, but by banks when they make loans. (See the Bank of England’s 2014 quarterly report here.) The central bank does create paper cash, but this money too gets into the economy only when other financial institutions buy or borrow it from the central bank in response to demand from their customers. The circulating money supply increases when banks make loans to businesses and individuals; and in risky environments like today’s, private banks are pulling back from Main Street lending, even with massive central bank reserves on their books.

The Tools the Fed Needs to Get Liquidity into the Economy

Private banks are not following through on the Fed’s attempted money injections, but publicly-owned banks would. In countries with strong government-owned banking systems, public banks have historically increased their lending when private banks pulled back. Public banks have a mandate to stimulate their local economies; and unlike private banks, they can do it and still turn a profit, because they have lower costs. They have eliminated the parasitic profit-extracting middlemen, and they do not have to focus on short-term profits to please their shareholders. They can pour their resources into improving the long-term prospects of the economy and its infrastructure, stimulating local productivity and strengthening the tax base.

Three promising new bills are before Congress that would facilitate the establishment of a public banking system in the US.

HR 8721, “The Public Banking Act”, was introduced on Oct. 30, 2020. As described on Vox, the Act would “foster the creation of public [state and local government-owned] banks across the country by providing them a pathway to getting started, establishing an infrastructure for liquidity and credit facilities for them via the Federal Reserve, and setting up federal guidelines for them to be regulated. Essentially, it would make it easier for public banks to exist, and it would give some of them grant money to get started.”

Another bill, introduced in September by Sens. Bernie Sanders and Kirsten Gillibrand, is The Postal Banking Act, which the authors said would

  • Create $9 billion in revenue for the postal service, saving it from privatization;
  • Protect low-income or rural families and communities from predatory lending; and,
  • Reestablish postal banking to provide basic, low-cost financial services to those who cannot access banks

The third bill, HR 6422, “The National Infrastructure Bank Act of 2020,” is modeled on Franklin Roosevelt’s Reconstruction Finance Corporation, which funded the rebuilding of the US economy in the Great Depression of the 1930s. According to its advocates, HR 6422 will build or restore over $4 trillion in infrastructure and create up to 25 million union jobs, while being “revenue neutral” (not burdening the federal government’s budget). The promise of HR 6422 and the model of the “American System” that inspired it – the innovative banking systems of Alexander Hamilton, Abraham Lincoln and Franklin Roosevelt – will be the subject of another article.

This article was first posted on ScheerPost.

The post Why the Fed Needs Public Banks first appeared on Dissident Voice.

From Lockdown to Police State: The “Great Reset” Rolls Out

Mayhem in Melbourne

On August 2, lockdown measures were implemented in Melbourne, Australia, that were so draconian that Australian news commentator Alan Jones said on Sky News: “People are entitled to think there is an ‘agenda to destroy western society.’”

The gist of an August 13th article on the Melbourne lockdown is captured in the title: “Australian Police Go FULL NAZI, Smashing in Windows of Civilian Cars Just Because Passengers Wouldn’t Give Details About Where They Were Going.”

Another article with an arresting title was by Guy Burchell in the August 7th Australian National Review: “Melbourne Cops May Now Enter Homes Without a Warrant, After 11 People Die of COVID — Australia, This Is Madness, Not Democracy.” Burchell wrote that only 147 people had lost their lives to coronavirus in Victoria (the Australian state of which Melbourne is the capital), a very low death rate compared to other countries. The ramped up lockdown measures were triggered by an uptick in cases due to ramped up testing and 11 additional deaths, all of them in nursing homes (where lockdown measures would actually have little effect). The new rules include a six week curfew from 8 PM to 5 AM, with residents allowed to leave home outside those curfew hours only to shop for food and essential items (one household member only), and for caregiving, work and exercise (limited to one hour).

“But the piece de resistance,” writes Burchell, “has to be that now police officers can enter homes with neither a warrant nor permission. This is an astonishing violation of civil liberties…. Deaths of this kind are not normally cause for government action, let alone the effective house arrest of an entire city.” He quoted Victoria Premier Daniel Andrews, who told Victorians, “there is literally no reason for you to leave your home and if you were to leave your home and not be found there, you will have a very difficult time convincing Victoria police that you have a lawful reason.” Burchell commented:

[U]nder this new regime you can’t even remain in your house unmolested by the cops, they can just pop ‘round anytime to make sure you haven’t had Bruce and Sheila from next door round for a couple of drinks. All over a disease that is simply not that fatal….

Last year more than 310,000 Australians were hospitalised with flu and over 900 died. By all metrics that makes flu a worse threat than COVID-19 but police weren’t granted Stasi-like powers during the flu season. Millions of people weren’t confined to their homes and threatened with AUS$5,000 fines for not having a good reason for being out of their homes.

At an August 19th press conference, Australia’s second most senior medical officer said the government would be discussing measures such as banning restaurants, international travel, public transport, and withholding government programs through “No Jab No Pay” in order to coerce vaccine resisters.

An August 13 article on LifeSiteNews quoted Father Glen Tattersall, a Catholic parish priest in Melbourne, who said the draconian provisions “simply cannot be justified on a scientific basis”:

We have a curfew from 8 pm to 5 am, rigorously enforced including by the use of police helicopters and search lights. Is the virus a vampire that just comes out at night? Or the wearing of masks: they must be worn everywhere outside, even in a park where you are nowhere near any other person. Why? Does the virus leap hundreds of metres through the air? This is all about inducing mass fear, and humiliating the populace by demanding external compliance.

Why the strict curfew? Curfews have been implemented recently in the US to deter violence during protests, but no violence of that sort was reported in Melbourne. What was reported, at least on social media, were planes landing in the night from ‎the Chinese province of Guandong carrying equipment related to 5G and the Chinese biometric social credit system, which was reportedly being installed under a blanket of secrecy.

Angelo Codevilla, professor emeritus at Boston University, concluded in an August 13th article, “We are living through a coup d’état based on the oldest of ploys: declaring emergencies, suspending law and rights, and issuing arbitrary rules of behavior to excuse taking ‘full powers’.”

Questioning the Narrative

Melbourne has gone to extremes with its lockdown measures, but it could portend things to come globally. Lockdowns were originally sold to the public as being necessary just for a couple of weeks to “flatten the curve,” to prevent hospital overcrowding from COVID-19 cases. It has now been over five months, with self-appointed vaccine czar Bill Gates intoning that we will not be able to return to “normal” until the entire global population of 7 billion people has been vaccinated. He has since backed off on the numbers, but commentators everywhere are reiterating that lockdowns are the “new normal,” which could last for years.

All this is such a radical curtailment of our civil liberties that we need to look closely at the evidence justifying it; and when we do, that evidence is weak. The isolation policies were triggered by estimates from the Imperial College London of 510,000 UK deaths and 2.2 million US deaths, more than 10 times the actual death rate from COVID-19. A Stanford University antibody study estimated that the fatality rate if infected was only about 0.1 to 0.2 percent; and in an August 4th blog post, Bill Gates himself acknowledged that the death rate was only 0.14 percent, not much higher than for the flu. But restrictive measures have gotten more onerous rather than less as the mortality figures have been revised downward.

A July 2020 UK study from Loughborough and Sheffield Universities found that government policy over the lockdown period has actually increased mortality rather than reducing it, after factoring in collateral damage including deaths from cancers and other serious diseases that are being left untreated, a dramatic increase in suicides and drug overdose, and poverty and malnourishment due to unemployment. Globally, according to UNICEF, 1.2 million child deaths are expected as a direct result of the lockdowns. A data analyst in South Africa asserts that the consequences of the country’s lockdown will lead to 29 times more deaths than from the coronavirus itself.

Countries and states that did very little to restrict their populations, including Sweden and South Dakota, have fared as well as or better overall than locked down US states. In an August 12th article in The UK Telegraph titled “Sweden’s Success Shows the True Cost of Our Arrogant, Failed Establishment,” Allister Heath writes:

Sweden got it largely right, and the British establishment catastrophically wrong. Anders Tegnell, Stockholm’s epidemiologist-​king, has pulled off a remarkable triple whammy: far fewer deaths per capita than Britain, a maintenance of basic freedoms and opportunities, including schooling, and, most strikingly, a recession less than half as severe as our own.

Not restraining the populace has allowed Sweden’s curve to taper off naturally through “herd immunity,” with daily deaths down to single digits for the last month. (See chart.)

The Pandemic That Wasn’t?

Also bringing the official narrative into question is the unreliability of the tests on which the lockdowns have been based. In a Wired interview, even Bill Gates acknowledged that most US test results are “garbage.” The Polymerase Chain Reaction (PCR) technology used in the nasal swab test is considered the “gold standard” for COVID-19 detection; yet the PCR test was regarded by its own inventor, Nobel prize winner Kary Mullis, as inappropriate to detect viral infection. In a detailed June 27th analysis titled “COVID-19 PCR Tests Are Scientifically Meaningless,” Torsten Engelbrecht and Konstantin Demeter conclude:

Without doubt eventual excess mortality rates are caused by the therapy and by the lockdown measures, while the “COVID-19” death statistics comprise also patients who died of a variety of diseases, redefined as COVID-19 only because of a “positive” test result whose value could not be more doubtful.

The authors discussed a January 2007 New York Times article titled “Faith in Quick Test Leads to Epidemic That Wasn’t,” describing an apparent whooping cough epidemic in a New Hampshire hospital. The epidemic was verified by preliminary PCR tests given to nearly 1,000 healthcare workers, who were subsequently furloughed. Eight months later, the “epidemic” was found to be a false alarm. Not a single case of whooping cough was confirmed by the “gold standard” test – growing pertussis bacteria in the laboratory. All of the cases found through the PCR test were false positives.

Yet “test, test, test” was the message proclaimed for all countries by WHO Director General Tedros Adhanom at a media briefing on March 16, 2020, five days after WHO officially declared COVID-19; and the test recommended as the gold standard was the PCR. Why, when it had already been demonstrated to be unreliable, creating false positives that gave the appearance of an epidemic when there was none? Or was that the goal – to create the appearance of a pandemic, one so vast that the global economy had to be brought to a standstill until a vaccine could be found? Recall Prof. Codevilla’s conclusion: “We are living through a coup d’état based on the oldest of ploys: declaring emergencies, suspending law and rights, and issuing arbitrary rules of behavior to excuse taking ‘full powers’.”

People desperate to get back to work will not only submit to a largely untested vaccine but will agree to surveillance measures that would have been considered a flagrant violation of their civil rights if those rights had not been overridden by a “national emergency” justifying preemption by the police powers of the state. They will agree to get “immunity passports” in order to travel and participate in group activities, and they will submit to quarantines, curfews, contact tracings, social credit scores and informing on the neighbors. The emergency must be kept going to justify these unprecedented violations of their liberties, in which decision-making is removed from elected representatives and handed to unelected bureaucrats and technocrats.

A national health crisis is also a necessary prerequisite for relief from liability for personal injuries from the drugs and other products deployed in response to the crisis. Under the 2005 Public Readiness and Emergency Preparedness Act (PREPA), in the event of a declared public health emergency, manufacturers are shielded from tort liability for injuries both from the vaccines and from invalid or invasive tests. Compensation for personal injuries is a massive expense for drug companies, and the potential profits from a product free of that downside are a gold mine for pharmaceutical companies and investors. The liabilities will be borne by the taxpayers and the victims.

All this, however, presupposes both an existing public health emergency and no effective treatment to defuse it. That helps explain the otherwise inexplicable war on hydroxychloroquine, a safe drug that has been in use and available over the counter for 65 years and has been shown to be effective in multiple studies when used early in combination with zinc and an antibiotic. A table prepared by the American Association of Physicians and Surgeons (below) found that the US has nearly 30 times as many deaths per capita as countries making early and prophylactic use of hydroxychloroquine.

 

The latest international testing of hydroxychloroquine treatment of coronavirus shows countries that had early use of the drug had a 79% lower mortality rate than countries that banned the use of the safe malaria drug. Lowering the US mortality rate by 79% could have saved over 100,000 lives. But an effective, inexpensive COVID-19 treatment would mean the end of the alleged pandemic and the vaccine bonanza it purports to justify.

The need to maintain the appearance of a pandemic also explains the inflated reports of cases and deaths. Hospitals have been rewarded with increased fees for reclassifying cases as COVID-19.  As deaths declined in the US, the numbers of cases reported by the Centers for Disease Control were also gamed to make it appear that America was in a “second wave” of a pandemic. The reporting criterion was changed on May 18 from people who tested positive for the virus only to people who tested positive for either the virus or its antibodies. The exploding numbers thus include people who have recovered from COVID-19 as well as false positives. The Loughborough and Sheffield researchers found that when controlling for other factors affecting mortality, actual deaths due to COVID-19 are 54% to 63% lower than implied by the standard excess deaths measure.

Ushering in “The Great Reset”

Forcing compliance with global vaccine mandates is one obvious motive for maintaining the appearance of an ongoing pandemic, but what would be the motive for destroying the global economy with forced lockdowns? What is behind the “agenda to destroy Western society” suspected by Australian commentator Alan Jones?

Evidently it is this: destroying the old is necessary to usher in the new. Global economic destruction paves the way for the “Great Reset” now being promoted by the World Economic Forum, the Bill and Melinda Gates Foundation, the International Monetary Fund and other big global players.

Although cast as arising from the pandemic, the “global economic reset” is a concept that was floated as early as 2014 by Christine Lagarde, then head of the IMF, and is said to be a recharacterization of the “New World Order” discussed long before that. It was promoted as a solution to the ongoing economic crisis triggered in 2008.

The World Economic Forum – that elite group of businessmen, politicians and academics that meets in Davos, Switzerland, every January – announced in June that the Great Reset would be the theme of its 2021 Summit. Klaus Schwab, founder of the Forum, admonished:

The world must act jointly and swiftly to revamp all aspects of our societies and economies, from education to social contracts and working conditions. Every country, from the United States to China, must participate, and every industry, from oil and gas to tech, must be transformed.

No country will be allowed to opt out because it would be endangering the rest, just as no person will be allowed to escape the COVID-19 vaccine for the same reason.

Who is behind the Great Reset and what it really entails are major questions that need their own article, but suffice it to say here that to escape the trap of the globalist agenda, we need a mass awakening to what is really going on and collective resistance to it while there is still time. There are hopeful signs that this is happening, including massive protests against economic shutdowns and restrictions, particularly in Europe; a rash of lawsuits challenging the constitutionality of the lockdowns and of police power overreach; and a flood of alternative media exposés despite widespread censorship.

Life as we know it will change. We need to ensure that it changes in ways that serve the people and the productive economy, while preserving our national sovereignty and hard-won personal freedoms.

Meet BlackRock, the New Great Vampire Squid

BlackRock is a global financial giant with customers in 100 countries and its tentacles in major asset classes all over the world; and it now manages the spigots to trillions of bailout dollars from the Federal Reserve. The fate of a large portion of the country’s corporations has been put in the hands of a megalithic private entity with the private capitalist mandate to make as much money as possible for its owners and investors; and that is what it has proceeded to do.

To most people, if they are familiar with it at all, BlackRock is an asset manager that helps pension funds and retirees manage their savings through “passive” investments that track the stock market. But working behind the scenes, it is much more than that. BlackRock has been called “the most powerful institution in the financial system,” “the most powerful company in the world” and the “secret power.” It is the world’s largest asset manager and “shadow bank,” larger than the world’s largest bank (which is in China), with over $7 trillion in assets under direct management  and another $20 trillion managed through its Aladdin risk-monitoring software. BlackRock has also been called “the fourth branch of government” and “almost a shadow government”, but no part of it actually belongs to the government. Despite its size and global power, BlackRock is not even regulated as a “Systemically Important Financial Institution” under the Dodd-Frank Act, thanks to pressure from its CEO Larry Fink, who has long had “cozy” relationships with government officials.

BlackRock’s strategic importance and political weight were evident when four BlackRock executives, led by former Swiss National Bank head Philipp Hildebrand, presented a proposal at the annual meeting of central bankers in Jackson Hole, Wyoming, in August 2019 for an economic reset that was actually put into effect in March 2020. Acknowledging that central bankers were running out of ammunition for controlling the money supply and the economy, the BlackRock group argued that it was time for the central bank to abandon its long-vaunted independence and join monetary policy (the usual province of the central bank) with fiscal policy (the usual province of the legislature). They proposed that the central bank maintain a “Standing Emergency Fiscal Facility” that would be activated when interest rate manipulation was no longer working to avoid deflation. The Facility would be deployed by an “independent expert” appointed by the central bank.

The COVID-19 crisis presented the perfect opportunity to execute this proposal in the US, with BlackRock itself appointed to administer it. In March 2020, it was awarded a no-bid contract under the Coronavirus Aid, Relief, and Economic Security Act (CARES Act) to deploy a $454 billion slush fund established by the Treasury in partnership with the Federal Reserve. This fund in turn could be leveraged to provide over $4 trillion in Federal Reserve credit. While the public was distracted with protests, riots and lockdowns, BlackRock suddenly emerged from the shadows to become the “fourth branch of government,” managing the controls to the central bank’s print-on-demand fiat money. How did that happen and what are the implications?

Rising from the Shadows

BlackRock was founded in 1988 in partnership with the Blackstone Group, a multinational private equity management firm that would become notorious after the 2008-09 banking crisis for snatching up foreclosed homes at firesale prices and renting them at inflated prices. BlackRock first grew its balance sheet in the 1990s and 2000s by promoting the mortgage-backed securities (MBS) that brought down the economy in 2008. Knowing the MBS business from the inside, it was then put in charge of the Federal Reserve’s “Maiden Lane” facilities. Called “special purpose vehicles,” these were used to buy “toxic” assets (largely unmarketable MBS) from Bear Stearns and American Insurance Group (AIG), something the Fed was not legally allowed to do itself.

BlackRock really made its fortunes, however, in “exchange traded funds” (ETFs). It gained trillions in investable assets after it acquired the iShares series of ETFs in a takeover of Barclays Global Investors in 2009. By 2020, the wildly successful iShares series included over 800 funds and $1.9 trillion in assets under management.

Exchange traded funds are bought and sold like shares but operate as index-tracking funds, passively following specific indices such as the S&P 500, the benchmark index of America’s largest corporations and the index in which most people invest. Today the fast-growing ETF sector controls nearly half of all investments in US stocks, and it is highly concentrated. The sector is dominated by just three giant American asset managers – BlackRock, Vanguard and State Street, the “Big Three” – with BlackRock the clear global leader. By 2017, the Big Three together had become the largest shareholder in almost 90% of S&P 500 firms, including Apple, Microsoft, ExxonMobil, General Electric and Coca-Cola. BlackRock also owns major interests in nearly every mega-bank and in major media.

In March 2020, based on its expertise with the Maiden Lane facilities and its sophisticated Aladdin risk-monitoring software, BlackRock got the job of dispensing Federal Reserve funds through eleven “special purpose vehicles” authorized under the CARES Act. Like the Maiden Lane facilities, these vehicles were designed to allow the Fed, which is legally limited to purchasing safe federally-guaranteed assets, to finance the purchase of riskier assets in the market.

Blackrock Bails Itself Out

The national lockdown left states, cities and local businesses in desperate need of federal government aid. But according to David Dayen in The American Prospect, as of May 30 (the Fed’s last monthly report), the only purchases made under the Fed’s new BlackRock-administered SPVs were ETFs, mainly owned by BlackRock itself. Between May 14 and May 20, about $1.58 billion in ETFs were bought through the Secondary Market Corporate Credit Facility (SMCCF), of which $746 million or about 47% came from BlackRock ETFs. The Fed continued to buy more ETFs after May 20, and investors piled in behind, resulting in huge inflows into BlackRock’s corporate bond ETFs.

In fact, these ETFs needed a bailout; and BlackRock used its very favorable position with the government to get one. The complicated mechanisms and risks underlying ETFs are explained in an April 3 article by business law professor Ryan Clements, who begins his post:

Exchange-Traded Funds (ETFs) are at the heart of the COVID-19 financial crisis. Over forty percent of the trading volume during the mid-March selloff was in ETFs ….

The ETFs were trading well below the value of their underlying bonds, which were dropping like a rock. Some ETFs were failing altogether. The problem was something critics had long warned of: while ETFs are very liquid, trading on demand like stocks, the assets that make up their portfolios are not. When the market drops and investors flee, the ETFs can have trouble coming up with the funds to settle up without trading at a deep discount; and that is what was happening in March.

According to a May 3 article in The National:

The sector was ultimately saved by the US Federal Reserve’s pledge on March 23 to buy investment-grade credit and certain ETFs. This provided the liquidity needed to rescue bonds that had been floundering in a market with no buyers.

Prof. Clements states that if the Fed had not stepped in, “a ‘doom loop’ could have materialized where continued selling pressure in the ETF market exacerbated a fire-sale in the underlying [bonds], and again vice-versa, in a procyclical pile-on with devastating consequences.” He observes:

There’s an unsettling form of market alchemy that takes place when illiquid, over-the-counter bonds are transformed into instantly liquid ETFs. ETF “liquidity transformation” is now being supported by the government, just like liquidity transformation in mortgage backed securities and shadow banking was supported in 2008.

Working for Whom?

BlackRock got a bailout with no debate in Congress, no “penalty” interest rate of the sort imposed on states and cities borrowing in the Fed’s Municipal Liquidity Facility, no complicated paperwork or waiting in line for scarce Small Business Administration loans, no strings attached. It just quietly bailed itself out.

It might be argued that this bailout was good and necessary, since the market was saved from a disastrous “doom loop,” and so were the pension funds and the savings of millions of investors. Although BlackRock has a controlling interest in all the major corporations in the S&P 500, it professes not to “own” the funds. It just acts as a kind of “custodian” for its investors — or so it claims. But BlackRock and the other Big 3 ETFs vote the corporations’ shares; so from the point of view of management, they are the owners. And as observed in a 2017 article from the University of Amsterdam titled “These Three Firms Own Corporate America,” they vote 90% of the time in favor of management. That means they tend to vote against shareholder initiatives, against labor, and against the public interest. BlackRock is not actually working for us, although we the American people have now become its largest client base.

In a 2018 review titled “Blackrock – The Company That Owns the World”, a multinational research group called Investigate Europe concluded that BlackRock “undermines competition through owning shares in competing companies, blurs boundaries between private capital and government affairs by working closely with regulators, and advocates for privatization of pension schemes in order to channel savings capital into its own funds.”

Daniela Gabor, Professor of Macroeconomics at the University of Western England in Bristol, concluded after following a number of regulatory debates in Brussels that it was no longer the banks that wielded the financial power; it was the asset managers. She said:

We are often told that a manager is there to invest our money for our old age. But it’s much more than that. In my opinion, BlackRock reflects the renunciation of the welfare state. Its rise in power goes hand-in-hand with ongoing structural changes; in finance, but also in the nature of the social contract that unites the citizen and the state.

That these structural changes are planned and deliberate is evident in BlackRock’s August 2019 white paper laying out an economic reset that has now been implemented with BlackRock at the helm.

Public policy is made today in ways that favor the stock market, which is considered the barometer of the economy, although it has little to do with the strength of the real, productive economy. Giant pension and other investment funds largely control the stock market, and the asset managers control the funds. That effectively puts BlackRock, the largest and most influential asset manager, in the driver’s seat in controlling the economy.

As Peter Ewart notes in a May 14 article on BlackRock titled “Foxes in the Henhouse,” today the economic system “is not classical capitalism but rather state monopoly capitalism, where giant enterprises are regularly backstopped with public funds and the boundaries between the state and the financial oligarchy are virtually non-existent.”

If the corporate oligarchs are too big and strategically important to be broken up under the antitrust laws, rather than bailing them out they should be nationalized and put directly into the service of the public. At the very least, BlackRock should be regulated as a too-big-to-fail Systemically Important Financial Institution. Better yet would be to regulate it as a public utility. No private, unelected entity should have the power over the economy that BlackRock has, without a legally enforceable fiduciary duty to wield it in the public interest.

Rushing a Vaccine to Market for a Vanishing Virus

More than 100 companies are competing to be first in the race to get a COVID-19 vaccine to market. It’s a race against time, not because the death rate is climbing but because it is falling – to the point where there will soon be too few subjects to prove the effectiveness of the drug.

Pascal Soriot is chief executive of AstraZeneca, a British-Swedish pharmaceutical company that is challenging biotech company Moderna, the U.S. frontrunner in the race. Soriot said on May 24th, “The vaccine has to work and that’s one question, and the other question is, even if it works, we have to be able to demonstrate it. We have to run as fast as possible before the disease disappears so we can demonstrate that the vaccine is effective.”

COVID-19, like other coronaviruses, is expected to mutate at least every season, raising serious questions about claims that any vaccine will work. A successful vaccine has never been developed for any of the many strains of coronaviruses, due to the nature of the virus itself; and vaccinated people can have a higher chance of serious illness and death when later exposed to another strain of the virus, a phenomenon known as “virus interference.” An earlier SARS vaccine never made it to market because the laboratory animals it was tested on contracted more serious symptoms on re-infection, and most of them died.

Researchers working with the AstraZeneca vaccine claimed success in preliminary studies because its lab monkeys all survived and formed antibodies to the virus, but data reported later showed that the animals all became infected when challenged, raising serious questions about the vaccine’s effectiveness.

Moderna has gotten fast-track approval from the FDA and managed to skip animal trials altogether before rushing to human trials. Its candidate is a “messenger RNA” vaccine, a computer-generated replica of an RNA component that carries genetic information controlling the synthesis of proteins. No mRNA vaccine has ever been approved for marketing or proven in a large-scale clinical trial. As explained in Science magazine, RNA that invades from outside the cell is the hallmark of a virus, and our immune systems have evolved ways to recognize and destroy it. To avoid that, Moderna’s mRNA vaccine sneaks into cells encapsulated in nanoparticles, which aren’t easily degraded and can cause toxic buildup in the liver.

These concerns, however, have not deterred the U.S. Department of Health and Human Services (HHS), which is proceeding at “Warp Speed” to get the new technologies tested on the American population before the virus disappears through mutation and natural herd immunity. HHS has already agreed to provide up to $1.2 billion to AstraZeneca and $483 million to Moderna to develop their experimental candidates. “As American taxpayers, we are justified in asking why,” writes William Haseltine in Forbes. Both companies have attracted billions from private investors and don’t need taxpayer money, and the government’s speculative bets are being made on unproven technologies in the early stages of testing.

The argument at one time was that the magnitude of the crisis justified the risk, but the virus is now disappearing of its own accord. The computer-modeled projection of 2.2 million U.S. deaths issued by Imperial College London (a business partner of AstraZeneca), triggering shutdowns across the United States, was subsequently found to be “wildly” overblown. The model was described in the UK Telegraph on May 16 as “the most devastating software mistake of all time.” The researchers wrote that “we would fire anyone for developing code like this” and that the question was “why our Government did not get a second opinion before swallowing Imperial’s prescription.”

The U.S. Centers for Disease Control has also revised its projections. Experts disagree on what the new data means, but according to an expert at the Montreal Economic Institute, “The most likely CDC scenario now estimates that the coronavirus mortality rate for infected people is between 0.2% and 0.3%. This is a far cry from the 3.4% figure that had been put forward by the WHO at the start of the pandemic.”

In other news from the CDC, on May 23 the agency reported that the antibody tests used to determine whether people have developed an immunity to the virus are too unreliable to be used.

But none of this seems to be dimming the hype and the deluge of investment money being thrown at the latest experimental vaccines. And perhaps that is the point of the exercise – to extract as much money as possible from gullible investors, including the US government, before the public discovers that the fundamentals of these stocks do not support the media hype.

Moderna: A Multibillion-Dollar “Unicorn” That Has Never Brought a Product to Market

Moderna in particular has been suspected of pumping its stock price with unreliable preliminary test data. On May 18, Moderna’s stock jumped by as much as 30%, after it issued a press release announcing positive results from a small preliminary trial of its coronavirus vaccine. After the market closed, the company announced a stock offering aimed at raising $1 billion; and on May 18 and 19, Moderna executives dumped nearly $30 million worth of stock for a profit of $25 million.

On May 19, however, the stock rocketed back down, after STAT News questioned the company’s test results. An antibody response was reported for only eight of the 45 patients, not enough for statistical analysis. Was the response significant enough to create immunity? And what about the other 37 patients?

Robert F. Kennedy Jr. called the results a “catastrophe” for the company. He wrote on May 20:

Three of the 15 human guinea pigs in the high dose cohort (250 mcg) suffered a “serious adverse event” within 43 days of receiving Moderna’s jab. Moderna … acknowledged that three volunteers developed Grade 3 systemic events, defined by the FDA as “Preventing daily activity and requiring medical intervention.”

Moderna allowed only exceptionally healthy volunteers to participate in the study. A vaccine with those reaction rates could cause grave injuries in 1.5 billion humans if administered to “every person on earth”.

A volunteer named Ian Haydon buoyed the markets when he appeared on CNBC to say he felt fine after getting the vaccine. But he later revealed that after the second jab, he got chills and a fever of over 103°F (39.4°C), lost consciousness, and “felt more sick than he ever has before.”

Those were just the short-term adverse effects. Long-term systemic effects including cancer, Alzheimer’s disease, autoimmune disease, and infertility can take decades to develop. But the stage is already being set for mandatory vaccinations that will be “deployed” by the U.S. military as soon as the end of the year. The HHS in conjunction with the Department of Defense has awarded a $138 million contract for 600 million syringes prefilled with coronavirus vaccine, individually marked with trackable RFID chips. That’s enough for two doses for nearly the entire U.S. population. One hundred million are to be supplied by year’s end.

Fortunately for vaccine manufacturers and investors, they do not have to worry about the drugs’ side effects, since the National Vaccine Injury Compensation Program and the 2005 PREP Act protect them from liability for vaccine injuries. Damages are imposed instead on the US government and US taxpayers.

What Moderna could have to worry about, however, is criminal action by the Securities Exchange Commission. By May 22, Moderna’s stock was down by 26% from its earlier high, making its 30% rise on a misleading press release look like a “pump and dump” scheme. On CNBC on May 19, former SEC lawyer Jacob Frankel said its stock offering on the heels of hyped news was the type of action that would draw scrutiny by the SEC, and that it could have a criminal component.

Why All the Hype? Moderna’s mRNA Vaccine

It wasn’t the first time Moderna’s stock had skyrocketed on a well-timed press release. On February 24, the World Health Organization said to prepare for a global pandemic, collapsing stock markets around the world. Most stocks collapsed, but Moderna’s shot up by nearly 30%, after it reported on February 25 that testing on humans would begin in March. Mega-investors made tens of millions of dollars in a single day, including BlackRock, the world’s largest asset manager, which made $68 million just on February 25. BlackRock was called “the fourth branch of government” after it was tasked in March with dispensing up to $4.5 trillion in Federal Reserve credit through “special purpose vehicles” established by the Treasury and the Fed.

Moderna has other friends in high places, including the Pentagon. Several years ago, Moderna received millions of dollars from the Pentagon’s Defense Advanced Research Projects Agency (DARPA), as well as from the Bill and Melinda Gates Foundation. Moderna’s stock has more than tripled this year, taking it to a market cap of over $22 billion. STAT News called it “an astonishing feat for a company that currently sells zero products.” Many of the companies actively developing COVID-19 vaccines have longer and more impressive track records. Why all the investor interest in this “unicorn” startup that went public only in 2018 and has no record of market success?

The major advantage of mRNA vaccines is the speed with which they can be deployed. Created in a lab rather than from a real virus, they can be mass-produced cost-effectively on a large scale and do not require uninterrupted cold storage. But this speed comes at the risk of major side effects. In a 2017 TED talk called “Rewriting the Genetic Code,” Moderna’s current chief medical officer Dr. Tal Zaks said, “We’re actually hacking the software of life ….”

As explained by a medical doctor writing in the UK Independent on May 20:

Moderna’s messenger RNA vaccine … uses a sequence of genetic RNA material produced in a lab that, when injected into your body, must invade your cells and hijack your cells’ protein-making machinery called ribosomes to produce the viral components that subsequently train your immune system to fight the virus. …

In many ways, the vaccine almost behaves like an RNA virus itself except that it hijacks your cells to produce the parts of the virus, like the spike protein, rather than the whole virus. Some messenger RNA vaccines are even self-amplifying…. There are unique and unknown risks to messenger RNA vaccines, including the possibility that they generate strong type I interferon responses that could lead to inflammation and autoimmune conditions.

A lab-created self-amplifying virus encapsulated in nanoparticles that evade the cell’s defenses by stealth sounds a lot like the “stealth viruses” that are classified as “bioweapons,” and that could explain DARPA’s interest in the technology. In a 2010 document titled “Biotechnology: Genetically Engineered Pathogens,” the US Air Force acknowledged that it was studying “genetically engineered pathogens that could pose serious threats to society,” including “binary biological weapons, designer genes, gene therapy as a weapon, stealth viruses, host-swapping diseases, and designer diseases.” DARPA was behind the creation of both DNA and RNA vaccines, funding their early research and development by Moderna as well as by Inovio Pharmaceuticals Inc.

In December 2017, over 1,200 emails released under open records requests revealed that the U.S. military is now the top funder behind the controversial “genetic extinction” technology known as “gene drives.” As investigative reporter Whitney Webb observed in a May 4 article, “these genetic ‘kill switches’ could also be inserted into actual humans through artificial chromosomes, which – just as they have the potential to extend life – also have the potential to cut it short.” Biowarfare is forbidden under international treaty, but the army’s Medical Research Institute of Infectious Diseases at Fort Detrick says its investigations are to “protect the warfighter from biological threats” and to protect civilians from threats to public health. Even assuming that is true, are the army’s technicians proficient enough to tinker with the human genetic code without hitting a kill switch or two by mistake?

The military is thinking about war, the pharmaceutical companies and investors are thinking about profits, the politicians are thinking about getting the country back to work, and even the regulators are bypassing proper safety tests in the rush to get the entire global population vaccinated before the virus disappears. It is left to us, the recipients of these novel untested GMO vaccines, to demand some serious vetting before the military shows up at our doors with their prefilled RFID-chipped syringes some time later this year.