Poverty – Informed Comment https://www.juancole.com Thoughts on the Middle East, History and Religion Mon, 26 Feb 2024 02:44:28 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 War is Bad for you — And the Economy: Biden touts the Alleged Benefits of the Arsenal of Democracy https://www.juancole.com/2024/02/alleged-benefits-democracy.html Mon, 26 Feb 2024 05:02:20 +0000 https://www.juancole.com/?p=217286 ( Tomdispatch.com ) – Joe Biden wants you to believe that spending money on weapons is good for the economy. That tired old myth — regularly repeated by the political leaders of both parties — could help create an even more militarized economy that could threaten our peace and prosperity for decades to come. Any short-term gains from pumping in more arms spending will be more than offset by the long-term damage caused by crowding out new industries and innovations, while vacuuming up funds needed to address other urgent national priorities.

The Biden administration’s sales pitch for the purported benefits of military outlays began in earnest last October, when the president gave a rare Oval Office address to promote a $106-billion emergency allocation that included tens of billions of dollars of weaponry for Ukraine, Israel, and Taiwan. MAGA Republicans in Congress had been blocking the funding from going forward and the White House was searching for a new argument to win them over. The president and his advisers settled on an answer that could just as easily have come out of the mouth of Donald Trump: jobs, jobs, jobs. As Joe Biden put it:

“We send Ukraine equipment sitting in our stockpiles. And when we use the money allocated by Congress, we use it to replenish our own stores… equipment that defends America and is made in America: Patriot missiles for air defense batteries made in Arizona; artillery shells manufactured in 12 states across the country — in Pennsylvania, Ohio, Texas; and so much more.”

It should be noted that two of the four states he singled out (Arizona and Pennsylvania) are swing states crucial to his reelection bid, while the other two are red states with Republican senators he’s been trying to win over to vote for another round of military aid to Ukraine.

Lest you think that Biden’s economic pitch for such aid was a one-off event, Politico reported that, in the wake of his Oval Office speech, administration officials were distributing talking points to members of Congress touting the economic benefits of such aid. Politico dubbed this approach “Bombenomics.” Lobbyists for the administration even handed out a map purporting to show how much money such assistance to Ukraine would distribute to each of the 50 states. And that, by the way, is a tactic companies like Lockheed Martin routinely use to promote the continued funding of costly, flawed weapons systems like the F-35 fighter jet. Still, it should be troubling to see the White House stooping to the same tactics.

Yes, it’s important to provide Ukraine with the necessary equipment and munitions to defend itself from Russia’s grim invasion, but the case should be made on the merits, not through exaggerated accounts about the economic impact of doing so. Otherwise, the military-industrial complex will have yet another never-ending claim on our scarce national resources.

Military Keynesianism and Cold War Fallacies

The official story about military spending and the economy starts like this: the massive buildup for World War II got America out of the Great Depression, sparked the development of key civilian technologies (from computers to the internet), and created a steady flow of well-paying manufacturing jobs that were part of the backbone of America’s industrial economy.

There is indeed a grain of truth in each of those assertions, but they all ignore one key fact: the opportunity costs of throwing endless trillions of dollars at the military means far less is invested in other crucial American needs, ranging from housing and education to public health and environmental protection. Yes, military spending did indeed help America recover from the Great Depression but not because it was military spending. It helped because it was spending, period. Any kind of spending at the levels devoted to fighting World War II would have revived the economy. While in that era, such military spending was certainly a necessity, today similar spending is more a question of (corporate) politics and priorities than of economics.

In these years Pentagon spending has soared and the defense budget continues to head toward an annual trillion-dollar mark, while the prospects of tens of millions of Americans have plummeted. More than 140 million of us now fall into poor or low-income categories, including one out of every six children. More than 44 million of us suffer from hunger in any given year. An estimated 183,000 Americans died of poverty-related causes in 2019, more than from homicide, gun violence, diabetes, or obesity. Meanwhile, ever more Americans are living on the streets or in shelters as homeless people hit a record 650,000 in 2022.

Perhaps most shockingly, the United States now has the lowest life expectancy of any industrialized country, even as the International Institute for Strategic Studies reports that it now accounts for 40% of the world’s — yes, the whole world’s! — military spending. That’s four times more than its closest rival, China. In fact, it’s more than the next 15 countries combined, many of which are U.S. allies. It’s long past time for a reckoning about what kinds of investments truly make Americans safe and economically secure — a bloated military budget or those aimed at meeting people’s basic needs.

What will it take to get Washington to invest in addressing non-military needs at the levels routinely lavished on the Pentagon? For that, we would need presidential leadership and a new, more forward-looking Congress. That’s a tough, long-term goal to reach, but well worth pursuing. If a shift in budget priorities were to be implemented in Washington, the resulting spending could, for instance, create anywhere from 9% more jobs for wind and solar energy production to three times as many jobs in education.

As for the much-touted spinoffs from military research, investing directly in civilian activities rather than relying on a spillover from Pentagon spending would produce significantly more useful technologies far more quickly. In fact, for the past few decades, the civilian sector of the economy has been far nimbler and more innovative than Pentagon-funded initiatives, so — don’t be surprised — military spinoffs have greatly diminished. Instead, the Pentagon is desperately seeking to lure high-tech companies and talent back into its orbit, a gambit which, if successful, is likely to undermine the nation’s ability to create useful products that could push the civilian sector forward. Companies and workers who might otherwise be involved in developing vaccines, producing environmentally friendly technologies, or finding new sources of green energy will instead be put to work building a new generation of deadly weapons.

Diminishing Returns

In recent years, the Pentagon budget has approached its highest level since World War II: $886 billion and counting. That’s hundreds of billions more than was spent in the peak year of the Vietnam War or at the height of the Cold War. Nonetheless, the actual number of jobs in weapons manufacturing has plummeted dramatically from three million in the mid-1980s to 1.1 million now. Of course, a million jobs is nothing to sneeze at, but the downward trend in arms-related employment is likely to continue as automation and outsourcing grow. The process of reducing arms industry jobs will be accelerated by a greater reliance on software over hardware in the development of new weapons systems that incorporate artificial intelligence. Given the focus on emerging technologies, assembly line jobs will be reduced, while the number of scientists and engineers involved in weapons-related work will only grow.

In addition, as the journalist Taylor Barnes has pointed out, the arms industry jobs that do remain are likely to pay significantly less than in the past, as unionization rates at the major contractors continue to fall precipitously, while two-tier union contracts deny incoming workers the kind of pay and benefits their predecessors enjoyed. To cite two examples: in 1971, 69% of Lockheed Martin workers were unionized, while in 2022 that number was 19%; at Northrop Grumman today, a mere 4% of its employees are unionized. The very idea that weapons production provides high-paying manufacturing jobs with good benefits is rapidly becoming a thing of the past.

More and better-paying jobs could be created by directing more spending to domestic needs, but that would require a dramatic change in the politics and composition of Congress.

The Military Is Not an “Anti-Poverty Program”

Members of Congress and the Washington elite continue to argue that the U.S. military is this country’s most effective anti-poverty program. While the pay, benefits, training, and educational funding available to members of that military have certainly helped some of them improve their lot, that’s hardly the full picture. The potential downside of military service puts the value of any financial benefits in grim perspective.

Many veterans of America’s disastrous post-9/11 wars, after all, risked their physical and mental health, not to speak of their lives, during their time in the military. After all, 40% of veterans of the Iraq and Afghan wars have reported service-related disabilities. Physical and mental health problems suffered by veterans range from lost limbs to traumatic brain injuries to post-traumatic stress syndrome (PTSD). They have also been at greater risk of homelessness than the population as a whole. Most tragically, four times as many veterans have committed suicide as the number of military personnel killed by enemy forces in any of the U.S. wars of this century.

The toll of such disastrous conflicts on veterans is one of many reasons that war should be the exception, not the rule, in U.S. foreign policy.

And in that context, there can be little doubt that the best way to fight poverty is by doing so directly, not as a side-effect of building an increasingly militarized society. If, to get a leg up in life, people need education and training, it should be provided to civilians and veterans alike.

Tradeoffs

Federal efforts to address the problems outlined above have been hamstrung by a combination of overspending on the Pentagon and the unwillingness of Congress to more seriously tax wealthy Americans to address poverty and inequality. (After all, the wealthiest 1% of us are now cumulatively worth more than the 291 million of us in the “bottom” 90%, which represents a massive redistribution of wealth in the last half-century.)

The tradeoffs are stark. The Pentagon’s annual budget is significantly more than 20 times the $37 billion the government now invests annually in reducing greenhouse gas emissions as part of the Inflation Reduction Act. Meanwhile, spending on weapons production and research alone is more than eight times as high. The Pentagon puts out more each year for one combat aircraft — the overpriced, underperforming F-35 — than the entire budget of the Centers for Disease Control and Prevention. Meanwhile, one $13 billion aircraft carrier costs more to produce than the annual budget of the Environmental Protection Agency. Similarly, in 2020, Lockheed Martin alone received $75 billion in federal contracts and that’s more than the budgets of the State Department and the Agency for International Development combined. In other words, the sum total of that company’s annual contracts adds up to the equivalent of the entire U.S. budget for diplomacy.

Simply shifting funds from the Pentagon to domestic programs wouldn’t, of course, be a magical solution to all of America’s economic problems. Just to achieve such a shift in the first place would, of course, be a major political undertaking and the funds being shifted would have to be spent effectively. Furthermore, even cutting the Pentagon budget in half wouldn’t be enough to take into account all of this country’s unmet needs. That would require a comprehensive package, including not just a change in budget priorities but an increase in federal revenues and a crackdown on waste, fraud, and abuse in the outlay of government loans and grants. It would also require the kind of attention and focus now reserved for planning to fund the military.

One comprehensive plan for remaking the economy to better serve all Americans is the moral budget of the Poor People’s Campaign, a national movement of low-income people inspired by the 1968 initiative of the same name spearheaded by the Reverend Martin Luther King, Jr., before his assassination that April 4th. Its central issues are promoting racial justice, ending poverty, opposing militarism, and supporting environmental restoration. Its moral budget proposes investing more than $1.2 trillion in domestic needs, drawn from both cuts to Pentagon spending and increases in tax revenues from wealthy individuals and corporations. Achieving such a shift in American priorities is, at best, undoubtedly a long-term undertaking, but it does offer a better path forward than continuing to neglect basic needs to feed the war machine.

If current trends continue, the military economy will only keep on growing at the expense of so much else we need as a society, exacerbating inequality, stifling innovation, and perpetuating a policy of endless war. We can’t allow the illusion — and it is an illusion! — of military-fueled prosperity to allow us to neglect the needs of tens of millions of people or to hinder our ability to envision the kind of world we want to build for future generations. The next time you hear a politician, a Pentagon bureaucrat, or a corporate functionary tell you about the economic wonders of massive military budgets, don’t buy the hype.

Via Tomdispatch.com

]]>
Our World-Historical Turning Point – Kairos – is Now, and Everything Depends on the Youths https://www.juancole.com/2024/01/historical-turning-everything.html Fri, 19 Jan 2024 05:02:01 +0000 https://www.juancole.com/?p=216639 ( Tomdispatch.com) – “All Americans owe them a debt for — if nothing else — releasing the idealism locked so long inside a nation that has not recently tasted the drama of a social upheaval. And for making us look on the young people of the country with a new respect.” That’s how Howard Zinn opened his book The New Abolitionists about the Student Nonviolent Coordinating Committee of the 1960s. Zinn pointed out a truth from the Black freedom struggles of that era and earlier: that young people were often labeled aloof and apathetic, apolitical and uncommitted — until suddenly they were at the very forefront of justice struggles for themselves and for the larger society. Connected to that truth is the reality that, in the history of social-change movements in the United States and globally, young people almost invariably find themselves in the lead.

I remember first reading The New Abolitionists in the 1990s when I was a college student and activist. I had grown weary of hearing older people complain about the inactivity of my generation, decrying why we weren’t more involved in the social issues of the day. Of course, even then, such critiques came in the face of mass protests, often led by the young, against the first Iraq war (launched by President George H.W. Bush), the Republican Contract With America, and the right-wing “family values” movement. Such assertions about the apathy of youth were proffered even as young people were waging fights for marriage equality, the protection of abortion, and pushing back against the attack on immigrants, as well as holding mass marches like the Battle for Seattle at the World Trade Organization meeting as well as protests at the Republican National Convention of 2000, and so much more.

Another quote from Zinn remains similarly etched in my mind. “Theirs,” he wrote, “was the silent generation until they spoke, the complacent generation until they marched and sang, the money-seeking generation until they gave it up for… the fight for justice in the dank and dangerous hamlets of the Black Belt.”

And if it was true that, in the 1990s and 2000s, young people were so much less complacent than was recognized at the time, it’s even truer (to the nth degree!) in the case of the Millennials and Gen Z today. Younger generations are out there leading the way toward justice in a fashion that they seldom get credit for.

Don’t Look Up

Let me suggest, as a start, that we simply chuck out the sort of generalizations about Millennials and Gen Z that pepper the media today: that those younger generations spend too much money on avocado toast and Starbucks when they should be buying real estate or paying down their student loans. Accused of doing everything through social media, it’s an under-recognized and unappreciated reality of this century that young people have been showing up in a remarkable fashion, leading the way in on-the-ground movements to ensure that Black lives matter, dealing vividly with the onrushing horror of climate change, as well as continued conflict and war, not to speak of defending economic justice and living wages, abortion access, LGBTQ rights, and more.

Take, for instance, the greatest social upheaval of the past five years: the uprising that followed the murders of George Floyd and Breonna Taylor, with #BlackLivesMatter protests being staged in staggering numbers of communities, many of which had never hosted such an action before. Those marches and rallies, led mainly by teenagers and young adults, may have been the broadest wave of protests in American history.

When it comes to the environmental movement, young people have been organizing campaigns for climate justice, calling for a #GreenNewDeal and #climatedefiance from Cop City to the March to End Fossil Fuels to a hunger strike in front of the White House. At the same time, they have been bird-dogging politicians on both sides of the aisle with an urgency and militancy not previously associated with climate change. Meanwhile, a surge of unionization drives, whether at Walmart, Starbucks, Amazon, or Dollar General, has largely been led by young low-wage workers of color and has increased appreciation for and recognition of workers’ rights and labor unions to a level not seen in decades. Add to that the eviction moratoriums, mutual-aid provisions, and student-debt strikes of the pandemic years, which gained ground no one had thought possible even months earlier.

And don’t forget the movement to stop gun violence that, from the March for Our Lives in Florida to the protests leading to the expulsion and subsequent reinstatement of state legislators Justin Jones and Justin Pearson in Tennessee, galvanized millions across racial and political lines. Teenagers in striking numbers are challenging this society to value their futures more than guns. And most recently, calls for a #ceasefirenow and #freepalestine have heralded the birth of a new peace movement in the wake of Hamas’s attacks on Israel and the Israeli destruction of much of Gaza. Although university presidents have been getting more media attention, Palestinian, Jewish, and Muslim students have been the ones organizing and out there, insisting that indiscriminate violence perpetrated against Palestinians, especially children, will not happen “in our name.”

From Unexpected Places

An observation Zinn made so many years ago about young people in the 1960s may have lessons for movements today: “They came out of unexpected places; they were mostly black and therefore unseen until they suddenly became the most visible people in America; they came out of Greensboro, North Carolina, and Nashville, Tennessee, and Rock Hill, South Carolina, and Atlanta, Georgia. And they were committed. To the point of jail, which is a large commitment.”

Today’s generation of activists are similarly committed and come from places as varied as Parkland, Florida, Uvalde, Texas, Buffalo, New York, and Durham, North Carolina. Below the surface, some deep stuff is brewing that could indeed continue to compel new generations of the young into action. As we approach the first quarter mark of the twenty-first century, we’re stepping firmly into a new technological era characterized by unparalleled levels of digital power. The Fourth Industrial Revolution, as elite economists and think-tankers like to call it, promises a technological revolution that, in the words of World Economic Forum founder Klaus Schwab, is likely to occur on a “scale, scope, and complexity” never before experienced. That revolution will, of course, include the integration of artificial intelligence and other labor-replacing technology into many kinds of in-person as well as remote work and is likely to involve the “deskilling” of our labor force from the point of production all the way to the market.

Residents of Detroit, once the Silicon Valley of auto manufacturing, understand this viscerally. At the turn of the twentieth century, the Ford River Rouge Plant was the largest, most productive factory in the world, a private city with 100,000 workers and its own municipal services. Today, the plant employs only a fraction of that number — about 10,000 people — and yet, thanks to a surge of robotic innovation, it produces even more cars than it did in the heady days of the 1930s. Consider such a shift just the tip of the spear of the kind of change “coming to a city near you,” as one veteran auto worker and union organizer once told me. All of this is impacting everything from wages to health-care plans, pensions to how workers organize. Indeed, some pushback to such revolutionary shifts in production can be seen in the labor strikes the United Auto Workers launched late in 2023.

Overall, such developments are deeply impacting young people. After all, workers are now generally making less than their parents did, even though they may produce more for the economy. Growing parts of our workforce are increasingly non-unionized, low-wage, part-time and/or contracted out, often without benefits like health care, paid sick leave, or retirement plans. And not surprisingly, such workers struggle to afford housing, childcare, and other necessities, experiencing on the whole harsher lives than the generations that preceded them.

In addition, the last 40 years have done more than just transform work and daily life for younger generations. They have conditioned so many to lose faith in government as a site for struggle and change. Instead, Americans are increasingly dependent on private, market-based solutions that extol the wealthy for their humanitarianism (even as they reap the rewards from federal policymaking and an economy rigged in their favor).

Crises upon Crises

Consider the social, political, and economic environment that’s producing the multi-layered crises faced by today’s younger generations. When compared to other advanced countries, the United States lags perilously behind in almost every important category. In this rich land, about 45 million people regularly experience hunger and food insecurity, nearly 80 million are uninsured or underinsured, close to 10 million live without housing or on the brink of homelessness, while the education system continues to score near the bottom compared to the other 37 countries in the Organization for Economic Co-operation and Development. And in all of this, young people are impacted disproportionately.

Perhaps most damning, ours is a society that has become terrifyingly tolerant of unnecessary death and suffering. Deaths by poverty are an increasingly all-American reality. Low-wage jobs that have been found to shorten lives are the norm. In 2023, researchers at the University of California, Riverside, found that poverty was the fourth-leading cause of death in this country, right after heart disease, smoking, and cancer. While life expectancy continues to rise across the industrialized world, it’s stagnated in the U.S. since the 2010s and, during the first three years of the Covid pandemic, it dropped in a way that, according to experts, was unprecedented in modern world history. That marks us as unique not just among wealthy countries, but among poorer ones as well. And again, its impact was felt above all by the young. What we call “deaths of despair” are also accelerating, although the label is misleading, since so many overdoses and suicides are caused not by some amorphous social malaise but by medical neglect and lack of access to adequate care and mental-health treatment for the under- or uninsured.

Nor are low wages, crises of legitimacy, and falling life expectancy the only significant issues facing our younger generations. Just last week, the New York Times reported that 2023 was the hottest year on record (with climate chaos worsening yearly and little chance of the elimination of our reliance on fossil fuels in sight). Add to that the fact that anyone born in the last three decades can hardly remember a time when the United States was not in some fashion at war (whether declared or not) and pouring its taxpayer dollars into the Pentagon budget. In fact, according to the National Priorities Project, this country has spent a staggering $21 trillion on militarization since September 11, 2001, including increased border patrols, a rising police presence in our communities, and various aspects of the Global War on Terror that came home big-time. Add to all that, the rise of Trumpian-style authoritarianism and attacks on our democratic system more extreme than at any time since the Civil War.

What Time Is It?

Thousands of years ago, the ancient Greeks taught that there were two ways to understand time — and the times in which we live. Chronos was quantitative time, the measured chronological time of a clock. Kairos, on the other hand, was qualitative time: the special, even transformative, time of a specific moment (and possibly of a movement). Kairos is all about opportunity. In the days of antiquity, Greek archers were trained to recognize the brief kairos moment, the opening when their arrow had the best chance of reaching its target. In the Bible (and as a biblical scholar I run into this a lot), Kairos describes a moment when the eternal breaks into history.

German-American theologian Paul Tillich introduced the modern use of kairos in describing the period between the First World War and the rise of fascism. In retrospect, he recognized the existential stakes of that transitional moment and mourned the societal failure to stem the tide of fascism in Germany, Italy, and Spain. There was a similar kairos moment in apartheid South Africa when a group of mainly Black theologians wrote a Kairos Document noting that “for very many… in South Africa, this is the KAIROS, the moment of grace and opportunity… a challenge to decisive action. It is a dangerous time because, if this opportunity is missed, and allowed to pass by, the loss… will be immeasurable.”

2024 may well be a kairos moment for us here in the United States. There’s so much at stake, so much to lose, but if Howard Zinn were with us today, I suspect he would look at the rise of bold and visionary organizing, led by generations of young leaders, and tell us that change, on a planet in deep distress, is coming soon.

Via Tomdispatch.com

]]>
We Deserve Medicare for All, But What We Get Is Medicare for Wall Street https://www.juancole.com/2024/01/deserve-medicare-street.html Sat, 06 Jan 2024 05:02:46 +0000 https://www.juancole.com/?p=216368 By Les Leopold | –

Creating a sane healthcare system will depend on building a massive common movement to free our economy from Wall Street’s wealth extraction.

( Commondreams.org ) – The United States health care system—more costly than any on earth—will become ever more so as Wall Street increasingly extracts money from it.

Private equity funds own approximately 9% of all private hospitals and 30% of all proprietary for-profit hospitals, including 34% that serve rural populations. They’ve also bought up nursing homes and doctors’ practices and are investing more year by year. The net impact? Medical costs to the government and to patients have gone up while patients have suffered more adverse medical results, according to two current studies.

The Journal of the American Medical Association (JAMA) recently published a paper which found:

Private equity acquisition was associated with increased hospital-acquired adverse events, including falls and central line–associated bloodstream infections, along with a larger but less statistically precise increase in surgical site infections.

This should not come as a surprise. Private equity firms in general operate as follows: They raise funds from investors to purchase enterprises using as much borrowed money as possible. That debt does not fall on the private equity firm or its investors, however. Instead, all of it is placed on the books of the purchased entity. If a private equity firm borrows money and buys up a nursing home or hospital chain, the debt goes on the books of these healthcare facilities in what is called a leveraged buyout.

To service the debt, the enterprise’s management, directed by their private equity ownership, must reduce costs, and increase its cash flow. The first and easiest way to reduce costs is by reducing the number of staff and by decreasing services. Of course, the quality of care then suffers. Meanwhile, the private equity firm charges the company fees in order to secure its own profits.

With so much taxpayer money sloshing around in the system, hedge funds also are cashing in.

An even larger study of private equity and health was completed this summer and published in the British Medical Journal (BMJ). After reviewing 1,778 studies it concluded that after private equity firms purchased healthcare facilities, health outcomes deteriorated, costs to patients or payers increased, and overall quality declined.


Photo by Towfiqu barbhuiya on Unsplash

One former executive at a private equity firm that owns an assisted-living facility near Boulder, Colorado, candidly described why the firm was refusing to hire and retain high-quality caregivers: “Their position was: We are trying to increase our profitability. Care is an ancillary part of the conversation.”

Medicare Advantage Creates Wall Street Advantages

Congress passed the Medicare Advantage program in 2003. Its proponents claimed it would encourage competition and greater efficiency in the provision of health insurance for seniors. At the time, privatization was all the rage as the Democratic and Republican parties competed to please Wall Street donors. It was argued that Medicare, which was actually much more efficient than private insurance companies, needed the iron fist of profit-making to improve its services. These new private plans were permitted to compete with Medicare Part C (Medigap) supplemental insurance.

In 2007, 19% of Medicare recipients enrolled in Medicare Advantage plans. By 2023 enrollment had risen to 51%. These heavily marketed plans are attractive because many don’t charge additional monthly premiums, and they often include dental, vision, and hearing coverage, which Medicare does not. And in some plans, other perks get thrown in, like gym memberships and preloaded over-the-counter debit cards for use in pharmacies for health items.

How is it possible for Medical Advantage to do all this and still make a profit?

According to a report by the Physicians for a National Health Program, it’s very simple—they overcharge the government, that is we, the taxpayers, “by a minimum of $88 billion per year.” The report says it could be as much as $140 billion.

In addition to inflating their bills to the government, these HMO plans don’t pay doctors outside of their networks, deny or slow needed coverage to patients, and delay legitimate payments. As Dr. Kenneth Williams, CEO of Alliance HealthCare, said of Medicare Advantage plans, “They don’t want to reimburse for anything — deny, deny, deny. They are taking over Medicare and they are taking advantage of elderly patients.”

Enter Hedge Funds

With so much taxpayer money sloshing around in the system, hedge funds also are cashing in. They have bought large quantities of stock in the healthcare companies that are milking the government through their Medicare Advantage programs. They then insist that these healthcare companies initiate stock buybacks, inflating the price of their stock and the financial return to the hedge funds. Stock buybacks are a simple way to transfer corporate money to the largest stock-sellers.

(A stock buyback is when a corporation repurchases its own stock. The stock price invariably goes up because the company’s earnings are spread over a smaller number of shares. Until they were deregulated in 1982, stock buybacks were essentially outlawed because they were considered a form of stock price manipulation.)

United Healthcare, for example, is the largest player in the Medicare Advantage market, accounting for 29% of all enrollments in 2023. It also has handsomely rewarded its hedge fund stock-sellers to the tune of $45 billion in stock buybacks since 2007, with a third of that coming since March 2020. Cigna, another big Medicare Advantage player, just announced a $10 billion stock buyback.

These repurchases are also extremely lucrative for United Healthcare’s top executives, who receive most of their compensation through stock incentives. CEO Andrew Witty, for example, hauled in $20.9 million in 2022 compensation, of which $16.4 million came from stock and stock option awards.

Those of us fighting for Medicare for All have much in common with every worker who is losing his or her job as a result of leveraged buyouts and stock buybacks.

A look at the pharmaceutical industry shows where all this is heading. Between 2012 and 2021, fourteen of the largest publicly traded pharmaceutical companies spent $747 billion on stock buybacks and dividends, more than the $660 billion they spent on research and development, according to a report by economists William Lazonick and Öner Tulum. Little wonder that drug prices are astronomically high in the U.S.

And so, the gravy train is loaded and rolling, delivering our tax dollars via Medicare Advantage reimbursements to companies like United Healthcare and Big Pharma, which pass it on to Wall Street private equity firms and hedge funds.

It’s Not Just Healthcare

In researching my book, Wall Street’s War on Workers, we found that private equity firms and hedge funds are undermining the working class through leveraged buyouts and stock buybacks. When private equity moves in, mass layoffs (just like healthcare staff cuts and shortages) almost always follow so that the companies can service their debt and private equity can extract profits. When hedge funds insist on stock repurchases, mass layoffs are used to free up cash in order to buy back their shares. As a result, between 1996 and today, we estimate that more than 30 million workers have gone through mass layoffs.

Meanwhile, stock buybacks have metastasized throughout the economy. In 1982, before deregulation, only about 2% of all corporate profits went to stock buybacks. Today, it is nearly 70%.

Those of us fighting for Medicare for All, therefore, have much in common with every worker who is losing his or her job as a result of leveraged buyouts and stock buybacks. Every fight to stop a mass layoff is a fight against the same Wall Street forces that are attacking Medicare and trying to privatize it. Creating a sane healthcare system, therefore, will depend on building a massive common movement to free our economy from Wall Street’s wealth extraction.

To take the wind out of Medicare Advantage and Wall Street’s rapacious sail through our healthcare system, we don’t need more studies. It’s time to outlaw leveraged buyouts and stock buybacks.

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
]]>
How Turning Gaza into a Hellhole is Costing Americans Billions as Child Poverty Spikes at Home https://www.juancole.com/2023/11/hellhole-americans-billions.html Mon, 06 Nov 2023 05:02:07 +0000 https://www.juancole.com/?p=215204 ( Tomdispatch.com ) – On September 19, 2001, eight days after 9/11, as the leaders of both parties were already pounding a frenzied drumbeat of war, a diverse group of concerned Americans released a warning about the long-term consequences of a military response. Among them were veteran civil rights activists, faith leaders, and public intellectuals, including Rosa Parks, Harry Belafonte, and Palestinian-American Edward Said. Rare public opponents of the drive to war at the time, they wrote with level-headed clarity:

“We foresee that a military response would not end the terror. Rather, it would spark a cycle of escalating violence, the loss of innocent lives, and new acts of terrorism… Our best chance for preventing such devastating acts of terror is to act decisively and cooperatively as part of a community of nations within the framework of international law… and work for justice at home and abroad.”

Twenty-three years and more than two wars later, this statement reads as a tragic footnote to America’s Global War on Terror that left an entire region of the planet immiserated. It contributed to the direct and indirect deaths of close to 4.5 million people, while costing Americans almost $9 trillion and counting.

The situation is certainly different today. Still, over the last few weeks, those prophetic words, now 22 years old, have been haunting me, as the U.S. war machine kicks into ever higher gear following the horrific Hamas massacre of Israeli civilians and the brutal intensification of the decades-long Israeli siege of civilians in Gaza. Sadly, the words and actions of our nation’s leaders have revealed a staggering, even willful, historical amnesia about the disastrous repercussions of America’s twenty-first-century war-mongering.

Case in point: recently, the United States was the only nation to veto the U.N. Security Council resolution calling for “humanitarian pauses” to deliver life-saving aid to Palestinians in Gaza. Instead, all but a few members of Congress are lining up to support billions more in military aid for Israel and the further mobilization of our armed forces in the Middle East. These moves, experts say, may only accelerate wider regional conflict (something we are already seeing glimmers of vis-à-vis Iraq, Lebanon, Syria, and Yemen) at a time of increasingly profound global instability. In the last few weeks, the U.S. Navy has “assembled one of the greatest concentrations of power in the Eastern Mediterranean in 40 years,” while the Department of Defense is readying thousands of troops for possible deployment. Meanwhile, college administrators are suggesting student-reservists be prepared in case they get called up in the coming weeks.

Amid this frenzy of American bluster and brawn, the U.N. agency for Palestinian refugees reports that Gaza is “fast becoming a hell hole,” riddled with death, disease, starvation, thirst, and displacement. Hundreds of scholars of international law and conflict studies have warned that the Israeli military may already have launched a “potential genocide” of Gazans. At the same time, within Israel, citizen-militias, armed by the far-right minister of national security, have escalated violent attacks on Palestinians, only worsened by the acts of armed Israeli settlers on the West Bank protected by that very military.

Finally allowing a tiny amount of aid across the Egypt-Gaza border, after shutting down all food, water, and fuel for Gaza, Israeli Defense Minister Yoav Gallant made it clear just how much power the United States wields over this unfolding humanitarian crisis. “The Americans insisted,” he reported, “and we are not in a place where we can refuse them. We rely on them for planes and military equipment. What are we supposed to do? Tell them no?”

As Gallant implied, the U.S. could use its influence not only to demand far more aid for Gazans, but to compel quite a different course of action. There should, after all, be no contradiction between condemning Hamas for its heinous slaughter in the south of Israel and denouncing Israel for its decades-old dispossession and oppression of the Palestinian people and its now-indiscriminate killing and destruction in Gaza. There need be no contradiction between decrying terrorism and demanding diplomacy over violence. In truth, the Biden administration could use every non-military tool at its disposal to pressure both Hamas and Israel to pursue an immediate ceasefire, the full release of all hostages, and whatever humanitarian assistance is now needed.

If only, rather than further militarizing the region or questioning the death toll in Gaza, the Biden administration were to focus on making this most recent and ever more ominous crisis a final turning point, not for yet more brutality, but for a long-term political solution focused on achieving real peace, human rights, and equality for everyone in the region. In this moment of grief and rage, when tensions are at a fever pitch and the wheel of history is turning around us, it’s time to demand peace above all else.

The Cruel Manipulation of the Poor

While the U.S. government refuses to use its considerable power as leverage for peace, ordinary Americans seem to know better. Unlike the days after 9/11, recent polls suggest that a majority of Americans oppose sending more weapons to Israel and support delivering humanitarian aid to Gaza, including a majority of people under the age of 44, as well as a majority of Democrats and independents and a significant minority of Republicans. While Representative Rashida Tlaib, the only Palestinian-American in Congress, was made a pariah and is in the process of being censured by some of her colleagues after her plea for a ceasefire, she actually represents the popular will of a significant portion of the public.

And that, in turn, represents a generational shift from even a decade or two ago. In the wake of this country’s disastrous wars in Afghanistan and Iraq, as well as dozens of other military conflicts globally, many Americans, especially Millennials and Gen Zers, see the U.S. military less as a defender of democracy than as a purveyor of death and chaos. Nearly second-by-second online coverage of the Israeli bombing campaign is offering Americans an unprecedented view into the collective punishment of more than two million Gazans, half of them 18 or younger. (Now, with limited Internet and communications, it’s unclear how word of what’s happening in Gaza will continue to get out.) Add to that the slow-burning pain that has marked life in the United States over the last 15 years — the Great Recession, the Covid-19 economic shock, the climate crisis, and the modern movement for racial justice — and the reasons for such a relatively widespread urge for peace become clearer.

Today, half of all Americans are either impoverished or one emergency away from economic ruin. As younger generations face what often feels like a dead-end future, there’s a growing sense among those I speak to (as well as older folks) that the government has abandoned them. At a moment when the Republicans (and some Democrats) argue that we can’t afford universal healthcare or genuine living wages, the military budget for 2023 is $858 billion and the Pentagon still maintains 750 military bases globally. Last week, without a touch of irony, Treasury Secretary Janet Yellen, who claimed last year that student debt relief would hurt the economy, insisted that the U.S. can “certainly afford two wars.”  

Millions of us tuned into President Biden’s Oval Office speech on his return from Israel, only the second of his presidency. There, he asked Congress to earmark yet another $100 billion mainly for American military aid to Israel, Ukraine, and Taiwan (a boon to the war-profiteering weapons makers whose CEOs will grow even richer thanks to those new contracts). Just a year after Congress killed the Expanded Child Tax Credit, which had cut official child poverty in half, Biden’s speech represented a further pivot away from socially beneficial policymaking and toward further strengthening of the ravenous engine of our war economy. After the speech, the Nation‘s Katrina vanden Heuvel offered this compelling instant commentary: “Biden tonight rolled out a version of twenty-first-century military Keynesianism. Let’s call his policy just that. No more Bidenomics. And it consigns the U.S. to endless militarization of foreign policy.”

A decision to organize our economy yet more around war will also mean the further militarization of domestic policy, with dire consequences for poor and low-income people. Reverend Martin Luther King, Jr., once called such steps the “cruel manipulation of the poor,” a phrase he coined as part of his denunciation of the Vietnam War in the late 1960s. King was then thinking about the American soldiers fighting and dying in Vietnam “on the side of the wealthy, and the secure, while we create a hell for the poor.”

Today, a similar “cruel manipulation” is playing out. For years, our leaders have invoked the myth of scarcity to justify inaction when it comes to widespread poverty, growing debt, and rising inequality in the United States. Now, some of them are calling for the spending of billions of dollars to functionally fund the bombardment and occupation of impoverished Gaza and a violent Israeli clampdown in the West Bank, not to speak of the possibility of a wider set of Middle Eastern wars. However, polling numbers suggest that a surprising number of Americans have seen through the fog of war and are perhaps coming to believe that our nation’s abundance should be used not as a tool of death but as a lifeline for poor and struggling people at home and abroad.

Not in Our Name

In a time of stifling darkness, one bright light over the last weeks has been the eruption of non-violent, pro-peace protests across the world. In Africa, Asia, Latin America, and Europe, hundreds of thousands of people have hit the streets to demand a ceasefire, including possibly half a million people in London. Here in the U.S., tens of thousands of Americans have followed suit in dozens of cities, from New York to Washington, D.C., Chicago to San Francisco. No less important, those protest marches have been both multi-racial and multi-generational, much like the 2020 uprisings for Breonna Taylor, George Floyd, and the countless other Black lives lost to police brutality.

Recently, close friends and colleagues sent me photos from a march in Washington where Jewish protesters demanded a ceasefire and held up signs with heartrending slogans like “Not in My Name,” “Ceasefire Now,” and “My Grief Is Not Your Weapon.” Ultimately, close to 400 people, including numerous rabbis, were arrested as they peacefully sang and prayed in a congressional office building, while David Friedman, ambassador to Israel under President Trump, hatefully tweeted: “Any American Jew attending this rally is not a Jew — yes I said it!” Representative Marjorie Taylor Greene of Georgia ludicrously claimed that they were leading an insurrection.

Two days later, my organization, the Kairos Center for Religions, Rights, and Social Justice, cosponsored a pro-peace march that drew a large crowd of Palestinians and Muslim-American families. At noon, about 500 protesters, a gorgeous, multicolored sea of humanity participated in the Jumma call to prayer in front of the U.S. Capitol. The following week, folks co-organized a pray-in at New York Representative Hakeem Jeffries’s office, using the phrase “ceasefire is the moral choice.” Faith and movement leaders offered prayers from their various religious traditions and displayed the names of people killed so far.

On October 27th, as Israel expanded its ground invasion of Gaza, I joined thousands of people in Grand Central Station to call for a #CeasefireNow, one of the largest demonstrations in New York since this most recent conflict broke out. Protests continued all week. And on November 4th, there was a mass rally and march in Washington, D.C., to call for an end to war and support the rights of Palestinians, with hundreds of organizations bridging a diversity of views and voices to plead for peace.

Those marches were an inspiring indication of the broad coalition of Americans who desperately want to prevent genocide in Gaza and dream of lasting peace and freedom in Israel/Palestine. At the lead are Palestinians and Jews who refuse to be used as pawns and prop-pieces by military hawks. Alongside them are many Americans all too aware that, though they might not be directly affected by the nightmarish events now unfolding in the Middle East, they are still implicated in the growing violence there thanks to their tax dollars and the actions of our government. Together, we are collectively crying out: “Not in Our Name.”

Such marches undoubtedly represent the largest antiwar mobilization since the invasion of Iraq in 2003 and are weaving together diverse communities — young and old, Black, Brown, and White, Muslim, Jewish, and Christian, poor and working-class — in a way that should prove encouraging indeed for a growing peace movement. Right now, there are new alliances and relationships being forged that will undoubtedly endure for years to come.

Yes, this remains a small victory in what’s likely to prove a terrifying global crisis, but it is a victory nonetheless.

Roses Dressed in Black

The last few weeks have resurrected traumatic memories for many Jews and Palestinians globally — of the Holocaust, the Nakba, and the long history of Islamophobia, anti-Arab hate, anti-Jewish violence, and antisemitism. For many of us who are not Palestinian or Jewish, the recent mass death and violence have also triggered our own painful reckonings with the past.

I’m a descendant of Armenian genocide survivors. When I was a child growing up in Milwaukee, Wisconsin, I heard hushed tales of death marches, hunger, lack of water, barricaded roads, and harrowing escapes. Those stories remain etched into my consciousness, a mournful inheritance my dispossessed ancestors handed down.

My great-grandfather, Charles Ozun Artinian, fled his home in what is now Turkey’s Seyhan River valley after the 1909 Adana Massacre in which Ottoman militants killed 25,000 Armenian Christians. Part of his family escaped over the Caucasus Mountains into Western Europe. They then traveled halfway across the world to Argentina, because so many other nations, including the United States, had closed their borders to Armenian refugees and would only open them years later.

As he was fleeing Adana, Charles wrote a poem, one of the few surviving long-form poems from the region at the time. It begins:

“In the Seyhan valley there rises a smoke

Roses dressed in black, month of April cried

Cries of sadness and mourning were heard everywhere

Broken hearted and sad, everybody cried…”

My family taught my siblings and me that although the genocide against our people was carried out by the Ottoman Empire, it was made possible by the complicity and indifference of the international community, including the world’s richest and most powerful nations. Right now, the smoke rising over Gaza is suffocating and every additional hour the U.S. enables more bombs to fall and tanks to rumble, more roses will be, as my great-grandfather put it, dressed in black. Not only that, but with the detonation of each new American-made bomb, the conditions for the long-term freedom and safety of both Israelis and Palestinians are blasted ever more into rubble.

Let us honor the memories of our ancestors and finally learn the lesson of their many stolen lives: “Not In Our Name!,” “Peace and Justice for All!” and the pleas from Gaza, including “Ceasefire Now!,” “End the Siege,” “Protect Medical Facilities,” and “Gaza is Home!”

Via Tomdispatch.com

]]>
How America Abandoned our Poor: Confronting the Needless Scourge of Poverty https://www.juancole.com/2023/10/abandoned-confronting-needless.html Mon, 09 Oct 2023 04:02:55 +0000 https://www.juancole.com/?p=214747 ( Tomdispatch.com ) – On the island of Manhattan, where I live, skyscrapers multiply like metal weeds, a vertical invasion of seemingly unstoppable force. For more than a century, they have risen as symbols of wealth and the promise of progress for a city and a nation. In movies and TV shows, those buildings churn with activity, offices full of important people doing work of global significance. The effect is a feeling of economic vitality made real by the sheer scale of the buildings themselves. 

In stark contrast to those images of bustling productivity stands an outcropping of tall towers along the southern end of Manhattan’s Central Park. Built in the last 20 years, those ultra-luxury residential complexes make up what is unofficially known as “Billionaires’ Row.” The name is apt, considering that millionaires and billionaires have flocked to those buildings to buy apartments at unimaginably high prices.

In 2021, the penthouse on the 96th floor of 432 Park Avenue was listed at an astonishing $169 million (though its Saudi owner has since slashed the offering price to a mere $130 million). No less astonishing these days, such lavish, sky-high homes often sit empty. Rather than fulfilling any functional role, many serve as nothing more than speculative investments for buyers who hope, one day, to resell them for even higher prices, avoid taxes, or launder dirty money. For some among the super-rich, flush with more money than they know what to do with, Billionaires’ Row is simply an easy place to park their wealth. 

Those empty apartments cast a shadow over a city full of people in need of affordable housing and better wages. Reaching from the southern tip of Manhattan into Brooklyn lies the most economically unequal congressional district in the country. To the north, in the Bronx, sprawls the nation’s poorest district. Just last week, the New York Times reported that, based on 2022 census data, “the wealthiest fifth of Manhattanites earned an average household income of $545,549, or more than 53 times as much as the bottom 20 percent, who earned an average of $10,259.”

In New York, where land is a finite resource and real estate determines so much, it is a cruel irony that the richest people in the world are using their capital to literally reach ever higher into the clouds, while back on earth, the average New Yorker, grimly ensconced in reality, lives paycheck to paycheck, navigating a constant storm of food, healthcare, housing, transportation and utility costs. 

Abandonment Amid Abundance

Extreme economic inequality, characterized by a small class of the very wealthy and a broad base of poor and low-income people, may be particularly evident in cities like New York, but it’s a fact of life nationwide. In September 2023, the wealth of America’s 748 billionaires rose to $5 trillion, $2.2 trillion more than in 2017, the year the Trump administration passed massive tax changes favoring the rich. The new 2022 census data offers a very different picture of life for the nation’s poor in those same years. In fact, the numbers are eye-popping: between 2021 and 2022 alone, the overall Supplemental Poverty Measure (SPM) rose by nearly 5%, while child poverty doubled in size.

The U.S. Census Bureau uses two measurements of poverty: the Official Poverty Measure (OPM) and that SPM. The OPM, it’s widely agreed, is shamefully feeble and outdated, while the Supplemental Poverty Measure casts a wider net, catching more of the nuances of impoverishment. Still, even that has its limitations, missing millions of people who flutter precariously just above the official threshold of poverty, constantly at risk of falling below it.

That said, the SPM remains a helpful barometer for this country’s attempts to address poverty. Shailly Gupta-Barnes, my colleague at the Kairos Center and a poverty policy expert, observes that, because the “SPM accounts for family income after taxes and transfers…, it shows the antipoverty effects of some of the largest federal support programs.” Considering that, it’s neither an accident, nor a fluke of the market that the SPM just skyrocketed at an historic rate.

The explanation isn’t even complicated. It’s because a number of highly effective Covid-era, anti-poverty programs were callously cut. (No matter that cases of Covid are again on the rise.) When the newest census figures were released in September 2023, Gupta-Barnes explained, “41% of Americans were poor or low-income in 2022, up significantly since 2021, mainly because of the failure to extend and expand tested anti-poverty programs including the child tax credit, stimulus checks, Medicaid expansion and more.”

The take-away from all of this seems clear enough. When the abundant resources of this society are mobilized to tackle poverty, it decreases; when we undermine those efforts, it increases. The more subtle, but equally important take-away: how we measure poverty has massive implications for how we understand human deprivation in our country. As it happens, tens of millions of people who live in regular economic peril are being made invisible by our very tools for measuring poverty. How, then, can we ever hope to address it in its entirety if we can’t even see the people suffering from its iron grip?

The View from the Bottom

In 2022, the official threshold for poverty was $13,590 per year for one person and $27,750 for a family of four — with about 38 million Americans falling below that threshold. That number alone should shock the conscience of a nation as wealthy and developed as ours. But the truth is that, from the beginning, the official poverty line has been based on an arbitrary and shallow understanding of human need.

First formulated in the 1960s, when President Lyndon Johnson’s administration introduced its War on Poverty, the Official Poverty Measure focuses primarily on access to food for its base line and doesn’t fully take into account other critical expenses like health care, housing, and transportation. It is based on an austere assessment of how much is too little for a person to meet all of his or her needs. Because of its inadequacy, millions of Americans badly in need of support have essentially been erased from the political calculus of poverty. More than half a century later, they still remain so, since the OPM has endured not only as a bureaucratic benchmark but as the authoritative reference point for poverty, influencing our conception of who is poor and, on a policy level, who actually qualifies for a range of public programs.

Since the 1960s, much has changed, even if the official poverty line has remained untouched. The food prices on which it’s based have skyrocketed beyond the rate of inflation, alongside a host of other expenses, including housing, gas, utilities, prescription medicine, college tuition, and now essential costs like internet and cell-phone plans. 

Meanwhile, over the last four decades, wage growth has essentially stagnated. Since 1973, wages for the majority of workers have risen by just 9%, while actually falling for significant numbers of lower-income people. Productivity, on the other hand, continues to grow almost exponentially.  As a result, workers are making comparatively less than their parents did, even though they may produce more for the economy.

This crisis of low pay is no accident. As a start, over the last 50 years, CEOs have taken ever bigger chunks for themselves out of their workers’ paychecks. In 1965, the average CEO made 21 times what his or her workers did. Today, that figure is 344 times more. The reason for such a dramatic polarization of wages and wealth (as so vividly on display in the current UAW strike) is a half-century of neoliberal policy-making intensely antagonistic to the poor and beneficial for the rich.

Over the decades, our economy has been completely reshaped, transforming the kinds of jobs most of us have and the ways we do them. Today, growing parts of our workforce are automated, non-unionized, low-wage, part-time and/or contracted out, often without benefits like health care, paid sick leave, or retirement plans. No one, therefore, should be surprised to learn that such an increasingly stark division of labor and money is accompanied by an unprecedented $17 trillion in personal debt. (And now, with student debt repayments beginning again on October 1st, there is even more needless suffering for those so poor that their economic value is in the negatives.)

In 1995, the National Academy of Sciences recommended the Supplemental Poverty Measure as a new way of assessing poverty and, in 2011, the Census Bureau began to use the SPM. But even that is insufficient. As Gupta-Barnes explains, “Although a broader and preferred measure, the SPM poverty threshold still remains an incomplete estimate of poverty. For instance, according to the SPM, a four-person household with an income of $30,000 is not poor because they fall above the designated poverty threshold. This means that many households living just above the poverty threshold aren’t counted as poor, even though they will have a hard time meeting their basic needs.”

Indeed, right above the 38 million people in official poverty, there are at least 95 million to 105 million living in a state of chronic economic precariousness, just one pay cut, health crisis, or eviction from economic ruin. In other words, today, the low-wage, laid-off, and locked out can’t easily be separated from people of every walk of life who are being economically downsized and dislocated. The old language of social science bears little resemblance to the reality we now face. When the economically “marginalized” are being discussed, it’s all too easy to imagine small bands of people living in the shadows along the edges of society. Unfortunately, the marginalized are now a near-majority of this country.

Poverty Is a Policy Choice

It’s easy to feel overwhelmed, even paralyzed, by such a reality. No one — billionaires aside — is immune from the dread-inducing gravity of the situation this country finds itself in. But here’s the strange thing: deep in the depths of such a monumental mess, it’s possible to discover genuine hope. For if our reality is human-made, as it surely is, then we also have the power to change it.

Ironically, during the pandemic years, before the poverty numbers rose dramatically again in 2022, it was possible to see a notable and noticeable reduction in the numbers of poor Americans exactly because of decisive government action. In 2021, for example, the Child Tax Credit (CTC) and the Children’s Health Insurance Program (CHIP) played leading roles in reducing child poverty to the lowest rates since the SPM was created. The protection and expansion of Medicaid and CHIP also helped mitigate food insecurity and hunger. The research firm KKF estimates that enrollment in those anti-poverty programs rose from “23.3 million to nearly 95 million from February 2020 to the end of March 2023.” And millions of families were able to stay in their homes and fight unlawful evictions during the first couple of years of the pandemic thanks to federal and state eviction moratoriums.

Unfortunately, these pandemic-era programs were sold to us as only temporary, emergency measures, though they were commonsensical policies that advanced the interests of millions of people who had been poor before Covid-19 struck. And unfortunately, alongside Democrats like Joe Manchin and Kyrsten Sinema, congressional Republicans quickly rolled back some of the most striking advances, including letting CTC expire in 2022 (and they continue to advocate for ever greater cuts).

We are now in the midst of what pundits are calling the “great unwinding,” an awkward euphemism for deliberate, brutal reductions to Medicaid expansion in dozens of states. Since April, nearly six million people, including at least 1.2 million children, have been stripped of life-saving Medicaid coverage and estimates suggest that between 15 million and 24 million people may be disenrolled by next spring.

In (harsh) reality, there are at least these two interrelated ways in which poverty is a policy choice. How we choose to define poverty fundamentally shapes how we understand it, while how we govern has enormous consequences for the everyday lives of poor and low-income people. Right now, we’re either getting celebratory messages about the strength of our economy from Democrats or accusatory scapegoating from Republicans. In truth, though, the current bleak reality of poverty is the consequence of decades of neoliberal neglect and animus by both parties.

The pandemic years, sad as they have been, offered a small glimpse of what it would take to confront the needless scourge of poverty in a time of tremendous national wealth. Those investments could have been a first step in launching a full-scale assault on poverty, building off their embryonic success in the pandemic moment.

Instead, the consequences of the rollback of those programs and the threat of yet more cuts brings us to a potential turning point for the nation. Will we continue to condemn tens of millions of us to cruel and unnecessary poverty, while feeding the drive to authoritarianism or even an all-American version of fascism, or will we move swiftly and compassionately to begin lifting the load of poverty and so strengthen the very foundation of our democracy?

Tomdispatch.com

]]>
We have one more Chance to end Child Poverty in America https://www.juancole.com/2023/08/chance-poverty-america.html Sun, 13 Aug 2023 04:08:29 +0000 https://www.juancole.com/?p=213821

My story is proof that public programs can help kids escape poverty. Now lawmakers have a chance to help more kids get out.

When I was born into poverty, the deck was stacked against me in all aspects of life — from educational opportunities and health care to the future earnings I could expect. Now I’ve graduated from college and I’m poised to start my first post-college job.

What made the difference? Hard work, yes — but also public investment. Public programs helped keep me fed, healthy, and learning as I grew up. Kids growing up today deserve the same chance.

SNAP benefits allowed me to eat breakfast before school, eat dinner after work, and ensured that food would not be added to my list of anxieties. Programs like Free and Reduced Meals made sure I got lunch at school, too.

Medicaid allowed me to be a healthy student. It covered visits to my dentist, who identified two tooth infections before they spread to my brain. It made sure I could see my doctor, who diagnosed me with asthma so I could get treatment and play sports with other kids.

Funding for Advanced Placement exams at my low-income Title I school helped me prepare for college. And with my family’s income falling far short of tuition costs, the Pell Grant program allowed me to pursue a college education — and escape generational poverty.


Via Pixabay.

Without this help, I wouldn’t have become the first person in my father’s family to graduate high school — nor the first person in my entire family to graduate college. Our safety net is still frayed and underfunded, and escaping poverty is still the exception. But my story shows what’s possible when we’re given equitable resources and opportunities to succeed.

One new policy could make even greater strides towards equity — if it can overcome fierce opposition from conservative lawmakers.

Rep. Rosa DeLauro (D-CT) has introduced legislation that would increase the Child Tax Credit (CTC) to $3,000 for children ages 6-17 and to $3,600 for children under 5. DeLauro’s proposal would deliver installments of that credit monthly (rather than once a year at tax time) and make it fully refundable so the lowest income families are eligible for the full value.

This proposal is similar to the pandemic expansion of the CTC, which reduced child poverty nearly by half in the short time it was active.

The Brookings Institution uncovered that the credit was particularly effective in conservative, high poverty states such as West Virginia, Oklahoma, and Alabama. But when conservative lawmakers let it expire at the end of 2021, the expanded CTC expired and sent 3.7 million children into poverty.

According to the Institute on Taxation and Economic Policy, expanding the CTC again would benefit more than 60 million children, especially those with the lowest incomes.

The CTC also has powerful implications for racial justice. The Center for Budget and Policy Priorities found that roughly half of Black children, 40 percent of Indigenous children, 40 percent of Latinx children, 17 percent of white children, and 15 percent of Asian children get less than the full credit — or no credit at all. Expanding it would narrow racial economic divides while benefiting kids of every race.

Opponents of the CTC argue that it will discourage parents from working. However, most studies predict that 99 percent of working parents will continue to work. The tangible benefits to children living in poverty outweigh these hypothetical, unproven complaints.

We’ve seen that expanding the Child Tax Credit works. So choosing not to will harm children’s development and widen the educational disparities we already see in this country based on race and class. In an already uneven playing field, programs like this have the opportunity to give low-income students a chance to succeed.

To give more kids a chance to escape poverty, we need to invest in the kids who need it the most. Now we have another chance to get it right.

 
Kassidy Jacobs

Kassidy Jacobs is a Next Leader at the Institute for Policy Studies.

]]>
Caution: Children at Work: The Return of Child Labor shows the Decline of the Right is Bottomless https://www.juancole.com/2023/07/caution-children-bottomless.html Sat, 08 Jul 2023 04:02:54 +0000 https://www.juancole.com/?p=213085 By

( Tomdispatch.com ) – An aged Native-American chieftain was visiting New York City for the first time in 1906. He was curious about the city and the city was curious about him. A magazine reporter asked the chief what most surprised him in his travels around town. “Little children working,” the visitor replied. 

Child labor might have shocked that outsider, but it was all too commonplace then across urban, industrial America (and on farms where it had been customary for centuries). In more recent times, however, it’s become a far rarer sight. Law and custom, most of us assume, drove it to near extinction. And our reaction to seeing it reappear might resemble that chief’s — shock, disbelief. 

But we better get used to it, since child labor is making a comeback with a vengeance. A striking number of lawmakers are undertaking concerted efforts to weaken or repeal statutes that have long prevented (or at least seriously inhibited) the possibility of exploiting children. 

Take a breath and consider this: the number of kids at work in the U.S. increased by 37% between 2015 and 2022. During the last two years, 14 states have either introduced or enacted legislation rolling back regulations that governed the number of hours children can be employed, lowered the restrictions on dangerous work, and legalized subminimum wages for youths.

Iowa now allows those as young as 14 to work in industrial laundries. At age 16, they can take jobs in roofing, construction, excavation, and demolition and can operate power-driven machinery. Fourteen-year-olds can now even work night shifts and once they hit 15 can join assembly lines. All of this was, of course, prohibited not so long ago.    

Legislators offer fatuous justifications for such incursions into long-settled practice. Working, they tell us, will get kids off their computers or video games or away from the TV. Or it will strip the government of the power to dictate what children can and can’t do, leaving parents in control — a claim already transformed into fantasy by efforts to strip away protective legislation and permit 14-year-old kids to work without formal parental permission.

In 2014, the Cato Institute, a right-wing think tank, published “A Case Against Child Labor Prohibitions,” arguing that such laws stifled opportunity for poor — and especially Black — children. The Foundation for Government Accountability, a think tank funded by a range of wealthy conservative donors including the DeVos family, has spearheaded efforts to weaken child-labor laws, and Americans for Prosperity, the billionaire Koch brothers’ foundation, has joined in.

Nor are these assaults confined to red states like Iowa or the South. California, Maine, Michigan, Minnesota, and New Hampshire, as well as Georgia and Ohio, have been targeted, too. Even New Jersey passed a law in the pandemic years temporarily raising the permissible work hours for 16- to 18-year-olds.

The blunt truth of the matter is that child labor pays and is fast becoming remarkably ubiquitous. It’s an open secret that fast-food chains have employed underage kids for years and simply treat the occasional fines for doing so as part of the cost of doing business. Children as young as 10 have been toiling away in such pit stops in Kentucky and older ones working beyond the hourly limits prescribed by law. Roofers in Florida and Tennessee can now be as young as 12.

Recently, the Labor Department found more than 100 children between the ages of 13 and 17 working in meatpacking plants and slaughterhouses in Minnesota and Nebraska. And those were anything but fly-by-night operations. Companies like Tyson Foods and Packer Sanitation Services (owned by BlackRock, the world’s largest asset management firm) were also on the list.

At this point, virtually the entire economy is remarkably open to child labor. Garment factories and auto parts manufacturers (supplying Ford and General Motors) employ immigrant kids, some for 12-hour days. Many are compelled to drop out of school just to keep up. In a similar fashion, Hyundai and Kia supply chains depend on children working in Alabama.

As the New York Times reported last February, helping break the story of the new child labor market, underage kids, especially migrants, are working in cereal-packing plants and food-processing factories. In Vermont, “illegals” (because they’re too young to work) operate milking machines. Some children help make J. Crew shirts in Los Angeles, bake rolls for Walmart, or work producing Fruit of the Loom socks. Danger lurks. America is a notoriously unsafe place to work and the accident rate for child laborers is especially high, including a chilling inventory of shattered spines, amputations, poisonings, and disfiguring burns.  

Journalist Hannah Dreier has called it “a new economy of exploitation,” especially when it comes to migrant children. A Grand Rapids, Michigan, schoolteacher, observing the same predicament, remarked: “You’re taking children from another country and putting them almost in industrial servitude.”

The Long Ago Now

Today, we may be as stunned by this deplorable spectacle as that chief was at the turn of the twentieth century. Our ancestors, however, would not have been. For them, child labor was taken for granted. 

Hard work, moreover, had long been considered by those in the British upper classes who didn’t have to do so as a spiritual tonic that would rein in the unruly impulses of the lower orders.  An Elizabethan law of 1575 provided public money to employ children as “a prophylactic against vagabonds and paupers.”

By the eighteenth century, the philosopher John Locke, then a celebrated champion of liberty, was arguing that three-year-olds should be included in the labor force. Daniel Defoe, author of Robinson Crusoe, was happy that “children after four or five years of age could every one earn their own bread.” Later, Jeremy Bentham, the father of utilitarianism, would opt for four, since otherwise, society would suffer the loss of “precious years in which nothing is done! Nothing for Industry! Nothing for improvement, moral or intellectual.”

American “founding father” Alexander Hamilton’s 1791 Report on Manufacturing noted that children “who would otherwise be idle” could instead become a source of cheap labor. And such claims that working at an early age warded off the social dangers of “idleness and degeneracy” remained a fixture of elite ideology well into the modern era. Indeed, it evidently remains so today. 

When industrialization began in earnest during the first half of the nineteenth century, observers noted that work in the new factories (especially textile mills) was “better done by little girls of 6-12 years old.” By 1820, children accounted for 40% of the mill workers in three New England states. In that same year, children under 15 made up 23% of the manufacturing labor force and as much as 50% of the production of cotton textiles.

And such numbers would only soar after the Civil War. In fact, the children of ex-slaves were effectively re-enslaved through onerous apprenticeship arrangements. Meanwhile, in New York City and other urban centers, Italian padrones expedited the exploitation of immigrant kids while treating them brutally.  Even the then-brahmin-minded, anti-immigrant New York Times took offense: “The world has given up stealing men from the African coast, only to kidnap children from Italy.”

Between 1890 and 1910, 18% of all children between the ages of 10 and 15, about two million young people, worked, often 12 hours a day, six days a week.

Their jobs covered the waterfront — all too literally as, under the supervision of padrones, thousands of children shucked oysters and picked shrimp. Kids were also street messengers and newsies. They worked in offices and factories, banks and brothels. They were “breakers” and “trappers” in poorly ventilated coal mines, particularly dangerous and unhealthy jobs. In 1900, out of 100,000 workers in textile mills in the South, 20,000 were under the age of 12.

City orphans were shipped off to labor in the glassworks of the Midwest. Thousands of children stayed home and helped their families turn out clothing for sweatshop manufacturers. Others packed flowers in ill-ventilated tenements. One seven-year-old explained that “I like school better than home. I don’t like home. There are too many flowers.” And down on the farm, the situation was no less grim, as children as young as three worked hulling berries.

All in the Family               

Clearly, well into the twentieth century, industrial capitalism depended on the exploitation of children who were cheaper to employ, less able to resist, and until the advent of more sophisticated technologies, well suited to deal with the relatively simple machinery then in place.

Moreover, the authority exercised by the boss was in keeping with that era’s patriarchal assumptions, whether in the family or even in the largest of the overwhelmingly family-owned new industrial firms of that time like Andrew Carnegie’s steelworks. And such family capitalism gave birth to a perverse alliance of boss and underling that transformed children into miniature wage-laborers.

Meanwhile, working-class families were so severely exploited that they desperately needed the income of their children. As a result, in Philadelphia around the turn of the century, the labor of children accounted for between 28% and 33% of the household income of native-born, two-parent families. For Irish and German immigrants, the figures were 46% and 35% respectively. Not surprisingly, then, working-class parents often opposed proposals for child labor laws. As noted by Karl Marx, the worker was no longer able to support himself, so “now he sells his wife and child. He becomes a slave dealer.”  

Nonetheless, resistance began to mount. The sociologist and muckraking photographer Lewis Hine scandalized the country with heart-rending pictures of kids slaving away in factories and down in the pits of mines. (He got into such places by pretending to be a Bible salesman.) Mother Jones, the militant defender of labor organizing, led a “children’s crusade” in 1903 on behalf of 46,000 striking textile workers in Philadelphia. Two hundred child-worker delegates showed up at President Teddy Roosevelt’s Oyster Bay, Long Island, residence to protest, but the president simply passed the buck, claiming child labor was a state matter, not a federal one.

Here and there, kids tried running away. In response, owners began surrounding their factories with barbed wire or made the children work at night when their fear of the dark might keep them from fleeing. Some of the 146 women who died in the infamous Triangle Shirtwaist Factory fire of 1911 in Manhattan’s Greenwich Village — the owners of that garment factory had locked the doors, forcing the trapped workers to leap to their deaths from upper floor windows — were as young as 15. That tragedy only added to a growing furor over child labor.

A National Child Labor Committee was formed in 1904. For years, it lobbied states to outlaw, or at least rein in, the use of child labor. Victories, however, were often distinctly pyrrhic, as the laws enacted were invariably weak, included dozens of exemptions, and poorly enforced. Finally, in 1916, a federal law was passed that outlawed child labor everywhere. In 1918, however, the Supreme Court declared it unconstitutional.

In fact, only in the 1930s, after the Great Depression hit, did conditions begin improving. Given its economic devastation, you might assume that cheap child labor would have been at a premium. However, with jobs so scarce, adults — males especially — took precedence and began doing work once relegated to children. In those same years, industrial work began incorporating ever more complex machinery that proved too difficult for younger kids. Meanwhile, the age of compulsory schooling was steadily rising, limiting yet more the available pool of child laborers. 

Most important of all, the tenor of the times changed.  The insurgent labor movement of the 1930s loathed the very idea of child labor. Unionized plants and whole industries were no-go zones for capitalists looking to exploit children. And in 1938, with the support of organized labor, President Franklin Roosevelt’s New Deal administration finally passed the Fair Labor Standards Act which, at least in theory, put an end to child labor (although it exempted the agricultural sector in which such a workforce remained commonplace).

Moreover, Roosevelt’s New Deal transformed the national zeitgeist. A sense of economic egalitarianism, a newfound respect for the working class, and a bottomless suspicion of the corporate caste made child labor seem particularly repulsive. In addition, the New Deal ushered in a long era of prosperity, including rising standards of living for millions of working people who no longer needed the labor of their children to make ends meet.

Back to the Future

It’s all the more astonishing then to discover that a plague, once thought banished, lives again. American capitalism is a global system, its networks extend virtually everywhere. Today, there are an estimated 152 million children at work worldwide. Not all of them, of course, are employed directly or even indirectly by U.S. firms. But they should certainly be a reminder of how deeply retrogressive capitalism has once again become both here at home and elsewhere across the planet.

Boasts about the power and wealth of the American economy are part of our belief system and elite rhetoric. However, life expectancy in the U.S., a basal measure of social retrogression, has been relentlessly declining for years. Health care is not only unaffordable for millions, but its quality has become second-rate at best if you don’t belong to the top 1%. In a similar fashion, the country’s infrastructure has long been in decline, thanks to both its age and decades of neglect. 

Think of the United States, then, as a “developed” country now in the throes of underdevelopment and, in that context, the return of child labor is deeply symptomatic. Even before the Great Recession that followed the financial implosion of 2008, standards of living had been falling, especially for millions of working people laid low by a decades-long tsunami of de-industrialization. That recession, which officially lasted until 2011, only further exacerbated the situation. It put added pressure on labor costs, while work became increasingly precarious, ever more stripped of benefits and ununionized. Given the circumstances, why not turn to yet another source of cheap labor — children? 

The most vulnerable among them come from abroad, migrants from the Global South, escaping failing economies often traceable to American economic exploitation and domination. If this country is now experiencing a border crisis — and it is — its origins lie on this side of the border.

The Covid-19 pandemic of 2020-2022 created a brief labor shortage, which became a pretext for putting kids back to work (even if the return of child labor actually predated the disease). Consider such child workers in the twenty-first century as a distinct sign of social pathology. The United States may still bully parts of the world, while endlessly showing off its military might. At home, however, it is sick.

Tomdispatch.com

]]>
States are Shrinking number of Medicaid Recipients, but Everyone would Benefit from its Expansion https://www.juancole.com/2023/06/shrinking-recipients-expansion.html Thu, 22 Jun 2023 04:08:59 +0000 https://www.juancole.com/?p=212777 Gainesville, Florida — When the COVID-19/pandemic-era ‘continuous coverage’ Federal requirement ended recently, states could resume eligibility checks that can purge people from Medicaid rolls. Many states are currently doing this, with coverage being lost for hundreds of thousands of individuals/families. These cuts are mostly occurring for procedural reasons, not because enrollees actually lack eligibility.

Reports show that states like Florida, for example, have made it almost impossible to keep up with the paperwork and remain on Medicaid. So far, over 250,000 Floridians have lost coverage. Some states aren’t interested in paying for poor people’s health care. Gov. Sarah Huckabee Sanders stated that she is increasing removal of Arkansas citizens from “government dependency”.

HEROIC JOB CREATORS VS. DESPICABLE MOOCHERS

The ”slash and burn” social and health policies of many states eager to drop Medicaid coverage comes from the old, worn out myth about “heroic job creators vs. despicable moochers”. This toxic and misleading ideology is based on discredited supply side, trickle-down, voodoo economics that have consistently failed to deliver good jobs, or recognize that government aid and investment is often a crucial lifeline. The 2017 cruel GOP tax plan, for example, made it very difficult to make the kinds of investments that would realistically put large numbers of Americans back to work in quality, good paying, secure jobs. Much needed today is a large program of infrastructure repair and renewal, enormous national investments in public health programs and single-payer national health insurances.

AYN RAND AND COMPANY

Belief that low-income Americans do not deserve a helping hand derives from the wealthy 1% private corporate business ideology, straight out of Ayn Rand, asserting that the US is a meritocracy where only the most deserving rise to the top. If you’re sick or poor, you’re on your own, and those who are more fortunate have no obligation to help. In fact, it’s immoral to demand that they help. Alarmingly, many believe the ludicrous myth that welfare recipients receiving public benefits are “takers” rather than “makers” which is untrue for the vast majority of working-age recipients.

In 2012, GOP Presidential candidate Mitt Romney and V.P. candidate Paul D. Ryan asserted that 47 % of all Americans are “takers”; that poverty relief will turn the safety net into a “hammock”; and that food stamps turn the inner city into a “culture of dependence with legions of moochers, welfare “Cadillac” queens, anchor babies, illegal immigrants”, ad nauseam. According to these derisive geniuses, removing benefits is necessary to compel the unemployed to work even if their children suffer in the process.

Corporate big business ideology says that human society is a market, and social relations are commercial transactions with a “natural hierarchy” of winners and losers. Attempts to limit competition, change social outcomes eg, estate tax issue, mandated health insurance, etc. is treated as hostile to liberty and big business interests. Unions and collective bargaining must be crushed; tax, public protection regulations and public services must be minimized or eliminated. Inequality is okay since it’s a result of a reward for merit and generates wealth for the tiny .001%, which the false myth says trickles down to enrich everyone in the 99%. Tax and other social policies to create a more equal society are dismissed as counterproductive to the interests of the ultra wealthy.001%.

MEDICAID REVIEWED

1). Medicaid is financed jointly by the federal government and the states. Early in 2023, 93 million people were enrolled in Medicaid or the Children’s Health Insurance Program, up from 71 million before the pandemic.

2). One of Medicaid’s most important dimensions is its irreplaceable role in addressing the immediate and long-term effects of public health crises, such as Covid-19, since it is the biggest single source of health care financing for dealing with critical public health threats. As we are painfully seeing, these threats may begin with an initial, recognized period of a formally declared emergency. They then can morph into events with very long-term effects felt for years or decades after. This was the case with the World Trade Center attacks, which led to an immediate surge in health care spending, followed by years of elevated spending to address the long-term health fallout triggered by the emergency itself. Think about Zika or the opioid crisis, or COVID-19, to understand the near-term/long-term nature of public health threats.


Image by Darko Stojanovic from Pixabay

3). SNAP, formerly known as food stamps, serves 42 million Americans. At least one adult in more than half of SNAP-recipient households are working. And the average SNAP subsidy is $125 per month, or $1.40 per meal.

4). 80% of adults receiving Medicaid live in families where someone works, and more than half are working themselves and doesn’t trap people in poverty or pay people not to work. Temporary Assistance to Needy Families has required work as a condition of eligibility since Bill Clinton signed welfare reform in 1996. The earned income tax credit, a tax credit for low and moderate income workers supports only people who work.

5). Workers apply for public benefits because they need assistance to make ends meet. Although American workers are among the most productive in the world, during the last 40 years the bottom half of income earners have seen no income growth. Since 1973, worker productivity has grown almost six times faster than wages.

6). Wage stagnation interfaces with costs of housing.Most Americans are spending more than one-third of their income on housing, which is increasingly unaffordable. There are 11 million renter households paying more than half their income on housing. And there is no county in America where a minimum wage worker can afford a two-bedroom home. Still, only 1 in 4 eligible households receive any form of government housing assistance.

7). The majority of recipients of public benefits who do not work are primarily children, the disabled and the elderly, people who cannot or should not work. The safety net exists to rescue people during vulnerable periods. Most people who receive public benefits leave the programs within three years.

8). Many public benefits pay for themselves over time as every dollar in SNAP spending is estimated to generate more than $1.70 in economic activity. Medicaid benefits are associated with enhancing work opportunities. The earned income tax credit contributes to work rates, improves the health of recipient families and has long-term educational and earnings benefits for children.

MEDICAID EXPANSION NEEDED

The COVID-19 pandemic demonstrated the urgent need for expansion of the Medicaid health insurance program. In too many states, political decisions by state legislators to deny health insurance to thousands of their citizens has resulted in an almost non-existent social and health safety net. Decisions in 2016/2017 by 25 states to reject the expansion of Medicaid coverage under the Affordable Care Act resulted in an estimated 7,115 and 17,104 more deaths than had all states opted in, according to researchers at Harvard Medical School and the City University of New York. The researchers found that because of the states’ “opting out” of the Medicaid expansion, 7.78 million people who would have gained coverage remained uninsured.

PRIVATIZED ADMINISTRATION/MANAGED REIMBURSEMENT/CARE IN MEDICAID

To worsen the situation, many states have over the last decade privatized administration of their Medicaid program into a managed care program administered by the private profit health insurance industry. The companies are paid by state government, and their profit depends on spending as little as possible on Medicaid patients. It’s hard to imagine any greater disconnect between public good and private profit: the interest of private health insurance companies lies not in the obvious social good of delivering quality health care to patients but in having as few as possible treated as cheaply as possible. No better example exists of a private capitalist enterprise that feeds on the misery of man.

You might think that we learned the lesson of discredited managed care in the 1990s. The term “managed care” is confusing to many, but really amounts to managed reimbursement rather than managed care. A set prospective annual payment is made by federal/state governments, as in the case of state Medicaid managed care, to cover whatever services patients will receive over the coming year. There is therefore a built-in incentive for managed care organizations to skimp on care and pocket more profits.

ACA PROMOTES MEDICAID MANAGED CARE

Unfortunately, privatized Medicaid managed care was facilitated further by the Affordable Care Act (ACA). More than one-half of Medicaid beneficiaries are now in privatized plans, which have been enacted in many states based on the unproven theory that private plans can enable access to better coordinated care and still save money.

That theory is not just unproven, it is patently wrong as the state of Florida discovered. In 2016, Medicaid ate up 45.9 percent of growth in general revenue , ballooned by approval of a 7.7 percent increase in payment to private managed care plans. Privatized programs have high administrative costs, built-in profits, and do not save money or improve care. Their route to financial success is by finding more ways to limit care and deny services.

Without evidence or disclosure by the private Medicaid plans business/profit interests, private HMOs claim they save the state millions annually. Shockingly, we let many state administrations get away with this illusion by forgetting that adding a “profiteering middleman” to manage health care delivery always adds cost, and does not lower them. The Florida and other state GOPs, for example, lobbied hard for Medicaid managed care. Using scare tactics, they claimed that more than 3 million recipients (Fla.) would receive better care and also save the state money, sternly warning the roughly $23 billion a year Medicaid bill was consuming the state budget.

WHY DO WE STILL WORSHIP AT THE ALTAR OF PRIVATIZATION IN U.S. HEALTHCARE?

John P. Geyman, M.D., former chair of the University of Washington Department of Family Medicine and one of the most published family physicians in the U.S., asks, “Why do we still worship at the altar of privatization in U.S. health care, especially for the poor and most vulnerable among us?”

Several answers stand out:

(1) there is a lot of money to be made by insurers operating health programs subsidized by state governments;

(2) exploitive privatized programs are perpetuated by well-funded “free market” think tanks, their followers in Big Business, and compliant politicians responding to industry lobbyists;

(3) regulations are inadequate to prevent gaming by insurers at patients’ expense;

(4) as a society, we still don’t seem to care when people have bad health outcomes and die because of failed health care policies.”

Wendall Potter, a New York Times bestselling author, health care and campaign finance reform advocate, and authority adds more re., privatization/profiteering:

A). big Insurance revenues and profits have increased by 300% and 287% respectively since 2012 due to explosive growth in the insurance companies’ pharmacy benefit management (PBM) businesses and the Medicare replacement plans called Medicare Advantage.

B). the for-profits now control more than 70% of the Medicare Advantage market.
In 2022, Big Insurance revenues reached $1.25 trillion and profits soared to $69.3 billion.
That’s a 300% increase in revenue and a 287% increase in profits from 2012, when revenue was $412.9 billion and profits were $24 billion.

C). big insurers’ revenues have grown dramatically over the past decade, the result of consolidation in the PBM business and taxpayer-supported Medicare and Medicaid programs.

D). what has changed dramatically over the decade is that the big insurers are now getting far more of their revenues from the pharmaceutical supply chain, Medicare, Medicaid and from taxpayers as they have moved aggressively into government programs. This is especially true of Humana, Centene, and Molina, which now get, respectively, 85%, 88%, and 94% of their health-plan revenues from government programs.

E). the two biggest drivers are their fast-growing pharmacy benefit managers (PBMs), the relatively new and little-known middleman between patients and pharmaceutical drug manufacturers, and the privately owned and operated Medicare replacement plans marketed as Medicare Advantage.

F). huge strides in privatizing both Medicare and Medicaid have been made. More than 90% of health-plan revenues at three of the health industry companies come from government programs as they continue to privatize both Medicare and Medicaid, through Medicare Advantage in particular. Enrollment in government-funded programs increased by 261% in 10 years.

CONCLUSION

US citizens should demand that private corporate HMOs be removed and banned from the administration of our public health insurance programs in all states. Privatization of Medicaid increases costs, without any corresponding increase in quality or access to care. Private insurers maximize profits by mainly limiting benefits or by not covering people with health problems. The greed of casual inhumanity is built in the business model and the common good of citizens is ignored. Excluding the poor, aged, disabled and mentally ill is sound business policy, since it maximizes profit. As long as insurance for health care remains so lucrative for private insurers, patients’ needs and the public interest are disregarded by health insurance profiteers.

After we expand Medicaid to meet today’s immediate challenges, tomorrow we need to insist that the federal government finance a nationwide, not-for-profit, Medicare for All system of universal, single-payer coverage, based on medical need and not ability to pay. This would resolve persistent problems of failed market policies, and would resolve what Martin Luther King, Jr. once described: “Of all the forms of inequality, injustice in health care is the most shocking and inhumane.”

]]>
Can Lebanon Finally elect a President who will Lead it out of its Economic and Political Morass? https://www.juancole.com/2023/06/president-economic-political.html Tue, 13 Jun 2023 04:08:15 +0000 https://www.juancole.com/?p=212603 By Habib Badawi

Beirut (Special to Informed Comment; Feature) – Lebanon may be a small country of some four million citizens with a land area a little less than that of Connecticut, but it plays an outsized role in the geopolitics of the Middle East. Its upcoming presidential election on June 14th therefore has wide domestic and regional implications. The presidency carries significant implications for Lebanon’s delicate electoral landscape and the distribution of power among its religious and political factions. It will be the twelfth attempt by the Parliament to elect a president. Previous attempts have failed to secure two-thirds of the total 128 parliamentary votes in the first round or a simple majority in subsequent rounds.

Host to as many as 1.5 million Syrian refugees, Lebanon’s government and its policies have implications for the future of its larger neighbor. The country is an economic basket case, with charges of peculation at its national bank, and it suffers from long-term infrastructural damage to its main port because of a massive ammonium nitrate fertilizer explosion in 2020. — the third biggest explosion in modern history after Hiroshima and Nagasaki. A middle income country only a few years ago, Lebanon has fallen so low that now some 80 percent of its population lives below the poverty line.  

The government is often deadlocked by sectarian struggles. Many in Lebanon’s powerful Christian minority are tied to the west, though some factions are allied with the Shiite Hezbollah Party. About a third of its population is composed of Shiites, many tilting toward Iran or Iraq.  It also has a big block of Sunnis, who identify with the wider Sunni Arab world and are open to influences from Egypt and Saudi Arabia.  Social and political conventions dictate that the president always be Christian.  But some presidents have been closer to Damascus and some closer to Paris and Washington—Lebanese Christians are diverse.

Former Finance Minister Jihad Azour, who served as director of the Middle East and Central Asia Department at the International Monetary Fund, has emerged as the favorite of most Christian parties. He is, however, disliked by Hezbollah and Amal, the two parties representing Lebanon’s Shiites. It is possible that Azour will nevertheless gain a swell of support.

Suleiman Frangieh, 57, head of the Marada Movement and a favorite of Hezbollah, has also traditionally enjoyed strong support from the Syrian regime. However, signs of waning political relevance suggest the influence of Damascus may be diminishing. This change points to the shifting dynamics of power within Lebanon, emphasizing the challenges faced by long-standing factions.

On the other hand, Azour’s front has garnered substantial domestic support and relies on international backing. However, critics have charged that the media is being manipulated in his favor and have suggested that it is because of external interference.  This critique adds an extra layer of complexity to the electoral landscape. It remains to be seen how these external influences will impact the outcome and shape Lebanon’s political future.

One cannot ignore the role of established influential figures and factions in Lebanon’s presidential race. The long-serving Speaker of the Lebanese Parliament, Nabih Berri, a leader of the moderate Shiite Amal Party, displays a knack for political maneuvering and safeguarding personal interests. He would try to put his thumb on the scale for Frangieh. However, his influence on shaping the election outcome must be weighed against other factions’ aspirations for a more inclusive governance model.

Considering the complexity and fluidity of Lebanon’s electoral landscape, it is always possible for a third “surprise” nominee to emerge and have an impact the presidential election. Lebanese politics often witness unexpected developments and the rise of new candidates who capture domestic and international attention and support. Therefore, while the analysis focuses on key players and established factions, it is critical to remain open to the possibility of a third candidate entering the race.

Lebanon’s delicate balance of power requires that the next president unite diverse communities and navigate intricate sectarian divisions. This task calls for a leader who can foster inclusive governance while addressing religious and political factions’ concerns and aspirations. Striking a balance between competing interests is essential to preventing further divisions and promoting national unity.

Lebanon has long been influenced by external actors, which adds another layer of complexity to its electoral landscape. The successful candidate must possess the diplomatic acumen to protect Lebanon’s national interests while fostering productive relationships with international partners. Balancing national sovereignty and international support will be crucial for the next president.

The presidential election outcome extends beyond politics. It has the potential to have an impact Lebanon’s economic recovery and structural reforms. A capable and determined president can play a pivotal role in advancing much-needed reforms and steering the country toward stability. Additionally, establishing an independent presidency can provide an opportunity to overcome political paralysis and create a more effective government that addresses Lebanon’s economic challenges.

Lebanon’s upcoming presidential election holds tremendous significance for the country’s political future. Balancing sectarian interests, navigating external influences, and preserving national unity are critical factors that will shape the election outcome. The chosen president will need strong leadership, diplomatic finesse, and a commitment to inclusive governance to guide Lebanon through its challenges. As Lebanon strives for stability and prosperity, transparent and a fair electoral process that reflects the people’s will is paramount. Sadly, the possibility has to be admitted that the deadlocked Lebanese political system will yet again fail to produce a president, leaving the hapless country rudderless yet again.

]]>