Feeds:
Posts
Comments

A few aphoristic thoughts…

What must the experience of the world have been like for the earliest of human beings? Before the proliferation of abstract categories — like, say, ‘history’ or ‘science’ or ‘ethics’ — and before the founding of civilization?

What would it be like, for instance, to lack a system of number? Since we have all been immersed in mathematics since early childhood, it is impossible for modern man to re-conceive in their totality the thoughts and experiences of a person who lacked such a system. A man without a system of number would, for instance, also have no system of ‘time’ (which Aristotle, rightly I think, defines as ‘the numbering of motion’). ‘Time’ is a category we totally take for granted. But what if nobody had ever told us what ‘time’ is, or if we had never been introduced to the calendar system? — 

Now, of course, like the animals, early man would have an intuitive understanding of both object-differentiation — ‘this thing is distinct from that thing’ — and causality — ‘this happened because I did this’ — but does it follow from this that he had a distinct sense of ‘past’ and ‘future’? Perhaps he experienced the world as a chain of events occurring in a single point in time — always changing, but never receding into ‘the past’ nor arriving from ‘the future.’ There is matter in motion, yes — but ‘time’ is a product of the mind, not of the world.

Once we fall in love with abstract categories, we begin to filter our experiences through the implicit expectations of those categories, which necessarily limits the possible scope of our understanding. Something this limiting serves a good purpose — but sometimes it robs us of our passion and blinds us to evidence pointing in the direction of new experiences and different perspectives. Would any of our modern anxieties — cravings for luxury, status, the quest for ‘identity,’ etc. — be recognizable to early man? Our awareness of time as a numbered, systematized force that rules our lives and from which we cannot escape often seems to fill us with a sort of dread. (What if we were not in possession of an abstract system of time that told us, more or less, when we would die, for instance?) But, still — how do we begin to escape from our understanding of time?

Advertisements

One of the most fascinating theories from Phillip Bobbitt’s Terror and Consent, and one which pervades the entirety of the book, is that the West is in the early stages of a transition in the basis for the legitimacy of the liberal-democratic constitutional order. Bobbitt argues that we are witnessing the beginnings of a shift from the nation-state — which seeks to secure and improve the enshrined values of the nation, or ‘the people,’ as a whole, usually through coercion — to what Bobbitt chooses to call the ‘market-state,’ which will seek to secure and expand the (primarily economic) choices available to the individual. As the importance of national identity decreases, the lines between the various liberal states will blur, and so will the political fault-lines. ‘Market-states’ will increasingly rely on international cooperation between them in order to secure the possibility of this political order — especially in regard to common global threats that threaten individual choice, whether it is terrorism, natural disasters, or the proliferation of weapons of mass destruction. The relationship between the individual and the state, then, will be more like that of a shareholder to a corporation than a member of a unique people with a unique history to centralized representative authorities.

I suspect he is right. Yet, despite its emphasis on individual choice, the ‘market-state’ will not be libertarian in nature (nor does Bobbitt suggest anything to this effect). Democratic man cries out for freedom, meaning: choices — including the means to pursue those choices. It is not enough that the people are permitted to do something — they also want the means by which they can take part in the activities of their choice. Americans increasingly find the heavy hand of direct government intervention inappropriate. ‘Social engineering’ beyond light ‘nudging‘ is out of fashion. The same forces that have weakened the foundations of the modern welfare state are those that grant social license to sexual minorities, religious minorities, drug users, participants in the counterculture, and other varieties of fun decadent identities and choices. The market-state will promise to make it possible for nearly all individuals to more-or-less determine the course of their own lives, and will, little by little, abandon the nation-state’s mission to shape newborns into good Americans, good Englishmen, good Germans, etc.

My personal prediction: the ‘postmodern,’ ‘market-state’ version of the welfare state will need to be more-or-less totally revamped — especially as technology all but guarantees that income inequality will continue to grow — replacing conditional cash transfers (food stamps, disability benefits, unemployment benefits, etc.) with a federally-guaranteed universal basic minimum income, generous vouchers for education and health care, and so forth.

Considered as a whole, the constitutional order will have evolved to deal with new threats to its legitimacy — such as terrorism and income inequality. No one will be able to object to the state on the basis that the political order is depriving them of basic necessities and protections, and individuals will not reasonably have anyone but themselves to blame for the abuse or misuse of the benefits afforded to them by the state.

The pundits can’t keep Elizabeth Warren’s name off their lips. Not since the swift rise of Barack Obama has a senator skyrocketed to such prestige within the Democratic Party in such a short period of time — and, for a second time in a row, this talented senator seems to have arrived at just the right moment to keep a nascent Hillary Clinton presidential campaign from becoming little more than a coronation. Clinton’s ‘inevitability’ may be an illusion, once again.

But a careful examination of the facts of the 2008 primary campaign demonstrates that the circumstances of the 2016 race are already radically different from last time — and in a way that benefits Hillary to such an extent that she is almost certainly better-positioned to win her party’s presidential nomination than any non-incumbent since Dwight Eisenhower.

In November 2006, then-Sen. Clinton could scarcely reach 40% in public polls of registered Democrats. A USA Today-Gallup poll from Nov. 9-12, 2006, showed Clinton with only 31% support, with Barack Obama at a respectable 19% — a gap of only 12%. No 2006 poll ever showed Clinton with more than a 22% lead. Her largest lead at any point in the cycle was 34% — in September 2007. But a recent ABC-Washington Post poll of registered Democrats shows Hillary Clinton with a colossal 53% advantage over Elizabeth Warren — 64% to 11%, with Vice President Joe Biden taking 13% of the vote. She not only has more than double the support she had at this point in the 2008 cycle, but an almost 20% greater lead than she had at any point in 2008. Even Vice President Al Gore, perhaps the best example of an ‘inevitable’ presidential nominee in modern history, was up by just 26% in the autumn of 1998.

This matters significantly because Clinton only barely lost the nomination last time, after having begun from a far weaker starting point — and, for all of her gifts, Sen. Warren is not the phenom President Obama was: not only does she lack his oratory skill, but if she were to run, she would be extremely unlikely to capture the near-unanimous support of the black community that President Obama gained shortly after his win in the Iowa caucuses — support that was absolutely vital to his success. If it were not for the unity of the black community, Barack Obama would not be president today. Warren will not be able to recapture that history-making energy. Moreover, President Obama’s 2008 victory was heavily dependent on his ability to catch Clinton asleep at the wheel in caucus states, where she totally failed to organize — a mistake she most certainly will not make twice.

Clinton’s old Achilles’ heel — her support for the Iraq War — has lost much of its potency, too, as the vote to go to war recedes further and further into the public’s memory. Nearly a decade’s time has elapsed between her first presidential announcement and now — and her foreign policy knowledge after a largely successful run as Secretary of State is unquestioned. She remains somewhat to the right of the foreign policy center-of-gravity among primary voters — but she is still thoroughly within the Democratic mainstream.

Finally, parties are always more cautious about their choice of nominee when they sense they are in a position of weakness. In large part because of President George W. Bush’s failures, the political climate of 2008 was as favorable to progressives as it had been in at least a generation: the Democratic Party was fresh off of a sweeping victory in the 2006 midterms, and the country was utterly fatigued of Bush. It is not surprising that Democrats felt more comfortable with selecting a bolder nominee than Clinton in 2008. But in this cycle, the circumstances are reversed. Like Republicans in 2008, Democrats will almost certainly prioritize ‘electability,’ and Clinton remains the party’s best chance at holding the White House.

She may not have been inevitable in 2008, but in 2016, the nomination is hers. Democrats who are not yet Ready for Hillary had better make their peace.

First, so as to prepare one’s expectations accordingly: People often think, consciously or otherwise, that denying the validity of the theory of mental illness — that is: the dominant paradigm of psychiatry — is tantamount to denying the suffering of people who are called ‘mentally ill.’ Psychiatrists, of course, have both a personal and financial interest in ensuring that people respond to critiques of their discipline with this sort of emotional revulsion. I will attempt here to summarize the core of my problems with psychiatry, much of which is influenced by Thomas Szasz’s excellent critiques. It is crucial to recognize at the beginning of any conversation like this that Szasz never denies that people labeled as mentally ill are suffering (and certainly I don’t). He denies that they are ill. Illness causes suffering (or worse) — but not all suffering is a sign of illness, and if we are to effectively help people, it is imperative to know the difference, and to construct our social systems according to our understanding of those differences.

I would ask anyone reading this to keep an open mind: It is easy to ridicule the dogmas of the past, but all of those past dogmas held themselves up as the pinnacle of human understanding, too — it is the dogmas of the present that deserve the most intense scrutiny. I am convinced that sometime in the not-so-distant future, people will view psychiatry much like they do alchemy: as a precursor; a stepping-stone to real science. Neurology will and should eventually take over all legitimate (but poorly-understood) brain disorders that so happen to have been clustered under the ‘mental illness’ umbrella — just like Parkinson’s Disease and neurosyphilis in the past, real disorders like bipolar and schizophrenia(s) will eventually be treated by brain specialists, not psychiatrists.

What is distinctive about psychiatry, insofar as it claims to be a medical science rather than a humanistic endeavor? It is the only field of ‘medicine’ whose members vote in a central committee on whether or not a condition qualifies as a disorder. It is the only field of ‘medicine’ that purports to be able to ‘cure’ ‘diseases’ through the use of speech — ‘psychotherapy.’ (Can we heal the flu by learning to change our perspective?) It is the only field of ‘medicine’ in which the line between illness and health is a fundamentally political question: consider the status of sexual minorities in the history of psychiatry, including the status of transgendered people today — but, more broadly, consider that the ‘mentally ill’ person is frequently less disturbed by his own thoughts and behavior than by other people’s reactions to his thoughts and behavior — that is: by society’s unwillingness to understand his condition, and its unwillingness to accommodate his problems. We should absolutely help people fit in with society — including, possibly, through the use of humanistic counseling and mind-altering drugs — but that does not necessitate deeming them ‘ill.’ (A TV can be playing an unusual channel without being broken, after all.) Finally, psychiatry is the only field of ‘medicine’ whose professionals are legally permitted to forcibly commit someone who wants to be left alone. The standard here is that one must be ‘harmful to oneself or others’ to justify this kind of action. But a cancer victim who refuses chemotherapy is undoubtedly harming himself. Shall the police forcibly constrain himself and impose treatment on him?

Counseling (‘psychotherapy’) can be very helpful to people, as long as everyone involved respects each other’s boundaries, dignity, and agency. Moreover, free adults ought to be allowed to use any drug they want to relieve their suffering. But the fact that a drug has a positive effect is not evidence that it has ‘treated’ anything. If you buy ketamine from someone on the street to induce euphoria, you are called a ‘drug abuser’ participating in a ‘crime.’ If a Yale professor gives you ketamine to induce euphoria and calls it a ‘research study,’ then ketamine is a ‘medicine’ that ‘treats’ ‘depression.’ This is not ‘semantics’: it is the difference, as Szasz points out, between typhoid fever and ‘spring fever.’ If someone with ‘spring fever’ is given some amphetamine, he will no longer be listless, apathetic, or disconnected from society. Is this proof that ‘spring fever’ is a real illness one can ‘treat’ with ‘medicine’ called ‘Adderall’?

Let us consider ‘depression’ as an example. The arguments I am about to make are mine and not Szasz’s, though he would likely have agreed. Few people doubt that ‘depression’ is an ‘illness.’ Let us first consider, however, that the world is not simply objectively ‘there,’ waiting for us to correctly perceive it. The world, in fact, discloses itself in different ways to different people. (“Our temperament sets a price on every gift bestowed by fortune” says La Rochefoucauld) Some people are actually born with tendencies toward ‘hypomania’ — eager, active, confident, sociable, driven. Others are born with tendencies toward ‘depression’ — reluctant, passive, self-critical, isolated, apathetic. Both tendencies are foreign to most people; their center of gravity is somewhere in between these extremes. Is either class of people described above perceiving the world inaccurately? Is there not injustice as well as opportunity in the world? Is there not cruelty and fear as well as love and hope? Is there not disappointment as well as excitement? Is it a sign of ‘illness’ to fixate on one rather than the other? If someone is melancholic by nature, perhaps it would behoove them to explore perspectives that can mitigate the negative sides of this temperament — and perhaps it would even be to their benefit to take drugs, as those diagnosed as mentally ill are usually advised to do, to help take the edge off of those darker moments. Maybe this is the burden of a melancholic temperament. But where is the tangible evidence of ‘illness’? The empty theory of ‘chemical imbalance’ is finally losing favor among academics, but it is still heavily favored by advertisers (and the people). The notion of a ‘chemical imbalance’ implies that there is a ‘balance’ — which implies, as we have seen, that there is a particular ‘correct’ way of experiencing the world. What does a ‘chemical balance’ look like? How should we properly view the world? Szasz suggests that we will discover the ’cause’ of ‘mental illnesses’ like this at the same time we discover the chemical ’cause’ of being a Democrat, or being a Taoist. Psychiatry, in this sense, is part of a popular political trend toward homogenization and the blotting out of human differences that threaten to undermine the democratic-liberal-capitalist order.

No ‘anti-stigma’ campaign will work; psychiatrists’ insistence that they care about mitigating stigma is either naive or a facade. I am sure most psychiatrists care deeply about making sure most of their ‘patients’ are treated well — but stigma is an unavoidable byproduct of being a patient, an ‘ill’ person. Someone who is sick and in treatment is not ordinarily considered a full and able participant in society. It is impossible to erase the stigma attached to incomplete participation in society — there is a stigma attached to being on welfare, being disabled, being in treatment for a bodily disease, and being very old (or very young) for identical reasons. Why should we anticipate that society would allow for an objection in the case of the ‘mentally ill’? People, fortunately, are actually willing to accept and help others labeled ‘mentally ill’ — but we must state what their problems are in plain English, not in pseudo-medical jargon, in the hopes that the legitimacy usually accorded to real medicine will ‘rub off’ on psychiatry.

Ultimately, there is simply no coherent definition of ‘mental illness.’ There are bodily diseases, including diseases of the brain, and they are the concern of medical science — and there are humanistic concerns, problems in living. Psychiatry approaches problems in living (as well as a handful of real diseases) through the language of medical science. The results are about what we’d expect.

I am proud to announce that my Master’s Essay (thesis equivalent) for my graduate work at St. John’s College is complete. It is called “How Ought We to Live?: Philosophy and Its Challengers In Book I of Plato’s Republic.” I wrote it not only with an academic audience in mind, but with friends, family, and a general educated audience, too, for whom I hope I can make Plato appear as he is: permanently relevant and always exciting. Hopefully, this essay, the longest I have written, will demonstrate why I have been so intoxicated by Socrates over the past year!

Read the full essay by clicking here; I have chosen to host the paper at academia.edu.

Here is an excerpt:

Thrasymachus continues with his display and says that he has a definition of justice that is not any of the answers he prohibited – but he before telling the group what it is, he revives the question of that which is ‘owed and fitting’; particularly, he wants to know what penalty Socrates will suffer upon having his ignorance exposed.


Socrates matter-of-factly claims the suffering that is fitting for an ignorant man is simply to learn from one who knows. Thrasymachus feigns amusement, but states bluntly that what he really wants is a physical token representing his intellectual superiority: money. Socrates nonchalantly promises to pay him when he gets any, but Glaucon, ready for the conversation to move past Thrasymachus’ theatrics, tells him that he and Socrates’ other friends will pay on his behalf.


Socrates’ linking learning and suffering might first be taken to be ironic, but the image of education presented in the
Republic reflects a grueling, often tortuous process. Philosophic education demands that we radically and unsparingly examine and re-examine our most cherished beliefs – even, perhaps especially, those ones we have learned to love most, and which constitute the most fundamental principles that guide our daily lives. In the Allegory of the Cave, those who leave the shadows behind do not do so with joy; they are “dragged there by force.” (516A) The man confronted with his ignorance “suffers pain” from being freed from his former captivity. (515D) Education requires us to recognize that we cannot necessarily trust tradition, the law, our parents, the poets, or even our own common sense to discern the truth about the most important questions. It is doubtful that anyone could recognize this without suffering.

From a Facebook post…

I used to believe Islam needed to go through its own ‘reformation.’ It is a popular opinion — and a comforting one, too, since it is premised on the notion that History functions in the same way in all places, and thus Islam is just ‘a little behind’ the West and will soon ‘catch up.’ But this narrative is deeply problematic, because the concerns that motivated the likes of Martin Luther are not present in contemporary Islam. The Protestant Reformation was an intellectual rebellion against a powerful centralized authority in an era in which few laymen had actually read the Bible for themselves. But there is no centralized authority in Islam right now, and it is quite common for laymen — even illiterate ones — to have memorized the entire Qur’an. Islam’s central authority was dissolved about a century ago — and no imam is discouraging lay Muslims from knowing exactly what the Qur’an says. How and why, then, would Islam need something akin to the Protestant Reformation?

What people really mean when they say things like this, it seems, is that Islam direly needs some kind of movement that will plant the seeds of liberal change. But here in the West, the process of liberal change took centuries, not decades. Besides, if the likes of ISIS, al-Qaeda, Hamas, Boko Haram, etc., are not legitimately Islamic and are therefore rebelling against traditional interpretations of Islam, then perhaps they are the ‘sola scriptura’-oriented ‘reformers,’ interpreting Islam in a fundamentalist manner that will, over the course of centuries — again, not decades — help the Muslim world overpower the hegemonic West rather than passively accept or submit to the ideology of its conquerors. Surely the Islamists have at least as much a claim to being legitimately Islamic as those who think the Qur’an is somehow a precursor to the American Declaration of Independence and progressive notions of universal human rights.

From a Facebook post…

Last week, infantile protesters invaded the Senate to shout at Henry Kissinger that he is a war criminal, which spurred me to think about Kissinger and his influence.

He seems to me to have been a uniquely malignant force on US foreign policy in the 20th century. Kissinger popularized a strain of thought that had usually been subterranean among American statesmen: the etymologically self-congratulating ‘realist’ ideology, which sanctifies the idol Stability in an attempt to freeze any aspect of the status quo that is not immediately threatening to the nation’s material interests — a goal neither possible nor desirable, and which takes a base approach to what constitutes the national interest.

In practice, realism does not and cannot distinguish between the legitimacy of the leader of a republic and the tyrant of a slave nation, so long as their order is ‘stable’ and so long as they have obtained a monopoly on force — which is only a parlor trick: monopolies on force can be broken.

Realism’s attractiveness is found in its recognition of the stubbornly persistent primacy of power politics on the international stage; it is under none of the foolish liberal-internationalist illusions about the possibility of substituting process for power. But it is morally blind: It tells us to look at men like Saddam Hussein or Bashar al-Assad and pine for their continued rule, rather than for their replacement with something better. Worst of all, it subtly conditions people to plan for the short-term by seducing them with the notion that an order that is stable today will still necessarily be stable ten years from now, if only we don’t do anything to break it. Why else would people earnestly believe that Saddam Hussein, whose regime most of all in the Middle East mirrored Bashar al-Assad’s, would have somehow been a bulwark against the rise of ISIS?