Tuesday, August 9, 2011

Wikipedia Is My Religion

It is said that people are drawn to religion because it makes them feel that they are a part of something greater than themselves. Religion makes people feel that they belong — to a community, yes, but also to some Grand Whole. Admittedly, I spent many years feeling disconnected from this Big Picture. When the Internet and YouTube came along, I had the opportunity to express my views and reach out, to have a much more influential hand at stirring the Drink of Humanity, as it were. But I never achieved that feeling of being "a part of something greater than ourselves" until I became a Wikipedia editor a couple of years ago.

Here's how I saw Wikipedia previously: It was an uneven, sometimes reliable (but often not) collection of information managed largely by amateurs, useful for getting a general idea of a topic, but not for research or any serious purpose. Most major articles seemed organized well enough, so I figured there must be a system in place to oversee the editing process. I had heard that anyone could edit Wikipedia, but I assumed that if you submitted an edit, it went to some kind of authorities for approval, and maybe your edit would show up in the article and maybe it wouldn't.

That isn't how Wikipedia works at all. Anyone, anywhere can edit Wikipedia, and change it, right now.* You don't even need to create an account or sign in. Furthermore, there are no "authorities." There are administrators, which are volunteer editors who have been promoted by other editors to perform certain functions, such as banning repeat vandals, and there is also a paid office staff who generally don't get involved in editing. All of the articles are managed by the community of editors, who check each other's edits on a completely equal footing. Since getting involved, I've been continually amazed by how effective this system is.

Wikipedia has a bad reputation as a serious source of information — but it should not be used for that purpose. Instead, it should be used as a gateway to information. One of the things that makes the system work so well is that any addition to the encyclopedia, at least in principle, needs to be backed up by a "reliable source," so if you're looking for a serious reference for research, start with the article and then follow the sources. Reliable sourcing doesn't always happen on Wikipedia, but with major articles that are watched by a lot of editors, as well as highly controversial articles, it almost always does (and it's getting better all the time). Take the article on 7 World Trade Center, for example. Naturally, it is a magnet for conspiracy theorists, who have been trying to tweak the facts therein for years. Without exception, though, dubious and poorly referenced edits are reverted by the community. Fringe theories, according to a key Wikipedia guideline, are not to be given "undue weight" in articles describing the mainstream position. As a result, you'll see very little "9/11 Truth" in the 7WTC article, although there is a link to the article that discusses these theories at length.

Naturally, conspiracy theorists hate Wikipedia. It represents everything they detest — the squelching of alternative ideas and opinions, by some vague assumed authority, in favor of the monolithic mainstream view. For an enthusiast of reality like myself, though, Wikipedia offers an easy way to distinguish educated, informed, scholarly views on a topic (explained in detail and thoroughly referenced) from fringe theories by a small number of not-so-scholarly folks. This is because anyone caught pushing a fringe point of view is quickly ostracized on Wikipedia. Furthermore, blatant acts of vandalism are immediately reverted; at any moment, there are dozens of editors watching the recent changes page, competing to see who can be the first to expunge the addition of the word "penis" from the Salma Hayek article, or whatever. Typically this happens within about 15 seconds.

I've been impressed by the civility of the Wikipedia community as well. Unlike the comments on YouTube, which truly are the worst of the worst in terms of Internet discussions, Wikipedia editors are overwhelmingly friendly, helpful, and impartial. If they have an opinion on a controversy, they tend not to reveal that opinion. Experienced editors I had never communicated with took me under their wing, guiding me and defending me from attackers. When I lapsed into sarcasm in one contentious discussion, another editor called me out for this behavior. In short, editing Wikipedia is for grown-ups — if you aren't one, either you become one fast, or you just go away.

Even after making just a few edits to Wikipedia, I felt transformed — and here's where the "religious" aspect comes in. To make one simple improvement to one Wikipedia article is to contribute to a massive global project. It's likely Wikipedia will be around for a very long time, and that single improvement may last well beyond your corporeal life on Earth. You will have been a part of something greater than yourself, at the same time leaving your mark on the world, making it just a little better than you found it. Isn't that the best of what religion has to offer?



* Articles on celebrities and other frequently vandalized pages tend to be protected, which means they can't be edited by users with no editing history. However, the requirements to qualify for editing these pages are minimal.

Thursday, July 28, 2011

Doubling Down On The Multiverse


Scientific American magazine is kind of like Men's Health these days, and as seen on its August 2011 cover, SciAm's version of "Get Rock Hard Abs" is anything having to do with the multiverse. The real existence of an ensemble of very different universes, much too far away to ever be seen, has become a popular speculation. But it's nothing more than an imaginative idea invented by well-meaning humans who seek explanations without having enough information — not terribly unlike the idea of "God."

First, some background: The idea of a multiverse originated in the 1950s, when a graduate student named Hugh Everett wrote one of the most influential Ph.D. dissertations ever. He argued that just as the electron cloud surrounding an atomic nucleus represents all possible positions for that electron (were we to pin down its location), so the entire universe, obeying the same laws of physics, must have a "universal wave function" that represents all possible configurations and courses of events within the universe. The idea took root in the form of a world that is constantly splitting or branching into different possibilities; when you decided to turn right at that intersection and collided with a bicyclist, there's another "branch" of the universe in which you turned left and got a traffic ticket. We can't observe how the other branches turn out, but some physicists believe that those branches are every bit as real as the branch you and I are currently experiencing.

Over the decades, the multiverse idea has evolved; currently there are four definitions of "multiverse" under consideration. A Level I multiverse proposes simply that our universe is infinitely large, and we can observe only a tiny region. Other regions have the same laws of physics but different distributions of matter. Therefore, with an infinite number of these regions, some of them must be similar to ours — including perhaps a world where everything is the same as our world, except you turned left at that intersection. In a Level II multiverse, the laws of physics vary from place to place; only our local "neighborhood" operates on the physical laws familiar to us. A Level III multiverse is the kind originally envisioned by Everett, where there is really only one local universe, but within that universe are an infinite number of branches — including a branch where you turn left, and some branches where the laws of physics started out completely different and matter never formed. Finally, a Level IV multiverse comprises the sum of any and all possible mathematical structures that may represent universes, with the mathematical structures themselves being fundamental or irreducible entities, not the universes they represent. In a Level IV multiverse, "self-aware substructures" (which are also fundamentally mathematical) arise — conscious beings like you and me.

Granted, Level III and Level IV multiverses are abstract and subtle. Maybe that's why, in popular science media, you hear so much more about Level I and Level II. For example, in the History Channel variety of science "edutainment," the Level II multiverse, with its vast array of distant "bubble" universes, all with different physical constants, has become the default go-to explanation for why our universe appears to be fine-tuned for the existence of matter. Fine-tuning has become a problem in physics in the last couple of decades, but the Level II multiverse offers an easy way out. If there are an infinite number of bubble universes, each with different physical laws, surely some of those would have laws that support the formation of matter, and eventually life. So, finding ourselves in such a "fine-tuned universe" should not be surprising at all.

Here is where I bristle. Yes, we do need an explanation for our peculiar and benevolent set of physical constants that's better than "a loving intelligent creator designed the universe to be that way." But let's not be ridiculous about it. Proponents of the Level II multiverse insist that other bubble universes must exist for no other reason than we find ourselves living in one such bubble. To me, that's an arrogant and small-minded conclusion to draw. As I've argued previously, it's like seeing MTV playing on your television, and then based on that one observation, concluding that there must be hundreds of invisible TV cables somehow entering your home, each carrying a different channel.

In The Trouble With Physics, Lee Smolin writes that one of the greatest powers of science is to protect humans from their own imaginations. We are very good at noticing patterns in the world and generating possible explanations for those patterns. But when we aren't given enough information — as is always the case — our creative imagination fills in the gaps. Whether it's the Earth supported on the back of a turtle, or a Judeo-Christian Yahweh dividing light from the darkness, our naive explanations tend to be fanciful. Medieval astronomers imagined the Sun and planets moving on rotating crystal spheres, but then Copernicus, Galileo, and Kepler showed that the crystal spheres weren't necessary. The same happened to the aether, the invisible medium of space that was believed to carry light waves, and which was vanquished by Einstein. This steady correction of fanciful human fabrications is the legacy of science. "More than anything else," Smolin writes, "[science] is a collection of crafts and practices that, over time, have been shown to be effective in unmasking error. It is our best tool in the constant struggle to overcome our built-in tendency to fool ourselves and fool others."

So we have the bubble universes, which are posited to "really" exist, despite being completely unobservable. If science is our best hope for unmasking wrong explanations, it can't help us here. Unchallenged, the bubble universe theory could persist for hundreds of years, as untested as Judeo-Christian creation before it, simply because it is an explanation, even if it isn't a scientific one. This is why it irks me to hear an authority in physics just lay it out, as verified truth, that distant, very different bubble universes really do exist out there — that this view explains everything. No need to think about the issue anymore; it's been solved. You've just gotta have faith!

I am reminded once again of a telling moment from Carl Sagan's Cosmos. Long ago, people wondered what was on the surface of Venus. Through a telescope, the planet was a white blur. It must be covered with clouds. If it's covered with clouds, it must rain a lot, and the surface must be wet and swampy. If it's swampy, there must be swamp creatures, maybe even dinosaurs. As Sagan put it, "Observation: Can't see a thing. Conclusion: Dinosaurs." The crazy collection of unobservable bubble universes thought to be out there are today's Venus dinosaurs. In 100 years, we will laugh at the naivete and arrogance of this colorful, if incredibly small-minded, conception. Science moves on, forever banishing the errors of the human imagination. In this case, it's a sure thing.

Sunday, July 24, 2011

Change Happens. Quit Whining About It

It seems most people would like the world to stay exactly the same as it is right now.

Whenever something changes or is about to change, people come out of the woodwork to bitch about it. A few years ago, my city voted to build a cineplex and parking garage downtown. It seemed like a good idea — we had no movie theater, parking in town was always a problem, and it would bring much-needed tax dollars to the city. But immediately, lawn signs started springing up: STOP THE MEGAPLEX! Letters to the paper argued that the "gargantuan" structure would destroy the character of downtown, increase traffic and pollution, etc. People liked the town the way it was, and they wanted it to stay like that. Period.

Of course, now everyone loves our movie theater.

A recent news broadcast profiled the coming transition of the Golden Gate Bridge to an all-electronic toll system. Toll takers were told they'd be out of a job in 18 months. Naturally, they lamented the change. "It's gonna be a ghost town around here," said one of the toll takers. (Aside from the 120,000 vehicles per day, I suppose.) The same program reported that a struggling local city needed to restructure its school district, consolidating schools and closing several of them. Angry parents pleaded with the city council — it is "so sad" for schools to close after all these years, it's stressful for a child to change schools and make new friends, etc. And it seems just about every month, a local bookstore or video store falls victim to the Internet. It's all so terribly sad. ("For years, I've been going to Border's to decide what to buy on Amazon. Where will I look at books for free now?")

Yes. Change can be sad. When it happens against our wishes, we get angry. But it's inevitable, our efforts to prevent it notwithstanding. Children grow up, despite their parents' desire to "protect their innocence" for as long as possible. Marriages split up, after many miserable years of trying to make it work. Old people die, despite modern medicine's efforts to keep even the most hopeless patients alive, so that the family can put off being sad just a little bit longer.

Obviously, losing a person or anything else of real value is sad, but I don't understand this automatic connection between change and sadness or resentment. Perhaps people feel that the world around them can and should remain unchanged as they go about their life. Maybe being disabused of that idea is why people feel loss when confronted with any change at all. They bemoan "the end of an era," because that's just what you say when something changes. Gay marriage is opposed because it "redefines marriage," which is just another way of saying it's the end of an era when only certain couples could marry. People tend to stay loyal to brands, and they scream when their brands change or go away. Even the most inconsequential changes, such as changing the name of a street, are resented. ("It'll always be Cripplegook to me, dammit!")

We should fight this instinct and actively embrace change. In the long run, change usually works out for the better. You might have to put up with the road being dug up for 18 months, but when it's over there will be a new subway. You may have had to replace your VHS collection with DVDs, but how sad is it to no longer have to rewind them? We all know someone who once lost a job or relationship, yet later described it as the best thing that could have happened, for one reason or another. Even the saddest thing of all, death, has an upside. We'd never be here today if old life didn't die off and make way for new life. (Although I've gotta say, when all those dinosaurs died it was literally the end of an era....)

The next time something changes that makes you sad or resentful, ask yourself where this emotion might be coming from. It is because the world is moving on and you aren't ready for it? Is nostalgia playing a dominant role? Have you considered the upside and the long-term view? Beware of a stodgy perspective justifying your reaction. ("Well of course the hardware store closing is sad — they've known me for 40 years, dadgummit. I mean, go to Home Depot and try to find castor oil and Shinola. Crying shame, it is.")

The world is dynamic. It evolves and moves on, usually for the better, in one way or another. If you try to stand in the way of that, for no other reason than the fact that it's change, you're more of a problem than a solution.



Click on the cartoon (twice) to enlarge it.

Monday, June 20, 2011

Global Warming Did Not Cause That Tornado

I am not a global warming denier, although at times it can seem as if I am. I often refute claims made by global warming enthusiasts, for the mere fact that they are false. If you’ve followed my blog, by now you probably know that I value reality over bullshit, even in cases where it hurts — and I’m sorry my fellow liberals, but we tend to spout a lot of bullshit about global warming.

Any given week, tune into one of my favorite TV shows, Real Time With Bill Maher, and you may well see a celebrity like Tim Robbins or Janeane Garafolo or Ellen Page (or, most annoyingly, Maher himself) declaring that the latest monster tornado or hurricane was “caused by global warming.” It is presented not as opinion but as fact, citing the argument that because a warmer climate puts more water in the atmosphere, weather patterns are getting more extreme; therefore, all extreme events are the result of global warming.

Wrong! False! Complete, total, and utter bullshit!

Part of the argument is correct: Warmer air results in more evaporation and transpiration by plants, increasing the water-vapor content of the atmosphere. Water vapor and heat contribute to weather events like tornadoes and hurricanes. But that is where the facts end. To take the argument further and declare that climate change is therefore “the cause” of any individual event is, dare I say it, taking a leap of faith. The claim of direct causality is an unfalsifiable hypothesis, nearly as ridiculous as attributing a tornado to God’s wrath over gay marriage (with the obvious difference that there is evidence for global warming). A better analogy would be losing several thousand dollars on a one-day stock trade, and then blaming the loss on the fact that the Dow Jones Industrial Index had been generally declining for the past year.

Here’s what can be truthfully said in these situations: Global warming is associated with a statistical increase in the gross number of (arbitrarily “extreme”) weather events, which are influenced by heat and water vapor. Similarly, a declining Dow is associated with a statistical increase in the number of declines of individual stocks over the period in which the Dow declines. That is the extent to which we can ascribe causality in these cases. To go further and attempt to single out “the cause” is a reductionist oversimplification — and if there’s one thing human brains like to do, it’s reducing and oversimplifying complex issues to the point that they’re so easy to understand, it’s downright stupid.

When someone is advancing their progressive policy agenda, it seems effective to declare, “The tornado was caused by global warming!” It is not particularly effective to say what’s actually true: “Although the atmosphere is a complex system and we cannot attribute causality of a single event to any one factor, global warming increased the likelihood of that particular tornado in the broadest statistical sense.” But the problem is, oversimplified bullshit begets even more oversimplified bullshit. A false reductionist argument makes it all the easier for the opposing side to say, “It’s cold today! Where’s your global warming now?” Or, more subtly, “If there’s more water in the atmosphere, then why is Texas experiencing a drought? Checkmate!” In other words, no debate will get closer to the truth if one or both sides are citing falsehoods and fallacies. Just because one side thinks/knows that they are right, that doesn’t give them license to leave logic and basic truths at the door when arguing their position.

Of course, this extends to other controversies besides the effects of global warming. For example, there’s the debate over whether the Stimulus Program “saved” the U.S. economy. Liberals: “It kept us out of a second Great Depression!” Conservatives: “It was a waste of money and didn’t create the jobs it was supposed to!” Hey, guess what? Both of these positions are completely unfalsifiable. They are made-up opinions, not arguments supportable by clear evidence, and certainly not facts. Without an alternate universe that we can observe as an experimental control, it’s anyone’s guess how an alternate scenario would have actually played out. It’s like in sports, when the TV announcer says, “If the double play hadn’t cleared the bases, three would have scored on that home run, and this team would be ahead right now.” Really? And you know this how?

Albert Einstein said, “Everything should be made as simple as possible. But not one bit simpler.” Remember, people are idiots, and idiots like to simplify the world and make it easier to grasp. Then, when they think they’ve grasped it, they start spouting reductionist bullshit — and in doing so, their idiocy becomes all the more apparent.

Don’t be an idiot.

Monday, June 6, 2011

Bad Experiences Are Good Experiences

A few years ago, I came up with a way of looking at life that has helped me get through many miserable events. I've never seen it described anywhere, and if you know of any writers who have expressed a similar perspective, please tell me.

Say you're driving through the Mojave desert. What would you rather have happen to you: (1) You suffer a tire blowout, which leads to a 45-minute ordeal of changing the tire in 110-degree conditions, during which an overweight cop stops to hang out and watch for awhile, offering no physical help whatsoever. Or, (2) you don't suffer the blowout, and you spend the same 45 minutes in air-conditioned luxury listening to your favorite CD.

Most people would pick (2). If you were experiencing (1) right now — as my 85-year-old uncle did last year — you'd almost certainly elect to switch over to (2) if you could. I am here to tell you, though, that (1) is the preferable experience to have. Here's why.

There's a common expression that goes, "Someday, we'll all have a good laugh about this." Meaning, this may not seem like much fun now, but in the long run it won't matter; we'll remember it as being funny. I've taken this sentiment a bit further. Consider this: Five years after your trip through the Mojave Desert, what will you recall about it? If you experienced (2), probably nothing. But if you experienced (1), you'll have a rich memory of a miserable 45 minutes. You'll have a story with which to regale your friends, complete with the colorful character of the fat, gawking cop. I dare say that on your deathbed, scenario (1) will have provided you with a slightly richer, more memorable life than scenario (2). Did the fact that you were miserable at the time have any negative bearing on your life, long-term? Of course not. A few minutes of misery enriched you for a lifetime. In the big picture of things, it was the far better experience to have.

This manner of thinking applies best to annoying but ultimately benign events, as in the Mojave example; if you developed heat stroke changing the tire and sustained brain damage, that wouldn't be good at all. Similarly, some experiences, such as suffering a family tragedy, are unquestionably negative. However, even in the worst of times, you can comfort yourself somewhat by realizing that this experience can and will enrich you, make you stronger, make you more aware, better rounded as a person. The point is to try to zoom out and imagine the big picture, and think of how the present may impact your life in some way for the better. It isn't easy, but it's something to consider when your only other option seems to be wallowing in your misery.

To me, the worst way you can spend a day is to watch mindless TV on your couch. If you spent your whole life like that, sure, you might never endure a moment of discomfort — but in the end, what would you have to look back on? Nothing!

Think about what I've said the next time you can't believe "this" is happening to you. Just try to be glad that something is happening in your life — anything at all.

Tuesday, April 26, 2011

Consciousness Is Not Mystical

There’s been a growing popularity in the discussion of consciousness, as it relates to things like religion and physics. Theists tell us that consciousness survives death and is eternal. The new age set assigns a mystical quality to human consciousness, to the point where, in books like The Secret, we are told that we can alter the course of objective events, with our minds alone. A fringe element in the physics community proposes an interpretation of quantum mechanics loosely called “consciousness causes collapse,” where the presence of consciousness in some unspecified way triggers potential quantum events to become actual events. Even Robert Lanza, the brilliant originator of one of my all-time favorite theories, the biocentric universe, has teamed up with Deepak Chopra and speaks of the foundational consciousness of the universe and how one’s own personal consciousness can never die, etc.

Whatever. It’s all hooey. There is nothing mystical, or even mysterious, about consciousness. Consciousness is amazing, like the diversity of life on Earth or like the entire universe — but as I have written, just because something is astonishing does not mean it is mystical or in any way supernatural. Merely because the human mind is limited in its ability to comprehend complex things, that does not mean the universe had to be designed by an intelligent God, or that biological evolution could not proceed on its own without a guiding hand, or that we humans, singled out as a species, have been given some unique gift to appreciate beauty and grandeur by the Creator that made it all happen.

Consciousness is a giant, tangled web of biological observations and self-observations, a system of information exchange and storage that goes on within an individual organism. That is all it is. Since all biological beings observe and respond to their external environment as well as their internal state (in the process called homeostasis), we can say that every living thing experiences consciousness, to some degree. Bacteria and blades of grass are conscious — not conscious like us, but conscious nonetheless. If you disagree with this statement, I’d say it’s because you buy into the ancient Western assumption that there’s something unique about human consciousness, that we exceed some kind of “consciousness threshold,” while other animals, and certainly plants, are deficient and inadequate in this regard. I find this opinion arrogant to the extreme.

The premise is that humans, with our language and our science, see the world the way it “really is,” while a dog or a deer does not. We appreciate the beauty of flowers and waterfalls and contemplate the order of things, while dogs, lacking these abilities, look for fire hydrants to pee on. They’re lovable but dumb. It’s not too surprising that the Bible instructs us — God’s chosen species — to act as the masters of the rest of the living and nonliving world; again, an arrogant position to take. We would not be here if the “lower” animals weren’t adapted to responding, with full adequacy, to their dynamic environments.

It’s certainly true that humans have an advanced consciousness, with our long, detailed memories of the past and profound visions of the future. But consciousness in the animal world is a continuum; there is no dividing line between conscious and non-conscious animals. People often say that humans are the only species that contemplate the future and their own mortality, but that isn’t completely true. When a mammal is faced with a choice, or is in a perilous situation, it is able (however crudely) to create mental images of various choices at once, along with their expected outcomes, and act accordingly. This cognitive ability offers a clear survival advantage, and generally the higher you go up the evolutionary tree, this more adept this ability becomes. Animals communicate and exchange information all the time; it may not qualify as intellectual discourse, but it is communication all the same. Among the more advanced functions, animals play and dream and experience emotion and seek out pleasure. There are, in fact, very few things that people do with their consciousness that other animals (at least other mammals) do not also do, in some crude form.

Earlier this year, on the TV show Jeopardy!, the IBM computer Watson crushed former champs Ken Jennings and Brad Rutter in a three-day competition. A critical part of Watson’s software design involved determining the confidence level for each potential response; if the confidence exceeded a certain threshold, Watson would “ring in” and answer. In other words, in addition to interacting with the external environment, it was monitoring its own potential reactions and weighing their positive/negative consequences. Folks, this is consciousness! By machine standards, a highly advanced form, in fact. True, during the taping Watson probably wasn’t contemplating an escape from Sony Pictures Studios, but it was juggling external observations and internal self-observations in order to make choices regarding how to act and thus impact the outside world. I don’t see how this is any different from, say, a lab rat deciding whether to press the lever for the electric shock or the food pellet. Or, to use a lower-intelligence example, whether a person selects Donald Trump or Sarah Palin in the GOP straw poll.

Watson the computer is incredibly complex, but still nowhere near the complexity of the human brain. However, we can make an analogy. Consider a desk calculator, able to turn inputted information into physical action (numbers displayed on the screen). It uses the same digital format of one-or-zero, yes-and-no questions and answers to do its thing that Watson uses, only on a far simpler scale. The same can be said of the relationship between an amoeba and a human: Both rely on cascading electrochemical reactions to convey internal information from here to there. Watson has features that the calculator lacks (such as hard drives); likewise, humans have memory-storing neuronal synapses not found in one-celled animals. But all of the above rely on information from the external physical world to create actions that impact the physical world in turn. Regardless of the degree of complexity, in my book that means they’re all “conscious.”

Thursday, April 21, 2011

The Case For Metaphysics

In addition to my work as a comedian and musician, I’ve collaborated on several YouTube videos exploring the theory of the biocentric universe. This is the radical proposition that the activity of our evolving biological superorganism produces the universe that we see — that a pre-existing universe of nonliving matter did not create the first living thing through chance, some ten billion years after a real and actual event we call the Big Bang. Instead, the theory says, the universe is effectively only as old as life itself. Some find this concept so outrageous, they think it must be a part of my satire, but it isn’t. I’m fascinated by this revolutionary, spectacularly godless cosmological view, in which the universe began as nothing in particular, the echoes of the Big Bang are the now-observed physical back-story for that beginning, and quasars undergo retrocausality through decoherence across billions of light years.

The basic idea is that nothing in the universe comes predefined; by default, the entire thing is but a swarm of probability, just like the “electron cloud” of an atom. We know that electrons are not little dots of electron-stuff that whizz around the atomic nucleus like tiny planets. Physics in the 20th century revealed that such electrons can be described only in terms of probability — the probability that a person, machine, etc., will find an actual electron at a specific location, if that location is checked. This is a well-known principle of quantum mechanics.

That principle of probability-by-default may extend to the entire universe; for a half-century, physicists have entertained that the whole thing is a quantum system. But for the purposes of this discussion, it comes down to the following question: Are the physical properties of all particles of matter independently predefined and absolute, possessed intrinsically by each individual particle? Or, are these properties relevant only with regard to the particle’s interaction with other things, such as other particles or living observers? Is there, for example, a specific beta-radiation particle with a specific momentum and charge traveling from the far side of the Alpha Centauri star system, right now?

Most science enthusiasts would answer yes without hesitation, because that’s the view of the universe we live with. In Western scientific tradition, we assume that the workings of the physical world occur on their own in the background, regardless of whether we happen to be there to watch or know anything. Observation and measurement are merely the process of discovering what’s already in predefined existence, we believe. But are we sure this is entirely true? To paraphrase the classic zen question: If a particle is emitted by Alpha Centauri and no one is around to see it, is it really there?

For those averse to anything philosophical-like, this is where the hackles go up. When we speak of the existence or nonexistence of an unobserved object, we’re making a distinction that’s metaphysical — we’re dealing with the fundamental nature of being, something that’s outside the realm of ordinary observation and measurement. Such a conjecture seems to offer no scientific value, because it can’t be directly tested in the laboratory. As a result, there seems to be a pervasive attitude that ideas involving metaphysics have no real value to the modern world at all. On the biocentric videos, many comments can be summarized thus: “This is just philosophy. You can say all you want that things don’t exist if we aren’t around to perceive them, but that’s bullshit. You’re only changing the definition of the word ‘exist.’ Things exist whether we’re there to perceive them or not.”

This is a naive argument. One cannot dismiss a metaphysical position on the grounds that it is “just philosophy,” because whichever side of the issue you come down on, there is no escaping metaphysics. To assert that an object does possess absolute properties — qualities that exist independently of its interactions with other things — is to take a metaphysical position as well. Chew on that idea awhile. The assumption of an absolute defined particle somewhere off in the galaxy is equally “just philosophy,” and equally “bullshit,” as the idea that it’s only potentially there, not actually there. Having been brought up in the tradition of Western thought, we all carry around this assumption of absolute physical characteristics, possessed intrinsically and independently by every last microscopic object in the universe, as if assigned on the day of Creation by an omnipotent God. In physics, this assumed principle is known as realism. But the brute fact remains, there is no evidence whatsoever supporting absolute realism. None! If you’re a thinking person, you should seriously ask yourself: How scientific is it to base an entire physical worldview on a metaphysical position, a possibly flawed fundamental assumption for which there is no supporting empirical evidence?

Now, here’s where it gets really interesting. Quantum mechanics has been baffling physicists and lay people alike for 80-odd years. The findings of decades of experiments, such as delayed choice and quantum eraser, are extremely difficult to square with the traditional metaphysical foundation of absolute, pre-existing properties of matter. This is partly why there are so many quantum-mechanics interpretations; those that try the hardest to accommodate absolute realism, such as Bohmian mechanics and the transactional interpretation, are complex, bizarre, and highly controversial. But, believe it or not, every finding from every quantum mechanics experiment ever performed is consistent with the alternative metaphysical framework, where the physical properties of matter are relevant only in relation to systems capable of measuring them somehow. This concept follows quite simply from a broad generalization of Einstein’s special relativity, which showed that velocity and simultaneity are never absolute and can be described only in relation to an observer’s frame of reference. (See our video titled It’s All Relative.)

The great physicist Richard Feynman once said that philosophy is as useful to physicists as ornithology is to birds. This is the “shut up and calculate” view, in which physics is employed only to predict the behavior of physical systems — the purely empirical approach that shuns any discussion of meaning or the “true” fundamental nature of things. But are numbers and calculations all we want to get out of science? After all, the Ptolemaic model of astronomy was surprisingly accurate at predicting eclipses and other events — but the reason we adopted the later Copernican model wasn’t only to improve the accuracy of astronomical calculations. Turns out, it’s quite useful to know that the planets really do orbit the Sun, and do not orbit the Earth while moving on an intricate system of invisible circles or “epicycles,” as was once believed. The Sun-centered model provides the basis for a more fundamental and elegant explanation of what’s really going on in the relationships between the Sun, planets, and Earth.

A fundamental explanation is exactly what some physicists are seeking from the increasingly legitimate theories of observer-centered realism, which profess that observation is an active and intrinsic element in the unfolding of reality. (The biocentric universe is one such theory. Here’s another.) Experiments may soon unlock numerous mysteries that have come on the heels of both quantum mechanics and cosmology. For example, why did the initial conditions of the Big Bang produce a universe that appears to be fine-tuned for life? To answer this question under the standard metaphysics, we either need to appeal to an intelligent God, or invoke multiple universes combined with the anthropic principle, a conjecture that I find unsatisfactory. Neither proposition is testable, so we’re back to basing our explanations on unsupportable assertions — which, I regret to say, is not a scientific endeavor, no matter how many shows about the multiverse are broadcast on the Science Channel. (Paul Davies has chimed in on this. For an exhaustive look at how contemporary science is becoming increasingly “faith-based,” read Lee Smolin’s The Trouble With Physics.)

I know what you’re thinking: Like the multiverse, a metaphysical position cannot be directly tested in the lab, so how can it ever be a part of science? Some theories do prevail despite not being directly testable. Evolution theory, for example, is accepted primarily because a huge amount of evidence, from multiple disciplines, is fully consistent with the theory, to the point where unforeseen details like the genetic code were predicted to exist, and subsequently confirmed. This is basically why multiverse theories are accepted as well: because the conjecture of multiple universes is consistent with the real observation of a seemingly fine-tuned universe. However, quantum mechanics is far more consistent with the metaphysical position of observer-centered realism, compared to the opposing metaphysical view of absolute realism. So if the entire universe is a quantum system, we need to think about what that means for cosmology and the universe’s initial conditions, which we have long assumed to be absolute. Indeed, Stephen Hawking theorizes that the universe may not have had a unique beginning — that its initial conditions existed in quantum superposition, just like the electrons of an atom’s electron cloud. In other words, the initial conditions were not fixed and singular, assigned either by God or by chance. Instead, they are relevant only in relation to today’s universe, in which physicists calculate them from the “top down,” i.e., working backward from the present conditions that we do observe. No intelligent God, or multiverse, necessary.

Personally, I believe that observer-centered realism will be confirmed, albeit indirectly. Just within the past month it was found that molecules of DNA are able to interact with quantum systems in ways that ordinary, non-biological molecules do not. Perhaps this is the first of many discoveries pointing to the fundamental role that biology plays in physics, which will then lead to a revolution in technology and medicine. But that will never happen unless we entertain alternative metaphysical viewpoints about our place in the world as observers. If we take that leap, someday soon we might see the real benefits of interpreting empirical science through a metaphysical lens — which at last will prove that metaphysics isn’t “just philosophy” after all.

Thursday, March 17, 2011

Stuck In A Rut? Rearrange Something

Ever since I got some attention on YouTube for my satires, I’ve become less and less interested in recording music, which has long been my No. 1 passion in life. When I was just a musician on MySpace, I was excited if I got a couple dozen song listens, and now I have over 15 million video views. As someone who’s always wanted an audience, it became far more rewarding to make comedy videos that get tens of thousands of views, rather than a song that may never get heard by anyone.

So my home studio began to deteriorate. It was accumulating massive amounts of dust, and even in their inactivity, somehow the cables strewn about the floor seemed to get increasingly tangled. A few days ago I finally spent a day with the vacuum and Swiffer and got everything back where it belonged.

While cleaning up, though, I realized that my studio had been set up almost the same exact way since I got out of college, even after numerous moves. Originally I had not only a receiver and CD player but a turntable, cassette deck, and DAT machine, not to mention several VCRs and a TV — so I had a large wooden “entertainment center” to house them all. But while cleaning, I realized that only the receiver and CD player were left, so why the hell do I have this huge behemoth of furniture taking over the room? In one of those rare but liberating moments of transformation, I decided it was time to say goodbye to the giant wooden cabinet. This allowed me to move the futon couch away from blocking the closet doors, which had always been a pain.

What a difference! My studio went from being a rigid, awkward layout shoehorned into a room where it didn't belong — a place where you couldn’t go in a straight line more than about five feet without hitting something — to a wide-open space highly inviting for performing and recording. And immediately I started playing music again. (I’m easing my way back with a cover song; look for a performance video on my YouTube channel in a couple of weeks.)

For no real reason, we tend to hold on to objects and habits that we’ve lived with for years. Like the “junk DNA” left in our genome from the evolutionary adaptations of our sea-worm and lungfish ancestors, which similarly accumulates over time never to be cleaned up, these objects and habits simply stay with us, by default. We unwittingly learn to work around them, somehow remaining unaware of how they encumber us. I had stored an old Mac and its associated peripherals under my studio desk, which annoyingly reduced the legroom there; this crap is now much better stored in a Hefty bag in the laundry room. Also, the studio door would never stay all the way open, so in five minutes I made a little magnetic thing that keeps it open whenever I want — finally! No more kicking the book or T-shirt as a door stop.

Until we force ourselves to examine whether a setup is really working optimally, we put up with what we’ve got. It works, we might tell ourselves, without realizing that it’s working badly. Thomas Jefferson believed that we ought to throw out the Constitution every generation and rewrite it from scratch. While that may be a bit extreme for a democracy, we can rewrite things in our lives, anytime we want.

Think about things in your home or your life that no longer serve their purpose. Is it really optimal for your sofa to be there and the table to be there? Does your Facebook page really have just the friends you want to have? How many static, worn-out things in your life are like that just because you haven’t bothered to think about changing them? How many little inconveniences can you remove from your day, just by putting aside a few minutes to notice them and then solve them forever?

You don’t need to buy a new car or move to a new house or a new city to shake things up. Just think about your surroundings, and then move some things around. It’s easy to improve your life in small but significant ways, so do it. You’ll be glad you did.

Sunday, February 20, 2011

Lose Weight, Know Death

Last fall I uploaded the video for my song “Constipation,” after being inactive on YouTube for several months, and a bunch of people commented, “You’ve gotten fat!” Turns out I had gained about ten pounds since I last weighed myself (and I wasn’t exactly skinny before). A check of the body mass index chart showed that for the first time in my life, I was in the “overweight” category. I immediately put myself on a weight-loss program. It’s been about four months, and although I am no longer crash-dieting, I’ve lost 27 pounds. It feels great, and it’s satisfying to pick up three gallon-jugs of water and realize that this is the amount of me that’s no longer “me.”

My body is now some 15% less of a body than it was when I started. I am still the same person; it’s just that about one-seventh of me has gone away. That one-seventh is now dead. It has transitioned from being a part of my living tissue, to being entirely nonliving. It is now like all ordinary matter — molecules and atoms freely wandering in the world, unconstrained by cell membranes and the processes of life, broadly scattered about the environment in the form of metabolic by-products and residual heat. This 27 pounds is not conscious; it is not experiencing anything whatsoever.

I realized that since the 15% of me is now dead, that means that when I as an individual die, I will be effectively losing 100% of my weight. Death is the simple transition of living matter to nonliving matter; this can happen equally effectively to cells, organs, or an entire person. So when I die, 100% of my body will undergo this transition, and that 100% of me will feel exactly the same as the departed 15% of me feels right now: nothing.

Of course, this is an imperfect analogy; I lost little or no weight from my brain, and molecular fat within fat cells does not participate in consciousness. But this doesn’t really matter, because pretty much the same thing would happen if I lost one-seventh of my brain in a grisly accident. A decomposing chunk of brain tissue doesn’t experience consciousness, either; the lost one-seventh portion would be exactly as unconscious as the 27 pounds of fat that I’ve lost. And if one-seventh of my brain died, I don’t think part of me would go to Heaven … would it?

This is an area where I feel that even moderate people of faith are living in pre-scientific times. There is no localized “seat” of consciousness, no specific location of the soul, in the body. We all know that if we lost one-seventh of our brain tissue, our consciousness would suffer — consciousness deteriorates readily just when we have a high fever. The many bizarre cases written about by Oliver Sacks are proof positive that our sense of the world (including the self) is tied to the physical condition of the brain. How does the idea of an eternal soul work with a person like Terri Schiavo? Do Christians feel that she was actually fully conscious in some manner as she lay in her waking but vegetative state? Or, when she died, did her healthy consciousness reconstitute itself before going to heaven? And which consciousness was that — as it was just before she suffered brain damage at age 26, or a younger, more naïve consciousness? Do persons born with severe developmental disabilities become normal after death? Do those with minor learning disabilities, or traumatic memories, lose them before they go to Heaven? Do sufferers of obsessive-compulsive disorder learn to chill out after they die? What if certain people’s disorders or flaws actually helped them to achieve great things on Earth?

I suppose if I were a believer, I’d say something like, everyone has a perfect soul or spirit which can be trapped inside a flawed body, but which becomes free upon death. To me, though, if a person’s soul in Heaven is different from his or her waking self on Earth, then it isn’t the same person — any more than someone is the same person after they’ve been given a lobotomy, or developed Alzheimer’s.

The problem is, the idea of an eternal soul is logically incompatible with the idea of an organic body that hosts consciousness organically. There could be no self-consistent “theory of the eternal soul” that explained how that soul relates to an individual’s personality, memories, and experience on Earth. Life after death is fine as a bedtime story, but when scrutinized with any logical rigor at all, none of it makes sense.

Now if you’ll excuse me, the remaining six-sevenths of me has a life to enjoy.

Wednesday, February 9, 2011

How People Sleep At Night

You often hear the expression, “How does (so-and-so) sleep at night?” We wonder how people who do wrong things manage to live with themselves. At the height of their wrongdoing, how did Bernie Madoff, or Saddam Hussein, or Joseph Stalin sleep at night?

I'll tell you how they slept: Just fine, I'm sure.

People have the ability to shape the subjective reality that they live in — the world in which they see themselves embedded — however they see fit. Let me rephrase: All people actively shape their subjective reality, all the time. It is a part of human nature; there is no escaping it. For most people this isn’t a big deal. For others, it’s a very big deal, because it’s what lets them sleep at night.

I first came upon this idea right after the O.J. Simpson trial in the ’90s. Here was someone who had almost certainly murdered two people, but the murderer himself seemed to have no knowledge of this fact. At press conferences, incredibly, he would talk about how he planned to devote his life to finding “the real killers.” It didn’t make sense; I believed that O.J. himself believed that he was innocent. It seemed as if he had rewritten his internal history, the memories in his brain (which at some time had to be incredibly vivid), to the point where Nicole Simpson and Ron Goldman were killed by totally different people.

Bingo. That is exactly what Mr. Simpson did, although consciously, he has no idea what he did. Subconsciously editing a murder out of one’s memory is extreme; it’s difficult to believe such a feat of selective memory is even possible for a human. But this is merely an extreme case of something that happens all the time, in all of us. I started calling it the O.J. syndrome.

More recently I’ve come to call this effect the everyday Stockholm syndrome. The Stockholm syndrome is what happens when someone is captured against their will by a group, and then over time, they come to identify and cooperate with that group. (Patricia Hearst is the classic example.) The Stockholm syndrome is recognized as a defense mechanism for people under tremendous stress or duress — but again, it’s just an extreme example of something that routinely happens to all of us.

My friend Vivian is a good case of the everyday Stockholm syndrome. She was, and is, idealogically a liberal person — the kind who would volunteer for an environmental cause. But then she got married, her husband was hired by an oil company, and they moved to Houston. Although she is still socially and politically progressive, when it comes to climate change and energy regulation, she can rattle off all of the conservative arguments. It isn’t that she doesn’t believe them and she’s just “acting.” It probably isn’t even that she independently changed her mind on these issues, based on some enlightenment. It’s that she found herself in an internal conflict, what psychologists call cognitive dissonance, and this was the way out. Subconsciously she became motivated to experience a shift in the way she saw the world. If her subjective reality didn’t undergo this transformation, she’d literally be sleeping with the enemy, the man she loves, and that just wouldn’t do. So gradually, her subjective reality changed by just a small amount, and once that happened, the life she lives became completely fine. And she’s sleeping at night, no problem.

I noticed this myself a few years ago. I got a boatload of work animating promotional videos for Verizon Wireless. Mind you, normally I’m about as anti-corporate as anyone, vehemently so. I kind of had to grit my teeth to make these videos pimping a huge phone company and its celebrity affiliations. But after a couple of weeks, I caught myself thinking, “Verizon is actually pretty cool.” NOOOOOO! There was nothing about Verizon’s inane promotions that made me feel this way. But in the immersion of it all, I noticed a slow change in my perspective.

Nowadays I smile when someone brings up this question. “How does Glenn Beck sleep at night?” Very well, I’m sure! Many people would like to think that on his way to work, Glenn is psyching himself up for another hour of cynical lies, and on his way home he’s wondering how he could have done such a thing, perhaps pleading with God for forgiveness. But, that is a liberal fantasy! No matter what his personal idealogical history might be, I guarantee you that anyone in Glenn Beck’s position, making that much money and with that much fame and influence — and all of the faithful followers constantly validating his opinions — will go to work telling himself, “I am going to be telling the truth today! People need someone like me to tell them the truth! I am doing the right thing!”

Because that’s what Glenn Beck needs to do to sleep at night.

The “everyday Stockholm syndrome” fits well with other ideas I’ve written about. Each individual’s view of the world is always filtered through a subjective reality that sees and ignores whatever aspects it desires, at times embellishing life with experiences which (apparently) aren’t a part of objective reality. Many people live a “fake life” because it’s comforting to believe that praying works and dead relatives are looking down on them from Heaven. And full-on sufferers of the “Bullshit Syndrome,” such as creationists and 9/11 Truthers, have placed themselves in a bubble so profoundly impenetrable, the evidence for their position seems to be overwhelming; meanwhile there is zero conflicting evidence, so dissenters must all be mindless zombies who will believe anything that authorities tell them. (Actually, someone who thinks the government is out to get them probably doesn’t sleep so well at night.)

All of these ideas are brought together in an excellent article that a reader recently forwarded to me — thanks, Ian.

Thursday, January 27, 2011

Decoherence: Destroyer of Weirdness

Last night I watched an episode of one of my favorite TV shows, PBS's Closer to Truth, which deals with scientific perspectives on questions of philosophy and theology. The episode was called “Why Is the Quantum So Weird?” From Scientific American magazine to popular pseudoscience books like The Secret, we are told that the world of the very small is extraordinarily strange, counterintuitive, unlike anything we can relate to in everyday life — where particles can seem to be in two places at once, go backward in time, “tunnel” through impermeable barriers, etc. We know that on very small scales, these quantum phenomena do occur, and in fact quantum mechanics establishes the theoretical basis behind everything from transistors to quantum computing. So, why is the quantum world so strange?

The episode provided an excellent run-down of quantum theory, but it didn’t provide a satisfying answer to the question. That’s because it’s not the right question. We should be asking, Why is the ordinary world not weird? Because this is a question we actually have an answer for.

Assigning a value-judgment word such as “weird” to quantum phenomena betrays how biased we humans can be. We expect things to behave the same on all scales, large and small, because that’s how a physically consistent universe should be. If a tennis ball can be in only one place at once, we assume that the same must be true of an electron. In fact, today many physicists agree that the world does behave the same on all scales; however, this behavior is most accurately described by the laws of quantum mechanics — even the behavior of the entire universe as a whole. (This is the scientific basis for the parallel universes of the famous “many worlds interpretation.”) In other words, the whole entire universe on all scales is “weird.” So why does it make sense to us? Why do we never see evidence of a tennis ball being in two places at once, or passing through a brick wall, as we do for subatomic particles?

The answer relates to something called quantum decoherence. Discovered in the late 1980s, decoherence refers to the loss of coherence, which is the property of a quantum system (such as an electron) that can give it an uncertain, blurry or smeared out physical description. An electron in a coherent state can be in superposition, meaning that its precise location, momentum, spin, etc., is undefined or blurred out: It appears to possess many values for these things at once. (Most people learn about this bit of quantum weirdness by way of the electron cloud that surrounds an atom's nucleus, but free electrons and other particles have this property as well.) However, if that electron encounters an electron detector, the system of the electron and the detector will undergo decoherence, and the electron will appear to suddenly “snap” into one definite state. You often hear this process described as the collapse of the wavefunction, although that phrase is becoming increasingly archaic among the physics crowd.

Decoherence causes ordinary macroscopic objects to behave differently than subatomic particles; unlike electrons or photons, they always exist in definite places and follow well-understood and predictable or classical laws of motion. To experimentally prove that decoherence is responsible, just take an object and put it into a coherent state of superposition, and keep it that way — prevent decoherence from happening. To achieve this feat, though, there’s one thing you need to do: The object must be completely removed, or decoupled, from interaction with its environment. For example, it needs to be kept incredibly cold, at a fraction of a degree above absolute zero. This is because the moment the object starts getting hit with photons of heat or light, those photons begin to “observe” the object. In doing so, they carry away enough information about the superposition that the object appears to collapse into one definite state, with astonishing speed. Decoherence ensures that anything that’s directly observed in any way at all cannot remain in a state of superposition. Even though everything in the universe obeys those “weird” laws of quantum behavior — all the time — whenever there’s any kind of observation going on, decoherence destroys that quantum weirdness. In the process, it creates a world that makes sense.

Actually, decoherence only destroys the weird aspect of nature; it doesn’t destroy the alternate states represented by a superposition, or change anything about the way quantum mechanics works. If a tennis ball in a quantum superposition undergoes decoherence, information describing those potential alternate states still technically exists in the world. It’s just that it has been irreversibly lost to the chaos of the environment, and like Humpty Dumpty, no amount of effort will be able to restore it. The “blurry” aspect of a tennis ball that that has undergone decoherence is a little like the kinetic energy of a car with the brakes applied: It isn’t destroyed altogether, but only gets dissipated into the environment. Once this happens, the probability of any alternate state reappearing — for the alternate positions or momenta of all of the ball’s particles to randomly reconstitute themselves, allowing us to see a second ball — becomes vanishingly tiny.

So the next time you’re playing tennis, and you hit one definite ball back to a definite spot on the court, you can thank quantum decoherence for making it possible.

Thursday, January 6, 2011

"Susan G. Komen" Is A Cancer

I’m sure you’re familiar with “Susan G. Komen for the Cure,” the charity organization that puts pink ribbons on countless commercial products in the effort to raise awareness about breast cancer. I touched on “pink-ribbon saturation” in a previous post. Komen has attracted controversy in the past for giving grants to Planned Parenthood, and also for some of its dubious alliances with decidedly unhealthy products. Since then, Komen has come under additional fire for legally challenging charity groups that use “cure” in their names, including tiny operations such as “Cupcakes for a Cure” and “Surfing for a Cure.” Komen spends nearly one million dollars per year on its legal department, which defends some 200 trademarks, all in the effort to prevent what spokespersons call “confusion in the marketplace.” This is money that was donated in good faith specifically for the cure of cancer, and instead is siphoned off to fund the intimidation and legal strong-arming of much smaller charities devoted to curing cancer.

Are we angry yet? Komen’s partnership with KFC — which, incredibly, attempted to associate fried chicken with good health — resulted in $4.2 million, the largest corporate donation so far. And yet, in just over four years, every last penny will have evaporated in Komen’s legal department alone.

I e-mailed Komen (phone: 877-465-6636) to complain about this travesty. In their form-letter reply, they stated that 84 cents of every donated dollar goes directly toward “all of these programs,” which include “community outreach and advocacy.” It is unclear how many of those 84 cents go directly for the cure, such as funding experimental trials. And still, I have to ask, why only 84 cents? Why not 95 cents, or 98 cents? Hell, even Las Vegas slot machines pay back at a rate in the mid to upper 90s. When you donate a buck to Komen, what isn’t spent on marketing and promotion to grow the organization goes to lawyers, administrators, and others who will never cure cancer.

Komen admits that it occasionally “asks another charity ... to consider clarifying the name,” but that it spends “far below the $1 million mentioned in one news story” on legal challenges to protect its brand. Yes, we get that lawyers also do contract work. But hey, guess what: Whether $500,000, $50,000, or $1,000, any and all of it is too much. It’s all money donated — but not spentfor the cure.

Here’s the enormous irony of it all: Komen, an entity founded to help cure cancer, has become a cancer itself. In the human body, cancer begins when a few healthy, useful cells undergo a change that causes them to grow uncontrollably into a mass. This mass then continues to grow, gobbling up more and more resources in the process (to the detriment of the system as a whole), eventually getting so large that useful organs — I’m thinking of groups like “Cupcakes for a Cure” in this analogy — get crushed. That is how cancer kills an individual, and that is why overgrown, overstaffed groups like Komen unintentionally work to destroy the reputation of important causes. It’s as if the original Susan G. Komen’s cancer lives on, decades later, having long moved on from its host’s dead body to infect all of society, albeit in a subtler, but still insidious, manner.

Many of us have horror stories about working with or volunteering for a major charity. That’s because a successful nonprofit tends to gradually morph into a self-sustaining entity, whose primary function becomes to grow, the assumption being that a larger and more visible charity can do more good than a small, obscure one. However, the larger an organization becomes, the less appropriate the word “nonprofit” is: Anyone on the payroll is profiting very much from the business, and the group needs to meet a quota of donations to keep up. A board of directors and a legal team are assembled. Advertising, promotion, and publicity begins. The original motivation for the mission fades into the background; ideas such as market penetration, branding, and trademarks become important, commencing the legal challenges. Meanwhile, all of this growth is increasingly justified in the name of “awareness.” At this point, do we really need to be made more aware that breast cancer exists?

Don’t get me wrong; I am all for helping those in need. I’ve given four-figure donations to the American Red Cross in response to recent disasters. However, now I’m wondering if that was the best way to donate. From now on I will seek out smaller, more direct means of giving. You know, you can give directly to a children’s hospital; you can call up a local school and ask where you can send a check for classroom materials; you can give to the public library. I could start my own charity right now and do just that with 100% of the donations. An organization with an elite board of directors and hundreds or thousands of employees on the payroll cannot.

Giving to a major, highly visible charity, as epitomized by Susan G. Komen for the Cure, is the laziest, most inefficient way to be generous. Don’t feed the cancer.