Faith, Belief, Evidence, & Fraud

Religions encourage and praise faith without evidence in their believers. Christianity is no better, nor probably any worse, than other religions that way. Cynics would argue that religions have to do this, that their authority comes from a source that can’t be proven.

I write tonight, not to blast religions or the religious, but to discuss the limits of faith as a virtue.

It’s one thing to accept or deny a belief in the absence of evidence. Is there a god? Things that were once accepted as proof of gods are now more often understood through science. That doesn’t disprove the existence of a god, but it also makes it as hard as ever to prove that there is a god.

What about a hundred other things that were once taken on faith that no longer stand unchallenged? Are men superior to women? Is one race superior to the others? Is the Earth the center of the universe? Are left-handed people evil? Exposure to other cultures have raised or validated doubts about gender superiority and racial superiority, even if some refuse to accept those arguments. Science has provided us with models that match our observations better but require us to deny that the Earth is the center of the universe, or even our own solar system. And, thankfully for my sister and others, left-handedness is no longer seen as a sign of evil.

There are people who refuse to accept scientific evidence that contradicts beliefs codified in ancient religious texts. How old is our planet and our universe? A literal interpretation of Judeo-Christian scriptures would suggest an age of 6000 years or so; modern scientific theory suggests millions of years, not mere thousands of years. On one side, some suggest that the devil plants false evidence to make us doubt holy scripture. On another side, some suggest that religious scripture aren’t meant to be scientifically literal and accurate.

Closely related is the question of evolution, particularly as opposed to creationism. Again, does one take the Bible literally or does one accept scientific evidence to provide a more nuanced view of the world than was possible 3000 or 4000 years ago?

What about vaccines? This isn’t directly a religious argument, apart from some religions that reject more or all of medical science. But some people reject the arguments that vaccines are effective and best for the communal health of society. This is a particularly vexing issue for scientists, because the origin of the argument against some types of vaccines is well known, as is the fraudulent nature of that argument. Its initial proponent was trying to sell a different form of vaccines for which he owned patent rights; his arguments against specific vaccines became the basis of an argument against all vaccines, even though it’s well known the proponent was scientifically dishonest and fraudulent.

Here the arguments get, in many views, irrational. An agency of the US government, one argument goes, was corrupted and suppressed one or studies proving that vaccines are dangerous. Never mind that other governments around the world have refuted the arguments against vaccines; there must be some corruption somewhere to explain this. Others cling to anecdotes that blame vaccines for unexplainable illnesses, especially autism. We don’t know precisely what causes autism, so why not vaccines? The fact that there is no correlation found in large studies doesn’t dissuade the people who hear some parent’s anguished argument about how their kid was fine until they had vaccines. It is from this fallacy that one hears the argument, “The plural of ‘anecdote’ is not ‘data’.”

This argument has real consequences. Some people, usually due to severe illnesses, can’t be immunized. The best defense for those people is to immunize everyone around them so they aren’t exposed to the illnesses for which they can’t be immunized. This is “herd immunity,” the concept that immunizing most of a group is almost the same as immunizing the whole group. How much is “most of a group” is a key point; some parents believe that it’s not important to immunize their child because everyone else will be, negating the risk. The percentage of a population immunized soon plummets below the threshold for minimal herd immunity, and suddenly we have outbreaks of diseases that are easily preventable.

The debates about evolution or vaccines, while emotional and fervent, usually are honest, with the notable exception of the initial claim that some mercury-based vaccines would cause autism. The “debate” about climate change, on the other hand, resembles the vaccines issue in some ways, but it is corrupted by very real commercial interests.

Approximately 97% of scientists accepted by other scientists as experts in the field agree that climate change, global warming, whatever you choose to call it, is driven by human activities. There are well debated theories on the mechanisms in modern human history that have caused and accelerated global warming. The science of the mechanisms of global warming are well understood; the human activities fueling (almost literally) the warming are understood.

Alas, some of the technologies driving the developed world’s economic explosive growth are also driving the devastating increase in the causes of global warming. Simply put, the burning of fossil fuels has been a boon for the developing world but a threat to the health of the planet. This is very analogous to how consuming tobacco products contributes to higher rates of cancer. Equally analogous, both fossil fuels and tobacco make their producers rich. Reducing demand for those products, while understandable from many perspectives, runs counter to the interests of those producers.

The tobacco industry lost their battle to prove their products innocuous. Their products haven’t been banned outright, but in the USA and several other countries, demand has been greatly reduced. The fossil fuel industries have learned from those battles and have fought tooth and nail to prevent acceptance of the role of fossil fuels in threatening our planet. They try to sow doubt that global warming is real. They try to sow doubt that humans cause it. They try to cherry-pick data to show the problem isn’t as severe as claimed. They try to attack the methodology and credibility of the scientists studying the problem and concluding that we humans are to blame.

All of this leads to this weekend’s Marches for Science literally around the world. Most of the developed countries of the world and almost all of the developing countries of the world accept the science of global warming. Every government on the planet accepts the benefits of vaccines. Basic scientific research is seen almost everywhere as a good thing that helps countries and cultures advance. My country, the United States of America, has the only government of the world that acts as if global warming isn’t settled science, and that is due to the self-interest-driving political activities of the fossil fuel industries.

Ours is a technology-based country. Internal combustion engines are ubiquitous, as are modern hospitals, cellular telephones, computers, plastic products, televisions, people conceived through artificial insemination, farm animals and race horses conceived through artificial insemination, and a million other elements of daily life whose origins lie in the scientific method. People who decry vaccines and people who profit from fossil fuels all casually use these products of the scientific method but deny its validity on their very narrow issues. Fossil fuel companies in particular employ scientists and engineers by the tens of thousands to help them find fossil fuels, extract them, refine them, and get them to market — but they refuse to accept the proof of the consequences of their actions.

Will the Marches for Science make a difference? I hope so, but it’s easy for me to be cynical. Will people suddenly turn against their elected representatives out of a new-found respect for science after this weekend? Or will this remain convinced that the Earth is 6000 years old and scientists who believe in evolution and geology where duped by the Devil? They’ve been so conditioned against “intellectual elites,” will they ever admit that perhaps they should listen to people smarter than them and not just people who say what they want to hear?

I’m glad for the scientists getting politically active, but I hope we figure out who to address the root of the problem: fraudulent rejections of science driven by selfishness.


Patriot Day and Anti-AntiFa Efforts

Part of me is impressed that the “Pro-Trump” rally in Berkeley, CA, called a “Patriot Day” rally. As a child of Massachusetts, I grew up with the anniversary of the Battles of Lexington and Concord being celebrated as Patriots Day. Little did I know that only Massachusetts and its spin-off, Maine, celebrate that day. (Are Massachusetts and Virginia the states to shrink because counties split off and became their own state? Is it just a coincidences that these are two of the four states that are named as Commonwealths?) (Hey, you knew this was called “Overanalysis While You Wait,” when you started reading.)

Mostly, though, I’m dismayed at the cognitive dissonance on the part of some who attended the rally. The people distributing instructions on how to build signs that could easily become weapons in case counter-protesters provoked a fight were expecting to fight “Antifas.” “Antifa,” of course, is a shortening of “anti-fascist,” in the same way that “alt-right” is a shortening of “Racist Islamophobic xenophobes.” If your opponents are anti-fascism, does that make you pro-fascism? Or just shit-stirrers who look for reasons to fight?

Definitions of fascism vary, but one common element seems to be a strong, authoritarian government. This makes their reference to Patriot Day especially perverse, because the colonists in Massachusetts are against the strong central government in London. They were against the strongman King. They qualify as “antifa,” not pro-fascist.

Once again, people who like Donald Trump are shown to have flunked the most basic elements of American history.

The Annual March Rant

True or false:

Basketball conference games exist only so ESPN has content to broadcast four nights a week in January, February, and early March.

Before conference play starts, inter-conference games and November tournaments sort out in rough form which teams should be in the Top-25 or Top-40. Some teams emerge as better than expected and earn some attention; some teams disappoint early and might actually need the conference schedule to tune themselves up.

Once conference play is over, almost any team qualifies for their conference tournament. Any team that gets hot at the right time can play their way to the conference championship, earning an automatic bid for the tournament. If two teams get hot in the same tournament, well, one of them loses out, unless they somehow impress the selection committee enough to earn an at-large bid to the tournament.

This logic doesn’t apply to the smallest conferences — not small in terms of membership, but in terms of stature. There are lots of conferences that routinely send only their conference champion to the tournament. In those conferences, conference play is all about seeding for the conference tournament; you’d rather be a #1 facing off against an #8/#9 winner than a #7 facing a #10 before hoping to face a #2 (or that #9 facing an evenly matched #8 before being fed to the #1). These conferences don’t often turn up on ESPN, although there are so many other ESPN-branded channels and other sports channels, they get some exposure.

But for “mid-major” conferences, those without major football teams but otherwise well established, winning the conference tournament isn’t essential; now that there are more than thirty at-large spots, it’s common for those conferences to get at-large bids for members with good inter-conference records or, rarely, surprisingly strong conference records. If you’re a close second to a team with a good inter-conference record, maybe that rubs off on you.

Still, the difference between finishing second in the Big 12/ACC/Pacific (8? 10? 14?) and finishing sixth in the conference round-robin isn’t that bad; you still have a good chance to make the 68-team tournament with a strong inter-conference record. Your largest concern is getting games close to home compared to playing two timezones away from home and maybe having to start in Dayton for a 11-seed play-in game. Play well in the conference tournament to show you’re still strong and you should be OK.

There are more at-large bids than automatic bids. Conferences can send six teams or more to the tournament. There have been years when three of the Final Four were from the same conference, and more than one conference has done that.

Imagine if there were only thirty-two bids. Oh, dear! That’s the number of automatic bids! Why, some years either Duke or North Carolina wouldn’t get in! OK, we’d better make it thirty-six or even forty bids and have eight play-in games: four for the eight weakest conferences and four for those eight at-large teams. Now the conference tournament becomes much, much more important. Kansas bombing out of the Big 12 tournament in the first round might now cost it a bid, not merely a better seed in the tournament.

Want to make the conference schedule more than just a seeding exercise for the conference tournament? Put in a provision: if there’s a clear-cut (no tie-breaker needed!) conference champion from the conference season, they get a game against the tournament champion if they don’t also win the tournament to get that automatic bid. That means conference tournaments would have to aim for Saturday completions, with Sundays reserved for those season-champion against tournament champion grudge matches. CBS and ESPN might have many, or they might have none. That’s excitement!

So, why have I been so ruthless in cutting down the size of the NCAA tournament? Because I hate one bad game forcing a team out of the tournament. Once the field of 32 is set, after those play-in games, make the tournament double-elimination! After a thirty-game season, surely making a team lose twice before they’re out isn’t unreasonable. It makes the date of the final game uncertain, but three of the major professional leagues (and their broadcast networks!) already deal with that.  The NCAA even does that incrementally at some levels of the baseball and softball tournaments; rounds alternate between double-elimination and best-of-three. (An eventual champion could lose four games on their way to the championship — but they have to be spaced out accordingly.)

Don’t worry, “Cinderella” teams of destiny are still possible — they just have to peak a little higher sooner just to make the tournament and then be ready to beat any team twice, not just once, to get the championship.

Or, we can just admit that NCAA basketball conference games are just content for television networks, with some minor impact on seeding the conference tourneys.

NPR, Among Others, Has Lost Its Way

I am very much a child of the Sixties and Seventies. I started first grade in the fall of 1969. The war in Vietnam was on the evening news, or so I am told. Jack, Martin, and Bobby had all died during my brief life so far. The Vatican II convocation had been over for years; I only remember masses in English, with the priest facing the congregation. The Equal Rights Amendment was still a possibility. Female lectors and soon eucharistic ministers weren’t unusual, although I never served as an altar boy with a female altar servant.

It’s not that racial prejudice had simply vanished. Even as a child, I knew that there were racists in America. I might have thought they were mostly confined to the South, but I knew they existed. I also knew they were wrong. They were behind the times. They clung to outmoded, wrong beliefs. My first political memory is from the fourth grade, watching a mock debate about the ’72 Presidential race and asking what made McGovern think he could end the war in 90 days or whatever his pledge was. So, I don’t remember George Wallace being shot, but I knew as a child that he represented that segment of society that clung to the past, to white supremacy.

The environmental movement was starting. I didn’t notice as an eight-year-old when the EPA was created, but I knew that things like Earth Day and my green ecology lunchbox meant that we were trying to save the earth. There didn’t seem to be much doubt that we had to repair the environment; pollution was awful, epitomized by burning rivers (OK, only one; I was a little confused as a kid) and public service ads on TV showing fish dead in rivers due to pollution. We might have to convince greedy corporations and greed people to do the right thing and change to prevent pollution, but there was no question that the environmental movement was at some level right and necessary.

As I grew up, things got more complicated. There were women who opposed the Equal Rights Amendment, for some perverse reason. There was opposition to school bussing to achieve public school integration — even in the North. Before I graduated from high school, Ronald Reagan was taking the contrary view that government was the problem, not the solution. He didn’t say that racism in America was bad; he just implied that government efforts to address it did more harm that good. He implied that about lots of government programs. Two years earlier, Allan Bakke had sued one of the University of California medical schools, claiming “reverse discrimination.” Some whites were starting to push back when they felt they were the ones paying the price so minorities could be given the chance to succeed.

Reagan, of course, was elected President in 1980. The ERA wasn’t ratified by enough states in the time allowed for its passage; it wasn’t brought up again, but laws about women’s rights were pursued at sub-Constitutional levels. Reagan tried to neuter the EPA, but public opinion forced him to reverse that attempt.

Before Reagan’s second term was over, I completed my education and tried to become an adult, whatever that meant. I remember watching the network evening news in the fall of ’87 about that day’s stock market crash. However, I soon acquired the habit of listening to National Public Radio affiliates. They weren’t government broadcasts, although some received a minuscule amount of public subsidies. It was enough that they weren’t owned by corporations but instead were run as non-profit efforts. If they were more liberal than “mainstream,” corporate-owned TV stations, radio stations, and newspapers, maybe it was due simply to the lack or corporate influences on their choices of news items and how they were presented. The Wall Street Journal went in my eyes from a respectable major newspaper to a paper with a specific, visible slant in favor of business over other interests. Other mainstream, respected outlets weren’t so pronounced in their biases, but I’d hear rumors that GE had killed this NBC story or Westinghouse had somehow meddled with a story on one of the stations it owned. Such accusations against NPR were less common and thus more shocking the few times they came up.

I listened to NPR during the tail end of the Reagan years and through the George H. Bush years. I kept listening during the Clinton administration with its violent tug-of-war with Republicans in Congress who refused to engage constructively with the administration. Some “reforms” were passed, such as sentencing guidelines and welfare “reform.” This was also the time when traditional broadcasting and journalism were starting to be augmented by “the Internet.” By the time the Clinton administration yielded to the Gore George W. Bush administration, American politics were becoming sharply polarized. There were new channels and publications on the right that accused the mainstream media of being too liberal, never mind NPR or its even more leftist “public” rivals, such as Pacifica Radio.

My NPR affiliates changed as I moved, from the Binghamton, NY market to Elkhart/South Bend to Detroit to Washington, DC itself. I felt some sadness verging on anger as Bob Edwards was pushed out by NPR and soon ended up on for-profit satellite radio, for crying out loud! But, under Bush 41, Clinton, and Bush 43, NPR remained fairly consistent in its tone. It also remained fairly consistent as I bounced through Illinois during Obama’s rise in stature into the Presidency and my moves to Austin and then back to the upper Midwest, now in Madison, Wisconsin.

Something about the rise of Donald Trump, culminating in his election to the Presidency, shook NPR, as it shook many media outlets. The mainstream media, including NPR, consistently underestimated Trump during his primary campaign and his Presidential campaign. He was such a repudiation of fifty years of concern for the downtrodden and the minorities. He was the voice for the spiritual descendants of Allan Bakke, proclaiming that they were being held back because of considerations given to minorities. Never mind that automation killed more factory jobs than affirmative action, multi-lateral trade pacts, or illegal immigration. Here was Trump proudly making statements that might have gotten him tossed out on his ass during the Sixties and Seventies and were too extreme for widespread acceptance in the Eighties and Nineties. NPR and the rest of the mainstream media kept waiting for “respectable conservatives,” to figure out how to beat Trump, for Trump’s rising support levels hit a ceiling and for normal order among conservatives to be restored. They also struggled some with the rise of Bernie Sanders. Was the declared Independent really going to upturn the Democratic primary system and beat Hillary Clinton by running from her left? How far left was her left, anyway? Was she a moderate who was too friendly with Wall Street, or was she the progressive who had pushed for heath care reform in her husband’s early years as President, only to come up short, and later had spoken truth to power at an international women’s conference in Beijing?

The Internet gave all kinds of voices ways to find their audiences. In particular, it let well-heeled corporate influences attack the mainstream media with the rise of Fox News and conservative talk radio, and that in turn led to the rise of ultra-right, or white nationalists, or whatever label you’d like to give them to the right of the visible right. The break in the streak of forty-three straight white Christian men as President with the election of Barack Obama somehow energized those fearful of minorities. Mitt Romney spoke in 2012 of 47% of America that would never vote for him because they benefitted too much from government largesse. His comments were quickly and loudly denounced by the mainstream, but surely they added fuel to those far-right activists who were convinced they were victims somehow — or that they could get rich convincing others that they were victims, not merely unlucky in the changing economic tides of the world.

Karl Rove was wrong in 2012 on election night, when he was so damned sure that Mitt Romney had more support than the press gave him credit for, that he was going to upset the incumbent Obama. However, in 2016, those making similar claims about Trump proved right when Trump in fact pulled off the upset against Hillary Clinton. The mainstream media immediately went into a frenzy worthy of the title of this blog, “Overanalysis While You Wait.” Had Trump won, or had Hillary lost? Was she a poor candidate, or was she a victim of a quarter-century of right-wing smear campaigns dating back to the Whitewater scandal in Arkansas? Had the FBI, deliberately or otherwise, sunk Hillary’s campaign by giving legitimacy to the alleged scandal of her e-mail server? Had Russia somehow sponsored the leaks about internal Democratic e-mails that made Clinton look less like a progressive hero and more like a political operative who’d do whatever it took to win?

NPR, among others, decided to take the tact that Trump had won somehow on the merits of his positions in the eyes of the voters. Even as Trump stacked his transition team and eventually his administration with Wall Street billionaires, NPR and others decided to find those voters who had turned out unexpectedly strong and possibly against their own self-interest to vote for this populist-sounding candidate. Euphemisms like “economic anxiety” were invented as the reasons all these good American folk embraced a candidate with xenophobic, homophobic, Islamophobic, misogynistic views. Never mind that so much of what Trump had claimed from the first day of his campaign was demonstrably wrong. His supporters were treated as if their beliefs and their faith in him were rational and reasonable. Never mind that xenophobia, homophobia, misogyny, and religious discrimination were un-American and in some cases specifically prohibited by our foundation documents, including the almighty Constitution. “Economic anxiety” was presented as a powerful motivator, even as hate crimes against blacks, Jews, Muslims, homosexuals, transexuals, and other marginalized people exposed the lie that this was somehow about “economic anxiety.”

Worse, NPR has decided, perhaps by default, to legitimize Trumps administration despite the vast catalog of lies told by Trump himself and by his representatives, such as Sean Spicer, Kellyanne Conway, Stephen Miller, and Sebastian Gorka. They even refuse to label Trump’s untrue statements as, “lies,” claiming they can’t be sure enough of his intent to call them more than untruths or mistakes. When people like Spicer and Conway repeat Trump’s claims as if they are unquestionably true, even in the fact of evidence to the contrary, NPR continues to interview them and allow them air time, and they still don’t call them lies. This isn’t like eight years of opposition to the Obama administration. These aren’t policy arguments about whether healthcare should be universal or whether a President in the last year of his term can nominate a Supreme Court justice. The rate of self-serving lies, the number of policy changes being justified by demonstrable falsehoods, hasn’t yet caused NPR to stop treating these people as legitimate.

Journalists like to claim that their job is the pursuit and revelation of the truth. Some outlets, including the staid New York Times, have done so with enough gusto to become clear targets of Trump’s paranoid ire. If NPR has drawn Trump’s ire at all, it’s only by accident. They’ve forgotten the quote from Orwell: “Journalism is printing what someone else does not want published; everything else is public relations.”

NPR can’t be cowed by fear of criticism from the far right. They need to regain their old tone and adapt it to this age of routine deception by our incumbent administration. Challenge, challenge, and challenge some more, and for God’s sake, stop letting these liars and self-deceivers present their message directly. They aren’t NPR’s listeners, and they don’t deserve NPR’s consideration.

Rural & Urban, Conservative & Progressive, and So Forth

I don’t know if it’s always been true. It wasn’t what was talked about in Reagan’s Eighties, when the focus was on the West, not rural America. But, in this intensely polarized time, Trump’s supporters take comfort in those maps of counties that show broad red swaths of American, with a few blue enclaves in those most urban of counties.

Never mind that the population density of those blue counties dwarfs the population density of those rural red counties. “We’ve got all the land,” and given how the Constitution stipulates the Senate be made up, having lots of densely populated counties and states wins over having large advantages in a few densely populated counties and states.

This was on my mind on the drive back to Madison, Wisconsin, after a weekend in Chicago, ending with a breakfast at one of the restaurants in Chicago with a cult following, Lou Costello’s. Two omelette breakfasts set us back about $40; in Madison, it’d be less than $30, and there probably are lots of towns in America where if you can get the local cafe or diner to make you an omelette, it’d be $20-25.

It’s not hard to figure out why an omelette breakfast costs more at Lou Costello’s than at the Pancake Cafe. Rent (or property values) on Jackson Avenue in downtown Chicago dwarf rents or property values on Gammon Road in Madison, for starters. Your servers and bussers either need more money to pay the rent or for longer trips to Lou’s than to the Pancake Cafe, if not both. Eggs, cheese, and other ingredients may cost almost the same, but the costs associated with so much commerce wanting to be in the same five square miles and so many hundreds of th0usands of people, if not millions, wanting to be within reasonable commuting distance of those companies and shops simply drive up the cost of living in the larger urban areas. It might not be as pronounced in cities like Dallas or Kansas City, where the city and suburbs can expand in all areas, but Chicago is bounded on one side by Lake Michigan, like Boston and Los Angeles are bordered by oceans.

Some of those local effects balance out; some don’t. If a secretary in Chicago needs more income to live, her manager understands and pays that accordingly, if grudgingly, due to market forces. If a secretary in Pontiac, Illinois, makes a lot less than her cousin two miles up I-55, she also pays a lot less for her home and land, not to mention her groceries and her local fitness club membership.

On the other hand, some costs are fixed. Netflix doesn’t vary its charges by the local median cost of living; a $20 per month charge covers both Chicago and Pontiac, but it’s two hours’ wages in Pontiac and only an hour and change in Chicago.

This probably isn’t the sole reason Pontiac is deeply conservative and Chicago is deeply progressive, but it has to be part of it. There’s also a matter of scale. The annual budget for Chicago or for Cook County has to dwarf the annual budgets for Pontiac or for Livingston County. So, when the press starts talking about a $100 million dollar project, people in Chicago don’t flinch as much as people in Pontiac are likely to.

States often, perhaps even usually, have fixed income tax rates. If you’re making more in Chicago, you’re paying more in taxes, but not exponentially more. Are you getting more in return? You get Soldier Field and Grant Park, but you also have zillions of “neighbors” to share those with. You have more miles of Interstate and perhaps wider interstates, but you also have more cars to fight your way through on your commute.

I wonder if what cities have that rural areas don’t have is opportunity. If a large company leaves Chicago, Los Angeles, Boston, or Miami, there are hundreds of other companies in town that need bookkeepers, factory workers, and almost any other profession. If one of the major employers in Pontiac shuts down, who picks up the slack? When the state prison there was close to shutting down, maybe the guards would be offered positions at other prisons, but none were within commuting distance. If a seed company moves away, where do its employees go? How likely is some start-up to arrive in town, giving folks a chance at joining a venture that may make them rich? Maybe a new auto factory will come to town, adding 1,000 jobs, like the Mitsubishi plant in Normal or the Subaru plant in Lafayette, Indiana, but how often does that happen, and how often has your town tried to attract one of those plants? Your town didn’t get that Mitsubishi plan or that Subaru plant, did it? You can hope for the next one, but don’t count on it, or your hopes will be crushed.

To the extent that America truly is a Christian nation, we mostly follow the same Bible. Maybe we aren’t as religious as our grandparents were, or maybe we’ve never been as religious as we’ve been told we are. But, somehow, those lessons still get translated differently in the cities than in the villages. Bibles in both areas teach that Jesus cared for the poor. Do we take that responsibility upon ourselves, or do we share it with our government? Do we tell each other how to be kind and loving to our neighbors, and even to tell each other who our neighbors are, or do we try to use the government’s voice to remind each other that our neighbors include foreigners, people with different sexual mores, people with different religions, and even people who might not like us?

In a small town, maybe you know the town manager and the city council members. You know who parties too much on the weekend or whose marriage is in trouble. Maybe it’s easy to see government as flawed when you see how flawed the people making up the local government are.

Maybe in the cities it’s easier to see government as an arena for games of us vs. them. After the Irish climbed their way up the system, to be followed by the Italians, of course the African-Americans saw an opportunity to make government work for them. They, too, could work their way onto city council and into various government departments, such as the parks department or the fire department. You might hear the mayor and the aldermen are corrupt, but when you get your alderman and eventually your mayor, well, either the corruption stops or it benefits you!

I’m sure this is all too simplistic. I’m sure there are think tanks, both progressive and conservative, devoted to understanding these dynamics, if only to use them to their own advantages. Maybe none of this is really fueled by our local population density. Maybe this is fueled by how easily corporate forces and other shadowy forces can influence our moods. The Koch Brothers, for example, aren’t small town farmers any longer, if they ever were. Planned Parenthood doesn’t hate small towns; they just don’t have as many clinics in those areas.

But, for now, those wide swaths of red and those tiny, dense islands of blue are the state of political America. There have to be reasons for this, and we collectively need ways to blend them, not sharpen the divisions among them. There’s no way we can literally divide into two political structures, one for cities and one for rural areas. We are one country, and that won’t change.

Men and Cameras

One of the themes I heard intermittently growing up and in my life as a new college graduate was of men hiding behind cameras instead of being involved in whatever was going on. I don’t remember how often I heard this theme or from how many people I heard it, but it was more than one person.

So, naturally, I’ve overcompensated.

Most of my best pictures, the ones I use as screen-savers and desktop wallpaper on my computers, were taken when I was by myself. If I’m by myself, I don’t worry about someone else getting bored or impatient while I try to frame a picture or find the position from which there isn’t a branch in the way or someone walking through the scene. I can wait for the roller coaster to emerge from the artificial mountain or for the train to come around the corner. I can find the bird in the tree and try to get the best angle for the best lighting, even if I know half the time it flies away before I succeed.

This is not to say that I have lots of memories of people getting impatient with me while I’m taking pictures. In my best style, I don’t give them the chance. I’ve got some great pictures of various Disney resorts when my wife was sleeping in or when I was in a Disneyworld park while she rested back in the room. Similarly, pictures from other vacations or other sights were from when I was alone. I don’t have pictures from National Zoo when I was there with others; I have pictures from one day my wife was out of town and I went by myself on a weekend. She was disappointed I had gone by myself; she had no idea her stories about her ex fiddling with cameras all the time on family vacations had intimidated from taking pictures while out with her. This isn’t to say I don’t have pictures of her or her extended family from past vacations, but not nearly as many as I have from my times alone, and not as meticulously taken. Those are pictures of people in a place; the pictures i have of the place for the sake of the place tend to come from my own solitary wanderings here or there.

If someone I’m traveling with has their own camera and is taking their own pictures, this tendency is subdued. I’ll wait for them, and they’ll wait for me, or we’ll both take pictures from this overlook or of that scene.

But, yeah, in the back of my mind, I’m not going to be the guy with the camera for whom others wait and who was on the vacation but not actually engaged.

Performance or Performer?

When you think of Pink Floyd, what specifically do you think of? The Wall, some will say, or Dark Side of the Moon. Few will say Syd Barret’s illness and death, or anything other than the music.

With Harper Lee or J. D. Salinger, you think of their single books and their reclusive images, not some party they threw or some talk show interview they gave.

With others, it’s not so clear cut.

Do you remember Prince for his musical genius or for issues peripheral to his music, such as abandoning his name for several years, or his aggressively sexual imagery in his work, or his all-female backup band when he couldn’t play all the instruments himself?

I’m not familiar with Lady Gaga’s music; I haven’t heard most contemporary music. I am, however, familiar with the phrase, “meat dress,” as well as a probably-manufactured controversy about if she was a hermaphrodite or androgynous or what.

I don’t watch much football any longer, so maybe there’s a beauty to Cam Newton’s play that I haven’t seen, but I can’t help but see his pitches for a Greek yogurt brand.

I’ve seen some basketball, so I know LeBron James can play, but I also remember a less-than-humble announcement that he was leaving Cleveland for Miami, and I still wince whenever someone refers to him as “LBJ.” Maybe no one whose lifespan overlapped that of Lyndon Baines Johnson refers to James by those initials, but letting that use go unchallenged, as James seems to, reeks of hubris.

Actions should speak louder than words. Skills and art should negate any need for self-promotion. We should be known for any skills we have that are outstanding, not merely for being willing to be outlandish until we earn our fame.

Maybe Gaga, unlike Prince, has mellowed as her art has become recognized and has let her music speak for her. Maybe James deserves credit for learning a lesson and being more reserved as he left Miami to return to Cleveland. Maybe in ten years, Newton’s leadership of his team will dwarf any endorsements he has done. Maybe, to some, it already does.

I’m painfully aware that segments of our society watch the spectacle more than the art; that’s why someone like Stefani Germanotta has to take a name like “Lady Gaga,” even to get her music heard in some quarters. Wishing it wasn’t true doesn’t make it so; I have no idea how many flawless performers are outshone by auto-tuned publicity seekers who will do anything to attract our attention and then convince us that we’ve found something great, regardless of merit.

In the mean time, every time someone stirs up so much fecal material in an attempt to get my attention, they’d damn well better earn my respect or quickly fade away. Shit-stirrers who feel offended that I’m not impressed and not mesmerized will have no call to yell “discrimination” or otherwise take offense. It’s true of politics; it’s true of culture wars; it’s true of everything that clamors for my attention.