Things That Go Bump in the Mind

Things That Go Bump in the Mind
Look for a new post every Sunday morning.

Sunday, May 26, 2013

How to be Vegan and Ambivalent about It


It's not what goes into your mouth that defiles you; you are defiled by the words that come out of your mouth. -- Matthew 15:11
It's not easy being green. -- Kermit the Frog

     This morning for breakfast I had toast and some stuff that looked pretty much like ham salad – but it didn't have any ham in it. Instead the “ham” was made up of the pulp from carrots, beets, apples, and sauerkraut. It didn't taste bad; in fact, after pretty much sticking to the vegan diet for the past year, I thought the concoction tasted pretty good. Notice the contingency of the phrase “after pretty much sticking to the vegan diet for the past year,” because I'm confident there was a time not too long ago when I would have turned my nose up at the fake ham salad. I must admit the dab of peanut butter I put with it certainly added something to the flavor. The amazing thing is, however, it did taste pretty good, even without the peanut butter, and that's because if you stick with eating a certain way for an extended period of time, your preference for particular tastes begin to adapt to whatever you are eating.
     When my wife, Ruth, started this diet (I supposed I should use the word “lifestyle” because a “diet” is something you choose for the short term, but a “lifestyle” is something you're supposedly in for the long haul), I was not going to do it. At the time she told me she was changing to a “plant-based” diet, I was on a “gas station-based” diet – practically everything I ate was brown, round, and rolling on the grill at the Speedway. Gosh, I loved that food, and I still get my cravings for it, but I'm doing better about living day-to-day without it. It took an extra six months and a movie called Forks Over Knives to convince me to try to change over to a vegan diet, and just basically eat what Ruth did. I still cheat on the “lifestyle.” I don't have the willpower to overcome 50 years of programming in just a year or so to entirely forgo meat and cheese, but I often surprise myself on how well I do it. Ruth is a devotee, however, and can navigate a buffet like zealot. I'm not that strong; if we're at a party or some other social gathering, somehow, a tiny particle of beef or cheese finds its way onto my plate to hide beneath the broccoli.
     So, the Big Question is, of course, “Why veganism?” Why bother? I often tell people I do it because Ruth does it (and there's a certain amount of truth to that since I wouldn't have started if she hadn't gone first). I like the answer “I do it because my wife wants me to” because it lets me off the hook from doing the explaining why a plant-based diet is a good idea and really worth the effort. You see, here's the rub: as a rhetorician, I understand when people ask questions because they are really more interested in making an argument rather than getting an answer. So, not always, of course, but often enough, I get people who ask me “Why do you bother with this diet when it's so much easier to eat like everybody else?” when they don't really want to know why I do it but are really just wanting to justify why they eat like everybody else. Honestly, I don't care why other people eat what they do. So, it's in those situations when I'd rather not bother with the argument at all that I just put it back on Ruth. So, I circumvent the argument; I say, “I do the vegan thing because meat and cheese makes my ears bleed.” And then they go, “What?” and then I say, “You know, blah, blah, blah, blah, blah” and they give up wanting to argue with me because they figure I'm just another henpecked husband who does whatever his wife wants him to. (Again, there may be a certain amount of truth to that, but I'm not nearly as henpecked as I am unwilling to argue with people who have already made up their minds about something.)
     Ruth, by the way, goes way beyond the vegan thing and actually tries to follow a sparse “plant-based” diet that also puts strict limits on sugar, salt, oil, and other evils of processed food. I'm not so much committed to that lifestyle because living without meat and dairy is hard enough, and it's taking me more than a year to wrap my brain around how to do the vegan thing while working in town without remembering to pack a lunch. Ruth would be happy to have you know that while a vegan diet is a healthier diet, it still doesn't mean it's a healthy diet if you are living off of french fries, doughnuts, and soda (which technically can be all vegan because you can get all of those things without meat, dairy, or eggs). Ruth's diet is about heart health, and eating that way really is a good way to help avoid heart attacks, strokes, diabetes, and even cancer.
     It's not that I have a death-wish, but the reason I'm not as good at avoiding “the other bad food” is because when I get too hungry my belly can get louder than my brain. If I'm hungry, my brain can yell its fool head off about cancer and diabetes, but it can't get louder than my belly singing “The Supper Song of Freedom.” (It's lyrics goes a little like this: Eat what you want, Eat what you want, you may die tomorrow of a heart attack, but you're suffering right now from hankering for something greasy. Don't fear the cancer; fear the hanker).
     Some people choose to do the vegan thing for ethical reasons. They see the needless butchering of animals as an evil humans can live without. I have no argument with these people any more than I have arguments with the people who believe humans have a right to shoot anything that moves. I am not interested in arguing either way. It's not that I don't have an opinion on the morality of killing animals for particular reasons; it's that I don't think my opinion is necessarily better than anyone else's on this so I'm not interested in participating in an argument in which I can see validity on both sides of the aisle. If there's one seriously good idea I picked up from studying rhetoric, it's that it's really okay to stay uncommitted and agnostic when you're not persuaded by the evidence “for” or “against” something.
     So why do I follow the vegan diet? Sometimes it's because I love my wife enough to want to support her in something that she believes so deeply in (Heaven knows there's enough ideas she believes in that I can't find myself supporting); sometimes it's because there's nothing else in the house to eat; and sometimes it's because I just want to see if I can please both my brain and my belly with stuff that isn't going to kill me somewhere do the road. Perhaps when I have expired, I can come back as a vegan zombie and while other zombies are craving brains, I can be moaning for “Grrrrains!”

      Keep thinking rhetorically, and I'll be back next week.

Sunday, May 19, 2013

Goodbye, Southern High.


    

     On Thursday, I retire from teaching public high school and tonight I will watch my last group of seniors graduate. Sometime next week, they will start the process of knocking the building down. It's not that the district is taking my retirement that seriously; they were planning on knocking the building down anyway because a new high school (right next door) will be finished this summer.
     Somehow it feels completely appropriate that they are knocking the building down the same year I'm heading off to The Great Pasture of Retirement. Both the building and I are 52, and I suppose if I were to hang around much longer, they'd be taking a bulldozer to me as well. To be blunt, my values are no longer welcome in teaching. Like the building I'm leaving behind, my wiring is out of date. I know that needs some explaining, so here goes:
     My dad was a school teacher before me (and he taught in the very same high school where I have taught for the past 30 years). Back when I was in college, I was majoring in Journalism, but I had my eye on a teaching career. My dad didn't want me to do go into teaching. He was afraid I'd become disgruntled over the small salary he'd had to live off of to raise his family; he didn't want me to make the same mistake. When I broke it to him that I was getting my teaching certificate, he said, “Well, all right, but you know how much it pays. Go ahead and become a teacher, but I never want to hear you complain how much money you're making.” That's been the deal for the last 30 years; I've never complained to him about my salary (even though for better than the past 15 years, my beloved school district was literally the lowest paying district in the state of Ohio).
     So, I didn't become a teacher for the money (nor, for that matter, have I ever met a teacher who did). I became a teacher because I liked the respect and dignity that came with the job. I've said to several principals I've worked with over the years, “Look, I could get a better paying job. I need my respect. If I don't have my dignity, I might as well be a circus clown.” Now, as far as I can tell, everyone has something they are really good at; for me, it's been teaching high school English. You can ask anyone in my family – I'm a lousy plumber, a horrible mechanic, a terrible carpenter, and I couldn't dig a straight ditch to save my life – but put me in a room of snarly teenagers, and I can get them to care about Dickens' “Great Expectations,” and I can get them to feel pretty good about their ability to write. Call it a “gift” or “calling,” but I've been blessed to work in my old high school with the people who would respond to the enthusiasm I'd bring to my lessons.
     Now, it's time to go. Call me cynical (perhaps I am), but the qualities that used to be valued in being a good teacher are no longer relevant in the contemporary classroom. What used to make me a good teacher is that my students knew that I cared about them and that I did my best to make them feel welcome. Now what is being valued in teaching has nothing to do with treating students like human beings. What is now considered the most valuable skill in teaching is the ability to document what you plan on teaching, document what you teach while you're doing it, and document how you plan on reteaching the same material once you've documented that the students didn't master the material the first time you covered it. In other words, it's about faking a ton of bureaucratic paper work so if the need ever arises, the district can prove that you presented the material. How you presented the material, whether it was merely a thick life-sucking packet of tree-killing handouts or through an engaging Socratic discussion makes no difference whatsoever. This is to say, no one cares anymore why students didn't learn anything from your instruction, the administration only cares about the evidence in triplicate that proves you offered the instruction.
     Of course, the logic behind this thinking is madness itself. Clearly, if you test a roomful of students and 90% of the students pass the test, the instruction had to be there or where would the 90% have learned it from? But it's no longer about common sense, teaching now is about cranking out the paperwork. Simply put, humanity is no longer relevant. I can't stay around and teach when my value as a teacher is based on my ability to document what I'm teaching and not on my ability to get my students to care about their own development as citizens and fellow human beings.
     This past January, I had a student whose step-father shot himself in front of the family. When the girl told me what had happened, I hugged her and said, “I'm sorry to hear that you have to go through this. Don't worry about your English grade; you've got bigger things to care about right now. I've got your back; you will pass English this spring.” Was making such a promise ethical? Given the modern obsession with testing and scoring, absolutely not. Was the promise profoundly moral? Give my life's interest in preserving dignity and concern, it absolutely was. The girl in this circumstance is probably the most dramatic example of the need to protect and prize our students' humanity, but I could take you desk by desk and tell a similar story about each of my students: this one has to work til midnight in a fast food restaurant to help her parents pay their rent, this one has been pregnant since February and has no idea if she can handle college and a baby, this one can't concentrate because her boyfriend has been hurting her a lot more lately but doesn't know how to break up with him without getting beat up for trying. Desk after desk, story after story, I know these people. My students are not merely data entry points on some chart they are constructing in Columbus based upon their OGT scores. Nonetheless, if I were to hang around next year, 50% of my next evaluation would come directly from their standardized test scores.
     When I was writing my dissertation on the history and theory of rhetorical authority, I devised a formula for determining “good” authority from “bad” authority. It's not really that complicated: “Good” authority is concerned with the dignity of the people it works with; “Bad” authority is not. I called the good form of authority “pro-agentic” because it takes the agency of other people as its highest responsibility; I called the bad form of authority “pythonic” because like a large and powerful snake, bad authority likes to constrict other people and squeeze them into seeing the world according to its own narrow point of view. The best example I can come up with for the “pythonic ethos,” – that is to say the form of authority that denies the other's humanity to achieve it's own political agenda – is the current educational environment that is only willing to look at the data generated by test scores and the documents that “prove” instruction occurred to determine the worth of a classroom teacher. I think Tina Turner would say, "What's love got to do with it?"
      God help us all; kick me out and knock that building down. There's no longer room for teachers like me – we're as obsolete as blacksmiths in a Ford factory. Once I'm gone ,who is going to teach students that how you treat people is more important that how you can manipulate them into doing what you want? That's a trick question, of course, because it's not on the test.
     Keep thinking rhetorically, and I'll be back next week (but I won't be a school teacher, I'll just be another Old Fart who pines long and loud about the Good Ole Days).

Sunday, May 12, 2013

Why Words Are Stronger Than Welts


    
      A Zen koan is a short parable that gives us something to think about that snaps us out of our routine mindset. One of my favorites goes like this: A neophyte monk goes to his master in the Buddhist temple where he has gone to live and says, “I've been here a few months, and I've been meditating for 12 hours a day, and I don't think it's working for me. As much as I try to find the Eternal Quiet of Being, I feel – underneath it all – as though I am a bottle that is filled with gunpowder and could explode at any moment. Can you explain why I feel this way?” The master nodded serenely and said, “You feel this way because everyone feels this way.”
      I don't know if everyone feels as though they were going to explode, but I certainly can relate to the idea that we all struggle and yet, we forget that everyone else has their own struggles as well. Today is Mother's Day, and my mother passed away nine years ago. Some days I miss her so much I find myself crying while sitting all alone in my truck as I'm driving to work. Other times, I go months without thinking of her at all. Like the Zen koan above, I feel my own relationship with my mother is uniquely complicated, but I suppose the truth really is that everybody's relationship with their mother is uniquely complicated.
     Although it is difficult sometimes to explain to people the benefits of a rhetorical education, one boon is the ability to use words to make subtle (but important) distinctions. Throughout my life, I don't think I ever had a moment when I didn't love my mother, but I had many stages in my life when I didn't like her very much. As a child, I had expectations that my mother was never able to meet, and it wasn't until I was well into my adulthood that I was able to understand enough of my mother's own history to comprehend that it was her own struggles with life that prevented her from being the mother I felt I deserved. It's difficult even now with her being gone all this time to explain how the pains of my childhood have molded the man I am today, and, furthermore, regardless of how I wish now things had been different in the past: I am who I am, she was who she was, and underneath it all is not a bottle that could explode at any moment, but the enormity of grace that comes from learning how to forgive.
     I grew up in an era in which “child abuse” existed as a matter of everyday existence, but did not exist as a recognizable classification of behavior; that is to say, during my childhood, nobody called it “child abuse,” people just referred to it as “parenting.” People who worked with my mother later in life used to tell me how kind and loving she was to them, and whenever I heard them say such things, I inevitably had a brief bout of vertigo that comes from cognitive dissonance. Whenever people told me how kind and loving my mother was, they were completely unaware they were talking about the woman who used to beat me as a child frequently and violently with a wide variety sticks and boards. My brothers and I were beaten so often by our mother that we became “connoisseurs” of beatings, and even now can reminiscence over the finer ones. “Remember when I was beaten for getting muddy at that construction site? Ah, that my friend, was a very good beating.”
     It wasn't until decades into my adulthood that I was able to wrap my mind around that idea that my mother had been beaten during her own childhood and grew up believing that not beating your children is a form of neglect, and that somehow, beating children is a way of demonstrating that you care about them. Although I was frequently beaten as a child, I resolved growing up that I never was going to beat my own children. Although beatings were a regular feature of my childhood, somehow the concept that it was an essential (even “normal”) part of life never made it into my belief system the way it had been entrenched into my mother's.
     I would be lying if I said I still don't feel the psychic wounds of my childhood thrashings. However, I think I can honestly say that I have learned to forgive them. I live with the hope that whatever psychic wounds I may have fostered on my children are forgiven as well. Time will tell; like the rookie monk, we don't understand what everyone else is going through.
     Of all the things I held my mother accountable for as a child (in addition to resenting her beatings, I was disgruntled over her indifference to the way I was bullied by neighborhood children), I can nonetheless feel an astounding depth of gratitude for the things my mother did right. Pretty much at the top of that list is this: my mother took me to the library. When I look back at my childhood, I can remember the public library as well as my mother's kitchen. These trips to the library were magical. The idea that we could go to a place where we could surround ourselves with books and that we could take several of them home with us astonishes me even now as I relive those feelings of being allowed to choose to read anything I wanted. I escaped into books, and somewhere during those escapes, I picked up the odd idea that words were more important than welts. And that is why I revere words and the potential they hold to achieve what violence never will. And, that is perhaps the best definition I can give Rhetoric: the belief that people who achieve their ends by violence and coercion are inevitably flawed and corrupted by their faith in violence.
     On this Mother's day, I am grateful that I learned, perhaps in worst way possible, that words are stronger than blows, love is stronger than fear, and forgiveness is the greatest strength of all. This morning I am missing my mother enough to cry again, by myself as I type this. Underneath it all is a bottle waiting to explode, and beneath that, love and forgiveness.
     Keep thinking rhetorically and I'll be back next week.

Sunday, April 28, 2013

So Much Depends Upon a Red Wheelbarrow


     
     It's spring. Oh, I know spring has been here for about a month as far as the calendar is concerned, but real spring – actual spring – doesn't look at the calendar, it looks at the buds blooming on the fruit trees.
     The Grass has sent me official notice that the upcoming mowing season is going to be fiercely competitive. I surprised my green nemesis by opening the mowing season a week earlier than it expected, and I may have caught it off-guard the first time on the opening match of the season, but The Grass is already determined to dominate the standings by mid-July. My neighbor, Bill, had some extra blades that happened to fit the Craftsman 21HP and so my mount is feeling especially eager to take on the competition. I still have to put on a new filter and change the oil in order to get the Craftsman fully psyched up for the summer ahead, but the filter is already in the cab of the truck waiting for the next trip into town to meet a doppelganger at the parts store, and I've already drained the old oil out so there's no turning back on the process of transfusing new blood into the Briggs and Stratton heart of the 21HP.
     The garden has accepted its cold weather class of 2013 with loving welcome. The cabbage, cauliflower, and brussel sprouts have been in the ground for better than a week, and yesterday, I threw caution to the wind and put peppers and tomato plants in the ground. A frost can kill those pepper and tomato plants, but the forecast for the week ahead has lows in the mid 40s so I'm tossing the dice to see if I can get those summer bounties a few weeks earlier this year.
     The garden never looks as good as it does when it's first planted. Someday soon, a few weeds will poke up through the ground and try to bring chaos to my nice orderly rows, and by late August, the Law of Entropy will prevail over my attempts to keep the garden pretty and neat. There are humans among us who have that gift for keeping a garden as lovely on it's last harvest in the fall as its first plant in the spring, but I will never find myself among their demographics. I am too paranoid of chemical companies to use products to keep the weeds under control, and by late summer, when it's really hot outside, I become too much aware of the relatively cheap price of a can of tomatoes at Kroger to care about fighting off the pagan weed invaders that storm the territory of my civilized plants. In the first few months of each spring, I will ruthlessly hunt down and hoe out the vanguard of the heathen weed invaders, but by the time the thrust of the horde arrive, I'll be safely retreating to the air conditioning in my basement. I have a rototiller, but it hates to start almost as much as I hate to use it so there's only so many times I'm willing to drag that beast out of the barn.
     What a relief it is to have warmer weather. We spend so much time in the winter trying to just endure the indignities of cold weather that by the time in late April or early May when we can finally leave the house without a jacket, the whole of nature is mildly intoxicating. Spring gets into the blood stream and travels up the spinal column where it hypnotizes the brain into thinking that building a patio is a good idea. “Look at it this way, Brain,” argues the rhetoric of Spring, “All you have to do is the planning. Back and shoulders will do all the heavy lifting; you won't have to lift anything; all you have to do is ride around inside the skull and think about how nice it's going to be when it's all finished.” About this time, Back and Shoulders start to put in request for vacation time, but Brain is too twitterpated by Spring to take their demands very seriously. “Yes,” whispers Spring seductively, “Look at the Lowes' ad. Patio stones are on sale this weekend. You know how much you like saving money, right?”
     It's spring, and so much depends upon a red wheelbarrow glazed with rainwater. When I was in high school, I thought this William Carlos William poem represented everything I hated about poetry. I couldn't understand it because it didn't seem to have a point. As a high school freshmen, this poem represented everything that was wrong with my high school education. The state could force me to go to school and listen to such drivel, but it could not compel me to like it, and I refused to like it. When I was in the 9th grade, I had no use for poetry. Poetry was too genteel to be respected by my testosterone-fueled adolescence, and I didn't want to have anything to do with it.
     Now, a lifetime later, this poem represents everything that was good about my high school education. This poem, at least as I interpret it now, is a way of saying, “So much depends upon our ability to appreciate the simple pleasures of life. Without our ability to recognize beauty in the ordinary, we are lost to our own humanity.” If there is anything that is lacking in the new, draconian, standards-or-die formulas for education, it's this message that our humanity is far more important than any score on a nationally normed evaluation. What our students need, in my humble opinion, is far more time to consider why beauty is important and far less time trying to prove that they have mastered some skill that makes them employable to corporate hacks who believe only their own money is beautiful.
     Wow, almost slipped into a rant there. Could go on about it, but hey, it's spring, and at least for now, the garden is free of weeds. Brain doesn't want to go on writing; Brain wants to plan a patio.  I will be teaching this poem this week, and I expect my sophomores will hate it until I explain why so much depends upon a red wheelbarrow.
     Keep thinking rhetorically, and I'll be back next week.

Sunday, April 21, 2013

The Dance of Meaning: Violence and Rhetoric


It has become appallingly obvious that our technology has exceeded our humanity.” – Albert Einstein
There are lots of causes I'm willing to die for – but not one cause I'm willing to kill for.” - Mahatma Gandhi
"I wish the entire human race had one neck, and I had my hands around it!" – Charles Panzram


     Not all messages are made up of words. Perhaps the best messages are wordless – a mother's kiss on her baby's forehead, a handshake the seals a deal, a hug at funeral. Words have more precision, but actions have more impact. In the long run, we don't remember what people say; we remember how they made us feel; we remember their silent presence long after we've forgotten any particular thing they've said. We need words to think about the lessons that come to us across the course of our lives, but the wisdom we garner through life comes from collecting experiences – not thinking about them.
     As an English teacher, words have been my stock-in-trade. Whatever reputation I have built as a writing instructor has come through the trust I have put into the way words can shape thinking and the skill I have developed in showing others the importance of using the right words for the right occasions. As a rhetorical theorist, I have looked at words the way mechanical engineers look at materials and consider the forces necessary to bend them to produce specific results. As a musician, I have played with words and cared as much for the sounds they produce as the meanings they convey. As a reader, I have often admired the words of fellow human beings who have been able to stir passions within me for places I've never been and strangers I will never know. While a picture may be worth a thousand words, words are able to paint pictures that see through the dull materiality of this world and reveal glimpses that can only been seen by the heart. I value words; I don't always trust them, but they have nourished and sustained me well beyond the mere mortal limitations of my physical body.
     If life is a performance, then actions and words dance before us. Sometimes actions delicately lift words into the air and suspend them overhead to be admired and acknowledged before being returned gracefully to the ground. Sometimes words jaunt behind the back of actions and come leaping forward in syncopated gyrations. Sometimes actions and words compete for our attention while moving frantically before our eyes; other times, each will support the other by waiting with an outstretched hand while the other commands the spotlight. We derive meaning by recognizing their engagement with each other; confusion comes when words are out of step with actions or actions are no longer in sync with what's being said.
     Humans today are absurdly verbal and frequently hyperconscious of their verbosity. See, just reading that last sentence, made you think about it. At some time in eons past, however, there must have been a time when language was more defined through behavior than through orality. People moved and other understood one another through gesture and facial expression; critical evaluation of who we are to one another based upon what we say to one another came much, much later. As the human ability to express ideas grew through language so did the human capacity to understand ideas expand as well. While on some tacit level, we may have always understood that some behavior was “right” or “wrong,” it took words to articulate the conditions upon which we could come to some agreement that any particular behavior was “right” or “wrong.”
     As far as I know, I was not around eons ago when people began to reify the notions of “right” behavior and “wrong” conduct. On the other hand, I was around during my childhood, and I have fuzzy memories of learning the consequences of misbehavior and the rewards of being virtuous. Some lessons came from my parents who were not averse to beating me with a stick to convince me of the error of my ways. Some lessons came from my brothers who were willing to throw punches to inform me of my place. I also remember one kid from a neighborhood I grew up in who had a couple of cronies hold my arms while he punched me in the gut to let me know that he was dangerous. Each of these lessons I may have been able to comprehend if I had been merely told, but the memories of the personal violence I hold in my body go way past the linguistic neurons of my brain and are buried deep within the muscles that actual bore the bruises. I wasn't there the first time in history when someone desired something that someone else had and then used violence to take it away from the other, but I was there in my childhood the first time someone decided I needed to learn something through a painful thump.
     And this is it then: why I care so much about protecting and advancing the power of language. The ultimate lesson of my childhood simply was that the ability to hurt another person does not evoke respect for any idea, it merely induces pain and the fear of pain. Ideas that are accompanied by either violence or the threat of violence are morally corrupt. If you wrench my arm behind my back, I will loudly proclaim your superiority, but I will not believe it. If I survived my childhood with any belief intact, it is that violence is incompatible with morality. You cannot convince anyone of the “rightness” of your position and threaten to hurt them at the same time.
     This week, hundreds of miles from my backdoor, a couple of people tried to send a message by killing and injuring strangers at a sporting event. At the time of this writing, we have not heard their “explanation” of their message. The only message we heard was that they were horrible, horrible people for being willing to kill random strangers. Someday, sooner than later I imagine, journalists will squawk their message and try to contextualize what these people “wanted to say” with what they actually said by blowing up bystanders. It doesn't matter what their “other” message is. Violence is not rhetorical. Violence is the anthesis of rhetoric. You can disagree with me if you want to, and I promise I'll not hit you with a stick, punch you in the gut, or send shrapnel into your flesh. Because of this, I don't need to argue my moral superiority. Nonviolence is merely morally superior to violence. Always and forever.
     Keep thinking rhetorically and I'll be back next week.

Sunday, April 14, 2013

When Cogs Go to College


“Humans are only fully human when they play.” – Friedrich Schiller

     I attended a workshop this week at a local college that put high school writing teachers into conversation with their college-level counterparts. It was an interesting conservation to hear the college instructors remark that their highest concern for entering college students is their students struggle with the inability “to think critically” about their own writing or about the writing of others. The high school teachers responded by pointing out that their job evaluations are dependent upon their students scores on standardized tests, and “critical thinking” skills are not on the test. “Critical thinking,” by the way, is the ability to recognize alternative solutions and to observe how alternative perspectives change the validity of assertions. Students who are good at “critical thinking” often do poorly on standardized tests because they end up valuing too highly the alternatives to the “one right answer” that must be discovered when questions are put into the format of the four-response bubble-sheet.
     Although the chronological line between an eighteen-year-old senior leaving high school and an eighteen-year-old freshmen entering college may seem thin to students and their parents, the gap between what high school teachers are expected to teach to their students and what college instructors expect of their students is significant.
     I italicized the prepositions “to” and “of” in the previous sentence because I wanted to emphasize that if there is an important difference between high school and college, for the most part it comes down to the question of who is ultimately held responsible for the student's classroom success. In high school, the teacher is held responsible for whether the students learn what is expected of them, and in college, students are held responsible for their own learning. It may seem like a simple concept, but the ramifications of this shift between the burden of academic success moving from the shoulders of the instructor onto the shoulders of the student are huge. Many students enter college expecting it to be a mere continuation of high school and are stunned to find out that the information that was spoon fed to them in high school is now their own chore to collect and digest.
     The hot political buzzwords in education right now are “college readiness” and “accountability.” The argument coming from governors' offices and statehouses of legislators is that because college is so expensive, someone needs to be held accountable for the costs of remediation when students show up on college campuses unprepared to handle the rigors academic work. Inevitably, the public school system is held to blame when students are not ready for what their college instructors expect of them. The students, themselves, are not held responsible because they are depicted as the victims of the education that was offered to them. Thus, the political machinery cranks out ever more policy that leads to making public K-12 education even more draconian and joyless because “clearly the students are not being made to work hard enough so let's just keep making the work harder.” The problem with this thinking, of course, is that the whip of policy keeps cracking at the horses pulling the carts while the passengers taking the free-ride are not overly concerned about these changes because their responsibility pretty much ends with showing up to get on the wagon.
     As a teacher with both a Ph.D and 30 years of classroom experience, let me share something with you – three basic ideas of education that are being ignored in the psychotic bureaucracy that is currently dictating how public schools must be run and how public school teachers must conduct their classes. I call them the Three Basic Truths of Teaching.
     Basic Truth #1: Nobody learns anything unless they see a value in it.
In an ideal world, the transition between high school and college would be like a move from the shallow end of a swimming pool into the deep end – requiring the same skills but offering more depth and greater possibilities to the swimmer. From an outside observer's perspective, the surface level looks to be the same, and from this point-of-view, some people might even argue there's no difference at all for the swimmer who stays on top of the water. Unfortunately, swimmers who are unprepared for the deep end of the pool find out fairly quickly there is a massive difference between being able to survive when you have your feet on solid ground and when you don't. Students who are not “college ready” have been in the water for years but have never learned to swim because they have not understood the necessity for being able to tread water without a solid footing beneath them. Frequently, students who do really well in high school end up failing out of college because they only learned as much as they needed to get by, and when that strategy just doesn't work for them at the college-level, they sink under their own inability to take responsibility for their education. The solution is not to make the public school students stay in the water longer but to see that they take responsibility for learning to float while they are there.
If we really want to make students “college ready,” students are going to need to feel what they are being taught is going to help them succeed in life and not that they are being taught esoteric nuggets of information merely because it is “very likely to show up on the test.”
     Basic Truth #2: A little fun goes a long way in motivating people to take on difficult challenges. Most people are willing to put in a good effort even when challenged with rigorous problems if they have the expectation that they can have a little fun in the process. If there is one thing that is really destroying education right now, it is the grim seriousness that now hovers over schooling like a scary dark cloud. Children are being robbed of their childhoods in the name of bureaucratic efficiency. Instead of letting kids have the time and space to enjoy life, every moment of school now is about The Grind. Teachers who engage in any activity that cannot be be defended by charting it directly from the The Holy Writ of National Standards are suspected of heresy. People who do not work in the modern K-12 classroom have no idea how much paperwork is now required to be generated to document how each and every standard is being covered. Teachers are unmotivated to make classes “interesting” because they are being asked to spend more time documenting what has been taught rather than spend their planning time thinking about how best to teach what comes next. Classrooms have become joyless instruction pods because the testing corporations who have bought the ears of the policy makers insist that teachers are not pushing their students hard enough. God forbid, teachers and students actually enjoy any of the content – this is why the new Common Core Standards do as much as possible to replace literature with “informational texts” – because, you know, students might actually like storytelling. Furthermore, the values that students learn from literature (such as the importance of being kind to others or why being honest pays off in the long run) are not on the test.
     Basic Truth #3: Education that does not respect human dignity is not education, it is propaganda. All people, but I would say especially children, know when they are being treated like cogs in a great assembly line rather than as human beings. I was at a conference a few years back when during the keynote speech, the CEO of a national organization claimed “More than two-thirds of fourth graders can no longer read at a fourth grade level.” Really? It seems to me when two-thirds of a population cannot do something then somebody is lying about what that population should be able to do – after all, what in the heck do we mean by “a fourth grade level” if the majority of fourth graders can't do it?
     People have feelings. People have wants. People need motivation. Numbers do not have feelings. Numbers do not have wants. Numbers do not need motivation. Teachers and students are people, and they deserve to be treated like people. What is really being lost in the great debate about what makes students “college ready” is that teaching “the standards” does absolutely no one any good if we don't recognize that we aren't really teaching “the standards,” we are teaching human beings. Right now, there is far too much pressure on teachers to “teach standards” and not enough room is being left over for “teaching students to be students.” I've said it here before, but it needs to be repeated – as long as the testing corporations can reap more profit from student failure than from student success (by marketing the “remedial” material back to schools), the cycle of “test, fail, blame, and remediate” will only continue to get worse.
     Keep thinking rhetorically, and I'll be back next week.

Sunday, April 7, 2013

To Our New Robotic Overlords: Go Screw Yourselves




     Whether or not the title of this week's post is funny is debatable, but at least it's based upon a principle of humor that could lead a human reader to an amusing and humorous interpretation. You see, robots are machines, and since machines are manufactured by having components held together by screws, telling a robot to “go screw yourself” (especially a robot who has been put in charge of converting humans into slaves for the benefit of the system that built the robot) lends itself to the jocular ambiguity of a pun by referencing a traditional epithet that has long been applied to reprehensible leaders who would exploit their lackeys (and that epithet being, of course, that if the leader is determined “to screw” his minions, he should rather emphatically go satisfy his lascivious requirements through an exclusive and solitary onanism).
     Now here's an important question: Would a robot find the preceding paragraph funny? If you are human, you might respond to this question by saying the question itself is absurd. Because robots are incapable of experiencing human emotions and laughter is an emotional response triggered by a comic awareness unique to human sensibility, the question as to whether a machine can determine 'if something is funny or not' is meaningless. This is to say that humor is subjective; in order to decide if anything is funny, the circumstances require a human subject capable of personal and intuitive response. While people can argue over whether or not any specific joke is funny, we are likely to find near complete unanimity if we are arguing instead whether or not machines are capable of appreciating humor.
     Now a thought experiment: Suppose an eccentric billionaire came to you and offered you an insane amount of money to create a robot that would laugh at his jokes. This leads to an ethical dilemma, right? Do you create a robot that seems to laugh at the billionaire's jokes in order to become fabulously wealthy, or do you lose your chance at having all that wealth by honestly admitting to the billionaire that because machines cannot really laugh, taking his money for a laughing machine is inevitably an act of fraud?
     Some people might respond to this by saying, “If someone has more money than brains, then you should go ahead and take the money and build a robot that plays a recording of laughter when the billionaire uses a particular tone of voice that indicates he is being sarcastic. It doesn't matter if the robot has no more sense of what the billionaire is saying than any typical department store mannequin, if someone is willing to pay billions of dollars for a laughing robot, it doesn't matter if the robot can actually laugh at what is being said, the billionaire only needs to believe it is laughing at his jokes. If a robot played a laugh track at intervals that gave the impression it was laughing at the appropriate times, it would be up to the billionaire to decide if he was being ripped off.”
     Other people would respond by saying, “Taking money for one thing and delivering a product that does something else is fraud. It doesn't matter how much money is involved. Because robots cannot interpret humor, a uniquely human ability, taking money for “an authentic laughing robot” would be dishonest no matter how satisfied the billionaire would be with the results of a machine that would produce laughter at convincing intervals.”
     Suppose in your conversation with the billionaire, you ask him, “Why do you want a laughing robot? Why not just hire a panel of clowns to laugh at your jokes? Surely you can afford to hire people to laugh at what you say.” And the billionaire responds, “I don't want people to laugh at what I'm saying merely because I'm paying them to laugh at what I'm saying. I have tried this method in the past, and when I pay people to laugh at what I'm saying, they laugh at everything whether it's really funny or not. What I need is an objective measure of humor. I know that not everything I say is funny, but some of it is. I want a device that can tell me when I've said something funny. If I had a robot, it would be completely objective in determining whether something is funny or not because robots cannot be bribed.”
     This, then, is crux of the problem: The human desire for “an objective measure” of “something that cannot be measured objectively” cannot take precedence over the logic of measurement. That is to say, a fundamental principle of rationality maintains that “a motive to have specific information” cannot supersede “the reasoning that can provide the information.” To be blunt, I'll put it this way: Anyone who claims they can objectively measure anything that depends upon human subjectivity is either a fraud or fool.
     In the example above, the billionaire's access to endless wealth is irrelevant to obtaining the information he wants. Because humor is by definition “subjective,” an “objective” measure of humor cannot be had at any price. Money can buy a lot of things; it can purchase agreement, but it cannot shop for authenticity for things that cannot be authenticated. Numbers are very good at describing things that can be measured – the distance between New York and L.A., the weight of an elephant, and average salary of postal workers in the United States. What numbers cannot tell us is qualitative information than in simply not reducible to quantification. The distance from New York to Los Angles is 2,778 miles; but who can save authentically that “It's too far to walk”? The subjective determination that “It's too far to walk” is based on a wide variety of human motives and conditions. It wouldn't be too far to walk if you had the right motivation to walk it. Even if 99.9% of a survey of the general American population declared that “It's too far to walk,” you might walk the distance if you had the right motivation. We might use statistical data to learn that the average salary of a postal worker is $48,380. Whether you believe they don't make enough money, make too much money, or make the right amount money, your belief can be justified by a wide variety of arguments, but your opinion can not be quantified into the “one right answer” because there is no one right answer. Opinions are subjective. Only numbers are objective, and numbers can't have opinions.
     To recap: Objective data can provide information that can be authenticated. Opinions can be informed by objective data, but because interpretation is a subjective human response, opinions can only be justified, they cannot never be authenticated. Anyone who claims that they can provide an objective answer to a question that requires a subjective response is being disingenous. If vasts amounts of money are involved, it is my informed opinion that fraud on a massive scale is inevitable.
     If you are a human, you may interpret this essay as an inditement of the corporations that are currently conspiring “to grade” student writing with “intelligent software.” I would argue that while the software may be intelligent, the people who believe in the results are not. Whether those who are either developing the software or buying the software are being entirely honest about their motives for saying they believe in the results depends entirely on how much money they are being given to authenticate the results.
     Was this essay “well-written”? I don't suppose any machine could offer an opinion about this one way or other. Software can offer information that can make predictions on the quality of writing based upon the metrics of sentence length, vocabulary usage, and grammatical conventions; Software cannot “read” writing for content and recognize subordination of ideas, rhetorical fallacy, or metaphorical language. I'll believe in software that can make accurate predictions of the greed of corporate hucksters, short-sighted politicians, and budget-conscious school administrators long before I'll ever be able to accept the existence of software that can judge “good writing.”
     Keep thinking rhetorically, and I will return next week.

Sunday, March 24, 2013

How to Spot a Fool


The fool on the hill sees the sun going down, and the eyes in his head, see the world spinning 'round.” – Paul McCartney



     Last year, 43 people met their fates at the hands of government employees who were commissioned to snuff out the life of America's most notorious criminals. People, of course, like to argue about the merits of capital punishment and whether or not the government should be in the business of offing its most loathsome citizenry, but I'll save all my arguments for it and against it until another day. Today, I want to talk about “Fools,” and I have a good story about someone who was once executed on death row that illustrates the rhetorical point I want to make.
The number of people killed by the government, with full intent and in front of reporters who witness the event in order to write about it, is down by more than half since 1999 when we ended the Millennium off with a bang by launching 99 criminal explorers into the dark void of The Great Unknown. One of the first lessons they teach in a basic news reporting class is the list of values that make a story “newsworthy” typically includes such factors as prominence, proximity, timeliness, impact, and human interest. Even when we're knocking off the nation's ne'er-do-wells on almost a weekly basis, it makes for a pretty good news story; however, back in 1966, when the country went that entire year with only being able to check off one name from its list of people on Death Row, putting someone into an electric chair to toast the soul out of them made an exceptionally good news story.
     James French was only thirty years old when the State of Oklahoma strapped him into an electric chair and sent his wind sweeping down the plain, but by then, he was ready for it. French was one of those people your parents warn you about when you are learning to drive and you're tempted to pick up hitchhikers. Contrary to traditional parental wisdom, not everyone who is meandering around this country by sticking out a thumb and taking rides from strangers will kill you – I, for one, hitchhiked from Athens, Ohio to Yellowstone National Park and back when I was a vacuous college student, and I made the whole trip without killing anyone. French, however, was one of those horror movie type of hitchhikers who pretty much ruined it for all the nonviolent ramblers who are simply out to score a free ride. I don't know if it was just a rookie mistake or what, but French had only made it from Texas to Oklahoma when he decided to take his benefactor hostage for a while and then kill the fellow for his car.
     After French was caught and he came to the inescapable conclusion that he was going to have to live out the rest of his days in prison, French decided he would rather have the management shorten the length of his stay rather than prolong it. Three years into his new career as a lifer, French murdered a cellmate in order to insure that he could get his name added to the list of people waiting for a coveted oneway ticket to ride Old Sparky to the Outer Banks of Eternity. Back in the mid 60's, it was easier to catch a ride in a rusty pickup truck on a dusty two-lane road in Texas than it was to secure a seat in an Oklahoman electric chair.
     Perhaps it was the heat of that hot day in August of 1966 when French took his final walk that inspired him or maybe he'd been thinking about it from the moment he found out he'd won his chance to be the only guy to be killed by the government that year, but French came up with the perfect rejoinder to the reporters who were anxious to have a good quote from the prisoner when they asked him, “Do you have any last words?” French said to them, “Hey, fellas! How about this for a headline for tomorrow’s paper? ‘French Fries’!”
     This type of sardonic remark is what separates genuine fools from mere posers. Contrary to whatever you have heard, fools are not stupid. Fools recognize where they are and who they are with, and nonetheless, they say whatever's on their mind without regard for its propriety or its ramifications for themselves or others. A genuine fool isn't courageous in face of danger; a fool is unconcerned by it.
     There are lots of ways of dividing people into two groups: the rich and the poor, the young and the old, and those who prefer chocolate over vanilla – to name a few. Separating people into the groups “the intelligent and the obtuse” doesn't really get at identifying the qualities of fools because fools are neither intelligent nor obtuse. Fools exist in a space that is not defined by their degree of knowledge or intelligence, but is distinguished, rather, by their heedless behavior. There are people who support the systems that direct their lives (call them “followers,” perhaps); there are people who fight against the systems that direct their lives (call them “rebels”); and there are the people for whom the system doesn't really come into their decision making – not because they rage against it or want to challenge its prescriptions, but because they simply don't believe the rules that applies to everyone else actually applies to them, and these are the people who merit the title “fools.”
     A rebel can be an idealist who images a better world in which the systemic order that controls people's lives has changed, and a rebel can be willing to accept the consequences of challenging the system that controls them. A rebel understands (or at least works under the assumption that he or she understands) the motives of the authoritarian forces that are in control, and the rebel operates to subvert the powers that are indifferent to their notions of injustice. A fool, on the other hand, is no rebel. A fool has no desire to change the system because the fool is either unaware of the system or believes the system is unaware of them. A fool isn't necessarily stupid and willing to sacrifice his or her own dignity to appease the Powers That Be; a fool lives unconcerned with the opinions of the Powers That Be because the fool is too preoccupied with living in a world defined by the fool's own epistemological boundaries.
     In many card games, the Joker is a wildcard that can replace any other card and thus, act like a magical card that randomly appears from a player in need of taking a hand. In the original games based upon the Tarot decks where “The Fool” (the card that later became the Joker in modern decks) appears, “The Fool” card does not replace another card (that is to say it takes on the identity of another card) it merely temporarily excuses the player from following suit. Thus, the original wildcards were not like cards with superpowers that could suddenly out-trump any other card on the table, the purpose of the original Jokers were to create a space for the player where the rules were temporarily suspended and did not apply to the player. Thus, fools are neither people who either seek to win by following the order supplied by the rules nor are they people who seek to change the rules for the benefit of themselves and others; fools are people who simply exist outside of the game, and they allow us to recognize how we are playing the system that they, themselves, cannot recognize. That's enough for now to think about. As we approach the first day of April, let's just remember that people do not choose to be fools, fools are merely who they are.
     Keep thinking rhetorically and I'll be back in a couple of weeks. I'll be taking next weekend off to spend time with the family and to recover from an anticipated overconsumption of chocolate bunny ears.

Sunday, March 17, 2013

My Blarney Has a First Name: Lucky


A mind needs books like a sword needs a whetstone.” – Tyrion Lannister
They got little baby legs that stand so low, you got to pick em up just to say hello.” – Randy Newman

A few weeks ago, a student of mine suggested that my St. Patrick's Day post should be dedicated to leprechauns. So here it:

     Eloquence is a sort of magic, isn't it? At the heart of magic is the idea that invisible powers controlled by mere words can somehow manifest changes in the physical world. Everyday we use magical words to get what we want – sometimes we get what we want by by speaking into small, shamanistic, expensive, electronic devices that hurl our words at the speed of light to people many miles away and within half an hour, they bring us a pizza.
     What's that you say? Cellphones are not magical? I disagree. Cellphones are magical; cellphones employ the wizardry of contemporary technology to send our ideas, our desires, our pains, and our insights into a world of silence. Then, magically, what we send out comes back to us – reshaped, refined, and disguised as response that emerges from a collective intelligence that somehow has heard our intangible words and was moved by them. You do not need to understand the physics of frequencies to speak with someone on the other side of the world; you do not need to speak the magical tongue of binary code to send a text. Do you need to understand the mechanics of eloquence to make its magic work on your behalf to get you what you want? No, of course, not. Beauty may be in the eye of the beholder, but magic rolls off the tongue.
     Here, then, lies the paradox of leprechauns. What is the source of the leprechaun's magic and how do we explain its limitations? If the leprechaun's magic is powerful enough to produce pots of gold, then why isn't the magic powerful enough to defend the leprechaun from occasionally falling into the clutches of mere humans who can exploit his magic for their own personal gain? The rules seem simple enough: catch the leprechaun and hold him tightly in the grip of your fingers, and he will be obligated to grant you three wishes (or at the very least, a sugar-coated breakfast cereal packed with marshmallow shaped like stars and moons). Don't look away, the folklore warns, for if the leprechaun catches you looking away, even for a brief moment, he's sanctioned to blink away and leave you holding smoke. Three wishes – no wishing for extra wishes – three as in the Christian Godhead, the bones of your elbow, the states of time, and parts of an atom – three, that's all you get, three. There are stories that can explain where the leprechaun's gold comes from (I'll tell you below) and there are psychological explications for his diminutive size (yeah, I'll give you that too), but how do we explain how such a powerfully magical being is incapable of living other than as a fugitive? Leprechauns are always on the run; where are they running to? Where are they running from?
     Some anthropological historians tie the stories of leprechauns to the Tuatha Dé Danann, a group from Irish mythology who were driven into hiding in underground dwellings to escape the bloody swords of Gallic invaders. Over the course of many centuries, as group after group of violent invaders arrived to plunder and kill the local population (no wonder there's so many people with a temper living in “ire”land), the Tuatha Dé Danann, or "peoples of the goddess Danu,” became literally smaller in the imaginations of the people who stayed to live on top of the ground as they envisioned the nearly forgotten ancestors who had vanished when they went underground. As for the “pots of gold” – throughout the many dark centuries when the constant threat of invasion hovered overhead like dark and menacing storm clouds, people who were able to accumulate a little wealth often buried their money in pots to keep it safe from being plundered by violent outsiders. When the original owners of these pots either died (from war, disease, accident, or malnutrition) or they simply forgot the right location for their buried treasure, then the “forgotten” riches became the windfall of the forgotten people who came to be known as the luchorpán, the Old Irish term that literally meant “small body.” After all, being an underground people gave them a valid claim to anything found beneath the soil.
     And here, I'll give you some speculation. Why are the “pots of gold” to be found at the end of rainbows? My guess is that even without the Biblical influence of Genesis – in which God offers “the rainbow” as a gesture of His promise that He would henceforth eschew genocide to resolve any future disappoints He may be having with the human race – the rainbow is an archetype for peace because we see them so often after the violence of storms have passed. Once the violent invaders have left, it's safe to go dig up your pots; once the storm has passed, the rainbow will show you where you buried your wealth. It's almost as though the gold itself would be shedding its light to the sky instead of the other way around.
     Leprechauns, of course, in our current popular imagination, are always wearing green outfits – which makes sense if you are small, secretly rich, and you need as much camouflage as you can to keep strangers from manhandling you. The Irish country is lush with its green flora. Oddly enough, however, the elfin mascot of Irish kitsch that we think of is inevitably clad in green is more a product of 20th century marketing than medieval folklore. Up until the late 19th century, the most common depiction of leprechaun in poetry and prose was in red, and even this was highly dependent upon the location in Ireland where the leprechaun was believed to be living; in some locations, he was just as apt to be clad in plain brown leather.
     Here in the US, the most famous leprechaun is Lucky, the mascot for Lucky Charms. Lucky seems not to give a hoot for gold, but he's got a meth addict's mania for oddly-shaped marshmallows. Lucky Charms was the brainchild of a guy named John Holahan who in 1962 first came up with the idea of throwing marshmallows into breakfast cereal. In the history of “Great Food Ideas,” Holahan stands shoulder to shoulder with John Montagu, the 4th Earl of Sandwich, who came up with the idea of sticking meat and cheese between bread so he could eat and play cards at the same time. Holahan's first idea for Lucky Charms is the marshmallows were supposed to represent the tiny icons that could be added to charm bracelets (which were in vogue in the early 1960s), but that idea eventually became crushed under the pagan mysticism of Lucky's occult affiliation with the Tuatha Dé Danann. According to the secret lore of General Mills, the different shapes of the marshmallows invoked a wide variety of shamanistic abilities: Shooting Stars gave consumers the power to fly, Horseshoes conveyed the power to speed things up, and Blue Moons could invoke the power of invisibility. Click here if you think I'm making this up.
     Does God exist? Of course He does. How else can we explain that Lucky Charms is no longer available in Ireland. General Mills stopped selling their Lucky Charms in Ireland (and the rest of Great Britain for that matter) sometime in the mid 1990s. There are still some diehard fans in Ireland who pay roughly $12 a box online to have it shipped to them across the wide Atlantic ocean. General Mills most likely stopped selling the cereal in the United Kingdom because Lucky the Leprechaun became too politically incorrect to defend, and it was only a matter of time before someone called the cereal company out for using a twee character to hock their sugar. General Mills, of course, never felt obligated to explain why they decided to pull the plug on their overseas shipments. You say you don't believe in magic? Let me introduce you to the great wizard Amazon.com.
     Keep thinking, rhetorically or otherwise, and I'll be back next week. And may you know nothing but happiness from this day forward; Happy St. Patrick's Day.

Sunday, March 10, 2013

In Praise of Corporate Education -- Encomium to Ephemera


  

     How many times in the course of a week do we take the mouse in hand, move the cursor over an underlined word, and click on a link without so much as a second thought? Hypertext has given us the gift of intellectual liberation and the joy of perpetual distraction. Once upon a time in the dark and declining years of the late 20th Century, people who wanted to read for information were forced to limit their attention spans to one artifact at a time – on paper nonetheless. In those primitive days, prior to the Age of the Internet, human brains were shackled by the inability to focus upon more than a single idea at a time. Many young people today cannot even imagine the horror of being required to think critically about a single topic in depth. Now, however, like our children, we are blessed to live in a technical paradise where every individual awareness is able to buzz around cultural ideas with the freedom, curiosity, and intellectual acumen of bees in a field of infinite flowers. So much “cognitive” honey comes from our technically-advanced ability to gather the sweet random nectar of informational blurbs that we ought to live in perpetual gratitude that – like our friends the bees – humans can sustain themselves indefinitely on sugar and need never fear any disease that could theoretically arise from the ceaseless ingestion of sweets.
     I have heard the whining of doubters and naysayers, Luddites from an obsolete era, who bemoan the “sacrifice of substance over style” and who remain fettered to their antiquated belief that knowledge without understanding is vacuous. These frumpy curmudgeons like to hide behind their rich vocabularies, their extensive life-experience, and their astute perspectives as though expertise should matter more than the popular opinions of wealthy corporations, bribed legislators, or bemused consumers. Anyone who wants to argue that the sustained contemplation of significant subjects is more important than the immediate digestion of poorly-considered sentiments has no place in either modern education or on Facebook.
     Teachers today have it so much easier than their counterparts had it in that long ago era of ten or so years ago. Back then, instructors had no choice but to rely up a competent understanding of their disciplines; those teachers of that bygone age did not enjoy the modern luxury of a checklist of factoids that students need only memorize without having to go to all the bother of learning the context that would make it meaningful. As a society, we have come so far so quickly that it's easy to forget that it was only a few years back when teachers were charged with inspiring enthusiasm for their subjects and stimulating intellectual curiosity among their students rather than galvanizing them with fear for the next round of high stakes tests. Today's teachers are freed from the anxiety of authentic assessment (say by getting to know each student as an individual through their written responses or their classroom responses), and need only worry about constructing the mounds of evidence their administrators require to demonstrate they have methodically, robotically, and tirelessly covered their checklist of generica (otherwise known as their “state standards.”) Contemporary teachers have been freed from the burden of even the need of having to like their subject matter or their students; to demonstrate success as a teacher today, practitioners need only manufacture small mountains of paperwork proving that everything that must be taught has been taught. Teachers who have been able to adapt to these current mandates can be as sympathetic as headstones as long as they can provide evidence they have been force-feeding students nothing but the isolated and disconnected details off their state-mandated checklists.
     The ubiquity of the internet allows us instant access to an endless flow of delightfully insubstantial and ill-considered postulations. Let us all be grateful then that there is more to life than wisdom and significance. Because consumers are trained by wildly entertaining advertisements to ignore the duplicitousness of marketers, it is more than a wonderful coincidence that corporations have taken over the curriculum now provided to public school teachers. If, as in days gone by, teachers were allowed to motivate students to go beyond the platitudes of facile compliance and encourage students to investigate the complexities of their subjects (rather than mouth the rote material that will allow them to demonstrate the competence of their test-taking abilities), students might find themselves in the uncomfortable and scary position of actually questioning the thinking behind what they are being told. Many well-intentioned corporations are paying legislators good money to insure that state departments of education lean on local administrators to prevent any nonconformity among their teaching staff in allowing any original or unauthorized student work to be considered as evidence of “learning.” Anyone who believes teachers should be allowed to offer their own opinions on the competence of their students should be driven out of town on the horse and buggy they came in on. The only fair way to insure that every student is being programmed to mindlessly accept the philanthropic generosity of our corporate overseers is by not allowing teachers to value any student output that will not be covered on their standardized tests.
     In order for the corporations to maximize their profits from the production of standardized tests, they need to be able to rely upon the steady stream of income that comes from selling remediation materials to the same students who end up flunking their tests. Without these profits, corporations would not be able to be so openhanded in their support of state legislatures. Because of the campaign contributions that many legislators receive from the testing corporations, it clearly would appear as a conflict of interest to them if they were to subvert corporate profits away from their benefactors by allowing schools to determine for themselves who should or should not graduate from high school. By taking financing from corporations to assist them in their ability to govern, legislators have an ethical obligation to see that the children of their state do not develop the ability to question the credibility of the ceaseless deluge of useless, random, and questionable information that mollifies them on their smartphones and laptops.
     Our children deserve a happy life of mindless acceptance of corporate propaganda because in the perfect democracy of the internet, all opinions are welcome and the best opinions come with coupons for inexpensive pizza. Anyone who insists that real education is difficult and students are better off studying the complexities of academic life should take a break from being such a know-it-all and go enjoy some pictures of cats with hilariously misspelled captions. Grumpy cat agrees with me on this one.  (Oh, by the way, I submitted my retirement application this week).
     Keep thinking rhetorically, and I'll be back next week.

Sunday, March 3, 2013

They Might Be Giants, But They Definitely Are All Dudes


Spoiler Alert: This week's column discusses Bryan Singer's film Jack the Giant Slayer. If you plan on watching the movie and you expect to be surprised by anything that happens in the film, then you might be better off skipping this post. It's not my intention to ruin this movie for anyone, but I'm not going to try to leave out the typical details readers would expect a writer to keep out of a film review. I'm not interested in writing a traditional film review, I want to discuss the rhetoric of this movie. For what it's worth, I would give the film a solid three stars out of four. It's pretty much the movie you would expect to see – lots of eye-popping visual effects, plenty of gratuitous violence, and a plot that's confined to the moral conventions of a medieval fairytale like a straitjacket. 


     Walking out of the movie theater last night, my wife, Ruth, was indignant at the ending of Jack the Giant Slayer. I don't think I'm exaggerating by saying she felt betrayed by the movie's ending, and it took probably five minutes in the cinema's parking lot for her to cool down. The source of her umbrage? She expected a payoff to the feminist undertones developed earlier in the film. I didn't expect any feminist message by the end of the film, and I was not disappointed – but then again, I'm a guy. I didn't expect a movie based on the archetypical notion that “the princess needs saving” to have any expectation of consciousness-raising for its audience. If I was surprised by anything, it was by how much Ruth expected to see the princess depicted as other than subordinated to Jack, the title character, by the time the credits started to roll at the end.
     Okay, here's what happened: A central element to the plot was that a population of evil, ugly, and hygienically-challenged giants could be controlled by anybody who wore a magical crown that had been crafted centuries before by a legendary king. During the last 15 minutes of the film, there was the inevitable struggle for the magical crown that could stop the giants from laying siege to a castle where they intended to gorge themselves on the people trapped within. (For a movie that lacked precious little possibility for product-placement advertising, I think the producers missed a golden opportunity by not having the giants refer to the king's stronghold as “the White Castle” and the soldiers they intended to eat as “sliders,” but that's neither here nor there.) Anyway, after Jack The Title Character had – at the last possible moment – realized he could kill the giant who was holding the magic crown by tossing a magic bean down the monster's gullet, the farm-boy turned adventurer then rushed outside in his climactic moment of glory to make the invading horde of giants take a knee and reconsider their whole strategy of pausing to gloat before eating their adversaries. What ticked Ruth off was that it was Jack who came smugly ambling out of the castle to control the giants and not the Princess Isabelle. Both Jack and Princess Isabelle had been alone together when the giant – who had been holding the magic crown – died of IWD (Invasive Weed Disease), and Ruth fervently expected that common farmhand would turn over the crown to his princess before going outside to prevent the invading giants from commencing with their post-victory smorgasbord of human flesh.
     Rhetorically, I think I understand why Ruth had such high expectations for Jack to hand over the magic crown and let the princess end the film by being the one who saved her realm from the hungry, hungry, huge guys. And, furthermore, I think I can explain why this ending never even occurred to the filmmakers (and if it did, why they probably never gave it a second thought). Perhaps, if the film had ended with the couple walking out together – with the two of them holding the magic crown high in the air each with one hand between them, then the movie would have had a modern fairytale ending, but it would have failed the internal consistency of its patriarchal subtext and risked offending its primary audience. The Golden Rule of Capitalism is “Never risk offending your primary audience.” Of course, I intend to explain all of this below.
     First, here's why I think Ruth expected Princess Isabelle to save the day with the magic crown. At the very beginning of the movie, the film cut back and forth from two parents telling reading the same bedtime story to their children – Jack's dad reading to him the story of the giants' previous defeat at the hands of magic-crown-holding King Erik and Princess Isabelle's mother, the current queen, reading the identical story to her. By inter-splicing these two stories of Jack and the Isabelle in the opening, it would be reasonable to expect that perhaps the film would portray the two protagonists as equals. As the film leaps ahead 10 years to show Jack as a young man traveling to the city to sell off a horse, the viewer soon encounters Isabelle traveling in disguise in the same market as Jack. The two quickly run into each other. Again, with this first encounter between the two main characters, it would not be unreasonable to expect that the rest of the film would try to maintain a balance of “his story” to “her story.” Furthermore, given the film's early depiction of young adult Isabelle's willingness to defy her father's injunction against traveling alone outside of the castle, it's not difficult to understand how Ruth (and other people expecting a more contemporary portrayal of womanhood) would read into the story that this princess is not going to be the typical traditional heroine who will need a man to save her, but rather a post-modern, feminist princess who will demonstrate her independence by seeking out her own adventures – regardless of whatever her father's patriarchal rule demands of her. Later, after Isabelle has been transported to the land of the giants by the miraculous growth of the beanstalk beneath the hut she had been trapped in, both the film audience and the other characters in the movie have it pointed out to them that given the choice of climbing down the beanstalk to return to the safety of her father's domination and the dangers of independently exploring the territory of cannibalistic giants, Princess Isabelle opts for the risk of the giants. While in generations past this decision to go it alone in the wilderness may have been played off as a sign that a princess is not smart enough to go back down a beanstalk, in the context of this film, it is clear that she was bravely looking for her own adventure. Additionally, when this film is put into the context of other recent “fairy tale” movies, such as last summer's Snow White and the Huntsman in which Snow White fights like a ninja, it is perfectly reasonable to expect that perhaps by the time the movie ends, Princess Isabelle would end up holding the crown that controls the dreaded giants as a paragon of female empowerment.
     And here's why I think it never even occurred to the filmmakers to end the movie with Isabelle saving the day (or at the very least, sharing the day with Jack). Although the movie winked at the audience with a self-awareness of modern irony (in one scene, for example, a giant attempted to bake “pigs in a blanket” with actual hogs enveloped in flour blankets), too much attention was given to maintaining the traditions of patriarchy within the story itself. When Jack encounters Isabelle at the market for the first time, Jack takes a punch in the face to defend Isabelle from some ruffians who clearly had no idea who they were dealing with. From this moment on, it is clearly a “guy film.” One way to distinguish a “guy film” from a “chick flick” is count the number of explosions it depicts, but another is to actually count the number of female characters. With the exception of the queen who reads to Isabelle as a child, and whose disappearance by the time Isabelle comes of age is reduced to nothing more than the screenwriters' need to explain the princess' “rebelliousness” in her insistence on going off alone, there are virtually no other female characters in the movie. Not only are all the giants filthy, rude, and violent, they are all dudes as well. All the king's knights who travel up the beanstalk to rescue the princess are men and all the soldiers who fight off the giants at the movie's conclusion are men. If there are women shown among the crowd at the market or within the crowded castle, they are nothing more than scenery. If I wanted to really push how masculine the undertones of this film really are, I'd point out that the princess's name “Isabelle” is meant to point out how pretty she is; she “is a belle.”
     As a viewer, the rhetorical message I think the filmmakers wanted to send to it's primary audience of young men is that given enough courage and determination, anyone can overcome the stigma of poverty to defeat the giants of power, wealth, and influence. Early in the film, Jack is told in no uncertain terms that no matter what happens, there is no chance of a romance with the princess because he is a commoner and only the privileged nobles have any opportunity to court royalty. Given the gorgeousness of Nicholas Hoult (who does an admirable job of playing Jack), there is virtually no one in the audience who would believe that Jack wouldn't end up with the princess once he saves her from those awful, smelly, and apparently misogamistic giants. By the end of the film, not only has Jack defeated the giants, but he has overcome his humble beginnings as well, demonstrating the tired and medievally anachronistic message that there's nothing a little bravery, optimism, and hard work can overcome – unless you are unfortunate enough to be born too big, too ugly, and too grimy for Hollywood's perfect aesthetic, then you deserve whatever gigantic fall to earth that comes to you.
     Keep thinking rhetorically and I'll be back next week.

Sunday, February 24, 2013

The Tang of Reality


    
     The tapestries of our lives are so tightly woven that it's surprising to discover the seemingly disparate memories that can come from pulling upon a single thread. Thus, as I consider the beginnings of my difficulties with organized religion, it seems odd to me that some of my troubles began with Tang.
     As a child growing up in the 1960's, I was fascinated by Tang. Tang was a brightly-colored, powered, orange drink that came in a long, glass jar. Tang wasn't just powdered orange juice, it was better. All Mom had to do was spoon a little Tang into a glass, mix in some water, and - voila - there it was: the perfect beverage.
     I knew it was the perfect beverage because the commercials said so on TV; "Holy Smokes!"(as Rocky the Flying Squirrel would say to Bullwinkle when he got really excited), Tang had been created for astronauts to drink in outer space. I was 8 years old the summer Neil Armstrong became the first person to walk on the moon, and words can not adequately express the excitement and coolness I experienced by being able to guzzle the same drink as the space heroes of Apollo 11.
     At least a quarter of a century has passed since I had my last glass of Tang, but I can still taste it: an orange flavor as it would have been replicated on the Starship Enterprise. "Warning! Warning! Will Robinson!," I could imagine the robot from Lost In Space saying upon analyzing a glass of real orange juice and finding pulp in it (few things were as repulsive to me as a child as orange juice pulp). Tang was everything a kid could want from a drink: sweet enough to make your eyes pop, tart enough to make your lips pucker, and smooth enough to swallow in a gulp.
     I remember holding a jar of Tang, unscrewing the lid, staring into the bright, orange powder, and thinking, "Science is so cool." As an adult, I don't know if Tang had really been developed by NASA scientists (picture this: a group of NASA scientists all wearing long, white lab coats at an important meeting. The chief scientist is holding a clipboard as he stands at the head of a long table. He looks at his clipboard for a moments and says, "Alpha team reports little progress in the development of an 'O' ring that can withstand the intense temperatures of reentry; Beta team is struggling still with the detachment of the second stage boosters, but, hey, good news, Omega team has perfected the process of turning orange juice into powdered sugar granulates.") or if Tang was created by someone who was a genius at marketing products to children (picture another room: this one contains the 1968 annual awards ceremony for advertisers. Standing at the dais is a man dressed in a tuxedo, and he's holding a trophy with something that looks like a small golden cow pie on top of it. "I'd like to thank the Academy," he says, "for winning 'Best Scam on American Youth' for the third year in a row. But I'm not going to rest on my laurels, I'm in the process of developing a plot to sell tennis shoes to teenagers for $100 a pair." A gasp goes through the crowd; "It'll never happen," someone whispers.). The point is, however, I believed Tang had been created for astronauts because that's what I had been told, and that's exactly what children do: they believe whatever they are told.
     Some folks see nothing wrong with exploiting the credulity of children, and furthermore, find it charming or amusing to tell a small child anything. I expect (though I've never read any research to confirm this) that very few people suffer any permanent damage from the discovery that their parents are in reality the ones who hide money under their pillows in exchange for lost baby teeth. Unfortunately, the problem I had in being so gullible as a child was that I tried to believe too much at one time. In church I had been led to believe that Hell lies a couple of miles beneath the surface of the ground, and Heaven floats on top the clouds just above our heads. I had no problem believing this because first, I was a child; and second, the integrity of the people who told me these things was beyond question.
     As a child, there was no question that Heaven floated on top the clouds within the atmosphere of this planet because I had been told so in church, and while I could imagine angels poking holes in the clouds to keep an eye on us down here below, I could not imagine the people in church lying about it. Church, as far as I could surmise as a child, was the very last place a person would want to tell a lie. Furthermore, the idea that Heaven is located in the clouds just a few miles above our heads was supported in dozens of ways. In Sunday school, I was taught the Bible story of how God had grown angry when a group of people had built a tower so tall and so close to Heaven that it trespassed on God's personal space. God not only knocked it down, but He scattered the people who built it across the entire planet and made them speak different languages so they wouldn't try it again. Also, my mother owned an oversized, illustrated Bible, and I can remember the sense of awe and wonder I felt at looking at an incredibly rendered drawing of Jacob lying at the foot of a magnificent staircase with angles ascending and descending from Heaven in the clouds. Moreover, I had no reason to doubt the ministers who assured us that after Jesus had risen from the dead, eyewitnesses had watched him ascending into Heaven. On the day of the Ascension, Christ had floated up into his home in the clouds; he had not merely grinned like Alice's Cheshire cat and disappeared.
     Every bit of religious instruction I had as a child taught me that Heaven was a real place that floated on the top of the clouds. The evidence from Bible stories, the testimony of Sunday school teachers and ministers, the portrayals of oil paintings and other illustrations in everything from religious literature to magazine advertising convinced me of the truth of this. And yet . . . there was Tang, the preferred beverage of astronauts. I had held the amazing jar of orange powder in my hands; I had watched with my own eyes the miraculous transformation of plain water into the world's most perfect beverage. And, with every glass of Tang came the confirmation that people had traveled straight through the clouds on their way to the moon, and they never saw hide nor hair of the denizens who lived there.
     After Moses left Egypt to wander with his people lost in the wilderness for 40 years, God sustained them, I was told, with manna that fell from Heaven. I've never tasted manna; however, twenty-five years after my last glass of that tart orange drink, I can still taste the Tang.
     Keep thinking rhetorically and I'll be back next week.