Thursday, April 30, 2009

Final Reflection

Recently, I had read a short paragraph that I had written at the beginning of this semester. The paragraph was composed of an answer I had given to the question, How would you define ‘composition’? Truthfully, though I like to think that my understanding of this subject has advanced in the past three months, still I do not want to be dishonest: it has got to do with persuasion, though beyond that, another’s guess is as good as mine.
My favorite definition of rhetoric (composition) is given by I.A. Richards, who said – maybe speciously – that composition is “the study of misunderstandings and their remedies.” This is a lovely little sentence, and I am certain that at another time, it may have had a good deal of relevance to a way of life far removed from our own cynical times, which, so it seems, has made “misunderstanding” –read: deliberate ignorance- a way of life, and consumption the remedy. Sadly, rhetoric has come to mean for most of us nothing other than the misleading of others for one’s own gain; I hope that in some other, more enlightened society, far in the future, we can regain the more noble meaning of rhetoric, which Andrea Lunsford defined simply as the “art, practice, and study of human communication.”
Does writing have the capacity to help the community come together? Without a doubt, many people seem to value family. “They’re my foundation,” wrote a 17-year old high school student, Kristiana St. John, in an article entitled “Youths’ stuff of happiness may surprise

parents.” The girls later commented that, though she may occasionally do something “stupid,” to be told by her mother that she was loved made her feel “very happy and blessed.” When I read these words, I was reassured by what I have always thought about communication, that is to say, it is a form that is far too vast to simply be used for manipulating or persuading others. Of this I am completely adamant. To insist that all communication is in one form or another hinged upon moving another in the direction one wants, I think, is too short-sighted. This mother wants her daughter to be as she is, no more.
Also, I have come to believe that writing and communication – particular ones – needn’t be inhibited by what are called the three important forms of communication: logos, ethos, and pathos. Let’s go to the extremes of sentiment, one on each side of our character. I think that, if we consider on the one hand tyranny and on the other passiveness, who could say that either is includes an argument? If I beat the whip down on your skin, telling you “Do as I say or the alternative is this,” what is my argument? Likewise, if you rejoined, “I had better do as he says, or I’ll get the whip,” you, as the tormented, could no more be considered to be as one who makes arguments than I, the tormenter: we simply exchange orders and meek replies of consent, which are hardly at all like arguments. Now, I do not mean to imply that rhetoric is based upon legitimacy, not in the moral sense, at least. I do say, however, that an argument must rely upon some external form of reason as much as an internal form. “Do as I say or the whip”: There is the internal reason, which, I suppose, is sound, assuming I were to follow through. But what if I was bluffing, and, instead, was only making a joke at your expense. To say, then, that my saying “Do as I say or the whip” confers the same meaning, whether I am joking or being sadistic, is not convincing: often,

communication is highly deterministic – meaning it’s controlled by the context in which we communicate.
I value writing. I have wanted to teach English for some time, though, to be truthful, I am skeptical about what possibilities writing could have in the future. I do not think that our society deems it worthy of the value it has, which has left mixed results. In a short time – maybe in the past ten years – I’ve been flabbergasted by the break-neck speed in which the culture has moved from what was once a more literate culture to one so heavily reliant upon imagery. From entertainment, to the way that we get our information to how we interact, electronics has become the dominant mode of modern life, which, to be sure, has benefits and drawbacks. Teaching will benefit from this, but somehow will be hurt by it, as is the clear case in how brief our children’s attention spans are, what, with the need to be constantly consumed with this or another gadget, like a cell phone or an i-pod. Recently, I heard that the Texas Tech library will soon begin to ship its books out to warehouses where, it was told, the books will be stored and eventually destroyed. The reason is because, supposedly, the students spend 98% of their effort in the library using computers. “A glorified coffee shop” is what the Tech library would be, said an employee. I couldn’t answer.
What is composition? In my view, it is something that I hope has a future.

Thursday, April 23, 2009

Opening the Human Doors: Artist's Books and Pedagogic Theory

Opening the Human Doors: Artist’s Books and Pedagogic Theory

The reader became the book; and summer night
Was like the conscious beinthe book.
-Wallace Stevens

The art of children has long interested scholars, beginning perhaps as early as the 19th century. Sir Herbert Read supposed that the academic focus upon children’s art may have begun in John Ruskin’s The Elements of Drawing, published in 1857, (Gaitskell 15) though he listed several other candidates. Among these were writings in the 1890s by literary critic James Sully and, more convincingly, by Helga Eng, who, in the mid 1930s, recognized first the enormous expressiveness and variety in children’s art. (Gaitskell 15)
Though artist’s books have been traced back as early as William Blake’s mystical poems, (Klima 10) this peculiar genre of publishing – which evidently mixes the elements of art with the printed form of books – is mostly a phenomenon that emerged of the 1960s. Artist books, as we will see, are extraordinarily versatile forms of expression, though, with the exception of a small circle of collectors and universities, are often sadly obscure to the public.
For our purposes, we will explain why artist’s books are a novel approach to needless frustration faced by children each day in learning to write.

To begin, we will review, briefly, the history of this genre, with myriad examples throughout. Second, there will be included some views on matters of education and art by prominent thinkers in both fields. Then will be given a completely new and easy-to-apply example of an art book, after which we will settle on our conclusion.

“…An Impermanent World....”
What, precisely, is an artist’s book? Apparently, even among those who have made them for many years there is no consensus of what either “art” and “book” means. “There are as many definitions of an artist’s book as there are innovative expressions of its flexible form,” said Nancy Toulsey (Hubert 21). “This mercurial condition defines the nature of the artist’s book.” Another less general definition of the form was offered by Clive Philpot, former director of the Museum of Modern Art, New York, who regards artist’s books as needing a utilitarian purpose as well as being aesthetic: “[the] book form is intrinsic to work…one way of determining this is to consider whether what is presented could be shown on the wall.” (Hubert 22)
The first important revival of the artist’s book, where those who made them arrived from all over the country to consider what they are, was in 1973, at the Moore College of Art in Philadelphia, Pennsylvania. (Klima 20) A pamphlet handed out at the modest exhibition, which featured 250 widely varied books, focused upon works from the early 1960s up to the present. “A book represents a permanent reality in an impermanent world,” wrote Lynn Lester Hershman in the exhibition’s pamphlet, “…access to its contents was controlled by the individual.” (Kilma 20)

Clearly, this person saw much more potential in the “permanence” of artist’s books than as simply a pleasing object to “be shown on a wall.” But the genre continued to grow, and has grown, perhaps even beyond the idea of permanence thought by those attending the exhibition at the Moore College of Art in 1972.
In Linda Smith’s book, Inside Chance (2000), the reader sees not so much a poem or an idea as a physical metaphor. Composed of eight black cardboard cubes, neatly trimmed, connected by black paper indiscernibly pasted in grooves connecting the cubes, the work unfolds like a strange puzzle – what one may expect to find in a nick-knack shop – and within can be read the poem by Alberto Rios, “Inside Chance.” (Wasserman166) As in that piece, Leah Micher Geiger’s The Reptile Brain (2003) features collage and words, though errs more to a sculpture than does Inside Chance, which is more book-like. With four rectangular square panels, oak, smeared with diluted white house paint, (Wasserman115) some ridden with text copied out of biology books, others pasted with natural objects – leaves, feathers, reptile bones -, The Reptile Brain is decidedly a more political work than Inside Chance. By reading it, the viewer is expected to consider the possible reptilian reflex, buried deep in our brains and gruelingly apparent in our habits.


The Great Wall of China (1991) was influenced by Shirley Sharoff’s early career, teaching middle-grade students the English language in China. A sort of cut-out, covered on either side by several cleanly written poems by Lu Xun, a Chinese dissident and author, “The Great Wall” is propped up and spirals, in and out, like a maze for mice. Sharoff wrote about her experience: “As an English teacher, I assigned my Chinese students short compositions so they could practice their writing, but for me, it was a way into their lives and their memories of life in the 1970s.” (Wasserman149)
”The Great Wall”
Artist’s books by the likes of Smith, Geiger, and Sharoff, have clearly pressed beyond the domain of what others may once have been comfortable to call a book. The best definition, however, of an artist’s book may yet have been offered by Ed Ruscha, who pioneered the genre with his book Twenty-Six Gasoline Stations (1962): “[the artist’s book] implies that what the book actually represents may matter less than what it dismisses.” (Klima 10)



A Thousand Words
We have all at one time or another heard the old phrase “a picture tells a thousand words,” though seldom may we have thought of what it means. How is it that a picture “tells,” seeming, as it does, to lack any of the conventions – grammar, words, and clauses – of spoken language?
By its very incapacity to communicate in words, the image has an added layer of depth that may be absent in writing. If it is true, as I think it is, that people are inclined by nature to self-interpretation, then it is clear that the image, though it does not speak, implies a question, which the onlooker is in turn expected to answer.
The value of art may be judged, argues Andre Malraux, not by its nearness to life but by its rarity. “If it so happens that an artist immortalizes some supreme moment, he does not do this by reproducing it, but because he subjects it to a metamorphoses.” (Malraux 279) Malraux’s idea of the “supreme moment” is, to him, an autobiographical moment, which, as we’ll now see, is a prominent theme in many artist’s books, and could be a help to students that need inspiration for their writing.
Renee Stout’s lovely book, Seven Windows, is a fine example of an autobiographical artist’s book. Properly speaking, it may be well to call this a biography of an imagination: the book, shaped like a common scrapbook, filled with drawings, made with rich, lucid lines of color, tells the story of Madame Ching, Stout’s childhood alter-ego. The author has made a truly bizarre character: the fortune-teller, with pale cheesecloth wrapped tightly around her head, wearing a broad sack-like blue dress, wizened dark skin and piercing white eyes, looks like
an old migrant woman from late 19th century America, standing tiredly, as she’s often portrayed, in a dark field, under a frozen sun. Madame Ching, in Stout’s words, was “a mysterious fortuneteller and root worker, [and] functioned as a vehicle through which I could analyze the complexities of the self and human relationships.” (Wasserman 47) We cannot guess how Stout may have embellished Madame Ching nearly 30 years after she had created her; the book contains many other drawings, including a strange and beautiful watercolor painting of a human heart, suspended in the center of the page, surrounded by short extracts, written in blood-red ink, from what appears the author’s journal. A project like Seven Windows would offer students a superb opportunity to bring their lives into their writing.
During an interview in which he discussed his many artist’s books, Mathew Gellar, author of “Difficulty Swallowing: A Medical Journal,” pointed towards two elements he considered to distinguish a book from a work of art. (Dept. of Social and Health Services 11) The first “fact” about books: Readers, unlike the person seeing images, is capable of being discerning about what he reads. Secondly, and leading from Gellar’s first point, the reader can do his activity when ever he thinks it appropriate; the average man, walking through a “museum” of reality, is hemmed in by the images surrounding him. Gellar is referring more towards the many forms of advertising in modern life, though art is no less to him as impressive. “The experience of [the artist’s book] presents an interesting combination of limitations and variables,” said Gellar in the interview’s conclusion. (Dept. of Social and Health Services 11)
When considering Gellar’s opinions about books, we could offer two mistakes in his reasoning, which is pertinent to the use of artist’s books as a tool for teaching composition. The
first point is: the interviewee had not considered that the reader can phase out material in a book as easily as a man driving through Lubbock to get to work can disregard whatever that does not get him to work. Secondly, Gellar appears to accept a contradictory notion about art: to him, the act of reading is always intellectual and voluntary, while to look at printed images, like advertisements, is always a visceral experience, and in such an experience the person has no choices.
We will suspend our opinion of Gellar’s argument, for the time being, and keep his two points in mind, while we continue to explore the artist’s book.
Next, I will illustrate an idea for an artist’s book that I conceived myself.
Down the Rabbit-Hole
In Chris Marker’s 28-minute film, “La Jetee” (“The Jetty”, eng. 1962), the audience is hurled into the demented, savage world of the future. Paris has been completely destroyed by an atom bomb, as has the rest of the world, and the entire surface of the earth is covered with a poisonous nuclear residue. The surface world is blotted with decay and death, and those who would survive the Third World War, migrated underground, where, as the film begins, the human population had lived for at least 30 years. Much of the film’s story concerns a strange series of time-travel experiments, the purpose of which is to reverse the condition that had led to the explosion; one in particular, about a man, who, unable to tolerate the specter of his freedom in a new, more free world, attempts to travel to his past, where, he assumes, he can find freedom in his unhappy childhood. The movie inspired Terry Gilliam’s belated film adaptation, “12 Monkey’s” (1995).
“La Jetee” is an extraordinary film, not merely for its original story but for how it was made. At no point does the audience see a moving image: the movie is entirely composed of photograph stills, with the actors manipulated to seem as if they are communicating to one another. What results for the viewer is the disturbing feeling that they are looking at a torn-up scrapbook from the past, telling us something that is of dire importance to us.
Scene from “La Jette”
The film is very well written; when closing our eyes, we are convinced we are being read a short story. “Only later would he remember the sight of a frozen sun, at the sight of a stage setting at the end of a pier, and of a woman’s face,” says the omniscient narrator, over a stark photograph of an empty dock, with clouds looming above. “Nothing tells memory from ordinary moments; only afterwards do they claim remembrance on account of their scars.” (Marker) In instances like these, the narrator’s words increase what is shown in the photographs, sometimes subtly, and in this case, quite dramatically. What we are observing is persuasive writing.

What could students learn from watching “La Jetee”? My proposal is that they would do a project based on the film’s formula, of telling a story through a combination of photographs and text.
The principle of this exercise is straightforward. Simply put, we assume that the best way for a child to learn is if they are personally engaged in their work; for that to happen, they must be offered considerable independence to form their own works and, most importantly, to bring their own resources to their efforts. However, the students should not be left to fend for themselves entirely, and that is why the teacher, who has a better knowledge of the readings, would supply works – short stories, poems, novels – that that students would retell with photo illustrations and text. If a student, for example, chose to illustrate “The Circular Ruins,” a fantasy story by Jorge Luis Borges, he could cast himself as the wizard, who, separated from society, tries to dream of a boy in entirety to accompany him in his solitude. The student could choose locations (preferably a field, or near a hollowed out shack), make costumes; find someone to play the boy, perhaps a little brother. And finally, the student could manipulate the camera as he or she choose: take as many pictures as needed, make black-and-white or color images, and mix them together as they see fit. The students will be required, though, to add detail to the pictures with writing.
This project would address a number of issues about writing, the least not being that children often prefer imagery to writing. Ours is a visual culture, more visceral than it is thoughtful, and as such, children tend to consider reading and writing to be an obstacle rather than a means for important expression. Here is a project they can take home, that they have the
freedom to take apart and refashion to their liking; what is more, they have control over their story, and control is something that is rarely afforded to poor children in their personal lives.
Conclusion
We have discussed the fascinating history of artist’s books; of their extraordinary variety and potential for teaching composition in primary schools. We have realized how artist’s books could ennoble persuasive writing and make composition interesting to children. Perhaps no definition could embody anything as immutable and as strange as artist’s books. In closing, the reader would do well to consider Ralph Waldo Emerson’s opinion of art (Gowan 407): “…that beside his privacy as an individual man, there is a greater public power on which he may draw, by unlocking, at all risks, his human doors…the mind flows into and through things hardest and highest and the metamorphosis is possible.”


Works Cited
Dept. of Social and Health Services, Belltown. What Are We Waiting For? : An Exhibition of Artist’s Books. Seattle: Real Comet Press, 1984.
Gaitskell, Charles D. Children and Their Art: Methods for the Elementary School. New York: Harcourt, Brace and Company; 1984.
Gowan, John Curtis. Development of the Creative Indicidual. San Diego: Robert R. Knapp,
1972.
Hubert, Renne Riese; and Hubert, Judd D. The Cutting Edge of Reading: Artist’s Books. New
York: Granary Books, 1979.
Klima, Stefan. Artist’s Books: A Critical Survey of the Literature. New York: Granary Books,
1998.
La Jetee/San Soleil. Dir. Chris Marker. The Criterion Collection, 2007.
Malraux, Andre. The Voices of Silence. Princeton: Princeton University Press, 1978.
Wasserman, Krystyna. The Book as Art: Artist’s Books from the National Museum of Women
In the Arts. New York: Princeton Architectural Press.

Sunday, April 12, 2009

Composition

If you were to ask me what I thought of Composition 3360, I would answer that I enjoyed it, for the most part.
There were the obvious troubles, namely of trying to read Lanham’s entire book, “Style: An Anti-Textbook.” Not that “Style” was a bad book. On the contrary. Simply put, I could find very little time to read it, what, with my schedule being so hectic and all. The thesis of the book was very interesting, though, or what I had interpreted as its thesis: that discourse, in any form, is an expression of the human will to wave power over others’ heads.
Of course, I no more now than at the beginning of the semester believe at all in this, as I’m sure I argued in class more than a few times. There may be, however, a grain of truth in this idea, which I admit begrudgingly. But before going further, I would only point out one flaw that I think is pandemic in this argument as it’s traditionally used. The flaw is this: Assuming that the thesis is correct, that discourse is meant, invariably, to exercise power over others, is such communication always meant for selfish means? I’m speaking, evidently, about the sort of “selfishness” that’s pathological, here; no doubt, we could name all sorts of instances from simple observation in which we could prove that selfishness is not only healthy but even desirable. That isn’t what I’m talking about here, though.
Then there was the so-called “MOO” project, which I’m not sure how I feel about, except to say it was different. What struck me most of all, after everybody had stopped their fingers from clipping the keys and stood up from their little chairs, was of how relieved I felt at that instant to see people around me once more. It’d be difficult to produce a metaphor; maybe I felt like Plato’s naked man emerging from the cave to see the light, wanting to tell everyone else about it.
Over all, I enjoyed the class.

Tuesday, March 31, 2009

The Essay Idea: An Essay

Today, I mentioned something in class that I though was interesting. No one else appeared to take much stock of it, but nevertheless, it interested me. It’s about a possible idea for an essay. The idea is this: Can a student who, not interested in any or most forms of academic achievement, be drawn towards such a thing through writing? More specifically, if he were taught to approach writing not as a dry subject with a predetermined set of rules but as a set of languages, would that student be encouraged to achieve more?
It’s a risky theory, I know, and truthfully, I’m neither certain how I could prove this or if there are even any precedents for it in scholarship. There was Jacques Lacan, who regarded the unconscious mind as being organized and dictated much along the lines of grammar. That, however, is a very dense area to get into, not at all appropriate for a short essay.
But I feel an urge to continue with this, however foolish as it seems. For a brief example, let’s take the subject of literary modernism. We won’t delve into its entire history, of course, but we can simply look at what it was. Modernism came into being as a result of industrialization, in Europe particularly, and meant, at least to its participant, a sort of challenge to what from their point of view were highly inhumane, monolithic societies that revolved around hectic cities. Modernist literature was eminent for mixing genres together: Miguel de Unamuno’s “Don Juan” in “Three Exemplary Novels” is a cross between a play and a short story; Francis Ponge’s poems in “Things” read like short essays. Modernist stories could have non-linear plots: Andre Gide’s “The Counterfeiters,” about a botched attempt at passing fake notes in 1920s France, features a frank portrayal of homosexuality; such writing was useful, time and again, for handling taboo subjects.
My point is, though, is that, by encouraging students of writing to compare the given specimens of writing, and to afford them opportunities to mix genres freely in their writing would, I firmly believe, be very helpful to them. Let me begin by saying, we wouldn’t do this sort of thing indefinitely; clearly, it’s an experiment, something reserved for certain times. But it’s creative; for one, they would learn the limitations of one genre and the advantages of another. I myself have improved my prose by learning the terseness and discipline required of poetry. Also, there are plenty of teachers who’d be more than willing to take part in this sort of experiment. Right this moment, I’m observing classrooms at Monterey high school, English classes whose fabric are torn at the seams, so bored are the students by the predictable readings. Why not add some flavor?
It’s a thought…

Sunday, March 8, 2009

The Internet: Its Potentials, Drawbacks

Without a question, few other organisms like the Internet have so thoroughly changed the basis for communication in a society. There were its obvious forerunners, namely radio and then television, which, when first introduced to American society, was thought by many to be a revolution in the mode of communication and culture. One of the first programs to air on TV was a minor play, “The Man with the Flower in his Mouth,” by the Nobel Prize winning playwright, Luigi Pirandello. My father has shared with me fond memories of watching “Play of the Week” on CBS; the program often featured old plays by the likes of Ibsen, Shaw, and O’Neill; as well as original works by Patty Cheyevsky, Tennessee Williams, and Rod Serling, who is more well known as the creator of the “Twilight Zone.” That era, though, is clearly gone, and its demise could perhaps be explained no better than by Serling, who wrote 92 episodes of “The Twilight Zone” himself.
Serling, who allowed his show to be cut out on its third cancellation by CBS, was personally horrified by the new chairman’s tastes and for the brainless pabulum on the new medium we now have come to expect, mostly game shows (i.e. The Joker’s Wild, etc.), believed, I think correctly, that the country was being dumbed down at the behest of a stupid man who happened to be powerful. Looking back, can we doubt Serling’s incisiveness, with modern shows like “Dog: The Bounty Hunter” and “Dancing with the Stars” now topping the ratings? This raises an important point, though, namely of what or what not the public considers to be interesting entertainment.
Clearly, there is that lingering temptation to say,”Americans will watch any shit you put on.” Why, though? “Because they aren’t smart enough to distinguish the important matters, the ones that really mean anything.” I couldn’t be sure that everyone shares this view, though I do disagree with its main point, particularly, that people are not interested in world politics, and, being naturally dim, are better suited for what Columbia University professor Paul Nystrom called a “philosophy of futility.” The average Joe, who, so stupid he cannot conceive his own predicament, ought to be driven into a life of menial servility and labor, where he can relieve his pain with the miracle of consumerism. That, unfortunately, has been the bent of television throughout its history, and I see no reason why Internet shouldn’t flow in that direction as well. In fact, it already has, to a great extent, for the reason I just mentioned.
There are differences between the Internet and Television. For one, the Internet isn’t dominated (yet) by advertisers and concentrated capital, which, as one would expect and sees every day, is the decisive force behind decision-making and entertainment. Debates continue in these circles about whether or not the Internet should remain free or become, like in China, managed and controlled. I myself receive e-mails from activist groups describing, in the most foreboding terms, the importance of a free and open Internet. The reasons are clear. These groups want what the other mediums have long ceased to offer, most importantly, a medium where they can go to discuss important matters that affect them. That isn’t what all of the internet is used for, obviously, but that is what many people, intelligent people, are talking about.
I can only hope that these groups will be successful with their basic goal, that the Internet will remain free, and television, radio, and print will someday realize the potential to inform and stimulate the public, as they clearly now do not.

Wednesday, February 25, 2009

The Berlin Paper

Here, I will discuss briefly “The New Rhetoric,” one of three major theories of pedagogy James Berlin discusses in his lengthy essay, “Contemporary Composition.”
The New Rhetoric, for me, was most interesting of the three methods Berlin discussed, not least because I mostly disagree with its major tenets, but also that, of all teaching methods I’ve been exposed to, it is perhaps the most frequently practiced. For the New Rhetoritician, Berlin says, truth is “dynamic and dialectical, the result of a process involving the interaction of opposing elements.” Unlike to Plato’s assumption that truth is embodied in natural “forms” which are meant to be discovered by an intricate method of philosophizing, the relation between these opposing elements are “created, not pre-existent and waiting to be discovered.” Because truth as such is undiscoverable through sense impressions – to do that, it needs to be organized and structured – truth is, in the context of communication, an element highly subjective and ungraspable, or suspicious, but through the interchange of messages from one person to another.
Before I attack this argument, it may be well if I stated, first, that I’ll make no attempt to misrepresent Berlin by connecting him with this view, though he seems to approve of it. I myself am not totally adverse to it. I agree, for instance, with their view that truth is in a constant state of flux, and that, being such an elusive substance, any resolution to discover a viable truth would no doubt rely on a communal effort. There are, however, different kinds of truths and different elements to those truths, each which appear to be appropriated uniquely to their relative contexts. The New Rhetoric’s is a perspective that allows for little in the interpretation of those truths’s that are most elementary yet elusive, which are the spiritual truths. I disagree with their view that communication is necessarily designed for the purpose of arriving at truth. Some forms are, and others are not. I believe that human beings apprehend a crucial truth through experience; racial integration, for instance, spread the bounds of the culture’s tolerance and, I think, civilized it to a great extent, all by the simple means of familiarity.

Saturday, February 14, 2009

Simon of the Desert: A Review



On February 14, 2009, The Criterion Collection announced the release of Luis Bunuel’s film “Simon of the Desert,” significant for its being the last film of the great director’s so-called “Mexican Period.” The film, which lasts only 43 minutes long, features two familiar Bunuel collaborators, Claudio Brook (Simon) and Silvia Pinal (The Devil).
I watched the film on a Friday night, alone, and, with the expectation of seeing an outrageous, anti-clerical comedy, I wasn’t disappointed at all. I do believe, however, that unlike other “art” directors, Bunuel is usually more accessible; no other director, in my opinion, is as at once so obscure and somehow very appealing, for the reason that, no one is often as creatively outrageous. This trait is apparent in “Simon of the Desert,” which is loosely based on the fabled Christian ascetic, Simon Stylites, who, as legend has it, in his attempt to expiate himself of venal sins, lived upon a high pedestal for 36 years. The film retells Stylites’ story, with Claudio Brook in the title role, covered in a dirty wool smock, cheeks covered in a matted beard, and speaking with a funny, pompous croak at his adoring, hypocritical followers.
Brook, who had an illustrious career (he starred in two other Bunuel films, “The Exterminating Angel” and “Viridiana”), is quite good; he is pompous, more self-important than noble, though he tries; when he heals the poor, handless cripple, who, walking off afterwards apparently without the slightest impression of the miracle, his cynicism seems, ironically, to reflect that of Simon’s himself. The dirty ascetic, glaring into the windswept land, wonders to himself about the nature of this religious gesture, fantasizing his mother holding his head in her understanding arms, comforting his sorrows. By his nature, he is weak, and we are certain that when the devil comes he will surely capitulate to temptation.
Bunuel made very good use of Silvia Pinal in their too-brief collaboration. The two teamed before in “Exterminating Angel” and “Viridiana”. The latter film is probably her best work, and, incidentally, was a humiliating blow to the dictatorship of Francisco Franco, who commissioned Spain’s leading director and exile back to his homeland to make an art movie, not expecting the hilarious, atheistic fiasco that resulted. Lacking the innocence of the naïve, religious girl in that film, Pinal here is evil itself. I thought that the scenes Pinal was in were the film’s funniest, especially the bizarre ending in a New York nightclub. The devil comes to Simon three times: first, as a sexy schoolgirl, singing a vulgar lullaby; then, as a bearded, cherubic messenger of God; and finally, as a ghostly, necrophilic apparition in a coffin.
The ending of the film still troubles me. Did Simon allow himself to fall under the devil’s spell and be taken to New York, or was he forced against his will? If the first proposition is true, then his fate in the nightclub – to be stranded in the future – is more appropriate. But is it, really?
The film is great. See it.

http://www.youtube.com/watch?v=gNGOsvrbvu4&feature=related - Scenes from the film.

Thursday, February 12, 2009

How to Make Love to a Humanist

Of the multitudes of ideas men have created, perhaps no other has been more influential (or controversial) than humanism. In its history, some of the most intelligent and noble ideas have been thought by men who have numbered themselves in this tradition, as well as the most benighted and ignorant of people. Considering myself a humanist, it may be well if I clarified this mystifying term, so frightening to so many in the American public. Afterwards, I’ll explain why it, humanism, is so crucial for understanding our world.

First, I quote one of the 20th century’s eminent philosophers, Antonio Gramsci, who, I believe, summed up the humanist credo well. Gramsci, a man who was imprisoned most of his adult life in the northern Italian state of Salo for his beliefs by Mussolini’s regime, and who never saw any of his work published, said, famously, that people must maintain a “pessimism of the intellect and optimism of the will.” What does Gramsci mean? He is advocating, simply, a sentiment, not a belief, that would befit men to best confront the problem of despair and inertia, which has dogged people the world over, throughout history, and has rendered many helpless, more particularly, by cynicism. People, he argues, must assume that the worst that could happen if they weren’t to act is guaranteed; however, by exerting their will to change their conditions for the better, people must assume their actions will affect a positive change. In this sense, action becomes, for the philosopher, an act more moral than thoughts and values for their own sake. So, for instance, if I studied heavily for an algebra exam, and I was certain that I would still fail, I would resolve to take the test anyway; the reason is because, if I didn’t take the test, my failure would be guaranteed, whereas if I took it, I either would pass the test or have the satisfaction of knowing I didn’t cower before the anticipation of taking it.
Gramsci’s idea, for me, is emblematic of the humanistic philosophy. This concept, like all humanism, is intended to give moral leverage to the individual, which, in turn, implies a greater need for, on the one hand, personal responsibility, and on the other, more obligation to one’s community. We can no longer deify the cult of reason, for the simple reason that civilization’s barbarism and vulgarity, so obvious to us all now, must figure into any serious assessment of human behavior.
However, there are plenty incidences that one can point to in which people don’t act like skull-splitting, blood-drinking barbarians. The Loyalist movement in Spain, which lasted from 1936 to 1939, has long been used as an example by anarchists and libertarian socialists as proof of the viability of anarchism, with good reason. George Orwell wrote memorably of the large communal societies that were established throughout Spain, in “Homage to Catalonia.” Orwell, a British socialist, fighting on behalf of the independent cause and describing the war-torn conditions around him, wrote that, though he didn’t understand everything in this society or even didn’t like many aspects of it, added that he couldn’t help but to behold what the Loyalists had accomplished and admire it. He returned to this region several months after his first visit, when the entire area had been crushed, usurped by Franco and his Felangist regime.
Orwell’s experience, and Gramsci’s, are a source of deep inspiration to me. They prove that, in the midst of insurmountable chaos, sadism, and unspeakable brutality, people can somehow maintain a semblance of reason and accomplish important things that may be beheld as example of moral behavior for other generations. I don’t believe humanism is strictly reserved for leftists or atheists, much for the same reason that I don’t see morality and piety as merely the apple of a right-winger’s eye. We are talking about humans and human values, after all, which can leave endless space for second-guessing and hairsplitting, on anything we could possibly choose to discuss. I don’t have space to attack that mammoth here. If, however, I were asked what I think all humanists share in common, I’d answer, they all agree that the common good is somehow reliant upon a balance between individual and communal security; that people, unlike property, are moral vehicles, and as such ought to be guarantors of certain rights, and; as many telling things can be determined about someone by the work they do, the quality of work must be improved as much as possible, which would in turn, presumably, improve the general value of life in the country.

Sunday, February 8, 2009

Rhetoric Burger...Hold the Reason!

Recently, I have begun to reconsider my own beliefs, my judgments of other people, institutions, values; never, in all my mental wanderings, have I come to a satisfactory resolution to these pressing issues. I wonder, while wresting on a wizened, boney tree limb in my backyard, Jimi Hendrix at my side, waiting by a fire, what the meaning of truth is. Sadly, Jimi could offer nothing to help me in my journey, though, as you would expect, these meetings have at the least resulted in some great tuneage on his part.
The question is, “How will logos, pathos, and ethos benefit me in my life?” I would gladly have skirted that question to Jimi, who is, I believe, far more capable of answering than I am. He has, however, been dead for almost 40 years, so I suppose I’ll pick up the bag…
Whenever I catch myself watching television-for whatever reason that was-, or if I am simply listening to people talk after class, or, found in the middle of a heated political debate with another student I loose control and act a fool, I am continually amazed at the apparent disregard of reason and soundness in our arguments. To use myself as an example: When talking about, say, religion, I usually make no pretense of wanting to share a “civilized conversation,” and, like some flea-bitten, glassy-eyed pit-bull, leap on the poor sap I’m talking to, aiming straight for his jugular vein. Why, I can’t say. No doubt, I’m just as able as anyone else of being logical, of breaking down another’s argument, and of showing where I disagree and why, without the need to humiliate them. And often I have done just that.
But-and I’m sure some of you reading this would agree-these are very contentious times in our countries’ history, for a host of different reasons, and, no matter which “side” we’re on, there seems always to be a lingering instinct in us to be right, not logical. To be even more cynical, I would go so far to say many people-the dispossessed, the jobless and unhappy-would not even care to fit ethics into their arguments; simply winning an argument would suit them fine.
One example: Three years ago, I took an “American Policy” class, which was meant to inform us of the various branches of government, how wealth is distributed, etc. There were two blokes sitting behind me, who, other than during this period, had never spoken to anyone else. That day, the class somehow touched on ethics, which led to subject of homosexuality. Before anyone knew it, the class was swept into a fiery, limb-tearing harangue, all over the “morality” of homosexuality, with the two boys behind me leading the charge. I’ll spare you the whole argument-“Homosexuality is like necrophilia!”; “God hates fags, and you’re defending them!”- but all throughout, as I was attempting to speak to them, calmly, and trying to get a small point across, I instead was forced to dodge all the gobs of mud being flung at me from the other direction. Has it always been this way, I thought, or is this a new type of person we’re making?
From the “Aahh shad-up”-ing of a Bill O’Reilly to the lip-pursing, “Worse Person in the World”-like sarcasm of a Keith Olberman, modern people tend to move towards the vileness of a bad argument. I could be wrong, but I don’t think our social life has always been this way. Looking back, I think that, though the United States has witnessed countless horrors in its brief history, their may once have been a time, however brief, when people could talk to one another respectfully, logically, and without the urge to turn the other person into a bloody, meat-like pulp over a parking space at WalMart.
This is what Jimi and I were discussing last night. For my own part, I would like to use more ethos in my arguments in the future. That way, I may better be able to understand the other person’s point of view, no matter how much I find fault in his ideas.

Saturday, January 31, 2009

Hardcore Logo

On the front page of the current USA Today Weekend, wedged above the headline, a caption asks, “Which Adds will You be Talking About on Monday?” The question, of course, is related to the advertisement coup of the NFL Super Bowl, which is as much a hallowed tradition as the game itself. This “question,” or whatever you could call it, made me ask myself a few other questions, namely, “Who would be stupid enough to answer this question,” and “Why do these schmucks assume I’ll be talking to people about TV adds on Monday?”
Whenever I am asked ridiculous questions in advertisements, the type of questions that obviously weren’t intended by the agency to be answered with a logical reply, I can’t help but to become devilish and conceive to myself a sarcastic reply. "Hmmph…that’s a good question there, young feller. Adds…TV…I talk about TV adds all the time, sometimes even to my wife and kids over dinner. I like the Caveman adds, and the ones for the Snuggie blankets; I brought that one up to my boss. Goddamnit, I didn’t realize this would be so hard!"
I dislike to think that television adverting has an influence on my choices and thinking; it certainly didn’t influence my choice for presidential candidate, because I refused to vote for precisely that reason. Never in my life have I found an instance in which I thought advertising was practically necessary, at least, not for disseminating truth or expressing intelligent ideas ; in short, it's my assurance that all advertising, at its core, is calculated to manipulate people into a blustering, irrational, and predetermined conclusion, whether for the sake of stuffing their fat faces with Dorito's, purchasing healthcare from Montel Williams or Billt Mays, or to vote for "change you can believe in." For this reason, I try my best, each day, to ward off advertising “persuasion,” which I either avoid like the plague or, like the cynic I am, level as much derision and mockery at as I possible.
As for “pathos, logos, and ethos,” I am not the best judge of that, not only for the reason I just gave; as far as I am concerned, the general tendency of advertising is to aim square at the gut, bypassing reason or ethics altogether. So I suppose my answer would be “gut.” To take one example, while watching Headline News today, the newscaster, whoever it was, described as the show went to commercials what was coming up in the next segment: A new add has been circulating on television - an advertisement for vegetables - in which half naked women, crawling and writing in a dimly lit hallway, are caressing broccoli, apples, oranges, etc. You may be asking yourself, “What does that have to do with vegetables?”; my question is, “Where were the bananas and zucchini?” Advertising, as everyone knows, is not intended to sell things; it is merely there to concentrate capital, to make the TNT premiere of “Back to the Future” last beyond 3 hours, and to give an excuse to look at women.
That is all.

Tuesday, January 20, 2009

Rhetoric: Facts and Fiction

This afternoon, our Composition class discussed rhetoric, which Aristotle defined as “the faculty of discovering in any particular case all of the available means of persuasion.” Interestingly, though they purported to agree with this view, my class described a much different definition of “persuasion,” one which I’d now like to expound upon and then refute.
First, I will begin to talk about Aristotle’s quote, which I think was grossly misrepresented in class - “the faculty” meant to represent human nature, and “any particular case” to imply all interactions an individual shared with other people.” To better understand Aristotle’s view, we would do well to put in context how Rhetoric was composed. The book was written during two period’s in Aristotle’s career: the first (367 to 347 BCE) while he was apprenticed to Plato, and the second period (335 to 322 BCE) after he had begun running his own academy, the Lyceum. The Rhetoric is made up of three books, but we may safely be concerned only with the first, which lays out the definition of rhetoric. When Aristotle writes about deliberate rhetoric, he is mainly concerned about how to raise discourse in political circles: war, law-making, economy, commerce. He advocated, specifically, that people not engage in rhetorical discourse about subjects they cannot control. "But the subjects of deliberation are clear, and these are whatever...are within our power. [As judges] we limit our consideration to the point of discovering what is possible or impossible for us to do." What does Aristotle mean to ascribe as “the subjects of deliberation”? He is referring to the “particular case” in which, once discovered, though not before (that, as he said, would rule out eligibility for rhetoric), one may safely begin to expose concepts to the light of reason and debate. [1] “Particular cases,” as Aristotle considers them, are in this book meant to refer to politics and speech-giving. Here, we come to the classroom discussion.
The tenor of the argument, I am glad to say, was cordial; there wasn’t as much disagreement as I would have liked; but that is beside my point, which is this: Before using the work of an important historical scholar, we would be well advised to read his work first. By that means, we would avoid looking silly afterwards.
The first premise the class put forward was: Human nature is self-evidently evil (selfish), and, as Aristotle said, people express this preternatural inclination by attempting to exert power over others in each and every situation that they come into contact. I won’t attempt to dispute Aristotle’s view on this matter, for the simple reason that he would have completely disagreed with it. To begin, Aristotle was not phrasing his definition of persuasion within the context of evil, but rather through goodness, luck, and happiness; when using persuasion, we must realize, he writes, that there is no concrete definition (“Human nature is self-evidently evil”) about human values. Aristotle argues that the people are capable of reaching a common understanding of what an ideal means. By accepting this premise, the public realizes that all ideas are evaluated by the human propensity for goodness ("On Rhetoric",I.4.1359b:4) and the goodness they observe in others. He did not say, as the class said, that people use their habit to evaluate the value of ideas through the goodness they see in others for selfish, evil reasons.
The second premise the class put forward is related to the first: (1) As Human nature is basically pathological, is oriented toward its own gratification, we can conclude that (2) all attempts at persuasion are motivated by selfish means; (3) As all persuasion is pathological, all communication persuasive, thus all communication and actions are rooted deep down in a nature that is selfish. Here, obviously, we have completely abandoned the subjects of rhetoric and Aristotle, so we may put those aside momentarily. On the first leg of the premise, “Human nature is basically pathological”: If I or anybody decided to build an argument about human nature on the foundation that it was good or evil, either attempt would end in miserable failure. The reason is because, by looking around us, we could find whatever form of behavior we like. That isn’t reasoning, but more along the line of cherry-picking. “All attempts at persuasion are motivated by selfish means”? I don’t agree, frankly, that all communication is persuasion. Persuasion, for me, implies an argument is in place; arguments have premises and a conclusion, and usually evidence to support their conclusions. In short, if I say “John F. Kennedy was the greatest president out country ever had” or “The table is red,” I am not making argumentative or persuasive statements. What I said were general statements, which life is filled with. I never said why I thought John F. Kennedy was the greatest president, and perhaps I don’t believe that statement; nor did I give an example of a red table – I don’t have an opinion about it, because the red table I’m thinking of doesn’t exist.
Finally, “All communication is rooted deep down in a faculty that is fundamentally selfish.” Obviously, this doesn’t comply with Aristotle’s idea of the human faculty, but never mind that. I will answer this by stating my own opinion of selfishness, which is that, I see no reason why all selfishness is necessarily bad. Here, our class employed what, from my point of view, was a very specious form of argument. The basis was: If you are slightly evil, your essence is evil. The only analogue I can think of to this reasoning is on Kirk Cameron’s “Way of the Master” program. “Way of the Master” is a Christian television program, aired on cable access, which deals with the subject of evangelism. On the show, Cameron does a segment in which he approaches common people on the streets and, catching them off guard, proceeds to humiliate them. He asks them inane questions, expecting the same predictable answers. For example: “Have you ever lied before? Yes? Then, by your own admission you are a liar. Ever stollen? Then that makes you a thief.” Notice, he doesn’t ask, “Have you ever given a present to somebody? Then that makes you generous.” Cameron fails to consider that life may present some instances in which lying may be virtuous or, in fact, that there are times in life where people don’t lie. In my opinion, the class made the same mistake but in this case, on the subject of evil.

1. http://plato.stanford.edu/entries/aristotle-rhetoric/

Wednesday, January 14, 2009

A Few Thoughts on Good and Evil

What is Evil? Or if you are a modern American, what isn’t Evil? I was compelled to answer this question in my Composition course this past Tuesday and, not having expected to confront such a complicated issue, I thought that I looked absurd discussing it with the others students. But why? Like many people, the defining moment, the one which has trained my notion of how depraved and cruel people can be, was the mass murder of the Jews by the Nazis; this was, of course, what several members of the class brought up to illustrate their points, from whatever perspective they represented, however convincingly they made their points. The Holocaust is, however, quite unique to the American imagination; there are more Holocaust museums in the United States, though it took place in Europe, than there are museums and memorials commemorating the mass murder of Native Americans (Columbus Day is still celebrated), the enslavement of countless African Americans, as well as the unjust “internment” of roughly 2 million Japanese, German, and Italian Americans during World War 2. While observing this startling phenomenon, of American citizens unwilling or unable to account for their countries’ crimes, I cannot help to ask myself: Why, on the one hand, can Americans be so indignant towards crimes committed by governments in other countries while remaining largely indifferent to its own historical atrocities? The answer is, in part: A lack of activism and community.
When I say, “largely indifferent to its own atrocities”, I am, of course, overdrawing my point. More conscious than they once were, Americans have made, in the instance of the Indian case, some sort of general recognition of that genocide. In the instance of Black Civil rights, there have also been made great strides; the election of an African American president will attest to that. In fact, just looking at the frontrunner candidates for a major party, a black man and a woman, ought to convince one of the strides that have been made in the United States in the past 40 years.
But how were those strides made? Was it a gift from an angel? Or did a bureaucrat decide one day “I think I’ll get rid of segregation for them”? No. The reason, which isn’t supposed to be discussed, incidentally, is because of the hard work of the mass activism that took place in the 1960s, in large part; people formed together in groups, began to develop ideas and determined what they thought and wanted to see done, and carried out plans to see those ideas come to fruition.
“But hasn't that lack of discipline, that wanton disregard for authority, led to the current breakdown in values”? Yes, without a doubt, there are many failures to point to around us. But I am not convinced, as I experience the political climate around me each day , that that is purely, or to even a small extent, the result of the activism in the 1960s.
First of all, we must acknowledge that the activism of the 1960s was, for the most part, crushed. There is a very rich record that we could look at which lays out in detail the White House’s anti-dissident agenda, both at home and abroad; the most shocking is the COINTEL (Counterintelligence) Program [1], perhaps the most dramatic attempts by the political class to crush free thought since the Palmer Raids [2] four decades earlier. My question is: Why did those movements require our representatives to crush them? Here, we come back to the problem of evil.
The activism of the 1960s, all progressive activism in fact, has a particular goal: Making people aware of oppression. The activist tells us: Oppression is not foisted on people, it is learned. For instance, a major pro-slavery intellectual, George Fitzhugh [3], made, to the comprehension of Americans in the late 1800s, many highly compelling arguments for slavery. He said to the Northerners, in essence, “We Southerners aren’t racist because, unlike you, we take care of our slaves. Where you have a system in which the capital is concentrated and people are forced to rent themselves to survive, we don’t dispose of them when we are through; Here, slaves aren’t debased or humiliated, they are treated with respect, fed, educated, allowed to live in peace.” Reading the record now, many workers took that argument quite seriously. Generally speaking, when people want to make arguments, they don’t try to simply snub people, but rather develop their idea along lines that would be generally agreeable to everyone else. That is what Fitzhugh did, that is what his counterparts in the North did, and the result may be that, if unchecked, a lot of oppression could be ingrained in people’s consciousness'. The activist, therefore, calls for a circumstance, or an arrangement, in which people can argue and be skeptical in as free a manner as possible or necessary, in other words, democracy.
The case of slavery in the United States is interesting, actually; If we were to ask the typical slave owner if what he thought he did was evil, it is my assurance that he would not; he would probably think, for the reasons that I enumerated above, that he was free of blame, that he regarded himself as good and obeyed the law. I can sense myself coming closer to a trap, so I must jump back: Because I am saying it appears obvious that values are incredibly variable and indistinct, seeming to come without in as many instances than from within, does not mean that I believe a person’s capacity for good or evil is just as changeable.
When observing people, we see they have many ostensible traits, ones that lead us to conclude that they have a moral faculty; what that faculty is, however, is a complete mystery. What we look at may indicate things, but the possibilities are so varied and complicated, no definitive conclusions could be made now about the idea of something like a human moral faculty. On the one hand, there is cruelty, sadism, genocide, sexual abuse; by the same token, there is kindness, mutual support, respect for human rights, mass democratic movements. My contention is that if people are ever to begin to discover the moral faculty, they must by necessity develop a mutual arrangement, one which would maximize the opportunities and incentives for healthy human behavior. I’m speaking, of course, about democracy.

1. http://www.icdc.com/~paulwolf/cointelpro/churchfinalreportIIIa.htm
2. http://en.wikipedia.org/wiki/Palmer_raids
3 http://reactor-core.org/cannibals-all.html