A couple weeks ago, I Googled myself. Admit it: you do it too. But it really is worth doing occasionally if you have become a blogger, because it gives you the illusion that you have some clue as to whether you are being needlessly slandered by others. Strangely, Tenured Radical itself is #5 in terms of hits for "Claire Potter," whereas a paper I gave at the University of Connecticut five years ago is on the top of the list (I suppose because the paper was about J. Edgar Hoover, who is slightly better known than I am.) But imagine my surprise when I saw at spot #3 the phrase: "Claire Potter is arrogant and inflammatory...." Whoa, now. Imagine my further surprise when, upon closer inspection, the post was not located in any place where I am used to being bashed for my politics or my behavior, but on the website faculty love to hate, RateMyProfessors.com.
Since I asked RateMyProfessors to take it down, I reproduce the text posted on April 5, 2008, in full here:
Claire Potter is arrogant and inflammatory, but also one of the most articulate people I've ever met. Her ideas and lectures are interesting, if offensive and disorganized (she sometimes tells the class she hasn't prepared). If she spent more time engaged with students and reality and less writing her blog, perhaps she'd be more palatable. Well perhaps, but not likely.
While I was at it, I took down this one from last semester:
While Potter is very engaging and clearly knows her stuff, she teaches a very biased view of history (that is, liberal feminiest[sic] history). any conservatives are always "they" and there is far more focus on liberal presidencies-with some scoffing over the kooky reagan years. If we're going to teach History, let's at least try to tell the truth. Yes let's -- like, for example that we spent a week on Barry Goldwater, and subsequently, the last three weeks of the semester on the new conservatism and neo-liberal governance. And all historical figures are "they" to me because -- well, they aren't me. You don't need to know much about object relations theory to know that.
Now, why did I have these comments taken down -- a move which is, by the way, only temporary, since permanent deletion occurs at the discretion of the site managers, who have no idea whether the comments accurately reflect an experience in my class? Well, because I consider them inaccurate and malicious, and I don't want them popping up on random Google searches. There actually is also a little matter of truth at stake here. In relation to the first comment, I have never told a class I was unprepared, although strangely, the day before this "evaluation" was posted, I had lectured the class about their uneven preparation, and given them a little in-class, surprise writing exercise. And as for the second comment I took down, also completely wrong: this strikes me as something that may well have been written by someone who knows about me, assumes that I would teach a political history class in a certain way, but was not actually in the class. This is not the first time I have suspected that people who are not my students comment on me, since several years ago I had RateMyProfessors take down a comment for a class I had not even taught. This took several tries, by the way, and the rating itself remains.
To test my theory that ratings could be posted by people who had never been my students, I went to the dreaded site, and registered myself, under my own name, as a Zenith student. Easy-peasy. The only false information I provided was a birth date that made me 19 years old (I wish!) and the box I checked that affirmed my status as a Zenith sophomore. I then successfully added a rating about myself. You can see it here: it's the anxious looking green emoticon that has the comment "interesting." I thought it only fair to add something right down the middle, neither good nor bad. Inflammatory perhaps, but arrogant never, that's my motto.
Now if I can do this, using my own name and without setting off any alarm bells on the website, what this means is that anyone can register as anyone and leave an evaluation -- for anyone -- that says anything. That's right. You could do it from prison if you had internet privileges, or from Afghanistan, if you were just farting around in between avoiding the Taliban. So think about that the next time you go into a class feeling insecure and betrayed because someone has posted something nasty about you on RateMyProfessors: it might not even be one of your students who did it. Furthermore, it makes you wonder -- when most professors have fewer than a dozen ratings, if someone has amassed an unusually high number -- sixty or seventy say -- who the heck is actually adding them? Can it really be students?
RateMyProfessors does not address the question of verification anywhere on its site, nor does it even suggest that it is possible for a person who is not actually your student to post a review of your teaching. But it does have a page where the site managers tell you what you can do to correct an unfair comment (other than post a video of yourself telling your students how wrong they are, a new feature called "Professors Strike Back" that is designed for those of you longing for your own reality show.) You can also ask that the comment be reviewed by the site managers as false or defamatory, as I did with the comments above: when it is, should they decide not to put it back up, the rating itself remains, something they don't tell you. But what else can you do? Why you can give them more publicity! As they explain:
"The best thing you can do is write an article about the site in your school paper. This ALWAYS has a huge impact on the number of ratings and makes the site become less entertainment oriented (an admission that selling advertising is the actual point of this website -- duh) and more of a resource for helping you plan your class schedule. Other things you can do is post flyers around campus, post links to the site on message boards that are school-related, and email your friends about the site."
Other than friendly advice on how to make a complete fool of yourself, this speaks to what I think is the central feature of the site. Students need to be highly motivated to use it, and normally they are motivated because they really like you or, conversely, they are getting revenge for some imagined or unimagined insult by writing something spiteful in public. The suggestion that professors themselves should mobilize students to "tell the truth" about their classroom experiences I find just bizarre and manipulative. I mean, who would do that?
But this isn't the only problem with RateMyProfessors from my point of view, or even the one that deserves serious attention. The problem is that, while accepting employment at a college or university means that you agree to the principle that your teaching will be evaluated as part of your contract with the school, none of us have any obligation to have things written about us by God Knows Who on a site that is completely and utterly commercial. Furthermore, the site can do whatever it wants with the ratings -- including, if it desires, publish a book based on the "data" it collects. The Terms of Use state explicitly that everything posted to the site is owned by the site and, while posters are warned that they may not do anything that violates civil or criminal statutes, by rules they themselves have established, RateMyProfessors claims that it may reproduce anything on the site without restriction. I'm sure that this is boilerplate copyright language, and I am equally sure that the site manager's refusal of liability is unenforceable, but the point remains that we -- the faculty -- are the ones whose work and identities are the substance of the site. And we have not been asked for our permission for "data" to be collected and published about us.
But that said, corporations are not this powerful: I think universities and individuals probably have the legal right to have themselves removed from, or to opt-out of, this site. If I knew how to do it I would, just on principle. Like Facebook, it is nothing but a crude marketing device in drag, where getting you on the site and keeping you there for as long as possible is the real point.
Sunday, April 27, 2008
Thursday, April 24, 2008
What Time Is It? It's Senior Thesis Time
Actually, it's past senior thesis time. I advised two wonderful honors students, read five other theses (also real labors of love -- my Zenith students never fail to impress me when they put their hearts into something), awarded a prize or two (or six, depending on who and what you are counting), and I have two more comments to write. Then I am done for this year. And of course, just as we send this group off into the sunset, the juniors start poking their heads around the doorjam and saying, "Professor Radical -- I'd like to talk about my idea for a thesis?"
It's the time of year that I am full of gratitude: gratitude for students who did their work with integrity and shared it with me, gratitude that a new group has enough faith in my powers of whatever to give it a whirl despite my obvious flaws, and gratitude -- well....
That I wasn't Aliza Shvartz's tutor at Yale!
You knew I had to get around to this story eventually, didn't you? I am proud to say, as a former member of the editorial board of the Oldest College Daily, that the Yale Daily News report on this strange event is still the best and least sensational account of Aliza's senior art project -- which may or may not have happened -- a performance piece for which she claims to have inseminated and aborted herself repeatedly. Yale says it didn't happen; Aliza says it did, although initially (apparently) she said it didn't and then she changed her mind. What we can infer is that she doesn't appear to have any politics whatsoever, since the project was all about her feelings and experiences as she turned her body inside out, but not about abortion per se.
All the same, incoherent and weird as this seems from the sparse and contradictory information that has leaked from meetings where Yale officials are ripping their hair out, subsequent media coverage has just been too bizarre. FOX quotes NARAL communications director Ted Miller's opinion that Shvartz's project "is offensive and insensitive to the women who have suffered the heartbreak of miscarriage" (since when has miscarriage been a critical issue for NARAL?); and National Right to Life Committee President Wanda Franz's view that the Yale senior is "a serial killer."
As FOX says: We Report. You Decide. Just like on Tenured Radical.
Perhaps in an attempt to be helpful, Discover Magazine suggests that Yale test the installation (which never got installed) for Shvartz's DNA, while the New York Times helpfully notes that it could also be tested for the presence of hormones only produced during pregnancy. Oh, let's not and say we did. Maybe instead Yale could gift some DNA testing to some poor schmuck on death row somewhere, now that the Supremes have decided that death by lethal injection is not cruel after all.
Sigh. Such a mess. The Chronicle of Higher Ed has some pretty good coverage, although click here if you want to read ninety-six ways to call someone you don't know, and know almost nothing about, a sicko. However, in yet another moment where women become responsible for all the moral outcomes of reproduction, no one but the OCD is interested in pursuing the question that first popped into my head (after, I admit, "What was her advisor thinking?") which was -- where and how did she collect the alleged sperm? As Yale Daily News reporter Martine Powers explained,
"The 'fabricators,' (nice touch, Martine - sift through these articles, and you will see that I am not the only person who thinks there was no sperm) or donors, of the sperm were not paid for their services, but Shvarts required them to periodically take tests for sexually transmitted diseases. She said she was not concerned about any medical effects the forced miscarriages may have had on her body. The abortifacient drugs she took were legal and herbal, she said, and she did not feel the need to consult a doctor about her repeated miscarriages. (Really? Perhaps because the doctor would have told her she was completely insane. Just a thought. And anyway, she had an advisor in the Art department for that kind of conversation!)
Shvarts declined to specify the number of sperm donors she used, (ok, I'm thinking you loiter around Tang Cup practices, ask for volunteers, plastic cups are right there and can be rinsed out) as well as the number of times she inseminated herself....Pia Lindman, Schvarts' senior-project advisor, could not be reached for comment Wednesday night."
Well, I'll just bet that Professor Lindman was not available for comment. I would have been at home, cell phone at the bottom of the toilet tank, heavily medicated and wishing I had really insisted on those weekly tutorial meetings after all. And my guess is that the next time she is available for comment she will be working somewhere else. Yale -- otherwise known as Oligarch in these pages -- does not take kindly to having been made a fool of.
But my other thought is -- if Shvartz is not deranged, which she might be (it is also worth noting that academics at all levels who are otherwise sane are capable of telling giant, improbable whoppers, so it's hard to know) -- maybe the "abortions" were never the "performance" in the performance art at all -- maybe it's the rest of us responding to it that is the actual art! Now, how radical would that be?
It's the time of year that I am full of gratitude: gratitude for students who did their work with integrity and shared it with me, gratitude that a new group has enough faith in my powers of whatever to give it a whirl despite my obvious flaws, and gratitude -- well....
That I wasn't Aliza Shvartz's tutor at Yale!
You knew I had to get around to this story eventually, didn't you? I am proud to say, as a former member of the editorial board of the Oldest College Daily, that the Yale Daily News report on this strange event is still the best and least sensational account of Aliza's senior art project -- which may or may not have happened -- a performance piece for which she claims to have inseminated and aborted herself repeatedly. Yale says it didn't happen; Aliza says it did, although initially (apparently) she said it didn't and then she changed her mind. What we can infer is that she doesn't appear to have any politics whatsoever, since the project was all about her feelings and experiences as she turned her body inside out, but not about abortion per se.
All the same, incoherent and weird as this seems from the sparse and contradictory information that has leaked from meetings where Yale officials are ripping their hair out, subsequent media coverage has just been too bizarre. FOX quotes NARAL communications director Ted Miller's opinion that Shvartz's project "is offensive and insensitive to the women who have suffered the heartbreak of miscarriage" (since when has miscarriage been a critical issue for NARAL?); and National Right to Life Committee President Wanda Franz's view that the Yale senior is "a serial killer."
As FOX says: We Report. You Decide. Just like on Tenured Radical.
Perhaps in an attempt to be helpful, Discover Magazine suggests that Yale test the installation (which never got installed) for Shvartz's DNA, while the New York Times helpfully notes that it could also be tested for the presence of hormones only produced during pregnancy. Oh, let's not and say we did. Maybe instead Yale could gift some DNA testing to some poor schmuck on death row somewhere, now that the Supremes have decided that death by lethal injection is not cruel after all.
Sigh. Such a mess. The Chronicle of Higher Ed has some pretty good coverage, although click here if you want to read ninety-six ways to call someone you don't know, and know almost nothing about, a sicko. However, in yet another moment where women become responsible for all the moral outcomes of reproduction, no one but the OCD is interested in pursuing the question that first popped into my head (after, I admit, "What was her advisor thinking?") which was -- where and how did she collect the alleged sperm? As Yale Daily News reporter Martine Powers explained,
"The 'fabricators,' (nice touch, Martine - sift through these articles, and you will see that I am not the only person who thinks there was no sperm) or donors, of the sperm were not paid for their services, but Shvarts required them to periodically take tests for sexually transmitted diseases. She said she was not concerned about any medical effects the forced miscarriages may have had on her body. The abortifacient drugs she took were legal and herbal, she said, and she did not feel the need to consult a doctor about her repeated miscarriages. (Really? Perhaps because the doctor would have told her she was completely insane. Just a thought. And anyway, she had an advisor in the Art department for that kind of conversation!)
Shvarts declined to specify the number of sperm donors she used, (ok, I'm thinking you loiter around Tang Cup practices, ask for volunteers, plastic cups are right there and can be rinsed out) as well as the number of times she inseminated herself....Pia Lindman, Schvarts' senior-project advisor, could not be reached for comment Wednesday night."
Well, I'll just bet that Professor Lindman was not available for comment. I would have been at home, cell phone at the bottom of the toilet tank, heavily medicated and wishing I had really insisted on those weekly tutorial meetings after all. And my guess is that the next time she is available for comment she will be working somewhere else. Yale -- otherwise known as Oligarch in these pages -- does not take kindly to having been made a fool of.
But my other thought is -- if Shvartz is not deranged, which she might be (it is also worth noting that academics at all levels who are otherwise sane are capable of telling giant, improbable whoppers, so it's hard to know) -- maybe the "abortions" were never the "performance" in the performance art at all -- maybe it's the rest of us responding to it that is the actual art! Now, how radical would that be?
Labels:
Aliza Shvartz,
art,
higher education,
Yale University
Monday, April 21, 2008
Be Afraid of Your Wife: Feminism and the History of Everyday Rage
(Crossposted at Cliopatra)
A Vietnam-era suburban housewife is standing in front of a kitchen counter. She stares calmly and without expression into the camera, as if she is the star of her own cooking show. “Knife,” she intones, displaying a knife in her right hand. With short, violent strokes she stabs the cutting board in front of her. She puts the knife aside. “Measuring cup,” she intones, and begins to flip an invisible liquid into the face of an invisible person. “Nutcracker,” she says, holding up the new implement and snapping it together sharply three or four times before setting it down.
Ouch. “Semiotics of the Kitchen” (1975), one of five short performance pieces produced and filmed by Lynda Begler, shows how ordinary kitchen implements express a woman’s rage, or what Betty Friedan famously called “the problem that has no name.” But Friedan – and other feminist writers – are considerably better known than the many female visual artists who worked for women’s liberation from 1965 on. If you are interested in an understudied, and dramatic, cultural history of second wave feminism, run – do not walk – to see “WACK! Art and the Feminist Revolution,” an exhibit at P.S. 1 in Long Island City, New York.
Curated by Connie Butler, WACK! represents 120 artists, collectives and collaborations in an international mixed media display that stretches from the mid-1960’s into the 1980’s, with the bulk of the exhibits concentrated in the years that defined movement feminism, 1966 through 1975. Collectively, and emphasizing images of domestic objects that defined bourgeois women’s existence during the late Cold War, the exhibit centers a series of critical questions that were central to feminist consciousness raising as it distinguished itself from other New Left movements. How does women’s oppression become visible in normal and everyday settings? Under what conditions does domestic patriarchy intersect with other oppressions, such as racism and American imperialism? What do women look like – and how would we know, when images generated by consumer culture construct “womanhood” as an artifact of cosmetic and commercial perfection?
Reading the exhibit as a historian of political feminism, and not as a historian of art, I was nevertheless struck at how difficult it was for these women to be perceived as artists at all when the gritty masculinity of the Cedar Tavern crowd in downtown New York dominated the gallery scene in these years. Entering the exhibit on the first floor, I was immediately drawn to Mary Beth Edelson’s “Some Living Women Artists” (1972), a photo collage in which a spoof of the Last Supper (Georgia O’Keefe’s head placed on Jesus’ body, surrounded on either side by twelve “apostles” that include Lee Krasner, Louise Bourgeois and Yoko Ono), is framed by miniature photographs of sixty, less well known, women artists. Further into the exhibit, Serbian performance artist Marina Abramovic’s silent film “Art Must Be Beautiful, Artist Must Be Beautiful” (1973) displays the dilemma of recognition for women artists, as Abramovic slashes at her thick, dark hair with a comb and brush for fourteen and a half minutes. Sometimes scraping her face and yanking at herself viciously in this “beautifying” effort, she obsessively mouths the title of the piece.
Even those with casual knowledge of the early years of women’s liberation will recognize one of its central themes, the critique of a consumer culture that urged women to perform a feminine role scripted by others. This later acquired a name, both among radical feminists and gay liberationists: “looksism,” something that women’s liberationists freed themselves from by throwing away girdles, bras, curling irons, make-up and shaving devices. Ann Newmarch’s photographic collage “Look Rich” (1975) centers a magazine clipping from a women’s magazine that urges women to “look rich” while on vacation, and to purchase expensive luggage, so that they can attract potential marriage partners. In block letters, the artist comments: “We must risk unlearning all those things that have kept us alive so long.” In “Beauty Knows No Pain” (1972), Martha Rosler cuts strategic holes in print advertisements for foundation garments and lingerie, inserting pictures of breasts and other body parts, so that the models blatantly display what these feminine garments are intended to both conceal and “sell” to men. Predictably, several exhibits – Ann Mendieta’s photographic series “Untitled (Glass on Body Imprints, 1972)” and Alice Neel’s oil on canvas portrait “Margaret Evans, 1978” – address this theme by showing women’ naked bodies in their most unflattering light, the first distorted by the pressure of glass against skin and hair, and the second distorted naturally by the final stages of pregnancy.
In another register, Betye Saar’s “The Liberation of Aunt Jemima” (1972), one of the few pieces by a black artist in the collection, shows a raised plastic cartoon “Mammy” doll, positioned in front of a background made from multiple pictures of a more comely, domesticated “Jemima” cut from the pancake box. A broom in her right hand, “Jemima” carries a rifle in her left: under one armpit is a pistol, and inset into her capacious skirt is a portrait of yet a third “Jemima,” holding a particularly anxious-looking white baby. Rising up in front of the whole collage is a brown fist, raised in a black power salute.
Although the exhibit offers much to think about, a great deal of it can be best appreciated as a commentary on a time when, as Ruth Rosen has put it, “The world turned upside down.” One series, however, bridged past and present for me by its insistence that prosperity at “home” is inevitably linked to a violent foreign policy: Rosler’s series of collages (1967-72) that contrast everyday household scenes with a parallel world of U.S. imperialism in Viet Nam. In “Red Stripe Kitchen,” two GI’s are rummaging through a suburban American kitchen, one peering around a doorjamb, perhaps looking for insurgents; the other is thoughtfully removing a rocket from a kitchen cupboard, as if it were a favorite chafing dish. In another brightly-colored collage, an American woman in a seductive pose is reflected in a bedroom mirror, but out the window and on television we can see artillery firing in black and white. In a third, a napalmed Vietnamese woman cradles her baby in her arms as she runs through a bright, sunny living room carpeted in white shag. Yet another features a photograph of a brick suburban ranch house; on the curb, a private soldier pauses in between firefights for a cigarette break.
The exhibit, in its insistence that viewers question the “normal” – or the connections between what we consider normal and the violence to self and other that the normal conceals, reinforced the intellectual links in my mind between second wave feminist theory and queer theory. But the exhibit also does the important work of linking feminism to a variety of New Left movements, and demonstrates visually how much intellectual exchange there was between feminism and other political impulses, even as feminist activists began to narrow their focus to violence against women by the mid-1970’s. As the 1969 proposal Mierle Laderman Okeles performance piece “Washing/Tracks/Maintenance” (1973), in which she described how she would live and clean publicly every day in a gallery, asked: “after the revolution, who’s going to pick up the garbage Monday morning?”
Since the revolution didn’t come, we never got to find out – although we could probably guess, which was Okeles’ point. But for historians who are interested in what a lesser-known feature of the women’s movement looked like, this exhibit is a gem: don’t miss it.
“WACK! Art and the Feminist Revolution” is at P.S. 1 Contemporary Art Center, 22-25 Jackson Ave. at 46th Ave., Long Island City, New York, 11101, through May 12. Open 12-6, Thursday – Monday. Admission is $5.00.
A Vietnam-era suburban housewife is standing in front of a kitchen counter. She stares calmly and without expression into the camera, as if she is the star of her own cooking show. “Knife,” she intones, displaying a knife in her right hand. With short, violent strokes she stabs the cutting board in front of her. She puts the knife aside. “Measuring cup,” she intones, and begins to flip an invisible liquid into the face of an invisible person. “Nutcracker,” she says, holding up the new implement and snapping it together sharply three or four times before setting it down.
Ouch. “Semiotics of the Kitchen” (1975), one of five short performance pieces produced and filmed by Lynda Begler, shows how ordinary kitchen implements express a woman’s rage, or what Betty Friedan famously called “the problem that has no name.” But Friedan – and other feminist writers – are considerably better known than the many female visual artists who worked for women’s liberation from 1965 on. If you are interested in an understudied, and dramatic, cultural history of second wave feminism, run – do not walk – to see “WACK! Art and the Feminist Revolution,” an exhibit at P.S. 1 in Long Island City, New York.
Curated by Connie Butler, WACK! represents 120 artists, collectives and collaborations in an international mixed media display that stretches from the mid-1960’s into the 1980’s, with the bulk of the exhibits concentrated in the years that defined movement feminism, 1966 through 1975. Collectively, and emphasizing images of domestic objects that defined bourgeois women’s existence during the late Cold War, the exhibit centers a series of critical questions that were central to feminist consciousness raising as it distinguished itself from other New Left movements. How does women’s oppression become visible in normal and everyday settings? Under what conditions does domestic patriarchy intersect with other oppressions, such as racism and American imperialism? What do women look like – and how would we know, when images generated by consumer culture construct “womanhood” as an artifact of cosmetic and commercial perfection?
Reading the exhibit as a historian of political feminism, and not as a historian of art, I was nevertheless struck at how difficult it was for these women to be perceived as artists at all when the gritty masculinity of the Cedar Tavern crowd in downtown New York dominated the gallery scene in these years. Entering the exhibit on the first floor, I was immediately drawn to Mary Beth Edelson’s “Some Living Women Artists” (1972), a photo collage in which a spoof of the Last Supper (Georgia O’Keefe’s head placed on Jesus’ body, surrounded on either side by twelve “apostles” that include Lee Krasner, Louise Bourgeois and Yoko Ono), is framed by miniature photographs of sixty, less well known, women artists. Further into the exhibit, Serbian performance artist Marina Abramovic’s silent film “Art Must Be Beautiful, Artist Must Be Beautiful” (1973) displays the dilemma of recognition for women artists, as Abramovic slashes at her thick, dark hair with a comb and brush for fourteen and a half minutes. Sometimes scraping her face and yanking at herself viciously in this “beautifying” effort, she obsessively mouths the title of the piece.
Even those with casual knowledge of the early years of women’s liberation will recognize one of its central themes, the critique of a consumer culture that urged women to perform a feminine role scripted by others. This later acquired a name, both among radical feminists and gay liberationists: “looksism,” something that women’s liberationists freed themselves from by throwing away girdles, bras, curling irons, make-up and shaving devices. Ann Newmarch’s photographic collage “Look Rich” (1975) centers a magazine clipping from a women’s magazine that urges women to “look rich” while on vacation, and to purchase expensive luggage, so that they can attract potential marriage partners. In block letters, the artist comments: “We must risk unlearning all those things that have kept us alive so long.” In “Beauty Knows No Pain” (1972), Martha Rosler cuts strategic holes in print advertisements for foundation garments and lingerie, inserting pictures of breasts and other body parts, so that the models blatantly display what these feminine garments are intended to both conceal and “sell” to men. Predictably, several exhibits – Ann Mendieta’s photographic series “Untitled (Glass on Body Imprints, 1972)” and Alice Neel’s oil on canvas portrait “Margaret Evans, 1978” – address this theme by showing women’ naked bodies in their most unflattering light, the first distorted by the pressure of glass against skin and hair, and the second distorted naturally by the final stages of pregnancy.
In another register, Betye Saar’s “The Liberation of Aunt Jemima” (1972), one of the few pieces by a black artist in the collection, shows a raised plastic cartoon “Mammy” doll, positioned in front of a background made from multiple pictures of a more comely, domesticated “Jemima” cut from the pancake box. A broom in her right hand, “Jemima” carries a rifle in her left: under one armpit is a pistol, and inset into her capacious skirt is a portrait of yet a third “Jemima,” holding a particularly anxious-looking white baby. Rising up in front of the whole collage is a brown fist, raised in a black power salute.
Although the exhibit offers much to think about, a great deal of it can be best appreciated as a commentary on a time when, as Ruth Rosen has put it, “The world turned upside down.” One series, however, bridged past and present for me by its insistence that prosperity at “home” is inevitably linked to a violent foreign policy: Rosler’s series of collages (1967-72) that contrast everyday household scenes with a parallel world of U.S. imperialism in Viet Nam. In “Red Stripe Kitchen,” two GI’s are rummaging through a suburban American kitchen, one peering around a doorjamb, perhaps looking for insurgents; the other is thoughtfully removing a rocket from a kitchen cupboard, as if it were a favorite chafing dish. In another brightly-colored collage, an American woman in a seductive pose is reflected in a bedroom mirror, but out the window and on television we can see artillery firing in black and white. In a third, a napalmed Vietnamese woman cradles her baby in her arms as she runs through a bright, sunny living room carpeted in white shag. Yet another features a photograph of a brick suburban ranch house; on the curb, a private soldier pauses in between firefights for a cigarette break.
The exhibit, in its insistence that viewers question the “normal” – or the connections between what we consider normal and the violence to self and other that the normal conceals, reinforced the intellectual links in my mind between second wave feminist theory and queer theory. But the exhibit also does the important work of linking feminism to a variety of New Left movements, and demonstrates visually how much intellectual exchange there was between feminism and other political impulses, even as feminist activists began to narrow their focus to violence against women by the mid-1970’s. As the 1969 proposal Mierle Laderman Okeles performance piece “Washing/Tracks/Maintenance” (1973), in which she described how she would live and clean publicly every day in a gallery, asked: “after the revolution, who’s going to pick up the garbage Monday morning?”
Since the revolution didn’t come, we never got to find out – although we could probably guess, which was Okeles’ point. But for historians who are interested in what a lesser-known feature of the women’s movement looked like, this exhibit is a gem: don’t miss it.
“WACK! Art and the Feminist Revolution” is at P.S. 1 Contemporary Art Center, 22-25 Jackson Ave. at 46th Ave., Long Island City, New York, 11101, through May 12. Open 12-6, Thursday – Monday. Admission is $5.00.
Labels:
feminism,
Political History,
popular culture,
reviews
Wednesday, April 16, 2008
Return to Sender; or, the Art of Rejection
In response to recent accusations of smuggery, I would like to say that, although I occupy a privileged position in the world, I am still subject to rejection from time to time. I hate rejection. It makes me feel unwanted. I hate it when students reject me by writing mean teaching evaluations. It makes me feel misunderstood and resentful. Fortunately it doesn't happen very often.
I have had to get used to rejection, though, since between my exalted position as Chair of the Program and the never-ending project of keeping my scholarly life vital, I have to apply for things constantly -- internal to Zenith as well as external -- and, as they say, you can't win 'em all. One year, during the Unfortunate Events, because members of my department were giving me the Big Raspberry and because I couldn't really sleep, I applied for everything under the sun: five jobs, three year-long fellowships, and a tiny research fellowship that the actual fellowship committee chair at the archive had urged me to apply for. It was that last one that tore it -- I sat down on my front stairs and wept. Never, I vowed, NEVER will I apply for anything again -- not even an extension on my taxes.
I got over it. And now, because of "my privilege," as my Zenith students would put it, and because I am really an optomist, I apply for things all the time without worrying much whether I will get them or not. Jobs, fellowships, symposia where your work has to be accepted. And sometimes I do get them, or in the case of jobs, I get a nibble here and there. This is an ideal outcome, by the way, if you are already employed in a good situation: I know I am appreciated, but I don't have to move or say goodbye to my friends. Although I must admit, my friends seem to say goodbye to me with regularity -- another story, for another day.
But because I have been getting a few rejection letters myself, and because I recently sent sixty or so of them, I have been following the discussion in the blogosphere about rejection letters rather avidly. These are people being rejected for tenure track jobs -- one fellow apparently took to papering the wall of the TA lounge with them. This is what I have learned: many of you do not do such a great job rejecting people -- some of you never send a thing, assuming that time will pass and people will just get it after a while that they aren't going to be interviewed. So pay attention, search chairs of 2008-09:
1. Do not send rejections by email.
2. Do not send rejections by post card.
3. When writing a letter to candidates, if you actually met them, or solicited the candidacy, take two seconds to write a personal note. This means not having your departmental secretary sign them, of course.
4. Send rejections in a timely way: at least when the search is over, if not before. In fact, although wisdom has it that you reject no one until the chosen candidate has signed on the dotted line, truth be told, a large part of the pool is out of the running after the first cut. Why not tell the people who didn't make the semi-final cut -- say, in January, rather than April?
I suspect I get the special handling variety of rejection letters because of the rank thing, but there is no reason that has to be so. Being respectful to job candidates goes a long way, from my perspective. I would like to say that this year I got no response from one search chair, one ordinary "We hired blah blah blah..." letter, and two particularly nice letters, both from (cough, cough) women. Odd coincidence, no?
One letter - that came by email, true, but I don't really care -- said it was so good to have read my materials because there might be a position that suited me better some day (isn't that nice? It was the posiiton that didn't fit the candidate, not the candidate that didn't fit the position. Sweet.) The other said that I was not a candidate because of how the position was ultimately defined, not because of "any deficiencies on [my] part." Again -- nice.
Of course, the last letter struck exactly the right note, since this is the letter I always fear I will receive:
Dear Professor Radical,
Are you kidding? We have people defending dissertations who look better on paper than you do. And full professor? Puh-lease.
Sincerely,
Sherman Pinprick
Distinguished Tinky Winky Chair of Queer Studies
Venerable University
P.S. You also didn't get the job because of your stinking blog.
I have had to get used to rejection, though, since between my exalted position as Chair of the Program and the never-ending project of keeping my scholarly life vital, I have to apply for things constantly -- internal to Zenith as well as external -- and, as they say, you can't win 'em all. One year, during the Unfortunate Events, because members of my department were giving me the Big Raspberry and because I couldn't really sleep, I applied for everything under the sun: five jobs, three year-long fellowships, and a tiny research fellowship that the actual fellowship committee chair at the archive had urged me to apply for. It was that last one that tore it -- I sat down on my front stairs and wept. Never, I vowed, NEVER will I apply for anything again -- not even an extension on my taxes.
I got over it. And now, because of "my privilege," as my Zenith students would put it, and because I am really an optomist, I apply for things all the time without worrying much whether I will get them or not. Jobs, fellowships, symposia where your work has to be accepted. And sometimes I do get them, or in the case of jobs, I get a nibble here and there. This is an ideal outcome, by the way, if you are already employed in a good situation: I know I am appreciated, but I don't have to move or say goodbye to my friends. Although I must admit, my friends seem to say goodbye to me with regularity -- another story, for another day.
But because I have been getting a few rejection letters myself, and because I recently sent sixty or so of them, I have been following the discussion in the blogosphere about rejection letters rather avidly. These are people being rejected for tenure track jobs -- one fellow apparently took to papering the wall of the TA lounge with them. This is what I have learned: many of you do not do such a great job rejecting people -- some of you never send a thing, assuming that time will pass and people will just get it after a while that they aren't going to be interviewed. So pay attention, search chairs of 2008-09:
1. Do not send rejections by email.
2. Do not send rejections by post card.
3. When writing a letter to candidates, if you actually met them, or solicited the candidacy, take two seconds to write a personal note. This means not having your departmental secretary sign them, of course.
4. Send rejections in a timely way: at least when the search is over, if not before. In fact, although wisdom has it that you reject no one until the chosen candidate has signed on the dotted line, truth be told, a large part of the pool is out of the running after the first cut. Why not tell the people who didn't make the semi-final cut -- say, in January, rather than April?
I suspect I get the special handling variety of rejection letters because of the rank thing, but there is no reason that has to be so. Being respectful to job candidates goes a long way, from my perspective. I would like to say that this year I got no response from one search chair, one ordinary "We hired blah blah blah..." letter, and two particularly nice letters, both from (cough, cough) women. Odd coincidence, no?
One letter - that came by email, true, but I don't really care -- said it was so good to have read my materials because there might be a position that suited me better some day (isn't that nice? It was the posiiton that didn't fit the candidate, not the candidate that didn't fit the position. Sweet.) The other said that I was not a candidate because of how the position was ultimately defined, not because of "any deficiencies on [my] part." Again -- nice.
Of course, the last letter struck exactly the right note, since this is the letter I always fear I will receive:
Dear Professor Radical,
Are you kidding? We have people defending dissertations who look better on paper than you do. And full professor? Puh-lease.
Sincerely,
Sherman Pinprick
Distinguished Tinky Winky Chair of Queer Studies
Venerable University
P.S. You also didn't get the job because of your stinking blog.
Labels:
the Job Market,
the Progress of the Radical,
writing
Sunday, April 13, 2008
Congratulations Brian Donovan
Since the Radical is now associated in the public mind with all things tenure, I noted with pleasure last week that The Chronicle of Higher Education had linked me with this YouTube video, which is one of the most perfect visual conceits I have ever seen.
Now, I ask you -- how funny is this? Very funny. It also makes me jealous that he knows how to make a video like this and I don't. Finally, it reminds me that one of the things I love about being an academic is the wit and the high jinks. Other than professional comedians, the only group of people who are funnier are people who work in advertising.
So just in case you think the Radical has fallen into the trap common to radicals everywhere -- in other words, taking her political positions so seriously that she loses her sense of fun (North Korea is a good example of this error, as was the second incarnation of the radical feminist collective Redstockings) -- let's give a big round of applause for Brian and everyone else who made it through the tenure process this year.
And for those of you who were made to walk Spanish: take heart. Life will go on, most likely even better than before.
Now, I ask you -- how funny is this? Very funny. It also makes me jealous that he knows how to make a video like this and I don't. Finally, it reminds me that one of the things I love about being an academic is the wit and the high jinks. Other than professional comedians, the only group of people who are funnier are people who work in advertising.
So just in case you think the Radical has fallen into the trap common to radicals everywhere -- in other words, taking her political positions so seriously that she loses her sense of fun (North Korea is a good example of this error, as was the second incarnation of the radical feminist collective Redstockings) -- let's give a big round of applause for Brian and everyone else who made it through the tenure process this year.
And for those of you who were made to walk Spanish: take heart. Life will go on, most likely even better than before.
Labels:
tenure
Friday, April 11, 2008
Thinking War, Thinking History: A Short Review Essay
(Editor's Note: every once in a while, someone who follows the Tenured Radical is Reading feature asks what I think about a book I have read. Mostly I don't say, since it would be a great burden to review everything, and because there is a reason book review sections have editors. However, as I am formally inaugurating my relationship with Cliopatra today, I though this cross-posted essay might be a good start.)
Perhaps it is an effect of the fifth anniversary of the war in Iraq, or perhaps it is the impending retirement of my American Studies colleague Richard Slotkin, but I seem to be reading more about war and violence this year than I have in the last decade.
Following on William A. Williams’ Tragedy of American Diplomacy (1959), a book that sought to understand how the project of democracy could be simultaneously well intentioned and destructive, a few scholars of the United States – Slotkin among them -- proposed that the history of violence was central to the formation of an “American” identity. Critical to this was a re-examination of the intellectual relationship between American democracy and the frontier, articulated originally by Frederick Jackson Turner (1893). Slotkin, for example, argued over the course of three volumes that it was not the frontier itself, but rather the naturalization of violence particular to the “frontier” -- on the Great Plains, in southeast Asia, or wherever Euro-Americans encountered racial “others” -- that shaped and re-shaped American culture and politics. His most recent book, The Lost Battalion: the Great War and the Crisis of American Nationality (2005) explores another, paradoxical, feature of this history of national violence: the exclusion of African Americans from democratic rights in 1919, despite black soldiers’ hopes that participation in World War I might result in full citizenship.
In the spirit of what I have cited above, I would like to take special note of four books I have read recently. Each pursues important questions about the history of violence, war and nationalism that are useful to us as historians and as critical thinkers about the contemporary United States.
The first volume I would recommend is Ned Blackhawk’s Violence Over the Land: Indians and Empires in the Early American West (Cambridge: Harvard University Press, 2006). Blackhawk is part of a generation of Native scholars who are making radical interventions in American history. Violence Over the Land demonstrates the consequences of European imperial ambitions for Native North Americans from the sixteenth century onward. But it is the Indian empires of the Great Basin, primarily the Ute, Paiute and Shoshone nations, their strategies, economies and political cultures, which are central to this history of American borderlands. It is also significant to note that Blackhawk, following Richard White’s influential The Middle Ground: Indians, Empires and Republics In the Great Lakes Region, 1650-1815 (1991), challenges the idea of an American “frontier” that moved in a linear way. Rather, Indians negotiated their survival on multiple, overlapping frontiers. Simultaneously, intermingled Americans, English, French and Spanish colonists and entrepreneurs created the political, economic and environmental conditions for the success of a United States imperial project in the Far West well before it was fully launched in the second half of the nineteenth century.
The second book on my list is a cultural and political history of death that has already received well-deserved attention, Drew Faust’s, This Republic of Suffering: Death and the American Civil War (New York: Alfred A. Knopf, 2008.) Faust asks a critical question that could usefully be asked about other wars and different forms of social violence: what is the impact of unimaginable death on a society? And how do people assimilate, and learn to cope with, a way of dying that is new to them? Of course, this is a particularly relevant question for the Civil War because, as Faust points out, neither the United States nor the Confederacy had any reason to imagine in 1861 that death might occur on such a scale. Furthermore, the idea that family members might die anonymously far away from the comfort loved ones normally provided, that soldiers might be dismembered and/or buried in haste, was unimaginable to antebellum bourgeois Americans who idealized a “good death.” Asking us to think about the everyday consequences of war to those who bear the brunt of it seems particularly relevant at a moment in time when the United States military works assiduously to keep the dead and wounded from Iraq and Afghanistan uncountable and hidden from view.
My third pick is historical fiction: Pat Barker’s Life Class (New York: Doubleday, 2008). Barker is a British novelist who won the Booker prize for The Ghost Road (1995), the third volume of her trilogy about World War I. This book views the Great War through the eyes of a group of friends whose studies at the Slade are interrupted in 1914. Similarly to Faust, the grisly descriptions of soldiers’ wounds point to the horrifying details of battlefield violence that individualize death through detail: for example, the slight – but consequential -- fact that on the Western Front men died of injuries that might have healed, had dirt teeming with untreatable microbes not been blown into the wounds. One cannot help but think of the contemporary phenomenon of suicide bombers, or IED’s exploded next to groups of soldiers and civilians, during which microscopic bits of human flesh penetrate and infect the skins of those who survive the attack. But the intellectual questions the book asks also should compel historians: does war awaken sensibilities that give art depth and meaning it might not otherwise have? Under what conditions does an aesthetic approach to war trivialize its violence? And under what conditions might any of us – as one female artist in the novel does – decide not to “see” the violence of war at all, and choose to focus on art instead?
My final pick is Cathy Wilkerson’s Flying Close to the Sun: My Life and Times as a Weatherman (New York: Seven Stories Press, 2007), and I make this recommendation as someone who thinks virtually all memoirs from the radical anti-war movement are oddly annoying historical documents. Bill Ayer’s book, Fugitive Days, which had the bad luck to publish on September 11, 2001, romanticizes the violence and overestimates the political impact of Weatherman; Jane Alpert’s Growing Up Underground (1981) is simultaneously apologetic and too quick to pin responsibility on her co-conspirators. Furthermore, no former activist can write honestly about what happened, since there are people who might still be harmed by what is revealed. But what I love about the Wilkerson book is that it asks the question: how did someone who cared so deeply about peace and justice come to embrace violence as a political necessity? Wilkerson embeds the answer in autobiography, in an excellent social history of the New Left, and in reflections on the life-long burdens she has had to come to terms with for having chosen violence over peaceful opposition to a deeply immoral war.
Perhaps it is an effect of the fifth anniversary of the war in Iraq, or perhaps it is the impending retirement of my American Studies colleague Richard Slotkin, but I seem to be reading more about war and violence this year than I have in the last decade.
Following on William A. Williams’ Tragedy of American Diplomacy (1959), a book that sought to understand how the project of democracy could be simultaneously well intentioned and destructive, a few scholars of the United States – Slotkin among them -- proposed that the history of violence was central to the formation of an “American” identity. Critical to this was a re-examination of the intellectual relationship between American democracy and the frontier, articulated originally by Frederick Jackson Turner (1893). Slotkin, for example, argued over the course of three volumes that it was not the frontier itself, but rather the naturalization of violence particular to the “frontier” -- on the Great Plains, in southeast Asia, or wherever Euro-Americans encountered racial “others” -- that shaped and re-shaped American culture and politics. His most recent book, The Lost Battalion: the Great War and the Crisis of American Nationality (2005) explores another, paradoxical, feature of this history of national violence: the exclusion of African Americans from democratic rights in 1919, despite black soldiers’ hopes that participation in World War I might result in full citizenship.
In the spirit of what I have cited above, I would like to take special note of four books I have read recently. Each pursues important questions about the history of violence, war and nationalism that are useful to us as historians and as critical thinkers about the contemporary United States.
The first volume I would recommend is Ned Blackhawk’s Violence Over the Land: Indians and Empires in the Early American West (Cambridge: Harvard University Press, 2006). Blackhawk is part of a generation of Native scholars who are making radical interventions in American history. Violence Over the Land demonstrates the consequences of European imperial ambitions for Native North Americans from the sixteenth century onward. But it is the Indian empires of the Great Basin, primarily the Ute, Paiute and Shoshone nations, their strategies, economies and political cultures, which are central to this history of American borderlands. It is also significant to note that Blackhawk, following Richard White’s influential The Middle Ground: Indians, Empires and Republics In the Great Lakes Region, 1650-1815 (1991), challenges the idea of an American “frontier” that moved in a linear way. Rather, Indians negotiated their survival on multiple, overlapping frontiers. Simultaneously, intermingled Americans, English, French and Spanish colonists and entrepreneurs created the political, economic and environmental conditions for the success of a United States imperial project in the Far West well before it was fully launched in the second half of the nineteenth century.
The second book on my list is a cultural and political history of death that has already received well-deserved attention, Drew Faust’s, This Republic of Suffering: Death and the American Civil War (New York: Alfred A. Knopf, 2008.) Faust asks a critical question that could usefully be asked about other wars and different forms of social violence: what is the impact of unimaginable death on a society? And how do people assimilate, and learn to cope with, a way of dying that is new to them? Of course, this is a particularly relevant question for the Civil War because, as Faust points out, neither the United States nor the Confederacy had any reason to imagine in 1861 that death might occur on such a scale. Furthermore, the idea that family members might die anonymously far away from the comfort loved ones normally provided, that soldiers might be dismembered and/or buried in haste, was unimaginable to antebellum bourgeois Americans who idealized a “good death.” Asking us to think about the everyday consequences of war to those who bear the brunt of it seems particularly relevant at a moment in time when the United States military works assiduously to keep the dead and wounded from Iraq and Afghanistan uncountable and hidden from view.
My third pick is historical fiction: Pat Barker’s Life Class (New York: Doubleday, 2008). Barker is a British novelist who won the Booker prize for The Ghost Road (1995), the third volume of her trilogy about World War I. This book views the Great War through the eyes of a group of friends whose studies at the Slade are interrupted in 1914. Similarly to Faust, the grisly descriptions of soldiers’ wounds point to the horrifying details of battlefield violence that individualize death through detail: for example, the slight – but consequential -- fact that on the Western Front men died of injuries that might have healed, had dirt teeming with untreatable microbes not been blown into the wounds. One cannot help but think of the contemporary phenomenon of suicide bombers, or IED’s exploded next to groups of soldiers and civilians, during which microscopic bits of human flesh penetrate and infect the skins of those who survive the attack. But the intellectual questions the book asks also should compel historians: does war awaken sensibilities that give art depth and meaning it might not otherwise have? Under what conditions does an aesthetic approach to war trivialize its violence? And under what conditions might any of us – as one female artist in the novel does – decide not to “see” the violence of war at all, and choose to focus on art instead?
My final pick is Cathy Wilkerson’s Flying Close to the Sun: My Life and Times as a Weatherman (New York: Seven Stories Press, 2007), and I make this recommendation as someone who thinks virtually all memoirs from the radical anti-war movement are oddly annoying historical documents. Bill Ayer’s book, Fugitive Days, which had the bad luck to publish on September 11, 2001, romanticizes the violence and overestimates the political impact of Weatherman; Jane Alpert’s Growing Up Underground (1981) is simultaneously apologetic and too quick to pin responsibility on her co-conspirators. Furthermore, no former activist can write honestly about what happened, since there are people who might still be harmed by what is revealed. But what I love about the Wilkerson book is that it asks the question: how did someone who cared so deeply about peace and justice come to embrace violence as a political necessity? Wilkerson embeds the answer in autobiography, in an excellent social history of the New Left, and in reflections on the life-long burdens she has had to come to terms with for having chosen violence over peaceful opposition to a deeply immoral war.
Labels:
History News Network,
Iraq
Monday, April 07, 2008
The Check is in the Mail; or, What to Do With an Honorarium
Because of a lovely speaking gig in the Midwest last week, I find myself in possession of an honorarium this week. Furthermore, because I won a prize in March for my article on Miss Mary Hoover (it was the Audre Lorde prize, given by the Committee for Lesbian and Gay History, an AHA affiliate), I was in possession of another, smaller, check. But it was a check all the same, one that was not only unexpected, but really a bonus, since who ever expects to make any money on an academic article? Much less win a prize, so that now the "awards and fellowships" section of my vita actually has an award on it? Last, but not least, I expect George Bush to send me a very large check later this month, most of which will go to pay my bloated Shoreline property taxes in June, but there will still be some left over. What do you do in such a situation of excess, dear? I mean, after buying a flat screen TV so that our nephews no longer mock us because we are still happy with a twelve-inch Trinitron?
Well, I always start by saying: "Thank you." Thank you Goddess, for my good health, and the job that allows me to write rather than drive to six colleges every week to teach twelve courses a semester; and thanks to those of you who took the time to judge the Audre Lorde prize and give it to me. Thanks to the journalthat published Miss Thing after two journals rejected her. Thanks to the Director of Women's Studies at Little Midwestern College, one of the many unsung heroes of higher education who, this spring, spent a lot of time arranging my visit when she could have been doing her own writing. Thank you fate, not just for getting me a nice job, but one that pays well (my rants against tenure have revealed that many of you out there, of similar rank and accomplishment, make half to two-thirds what I do.) Thank you that I am not being foreclosed on, or paying for chemotherapy with my tax return. Thank you, George W. Bush for not spending every single dime I sent you on outsourcing the war at premium prices.
But now that the thanking is done, here's the question: what to do with the extra money? If I were a graduate student or an adjunct, I would probably buy food. But I have a salary that pretty much takes care of that, plus the utilities and the car insurance. So what to do with windfall profits?
I put them in my Vanguard Roth IRA.
You see, I had one of those old-fashioned fathers, who would not have understood the current world of credit and debt we live in: he had an American Express card, which he paid every month. He is the only person I have ever heard of who paid cash for a house (ok, Tony Soprano did, but my dad was a gastroenterologist.) The idea that you would take a loan on your home to consolidate your credit card bills would have appalled him. And he was the kind of father who said, when I graduated from college, "Just remember: when you get a paycheck, pay yourself first." What this meant was, no matter how little you can save, save it anyway. People like Suze Orman make zillions from giving this kind of advice now. My father wrote a book called A Better Life With Your Ulcer, which, despite the snazzy title, did not make zillions.
Of course, I completely ignored my father's advice for many years, and for the same reason many of you will ignore mine now: during years when I was scrounging under the sofa for pizza money, or even later when the mortgage for our first house in Zenith plus the rent in New York was more or less cleaning me out - like many of you -- I didn't have any money. Then when I did have money, I started amping up my TIAA-CREF (word to the wise: do not even open your TIAA-CREF statements for a few years. It's bad right now.) But a few years back, I found myself in possession of a chunk of cash for which there was no earthly use except perhaps to take a ski trip. Instead I bought a Roth IRA. And now, whenever I get a royalty check (always small), or a speaking fee (bigger, but not as large as Bill Clinton's), or any other unexpected windfall, I send it off to Vanguard, on the theory that $100 saved today will grow exponentially by the time a take it out at age 70. Why do I recommend this policy? Because even with the instability of the current market, I am over $7,000 ahead of my original investment after only five years of this personal savings plan. By my calculations, and adding a social security payment of a little more than two large after age 67, that is an extra three months of retirement at my current after tax budget.
And actually, Vanguard shares have only dropped four dollars over the past year, so it's really ok to open the envelope.
In other news, I am pleased to say that I have been invited -- and have accepted the invitation -- to join Cliopatra, a group history blog hosted by George Mason's History News Network. Don't read it yet? Well, if you snooze, you lose, that's all I can say. Stay tuned.
Well, I always start by saying: "Thank you." Thank you Goddess, for my good health, and the job that allows me to write rather than drive to six colleges every week to teach twelve courses a semester; and thanks to those of you who took the time to judge the Audre Lorde prize and give it to me. Thanks to the journalthat published Miss Thing after two journals rejected her. Thanks to the Director of Women's Studies at Little Midwestern College, one of the many unsung heroes of higher education who, this spring, spent a lot of time arranging my visit when she could have been doing her own writing. Thank you fate, not just for getting me a nice job, but one that pays well (my rants against tenure have revealed that many of you out there, of similar rank and accomplishment, make half to two-thirds what I do.) Thank you that I am not being foreclosed on, or paying for chemotherapy with my tax return. Thank you, George W. Bush for not spending every single dime I sent you on outsourcing the war at premium prices.
But now that the thanking is done, here's the question: what to do with the extra money? If I were a graduate student or an adjunct, I would probably buy food. But I have a salary that pretty much takes care of that, plus the utilities and the car insurance. So what to do with windfall profits?
I put them in my Vanguard Roth IRA.
You see, I had one of those old-fashioned fathers, who would not have understood the current world of credit and debt we live in: he had an American Express card, which he paid every month. He is the only person I have ever heard of who paid cash for a house (ok, Tony Soprano did, but my dad was a gastroenterologist.) The idea that you would take a loan on your home to consolidate your credit card bills would have appalled him. And he was the kind of father who said, when I graduated from college, "Just remember: when you get a paycheck, pay yourself first." What this meant was, no matter how little you can save, save it anyway. People like Suze Orman make zillions from giving this kind of advice now. My father wrote a book called A Better Life With Your Ulcer, which, despite the snazzy title, did not make zillions.
Of course, I completely ignored my father's advice for many years, and for the same reason many of you will ignore mine now: during years when I was scrounging under the sofa for pizza money, or even later when the mortgage for our first house in Zenith plus the rent in New York was more or less cleaning me out - like many of you -- I didn't have any money. Then when I did have money, I started amping up my TIAA-CREF (word to the wise: do not even open your TIAA-CREF statements for a few years. It's bad right now.) But a few years back, I found myself in possession of a chunk of cash for which there was no earthly use except perhaps to take a ski trip. Instead I bought a Roth IRA. And now, whenever I get a royalty check (always small), or a speaking fee (bigger, but not as large as Bill Clinton's), or any other unexpected windfall, I send it off to Vanguard, on the theory that $100 saved today will grow exponentially by the time a take it out at age 70. Why do I recommend this policy? Because even with the instability of the current market, I am over $7,000 ahead of my original investment after only five years of this personal savings plan. By my calculations, and adding a social security payment of a little more than two large after age 67, that is an extra three months of retirement at my current after tax budget.
And actually, Vanguard shares have only dropped four dollars over the past year, so it's really ok to open the envelope.
In other news, I am pleased to say that I have been invited -- and have accepted the invitation -- to join Cliopatra, a group history blog hosted by George Mason's History News Network. Don't read it yet? Well, if you snooze, you lose, that's all I can say. Stay tuned.
Wednesday, April 02, 2008
Tenure, Tenure -- Who's Got the Tenure?
Apparently not a number of women at Baylor. In yesterdays' Inside Higher Ed, Scott Jaschik reports that in a move that is spreading through colleges and universities like the herpes virus, Baylor University administrators decided to "raise standards" for tenure this year, subsequent to the cases being prepared and submitted. They did this without informing anyone who would be affected by it, or the tenured faculty, for that matter. This means that probationary faculty who did what they were asked to do, and even those who might have looked beyond their department for a second opinion as to how to meet the bar, were hammered. Tenure denials went from ten percent to forty percent. Two-thirds of the women up for promotion were denied.
The story is a follow-up on a Monday story that features yours truly. Many hits to the Tenured Radical story linked to Scott's Monday piece, (almost 700), but no comments left: strange. However, I would report two interesting features of the comments left at IHE. They are overwhelmingly pro-tenure (fair enough, I understand that), but the comments are almost all aimed at the protection of academic freedom during the post-tenure years. Few are interested in what I think is a key question: what is the effect of the tenure process on young scholars, and how do we protect their academic freedom? Few people are also interested in unionization as an alternative. A third feature of these comments is that, as one of the commenters pointed out, that although they are overwhelmingly from tenured people, virtually all are written pseudonymously. Which suggests, as this commenter pointed out, that they don't feel that their speech is well-protected by tenure or -- that something else, a sense of one's reputation being fragile -- is provoked by even talking about tenure.
The story is a follow-up on a Monday story that features yours truly. Many hits to the Tenured Radical story linked to Scott's Monday piece, (almost 700), but no comments left: strange. However, I would report two interesting features of the comments left at IHE. They are overwhelmingly pro-tenure (fair enough, I understand that), but the comments are almost all aimed at the protection of academic freedom during the post-tenure years. Few are interested in what I think is a key question: what is the effect of the tenure process on young scholars, and how do we protect their academic freedom? Few people are also interested in unionization as an alternative. A third feature of these comments is that, as one of the commenters pointed out, that although they are overwhelmingly from tenured people, virtually all are written pseudonymously. Which suggests, as this commenter pointed out, that they don't feel that their speech is well-protected by tenure or -- that something else, a sense of one's reputation being fragile -- is provoked by even talking about tenure.
Subscribe to:
Posts (Atom)