Merit-hypocrisy in the Air

April 18, 2015

Meritocracy cartoon, October 29, 2010 (Josh C. Lyman via http://www.clibsy.com/).

Meritocracy cartoon, October 29, 2010 (Josh C. Lyman via http://www.clibsy.com/).

One of the hardest ideals for me to give up on in all of my life has been the idea of meritocracy. Even when I couldn’t spell the word, much less define it or use it in a sentence, I believed in this ideal. It was the driving force behind my educational progression from the middle of fourth grade in January ’79 until I finished my doctorate in May ’97. The meritocratic ideal even guided me in my career, in both academic and in the nonprofit world. Only to realize by the end of ’09 what I suspected, but ignored, for many years. My ideal of a meritocracy is shared by only a precious few, and the rest give lip service to it before wiping it off their mouths, concealing their split lips and forked tongue with nepotism instead.

Being the historian I am — whom people like Jelani Cobb joked about on Twitter as a curse — I am programmed to look back at situations in my own life to look for root causes, to understand what I can do to not repeat my own mistakes, my not-so-well-planned decisions. I’ve thought about my advisor Joe Trotter and my dissertation committee of Trotter, Dan Resnick (husband of education researcher Lauren Resnick) and Bruce Anthony Jones. The biggest mistake I made was in putting this hodgepodge committee of a HNIC advisor, racial determinist and closeted wanderer together to help guide me through my dissertation and then into my first postdoctoral job.

Aaron Eckhart as main character in movie I, Frankenstein (2014), August 12, 2013. (http://sciencefiction.com/).

Aaron Eckhart as main character in movie I, Frankenstein (2014), August 12, 2013. (http://sciencefiction.com/).

Of course, I didn’t know enough about these men to describe them this way, certainly not until I’d graduated and couldn’t find full-time work for more than two years. The signs, though, were there. Trotter’s unwillingness to recommend me for any job before my completed first draft of my dissertation was really complete (it took me two weeks to revise my dissertation from first to final draft). Resnick calling my dissertation writing “journalistic” and saying that my nearly 2,000 endnotes and thirty pages of sources was “insufficient.” Bruce pulling back on his schedule with me even before taking the job at University of Missouri at Columbia in July ’96.

None of this had anything to with my work. It was about me, whether I as a twenty-six year-old had suffered enough, had gone through enough humiliation, to earn a simple letter of recommendation for a job. When Trotter finally decided it was time to write me a letter of recommendation, it was December ’96, and the job was University of Nebraska-Omaha, “subject to budget considerations,” meaning that it could (and it did) easily fall through. Resnick flat-out refused to share anything he wanted to write about me, with all his “confidentiality” concerns, while I wrote all my letters for myself for Bruce. It was a disaster, and none of it had anything to do with the quality of my work as a historian, educator, or academic writer.

The work I ended up getting after Carnegie Mellon was the result of my dissertation, my teaching experiences, and my networking. The idea that I’d earned my spot, though, was still lacking in the places in which I worked. Particularly at Presidential Classroom, where I was the token highly-educated Negro on staff, and working at Academy for Educational Development with the New Voices Fellowship Program. In both cases, I had bosses whose racial biases only became clear once I began working with them. The then executive director Jay Wickliff never cared about the quality of my work or my degrees. Wickliff’s only concern was that I should keep my mouth shut when he acted or spoke in a racist manner.

My immediate supervisor Ken, on the other hand, wanted all the credit for work I did under him, except in cases when he deemed my methods “not diplomatic enough.” Even before his bipolar disorder led him to a psychological breakdown, Ken regularly accused me of gunning for his position, sometimes turning red whenever he heard about my latest publication, teaching assignment or conference presentation. I had to fight to keep my job and to move on within AED in those final months of ’03 and early ’04, a fight that had zero to do with merit.

Dixie Biggs, Lip Service teapot, April 19, 2015. (http://pinterest.com).

Dixie Biggs, Lip Service teapot, April 19, 2015. (http://pinterest.com).

I say all this because the one thing that every one of these folks had in common is their lip service to the belief that hard work and results are the keys to success and career advancement. Yet for every one of them, the merit that I had earned didn’t matter. My relative youth, my age, my race, my heterosexual orientation, even my achievements, either scared them or gave them reason to have contempt for me.

I say all of this because in the past eleven years, I have been very careful about the company I keep, about the mentors I seek, about the friends I make, personally and professionally. I went from not trusting anyone as a preteen and teenager to trusting a few too many folks in my twenties and early thirties. All because I believed that my hard working nature and talent mattered more than anything else. What has always mattered more is who you know, especially in high places like academia and with large nonprofits and foundations. So, please, please, please be careful about the supposedly great people you meet. Many of them aren’t so great at all.

That’s why the idea that academia is a place full of progressive leftists is ridiculous. Yes, people like Dick Oestreicher, Wendy Goldman, Joe Trotter and so many others wrote and talked about progressive movements and ideals while I was their student. But fundamentally, they could’ve cared less about the actual human beings they worked with and advised, particularly my Black ass. Their ideals stopped the moment they ended their talk at a conference or wrote the last sentence of a particular book. They only cared about people that they could shape and mold into their own image. And that’s not meritocracy. That’s the ultimate form of nepotism.


Before and After Spencer

April 14, 2015

Seattle Seahawks' Jerome Kearse making great catch off tipped ball while on the ground on final drive of Super Bowl XLIV, Tucson, AZ, February 2, 2015. (http://reddit.com).

Seattle Seahawks’ Jerome Kearse making great catch off tipped ball while on the ground on final drive of Super Bowl XLIV, Tucson, AZ, February 2, 2015. (http://reddit.com).

This week marks twenty years since the now-retired Catherine Lacey called me up on a Friday morning while I was brushing my teeth to tell me that I’d been selected to be a Spencer Foundation Dissertation Fellow for the 1995-96 year.  I’d hoped and prayed for that day for more than twenty months, after my fellowship and teaching plans for the summer of ’93 fell through. But I’ve talked about Catherine Lacey and some of my Spencer experiences already, as well as about the reaction of Joe Trotter and some of my Carnegie Mellon grad school mates to this news.

This post is about the days before I received Lacey’s call, before I knew that I would be on the fast track to a doctorate. Because before I’d been selected for the Spencer Dissertation Fellowship, the selection committee had rejected me, with a 6-1-1 vote (that’s six in favor, one not in favor, and one abstaining). I knew this because Catherine had sent me a rejection letter with a handwritten note at the bottom of it, one that I received after two months away in DC doing my dissertation research. My suspicion was that most of the Fellows had received an 8-0 or 7-1 selection vote.

That was all on March 31, ’95. Catherine’s note, though, was encouraging. She said to “stay tuned,” that she was “looking into other alternatives.” So there was still a chance that I’d get the fellowship. Still, I didn’t want to do what I did two years earlier, when assumptions and hope led me to six weeks of joblessness and an eviction notice.

John Hancock Center, Downtown Chiicago - The Spencer Foundation is on the 39th Floor, April 14, 2015. (http://milenorthhotel.com).

John Hancock Center, Downtown Chiicago – The Spencer Foundation is on the 39th Floor, April 14, 2015. (http://milenorthhotel.com).

So I did what I’ve done best throughout my work experiences. I scrambled to make sure I had work during the summer and upcoming school year. I didn’t want to be stuck borrowing more in student loans or teaching more of Peter Stearns’ version of World History courses — really, World Stereotypes — for entitled CMU freshmen.

I talked with both then associate provost (and also an eventual) mentor) Barbara Lazarus and fellow but further along grad student in John Hinshaw about me taking his job as a part-time assistant to Barbara. John really wanted to finish his dissertation and move on (who could blame him, given that Trotter was his advisor as well), and Barbara would’ve liked me for the job. So I gave them both a tentative yes, knowing that the job was contingent on John’s timetable for leaving it and finding an academic job elsewhere, all while completing his dissertation.

The thought occurred to me, though, that I may need more than a 15-20-hour-per-week job to get through the dissertation stage. Especially if I was to avoid teaching for the mercurial Stearns again. So I scheduled a meeting with Trotter to see if he any research project he needed help with.

We met at 2 pm on Thursday, April 13. Trotter was as excited about us meeting as he had been when I first decided to transfer to Carnegie Mellon to work with him as my advisor two and a half years earlier. He had at least three migration studies projects with which he wanted my labor. All the projects were about extending his grand proletarianization thesis. All would be dreadfully boring drudgery compared to my dissertation, but would keep me in additional pay checks for a year or two. I faked a smile, and tentatively said yes to Trotter as well.

Dikembe Mutumbo putting the wood to the. LA Laker Andrew Bynum, April 14, 2015. (http://fortheloveofgif.tumblr.com).

Dikembe Mutumbo putting the wood to the. LA Laker Andrew Bynum, April 14, 2015. (http://fortheloveofgif.tumblr.com).

Eighteen hours later came Catherine’s call about me being offered the Spencer Fellowship! I took it as a sign from God, that at the very least, I’d finish my dissertation and my doctorate without the need for working on it an extra two or three years. Unfortunately, neither John Hinshaw nor Joe Trotter saw my great fortune the way I did. When John found out, which was a week later, he didn’t talk to me for nearly three years. And from reading my previous blog posts, you all already know how my work with Trotter devolved after the Spencer award announcement.

The one thing that fellowship did for me as a person — and not just as an academician, researcher or education — was to give me the space to question academia and my role in it. Even two decades later, I’m still ambivalent about the academic method of obtaining tenure, of the publish-or-perish paradigm, of the hypocrisy that exists in such a cloistered world. Even as I still hold a job and play a role in this world.

What I’ve come to learn is that hypocrisy is everywhere, in the nonprofit world, in romance, and in academia, too. We could all start with, “Did you hear the one joke about how merit and hard work alone can lead to a prosperous life?” That’s the hypocrisy that I had to learn to see in academia, and began to, thanks to the space that the Spencer Dissertation Fellowship gave me that year. More on that later.


The Importance of The Great Society, 50 Years Later

April 10, 2015

Cartoon about American politics and the economy during the Great Recession, May 9, 2010. (Mike Luckovich, Atlanta Journal-Constitution). Qualifies as fair use under copyright laws (low resolution, relevance to subject matter).

Cartoon about American politics and the economy during the Great Recession, May 9, 2010. (Mike Luckovich, Atlanta Journal-Constitution). Qualifies as fair use under copyright laws (low resolution, relevance to subject matter).

This weekend marks a half-century since President Lyndon Baines Johnson signed the first series of bills into law that signified his Great Society/War on Poverty work. We won’t hear much about this, though. Not with 2015 being a benchmark year for commemorating and celebrating so much else. The Voting Rights Act of 1965. The 70th anniversary of the end of World War II. Appomattox at 150, along with President Lincoln’s assassination, Juneteenth, and the ratification of the 13th Amendment to the US Constitution, ending legal slavery (except for prisoners, of course). The trench warfare of World War I, and the Battle of Gallipoli (if one’s a real war trivia buff). Even the 30th anniversary of Back To The Future and the UN Conference on Women, which gathered in Nairobi, Kenya in the fall of 1985, will likely get more air time, cyber time, and ink than the first of LBJ’s Great Society programs.

But the Great Society was supposed to be a grand experiment and experience. It was supposed to wipe out poverty, end all legal forms of discrimination and segregation, and make the opportunity to achieve the modern American Dream of a middle-class life or better a reality for almost everyone. And it would’ve worked, too, if it weren’t for those pesky kids, um, LBJ’s pesky decisions to go escalate the Vietnam War. The war cost $269 billion in 1970 dollars, or, $1.7 trillion in 2015 money.

What $1 trillion looks like,  January 2012. (http://usdebt.kleptocracy.us/ via Elsolet Joubert).

What $1 trillion looks like, January 2012. (http://usdebt.kleptocracy.us/ via Elsolet Joubert).

Yet it’s not exactly true that it would’ve worked, or that it didn’t work. You see, despite the best efforts of an elitist, right-wing and ineffectual federal government corrupted by plutocrats and its military-industrial complex, the Great Society programs are still with us. For starters, there’s the Elementary and Secondary Education Act (ESEA) of 1965 and Higher Education Act (HEA) of 1965, the first one LBJ signed into law on April 11, fifty years ago on this date.

Without these laws, the modern era of educating millions of children whom states, colleges and universities and local school districts had shut out of K-12 and higher education wouldn’t have occurred at all. With federal government dollars and regulations, not to mention a foundation in the form of the Civil Rights Act of 1964, ESEA and HEA in its initial years provided real hope for “an equal opportunity for all.”

This wasn’t and isn’t just about Jim Crow and racial discrimination and segregation, whether de jure in the South or de facto via residential segregation and redlining in cities outsider the South. This became about opening doors for whole classes of children and adults that had been steel-reinforced-concrete walls before.

Harvard Economics Professor Roland Fryer at American Enterprise Institute, Washington, DC, July 16, 2007. (http://aei.org).

Harvard Economics Professor Roland Fryer at American Enterprise Institute, Washington, DC, July 16, 2007. (http://aei.org).

Technocrats and other public education crises gurus of the likes of Michelle Rhee, Wendy Kopp, and Roland Fryer would have us think that the $600 billion that states and the federal government via ESEA spend on public education each year is all wasteful spending. They say that we have about the same number of students in schools today (about 50 million) as we did in 1970.

Of course, they don’t tell the whole story, leaving out a lot of truth for us. For in my lifetime, students with disabilities (cognitive, physical, emotional) have gone from shutout to included in public schools as part of the Americans with Disabilities Acts of 1989 and 1992, and of course, ESEA. Students with language proficiency issues became part of the public education fabric under the Lau v. Nichols Supreme Court decision in 1974, with funds provided out of ESEA. Charter schools, magnet programs, school counseling, and so many other things that have made public education more inclusive and also more expensive wouldn’t have been possible without ESEA. Meaning, technocrats, that public education was severely underfunded and exclusionary prior to 1965.

The HEA hasn’t had the same effect, primarily because we as a society tend to think of higher education as a luxury and not a right, and have fought hard to make many colleges and universities the exclusive domain of “the worthy.” Still, without HEA and the Pell Grants (named after former Sen. Claiborne Pell (D-RI, 1961-97), the Supplemental Educational Opportunity Grant, the Direct Student Loan Program, and the Supplemental Loan Program, a whole generation of low-income and middle-income families wouldn’t have been able to send their kids to college. They were often the first in their families, as Blacks, as women, as Black and Latino women, as working-class Whites with only a steel mill or automobile plant in their future. At least back then.

President Lyndon Johnson signing the Higher Education Act of 1965, Master's Gymnasium, Texas State University, San Marcos, TX, November 8, 1965. (http://smmercury.com/). In public domain.

President Lyndon Johnson signing the Higher Education Act of 1965, Master’s Gymnasium, Texas State University, San Marcos, TX, November 8, 1965. (http://smmercury.com/). In public domain.

There’s also the National Endowment for the Arts and the National Endowment for the Humanities, the Head Start program, and Congress passing Medicaid and Medicare, providing health coverage for the elderly and deeply poor for the first time. And all in 1965.

What does all of this mean in 2015? That despite the mountains of books written on LBJ and the failures of the War on Poverty and the Great Society, that these programs worked and work in ameliorating poverty and expanding opportunities for all, even across racial lines. Would they have been more effective with more investment, especially in those initial years, when Vietnam became more of a priority? No doubt.

That’s just it, though. Most Americans — who despite the fears of Ted Cruz and Rudy Giuliani, remain White — didn’t care about poverty and racism and the structures that supported it then, and they mostly don’t care now. They want Congress to work, just not on issues that would permanently unbalance the social hierarchy that they’ve assumed as their birthright for two centuries.

Hence the constant one-two punch of a relationship they see with poverty and those with black and brown skin. Hence the constant carping about taxes and needless spending on food stamps and welfare, even though the true face of government assistance in the US has always been a haggard White one. Hence the constant media tropes about hard work and so-called self-made men worth billions making their money without holding a college degree, who somehow can tell the rest of us how to live. The Great Society, for all of LBJ’s foibles and its weaknesses, remains a great legacy of what America could be, and at times, has been, but overall, refuses to be.


Easter Seder 1995

April 8, 2015

Matzo and a cup of wine in a Kiddush cup for first evening of Passover, April 7, 2015. (http://www.timeanddate.com).

Matzo and a cup of wine in a Kiddush cup for first evening of Passover, April 7, 2015. (http://www.timeanddate.com).

Like most of my posts, this is a story of irony, sarcasm and identity. It may be a bit out of time, since the first night of Passover and Easter already occurred last weekend. But it’s still Passover week for those who do more than eat matzos and chicken liver paste with a glass of Manischewitz on the first night.

In all, I have been present, prayed, dined, wined and whined at four Passover Seders. Three of them were during the Hebrew-Israelite years, 1982, 1983, and 1984. All of them involved a roasted leg of lamb, bitter herbs, and chewing down raw horseradish while chugging super-sweet wine to chase away the five-alarm-fire in my mouth, throat and stomach. Endless praises to Yahweh, too many exhortations of Moses, and awkward snorts toward being strangers among strangers in a strange and oppressive land. That was my Passover experience in a lifetime and timeline determined by my Mom and idiot stepfather Maurice, before I turned to Christianity, before I gave up on the idea that I could be from one of the Ten Lost Tribes of ancient Israel.

My fourth Seder, though, came eleven years later, in mid-April 1995. I’d been a Christian for as long as I hadn’t commemorated Passover as part of my religious birthright. I wasn’t sure about the idea of attending this celebration, as it wasn’t even at sundown on that year’s first day of Passover, Saturday, April 15. My friend Carl and his/our respective Carnegie Mellon history grad school mates Alan, Jeff, and Susannah were holding their little Seder on Easter Sunday, April 16, as the first two rented a house together in the Point Breeze (really, the White end of Homewood-Brushton, which asked for a race-based divorce in 1961) neighborhood of Pittsburgh.

Picture of the Henry Clay Frick Mansion, or "Clayton", located at 7200 Penn Avenue, Point Breeze, Pittsburgh, PA, March 21, 2010. (Lee Paxton via Wikipedia). Released to public domain via CC-SA-3.0.

Picture of the Henry Clay Frick Mansion, or “Clayton”, located at 7200 Penn Avenue, Point Breeze, Pittsburgh, PA, March 21, 2010. (Lee Paxton via Wikipedia). Released to public domain via CC-SA-3.0.

They had invited me a week earlier, a few days before my Spencer Foundation Fellowship application went from no-go to a go. I thought about saying no, but generally, I didn’t do anything on Easter Sundays, anyway. Even as a member of Covenant Church of Pittsburgh, the one Sunday I didn’t attend church was Easter Sunday. It the holiest of days, like Passover, and for so many people, the only day all year they attended church. For so many, it was show-off-my-new-spring-clothes day, not Jesus’ Resurrection Day. I didn’t like the overcrowded-ness that came with an Easter Sunday or Christmas service. It smacked of hypocrisy, my own included.

So I decided for one Sunday to attend a Seder prepared by folks who’d only known themselves as Jews both ethnically and religion-wise their whole lives. Except the stern, orthodox, full of bitterness and tears, joy and triumph that were the Seders of my Hebrew-Israelite days was a lighthearted affair. It was as unorthodox a Seder as could’ve expected, with lots of conversation about grad school, about my dissertation fellowship, about life and sports and music in general. No raw horseradish, but lots of chicken liver paste. No Manischewitz, but some Mogen David, along with more traditional red and white wines, and an empty seat for Elijah.

Manischewitz wine, in bottle and a wine glass, September 11, 2012. (http://tabletmag.com/).

Manischewitz wine, in bottle and a wine glass, September 11, 2012. (http://tabletmag.com/).

Carl and Alan, of course, expressed surprise when I did ask questions or make comments. Like about the kosher-ness of eating mashed-up chicken livers, or the differences in taste between the traditional Pesach beverages, or how peanut butter and jelly went well with matzo crackers. Alan, about to be a one-year-and-done CMU history doctoral student, did ask me, “Where did you learn about Passover?” I said, “This is my fourth Seder.”

I knew better than to fully unlock everything I knew about Pesach, Judaism, Jewish history, the Ten Lost Tribes, being a Hebrew-Israelite, and the racial privileging that I had observed growing up in Mount Vernon between “real” Jews and us “weird” (read “not White”) Jews. For a few hours, though, I had to confront a part of my past that I’d all but locked away by the beginning of ’90. Not just locked away. I’d taken everything from between April 13, ’81 and July 23, ’89, wrapped it in Saran Wrap, put that in a Ziploc bag, thrown it in a safe, locked it, and then built a force field to keep out intruders.

I was relieved when I finally left Carl and Alan’s Easter Sunday/Passover Seder and walked back to my apartment in East Liberty. I wasn’t ready yet to take a look back at what I lived through during the Reagan Years. I was all about moving forward, and the previous days and weeks of dissertation research followed by a major-league dissertation fellowship made me feel like the completely different person that I believed I actually was. At least ninety-five percent of the time.


Degrees of Fakery

March 17, 2015

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

All too often, there are Americans high and low who believe they can say, “That’s just your opinion!” to anyone about anything. It doesn’t matter if the person they say this to is an expert in, say, climate change or American history or twentieth-century education policy. Or, for that matter, if the person they question is a total bullshit artist. All opinions are equal, and equally discountable.

But it does matter if the opinion comes from someone rich and famous. or at least, someone Americans see on TV and/or hear on the radio nearly every day, someone they like, someone they could see themselves sharing a laugh or cry with. That’s why opinions like those of Rudy Giuliani, Bill Cosby, Michelle Malkin, even Brian Williams seem to have mattered more over the years than the expert interpretations of many a scholar, scientist or public intellectual.

On the scale of those experts, those in the media likely view me as a middle-of-the-pack expert. I went to graduate school for five and a half years, earning two advanced degrees with a focus on twentieth-century US and African American history, with an even sharper focus on history of American education, African American identity and multiculturalism.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Despite what my Mom, my dad and some of my more cynical former high school classmates may think, earning a PhD in history wasn’t nearly as simple as answering 1,000 Final Jeopardy questions correctly before getting a stamp of approval. Twenty-three masters and doctoral courses, more than forty paper assignments of twenty pages or more, two years as a teaching assistant, one year as an undergraduate student advisor, two summers as a research assistant, and twenty-seven months of single-minded focus researching and writing a 505-page dissertation with more citations than the number of Citations Chevrolet made between 1980 and 1985. Oh, and did I mention, nineteen months of burnout afterward?

Yet, when I take the years I’ve spent researching, writing, learning, teaching, publishing and working in the fields of history and education, and express views based on that, I get told what anyone else on the street could say. “That’s just your opinion!” Unbelievable!

I think, too, about those from a time not too long ago who could’ve and should’ve earned a doctorate, a medical degree, or a JD, yet the structures of socioeconomic privilege, racism and sexism prevent them from earning these most expert of degrees. Yet, at many an HBCU, in many a segregated classroom, in so many barbershops, we still called them “Doc,” a sign of respect, for their abilities, for their experience, for their — dare I say — expertise.

We still do this now, even for people who don’t deserve the nickname “Doc.” My father and my idiot, late ex-stepfather both at one point in their lives or another laid claim to being doctors and/or lawyers. For the first two years I knew my then stepfather Maurice, between ’77 and ’79, he carried a monogramed briefcase, always full of his “important papers,” telling me and anyone else he bumped into on the streets of Mount Vernon, New York that he was a “doctor” or a “lawyer.” When drunk, my father sometimes took it a step further, telling strangers on the Subway that he was a “big-shot doctor an’ a lawyer” on his Friday-evening paydays. Maurice drove a Reliable taxicab during his delusions-of-grandeur years, and my father was janitorial supervisor.

Given the history of education and our society’s denial of quality education to people of color and the poor in the US, though, I didn’t entirely hold it against them then, and I don’t now. What I do have much bigger problems with, though, are the people who should know better, and yet don’t do any better. Just in my lifetime alone, people with Dr. in front of their names without having earned a doctorate or a four-year medical degree. Like “Dr.” Henry Kissinger, “Dr.” Bill Cosby, and of late, “Dr.” Steve Perry (not to be confused with the former lead singer for Journey, I guess). And no, honorary doctorates for giving money to Harvard, Temple, or the University of Massachusetts don’t count! Nor does starting an outline for a dissertation without actually finishing one. Still, they insist on the “Dr.,” even when it’s obvious I could’ve sat on the stoop at 616 East Lincoln Avenue thirty years ago to get the same half-baked opinions from one of my hydro-smoking neighbors.

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Then again, numbskulls like William Kristol and Newt Gingrich have earned doctorates from Harvard and Tulane University respectively, and Ben Carson was once one of the most respected pediatric neurosurgeons in the world! Yet, for some dumb reason, our media and most Americans think that their opinions are somehow more valuable, more consumable, than those of people who’ve spent decades understanding education, American culture, racial, gender and socioeconomic inequality, and government corruption. Or maybe, we just like listening to fake opinions from people with fake degrees and/or fake expertise on a subject in which they know nothing. Because nothing is exactly what Americans want to hear.


Boy @ The Window Origins: Meltzer Conversations

March 14, 2015

X-Men Origins: Wolverine (2009) scene, where Wolverine frees mutants kept as experiments by Colonel William Stryker , March 13, 2015. (http://cdn.collider.com/).

X-Men Origins: Wolverine (2009) scene, where Wolverine frees mutants kept as experiments by Colonel William Stryker , March 13, 2015. (http://cdn.collider.com/).

Of all the tangents I took related to writing Boy @ The Window, the most direct path that got me to write a memoir about the most painful period in my life was through several conversations with my dear teacher, friend and mentor in the late Harold Meltzer. I’ve discussed bits and pieces of some of those conversations here and in longer form in Boy @ The Window. It’s still worth rehashing some of those conversations, at least in terms of what was and wasn’t good advice, as well as in explaining how some of the main themes of the memoir developed over time.

As I wrote in Boy @ The Window, though my “first interview with him was in August ’02,” the first time “we discussed the possibility of me doing Boy @ The Window went back to February ’95.” Meltzer had been retired from teaching about a year and a half, while I was beginning the heavy lifting phase of my doctoral thesis, “living in DC for a couple of months while hitting the archives and libraries up for dusty information. In need of a writing break, I gave him a call on one cold and boring Saturday afternoon.”

It was in response to a letter he sent congratulating me. I’d recently published an op-ed in my hometown and county newspaper, “Solving African American Identity Crisis.” I was writing about issues like using the n-word, hypermasculinity, and internalized racism in the short and, for me at least, dummied down piece. Somehow our discussion of that piece led to a discussion of my classmate Sam. Did I really want to spend an hour and a half talking with Meltzer about Sam and some of my other Humanities classmates and their possible identity issues, considering some of my own serious growing pains — the Hebrew-Israelite years, my suicide attempt, my Black masculinity and manhood issues? Absolutely not!

But I learned quite a bit about how I might want to approach writing Boy @ The Window through that phone call. Not because Meltzer had given me any sage advice, which he didn’t, or because he revealed things to me that I shouldn’t have come to learn during our conversation, which he definitely did.

Benetton ad, 1980s, January 2013. (http://fashionfollower.com/).

Benetton ad, 1980s, January 2013. (http://fashionfollower.com/).

No, it was the idea that a lot of the things that I had pursued as a historian and researcher were things that came out of my experiences growing up. Multiculturalism as a historical phenomenon (at least if one linked it to cultural pluralism)? Can anyone say Humanities Program, or, what I used to call “Benetton Group” when we were at A.B. Davis Middle School? Writing about African American identity issues? Obviously related to living in Mount Vernon, the land where any hint of weakness translated into me being called a “faggot” or a “pussy.”

And what about any scholarly concerns with racial and socioeconomic inequality and Black migration? Anyone ever meet my Mom and my father Jimme, 1960s-era migrants from Arkansas and Georgia/Florida respectively? An examination of the Black Washingtonian elite and their looking down upon ordinary Blacks because of their own colorism or the latter’s lack of education? Come on down, Estelle Abel and any number of well-established Black Mount Vernon-ites who never gave me the time of day! As much as academia had been an escape for me, into a world of rationalism and logic, a place of dispassionate scholarship, it was all personal for me, without realizing it until that phone conversation with Meltzer.

Fast-forward to November ’02, the last interview I did with Meltzer before his death two months later. We spent the last couple of hours on that brisk fall Thursday discussing the book idea that would become Boy @ The Window. Meltzer thought that it should be a work of fiction, “based on the real flesh and blood folks in my life, but with different names of course to protect me from any potential lawsuits. He did make me rethink the project from a simple research study of my high school years into narrative nonfiction or a memoir.” 

Screen shot of fictional character Harper Stewart's bestselling novel nfinished Business, from The Best Man (1999), March 14, 2015. (hitchdied via http://s785.photobucket.com/).

Screen shot of fictional character Harper Stewart’s bestselling novel Unfinished Business, from The Best Man (1999), March 14, 2015. (hitchdied via http://s785.photobucket.com/).

Was Meltzer correct? Should I have done a Harper Stewart — played by actor Taye Diggs in The Best Man (1999)? Should I have fictionalized all of my experiences and those of my family, teachers, administrators and classmates? I’m not sure if it would’ve made a difference. Stories of fiction tend to have a tight symmetry to them. Or, the theme of “what goes around comes around” is usually a big one in any novel. You can’t leave too many loose threads or unresolved issues, even if the novel is part of a series. For my purposes, since my life remains a work in progress, a story of relative — not obvious or absolute — success, telling it as fiction would hardly ring true to me, much less to any group of readers.

Whatever else anyone wants to say about the late Harold Meltzer, the dude got me to think about difficult things until I was no longer comfortable in leaving my uncomfortable experiences and assumptions unchallenged. The very definition of a mentor, the very purpose of Boy @ The Window.


ETS Using Test Results To Justify Its Test-Filled Vision

March 4, 2015

America's Skills Challenge: Millennials and the Future (cover), February 17, 2015. (ETS).

America’s Skills Challenge: Millennials and the Future (cover), February 17, 2015. (ETS).

I actually like the Educational Testing Service (ETS). I’ve done work for them as a consultant and as an AP Reader over the years. I enjoyed most of my testing experiences with them, especially the AP US History Exam of 1986. I like many of the conferences that they host and sponsor, and they beat almost all with the spreads of food that they provide at their events. Yet even with all that, ETS’ agenda is one of promoting the ideal of a meritocratic society with a repressive regime of testing that shows beyond a shadow of a doubt the socioeconomic determinism of standardized assessments. Or, in plain English, tests that favor the life advantages of the middle class and affluent over the poor, Whites and assimilated East Asians over Blacks, Latinos, and only partially assimilated immigrants of color.

Such is the case with a nearly unreported new report from ETS. They had scheduled a press release for the “America’s Skills Challenge: Millennials and the Future” on Tuesday, February 17th at the National Press Club in Washington, DC. The organizers postponed the event, though, because of the phantom snow storm that was really a typical snow shower. So I didn’t get to ask my preliminary questions about the findings of researchers Madeline J. Goodman, Anita M. Sands, and Richard J. Coley, that despite the educational gains of the generation born after 1980, they sorely lack the skills they need for life and work in the twenty-first century. My questions? How could anyone have expected millennials to develop independent thinking, critical thinking, innovative thinking, writing and other analytical skills if they spend precious little time in their education actually doing any of these things? How would the constant barrage of high-stakes tests from kindergarten to twelfth grade have been able to instill in students ways to think outside the box, to look at issues with more than one perspective, to stand in opposition to policies based on evidence, and not just based on their gut or something they picked up from a test?

Mass of students taking high-stakes test, September 4, 2014. (http://newrepublic.com via Shutterstock).

Mass of students taking high-stakes test, September 4, 2014. (http://newrepublic.com via Shutterstock).

Well, the report is worse than I thought. Goodman, Sands and Coley put together an argument that makes circular reasoning look like a Thomas the Tank Engine episode. The authors produced this first in a series of reports for ETS, relying solely on “data from the Programme for the International Assessment of Adult Competencies (PIAAC).” The PIAAC, developed by the Organisation for Economic Co-operation and Development (OECD), is a survey that assesses the skill levels of a broad spectrum of people between the ages of sixteen and sixty-five, the primary working population in most developed countries (meaning the US and Canada, the EU, the Baltic states, Australia, Japan and South Korea). ETS and the authors claim that this survey instrument is better at assessing how far behind millennials in particular are when compared to “their international peers in literacy, numeracy, and problem solving in technology-rich environments (PS-TRE)” than the international testing of high school students alone. And as such, the authors concluded that

PIAAC results for the United States depict a nation burdened by contradictions. While the U.S. is the wealthiest nation among the OECD countries, it is also among the most economically unequal. A nation that spends more per student on primary through tertiary education than any other OECD nation systematically scores low on domestic and international assessments of skills. A nation ostensibly based on the principles of meritocracy ranks among the highest in terms of the link between social background and skill level. And a nation with some of the most prestigious institutions of higher learning in the world houses a college-educated population that scores among the lowest of the participating OECD nations in literacy and numeracy.

I don’t about anyone who reads my blog, but I find these conclusions smack of so much hypocrisy that they’re stomach-ache-inducing. Really? Years of promoting testing at every level of K-12 education, everything from state and district-level assessments to PARCC and Smarter Balanced Assessments, and it’s only because of growing economic inequality that US students-turned-adults don’t score well in the super-advanced, highly skilled categories? Not to mention, the SAT, AP exams, GREs, LSATs, GMATs, MCATs, Praxis I, Praxis II, and so many other ETS exams that it would cause the average psychometrician’s head to explode? Seriously?

Terrier dog chasing its own tail, March 3, 2015. (http://webmd.com).

Terrier dog chasing its own tail, March 3, 2015. (http://webmd.com).

This is yet another case of the dog chasing its own tail. A case where the $3-billion-per-year nonprofit just outside Princeton, New Jersey is sounding a clarion call for a crisis that it helped create. Not the one on the rapid rise of inequality, though its promoting of a false meritocracy through constant testing has served to lull affluent America into an intellectual coma. But in the cutting of history and social studies, literature and art, theater and music classes, from kindergarten really all the way through a bachelor’s degree program.

In the promotion of testing as the way to address achievement gaps, to deal with the so-called education crisis, so much of what was good about K-12 and even higher education has fallen away. Reading for the sake of reading and learning has drifted away, with more English and less literature in schools and at many colleges and universities than ever. Want to teach someone how to express themselves in writing, to express their numeracy in proofs? That thinking runs counter to what goes on in the Common Core school systems of 2015, meaning most people will either never develop these skills, or, if lucky, might develop them somewhere between their junior year of college and in finishing a master’s degree or doctorate. We emphasize STEM fields with billions of STEM dollars without realizing that great STEM is much more than equations and formulas. It’s also imagination, applying the ability to break down pictures, ideas, words and sentences contextually to the world of numbers and algorithms.

And don’t give me this whole “the SAT now has an essay section on it” spiel! Fact is, everyone knows that expressing their words on paper, on a screen or in speech is critical in modern societies. After almost seven decades of testing, ETS figured this out, too? What they haven’t figure out yet, though, is how to make standardized high-stakes testing a necessary for the entire working adult population in the US. Believe me, that’s where they want to head next.


Follow

Get every new post delivered to your Inbox.

Join 758 other followers