Merit-hypocrisy in the Air

April 18, 2015

Meritocracy cartoon, October 29, 2010 (Josh C. Lyman via http://www.clibsy.com/).

Meritocracy cartoon, October 29, 2010 (Josh C. Lyman via http://www.clibsy.com/).

One of the hardest ideals for me to give up on in all of my life has been the idea of meritocracy. Even when I couldn’t spell the word, much less define it or use it in a sentence, I believed in this ideal. It was the driving force behind my educational progression from the middle of fourth grade in January ’79 until I finished my doctorate in May ’97. The meritocratic ideal even guided me in my career, in both academic and in the nonprofit world. Only to realize by the end of ’09 what I suspected, but ignored, for many years. My ideal of a meritocracy is shared by only a precious few, and the rest give lip service to it before wiping it off their mouths, concealing their split lips and forked tongue with nepotism instead.

Being the historian I am — whom people like Jelani Cobb joked about on Twitter as a curse — I am programmed to look back at situations in my own life to look for root causes, to understand what I can do to not repeat my own mistakes, my not-so-well-planned decisions. I’ve thought about my advisor Joe Trotter and my dissertation committee of Trotter, Dan Resnick (husband of education researcher Lauren Resnick) and Bruce Anthony Jones. The biggest mistake I made was in putting this hodgepodge committee of a HNIC advisor, racial determinist and closeted wanderer together to help guide me through my dissertation and then into my first postdoctoral job.

Aaron Eckhart as main character in movie I, Frankenstein (2014), August 12, 2013. (http://sciencefiction.com/).

Aaron Eckhart as main character in movie I, Frankenstein (2014), August 12, 2013. (http://sciencefiction.com/).

Of course, I didn’t know enough about these men to describe them this way, certainly not until I’d graduated and couldn’t find full-time work for more than two years. The signs, though, were there. Trotter’s unwillingness to recommend me for any job before my completed first draft of my dissertation was really complete (it took me two weeks to revise my dissertation from first to final draft). Resnick calling my dissertation writing “journalistic” and saying that my nearly 2,000 endnotes and thirty pages of sources was “insufficient.” Bruce pulling back on his schedule with me even before taking the job at University of Missouri at Columbia in July ’96.

None of this had anything to with my work. It was about me, whether I as a twenty-six year-old had suffered enough, had gone through enough humiliation, to earn a simple letter of recommendation for a job. When Trotter finally decided it was time to write me a letter of recommendation, it was December ’96, and the job was University of Nebraska-Omaha, “subject to budget considerations,” meaning that it could (and it did) easily fall through. Resnick flat-out refused to share anything he wanted to write about me, with all his “confidentiality” concerns, while I wrote all my letters for myself for Bruce. It was a disaster, and none of it had anything to do with the quality of my work as a historian, educator, or academic writer.

The work I ended up getting after Carnegie Mellon was the result of my dissertation, my teaching experiences, and my networking. The idea that I’d earned my spot, though, was still lacking in the places in which I worked. Particularly at Presidential Classroom, where I was the token highly-educated Negro on staff, and working at Academy for Educational Development with the New Voices Fellowship Program. In both cases, I had bosses whose racial biases only became clear once I began working with them. The then executive director Jay Wickliff never cared about the quality of my work or my degrees. Wickliff’s only concern was that I should keep my mouth shut when he acted or spoke in a racist manner.

My immediate supervisor Ken, on the other hand, wanted all the credit for work I did under him, except in cases when he deemed my methods “not diplomatic enough.” Even before his bipolar disorder led him to a psychological breakdown, Ken regularly accused me of gunning for his position, sometimes turning red whenever he heard about my latest publication, teaching assignment or conference presentation. I had to fight to keep my job and to move on within AED in those final months of ’03 and early ’04, a fight that had zero to do with merit.

Dixie Biggs, Lip Service teapot, April 19, 2015. (http://pinterest.com).

Dixie Biggs, Lip Service teapot, April 19, 2015. (http://pinterest.com).

I say all this because the one thing that every one of these folks had in common is their lip service to the belief that hard work and results are the keys to success and career advancement. Yet for every one of them, the merit that I had earned didn’t matter. My relative youth, my age, my race, my heterosexual orientation, even my achievements, either scared them or gave them reason to have contempt for me.

I say all of this because in the past eleven years, I have been very careful about the company I keep, about the mentors I seek, about the friends I make, personally and professionally. I went from not trusting anyone as a preteen and teenager to trusting a few too many folks in my twenties and early thirties. All because I believed that my hard working nature and talent mattered more than anything else. What has always mattered more is who you know, especially in high places like academia and with large nonprofits and foundations. So, please, please, please be careful about the supposedly great people you meet. Many of them aren’t so great at all.

That’s why the idea that academia is a place full of progressive leftists is ridiculous. Yes, people like Dick Oestreicher, Wendy Goldman, Joe Trotter and so many others wrote and talked about progressive movements and ideals while I was their student. But fundamentally, they could’ve cared less about the actual human beings they worked with and advised, particularly my Black ass. Their ideals stopped the moment they ended their talk at a conference or wrote the last sentence of a particular book. They only cared about people that they could shape and mold into their own image. And that’s not meritocracy. That’s the ultimate form of nepotism.


Degrees of Fakery

March 17, 2015

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

All too often, there are Americans high and low who believe they can say, “That’s just your opinion!” to anyone about anything. It doesn’t matter if the person they say this to is an expert in, say, climate change or American history or twentieth-century education policy. Or, for that matter, if the person they question is a total bullshit artist. All opinions are equal, and equally discountable.

But it does matter if the opinion comes from someone rich and famous. or at least, someone Americans see on TV and/or hear on the radio nearly every day, someone they like, someone they could see themselves sharing a laugh or cry with. That’s why opinions like those of Rudy Giuliani, Bill Cosby, Michelle Malkin, even Brian Williams seem to have mattered more over the years than the expert interpretations of many a scholar, scientist or public intellectual.

On the scale of those experts, those in the media likely view me as a middle-of-the-pack expert. I went to graduate school for five and a half years, earning two advanced degrees with a focus on twentieth-century US and African American history, with an even sharper focus on history of American education, African American identity and multiculturalism.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Despite what my Mom, my dad and some of my more cynical former high school classmates may think, earning a PhD in history wasn’t nearly as simple as answering 1,000 Final Jeopardy questions correctly before getting a stamp of approval. Twenty-three masters and doctoral courses, more than forty paper assignments of twenty pages or more, two years as a teaching assistant, one year as an undergraduate student advisor, two summers as a research assistant, and twenty-seven months of single-minded focus researching and writing a 505-page dissertation with more citations than the number of Citations Chevrolet made between 1980 and 1985. Oh, and did I mention, nineteen months of burnout afterward?

Yet, when I take the years I’ve spent researching, writing, learning, teaching, publishing and working in the fields of history and education, and express views based on that, I get told what anyone else on the street could say. “That’s just your opinion!” Unbelievable!

I think, too, about those from a time not too long ago who could’ve and should’ve earned a doctorate, a medical degree, or a JD, yet the structures of socioeconomic privilege, racism and sexism prevent them from earning these most expert of degrees. Yet, at many an HBCU, in many a segregated classroom, in so many barbershops, we still called them “Doc,” a sign of respect, for their abilities, for their experience, for their — dare I say — expertise.

We still do this now, even for people who don’t deserve the nickname “Doc.” My father and my idiot, late ex-stepfather both at one point in their lives or another laid claim to being doctors and/or lawyers. For the first two years I knew my then stepfather Maurice, between ’77 and ’79, he carried a monogramed briefcase, always full of his “important papers,” telling me and anyone else he bumped into on the streets of Mount Vernon, New York that he was a “doctor” or a “lawyer.” When drunk, my father sometimes took it a step further, telling strangers on the Subway that he was a “big-shot doctor an’ a lawyer” on his Friday-evening paydays. Maurice drove a Reliable taxicab during his delusions-of-grandeur years, and my father was janitorial supervisor.

Given the history of education and our society’s denial of quality education to people of color and the poor in the US, though, I didn’t entirely hold it against them then, and I don’t now. What I do have much bigger problems with, though, are the people who should know better, and yet don’t do any better. Just in my lifetime alone, people with Dr. in front of their names without having earned a doctorate or a four-year medical degree. Like “Dr.” Henry Kissinger, “Dr.” Bill Cosby, and of late, “Dr.” Steve Perry (not to be confused with the former lead singer for Journey, I guess). And no, honorary doctorates for giving money to Harvard, Temple, or the University of Massachusetts don’t count! Nor does starting an outline for a dissertation without actually finishing one. Still, they insist on the “Dr.,” even when it’s obvious I could’ve sat on the stoop at 616 East Lincoln Avenue thirty years ago to get the same half-baked opinions from one of my hydro-smoking neighbors.

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Then again, numbskulls like William Kristol and Newt Gingrich have earned doctorates from Harvard and Tulane University respectively, and Ben Carson was once one of the most respected pediatric neurosurgeons in the world! Yet, for some dumb reason, our media and most Americans think that their opinions are somehow more valuable, more consumable, than those of people who’ve spent decades understanding education, American culture, racial, gender and socioeconomic inequality, and government corruption. Or maybe, we just like listening to fake opinions from people with fake degrees and/or fake expertise on a subject in which they know nothing. Because nothing is exactly what Americans want to hear.


When Work Really Is Too Much

January 27, 2014

From "How to Do More Work in Less Time" article, Forbes Magazine, February 28, 2012. (Deborah L. Jacobs/http://forbes.com).

From “How to Do More Work in Less Time” article, Forbes Magazine, February 28, 2012. (Deborah L. Jacobs/http://forbes.com).

I’ve been working for a paycheck in some capacity since September ’84, when me and my brother Darren began working with our father Jimme down in Upper West/East Side and Midtown Manhattan. Back then, we cleaned the floors of corporate offices, the carpets of condos and co-ops, and endured Jimme’s alcoholic ups and downs. There was one lesson, though, that stuck with me in the year or so that we worked for our father, one that extended the lesson we observed from our Mom before we fell into welfare in April ’83. That we wouldn’t get far without hard work or without having work, and that if we wanted to avoid the work of a low-paying, back-breaking job like buffing and waxing floors, we also needed to work smart, to use our brains and our muscles

Since then, the longest I’ve been without a job has been ten months, between August ’86 and June ’87. I worked all the way through undergrad at Pitt and was a grad assistant and teaching assistant throughout grad school (with the exception of my time as a Spencer Foundation Dissertation Fellow in ’95-’96, and even then, I worked on two of Joe Trotter’s research projects). I’ve faced periods of unemployment and longer periods where I’ve cobbled together part-time and full-time work, as well as held stable full-time work in the nonprofit and higher education worlds.

Working long hours, January 23, 2014. (Mark Holder/http://www.findersandsellers.com).

Working long hours, January 23, 2014. (Mark Holder/http://www.findersandsellers.com).

In all that time, I’ve only held two jobs where I’d been overwhelmed with work. Not the actual act of performing the duties of these jobs, mind you. The number of hours in which I had to show up for work was what eventually made these jobs overwhelming. My first time experiencing full-time work outside of a summer job was in the middle of my Winter/Spring ’89 semester. I worked for Pitt’s Computer and Information Systems’ (CIS) computer labs back then. I had requested more hours, and had gone from twelve to twenty to thirty-six between the beginning of January and the second-half of February, covering for folks who had moved on to real full-time work after graduating.

This was a seven-week period in which I averaged 36 hours per week while taking sixteen credits — five classes — and all while facing sexual harassment from my co-worker Pam, harassment tacitly sanctioned by our boss and her friend Cindy. Despite it all and my $4.15/hour salary, I focused on the work, the need for extra cash, and my friends, and came out the other side, and hoped to avoid a situation like that again.

I stumbled my way into a worse situation in my first full-time work after earning my doctorate, with the now out-of-business Presidential Classroom. My official title was Director of Curriculum, but that was my main job for only nine months out of the year. Because Presidential Classroom had dedicated itself to edu-tainment with a full-time staff of only a dozen, this meant that all full-time staff were also part of what we called Program. Fifteen weeks during the winter, early spring and summer, one group of 300-400 high school juniors and seniors from across the country (and Puerto Rico and outside the US/commonwealth) after another would spend a week in DC learning about “how government and politics work on Capitol Hill.” Or, as our brochures would say, “Not your typical week in Washington.”

One version of Presidential Classroom logo, January 27, 2014. (http://congressionalaward.org).

One version of Presidential Classroom logo, January 27, 2014. (http://congressionalaward.org).

I worked on-site at the Georgetown University Conference Center (where Marriott had a hotel, primarily for families visiting their hospitalized loved ones at Georgetown University Hospital) for seven of those weeks. I supervised interns, so-called faculty (some of whom were government employees who seemed more interested in chasing skirts than in sharing their experiences) and worked with other staff while watching over these groups of students roaming all over DC and Northern Virginia week after week.

One week in February ’00, I counted up, and found that I’d worked 120 hours in all. This included a 21-hour-day, in which I’d caught a boy in a girls’ hotel room, and then proceeded to contact his parents and expel him from the program. Between that and the bigoted staff I worked with — including my boss, the ED, who once told the joke that “slavery was a hoax” — I knew that putting in 100+ hours per week and sleeping in lumpy beds for $35,000 a year wasn’t worth it. By the last week of June ’00, I was severely sleep-deprived and ready to run my co-workers through with a long spear.

The lesson here was that we all need work, and we all need to work hard in order to guarantee success. But working hard also requires hard thinking and decision-making. It required me to say “No” to things that I had said “Yes” to when I was younger and more desperate for any job. What’s the damnable misery of it, though, is knowing that there are millions of people stuck in jobs that require so much more of them than they should be willing to give.

No job should require the kind of hours I put in combined with harassment and bigotry unless the salary is in the six-figure range, and even then, it’s not worth it. It won’t be worth the loss of self-esteem, the sleep deprivation, the sudden weight gain, the irritability and the temptation to turn to forms of self-medication. It wasn’t worth it for me in ’89 or in ’00, as I’m sure it isn’t for those of you in jobs like this now.


The Quest For Work, Past and Present

August 21, 2012

Down and out on New York pier, 1935, June 2009. (Lewis W. Hine via FDR Presidential Library). In public domain.

Election ’12 should be about how to generate more jobs and how to grow the economy. Sadly, it hasn’t been about these issues, and given the toxic political and cultural climate, it will not be about jobs or the economy when this cycle ends on November 6.

I’ve seen this horror movie of economic downturns and mini-depressions in American society and in my own life now three times in the past thirty-five years. Each time, I’ve been better prepared, more informed, more able to ride out the storm. And each time, I’ve seen the ugly side of what we call the United States of America, a place that has and will continue to punish the unemployed and underemployed for problems beyond their control. Especially if they were and are women, young, over forty, of color, and among the poor.

In the period between ’79 and ’83, when the effective inflation rate for that four-year period was more than thirty-five percent, when we experienced a double-dip recession, when interest rates reached 22.5 percent. My mother’s meager income of $12,000 in ’79 didn’t keep up, even as it reached $15,000 in ’82. We were late with our rent at 616 by an average of three weeks each month and didn’t have food in the apartment the last ten days of any month, going back to October ’81. Things were so bad that my mother, a supervisor in Mount Vernon Hospital’s dietary department, brought food home from the hospital kitchen for us to eat for dinner several times each month.

“Negro Women,” Earle, Arkansas, July 1936, August 21, 2012. (Dorothea Lange via Library of Congress/http://libinfo.uark.edu). In public domain.

The good news was, Mount Vernon Hospital’s employees went on strike for higher wages and increased job security in mid-July ’82. The bad news was, although Mom was a sixteen-year veteran, nearly fifteen of those as a dietary department supervisor, Mom never joined the union. She didn’t want to pay “them bloodsuckers” dues, and said that she “couldn’t afford them” anyway.

I can only imagine how much spit and venom Mom faced on her way to work every day for three weeks. Considering our money situation, which I knew because I checked the mail and looked at our bills every day, picketing and getting union benefits might have been better than working. It wasn’t as if there was food in the house to eat anyway. As much as I enjoyed Mount Vernon Hospital’s Boston Cream Pie, I thought that picketing for a better wage was the way to go.

Soon after I started eighth grade, the other shoe dropped. Mom, so insistent on not joining Mount Vernon
Hospital’s union, was the odd woman out. The hospital’s concession of five percent increases per year over three years left them looking to cut costs. The only personnel left vulnerable were non-union service workers and their supervisors. My Mom had been cut to half-time by her boss Mrs. Hunce. Mom was screwed, but it was a screwing partly of her own making. It was the beginning of a two-decade-long period of welfare, underemployment, unemployment welfare-to-work, with an associate’s degree along the way. So much for hard work leading to prosperity!

I’ve gone through my own periods of unemployment and underemployment over the years. The most severe one for me was between June and September ’97, right after I finished my PhD. It was the first time in four years I hadn’t had work or a fellowship to rely on, and it was brutal. I did interviews with Teachers College and Slippery Rock University for tenure-track positions in education foundations, only to finish second for one job, and to see the folks at Slippery Rock cancel the other search. In the latter case, I think that they felt uncomfortable hiring someone of my age — twenty-seven — and my, um, ilk (read race here).

What made it worse was the fact that I couldn’t simply apply for any old job. I did actually try, too. McDonald’s, UPS, FedEx, Barnes & Noble, among others. I couldn’t even get Food Stamps in July, because my income threshold for March, April and May ’97 — $1,200 per month — was too high. And because I technically was a student for tax purposes my last two semesters at Carnegie Mellon — even though I was adjunct professor teaching history courses — I didn’t qualify for unemployment benefits either.

Shuttered Homestead steel mill, 1989, August 21, 2012. (Jet Lowe/Historical American Engineering Record). In public domain.

I had to omit the fact that I had a PhD to get a part-time job at Carnegie Library of Pittsburgh, which began after Labor Day ’97. I ended up teaching as an adjunct professor at Duquesne University’s College of Education the following year. Still, my income level did not return to where it was my last year of graduate school until June ’99, when I’d accepted a position with Presidential Classroom in the DC area.

I am nowhere near those times of being considered or treated as a statistic, marginalized in media and in politics as being lazy, shiftless, not smart or hard-working enough. But as a person who teaches near full-time and has more than occasional consulting work, I know how precarious and temporary work can be.

Ironic, then, that the people making decisions that have put people like me and my Mom in terrible financial straits have never missed a meal or not paid a bill because they were choosing between heat and not making phone calls. That most Americans regardless of party affiliation shun the poor, unemployed and underemployed is a shame and a pitiful example of how we really don’t pull together during tough times.

These attitudes are why rugged individualism and hard work aren’t enough to get and hold a job. An education, a real social safety net, even regulation of the job market, would help level the playing field for millions. Or, maybe some of us should learn Mandarin Chinese, Hindi, Arabic or Portuguese and move to where the jobs really are.


The Human Race Addendum

August 13, 2012

Two years ago, I wrote a post about a curious observation I made about inequality, unfairness and humanity, all courtesy of my fourth grade teacher, Mrs. Pierce (“Hard Work and the Human Race,” September ’10 – see below). In the thirty-four years since this observation, it’s fairly obvious that the great college football coach legend Barry Switzer was right about how people like Romney think about their station in life. “Some people are born on third base and go through life thinking they hit a triple.”

GOP presidential hopeful Mitt Romney’s pick of Paul Ryan as his vice-president is a confirmation of the idea that there are folks in America who truly believe that their success came only as a result of hard work, luck and prayer. But to use a better analogy, it’s easy to be a winner when your born in middle of the fourth lap of a 400m race, while someone like me had to fight just to get in the starting block. Politically, Carter and Reagan was the spark for my understanding of economic inequality. Three and a half decades later, the Romney-Ryan ticket reflects the long and winding road this mythology of “equal opportunity, not equal outcomes” has taken our nation. Only, equal opportunities do not exist for most of us, as the track and field analogy illuminates.

===========================

When I was nine years old, my fourth grade teacher at Holmes, Mrs. Pierce — a grouch of an older White woman, really — talked about the human race and attempted to describe our species’ variations. She tried to do what we’d call a discussion of diversity now. It went over our heads, no doubt because she didn’t quite get the concept of diversity herself.

Holmes Elementary School, Mount Vernon, NY [Top left corner was Mrs. Pierce’s classroom in 1978-79 year], November 22, 2006. (Donald Earl Collins).

Like the fourth-grader I was, I daydreamed about the term, human race. I thought of Whites, Blacks, Asians, Hispanics, young and old, male and female, from all over the world, all on a starting line. It was as if four billion people — that was the world population in ’79 — were lined up to run a race to the top of the world. In my daydream, some were faster than others, or at least appeared to be, while others hobbled along on crutches and in wheelchairs. Still others crawled along, falling farther and farther behind those who were in the lead, the ones that looked like runners in the New York City marathon. Before I could ponder the daydream further, Mrs. Pierce yelled, “Wake up, Donald!.” as if I’d really been asleep.

A high school friend recently gave me some much-needed feedback on my Boy @ The Window manuscript. Her feedback was helpful and insightful, and very much appreciated. But some of it reminded me of the realities of having someone who’s a character in a story actually read that story. Their perceptions will never fully match up with those of the writer, which is what is so groovy and fascinating about writing in the first place.

One of the things that struck me as a thread in her comments — not to mention in so many conversations I’ve had with my students about race and socioeconomics — was the theme of individual hard work trumping all obstacles and circumstances. As if words, slights, and mindsets in the world around us don’t matter. As if poverty is merely a mirage, and bigotry, race and racism merely words on a page. Sure, a story such as the one I have told in this blog for the past three years is about overcoming roadblocks, especially the ones that we set ourselves up for in life, forget about the ones external to our own fears and doubts.

At the same time, I realized what my weird daydream from thirty-one years ago meant. Some people get a head start — or, in NASCAR terms, the pole — before the race even starts. That certainly doesn’t make what that individual accomplishes in life any less meaningful, but knowing that the person had an advantage that most others didn’t possess does provide perspective and illuminates how much distance the disadvantaged need to cover to make up ground. Those who limp and crawl and somehow are able to compete in this human race have also worked hard, likely at least as hard as those with a head start, and more than likely, harder than most human beings should ever have to work.

2009 London Marathon. (http://www.newsoftheworld.co.uk/)

Plus, there are intangibles that go with race, class and other variables that determines how the human race unfolds. “Good luck is where hard work meets opportunity,” at least according to former Pittsburgh Penguins goaltender Tom Barrasso. Most human beings work hard, but all need opportunities that may provide a real sprint to catch up or take a lead in the human race. Family status, political influence, social and community networks, religious memberships, being in the right place at the right time, all matter and are connected to race and class, at least in the US.

The moral of this story is, hard work matters, individual accomplishment matters. Yet a panoramic view of the race in which humans are engaged matters more in putting our individual successes and the distance that remains in some reasonable perspective. Without that, we’re all just pretending that individual hard work is the only thing that matters, when that’s only half the battle, or half of half the battle.


Promoting Fear of a “Black” America

February 4, 2012

Fear of a "Black" America front cover, July 2, 2004 (Donald Earl Collins).

It’s been seven years since my first radio interview and book signing for my first book, Fear of a “Black” America: Multiculturalism and the African American Experience (2004). In all, I spent sixteen months actively promoting the book, through PR releases, contacts at universities and through my work at the Academy for Educational Development, and a huge volume of email exchanges and phone conversations. Between this nearly full-time work, my full-time job, and being a full-time parent and husband, I was exhausted by the end of ’05.

It’s unbelievably hard work to promote a book. Especially a self-published one. Not to mention, one that I’d proclaimed as an in-depth response to the conservative movement’s “Culture Wars” on all things “multicultural.” One that was a combination of personal vignettes with interviews and historical research to tell the story of African Americans and other groups of color coming to grips with their notions of multiculturalism in education and in their everyday lives. Granted, it was immediately available via Amazon.com, Barnes & Noble/B&N.com and the now out-of-business Borders.com. But if I’d done nothing, I would’ve sold maybe one hundred copies in ten years.

My work to promote Fear of a “Black” America began about a year and half before it hit virtual and actual shelves in September ’04. I created a website for the manuscript (http://www.fearofablackamerica.com) in February ’03,  learning HTML in detail in three weeks’ time. Within a year, the number of unique visitors to the fledgling site varied between 500 and 1,000 a month. After three years of coming close — but still failing — to publish Fear through traditional publishers like Beacon Press, Random House and Verso, I politely moved on from my agent and decided to self-publish.

A couple of months into the process, I hadn’t much success beyond a couple of professors using copies of Fear in their African American studies courses (a completely random occurrence — they were in different parts of the country). My friend Marc took it upon himself to have me meet him and a friend of his for a long talk about how to organize a marketing campaign for the book at the end of November ’04. While they were certainly well-meaning, their advice provided no real insight into the process other than what I already knew. I just needed to be persistent.

That persistence paid off in early February ’05. In a span of three days, I did an evening drive interview with Howard University Radio (WHUR-FM) and a book signing at Karibu Books. Both, at least, gave me some momentum beyond Black History Month, as I continued doing book signings in the DC area and through my job up in New York that spring.

My promotions reached their height in April ’05, when I did an hour-long interview with Pacifica Radio DC (WPFW-FM) about Fear. There, I realized how much more interested caller were in my personal background and how that shaped my views of multiculturalism. I also learned that some of the callers — whom I didn’t know — had actually read my book. It made all of the groundwork I’d done to get to this point worth the effort. By then, I’d cracked the top 100,000 in the Barnes & Noble list (84,000), or roughly ten to fifteen sales per week, and the top 200,000 (161,000) on Amazon.com (another 10-15 sales per week).

WPFW 90.9 Interview (Part 1), Fear of a “Black” America, April 25, 2005

WPFW 90.9 Interview (Part 2), Fear of a “Black” America, April 25, 2005

During that summer and fall, I continued to promote Fear, with another interview on Pacifica Radio DC in August, and a book signing at Howard University Bookstore in October ’05. But I was running on empty. As fast as email was, it didn’t have the immediacy of what we now call social media. And in ’05, Facebook was in its infancy, Twitter didn’t exist, and Blogger was a relative novelty. Even with a website that received 4,000 hits and over 1,200 visitors a month, I couldn’t generate the cascade effect that I could right now.

My final act of promotion for Fear of a “Black” America came in August ’06, though John Kelly’s Washington Post Metro Column, “Getting Work Done – On the Way to Work,”  in which I talked about editing my book on Metro Rail for two years. By then, I’d pivoted to work on Boy @ The Window, knee-deep in reopening memories that hadn’t been well-considered when I was a teenager.

Between September ’04 and December ’05, I promoted Fear of a “Black” America using $3,500 of my resources, and made over $1,000 on the book, selling about 600 copies in sixteen months. Overall, I’ve sold over a 1,000 copies between ’04 and ’08. Those numbers are on par with most works published in academia.

But I was hardly satisfied. I knew by ’09 that with a social media apparatus, I could’ve sold ten times as many books. I knew that my memoir manuscript deserved more than the fate of self-publishing, that I’d want to find a path to a traditional publisher. Still, despite my moments of despair, I believe that my persistence in finding an agent and a publisher is the right way to go. It’ll make it easier to work hard in promoting Boy @ The Window. In that case, I’ll be doing it in the virtual light of day.


Sometimes, I Am Walter White

July 17, 2011

Bryan Cranston as Walter White Screen Shot, Breaking Bad, Season 1, Episode 1. Qualifies as fair use under US copyright laws because picture is part of post describing the character and series.

Season Four of Breaking Bad begins tonight at 10 pm EDT on AMC in my part of the world. I’m a late comer to the show, and only because my wife had sat on her Netflix delivery of the first two disks of the first season back in March. But boy did I catch up, watching the first two seasons in a span of ten days! Overall, I find the first six episodes of Breaking Bad the most intriguing. Those episodes provide me the reasons for why I support Walter White (the main character played by Bryan Cranston), because I can see some of myself and my life in his.

For those of you who haven’t watched or aren’t fans, Walter White is a brilliant yet foolish has-been-who-really-should’ve-been-somebody high school chemistry teacher in Albuquerque, New Mexico. He’s fifty years old, married for seventeen years, with a fifteen-year-old who has cerebral palsy, and with a surprise baby well on the way, as his wife’s in her third trimester. When he discovers after collapsing at his other job (at the local car wash) that he has advanced lung cancer and maybe six months to live, he decides through serendipity to use his training as a biochemist to produce high-grade methamphetamine, or crystal meth, in order to provide for his family before kicking the bucket.

I’m not terminally ill, at least as far as I know. Nor am I a biochemist. But like Walter White, I am an over-educated person with tons of skills and experience, but woefully under-applying them in my current work as an adjunct professor and consultant. I wasn’t pushed out of a venture with a biotech company in which the other partners made billions of dollars off of my ideas. But I’ve had people in my life who’ve attempted to keep me from expressing my ideas, from getting a job, even made up stories to derail my career.

Unlike Walter White, I’m at least teaching college students, if only in the technical sense that the students I teach are in college. Although, given the sporadic nature of my consulting when combined with my teaching, it may be time to do like Walter White and obtain certification to teach high school social studies. For unlike in Albuquerque, teaching at the high school level out here often pays better than being a college professor, and can yield better results academically for the students involved.

Given where Walter could’ve been in life by the time he reached middle age, it’s small wonder that he has a

The Hulk Screen Shot, May 1, 2008. (Source:Lawrence Cohen/http://www.apple.com/trailers/universal/theincrediblehulk/large.html). Qualifies as fair use under US copyright laws because its a low-resolution depiction of a character as described in this post.

deep well of pent-up rage to draw from throughout the series. I understand that rage because I’ve seen it in myself over the years. But my rage comes from a life of deprivation and working my ass off to overcome it, only to feel as if there’s still tons’ more work to do. With the struggle to become a successful writer, and not just an academic one with a book and a couple dozen articles to my credit, I’m already tired. But the struggle for more work in a field in which you know you’re well qualified and already have done a ton of work can lead to Walter White rages. Or, for that matter, Bruce Banner each time he turned into Hulk.

Really, I realize that on the whole, I’m not Walter White. I’ve been written off too often in life to see myself that way. But I can understand after spending the better part of three decades working to turn “No!” into “Yes!,” to prove myself as a thinker, educator, historian, manager and writer. Not only to myself, but to my God, and those manning the gates to jobs, publishing, grants and degrees. I get it as to how and why rage can build up. I guess that if I found myself with Stage 3 lung cancer, I could use my talents to write other people’s books and dissertations, or even to write scripts for porn, but that wouldn’t exactly be me.

No, under Walter White’s circumstances, I’d probably call in every favor that I’ve been owed since seventh grade. I’d contact every writer that I’m a fan of, every contact I know associated with publishing books, magazines, scholarly journals, and make myself a royal pain in the ass. That is, until getting a book contract for Boy @ The Window, publishing several pieces I’ve been working on with occasional bursts of writing for the past two or three years. I’d do whatever I could to make sure that Noah and Angelia were taken care of before I passed.

Come to think of it, what I’ve just written should be my mantra, impending death or otherwise. As Nickelback says in “If Today Was Your Last Day,” “against the grain should be a way of life.” That’s been me for the past thirty years. So I’m really only sometimes Walter White.

Rajon Rondo, ultimate against the grain drive before hard foul, 2010 NBA Eastern Conference Finals, May 1, 2010. (Photo by Nathaniel S. Butler/NBAE via Getty Images).


Follow

Get every new post delivered to your Inbox.

Join 776 other followers