Copying, Lifting, and Cultural Appropriation

March 19, 2015

Pharrell Williams and Robin Thicke at the 56th Annual GRAMMY Awards, Beverly Hills, CA, January 25, 2014. (Larry Busacca/Getty Images, via http://images.musictimes.com/). Qualifies as fair use under copyright laws, via Getty Images agreement with CC-SA-3.0.

Pharrell Williams and Robin Thicke at the 56th Annual GRAMMY Awards, Beverly Hills, CA, January 25, 2014.
(Larry Busacca/Getty Images, via http://images.musictimes.com/). Qualifies as fair use under copyright laws, via Getty Images agreement with CC-SA-3.0.

I’m sure all of you have heard about the recent court decision that gave Marvin Gaye’s estate a $7.3 million award, finding that Pharrell Williams and Robin Thicke committed copyright infringement stemming from their 2012 hit single “Blurred Lines.” They lifted the melody and rhythm for their song from Marvin Gaye’s 1977 single “Got To Give It Up.” The two songs do sound similar enough, and interviews with Williams do show that he was heavily influenced by Gaye’s work. I find myself agreeing with the jury on this because of Williams’ Whiteness rhetoric about being the “new Black” last year, as well as Thicke’s constant cultural appropriation in his videos and music.

The decision, though, made me think about how much copying has gone on in music over the years. It made me think about the first time I heard Madonna, off her first self-titled album, on the airwaves in the fall of ’83. It was her first Billboard Top 40 single “Holiday.” Except that the first dozen or so times I heard it, I thought at first it was “And The Beat Goes On,” a late ’79/early ’80 disco hit from the group The Whispers.

It was the first time I realized that music artists could copy each other, or at least, have similar sounds, rhythms, tones and other musical arrangements in their songs. The lyrics were obviously different, but both “And The Beat Goes On” and “Holiday” were “forget-the-cares-of-this-world” dance-pop songs with heavy R&B influences.

I’d wondered for years whether Madonna ever gave The Whispers any formal credit for sampling their music for one of her very first tracks. I did find an answer in the Madonna (1983) album’s liner notes. Nope, not a single mention, not a word of acknowledgement. But John “Jellybean” Benitez was mentioned as producer. There’s no way in this world that he and the other folks who worked on “Holiday” didn’t know who The Whispers were or hadn’t heard their song “And The Beat Goes On.” It’s possible that Madonna herself didn’t know, but given her constant credits to the disco era, I seriously doubt that, too. Take a gander below, folks, and tell me how similar the two songs were/are:

Keep in mind, though, this was before Madonna had become “Like A Virgin” Madonna, “Material Girl” Madonna, and “Vogue” Madonna. And copying, sampling, and lifting was more acceptable in the early 1980s than it is today. Especially since at the time, neither The Whispers nor Madonna were music icons. Of course, lifting from relatively obscure Black artists to mainstream a song or music genre is nothing new. Just ask Al Jolson and Elvis Presley!

Thirteen years later, The Fugees released their big hit, “Ready Or Not” (1996). As soon as I heard it, I knew they had sampled Enya’s “Boadicea” (1987), because I’m that kind of eclectic music enthusiast. They didn’t give Enya credit in their initial liner notes, either, and hadn’t obtained permission to use her music in their song. Enya threatened to sue over this rather obvious copyright infringement, and The Fugees and Enya settled the issue out of court.

By ’96, the rules for sampling other music artists’ work had become tightened, and Enya herself was a well-known, if not iconic, new age music artist. The up-and-coming Fugees picked the wrong Irish singer to copy without permission or acknowledgement.

What does all of this mean? For starters, you should never plagiarise someone whose work is well-known. Vanilla Ice, meet Queen and David Bowie. The Verve and “Bittersweet Symphony” from ’97? Let me introduce you to The Rolling Stones!

But the “Blurred Lines” decision means much more than the message that one should steal from an unknown without a major music contract instead of stealing from Marvin Gaye. The legal decision blurs the distinction between illegal sampling and inappropriate cultural appropriation. Really, Madonna’s use of The Whisper’s “And The Beat Goes On” is just as blatant, and so was her appropriation of disco, R&B and other Black and Latino dance rhythms between ’82 and ’93. Unlike Pharrell Williams and Robin Thicke, though, Madonna’s appropriation wasn’t seen as such, at least during her first years of fame. Heck, I knew more than a few Black folk who though Madonna was either Black or biracial prior to the Like A Virgin (1984) album, likely because like me, they didn’t have cable to watch MTV ad nauseam.

I guess that Pharrell’s and Robin Thicke’s act has worn thin with the fickle public. This may well point to the larger fact that mainstream popular music and the artists that are creating today’s music are about as creatively collaborative and eclectic as a dunker in basketball with no jumpshot and no defensive skills. This isn’t your father’s White Soul, aka, Michael McDonald, Darryl Hall & John Oates, or even Kenny Loggins, working with Earth, Wind & Fire or Kool & The Gang. Today’s music artists can only do their music one way, and need “inspiration” to “create” a “new sound.” One that is too often lifted from the past, yet never placed in context, and sampled with and without permission.


Degrees of Fakery

March 17, 2015

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

Anne-Marie Johnson in Im Gonna Git You Sucka (1988), March 17, 2015. (http://cdn5.movieclips.com/). Qualifies as fair use under US copyright laws (low resolution and relevance to subject matter).

All too often, there are Americans high and low who believe they can say, “That’s just your opinion!” to anyone about anything. It doesn’t matter if the person they say this to is an expert in, say, climate change or American history or twentieth-century education policy. Or, for that matter, if the person they question is a total bullshit artist. All opinions are equal, and equally discountable.

But it does matter if the opinion comes from someone rich and famous. or at least, someone Americans see on TV and/or hear on the radio nearly every day, someone they like, someone they could see themselves sharing a laugh or cry with. That’s why opinions like those of Rudy Giuliani, Bill Cosby, Michelle Malkin, even Brian Williams seem to have mattered more over the years than the expert interpretations of many a scholar, scientist or public intellectual.

On the scale of those experts, those in the media likely view me as a middle-of-the-pack expert. I went to graduate school for five and a half years, earning two advanced degrees with a focus on twentieth-century US and African American history, with an even sharper focus on history of American education, African American identity and multiculturalism.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Front and left-side view of Chevrolet Citation II (1980-1985), Clinton, MD, August 28, 2008. (IFCAR via Wikipedia). Released to public domain.

Despite what my Mom, my dad and some of my more cynical former high school classmates may think, earning a PhD in history wasn’t nearly as simple as answering 1,000 Final Jeopardy questions correctly before getting a stamp of approval. Twenty-three masters and doctoral courses, more than forty paper assignments of twenty pages or more, two years as a teaching assistant, one year as an undergraduate student advisor, two summers as a research assistant, and twenty-seven months of single-minded focus researching and writing a 505-page dissertation with more citations than the number of Citations Chevrolet made between 1980 and 1985. Oh, and did I mention, nineteen months of burnout afterward?

Yet, when I take the years I’ve spent researching, writing, learning, teaching, publishing and working in the fields of history and education, and express views based on that, I get told what anyone else on the street could say. “That’s just your opinion!” Unbelievable!

I think, too, about those from a time not too long ago who could’ve and should’ve earned a doctorate, a medical degree, or a JD, yet the structures of socioeconomic privilege, racism and sexism prevent them from earning these most expert of degrees. Yet, at many an HBCU, in many a segregated classroom, in so many barbershops, we still called them “Doc,” a sign of respect, for their abilities, for their experience, for their — dare I say — expertise.

We still do this now, even for people who don’t deserve the nickname “Doc.” My father and my idiot, late ex-stepfather both at one point in their lives or another laid claim to being doctors and/or lawyers. For the first two years I knew my then stepfather Maurice, between ’77 and ’79, he carried a monogramed briefcase, always full of his “important papers,” telling me and anyone else he bumped into on the streets of Mount Vernon, New York that he was a “doctor” or a “lawyer.” When drunk, my father sometimes took it a step further, telling strangers on the Subway that he was a “big-shot doctor an’ a lawyer” on his Friday-evening paydays. Maurice drove a Reliable taxicab during his delusions-of-grandeur years, and my father was janitorial supervisor.

Given the history of education and our society’s denial of quality education to people of color and the poor in the US, though, I didn’t entirely hold it against them then, and I don’t now. What I do have much bigger problems with, though, are the people who should know better, and yet don’t do any better. Just in my lifetime alone, people with Dr. in front of their names without having earned a doctorate or a four-year medical degree. Like “Dr.” Henry Kissinger, “Dr.” Bill Cosby, and of late, “Dr.” Steve Perry (not to be confused with the former lead singer for Journey, I guess). And no, honorary doctorates for giving money to Harvard, Temple, or the University of Massachusetts don’t count! Nor does starting an outline for a dissertation without actually finishing one. Still, they insist on the “Dr.,” even when it’s obvious I could’ve sat on the stoop at 616 East Lincoln Avenue thirty years ago to get the same half-baked opinions from one of my hydro-smoking neighbors.

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Stock photo of former NYC mayor Rudolph Giuliani, August 2013. (AP/New York Post).

Then again, numbskulls like William Kristol and Newt Gingrich have earned doctorates from Harvard and Tulane University respectively, and Ben Carson was once one of the most respected pediatric neurosurgeons in the world! Yet, for some dumb reason, our media and most Americans think that their opinions are somehow more valuable, more consumable, than those of people who’ve spent decades understanding education, American culture, racial, gender and socioeconomic inequality, and government corruption. Or maybe, we just like listening to fake opinions from people with fake degrees and/or fake expertise on a subject in which they know nothing. Because nothing is exactly what Americans want to hear.


Boy @ The Window Origins: Meltzer Conversations

March 14, 2015

X-Men Origins: Wolverine (2009) scene, where Wolverine frees mutants kept as experiments by Colonel William Stryker , March 13, 2015. (http://cdn.collider.com/).

X-Men Origins: Wolverine (2009) scene, where Wolverine frees mutants kept as experiments by Colonel William Stryker , March 13, 2015. (http://cdn.collider.com/).

Of all the tangents I took related to writing Boy @ The Window, the most direct path that got me to write a memoir about the most painful period in my life was through several conversations with my dear teacher, friend and mentor in the late Harold Meltzer. I’ve discussed bits and pieces of some of those conversations here and in longer form in Boy @ The Window. It’s still worth rehashing some of those conversations, at least in terms of what was and wasn’t good advice, as well as in explaining how some of the main themes of the memoir developed over time.

As I wrote in Boy @ The Window, though my “first interview with him was in August ’02,” the first time “we discussed the possibility of me doing Boy @ The Window went back to February ’95.” Meltzer had been retired from teaching about a year and a half, while I was beginning the heavy lifting phase of my doctoral thesis, “living in DC for a couple of months while hitting the archives and libraries up for dusty information. In need of a writing break, I gave him a call on one cold and boring Saturday afternoon.”

It was in response to a letter he sent congratulating me. I’d recently published an op-ed in my hometown and county newspaper, “Solving African American Identity Crisis.” I was writing about issues like using the n-word, hypermasculinity, and internalized racism in the short and, for me at least, dummied down piece. Somehow our discussion of that piece led to a discussion of my classmate Sam. Did I really want to spend an hour and a half talking with Meltzer about Sam and some of my other Humanities classmates and their possible identity issues, considering some of my own serious growing pains — the Hebrew-Israelite years, my suicide attempt, my Black masculinity and manhood issues? Absolutely not!

But I learned quite a bit about how I might want to approach writing Boy @ The Window through that phone call. Not because Meltzer had given me any sage advice, which he didn’t, or because he revealed things to me that I shouldn’t have come to learn during our conversation, which he definitely did.

Benetton ad, 1980s, January 2013. (http://fashionfollower.com/).

Benetton ad, 1980s, January 2013. (http://fashionfollower.com/).

No, it was the idea that a lot of the things that I had pursued as a historian and researcher were things that came out of my experiences growing up. Multiculturalism as a historical phenomenon (at least if one linked it to cultural pluralism)? Can anyone say Humanities Program, or, what I used to call “Benetton Group” when we were at A.B. Davis Middle School? Writing about African American identity issues? Obviously related to living in Mount Vernon, the land where any hint of weakness translated into me being called a “faggot” or a “pussy.”

And what about any scholarly concerns with racial and socioeconomic inequality and Black migration? Anyone ever meet my Mom and my father Jimme, 1960s-era migrants from Arkansas and Georgia/Florida respectively? An examination of the Black Washingtonian elite and their looking down upon ordinary Blacks because of their own colorism or the latter’s lack of education? Come on down, Estelle Abel and any number of well-established Black Mount Vernon-ites who never gave me the time of day! As much as academia had been an escape for me, into a world of rationalism and logic, a place of dispassionate scholarship, it was all personal for me, without realizing it until that phone conversation with Meltzer.

Fast-forward to November ’02, the last interview I did with Meltzer before his death two months later. We spent the last couple of hours on that brisk fall Thursday discussing the book idea that would become Boy @ The Window. Meltzer thought that it should be a work of fiction, “based on the real flesh and blood folks in my life, but with different names of course to protect me from any potential lawsuits. He did make me rethink the project from a simple research study of my high school years into narrative nonfiction or a memoir.” 

Screen shot of fictional character Harper Stewart's bestselling novel nfinished Business, from The Best Man (1999), March 14, 2015. (hitchdied via http://s785.photobucket.com/).

Screen shot of fictional character Harper Stewart’s bestselling novel Unfinished Business, from The Best Man (1999), March 14, 2015. (hitchdied via http://s785.photobucket.com/).

Was Meltzer correct? Should I have done a Harper Stewart — played by actor Taye Diggs in The Best Man (1999)? Should I have fictionalized all of my experiences and those of my family, teachers, administrators and classmates? I’m not sure if it would’ve made a difference. Stories of fiction tend to have a tight symmetry to them. Or, the theme of “what goes around comes around” is usually a big one in any novel. You can’t leave too many loose threads or unresolved issues, even if the novel is part of a series. For my purposes, since my life remains a work in progress, a story of relative — not obvious or absolute — success, telling it as fiction would hardly ring true to me, much less to any group of readers.

Whatever else anyone wants to say about the late Harold Meltzer, the dude got me to think about difficult things until I was no longer comfortable in leaving my uncomfortable experiences and assumptions unchallenged. The very definition of a mentor, the very purpose of Boy @ The Window.


On Kicking My Damsel-in-Distress Syndrome

March 7, 2015

Chivalry with a suit blazer,   March 7, 2015. (http://genius.com)/

Chivalry with a suit blazer, March 7, 2015. (http://genius.com)/

This week marked thirty-three years since the fight that led to a crush that led to me falling in love for the first time, via a ballerina in training. The three-month period between March and June ’82 shaped how I dealt with teenage girls and women between the time I turned twelve and my mid-thirties. The crush on “Ballerina Wendy” and its mutation because of my stepfather’s knocking out of my Mom in front of me helped shaped my feminism, my womanism and my sexist damsel-in-distress syndrome.

Wonder Woman, October 30, 2012. (http://tvequals.com).

Wonder Woman, October 30, 2012. (http://tvequals.com).

It was the beginning of my damsel-in-distress syndrome. Though it was triggered by the Memorial Day incident, my damsel-in-distress syndrome had been latent for years. I was in fact a mama’s boy, tempered by living at 616 and in Mount Vernon. I’d always been enamored by strong, athletic women (or at least, actresses with that role), going back to Lynda Carter as Wonder Woman. Yet I’d also been surrounded by sexism and misogyny, from my father calling my Mom a “Black bit'” since I was four to my stepfather’s constant quoting of the Torah to justify his laying of violent hands and feet on my Mom.

What I did in response was to help my Mom in every way I could, and in ways I never should’ve. Calling up Con Ed and Ma Bell to pay the electric and telephone bills. Listening to years of conversations about her failed marriages, about my father’s alcoholic failings, about her bills, about the burdens we as her children had put on her. Washing clothes for the house every weekend from October ’82 through August ’87 and anytime I was home for the summer and for the holidays once I went off to college. Going to the store as many as five times in a single afternoon and evening because my Mom forgot that she needed diapers or cigarettes. Hunting my father down for money even on weekends I didn’t want to be bothered because we were out of food for my younger siblings. Taking a fist-filled beating here or there from my stepfather to take the pressure off of my Mom. Promising my Mom that after I finished my degree, I’d come back to New York to work and help her out financially.

Atlas supports the terrestrial globe on a building in Collins Street, Melbourne, Australia, October 9, 2006. (Biatch via Wikipedia). Released to the public domain.

Atlas supports the terrestrial globe on a building in Collins Street, Melbourne, Australia, October 9, 2006. (Biatch via Wikipedia). Released to the public domain.

On that last promise especially, I reneged. I changed my major from computer science to history, and decided to stay at Pitt, to go to graduate school, to earn a PhD, to start writing, both in the academic world and a bit as a freelancer, to teach for a living. It was the basis, I think, for her falling out with me in ’97, and why our relationship remains limited.

My Mom was hardly the only woman in my life in which I wanted to assist. Some of my Pitt friends can certainly attest to this fact, that sometimes I was there to help too often. To the point where once I realized I was overburdened or when that other person had become too reliant on me, it pretty much killed that friendship. Either way, I was angry, and sometimes felt used, while some of my Pitt friends were either confused or angry themselves.

I’ve had to learn over the years to say no, even to my wife, when I realized that one too many logs on the fire will actually put that fire out. It started with everything high-tech. Every computer glitch, every printer error, every Internet issue, and I was there like Clark Kent, ready to help. But by the time I hit thirty-five, I was just too tired and felt too burdened to be that on all the time. I finally stopped helping my wife with her tech issues. I stopped offering to help, and have only interjected when the issue actually affects all of our equipment.

The irony is, my wife is a stronger person than my Mom, stronger in many ways than how I perceived Wendy as a person so many years ago. It’s not as if my wife doesn’t need or appreciate the help. But, as I’ve learned over the years, too, sometimes, help is just emotional support, a hug or a joke. Or, when I’m ready to, simply listening without feeling the need to use a quadratic equation to solve the problem.

American Ballet Theater soloist Misty Copeland in a promotional photo via her Under Armour ad deal, January 30, 2014. (Under Arnour via Huffington Post).

American Ballet Theater soloist Misty Copeland in a promotional photo (cropped) via her Under Armour ad deal, January 30, 2014. (Under Arnour via Huffington Post).

Damsel-in-distress syndrome, as chivalrous as it is, can also be extremely sexist, for both women and men and girls and boys. It means constantly attempting to help people who may or may not want your help, especially in cases where it is clear that they may need help. It means taking on emotional and psychological burdens that otherwise should only belong to the person you’re trying to support. It means, sadly, providing advice and knowing answers and solutions that may not be answers or solutions at all.

The Memorial Day ’82 incident with my mother changed what was an otherwise innocent crush and love into something contradictory even as it became more meaningful. It made me appreciate women who could and can kick some ass, whose strength would be obvious to all. And it made me think women who weren’t like that — women like my Mom — needed constant help from people like me. Wendy defended herself thirty-three years ago. My Mom tried and couldn’t. Life and strength for us, male and female and transgender, though, has never been that simple. And though I have saved quite a few damsels in distress over the years, it isn’t my eternal burden to carry.


ETS Using Test Results To Justify Its Test-Filled Vision

March 4, 2015

America's Skills Challenge: Millennials and the Future (cover), February 17, 2015. (ETS).

America’s Skills Challenge: Millennials and the Future (cover), February 17, 2015. (ETS).

I actually like the Educational Testing Service (ETS). I’ve done work for them as a consultant and as an AP Reader over the years. I enjoyed most of my testing experiences with them, especially the AP US History Exam of 1986. I like many of the conferences that they host and sponsor, and they beat almost all with the spreads of food that they provide at their events. Yet even with all that, ETS’ agenda is one of promoting the ideal of a meritocratic society with a repressive regime of testing that shows beyond a shadow of a doubt the socioeconomic determinism of standardized assessments. Or, in plain English, tests that favor the life advantages of the middle class and affluent over the poor, Whites and assimilated East Asians over Blacks, Latinos, and only partially assimilated immigrants of color.

Such is the case with a nearly unreported new report from ETS. They had scheduled a press release for the “America’s Skills Challenge: Millennials and the Future” on Tuesday, February 17th at the National Press Club in Washington, DC. The organizers postponed the event, though, because of the phantom snow storm that was really a typical snow shower. So I didn’t get to ask my preliminary questions about the findings of researchers Madeline J. Goodman, Anita M. Sands, and Richard J. Coley, that despite the educational gains of the generation born after 1980, they sorely lack the skills they need for life and work in the twenty-first century. My questions? How could anyone have expected millennials to develop independent thinking, critical thinking, innovative thinking, writing and other analytical skills if they spend precious little time in their education actually doing any of these things? How would the constant barrage of high-stakes tests from kindergarten to twelfth grade have been able to instill in students ways to think outside the box, to look at issues with more than one perspective, to stand in opposition to policies based on evidence, and not just based on their gut or something they picked up from a test?

Mass of students taking high-stakes test, September 4, 2014. (http://newrepublic.com via Shutterstock).

Mass of students taking high-stakes test, September 4, 2014. (http://newrepublic.com via Shutterstock).

Well, the report is worse than I thought. Goodman, Sands and Coley put together an argument that makes circular reasoning look like a Thomas the Tank Engine episode. The authors produced this first in a series of reports for ETS, relying solely on “data from the Programme for the International Assessment of Adult Competencies (PIAAC).” The PIAAC, developed by the Organisation for Economic Co-operation and Development (OECD), is a survey that assesses the skill levels of a broad spectrum of people between the ages of sixteen and sixty-five, the primary working population in most developed countries (meaning the US and Canada, the EU, the Baltic states, Australia, Japan and South Korea). ETS and the authors claim that this survey instrument is better at assessing how far behind millennials in particular are when compared to “their international peers in literacy, numeracy, and problem solving in technology-rich environments (PS-TRE)” than the international testing of high school students alone. And as such, the authors concluded that

PIAAC results for the United States depict a nation burdened by contradictions. While the U.S. is the wealthiest nation among the OECD countries, it is also among the most economically unequal. A nation that spends more per student on primary through tertiary education than any other OECD nation systematically scores low on domestic and international assessments of skills. A nation ostensibly based on the principles of meritocracy ranks among the highest in terms of the link between social background and skill level. And a nation with some of the most prestigious institutions of higher learning in the world houses a college-educated population that scores among the lowest of the participating OECD nations in literacy and numeracy.

I don’t about anyone who reads my blog, but I find these conclusions smack of so much hypocrisy that they’re stomach-ache-inducing. Really? Years of promoting testing at every level of K-12 education, everything from state and district-level assessments to PARCC and Smarter Balanced Assessments, and it’s only because of growing economic inequality that US students-turned-adults don’t score well in the super-advanced, highly skilled categories? Not to mention, the SAT, AP exams, GREs, LSATs, GMATs, MCATs, Praxis I, Praxis II, and so many other ETS exams that it would cause the average psychometrician’s head to explode? Seriously?

Terrier dog chasing its own tail, March 3, 2015. (http://webmd.com).

Terrier dog chasing its own tail, March 3, 2015. (http://webmd.com).

This is yet another case of the dog chasing its own tail. A case where the $3-billion-per-year nonprofit just outside Princeton, New Jersey is sounding a clarion call for a crisis that it helped create. Not the one on the rapid rise of inequality, though its promoting of a false meritocracy through constant testing has served to lull affluent America into an intellectual coma. But in the cutting of history and social studies, literature and art, theater and music classes, from kindergarten really all the way through a bachelor’s degree program.

In the promotion of testing as the way to address achievement gaps, to deal with the so-called education crisis, so much of what was good about K-12 and even higher education has fallen away. Reading for the sake of reading and learning has drifted away, with more English and less literature in schools and at many colleges and universities than ever. Want to teach someone how to express themselves in writing, to express their numeracy in proofs? That thinking runs counter to what goes on in the Common Core school systems of 2015, meaning most people will either never develop these skills, or, if lucky, might develop them somewhere between their junior year of college and in finishing a master’s degree or doctorate. We emphasize STEM fields with billions of STEM dollars without realizing that great STEM is much more than equations and formulas. It’s also imagination, applying the ability to break down pictures, ideas, words and sentences contextually to the world of numbers and algorithms.

And don’t give me this whole “the SAT now has an essay section on it” spiel! Fact is, everyone knows that expressing their words on paper, on a screen or in speech is critical in modern societies. After almost seven decades of testing, ETS figured this out, too? What they haven’t figure out yet, though, is how to make standardized high-stakes testing a necessary for the entire working adult population in the US. Believe me, that’s where they want to head next.


Malcolm X, “Make It Plain”

February 21, 2015

Plain Conscious Chocolate, February 21, 2015. (http://www.ethical-treats.co.uk/).

Plain Conscious Chocolate, February 21, 2015. (http://www.ethical-treats.co.uk/).

I’d be a terrible historian to not comment on the fact that today marks fifty years since some Nation of Islam malcontents — with support from J. Edgar Hoover’s FBI — murdered Malcolm X at the Audubon Ballroom (now the Shabazz Center) in Washington Heights in Upper Manhattan. I wasn’t around for the event, or any of the tumultuous events that defined “The ’60s.” All I know was I didn’t learn about Malcolm Little or Malcolm X until the summer between my undergraduate and graduate years at Pitt, the summer of ’91. Although the year before, I’d gone to a Malcolm X birthday celebration at the Homewood-Brushton branch of Carnegie Library of Pittsburgh. There, I saw poets performing their work, got to listen to some good jazz and rap, and saw the Afrocentric set out in full force.

Audubon Ballroom, where Malcolm X was murdered (now the Shabazz Center, with the Columbia University Medical Center's Mary Woodard Lasker Biomedical Research Building in the background), Washington Heights, New York, June 4, 2014. (Beyond My Ken via Wikipedia). Release to the public domain via GNU Free Documentation License, Version 1.2.

Audubon Ballroom, where Malcolm X was murdered (now the Shabazz Center), Washington Heights, New York, June 4, 2014. (Beyond My Ken via Wikipedia). Release to the public domain via GNU Free Documentation License, Version 1.2.

You’d think after three years as a Hebrew-Israelite and years around children of Nation of Islam members as a kid that I would’ve heard all about Malcolm. Nope, hardly a peep about him growing up in Mount Vernon. Mostly, I got questions like, “Yo, you a “five percenter”?,” which for me translated into the chosen few living in the midst of the end times. Other than that, there was always the dichotomy trope of Martin versus Malcolm laid on us real thick through school and the newspapers. Dr. King was respectable, nonviolent, a true representative of the race. Malcolm was a street thug, a leading member of a heathen religion, a violent man who hated White people.

My Mom, who normally rejected mainstream White ways of thinking about Black folks, had bought this trope and tried to sell it to me and my older brother growing up. But as with so many things my Mom attempted to instill in me growing up, I wouldn’t make any decisions about Malcolm the person (as opposed to the icon) until I got around to reading, in this case about him and the Nation of Islam, as an adult.

The Five Percenter logo (apparently popular among the rapper set), January 8, 2013. (http://assets.vice.com)

The Five Percenter logo (apparently popular among the rapper set), January 8, 2013. (http://assets.vice.com)

The one thing I realized after reading the Afrocentric, mainstream and Alex Haley interpretations of Malcolm in the early ’90s is that just like with King, we could make Malcolm X represent whatever we wanted. He could be nonviolent and a militant at the same time, or a thug and an ambassador of peace at the same time. Yes, as the late Manning Marable’s book shows, Malcolm — like most of us — was a walking, breathing contradiction of convictions (literal and figurative) and beliefs. For the purposes of my post today, though, he was a social justice activist, acting on the part of those poor, Black and discarded, plain and simple.

Which is why I think anyone who thinks Malcolm X brought murder to his own pulpit in February 1965 is an idiot. The idea that teaching others self-defense in opposition to White mobs, lynching, and blatant police brutality deserved a violent death. Really, now? So, if that’s the case, then Dr. King should have died of natural causes about three or four years ago, since his was the path of nonviolence, right? Yet, you still hear the likes of Rudy Giuliani, Bill O’Reilly and Geraldo de Stupido slinging this shit (and similar crap playing on such respectability politics themes) as if it were McDonald’s hash browns on sale for half-price.

Manning Marable's Malcolm X: Life of Reinvention (2011) cover (Marable died four days before his last book dropped), May 28, 2012. (Malik Shabazz via Wikipedia).  Qualifies as fair use under copyright laws (relevant subject matter, low resolution).

Manning Marable’s Malcolm X: Life of Reinvention (2011) cover (Marable died four days before his last book dropped), May 28, 2012. (Malik Shabazz via Wikipedia). Qualifies as fair use under copyright laws (relevant subject matter, low resolution).

Speaking of that lot, I don’t wonder what Malcolm X would say about our racist, plutocratic democracy these days. Anyone whose read his words would know what he’d say. That what happened with Michael Brown and Eric Garner and Rashida McBride and so many others should be resisted “by any means necessary.” That we should unmask those powerful people lurking in the shadows but pulling the strings that keep the systems of oppression working 24/7 in our world. He would’ve supported Occupy Wall Street when and where few Black leaders had in 2010 and 2011, called Islamic State or IS (that’s what they are called outside the US, where we can’t get our acronyms straight) a “chickens coming home to roost” scenario, and put Tavis Smiley and Cornel West in the same self-aggrandizing bag as Giuliani and Rivera.

I get why it took Malcolm Little so long to transform himself into Malcolm X, and still, until after his thirty-ninth birthday, for him to find himself and his purpose in the world. It’s taken me nearly four and half decades to do the same. It’s hard to “make it plain,” especially to ourselves. It’s scary to be in a constant state of disillusionment, about family and friends, about your identity, about your religion and beliefs. But it also allows you to see yourself and everyone around you fresh for the first time, to know who people really are.


Carnegie Library of Pittsburgh and My Own Prison

February 16, 2015

East Library branch of Carnegie Library of Pittsburgh, before (the version I worked in) and after renovation, October 4, 2006 and September 25, 2011. (http://popcitymedia.com and http://eastliberty.org).

East Library branch of Carnegie Library of Pittsburgh, before (the version I worked in) and after renovation, October 4, 2006 and September 25, 2011. (http://popcitymedia.com and http://eastliberty.org).

On February 17th seventeen years ago, we opened one of the first community-based computer labs in the US at the East Liberty branch of the Carnegie Library of Pittsburgh. What was once known as the Microsoft Library Fund (which later became the Gates Library Foundation, and then became part of the Bill & Melinda Gates Foundation) had provided the initial $110,000 to place this computer lab in one of the East Library branches resource rooms. I guess it could’ve been a proud moment for me. If I hadn’t earned my PhD the year before, only to face unemployment for three months during the summer of ’97 and underemployment in the five months since taking the Carnegie Library job. But this was a humiliating moment, not one of pride or, at least, taking comfort in a job done well. It was a learning moment at a time when I thought I already knew what I need to move forward with my career and life.

The dissertation process, my battles with Joe Trotter, the truth about my relationship with my Mom, had all taken a heavy toll on my heart and mind by the time Memorial Day ’97 rolled around. So much so that I lived between moments of humility (which is different from humiliation) and moments of rage in the sixteen months between May ’97 and the fall of ’98. I was living on fumes from my last Carnegie Mellon paycheck when I began working for Carnegie Library the day after Labor Day that year. I’d been conditioned, though, to think that everything happens for a reason. So I assumed that God was attempting to teach me a lesson, that I needed to give more out of the needs I had in my life in order for the things I thought I deserved to come my way.

John Wooden saying on being humble, February 16, 2015. (https://pbs.twimg.com).

John Wooden saying on being humble, February 16, 2015. (https://pbs.twimg.com).

There was a bit of a flaw in my logic around God’s lessons. For one, the idea that I wasn’t finding work in academia because I hadn’t been a giver was ridiculous. Between volunteering for soup kitchens, tutoring high school students, tithing at church, and so many other things, it was dumb to think that not enough humility was the reason I didn’t get the job at Teachers College or had trouble finding adjunct work in the fall of ’97. Or rather, it was dumb not to think that bigger issues — like my dissertation committee abandoning me when I needed them the most — played a greater role in my not finding full-time work in my chosen profession than any inability to serve others.

The Carnegie Library job provided a part-time stop-gap for my income while I attempted to figure out how to move forward without my advisor and my committee and move on with the knowledge that my relationship with my Mom would never be the same. I figured that the job gave me the opportunity to help others and to do good, and that it was a good first foray into the nonprofit world, especially with money from the world of Microsoft.

Boy, I couldn’t have been more wrong! I had a co-worker who was jealous of my degree and attempted to undermine the work of putting together the lab and the class materials for teaching patrons how to use the computers at every turn. I figured out that the bosses at the central branch in Oakland had essentially pocketed some of the funding for the lab to cover the costs of new computers for their own personal use, and had underfunded both my position and my co-worker’s position as part of the grant.

Album cover for Creed's My Own Prison (includes title track), released August 26, 1997. (Jasper the Friendly Punk via Wikipedia). Qualifies as fair use under US copyright laws to illustrate title and theme of this blog post.

Album cover for Creed’s My Own Prison (includes title track), released August 26, 1997. (Jasper the Friendly Punk via Wikipedia). Qualifies as fair use under US copyright laws to illustrate title and theme of this blog post.

But I didn’t learn all of this until June. By February ’98, I began to realize that, more than anything else, I needed to free myself from my own prison of an idea, that I’d done anything wrong or sinful to end up running a computer lab project at twenty-eight when I had done much of this same work at nineteen years old. I had to begin to find prominent people in my field(s) to support me in finding work, even if none of them were on my dissertation committee. I still needed to apply for academic jobs, even if my status meant than some would reject me because of my issues with my advisor. I even needed to explore the idea of jobs outside academia, in the nonprofit and foundation worlds, where my degrees and my ideas about education policy and equity might still matter.

It definitely helped when Duquesne hired me in April to teach graduate-level education foundations courses in History of American Education and Multicultural Education. It helped even more, though, when I decided in August to quit the Carnegie Library job. Between the Microsoft folks and the sycophants at Carnegie Library who were willing to do almost anything for a few extra dollars — anything other than serve their neighborhoods, that is — I’d had enough of duplicitous people. Who knew that my first job with sycophants and Gates money would come back to haunt me in the seventeen years since!


Follow

Get every new post delivered to your Inbox.

Join 754 other followers