Mark Sussman

Writer - Researcher - Teacher

Perhaps Overly Detailed Statement Regarding the Definitions, Effects, and Institutional Mores of Plagiarism

I’ve been working on a doc for my undergrads in Intro to Literary Studies and Intro to Writing about Lit that will explain why plagiarism is a big deal, why you shouldn’t do it, and why your teachers sometimes lose their marbles when they suspect you of it. There’s a bit at the end that goes beyond the usual tautological reasoning (“Don’t plagiarize because it’s wrong”) and gestures to the ways in which plagiarism affects teacher-student collaboration. Here’s a draft.
— — —

Perhaps Overly Detailed Statement Regarding the Definitions, Effects, and Institutional Mores of Plagiarism

by Mark Sussman

 

In this class, and likely in every class you will take at Hunter, you are expected to submit work that is wholly your own. You are also expected to demonstrate that you have mastered the material at hand, which means you will often be quoting and paraphrasing the work of experts. So, turn in work that is 100% original, but make sure that original work borrows from the work of other people. Hmmmmmm …

This seeming contradiction can make the rules of plagiarism and academic integrity sound confusing, if not downright impossible to follow. It can also obscure the rather complicated reasons plagiarism is treated so seriously, despite the myriad ways in which social media has made sharing, reposting, regramming, retweeting, and other forms of appropriation acceptable and normal.  But I am going to try to explain things as clearly as I can.

 

What is plagiarism?

The most simple definition of plagiarism is appropriating someone else’s writing or ideas without attributing them to the original author. The effect of this is to make it seem as though you are the originator of what are, in reality, someone else’s words or ideas. So for example, if I write, “Othello shows us that, as T.S. Eliot wrote, ‘[N]othing dies harder than the desire to think well of oneself’” (244), I have attributed the quote and idea to their author and cited the source. Everything is fine. But if I write, “Othello shows us that nothing dies harder than the desire to think well of oneself,” I have committed plagiarism, because I took Eliot’s words and passed them off as my own.

 

What is originality?

When you hear your professors (at least your English professors) say they want you to produce “original” work, they mean “original” in a very specific sense. They mean that you should produce a piece of writing and analysis whose argument and thesis statement are the product of your own research, writing, and thought. All the writing in your essay should support that thesis statement and argument, which are original in the sense that you formulated them yourself after examining and analyzing the evidence at hand (the text, other scholars, etc.). They don’t mean that every word or idea in your essay has to be yours. Learning about what others have thought and said about the texts you study is a crucial part of writing about them in an informed manner. You are expected to read, cite, and quote from outside sources in order to learn what other writers and thinkers have said about it.

But your professors do ask that when you use someone else’s words or ideas, you give credit to the original source by using a standard system of citation (like MLA). At the undergraduate level, they don’t even ask that you argue something that no one has ever argued. They only ask that you come up with the argument on your own — if someone somewhere happens to have had the same thought and you don’t know about it, that is understandable in most cases. You’re all still learning how to do this, no one expects you to have comprehensive knowledge of your subject.

So essentially, all of the rules surrounding citation, attribution, and plagiarism are there to prevent you from doing one thing: taking credit for other people’s work, whether accidentally or purposefully. The reason style guidelines like MLA, APA, and Chicago are so intricate and infuriating, and the reason your professors get all worked  up about them, is because they are central to making sure credit is given to people who earned it. Professional scholars dedicate their lives to producing new knowledge about the world, and it matters that they receive credit for their work.

 

Ok, but why is that important?

You may ask what difference this credit makes in the context of a college class. You’re not trying to “steal credit” for writing or ideas in a professional context, like a journalist who passes off someone else’s reporting as his own. By borrowing an elegant formulation or a slick analysis from someone else, you’re only trying to create a better essay, which is, after all, what your professor told you to do. So no harm, no foul.

No. That attitude misconstrues why your professors think citation and giving credit are so important. The reason they furrow their brows when you misplace a comma in your works cited and get unreasonably upset and prosecutorial when you borrow a few sentences from a website is because they are trying to train you to think of citation as a matter of ethics, as a matter of fairness and rightness. Failing to give proper credit in the proper way is, in the context of academic institutions, wrong in the same way that stealing money from your neighbor is wrong. In that sense, not citing a source is a categorically different error than, say, writing “its” when you mean “it’s” or messing up a plot point in Othello. From their perspective, failing to credit your sources looks like a failure of character.   

 

This doesn’t sound like we’re talking about writing anymore … 

Like it or not, your English professors are trying to train you not only to be a certain kind of writer and thinker, but to be a certain kind of person, the kind of person who doesn’t steal from their academic neighbor and who looks down on anyone who would. Your professors will not really say this to you because, frankly, the idea that we’re trying to impose our own morals and character on you really weirds most of us out. The reasons for this are complicated, and I’m happy to go into them later. But trust me, it’s true. They want you to experience moral revulsion at the very suggestion of not citing your sources, just like they do. And when you don’t give credit where it’s due, your professor starts to ask themselves whether something is going on. They start to ask themselves if they have a thief on their hands. Not a “rule-breaker,” but a thief.

 

That sounds harsh.

It is. In my experience and that of most of my teacher friends, most plagiarism is accidental. Some plagiarism is intentional, but done out of desperation, fear, and anxiety. A very, very small amount of plagiarism is done in a calculating, sneaky, underhanded way. The problem is, all of those kinds of plagiarism look the same when you find them. When you’re confronted with a paper that contains plagiarism, you don’t know if you’re dealing with a) someone who simply doesn’t know the rules and has accidentally broken them, b) someone who is having real problems in the class, and perhaps in life, that can be addressed in an honest conversation, or c) a total sociopath.

At that point a wall of suspicion imposes itself between teacher and student. The suspected plagiarist’s behavior is dissected, his or her papers are examined with a fine-tooth comb, and a perceptible chill hovers over the teacher’s dealings with the student. Everything the student says and does is colored by the possibility that it might all be part of some elaborate con (English professors tend to be suspicious — it’s actually part of their training). You never really know if you’re dealing with an honest mistake or an attempt to deceive and manipulate.

So plagiarism is about your teachers’ feelings?

Yeah, kinda. There are reasons why plagiarism is a crucial issue for professional scholars, and why scholars and journalists who have been found to plagiarize in published work are essentially kicked out of the profession and shunned. Again, I’m happy to discuss that later. But in the context of the classroom, even the appearance of plagiarism, never mind flagrant, sociopathic theft, can fracture the one-on-one communication that’s necessary for teachers to really improve their students’ writing and work. You will simply learn more if there is a one-on-one component in your courses, and that is almost impossible to have when your teacher is constantly asking themselves if the sentences they are reading are yours at all. So if the appeal to ethics doesn’t do it for you, consider the quality of instruction you would like to get for the ever-increasing tuition you pay.

 

So let’s say you think I’m plagiarizing. What happens?

What happens is I call you into my office and I point to what I think are instances of plagiarism. I ask you whether or not you admit this is plagiarism and whether you have some reason why it looks like there’s plagiarism present. Then I refer the matter to Hunter College’s Academic Integrity Official, who will initiate a process that could end in a warning, expulsion from Hunter, or anything in between, depending on the severity of the offense. You can either officially admit to the accusation or contest it, in which case there will be a sort of hearing held to determine what will happen. You can read all about this on Hunter’s Academic Integrity website.

 

Ok. Got it. Don’t plagiarize. But I’m worried that I might accidentally plagiarize. How do I not do that?

  1. Keep track of your sources. You will probably accumulate many sources you would like to quote from. As you start incorporating quotations, and especially as you start paraphrasing, it will become surprisingly easy to lose track of what you thought of and wrote and what someone else thought of and wrote. Keep a doc that has only the material you’re getting from elsewhere and the citation information for that material so you can double-check.
  2. Cite as you go. Do not tell yourself you’ll insert in-text citations later because you’re on a roll, and you don’t want to stop writing to check a page number. Take a second to do it as you’re writing or you may forget.
  3. “Borrowing” language from a website without attribution is plagiarism. Taking language from any source (including a website) and changing around a few of the words to make it look slightly different but not citing it is most definitely plagiarism. It’s tempting, but don’t do it. It’s very easy to spot.
  4. Err on the side of caution. If you’re not sure if you should cite something or not, cite it. I’ll let you know if it’s something you don’t need to cite.
  5. If you have questions about how or whether to cite, ask me. I promise I will not be mad. In fact, I will be happy that you are taking these issues so seriously!

 

Butler, Speech, and the Campus

[Note: a slightly expanded version of this post is up at Souciant, titled “Looking for Judith Butler.” I’m keeping the post as-is for posterity’s sake.]

I really enjoyed Molly Fischer’s piece about Judith Butler for New York, but I think it misses something significant about Butler’s ongoing relevance. The piece ends with the suggestion that discourse about gender has moved beyond the performative theories Butler expounded in Gender Trouble. Paragraphs like this one convey the idea that Butler has triumphed, but also that she has been surpassed:

Isaac belongs to a generation for whom Butler is part of the canon. Today, it is possible to go online and read Judith Butler’s theory of gender performativity as explained with cats. There are Facebook pages like “Judith Butler Is My Homegirl.” Quotes from Gender Trouble are reliably reblogged on Tumblr. And yet, Maria Trumpler, director of Yale’s Office of LGBTQ Resources and a professor of women’s, gender, and sexuality studies, says that for the kids she sees at Yale today, 40 years after Butler was an undergraduate there, Gender Trouble is “really old-fashioned.” The last four years in particular have seen an enormous growth of student interest in identities “beyond the binary,” Trumpler says, like agender, bigender, genderqueer.

Fair enough. But Butler still remains wildly relevant on college campuses, particularly for undergraduates. Nathan Heller’s recent piece for the New Yorker and reports about campus protests makes it clear that it’s Butler work on speech (in Excitable Speech) and assembly (in Notes Toward a Performative Theory of Assembly) that have the most relevance to campus life right now. In fact I would say that, from the perspective of the present, Butler’s work as a theorist of gender looks like a special case of her broader work as a theorist of speech. It is difficult for me to read accounts of students calling the speech they hear on campus “violence” without thinking of Butler’s work after Gender Trouble.

Here, for example, is a passage from the introduction to Excitable Speech:

Understanding performativity as a renewable action without clear origin or end suggests that speech is finally constrained neither by its specific speaker not its originating context. Not only defined by social context, such speech is also marked by its capacity to break with context. Thus, performativity has its own social temporality in which it remains enabled precisely by the contexts from which it breaks. This ambivalent structure at the heart of performativity implies that, within political discourse, the very terms of resistance and insurgency are spawned in part by the powers they oppose (which is not to say that the latter are reducible to the former or always already coopted by them in advance).

In other words, Butler is saying that when you “resist” dominant social forces by construing their hate speech (like racial slurs) as violence, you are actually participating in validating a model of language that can work against you as well. Butler uses the example of arguments about pornography, but we could just as easily look at arguments against gay marriage. We may scoff at a straight, married couple who says their religious rights are being infringed upon when two people of the same gender get married. But what they’re saying is that the political act that legitimizes gay marriages changes the terms of the institution of marriage without their consent, and so does injury to them in the same way that a slur or hate speech does injury.

Performativity, though it is often thought of as a tool of insurgent political analysis, has no political allegiances. I think this is the push-pull we see on campuses now, with some campus activists calling for protections from what they see as hate speech and others saying that such protections constitute a restriction on free speech, and thus a form of injury. Butler has spent a long time describing and theorizing this sort of structure, where, as she puts it, “language constitutes the subject in part through foreclosure, a kind of unofficial censorship or primary restriction in speech that constitutes the possibility of agency in speech.” In other words, what we think of as a freedom of speech, with all of the privileges of expression that implies, is only enabled by a tacit agreement not to speak about certain things or in certain ways.

Right now the nature of those certain things and certain ways is becoming more and more uncertain. The limits of speech are being tested on both the left and right. They are tested on the left by campus activism that demands institutional protection from forms of speech they consider to be violence. They seek the power to punish people for certain kinds of hurtful language. Though Butler’s writings do not endorse those sorts of punitive measures (at least that I can see, I’m not a Butler expert), it seems clear to me that the dissemination of her ideas has influenced these activists. From the right, those same forms of hurtful speech are becoming part of the political lingua franca. Utterances that would otherwise be called hate speech are drawn into a zone of acceptance that protects them from any plausible claim that they constitute a form of violence. Butler’s ideas, far from approaching comfortable retirement, need to be engaged now more than ever.

Criticism in Doubt: A.O. Scott’s Better Living Through Criticism

It’s not clear why, exactly, A.O. Scott wrote Better Living Through Criticism in the first place. It seems like the sort of thing the lead film reviewer at the New York Times ought to do, I guess. But, as Leon Wieseltier’s dead-on and damning review makes clear, Scott doesn’t have any particular critical position, never mind a thesis, to defend. The book’s subtitle, How to Think about Art, Pleasure, Beauty, and Truth, turns out to be a joke: of course no one can tell you these things, or at least Scott doesn’t seem to think one should. He’s perfectly comfortable praising or trashing an individual film, as you can see in his reviews on any given day in the Times, but when it comes to stating his working principles as a critic, he retreats.

Scott finds that two major critical positions, elitism and populism, turn out to be the same: “The idea of critical authority and the ideal of common knowledge are not in competition, but are rather the antithetical expressions of a single impulse toward comprehensive judgment, toward an integral aesthetic experience, the achievement of which would eliminate the need for critics altogether.” This sounds nice, but its strategy is to nullify the difference between meaningful critical attitudes. And it’s this discomfort toward the disagreements that occur when critics take concrete positions on art that they are willing to defend that is a problem for the rest of Scott’s book. 

Better Living Through Criticism constantly performs this anxiety over disagreement, and particularly about being on the wrong side of disagreement. Over time, Scott tells us, “You are guaranteed to be wrong,” proven wrong by changing tastes, which come to laud the movie you trashed in print, proven wrong by history, which now views that delightful comedy you reviewed favorably as a paragon of fascist cinema, your opinion fossilizes, turns to dust, and so on. Times change, tastes change, critics can only accept it. That’s good practical advice for a working critic, but terrible advice if you happen to be writing a book about the actual practice of criticism. What the reader wants from a book like Scott’s, what its subtitle seems to view as an ironic impossibility, is a book of ideas and methods that have the chutzpah to claim critical authority for themselves, to tell you why you should believe them, and to anticipate and eviscerate any argument that says it ain’t so.

But it’s almost as though Scott feels he doesn’t have the authority to make such claims — then why write the book? Several sections are written as dialogs between Scott and some interlocutor (modeled, Scott tells us, on David Foster Wallace’s “Brief Interviews with Hideous Men”), presumably his own nagging voice of self-criticism and doubt. In them he is forever offering qualifications and revisions, telling us, “Of course, we’re all determined beings, made by circumstances beyond our control. But we’re also changeable creatures, highly susceptible to the influence of accident, free agents with the power to invent ourselves.” The formation of critical judgment is equal parts nature and nurture, sure. But surely a book like this ought to offer some thoughts as to how the practice of criticism can identify and, to some extent, ground our unknowing suspension between the tastes we absorb by osmosis and those we cultivate, to make conscious and explicit the unacknowledged and implicit forms we struggle to see in the objects before us and which exert power over our thoughts and beliefs. Scott’s book opts out of the difficult work of hewing knowledge from uncertainty. Criticism begins in doubt, but the point is to overcome it, not enshrine it.

 

David Bowie, the Language of the Tribe, Weirdness, and so on

There have been a couple pieces written about David Bowie and what he meant to the “weird kids.” Here are some vaguely continuous thoughts I’ve had over the last 24 hours as a former “weird kid.”

1. Bowie was the ur-weird kid transformed into something larger than life. Ziggy Stardust was the theater nerd as messiah, the sci-fi geek as rock star, the choir dork as diva.

2. One of Bowie’s qualities that I think made weird kids latch onto him was a sincere lack of belief in authenticity. This was seemingly instinctual rather than intellectual, felt rather than theorized. He didn’t believe that “authenticity” was a real thing, and that wasn’t just some postmodern line.

3. Authenticity is a problem for weird kids. You aren’t part of any clique or group. The experience of being a weird kid is one of constantly trying to fit into some group and knowing full well you don’t belong there. And they, the group members who do belong in the group, know it too. And the more you try to pretend you do belong there, the worse it gets. You speak nervously and try to adopt the language of the tribe, but it doesn’t take. There’s something wrong with the way you’re dressed, with the way you talk — your self-consciousness gives away the fact that you are trying to fit in instead of just fitting in.

4. When you try to fit in and fail, you are exposed as inauthentic, as a faker, as someone trying to deceive their way into friendship, human contact, something . You aren’t really a jock — you’re not good at sports, don’t know anything about them. You aren’t a skater — you don’t even own a skateboard. You aren’t a stoner — you’re too scared to smoke weed! You insist that you belong, but this is a desperate lie, and a transparent one. Telling it feels really bad, but not because you’re being dishonest. It’s because you’d rather lie than face the social wasteland, which is where everyone knows you truly belong. And that is pathetic.

5. I remember being 12 or 13 and hearing an interview with Bowie where he used the word “dilettante” to refer to himself. He sounded ironic, I think he was laughing or smiling when he said it. I had to go look it up, and after I did, I remember feeling bad for him, because he had been found out too. They knew he wasn’t authentic, that he didn’t belong. He was just a dabbler. He was cast out. I listened to his music obsessively all through middle school and high school.

6. It took me longer than it should have to understand that Bowie was laughing about the word “dilettante” itself. It implies lack of commitment, dabbling, and so on. It’s the sort of word specialists throw in the faces of curious generalists when they feel like their enclaves are being invaded. To be made to feel like a dilettante in a room full of specialists is to be reminded of your inauthenticity.

7. But Bowie seemed to feel no such pressure to “commit” to one thing or another, to one style or another. He pursued an idea until he had exhausted it. He seemed to feel no compulsion to continue to lug the exhausted idea around. He shed it once it was complete. He wasn’t the idea; the idea wasn’t him.

8. If you watch the BBC documentary Cracked Actor, which follows Bowie after the end of his Ziggy Stardust phase, you see him struggling with this process. He is frighteningly thin, reedy-voiced, as unsure of himself in interviews as he is confident on stage.

9. By the time he entered his Thin White Duke phase, he seems to have gotten over these nerves.

10. If David Bowie was a dabbler, a dilettante, an outsider forever intruding into mediums, genres, and styles that were not properly “his,” this inauthenticity was liberating rather than fraudulent. His ability to leave behind a form or statement once it ceased to be alive for him in a state of continuous curiosity about what it was he was even doing.

11. If you’re a weird kid, you exhaust yourself trying to figure out how “to be authentic.” You’re exhausted because you can’t figure out authenticity — you are authentic or you aren’t. You belong or you don’t. So you spend an inordinate amount of time worrying about it. You wear yourself out, make yourself anxious. You internalize your own position as an outsider and become an enigma to yourself.

12. But David Bowie was an unworried outsider. He didn’t want to be anything he wasn’t, he just wanted to know what it would be like. This sense of relaxed acceptance, of curiosity rather than anxiety, was what he gave to the weird kids.

 

 

Disconnected Post Script: Almost none of the remembrances of Bowie published yesterday or today mention what a tremendous singer he was. Watch footage of any performance from the 70s and stand in awe.

Stenography, Spirituality, and the Media History of Liberation

pitman psalms

The Book of Psalms in the Corresponding Style of Pitman’s Shorthand

More religion and stenography, this from Isaac Pitman’s A Manual of Phonography, or, Writing by Sound (1864):

In the 16th and 17th centuries, the principles of the Reformation were extensively promulgated in this country from the pulpit. A desire to preserve for future private reading the discourses of the principal preachers of that day, led to the cultivation of the newly invented art of shorthand writing. Teachers and systems increased rapidly; and by a comparison of one mode with another, and by experimenting with various series of alphabetical signs, Mason, at length produced a system of the art, from the publication in 1588 of Brights’s system of arbitrary characters for words (or rather from the publication of the first shorthand alphabet by John Willis, in 1602) to the appearance of Mason’s system in 1682, may therefore be considered as resulting from the dawn of RELIGIOUS FREEDOM. Mason’s system was published by Thomas Gurney, in 1751, and it is used by members of his family, as reporters to the Government, to the present time (17).

So, on the one hand, because the “desire to preserve … the discourses of the principal preachers of the day” required a mode of recording faster and more efficient than normal writing, our knowledge of the Reformation depended on the development of shorthand writing systems. On the other hand, though, the rapid increase of shorthand writing systems and schools in the 16th and 17th centuries also points to the idea that shorthand writing was a product of the Reformation. So shorthand and “RELIGIOUS FREEDOM,” in Pitman’s account, sort of produce each other.

This seems like a dubious empirical claim, but there’s something about it that I don’t want to let go of. After all, we’re now used to talking about how new technologies enable the transmission and circulation of ideas that have real effects in the world (the Arab Spring and Twitter, for example). But the idea of print and the printing press dominates the way we imagine that information circulated basically until the invention of the telegraph. (I’m ignoring the work of many important scholars, like Lisa Gitelman and Bernhard Siegert, but stay with me.) Pitman may offer a hyperbolic, slightly dubious account of the Reformation’s media ecology, but in doing so he forces us to imagine a world in which the means by which information, language, and ideas made their way from one medium to another, from the voice of a speaker to the eyes or ears of a distant reader or listener, were not so settled.

In Deep Time of the Media, Siegfried Zielinski urges us to recapture those lost possibilities that inhere in forgotten or vestigial media. He argues that progressivist models of media history view our present media environment as developing inevitably out of prior environments. In this view, “history is the promise of continuity and a celebration of the continual march of progress in the name of humankind” (3). This progressivist idea of history is, in truth, ahistorical, since it suggests that, “everything has always been around, only in less elaborate form. One needs only to look” (3). Media historical progressivism remains blind to the possibilities offered up by media and technologies that didn’t survive, that remain buried. It is these forms that Zielinski finds interesting, urging us not to “seek the old in the new” but to “find something new in the old” (3).

The history of shorthand offers this kind of new oldness, but the relation between past and present, new and old in shorthand is more dialectical than what Zielinski suggests. I do see in Pitman the echo of an older way of thinking about the metaphysics of the voice’s relation to the hand and a foreshadowing of media history to come. Pitman’s enthusiasm about shorthand’s entwinement with political and religious liberation could easily transform into its opposite. Where he saw the shorthand as an agent of freedom, we could also see the origins of the copyist as a mechanized drudge. I’m still not settled on how I think all this plays out. Maybe it’s all because Pitman was a Swedenborgian.

« Older posts

© 2016 Mark Sussman

Theme by Anders NorenUp ↑