Skip to main content
Navigate Up
Sign In

University of Colorado Denver | Anschutz Medical Campus

University of Colorado Denver

Why plagarism is evil

Barbara Goodrich, Ph.D.

With the increased use of the internet, multi-authored web-pages such as wikis and newspaper discussions, and other electronic data, some young people have become confused about what is and is not acceptable regarding plagiarism.   This is a brief guide to help out.

The basics:

 A working definition of plagiarism:

Presenting another's intellectual work as if it were one's own, with the probable result of one's getting credit for it.   The clearest example is quoting someone for more than just a few words without giving credit to the real author.  

How to avoid it:

A proper, formal citation according to your discipline's protocol is best, but as long as you give the author the credit, you are safe from the charge of plagiarism.   If nothing else, if in doubt, just put quote marks around it, and include the author's name and the title of the work in parentheses.   Easy.  

If you need more details:

 Using your own words:

For an assignment in which you are to use your own words to describe or explain something, use your own words, ideally from scratch.   (You'll remember it much more easily when you've put it in your own terms, too.)  It's not enough to import the original and then alter it slightly by, say, substituting a synonym every 20th word, or changing an active construction ("John Doe then compared the eating habits of chimpanzees and bonobos with regard to sharing food with strangers" to "then the eating habits of chimpanzees and bonobos were compared with regard to sharing food with strangers"). 


Other technical definitions of plagiarism:

might include, say, stealing the idea for a particular plot and/or characters; stealing particular verses or melodies of songs or a breakthrough scientific hypothesis; stealing, even, say, a very distinctive interpretation of something.  In general, anything you can imagine someone suing to protect as his/her own intellectual property should be respected as his/her own intellectual property, and not reproduced without giving the creator proper credit and proper references.    (In some cases, you may also be legally required to obtain permission from the creator to reproduce the creation.)

What if a friend offers something of his/her own to me, gives me permission to use something as my own work?  Wouldn't that be OK?

Nope, that's still not enough!    Think of it like this:  You wouldn't be exploiting your friend here, and that is very important, granted.  But you would nonetheless be mistreating your audience by misleading them.   (And if the context is school, and the friend is a better than average student, then all the other students would be mistreated as well, since they'd miss out on that competitive advantage.)   It's much better just to give credit where it's due, and be admired for your wide reading and good taste in quotes.  

What if a friend pressures me to give him/her my own work to be submitted as his/her own?

Well, your Auntie Barbara doesn't recommend using physical violence . . .

What if my professor wants us to use some weird kind of citation format that I've never seen before?

You've definitely got your auntie's sympathy, and the sympathy of many, many academics on that one.   It's an unfortunate historical accident that various different disciplines developed entirely different citation forms, and that poor undergraduate students taking, say, philosophy and chemistry at the same time have to juggle entirely different formats so that it's difficult to learn either as a habit.   I hope that eventually disciplines and publications will converge on a single citation form.   But in terms of ethics, the important thing is just to have some kind of citation with at least the author's name and the work's title in it.   

Other information that is useful in a citation:

If it's a book: the edition if there's more than one, the translator if it's translated, the publisher and city, the year it was published, the page number(s). 

            If it's a journal article: the name of the journal and the volume it appeared in, the year and possibly month it was published, the page number(s).

            If it's from the internet: the URL address, the date it was made public, if available; the name of the organization holding the website, if any. 

            If it's a personal communication: just that it's a personal communication, and the date it was communicated. 

In each case, the format includes whatever information would be helpful to someone who loves your quote and wants to find the source!

What about how to use quote marks?   Sometimes you do, sometimes you don't.  It's confusing.

The general convention is that if a quote is short, two or three full lines of text or less, you just put quote marks around it, using " and " in the U.S. and ' and ' in the U.K..  On the other hand, if a quote is long, two or three full lines of text or more, then rather than using quote marks, you just indent the whole thing.  For long quotes, that makes it easier to distinguish what the quote is, so the reader doesn't have to keep hunting for the last quote mark.   There are various little rules about this, e.g. how to mark quotes within quotes (alternate the " and ' marks), how to mark a quote containing two paragraphs (omit the last " of the first paragraph, but add a new beginning " to the new paragraph) and so on.  But again, following the perfect punctuation rules is far less important than just setting the quote off as a quote, however you do it.   

But where do I draw the line?   We're encouraged to search the 'net, talk, share ideas, and so on.   And I've had many influences on my thinking; I can't stop and give footnotes to everyone who's given me an idea.   And sometimes, after a long and intense discussion with a friend, it's hard to tell whose an idea is.

Absolutely true!   And there is certainly a gray area between "X has influenced me" and "X said this and I'll repeat it verbatim."    Influences are so numerous, and often so subconscious, that normally they needn't be mentioned, though it's considered particularly impressive if you can be explicit about the sources that have influenced your thinking the most.  (It took me until after I'd earned my doctorate to realize that my own philosophical thinking was heavily influenced by reading Jack London animal stories as a kid – no kidding!)   Sometimes mentioning them does add substantively to a discussion; sometimes it doesn't.   Go with what seems most suitable in each situation.   If it's more of a particular specific idea, and you developed it with someone, it's polite to mention the friend's name, too.  (Spouse and I tease each other about this.  I know he really likes an idea I suggest in an informal discussion if he assimilates it so thoroughly that he forgets it was mine.    But we don't care about credit in this context, just compliments and offers to buy drinks.)      

Expectations differ for different contexts, too.   In an internet discussion based on a newspaper column, for instance, nobody expects citations unless the topic happens to turn on a purported quote or something similar.   It's in more formal situations, e.g. academic papers for classes or for professional publications, that people expect citations.  Here, if the citations aren't there, people will assume that the ideas and/or words presented are one's own unless otherwise specified, so, if they aren't your own, you do need to specify otherwise.   With experience, you'll get a good reliable sense of what counts as original thought though influenced by others, and what counts as a review or interpretation of others' work which requires references.

But really, why is plagiarism evil?  What's so important about all this?  I mean, when I get a job, I'll be able to go online and get documents when I need them.  Why can't I just do that now, in school? 

Almost no jobs involve this kind of thing, whatever you may have heard.  For example, when journalists just use the publicity packs they are given, rather than doing their own research and writing, they are running a huge professional risk.   In the short term they may sometimes be rewarded by, e.g., the corporations or government administrations whose propaganda they may be propagating, but pretty obviously in some cases this can involve a gross violation of journalistic ethics, and when found out, may be punished very severely, with the journalists disgraced as well as fired.    It's far preferable to get in the habit of writing your own work, taking pride in developing your own style, thinking of things according to your own judgment.   In fact, given the internet's growth, the ability to write clearly and originally is even more important in most jobs now, since even more communication takes place in writing. 

Also, if a paper is being submitted for a grade, as an example of one's own writing skills, it needs to be an example of one's own writing skills.   Otherwise, not only would the purported author be misrepresented, but that purported author would have seized an unfair competitive advantage over you other students. (That's why it's evil, in Auntie Barbara's opinion.)  Relax, though.  If it's easy these days for less than scrupulous students to obtain pre-written papers on the internet, it's at least as easy for professors to check submitted papers for suspicious origins, and most of us get pretty zealous in defending all you other students from the occasional cheat, even to the point of expelling the cheat.  (And of course one of the saddest characteristics of cheating is that it's often very poor quality.  You read it and think, "That poor young fool probably paid two days' wages for this?!   Golly, if the other students knew that, they'd probably forgive him/her.") 

Rather than a misguided student doing and going through all this and living with the resulting anxiety and nightmares and confidence-undermining of becoming a fake, etc., etc.,  it's so much easier and simpler just to live straightforwardly, to learn to write fairly well (or superbly well) and to be proud of it, and let it be appreciated. 

What's all this about academics and plagiarism?  I just read a story about some big historian/physicist/whatever getting fired for plagiarism.   Why is it so important?

Several prominent academics around the globe have been discovered plagiarizing in the past few years, which has shocked and distressed intellectuals worldwide, and which is part of my motivation for writing this guide.  The integrity of history, archaeology, the sciences, etc. rests on the individual integrity of those who contribute to them.  E.g. we need to be able to trust someone who finds a fossil that he found it where and when he says it was found, since it might mean something very different if found in a different location in different circumstances.  We must be able to trust someone who says she generated certain results in a study of carcinogens that she got precisely those results and in those circumstances.   (If someone else then claims falsely to have carried out that same kind of study too, it would give an illusion of confirmation, of reproducibility.   We need real duplication of important studies, for real confirmation of data.)  We also have to be able to trust people when they say that they came up with a certain hypothesis, or developed a certain idea.  This is key both for reasons of intellectual property and for reasons of proper evaluation (academics get graded too, by their peers, with the rewards being promotions, tenure, raises, grants).   It's also key for keeping the intellectual history of those ideas and studies straight.  Thus, a supported charge of plagiarism will quickly end even a prestigious career. 

An interesting variant:  One high-profile academic who has been in headlines recently charged with plagiarism turns out to have written various articles under pseudonyms.   Normally this is not a problem, especially in fiction, or if the author uses a pseudonym to write in a different field.  (E.g. several of Britain's finest mystery and spy-thriller writers are scholars who write their fiction under nom-de-plumes.)  However, this academic allegedly was writing scholarly articles confirming other scholarly articles he'd written under his own name.   This gave an illusion of group consensus on matters about which he was in fact the only source.   To make matters worse, the pseudonyms he chose were not fictional names, but the names of real academics in this discipline, which gave weight to the illusory consensus, so it was a sort of reverse plagiarism.   This may appear at first a harmless act of generosity, since he's giving credit for his own work to others.   But it's really deceiving his readers in a number of ways, and exploiting the names of the other academics.

(What seems most poignant to me is that any of these notoriously absent-minded intellectuals should delude themselves into believing that they could juggle half-a-dozen different lies like a practiced con man, when they probably -- if they're like me and like most intellectuals I've known -- have difficulty remembering where they parked their cars.)

Gee, now I'm worried.  There are only so many possible sentences in English.  What if I accidentally happen to write a paragraph that someone else already wrote?   Or what if I subconsciously remember someone else's idea from a long time ago, and I use it without realizing that I should give credit to someone else?

Relax.  With English's vast vocabulary and complex grammatical structure, it's almost impossible to come up with much more than a phrase that is identical to someone else's.  When you take into account different topics, contexts, outline structures, and so on, it's so close to impossible in normal situations that we can consider it impossible.  There are a few set phrases that people use, such as my favorite bugbear "it is interesting to note that . . ."   but these are accepted as set phrases, and probably nobody would want to claim credit (or blame) for inventing them anyway.  

If you discover that you have subconsciously used someone else's idea without realizing it, then just give credit when you do realize it, and things should be fine.   Once in a while it happens, but almost always the idea has been largely reworked by the subconscious to be integrated into your own larger picture of the world, so that the result is more an homage and less a copy, and that makes all the difference. (One fascinating exception: poor Helen Keller, after having become blind and deaf after an illness as a toddler, wrote a little story later in her life that, it was later discovered, was almost verbatim a story that she had been read as a toddler, before her illness.  But this case demonstrates how extraordinary such a thing is: It required a prodigiously gifted, precocious child just to absorb a story at such a young age, and then to rediscover it again after such devastating traumas and losses, when some long-dormant abilities and memories were finally reawakened.   It's not at all surprising that she had no idea that it was not original.  And while some people did criticize her at the time, nobody does now.  Her critics are regarded as churlish, instead, for insisting on applying rules in an obviously exceptional case.)

Possibly the most famous example of unconscious borrowing is Freud's co-opting Schopenhauer's and Nietzsche's notion of the subconscious, which he presented as his own original idea, though with a rather different meaning, since his theoretical context was different.   When people who were familiar with philosophy finally pressed him on this, he admitted that he'd probably got it from them, but only indirectly, not directly.  What he'd done with the notion of an unconscious psyche was sufficiently different from what they'd done that he escaped charges of plagiarism, but his subsequent attempts to minimize their obvious contributions did cast a shadow over his legacy.   A more gracious acknowledgement of their influence, once he realized it, would have garnered him respect, instead.

I'm writing papers in an advanced science class.   I've been told that in the sciences, it's not important to put quote marks and references around things as it is in the humanities, because it's not important who wrote the words, but what objective data were found.   Why doesn't writing this way count as plagiarism? Scientists, like all academics, should be giving all the citations needed; this by itself is enough to prevent any substantive plagiarism.  

Science writing is very different stylistically from the humanities in that direct quotes in the sciences are considered stylistic bad form, since, indeed, as you say, it's the data that is important, not who described it and in which words.  However, normally it's also bad form – at the very least – to quote passages from others without quote marks, etc..   So how to solve the problem?  Paraphrase!  It's that simple.  And let's be honest.  Frequently it's easy to paraphrase a science journal article so that the paraphrase is quite an improvement on the original in clarity, precision, elegance, and so on.    It may sometimes be a little painful to get the paraphrase going.  But that it'll be an improvement is often a given.   (Of course, the same is at least as true for philosophy journal articles, too, and we philosophers don't even have the excuse of needing to use scientific technical terminology.) 

But I'm not sure I agree with all this emphasis on intellectual property rights.  Ideas should be free, and freely exchanged; we shouldn't have to pay for them.   And if I ever write something good enough to be published, I may stand by this even then, and not ask for royalties! 

A growing number of other people share these values!   Mozilla, the much-loved software, is based on them.  So are, e.g., some outstanding introductory physics books, Benjamin Crowell's online "Light and Matter" series -- if only all scientist and philosophers wrote even half this well.   (Incidentally, this is an astonishing devotion to ideals on Crowell's part.   As many money-strapped students know all too well, textbook authors can pretty much coin money if their books are good enough to be adopted by a few universities.   As a premier physics author, Crowell is forgoing considerable royalties for the sake of his students and other readers.)

Even if you don't want to charge royalties for your work, though, you may want to keep creative control over how your work can be treated, e.g. to prevent others from stealing parts and changing them in ways you don't approve, or from stealing parts and then charging for them through aggressive publicizing.  The Creative Commons approach that Crowell uses protects author control while allowing free usage; you can copyright your work so that you have control over it, but insist that users not change or retail your work (   Exciting and inspiring developments!


© The Regents of the University of Colorado, a body corporate. All rights reserved.

Accredited by the Higher Learning Commission. All trademarks are registered property of the University. Used by permission only.