• Newsletters
  • Help Center
  • Georgia Decides: Voter Guide
  • Politically Georgia
  • High School Sports

Opinion: The college essay is not dead

Georgia Gwinnett College associate professor Amanda Sepulveda teaches students in her English class.  PHOTO CREDIT: GEORGIA GWINNETT COLLEGE.

In a guest column today, Matthew Boedy , an associate professor of rhetoric and composition at the University of North Georgia , discusses the development of artificial intelligence programs that can spit out accurate and fluid essays in response to any prompt.

Boedy is responding to a column in the Atlantic by English teacher Daniel Herman who writes of the new OpenAI’s ChatGPT program that “may signal the end of writing assignments altogether — and maybe even the end of writing as a gatekeeper, a metric for intelligence, a teachable skill...But most jaw-dropping of all, on a personal level: It made quick work out of an assignment I’ve always considered absolutely ‘unhackable.’ ”

In another Atlantic essay on sophisticated generative AI, novelist and essayist Stephen Marche writes: “Practical matters are at stake: Humanities departments judge their undergraduate students on the basis of their essays. They give Ph.D.s on the basis of a dissertation’s composition. What happens when both processes can be significantly automated? Going by my experience as a former Shakespeare professor, I figure it will take 10 years for academia to face this new reality: two years for the students to figure out the tech, three more years for the professors to recognize that students are using the tech, and then five years for university administrators to decide what, if anything, to do about it.”

Here is Boedy’s take on whether AI programs endanger writing and writing instruction.

By Matthew Boedy

It’s that time of year when I read reflections by my students in my first year writing course. This course is part of a mandated two-semester program and is populated by many dual enrolled students.

By and large, they praise the class and my teaching. Though I suspect at times some are merely buttering me up for a better grade. But I also ask them to reflect on how my ways with reading and writing compares to their high school experiences.

Mathew Boedy

Credit: Peggy Cozart

icon to expand image

The overwhelming claim by these “cream of the crop” students about their high school experience is twofold. First, they are often assigned little to no writing beyond one to two pages. Second, the mechanical or formulaic teaching of writing they received often meant they learned nothing of consequence.

I can’t vouch for the complete veracity of those claims. But I bring this up because of two paired headlines racing across the parts of the internet recently where teachers like me meet: “The End of high school English” and its companion, “The College Essay Is Dead.”

On the website of the Atlantic, both muse about the impact of a new technology called ChatGPT which is an artificial intelligence software that can create essays that sound as good or better than the run-of-the-mill ones I read on a regular basis from students.

The basic claim of mortality here is that this software is the greatest plagiarism program of all time. And people like me still assigning essays will only get from this point on prose produced by a robotic output of what people in the AI business call language production algorithms. I won’t bore you with examples but basically not only can you ask the algorithm to write an essay on any topic but also in the style of any famous author. As if sounding like Hemingway gets you extra points.

Let me dispel any notion that the college essay is dead or that this new technology will end my career as a writing teacher.

Contrary to popular belief, we writing teachers believe more in the process of writing than the product. If we have done our jobs well and students have learned, reading that final draft during this time of year is often a formality. The process tells us the product will be amazing.

Writing is a process of learning not merely about a subject. It’s also a learning about how that subject can best be framed for an audience. It’s also a writer learning about themselves. What do they want to say? What do they want to sound like? What rhetorical tools best fit their own skill set?

Asking an algorithm to make you sound like Hemingway actually will raise the reddest of red flags for me for plagiarism because the paper doesn’t sound like a first-year student.

On that note, if we writing teachers are doing our job well, we are crafting assignments that simply can’t be plagiarized. That is, an essay for my assignments can’t be bought off the internet or created by an algorithm. For example, I ask students to write an essay about three to five pictures of their own choosing. Sure, students can and do select pictures from the internet. But many don’t, instead using pictures from their phone. And coming up with things to say thematically about those pictures can’t be done by an algorithm. Another assignment is a research essay where I give students two sources and they have to find two others. The plagiarism I find most now is students stealing from the examples I show from previous semesters or other students in the class when they post early drafts to a class discussion board.

But for the writing-to-learn process to work, students also have to do their jobs. They have to be willing to fail, to write badly, or simply admit they don’t know what to say. And that is extremely hard if indeed they have never been asked to fill a blank page with little to no guidance from the teacher. To think as they write, not already have thought and then write.

And yes, that initial failure does bring the temptation to cheat. But what I hear from students in these end-of-semester reflections is not the siren song of plagiarism but a fear of failure. Because many of these students have never failed.

And for the other students who all they have known is failure with writing, this process only reinforces that sense of dread.

The answer to that is not an algorithm but advice. It’s why I have conferences with my students roughly every two weeks.

This new technology may indeed be the end of high school English. A certain kind of high school English. And a certain kind of college essay.

But it isn’t the death of the kind of education you should expect from our state’s higher education institutions. It’s the opportunity to show why we need more faculty, not less. Why we need less students per class, not more.

The cost of college has exploded due to one sizable factor – the death of public support. Lawmakers think tuition is the burden students should pay to have skin in the game.

But we all as a collective have skin in this particular composition game because good writing is that key fundamental civic skill, one we so desperately need more of. And it must be and should only be taught by those who see it as learning, not keyboard strokes after learning.

Even the algorithm agrees. I asked ChatGPT “What is the best way to teach writing?” I don’t have space to share its whole answer. But it suggests “clear and detailed feedback,” helping “students develop their own writing process” through revision practice, and working to “encourage creativity and originality in students’ writing.”

Maybe though it is just telling me what I want to hear. Like some of my students.

About the Author

ajc.com

Maureen Downey has written editorials and opinion pieces about local, state and federal education policy since the 1990s.

Credit: TNS

A member of the audience wears a “Dominion Password” T-shirt earlier this month at the State Election Board’s final scheduled meeting of 2024. The state's Dominion Voting System machines are among the things voters cite to explain their distrust in Georgia's election system. (Hyosub Shin/AJC)

Credit: HYOSUB SHIN / AJC

The End of High-School English

I’ve been teaching English for 12 years, and I’m astounded by what ChatGPT can produce.

This article was featured in One Story to Read Today, a newsletter in which our editors recommend a single must-read from The Atlantic , Monday through Friday. Sign up for it here.       

Teenagers have always found ways around doing the hard work of actual learning. CliffsNotes dates back to the 1950s, “No Fear Shakespeare” puts the playwright into modern English, YouTube offers literary analysis and historical explication from numerous amateurs and professionals, and so on. For as long as those shortcuts have existed, however, one big part of education has remained inescapable: writing. Barring outright plagiarism, students have always arrived at that moment when they’re on their own with a blank page, staring down a blinking cursor, the essay waiting to be written.

Now that might be about to change. The arrival of OpenAI’s ChatGPT, a program that generates sophisticated text in response to any prompt you can imagine, may signal the end of writing assignments altogether—and maybe even the end of writing as a gatekeeper, a metric for intelligence, a teachable skill.

If you’re looking for historical analogues, this would be like the printing press, the steam drill, and the light bulb having a baby, and that baby having access to the entire corpus of human knowledge and understanding. My life—and the lives of thousands of other teachers and professors, tutors and administrators—is about to drastically change.

I teach a variety of humanities classes (literature, philosophy, religion, history) at a small independent high school in the San Francisco Bay Area. My classes tend to have about 15 students, their ages ranging from 16 to 18. This semester I am lucky enough to be teaching writers like James Baldwin, Gloria Anzaldúa, Herman Melville, Mohsin Hamid, Virginia Held. I recognize that it’s a privilege to have relatively small classes that can explore material like this at all. But at the end of the day, kids are always kids. I’m sure you will be absolutely shocked to hear that not all teenagers are, in fact, so interested in having their mind lit on fire by Anzaldúa’s radical ideas about transcending binaries, or Ishmael’s metaphysics in Moby-Dick .

To those students, I have always said: You may not be interested in poetry or civics, but no matter what you end up doing with your life, a basic competence in writing is an absolutely essential skill—whether it’s for college admissions, writing a cover letter when applying for a job, or just writing an email to your boss.

Read: The college essay is dead

I’ve also long held, for those who are interested in writing, that you need to learn the basic rules of good writing before you can start breaking them—that, like Picasso, you have to learn how to reliably fulfill an audience’s expectations before you get to start putting eyeballs in people’s ears and things.

I don’t know if either of those things is true anymore. It’s no longer obvious to me that my teenagers actually will need to develop this basic skill, or if the logic still holds that the fundamentals are necessary for experimentation.

Let me be candid (with apologies to all of my current and former students): What GPT can produce right now is better than the large majority of writing seen by your average teacher or professor. Over the past few days, I’ve given it a number of different prompts. And even if the bot’s results don’t exactly give you goosebumps, they do a more-than-adequate job of fulfilling a task.

I mean, look at this: I asked the program to write me a playful, sophisticated, emotional 600-word college-admissions essay about how my experience volunteering at my local SPCA had prepared me for the academic rigor of Stanford. Here’s an excerpt from its response:

In addition to cleaning, I also had the opportunity to interact with the animals. I was amazed at the transformation I saw in some of the pets who had been neglected or abused. With patience and care, they blossomed into playful and affectionate companions who were eager to give and receive love. I was also able to witness firsthand the process of selecting the right pet for the right family. Although it was bittersweet to see some animals leave the shelter, I knew that they were going to a loving home, and that was the best thing for them.

It also managed to compose a convincing 400-word “friendly” cover letter for an application to be a manager at Starbucks. But most jaw-dropping of all, on a personal level: It made quick work out of an assignment I’ve always considered absolutely “unhackable.” In January, my junior English students will begin writing an independent research paper, 12 to 18 pages, on two great literary works of their own choosing—a tradition at our school. Their goal is to place the texts in conversation with each other and find a thread that connects them. Some students will struggle to find any way to bring them together. We spend two months on the paper, putting it together piece by piece.

I’ve fed GPT a handful of pairs that students have worked with in recent years: Beloved and Hamlet , The Handmaid’s Tale and The Parable of the Sower , Homer’s The Odyssey and Dante’s Inferno . GPT brought them together instantly, effortlessly, uncannily: memory, guilt, revenge, justice, the individual versus the collective, freedom of choice, societal oppression. The technology doesn’t go much beyond the surface, nor does it successfully integrate quotations from the original texts, but the ideas presented were on-target—more than enough to get any student rolling without much legwork.

It goes further. Last night, I received an essay draft from a student. I passed it along to OpenAI’s bots. “Can you fix this essay up and make it better?” Turns out, it could. It kept the student’s words intact but employed them more gracefully; it removed the clutter so the ideas were able to shine through. It was like magic.

I’ve been teaching for about 12 years: first as a TA in grad school, then as an adjunct professor at various public and private universities, and finally in high school. From my experience, American high-school students can be roughly split into three categories. The bottom group is learning to master grammar rules, punctuation, basic comprehension, and legibility. The middle group mostly has that stuff down and is working on argument and organization—arranging sentences within paragraphs and paragraphs within an essay. Then there’s a third group that has the luxury of focusing on things such as tone, rhythm, variety, mellifluence.

Whether someone is writing a five-paragraph essay or a 500-page book, these are the building blocks not only of good writing but of writing as a tool, as a means of efficiently and effectively communicating information. And because learning writing is an iterative process, students spend countless hours developing the skill in elementary school, middle school, high school, and then finally (as thousands of underpaid adjuncts teaching freshman comp will attest) college. Many students (as those same adjuncts will attest) remain in the bottom group, despite their teachers’ efforts; most of the rest find some uneasy equilibrium in the second category.

Working with these students makes up a large percentage of every English teacher’s job. It also supports a cottage industry of professional development, trademarked methods buried in acronyms ( ICE ! PIE ! EDIT ! MEAT !), and private writing tutors charging $100-plus an hour. So for those observers who are saying, Well, good, all of these things are overdue for change —“this will lead to much-needed education reform,” a former colleague told me—this dismissal elides the heavy toll this sudden transformation is going to take on education, extending along its many tentacles (standardized testing, admissions, educational software, etc.).

Perhaps there are reasons for optimism, if you push all this aside. Maybe every student is now immediately launched into that third category: The rudiments of writing will be considered a given, and every student will have direct access to the finer aspects of the enterprise. Whatever is inimitable within them can be made conspicuous, freed from the troublesome mechanics of comma splices, subject-verb disagreement, and dangling modifiers.

But again, the majority of students do not see writing as a worthwhile skill to cultivate—just like I, sitting with my coffee and book , rereading Moby-Dick , do not consider it worthwhile to learn, say, video editing. They have no interest in exploring nuance in tone and rhythm; they will forever roll their eyes at me when I try to communicate the subtle difference, when writing an appositive phrase, between using commas, parentheses, or (the connoisseur’s choice) the em dash.

Which is why I wonder if this may be the end of using writing as a benchmark for aptitude and intelligence. After all, what is a cover letter? Its primary purpose isn’t to communicate “I already know how to do this job” (because of course I don’t) but rather “I am competent and trustworthy and can clearly express to you why I would be a good candidate for this job.” What is a written exam? Its primary signal isn’t “I memorized a bunch of information” but rather “I can express that information clearly in writing.” Many teachers have reacted to ChatGPT by imagining how to give writing assignments now—maybe they should be written out by hand, or given only in class—but that seems to me shortsighted. The question isn’t “How will we get around this?” but rather “Is this still worth doing?”

I believe my most essential tasks, as a teacher, are helping my students think critically, disagree respectfully, argue carefully and flexibly, and understand their mind and the world around them. Unconventional, improvisatory, expressive, meta-cognitive writing can be an extraordinary vehicle for those things. But if most contemporary writing pedagogy is necessarily focused on helping students master the basics, what happens when a computer can do it for us? Is this moment more like the invention of the calculator, saving me from the tedium of long division, or more like the invention of the player piano, robbing us of what can be communicated only through human emotion?

Maybe the moment when I felt most caught between these two thoughts was when I asked GPT a question from an exam I give in my Eastern-philosophy course, something I’ve never imagined a student could find outside assistance to answer: “Explain the Madhyamaka Buddhist idea that all phenomena lack inherent existence, using a funny, specific example.”

ChatGPT wrote back, as ever, instantly:

The Madhyamaka Buddhist idea that all phenomena lack inherent existence can be illustrated with a funny example: let’s say you have a pet hamster. You might think that your hamster has its own inherent existence, but in reality, your hamster is made up of many different elements such as its fur, its bones, its muscles, and its brain. None of these elements exist independently or inherently on their own, so your hamster itself does not have any inherent existence either. The same goes for all phenomena—everything is made up of various components that don’t have inherent existence, so nothing has any inherent existence either.

Everything is made up; it’s true. The essay as a literary form? Made up. Grammatical rules as markers of intelligence? Writing itself as a technology? Made up. Starting now, OpenAI is forcing us to ask foundational questions about whether any of those things are worth keeping around.

About the Author

More Stories

High-School English Needed a Makeover Before ChatGPT

  • Newsletters
  • Account Activating this button will toggle the display of additional content Account Sign out

A.I. Could Be Great for College Essays

Every year, the artificial intelligence company OpenAI improves its text-writing bot, GPT. And every year, the internet responds with shrieks of woe about the impending end of human-penned prose. This cycle repeated last week when OpenAI launched ChatGPT —a version of GPT that can seemingly spit out any text, from a Mozart-styled piano piece to the history of London in the style of Dr. Seuss . The response on Twitter was unanimous: The college essay is doomed. Why slave over a paper when ChatGPT can write an original for you?

Chatting with ChatGPT is fun. (Go play with it !) But the college essay isn’t doomed, and A.I. like ChatGPT won’t replace flesh and blood writers. They may make writing easier, though.

GPT-3, released by OpenAI in 2020, is the third and best-known version of OpenAI’s Generative Pre-trained Transformer—a computer program known as a large language model. Large language models produce language in response to language—typically, text-based prompts (“Write me a sonnet about love”). Unlike traditional computer programs that execute a series of hard-coded commands, language models are trained by sifting through large datasets of text like Wikipedia. Through this training, they learn patterns in language that are then used to generate the most likely completions to questions or commands.

Language is rife with repetition. Our ability to recognize and remember regularities in speech and text allows us to do things like complete a friend’s sentence or solve a Wordle in three tries. If I asked you to finish the sentence, The ball rolled down the … you’d say hill, and so would GPT-3. Large language models are, like people, great at learning regularities in language, and they use this trick to generate human-like text. But when tested on their ability to understand the language they produce, they often look more like parrots than poets.

Gary Marcus, an emeritus professor at NYU and the author of Rebooting AI , is a vocal critic of the idea that bots like GPT-3 understand what they’re writing . In a recent preprint (an academic paper that hasn’t been peer-reviewed), Marcus and two co-authors used OpenAI’s DALL-E 2 to demonstrate the limitations of large language models. DALL-E 2 , like GPT-3, responds to text-based prompts. But instead of producing language, it creates images. DALL-E 2 was trained to learn associations between images and their captions to produce entirely new images based on the text users enter. If you ask it for, say, “an impressionist painting of Albert Einstein surfing” you get the four images below.

Not bad, right? DALL-E 2 was recently used to win an art competition .

But as Marcus and his co-authors point out in their paper , DALL-E 2 doesn’t understand the relationship between the words that users enter. If, for instance, you ask it to produce an image of “an old man talking to his parents,” you get four images of three people talking. The AI fails to understand that, however old the old man looks, the people he’s talking to must, by definition of being his parents, look older.

Compared with DALL-E 2, ChatGPT appears to have a better grasp of the text users enter and the responses it spits out. This might be why the Internet responded to its release so enthusiastically. ChatGPT “is causing jaws to drop and brows to furrow,” wrote Axios .

I had trouble stumping it with some classic riddles from cognitive psychology. Take the Lily Pad problem:

Prompt: A lily pad, in a pond, doubles in size every day. On the 20 th day, it covers the entire pond. On what day did it cover half the pond?

ChatGPT: A lily pad in a pond that doubles in size every day will cover half the pond on the 19 th day, one day before it covers the entire pond.

The A.I. got that the question described exponential growth—a phenomenon that, as the pandemic demonstrated, many humans fail to understand. But I did fool ChatGPT with the bat and ball problem made famous by Princeton psychologist Daniel Kahneman’s book, Thinking, Fast and Slow :

Prompt: A bat and ball cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?

ChatGPT: If the bat costs $1 more than the ball, and the bat and ball cost $1.10 in total, then the ball must cost $1.10 - $1.00 = $0.10.

The correct answer is 5 cents. (Trust me.)

ChatGPT might fail at tests of reasoning, but it’s great at generating humanlike text to different lengths and in various styles. This brings us back to the college essay—an easy task for ChatGPT, many commentators claimed on Twitter. “We’re witnessing the death of the college essay in realtime,” tweeted one Google employee . Ethan Mollick, a management professor at the University of Pennsylvania, had ChatGPT write an essay question , create a grading rubric for said question, answer the question, and grade its own answer. (It got an A minus.) How could the essay not be doomed?

This isn’t the first time that large language models have been predicted to fell the essay or worse. “To spend ten minutes with Sudowrite [a GPT-3-based A.I.] is to recognize that the undergraduate essay, the basic pedagogical mode of all humanities, will soon be under severe pressure,” wrote journalist Stephen Marche in a 2021 New Yorker piece. (On Tuesday, Marche wrote an article for the Atlantic titled “ The College Essay Is Dead .”) And in 2019, when GPT-2 was created, OpenAI withheld it from the public because the “fear of malicious applications” was too high .

If any group were to put an A.I. to malicious use, essay-burdened undergraduates would surely be the first. But the evidence that A.I. is being used to complete university assignments is hard to find. (When I asked my class of 47 students recently about using A.I. for schoolwork, they looked at me like I was mad.) It could be a matter of time and access before A.I. is used more widely by students to cheat; ChatGPT is the first free text-writing bot from OpenAI (although it won’t be free forever). But it could also be that large language models are just not very good at answering the types of questions professors ask.

If you ask ChatGPT to write an essay contrasting socialism and capitalism, it produces what you expect: 28 grammatical sentences covering wealth distribution, poverty reduction, and employment stability under these two economic systems. But few professors ask students to write papers on broad questions like this. Broad questions lead to a rainbow of responses that are impossible to grade objectively. And the more you make the question like something a student might get—narrow, and focused on specific, course-related content—the worse ChatGPT performs.

I gave ChatGPT a question about the relationship between language and colour perception, that I ask my third-year psychology of language class, and it bombed . Not only did its response lack detail, but it attributed a paper I instructed it to describe to an entirely different study. Several more questions produced the same vague and error-riddled results. If one of my students handed in the text ChatGPT generated, they’d get an F.

Large language models generate the most likely responses based on the text they are fed during training, and, for now, that text doesn’t include the reading lists of thousands of college classes. They also prevaricate. The model’s calculation of the most probable text completion is not always the most correct response—or even a true response. When I asked Gary Marcus about the prospect of ChatGPT writing college essays his answer was blunt: “It’s basically a bullshit artist. And bullshitters rarely get As—they get Cs or worse.”

If these problems are fixed—and, based on how these models work, it’s unclear that they can be—I doubt A.I. like ChatGPT will produce good papers. Even humans who write papers for money struggle to do it well. In 2014, a department of the U.K. government published a study of history and English papers produced by online-essay writing services for senior high school students. Most of the papers received a grade of C or lower. Much like the work of ChatGPT, the papers were vague and error-filled. It’s hard to write a good essay when you lack detailed, course-specific knowledge of the content that led to the essay question.

ChatGPT may fail at writing a passable paper, but it’s a useful pedagogical tool that could help students write papers themselves. Ben Thompson, who runs the technology blog and newsletter Stratechery, wrote about this change in a post about ChatGPT and history homework. Thompson asked ChatGPT to complete his daughter’s assignment on the English philosopher Thomas Hobbes; the A.I. produced three error-riddled paragraphs. But, as Thompson points out, failures like this don’t mean that we should trash the tech. In the future, A.I. like ChatGPT can be used in the classroom to generate text that students then fact-check and edit. That is, these bots solve the problem of the blank page by providing a starting point for papers. I couldn’t agree more.

I frequently used ChatGPT while working on this piece. I asked for definitions that, after a fact-check, I included. At times, I threw entire paragraphs from this piece into ChatGPT to see if it produced prettier prose. Sometimes it did, and then I used that text. Why not? Like spell check, a thesaurus, and Wikipedia, ChatGPT made the task of writing a little easier. I hope my students use it.

Future Tense is a partnership of Slate , New America , and Arizona State University that examines emerging technologies, public policy, and society.

comscore beacon

  • Ethics & Leadership
  • Fact-Checking
  • Media Literacy
  • The Craig Newmark Center
  • Reporting & Editing
  • Ethics & Trust
  • Tech & Tools
  • Business & Work
  • Educators & Students
  • Training Catalog
  • Custom Teaching
  • For ACES Members
  • All Categories
  • Broadcast & Visual Journalism
  • Fact-Checking & Media Literacy
  • Poynter ACES Introductory Certificate in Editing
  • Poynter ACES Intermediate Certificate in Editing
  • Ethics Training
  • Ethics Articles
  • Get Ethics Advice
  • Fact-Checking Articles
  • IFCN Grants
  • International Fact-Checking Day
  • Teen Fact-Checking Network
  • International
  • Media Literacy Training
  • MediaWise Resources
  • Ambassadors
  • MediaWise in the News

Support responsible news and fact-based information today!

Is the college essay dead? AI apps write scripts, speeches and so much more

Plus, the house of representatives will vote on a new speaker today. how does that work.

the college essay is not dead

I want you to experience some artificial intelligence applications that I have been trying recently. You will see some jaw-dropping potential in these apps to disrupt, create and even cheat. Let’s start with ChatGPT, which, with just a little prompting, can write an essay or even a TV script. I asked it to create a Seinfeld scene in which George decides to become a journalist. In 10 seconds, the script appeared:

the college essay is not dead

(OpenAI.com)

The New York Times can’t seem to rave enough about ChatGPT saying, “ChatGPT is, quite simply, the best artificial intelligence chatbot ever released to the general public. It was built by OpenAI, the San Francisco A.I. company that is also responsible for tools like GPT-3 and  DALL-E 2 , the breakthrough image generator that came out this year.” In case you were wondering, GPT stand for “generative pre-trained transformer.”

I asked it how I could know if a racehorse would become a champion:

the college essay is not dead

I asked ChatGPT to write a sermon that a United Methodist minister might deliver about lotteries. I asked it to explain quantum physics at a fourth-grade level. After it gives a response, you can ask for another response, and it will compose a new answer. And, according to the Times, “It can  write jokes  (some of which are actually funny),  working computer code  and  college-level essays . It can also  guess at medical diagnoses ,  create text-based Harry Potter games  and  explain scientific concepts at multiple levels of difficulty .” 

Recently, an essay in The Atlantic suggested that artificial intelligence technology makes it easy for a program to produce a logical, conversational article or essay. One student who was caught using AI to produce an essay said it was not unlike using a spellcheck program.  

They don’t feel like they’re cheating, because the student guidelines at their university state only that you’re not allowed to get somebody else to do your work for you. GPT-3 isn’t “somebody else”—it’s a program. The world of generative AI is progressing furiously. Last week, OpenAI released an advanced chatbot named ChatGPT that has spawned a new wave of  marveling and hand-wringing , plus an  upgrade  to GPT-3 that allows for complex rhyming poetry; Google  previewed  new applications last month that will allow people to describe concepts in text and see them rendered as images; and the creative-AI firm Jasper received a  $1.5 billion valuation  in October. It still takes a little initiative for a kid to find a text generator, but not for long. Kevin Bryan, an associate professor at the University of Toronto,  tweeted  in astonishment about OpenAI’s new chatbot last week: “You can no longer give take-home exams/homework … Even on specific questions that involve combining knowledge across domains, the OpenAI chat is frankly better than the average MBA at this point. It is frankly amazing.” Neither the engineers building the linguistic tech nor the educators who will encounter the resulting language are prepared for the fallout.

Hyperwrite is another interesting program that includes templates.  Look at all of the options just under “marketing”

the college essay is not dead

(HyperWrite)

Hyperwrite allows the user to build documents step by step. I asked HyperWrite to explain World War 2 in language that a 5-year-old might understand. Here are three possibilities it offered:

the college essay is not dead

And I built a 322-word essay about the importance of submarines from World War II to today in 30 seconds. Here’s part of a finished essay:

the college essay is not dead

Here are some other places you can go to learn how AI is moving into journalism.

  • The future of content creation with AI is closer than you might think – Poynter 
  • A news photo editor’s nightmare is an art director’s dream – Poynter
  • These projects are using AI to fight misinformation – Poynter
  • How news organizations used automated news to cover COVID-19 – Poynter
  • What is the future of automated fact-checking?  – Poynter
  • Here’s how publishers around the world are using automated news – NiemanLab
  • “Look at the robot as your new colleague”: what automation can do for sports reporting – Journalism.co.uk
  • How a local paper in Argentina uses AI to publish hundreds of sports pieces a month – Reuters Institute for Journalism
  • Runway – an example of text for video editing. This is their latest trailer for 2023.

The battle over today’s vote for Speaker of the House

Let’s keep in mind that this has to do with one of the most powerful and important positions in U.S. government, and hours before the House of Representatives is to vote on who holds that position, the outcome is uncertain. House Republican Leader Kevin McCarthy (R-Calif.) has the backing of almost all his Republican colleagues, but 15 Republicans don’t back him and that is enough to send the leadership vote to a second round, which hasn’t happened in a century.

Since today’s vote will not be the formality it usually is, you should know the rules . PBS explained some of the details that you might not expect, including that the speaker does not have to be a member of Congress, and to be elected, the speaker does not have to get 218 of the 435 House votes (a majority.) The vote only requires a majority of those who are present and vote by name. 

All candidates for speaker must be nominated by members of the House, but they don’t need to be elected lawmakers of the House.  Article I, section II  of the Constitution says only that the House “shall choose their Speaker and other officers.” So far, the chamber has only chosen its own members as speaker, but a non-lawmaker is possible. Earlier this year, former Secretary of State Colin Powell received a vote for speaker, as did Senator Rand Paul of Kentucky. To be the next speaker, a person needs a majority of the votes from House members who are present and voting. (See this useful  Congressional Research Service (CRS) report  for more detail.) That means that while a majority is 218 votes in the House, a person could become speaker with fewer votes if several members do not attend the vote. That happened in 2021 when Rep. Nancy Pelosi, D-Calif., won with just 216 votes after three members voted “present.”

The House has been deadlocked 14 times before. The House historian traces the floor fights back to 1793 when it took three ballots to choose a Leader.

Most House Speaker floor battles happened before the Civil War. But for sheer drama, read about the House Speaker vote from 1917 , or the nine ballots required to elect Rep. Frederick Gillett  of Massachusetts to be speaker in 1923. 

CNN reminds us:

In 1855 and 1856, it took 133 separate votes for Rep. Nathaniel Banks of Massachusetts to be elected, again by a plurality and not a majority. The process stretched over more than a month and included a sort of inquisition on the House floor of the three contenders. They answered questions about their view of the expansion of slavery.  Read more from the House historian’s website . It’s also interesting to  read about Banks ; his official House biography notes he was elected to office as a Republican, an independent, a member of the America Party and as a Democrat.

The Congressional Institute explains the House meets today: 

  • Establish that there is a quorum. If a member in attendance declines to vote, they do not count toward the quorum.
  • The clerk calls for speaker candidate nominations.
  • Chairs of the Republican Conference and Democratic Caucus make short speeches on behalf of their parties’ nominees.
  • The clerk then asks for any other nominations. 
  • The clerk then appoints tellers to record the votes, and the voting begins. 
  • Each member-elect has the opportunity to vote when the clerk calls his or her name. Typically, the member-elect responds by calling out the last name of the person they wish to be speaker. They may also answer “present,” which does not count as a vote but does contribute to a quorum. Or they could decline to respond at all.

the college essay is not dead

The news desert problem continues to worsen with little relief in sight

A new Medill study estimates that an eye-popping 7,000 newspaper jobs — about a third of them in newsrooms — were lost in 2023

Group standing for a picture

Meeting students where they are: How one fact-checker increased media literacy among students in the Gambia

Mariama Danso demonstrated to the students how misinformation spreads both online and offline.

the college essay is not dead

Opinion | How much coverage of Donald Trump is too much?

In this election cycle, media coverage of the unconventional and abnormal seems to trump coverage of the conventional and normal

the college essay is not dead

FEMA conspiracy theories have existed for decades. The internet has amplified them

Experts told PolitiFact that conspiracy theorists began targeting FEMA in the early 1980s, soon after its creation

the college essay is not dead

The Chicago Tribune didn’t replace its retiring architecture critic. So he funded his own successor.

Pulitzer Prize-winning critic Blair Kamin is paying for the Tribune’s next architecture writer out of his own pocket. Why would he do such a thing?

Start your day informed and inspired.

Get the Poynter newsletter that's right for you.

IMAGES

  1. The College Essay Is Still Very Much Alive

    the college essay is not dead

  2. The College Essay Is Dead

    the college essay is not dead

  3. Do NOT Make This College Essay Mistake

    the college essay is not dead

  4. What Not to Write About in Your College Essay, Part II

    the college essay is not dead

  5. Colleges Aren't Dead Billion$ of Reasons Why Colleges Will Survive

    the college essay is not dead

  6. College essay don’ts: 37 Things to Avoid In a college essay College Everything College Everything

    the college essay is not dead

COMMENTS

  1. Opinion: The college essay is not dead - The Atlanta Journal ...

    It’s the opportunity to show why we need more faculty, not less. Why we need less students per class, not more. The cost of college has exploded due to one sizable factor – the death of public...

  2. Will ChatGPT Kill the Student Essay? - The Atlantic

    The College Essay Is Dead. Nobody is prepared for how AI will transform academia. By Stephen Marche. Paul Spella / The Atlantic; Getty. December 6, 2022. Suppose you are a...

  3. The College Essay Is Still Very Much Alive - Forbes

    Stephen Marche’s December 6th article for The Atlantic is a prominent example of the sensationalized response to ChatGPT, in which Marche boldly argues that with the innovations in AI, “the...

  4. Here Comes the Second Year of AI College - The Atlantic

    In this vacuum, professors grew suspicious of students who turned in particularly grammatical essays. Plagiarism detectors flagged legitimate work as AI-generated.

  5. ChatGPT Will End High-School English - The Atlantic

    “Can you fix this essay up and make it better?” Turns out, it could. It kept the student’s words intact but employed them more gracefully; it removed the clutter so the ideas were able to ...

  6. ChatGPT won’t kill the college essay. - Slate Magazine

    The response on Twitter was unanimous: The college essay is doomed. Why slave over a paper when ChatGPT can write an original for you? Chatting with ChatGPT is fun.

  7. Is the college essay dead? AI apps write scripts, speeches ...

    One student who was caught using AI to produce an essay said it was not unlike using a spellcheck program.

  8. Teaching Experts Are Worried About ChatGPT, but Not for the ...

    By Beth McMurtrie December 13, 2022. Is the college essay dead? Are hordes of students going to use artificial intelligence to cheat on their writing assignments? Has machine learning...

  9. What impact will ChatGPT have on the college essay? (opinion)

    I’m far from convinced that ChatGPT can produce great college essays. Great essays have a spark to them that is not about the ability to write but rather the ability to think. Great personal essays are clever and insightful, with an authenticity and a sincerity that’s—well, personal.

  10. Opinion: No, AI chatbots haven't killed the college essay ...

    People who assume that technology will make the college or high school essay obsolete seem to view writing merely as a product or a lifeless method of evaluation, not a process or a way of...