Responding to Responses to “What Automated Essay Grading Says To Children”

I about what I feel like the use of machine scoring for student writing looks like to children.  The responses were strong.  I thought it made sense for me to clarify what I was saying, what I wasn’t saying, and what I didn’t say.

Let’s tackle the last one first.  I didn’t say that I’m unsympathetic to the idea that more writing would happen if there was less grading to do.  Certainly, one reason that writing isn’t happening enough in classrooms now is that there’s a perception that every piece written must be “marked” or “graded” or “bled upon” by a teacher.  That’s completely false and a terrible idea.

What our students need isn’t so many end comments or suggestions for grammatical or technical correction, but they need to be responded to as writers by readers who are reading their work.   says this far smarter than I ever could, but we teachers should be doing less evaluating and more responding.

So, yes.  Teachers are taking too long with papers.  The answer isn’t to stop reading them. It’s to read them differently.  Or to have more teachers reading fewer students’ writing.  And we don’t need to read everything that a student writes.  We certainly don’t need to grade everything a student writes.

Where I think this gets messy is, as , is the notion that students need more grading from us in order to get better as writers.  They do not.  They need for we teachers to write with them, and to create cultures of inquiry and reflection rather than regurgitation in our classrooms.  They need to be treated as apprentice writers and brought up accordingly.

Robotic graders are for people too busy to read the work our students are investing in.  That’s not fair to our students.

Now, to clarify.  I’ve ben in classrooms where existing writing assessment software has been used, and I’ve been pleasantly surprised by what I’ve seen.  My most recent experience with a writing assessment tool was in a middle school classroom in my school district, where a gifted teacher was using the tool as a starting place for her writing courses.  The software did free her up to be in conversation with her students about their writing.  That was just the right way for her and the class to be – the students drafting, the teacher conversing and reading and being with her students.

The students wrote more and revised more.  In talking with them, they felt a connection to their teacher and that she was concerned for them as writers.  The software was a scaffold, and a place to start.

I was okay with that.  More than okay.  The teacher made the classroom shine.  The software augmented the teacher.  She could’ve run a similar, maybe not as prolific, writing workshop with her students using only paper and pencil.

And she read what they wrote.  And encouraged them to share their writing with each other.

Writing for a machine to read all the time, though, is not really writing.  It’s pretending.  It’s make believe.  And not the good and playful kind.  It’s faking it when there’s not an other someone reading at least some of the work.  We want our students to write well not because they’ll need to do so in some far off future job.  We want them to write well because they have something important to say to the world right now.

So let me clarify further.  I get how the computers do the “reading” that they do (( By the way,  is worth your time if you want to understand the processes and processing involved. )).  And I won’t completely knock it.  It’s handy if you need to score a bunch of tests in a hurry. And that’s one kind of writing – writing as proof of knowing.  But it’s writing that assumes unimportance.

And it’s writing that suggests that the students could build their own robot essay writers to write their essays for them.  In fact, that’s what an awful lot of student “cheating” cases are – they’re crowdsourcing their homework.  Some students do that out of malicious intent.  Others out of ignorance.  But too many students fake their way through essays out of boredom, and out of the knowledge that the teacher’ll be in a hurry and probably not notice.

You’ve got to notice what your students are doing.  And you’re going to miss some things.  But you can’t miss all of them.  Maybe even most.

I don’t think a machine grading writing is the end-all of everything I hold dear.  I’m sympathetic to the argument that our students need to write more and perhaps the machines will encourage that.  But the fervor with which I suspect machine grading of writing will be adopted suggests the real problem – we don’t actually want to read and write with our students.  We want to do reading and writing to them.  And that’s wrong.

10 thoughts on “Responding to Responses to “What Automated Essay Grading Says To Children”

  1. I wonder how many teachers “do” reading and writing to students simply because they don’t know any different? I right now am on a journey to learn how to help my students be better readers and writers. I can say that the “training” I received in college 18 years ago didn’t help much. I can honestly say that five years ago I would have loved a program that graded student writing and I can also say that it would have done it more competently than myself.

  2. The problem is that automated essay grading is already seriously distorting the curriculum. Look at the Common Core standards. When Common Core did some international benchmarking in an early draft, it was crystal clear that compared to other standards, they’d systematically removed parts of the reading and writing standards that could not be scored by computer.

    If we go down this path, before too long the algorithm will become the standard. The computer will be seen as more reliable and valid than human graders — there will be no possible argument against the software, because the software will be the standard.

    Also, I agree that this software can be beneficial in a limited role.

  3. When people view the role of writing assessment as something that results in a grade, they are missing the point.

    Feedback and review should highlight opportunities for improvement. It should be a process of asking questions, as opposed to making statements. Feedback and review need to be grounded in at least a passing knowledge of grammatical sentence types, rhetorical sentence types, verb choice, sentence length, pacing, zeugma, the judicious use of polysyndeton, and other devices that help creative minds illustrate ideas more effectively.

    The largest disservice that robo graders do is highlight the myth that a piece of writing is ever done beyond the point of improvement. There are times when it’s necessary to just finish something and be done with it, but that is a different skill set than expressing oneself creatively with words.

    Robo graders narrow the conversation, and help people confound assessment with knowledge, and a grade with learning. Assessment and grades are indicators of a point in time; learning to write (and learning to edit) is a process of asking questions.

  4. Attending a writing project summer institute changed my way of teaching! It forced me to look at teaching as a process where I worked side by side with my students in writing, reading and even math to build knowledge. We collaborated, shared ideas, shared writing, drafted and did all of the things that occur in a “workshop” format. By the time a piece of writing was turned in for a grade, the process had morphed the piece into a finished product. One student had his/her name on it, but it was a work produced from a class of writers who advised each other through the process so there were no surprises what the grade would be.

    It takes effort and time to build a classroom culture like that. It really helps when the teacher sits down in front of the class and asks for their advice/input on his/her writing, then the students truly believe they are valued members of a writing community.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.