Mr. Roboto only wants you to make the grade

Ian Bogost’s opening move in the first chapter of Persuasive Games,  “Procedural Rhetoric”, was to appeal to a game called Tenure. I was both surprised and pleased to see that teaching was positioned as a game so prominently in a book about games. As a reminder since this already feels like a while ago, Bogost’s description of Tenure accounted for a training device that was meant to simulate the first year of teaching with the aim toward getting a contract for the second year (1). Through the multiple choice decision making events, the procedural argument about teaching is that it requires a complexity of decision making that a teacher must navigate, involving contradictory and multidimensional conditions related to classroom management, collegiality with other teachers, student advocacy, and career advancement (to name a few) that can have dynamic results, many of which may or may not be in the best interest of any one party at any one moment. Bogost hones in on the aspect of institutional politics of the game, but I’d argue that even within the confines of determining what is the best decision to make as a teacher with a duty toward students, complexity runs abound in any decision made.

Where there is complexity, there is also, I am increasingly finding, a beckoning for potential assistance from computation. If we can explore the possibility of mechanical ethicists, as Maher does in “Artificial Rhetorical Agents and the Computing of Phronesis,” then why not mechanical teachers? Teaching, after all, can be rule-bound and is also heavily invested in making decisions. In a way, it is a series of moments of dealing with oughts and ought nots, much like what Tenure procedurally displays. Maher’s description of Beliefs, Desires, and Intentions as creating an architecture that “attends to the fact that action most often occurs in a space of ‘competing alternatives’ that must be weighed before deciding one” suggests that the machine’s potential ability to deliberate about a moral decision is similar to the Tenure player’s deliberation about whether or not to ignore a student’s tardiness (15). But let’s imagine what could happen if we remove the player and insert the machine as player. What would it be like to have a robot teacher?

random robot with 2 plus 2

Apparently, from Ohio to South Korea, robot teachers are already in action. However, these robots are controlled by humans and not (at least totally) by code. I’m interested in something like Maher’s AMA (which would require an ARA). I guess, to be very creative, we will call it the Artificial Teacher Agent (ATA) for now. If we accept that teachers deal with oughts, then perhaps it wouldn’t be too much of a stretch to go from moral systems to teaching decisions. After getting through the promise and problematics of Kantian and utilitarian ethics as models for machines to run, Maher spends a moment on Aristotelian virtue ethics and case-based ethics, providing an example of Marcello Guarini’s method that “uses a series of cases to train artificial ‘neural network models’ so that any kind of abstract moral rules rise organically from the situation rather than deontologically” (10). The machine would be educated in morals through this process of experiencing case-based ethics of sorts. If we maintain an acceptance of the assumption that teaching can be considered a series of oughts, we might also consider the dynamism and contingency that the situatedness of teaching provides. Things that maybe can be planned for, but perhaps still can’t be completely planned for (situations that might somehow violate rules and models we hold).

Some Teaching Oughts  that I thought of (this is furious free association, so it could be a little suspect)
 

This student ought to spend time doing this kind of writing

 

I ought to spend more time on working on the understanding of the assignment in class with students

 

I ought to grant an extension because the student has been dealing with a personal issue

 

I ought to reach out this student to have a meeting about generating ideas for writing

 

I ought to be more stern today to send a message that the class or assignment will be difficult

 

I ought to be more friendly today to send a message that this is a space for experimentation

 

I ought to say very little today and do more listening to see where they are at

 

 

I read a book this summer called After Pedagogy: The Experience of Teaching by Paul Lynch. Lynch’s project is a response to a movement in composition studies called “postpedagogy.” A brief (and probably sloppy) attempt to define this movement would be something like this: work in the field has positioned developments in pedagogical theory as an impossible aim because, as Lynch writes, teaching is “too complex, too particular, too situated to be rendered in any repeatable and therefore portable way” (xiv). In other words, pedagogy cannot be relied upon, as a priori, in order to reliably make decisions (and good ones) as teachers. Lynch’s response to the movement is not a rebuttal or a counterargument, but instead, drawing from John Dewey’s conception of experience (one that is very difficult to pin down), a move to develop a philosophy of experience in teaching writing that accounts for both the “raw data of everyday living” as well as “our methods of reflecting, repurposing, and learning from everyday living”, so that compositionists can make a sort of yin and a yang of pedagogy and contingency (xix).

One of Lynch’s heavy investments for his application of a philosophy of experience to teaching is in the moral system of casuistry.  Essentially, casuistry “asks whether and when circumstances change the ways in which we judge moral action….casuists check their judgments against paradigm and analogy and frame their decisions for the particular case at hand….a judgment holds for only for the given case” (Lynch 104-105). For an application to classroom ethics, Lynch provides the following example:

“A student asks for an extension. ‘I am swamped in my other classes,’ she says. This claim (which students invariably fail to understand insults the teacher to whom they are talking) might not win much sympathy, until we consider that the student (a) has never missed a deadline, (b) has been stellar all semester long, (c) is holding down a full-time job, (d) is raising three kids alone, and (e) is a member of the honor society. Given these circumstances, teachers might be inclined to bend the rules a bit. This is a basic casuistic situation, in which circumstances seem to demand some deviation from the usual procedure (111-112).”

There is so much information available and imagined at any given moment in the classroom and in preparing for class. If we imagine an ATA like an AMA, then, how great would it be (and possibly this would be a requirement like for the AMA) if it was also an ARA. If it also had to explain the decisions it made in the classroom. Lynch’s example is complex, but the classroom certainly has more potential for complexity. The words I choose as a teacher, the activities I design, the way I arrange the room, the questions I ask, the way I speak to students at any given moment, how I talk to one student vs. another student (say, a student with low self-esteem and another with, well, plenty of esteem to go around) : all of these are decisions made for one reason or another, and they may lead to myriad outcomes. Because of the situatedness of teaching, I might have no time or foresight to prepare to make some of these decisions (or adjustments to decisions already made). If an ATA might have some sort of case-based learning ability, it might make some interesting and effective moves as a teacher.

Perhaps it wouldn’t work, but maybe, like Tenure, there could at least be some training value. Maher also seems to retreat to this sort of possibility about the potential to learn about morality and possibly update it for a digitized world (30). Teaching teachers, via intentional professionalization or more routine observations by administrators or other teachers, has long been held as a difficult and onerous task, especially when implications for one’s career are implicitly or explicitly involved. Maybe having a machine report out decisions it would make in your classroom could  be an outstanding way to consider something like Lynch’s argument for a balance between pedagogy and contingency: considering and reflecting on what we’ve done to learn how to (loosely) teach tomorrow (only to again follow the same pattern). Having an ATA (that is also, necessarily, an ARA) explain its decisions to a teacher (novice or seasoned) might be promising for future professionalization and training of teachers. This would ostensibly remove the interpersonal anxiety between and among teachers that sometimes manifests itself; the lack of a holistic subject that one has an ongoing social relationship (or, at least, could have an ongoing social relationship with) might allow for someone to let their guard down when considering the case-analysis by the ATA. This might kind of be like the Rogerian ELIZA that found itself to be a great conversation partner. Rather than right or wrong answers, the ATA could provide options in decision making that the teacher may not have considered and that information might inform future decisions (pedagogically a priori or ad libbed in the moment of teaching in the future).

Or it could be a terrible intervention that only encourages neoliberal surveillance and quests for “teacher effectiveness”. I could also see this being terrible. But, I think it is interesting to imagine as a useful possibility.

2 thoughts on “Mr. Roboto only wants you to make the grade

  1. I do wonder how all of this would end up playing out if ATA was left to adjudicate a borderline case. For example, I have had weird moments in my teaching where I have had to make decisions in the moment about whether a student would fail my course or not, based on what was an admittedly harsh attendance policy. This bit me in the ass last semester when a student was assaulted and sent to the hospital while already out of absences. Things did not end well for me, because I behaved too procedurally for the situation, or did not have enough meta-procedure in place to actually “be human” about things.

    It’s clear to me, at this point reflecting on my pedagogy, that I have made a lot of my classroom rules too procedural, that I have attempted to construct syllabi that behave like very simple computers to create emotional distance between myself and what are inevitably my decisions. My student last semester undoubtedly felt they were trapped in an endless procedure loop produced by me and my syllabus. How much worse would it have been if a hypothetical student had to navigate some dystopian software bureaucracy? Who would they talk to once the machine had given the final NO? I can’t imagine a world where a computer could cope with the situatedness of the bizarre interperson construct that is the classroom.

  2. It is interesting to think of taking the ideas of an AMA and applying it to something like teaching. In this case, perhaps another line of thought could consider teaching procedures instead of replacing of helping classroom teachers. For example, the Google self-driving cars were discussed as AMAs (or machines with the high potential to become AMAs). What if driver’s ed classes were instead replaced with courses of sitting in self-driving cars as they navigated the streets perfectly? Then, the students could take over driving while the self-driving program took a back seat with the ability to forcibly take over driving in the case of an impending accident or major rule violation. Perhaps the self-driving car could make obnoxious beeping sounds when the student was about to violate good driving procedure, like turning without signalling.

    I think the logics for this were already mentioned in a couple of the readings, both in Bogost and in this week, but mainly centered around search-and-rescue robots and search-and-rescue simulation/training programs.

Leave a Reply to Smalltalk Cancel reply

Your email address will not be published. Required fields are marked *

*
*
Website