Mr. Polemic or: Techno-Comprehension and Limited Retention

Please approach my response to this weeks reading with a grain of salt (I’m pseudo-serious, but I know I’m wrong). I start this post with multiple questions I have at the forefront (if you rather respond to one of these, by all means do so) before diving in:

First, What is code, In Katherine Hayles’ chapters, I’m left with a very vague and nebulous feeling that the phenomena of “code” signifies an empty signified, making this abstraction too theoretically detached from an independent artifact of analysis.

Second, Is the academic move to treat language as a monolithic subject divorced from its current programming utilization?

Third, how does technological optimism frame the discussion of computation?

Fourth, Literacy and Code imply a looking-in/regulatory functions that feels very violating of communities that don’t want to be revealed.

These gestures were all focal points that I originally wished to write about, and think that there is much to be discussed, albeit this post takes a different direction.

Computation, Hayles identifies, “connotes far more than the digital computer, which is only one of many platforms on which computational operations can run” (p. 17),  unpacking hardware onto social systems (via langue) and complex modes of interaction and being. In chapter two, Hayles attempts to critically delineate the limits/operations of “speech, writing, and code” (p. 39). She argues that, “we cannot afford to ignore code or allow it to remain the exclusive concern of computer programmers and engineers. Strategies can emerge from a deep understanding of code that can be used to resist and subvert hegemonic control by megacorporations (p. 61)”.  My concern is that theorizing a concept of “code” literacy need not (re)articulate itself from speech and writing, but should be embedded within pragmatic interpretations of language proper.

 

Hayles argues, “Speech, writing, and code: the three major systems for creating signification interact with each other in millions of encounters every day” (p. 38).  When treating “code” as an external system (la langue), it becomes a privileged technical system exterior to and insulated from speaking and writing. When the technique of coding becomes it’s own techne it removes itself from the basic supposition that coding is writing. She argues, “Now that the information age is well advanced, we urgently need nuanced analyses of the overlaps and discontinuities of code with the legacy systems of speech and writing, so that we can understand how the processes of signification change when speech and writing are coded into binary digits” (p. 38). In a logically positivist manner, she situates code as the next phase of learned knowledge, understanding, and development, because, “Computers are no longer merely tools (if they ever were) but are complex systems that increasingly produce the conditions, ideologies, assumptions, and practices that help to constitute what we call reality” (p. 60).  Tension emerges when treating the computer as both simplistic tool and complex machine (which is latent in this last quotation itself), pinning down the mechanistic process/capabilities in it’s fluidity need not serve analysis for the artifice proper. In less congested terms, selectively deciding how to theorize code/computation/digital in it’s relationship to speech/word/written functions as a self serving mode of interpretation further externalizing computational literacy, instead it creating an additional layer of literacy making it harder to teach “non-programmers” its utilization.

 

Hayles argues that, “Code is not the enemy, any more than it is the savior, Rather code is increasingly positioned as language’s pervasive partner. Implicit in the juxtaposition in the intermediation of human thought and machine intelligence, with all the dangers, possibilities, liberations, and complexities this implies” (p. 61). What is problematic in this “parternship metaphor” is that code is part of language and a non-external device. Computer code is not a partner to language, but computer code is itself language. If the goal is to create a public that is literate and capable of understanding “computer code” (my criticism is more so of the concept of code rather then the computer itself), we need a public that is literate (with code). If knowledge of the digital becomes apart of learning language itself, then language inherently becomes what it was always/already, code.

 

Digital language need articulate itself through discourse at earlier stages of development, both to show its utility and to deny the magical effect it has, once it becomes insular  to language (Eliza effect was the title I believe the other author called it – need fact checked). For instance, at the Elementary educational level, if computational language was taught (we can easily scrap cursive, etc…) computer “code” would not be a foreign entity but part of our basic system of understanding and literacy. As Hayles’ notes, computation and code is a “metaphor pervasive in culture” (p.20), the ultimate task is to change the metaphor into something more attainable at the literacy stage.

Citizenship in the Digital Age

My contribution into this weeks reading, Expressive Processing: Digital Fictions, Computer Games, and Software Studies by Noah Wardrip-Fruin, centers around the concepts of digital media, citizenship, technical expertise, and simulation. My concern is pseudo germane to the work itself, but more of an indictment of articulating better citizenship as the output of the work on computational literature. For example, Wardrip-Fruin argues, “Learning to understand the ideologies encoded in models and processes, especially when unacknowledged by system authors, is an important future pursuit for software studies… to be better citizens we need to understand software critically” (pp. 422-424). There are a few problems I would like to address here. First and foremost, citizenship itself is an ever fluid concept that hardly seems conceivable in lieu of recent debate and discussion. Second, the simulated citizen is always/already operative within simulation when addressing “politics”. Finally, every act of identification is an act of division (Burke) causing new (in)humanness to occur, and as Wardrip-Fruin argues, “our ability to identify with human characters is closely tied to their graphical representation” (414).

 

My first interjection is at the point of a Marxist critique of media and citizenship. Tying concerns of computational literary to machines that costs thousands of dollars (at the cutting edge), directly equates into a new form of academic elitism that may have negative effects. When I finished this book I could remember the aspects pertaining to ideology but couldn’t remember much in relationship to economy, I searched for the use of economic value in its relationship to ideology and found something surprising. Each time the book utilized the term, “ideology”, it was always/already devoid of material conditions and focused purely on the symbolic, social, and political ideologies that need not be divorced from economic analysis. If a central tenant of the works justification resides in the fact that it is a precursor for citizenship, then we need to careful reanalyze who gets to be a semi-citizen in the first place. The total of households in 2013 that owned a computer was 83% and in poor households the statistics of owning a computer were approximately 60 percent (Census.Gov).

 

https://www.census.gov/history/pdf/acs-internet2013.pdf

 

Bringing me to my second point. Was there ever a notion of citizenship devoid of simulation? Politics itself seems to be the bridge where ideas are circulated, simulated, and voted upon to make into actuality. At the representational and naive level, our congresspeople directly simulate our responsitivity whenever they case a vote for/against a particular piece of legislation due to what is articulated. If I gave you the next passage from the book without explanation (and removing The Sims from it), you might be convinced that this quote directly supports politics in tune with its people, “Not only should we aim for engaging expression but also for expression that communicates the evolving state of the underlying system. We should strive for the closeness of surface and simulation achieved by the Sims, but while moving both forward and sideways toward elements of human life other than the most basic” (pp. 415-416). Which brings me to another thought, where does the line between simulation and real actually exist? There are many times where people think, “this isn’t the real me”, well what exactly does that proposition mean? In a world where we are rhetorically sound and repudiate higher truth’s, isn’t the “real you” just a serious of communicative exchanges in the creation of identity? If so, how does this differ from simulation itself? Just to make fun of myself for a terrible question:

 

 

Finally and most importantly, digital citizenship seems to be here to stay, what does that mean for our current geo-political situation? If most individuals lack the technical expertise to even understand that our climate is at risk from anthropomorphic activity, how is it possible to get individuals to think critically about simulation, which goal is self-erasure (a simulation is good when it denounces itself as simulation) and obfuscates the very processes at hand (seriously can’t I just play Grand Theft Auto, Sim City, etc., in peace)? Isn’t this going to get harder as, “Today’s authors are increasingly defining rules for system behavior” with greater technical capacity (p.3).

 

 

Ps. Just to show everyone that I’m very bad with fundamentals of a computer, here is an imagine of me trying to show a chart.

 

 

Untitled

 

I rest my case:

 

 

 

or

 

Coding til the lights turn off or: Does Not Compute

As the title suggests, currently finishing my two hour run on Code Academy (1.5 hours, but the Cathedral of Learning is very creepy at night and they seemed to have switched the lights outside off our computer lab office off) with some bittersweet feelings:

Cathedral Is Creepy

 

Attempting to learn the language of basic codes was a harder task than previously anticipated. Most of the time I found myself staring at a page, attempting to recall the early lessons that I flew by, finally culminating in the realization that I should start again from the beginning (utilizing repetition as a means to get a better understanding of the computational language that certainly seems foreign). My ego certainly got checked as I overtly anticipated my ability to just “jump in and succeed”. I quickly realized that all my academic weaknesses (lack of attention to detail, lack of patience, and struggles to remember mathematical signs) are the sine qua non for quickly learning the linguistic structure and trade. After the initial shock wore off, the repetition of trail and error eventually got to the point where some of the commands were beginning to make sense, which quickly shot some optimism back into the fold.

 

As I’m writing this, I’m currently excited again about the computational and technical aspects of this course. First and foremost, the efficiency and simplicity of the language (once attuned) mixed with a determined infrastructure is giving me hope that there might be a computational critical method that can be excavated onto rhetorical criticism at large (not just semantically), and while this thought is still nascent, I’m definitely going to be thinking about it as I move forward. Second, and selfishly, I hope that continuation with technical computation can help me resolve some issues with how I approach text in general (whether it be word efficiency or better attention to detail). Finally, I hope that (as the reading suggests) I become pseudo-competent over the next few months of the course and literate enough to understand what’s beyond “the black box”.