Media, Technology, and Education
Making MeaningTeaching

Making Sense of All Things Digital

For the last few years, I have been volunteering my time at a local senior center, teaching computing skills.  One of the struggles is to explain the subtle cues that the computer provides to us as its users to let us know what we can do at that particular time.  What do I mean by “cues”?  This Friday, I talked about how you know when you can type text in a particular spot.  Think about it.  You look for your cursor to change to a straight vertical line that blinks.  Wherever it blinks is where your text will appear if you press the keys on your keyboard.  We all know this, right?  The problem is that there are thousands of these items.  Each appears to be a small thing, without much consequence.  And yet, by paying attention to these small, visual cues, we all know what we can do and when we can do it.  It’s challenging to teach people who aren’t used to paying attention to, much less deciphering, these subtle cues.  I love it but I’m constantly struggling to explain why things are as they are on PCs, to help make sense of the virtual world.

One of the things I have never been able to explain is why sometimes you need to click and why sometimes you need to double-click.  I would like to be able to articulate a rule about when to engage in each action but I have not yet been able to do so.  Instead, I tell the students in this class that they should first try to click on something and if nothing happens, they should double-click.  This explanation feels wholy unsatisfactory to me because I want to believe that computers are logical.  But deep in my heart, I know they aren’t.  They are just as subject to whims of culture-making as any other artifact of our culture.  And now I have proof of that.  Tim Berners-Lee (that’s SIR Berners-Lee to you) recently admitted that he regrets the double-slash.

Sir Berners-Lee invented the World Wide Web.  The author of the article I linked to says he is considered a father of the Internet but that’s not true.  There is much confusion about the difference between the Internet and the World Wide Web.  In fact, most people consider them to be the same thing.  But they are not.  The Internet is the hardware that the World Wide Web (which is comprised of information) resides on.  The Internet was created in the early 1970’s.  The World Wide Web was conceived of by Berners-Lee in the early 1990’s.  Berners-Lee’s achievement is monumental.  We don’t have to give him credit for the entire Internet.  He’s still an amazing guy.

The World Wide Web is comprised of web pages and social networking sites and blogs and such rather than the actual machines that hold all of that information.  When we browse the World Wide Web, we typically use a web browser like Internet Explorer or Firefox.  If you look in the address box of the web browser you’re using, you will see that the address there contains a number of pieces.  The first part of the address is the protocol that your computer is using to communicate with the computer that contains the information you want to see.  A protocol is simply a set of rules that both computers agree to abide by in their communication.  You can think of a protocol as a language that the computer agree to use in their communication.  Typically, the protocol these days for web browsing is http (hypertext transfer protocol) or https (hypertext transfer protocol secure).  Much of the text of the address of the web site you’re looking at specifies the name of a computer and the name of some space on that computer.  The thing that Berners-Lee regrets is the set of characters he chose to separate the protocol from the rest of the address.  He chose “://”.  He doesn’t regret the :.  It’s a piece of punctuation that represents a separation.  He does regret the //.  It’s superfluous, unnecessary.  This whole conversation makes me feel better about teaching the senior citizens who choose to take my class.  Some digital things are not logical.  They are whims.  Just ask Tim Berners-Lee.

Article written by:

I am currently Professor of Digital Media at Plymouth State University in Plymouth, NH. I am also the current Coordinator of General Education at the University. I am interested in astrophotography, game studies, digital literacies, open pedagogies, and generally how technology impacts our culture.

2 Comments

  1. Ian

    Poor Seniors. The sad fact is that while computers interfaces have made strides in presenting information and functions to a user, they’re still exceptionally difficult for those without a lot of specialized knowledge. It’s like asking someone to play a game with you, giving them a few of a hundreds of rules, then pestering them when they ‘make a mistake’. The only consistent method a user has to deal with the situation is to abandon task goals for exploration and simply ‘poke & look’.

    This breakdown in communication isn’t so much a literacy issue as it is a design problem (at least until a user has to deal with something that doesn’t make sense). Your heart of hearts is exceptionally correct. While computers are entirely logical, this doesn’t mean their interfaces make any sense. You might be interested in “The Design of Everyday Things” by Donald Norman and/or “The Humane Interface” by Jef Raskin.

  2. Undupepes

    I really enjoyed reading this post, keep up creating such interesting posts!!

Leave a Reply

Your email address will not be published. Required fields are marked *

Creative Commons License Licensed by Cathie LeBlanc under a Creative Commons Attribution 4.0 International License