I have long been interested in design. I started my career as a computer scientist. I had fallen in love with programming at the age of 15 and wanted to be a software developer. I loved to come up with new and unique software solutions to tasks that were important to me. I learned an important lesson in my first job, working as a software developer for The Geary Corporation. Our clients were Fortune 100 companies that typically worked with large software development companies like Electronic Data Systems and Andersen Consulting. Our success against our larger competitors came from our unique way of designing software. We worked with presidents and CEOs on the budget for the software development. And then we worked with the people who would actually use the software to design the software. We built quick interface prototypes and tested them with actual users so that we could understand their work flow and build software that supported that work flow. We avoided working with middle management who might tell us how they thought the software should work but whose opinions were not based in the day to day reality of using the software. Instead, we saw the actual users as the experts whose expertise we needed to elicit and by focusing our design process on them, we built software that met their needs.
This design lesson has stuck with me. As a computer science faculty member, I co-authored a book called Project-Based Software Engineering: An Object-Oriented Approach published by Pearson. Our textbook focused on a single path through the vast array of possibilities for how to design software. The path we developed focused on constant interaction with users during the design process as well as rapid prototyping and testing with those same users. The process starts with designing, building, and testing core functionality first and once the core is finished, adding more and more functionality until the software is complete or until the deadline for completion comes. By using this iterative process, problems in understanding users’ needs, expertise, or work flow can be addressed before too many resources are committed to these problems. In our book, we outlined some real-life software development nightmares in which millions of dollars were spent over significant periods of time resulting in flawed software that never gets used. The majority of these nightmares occurred because the design and development process was too far separated from actual software users for far too long.
Once I left computer science, I began to focus on game design and development as a research and teaching area. I teach my students a game design process that is similar to the software design process that I learned at Geary and promoted in my textbook. The iterative game design process that I teach starts with deciding what we want the player to experience while playing our game. This player experience goal becomes the deciding factor as we revise our game through the design process. Once we have a player experience goal, we brainstorm ideas for how we might give the player that experience (what the game might look like and be about). Very quickly, we formalize some of those ideas into a game prototype which we playtest with some users. Whether we are creating board games, card games, computer games, social games, etc., we make a crude paper prototype with core gameplay mechanics and perhaps a few other features laid out and we ask some people to play the game. We gather feedback from these playtesters and determine whether the decisions we have made up to this point move us closer to providing the players with the experience we set out at the beginning. We figure out where our problem points are (where the player experience goal gets muddied) and brainstorm some ideas about how to fix those problems. We build a new prototype and playtest again. We repeat this process over and over until we’re satisfied with the result (or, more likely, we reach our development deadline–the end of the semester).
So why am I talking about my experience with design as a field? I’m doing some research for a new class for the Spring 2019 semester. The class is called Designing Online and Face-to-Face Experiences for New PSU Students and is part of the pilot project for our new Integrated Capstone experience for General Education. I’m making the syllabus publicly available as I develop the course. The idea of the course is that the enrolled students (who are juniors and seniors) will learn about the various activities we provide for new students coming to PSU. We will talk with the admissions and student affairs staff who are responsible for putting these events to learn the goals of these experiences. The students will remember and reflect on their own incoming experiences. They will interview their peers about their experiences. We will read about experience design and service design and learn about the tools these fields use. And then the students will design new experiences for our incoming students. Finally, they will share their ideas with the admissions and student affairs staff.
In my preparation for this class, I’ve been reading a lot about experience design and service design and interaction design and other design disciplines. Even though there are a lot of different design disciplines, they seem to have many commonalities, which is what my own experience with various design disciplines has led me to believe. In my search for research that articulates the commonalities among these design disciplines, I came across a foundational article from 35 years ago called “Designerly Ways of Knowing” by Nigel Cross.
Writing in 1982, Cross makes the case for design as a third culture or “way of knowing” that all students, regardless of major, should be required to study. The first two cultures that all students are required to study are the sciences and the humanities. He argues that the sciences study the natural world, the humanities study human experience, and design studies the artificial world. He goes on to say that the methods used by the sciences are controlled experiments, classification, and analysis while the humanities use the methods of analogy, metaphor, criticism, and evaluation. The methods used by design are modelling, pattern-formation, and synthesis. Finally, he says that the sciences value objectivity, rationality, neutrality, and a concern for ‘truth.’ The humanities value subjectivity, imagination, commitment, and a concern for ‘justice.’ Design values practicality, ingenuity, empathy, and a concern for ‘appropriateness.’ Scientists and humanists rely on deductive and inductive thinking while designers rely on constructive thinking. Scientists use numerical modes of thinking and communicating, humanists use verbal and literary modes of thinking and communicating, and designers use nonverbal, graphic images (drawings, diagrams, sketches, etc.) for thinking and communicating. Through this argument for the study of design as a third culture, Cross articulates the tools and attitudes of design that are important for everyone to learn as part of their general education.
In particular, Cross compares the problem solving strategies of scientists and humanists with those of designers. Scientists and humanists spend a significant amount of time learning about the problem to be solved. Scientists, for example, tend to believe that a thorough understanding of the problem to be solved with result in the discovery of a fundamental rule describing the solution to the problem. In other words, the solution exists and the scientist needs to observe the world carefully to discover the solution. Designers, on the other hand, learn about the nature of the problem by trying out solutions, by focusing on the desired result. Cross says, “The scientists adopted a generally problem-focused strategy and the architects a solution-focused strategy.” Designers are “constrained to produce a practicable result within a specific time limit, whereas the scientist and [humanist] are both able, and often required, to suspend their judgements and decisions until more is known–‘further research is needed’ is always a justifiable conclusion for them.”
Cross also says that design problems tend to be ill-defined, ill-structured, or “wicked.” “They are not problem for which all the necessary information is, or ever can be, available to the problem-solver.” Design is focused on “how things ought to be.” That is, designers focus on changing the constructed world. An educated person, Cross argues, must be able to “understand the nature of ill-defined problems, how to tackle them, and how they differ from other kinds of problems.”
Finally, Cross identified five aspects of designerly ways of knowing:
Designers tackle ‘ill-defined’ problems.
Their mode of problem-solving is ‘solution-focused’.
Their mode of thinking is ‘constructive’.
They use ‘codes’ that translate abstract requirements into concrete objects.
They use these codes to both ‘read’ and ‘write’ in ‘object languages’.
This articulation of “design” as a discipline that provides important and useful skills, knowledge, and ways of engaging with the world makes sense to me. Cross argues that design should be an explicit part of a general education program, just as the sciences and the humanities are part of general education programs. I’ll need to spend some time thinking about that and how we might incorporate these designerly ways of knowing more fully into PSU’s Gen Ed program. In the meantime, this article has helped me think about what I want students to gain from taking my INCAP course in the Spring.
I am currently Professor of Digital Media at Plymouth State University in Plymouth, NH. I am also the current Coordinator of General Education at the University. I am interested in game studies, digital literacies, open pedagogies, and generally how technology impacts our culture.