Now that this shortened, very challenging Fall 2020 semester is over, I have started thinking about January followed by the Spring 2021 semester. As I posted back in October, we received a grant from the Northeast Big Data Hub to create a Data Analytics Learning Community (DALC). The grant provides stipends for 5 steering committee members to facilitate and plan activities for the DALC as well as stipends for 10 faculty members to participate in the learning community. We had 15 applications for the 10 stipends and several of the people who were not chosen to receive the stipends have indicated that they would like to participate anyway. The DALC kicks off with a weeklong workshop in January.
From the grant application: “The goal of the workshop will be to build faculty sense of competence, autonomy, and relatedness so that they feel able to effectively engage students in classroom activities that move beyond the traditional lecture and into the realm of using data science projects in the classroom.” In other words, the January workshop will teach faculty about data analytics but we will also have lots of conversations about using the principles of what we know about how people learn to help faculty plan activities that will fully engage their students. We have a model for faculty professional development in the Cluster Pedagogy Learning Community (CPLC) that has been running on our campus for a year and a half. In particular, we engage faculty in the CPLC using strategies, techniques, tools, etc. that we think they could use in their own classes. We use all of the ideas for how faculty can engage with students in their classes in the CPLC so faculty can experience that engagement as learners. We will use a similar strategy for the DALC.
In doing research about the principles of the science of learning, I encountered two resources that I found particularly interesting for our planning.
The first is about strategies for professional development for teaching of middle school science. The 2019 article describes a study that compared the student learning outcomes resulting from 2 different approaches to teacher professional development. (As an aside, I have full access to this article because I work at a University that provides such access. The article is otherwise behind a paywall owned by Wiley which makes it inaccessible to many people who would benefit from its results.) From the abstract: “Ninety schools were randomly assigned into one of three arms: (a) a treatment arm in which the textbook curriculum was modified based on four principles of cognitive science coupled with teacher professional development (PD), (b) a second treatment arm in which teachers received PD designed to improve their knowledge of the science content, and (c) a business‐as‐usual control group.”
In particular, the professional development for the cognitive science group focused on helping teachers figure out how to apply 4 known principles of learning to the science content: 1. case comparisons to identify similarities and differences that illustrate abstract concepts; 2. visualization and diagrammatic reasoning related to communication of scientific concepts; 3. spaced testing that revisits previous learning; and, 4. a focus on connecting new knowledge with prior knowledge while also addressing misconceptions about that prior knowledge. The professional development for the content knowledge group focused solely on increasing individual teachers’ knowledge of the science content they would be teaching. The control group did not receive any professional development related to the teaching of science.
The random assignment of teachers to three arms of professional development just happened to result in a skewing related to teaching experience of the teachers. The cognitive science group had an average of 7.5 years of teaching experience while the other two groups had an average of 15 years of teaching experience. Such variations sometimes happen with randomness.
The authors measured two outcomes: 1. science knowledge of the teachers, and, 2. science knowledge of the students. They found that the science knowledge of the teachers mostly had little variation regardless of the professional development arm they participated in. The science knowledge of the students (as measured by end-of-unit tests as well as state standardized tests), however, was significantly increased when their teacher had engaged in the professional development using the science of learning principles, despite the fact that those teachers had on average half the teaching experience of the other two groups. Interestingly, the science knowledge of the students whose teachers had participated in the professional development that increased their content knowledge was lower than the science knowledge of the control group. In other words, it appears that professional development that focuses solely on teachers’ knowledge of the science content actually harmed the learning of the students of those teachers. The researchers also note that the teachers who received professional development related to cognitive science and learning were able to transfer their skills to the teaching of scientific concepts other than the ones the encountered in the professional development.
I think these findings support the idea that if a goal of our learning community is to help faculty become better teachers of data analytics content, we should focus on helping them develop pedagogical skills that incorporate the principles of the science of learning.
The second resource is a highly cited document from Deans for Impact that summarizes the existing research from cognitive science related to how students learn, and, most importantly, connects this research to its practical implications for teaching and learning. The document is full of interesting principles related to learning along with ideas for how teachers can put those principles to use in their classrooms. There is also a substantial works cited section that provides additional background about all of the principles. As we think about the January workshop and the resulting DALC, we will want to model the principles in the design of the professional development. The particular principles that I think it will be critical for us to model are:
Students learn new ideas by reference to ideas they already know.
Students have limited working memory capacities that can be overwhelmed by tasks that are cognitively too demanding. Understanding new ideas can be impeded if students are confronted with too much information at once.
Information is often withdrawn from memory just as it went in. We usually want students to remember what information means and why it is important, so they should think about meaning when they encounter to-be-remembered material.
The transfer of knowledge or skills to a novel problem requires both knowledge of the problem’s context and a deep understanding of the problem’s underlying structure.
We understand new ideas via examples, but it’s often hard to see the unifying underlying concepts in different examples.
As I said, the document contains ideas about how to incorporate these principles into teaching and so we will use those ideas in the development of activities and discussions for the DALC.
I am looking forward to learning with the DALC and to seeing more data analytics content taught on our campus. If there is anyone from PSU who would like to participate in the DALC (without a stipend), please feel free to contact me.
Image Credit: I took this photo on November 28, 2020. To me, it looks like a visual representation of trying to make sense of a bunch of data.
I am currently Professor of Digital Media at Plymouth State University in Plymouth, NH. I am also the current Coordinator of General Education at the University. I am interested in game studies, digital literacies, open pedagogies, and generally how technology impacts our culture.