Author: Gloria E. Jacobs

Synthesis and Design Workshop: Principles for the equitable design of digitally-distributed, studio-based, STEM learning environments

This is a post from Jill Castek of the University of Arizona about a collaborative interdisciplinary workshop held at the UA’s Biosphere outside of Tucson, AZ.  Be sure to click on the included link to view the slide show that provides more detail.


The Synthesis and Design Workshop dovetails with work on digital equity and inclusion in the digital world. Our efforts to convene this workshop was centered on, *HOW DO WE DESIGN Innovations in Learning for THE FUTURE with inclusive principles at the core of Cyberlearning?*

We designed the workshop with four outcomes in mind: (1) serve to advance knowledge regarding critical gaps and opportunities (2) identify and characterize models of collaboration, networking, and innovation that operate within and across STEM education, (3) improve our understanding of effective practices for generating sustained interest and success in STEM fields, and (4) synthesize evidence to inform the effective uses of technologies and design principles for establishing and assessing effective, inclusive environments.

We’re sharing a brief overview of our project in these few slides as a means of prompting dialogue, connections, and critique that will help push our thinking forward as we continue to craft our white-paper and final report for the National Science Foundation that summarize efforts we’ve undertaken.  These efforts pave the way forward for sustained efforts that bring together research and practice communities.

On behalf of our whole team at the University of Arizona, we thank you for the opportunity to share our thinking and we welcome continued conversations, connections and partnerships that will push this work forward and increase its impact.  You can connect through Twitter @jillcastek

Launch of 21st Century Learning Ecosystems News (LENS) Blog

The Literacy, Language and Technology Research group at Portland State University and the EdTech Center @ World Education are proud to launch a new blog series, the 21st Century LENS. We will be using The 21C LENS to provide regular updates on our three-year study of workplace learning. The goal of this research is to identify essential factors of an effective learning ecosystem by learning about and from working learners experiences, choices, preferences, and motivations.

Please take a look at the blog, subscribe, and share widely!  The 21CLEO team will be cross-posting to the DLAER Hub on occasion.  Look for information on Twitter using #21CLEO

University/Library Partnership in Research: How We Did It and What We Learned

In this post, we describe how we conducted research about digital problem solving processes of library users.  This research was a collaboration between the Multnomah County Library system (Portland, OR) and researchers at Portland State University and the University of Arizona.  With generous support from the Institute of Museum and Library Services, we were able to spend three years conducting this research. We recognize that three years is a longer time commitment than most libraries are able to devote to a research project. However, we offer specifics about our process and methods in the hope they may help you in research you conduct in your setting.

The research team
The team was made up of university researchers focused on digital literacies and library staff who work with library users. Each collaborator offered her unique perspective and viewpoint.  As with many collaborations, the collective whole was stronger and definitely greater than the sum of its parts. Collaboration occurred at every stage of the research process from planning and design of the project through data collection, data analysis, presenting findings at conferences, and sharing the results widely.  

We learned that collaboration was key to the success of this research project. Each member of the research team contributed a unique set of skills that proved invaluable for working with library patrons within the community and for understanding the data.  University/library partnerships hold great promise for advancing our understandings of digital problem solving and adult learning in general.

The research tools
We used three different tools for collecting data, and each tool provided a different lens on our examination of digital problem solving.   Together, these tools unveiled insights into the lives and learning of the library patrons, and when combined yielded rich interpretive power to describe digital problem solving processes.

  • We designed a library use background survey to collect data on basic demographic information, access to digital devices and the internet, specifics about library use and use of the library’s website, and participants’ self assessment of difficulty for tasks commonly completed using the library’s website.  

The results from the survey were essential for helping us understand the background and experiences of those who participated in the study.  It also allowed us to identify two distinct groups of people in the study’s population – individuals who have internet access at home and use the library’s website for multiple purposes, and those who access the internet mostly from the library and who use the internet and the library’s resources for specific purposes (in a more limited way). The survey responses allowed us make comparisons across these distinct populations.

  • We used PIAAC’s Problem Solving in Technology Rich Environments (PSTRE) assessment. This assessment is available through Education and Skills Online for a fee.  We chose the PSTRE because it is one of the few valid and reliable scenario-based assessments that addresses digital problem solving that are available for adults.  After taking the assessment, participants are provided with a score report that summarizes their results. The organization that pays for the test administration is provided with a spreadsheet of scores that include a scaled score and a level.

Language of the tool. We learned that although the PSTRE can be administered in Spanish, the register of the assessment differed from the Spanish spoken by the majority of Multnomah County Library patrons.  

Interpreting results. We learned that while the PSTRE is useful for understanding patterns across demographic groups, it is less useful for gaining a deeper insight into how people actually engage in digital problem solving. We were interested in instructional implications, but this assessment wasn’t designed for this goal.   

Resources for learning. We also found that the scores provided to participants needed interpretation, so we developed an explanation of the scores they received along with a list of resources for those individuals who want to improve their digital problem solving.  

Application to learning in libraries.  We discovered that this tool was focused on contexts that many perceived to be workplace focused and thus may not be the right one for collecting data within libraries. There continues to be a need for a tool specific for library contexts.  

  • Because the PSTRE does not provide the information needed by librarians to inform the design of instructional supports, we developed an observation and interview protocol for use with participants as they completed the PSTRE and a set of online library tasks based on the PSTRE framework’s four core areas:  1. Setting Goals and Monitoring Progress, 2. Planning, Self-organizing, 3. Acquiring and Evaluating Information, and 4.  Using Information.
  • Participants were partnered with another library patron to encourage them to talk through the problem solving process; however, a handful of participants did not show up for the scheduled sessions. Rather than asking the participant to work alone, one of the researchers teamed up with the participant.  We also asked questions as they completed the tasks in order to gain insight into what they were doing. When participants began to struggle, one of the researchers would step in and help so as to minimize frustration and provide a learning opportunity for the participants.

We learned that assessment contexts can be instructionally rich. When working with individuals, we were flexible and supportive of their needs.  Although these sessions were work intensive, we found that the data generated gave us a great deal of insight into the nature of digital problem solving.  Additionally, participants walked away having acquired some new strategies for navigating different kinds of online contexts.

Who participated?
Approximately 450 library users completed the library use background survey. Of those participants invited to participate, 211 completed the PSTRE assessment.  Because we decided against using the Spanish version, all of our participants used English as their primary language.

At different times, we had participants complete the PSTRE assessment on their own computers at home and  found that there were challenges in remote study participation.  We readministered the assessment multiple times and had more success recruiting participants from vulnerable populations than when we set up administration settings in libraries and other community locations.  

Compensating individuals for their time. Upon completing the PSTRE, participants received a payment of $20.  An additional 18 individuals participated in the observation and interview protocol. These individuals were invited to participate through the library’s outreach program and were paid $40.  

We learned that the financial incentive was important to our population and encourage participation. We had to be flexible in accommodating the number of people who wanted to participate.

Data Analysis
Quantitative analysis included descriptive and interpretive statistics:  (a) basic demographics; (b) comparisons between two groups, and (c) a latent class analysis of the relationship between two groups and key variables. The two groups were those who participated in the first round of data collection and those who were recruited through the library’s outreach program. We conducted the same descriptive statistic analyses of both groups and found that the demographics of the two groups differed. We used the latent class analysis to compare the two groups using three key variables from the library background survey and the PSTRE scores:  

Qualitative analysis was conducted using codes developed from the PSTRE framework and a coding scheme developed by the research team.  Initially, the whole team worked together to analyze selected screen capture recordings. Once we were in agreement on how to code the data, we divided into subgroups to analyze the remaining recordings. We continued to meet weekly meetings and discuss what we were seeing in the data.  Once all data were coded, the university researchers developed categories and themes, which were shared with and further developed by members of the whole team.

Our data analysis valued examining data through the eyes of practitioners as well as researchers.  Having a team with a wide range of backgrounds and experiences generated rich discussions about what we were seeing and the implications for libraries.  

Final Thoughts
The research prompted rich discussions among researchers and library staff, and these discussions yielded outcomes that were well worth the effort. The collaboration that occurred prompted discussions and were generative in the short and long term.  Looking at research and at practice together, through multiple lenses, introduced new insights that encourage reflection, innovation, and new ideas. Although you may not have the luxury of a large research team and years to conduct the research, we hope that some of the things we’ve learned can help you to learn more about the needs of the individuals you work with. We encourage you to share your experiences with us and ask questions of other readers of this blog.  

Critical Thinking and Digital Problem Solving: What’s the Difference?

Brian Kane is the Digital Literacy Coordinator at Literacy Volunteers of Rochester.  The Digital Literacy program at LVR provides a free drop-in service where individuals can learn basic computer skills or get assistance completing  computer-essential tasks. This service is provided by volunteers who work one-to-one with learners. Brian read our blog and asked, “What’s the difference between critical thinking and problem solving? Or, are they essentially the same thing?”   As Brian explained in his email,

I’m re-thinking completely the way Digital Literacy thinks about what we’re trying to achieve with customers, and as a consequence how we train and support volunteers. Ultimately, we want to help customers become more digitally literate (capable of using technology on their own, and learning new skills as they go).

Brian’s question and insight about an expanded view of literacy in today’s world is such a great one and required a great deal of thought, so we wanted to share our thoughts with the  DLAERHub community. Rethinking literacy learning in the digital age is what we’re engaged in too, so it’s exciting to see how these ideas are playing out in practice in different contexts.

Before turning to a discussion of digital problem solving, it might be useful to review what we mean by digital literacy.  

What is digital literacy?  There are many definitions of digital literacy available, but the one we have found ourselves using most frequently is one presented by the American Library Association:

Digital literacy is the ability to use information and communication technologies to find, evaluate, create, and communicate information.  

In an issues brief, our colleague Kathy Harris provided a concrete definition of basic digital literacy skills:

The physical ability to (1) use digital devices, (2) create and use computer files, and (3) choose appropriate digital applications for different purposes.

What is critical thinking? Critical thinking is a term commonly used in the field of education, whether one is working with children and adults. It is commonly understood as being the ability to analyze and evaluate information in order to come up with an answer or form a judgement.  This type of thinking occurs throughout our lives, online and offline, whenever we are asked to think about the information we are presented with and make a decision about it. Critical thinking is a key component of digital problem solving.

In general, we prefer to use the term digital literacies–referring to it in the plural because a composite of competencies are involved that include basic computer skills, and also the ability to navigate online interfaces and efficiently use digital and online tools, and the abilities to   engage in educational pursuits and digital networking.

What is digital problem solving? As part of our work with the Multnomah County Library on digital problem solving with library patrons, we developed a definition of digital problem solving that differentiated it from a number of related constructs such as Problem Solving in Technology Rich Environments (PSTRE), Media Literacy, Information Literacy, New Literacies of Online Reading and Research, and Digital Literacies.  

Digital Problem Solving involves the nimble use of skills, strategies, ​and mindsets required to navigate online​ in everyday contexts​, including the library, and use novel resources, tools, and interfaces in efficient and flexible ways​ to accomplish personal and professional goals.

So what’s the difference between critical thinking and digital problem solving?

We suggest that digital problem solving occurs at the intersection of critical thinking skills and basic digital literacy skills, but goes beyond both to embrace a discovery orientation, with the  the ability and willingness to be flexible and able to adapt to novel situations. This is essential in a digital world that is marked by constant evolution and change in technologies, interfaces, tools, and texts.

Digital problem solving involves learning how to learn within ever-changing digital environments.

Critical thinking is part of digital problem solving. During digital problem solving, a learner engages in critical thinking sometimes individually and at other times collaboratively, especially when examining different websites and resources and determining which ones are the most useful for solving the task at hand, and for determining which information within any given resource is relevant.

What is different about digital problem solving is that in addition to thinking critically about information, learners  also have to be able to flexibly and rapidly shift perspective and adapt their skills to novel situations.

For example, during observations of learners navigating in novel technology-rich environments , we saw people who repeatedly tried to enact the same strategy or approach again and again  when solving a digital problem, even though it didn’t work the first, second, or third time. So even if the digital resource had the right information the learner needed, they still weren’t able to solve the problem at hand because they weren’t able to make use of the resource in a meaningful or efficient way.

We hope this post provides some clarity to the issue and helps those of you in the field continue to build on your important work.  We encourage you to make comments and keep asking hard questions!