Category: Uncategorized

Synthesis and Design Workshop: Principles for the equitable design of digitally-distributed, studio-based, STEM learning environments

This is a post from Jill Castek of the University of Arizona about a collaborative interdisciplinary workshop held at the UA’s Biosphere outside of Tucson, AZ.  Be sure to click on the included link to view the slide show that provides more detail.


The Synthesis and Design Workshop dovetails with work on digital equity and inclusion in the digital world. Our efforts to convene this workshop was centered on, *HOW DO WE DESIGN Innovations in Learning for THE FUTURE with inclusive principles at the core of Cyberlearning?*

We designed the workshop with four outcomes in mind: (1) serve to advance knowledge regarding critical gaps and opportunities (2) identify and characterize models of collaboration, networking, and innovation that operate within and across STEM education, (3) improve our understanding of effective practices for generating sustained interest and success in STEM fields, and (4) synthesize evidence to inform the effective uses of technologies and design principles for establishing and assessing effective, inclusive environments.

We’re sharing a brief overview of our project in these few slides as a means of prompting dialogue, connections, and critique that will help push our thinking forward as we continue to craft our white-paper and final report for the National Science Foundation that summarize efforts we’ve undertaken.  These efforts pave the way forward for sustained efforts that bring together research and practice communities.

On behalf of our whole team at the University of Arizona, we thank you for the opportunity to share our thinking and we welcome continued conversations, connections and partnerships that will push this work forward and increase its impact.  You can connect through Twitter @jillcastek

Digital Equity Act of 2019

We’ve not posted much on this blog lately, but that doesn’t mean we haven’t been busy! Before we give you a brief update on our activities, we want to you know about the Digital Equity Act of 2019.

The Digital Equity Act was introduced into the US Senate by Sen. Murray, Patty (D-WA).  The house bill will be introduced soon by US Representative Jerry McNerney’s (D-CA).  The Digital Equity Act supports a diverse array of digital equity projects at the state and local level to help close the digital divide.  The National Digital Inclusion Alliance created the Digital Equity Act website as a way to get out information about the act.

Twitter #digitalequitynow


And now a little bit about what we’ve been doing!

We’ve been hard at work writing for publications, and we’ll be sure to let you know when those come out. We’ve also been presenting at conferences and thinking about future projects as well as moving forward on our work on the 21CLEO project that investigates the experiences of incumbent workers as they engage and persist in employer supported learning opportunities.  Through this work we intende to capture and elevate the voices of the working learners.  Additionally, we are striving to cross silos and build a culture of inquiry. To that end, we have a blog where we are regularly posting updates about our work and inviting comments. Take a trip over to the 21CLEO to join the conversation.

New Publication! Promising Library Practices: Assessing and Instructing Digital Problem Solving

We have a new article, entitled Promising Library Practices: Assessing and Instructing Digital Problem Solving, out in a special issue of the International Journal on Innovations in Online Education. Authored by Jill Castek and Gloria Jacobs, the article introduces a practical framework and tools that can be used or adapted to assess the digital problem-solving abilities of library users. Castek and Jacobs argue that a structured digital problem-solving assessment can be used by librarians and other library staff to customize supports for underserved adult learners specifically, and to enhance learning experiences with digital tools in libraries for the wider public more generally.

Many thanks to editors, Sandy Toro and Chuck Thomas, for bringing this special issue to fruition and to our reviewers for providing valuable feedback as we honed the manuscript.  We also could not have done this work without our colleagues at the Multnomah County Library, Portland State University, and the University of Arizona.

We hope you find the ideas in the article thought-provoking and helpful as you consider your work. We’d love to get your feedback, so please feel free to comment.  And take a look at the other articles in the issue too!  There’s some good stuff!

Launch of 21st Century Learning Ecosystems News (LENS) Blog

The Literacy, Language and Technology Research group at Portland State University and the EdTech Center @ World Education are proud to launch a new blog series, the 21st Century LENS. We will be using The 21C LENS to provide regular updates on our three-year study of workplace learning. The goal of this research is to identify essential factors of an effective learning ecosystem by learning about and from working learners experiences, choices, preferences, and motivations.

Please take a look at the blog, subscribe, and share widely!  The 21CLEO team will be cross-posting to the DLAER Hub on occasion.  Look for information on Twitter using #21CLEO

University/Library Partnership in Research: How We Did It and What We Learned

In this post, we describe how we conducted research about digital problem solving processes of library users.  This research was a collaboration between the Multnomah County Library system (Portland, OR) and researchers at Portland State University and the University of Arizona.  With generous support from the Institute of Museum and Library Services, we were able to spend three years conducting this research. We recognize that three years is a longer time commitment than most libraries are able to devote to a research project. However, we offer specifics about our process and methods in the hope they may help you in research you conduct in your setting.

The research team
The team was made up of university researchers focused on digital literacies and library staff who work with library users. Each collaborator offered her unique perspective and viewpoint.  As with many collaborations, the collective whole was stronger and definitely greater than the sum of its parts. Collaboration occurred at every stage of the research process from planning and design of the project through data collection, data analysis, presenting findings at conferences, and sharing the results widely.  

We learned that collaboration was key to the success of this research project. Each member of the research team contributed a unique set of skills that proved invaluable for working with library patrons within the community and for understanding the data.  University/library partnerships hold great promise for advancing our understandings of digital problem solving and adult learning in general.

The research tools
We used three different tools for collecting data, and each tool provided a different lens on our examination of digital problem solving.   Together, these tools unveiled insights into the lives and learning of the library patrons, and when combined yielded rich interpretive power to describe digital problem solving processes.

  • We designed a library use background survey to collect data on basic demographic information, access to digital devices and the internet, specifics about library use and use of the library’s website, and participants’ self assessment of difficulty for tasks commonly completed using the library’s website.  

The results from the survey were essential for helping us understand the background and experiences of those who participated in the study.  It also allowed us to identify two distinct groups of people in the study’s population – individuals who have internet access at home and use the library’s website for multiple purposes, and those who access the internet mostly from the library and who use the internet and the library’s resources for specific purposes (in a more limited way). The survey responses allowed us make comparisons across these distinct populations.

  • We used PIAAC’s Problem Solving in Technology Rich Environments (PSTRE) assessment. This assessment is available through Education and Skills Online for a fee.  We chose the PSTRE because it is one of the few valid and reliable scenario-based assessments that addresses digital problem solving that are available for adults.  After taking the assessment, participants are provided with a score report that summarizes their results. The organization that pays for the test administration is provided with a spreadsheet of scores that include a scaled score and a level.

Language of the tool. We learned that although the PSTRE can be administered in Spanish, the register of the assessment differed from the Spanish spoken by the majority of Multnomah County Library patrons.  

Interpreting results. We learned that while the PSTRE is useful for understanding patterns across demographic groups, it is less useful for gaining a deeper insight into how people actually engage in digital problem solving. We were interested in instructional implications, but this assessment wasn’t designed for this goal.   

Resources for learning. We also found that the scores provided to participants needed interpretation, so we developed an explanation of the scores they received along with a list of resources for those individuals who want to improve their digital problem solving.  

Application to learning in libraries.  We discovered that this tool was focused on contexts that many perceived to be workplace focused and thus may not be the right one for collecting data within libraries. There continues to be a need for a tool specific for library contexts.  

  • Because the PSTRE does not provide the information needed by librarians to inform the design of instructional supports, we developed an observation and interview protocol for use with participants as they completed the PSTRE and a set of online library tasks based on the PSTRE framework’s four core areas:  1. Setting Goals and Monitoring Progress, 2. Planning, Self-organizing, 3. Acquiring and Evaluating Information, and 4.  Using Information.
  • Participants were partnered with another library patron to encourage them to talk through the problem solving process; however, a handful of participants did not show up for the scheduled sessions. Rather than asking the participant to work alone, one of the researchers teamed up with the participant.  We also asked questions as they completed the tasks in order to gain insight into what they were doing. When participants began to struggle, one of the researchers would step in and help so as to minimize frustration and provide a learning opportunity for the participants.

We learned that assessment contexts can be instructionally rich. When working with individuals, we were flexible and supportive of their needs.  Although these sessions were work intensive, we found that the data generated gave us a great deal of insight into the nature of digital problem solving.  Additionally, participants walked away having acquired some new strategies for navigating different kinds of online contexts.

Who participated?
Approximately 450 library users completed the library use background survey. Of those participants invited to participate, 211 completed the PSTRE assessment.  Because we decided against using the Spanish version, all of our participants used English as their primary language.

At different times, we had participants complete the PSTRE assessment on their own computers at home and  found that there were challenges in remote study participation.  We readministered the assessment multiple times and had more success recruiting participants from vulnerable populations than when we set up administration settings in libraries and other community locations.  

Compensating individuals for their time. Upon completing the PSTRE, participants received a payment of $20.  An additional 18 individuals participated in the observation and interview protocol. These individuals were invited to participate through the library’s outreach program and were paid $40.  

We learned that the financial incentive was important to our population and encourage participation. We had to be flexible in accommodating the number of people who wanted to participate.

Data Analysis
Quantitative analysis included descriptive and interpretive statistics:  (a) basic demographics; (b) comparisons between two groups, and (c) a latent class analysis of the relationship between two groups and key variables. The two groups were those who participated in the first round of data collection and those who were recruited through the library’s outreach program. We conducted the same descriptive statistic analyses of both groups and found that the demographics of the two groups differed. We used the latent class analysis to compare the two groups using three key variables from the library background survey and the PSTRE scores:  

Qualitative analysis was conducted using codes developed from the PSTRE framework and a coding scheme developed by the research team.  Initially, the whole team worked together to analyze selected screen capture recordings. Once we were in agreement on how to code the data, we divided into subgroups to analyze the remaining recordings. We continued to meet weekly meetings and discuss what we were seeing in the data.  Once all data were coded, the university researchers developed categories and themes, which were shared with and further developed by members of the whole team.

Our data analysis valued examining data through the eyes of practitioners as well as researchers.  Having a team with a wide range of backgrounds and experiences generated rich discussions about what we were seeing and the implications for libraries.  

Final Thoughts
The research prompted rich discussions among researchers and library staff, and these discussions yielded outcomes that were well worth the effort. The collaboration that occurred prompted discussions and were generative in the short and long term.  Looking at research and at practice together, through multiple lenses, introduced new insights that encourage reflection, innovation, and new ideas. Although you may not have the luxury of a large research team and years to conduct the research, we hope that some of the things we’ve learned can help you to learn more about the needs of the individuals you work with. We encourage you to share your experiences with us and ask questions of other readers of this blog.  

Innovations in Digital Literacy Programs

Our guest blogger today is Drew Pizzolato, co-manager of NTEN’s Digital Inclusion Fellowship.

What was the first time you were excited to use a computer? What got you interested in learning new tools and skill on the computer? What was your first spark of joy and creativity online? It seems like everyone who wasn’t born using a smartphone has a story about their early days using computers or the internet. While these stories often have common themes, they’re all a little different. We all have our own quirks, interests, and skills that prime us for our spark moment – that first instance where technology becomes an alive, exciting, and accessible world of possibility.  As the co-manager of NTEN’s Digital Inclusion Fellowship, I am lucky to support digital inclusion leaders around the United States.  Fellows join a year-long cohort-based professional development program where they build skills in program management, leadership, and digital inclusion. They spend the year focused developing and implementing new or expanded digital literacy programs. We often brainstorm new ideas about how they can engage new community members. How can we foster those spark moments to engage learners in building digital skills?

Teachers and program designers are driving some of the most fun and exciting innovations in digital literacy programming by incorporating the latest technology and gadgets into their classes. In a recent webinar hosted by NTEN, presenters shared some of the unique programs they’ve developed. Taina Evans from the Brooklyn Public Library described Xbox bowling for seniors as well as an Amazon Alexa Jeopardy session. I’m really inspired by these types of programs. As internet connected devices become ubiquitous and the ways we interact with them proliferate, digital literacy leaders should be looking for new opportunities to connect adults with technology. Not everyone needs to enter the digital world through keyboarding and email. Why not voice controlled AI and virtual bowling?

Similarly, creative program designers are finding new ways to provide learners the opportunity to builds skill in in contemporary digital spaces. New tools with sophisticated user interfaces are developed for users to experience immediate success while intuitively exploring and experimenting features. The simplicity and intuitive nature of these interfaces provide opportunities for new-to-computer users, too. Andrew Farrelly is a 2018 Digital Inclusion Fellow and a coordinator at Rising Tide Capital, a nonprofit that uplifts communities and individuals by empowering entrepreneurs to grow their businesses. To help his adult students build mousing skills, Andrew introduced them to Canva. Canva is a web based design and publishing tool. It’s simple drag-and-drop interface allows students to practice their mousing skills while creating authentic brochures and business cards that they could use for their small businesses. Students loved playing with the tool and creating their own custom materials. Canva, and tools like it, are a fun and easy update on the classic “create your own greeting card/business card/etc.” activity.

So, if you’re eyeing the latest gadget for your own work or leisure, think about how it might be implemented in a digital literacy class. If you have other stories of hi tech gadget and interfaces in the new-to-technology user space, I’d love to learn about them. And if you’re interested in learning more about the Digital Inclusion Fellowship, you’ll be pleased to know that we’re currently accepting applications for the 2019 cohort. Learn more on the application page.

21st Century Learning Ecosystem Opportunities Research Project

We are pleased to announce a new project led by Kathy Harris, Applied Linguistics faculty at Portland State in Portland, OR.  Co-investigators include Jill Castek of the University of Arizona in Tucson, AZ, Jen Vanek of WorldEd in Minneapolis, MN, and Gloria Jacobs of Portland State and University of Arizona.

The new project, entitled 21st Century Learning Ecosystems Opportunities (21CLEO), is a $750,000, three-year grant from the Walmart Foundation. The purpose of the project is to study the 21st century learning ecosystem of adults employed in frontline service work. The national research team will explore the barriers and incentives to participation in employer-provided education as well as other learning opportunities, with a focus on digital literacies and digital access. An important goal of the research is to amplify the voices of frontline service workers.

The progress of the research will be shared via the 21CLEO blog, hosted by WorldEd.  We will be cross-posting on the DLAERHub blog.

We are excited about this new project and hope you will follow our progress and contribute your thoughts via comments on the 21LCLEO blog.

Critical Thinking and Digital Problem Solving: What’s the Difference?

Brian Kane is the Digital Literacy Coordinator at Literacy Volunteers of Rochester.  The Digital Literacy program at LVR provides a free drop-in service where individuals can learn basic computer skills or get assistance completing  computer-essential tasks. This service is provided by volunteers who work one-to-one with learners. Brian read our blog and asked, “What’s the difference between critical thinking and problem solving? Or, are they essentially the same thing?”   As Brian explained in his email,

I’m re-thinking completely the way Digital Literacy thinks about what we’re trying to achieve with customers, and as a consequence how we train and support volunteers. Ultimately, we want to help customers become more digitally literate (capable of using technology on their own, and learning new skills as they go).

Brian’s question and insight about an expanded view of literacy in today’s world is such a great one and required a great deal of thought, so we wanted to share our thoughts with the  DLAERHub community. Rethinking literacy learning in the digital age is what we’re engaged in too, so it’s exciting to see how these ideas are playing out in practice in different contexts.

Before turning to a discussion of digital problem solving, it might be useful to review what we mean by digital literacy.  

What is digital literacy?  There are many definitions of digital literacy available, but the one we have found ourselves using most frequently is one presented by the American Library Association:

Digital literacy is the ability to use information and communication technologies to find, evaluate, create, and communicate information.  

In an issues brief, our colleague Kathy Harris provided a concrete definition of basic digital literacy skills:

The physical ability to (1) use digital devices, (2) create and use computer files, and (3) choose appropriate digital applications for different purposes.

What is critical thinking? Critical thinking is a term commonly used in the field of education, whether one is working with children and adults. It is commonly understood as being the ability to analyze and evaluate information in order to come up with an answer or form a judgement.  This type of thinking occurs throughout our lives, online and offline, whenever we are asked to think about the information we are presented with and make a decision about it. Critical thinking is a key component of digital problem solving.

In general, we prefer to use the term digital literacies–referring to it in the plural because a composite of competencies are involved that include basic computer skills, and also the ability to navigate online interfaces and efficiently use digital and online tools, and the abilities to   engage in educational pursuits and digital networking.

What is digital problem solving? As part of our work with the Multnomah County Library on digital problem solving with library patrons, we developed a definition of digital problem solving that differentiated it from a number of related constructs such as Problem Solving in Technology Rich Environments (PSTRE), Media Literacy, Information Literacy, New Literacies of Online Reading and Research, and Digital Literacies.  

Digital Problem Solving involves the nimble use of skills, strategies, ​and mindsets required to navigate online​ in everyday contexts​, including the library, and use novel resources, tools, and interfaces in efficient and flexible ways​ to accomplish personal and professional goals.

So what’s the difference between critical thinking and digital problem solving?

We suggest that digital problem solving occurs at the intersection of critical thinking skills and basic digital literacy skills, but goes beyond both to embrace a discovery orientation, with the  the ability and willingness to be flexible and able to adapt to novel situations. This is essential in a digital world that is marked by constant evolution and change in technologies, interfaces, tools, and texts.

Digital problem solving involves learning how to learn within ever-changing digital environments.

Critical thinking is part of digital problem solving. During digital problem solving, a learner engages in critical thinking sometimes individually and at other times collaboratively, especially when examining different websites and resources and determining which ones are the most useful for solving the task at hand, and for determining which information within any given resource is relevant.

What is different about digital problem solving is that in addition to thinking critically about information, learners  also have to be able to flexibly and rapidly shift perspective and adapt their skills to novel situations.

For example, during observations of learners navigating in novel technology-rich environments , we saw people who repeatedly tried to enact the same strategy or approach again and again  when solving a digital problem, even though it didn’t work the first, second, or third time. So even if the digital resource had the right information the learner needed, they still weren’t able to solve the problem at hand because they weren’t able to make use of the resource in a meaningful or efficient way.

We hope this post provides some clarity to the issue and helps those of you in the field continue to build on your important work.  We encourage you to make comments and keep asking hard questions!

The Role of Experience in Digital Problem Solving

Our work with digital problem solvers made clear the importance of an individual’s experience with a variety of digital tools.

The ability to use digital tools to problem solve is impacted by the socioeconomic and cultural situation of the problem solvers. For example, do they have access to both digital hardware and high speed internet? If so, is it at home or do they have to go to another location such as a library or school? Also, do the problem solvers have access to learning opportunities and support around digital problem solving?  As our previous research into digital literacy acquisition shows, experience with online tools contributes to an individual’s confidence to use those tools, so those individuals who do not have consistent and reliable access to hardware and the internet or the opportunities to learn, will be unlikely to be able to build the confidence needed to navigate the multiple online resources needed to problem solve.

In the qualitative findings report from the Digital Equity in Libraries study, we provide scenarios of problem solvers that demonstrate this point. The scenarios are not meant to be generalized to all problem solvers. Instead they are meant to illustrate the differences between more experienced and less experienced problem solvers. Because problem solving is context dependent, how individual problem solvers move through a digital problem necessarily differs. Furthermore, not all problem solvers demonstrate all behaviors and may, in fact, demonstrate a mix of behaviors from the experienced and less experienced columns.

The table below, which also appears in the qualitative findings report, encapsulates the differences we saw between more experienced problem solvers and those with less experience.

More experienced problem solvers Less experienced problem solvers
adapted to novel environments struggled within novel environments
surveyed the resources available to them tended to overlook many of the resources available or focus on the resources immediately apparent or those that are familiar to them
decided/figured out how to use resources to their best advantage once aware of resources needed support to determine how to use them
used exploration/investigation to learn new approaches were slow to examine what the resources and interfaces offer
let go of what they know (from previous experience) relied heavily on approaches they know from past experience and resist trying new approaches
figured out what does not work through trial and error were hesitant to try something different for fear of making a mistake, and they needed reassurance before trying something new

We would love to hear what you’ve seen in your work with adult digital problem solvers.  Have you seen the same types of things we documented in our work?  What other ways have you seen people approach digital problem solving?

Digital Problem Solving Strategies

In previous posts, we’ve shared (a) the interactive relationship of approaches, strategies, and purpose; and (b) the different approaches we saw individuals use when engaging in digital problem solving.  In today’s post we discuss the digital problem solving strategies we saw individuals use.  We stress that this list is not necessarily inclusive of all problem solving strategies. You may see other strategies, and we invite you to share those with us in the comments!

We identified six strategies, which we’ve organized into two sets:

  • Information focused: These ask, do you you understand the purpose of the task, do you know what information you need, and are you accomplishing what you expected.  
  • Navigation focused: These ask, do you understand the purpose or function of a digital interface, are you willing to explore the different resources and interfaces available to you, and are you willing to experiment with the different digital interfaces to solve your task.

Information seeking strategies

  • identifying information needed,
    Choosing to ignore the information you don’t need and focusing on what is needed. Doing this not only at the beginning but throughout the problem solving process. This is related to the strategy of checking and rechecking one’s work.  This is also related to the strategy of identifying one’s purpose.
  • identifying the purpose of the task,Continually clarifying or specifying the purpose by reviewing the criteria on an ongoing basis.  This is related to the strategy of identifying the information needed and can include checking and rechecking.
  • checking and rechecking one’s work.
    Clarifying your purpose and examining the criteria against one’s progress on the task.  Making sure that you’ve accomplished your task.

These three strategies may be enacted at any given time and recursively. For example, a problem solver needs to identify what information is needed and the purpose of the task as they set goals, but then they have to recheck the information needed and purpose of the task as they monitor their progress; plan their next steps; and acquire, evaluate, and use the information.

Navigation Strategies

  • identifying the purpose of an interface or function of an interface,
    Based on prior experience and one’s personal learning needs, or exploration and experimentation, understanding why a particular interface or function of an interface is used.
  • exploring the resources and interfaces,
    Taking the time to look at the different resources and what the functions are within an interface before attempting to solve the problem or complete the task.  When the current method isn’t working, stopping mid-task to look at the resources and functions within an interface before continuing.
  • experimenting with the interfaces
    Using the interface and functions in different ways to see how the output differs. Trying different functions to see which one would give the desired result.

These three strategies may be enacted in order for the problem solver to make use of hardware, software, commands and functions, and representations (text, sound, numbers, graphics, videos).

For more illustrative examples of these strategies, read our complete report on PDX Scholar.