Month: January 2019

University/Library Partnership in Research: How We Did It and What We Learned

In this post, we describe how we conducted research about digital problem solving processes of library users.  This research was a collaboration between the Multnomah County Library system (Portland, OR) and researchers at Portland State University and the University of Arizona.  With generous support from the Institute of Museum and Library Services, we were able to spend three years conducting this research. We recognize that three years is a longer time commitment than most libraries are able to devote to a research project. However, we offer specifics about our process and methods in the hope they may help you in research you conduct in your setting.

The research team
The team was made up of university researchers focused on digital literacies and library staff who work with library users. Each collaborator offered her unique perspective and viewpoint.  As with many collaborations, the collective whole was stronger and definitely greater than the sum of its parts. Collaboration occurred at every stage of the research process from planning and design of the project through data collection, data analysis, presenting findings at conferences, and sharing the results widely.  

We learned that collaboration was key to the success of this research project. Each member of the research team contributed a unique set of skills that proved invaluable for working with library patrons within the community and for understanding the data.  University/library partnerships hold great promise for advancing our understandings of digital problem solving and adult learning in general.

The research tools
We used three different tools for collecting data, and each tool provided a different lens on our examination of digital problem solving.   Together, these tools unveiled insights into the lives and learning of the library patrons, and when combined yielded rich interpretive power to describe digital problem solving processes.

  • We designed a library use background survey to collect data on basic demographic information, access to digital devices and the internet, specifics about library use and use of the library’s website, and participants’ self assessment of difficulty for tasks commonly completed using the library’s website.  

The results from the survey were essential for helping us understand the background and experiences of those who participated in the study.  It also allowed us to identify two distinct groups of people in the study’s population – individuals who have internet access at home and use the library’s website for multiple purposes, and those who access the internet mostly from the library and who use the internet and the library’s resources for specific purposes (in a more limited way). The survey responses allowed us make comparisons across these distinct populations.

  • We used PIAAC’s Problem Solving in Technology Rich Environments (PSTRE) assessment. This assessment is available through Education and Skills Online for a fee.  We chose the PSTRE because it is one of the few valid and reliable scenario-based assessments that addresses digital problem solving that are available for adults.  After taking the assessment, participants are provided with a score report that summarizes their results. The organization that pays for the test administration is provided with a spreadsheet of scores that include a scaled score and a level.

Language of the tool. We learned that although the PSTRE can be administered in Spanish, the register of the assessment differed from the Spanish spoken by the majority of Multnomah County Library patrons.  

Interpreting results. We learned that while the PSTRE is useful for understanding patterns across demographic groups, it is less useful for gaining a deeper insight into how people actually engage in digital problem solving. We were interested in instructional implications, but this assessment wasn’t designed for this goal.   

Resources for learning. We also found that the scores provided to participants needed interpretation, so we developed an explanation of the scores they received along with a list of resources for those individuals who want to improve their digital problem solving.  

Application to learning in libraries.  We discovered that this tool was focused on contexts that many perceived to be workplace focused and thus may not be the right one for collecting data within libraries. There continues to be a need for a tool specific for library contexts.  

  • Because the PSTRE does not provide the information needed by librarians to inform the design of instructional supports, we developed an observation and interview protocol for use with participants as they completed the PSTRE and a set of online library tasks based on the PSTRE framework’s four core areas:  1. Setting Goals and Monitoring Progress, 2. Planning, Self-organizing, 3. Acquiring and Evaluating Information, and 4.  Using Information.
  • Participants were partnered with another library patron to encourage them to talk through the problem solving process; however, a handful of participants did not show up for the scheduled sessions. Rather than asking the participant to work alone, one of the researchers teamed up with the participant.  We also asked questions as they completed the tasks in order to gain insight into what they were doing. When participants began to struggle, one of the researchers would step in and help so as to minimize frustration and provide a learning opportunity for the participants.

We learned that assessment contexts can be instructionally rich. When working with individuals, we were flexible and supportive of their needs.  Although these sessions were work intensive, we found that the data generated gave us a great deal of insight into the nature of digital problem solving.  Additionally, participants walked away having acquired some new strategies for navigating different kinds of online contexts.

Who participated?
Approximately 450 library users completed the library use background survey. Of those participants invited to participate, 211 completed the PSTRE assessment.  Because we decided against using the Spanish version, all of our participants used English as their primary language.

At different times, we had participants complete the PSTRE assessment on their own computers at home and  found that there were challenges in remote study participation.  We readministered the assessment multiple times and had more success recruiting participants from vulnerable populations than when we set up administration settings in libraries and other community locations.  

Compensating individuals for their time. Upon completing the PSTRE, participants received a payment of $20.  An additional 18 individuals participated in the observation and interview protocol. These individuals were invited to participate through the library’s outreach program and were paid $40.  

We learned that the financial incentive was important to our population and encourage participation. We had to be flexible in accommodating the number of people who wanted to participate.

Data Analysis
Quantitative analysis included descriptive and interpretive statistics:  (a) basic demographics; (b) comparisons between two groups, and (c) a latent class analysis of the relationship between two groups and key variables. The two groups were those who participated in the first round of data collection and those who were recruited through the library’s outreach program. We conducted the same descriptive statistic analyses of both groups and found that the demographics of the two groups differed. We used the latent class analysis to compare the two groups using three key variables from the library background survey and the PSTRE scores:  

Qualitative analysis was conducted using codes developed from the PSTRE framework and a coding scheme developed by the research team.  Initially, the whole team worked together to analyze selected screen capture recordings. Once we were in agreement on how to code the data, we divided into subgroups to analyze the remaining recordings. We continued to meet weekly meetings and discuss what we were seeing in the data.  Once all data were coded, the university researchers developed categories and themes, which were shared with and further developed by members of the whole team.

Our data analysis valued examining data through the eyes of practitioners as well as researchers.  Having a team with a wide range of backgrounds and experiences generated rich discussions about what we were seeing and the implications for libraries.  

Final Thoughts
The research prompted rich discussions among researchers and library staff, and these discussions yielded outcomes that were well worth the effort. The collaboration that occurred prompted discussions and were generative in the short and long term.  Looking at research and at practice together, through multiple lenses, introduced new insights that encourage reflection, innovation, and new ideas. Although you may not have the luxury of a large research team and years to conduct the research, we hope that some of the things we’ve learned can help you to learn more about the needs of the individuals you work with. We encourage you to share your experiences with us and ask questions of other readers of this blog.