Lessons Learned From Research Paper

5. Resilience. Within the assessment processes, institutional, technological, and economic options that offer insurance or appear the most robust to a suite of environmental changes could be identified. These could include such measures as changed cropping patterns, water conservation, germplasm preservation, park design, and habitat connectivity, within the context of long-term resiliency to a changing climate. In addition, early warning systems for various environmental indicators (droughts, floods, tropical cyclones, wildfires) could be established in pertinent regions. Finally, development of a compendium of best practices for coping with extreme events and deployment of appropriate preparedness programs would enhance resilience.

6. Regional Foci. Understanding the impacts of climate change in a particular place in concert with the other environmental stressors operating in that region is key to developing wise coping options. The Regional Integrated Sciences and Assessments program and the regional studies begun under the U.S. National Assessment process are models of this nascent type of analysis, and an increase in this kind of activity is greatly needed.

7. Stakeholder Involvement. Connecting stakeholders to an ongoing research effort directly aimed at producing usable knowledge of value to stakeholders requires long-term partnerships, trust that researchers will actually stay the course, thorough familiarity on both sides about what each is doing, considerable effort expended by the research teams to gain knowledge about the decision context and the needs of the different types of stakeholders, and appreciation by the stakeholders of the added value the results of the research can offer to their concerns. All of this takes time and resources. The RISA teams, for example, have used periodic systematic surveys, annual workshops custom-tailored to the specific interests of different combinations of stakeholders (e.g., water resource managers, forest fire managers, fisheries managers, farmers, coastal managers), and the co-production of specific decision-support tools as ways to build in true stakeholder involvement.

Research and experience to date show that overemphasizing climate forecasts per se is counterproductive. Users have a decided preference for deterministic forecasts and lack of understanding of probabilistic forecasts to an extent that only the technically advanced early adopters find probabilistic climate forecasts to be useful. For others a softer approach is more useful and more readily understood. This approach is grounded on the fact that all stakeholders really want to understand to what extent climate is responsible for the underlying variation in the resources they use or manage or the economic activities in which they are engaged. Once researchers recognize this fact, it is possible to have fruitful, long-term relationships that evolve. However, each party to the relationship has to be committed to learning from the other, and the researchers need to strive to produce information and decision tools that are useful and timely to the stakeholders. However, it should be understood that stakeholders cannot define the totality of the research agenda for the simple reason that often the stakeholders do not and

Academic librarians often conduct original research to fulfill tenure track requirements or their own professional development needs. Unlike the majority of other terminal degree programs, research methodology is not a required course in all Library Science graduate programs, leaving some librarians at a disadvantage.1

After conducting our first research project as academic librarians, we realized we had taken some missteps and would have benefited from a primer that walked us through the basics. When administering our first survey, we discovered that an ample amount of forethought, planning, and investigation is essential. This article evaluates our first foray into research and provides accessible tips and guidelines for our colleagues.

We focused our project on the New York University Abu Dhabi (NYUAD) Library’s preference for e-books. Our library has the unique opportunity of building a new collection that capitalizes on the most recent developments in 21-century academic publishing. E-books suit our collection well because they do not require physical space and can be shared by the global NYU community. The NYUAD Library collection development policy thus states that e-books are considered as the first purchase option when available. If e-book collecting continues to grow exponentially, the NYUAD library may someday hold only one tenth of its projected 1 million monographs in print.

The NYUAD Library benefits from being part of the NYU library system: our patrons can borrow from the multi-million-volume collection at Bobst Library. While our ties with Bobst offer rich resources, we still need to build a collection to support the research and teaching at NYUAD.

Early in our first academic year (2010–11), faculty and students began expressing negative comments about library e-books. We found that some students tried to print entire e-books, while others did not like extended reading on a computer screen and viewed print books as a respite from being plugged in.

We established a research project with a threefold purpose: to determine if negative comments toward e-books were widespread or held only by a few; to gauge whether the library’s preference for e-books was truly serving the broader NYUAD user community; and to inform our collection development policy in order to better match our patrons’ preferences.

We conducted a survey of the NYUAD population (384 total: 147 students, 237 faculty and staff members) during the 2011 spring semester. With the survey completed by over half of our university population and 100 percent of our student body, we expected the data would answer our original research queries.

After analyzing the results, however, we realized that there were a few problems with our methodology. While we did learn a lot about patron e-book usage, our outcomes did not correspond well with our initial queries. Our survey experience taught us several things about the research process that we should have considered when planning.

Lessons learned

1) Establish a clear research focus and small goals. At the beginning of our project, we lacked clear purpose and a sense of urgency. We began with the general goal of wanting to learn about our community’s perceptions of e-books, and then following up with an article that evaluated these perceptions. We should have taken the advice we give students: focus your topic and get to work.

With summer fast approaching, we realized we might lose the opportunity of surveying our campus. Suddenly we had motivation. We chose the best time to offer the survey and worked backwards to develop a schedule: seek approval from our supervisors; secure an incentive for respondents; write and test the survey; advertise the survey; and explore options for survey distribution.

By mid-May, our survey was complete and the harder work of analyzing results began. As we discovered problems with manipulating the data, we became discouraged about the possibility of publishing our results. Could we still draw insightful and helpful conclusions?

The tide turned again, however, when the opportunity arose for an ACRL research writing consultation at the ALA Annual Conference. This meeting gave us the impetus to compile preliminary results of the survey and to outline the article we hoped to write. That meeting was very helpful, and our consultant, Aline Soules, offered an alternative approach on our topic, which resulted in this article.

Our next steps included submitting an article query and developing a writing plan. This was a challenge since the writing of our article coincided with the start of a new school year, but we set small goals and divided the work. The main lesson here: have a clear sense of direction and use external forces to motivate your writing process.

2) Garner support from colleagues. This lesson has two parts. First, the process of administering a survey, analyzing the results, and then writing an article is involved and time-consuming. As a result, it is beneficial to divide the work. Find a colleague who is interested in collaborating. Not only will the tasks be distributed, but your project will also benefit from the input and skills of others. Additionally, you will be motivated to complete the work as others hold you accountable.

Second, make sure you have your colleagues’ support. Explain your goals and ask their advice. In our case, this meant asking for our colleagues’ input on the survey and requesting financial support to purchase an incentive for respondents. Our supervisors were generous enough to offer an iPad, which was the perfect reward, considering the topic. We also asked our fellow librarians and staff to help advertise the survey and to offer technical support.

3) Do your homework. When most of us hear the phrase lit review, we groan. Library science literature is not known for being pleasure reading. It is essential, however, to find out what has been written about your topic to be sure you are adding something new to the conversation.

In our case, we found many articles about e-book usage but nothing specifically about returning to print after anti-e-book sentiment was discovered. The article “Why Aren’t E-Books Gaining More Ground in Academic Libraries”2 surveys the literature on e-book usage, access, and acquisition. This article certainly informed our understanding of how other libraries are handling the e-book question, but it did not address our particular question. Reading the library literature assured us that we had a unique angle and that there appeared to be a gap in the literature.

We conducted further research when we decided on the new approach of writing about “lessons learned.” We found articles suggesting tips in the writing process, such as the one that suggests the benefits of working with colleagues in a dossier support group.3 Since we were unable to find any article exactly like the one we wanted to write, we had the confidence to move forward with our query.

The bottom line: no one works in a vacuum. We are part of the larger world of academic libraries. When conducting research, engage the community by reading relevant literature and joining the larger conversation.

4) Know your software. NYU Libraries subscribe to Qualtrics, a Web-based survey tool. We created several test surveys to understand how the program worked and to make sure it would meet our needs. After the testing, we were confident that Qualtrics was a good choice; however, some glitches appeared while administering the survey and afterward when analyzing the results.

One issue occurred when offering the survey in the campus café. We had several laptops set up during lunch hours so we could take advantage of the midday crowd. We encouraged people to take turns using the laptops to complete the survey, but we soon realized that Qualtrics blocks repeat surveys in the same browser. A workaround was to run several browsers simultaneously and to refresh them after each survey, but this led to some scrambling on our part.

A larger issue emerged when analyzing our results. We originally decided to survey the entire campus community, including students, faculty, and staff. After looking at the results and trying to draw conclusions, we realized that to reflect our community’s e-book usage accurately we needed the ability to analyze the data by population. We understood after the fact that the majority of staff likely does not use library e-books, because the collection is primarily comprised of academic titles. Since nearly a quarter of respondents were staff (23%), including their responses dramatically skewed the results. We assumed it would be easy to remove staff responses and re-examine the data using only faculty and student input. However, after more closely examining the Qualtrics program, we discovered that there was no easy way to eliminate staff responses. Ultimately, we were left with data that did not accurately represent academic e-book preferences and uses. Spending more time working with Qualtrics initially would have helped us avoid this mistake and possibly given us the desired outcome.

5) Be prepared to respond. Finally, explore the ramifications of your research. Is your institution ready to respond to your results? How will your colleagues react to your findings? It is impossible to predict responses before conducting a survey; however, it is possible to discuss your research plans with your colleagues and to design your project in a way that will more likely yield viable indicators for change.

We designed our project in response to patron complaints about e-books. We did not think to pose the question, “If survey results indicate that our campus is resistant to e-books, could we really shift the collection development policy? What other options might be available?” In hindsight, it would have been helpful to consult our colleagues at NYU-New York to gauge whether there was room for change in e-book policies.

The results of our survey indicate that 71% of respondents only read library e-books a few times a year or not at all. The survey also finds that 43% of participants do not read library e-books, but this question was skewed by staff responses. Patrons’ major reasons for e-book frustration included discomfort with reading on electronic displays for extended time periods, network dependence, and the distraction of other online activities. The survey also reveals a desire for printing options and the ability to annotate.

While one respondent said, “E-books are incredibly convenient and in many circumstances are ideal for research,” another said, “I find it extremely difficult to quickly navigate ebooks [and I] “just print [the pages] anyway.”

The survey yielded a new set of questions: Are e-books easy for our patrons to locate? Would the use of e-books increase if they were available on all e-readers? Do we need e-books in more or different disciplines? The only clear finding that emerged from the survey was that 73% of respondents would like to know more about e-book access and usage.

Since e-books are a crucial building block to the NYUAD Library, they will remain the focus of the collection development policy. Our goal should not be how to change the policy; rather, we should aim toward better educating our readers in finding and using e-books.

The lesson learned: seek to understand the broader forces impacting your research topic in order to ask the appropriate questions. By customizing the project accordingly, your institution will be able to respond to the results.

Conclusion

As librarians conducting our first research study, we were excited to contribute to the conversation about e-book usage and preferences among university populations. We thought it was a good time to survey our campus on this topic after listening to comments from students and faculty, and we were confident that we would be able to survey a large percentage of our population. While we found the exercise valuable and it provided some insight into the e-book pulse at NYUAD, we did not achieve our intended outcome of informing our collection development policy.

As librarians new to the process, we learned that we would have benefited from structured planning, collegial support, reading the current literature in advance, testing (and re-testing) any survey tools, and ensuring that the environment is ready to respond to the results. While we were fortunate to have some of these pieces in place for our own project, a little more homework about the process of conducting research would have provided us with greater benefits in the end.


Notes
1. Tysick, C. Babb, N. , “Perspectives on . . .Writing Support for Junior Faculty Librarians: A Case Study,”. The Journal of Academic Librarianship 32, no. 1 ( 2006 ): 94 . Tysick and Babb write: “An analysis of 48 ALA-accredited MLS programs in the United States shows that 54 percent require students take a research methods course while only 10 percent require a thesis or project.”
2. Slater, R. , “Why Aren’t E-Books Gaining More Ground in Academic Libraries? E-Book Use and Perceptions: A Review of Published Literature and Research,”. Journal of Web Librarianship 4, no. 4 ( 2010 ): 305-331 –.
3. Hanna, KA.. O’Brien, A. Petsche, KF.. , “Our Excellent Adventure: A Somewhat Irreverent Look at How Three Tenure Track Librarians Prepared Their Dossiers and Lived To Tell About It,”. College & Research Libraries News 69, no. 9, ( 2008 ): 554-556 –.

Copyright © 2012 Laura Andersen and Beth Russell

Article Views (Last 12 Months)

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2018
January: 126
February: 138
March: 46
2017
April: 0
May: 30
June: 46
July: 69
August: 79
September: 109
October: 121
November: 135
December: 87


0 Replies to “Lessons Learned From Research Paper”

Lascia un Commento

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *