Lynne Lysiak, Appalachian State University Library, December 17, 2000
The presentation was badly misnamed, and absolutely fascinating.
I'm probably biased into one set of jargon, but when I hear the word "accessibility", my first thoughts are about handicap access: can someone who's color-blind use the site, or someone who uses a text-to-speech processor? My second thoughts are about multi-platform accessibility: is the site readable in Netscape (Navigator or Communicator), MSIE, and Lynx; is the site readable on PC and Mac and XWindows?
Well, this talk wasn't about that.
It was about web *usability*. More specifically, it was about testing library web pages to see if students can actually use them to find information. Answer: no, they can't.
Lynne Lysiak, of Appalachian State University (ASU) Library, first heard about the idea of web accessibility testing about two years ago, at a conference presentation given by people from University of Arizona (UofA). She thought it sounded fascinating, and told her boss about it. A few weeks later, he also heard a conference presentation on the topic. Nothing happened, though.
Last year, her former boss moved to Georgia Southern (GSU), and discovered that GSU's library was interested in redesigning its home page. He called Lynne back, and said, "You know, we're both interested, both institutions need to do this, let's work together."
After Lynne contacted the UofA people for permission to use their forms and questions as a starting point, they began planning.
- they had to decide whether they were testing the entire library site (drilling down through the pages for specific information) or just the front page of the site.
- they had to develop the questions, and make sure that the questions themselves weren't leading the tester in one direction or another.
- they had to set up a testing protocol (more on that later).
- they had to work with their Research Office for appropriate permissions and requirements to deal with human subjects.
Michael, Kathy, and Janet all have copies of the handouts from the presentation (printouts of the library home pages, sample data collection sheets, an overview of the results, and a suggested bibliography).
To keep things simple for the first run, they decided to work with very small, homogenous populations, and to have the students test sites other than their home university's. In all, they had 32 students:
- 16 at ASU (8 tested the GSU site and 8 tested the UofA site)
- 16 at GSU (8 tested the ASU site and 8 tested the UofA site)
All the students were second-semester freshmen, selected from Freshman Seminar and Freshman English classes.
Each student was observed separately. Students were asked to answer a series of eleven questions by clicking links on the library website and working their way down to the specific information source. No information source was more than three clicks from the home page. Students were asked to print the first page of each section they reached, and encouraged to verbalize their thoughts while they were searching.
After completing the eleventh question, students were asked to look back at the home page and say what was most helpful and what was most confusing.
Staff worked in groups of three: one coach and two observers. The coach's role was to hand the student a card with the question printed on it, have the student read the question out loud, and then continue to encourage the student to keep verbalizing. The observers' role was to note which links the student followed, and to write down what the student was saying about why he or she chose each link.
Each observer had a data sheet for each question (22 total per student). The data sheet listed the question and the most efficient paths (as pre-determined by the researchers) for finding the source of the answer. The observers were asked to list the links selected, in order; to rate the student's level of confidence; and to note both key words from the student's statement and their own observations.
Immediately after the observation session, the researchers met to debrief and summarize; they found that their memories and ability to recreate the session were very high immediately after, and dropped off sharply after even an hour or two. The three of them worked together to complete a summary sheet for each of the eleven questions.
Data were next entered into an MS Access database for analysis.
Rather than going heavily into the statistics, Lynne offered some general results.
- Students like colors, especially bright, primary colors.
- Students like graphics.
- Students ignore text. If it's more than a couple of words, or if it contains any jargon at all, they won't read it.
- Students take icons literally. If the catalog icon shows a book and a microfiche, they won't look there for videos or journals.
- Students don't understand library jargon. They don't recognize that a "journal" and a "magazine" will be found in the same place. They don't differentiate between "reference" and "reserve". Asked to find out whether they have any overdue books, they don't think to look at "borrower information".
- Students don't generalize easily. They view articles as separate from journals (i.e., asked to find an article on a topic, they skip over "journals" and try to find a link to "articles"). They don't see any difference between journal title and journal content.
- Students neither understand nor appreciate system complexity. They don't distinguish between an index of databases, a database, and a specific item.
- What a student *thinks* and what he or she *does* are rarely related.
Not all students are created equal
- Their visual orientations vary. Many students seemed not to see anything in the upper left of the screen; Lynne hypothesizes that they are conditioned by scroll bars to look to the right.
- Their color perception varies. Lynne mentioned this specifically because every male in her maternal line is color blind.
- Many students actually used alt-tags.
- Most students do not use drop-down menus. If they can't see it, I gather, they don't look for it.
- Students will have different levels of experience: general, computer use, web searching, and library use.
What students want
- A big white box in the middle of the screen, where they can type in their question.