Internet-oriented courses and teaching materials tend to focus on technology. A typical approach covers the basic services (terminal emulation, file transfer, gopher, etc.) and how to use the programs which support them. The development of integrated browsers which adapt to the different protocols of the Internet without the need for user knowledge or intervention has made this technological focus irrelevant for most students. As the Internet becomes less of a challenge to computer science it has become much more a challenge to information science. Students now need to learn how to formulate strategies for finding information on the Internet and how to evaluate that information rather than concentrating on the mechanics of Internet tools. This shifts the focus for Internet education from computer literacy programs to library skills programs.
The computing manager covered Internet history and structure, connecting, FTP, archie, telnet, e-mail and mailing lists. The librarian covered keyword searching, Boolean operators and search strategies by demonstrating OPACs, periodical indexes and gopher/veronica. For hire databases and services such as Lexis/Nexis and CompuServe illustrated tools used by journalists. Classes met for one hour, once per week. Computer lab facilities were minimal at the time so that any hands-on experience was organized around computers in library offices. With limited time and few opportunities for hands-on lab sessions only the basics could be covered. There was time only for demonstrating the mechanics of Internet tools and minimal class time was devoted to typical Internet issues such as ethical use, netiquette, access, quality and evaluation of information. Although the majority of students had used e-mail, if only sporadically, few had used other Internet tools, hence the emphasis on how-to. There were weekly assignments, some of which were e-mailed to the instructors, others printed. A scavenger hunt was used as the final exam.
After the initial offering of LS389, class time was expanded to three hours per week, once a week. Additional class time enabled the instructors to give more detailed demos and explanations, provided opportunities for lab time, and provided more time for discussions. A class mailing list was created for announcements, student questions and additional material from the instructors. The majority of assignments were designed to be e-mailed.
The course retained its basic structure through 1994. The world wide web was included at the end of the syllabus as an interesting new topic. Two major changes occurred during 1995: world wide web graphical browsers and search engines became popular and Rasmuson Library merged with Computing and Communications (C&C), the University of Alaska Fairbanks computing unit. While the LS389 instructors had begun covering Lynx, the web had received relatively little attention in the class. By the Spring 1995 semester (my first for LS389), the web was one of the last topics covered in the syllabus; a sort of dessert at the end of the main meal. One class covered Lynx operations, another Mosaic with an explanation of hypertext. Since none of the students at the time had graphical web browsers on their own computers and Mosaic was available on a limited basis in University computer labs, web assignments were completed using Lynx. C&C also began to offer a series of one hour sessions on computing topics, including the Internet. These presentations included the standard Internet tools (e-mail, FTP, telnet, etc.), UNIX and connecting to the University computing system. The librarians also presented information in this series on databases available through the Library's LAN. LS389 students were encouraged to attend these presentations.
The versatility of Netscape and its installation as the "standard" browser in UAF's computer labs has recently allowed us to de-emphasize the mechanics of other Internet tools. In earlier versions of LS389, demos of FTP took a minimum of two hours with many questions and handouts detailing steps and commands. Using Netscape, the same demo took ten minutes. To compare and contrast the two approaches, I retrieved a file using the command-driven approach, then retrieved the file via Netscape. Suddenly, students could "see" what was happening and the process became clearer. They completed the FTP assignment with less confusion than usual. Students now question the relevance of running programs the "old" command-driven way and want to know why particular tools such as telnet and gopher are still useful. Why use telnet and gopher when you can use the web? This leads to discussions and evaluations of tools and eventually, the realization that the web isn't always the answer to information needs.
Web indexes and search engines have allowed us to concentrate on information seeking skills: the coverage of indexes and the types of materials retrieved; keyword searching and Boolean operators; and most importantly, the content of materials retrieved. The course has become more subject-oriented. We cover topical areas such as Federal, state and local government information, business and advertising on the Internet, ethical use and privacy. We haven't abandoned gopher, telnet, and similar tools but they no longer dominate the syllabus.
We continue to assign weekly exercises using whatever tools and information sources have been covered up to that point. The assignments blend "scavenger hunt" questions with evaluations and comparisons of information sources and tools. We no longer give a final exam but do require students to write two papers on a choice of Internet-related topics (copyright, access, privacy, business and advertising, censorship) and contribute to weekly class discussion of the topics. We introduced the web, indexes and search engines early in the class to enable students to use the web as one of the resources for papers.
Several options are under consideration. We could include more Internet resources in LS101, our keystone course. Currently, we include e-mail and SLED (Statewide Library Electronic Doorway), a web gateway to Internet resources established by Rasmuson Library and the Alaska State Library. (SLED) SLED is an easy to use entry to the web which has been widely promoted throughout the state and is often already familiar to students in both LS389 and LS101. Using LS101 as a vehicle insures more student exposure to the Internet. Each Fall and Spring semester, an average of 300 students is enrolled for LS101 as opposed to the 35-40 students in LS389. LS101 students could also be required to attend one or more of Rasmuson's one hour presentations on computing topics. This takes advantage of computing staff expertise and allows librarians to concentrate on information content and search strategies. Page and Kesselman (1994) The Library is also developing a web page with the option of including links to information once covered in LS389.
Regardless of what we do, there will be less emphasis on a class devoted exclusively to the Internet and more integration of Internet tools in the library's teaching efforts. The web and the tools developing in response to it will allow us to concentrate on the "why" rather than the "how".
Statewide Library Electronic Doorway. http://sled.alaska.edu