The Universe at Your Fingertips

The Web for Documents Librarians

Jerry Breeze
Head, Documents Service Center
Lehman Library
Columbia University

Jane Cramer
Documents Librarian
Brooklyn College Library

David Hellman
Documents Librarian
New York University


Copyright 1997, Jerry Breeze, Jane Cramer, and David Hellman. Used with permission.

Abstract

The presenters will offer a tour of both well-established and "beginning" documents web sites, to see the variety of approaches and levels of complexity that currently exist. We will discuss the various factors which influence the size and complexity of a site: purpose, staffing, equipment, connectivity, expertise. We will take a look at some sites which are useful in the operation of a documents collection, both official government sites and sites created by documents librarians themselves in response to a particular need. Finally, we will look at a number of print tools and web sites which serve as resources for building a web site and determining its content.

Note:

The text of this paper should be read in the context of browsing through the accompanying web page: http://www.library.ucsb.edu/universe/breeze2.html

Discussion

The Web is a powerful and versatile resource for documents librarians. It can serve as source of information for patrons looking for government information, as a means of publicizing and distributing local information and services beyond the walls of the depository library, as a tool for more effective administration of a documents collection, and as a resource for building and maintaining a presence on the web itself. The three librarians involved in this presentation represent three different approaches and uses of the web in our depository libraries. Many of the differences in our web sites stem from our unique organizational structures.

At Brooklyn College, the documents librarian locates and evaluates web resources, writes the text of the page, including annotations, and hands it over to members of the Publications Committee, who do the actual coding and control the overall scope and style of Brooklyn College web pages. Web development is at a beginning stage -- a single page with links to only a few major sources. Initially, Jane was told that updates would be infrequent, perhaps only once a semester. As it became obvious that more frequent updating is necessary, the policy changed to allow biweekly updating.

At NYU, the librarian also locates and evaluates web items, and sends the information to the library systems office, whose staff are responsible for maintaining the library web presence. The approved NYU icons and logos must appear on web pages, but David is solely responsible for the content and arrangement of the pages. The NYU pages are constantly evolving, with David sending weekly updates to the systems office.

At Columbia, my job was redefined to explicitly include web development. I learned HTML coding as part of an informal group of librarians who felt that this was an important development in library services. We meet on a monthly basis to discuss library web issues and developments. Much of the early development of LibraryWeb at Columbia was a result of the efforts of this informal group. We designed a template which each of the individual library units uses for its initial page, presenting standard information (location, hours, staff, etc.) in the same way. Beyond that, the organization, structure, content, and maintenance of the pages is the responsibility of the individual library unit. I chose to integrate web resources with the traditional print and microform resources, so that students and faculty will know the full range of materials available to them. I update my pages on a daily basis, adding new material and updating links as needed.

Beyond its value to the communities we serve, the web also is a valuable resource for the librarians who administer documents collections. Many of the tools and publications that we used in paper are now up on the web, allowing us better access to them and opening up new ways of using them. The FDLP Administration site now makes nearly all of these administrative tools available. For example, the Item Lister allows us to create and download a current List of Item Selections for our individual libraries, rather than relying on the annual mailing from GPO. I was able to create a list of Superseded CD-ROMs in Columbia's collection by searching, copying and pasting from the GPO Superseded List.

As handy as these tools are, it's important to note that GPO is rather late in making these tools available to the documents community. Some depository libraries didn't wait, and offered their own versions of core tools and resources much earlier. Items like the Depository Library Directory, the Inactive and Discontinued Items list, and the PRF are still only available from non-official web sites. Additionally a host of other useful information can be found on other library sites -- things like GODORT information, CD-ROM guides, and lists of depository titles now available on the Internet.

Most of the publishers who specialize in government information now have web sites, which can include their annotated catalogs and price lists, information about new titles and services, and even online ordering capabilities. On some of my web pages, I have chosen to link directly to publishers' descriptions of their microform sets. And when I want to check for something, it's certainly easier to go to the web site than to dig through catalogs in my vertical file.

There are now many publications dealing with the web, such as Internet World, which are useful for general web development. There are two, however, which are specifically targeted at government information. The Federal Internet Source, published by the National Journal, comes out in the Fall and Spring of each year, is arranged into the standard categories for Federal government sources, as well as state and general political sites. Particularly nice features are the reproduction of the top level page for many sites and the provision of web, gopher, ftp, and telnet addresses. I find it very handy for quick referrals to those areas not covered on my own pages -- states other than New York or the judiciary, for example. The Internet Connection, from Bernan, is issued 10 times per year. Each short issue contains articles about 4-6 government sites. For those sites it covers, it provides some of the most detailed descriptions available. I also find it useful for the comparative articles, such as the one in the current April issue, which compares the NTDB on the web with the CD-ROM version. Bernan was kind enough to send me a number of copies of this issue to distribute to you here today.

Finally, there are a multitude of sites out there which offer tools and tips on creating web pages. The ones included here are a small sample of sites which are either indispensable (the Adobe Acrobat download site, Beginners Guide to HTML, Doctor HTML, the Table Sampler) or representative of the kinds of tools available (backgrounds, icons, clip art, etc.).

How did the documents web pages at these three sites come to have their present form? As part of a process, which began with an assessment of our own capabilities and the audiences we were trying to reach. At NYU, for example, since David is also the bibliographer for Political Science, he has links to guides to the print collections for both U.S. Government Documents and Political Science. Early on at Columbia's web development, I decided to not include the judiciary branch, since there is a separate Law School on campus.

We came up with three basic principles to follow for the ongoing care that these web sites require:

  1. Simplicity - Start small and add later, if you find that you have the time and support to do so. Keep the arrangement of your pages simple and logical. Use graphics sparingly, keeping in mind that many people will be connecting to your pages via modems, not T2 lines. As much as I admire the contents of the University of Michigan's Documents Center pages, it can be a frustrating site to use.

  2. Borrow from others - the web is a collaborative effort, so take advantage of it. Why re-create a page, when a link will do? Sometimes the information needs to be customized for your local audience, but often not. Or, someone else's page on a topic can serve as a valuable supplement to yours, especially if they have the resources to update and maintain it more often than you do. Luckily, we deal with information that is largely not copyrighted. I copy source code and graphics from government web servers all the time.

  3. Evolve - There is no such thing as a finished web page. Once your web site has grown beyond a few pages, you may need to think anew about its arrangement and scope. Things may have to be moved around. I have cut and pasted whole sections between pages in an effort to make my site more accessible. You have to realistically decide how much you can maintain. In some ways, the initial creation is the easy (and fun) part. But web pages are like infants which never grow up -- they have to be constantly watched, nourished, and cleaned up after. You can rely on programs like Doctor HTML or Linkbot to help you out only so far. There is still no substitute for having a human check those links. If you have other staff who can help, great. If not, realize when you add each link that you're going to have to verify that link again soon. So, is it worth adding?


HTML 3.2 Checked!