It’s All About User Services: A Summary and Commentary on the LITA Top Technology Trends meeting

Posted on 8 minute read

× This article was imported from this blog's previous content management system (WordPress), and may have errors in formatting and functionality. If you find these errors are a significant barrier to understanding the article, please let me know.

What follows is a summary and commentary on the LITA Top Technology Trends meeting at ALA annual conference in New Orleans on 25-Jun-2006. What I've tried to do is collate comments from the panel members and add my own commentary (marked off as such from the rest of the summary) where I thought I had something useful to add. It is my hope that this summary is a faithful representation of the statements made by the participants in the panel. If not, please let me know privately or in the comment area here and I will make the appropriate corrections on the body of the blog post.

Please note that this is not intended to be a complete summary of the comments of the panelists; in some cases I forgot to write things done, in other cases what was said didn't fit neatly into this collated set of topics. For a more complete accounting of the topics, please see Karen Coombs' and Michelle Boule LITA Top Technology Trends postings.

Evolution and Interim Solutions

As a profession, we need establish a collective mindset that "everything we do is an interim solution." When our perspective is that of managing interim solutions, we begin to describe our activities with the language and context of interim solutions -- that this is work not completed. For instance, faceted browsing is not the solution. Its adoption is part of an iterative process. (Tom Wilson) There is lots of experimentation in the arena of "next generation" OPACs and interfaces -- navigation, scope of OPAC content, consolidation of purchased and subscribed content -- and none of them are "good enough" to be long-term solutions. (Marshall Breeding) And if our gaze rests solely on the OPAC, we are in trouble. The catalog is but one source of information about library content yet it receives the lion's share of attention and effort. (Roy Tennant)

Approaching change with the mindset of managing interim solutions will encourage flexibility and more experimentation. I agree with these statements, and I think we need to be prepared to move a little more nimbly in the coming years. Perhaps not at "internet speed" -- we have firm roots in sound practices -- but certainly no longer at, say, "committee speed."

Research in information retrieval is now being explored by those that are not in our profession. We are no longer the landlords of the information space that we were before (but perhaps we can reclaim some of it through a lease-back arrangement). (Andrew Pace)

My recent trip to JCDL brought this home. The "joint" of JCDL is, by the way, the Association of Computing Machinery (ACM) and IEEE's Computer Section -- nary a "library organization" in sight for this "digital library" conference. There is top-notch information retrieval experiments and practices being explored here...stuff that we could apply or consider applying to our own systems.

There has been lots of consolidation in the business side of the library automation industry, but it is still fragmented and more consolidation will be coming. The future will have fewer companies and probably a fewer number of hopefully better products. The large automation companies are outsourcing development and integration of some modules, particularly for ERM functions (to companies such as Serials Solution and TDnet). (Marshal Breeding)

The has also been the rise of "managed" open source. Some open source has an audience that is wide enough to be community-maintained (the Apache web server, for instance). For applications of a more limited interest, companies are making it their buisiness to provide support for open source software (IndexData and others). (Karen Schneider) One example in particular, the field of Institutional Repositories was initially open source, but this capability is now being marketed as a complimentary part of an ILS. This puts institutional repository capabilities into the hands of more institutions. (Clifford Lynch)

Focus on the Service Aspects

The results of the mass-digitization efforts will change library operations. What is the role of library if everyone has content on their gizmo? Our role must be to provide services on that content. (Eric Morgan)

Faceted Browsing

NCSU's Endeca-enabled catalog is part of a long-term strategy for improving access to the items in the catalog; is is not the end. (Andrew Pace) Any decent search engine in 2006 will have this capability. Faceted navigation does a good job at marrying the search and browse modalities. (Karen Schneider)

Findability as a Service

The recent focus on "findability" is very healthy and the dissatisfaction of the library catalog is part of a reorientation to better serve the user. (Karen Schneider) Software for faceted browsing and personalization has reached a commodity status. Automatic classification and subject assignment and natural language processing are the first part of the last mile. (Andrew Pace)

Along with this has to be the realization that the OPAC is not the center of the library universe; other services are of equal importance to the users. Users with full-text expectations are coming to our metadata universe. And for them, tutorials, screen captures, and desktop movies are not going to cut it. (Karen Schneider)

Karen has a new twist on the "librarians like to search, users like to find" axiom. Our users new expect content -- not pointers to the content -- to be at the end of their finding process. And increasingly they are familiar with finding modalities coming out of the web-as-a-whole and will not sit through a bibliographic instruction session or watch a "screen cast" (movie of desktop capture) to learn how to use a new service. Think Jacob Nielson here -- if everyone is doing it a certain way, you probably should, too, regardless of the fact that you mike know a better way to do it.

At the same time we see the breakdown of barriers to publication -- with blogging and wiki software, anyone can be a publisher. Along with the benefit of the capability for everyone to publish, we have the detriment of everyone being a publisher. Should the library offer a filtering and selection service for this content? (Roy Tennant)

I can't remember the exact context of Roy's mention of this, but it seems that we should be offering this kind of sorting and filtering service of the "unwashed" blog and wiki content so it is inter-filed with selected and vetted content from our commercially-produced collections. That would be a service to our users, I think.

Actionability as a Service

Dissatisfaction with the OPAC is not just about its use as a findability tool. It is not just about getting the thing; users want to execute services against the thing: talk about it, find others who read it and what they read, create quick bibliographies, and discuss the work with the author. (Eric Morgan)

Eric was talking fast, and so not all of the actions he mentioned are listed above. I can't remember if one of the actions he listed as "excerpt". Technology has made it easy to excerpt and recombine pieces of content into a new work and the users have taken advantage of this capability -- they call it a "mash-up." As we put content online we need to bring along the enabling technologies that allow it to be excerpted -- when appropriate -- and track back the provenance of that excerpted content.

Publishing Platform as a Service

Voice-over-IP (VoIP) makes it easier for users to communicate all over the world. (Eric Morgan) Ubiquitous and constant communication mean that those with arcane interests can find each other on the network and create a small community. Sometimes these small communities create artifacts (for example, the code4lib conference). How can libraries serve these microcommunities well? Since we operate inside geographic boundaries and these communities don't, how do we service them? (Roy Tennant)

Web pages created in the form of blogs and wikis are becoming the norm rather than the exception. How do we think about of these sorts of things? Content is married to the software and underlying database, and it will be difficult to migrate these things forward. (Eric Morgan) The rise of community sistes (collectively, systems like Flickr, MySpace, GMail, etc.) increases the confusion between services for sharing versus services for preservation. "Over the next few months this lesson will be driven home." (Clifford Lynch)

This is a concern of mine as well, particularly with the conflicting value systems of the entities in question. As corporate bodies, accountable to venture capital firms or shareholders, will sustain a service as long as the business model is profitable. When it is no longer profitable, what happens to the content on those systems? These corporate bodies also seem to be relying -- again -- on revenue from advertising to sustain their activities. Despite the success of Google in reviving this method of moving money around cyberspace, do we really think that advertising-supported sites will continue indefinitely?

And everyone is publishing, with resulting decreases in this think we call "privacy." Teenagers are now being councled that what they put in Facebook will follow them for the rest of their life. (Karen Schneider)

If libraries do service these communities, what is our responsibility to inform users of the risk to their privacy and/or take proactive steps to protect their privacy. This question goes beyond, of course, statutory requirements in the United States and other countries regarding the solicitation and position of information about minors.

Granting agencies and university administrators coming to understand the importance of long-term data management and curation. The National Science Foundation's Office of Cyberinfrastructure will be putting out guidelines on this soon, and it will be something to watch for. (Clifford Lynch)

Network Services

In May, Internet2 announced the blueprint and initial capabilities for its next generation network. Initially the core links will have 80 gigabits per second (Gb/s) of bandwidth; the technology being employed is extensible to 800 Gb/s. Over the course of the next 18 months, it will replace the existing Abeline network. And this network will work differently from networks as we know them now: it will be a mixed optical/IP network, meaning that dedicated point-to-point links can be provisioned across the fibre for very high-speed transmissions. With network capacities increasing at this rate, it is possible to rethink how one uses the network. Distributed storage, or "grid storage," is now reasonably possible, for instance. (Clifford Lynch)