This document summarizes a presentation given by Judy McNally and Doreen Herold of Lehigh University about the challenges facing their technical services department and how they are adapting workflows to address changing trends. Key challenges include acquiring fewer print materials, an explosion of digital resources, reduced budgets, and changing staff roles. The department is shifting from print to electronic serials, outsourcing more work, and cross-training staff. Staff are taking on new roles like resolving access issues for electronic journals and doing more batch cataloging of materials like ETDs and SpringerLink titles. The department is also exploring new cataloging solutions like OLE.
Dirk Lewandowski gave a presentation on how to rank library materials in an online public access catalog (OPAC). He discussed the importance of relevance ranking to improve search results. Some key factors for ranking include text matching, popularity, freshness, and locality. Results lists should include a mixture of item types from different collections to satisfy user needs. Relevance ranking is complex and requires considering multiple factors and data sources. The core function of an OPAC remains search, but other features can enhance the user experience.
Data mining OCLC for translations.
Creating authority records for VIAF.
Remodelling the bibliorgraphic structure to make the best mutli-lingual displays from all available data in a work set.
NISO Virtual Conference: The Semantic Web Coming of Age: Technologies and Imp...
Feb 19, 2014: NISO Virtual Conference: The Semantic Web Coming of Age: Technologies and Implementations
Deck includes presentations from:
Ramanathan V. Guha, Google Fellow; Founder of Schema.org; Pierre-Paul Lemyre, Director of Business Development, Lexum; Bob Du Charme, Director of Digital Media Solutions, TopQuadrant
This document discusses how libraries can leverage data from their collections to support new research and discovery. It outlines several initiatives that treat library collections as data, including the Library of Congress labs and a project exploring computationally-driven research. The document also discusses OCLC's work analyzing institutional repository data through its Repository Analytics and Metrics Portal (RAMP) and making data more interoperable through support of the IIIF standard for sharing images and metadata.
Charting Communication: Assessment and Visualization Tools for Mapping the Co...
The document summarizes a study conducted by Becky Skeen, Liz Woolcott, and Andrea Payant at Utah State University on assessing communication patterns within their cataloging and metadata services department. They used interaction logs filled out by staff weekly and an anonymous survey distributed to other library departments. The study found lower than expected interaction with other technical services units and higher interaction with special collections. It also contradicted stereotypes of catalogers being withdrawn by finding most interactions were social. The data analysis tools used included Excel, Qualtrics, Tableau and OpenRefine. Conducting this assessment on a regular basis and expanding the research was recommended to provide more useful insights into communication over time.
The traditional ILS as we know it will only die out because it will evolve. It will not disappear. More now than ever do libraries need automation and resource management. The thing is, our collections are becoming more and more heavily electronic, we need a system that will handle digital content in an efficient manner. The current ILS does not.
Current ILSs are built around the traditional library practice of print collections and services
designed around these collections, but the last ten to fifteen years have seen great shifts in both
library collections and services. Print and physical materials are no longer the dominant resources.
Actually, in many libraries, especially in academic and research libraries, the building of electronic
and digital collections have taken a larger role in library collection development.
As libraries have moved increasingly to accommodate digital collections, they’ve found the ILS products unable to be reconfigured well enough to smoothly and efficiently handle the integration of all the workflows that are different, yet, necessary, for both print and digital.
The current ILS serves the purpose for an academic library but instead of one system with seamless interaction we have one system with add on components to do some of the now necessary functions like electronic resource management and the discovery layer.
there are three trends that will lead to the change in the traditional ILS: “1. Increased digital collections; 2. Changed expectations regarding interfaces; 3. Shifted attitudes toward data and software.”
There are four distinguishing characteristics of the next-generation ILS we believe are critical. They are comprehensive library resources management; a system based on service-oriented architecture; the ability to meet the challenge of new library workflow; and a next-generation discovery layer.
Up until recently, libraries developed collections to serve the communities that they were located in. And that's going to shift because the collections that they create will define the communities they serve, which is the exact opposite of the way it used to be in the physical world. In the electronic world it will be completely opposite. (VINOD CHACHRA, VTLS)
Our collections are now booming with digital content and a very inept way to serve it. The traditional ILS wasn’t created to handle digital content. The new ILS, will serve as a library service platform where digital content will be a the forethought instead of an afterthought.
Breeding writes that “the next generation of library automation systems needs to be designed to match the workflows of today’s libraries,
which manage both digital and print resources.”
There are four distinguishing characteristics of the next-generation ILS we believe are critical. They are comprehensive library resources management; a system based on service-oriented architecture; the ability to meet the challenge of new library w
November 19, 2014 NISO Virtual Conference: Can't We All Work Together?: Inter...
Leveraging Wikipedia as a Hub for Data Integration: the Remixing Archival Metadata Project (RAMP)
Timothy A. Thompson, Metadata Librarian (Spanish/Portuguese Specialty), Princeton University Library
NISO Two Day Virtual Conference:
Using the Web as an E-Content Distribution Platform:
Challenges and Opportunities
Oct 21-22, 2014
R. David Lankes, Dean’s Scholar for the New Librarianship at Syracuse University’s School of Information Studies; Director of the Information Institute of Syracuse
The document discusses Scholars Portal, a consortium of Ontario university libraries that provides access to digital scholarly resources and services. It aims to create a single point of access for integrated searching, as well as long-term archiving of content. Services described include article searching, access to ejournals and databases, interlibrary loans, and a digital repository. Future plans include expanding content and developing a shared infrastructure to ensure sustainability. The goal is to transform research, teaching and learning through a centralized portal for high-quality scholarly materials.
Connecting the Dots: Constellations in the Linked Data Universe
The universe of linked data is rapidly expanding and our community is finding innovative ways to link and apply data. This session will cover several initiatives and projects using linked data to improve discovery and reuse of information.
Speakers: Richard Wallis, Technology Evangelist, OCLC; Tom Johnson, Digital Applications Librarian, Oregon State University
November 19, 2014 NISO Virtual Conference: Can't We All Work Together?: Inter...
Learning the Lingo: Building Foundations for Successful Partnerships and Collaborations upon which Successful Systems Integrations can be Built
Carl Grant, Associate Dean, Knowledge Services & Chief Technology Officer, University of Oklahoma
This presentation was delivered by Rebekah Cummings of the University of Utah during a NISO Virtual Conference on the topic of data curation, held on Wednesday, August 31, 2016
The document discusses the future of the integrated library system (ILS). It predicts that if the ILS remains constrained by its current design, it has no future. However, if the ILS is freed from these constraints to evolve and integrate outside data sources, it may survive. Currently, ILSs are good for books but not other content like journals or digital materials. The document advocates for setting library data free by presenting it in multiple ways and sharing it flexibly. It also calls for more modern and customizable user interfaces and improved search capabilities. The real competition for ILS vendors are content management systems that can manage a library's entire web presence. The future may see ILSs evolving into CMSs or libraries adopting other systems
Led by Duke University, the OLE Project intends to build a design document for an open source library management system which will be based on the software design philosophy of service oriented architecture (SOA). SOA is becoming a dominant trend in technology as early adopters have shown that it provides the benefit of an agile system, one that is flexible in response to information demands. Lehigh’s Doreen Herold and Tim McGeary will present the status of the OLE Project, its process, its goals, and how other PALINET members can participate.
The Missing Link: Metadata Conversion Workflows for Everyone
This document describes workflows developed by Utah State University and the University of Nevada, Las Vegas to streamline metadata creation between special collections and digital initiatives departments. The workflows allow for converting finding aid information into Dublin Core for uploading item records to a digital repository, and batch linking digitized content to finding aids. The processes are designed to be taught easily and performed by various staff levels to automate metadata work and make it more flexible.
Presented at ALAO, October 29, 2010.
Program Description: Separated by principle or physically from the public side of the library can prove challenging for Technical Services departments in communication and interaction between the department and the rest of the library staff and patrons. How can Technical Services departments overcome these disconnects? This presentation will survey the different online efforts by several TS departments to facilitate communication with other library departments and beyond, including Miami University’s Technical Services LibGuides page.
Design for marketing technical businesses is complicated, because the product is complicated, but the consumers are ill-suited to evaluate it. Simplicity and good design are essential to successfully marketing technical products and services.
The document discusses the importance of metrics in technical services departments in libraries. It begins by outlining different types of metrics that can be used to measure individual, departmental, library, and university performance. These include quantitative metrics like number of records processed, as well as qualitative metrics like accuracy rates. The document then discusses how metrics can be used for training, performance evaluations, benchmarking, planning, and assessing how well the library is meeting its mission and supporting the university's goals. Overall, the document advocates for the implementation of comprehensive metrics programs in technical services to facilitate accountability, continuous improvement, and effective communication of the department's value and contributions.
The document discusses key performance indicators (KPIs) for technical support engineers. It provides examples of KPIs, performance appraisal methods, and steps to create KPIs for this position. Mistakes to avoid in developing technical support engineer KPIs are also outlined, such as creating too many KPIs or not linking them to overall strategies. The document recommends visiting an external website for additional KPI samples and materials.
This document contains information about performance evaluation forms and methods for evaluating technical support engineers. It includes a sample performance evaluation form with sections for reviewing job performance factors, employee strengths and accomplishments, performance areas needing improvement, and signatures. It also provides examples of performance review phrases for evaluating various skills and examples of the top 12 methods for performance appraisal, such as management by objectives, critical incident method, behaviorally anchored rating scales, and 360 degree feedback. The document aims to provide useful resources for conducting thorough performance evaluations of technical support engineers.
When I face a business challenge I'm trying to see the root of the problem. Having a quick-fix solution is fine, but what if a quick-fix solution is not possible unless fundamental issues are resolved. I see this often with KPIs. People ask to help with KPIs for this or for that, but the problem that they actually experience is a fundamental one - they have a vague strategy that is hard to follow.
Recently I had a conversation with Pablo, one of our Spain-based customers. His company is a leading national manufacturer and his question was about a KPI to help with the poor performance of their business. Our dialog was really insightful for both of us. Pablo sorted out things about strategy and KPIs; I was able to trace verbally the problem of a bad KPI back to its root - a poorly formulated strategy.
The result is not an article, but a dialog between me and Pablo. Together we completed a journey from a pointless KPI request to ideas about formulating a better strategy:
http://www.bscdesigner.com/a-journey-from-a-bad-kpi-to-an-excellent-strategy.htm
Making the Big Move: Moving to Cloud-Based OCLC’s WorldShare Management Servi...
The library migrated from their previous integrated library system to OCLC's WorldShare Management Services over a 6 month period. They moved their search, circulation, and catalog infrastructure to the new cloud-based system. The migration process involved preparing data for transfer and working through various technical issues. The library has made changes to workflows for acquisitions, technical services, and electronic resources management as a result of the new system. They have also provided feedback to OCLC on ways the system and services could be improved.
Digital Library user experience: sconul conferenceBen Showers
Libraries now require a broader range of skills including design, data literacy, and web development rather than specialization, as library systems move to the web; this challenges libraries to attract and manage staff with these new skills, such as through employing students.
The webinar discussed social reading experiences of sharing bookmarks and annotations in e-books. It covered three main presentations:
1. Todd Carpenter discussed a NISO working group on digital annotation requirements like specifying how annotations are rendered and challenges with annotating text.
2. Rob Sanderson presented on the W3C Open Annotation model for annotating web resources in RDF. It defines a basic model of annotations as comments linked to targets.
3. Dan Whaley discussed building an annotation platform that supports peer review of annotations to build reputation and scale to large numbers of users and annotations.
BIBFLOW and the Libhub Initiative: Leveraging our past to define our future
Eric Miller, President, Zepheira
Jeff Penka, Director of Channel and Product Development, Zepheira
How can library materials be ranked in the OPAC?Dirk Lewandowski
Dirk Lewandowski gave a presentation on how to rank library materials in an online public access catalog (OPAC). He discussed the importance of relevance ranking to improve search results. Some key factors for ranking include text matching, popularity, freshness, and locality. Results lists should include a mixture of item types from different collections to satisfy user needs. Relevance ranking is complex and requires considering multiple factors and data sources. The core function of an OPAC remains search, but other features can enhance the user experience.
Data mining OCLC for translations.
Creating authority records for VIAF.
Remodelling the bibliorgraphic structure to make the best mutli-lingual displays from all available data in a work set.
Feb 19, 2014: NISO Virtual Conference: The Semantic Web Coming of Age: Technologies and Implementations
Deck includes presentations from:
Ramanathan V. Guha, Google Fellow; Founder of Schema.org; Pierre-Paul Lemyre, Director of Business Development, Lexum; Bob Du Charme, Director of Digital Media Solutions, TopQuadrant
This document discusses how libraries can leverage data from their collections to support new research and discovery. It outlines several initiatives that treat library collections as data, including the Library of Congress labs and a project exploring computationally-driven research. The document also discusses OCLC's work analyzing institutional repository data through its Repository Analytics and Metrics Portal (RAMP) and making data more interoperable through support of the IIIF standard for sharing images and metadata.
Charting Communication: Assessment and Visualization Tools for Mapping the Co...Andrea Payant
The document summarizes a study conducted by Becky Skeen, Liz Woolcott, and Andrea Payant at Utah State University on assessing communication patterns within their cataloging and metadata services department. They used interaction logs filled out by staff weekly and an anonymous survey distributed to other library departments. The study found lower than expected interaction with other technical services units and higher interaction with special collections. It also contradicted stereotypes of catalogers being withdrawn by finding most interactions were social. The data analysis tools used included Excel, Qualtrics, Tableau and OpenRefine. Conducting this assessment on a regular basis and expanding the research was recommended to provide more useful insights into communication over time.
The future of the integrated library systemWhitni Watkins
The traditional ILS as we know it will only die out because it will evolve. It will not disappear. More now than ever do libraries need automation and resource management. The thing is, our collections are becoming more and more heavily electronic, we need a system that will handle digital content in an efficient manner. The current ILS does not.
Current ILSs are built around the traditional library practice of print collections and services
designed around these collections, but the last ten to fifteen years have seen great shifts in both
library collections and services. Print and physical materials are no longer the dominant resources.
Actually, in many libraries, especially in academic and research libraries, the building of electronic
and digital collections have taken a larger role in library collection development.
As libraries have moved increasingly to accommodate digital collections, they’ve found the ILS products unable to be reconfigured well enough to smoothly and efficiently handle the integration of all the workflows that are different, yet, necessary, for both print and digital.
The current ILS serves the purpose for an academic library but instead of one system with seamless interaction we have one system with add on components to do some of the now necessary functions like electronic resource management and the discovery layer.
there are three trends that will lead to the change in the traditional ILS: “1. Increased digital collections; 2. Changed expectations regarding interfaces; 3. Shifted attitudes toward data and software.”
There are four distinguishing characteristics of the next-generation ILS we believe are critical. They are comprehensive library resources management; a system based on service-oriented architecture; the ability to meet the challenge of new library workflow; and a next-generation discovery layer.
Up until recently, libraries developed collections to serve the communities that they were located in. And that's going to shift because the collections that they create will define the communities they serve, which is the exact opposite of the way it used to be in the physical world. In the electronic world it will be completely opposite. (VINOD CHACHRA, VTLS)
Our collections are now booming with digital content and a very inept way to serve it. The traditional ILS wasn’t created to handle digital content. The new ILS, will serve as a library service platform where digital content will be a the forethought instead of an afterthought.
Breeding writes that “the next generation of library automation systems needs to be designed to match the workflows of today’s libraries,
which manage both digital and print resources.”
There are four distinguishing characteristics of the next-generation ILS we believe are critical. They are comprehensive library resources management; a system based on service-oriented architecture; the ability to meet the challenge of new library w
Leveraging Wikipedia as a Hub for Data Integration: the Remixing Archival Metadata Project (RAMP)
Timothy A. Thompson, Metadata Librarian (Spanish/Portuguese Specialty), Princeton University Library
NISO Two Day Virtual Conference:
Using the Web as an E-Content Distribution Platform:
Challenges and Opportunities
Oct 21-22, 2014
R. David Lankes, Dean’s Scholar for the New Librarianship at Syracuse University’s School of Information Studies; Director of the Information Institute of Syracuse
Transforming University Research - Mar 2006Jill Patrick
The document discusses Scholars Portal, a consortium of Ontario university libraries that provides access to digital scholarly resources and services. It aims to create a single point of access for integrated searching, as well as long-term archiving of content. Services described include article searching, access to ejournals and databases, interlibrary loans, and a digital repository. Future plans include expanding content and developing a shared infrastructure to ensure sustainability. The goal is to transform research, teaching and learning through a centralized portal for high-quality scholarly materials.
The universe of linked data is rapidly expanding and our community is finding innovative ways to link and apply data. This session will cover several initiatives and projects using linked data to improve discovery and reuse of information.
Speakers: Richard Wallis, Technology Evangelist, OCLC; Tom Johnson, Digital Applications Librarian, Oregon State University
Learning the Lingo: Building Foundations for Successful Partnerships and Collaborations upon which Successful Systems Integrations can be Built
Carl Grant, Associate Dean, Knowledge Services & Chief Technology Officer, University of Oklahoma
This presentation was delivered by Rebekah Cummings of the University of Utah during a NISO Virtual Conference on the topic of data curation, held on Wednesday, August 31, 2016
The Future Of The Integrated Library Systemtiranloblanc
The document discusses the future of the integrated library system (ILS). It predicts that if the ILS remains constrained by its current design, it has no future. However, if the ILS is freed from these constraints to evolve and integrate outside data sources, it may survive. Currently, ILSs are good for books but not other content like journals or digital materials. The document advocates for setting library data free by presenting it in multiple ways and sharing it flexibly. It also calls for more modern and customizable user interfaces and improved search capabilities. The real competition for ILS vendors are content management systems that can manage a library's entire web presence. The future may see ILSs evolving into CMSs or libraries adopting other systems
Come to the Fiesta! Join the OLE ProjectDoreen Herold
Led by Duke University, the OLE Project intends to build a design document for an open source library management system which will be based on the software design philosophy of service oriented architecture (SOA). SOA is becoming a dominant trend in technology as early adopters have shown that it provides the benefit of an agile system, one that is flexible in response to information demands. Lehigh’s Doreen Herold and Tim McGeary will present the status of the OLE Project, its process, its goals, and how other PALINET members can participate.
The Missing Link: Metadata Conversion Workflows for EveryoneAndrea Payant
This document describes workflows developed by Utah State University and the University of Nevada, Las Vegas to streamline metadata creation between special collections and digital initiatives departments. The workflows allow for converting finding aid information into Dublin Core for uploading item records to a digital repository, and batch linking digitized content to finding aids. The processes are designed to be taught easily and performed by various staff levels to automate metadata work and make it more flexible.
Presented at ALAO, October 29, 2010.
Program Description: Separated by principle or physically from the public side of the library can prove challenging for Technical Services departments in communication and interaction between the department and the rest of the library staff and patrons. How can Technical Services departments overcome these disconnects? This presentation will survey the different online efforts by several TS departments to facilitate communication with other library departments and beyond, including Miami University’s Technical Services LibGuides page.
Design for Marketing Technical ServicesJon Sandruck
Design for marketing technical businesses is complicated, because the product is complicated, but the consumers are ill-suited to evaluate it. Simplicity and good design are essential to successfully marketing technical products and services.
Measuring the Wind: Metrics in Technical ServicesMorgan McCune
The document discusses the importance of metrics in technical services departments in libraries. It begins by outlining different types of metrics that can be used to measure individual, departmental, library, and university performance. These include quantitative metrics like number of records processed, as well as qualitative metrics like accuracy rates. The document then discusses how metrics can be used for training, performance evaluations, benchmarking, planning, and assessing how well the library is meeting its mission and supporting the university's goals. Overall, the document advocates for the implementation of comprehensive metrics programs in technical services to facilitate accountability, continuous improvement, and effective communication of the department's value and contributions.
The document discusses key performance indicators (KPIs) for technical support engineers. It provides examples of KPIs, performance appraisal methods, and steps to create KPIs for this position. Mistakes to avoid in developing technical support engineer KPIs are also outlined, such as creating too many KPIs or not linking them to overall strategies. The document recommends visiting an external website for additional KPI samples and materials.
Technical support engineer performance appraisalalexanderhill006
This document contains information about performance evaluation forms and methods for evaluating technical support engineers. It includes a sample performance evaluation form with sections for reviewing job performance factors, employee strengths and accomplishments, performance areas needing improvement, and signatures. It also provides examples of performance review phrases for evaluating various skills and examples of the top 12 methods for performance appraisal, such as management by objectives, critical incident method, behaviorally anchored rating scales, and 360 degree feedback. The document aims to provide useful resources for conducting thorough performance evaluations of technical support engineers.
A journey from a bad kpi to an excellent strategyAleksey Savkin
When I face a business challenge I'm trying to see the root of the problem. Having a quick-fix solution is fine, but what if a quick-fix solution is not possible unless fundamental issues are resolved. I see this often with KPIs. People ask to help with KPIs for this or for that, but the problem that they actually experience is a fundamental one - they have a vague strategy that is hard to follow.
Recently I had a conversation with Pablo, one of our Spain-based customers. His company is a leading national manufacturer and his question was about a KPI to help with the poor performance of their business. Our dialog was really insightful for both of us. Pablo sorted out things about strategy and KPIs; I was able to trace verbally the problem of a bad KPI back to its root - a poorly formulated strategy.
The result is not an article, but a dialog between me and Pablo. Together we completed a journey from a pointless KPI request to ideas about formulating a better strategy:
http://www.bscdesigner.com/a-journey-from-a-bad-kpi-to-an-excellent-strategy.htm
Making the Big Move: Moving to Cloud-Based OCLC’s WorldShare Management Servi...Charleston Conference
The library migrated from their previous integrated library system to OCLC's WorldShare Management Services over a 6 month period. They moved their search, circulation, and catalog infrastructure to the new cloud-based system. The migration process involved preparing data for transfer and working through various technical issues. The library has made changes to workflows for acquisitions, technical services, and electronic resources management as a result of the new system. They have also provided feedback to OCLC on ways the system and services could be improved.
INNOVATION AND RESEARCH (Digital Library Information Access)Libcorpio
Innovation and research, Digital Library Information Access, LIS Education, Library and Information Science, LIS Studies, Information Management, Education and Learning, Library science, Information science, Digital Libraries, Research on Digital Libraries, DL, Innovation in libraries and publishing, Areas of Research for DL, Information Discovery, Collection Management and Preservation, Interoperability, Economic, Social and Legal Issues, Core Topics In Digital Libraries, DL Research Around The World
The document summarizes a presentation about reinventing cataloging models for libraries in light of adopting new standards like BIBFRAME. It discusses challenges with current workflows that inhibit changes. The project aims to research how libraries can adapt practices and relationships to support evolving standards by testing conversion of data and prototype systems. The goals are to understand challenges and opportunities to develop a roadmap for planning investments and changes.
Web-Scale Discovery: Post ImplementationRachel Vacek
Discovery services provide users a single
search box to access a library’s entire prei-ndexed collection. Representatives from
two academic libraries serving different
user populations will discuss marketing,
instructing users, evaluating the product,
and maintaining the resource after a
discovery service is implemented
The document discusses the challenges of cataloging and metadata today and in the future, including changes in technology, user behavior, and the types of information objects that need to be described. It provides biographical information about Hendro Wicaksono and his experience working in libraries and developing cataloging systems. The document touches on the evolving nature of libraries, catalogs, metadata standards, and the tasks and skills needed for cataloging in the digital age.
The HIKE project aimed to evaluate integrating data between the KB+ knowledge base and local systems, and evaluating Intota as a potential replacement for the traditional library management system. It mapped existing electronic resources workflows and found inefficiencies in dealing with different formats. It tested KB+ and compared it to 360 Resource Manager, finding KB+ better for managing deals. The project recommended next steps including adopting Summon, embedding KB+ and 360 in workflows, and forming an Intota working group.
The role of catalogers is expanding in the digital age. Catalogers need to build partnerships with vendors to leverage outside expertise and provide access to vast online resources that are too large for any one institution to catalog alone. Catalogers also need to retool by utilizing knowledge bases, batch loads, OpenURL, and other standards to continue enabling discovery of library resources. By building on past successes and embracing new technologies, catalogers can adapt to changing needs and ensure patrons can find and access information.
Transparent Licenses: Making user rights clear (OLA Super Conference 2015)Hong (Jenny) Jing
Recent changes to Canada’s Copyright Act have propelled copyright and licensed use into the spotlight at colleges and universities in Canada. This session will look at Queen’s and University of Toronto libraries’ experience implementing a licensing permissions workflow using OCUL Usage Rights database (OUR). The systems will be covered are: 360 Link, Summon, Voyager OPAC, Endeca. We will explain how to implement the license links with and without using API.
Alma, the Cloud & the Evolution of the Library Systems Department - Kevin KiddKevin Kidd
As libraries implement Alma and other cloud-based technologies, there are many questions about the future role of the traditional sysadmin focused library systems department. What opportunities and challenges will systems departments face as libraries push their applications and services into the cloud? What will be the practical effect of implementing Alma on your systems department? What tasks will systems librarians give up? What new duties will they take on? What new skills will systems librarians need to develop? I will discuss these questions in the context of the implementation of Alma at the Boston College Libraries. As the first adopters of Alma, we would like to share thoughts and experiences in a broad discussion of the effects of cloud computing on library systems and services.
Library Makeover: Retooling & Re-engineering of Library ServicesFe Angela Verzosa
presented at the Seminar on the theme “The New Face of the 21st Century Libraries and Information Specialists,” sponsored by Cavite Librarians Association, Inc., held at La Salette Retreat House, Biga, Silang, Cavite, Philippines on Dec 5, 2007
The document discusses future trends in technical services that may impact libraries over the next 5-10 years. It notes that technical services departments will need to focus on innovation and managing change. Key trends discussed include a shift to more machine-based cataloging upstream; new standards like RDA; providing access to non-English materials; developing institutional repositories; implementing electronic resource management systems; and the potential impact of open-source ILS and federated searching systems like WorldCat Local. Technical services staff will need new skills in areas like metadata and may take on roles like maintaining institutional repositories or digital collections.
The document summarizes the JISC HIKE Project at the University of Huddersfield which evaluated the Intota library management system from Serials Solutions and the JISC Knowledge Base+. The project aimed to understand current workflows, identify pain points, evaluate the new systems, provide guidance on integration, and assess the impact on workflows. Intota promises improved integrated workflows from discovery to acquisition and more automated processing. The project found opportunities to reduce duplication and break down silos through new interoperable systems.
What does success look like when it comes to library discoverability? Index based discovery systems have seen a dramatic rate of adoption since introduction to the research ecosystem in 2009, with more than 9,000 libraries relying on a discovery system to provide users with a comprehensive index to their offerings. Some issues bar the way to providing this comprehensive view, but many challenges have been overcome through collaboration between libraries, content providers and discovery partners. The NISO ODI initiative began to examine these issues in 2011, and released a best practice in June 2014.
Speakers will highlight examples of successful collaboration, note continued areas of challenge, and provide insight on how the Open Discovery Initiative Conformance Checklists can be used as a mechanism to evaluate content provider or discovery provider conformance with the best practice.
This presentation was provided by Diane C. Mirvis of The University of Bridgeport, during the NISO event, "Library Resource Management Systems: New Challenges, New Opportunities," held October 8 - 9, 2009.
The document summarizes a panel discussion on BIBFRAME and linked data. It discusses how BIBFRAME aims to replace MARC with a more network-friendly format, distinguishing works from manifestations. Panelists discussed projects involving linked data and increased collaboration across institutions. Specific projects at Cornell and Columbia were mentioned. Questions were asked about controlled access points, vocabularies, and cataloging's role in the semantic web.
This presentation was provided by Ted Koppel ofAuto-Graphics, Inc, Ed Riding of SirsiDynix, Andrew K. Pace of OCLC, and John Mark Ockerbloom of The University of Pennsylvania, during the NISO webinar "Library Systems & Interoperability: Breaking Down Silos," held on June 10, 2009.
Presented at the OCLC Research Library Partnership meeting by Senior Program Officer, Karen Smith-Yoshimura and hosted by the University of Sydney in Sydney, NSW Australia, 17 February 2017. This meeting provided an opportunity for Research Library Partners to touch base with each other on issues of common concern and explore possible areas of future engagement with the OCLC Research Library Partnership and OCLC Research.
Crowdsourcing the Maintenance of E-Resource Metadata: How WorldCat Knowledge ...Charleston Conference
This document discusses OCLC's WorldCat knowledge base and its Cooperative Management Initiative to improve metadata quality. It notes that the knowledge base contains metadata on electronic resources from over 5,800 providers. Through the Initiative, member libraries can approve/deny provider changes and add/update their own records. While cooperative management has increased transparency and prevented bad data, challenges include inconsistent participation, a lack of change protection, and the need for clearer guidelines. The document calls for balancing provider and community contributions to better leverage crowdsourcing for metadata maintenance.
Similar to PaLA2010 Annual Cultivating Technical Services (20)
Delegation Inheritance in Odoo 17 and Its Use CasesCeline George
There are 3 types of inheritance in odoo Classical, Extension, and Delegation. Delegation inheritance is used to sink other models to our custom model. And there is no change in the views. This slide will discuss delegation inheritance and its use cases in odoo 17.
Front Desk Management in the Odoo 17 ERPCeline George
Front desk officers are responsible for taking care of guests and customers. Their work mainly involves interacting with customers and business partners, either in person or through phone calls.
How to Create Sequence Numbers in Odoo 17Celine George
Sequence numbers are mainly used to identify or differentiate each record in a module. Sequences are customizable and can be configured in a specific pattern such as suffix, prefix or a particular numbering scheme. This slide will show how to create sequence numbers in odoo 17.
How to Install Theme in the Odoo 17 ERPCeline George
With Odoo, we can select from a wide selection of attractive themes. Many excellent ones are free to use, while some require payment. Putting an Odoo theme in the Odoo module directory on our server, downloading the theme, and then installing it is a simple process.
No, it's not a robot: prompt writing for investigative journalismPaul Bradshaw
How to use generative AI tools like ChatGPT and Gemini to generate story ideas for investigations, identify potential sources, and help with coding and writing.
A talk from the Centre for Investigative Journalism Summer School, July 2024
How to Handle the Separate Discount Account on Invoice in Odoo 17Celine George
In Odoo, separate discount account can be set up to accurately track and manage discounts applied on various transaction and ensure precise financial reporting and analysis
The membership Module in the Odoo 17 ERPCeline George
Some business organizations give membership to their customers to ensure the long term relationship with those customers. If the customer is a member of the business then they get special offers and other benefits. The membership module in odoo 17 is helpful to manage everything related to the membership of multiple customers.
Split Shifts From Gantt View in the Odoo 17Celine George
Odoo allows users to split long shifts into multiple segments directly from the Gantt view.Each segment retains details of the original shift, such as employee assignment, start time, end time, and specific tasks or descriptions.
(T.L.E.) Agriculture: Essentials of GardeningMJDuyan
(𝐓𝐋𝐄 𝟏𝟎𝟎) (𝐋𝐞𝐬𝐬𝐨𝐧 𝟏.𝟎)-𝐅𝐢𝐧𝐚𝐥𝐬
Lesson Outcome:
-Students will understand the basics of gardening, including the importance of soil, water, and sunlight for plant growth. They will learn to identify and use essential gardening tools, plant seeds, and seedlings properly, and manage common garden pests using eco-friendly methods.
Lecture_Notes_Unit4_Chapter_8_9_10_RDBMS for the students affiliated by alaga...Murugan Solaiyappan
Title: Relational Database Management System Concepts(RDBMS)
Description:
Welcome to the comprehensive guide on Relational Database Management System (RDBMS) concepts, tailored for final year B.Sc. Computer Science students affiliated with Alagappa University. This document covers fundamental principles and advanced topics in RDBMS, offering a structured approach to understanding databases in the context of modern computing. PDF content is prepared from the text book Learn Oracle 8I by JOSE A RAMALHO.
Key Topics Covered:
Main Topic : DATA INTEGRITY, CREATING AND MAINTAINING A TABLE AND INDEX
Sub-Topic :
Data Integrity,Types of Integrity, Integrity Constraints, Primary Key, Foreign key, unique key, self referential integrity,
creating and maintain a table, Modifying a table, alter a table, Deleting a table
Create an Index, Alter Index, Drop Index, Function based index, obtaining information about index, Difference between ROWID and ROWNUM
Target Audience:
Final year B.Sc. Computer Science students at Alagappa University seeking a solid foundation in RDBMS principles for academic and practical applications.
About the Author:
Dr. S. Murugan is Associate Professor at Alagappa Government Arts College, Karaikudi. With 23 years of teaching experience in the field of Computer Science, Dr. S. Murugan has a passion for simplifying complex concepts in database management.
Disclaimer:
This document is intended for educational purposes only. The content presented here reflects the author’s understanding in the field of RDBMS as of 2024.
Feedback and Contact Information:
Your feedback is valuable! For any queries or suggestions, please contact muruganjit@agacollege.in
Lecture_Notes_Unit4_Chapter_8_9_10_RDBMS for the students affiliated by alaga...
PaLA2010 Annual Cultivating Technical Services
1. CULTIVATING TECHNICAL SERVICES
Pruning and Growing for the Future
Judy McNally
Technical Services Team Leader
Doreen Herold
Catalog Librarian
LEHIGH UNIVERSITY
October 26, 2010
2. Contents
• Challenges
• Lehigh’s Technical Services Team
• Our Work Processes
• The Search for New Solutions
• Summary
• Questions/Comments/
Conversation
3. Challenges
• Trends
– Acquiring fewer print publications
– Explosion of online/digital resources
– Keeping up with critical ongoing duties
– Dealing with reduced budgets and staffing levels
– Preparing for future changes
– Blurring of the lines between traditional staffing units
– The one CONSTANT is CHANGE
4. Challenges
• Criticisms of traditional procedures
– One-by-one adding/editing of records
– Access vs. the perfect record
– Reliance on MARC vs. other metadata
standards
– Continuing to check-in print periodicals
– Continuing to bind and house print journals
5. Challenges
– Outdated OPACs and technologies
• Competing with the popularity and power of
search engines
• Limited ability to link out to the web and make
use of Web2.0 technologies
– Staff and data silos
HOW DO WE SORT THIS ALL OUT?
WHAT DO WE PRUNE?
WHAT DO WE CULTIVATE?
6. Technical Services
at Lehigh
• Online catalog continues to be valued by
Collection Management Librarians
• Shifting responsibilities and workflows
• Cross-training and staff development
• Outsourcing
• Batch Processing
7. Technical Services Team
• Traditional Team units
– Serials/Bindery, Cataloging, Acquisitions,
Government Documents
– Staff members welcome change, learning new
skills and collaborating
– Recognizing individual strengths and talents
and building on them
8. Technical Services Team
• Serials – Shift from print to electronic
– Serials/Cataloging Assistants
• Decline of check-in duties
• Transition to work with electronic formats, copy
cataloging, inventory and deaccession projects
• Projects originating from Collection
Management, Special Collections, Access
Services, Government Documents
9. Technical Services Team
• Serials (cont.)
– Bindery Assistant
• Declining bindery duties
• Transition to work in Remote Storage Facility
– Back-up staffing for Access Services Team
– Mending duties
– Projects for Special Collections
10. Technical Services Team
• Serials (cont.)
– Senior Serials Cataloging Assistant
• All electronic journal copy cataloging (print also)
– New subscriptions, JSTOR titles, title changes, holdings
information, SFX issues
• Resolving electronic journal access problems for
Collection Management Librarians, Help Desk and
ILL staff (and our clients)
• Collaborates with Senior Serials Acquisitions
Assistant; Resource person for Serials staff
12. Technical Services Team
• Serials and Acquisitions
– Senior Serials Acquisitions Assistant
• Vendor/publisher/aggregator issues, licensing
issues, backfiles, changes to packages, moving
from print to online
• Resolving access problems
• Monograph Acquisitions
– Exploring link with ILL
– Ordering electronic books, PDFs
13. Technical Services Team
• Government Documents
– Government Documents and Metadata
Coordinator
• Shift towards online Government Documents
and training of Serials/Cataloging Assistants
• Transition into split position, on Technical
Services and Library Technology Team (working
on Digital Projects)
14. Technical Services Team
• Monograph cataloging
– Decline in amount of new print material
• 1998/1999: 13,116 titles, 19,203 volumes added
• 2008/2009: 7,299 titles, 9,155 volumes added
– Expanding into electronic resource cataloging,
Special Collections projects, metadata work for
Institutional Repository
– Hidden collections
15. Our Work Processes
• Increased batch cataloging
�� ETDs (from original to batch)
– From print to print+online to online only
(LNCS, et al.; one by one to batch)
– New subscriptions (ebrary)
– DB maintenance (Z39.50)
17. ETDs
The Old:
•Original cataloging by
Archives and Special
Collections Librarian, with
assistance by Technical
Services Team Leader and
Catalog Librarian
The New:
•Records acquired from
ProQuest
•Edited locally, using
MarcEdit, by Cataloging
Assistant with additional
help from Technical Services
Assistant
18. From Print to Print+Online
to Online Only
“As you've probably noticed, the MRS
proceedings site is a nightmare. The
system makes sense to the members (I
guess...) but it's pretty hard to navigate if
you're new to the site. Links in ASA are a
big help (gee, catalogs do perform a
function).”
Sharon Siegler
Lehigh University’s Engineering Librarian
19. From Print to
Print+Online to Online
Only
• Print
– One-by-one cataloging
• Print & Electronic
– One-by-one cataloging
– Verifying (and sometimes finding) URL
• Electronic ONLY
– Developing search strategies for WorldCat
– Batch searching by OCLC number from
Springer
– WorldCat Collection Sets (free from
Springer!)
20. New Subscriptions
“The emphasis on perfection, on creating
the perfect bibliographic record, which
was a hallmark of technical services for
decades, is being replaced by a model that
places a higher value on access to
information rather than on the
information itself.”
Pamela Bluh
Associate Director for Technical Services and Administration
University of Maryland School of Law
Making Waves: Technical Services, Past,
Present, and Future
21. New Subscriptions
• 46,313 GOOD vendor records
• Minor editing in MarcEdit
• Loaded in 12 minutes (a “slow news day” for
the server)
24. Database Maintenance
Adding URLs: Early American Imprints
• 37,500 records for our microform/electronic
access to Early American Imprints, series 1
(the Evans set)…
• …but 7,500 are missing URLs
• MarcEdit to the rescue!
25. Database Maintenance
Updating URLs: Annals of the New York Academy of Sciences
From http://www3.interscience.wiley.com/journal/123238624/issue to
26. The Search for New Solutions
• Editing records
• Locally-hosted files
• OLE
30. Bibliography
(2010, March). Public Perception. American Libraries, 41(3), 24.
Ambaum, G., & Barnes, B. (2010, October 11). Unshelved,
http://www.unshelved.com/
Bluh, P. (2008, October 17). Making Waves: Technical Services, Past, Present, and
Future. Paper presented at the annual meeting of the Potomac Technical
Processing Librarians, Bowie, MD. Paper retrieved from
http://digitalcommons.law.umaryland.edu/cgi/viewcontent.cgi?article=1629
&context=fac_pubs
Eden, B.L. (2010). The New User Environment: The End of Technical Services?
Information Technology and Libraries, 29(2), 93-100.
Fessler, V. (2007). The Future of Technical Services (It's Not the Technical
Services It Was). Library Administration & Management, 21(3), 139-155.
Godin, S. (2010, January 9). The Future of the Library. Seth’s Blog,
http://sethgodin.typepad.com/seths_blog/
McLeod, S. (2009, November 3). 10 Questions About Books,
Libraries, Librarians, and Schools. Education Week
LeaderTalk Blog, http://blogs.edweek.org/edweek/
LeaderTalk
31. THANK YOU FOR
JOINING US!
Judy McNally
njm0@lehigh.edu
Doreen Herold
dherold@lehigh.edu
32. Questions/Comments/Conversation
Suggested starters:
• How does your own technical services department manage change,
as affected by resource format (print to electronic), budgets, etc.?
• Has your serials check-in changed? Doing more or less? Have you
stopped checking-in? Do you “check-in” electronic issues? How are
other serials procedures changing and being affected? Bindery?
Weeding?
• What do you outsource? What are the pros and cons of your
experience?
• How are libraries implementing some of the new cataloging
standards (RDA, other metadata schemas)?
• What staff changes are you seeing and how are you managing them?
Staff decreases/increases? Cross-training? Moving to/collaborating
with other departments?
• How do you manage the cataloging of resources, especially standing
order resources, that were print but are now online?
E-journals, databases, reference sources, dissertations, E-books, hidden collections, Institutional Repository collections, streaming video, digital collections, locally hosted files
Print resources, weeding, database maintenance
RDA, FRBR, next generation catalogs and replacements for integrated library systems
WorldCat Local
Blurring as formats blur
Formats blending
Online training classes, webinars, self-paced tutorials, wiki for documentation , post-it sessions
Use of interns ; some outsourcing – authority control, record sets, some comes back to us –dvds, videos
16,740 volumes weeded in 2009/2010, primarily print journal runs we have online coverage for; PALCI cooperative storage
Final binds
The one constant is change
Purchase items requested often; adding links for open source electronic
Reclass project, print recon work
The previous slide outlined what I’ll discuss about our work processes. Judy provided an overview of our entire department but I’ll be focusing on our work processes in cataloging to provide somewhat of a case study of how our department has been affected by change. One primary factor that has affected our work processes in cataloging, as well as other areas, has been the increase in the cataloging of resources accessible online. This chart shows that increase over the past 15 years, in segments of five years.
As an aside, I should note that this is just the cataloging of these resources, as we also have online access to other resources that are not cataloged yet our department does manage access to such resources through tools such as SFX (for example, we do not catalog all titles available through ProQuest). Our last official count of the number of resources managed through SFX was from June of 2009, with the count being 32,124.
You can see the biggest jump has been with government documents, followed by electronic books (and the figure for the most recent five years doesn’t even include the 46,313 records that were added earlier this year for our new subscription to ebrary). We continue to see an increase in electronic journals, even after the massive additions that were made in the late 90s and early 2000s to address new subscriptions to aggregate databases. And our electronic conferences have exploded as small, specialized associations that don’t have publications as their primary mission forge their way into online publishing.
The first project I’d like to talk about is the cataloging of our theses and dissertations. At Lehigh, all cataloging is done within Technical Services, with some exceptions, one of those being that the cataloging of items in Special Collections is done by the Special Collections staff. In the past, our theses and dissertations were cataloged by our Archives and Special Collections Librarian, one title at a time. As necessary, he would seek additional help from the Technical Services Team Leader and the Catalog Librarian.
But, in 2007, we worked with ProQuest to begin acquiring records from them. Initially, they provided a load of almost 900 records, which included an 856 field to link to the full text of our dissertations, back to 1997.
Minor editing = add 655 Electronic books. Genre heading; add 710 ebrary, Inc.; add 856 $z Available to Lehigh users; add 949 call number/holdings statement; add 978 (experiment to capture statistics, based on Rutgers University’s use of the 978)
Since the initial load of 46,313 in April, ebrary record loads have become part of the routine. 4,161 records have been loaded since then, in batches ranges from 2 to just under 2,000. Batches are normally done on a monthly basis but our Collection Development librarians also make selections on a title-by-title basis.
In 2008, our Collection Development librarians established an Ebook Committee and, during their work, the subject of access arose. They connected with the Technical Services department regarding our practices with ebooks and we reviewed our application of the genre heading “Electronic books”.
We decided to begin applying this term, to empower searching for such materials and so, based on our specifications, our Systems staff pulled records from our system that we then edited with MarcEdit to ensure uniform application of that term to records already in our system. This enabled a canned search for electronic books, developed by our Digital Library team, as well as for general searching in our catalog.
We (that is, me and our intern, Phil Hewitt) used a canned report in our ILS to extract approximately 7,500 MARC records which we then edited in MarcEdit. We had a base URL with which to start but did some creative editing by duplicating and dicing of the series statement to capture the volume number, which was then merged with the base URL in the 856. We were then able to overlay the records and, voila!, had our URL.
We finally broke from referring to our Systems staff or using a canned ILS report and, instead, extracted 657 records rom our catalog, using MarcEdit’s Z39.50 server. The records were pulled using the base URL that is in bold, above the graphic. The records were manipulated a bit to put them into Excel in order to filter them for those under the Annals… series. This left 364 records remaining. The 260 and 440/830 fields were duplicated and edited to merge with the new base URL that you can see in the graphic. They were then loaded back into the system to overlay the records with the old URL.
Much of the work I’ve mentioned has heavily involved use of Terry Reese’s wonderful tool, MarcEdit. I’m just beginning to scratch the surface of using MarcEdit’s capabilities.
I’ve planned a series of 16 courses through WebJunction that I hope will help me develop a greater skill set to use MarcEdit, and other tools, more fully. If you don’t know about WebJunction, go to pa.webjunction.org to see their list of self-paced courses on a variety of topics, many or all of which are $5.00 per course, thanks to Commonwealth Libraries.
There are other tools out there, including one recently mentioned on Marshall Breeding’s Library Technology Guides. I haven’t looked into this yet—has anyone used this?
One more thing about MarcEdit—I’m wondering if it would be desirable to have a session at next year’s annual meeting on MarcEdit. Is there any interest in that? And, if you would be willing to co-present, I’d love to hear from you. I think it might be helpful to have a number of people provide detailed presentations on how they’ve used MarcEdit, something that attendees can learn about and take back home to implement.
A recent issue that has arisen for us a number of times concerns files that we cannot access remotely. A vendor will supply us with files, mostly PDFs, that we’ve purchased but does not provide remote access. So providing online access is up to us. Our first, and temporary, attempt was to put them on my individual webserver space.
But, as we’re developing our IR, we decided to make this part of the development and we’re now beginning to load the files into CONTENTdm. In the case you see here, we received almost 100 PDFs for musical scores. There were no records for them in WorldCat so a student is creating the metadata, after we developed guidelines, consulting Dublin Core standards and Music Library Association recommendations.
Once the metadata is complete, it will be exported, converted into MARC using MarcEdit, and then loaded into our catalog, with referring URLs. We do have MARC records for other files and will do the opposite, exporting the records from our system, running them through MarcEdit to convert into Dublin Core, and then putting that metadata into CONTENTdm as a base for the PDFs.
These files are for Lehigh access only (our users can access them off-campus, with their Lehigh login; visitors can access them on campus).
And, in the search for new solutions, I should mention our involvement with the Kuali OLE Project.