Testing SAML Attribute release

What goes on when you login to a Single Sign On controlled web site? From the user’s point of view, not much: you may have to select your institution from a long list of possible site, but once you have found the right one you just type in the required username and password and you are in.

From the system point of view there is a lot of going on behind the scenes activity. The whole login process has to be tweaked so that the information received about a logged in user matches what the website, better the Service Provider (SP), expects.

When a user succeeds in logging in a package of information is generated and shared with the SP in SAML format. SAML is an XML format. Generally only the minimum amout of information required to access a site is released, in ways designed to prevent tracking of users and avoiding spam emails.

The SAML released can be viewed in the Firefox Add-On SAML Tracer. But SAML is meant to be machine rather than human-readable. If there is a problem with access to the site it may well be because of an issue with the attributes released not matching those expected by the Service Provider.

There is a quick way of viewing the attributes released bu building a URL that looks like https://login.library.dmu.ac.uk/oala/sso-debug?entityID=<entityID-of-SP>. For example:

You will need to login with your own DMU Single Sign On details. The resulting Single Sign On Debug pages tell you where the information is being sent; the names of the SAML Attributes being released and the values associated with these attributes.

If there is an issue with logging in to a Service Provider’s resources, it might be worth checking the attributes released to ensure that the ones expected are in fact present.


Posted in Uncategorized | Tagged | Leave a comment

Creating cost & usage analysis for combined databases packages in Intota

I’m currently adding ERM data to Intota for all of DMU Library’s online databases. The library uses Intota to store and manage ERM data for e-resource renewals (I have previously blogged about creating a renewal checklist and automated alerts in Intota) and has also started to use Intota’s Assessment module to capture COUNTER usage stats and create usage/cost analysis to assist with library e-resource evaluation.

Some of the library’s negotiated agreements contain multiple online resources, even though we pay one set annual cost for subscription. Recording this single cost for budgetary allocation/spend purposes is simple enough, but it can present some potential complexities when you are interested in extrapolating usage and cost analysis for individual databases within combined deals.

For example, I have recently been dealing with a renewal agreement which consists of two online databases for which the library pays one set annual cost. The price of the deal is weighted differently in terms of the two databases (database X costs one fee, database Y costs a different fee), so we cannot apply a simple 50/50 split on the deal. The invoice for the deal also contains a single line for total VAT and a single line for a transaction fee applied by the negotiating agent.

So, to get a true reflection of the exact cost for each of the individual databases within the combined agreement, I had to apply the following formula to give me one “exact” cost for database X and one “exact” cost for database Y:

Weighted cost of database X in combined deal + VAT applied to database X + half of transaction fee = exact cost of database X.

(I then ran the same formula for database Y).

Once I had calculated the single subscription cost for each of the databases, I added these monetary values into the “Costs” section of each database record on the Intota knowledgebase. Intota saves and stores these figures and plugs the costs into its Assessment module, from where library staff can download and export various COUNTER and library usage reports:

Different usage reports and cost analysis can be created within the Int

Different usage and cost reports can be created in the Intota Assessment module.

Where possible, the library has set up SUSHI connections with content providers in Intota to automatically capture usage data on a regular basis. If a SUSHI connection is not possible or desirable, Intota collates data manually on a twice yearly basis (this involves providing Intota with admin logins for the publishers we have selected for this manual upload of library usage stats). For databases X and Y in the combined deal I mention above, we set up a SUSHI connection between Intota and the relevant vendor.

So, the set up for the combined database package is good to go. The library has set up the correct SUSHI credentials between Intota and publisher, and it has calculated a breakdown of costs for the individual databases contained within the combined agreement for 2015/16. When the library is ready to run a usage report for the combined deal, Intota Assessment should be able to provide the necessary information and a cost per use figure for the subscription period selected.

DMU Library has a number of combined database agreements, so we will be monitoring how we and Intota handle the administration of these (potentially) complex and challenging resource deals.

Posted in Uncategorized | Tagged , , , , | Leave a comment

Intota implementation: ERM data population, first steps…

DMU Library’s Content Delivery Team are currently implementing ProQuest’s new Intota Assessment product. DMU Library has started the process of populating the Intota knowledge base with subscription, cost, licence and usage data for its full text online resources, A&I databases, e-journals and e-books. We are using Intota in an attempt to streamline work processes across the directorate and provide more effective and efficient outputs for library staff who are part of e-resource/e-journal/e-book renewal cycles and purchase decisions. DMU Library was provided with full access to Intota in Nov 2014, and a number of library staff have spent the last two months or so getting familiar with the online Intota admin system, receiving online training from ProQuest Support and adding first batches of resource data (mainly costs and subscription term data).

I oversee the day-to-day administration of the library’s online resources portfolio, so I am very interested to see how Intota can help improve my own workflows when it comes to online databases management and assessment. The management of a hundred or more different online library databases/collections has traditionally been based on many different silos of information – Excel spreadsheets, Outlook calendar alerts, email inboxes, publisher admin sites, resource paper files and my own memory bank!

When a library e-resource is coming up for renewal, the Content Delivery Team provide subject teams with key info relating to the renewal – this is usually a cost, licence and usage analysis of the individual resource which assists the librarian in making sound and informed decisions to maximise annual library investment in the online products it purchases. This analysis has to be created weeks/months in advance, often set against strict notice period or licence deadlines which need to be met, so plotting the various workflows to join together can be tricky (especially if a number of resources are coming up for renewal at the same time).

Two aspects of functionality within Intota immediately struck me as having the potential to make “life easier” with regards managing elements of this resource data – the ability to create an online review checklist which could be easily annotated during the different stages of a resource renewal process, and the setting up of an automatic email alert for library staff to receive at key points during the renewal. So, I set about reading the Intota user guide to see how I would set each option up to gauge their usefulness…

Review checklist

I have created a basic review checklist within Intota which tries to capture the various stages of the e-resource renewal cycle. I am sure this looks different for all HE libraries, depending on how a resource is purchased and administered. The checklist is very much a work in progress at this stage; I was more interested in exploring the functionality within Intota, rather than scoping out exactly what the checklist should state step-by-step.

Intota allows you to manually input individual “steps” for the renewal cycle you want to create, based on your own library workflows. Once these individual steps are saved, the review checklist is then available to view for each database which is active within the Intota knowledge base and potentially subject to a renewal. Each step of the renewal process then has a box for library staff to “tick” once a stage is completed in the sequence, and an empty “note” field for library staff to enter relevant information (e.g. the date the step was completed and initials of library staff who completed the action).

Intota allows you to create a review checklist to assist resource renewals.

Intota allows you to create a review checklist to assist resource renewals.

Renewal alert

Intota also allows you to create automated “alerts” which can be emailed to nominated library staff as a prompt to start a particular renewal workflow or to act as a reminder to complete a task in the renewal cycle. I created a draft test alert for an upcoming library resource renewal.

The alert operates in line with renewal milestones (dates) set by library staff in the Database record for whichever resource you want to renew in Intota. These milestones will be based on the resource expiry date, and other key deadlines the library may wish to set for each evaluation stage of the renewal process. So, say a library resource expires on 28 Feb 2015, the library may set a “renewal decision date” of 28 Jan 2015 in line with a 30 day notice period contained in the resource agreement between library and supplier. The renew/cancel evaluation of the resource is obviously going to have to start much earlier than the decision date deadline – library staff need to be prompted to start collating usage data, cost information and the latest licence for subject teams to review. This is where an automated alert may be helpful.

Automated email alerts can be set up in Intota to act as a workflow prompt or reminder.

Automated email alerts can be set up in Intota to act as a workflow prompt or reminder.

For the test alert I created, I requested an automated email notification be generated 30 days before the “renewal decision date” (28 Jan 2015). An email alert duly appeared in my work inbox on 29 Dec 2014:

An automated email message is generated and sent to a nominated email account on the date requested.

An automated email message is generated and sent to a nominated email account on the date requested.

Setting up these automated messages as a means to trigger the start of a library workflow is a simple and effective method to begin to better manage processes in this area. The alert requires a “renewal decision date” to be added to all subscribed library resource records within the Intota knowledge base, but once this is populated, it should work well. The alert can also be sent to other library staff, as long as nominated email accounts are listed when the alert is created.

This blog post outlines functionality of two Intota features at the outset of DMU Library’s implementation of the service. The Content Delivery Team will continue to blog on progress and development with Intota over the coming months.




Posted in Uncategorized | Tagged , | 2 Comments

Summon implemented: what next?

‘Library Search’ at De Montfort University was launched in September 2014 and on 22nd December the implementation phase was declared to be officially over. Over the Spring and Summer of 2014 there was a good deal of concentrated effort put into getting Summon to reflect the variety of print and electronic material available to members of the university and that effort seems to have paid off. I expect people soon to be wondering how they ever got anything done without it.

I did get the chance to ask a pharmacy student how she felt about Library Search last week. At first I was pleased that she knew what it was, but she explained that her tutors were always highlighting the importance of using academic content in her assignments and that Library Search was a good way of finding this kind of material.

It is an important comment for underlining how being able to find and use academic content makes a difference to assignments and eventually degree results. But it also highlights the responsibility of the Library, and Content Delivery team in particular, to ensure that the appropriate content is all findable within Library Search.

That is, in fact, where the emphasis of the team’s work is being focused in 2015. We are moving into the implementation phase of Intota, a tool for managing electronic resources. Behind Library Search is a whole range of electronic resources to which the university subscribes. Intota does not add to that range, only makes managing them easier and ensures, for example, that we do not lose access to anything just because anyone forgot to pay the bill.

One of the tasks that need to be completed in getting Intota up and running is to add prices for our individual journal subscriptions. Where we have made a start on this we have already found more content that we can activate in Library Search, so that more articles show up in the search results. Equally, we have also been able to tune the activated date ranges to exclude years were we no longer have access. This helps to avoid frustrating users with articles that can’t be accessed and cuts down on helpdesk calls as people try to figure out what went wrong.

Once we have prices for journals and start collecting usage figures for article downloads we can make evidence-based decisions on how to promote and manage our electronic collections.

As we gain new collections, like the recently added Avery Index to Architectural Periodicals they are added to both Summon and Intota so that we can keep track of them and make the content available to students via library Search.

New journals are also being accepted into the Directory of Open Access Journals and we are monitoring the additions and adding those relevant to the university. As the range of students and their interests becomes ever more international this can be an important way of ensuring that material is available in languages other than English.

One of the other key elements to the Library Search infrastructure is EZproxy, which in turn requires attention and further fine-tuning. When Library Search was first introduced the EZproxy logs were monitored daily to identify and fix any problems with the way the service was configured. Having a proxy service in place means that many login problems that could be potential obstacles for students can be precluded. But there is an art to proxy configuration and odd side-effects that need to be investigated and resolved.

The trend towards internationalisation draws increased attention to the licenses that accompany and regulate the use we can make of electronic resources. We are hoping that Intota proves a useful tool for keeping track of the licenses themselves and alerting us to provisions that need to be enforced.

Of all these activities to support Library Search perhaps the most exciting is the interpretation of the incoming usage statistics. It will be interesting to learn if there has been a ‘Summon-effect’ in the use of electronic resources. Already it appears that the Ebrary e-book collection has been better promoted, via Library Search, in 2014 than we managed in 2013. Where usage was more mature there may still be discernable trends apparent as Library Search gains in popularity. Whatever the figures, those for the first term of Library Search use are likely to be closely investigated as they present a new benchmark in electronic resource use in the university.




Posted in Uncategorized | Tagged | Leave a comment

Why this blog?

There is a team of people at De Montfort University working working on making access to electronic resources easier for our users. One such venture is Library Search [login required] introduced in the Summer of 2014.

‘Simpler’ usually means taking the complicated bits and moving them behind the scenes. In this blog we will reflect on our progress and frustrations.

Posted in Uncategorized | 1 Comment