Google Analytics on library websites

OU Library seminar, given by Tony Hirst and Hassan Sheikh. Reprise of talk at ILI given by Hassan last year – draft presentation, PPT. Tracking referrals from course websites and how that affects their behaviour on the OU library site.

Google Analytics allows you to track users across your website during their session – how long they spend, which pages.  Tracking code (Javascript) put in to page templates. Up to 5m page views per month. Eay to set up.

Lets you ask and answer: How well is the home page working? Gives you an overlay of % clicks on each link.  One way of using the data is to change your site design to make things easier for users.  (E.g. to match Fitt’s Law – make the common elements larger and hence easier to acquire.)

Most clicked links: Databases (20%), eJournals (19%), library catalogue (11%).

53% traffic direct, 41% referring sites (i.e. via link), 5% via search engines. List broken down by referrers – so is 18%, intranet next.

Then look at summary report of behaviour of visitors from a specific referrer site – so traffic as an e.g. Show clear work-week peak of traffic. Bounce rate (single page hit): 27%.

Direct traffic much more steady through the week.

Content overview is another headline report – shows you top content, popular pages – / is top, find/journals is, top, then find/databases, etc. (Can map URLs on reports to easier to read names.)

Most popular pages: journals, databases, eResources.  Top traffic sources – shows you where the traffic comes from.

Can tunnel down too.  Interestingly, the databases get about 45% of traffic on site, but drilled down in to the databases themselves, even the top one only captures 5% – distribution much flatter. But we can’t get down to the activity spent on the journals themselves.

Library using GA to generate some performance indicators on page site – e.g. unique visitors, bounce rate, visitor loyalty, average page views, depth of visit, length of visit. Bounce rate is not necessarily bad for the Library site – if they come in and then go to somewhere you’re trying to get them to go, then that’s good.

Can export the data from any of the reports you can get on a single screen in GA – as XML, or CSV – so can plot e.g. avg time on site vs pages/visit, with a dot size for bounce rate, or avg time spent on site per network location – Tony has done quick graphs of this in ManyEyes.

Search traffic – can track search terms used – top were: athens, safari, refworks, referencing.  99.29% visits are without search (good site design?).

These are all averages – but be wary of them.

Next up: exploring OU library website usage, based on course referrals.  Brief look at traffic from Moodle (OU VLE), and also from TU120 (which has Google Analytics on it, so can match them).

Segmentation (breakdown) by Referral URL. Moodle has complex URLs with queries in them (which define the course, etc), but Google Analytics by default throws that away.  But can define a rule in GA to say ‘don’t discard that’. Then can see where traffic comes from (which courses), and then where it goes to on the Library website.  Can look at the originating page too.  So can get inkling of how effective (little bits of) the course pages on the VLE are in terms of where they send people.

Landing pages across VLE referrals – mostly home page, then eResources, then the Library Guide, then specific pages on eResources.

Tracking back, can find e.g. that ‘Article for Question 3′ was a big traffic driver on M882 – “Success Factors for Implementing Global Information Systems”.  (Currently a few technical fiddles required about being hard to distinguish links to separate sections of the same resource – an additional bit of tracking code on each link.)

TU120 2008J presentation – information skills “Beyond Google” – in Relevant Knowledge programme.

Out of 227k visits, TU120 generated 1678 visits. (In this sample – Sep-Dec for a single presentation.)  Can segment down in Google analytics.  Data is not TU120 students, it’s TU120 students visiting the Library.

Profile of visits – big spike at start, another spike in the middle, another towards the end.   Content Performance – tells you what pages were viewed.  Databases and journals are popular.  Look at referrer – and it’s mostly the ECA (week 10, final spike), then Section 3 of the course (multiple pages, week 3, mid-spike).   Can look down at what databases (it was Academic Search Complete, and Nexis UK).

Average 4.6 pages per visit, 14 min. But for the ECA, 28% of visits (overwhelming mode) are 2 pages deep – not normal distribution. But the depth of visit is much flatter in the middle of the week.

Can run A/B tests using Google Analytics – so 50% see page A, 50% see page B, see if the patterns are different.  Low risk way of trying A/B testing out for real on course content.  (Or multivariate testing, would be more efficient but more complex.)

Can look at which pages are sending e.g. traffic to the journals page.

Actions: segment onsite/offisite and regional users (IP range filters). Track by course referrals from the VLE. Enhanced OU Library PI reports (Many Eyes?). Improve homepage by keeping eye on site overlays.  Worth tracking changes – useful flag for problems. Keep eye on usage of database.  Reduce long list of databases (?). Use consistent names and URL paths.

Tony posts about Library analytics – eight posts already.

Explore posts in the same categories: liveblogging

Tags: , , ,

You can comment below, or link to this permanent URL from your own site.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 56 other followers

%d bloggers like this: