Tag Archives: augmented reality

A Dose of (Augmented) Reality: Exploring possible uses within a library setting

One of the areas mentioned in our surveys as something to investigate for future was Augmented Reality. Birmingham City University library’s Mobile Technlogies Working Group have been considering different ways of using augmented reality as this guest post by Anthony Humphries demonstrates (recreated with kind permission from BCU eLibrary blog). A brief introduction to Anthony…

I’m the Learning Resources Co-ordinator within a busy Lending Services department, supporting the Help Desk to improve our customer’s experience as much as possible.  A committed techno-positivist, I am highly interested in the ability of emerging technologies to enrich the experience of our users and sustain the relevance of our libraries.  I am always keen to discuss my ideas and if you want further information please contact me: anthony.humphries@bcu.ac.uk.

And now for his blog post…

Of the many emerging mobile technologies that libraries are looking at one that has always appealed to me is augmented reality (AR). Compared to other technologies that are discussed AR has:

  • fewer introductory barriers to overcome
  • is virtually cost-free
  • does not require specialised technical staff
  • the general public will increasingly have some familiarity with it.
  • can also be a lot of fun.

So I committed myself to turning some of these ideas into practical demonstrations for a group of interested colleagues.

I used the Aurasma platform as it’s free, straightforward to use, and has considerable market penetration. It works by having a pre-prepared image – a trigger – uploaded to their servers. Then when a device using the Aurasma browser focuses on one of these triggers information in the form of images and movies are overlaid onto the image in a predetermined way. Digital information is ‘superimposed’ onto what you are seeing through the devices camera. The big advantage of this optical approach compared to location based AR is that you can be precise with the location and it can be used over multiple floors without interference. There was a steep learning curve initially, learning what worked well (formats, sizes, scales) as a trigger and overlay, but after some trial and error using the software is actually quick and easy. Development forums provided some useful advice but a thorough introductory ‘best practice’ guide would have been welcome.

I came up with 9 possible categories of uses for AR and put together a demonstration for each of these. The focus was on provoking ideas rather than fleshed-out practical application:

  1. Video demonstration Pointing mobile device at the screen of the self-service issue machines automatically plays a video guiding the user on how the machine operates. There is also a button beneath this video saying ‘Need PIN?’ – when tapped this takes the user to a website with information on this.
  2. Enhanced publicity/directional map Pointing a mobile device at a floor plan map (either on a plinth at the library entrance or in hand-held form) overlays a re-coloured map indicating areas that can be tapped. When they are at a photo of that location there is a pop up giving users a ‘virtual tour’ and more information on that area.
  3. AR summon helpHelp on a screen-based service Pointing a mobile device at the Summon discovery tool overlays guidance arrows and notes onto the screen– pointing out the where to enter the search, where to refine filters & then view results
  4. AR virtual bay endVirtual bay-ends Pointing mobile device at a particular image (perhaps located near catalogue PCs) overlays directional arrows to where resources are located – giving users an initial idea of where to find what they are looking for.
  5. AR enhance instructional guideEnhanced instructional guide Pointing a mobile device at a leaflet about accessing our online resources automatically plays a video with screenshots showing the stages that they need to go through. To the right are buttons that could be tapped to directly call, email and complete a form if further help was needed.
  6. Induction/Treasure Hunt Students could scan a ‘frame’ placed in an area of the library. Once scanned a video would play introducing them to that area and how to use it – alongside the video a new question would appear that would guide them to another area to continue the ‘game’.
  7. Enhanced publicity material Pointing a mobile device at our main library introduction guide which is enhanced with pictures, videos and extra information beyond what could be included on a physical copy. Also all telephone numbers, email addresses and hyperlinks are made into tappable live links.
  8. AR Staff assistanceStaff assistance/reminder. Pointing a mobile device at the borrower registration screen of the LMS that we use overlaid with extra information to show the various fields that need completing. It is designed as a quick check for staff to ensure that it is completed accurately.
  9. ‘Book Locator’/directional video Using a mobile device to scan an image near to a catalogue PC to bring up a virtual table containing dewey ranges, i.e. 000 – 070. Tapping one of these would make a simple video pop-up directing the user from that location to the approximate shelving run. Technically this does not use AR at all, but was an interesting use of the software.

The demonstrations went well and generated some interesting debate amongst my library colleagues. Some brief thoughts after the demonstrations:

  • Point of need content – The way that triggers work allows them to be highly context specific, you are essentially just ‘looking’ at the thing that you want help with, i.e. a room, a screen or leaflet. Could there be a future where users just get used to pointing their device at things and getting assistance and extended content?
  • AR vs QR codes – The AR feels a lot more immediate than QR codes. Whereas scanning a code sometimes feels like an additional step and takes you away from what you are doing the extra information from AR is more integrated into your activity. Aurasma allows extra functionality too.
  • Getting library users onboard – Is an issue whenever something new is introduced. Some level of training would be required. People have to download the app, subscribe to a particular channel and then know where to scan. Technological improvements may mitigate some of this – for example Aurasma allow the possibility of integrating their software into an existing app, meaning that users will not need anything new or have to subscribe to channels.
  • Ease of development – As described above, the platform is not as intuitive as it might be initially but after a brief explanation I could see colleagues from across the service creating content, all it takes is some very basic image manipulation. I was creating these rough demos in about 15 minutes. The technical barrier is very low.
  • Range of devices – The demos all worked equally well on iOS and Android smartphones that I tested. They looked great on larger tablet devices.

Are you currently using augmented reality or planning to do so? Let us know your ideas in the comments.

Mlibs event – Mobile devices in the physical environment in libraries, exhibitions and galleries

This is part of a series of blog posts based on the sessions held at the Mobile technologies in libraries: information sharing event. More resources from the day are available at the event Lanyrd page.

Gary Green

Gary Green

Jason Curtis (Shrewsbury & Telford Hospitals NHS Trust), Gary Green (Surrey County Council), and Peter Kargbo (Manchester Metropolitan University) facilitated a group discussion during the morning breakout sessions on using mobile devices to link the physical to the virtual. Gary had prepared a collection of links on Delicious to help shape the discussion, and ideas were shared by those attending the session too.

Some topics discussed in this session included:

  • AR Apps – scan location to find local areas of interest e.g. restaurant – including Layar and Wikitude
  • Aurasma – AR app to scan book cover & view video reviews
  • Using AR to see videos for exhibitions (e.g. John Rylands)
  • AR issues round technology e.g. reflections/lighting
  • How do users know about AR?
  • Rooms tagged to provide further info on room (e.g. British Museum)
  • QR code based quizzes e.g. for school groups
  • Lists of journals with QR codes
  • Problem linking to non-mobile sites
  • Study rooms and booking forms – system for real-time booking
  • How to track usage of QR codes? – Google tracking codes
  • Complex QR codes may be difficult to scan on some devices
  • QR codes on opening hours posters e.g. holidays  – feedback very useful
  • QR codes on catalogues e.g. www.shelib.nhs.uk
  • Concerns over data charges (study at Huddersfield) and use
  • Could use plain text for contact details etc. – how to update
  • Finding out more especially visual search
  • Using QR codes to book computers in FE libraries, give students some independence

The facilitators asked everyone in the group to note down some of the challenges faced in implementing such technologies. The key themes emerging from those notes were:

  • Acceptance of mobile technologies in libraries
  • Lack of awareness of technologies from library staff
  • Lack of awareness of technologies from users – will they need to be trained?
  • Uncertainty about technologies in terms of their suitability for meeting a need – sometimes it can seem like a solution looking for a problem
  • How to apply the use of technologies in local context – which technologies to use, where, how…
  • What skills are needed to implement these technologies?
  • What have other people done? How can we learn from that?
  • What resource is needed to implement? What benefits will we gain?

Mlibs event – Augmented Reality for Special Collections

This is part of a series of blog posts based on the sessions held at the Mobile technologies in libraries: information sharing event. More resources from the day are available at the event Lanyrd page.

Matt Ramirez

Matt Ramirez

Matt Ramirez (Mimas) gave a presentation during the afternoon breakout sessions on the topic of Augmented Reality for Special Collections. The presentation was based on the work of the JISC-funded Special Collections using Augmented Reality to Enhance Learning and Teaching (SCARLET) project. The main points from Matt’s session were recorded by flipchart:

  • SCARLET project used Junaio app to create their A.R.
  • Students wanted to be able to interact with A.R. models, rather than just being signposted elsewhere
  • Useful tool for enquiry based learning
  • SCARLET toolkit will be available to use
  • Sketchup good for pre-built 3D models
  • No W3C standards for A.R. browsers
A more detailed overview is below, thanks to Pete Dalton:

Matt presented details of the Special Collections using Augmented Reality to Enhance Learning and Teaching (SCARLET) Project at the University of Manchester (Mimas) in collaboration with the John Ryland Library.   Through the use of Augmented reality through mobile devices students are able to have more immersive experiences when interacting with rare materials in special collections.  While viewing an object first-hand, AR markers and  spatial triggers provide access to supporting materials through mobile devices to enhance the learning experience.  Through the use of mobile technology the original object is in effect ‘surrounded’ by additional contextual material to enhance the learning experience.

It was reported that to date that the AR functionality had been generally well received by students.  The project had learned lessons about developing such content including not underestimating the time it takes to create the surrounding content and the need to get buy in from all stakeholders. In addition it was clear that AR should be presented as a unique additional experience and not an attempt to simply duplicate other experiences.

It was noted that this was a rapidly developing area and that the project had only begun to scratch the surface of what might be possible in the use of AR.  New possibilities were opening up all the time such as the ability for browsers to visually recognise 3D.

Matt highlighted the forthcoming Augmented Reality toolkit that the project will produce which can help others to harness AR to support teaching and research.

You can find out more about the project (including news about an extension project, SCARLET+) on the SCARLET project blog. You may also be interested in reading the case study the SCARLET team wrote for our community website, and if you have a smartphone you can sample the AR using the SCARLET demonstrator channel.

What is m-libraries?

So now we’ve handled what we mean by mobile, now it’s time to consider the concept of m-libraries*. In a nutshell:

Mobile devices + libraries = m-libraries

Mobile devices + libraries = m-libraries

You may have heard of the successful m-libraries and Handheld Librarian conferences, or read some of their conference proceedings or related blog posts. The scope for m-libraries is vast – basically any initiative that enables the use of mobile devices in libraries could be included under this umbrella. This could include (though is not limited to):

  • Accessing library content via mobile devices (e.g. e-books, e-journals, special collections)
  • Using SMS to support enquiries or provide information to users
  • Developing a mobile interface for a library website or library catalogue
  • Using QR codes around the library to link to electronic content accessible by mobile devices
  • Staff using mobile devices within the library to support roving enquiries
  • Developing a dedicated mobile app to provide library content to users
  • Utilising augmented reality within the library (e.g. special collections) using cameras on mobile devices
  • Using mobile devices to interact with the library (e.g. renewing books, checking in on location services, doing tasks via mobile devices for points/rewards)

As the m-library support project is JISC-funded, our primary focus is academic libraries in UK, though we are also interested in innovative projects further afield which we could learn from (e.g. in different types of libraries or in academic libraries outside the UK).

We’ll be sharing relevant information and links we find on the blog as well as gathering case studies to develop a web presence to share these examples. If you have any examples you think might be relevant for us to examine as part of the project, please submit an example via our form.

[gform form=’https://docs.google.com/a/joeyanne.co.uk/spreadsheet/viewform?hl=en_US&formkey=dDNaQzg0anhJQUpLQlgwVGMxS1BwY3c6MQ#gid=0′ confirm=’https://mlibraries.jiscinvolve.org/submit-an-example/thanks/’ class=” legal=’off’ br=’on’]

*The project’s official title is the mobile library support project but owing to the existing mobile library concept (i.e. library on wheels!) we chose to adopt the m-library name.