Monday, 3 October 2011

Lessons Learned

There have been several lessons learned around the practicalities of conducting usability and product development. The timing of our project, running across the Summer vacation, meant that representative stakeholders were in short supply and retaining them through the life of the project was also a challenge. This was due to the subject of the project, the qualitative methodology we employed, and its reliance on customers to lead the rapid development phase. Equally, the Summer also presents a challenge to the focus of the project team who were all scheduled to take annual leave at different points. Here we had to ensure that leave periods overlapped a little as possible and that handovers and project meetings through the project phases were detailed, enabling team members to pick up where others left off. Institutional lessons were learnt regarding lead times for equipment and ordering processes for short term projects where quick turnarounds are required.

This project has been an affirmation for the Project Team, if one were needed, of the importance of usability in product development for customer focussed applications. Using a range of usability methodologies was useful in confirming areas to focus on in development rather than relying on a single tool for guidance. Other usability tools such as wire-framing and paper prototyping gave us a very quick way of testing ideas and changes with customers, avoiding the need for lengthy development and programming work. Here we found using Balsamiq software along with some basic Flash ActionScripting helped bring a concept to life for the customers whose feedback we were seeking.

Overall, we strongly feel that it is important to remain focussed on what you had intended to develop. Our qualitative feedback is extremely rich and at times there was an overwhelming temptation to be pulled off at interesting tangents. Those tangents can still be explored and the feedback is equally valid and important but the focus must remain set on the area you have highlighted for initial development.

Having attended the Usability and Learnability workshop, it was encouraging to see the path the UsabilityUK project is taking. The support project would have been a crucial resource for us had it been operational at the time we were pulling our plan together. I have no doubt that future projects will look to UsabilityUK for guidance and support.

In all this, it is encouraging that the customer remains the focus. If we are developing applications for a customer use, surely that customer needs to be a central consideration in its design and function and have a role in design? Usability gives us the tools to place the customer at the heart of development.

How successful has the project been?

Our success measures for the project were qualitative, focussing on the results of usability testing pre and post development and also on the range of usability measures used in the project.

1. A mobile interface is developed for Locate with increased usability over the desktop version when used on a mobile device. This will be measured by comparison of baseline testing with summative testing following rapid development phase.

As has already been mentioned in our previous post, we were unable to develop a production version of LOCATE as part of this project, however were able to generate mock-ups and wireframes that tested developments identified from the baseline testing. Summative feedback and feedback received during development indicated that the relative usability of the planned mobile version had improved over the desktop model.

It was clear that customer expectation on a mobile device was different to that at a desktop. Tablet devices were more ‘forgiving’ in translating the desktop version of LOCATE, however handheld mobiles such as iphones and android phones were clearly where the desktop interface stumbled due to the difference in screen sizes.
From testing, it was obvious that significant work would be required to truly pare down screen real estate in order to deliver to the customer the information they required in an easily readable, interactive display. Large commercial companies, such as Ebay and Amazon, have developed specific OS dependent applications to deal with the conflict of usability vs screen size. It is in commercial database deployments such as these that we can look to for ideas on how to create our own service.

A broader question, which this project does not seek to answer, is whether customers actually do want to view Library catalogue data on their mobile device. Stakeholder feedback during testing appears to confirm that they can indeed see a use for it, but we should ensure that we are being customer-led in developing a service they will ‘need’ to enhance their study and research, and not technology-led in investing resources developing a service either because we can or because we feel we should in order to remain relevant.

2. Case study utilises a range of usability testing methods focussing on practical usage in a rapid development environment. We will report on the appropriateness and relative value of the approaches taken in achieving the end result as development and testing progresses.

Throughout the project we have utilised a number of recognised usability methodologies. This included;

• Personas
• Cognitive walkthrough
• Focus groups
• Paper prototyping
• Wireframing
• Guerilla testing
• Contextual analysis

The blog entries we have created have shown how we used each of these and the results that were obtained. Although we were not using the skills of a recognised usability expert, we were fortunate to be able to draw on the expertise of colleagues in our e-learning team who have a practical knowledge and awareness of this area and have been invaluable in supporting our project as it progressed. Clearly, access to a Usability expert would have been valuable also, had resources been available.

When looking at the methods we used in this project we feel that they were both appropriate and appropriately employed, however it is clear that the availability of a resource such as the one being developed by the Usability Support Project (UsabilityUK) in Strand A would have been valuable to us both at the project definition stage and ongoing through development and testing.

LocateME Project Summary - Recap

What did the LocateME project set out to achieve? Our goal, and related objectives, was two-fold with comments on our achievements set our below;

1. Product Development: To enhance our resource discovery environment (LOCATE) enabling it to operate across a range of devices.

a. Assessing usability of current desktop version of LOCATE
b. Developing a device independent mobile view of LOCATE
c. Evaluation to determine if experience has improved

This was an ambitious goal and we realised at an early stage of development that a full production version of a mobile environment for LOCATE would be beyond the timescale or resources of the project. In order to produce a mobile ‘App’ or device specific web view, significant developer time and skill would need to be input. We focussed instead on creating mock-ups for development purposes with the intention of implementing the improvements across a wider timescale. We were able to carry out live baseline testing on the desktop version of LOCATE across a wide range of devices, however.

This assessment of the current usability of the desktop version of LOCATE has been extremely useful and provided very good qualitative data for us to draw upon in development. Developing a mock-up device independent view was important in testing out development issues extracted from the baseline review. We were also able to activate and explore a live mobile version of Locate that was in the early stages of development. Our summative evaluation with selected stakeholders was of mixed success, with some comments and feedback being outside of the control and design of the interface. In general, the stakeholders we questioned thought that the interface had improved however a working live version would be required to test this further.

2. Usability: To provide a usability case study of benefit to the wider community

Again this was to be achieved by;
a. Trial usability testing approaches applied to mobile device interface development
b. Provide an evaluative case study of approaches used.

This blog and related documents provide evidence of our efforts to achieve this objective. We have utilised a range of recognised usability methods and have evidenced and documented our approach in order to inform the wider community. We believe we have fully achieved the key objectives for this goal.

Thursday, 29 September 2011

Summary of Contextual Inquiry Results

The contextual inquiry took place over the week of August 15-19.

The original plan was to recruit 4 students as testers, chosen to match four general "profiles" developed for the purpose of usability testing and to cover a range of user capabilities. The four profiles were:

EXEX - expert at mobile device, expert at library services
NONO - novice at mobile device, novice at library services
EXNO - expert at mobile device, novice at library services
NOEX - novice at mobile device, expert at library services

The students were sorted into these profiles based on the self-reported survey and questionnaire data.

Four students were signed up, but two did not show up to the initial device checkout appointment. A third student was brought in ad-hoc to try and make up some of the difference, but a fourth was not found in time.

I have assembled the relevant profile data for each contextual inquiry tester and made them anonymous: Profile Tester 1 | Profile Tester 2 | Profile Tester 3

The devices were checked out at the Monday session and students were asked to sign consent and release forms. They were also briefed on their tasks and desired focus for testing, outlined in the Student Tasks (final) document.

Over the course of the week, Amanda Hardy checked in with the Moodle module and posted several loose prompts for feedback. The students developed their own reporting format and left some comments in that way, which were collected and brought to the final interview session.

The final interview session was loose and relatively conversational, but followed the general themes in the Individual Interviews Notes document.

Generally speaking there was very little contextual feedback even with prompting and direct questioning, most of what the testers commented on were functionality issues. The full list of derived issues and contextually-significant data, as well as some thoughts as to why so little was found, is collected in the Contextual Inquiry Results document.

Summary of Final Testing - Functionality Aspects

This post will summarise the key issues/comments about functionality issues that came from the final testing of the full, mobile and mock-up versions of Locate. Testers were given iPads, iPhones and Android phones over a period of four days to test Locate on the devices and were asked to contribute to a Moodle discussion board as well as attending a Focus Group to discuss their findings. The Focus Groups were recorded and the results below come from those recordings. A further post regarding use of the devices in different environments will be posted separately.


Summary of Testing

The testers were asked to look at the 3 different versions of Locate (full web version, basic mobile version, and the mock-up created on the basis of the earlier testing) and to test how well they worked on various devices: iPad, iPhone and HTC (Android). Testers were asked to complete similar tasks to those completed in the earlier testing to look at functionality, usability, colour-schemes and uses of Locate in various locations.


Comments and Issues arising from Locate and Mobile Locate Testing

·         Mobile version of Locate very limited (some testers would prefer to use full version even though there are zooming/scrolling issues).

·         One tester preferred the mobile version but would like some features added (‘simpler the better’).

·         Testers were confused by having the section descriptions available on the mobile version (e.g. Books, Journals & Media, Article Search, Subject Databases etc) – these need to be hyperlinked if they are going to be available (will mobile Locate have these options available??).

·         One tester suggested having information appear only as you needed it (collapsible menus, sliding etc) so that all the information is not immediately in front of you but is still accessible.

·         Mobile version does not give enough information about availability – want to know how many copies are available – frustrated that couldn’t click on book to find information. One tester expected the Available at text to be a hyperlink – also recommended red text for not available and green for available (as already used in Locate full version).

·         One tester noticed a large white space after the end of the search bar in the mobile version which looked a bit odd – perhaps extend the search bar to reach across the page? (Would depend on whether the device is held horizontally or vertically?)

·         One tester suggested having an auto-detect function that would recognise which type of device you are using and take you to the appropriate version – however also mentioned the need to have a link between the two versions so that users can move between the two.

·         Issues around zooming in and out on various devices (maybe need to test these to get a definitive idea of which ones are working and which ones aren’t??).

·         In the mobile version the ‘Previous’ and ‘Next’ buttons are stuck in the corner and are very close together – difficult to click on the right one.  

·         In the full version the ‘Editions’ link is very far off too the right and it is possible that users won’t see it all (need to check mobile version r.e. ‘Editions’ link and tester didn’t try this).

·         Functionality mentioned that would be useful:
o   Availability information (no. of copies etc)
o   Requesting a book
o   Looking at account info (and paying fines – Paypal)

·         Issues surrounding linking to other library resources e.g. EBSCO (outside of library control) are seen as library issues. Also issues around logging in to non-Shibboleth resources. (iPads easier than phone screens)

·         Issues with using mobile phones to read e-books (scrolling left-right turns pages rather than moving across the page).



LocateMe Mock-Ups

It is possible that there were some misunderstandings about the purpose of the Mock-Ups in the testing. Some testers were confused about what they were supposed to be looking for/testing. Possibly due to the time lapse between the original testing and the final testing some testers may have forgotten the issues that they deemed most important at the beginning of the project – and that these were things that we had worked on to create the mock-ups.

Some comments:

·         Suggestion to change Login to My Account (tester: “login for what?”)
·         Suggestion that there may be ‘surface target’ issues with the drop down menu (e.g. books, fines, requests etc) especially on small screens.
·         Also noted the search bar still does not go all the way to the end – suggestion to replace with magnifying glass to save horizontal space.
One tester received an error when testing the mock-up (iPhone)

Tuesday, 27 September 2011

LocateME: Creating a virtual mock-up.


Methods used: Paper based wireframing, Balsamiq mock-up - wireframing software, Guerrilla usability testing and Adobe Flash ActionScript.

After analyzing the comments regarding using Locate on mobile devices from the feedback sheets, we identified one of the main *key functionalities and began working towards creating a ‘virtual mobile environment’ that would incorporate the identified component effectively within the Locate mobile interface.

 *Getting an option to login to My Account – primarily to renew books and see books out on loan (renewing higher priority than requesting).

Paper based wireframing.
We created two separate walkthrough scenarios which demonstrated how the Locate mobile environment could display the ‘My Account’ functionality within the web application.  The initial walkthrough mock-ups were created using paper, pen and post-it notes.

A screen copy of our current mobile environment was captured and printed.  We tailored the screen captures and positioned the ‘My Account’ option within several places throughout the walkthrough. 

Scenario 1: Access ‘My Account’ from the initial home screen via the ‘Sign in’ option.
Scenario 2: Access ‘My Account’ during or after executing a search.

Guerrilla usability testing.
We randomly approached 30 library users and asked each of them if we could have 5-10 minutes of their time, whilst briefly explaining to them what we were trying to achieve.  Each participant was handed the first page of the wireframe walkthrough and asked to access their ‘My Account’ information from the mobile environment mock-up.  We then presented each participant with the next paper mock-up in accordance to which option and direction they took.

All comments were noted throughout the process, and the participants were also each given the chance to offer additional feedback at the end of the session.

Wireframe screen captures with Post-it notes:
















Balsamiq development.
The findings from the initial paper based wireframe guerrilla usability testing were subsequently fed back into the design process using the ‘Balsamiq’ wireframing software.  The two scenarios were mapped out using Balsamiq, fine tuned and printed as graphical based printouts.  We then ran the previous tests, but this time we used a different random selection of library users; again allowing each participant to determine their own way through the task. 



Scenario 1: Access ‘My Account’ from the initial home screen via the ‘Sign in’ option:





















Scenario 1: Access ‘My Account’ during or after executing a search:



















 
 
Key Comments:

(1) “I normally expect to click on a toolbar at the top, or bottom.  The login option is in the right place.”

(2) “The web pages are simple and easy to get around.”

(3) “Smooth, logical and easy to use.”

(5) “Can the font be different colours, if you are logged in?”

(6) “The layout is intuitive and fresh.”

(7) “The information on the screen is not overwhelming.”

(8) “Is the ‘My Account’ link above the item just for that item?”

(9) “The word login is old fashioned.  Sign in is the normal option.”

(10) “It is very much like Sainsbury’s. It works like other similar mobile search pages.”




Flash and ActionScript : Virtual mobile device.

We took our findings from the paper based and the Balsamiq wireframe walkthroughs and used the information to create a virtual mobile walkthrough that demonstrates the two predetermined routes that allow the user to obtain their ‘My Account’ information. 

Adobe Flash: Adding user interaction:



















Adobe Flash: Adding 'My Account' functionaily:





















You can view the final virtual mobile interactive environment by clicking on the link below:


IMPORTANT: Flash version 9 or above is required to view the interactive mobile environment.

Tuesday, 20 September 2011

The initial stages of testing Locate on smart phones and tablet devices.


The initial walkthroughs by the instructional designers and learning technologist, using the latest mobile devices, allowed us to evaluate the interface and identify specific issues to the development team. The group analysis was recorded and made available to the development team for a further discussion in a scheduled LocateMe meeting.

A detailed report overview of the analysis videos can be accessed online: The five cognitive walkthrough reports.

The first video created analysed Locate using the iPad on the university WiFi network - and its unique feature of zooming in and out:






The second video to be included in the blog uses the iPhone. The task of using the iPhone (on the 3G network) was to search for a book (Operations Management) in the main university collection and find where this was located:




The final video to be referenced makes use of an ebook on the iPhone (using the 3G network):

Thursday, 14 July 2011

Usability Testing update - June (Stage 1)

The first task undertaken for the usability testing was the development of an initial recruitment questionnaire (Kirsty Kift) which was placed on SurveyMonkey and distributed via an online announcement to students. In addition to being a recruitment tool for Stage 1 testing, the questionnaire was used to determine a rough baseline of students' familiarity with the library services, various mobile devices, and whether they had access to wireless internet at home (a requirement for Stage 2). We received over 40 respondents although several of those did not leave contact information, making it impossible to recruit them.


Of those remaining, approximately 20 were recruited for the Stage 1 rapid usability analysis (identified by Kirsty, contacted by Sue Adcock). They were asked to sign up for one of several time slots for "testing groups" of 4-6. Before these groups met, Juliet Hinrichsen, Amanda Hardy, and Paul Grove developed a plan for a task-based cognitive walkthrough of the Locate interface using their own devices to identify any early issues.

Using the sample personas from the "JISC Usability Studies Final" document and the user tasks from the initial Locate development project, we assigned Helen (Amanda) to the status of "novice device user, novice library services user" and Monica (Paul) to the status of "expert device user, expert library services user" to suit both the descriptions in the personas and the experience of the people enacting the walkthrough.

Videos of the tasks were recorded and made available to the project's developer. Samples may be uploaded to a YouTube channel if there is demand.

The walkthroughs (and subsequent reports) were useful. They identified many early issues with the currently existing website's performance on the target mobile devices and flagged items for consideration in development. For example, not all of Locate's webpages could be "zoomed" on the Apple devices which made using those pages difficult. Interface elements (such as "view in new window" buttons) were not always present on the HTC. Several accessibility notes were made as well. Please see the reports for the complete information.


After the cognitive walkthroughs were completed, the rapid usability analysis "testing groups" took place. Students gathered from the initial SurveyMonkey questionnaire respondents (in groups of 4-6) were asked to bring and use their own mobile devices (as a convenient solution to delayed procurement of project devices). Testing groups were led by Library/eLU staff in pairs (Kirsty, Theresa Morley, Paul, Amanda). At the testing groups, students were asked to complete a "baseline survey"(Amanda/Kirsty) to determine some demographic and experiential data. We intend to administer this survey to the remaining participants at the end of the project to see whether some of their usage habits have changed. These surveys were also used to help identify potential persona assignments for those students willing to participate in Stage 2.


The testing group activity was in three parts: Completion of basic library tasks using the website (establish familiarity), completion of the same basic library tasks on the user's mobile device (feedback to the Library developers on this stage to be written on a handout), and a brief "focus group" at the end to discuss as a group their experience, feedback and any issues. Focus group feedback was collected by the facilitators and typed up.


It was useful to talk directly to the students about their experience and what they might like to do (or not) with the mobile version of the website. Some of them the project members had anticipated (the ability to renew books) and others we had not (linkage to the university's new managed printing system). Some of the suggestions are likely to be technologically complex to implement but all suggestions were collected and will be evaluated for priority and feasibility.


And a summary of the biggest priorities as provided by Theresa:
the following seem to be the biggest priorities to come out of last weeks’ testing:

· Getting a simple search screen and results screen that limits scrolling/zooming (with clear availability info)
o Users frustrated with slowness at times – simpler info = quicker?
· Getting an option to login to My Account – primarily to renew books and see books out on loan (renewing higher priority than requesting).
o Lots of login issues on mobile devices that caused frustrations
Other issues that could be included in future development:
· Opening online resources in a new window automatically (SFX, ebooks etc)
· Students would like to be able to search for articles and add them/save them for future reference (Article search was very slow/froze up in testing)
· Some students mentioned a way to get help (icons, tips etc) – Ask A Librarian link?
· Viewing loan history (linked with My Account)
· Paying fines / receiving overdues

Thursday, 30 June 2011

Risk Analysis



































RiskProbabilitySeverityAction to prevent
Staffing:   
Loss of key personnel24Build / shadow expertise across team
Organisational   
Availability of stakeholders for testing33Effective project planning; offer of incentives; effective communication
Technical   
Procurement Process and timely availability of hardware24Effective proect planning and agreement with purchasing dept.
Dependency on Balsamiq software22Build familiarity asap; develop options to proceed without Balsamiq if founf inappropriate.
External Suppliers   
Dependency on supplier support12Effective communication with Ex Libris

Project budget

LocateME Project Budget








Project Timeline





Project Team

Phil Brabban is Assistant Director (Public Services and Service Development) at Coventry University Lanchester Library. He has led on many projects in the areas of service enhancement and efficiency, most recently managing the implementation of a complete RFID circulation system for the Library. He is currently the Library's lead project member on University level projects to implement Smartcards and cashless trading across the campus. Phil has previously worked at Durham University as Systems Librarian overseeing the management and development of the University's Library Management system, including resource discovery tools.

Theresa Morley is a professionally qualified librarian with several years’ experience supporting Health & Life Sciences courses. As a member of the project group responsible for the implementation of Primo, she carried out the usability study which informed development of the interface as part of JISC project 12/09; she was also responsible for co-ordinating the focus group which informed the design of the home page. She has continued to play a key role in liaising between technical development staff and other stakeholders during the evolution of the product.

Kirsty Kift is a chartered librarian who has several years’ experience as the Engineering Librarian at Coventry University. She has a particular interest in student support and the use of technologies to enhance the student experience. She contributed to an internal usability study which investigated students’ use of the eLibrary and catalogue.

Paul Smith is Library Systems Manager at Coventry University Lanchester Library and has a strong record using our current portfolio of products to deliver scalable solutions to the university’s staff and students. His graphic design background has proved invaluable in previous projects with a focus on the end-user interface.

Juliet Hinrichsen is an Instructional Designer with extensive experience of technology enhanced learning and academic support. She has developed tools and resources for a wide range of learners and has research interests in design for conceptual scaffolding using principles of cognitive ergonomics. She led the successful ELTAC Benefits Realisation (BR) project on lecture capture and has had workpackage leader roles in various JISC projects including ELTAC: Enhancing Lectures Through Automated Capture (Institutional Exemplars); OCEP: Open Content Employability Project (OER); Location Independent Working (institutional innovation); Pathfinder (transition to HE); EnCoRe (DiVLE programme).

Aims and Objectives: LocateME usability case study

Aims and objectives
Goal: To enhance the Library’s integrated resource discovery environment, locally branded as LOCATE, enabling it to operate effectively across a range of mobile devices. The usability analyses undertaken through this project will provide an evidence informed case study of the development of benefit to the wider community.

To successfully meet this goal, the project has a number of objectives which fall into two broad groups: those relating to product enhancement and those relating to the process of usability testing.

Product Development
i. Assess the usability of the current desktop version of LOCATE as accessed via the selected mobile devices and experienced by different stakeholder groups
ii. Develop a device independent mobile view for LOCATE as informed by the findings of usability testing
iii. Evaluate the use of the new interface to assess whether user experience has improved

Usability
iv. Trial and evaluate approaches to usability testing as applied to mobile devices
v. Provide an evaluative case study to JISC USP focussed on the use and usability of mobile devices

Success Measures
i. A mobile interface is developed for Locate with increased usability over the desktop version when used on a mobile device. This will be measured by comparison of baseline testing with summative testing following rapid development phase.

ii. Case study utilises a range of usability testing methods focussing on practical usage in a rapid development environment. We will report on the appropriateness and relative value of the approaches taken in achieving the end result as development and testing progresses

Our focus in delivering LOCATE to our users is to provide them with an effective service which they are able to use and is ‘usable’ on a variety of devices. By measuring the relative increase in usability from the desktop version to a prototype mobile version, we ensure that the development is customer-driven. Ultimately, the user is the arbiter of our success in this regard. In order to support achievement of this goal, we need to carefully select appropriate usability testing approaches that will maximise the benefits of a user-focussed development.

We will baseline current performance by conducting a ‘cognitive walkthrough’, based on a set of agreed personas, in addition to conducting group testing during the initial phase. Summative testing following development will be used to compare usability of the service post-development and thus provide a measure of success.

The measures outlined focus on the user experience of the site on a mobile device. This will demonstrate effectiveness and value of the development of specific mobile environments and the impact they have on the users.

An alternative approach could have been to carry out a purely persona-based approach to baseline and development phases, however this would have led to the team making ‘assumptions’ based on how we feel users would use the site without supporting evidence from users themselves.

A further alternative would have been to carry out a comparative analysis of features of sites that are considered to have successful mobile environments and indentify common elements where their mobile interface differs from their standard desktop offering. Development could then take place by similarly identifying where like elements in LOCATE could be suitable adjusted or changed. Although there is considerable value in this scoping exercise, it could very quickly become time consuming and superfluous given that richer data can be sought from the user themselves.