A Usability Study of a Language Centre Web Site

Abstract

Submitted by Andrew J. Morrall for the degree of Master of Science in Information Technology in Education at the University of Hong Kong, September, 2002.

A usability study of the Internet site of the Centre for Independent Language Learning (CILL), a part of the English Language Centre (ELC) at the Hong Kong Polytechnic University (HKPU) was carried out in June and July 2002. The purpose of the research was to investigate whether the technique of ‘Discount Usability Testing’ advocated by Nielsen (1994, pp. 245-272) is effective in improving the usability of the CILL Internet site, given its departures from the ‘classical’ experimental approach as detailed in Rubin (1994), such as a small number of test subjects.

This overall question was divided into three. Firstly, do the improvements to the site made during the usability study result in improvements in test participants’ success rate in completing tasks on the CILL site? Secondly, do the improvements to the site made during the usability study result in participants being able to carry out these tasks in a shorter time? Finally, do the participants say that they believe that the site is easier to use?

In the research CILL members were questionnaired as to how they use the CILL site. From this information, ten test tasks were developed. Ten test participants, a mixture of CILL students and ELC teachers, including both native speakers of English and Chinese, then attempted these tasks. These tests were observed, and notes taken on usability problems with the site, whether the participant completed the task successfully, and how long the participant took to complete the task.

Nineteen modifications were then made to the site to try to eliminate the usability problems. Then more tests were held with the same test tasks and two groups of participants. The first group consisted of five participants who had taken the test in the first round of testing. The second group consisted of eight participants who had not done the first round. Again, these tests were observed, and notes taken on usability problems with the site, whether the participant completed the task successfully, and how long the participant took to complete the task.

The small number of test participants made it difficult to reliably answer research questions one and two. However, given this, the statistics indicate improvements in both task success rate and reductions in task completion time. For research question 3, of the five participants who re-took the test, one strongly agreed that it was easier, one agreed, and three neither agreed or disagreed. Thus, on these criteria, and with these limitations, ‘Discount Usability Testing’ was an effective methodology for improving the usability of the CILL Internet site.

Further research is recommended into using more authentic tasks such as a text correction task, investigating the effects of testers and participants reading the task instructions aloud, the effect of participants’ level of computing skills on their task success and duration, and the optimum number of participants to test between site design iterations.

In addition, as the aims, users and content of educational web sites are different, it is recommended that webmasters of such sites carry out usability studies to investigate the effectiveness of this methodology and how their sites could be improved.

References:

Nielsen, J. (1994). Guerilla HCI: Using discount usability engineering to penetrate the intimidation barrier. In Bias, R.G. & Mayhew, D. (Eds.) Cost-justifying usability. Boston: Academic Press.

Rubin, J. (1994). Handbook of Usability Testing. New York, NY. John Wiley & Sons.

Download the entire dissertation (MS Word format, 1.6 Mb)