Usability and user experience surveys

From EduTechWiki - Reading time: 13 min

Draft

Introduction[edit | edit source]

According to Perlman (2009), “Questionnaires have long been used to evaluate user interfaces (Root & Draper, 1983). Questionnaires have also long been used in electronic form (Perlman, 1985). For a handful of questionnaires specifically designed to assess aspects of usability, the validity and/or reliability have been established, including some in the [table below].” (retrieved 20:57, 14 March 2011 (CET))

See also: learning surveys

List of web usability questionnaires[edit | edit source]

We didn't find (yet) any specific web usability questionnaires, see below for generic usability survey instruments and that can be adapted to specific websites. Often, it is good enough to replace the word "system" by "web site", as an example, see the SUS that we present below.

List of usability and user experience questionnaires[edit | edit source]

User Interface Usability Evaluation with Web-Based Questionnaires[edit | edit source]

Author: Gary Perlman (2009)

Available through the User Interface Usability Evaluation with Web-Based Questionnaires page, either as online interface or as a a set of Perl scripts that you can install in your own server. (also from: online service at hcbib.org)

The script creates a customizable Web-based perl CGI script that allows to administer and to collect data according to a few "standard" user interface evaluation questionnaire forms. The questionnaires may be applied to web sites, but also to other software.

Online service: http://hcibib.org/perlman/question.cgi. It will send results by email.

Before you just click on the above link or the links below you should go to the original page at hcibib, scroll down and configure the questionnaire, i.e.:

  • customize system name, administrator email, etc.
  • customize rating scale such as number of points, labels, ...
  • customize number of open-ended positive/negative comments requested
  • Select the questionnaire

For your information, we below reproduce the table from the original keeping the original links....

Acronym Instrument Reference Institution Example
QUIS Questionnaire for User Interface Satisfaction

Chin et al, 1988

Maryland

27 questions

PUEU Perceived Usefulness and Ease of Use

Davis, 1989

IBM

12 questions

NAU Nielsen's Attributes of Usability

Nielsen, 1993

Bellcore

5 attributes

NHE Nielsen's Heuristic Evaluation

Nielsen, 1993

Bellcore

10 heuristics

CSUQ Computer System Usability Questionnaire

Lewis, 1995

IBM

19 questions

ASQ After Scenario Questionnaire

Lewis, 1995

IBM

3 questions

PHUE Practical Heuristics for Usability Evaluation

Perlman, 1997

OSU

13 heuristics

PUTQ Purdue Usability Testing Questionnaire

Lin et al, 1997

Purdue

100 questions

USE USE Questionnaire

Lund, 2001

Sapient

30 questions


This page seems to be the best starting point for exploring well known web-based usability evaluation questionnaires.

Purdue Usability Testing Questionnaire (PUTQ)[edit | edit source]

Author: Lin, Han X.; Choong, Yee-Yin and Salvendy, Gavriel (1997). A Proposed Index of Usability: A Method for Comparing the Relative Usability of Different Software Systems, Behaviour and Information Technology 16 n.4/5 p.267-278

The list is available through http://hcibib.org. Both the questionnaire and answer sheets are reproducible without permission provided that copyright is reproduced.

Measuring Usability with the USE Questionnaire[edit | edit source]

Author: Arnold M. Lund, Measuring Usability with the USE Questionnaire, STC Usability SIG Newsletter, orginally published in the October 2001 issue (Vol 8, No. 2)

Available: Measuring Usability with the USE Questionnaire

The questionnaire was developed over time and it started out with a large pool of items. “The questionnaires were constructed as seven-point Likert rating scales. Users were asked to rate agreement with the statements, raging from strongly disagree to strongly agree. Various forms of the questionnaires were used to evaluate user attitudes towards a variety of consumer products. Factor analyses following each study suggested that users were evaluating the products primarily using three dimensions, Usefulness, Satisfaction, and Ease of Use.”

The questionnaires were constructed as seven-point Likert rating scales, e.g. from -3 (totally disagree) to +3 (totally agree)

Usefulness
It helps me be more effective.
It helps me be more productive.
It is useful.
It gives me more control over the activities in my life.
It makes the things I want to accomplish easier to get done.
It saves me time when I use it.
It meets my needs.
It does everything I would expect it to do.
Ease of Use
It is easy to use.
It is simple to use.
It is user friendly.
It requires the fewest steps possible to accomplish what I want to do with it.
It is flexible.
Using it is effortless.
I can use it without written instructions.
I don't notice any inconsistencies as I use it.
Both occasional and regular users would like it.
I can recover from mistakes quickly and easily.
I can use it successfully every time.
Ease of Learning
I learned to use it quickly.
I easily remember how to use it.
It is easy to learn to use it.
I quickly became skillful with it.
Satisfaction
I am satisfied with it.
I would recommend it to a friend.
It is fun to use.
It works the way I want it to work.
It is wonderful.
I feel I need to have it.
It is pleasant to use.

System Usability Scale - SUS[edit | edit source]

One of the most popular questionnaires is the SUS which is short and does seem to yield reliable results across sample sizes (Tullis and Stetson, 2004).

The System Usability Scale (SUS) includes 10 items using a five-point response items (strongly disagree -- strongly agree):

  1. I think that I would like to use this system frequently
  2. I found the system unnecessarily complex
  3. I thought the system was easy to use
  4. I think that I would need the support of a technical person to be able to use this system
  5. I found the various functions in this system were well integrated
  6. I thought there was too much inconsistency in this system
  7. I would imagine that most people would learn to use this system very quickly
  8. I found the system very cumbersome to use
  9. I felt very confident using the system
  10. I needed to learn a lot of things before I could get going with this system

Adapted for websites this gives:

  1. I think that I would like to use this website frequently
  2. I found the website unnecessarily complex
  3. I thought the website was easy to use
  4. I think that I would need the support of a technical person to be able to use this website
  5. I found the various functions in this website were well integrated
  6. I thought there was too much inconsistency in this website
  7. I would imagine that most people would learn to use this website very quickly
  8. I found the website very cumbersome to use
  9. I felt very confident using the website
  10. I needed to learn a lot of things before I could get going with this website

TAM Satisfaction Questionnaire[edit | edit source]

The Technology Acceptance Model was created by Davis, 1989. The first six items measure perceived usefulness and the other six perceived ease of use. Both should explain use of a technology. Of this original simple version, exist several small variants in terms of wording. The items below were taken from Davis (1989).

  1. Using [.....] in my job would enable me to accomplish tasks more quickly.
  2. Using [.....] would improve my job performance.
  3. Using [.....] would enhance my effectiveness on the job.
  4. Using [.....] would make it easier to do my job.
  5. I would find [.....] useful in my job.
  6. Learning to operate [.....] would be easy for me.
  7. I would find it easy to get [.....] to do what I want it to do.
  8. My interaction with [.....] would be clear and understandable.
  9. I would find [.....] to be flexible to interact with.
  10. It would be easy for me to become skillful at using [.....].
  11. I would find [.....] easy to use.

Response items use a 7-point likely - unlikely scale: extremely - quite - slightly - neither - slightly - quite - extremely

More complex models also exist, e.g. UTAUT below.

UTAUT[edit | edit source]

The Unified Theory on Acceptance and Use of Technology (UTAUT) was created by Venkatesh et al. (2003). As the word "unified" suggests, it “integrates eight theories of technology adoption and provides a comprehensive view of the factors affecting users’ adoption behavior. The UTAUT model consisted of four main constructs – performance expectancy, effort expectancy, social influence, and facilitating conditions – and four moderating variables: gender, age, experience, and voluntariness of use.” (Soo Kang, 2017:Abstract)

Venkatesh (2003) version[edit | edit source]

Performance expectancy
U6 - I would find the system useful in my job.
RA1 - Using the system enables me to accomplish tasks more quickly.
RA5 - Using the system increases my productivity.
OE7 - If I use the system, I will increase my chances of getting a raise.
Effort expectancy
EOU3 - My interaction with the system would be clear and understandable.
EOU5 - It would be easy for me to become skillful at using the system.
EOU6 - I would find the system easy to use.
EU4 - Learning to operate the system is easy for me.
Attitude toward using technology
A1 - Using the system is a bad/good idea.
AF1 - The system makes work more interesting.
AF2 - Working with the system is fun.
Affect1 - I like working with the system.
Social influence
SN1 - People who influence my behavior think that I should use the system.
SN2 - People who are important to me think that I should use the system.
SF2 - The senior management of this business has been helpful in the use of the system.
SF4 - In general, the organization has supported the use of the system.
Facilitating conditions
Do you have everything you need to use the system ?
PBC2 - I have the resources necessary to use the system.
PBC3 - I have the knowledge necessary to use the system.
PBC5 - The system is not compatible with other systems I use.
FC3 - A specific person (or group) is available for assistance with system difficulties.
Self-efficacy
I could complete a job or task using the system...
SE1 - If there was no one around to tell me what to do as I go.
SE4 - If I could call someone for help if I got stuck.
SE6 - If I had a lot of time to complete the job for which the software was provided.
SE7 - If I had just the built-in help facility for assistance.
Anxiety
ANX1 - I feel apprehensive about using the system.
ANX2 - It scares me to think that I could lose a lot of information using the system by hitting the wrong key.
ANX3 - I hesitate to use the system for fear of making mistakes I cannot correct.
ANX4 - The system is somewhat intimidating to me.
Behavioral intention to use the system
BI1 - I intend to use the system in the next <n> months.
BI2 - I predict I would use the system in the next <n> months.
BI3 - I plan to use the system in the next <n> months.

Simeonova et al. (2017) version about e-learning, a cross-cultural validation[edit | edit source]

Performance Expectancy
How useful do you think Moodle/Blackboard is?
PE1 I would find Moodle/Blackboard useful for my studies
PE2 Using Moodle/Blackboard enables me to accomplish tasks more quickly
PE3 Using Moodle/Blackboard increases my productivity
PE4 If I use Moodle/Blackboard, I will increase my chances of successfully completing the course
Effort Expectancy
How much effort does it take?
EE1 My interaction with Moodle/Blackboard would be clear and understandable
EE2 It would be easy for me to become skilful at using Moodle/Blackboard
EE3 I would find Moodle/Blackboard easy to use
EE4 Learning to operate Moodle/Blackboard is easy for me
Attitude toward using technology
Is it enjoyable?
ATUT1 Using Moodle/Blackboard is a good idea
ATUT2 Moodle/Blackboard makes work more interesting
ATUT3 Working with Moodle/Blackboard is fun
ATUT4 I like working with Moodle/Blackboard
Social Influence
What do your social surroundings think about Moodle/Blackboard?
SI1 People who influence my behaviour think I should use Moodle/Blackboard
SI2 I use it because most of my classmates do
SI3 The teachers are supporting the use of Moodle/Blackboard
SI4 In general, the University supports the use of Moodle/Blackboard
Facilitating Conditions
Do you have everything you need to use Moodle/Blackboard?
FC1 I have the resources necessary to use Moodle/Blackboard
FC2 I have the knowledge necessary to use Moodle/Blackboard
FC3 Moodle/Blackboard is compatible with other systems I use
FC4 A specific person (or group) is available for assistance with Moodle/Blackboard difficulties
Self-efficacy
I could complete a job or task using Moodle/Blackboard…
SE1 If there was no one around to tell me what to do as I go
SE2 If I could call someone for help if I got stuck
SE3 If I had a lot of time to complete the job for which the software was provided
SE4 If I had just the built-in help facility for assistance
Anxiety
Are there any concerns?
Anx1 I feel apprehensive about using Moodle/Blackboard
Anx2 It scares me to think that I could lose a lot of information using Moodle/Blackboard by hitting the wrong key
Anx3 I hesitate to use Moodle/Blackboard for fear of making mistakes
Anx4 Moodle/Blackboard is somewhat intimidating to me

Affinity for Technology Scale (ATI)[edit | edit source]

This scale was developed by Franke, Attig & Wessel (2018). It is defined “as the tendency to actively engage in intensive technology interaction, as a key personal resource for coping with technology.”

"In the following questionnaire, we will ask you about your interaction with technical systems. The term “technical systems” refers to apps and other software applications, as well as entire digital devices (e.g., mobile phone, computer, TV, car navigation)."

"Please indicate the degree to which you agree/disagree with the following statements.","Completely disagree","Largelydisagree","Slightly disagree","Slightly agree","Largely agree","Completely agree"

  1. I like to occupy myself in greater detail with technical systems.
  2. I like testing the functions of new technical systems.
  3. I predominantly deal with technical systems because I have to.
  4. When I have a new technical system in front of me, I try it out intensively.
  5. I enjoy spending time becoming acquainted with a new technical system.
  6. It is enough for me that a technical system works; I don’t care how or why.
  7. I try to understand how a technical system exactly works.
  8. It is enough for me to know the basic functions of a technical system.
  9. I try to make full use of the capabilities of a technical system.

Fun questionnaire[edit | edit source]

This questionnaire was used in Afke Donker, Human factors in educational software for young children, PhD thesis, Vrije Universiteit, Netherlands http://hdl.handle.net/1871/9782

  1. Do you work with the program without someone telling you to?
  2. Would you like to work with the program when other children can decide for themselves what to do?
  3. Do you think it is boring to work with the program?
  4. When you started working with the program, did you want to continue working with it?
  5. Do you think your friends would like the program?
  6. Do you think the program is childish?
  7. Is the program is too difficult to play with?
  8. When you have worked with the program once, does it remain fun?
  9. Do you enjoy yourself when you are working with the program?
  10. Does the program contain many surprises?
  11. Would you like to work with the program more often?
  12. Do you perform well on the exercises in the program?
  13. Would you like to have the program at home?
  14. Do you make many mistakes while you are working with the program?

Geneva Appraisal Questionnaire (GAQ)[edit | edit source]

The Geneva Appraisal Questionnaire (GAQ) has been developed by the members of the Geneva Emotion Research Group on the basis of Klaus R. Scherer's Component Process Model of Emotion (CPM). Its purpose is to assess, as much as is possible through recall and verbal report, the results of an individual's appraisal process in the case of a specific emotional episode.

WEBLEI[edit | edit source]

“This instrument is designed to capture students' perception of web based learning environment. Apart from demographics and background information sections, there are four core aspects in the instrument. The first three aspects are adapted from Tobin's (1998) work on Connecting Communities Learning (CCL) and the final aspect focuses on information structure and the design aspect of the web based material. Each of these aspects is explained in the following section.” (Chang, V. ,1999, retrieved march 2014).

(1) WEBLEI Scale I: Emancipatory activities

  1. I can access the learning activities at times convenient to me.
  2. The online material is available at locations suitable for me.
  3. I can use time saved in travelling and on campus class attendance for study and other commitments.
  4. I am allowed to work at my own pace to achieve learning objectives.
  5. I decide how much I want to learn in a given period.
  6. I decide when I want to learn.

(2) WEBLEI Scale II: Co-participatory activities

  1. The flexibility allows me to meet my learning goals.
  2. The flexibility allows me to explore my own areas of interest.
  3. I am encouraged to explore concepts beyond my regular web based lessons.
  4. The asynchronous nature of the interactions enables me to reflect and respond when I had formulated an appropriate response.
  5. This mode of learning enables me to interact with other students and the tutor asynchronously.
  6. I communicate with other students in this subject electronically (via email, fax, bulletin boards, chat line).
  7. I have the autonomy to ask my tutor what I do not understand.
  8. The tutor responds promptly to my queries.
  9. The tutor addresses my queries adequately.
  10. The tutor sends me comprehensive feedback on my assignment.
  11. I have the autonomy to ask other students what I do not understand.
  12. Other students respond promptly to my queries.
  13. In this learning environment, I have to be self-disciplined in order to learn.
  14. I regularly participate in self-evaluations.
  15. I regularly participate in peer-evaluations.
  16. It is easy to organise a group for a project.
  17. It is easy to work collaboratively with other students involved in a group project.

(3) WEBLEI Scale III: Qualia

  1. I felt a sense of satisfaction and achievement about this learning environment.
  2. I enjoy learning in this environment.
  3. I could learn more in this environment.
  4. The technology resources enhance learning.
  5. I was supported by positive attitude from my peers.
  6. I was able to access the materials without much difficulty.
  7. I had no difficulty using the technology.
  8. I am confident in using the technology.
  9. I have no problems going through the materials on my own.
  10. I was in control of my progress as I moved through the material.
  11. It was easy to move about in the material.
  12. The web based learning environment held my interest throughout my course of study.
  13. I felt a sense of boredom towards the end of my course of study.
  14. I felt isolated towards the end of my course of study.

(4) WEBLEI Scale IV: Information structure and design activities

  1. The learning objectives are clearly stated in each lesson.
  2. The scope of the lesson is clearly stated.
  3. The organisation of each lesson is easy to follow.
  4. The structure keeps me focused on what is to be learned.
  5. Expectations of assignments are clearly stated in my subject.
  6. Activities are planned carefully.
  7. The subject content is appropriate for delivery on the Web.
  8. There is a logical sequence of presentation of the subject content.
  9. The presentation of the subject content is clear.
  10. The quiz in the web based materials enhances my learning process.
  11. The material shows evidence of originality and creativity in the visual design and layout.
  12. The graphics used in the material are appropriate.
  13. The colours used in the material are appropriate.
  14. The multimedia technology (eg animation, graphics, sound, video) contributes to the affective appeal of the material.
  15. The links provided in the material are clearly visible and logical.
  16. The links provided are relevant and appropriate to the document.
  17. The links provided are reliable ie no inactive links.
  18. The 'Help' system included in the material is context sensitive.
  19. The web based learning approach can substitute traditional classroom approach.
  20. The web based learning approach can be used to supplement traditional classroom approach.

Links[edit | edit source]

Bibliography[edit | edit source]

  • Chandra Vinesh, Darrell L. Fisher, Students’ perceptions of a blended web-based learning environment, Learning Environments Research, April 2009, Volume 12, Issue 1, pp 31-44.
  • Chandra, V., D. Fisher, and V. S. Chang. 2011. “Investigating higher education and secondary school web-based learning environments using the WEBLEI.” In Technologies for enhancing pedagogy, engagement and empowerment in education: creating learning-friendly environments, ed. Thao Le and Quynh Le, 93-104. USA: Information Science Reference, IGI Global.
  • Chang, V. (1999). Evaluating the effectiveness of online learning using a new web based learning instrument. Proceedings Western Australian Institute for Educational Research Forum 1999. http://www.waier.org.au/forums/1999/chang.html
  • Chang, V., & Fisher, D. (2003). The validation and application of a new learning environment instrument for online learning in higher education. Technology-rich learning environments: A future perspective, 1-18.
  • Franke, T., Attig, C., & Wessel, D. (2018). A personal resource for technology interaction:Development and validation of the Affinity for Technology Interaction (ATI) scale.International Journal of Human-Computer Interaction, Taylor & Francis, London, UnitedKingdom.
  • F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly 13 (1989) (3), pp. 319–340
  • Perlman, Gary (1985). Electronic Surveys, Behavior Research Methods, Instruments, and Computers v.17 n.2 p.203-205
  • Root, Robert W. and Draper, Steve (1983). Questionnaires as a Software Evaluation Tool Interface Design 4 -- Analyses of User Inputs / Proceedings of ACM CHI'83 Conference on Human Factors in Computing Systems 1983-12-12 p.83-87
  • Simeonova, B., Bogolyubov, P., Blagov, E., & Kharabsheh, R. (2014). Cross-cultural validation of UTAUT: the case of University VLEs in Jordan, Russia and the UK. Electronic Journal of Knowledge Management, 12(1), 25-34.
  • Tobin, K. (1998). Qualititative perceptions of learning environments on the world wide web In B. J. Fraser & K. G. Tobin (eds.), International handbook of science education, Kluwer Academic Publishers, United Kingdom: 139-162.
  • Tullis, Tom and Albert, Bill (2008). Measuring the User Experience : Collecting, Analyzing, and Presenting Usability Metrics p.317 Morgan Kaufmann Publishers. ISBN 0-12-373558-0
  • Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS quarterly, 425-478.

Licensed under CC BY-SA 3.0 | Source: https://edutechwiki.unige.ch/en/Usability_and_user_experience_surveys
29 views |
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF