Files
Abstract
This two-part study focused first on gaining insight into the current state of Web-based learning site usability, and second into better understanding of the perceptions of Web-based learning site builders regarding the usefulness of usability analysis results generated by an automated tool. Part one of this study involved analysis of usability attributes of Web-based learning sites using an empirically-based automated Web usability evaluation tool. Part two of the study involved Web-based surveys focused on the perceptions of Web-based learning site builders about Web usability. A literature review focused on the topics of usability, Web usability, Web-based learning, Web-based learning usability evaluation, and automated Web site usability evaluation tools supports the rationale for this study. To begin the study, an automated usability evaluation, using a software program called WebTango developed at the University of California at Berkeley (UCB), was performed on Web sites that are part of the DLESE Reviewed Collection (DRC), a subset of the educational resources in the Digital Library for Earth System Education (DLESE). Subsequently, a Web-based cross sectional survey was distributed to the builders of a subset of the DLESE resources that had been subjected to the usability evaluation. The first part of the survey explored the builders perceptions of Web usability in general, and the second part explored their perceptions of the analysis results yielded by WebTango with respect to the particular Web-based learning sites developed by the builders. Both Likert Scale (agree-disagree) and open-ended response formats was used for description and exploration. The results of the first part of the study indicated that the usability quality of Web-based learning sites was rated by WebTango as average or below average. In the survey study that constituted the second part of the study, respondents (builders of DLESE resources) were generally unfamiliar with usability and had little expertise or experience in applying usability evaluation methods. They expressed a desire to learn more about usability and enhance the usability of their resources, but they were generally uncertain regarding the utility and value of an automated evaluation tool. The respondents identified some benefits and issues of an automated evaluation tool that may be helpful for future research and development of automated usability evaluation tools.