The Accessibility of College and University Home Pages in the State of New York Joseph J. DiLallo josephdilallo@adelphi.edu Robert M. Siegfried siegfrie@adelphi.edu Department of Mathematics and Computer Science Adelphi University Garden City, NY 11530 USA Abstract Website accessibility has become more important in recent years as several companies have been sued over the inaccessibility of their websites. This study involves the development of a software tool that evaluates the compliance of websites with the “10 quick tips” that provide guidance to web designers who seek to address the highest level concerns of the guidelines of the Web Accessibility Initiative. This tool was then used to evaluate the accessibility of the home pages of the degree-granting institutions of higher education in the State of New York. These results are discussed, including which concerns of the disabled need to be addressed. Keywords: accessibility, World Wide Web, disabled 1. INTRODUCTION The accessibility of websites by the disabled has become an important issue in recent years. In February 2006, the National Federation for the Blind (NFB) filed a lawsuit against the Target Corporation over its website (http://target.com), alleging inaccessibility for the blind (Sliwa 2006). There were many issues with Target’s website: It was difficult to navigate the page. The screen reader was unable to read parts of the website, including the weekly specials. It was impossible for the blind to match a product with its price. Target.com is not the only website that is accused of being inaccessible for the blind; the NFB settled a lawsuit with America Online that led to more accessible service for the blind (Parker 2006). Additionally, Priceline and Ramada Hotels made the navigation of their websites (http://priceline.com and http://ramada.com, respectively) easier for the blind as a result of a settlement with the State of New York (Meyers 2006). Southwest Airlines and the 2000 Olympics, in Sydney, Australia, were also sued over accessibility issues (McCullagh 2002). Website owners are generally, but not always, required to make their sites accessible for the disabled. Section 508 of Rehabilitation Act states that federal websites, and sites owned by state and local agencies receiving federal funds, “must insure that the electronic information technology is accessible to people with disabilities, unless it presents an undue burden” (Noble 2004). The Individuals with Disabilities Education Act (IDEA 1997) requires that disabled students must be granted access to educational resources and this includes online resources. The requirements under the Americans with Disabilities Act (ADA) are not as clear. While it was ruled that the Metropolitan Atlanta Rapid Transit Authority’s website (http://itsmarta.com) had to be accessible to the disabled, another judge ruled that Southwest Airlines’ website was not a public accommodation that was required under the ADA to be accessible for the blind (Hudson 2003). In addition to the legal considerations cited above, web accessibility is, at its core, an ethical issue. A large portion of websites today carries information in the form of videos, sounds and animations. While these “bells and whistles” make a website more appealing to a mainstream user, they can make websites inaccessible for disabled users unless certain steps are taken. With every passing day, a greater number of services become available online. Given the pervasive role of the Internet in our lives today, keeping this essential service out of the reach of millions begins to resemble discrimination. Moreover, existing research suggests that people with disabilities rely on technology in general and on the Internet specifically more than the average individual. Bonner (2002) states that many people with disabilities find that the Web “makes a difference between living and just existing.” Gristock (2003) agrees, suggesting that “often, a computer is a link to the outside world where a disabled person can perform as an equal to a non-disabled person”. Thus, ensuring accessibility must become an integral part of web content development from its very early stages. Several studies have analyzed the accessibility of select web pages in many areas of the Web. Sullivan and Matson (2000) examined the accessibility of 50 of the Web’s most popular sites. Most of these websites (41 out of 50) had accessibility issues. Stowers (2002) analyzed 148 federal websites and found a majority to be lacking the proper accessibility requirements. A more recent study by Fagan and Fagan (2004) of state legislature websites also reveals a similar situation, that although an effort is being made by many organizations to improve their compliance with the accessibility standards, there are several examples of egregious violators. They point out that “even states that meet the minimum requirements for accessibility have not chosen to follow the full guidelines.” In the area of higher education websites, Schmetzke has conducted a study of the homepages of 1051 community colleges and found that only 29% were free of major barriers to accessibility (Schmetzke 2001). Diaper and Worman (2003) found that the homepages for British university websites that they examined were largely free from major barriers to accessibility. Most of these studies use automated tools such as WebXACT in their analysis (Spindler 2002), but some studies conducted their research manually. Thompson et. al. (2003) used two “experts” to judge the homepages of 102 research universities on a 5 point scale. The scores given by the two experts were compared with each other and with those generated by WebXACT. They concluded that the results from an automated tool are fairly good indicators of the actual accessibility of a web-page even though there are many aspects which have to be judged manually. Lazar et. al.(2003) analyzed 50 homepages in the Mid-Atlantic United States and found a disturbing trend that the most accessibility problems existed on the web pages of web design firms and IT firms, who should be the leaders in the field. The present study differs from the previous research in several ways. In order to avoid copyright issues and to focus on problems that were considered most important, a new tool for analyzing accessibility was developed. The degree-granting institutions of higher education in the State of New York were chosen for several reasons. Educational institutions have an obligation to make their services available to every individual and the policy changes that they implement governing their website design eventually trickle down to the entire web community; as educators, they also are trend setters for tomorrow. Also the content on these websites does not alter significantly over time. This enables one to conduct the longitudinal analyses that many studies have recommended. It is also important to note that no major study has been published in this area in the last 2 to 3 years. 2. DEVELOPMENT OF WAAP The Watchfire Corporation was considered “one of the leaders” of web accessibility auditing software (Sliwa 2006). They provided a free service known as “WebXACT”, a Web-based tool that could be used to test a website for quality, privacy, and accessibility. This tool was fairly popular, but it was not used in this study for a variety of reasons. The Terms of Service put forth by Watchfire clearly stated (2007) that “You shall not… use the Service to scan a website page that is not owned by you or otherwise under your management and control, unless you have received permission from the person or entity who owns or otherwise has management and control of such website page.” Getting permission from every website that would be included in the study would have been a lengthy endeavor, with the most likely outcome being that many of the websites’ owners would refuse to grant permission. The WebXACT tool also lacked a facility to audit a large of number of pages in one batch. Thus, its use would have required entering the address of each website, waiting for the assessment, and recording and interpreting the results manually. Additionally, the IBM Corporation bought Watchfire in July 2007 and the online service was taken offline (Wiens 2007). Design of WAAP The Web Accessibility Audit Program (WAAP) was designed to overcome these major barriers to using WebXACT. As our own tool, there were no restrictions regarding its use in this study. It was designed with easy automation in mind, and can check a potentially limitless amount of web pages in one run by using an input file containing the URLs of the pages to be checked. Furthermore, the main focus of this study was on the highest priority problems. While checking fewer issues than Watchfire’s tool did, WAAP’s output provides more information on the issues it examines, e.g., alt attributes that are empty. While it seems to be common practice to use insignificant images with empty but present alt attributes for layout purposes, some websites, such as Adelphi University’s website, also have empty alt attributes given for significant images that contain information (Thompson 2003). Because the same set of files were created for each web page, it was easy to compare and analyze the results. Images & animations – An alt attribute should describe each visual. Image maps - Client-side (instead of server-side) maps and text for hotspots should be used. Multimedia - Audio should have captioning or transcripts and video should have descriptions. Hypertext links - Link text should be clear even when read out of context. Page organization - Should use consistent structure and cascading style sheets for layout and style when possible. Graphs & charts - Should have a summary or longdesc attribute. Scripts, applets, & plug-ins - Should have alternative content in case these features are not available. Frames - Should include the noframes element and a title that explains the content. Tables - Should have a summary and should make sense when read line-by-line. Check your work - Using tools available on the W3C website. Figure 1. The ten quick tips for making websites accessible (from http://www.w3.org/WAI/quicktips/Overview.php) Figure 1 shows the ten quick tips that the World Wide Web Consortium (W3C) lists for making websites accessible (1999). While they summarize key concepts in accessible web design, they are not all-inclusive. However, they do cover most of the Level 1 issues in website accessibility, which are the most crucial issues in making a webpage accessible to the disabled. Still and animated images were all checked for the presence of an alt attribute, which was checked in turn to see if it was empty or actually contained text. Images were also flagged for human checking because graphs and charts need a longdesc attribute that is relevant to the graph or chart. Server-side image maps were detected by searching all images for the ismap attribute and all client-side image maps were checked to see if hotspots had text labels by analyzing area tags for the alt attribute. Multimedia is difficult, if not impossible, to check automatically for alternate content for the deaf or blind. As such, the program flags all object and script tags for human checking to ensure that there are other methods of obtaining the meaning of these features. Hypertext links, located through the use of the anchor tag, are both machine-checked for the presence of text in the link (or images with an alt attribute) and flagged for human checking to ensure that they are understandable with or without their surrounding text and images. Each page was checked to see if it used a style sheet for formatting, and these style sheets were subsequently checked to see if they specified absolute or relative sizes. Scripts, applets, and plug-ins were checked by searching for descriptions in object tags and noscript tags in pages with scripts. Frames were checked for a name attribute, frameset tags were checked for title attributes, and pages that used frames were also checked for a noframes equivalent for browsers that didn’t support them. Lastly, tables were checked for many criteria, including the existence of summaries for the entire table, the use of rows and column headers, and pairing each item up with it’s appropriate header. Implementation of WAAP WAAP was implemented in the Java programming language, which allowed the use of Java’s standardized libraries, simplifying the development process. One problem that was encountered was that websites tailored the HTML code that was transferred to match the browser sending the request. While a few schools had this feature, at least one school sent a blank page back to WAAP. The other sites using this feature sent a generic web page to WAAP, which may not be representative of the page received by most users. Verification of WAAP For the sake of comparison, we looked at the results of both WebXACT’s scan and WAAP’s scan for the Adelphi University homepage and the second author’s personal homepage. The most noticeable result was that while WebXACT looked for many more types of errors and warnings covering all three levels of priority, it did not look for all of the criteria indicated in W3C’s Quick Tips. For example, both WebXACT and WAAP check for tables without summary attributes and both programs warned that images needed to be checked for animations. Both tools found the same number of instances for each homepage. But they returned very different results on other points, such as relative versus absolute sizing: WebXACT checked for this error in font tags and tables, while WAAP checked font tags and style sheets. The result is that they reported different numbers of instances and pointed to different parts of the source document. The Ten Quick Tips also warned against using the bold and italics tags, so WAAP was designed to check for these as well and pointed them out while WebXACT did not. One final but important difference is that WebXACT does not discriminate between proper alt attributes in images and those whose descriptions are empty strings, while WAAP does. This resulted in WAAP pointing out more errors of this type, and manually checking Adelphi’s homepage confirms that some of these empty alt attributes were located on important images that should have had actual text. IBM’s acquisition of Watchfire occurred while WAAP was still under development and shortly thereafter, WebXACT was no longer available for further use. WAAP was further tested against three other online accessibility tools: CynthiaSays, FAE (Functional Accessibility Evaluator) and WAVE (Web Accessibility Evaluator). All four programs were used to evaluate a sample of ten college websites. WAAP, FAE and CynthiaSays all noted the instances of images with empty alt attributes. All three noted the use of bold and italics tags, which are deprecated, instead of the and tags that should be used in their place. While they all recognized the improper use of tables for layout, their messages were quite different: WAAP noted that the layout tables lacked summaries, CynthiaSays noted the lack of row and column headings and the nesting of tables, and FAE simply noted that tables were nested and that layout should have been done using a style sheet. WAAP and CynthiaSays gave error messages for the use of scripts that lacked noscript tags; FAE produced no messages when these were encountered. Of the programs used to audit these test sites, only WAAP produced error messages when it encountered absolute sizing. WAVE was also used to audit these pages for accessibility. It produced an annotated version of the page being tested, with tags that identified accessibility errors, alerts (web page features that needed to be manually checked for accessibility problems) and accessibility features that should be checked for accuracy. It also provided structural and semantic features that made it easier to spot features such as tables, headings and lists. However, WAVE was unable to identify most of the problems identified by the other programs; the only problems that it identified were images missing alt attributes and forms that were missing labels. How well do web accessibility testing programs work? Diaper and Worman (2003) found that there are discrepancies between how A-Prompt and Bobby (later renamed WebXACT) evaluated web pages. Lazar et. Al. (2003) found Bobby inaccurate enough that they did not use it in evaluating homepages for either WCAG or Section 508 compliance. This has led several people to derive testing methodologies that do not rely on software; Brajnik (2006) describes a heuristic walkthrough method to evaluate accessibility and Hackett et al. (2004) derives a metric that they used to quantify the accessibility and complexity of web pages. While automated evaluations such as the one herein described are less than perfect, they do provide a way to examine a larger number of pages than could be evaluated without automation. 3. RESULTS A list of 204 schools located within New York State was found online at the website http://www.univsource.com/ny.htm. Of these schools, Excelsior College and Practical Bible College were omitted from this study due to problems accessing Excelsior’s website and the merging of Practical Bible College with Davis College. Of the remaining schools, three websites returned merely a transfer page or frame set definition for the website and not the homepage proper. However, these three are still included in the results, bringing the total to 202 websites. WAAP was used twice to audit the accessibility of all 202 homepages. The program also produced a summary report that calculated the average number of error categories found in the various homepages and the average number of total instances of errors found. These results appear in Table 1. The number of classes of errors appearing per homepage dropped from 6.61 errors per homepage to 4.46; the number of instances dropped from 88.55 to 38.95. Table 1. A summary of the errors Survey Types of Errors Per Homepage Instances of Errors Per Homepage August 2007 6.61 88.55 July 2008 4.46 38.95 The report also produced a summary of the average number of warning categories for each homepage and the number of instances for each warning. The results appear in Table 2. Unlike the errors, there was no significant change in the number of warnings or in the number of instances. This is not significant in itself because warnings only indicate that there is a need to check a feature appearing on the page manually. Table 2. A summary of the warnings Survey Types of Warning Per Homepage Instances of Warnings Per Homepage August 2007 3.11 93.19 July 2008 3.12 90.99 Overall, the results show room for improvement but not outright inaccessibility to those who are impaired. Of the 29 criteria for different accessibility issues, the average number of errors per page was 6.6 and no single school showed more than 10 of the different types of errors. However, many of these errors occurred numerous times on a single home page. One university in the study had 1151 instances of errors in its home page; the page contained a large number of scripts. There were, on average, 97 instances of different types of errors with the second and third largest number of instances per home page being 648 and 475 respectively. 6 of the 29 categories of errors did not appear on any of the homepages, including (1) server-side image maps, (2) empty but present title attributes in frameset tags, (3) frame tags missing the name attribute, (4) frame tags with a name attribute given but no text provided, (5) table data tags with present but empty header attributes, and (6) tables with more than two levels of rows and/or columns that did not use headers to identify cells. Table 3 shows the frequency of various types of errors and warnings. Some interesting trends appeared with the data. Alternative text for images, in the form of alt attributes, is a vital link for those who rely on screen readers to make sense of what they cannot see. Of the 202 schools audited, 63% of them had at least one image with no alternative text, with the average being 7.5 occurrences per homepage. In addition, 51% of the schools had at least one instance of an alt attribute specified but no text provided within it, with an average of 11.6 occurrences on each page. Some of the images are insignificant images used for layout, but many of them are significant images of the site that would be important if not vital to proper navigation and understanding of the page. In both instances, there were minor declines in both errors when a follow-up survey was done eleven months later. The three most common errors were (1) links containing images but no text, (2) use of scripts without noscript alternatives, and (3) the use of absolute sizing for font tags and style sheets rather than relative sizing, with at least one instance on 93%, 92.6%, and 88% of the websites respectively. These three problems can represent significant hurdles for not only the completely blind but also the visually impaired as well. Absolute sizing makes it more difficult for users to set their own font sizes with which they are comfortable and can necessitate the use of screen magnifiers when they would not otherwise be needed. Scripts can pose a problem for any user if they are not supported by the browser or the appropriate plug-ins and can pose extra problems for the disabled in some cases. Links containing an image but no text can be impossible to understand when the images lack alternative text that describes not only the image itself, but also its function as a link. Adding text to all links along with images can ease the problems of navigation. It is noteworthy that most errors declined over the eleven months between the two surveys, most notably the occurrence of absolute sizing; this will require further investigation. It is particularly interesting that the only error that was more common in the second study is the occurrence of scripts without noscript alternatives. As more websites are becoming more interactive, scripts are more common and in many cases, the noscript alternative is neglected. Table 4. Occurrences of missing style sheets and frames lacking the noframes attribute Search Criteria Percentage (August 2007) Percentage (July 2008) Style Sheets: Number of Sites Missing Any Style Sheets 7% 6% Frames: Number of Sites With Frames But No “Noframes” Alternative 0.50% 0% The homepages were checked for missing style sheets and frames that lacked noframes attributes. These statistics appear in Table 4. Both types of errors were rare in the homepages when they were audited in August 2007. Missing style sheets was slightly less common in July 2008; the few sites that used frames without a noframes alternative fixed this error before the follow-up audit in 2008. To gain a perspective of how these accessibility errors affect blind users, the homepages with the most instances of errors were examined using the text-based browser Lynx. One homepage had over a thousand instances of accessibility errors in the original audit. This homepage had 17 screens containing links as a preface to the homepage’s actual content. Some of this site’s other pages consisted of as many as 50 screens, making the site very difficult to navigate. However, the homepage with the second largest number of instances was only 4 screens in length and was reasonably navigable. This kind of disparity was not unusual. 4. CONCLUSIONS The results of the study show that New York’s colleges and universities do better than one might fear although not as well as one might hope. While there is room for improvement, users with accessibility issues would most likely be able to glean the necessary information from these websites. The persistence of very common and easily fixable issues in academic websites may be representative of the entire Web. One interesting note of the study was that potential accessibility barriers imposed by the use of frames were actually much lower than expected in the original survey and disappeared in the follow-up survey. There are many possible explanations for this, including the use of the cascading style sheets and layout tables. This is actually a great boon to those relying on screen readers and magnifiers to access websites, even if this was not the intent of website designers. The misuse of alternative text for images is still a very common barrier for the blind to overcome; however, affixing alternative text that is null to images can be just as confusing as missing alternative text. It is sometimes suggested that the null alt attribute (alt="") be used to denote insignificant images within a layout (Sullivan 2000). However, this is not necessarily a good practice; some of the schools in the study had significant images that used the null attribute. It might be helpful if insignificant images used a different tag that enabled them to be distinguished more easily from other images. It is interesting that the current versions of the other accessibility auditing tools now check for this problem. This was not so common a year ago. There is more to be said on the topic of Web accessibility than this study covers. By limiting the study to the ten quick tips, the lower priority accessibility concerns were ignored; a follow-up study will examine these concerns. The colleges and universities of New York State may not be representative of the nation’s institutions of higher education, but they remain a good place to start. Using only college websites as the subject of this study also leaves the question of how well commercial websites are doing. It becomes increasingly difficult to assemble a list that portrays an accurate cross-section of what the World Wide Web is really like, and looking at small segments such as only college websites may or may not be representative of the Internet overall. How well do web accessibility testing programs work? Diaper and Worman (2003) found that there are discrepancies between how A-Prompt and Bobby (later renamed WebXACT) evaluated web pages. Lazar et al (2003) found Bobby inaccurate enough that they did not use it in evaluating homepages for either WCAG or Section 508 compliance. This has led several people to derive testing methodologies that do not rely on software; Brajnik (2006) describes a heuristic walkthrough method to evaluate accessibility and Hackett et al. (2004) derive a metric that they used to quantify the accessibility and complexity of web pages. While automated evaluations such as the one herein described are less than perfect, they do provide a way to examine a larger number of pages than could be evaluated without automation. Lazar et al. and Hackett et al. disagree in how to quantify barriers to accessibility. Lazar et al. consider the number of distinct errors to be of primary importance; e.g., if there were 5 images on a page each lacking alternative text, this would still be one error in his view. They consider this to be more important than the number of occurrences because it is easier for a webmaster to fix these errors than 5 problems that are all different in nature. Law et al. (2005) concur; programmers do tend to consider it easier to fix multiple occurrences of the same problem than a smaller number of different problems that may only occur once. But Hackett et al. view this from the perspective of a blind end-user, who is just as frustrated from multiple occurrences of the same problem as he or she is from single occurrences of the different problems. Why do so many accessibility problems still exist almost a decade after the first accessibility guidelines were released? Lazar et al. (2004) found that while most webmasters claimed to be familiar with accessibility guidelines, they lacked the time and/or funding to implement them and there was a lack of support from management and clients. 5. ACKNOWLEDGEMENTS The authors gratefully acknowledge the assistance of Akhil Ketkar. The authors also gratefully acknowledge the cooperation of the Adelphi University Web Team, which gave us permission to use WebXACT to audit the University’s homepage for accessibility issues. 6. REFERENCES Bonner, P., 2002, “And Websites for All”, PC Magazine, May 7, 2002, IP01. Brajnik, Giorgio, 2006, “Web Accessibility Testing: When The Method Is The Culprit”, Computers Helping People With Special Needs (2006), Proceedings Of The 10th International Conference, ICCHP 2006, Linz, Austria, July 11-13, p. 156-163. Diaper, Dan, and Linzy Worman, 2003, “Two Falls Out Of Three In The Automated Accessibility Assessment Of World Wide Websites: A-Prompt V. Bobby”, People And Computers XVII - Designing For Society: Proceedings Of HCI 2003, Eamonn O'Neill, Philippe Palanque, and Peter Johnson, eds., Springer, New York, p. 349-364. Fagan J C, and B. Fagan, 2004, “An Accessibility Study Of State Legislature Websites”, Government Information Quarterly 21, 65-85. Gristock, M., 2003. Accessibility on the Web, “The International Center for Disability Resource on the Internet”. Retrieved August 21, 2007 from http://www.icdri.org/WebAccess/accessibility_and_web_JKD.htm. Hackett, Stephanie, Bambang Parmanto, and Xiaoping Zeng, 2004, “Accessibility Of Internet Websites Through Time”, Proceedings Of The 6th International ACM SIGACCESS Conference On Computers Accessibility, Atlanta Georgia, October 18-20, p. 32-39. Hudson, William, 2003, “Public Accommodation: The Us Web Accessibility Jigsaw”, SIGCHI Bulletin 35, 1 (January/February), p. 8. Individuals with Disabilities Education Act Amendments of 1997, Pub. L. 105-17. Law, Chris, Julie Jacko and Paula Edwards, 2005, “Programmer-Focused Website Accessibility Evaluations”, Proceedings Of The 7th International ACM SIGACCESS Conference On Computers And Accessibility, Baltimore, Maryland, October 9-12, p. 20-27. Lazar J, P. Beere, K. Greenidge, Y. Nagappa, 2003, “Web Accessibility In The Mid-Atlantic United States: A Study Of 50 Homepages”, Universal Access In The Information Society 2, 4 (November), p. 331-341. Lazar, Jonathan, Alfreda Dudley-Sponaugle and Kisha-Dawn Greenidge, 2004, “Improving Web Accessibility: A Study Of Webmaster Perception”, Computers In Home Behavior 20, 2 (March), p. 269-288. McCullagh, Declan, 2002, “Judge: Disabilities Act Doesn’t Cover Web”, CNET News.com, October 21. Retrieved August 21, 2007 from http://news.com.com/2100-1023-962761.html. Meyers, Michelle, 2006, “Blind Patrons Sue Target For Site Inaccessibility”, CNET News.com, February 10. Retrieved August 21, 2007 from http:// news.com.com/Blind+patrons+sue+Target+for+site+inaccessibility/2100-1030_3-6038123.html. Noble, S., 2004, “Web Access And The Law: A Public Policy Framework”, Library Hi Tech 20, 4, p. 305-405. Parker, Laura, 2006. “National Federation of The Blind Files Target Lawsuit”, USA Today (October 26), p. 2A. Schmetzke, A, 2001, “Accessibility Of The Homepages Of The Nation’s Community Colleges”. Retrieved July 29, 2008 from http://library.uw sp.edu/aschmetz/Accessible/nationwide/CC_Survey2001/summary_CCC.htm Sliwa, Carol, 2006, “Accessibility Issue Comes to a Head”, Computerworld (May 8), p. 1. Spindler, Tim, 2002, “The Accessibility of Web Pages For Mid-Sized College and University Libraries”, Reference and User Services Quarterly 42, 2, p. 149-154. Stowers G., 2002. “The State Of Federal Websites: The Pursuit Of Excellence”, The PricewaterhouseCoopers Endowment For The Business Of Government. Retrieved September 7, 2007 from http://www.businessof government.org/pdfs/StowersReport0802.pdf Sullivan, T and R. Matson, 2000, “Barriers To Use: Usability And Content Accessibility On The Web’s Most Popular Sites”, Proceedings of the ACM Conference on Universal Usability (November 16-17), p 139-144. Thompson T, S. Burgstahler, and D. Comden, 2003. “Research on Web Accessibility in Higher Education.” Retrieved September 7, 2007 from http://www.rit.edu/~easi/itd/itdv09n2/thompson.htm Watchfire Corporation, 2007. WebXACT Terms of Use. Retrieved September 7, 2007 from http://webxact.watch fire.com/themes/standard-en-us/termsofuse.htm. Wiens, Jordan, 2007. “Watchfire Blazes Past Rival”, InformationWeek (October 8), p. 59-60. World Wide Web Consortium, 1999. “WAI Quick Tips To Make Accessible Websites.” Retrieved June 7, 2007 from http://www.w3.org/WAI /qu icktips/. APPENDIX Table 3. The frequency of various types of accessibility errors and warnings in August 2007 and July 2008 Search Criteria August 2007 July 2008 Average Number of Occurrences Percentage of Sites With The Error Average Number of Occurrences Percentage of Sites With The Error Image tag: Empty “Alt” Attribute 11.58 51.% 5.36 44 Image Tag: Missing “Alt” Attribute 7.48 63% 5.69 62% Table Tag: Missing “Summary” Attribute 6.88 74% 5.97 66% Scripts: Missing “NoScripts” Alternatives 4.71 93% 7.02 95% Client-Side Image Maps: Missing “Alt” Attribute 0.95 11% 0.75 10% Visual Style: Use of the “Bold” Tag 2.45 33% 1.62 29% Visual Style: Use of the “Italics” Tag 0.21 11% 0.19 11% Links: Links Containing Only Images (No Text) 11.21 93% 10.98 93% Object Tags: Alternative Text 0.27 21% 0.27 19% Absolute Sizing in Style Sheets and Font Tags 50.72 88% 0.57 21% Warnings: Scripts to Check For Adequate Summaries 5.14 93% 7.96 95% Warnings: Images to Check for Animations and Charts/Graphs 29.86 93% 21.92 97% Warnings: Links to Check for Out-of-Context Legibility 58.66 98% 61.32 99% Warnings: Objects to Check for Adequate Summaries 0.30 24% 0.29 20%