This paper provides an introduction to automatic and manual evaluation methods for analysis of web accessibility. The first
topic examines the recent results in advances in authoring, including modifications to existing CMS systems and new development
toolkits. Next, the session explores the accessibility of specialized content such as graphics and interface components. The
last topic in the session covers the results of the Web Accessibility Benchmarking Cluster of European Union supported projects
(WAB-Cluster). Authors discuss technologies needed for automatic as well as manual evaluation.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.