[LON-CAPA-dev] Report on automated testing systems.
Ron Fox
lon-capa-dev@mail.lon-capa.org
Tue, 07 May 2002 00:18:10 -0400
I've looked at a few automated testing systems for gui/web pages. Here
are my observations, comments and conclusions:
What we'd like:
Testing tool or set of testing tools capable of regression testing in
a GUI environment: We'll really need to capture tests and test data on:
Linux - various web browsers e.g. netscape and opera
Windows- Netscape and IE various versions
MAC - Netscape and IE various versions.
I believe that no single testing tool will do the trick in all
platforms, and even nastier, we won't be able cross pollinate test
results/data collection from platform to platform, or browser to browser.
Unfortunately, in the short time I've looked at the current state of
the art, I see help in the Windows and Linux platforms only:
There is a trade off to consider some data points for testing systems:
1. Automated capture/playback/compare systems: Easy to get test cases,
Comparision if availble is automatic. Comparisons can be tricked.
2. Scripted testing automated comparison: Test setup can be 'real work'
Simple comparisons can be easily tricked.
3. Scripted testing manual compare - labor intensive on script
generation and playback since you get to look at every page of web
output and compare it to a captured prior image. Comparisons are harder
to trick.
4. Scripted testing and scripted comparison - setting up test cases is like
programming, and therefore also error prone (how do you test the tests
;-)?).
Scripted comparisions, while effortful are also flexible and less easily
tricked.
Android : http://www.wildopensource.com/larry-projects/doc-android.html
Extensions to the Tcl/Tk scripting language which support
event capture/playback as well as checkpointing and comparing
windows. While it would have been nice to believe that android is
portable, the event capture/playback mechanism depends on the
applications running in the X11 environment.
Empire: http://www.empirix.com
Windows environment test management tool named e-Tester. Provides an
extensible proprietary scripting language. In contrast to e.g. Android,
event capture is not automatic, but must be scripted. Test results are
displayed but not automatically checked against prior browser contents.
Segue SilkTest:
http://www.segue.com/html/s_solutions/s_silktest/s_silktest_toc.htm
Much like Empire, down to the platform support, support or a wide
variety of browsers under the wide variety of Windows platforms.Tests
are consructed via a scripting language, manually. Automated
comparisons are not supported, but prior test results are kept in a
repository for manual comparison with current results.
Atesto: http://www.atesto.com/productsandservices/functionchecker.asp
Again manual scripting, but does automated comparisons between Web
pages in the baseline test and those of the current test.Comparisons are
done at the level of objects in the browser's DOM. This should be
insenstive to e.g. font selection. Runs on windows.
WebKing: http://www.parasoft.com/jsp/products/home.jsp?product=WebKing
Runs on Win Nt/2000/Xp, linux and Solaris 2.7+
Manually scripted tests. Scripted comparisons allow you to decide
what elements of the resulting web page are significant (e.g. the load
average field on the login page can be ignored with webking, even on
font independent DOM comparisons, this would typically cause a
regression mismatch.
Conclusions:
- I would say it's worth looking at android - hey its' free and the
automated
capture and compare could include a procedure which 'canonicalizes' the
browser.
- Amongst the commercial systems I think webking holds the most
promise.. .it is also probably the most complex system to learn, you
need to learn two languages: the scrpting language (no automated test
capture), and the comparison language.
Ron.