RE: [Inkscape-devel] Test strategy
Either Wiki or CVS would be suitable places for storing it. There are roughly equal pros and cons both ways. Wiki would be more widely accessible to more people, but CVS would allow a little more flexibility for manipulating the document (e.g., grep, txt2pdf, etc.)
-> workaround : HTML format ( portable, plus support usefull things such as images, links, tables...) maintained in the CVS and linked to a wiki page ?
It may be worthwhile if we have better ideas for the categorization, to alter the bug tracker to match. Those categories were decided when we first started the project, and probably need updated to match what we people actually support bugs about. If anyone has ideas for improvements let me know and I'll apply them.
Yup, this is directly linked to the test process. I'm preparing (70% done) it.
A subcategory of this is 'Compliance testing', which validates the features against a spec - in our case the W3C SVG spec. There may be other specs we can test compliance against, although some (like the GNOME Human Interface Guidelines) will generally not be automatable.
Ok, got it. Will take a look at Gnome HIG see what can be done there (not my #1 priority, but kept on the list) Already diving in SVG standards, plus test tool suite.
There is also performance testing, which can be done in a variety of ways. For example, startup timing, function-level profiling, etc.
Also, we should include memory error testing ala purify and valgrind.
Already in mind for the categories.
Tests: Build Linux (Debian / Gentoo / Slackware / Mandrake / Fedora
/ Suse / Others)
Windows (98 / 2000 / XP) MacOS (X) BSD (Free / Open) Other Unices (Solaris / Irix)
Note that an important condition for distros is what installation options were used. For example, a pure-base SuSE installation will not include developer tools like gcc. Most distros also have installation options which will end up lacking the -devel libs that Inkscape needs to compile. So each of these will require specification as to which installation options must be used during setup.
Argh, sounds complicated. The best thing, I suppose is to identify a maintainer by distro (ala Mandrake) and ask them to write down the specs for their respective distro
Notice : should be automated (example see : http://public.kitware.com/Dart/HTML/Index.shtml)
Interesting - have you used this system?
Yes, that's why I was thinking about it :) But I'm trying to find even better and flexible
- open Inkscape, launch reference editing sequence,
export result as
bitmap and compare with a reference bitmap provided by a reference application on the same sequence (Inkscape itself for non
regression, Batik
?) Previous sequences can be combined in one in order to save
time, but then
analysis of results can become more complex.
Note, I developed a way using motion tracker software to calculate a "percent similarity" rating given an Inkscape-generated png against a Batik-generated png, that makes the analysis work easier. There's a glitch that'll need to be worked out, in that the original SVG's contain the description in the image, which throws off the comparison.
Simply separate analysis of pixels and metadata (I'm quite used to it : have worked on the dicom standard for 3 years)
Verification of robustness - testing memory leaks (not a specialist of this : I need help)
This is an area where our work as testers is really cut out for us, because there is a ton of memory issues in the codebase, and we need to be able to categorize in a way that makes them accessible to the developers. Right now there's such an overwhelming number that developers wouldn't know where to start, so if in testing we can triage and prioritize the "top 10" or whatever, it would make the fixup work more tractable.
Right. But in fact, as for unit tests, this part of the verification is hard to manage by testing team. It's mainly up to coders to operate this. Testing team can provide help (crash report, reproducibility of malicious scenarii...) but (for example in my case) are not code/design wizzards able to slalom in the memory/architecture maze.
- provide a shaker tool (this tool would launch
Inkscape and then start
a predefined number of pseudo-random commands -logging each
action). The
goal is to detect crashes and analyse their conditions with the logs
Another idea along these lines is to fire a variety of signals at it, and detect if it behaves correctly.
Just need a few precisions so that I understand better
Again, this plan looks very good, I'm looking forward to this. I have a feeling that since there is a *huge* amount of improvement we can gain by taking a systematic approach to it like this. I think it'll also be very intriguing to figure out how to effectively meld systematized testing with open source practices, and to see how much can be accomplished in doing so.
Ok, I'm pleased if you find the approach interesting. Regards,
Matiphas
On Mon, 19 Jul 2004, Gazal, Geraud (MED) wrote:
Either Wiki or CVS would be suitable places for storing it. There are roughly equal pros and cons both ways. Wiki would be more widely accessible to more people, but CVS would allow a little more flexibility for manipulating the document (e.g., grep, txt2pdf, etc.)
-> workaround : HTML format ( portable, plus support usefull things such as images, links, tables...) maintained in the CVS and linked to a wiki page ?
HTML maintained in CVS would work fine. We can use a subdir in the inkscape_web module, and link to it from the Documentation page; no need for Wiki to be involved.
Tests: Build Linux (Debian / Gentoo / Slackware / Mandrake / Fedora
/ Suse / Others)
Windows (98 / 2000 / XP) MacOS (X) BSD (Free / Open) Other Unices (Solaris / Irix)
Note that an important condition for distros is what installation options were used. For example, a pure-base SuSE installation will not include developer tools like gcc. Most distros also have installation options which will end up lacking the -devel libs that Inkscape needs to compile. So each of these will require specification as to which installation options must be used during setup.
Argh, sounds complicated. The best thing, I suppose is to identify a maintainer by distro (ala Mandrake) and ask them to write down the specs for their respective distro
Let's think on this some more. I think it is possible to programmatically extract this sort of info from the package for that system. Being able to automatically test the dependency needs for various platforms would be extremely useful (and would be useful to other programs beyond Inkscape), but would take a good bit of work to figure out how to do it. Put this a bit lower in the priorities, and let's think about it a while.
Notice : should be automated (example see : http://public.kitware.com/Dart/HTML/Index.shtml)
Interesting - have you used this system?
Yes, that's why I was thinking about it :) But I'm trying to find even better and flexible
I'd like to get more info from you on it. We (at OSDL) have a similar framework that we're developing, called STP. I'd be interested to see how Dart stacks up to it.
- provide a shaker tool (this tool would launch
Inkscape and then start
a predefined number of pseudo-random commands -logging each
action). The
goal is to detect crashes and analyse their conditions with the logs
Another idea along these lines is to fire a variety of signals at it, and detect if it behaves correctly.
Just need a few precisions so that I understand better
There are a variety of signals you can send a program to cause it to do various things. `kill -9` (or kill -KILL) is one example. Some others:
{bryce@...347...} ~/src/Stp (185): kill -l HUP INT QUIT ILL TRAP ABRT BUS FPE KILL USR1 SEGV USR2 PIPE ALRM TERM STKFLT CHLD CONT STOP TSTP TTIN TTOU URG XCPU XFSZ VTALRM PROF WINCH POLL PWR SYS RTMIN RTMIN+1 RTMIN+2 RTMIN+3 RTMAX-3 RTMAX-2 RTMAX-1 RTMAX
What should Inkscape do when you throw one of those signals at it? What does it do? This could be a simple thing to test.
Bryce
participants (2)
-
Bryce Harrington
-
Gazal, Geraud (MED)