W3C SVG test suite and trunk
Hi,
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/coords-uni... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/text-altgl... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/text-tref-...
Tav
On Tue, 2013-10-01 at 14:56 +0200, Tavmjong Bah wrote:
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
Thanks Tavmjong,
I fixed an issue with the gradients this morning in r12646. I looked now at the svgs to some of those failures and they've improved but are still wrong.
Could you let me know how to run the test suite? I'd like to make one with the revision number.
Martin,
2013/10/1 Martin Owens <doctormo@...400...>:
On Tue, 2013-10-01 at 14:56 +0200, Tavmjong Bah wrote:
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
Thanks Tavmjong,
I fixed an issue with the gradients this morning in r12646. I looked now at the svgs to some of those failures and they've improved but are still wrong.
r12648 should render objectBoundingBox patterns and gradients correctly again, so I think some of those regressions should be fixed.
Regards, Krzysztof
On Tue, 2013-10-01 at 15:35 +0200, Krzysztof Kosiński wrote:
2013/10/1 Martin Owens <doctormo@...400...>:
On Tue, 2013-10-01 at 14:56 +0200, Tavmjong Bah wrote:
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
Thanks Tavmjong,
I fixed an issue with the gradients this morning in r12646. I looked now at the svgs to some of those failures and they've improved but are still wrong.
r12648 should render objectBoundingBox patterns and gradients correctly again, so I think some of those regressions should be fixed.
The rendering is better... but still some minor problems. I've updated the images.
Tav
2013/10/1 Tavmjong Bah <tavmjong@...8...>:
The rendering is better... but still some minor problems. I've updated the images.
pservers-grad-13-b shows major breakage. The rendering reverts to how it was before r12646 when it's nudged in any direction.
I have no idea what's going on at the moment, and gdb shows me some completely nonsensical output - the results of the pretty printer are different from _item_bbox and the same bbox passed down to pattern_new, even though print /r shows that they are identical...
Regards, Krzysztof
On 2013-10-01 17:38 +0200, Krzysztof Kosiński wrote:
2013/10/1 Tavmjong Bah <tavmjong@...8...>:
The rendering is better... but still some minor problems. I've updated the images.
pservers-grad-13-b shows major breakage. The rendering reverts to how it was before r12646 when it's nudged in any direction.
JFYI - this also happens with the oldest build I have archived on my current laptop, r10795: nudging the group breaks gradient rendering.
https://www.dropbox.com/sh/qg4zykdc0ljptop/mhUXGIX1uc#/
(Screenshots with r10795 and r12648 on-load, and nudged)
I have no idea what's going on at the moment, and gdb shows me some completely nonsensical output - the results of the pretty printer are different from _item_bbox and the same bbox passed down to pattern_new, even though print /r shows that they are identical...
Regards, Krzysztof
On 2013-10-01 18:54 +0200, su_v wrote:
On 2013-10-01 17:38 +0200, Krzysztof Kosiński wrote:
2013/10/1 Tavmjong Bah <tavmjong@...8...>:
The rendering is better... but still some minor problems. I've updated the images.
pservers-grad-13-b shows major breakage. The rendering reverts to how it was before r12646 when it's nudged in any direction.
JFYI - this also happens with the oldest build I have archived on my current laptop, r10795: nudging the group breaks gradient rendering.
https://www.dropbox.com/sh/qg4zykdc0ljptop/mhUXGIX1uc#/
(Screenshots with r10795 and r12648 on-load, and nudged)
Nevermind the tests with older trunk revisions - the broken rendering of the gradients when nudging the group in any direction actually already happens with current stable 0.48.4, and with 0.47 as well.
(Screenshots of 0.48.4 and 0.47 added to dropbox folder)
I have no idea what's going on at the moment, and gdb shows me some completely nonsensical output - the results of the pretty printer are different from _item_bbox and the same bbox passed down to pattern_new, even though print /r shows that they are identical...
Regards, Krzysztof
2013/10/1 Tavmjong Bah <tavmjong@...8...>:
On Tue, 2013-10-01 at 15:35 +0200, Krzysztof Kosiński wrote:
2013/10/1 Martin Owens <doctormo@...400...>:
On Tue, 2013-10-01 at 14:56 +0200, Tavmjong Bah wrote:
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
Thanks Tavmjong,
I fixed an issue with the gradients this morning in r12646. I looked now at the svgs to some of those failures and they've improved but are still wrong.
r12648 should render objectBoundingBox patterns and gradients correctly again, so I think some of those regressions should be fixed.
The rendering is better... but still some minor problems. I've updated the images.
In r12649, pservers-grad-13-b should render correctly.
There's still the issue of the gradients going awry when the object is nudged. I think this is caused by a faulty conversion to an user space gradient - I vaguely remember seeing some code related to this.
pservers-grad-21-b is rendered wrong, because position attributes are not inherited. Inheritance for other attributes is implemented in sp-gradient.cpp, so fixing this test case isn't that hard. su_v - can you verify whether there is a bug report for this?
I haven't yet analyzed the other test cases.
Regards, Krzysztof
2013/10/2 Krzysztof Kosiński <tweenk.pl@...400...>:
There's still the issue of the gradients going awry when the object is nudged. I think this is caused by a faulty conversion to an user space gradient - I vaguely remember seeing some code related to this.
It turns out the conversion to an userspace gradient is OK - the problem is something related to clones.
If I unlink the cloned rectangles, the gradients on them behave correctly.
Regards, Krzysztof
On 2013-10-02 01:39 +0200, Krzysztof Kosiński wrote:
2013/10/1 Tavmjong Bah <tavmjong@...8...>:
On Tue, 2013-10-01 at 15:35 +0200, Krzysztof Kosiński wrote:
2013/10/1 Martin Owens <doctormo@...400...>:
On Tue, 2013-10-01 at 14:56 +0200, Tavmjong Bah wrote:
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
Thanks Tavmjong,
I fixed an issue with the gradients this morning in r12646. I looked now at the svgs to some of those failures and they've improved but are still wrong.
r12648 should render objectBoundingBox patterns and gradients correctly again, so I think some of those regressions should be fixed.
The rendering is better... but still some minor problems. I've updated the images.
In r12649, pservers-grad-13-b should render correctly.
There's still the issue of the gradients going awry when the object is nudged. I think this is caused by a faulty conversion to an user space gradient - I vaguely remember seeing some code related to this.
pservers-grad-21-b is rendered wrong, because position attributes are not inherited. Inheritance for other attributes is implemented in sp-gradient.cpp, so fixing this test case isn't that hard. su_v - can you verify whether there is a bug report for this?
The only one I found so far wrt to inhertiance of gradient attributes: - Bug #1167689 “Referenced linearGradient is not rendered properly” https://bugs.launchpad.net/inkscape/+bug/1167689
I haven't yet analyzed the other test cases.
On Wed, 2013-10-02 at 01:39 +0200, Krzysztof Kosiński wrote:
In r12649, pservers-grad-13-b should render correctly.
It does, I've updated the images at:
http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g...
Thanks,
Tav
Hi ! I've been looking at the GSOC and after solving a few problems I can run the tests now. Now I have to find out a way to update the tests.
I've set a personal branch in my launchpad account with the changes I made to the original code. https://code.launchpad.net/~neandertalspeople/+junk/inkscape-testsuite
After updating the test I'll try to find a way to automate the tests, and perform them regulaly. What do you think about that ? Any idea about how that could be done ? Maybe it is possible to set some kind of trigger when there is a commit in the launchpad inkscape trunk ?
Guiu Rocafort
2013/10/2 Tavmjong Bah <tavmjong@...8...>
On Wed, 2013-10-02 at 01:39 +0200, Krzysztof Kosiński wrote:
In r12649, pservers-grad-13-b should render correctly.
It does, I've updated the images at:
http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g...
Thanks,
Tav
October Webinars: Code for Performance Free Intel webinars can help you accelerate application performance. Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from the latest Intel processors and coprocessors. See abstracts and register > http://pubads.g.doubleclick.net/gampad/clk?id=60134791&iu=/4140/ostg.clk... _______________________________________________ Inkscape-devel mailing list Inkscape-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/inkscape-devel
On Wed, 2013-10-09 at 00:51 +0200, Guiu Rocafort wrote:
After updating the test I'll try to find a way to automate the tests, and perform them regulaly. What do you think about that ? Any idea about how that could be done ? Maybe it is possible to set some kind of trigger when there is a commit in the launchpad inkscape trunk ?
I was thinking of doing something like: create a deb for the test suite, when the deb gets built, it consumes the inkscape package, runs the tests and outputs the results and any logs etc.
Then to make use of the test results (i.e. on the new website) we install the package and read the files and the logs for useful display.
The only non-technical maybe issue is using Canonical servers to run our tests. Would they complain about a deb build like that?
Martin,
We could also consider doing this the other way round and add the svg compliance tests into the test suite in inkscape trunk. The trunk ppa already runs tests daily or whenever trunk changes so we wouldn't need to add any new packages.
AV On 9 Oct 2013 01:53, "Martin Owens" <doctormo@...400...> wrote:
On Wed, 2013-10-09 at 00:51 +0200, Guiu Rocafort wrote:
After updating the test I'll try to find a way to automate the tests, and perform them regulaly. What do you think about that ? Any idea about how that could be done ? Maybe it is possible to set some kind of trigger when there is a commit in the launchpad inkscape trunk ?
I was thinking of doing something like: create a deb for the test suite, when the deb gets built, it consumes the inkscape package, runs the tests and outputs the results and any logs etc.
Then to make use of the test results (i.e. on the new website) we install the package and read the files and the logs for useful display.
The only non-technical maybe issue is using Canonical servers to run our tests. Would they complain about a deb build like that?
Martin,
October Webinars: Code for Performance Free Intel webinars can help you accelerate application performance. Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from the latest Intel processors and coprocessors. See abstracts and register > http://pubads.g.doubleclick.net/gampad/clk?id=60134071&iu=/4140/ostg.clk... _______________________________________________ Inkscape-devel mailing list Inkscape-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/inkscape-devel
On Wed, 2013-10-09 at 02:08 +0100, Alex Valavanis wrote:
We could also consider doing this the other way round and add the svg compliance tests into the test suite in inkscape trunk. The trunk ppa already runs tests daily or whenever trunk changes so we wouldn't need to add any new packages.
I'm actually interested in seeing those test results if possible. I'd like to know how easy it is to gather the information for project maintenance use.
If we formalise the svg tests, then we need to be able to get at the resulting png files too. Hopefully those can be saved somehow when the tests run.
Martin,
On Tue, 2013-10-08 at 22:18 -0400, Martin Owens wrote:
On Wed, 2013-10-09 at 02:08 +0100, Alex Valavanis wrote:
We could also consider doing this the other way round and add the svg compliance tests into the test suite in inkscape trunk. The trunk ppa already runs tests daily or whenever trunk changes so we wouldn't need to add any new packages.
I'm actually interested in seeing those test results if possible. I'd like to know how easy it is to gather the information for project maintenance use.
+1, I wasn't aware that there was testing being done other than that the compilation was successful.
If we formalise the svg tests, then we need to be able to get at the resulting png files too. Hopefully those can be saved somehow when the tests run.
Martin,
Tav
On 9 October 2013 06:46, Tavmjong Bah <tavmjong@...8...> wrote:
+1, I wasn't aware that there was testing being done other than that the compilation was successful.
Actually, yes, the PPA package builder runs "make check" at the end of each compilation and reports a build-failure if any of the tests fail (or if known-failures succeed). At the moment, "make check" runs the following:
1. C++ unit tests, using cxxtest. (Some of these are known failures that need fixing) 2. Unit tests on the extension scripts 3. A translations test
If the automated SVG compliance tests are added to the "TESTS" list in src/Makefile.am, then the PPA builder will take care of it.
As for seeing the test output, I think that should also be fairly straightforward. We'd just need to install the images that are generated in the PPA build into a new binary target called something like "inkscape-trunk-svg-test-results".
This will all then result in completely "hands-free" testing whenever trunk is updated. For best results, we should really try to separate the SVG tests into known-success and known-failure lists that can be put into separate good/bad lists for "make check". The automated builder will then complain at us whenever our SVG compliance level changes.
AV
On Thu, 2013-10-10 at 15:12 +0100, Alex Valavanis wrote:
This will all then result in completely "hands-free" testing whenever trunk is updated. For best results, we should really try to separate the SVG tests into known-success and known-failure lists that can be put into separate good/bad lists for "make check". The automated builder will then complain at us whenever our SVG compliance level changes.
Looking at the structure of our svg tests, they do appear to have good and bad result directories. Known bad is just a matter of filling the specific directory with specifically named files and likewise known good.
Martin,
On 10 October 2013 16:02, Martin Owens <doctormo@...400...> wrote:
Looking at the structure of our svg tests, they do appear to have good and bad result directories. Known bad is just a matter of filling the specific directory with specifically named files and likewise known good.
OK, but I mean there need to be two separate executable scripts for testing, so we can have something like the following in the Makefile. I haven't looked at the SVG test infrastructure yet, so I don't know if this is already available:
# List of all tests to be run TESTS = $(check_PROGRAMS) \ ../share/extensions/test/run-all-extension-tests \ run-svg-tests-good \ run-svg-tests-bad
# List of all tests that are known to fail XFAIL_TESTS = $(check_PROGRAMS) \ run-svg-tests-bad
AV
On Thu, 2013-10-10 at 16:14 +0100, Alex Valavanis wrote:
two separate executable scripts
Not yet. And while the python is pretty hairy (needs some code review) it should be possible to make it do what you want without too many issues.
Guiu, Tav; what do you think? Would it be easy enough to modify and would you like to patch it or should I?
Martin,
Well, I guess the easiest thing would be to organise the svg input files for the rendering tests into two separate subfolders in known-pass/testcases and known-fail/testcases. We can then just tell the test runner to look in the appropriate folder:
# cat run-svg-tests-good runtests.py --directory=known-pass
# cat run-svg-tests-bad runtests.py --directory=known-fail
On 10 October 2013 18:00, Martin Owens <doctormo@...400...> wrote:
On Thu, 2013-10-10 at 16:14 +0100, Alex Valavanis wrote:
two separate executable scripts
Not yet. And while the python is pretty hairy (needs some code review) it should be possible to make it do what you want without too many issues.
Guiu, Tav; what do you think? Would it be easy enough to modify and would you like to patch it or should I?
Martin,
On 2013-10-10 19:00, Martin Owens wrote:
On Thu, 2013-10-10 at 16:14 +0100, Alex Valavanis wrote:
two separate executable scripts
Not yet. And while the python is pretty hairy (needs some code review) it should be possible to make it do what you want without too many issues.
Guiu, Tav; what do you think? Would it be easy enough to modify and would you like to patch it or should I?
It's not completely clear to me what you need exactly (just a change/no change answer, a number of failed/succeeded tests, ...?), but there is currently no separation between good and bad tests. In particular, a test could have (and some do have) both known good and bad references related to it.
However, you can simply pass a list of tests to perform on the commandline if you want (or use a directory, as suggested by Alex). Any results are kept track of in teststatus.json. This file gets read and updated each run (rather than replaced with new results), so it would be possible to just run tests that failed/succeeded the last time (based on a commandline option), if that's what you need. If I can be of any help (since I actually wrote that dense bit of Python code), just ask.
2013/10/10 Jasper van de Gronde <th.v.d.gronde@...528...>:
It's not completely clear to me what you need exactly (just a change/no change answer, a number of failed/succeeded tests, ...?), but there is currently no separation between good and bad tests. In particular, a test could have (and some do have) both known good and bad references related to it.
Makefile.am contains two variables, TESTS and XFAIL_TESTS. In order for a "make check" to succeed, the programs in the TESTS variable must exit normally (exit status zero), while the programs in XFAIL_TESTS must exit with an error (exit status nonzero). http://www.gnu.org/software/automake/manual/html_node/Scripts_002dbased-Test...
Therefore, we need to know which tests fail at present and which don't, as well as a way to automatically tell whether a test fails or passes. "make check" will then alert us when Inkscape's output changes.
Regards, Krzysztof
On 10 October 2013 19:27, Jasper van de Gronde <th.v.d.gronde@...528...> wrote:
It's not completely clear to me what you need exactly (just a change/no change answer, a number of failed/succeeded tests, ...?), but there is currently no separation between good and bad tests. In particular, a test could have (and some do have) both known good and bad references related to it.
Basically we need scripts that return a 0 exit code if all rendering tests succeed and non-zero if something goes wrong. In this context, a "pass" means that everything in all included rendering tests works and "fail" means that something in any of the included tests didn't work. In pseudo-code:
# Approximate behaviour of runtests.py...
num_errors = 0; # Number of errors found
for i in test_list: test_succeeded = run_rendering test(i);
if( !test_succeeded): num_errors++;
return num_errors;
Again, I'm working in the dark a bit, because I haven't tried running the tests yet so I might be talking bollocks!
On Wed, 2013-10-09 at 00:51 +0200, Guiu Rocafort wrote:
Hi ! I've been looking at the GSOC and after solving a few problems I can run the tests now. Now I have to find out a way to update the tests.
I was able to run the tests without too much trouble... however I did not have a set of "good" PNGs to test against... so I gave up and went back to using the latest SVG 1.1 test suite.
To update the tests, I think you just need to put the new SVG files into the directory with the existing SVG files and then create reference PNGs in the appropriate files (known good rendering, known bad rendering). You can't just use the reference PNG files from the SVG test suite for doing automated pixel-by-pixel comparison as the results depend on the font used. (The SVG2 test suite will use a WOFF font, I think, to avoid this problem.) The existing files from the SVG test suite should be replaced by the files from the latest version of the SVG 1.1 2nd Ed test suite.
I've set a personal branch in my launchpad account with the changes I made to the original code. https://code.launchpad.net/~neandertalspeople/+junk/inkscape-testsuite
After updating the test I'll try to find a way to automate the tests, and perform them regulaly. What do you think about that ? Any idea about how that could be done ? Maybe it is possible to set some kind of trigger when there is a commit in the launchpad inkscape trunk ?
This would be awesome.
Tav
Hi.
I would like to know how the test are run. I see in this page ( http://wiki.inkscape.org/wiki/index.php/SVG_Test_Suite_Compliance ) the last update is from 2008 so it does not reflect the current svg support in inkscape. It would be good to re-do all the tests so there is a more precise documentation about what is currently implemented and what is still missing.
On Tue, Oct 1, 2013 at 2:56 PM, Tavmjong Bah <tavmjong@...8...> wrote:
Hi,
I've run the W3C SVG 1.1 2nd Edition test suite using both Inkscape 0.48 and trunk. There are quite a few improvements in test compliance with trunk but some tests show regressions:
http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/coords-uni... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/pservers-g... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/text-altgl... http://tavmjong.free.fr/INKSCAPE/W3C_SVG_0.49/htmlInkscapeHarness/text-tref-...
Tav
October Webinars: Code for Performance Free Intel webinars can help you accelerate application performance. Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from the latest Intel processors and coprocessors. See abstracts and register > http://pubads.g.doubleclick.net/gampad/clk?id=60134791&iu=/4140/ostg.clk... _______________________________________________ Inkscape-devel mailing list Inkscape-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/inkscape-devel
On Tue, 2013-10-01 at 15:36 +0200, Guiu Rocafort wrote:
Hi.
I would like to know how the test are run. I see in this page ( http://wiki.inkscape.org/wiki/index.php/SVG_Test_Suite_Compliance ) the last update is from 2008 so it does not reflect the current svg support in inkscape. It would be good to re-do all the tests so there is a more precise documentation about what is currently implemented and what is still missing.
At one time we had automated testing set up (it was a GSOC project). The source code can be found at:
http://sourceforge.net/p/inkscape/code/HEAD/tree/
(I don't think it was brought over to Bazaar.)
The SVG 1.1 test suite needs to be updated with the latest tests that can be found at:
http://www.w3.org/Graphics/SVG/WG/wiki/Test_Suite_Overview#SVG_1.1_Second_Ed...
(There have been tests added and many tests were changes so that the color red is not shown when a test passes.)
Tav
participants (7)
-
Alex Valavanis
-
Guiu Rocafort
-
Jasper van de Gronde
-
Krzysztof Kosiński
-
Martin Owens
-
su_v
-
Tavmjong Bah