Currently Inkscape renders the W3C Gaussian blur test (filters-gauss-01-b.svg) visibly different from the reference PNG provided by the W3C. However, from what I can tell it might not be Inkscape's problem... (Or is it?)
This spreadsheet: http://home.hccnet.nl/th.v.d.gronde/filters-gauss-01-b.ods shows some graphs of data extracted from the reference PNG and Inkscape's output, as well as some artificially created data. The top and bottom graphs correspond to lines extracted at y=220 and y=320, and they show that Inkscape's output matches pretty well to the reference PNG. I've only included the green channel of the reference PNG, but the other channels match as well.
The interesting bit is the green channel in the middle graph (corresponding to y=270). Legend: - the continuous green line is Inkscape's output - the dashed green line is the W3C reference PNG - the orange line is the result from alpha compositing the green channels from y=220 and y=320 (corresponding to only red and only yellow). - the brown line is the result of blurring a 45 pixels wide strectch of "red" roughly corresponding to the visible part of the red rectangle (only the green channel, so all 255 except for the "red" part, which is 0) As can be seen Inkscape's output matches almost perfectly with the artificially generated data, and it apparently doesn't matter much whether the filter is applied to each rectangle separately or to the combined image in this case (which is NOT the general case).
I've applied for a bugzilla account at the W3C to file a bug, but in the mean time, did I make some mistake in analyzing this or is this really a bug in the W3C test? Also, what on earth could cause such a weird mismatch?
And in an interesting twist, Batik gives more or less the same result as the W3C reference PNG. I've had a short look at their blur code, but without spending hours (if not days) to delve deep into their code I really can't tell what causes the difference in behaviour.
BTW, the strange fluctuations in the left part of the bottom graph are due to text appearing the reference PNG (which has been removed from Inkscape's tests to make it easier to compare images automatically).
just got this fomr #svg on irc.freenode.net
heycam: that batik gives the same result as the reference slide isn't surprising, since it was used to generate them
heycam: but as for the actual issue, looks like it needs some indepth analysis
On Tue, Dec 2, 2008 at 9:42 PM, Jasper van de Gronde <th.v.d.gronde@...528...> wrote:
Currently Inkscape renders the W3C Gaussian blur test (filters-gauss-01-b.svg) visibly different from the reference PNG provided by the W3C. However, from what I can tell it might not be Inkscape's problem... (Or is it?)
This spreadsheet: http://home.hccnet.nl/th.v.d.gronde/filters-gauss-01-b.ods shows some graphs of data extracted from the reference PNG and Inkscape's output, as well as some artificially created data. The top and bottom graphs correspond to lines extracted at y=220 and y=320, and they show that Inkscape's output matches pretty well to the reference PNG. I've only included the green channel of the reference PNG, but the other channels match as well.
The interesting bit is the green channel in the middle graph (corresponding to y=270). Legend:
- the continuous green line is Inkscape's output
- the dashed green line is the W3C reference PNG
- the orange line is the result from alpha compositing the green channels from y=220 and y=320 (corresponding to only red and only yellow).
- the brown line is the result of blurring a 45 pixels wide strectch of "red" roughly corresponding to the visible part of the red rectangle (only the green channel, so all 255 except for the "red" part, which is 0)
As can be seen Inkscape's output matches almost perfectly with the artificially generated data, and it apparently doesn't matter much whether the filter is applied to each rectangle separately or to the combined image in this case (which is NOT the general case).
I've applied for a bugzilla account at the W3C to file a bug, but in the mean time, did I make some mistake in analyzing this or is this really a bug in the W3C test? Also, what on earth could cause such a weird mismatch?
And in an interesting twist, Batik gives more or less the same result as the W3C reference PNG. I've had a short look at their blur code, but without spending hours (if not days) to delve deep into their code I really can't tell what causes the difference in behaviour.
BTW, the strange fluctuations in the left part of the bottom graph are due to text appearing the reference PNG (which has been removed from Inkscape's tests to make it easier to compare images automatically).
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ _______________________________________________ Inkscape-devel mailing list Inkscape-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/inkscape-devel
Andy Fitzsimon wrote:
just got this fomr #svg on irc.freenode.net
heycam: that batik gives the same result as the reference slide isn't surprising, since it was used to generate them
That at least explains that bit :)
heycam: but as for the actual issue, looks like it needs some indepth analysis
Thanks for mailing this. I filed a bug in the SVG Issue tracker (bug 6261), but haven't heard anything so far (not terribly surprising, as it has only been a few days).
participants (2)
-
Andy Fitzsimon
-
Jasper van de Gronde