
Previously, Inkscape didn't check if there's enough free memory for its pixel buffers and could crash without warning due to insufficient memory, e.g. upon zooming in. This problem became much worse after implementing Gaussian blur, because rendering blurred objects at high zooms may require a pixel buffer much bigger than the visible canvas. Now this situation is handled more gracefully: if a display operation requires more memory than available, or more than 100Mb (which corresponds to a 5000x5000 pixel buffer), it is skipped. This may result in blurred objects "disappearing" at high zooms. This is purely a display issue, however, it never corrupts data; just zoom out (or reduce blur radius) and the missing object will show up OK.
The "100Mb" magic number was chosen so that to prevent too large allocations which, even if they succeed, bog the system down to an almost unusable state. I realize it's far from perfect, but at least it's a solution which kinda works for a "typical" system with at least 256Mb memory. Any suggestions on how to better handle this are welcome.

bulia byak wrote:
... This problem became much worse after implementing Gaussian blur, because rendering blurred objects at high zooms may require a pixel buffer much bigger than the visible canvas...
This seems odd, why would you ever need a pixel buffer *much* larger than the visible canvas? Is it because the entire object has to be rendered? If so, can something be done about that? (might also speed things up)
The "100Mb" magic number was chosen so that to prevent too large allocations which, even if they succeed, bog the system down to an almost unusable state. I realize it's far from perfect, but at least it's a solution which kinda works for a "typical" system with at least 256Mb memory. Any suggestions on how to better handle this are welcome.
For rendering purposes this seems okay, but is this also enabled when exporting? Personally I wouldn't mind waiting a long time when exporting (not that I am likely to actually have documents that would approach this limit, but still), and I've got 1GB of memory, so I would rather see Inkscape just try to do the best it can.
And, assuming it's not possible to remedy the situation, when rendering it might be a good idea to create some kind of visual indication of having skipped some part (so the user isn't left wondering what the hell happened and possibly even tries to apply an effect twice for example).

On 10/15/06, Jasper van de Gronde <th.v.d.gronde@...528...> wrote:
This seems odd, why would you ever need a pixel buffer *much* larger than the visible canvas? Is it because the entire object has to be rendered? If so, can something be done about that? (might also speed things up)
Not ALL of the object, of course. But to correctly render blur, you must have your canvas enlarged by the blur radius on all sides, because these margins will affect what you see. And the blur radius at high zooms can easily run into thousands of screen pixels.
For rendering purposes this seems okay, but is this also enabled when exporting? Personally I wouldn't mind waiting a long time when exporting (not that I am likely to actually have documents that would approach this limit, but still), and I've got 1GB of memory, so I would rather see Inkscape just try to do the best it can.
Yes, that makes sense. I'll remove the 100Mb pre-allocation cutoff for command line operations (but not for GUI export, because the entire point of that cutoff is to prevent GUI from being bogged down).
And, assuming it's not possible to remedy the situation, when rendering it might be a good idea to create some kind of visual indication of having skipped some part (so the user isn't left wondering what the hell happened and possibly even tries to apply an effect twice for example).
Right now you get a warning on the console when this happens. I know this is not very Windows-friendly but I'd be reluctant to provide some sort of GUI indication for what is essentially an exceptional condition.
participants (2)
-
bulia byak
-
Jasper van de Gronde