On Wed, 10 Nov 2004 17:41:47 +1100, Peter Moulder
<peter.moulder@...38...> wrote:
My first inkscape task after Friday:
If a "ghost spike" is detected (i.e. if the curve goes too far from
the data points) then use a sharp corner instead of a smooth node.
Good, that's what is needed.
I wouldn't expect this to have much effect on the cases being
discussed here.
Yes, but otherwise it's important.
After that, there are a couple of possible freehand changes, and
also
three non-freehand patches, which I'll discuss first because they're
more ready, and in the hope of comments as to what stuff's worth doing
when.
- A simple patch adding a g_return_val_if_fail to something in
sp-object.cpp. There are two possible conditions: one equivalent to
what would cause a crash now anyway, or a tighter but simpler (and
less random-looking) test. I'd like to check callers to see whether
I can use the simpler tighter test. And get mental's opinion too.
- A patch whose correctness one can be fairly confident of but whose
benefits are relatively minor:
- Rejecting some strings that have a valid value as a prefix, e.g.
rejecting foo=nonebglx rather than treating it as foo=none.
- Minor efficiency gain through using memcmp instead of strcmp
(though a reviewer may prefer to keep strcmp for simpler-looking
code).
sounds OK to me, but perhaps still worth delaying until after 0.40
- A patch that makes our style string parsing more conformant, both
in
what we accept and what we reject. The patch more or less rewrites
sp_style_merge_from_style_string, so is harder to verify
correctness. We could just fix where we currently reject style
strings that have lots of whitespace (caused by confusion as to
whether trim_space sets *right to a length or final index) and delay
applying the rewrite until after 0.40, so that it gets more testing.
Good, certainly after 0.40
Back to freehand:
- I have a fairly short patch that affects only the first segment of
the curve (so it's helpful for short / smooth curves but no
significant effect for the signature example). Its effect will
improve fairly straight lines, but worsen curves that start with a
sharp bend: more specifically, curves whose initial radius of
curvature is comparable to or smaller than the tolerance.
Hmm, I'm afraid that initial small curves were part of the problem
Bryce experienced. This needs to be tested.
- I have a larger, unfinished, patch that delays committing a
curve
until we have two (or more) curves' worth of data points. This
patch would approximately triple the CPU cost of freehand drawing,
in exchange for a small improvement in curve fidelity.
That improvement can be built on if we use more sophisticated maths
for curve estimation (for yet more CPU cost, as well as of course
development effort).
Hard to say, needs testing, after 0.40 I think
I really think the ideal tolerance would be a function of pointer
speed. (I'm not commenting on how worthwhile it would be to implement
this.) The thinking is that if one is moving fast then clearly[*1]
one is not paying much attention to the exact position of the pointer,
so we can use a larger tolerance and get smooth curves (assuming
smoothness is considered good). If moving slowly, then it may be that
one cares about the position, or maybe the user's just resting the
pointing device, so we shouldn't make the tolerance too small, but at
least it needn't make allowance for the broad-strokes case.
That's a good idea. Actually it must depend on the intended radius of
curvature that the user has in mind, but since this is hard to
estimate, the speed seems to be a good approximation. Though it's not
ideal; for example when I'm trying to draw a good straight line I may
move it slower than when I'm trying to do a smooth sweeping curve. So
if you can implement this one I'd certainly want to test it out.