On Mon, Jun 27, 2011 at 4:55 AM, Jon Cruz <jon@...18...> wrote:
On Jun 26, 2011, at 8:44 PM, Campbell Barton wrote:
Small update, built inkscape with clang and it works - as in opens and a quick test worked ok - drawing and moving selections around. I had to comment some chunks of code though, so obviously that cant be committed.
So my question is: Would you accept patches to replace variable length arrays with heap allocations? (val = new Type[size]), delete[] val I can send in patches for it, there are only 5 or so.
For the single dimension arrays this is straightforward and that leaves us with only ~3 multi-dimensional ones which need to be dealt with.
That sounds good.
However, I'm not sure if I like the solution. If you have code that requires an explicit delete[] to be called, then you have the potential to not call that and then end up leaking.
Of course, I also dislike abuse of autopointers, so there might be something to look at and find the balance for.
The first example that springs to mind is a std::string. That is a handy little class that manages its own memory, and cleans it up when it goes out of scope. So even if one uses multiple early returns (evil, IMHO), or has an exception thrown, etc., cleanup should occur as needed.
Would a std::vector<T> be an appropriate replacement in the areas you're seeing things?
Have to say I don't know C++ well enough to say, heres a simple example of one of the cases with 3 lines of function body:
./src/libcola/shortest_paths.cpp
dijkstra(...) { Node vs[n]; dijkstra_init(vs,es,eweights); dijkstra(s,n,vs,d); }
So not freeing in this case is really not a worry, though maybe some day the function becomes more complicated, which is still an argument against replacing with allocations. Even so there are tools to detect memory leaks, I use valgrind frequently however it does rely on someone running that function to find the leak so errors can still go un-noticed.
Would std::vector<T> be acceptable here?