Thursday, 31 May 2007

Making use of Inkscape

I'm spending plenty of time developing a drawing program, so I decided to draw something for change.

This drawing is based on a photograph, but only to get the anatomy correct. I started with ordinary A4 copier parer and pencil for the first sketches. For the outline drawing I used heavier A4 drawing paper, on which I first drew the subject with pencil and then inked with a brush pen and 0.20 mm pen.

Scanning was a bit of a problem, as my scanner seems to be breaking down. The scanned image was full of horizontal colored lines. Luckily all those lines were bright colours, so I was able to open the image up in GIMP, take a CMYK separation of the image and discard all but the K (black) channel.

From this on, I used Inkscape for the work. Tracing the inked line drawing gave me a nice vector representation. This was actually surprisingly simple, as in GIMP - which I have used for drawing, too - separating these lines from background would have taken many steps and still wouldn't have yielded as well-defined lines.

The new paint bucket tool is a godsend for colouring drawings. Filling areas of drawing with the basic colour is only a mouse click away. With default settings, the filled area is not quite correct, as white lines show up between the area and bounding line, but this is easily remedied by increasing the value of "Grow / shrink by".

For adding shadows, I got to use the new featured I had created myself (Yay!). For this drawing I had though, that the lighting is quite warm, like summer day light, so I picked a cold color for the shadows - dark blue to be exact. Simple alpha blending with this kind of colour leaves a bit to be desired, so I used feBlend with multiply mode. Each shadow shape has an opacity of 20% instead of the whole shadow layer having that opacity - this way stacking shadows on top of each other will create darker shadows.

The few brighter spots are also created by using feBlend, this time with screen as mode. I feel that bright spots are really easy to overdo, so they are quite subtle here - some maybe even too subtle.

Well, for the bubbles and background I used plenty of gradients - probably not hard to tell ;) The bubbles are all clones, so it was easy to edit their appearance, when they didn't seem quite correct.

The pier planks are all unique, the tops are created by duplicating one plank, though. Also, lots of guidelines were used here. This is a place, where I wished that the 3D tool wasn't still in planning, but an usable tool.

And yes, the result:

I was also thinking, that I might create some kind of tutorial about this, too. I did take screenshots and save several copies of the image along the way. This might be a bit too large subject for a tutorial, though.

And yes, I haven't totally forgotten coding. As pointed out in inkscape-devel mailing list my last changes to codebase broke the about screen... Mixing these C and C++ objects is tricky sometimes. This time problem was that a C++ object inside a C object wasn't getting initialized. If both objects were of C++ style, this initialization would have happened automatically.

Tuesday, 29 May 2007

Filtering the image background

It's beginning to seem, that my planned timetable for this summer project is way off. Luckily it seems, that I've overestimated the time I need instead of underestimating. So what does this mean exactly? Well, now it's been a full two weeks since I started the project but I had planned five weeks for doing what I've done up to this point. Also, I've planned two weeks for creating renderer support for BackgroundImage and BackgroundAlpha input images. Well, it seems that support for BackgroundImage exists already, it just needed these changes I committed today to enable it.

Well then, about this commit today: support for in-parameter in filter primitives. This means, that now we aren't limited to filtering only the image itself and building linear filter chains. The background image can be taken into account when filtering and building complex filters should be possible too.

There are lots of possibilities that this enables. Next picture shows one of them: frost glass effect. Here I've got an ordinary gaussian blur, but instead of applying it to the object itself, I apply it to the background image. The object itself disappears, all we see is that a square area of the image is blurred.

This support for different input images isn't without problems, though. One really annoying one is with blend filter: it sometimes doesn't use the specified blending mode, but seems to apply normal alpha blending. I don't know yet, if this is a bug in input images support, in blend filter or even in some seemingly unrelated part of code.

To help debugging this, I think I'll reorder my timetable and implement feOffset filter next. This will allow me to use some example images W3C has published and compare renderings from Inkscape to them.

Monday, 21 May 2007

Refactoring work

First planned work for this summer was to refactor filter renderer initialization. The old initialization code was a monolithic piece of code copy-pasted in three places in codebase. (nr-arena-shape.cpp, nr-arena-group.cpp and nr-arena-image.cpp) While not bad in itself, each new filter primitive implemented would have grown that code by a dozen lines or so. It would soon have grown to an unmaintainable size.

So, the new approach: SPFilter has a single method sp_filter_build_renderer, which will initialize given renderer object (NR::Filter) to a correct state. Calling this method is all that needs to be done in those three nr-arena-* classes to set the correct filter renderer state.

The inside workings of sp_filter_build_renderer are as follows: each filter primitive (SPFilterPrimitive subclasses) has a build_renderer virtual function that will add the correct NR::FilterPrimitive object in the filter renderer. This function in turn calls sp_filter_primitive_renderer_common, which will do the part of initialization, which is common for all filter primitives.

Also, I looked into how modification messages are handles in document level filter objects. This actually took more time than the actual refactoring work, as it required me quite a few recompiles and plenty debugging to figure out, how these messages are propagated. Well, now these messages are propagated from SPGaussianBlur and SPFeBlend. This means that updating the underlying xml nodes for these two filters will cause the image on screen be updated. This will allow for further improvements, for example making the blur slider update the existing filter instead of creating a new filter every time the slider is changed.

Now that I've done this refactoring work, I will look into making in-parameter for filter primitives work. There is now the single method sp_filter_primitive_renderer_common where I should make this change to, so it will be somewhat simpler than it would have been before this refactoring work.

Also, I probably should write documentation, telling basically the same things about this refactored initialization routine, I've told in this blog post, just more detailed.

Wednesday, 16 May 2007

Rather odd blending bugs

Not long after committing the feBlend renderer, ScislaC found some rather odd bugs in it. Well, I'll show you the pictures:

What it looked like:

What it should have looked like:

So, what was up with this? Well, due to roundoff errors, under some circumstances feBlend renderer would output colours with more than 100% of some RGB components. The image compositing code didn't take such colours nicely...

Well, now this should be fixed in SVN.

On other topic, today I was looking for prices of new memory modules, so I could make this computer run smoother. It seems that increasingly often 512 MB just won't do it and the hard disk trashing begins. Now then, SO-DIMM memory is a bit on the expensive side, but for a bit less than 150 € / GB it would likely be worth the money...

Monday, 14 May 2007

Setup phaze and feBlend

Today I finally wrapped up some modifications I had lying on my hard disk and committed feBlend renderer into Inkscape codebase. The codebase surely changes fast, as testing those changes with latest code required a full recompile. (and that does take quite a while)

That feBlend renderer isn't much of use yet, as it needs some features, I'm planning to develop over the summer - for example, changing the input image. Blending the filtered image with itself isn't too useful a feature...

Recompiling Inkscape will hopefully take significantly less time in future though, as I installed distcc, which allows me to offload part of the compilation effort onto my other computer. It's quite nice a tool, as it works nice even though that other computer doesn't have all the libraries Inkscape requires, different versions of some and even a different processor architecture.

Inkscape and I

A bit over a year ago, I noticed I could apply for Google's Summer of Code program. Browsing their list of projects, for which I could apply, I bumped upon Inkscape website. Graphics programming has been one of my interests for a long time and Inkscape was a program that I was using already now and then. After discussion with Inkscape developers, I ended up submitting my proposal on creating infrastructure for SVG filter effects support.

So, the proposal got accepted and during the summer I worked on developing filter effects support in Inkscape. Soon it was fall and I returned to my studies at the university. I did commit some code during the winter, too, but not really much. When the spring came and I noticed ads promoting Kesäkoodi - a Finnish program similar to Google's Summer of Code - in our student's room, I thought it would be great to continue what I did last summer.

So here I am, beginning another summer filled with reading code, planning, writing documentation - and of course, writing code, too.