June 14, 2007
All my processing based AV experiments to date have been in real time. In preparation for my dissertation piece I thought I should get rendering. This post at the processing forums was immensely helpful. This Davbol fella has given many budding flight 404s a nice head start.
With Davbol’s structure –
I worked in my still very rough leech code and had things working surprisingly quickly. At 320 x 240 it took over ten minutes to render about 3800 jpegs. Loaded the images into after effects, dropped in the audio and everything synced nicely.
Finally, I could see the sketch running smoothly but unfortunately after all that waiting, the problems that real time chunkiness were hiding became apparent. Everything was flashing and blisteringly fast so I had to slow down all my motion and increase the audio dampening to settle things down.
Whenever I make a programming breakthrough it seems inevitable that along with it comes a set of new tasks and a haunting and realistic vision of the workload ahead.
I’ll have to find a way to work in real time and with offline rendering. Real time for experimenting with new forms and structures and offline to fine tune values and variables. Sitting around and waiting for things to render just seems so strange and inefficient after a year and a half of playing with this stuff in real time. I guess its the price you pay for detail. Time to refine this leech code and get it reacting in a smooth and dynamic manner.