Well let’s be honest, we are certainly figuring this out as we go.
We have stripped out the generative music (and thus, the pipe organ) from the project. Probably mostly because we thought this would help us distinguish us from the other generative music group, and also because we weren’t committed to that learning anything very interesting. That being said we are still working on defining what part of our project learns – and also, if it is art.
We are keeping the monsters, and the buttons. People will be able to push the buttons to make the monsters do something. Also we will be tracking people as they move, and translating that into monster movement. Very reactive. The plan is to save all of these movement patterns and create generated monsters that interact with the real people-based monsters in the animation.
Things to implement
- People (blob) tracking
- Masking the blobs maybe into some sort of particle system that the monster can be drawn as.
- Building a box to house projector, laptop
- Constructing button interface
- Animating the scene with monster (maybe 8-bit)
- Storing the movement patterns so that they can be recreated
- Bonus: customizable monster creation?