Today's catch of #generativemusic: a combination of algorithmically driven mangling of recordings and synths triggered by #minibee sensors. Enjoy!
The last 2 weeks I've been slowly piecing together an #openFrameworks app, visualizing #minibee accelerometer data; posture recognition; midi support; and finally, sending #osc data to a max-patch on the other side of the world (hint: VPN is your best friend ever!). The result will be a telematic performance/lecture together with German media artist Chris Ziegler (movingimages.de) and, as always, @klli, on the 3d of December. Code here: https://gitlab.com/kflak/forest-openframeworks.
@mathr @yaxu ping ping! Yes, indeed, that's been our party trick since 2014 or so. We're using #minibee accelerometers strapped to a dancer's wrists or ankles. I haven't found any standard libraries, but I've been experimenting with the GRT toolkit for basic posture recognition (https://gitlab.com/kflak/minibee-posture-recognition), in addition to a bunch of classes in #supercollider for working with the data to trigger/control events (https://gitlab.com/kflak/minibeeutils).
Phase modulation with feedback, triggered by #minibee accelerometer, as always in #supercollider
Having fun with 22 tone equal temper scale, #minibee movement sensors and my new #supercollider discovery the OteyPiano. I smell possibilities.