Wednesday, May 13, 2015

entangled_beat v1.1 - the dust has settled

Here are some installation and action photos. I dimmed the LEDs a bit so that it caused less after image but also so that it used less power. Our power usage allowed us to run through the night. The software ran with no hitches, hangups or crashes. This makes me very happy: a stable install. We did take the installation down for 2 days because of the wind. From a technical point of view the install went perfectly. From an interaction and engagement point of view I have a few mods I want to make. The sensor takes a long time to respond and does not work in the day time. Now for v1.2.







Monday, April 20, 2015

entangled_beat v1.0 - almost there

v1.0 of the arduino code ready to roll. Some sensor, screen and wiring bits left to do. Also some 3d printing and more testing. We have a central spot in Tankwa town to install the work: find us at 6ish.





Tuesday, September 24, 2013

#autoselfie now with glitches

I updated the autoselfie app so that it will cause glitches on the image where you touch it. The glitches were ported from the glitchP5 processing library.

There were some interesting things I learnt about the Matrix. The ImageView will transform the bitmap you put in it. The touch events give x and y relative to the imageview not the bitmap. So how to know where to apply the glitch. Luckily the ImageView will tell you what it did to the bitmap with the getImageMatrix() call. This Matrix object contains all the translation, scale and rotation info. Luckily you don't have to decipher the matrix. You can create an inverse matrix and apply it to the bounding box of the path created by the touch events. This rectangle is then used to define the glitch area.



Get it on Google Play Get the app - see the stream

Sunday, September 15, 2013

#autoselfie - the making of

Self portrait, selfie, autoportrait, autoselfie.
Self portraits are a recurring theme. I have made them in many media such as these I made in Painter in 2002.


People post selfies all over. So how can I build something that will allow me to play with self portraits on a code level. My phone can recognise faces, so what do I look like to my phone?

Ideas of protecting and revealing identity emerge. The software knows where the eyes are, people hide their eyes or reveal only their eyes.

I built #autoselfie the app to explore these ideas. I wanted to play with oauth and I wanted to integrate opencv into an app.


The users of the app become the subject of portraits. The software in the app changes the images and posts to twitter. The twitter hastag #autoselfie becomes the place where you can see where all these experiments are collected.

On the implementation side

OAuth seems simple enough when you read about it but implementing it not without it's tricks. I ended up using a Webview in the app because to use an Intent to break out and do the twitter authorisation in the browser caused backstack complications.

I used the twitter4j libraries. I am also using the opencv jars. To do this the app needs the OpenCV Manager companion app. Initially this may be a barrier but it is really convenient not to have to integrate the C++ OpenCV libraries myself, especially since Android Studio does not have NDK support.

Read the code on github.

UX

The user interface design purposefully removes control from the user. In this way the software layers, which normally recede and try to be invisible, are brought to the users' attention. Also the image is posted and the user can only view the result after it has been posted. At this point the user is confronted with the image that has become public and the recognisability of the image to humans and machines alike. In this way the user experience becomes part of the concept of the work.

Trivia and Easter eggs

At the moment the app creates images in four flavours. There will be more later. References to these can be seen in the twitter post text and in the code.

8bit Roy is a riff off Roy Lichtenstein, primary colours which I make using thresholds and sortof Ben-Day dots which I draw in only the black and red areas. How? by chasing through the pixels and drawing circles if they are black or red :)

cannyKonny is a riff off Konny. She sticks large portraits in black and white ink with red highlight on the streets in Paris. Her works are ephemeral: here today stripped away tomorrow. It uses a Canny filter to find the edges and a bitwise not to invert the colours and a splash of algorithmic red.

andAnotherOneForLuck is just that, red and pink and yellow are lucky colours some say.

noPointInHolding on is the title of a song by Alveol. I found it on a beatoracle playlist.

Get it on Google Play Get the app - see the stream