Wednesday, July 1, 2009

My Final Project Post

So, this is a belated final project entry. But I thought it would be good to record my post posthumously for this project. First description of the intent of the project:

Project: Add # 17 to the Bird series that Brancusi's Bird (I like the brass ones the best)
Remaking Brancusi's bird came to mind for a couple of reasons.

1. Rory suggested that we really consider how the final product is packed for a physical computing piece. I think Brancusi is really excellent at packaging complex ideas into very highly simplified forms of 3-d presentation, so I thought I could learn quite a bit by carefully considering his forms and how they might be translated for physical computing pieces.



Bad pic. But I love his sculptures.

Is brancusi the godfather of the ipod? Hmm. That idea makes me feel a little sad for some reason, though I don't know why

2. Once I thought of Brancusi and I started considering the Bird series, it seemed like a "cover" of his bird in physical computing materials actually had some relevance. When he made the bird excelerated travel was a huge metaphor for people at the turn of the century--with travel in many ways becoming a metaphor for being able to reach outside of our physical existence. Much like what information technology is for people today. I see us projecting ourselves onto technology and allowing technology to be projected onto us, for much the same reasons as people were excited about airplanes and trains, and such. Its an opportunity to have our individual selfs be larger than our immediate experiences; for its potentiality, and of course utility. And I like that. But I also think Brancusi’s bird recognizes that no matter how high we fly we are tied to earth. And of course even thought we have wireless and handhelds we must all come back to ground; ultimately its an electric pulse, it’s a 1 or a 0. And ultimately we are always these physical beings, even as we also have these alter selfs— projected in whatever different ways we have now made possible with IT.

So, how would I remake this bird?

The intent? Multiple LCDs placed in upward movement around a cylinder shape. The LCDs would be triggered on and off based on someone's (or multiple people's) movement around the object. The images would be live feeds from webcams built into the piece.

The reality:
Ahhh. So much went wrong.

And, what did it end up being: Just a simple projection from my laptop's webcam captured through Max/MSP and triggered by a serial feed from a motion detector controlled by the arduino. But this so simply setup was so time consuming.

So, I think the value in blogging about this what turned-out to be a very low-tech physical computing project posthumously is the lessons learned:

Lessons Learned
1. There are no cheap mini LCDs that support full video
While I wanted to have many, many LCDs all the way up a tall, thin cylindrical frame, I didn't want to pay for all of these. So, I decided that I would use at least 3 lcd to create a proof-of-concept motion trigger effect I wanted and stop at that. However, I did not find any cheap LCD screens that support full video/color graphics. The PS1 LCD it turns out is in high demand and is going for about $65-75 used. Maybe if I had had more time, I think it might have been possible to get them a bit cheaper.

2.The Coby media picture frames (and other media frames) that support media and are under $50 also do not support live video. You must have recorded video loaded onto these things (though it does appear that some of the frames over $100 will support live data from a computer). Also don't tear apart a Coby media picture frame. Its electronic connections are pretty delicate but tearing apart the frame takes some force. I tore one apart and it quit working. Maybe if you saw it apart instead??? I think now that I know where the connection is I might be able to saw the cover apart if I was very, very careful. But, you might be sensing a trend. I was too cheap to try and saw apart another one and ruin it.

3.The Max/MSP serial object does not reliably read serial data sent to the buffer from the Arduino.
I saw a few people posting about this issue, after spending two days trying to get Max to read the data being sent from a motion detector. I did get it to work if 1) there was a line break and a common included between each input and I had a delay(100); 2) I had Max shut down and started up the Arduino program ---but did not turn on the serial monitor, 3) I physically restarted the board before starting Max and have a patch read the serial data being sent. Also the Max program wouldn't always read it even if I went through this elaborate ritual. And it generally only starting working after 30-60 seconds of starting the patch. I installed several of the libraries that have been created to help support this, and none of them reliably worked for me.

I wasted so much time on this issue--and since I really didn't know enough to take a differnet technology angle for getting a live web cam feed projected to have the on/off controlled how I wanted I doggedly stuck with Max to control the video. For example, how would I hack logitech's cam software so I could pause a feed??? Also, I couldn't get the cam to have a full screen view in software supporting the cams; and for some reason that escapes me now, that seemed really important.

4. Quicktime might be based on MPEG-4 but MPEG 4 compression is not necessarily supported through Quicktime.
I found two cams that had several Apple compressions choices, and a MPEG-4 choice. I don't know why, but I thought these would work with MAX/MSP/Jitter which uses Quicktime for the live feeds. It doesn't. Quicktime ain't MPEG. So obvious now.

5. Don't be cheap and think you will do full graphic LCDs
Paying approx. $100 x 3 for mini LCDs just seemed like too much money to spend on my first real physical computing project. But, biggest lesson here is don't do LCDs if you are cheap. There was no cheap solution and I wasted alot of time being cheap.

6. Don't doggedly stick with your original intent if you have less than 3 weeks.
And I am pretty sure this was the best lesson I learned. For reasons that completely seem unnecessary now, I felt I had to have live video no matter what. Anyway, I do think this was actually a great lesson. I will scope better, and research up front better, and really think about what is feasible within timeframes, and assess better as a I go along. So, maybe it was good--even if it was depressing to have such a lame outcome.

No comments:

Post a Comment