I had a lot of fun with the RGB LED this week. Instead of using a pingpong ball to diffuse it I used a flexible plastic material and was able to film the subtle differences in color through my phone’s camera by placing the plastic directly on the lens of the camera. I played some Amadou & Mariam in the background for rhythm:
In the second exercise for this week it was hard to get super creative, but I was able to change the rate of fade. Here is a faster fade than the original with the delay at 5:
Hi, everybody! So, I did my best to modify this code a little bit, but never really understood why I couldn’t control the function better. I’m pocketing this question for class. But, here’s what the sample code looks like:
Here’s the closest thing to the way I had it in the end, when it still wasn’t functioning… I don’t have a video of that, because it really wasn’t any different with the exception of a little skip in the middle of the sweep.
I’m a maker with a background in a variety of materials including steel, wood, and ceramics. I’m originally from central Arkansas, but most recently moved to Brooklyn from a small craft school in the Blue Ridge Mountains in Western North Carolina. Experimenting with new materials and processes is one of my favorite things. I also really enjoy good food and good tea. Before a recent injury, you’d likely find me playing Ultimate when I wasn’t working, eating, or sleeping. I’m working on that and I’ll be on the field again soon!
This is a project on using 123D Catch to get a 3D scan of my whole body. Once I had the scan I cleaned it up and manipulated it ready for 123D Make. I sliced the model in Make and preped it for Laser cutting. Once done I lazer cut and integrated the componentry and Arduino to make the head turn. I created two functions an automatic tracking function and a manually controlled function through Max MSP. In a way this is a Modern Day Puppet.
Advances in physical computing and interaction design hardware over recent years have created a new breed of smartobjects which are gaining more and more traction in the design world. These smartobjects have the potential to be far more interactive and emersive than ever before. What is exciting is that its becoming increasingly easier and cheaper to become a part of, with DIY and hacker communities initiatives such as Maker Faire, Instrutactables as well numerous other organisations & people showing the growing interest in this area. Done as part of the Making Studio Class taught by Becky Stern in the Products of Design Masters Program at School of Visual Arts. This project aims to capture the essence of this style of designing, where ideas, thinking and process are shared for others to use and expand on.
On one hand ‘Cloud’ is an Arduino controlled, motion triggered lightning & thunder performance. On the other it is a music activated visualizer & suspended speaker unit.
The cloud is made by felting hypoallergenic fiberfill to a sponge casing which forms the frame of the cloud and holds the speakers and componentry. The felting tool used is a custom made felting tool made from the left over sponge and 4 felting needles. To control the functions of the cloud there are three tactile switches scattered around the base. The concept references real clouds which constantly change shape through the switches requiring constant exploration to find the right switch to turn the right feature on or off.
Acting either as a semi-emersive lightning experience or as speaker with visual feedback this Nightlight – Nightspeaker hybrid introduces not only a new discourse for what a nightlight is or cloud be but also what a smartobject is or could be?
Arduino controlled motion triggered lightning & thunder and music activated EL wire diffuse flashes combined into a suspended speaker unit. The Cloud is a felted Nightlight – Nightspeaker hybrid, acting either as a semi-emersive lightning experience or a speaker. The controls of the cloud are tactile switches scattered around the base, similar real clouds which constantly change the switches are subtle and require constant exploration to find the right switch to turn the right feature on or off.
Work in progress, after days of working to get the separate components of the cloud working independently I now have them working together in the same arduino sketch. The components I am using are the the PIR sensor, the Wav Shield & LED Storm. My system does thus: The PIR sensor is triggered by motion > then the randomized Lightning sequence begins in a randomized cycle followed by a super lightning strike > milli seconds after the super lightning strike is a randomly selected thunder sound, the cloud then waits until the PIR is triggered again.