Archives for the month of: February, 2011

Part 3 has an identical flow of data to part 2, the only difference is the type of sensor and the 3-dimensional relationships it effects. In this example we use a PING))) Ultrasonic Sensor to measure the proximity of an object (in this case my hand) to the sensor.  The ultrasonic range finder works by sending out a burst of ultrasound and listening for the echo as it reflects off an object. Code written to the Arduino board sends a short pulse to trigger the detection, then listens for a pulse on the same pin using the pulseIn() function. The duration of this second pulse is equal to the time taken by the ultrasound to travel to the object and back to the sensor. Using the speed of sound, this time can be converted to distance. For more information on the code and circuit click here.

The code written by Arduino prints values in “in” and “cm”. In this example we only need one value so we will use Serial.println(cm) and comment out the other Serial.print() functions. This will send a single array of values through the serial port similar to the example in our previous post. In Rhino the aperture is built by crating a circular array of pivot points about a center axis. These pivot points rotate a series of overlapping blades controlled by a singe parameter.

The values streaming from the serial port are fed directly into the grasshopper build driving the rotation of blades. To see this process in action see the video below. Notice the values printed to the ‘right’ screen displaying the distance in centimeters.

See the video in HD here.

In our previous post making things that talk part 1 the analogy is used where a group of people communicate allowing them to share and transfer information. In this post we build on the same concept with the micro controller (Carl) and the sensor (Susan). There are two changes; (Paul) from processing has left us and is replaced with (Grant) from grasshopper and the source (Sam) is still here but he is measuring wind variance vs light. In this example we use a really interesting anemometer/wind sensor from modern device. This little sensor uses a technique called “hot wire” which involves heating an element to a constant temperature and measuring the electrical power that is required to maintain the heated element at temperature as the wind changes. With wind variance the sensor produces voltage/values. These values are sent through the serial port into grasshopper via the generic serial read component from firefly. Once the values are pipped into grasshopper they alter a series of relationships that inflates a 3-dimensional balloon.

The single array of data streaming from the sensor controls 4 relationships.

  • a. varying division points spaced along central axis
  • b. varying circle radius about the division points
  • c. mesh surface constructed from control curves
  • d. z axis translation based on radius

GHX build

Final build showcases the digital inflation of a balloon translated by a wind sensor.

See this video in HD here.

Our previous post thoughts on response and interaction mentions the idea of embedded systems and the process of physically making things talk. Our first example uses an Arduino micro-controller , ambient light sensor the Processing programming language/development environment and a source of data which in this case is light. This communication involves 4 participants with the micro controller acting as the manager directing all parties involved. To make things simple I will make up names for each piece of technology used. We will call the micro controller (Carl) the sensor (Susan), the Processing IDE (Paul) and the light source (Sam). We will assume that each person speaks a few languages, in reality this communication takes place via programming syntax.

The dialog goes as so:

  1. Carl the controller goes to Susan the sensor located at room (A0), he asks Susan to get data from Sam the source.
  2. Once Susan the sensor has the requested data she sends it back to Carol the controller.
  3. Carl the controller then takes the data and sends it to Paul from Processing.
  4. Paul is very artistic and draws the data on the screen.

Carl is quite demanding looping through this sequence over and over again…

Ambient light data

Ambient light data display via Processing and Arduino

In reality the process goes something like this:

  1. Code is written to the Arduino micro controller asking for the anolog input voltage streaming from pin (A0).
  2. The sensor plugged into pin (A0) captures the numeric values of light in the form of voltage ranging from 0 to 1023.
  3. The Code written to the Arduino micro controller grabs those values and sends them through the serial port.
  4. The Processing code catches the values from the serial port and draws vertical lines based on the values, which in turn gives us a graphic representation of light

As we build on this example take note to the fact that the general story stays the same, we swap out a few different characters, a few different roles, a few different languages but at the end of the day its still a group of ‘people’ ‘talking’ and relaying information.

Interaction is often defined as a type of action that occurs as two or more objects have an effect upon one another.  When people interact with each other the progression is an iterative loop of speaking, listening and processing. This phenomenon is ubiquitous taking place in many forms within our environment. Smart-phones, cars, software and the World Wide Web are all things we interact with daily. As technology continues to advance, these interactions have become a woven and integral part to the way we live.

Most of us have already designed our own digitally interactive relationships using parametric modeling software such as Revit (family parameter), MAX (modifier stack), Rhino (grasshopper logic) and CATIA (component object model)

Revit (family parameter)  MAX (modifier stack)  Rhino (grasshopper logic)  CATIA (component object model)

Revit, 3dsMAX, Rhino and CATIA's parametric software user interface

Designers use software to interact with a modeled space or geometry and its parameters in a continuous loop until the desired outcome or design is reached. This dynamic level of interaction is becoming more and more vital in practice, but remains predominantly digital and process-based:  its primary focus based on managing the design process and making it more flexible.  The flexibility and interaction expire once the design is set and the model is finalized and materialized in construction documents.

In an effort to create responsive buildings (rather than merely responsive models) we seek to pull this notion of flexible interaction from the realm of digital design into the physical world we design for.  Understanding these fundamental relationships of interaction and communication is key to realizing the potential buildings have to respond to their users, environment, function, etc.  As a first step towards realizing these potentials, our next few blog posts explore  the concept of embedded computation or embedded systems and how they cant can contribute to responsive building skins.

This article is dedicated to buildings that incorporate adjustable/ movable  technologies that can adapt to variations in climate and the position of the sun.

Jean Nouvel’s Arab Institute completed in 1987 is among the first buildings to employ sensor-based automated response to environmental conditions.  25,000 photoelectric cells similar to a camera lens are controlled via central computer to moderate light levels on the south facade (1).  Now famously frozen in place, the apertures are commonly referenced in cautionary tails used to warn designers of the perils of developing kinetic facades.

Read the rest of this entry »

UCLA, SCHOOL of ARCHITECTURE and URBAN DESIGN
PERLOFF HALL
DECEMBER 6, 2010

TRANSFORMABLE DESIGN: ICONIC TO ENVIRONMENTAL

Inventor Chuck Hoberman spoke about his work in the field of Transformable Design. He started the talk by discussing one of his earliest installations, the Hoberman Sphere.

A few years later, he launched a line of toys focused around the now infamous object.

He went on to show how his work has evolved into a wide range of objects from stage installations for U2 to mechanisms that enhance building glazing performance such as his adaptive fritting and tessellate projects.

In 2008 Hoberman Associates teamed up with the engineering firm Buro Happold to form the Adaptive Building Initiative (ABI),  “dedicated to designing a new generation of buildings that optimize their configuration in real time by responding to environmental changes.”  http://www.adaptivebuildings.com/

Read the rest of this entry »

FACADES CONFERENCE 2010
UNIVERSITY OF SOUTHERN CALIFORNIA
SCHOOL OF ARCHITECTURE
GIN D. WONG, FAIA CONFERENCE CENTER
NOVEMBER 19-20, 2010

Lawrence Berkeley National Laboratory: Integrated, High Performance Facade Solutions

Introducing the appropriate tools, tests and methods early in the design process can lead to more successful building facades.  The speaker, Eleanor Lee emphasized the importance of balancing HVAC loads, artificial lighting and day lighting to achieve optimal performance. She introduced several freely-available analysis tools that can be used at different scales of facade analysis to alert designers to potential human comfort issues (glare, solar heat gain, etc.)  Among these where COMFEN, a front end for Energy Plus;  BCVTB, a middleware used to integrate Energy Plus and Radiance; and DAYSIM, a Radiance based analysis program. More detailed descriptions of these tools can be found in the analysis tools blog entry here.

Eleanor also described research and testing performed through a partnership between LBNL and the New York Times on the Renzo Piano designed headquarters building in New York. LBNL performed extensive computational analysis and built a full scale mock-up in order to advise the client on the selection of appropriate automated interior sun-shading systems and optimal calibration.

Interior view looking west from within the daylighting mockup of the NYT headquarters building. Mockup of the NYT headquarters building elevation with exterior shading provided by ceramic tubes.

The results of their research can be found here.

Read the rest of this entry »