Archives for category: Proof of Concept

CJ FACADEAs part of a recent design effort here in the studio we attempted to develop a kinetic facade that could respond and adapt in real-time to both solar radiation and user input. The client, CJ Corporation of Korea, was enthusiastic about the idea as part of their “only one” initiative which promotes unique one-of-a-kind thinking. While this certainly isn’t the only kinetic facade in the world, it presented our team with a new set of challenges.

[vimeo http://vimeo.com/19900510 w=460&h=259] Read the rest of this entry »

Advertisements

For many of us, the holy grail of modeling surface detail is the ability to “paint” geometry directly onto a surface in 3d space – being able to generate complex effects, or influence subtle variation with the stroke of the mouse or stylus. Tools such as Mudbox and Zbrush already support this exact mode of working, however combining geometry painting with the parametrics of 3ds Max to achieve responsive panel behavior would be a “best of both worlds” scenario. We’ll test this concept in the video below. Using the new viewport canvas tool in combination with the displacement modifier, we’ll attempt to build and manipulate surface effects similar to the embossed pattern on Zaha Hadid’s footware for Lacoste.

Read the rest of this entry »

The possibility of controlling panels in a Revit curtain wall through an Excel spreadsheet opens up a wide range of opportunities for interoperability between Revit and other tools in our workflow. Any program capable  of exporting a range of values could potentially send values directly to Revit, saving users the hours of manual data entry currently required to translate a design or analysis  model into a documentation (BIM) model.   To enable Revit to read in values from Excel , we looked at two Revit plugins; Revit Excel Link ,developed by Cad Technology Center and Whitefeet, written by Mario Guttman. Using a combination of Revit and both plugins, we were able to develop the workflow demonstrated in the video below.

Read the rest of this entry »

Evolutionary problem solving mimics the theory of evolution employing the same trial-and-error methods that nature uses in order to arrive at an optimized result.  When automated for specific parameters and results, this technique becomes an effective way to computationally drive controlled results within the iterative design process – allowing designers to produce optimized parameters resulting in a form, graphic or piece of data that best meets design criteria. In this post we walk you through the process of using Galapagos, an evolutionary solver for Rhino/ Grasshopper, and show an example of how this method can be tied in with analysis tools to optimize form based on energy data.

Read the rest of this entry »

I had the pleasure of exploring stadium concepts for a potential project in Saudi Arabia. The stadium typology has much to offer on the subject of skin, response and computational design with its sheer size and volume of components. One avenue explore was a loop consisting of  analysis data generated by Ecotect and a design concept formulated  in Rhino3d. Extracting data from Ecotect is quite simple. A .txt file with data in the CSV format can be exported from Ecotect through the following location: Display->Object Attribute Values->Properties->Export Data… This data is saved as a .txt file and can be directly imported into grasshopper with the ‘read file’ GH component. Although the information is accurately imported there is a break in the loop which forces one to stop, export and re-import the information. Luckily Ecotect creates a Dynamic Data Exchange server which allows running applications to speak with each other. The GecoGH plugin is a great set of tools that executes this exchange in a seamless loop where the scripted geometry/design is streamed to and from Ecotect. Below you will see some screen shots of a process utilizing this functionality.

GH mesh geometry

Read the rest of this entry »

Part 3 has an identical flow of data to part 2, the only difference is the type of sensor and the 3-dimensional relationships it effects. In this example we use a PING))) Ultrasonic Sensor to measure the proximity of an object (in this case my hand) to the sensor.  The ultrasonic range finder works by sending out a burst of ultrasound and listening for the echo as it reflects off an object. Code written to the Arduino board sends a short pulse to trigger the detection, then listens for a pulse on the same pin using the pulseIn() function. The duration of this second pulse is equal to the time taken by the ultrasound to travel to the object and back to the sensor. Using the speed of sound, this time can be converted to distance. For more information on the code and circuit click here.

The code written by Arduino prints values in “in” and “cm”. In this example we only need one value so we will use Serial.println(cm) and comment out the other Serial.print() functions. This will send a single array of values through the serial port similar to the example in our previous post. In Rhino the aperture is built by crating a circular array of pivot points about a center axis. These pivot points rotate a series of overlapping blades controlled by a singe parameter.

The values streaming from the serial port are fed directly into the grasshopper build driving the rotation of blades. To see this process in action see the video below. Notice the values printed to the ‘right’ screen displaying the distance in centimeters.

See the video in HD here.

In our previous post making things that talk part 1 the analogy is used where a group of people communicate allowing them to share and transfer information. In this post we build on the same concept with the micro controller (Carl) and the sensor (Susan). There are two changes; (Paul) from processing has left us and is replaced with (Grant) from grasshopper and the source (Sam) is still here but he is measuring wind variance vs light. In this example we use a really interesting anemometer/wind sensor from modern device. This little sensor uses a technique called “hot wire” which involves heating an element to a constant temperature and measuring the electrical power that is required to maintain the heated element at temperature as the wind changes. With wind variance the sensor produces voltage/values. These values are sent through the serial port into grasshopper via the generic serial read component from firefly. Once the values are pipped into grasshopper they alter a series of relationships that inflates a 3-dimensional balloon.

The single array of data streaming from the sensor controls 4 relationships.

  • a. varying division points spaced along central axis
  • b. varying circle radius about the division points
  • c. mesh surface constructed from control curves
  • d. z axis translation based on radius

GHX build

Final build showcases the digital inflation of a balloon translated by a wind sensor.

See this video in HD here.

Our previous post thoughts on response and interaction mentions the idea of embedded systems and the process of physically making things talk. Our first example uses an Arduino micro-controller , ambient light sensor the Processing programming language/development environment and a source of data which in this case is light. This communication involves 4 participants with the micro controller acting as the manager directing all parties involved. To make things simple I will make up names for each piece of technology used. We will call the micro controller (Carl) the sensor (Susan), the Processing IDE (Paul) and the light source (Sam). We will assume that each person speaks a few languages, in reality this communication takes place via programming syntax.

The dialog goes as so:

  1. Carl the controller goes to Susan the sensor located at room (A0), he asks Susan to get data from Sam the source.
  2. Once Susan the sensor has the requested data she sends it back to Carol the controller.
  3. Carl the controller then takes the data and sends it to Paul from Processing.
  4. Paul is very artistic and draws the data on the screen.

Carl is quite demanding looping through this sequence over and over again…

Ambient light data

Ambient light data display via Processing and Arduino

In reality the process goes something like this:

  1. Code is written to the Arduino micro controller asking for the anolog input voltage streaming from pin (A0).
  2. The sensor plugged into pin (A0) captures the numeric values of light in the form of voltage ranging from 0 to 1023.
  3. The Code written to the Arduino micro controller grabs those values and sends them through the serial port.
  4. The Processing code catches the values from the serial port and draws vertical lines based on the values, which in turn gives us a graphic representation of light

As we build on this example take note to the fact that the general story stays the same, we swap out a few different characters, a few different roles, a few different languages but at the end of the day its still a group of ‘people’ ‘talking’ and relaying information.