Interaction is often defined as a type of action that occurs as two or more objects have an effect upon one another.  When people interact with each other the progression is an iterative loop of speaking, listening and processing. This phenomenon is ubiquitous taking place in many forms within our environment. Smart-phones, cars, software and the World Wide Web are all things we interact with daily. As technology continues to advance, these interactions have become a woven and integral part to the way we live.

Most of us have already designed our own digitally interactive relationships using parametric modeling software such as Revit (family parameter), MAX (modifier stack), Rhino (grasshopper logic) and CATIA (component object model)

Revit (family parameter)  MAX (modifier stack)  Rhino (grasshopper logic)  CATIA (component object model)

Revit, 3dsMAX, Rhino and CATIA's parametric software user interface

Designers use software to interact with a modeled space or geometry and its parameters in a continuous loop until the desired outcome or design is reached. This dynamic level of interaction is becoming more and more vital in practice, but remains predominantly digital and process-based:  its primary focus based on managing the design process and making it more flexible.  The flexibility and interaction expire once the design is set and the model is finalized and materialized in construction documents.

In an effort to create responsive buildings (rather than merely responsive models) we seek to pull this notion of flexible interaction from the realm of digital design into the physical world we design for.  Understanding these fundamental relationships of interaction and communication is key to realizing the potential buildings have to respond to their users, environment, function, etc.  As a first step towards realizing these potentials, our next few blog posts explore  the concept of embedded computation or embedded systems and how they cant can contribute to responsive building skins.