This story is over 5 years old.


Nature and Tech Talk: A Q&A with Kimchi and Chips

Kimchi and Chips give us the skinny on their augmented digital installation Lit Tree.

Lit Tree is an augmented projection that responds to viewers’ gestures, using plants as volumetric canvases. The installation, made from a cloud of glittering, dancing voxels, was one of the premiere artworks featured at our Seoul Event this past September. The designers behind this extraordinary exploration of the interaction and influence between tech and nature are the London and Seoul based Kimchi and Chips, who we sat down with to discuss the details of their incredibly cryptic and visually stunning installation.


The Creators Project: As a unification between nature, technology and audience, Lit Tree showcases a fairly dense technological concept in a rather simplified interactive medium. Please describe the objective behind your work.
Kimchi and Chips: We hate visual trash, and the more powerful visual tools become, the more visual trash takes over our environment. On the other hand, we’re innovating new platforms for digital media. The conundrum is how to create something that really gets to people and connects with their thoughts, but can’t be easily exploited.

We look at media facades and we see cheap, low-hanging fruit abused by marketing companies all over the planet. In Seoul, every big street is dotted with bright neon, LEDs and projections. Big flat concrete surfaces make great canvases for product messages and high-contrast graphics.

We wanted to make a new platform for visual media within the built environment and started moving towards trees due to their inherent complexity, unpredictable nature and [their] volumetric surface. If you can project onto the surface of a tree, taking into account the intricate 3D position of where every pixel lands, then you achieve this incredible type of volumetric projection and can present volumetric graphics.

What sort of software/hardware did you use?
We generally aim to use low-cost hardware and make our software open source. For the scanning of the tree we developed our own software MapTools-SL, which works with a pair of webcams and ordinary video projectors. We try to choose hardware that has the right depth of field, resolution and drivers that other people can get a hold of.


You explain Lit Tree as an alternative to the media facade. By combining interactivity with the natural environment in this sort of way, how is technology seamlessly integrated into the ambiance rather than causing a disturbance?
Lit Tree experiments with how to create volumetric digital content for a tree. What we find is that in order to really convince and capture the viewer, we must present patient graphics that work hard to describe themselves. And in order to work at all, the system must ‘keep in touch’ with the evolving shape of the tree. It’s a challenge that involves studying the form of the tree, rather than overriding it.

What sort of obstacles do you face when localizing Lit Tree for different venues, like Platoon for instance?
It depends on the space. In art spaces we can keep the environment controlled and work on a preconceived design once we get to the venue. Adapting to existing spaces is more challenging and really pushes the limits of the system.

We need decent scans of the whole installation, so ideally you can stand back and find a camera position that can cover it all at once without going too wide. At Platoon we were in a confined space [where we joined] four or six scans from different viewpoints. Our software has all the maths to deal with this, but it means more work and less time to fine-tune the results, which is painful.

How do you see Lit Tree developing (ie, as you mentioned, alternative methods for photosynthesis, botanical research)?
Yes, due to the photosynthesis effects triggered by the projector and the enormous control you get over that in 3D using our system, there are definite opportunities for research into plant growth and triggering complex hormonal states within them. We’d love to get started on this and have a few contacts to chase up in that field.

Will you be using MapTools-SL, your structured light system, in future projects?
We have a big project coming up that will extend the software even more. What if we can project onto moving materials, such as trees moving in the wind, with each element of the tree rasterizing a volume of light? There’s so much more that can be done with this.

But also, we’d really like to work on getting the tools into [the hands of] other artists and designers—slowing down and explaining the ideas, rather than speeding off and making new tools. We hope more people will get involved with using the software.

Video footage courtesy of Kimchi and Chips.