For those who have always been drawn to Julianna Barwick’s exquisite soundscapes, on her latest project the artist has taken the term and turned it into reality. Partnering with the Ace Hotel for their latest boutique hotel offering, Sister City, located in the Lower East Side of Manhattan, Barwick created a sound installation for the hotel lobby that works with AI technology and cameras on the roof to pull together a score that’s never the same twice. The cameras help the AI change the score based on weather and light, in an attempt to match the music of the place with the mood of the city.
Instead of relying on a staid playlist or something like the slightly cheesy, self-aggrandizing presence of a live player and a grand piano, the hotel brought their emphasis on technology all the way into small but crucial details like the sonics of the hotel’s main shared space. Working with technology from Microsoft and her own signature ambient, looping style, the Southern-raised, LA-living composer, vocalist and producer has created a unique, completely individual musical project that even she hasn’t heard in action yet.
Sister City opens tomorrow, May 16, and it isn’t until then that the apparatus to create the living score will be fully set up and begin soundtracking the hotel’s lobby. And though she’s living in Los Angeles now, Barwick said she sees this project as a tribute to the many years she spent living, performing, and creating in New York — a fitting send-off for a city known for its ability to shape artists, from an artist who built her following there. Since she was prepping for a set at last weekend’s Form Acrosanti festival, I spoke with Barwick about the project over the phone, discussing influences like Brian Eno and whether or not we’ll ever be able to hear the hotel’s soundtrack outside of Sister City’s lobby. Read a condensed, edited version of our conversation below.
How did you get involved with the project and what drew you to want to work with the Ace Hotel on this?
The company Listen, they haven’t really been mentioned, but they’re a creative company. They facilitate all kinds of cool collaborations and installations and things like that. They hit me up probably almost a year ago, and asked if wanted to make music in collaboration with Microsoft to score the lobby of a new Ace Hotel in New York City. I thought it was a really cool opportunity, and a way to learn so many new things, so I was onboard immediately. We’ve just been working all together on it ever since.
Additionally, it has been a really delightful project to do, like an homage to my time in New York, because I lived there for 16 years and had infinity experiences there. It was really cool for it to come full circle, especially in that neighborhood where I saw a million shows, Other Music was there. It’s been really awesome for it to be full circle in that New York City way, and then also to learn about this new technology. It’s a totally new world for me.
In a video clip where you’re talking about the project, you describe it a sound installation that’s ever-changing. I think it’s so interesting to relate the fact that it’s ever-changing to your own role as the artist and creator. Because that’s a pretty interesting tension with the freedom of the installation.
Right. What is interesting about it is I made five different sets that would play through the day, so morning, noon, afternoon, evening, and night. I created a ton of music and little sound bites for when the AI recognizes the events, which are the things that the camera reads. There’s a camera pointed up into the sky on the roof, and it’s reading weather, and airplanes, and all kinds of things like that. I made all of the music, but I can’t be there 24 hours a day, seven days a week, running it. I think the interesting thing about this project is the AI is used as a tool to create the music as well, so it’s doing a lot of work when I can’t be there do it. It’s an interesting way to look at it, and in that way, it can always be on. It can always be reading information and events and keep the score constantly moving, and it will never be the same two days in a row ever. I just find that really interesting, but that’s the role that the AI has in it.
Even maybe a couple of decade-old fears about AI and technology are around the idea that like, ‘They’re going to use us.’ But it’s an interesting role reversal because the music is using the AI. It feels like the music has the power in this dichotomy here.
And without the music, the AI could read the events, but then what? It did take, in this instance, several humans to dream this up, set it up, do the work, and then apply the AI to make it interesting. The Microsoft music technologist on the project wrote the program. It’s a generative music program that she wrote, so things are just accumulating. The AI is reading the events, and it goes straight into that program that she wrote. You need that conduit in between.