The challenge was to raise awareness of Shopping on Instagram. Partnering with Unicorn XP and working with the Instagram design team on the launch of an exclusive 12-day pop-up with London’s most iconic retailer, Selfridges.
For maximum exposure and to bring the shop to life, they took over one of Selfridges’ famous Christmas windows with a digital art installation for shoppers on Oxford Street to interact and play with.
The deadline was tight and we had to work with a linear piece of motion graphics and make it interactive. Access to the site was difficult, and we weren’t going to be able to install the hardware until the last minute. Not ideal for development and testing so we had to be preemptive in our approach to the technology.
We started by researching all of the tools at our disposal to enable interaction. The options were:
- Standard WebCam with Face Detection. This only requires a clear view of the street outside the window and to be able to recognise faces. We built prototypes and were able to detect multiple faces. Using this method we could detect where a face was on the screen and influence the content.
- Standard WebCam with OpenPose technology to detect full skeletal pose information. This framework is intensive but is able to calculate where a users skeletal structure is just using a standard camera video feed. With this technique we can influence the content with a users arms and legs, enabling a lot of fun interaction. The problem was apparent however, it requires a user to be clearly visible and isolated on the screen, and on a busy Oxford Street, this would be very difficult.
- Microsoft Kinect Depth Sensing Camera. This is the optimal technology to recognise a user in front of the display but it requires a clear direct line-of-sight to the user with the infra-red sensor. We knew this was going to be an issue with the glass frontage unless we could install the sensor at street level and at the right height.
As anticipated the glass became a huge problem for us. The client wanted the best interactive experience and in the studio, this was achieved using the Kinect Depth Sensor, and we did the research and testing to prove that it could work through the thick glass.
Unfortunately, when the screen was installed it was much higher than we’d calculated and planned for, so the depth sensor had to be positioned much higher up, and therefore had to also be angled down towards the user. The refraction and thick glass was a perfect storm, and we had real problems getting the sensor to detect the user’s gestures.
With some tweaking of the code, we were able to get it to recognise movement as pedestrians walked past the screen, and this was enough to provide some interaction. Luckily this was all we were ever going to achieve anyway as the crowds on Oxford Street over Christmas were unbelievable.
Unicorn XP has a case-study page on their website which has result information that they have collected.
Here is a video we shot of the content playing out on the screen.