Bob Greenberg, the founder of R/GA has a new exhibit at the Cooper Hewitt Museum of Design that opened on February 23rd 2018 and runs until September 2018.
I led a team of developers in Buenos Aires. We built an Android and iOS app that gave users more information about each item on display. It uses machine learning to detect what object the user’s phone camera was pointed at and they could tap to listen to audio tours or get more information.
I was also responsible for the AV setup where we had 3 portrait displays showing video content. With two of those videos you could listen to the audio via the app which was sync’ed by a sub-audible Lisnr tone.
The app is currently in the app store and available for download although you really need to be in the exhibit to use it.
Next there was a site that generated Spotify playlists you could send people that would encourage them to buy you the device you wanted.
Finally there was a custom generated video site that allowed you to choose an adorable kid or a sassy grandma. You could add your custom elements to the video to convince someone to buy you one of the devices available at Verizon.
For the release of the new iPhone 8 Verizon was looking for interesting ways to become top of mind when shopping for the new phone. The concept we came up with was a SnapChat lens that could only be found in certain areas in the country. If the user was nearby they would be driven to our mobile experience via a Snap Ad. It would show a clue directing them to a general location, when they got there, they check the site again and there’d be a more specific clue. When they got to the right location the lens would be unlocked and they would snap it to us to win. We had a build a mobile experience with a lot of integrations with sports and weather apis to make sure the lens was available at the right time.
My first chatbot! A friend at work was trying to launch this wine guide for the Google Home. I used api.ai to structure all the things you can ask it and created a backend api that allowed him to easily add/edit what the responses would be. If you have a Google Home or use Google Actions give it a try.
To promote the latest Samsung TV we created the first ever billion color film. Because of the sheer scale of the color data we ended up having to use data science methods for counting and then adding color into the video.
My son is in the Cub Scouts and wanted to build a “Star Power Mario Car” for the pinewood derby. There are a lot awards other than fastest that you can pickup so I decided to go for style points. I used a Tinket, Bluefuirt and Neopixel strips from Adafruit and basically combined this and this tutorial.
It worked great for the judge but during the actual race the neopixels weren’t working. I think didn’t have enough power going to them.
Samsung was looking for a new experience in their 837 space in the meatpacking district. We worked with a photographer named Carlos Serrao to put together a tunnel where photography lighting and a Samsung Galaxy S7 phone work together to create interesting pictures of people. After walking through you can use a different S7 to see your picture, email it to yourself and share it up on the video wall outside the tunnel. Here’s my picture when we opened it.
Unfortunately, my Kickstarter idea did not get funded. Due to the way the product gets manufactured I needed A LOT of money up front. It was a stretch but sometimes these things take off, sometimes they don’t. In this case it didn’t. The process of coming up with an idea, developing it, launching a Kickstarter and trying to promote it was a really fun learning experience for me. I’m looking forward to the next project where I can get out of my comfort zone and try something new.
A lot of my time in 2015 was spent on these large screens at the cafe of the R/GA office. I was the technical director and there was a large team of creatives and developers on this project. 30 separate displays all working together, able to to be updated in real-time all in full resolution. It was not easy putting together a solution but on January 4th 2016 when we moved into the new office it was all online. Shortly after launch I figured out a way to use LISNR technology to allow people to hear the audio for the videos on their mobile devices.