- Published: Wednesday, 30 September 2015
Aipoly aims to help the blind navigate the world. Our first application enables blind people to take a photo of their surroundings, and have it described to them in a sentence. Aipoly’s initial prototype uses convolutional neural networks to interpret images and relay them to the user in real time, and an API with a human layer for more descriptive semantic responses. We got into TechCrunch with Aipoly! My first ever TechCrunch article! :) You can read it in full here! I worked on the project with my Singularity University classmate Alberto Rizzoli from Italy. Mark Parncutt from Australia helped us create our first prototypes.
Here is the video we made during the first week of our two-week project at Singularity University showing users' response to the app.
I was attending a picnic with 70 blind people and showing them the app, when it became morning in Australia! So I did a Weekend Breakfast interview with ABC News 24.
After two weeks at Singularity University, our work on Aipoly and our pitch impressed our classmates and the judges so much that we were given the privilege of presenting at the Closing Ceremony as the final speakers. Here is a video made by Singularity University where we speak about that experience!
And finally, Andrea Bocelli used and gave us feedback on our Aipoly app!