Ground-breaking BBC app prototype using augmented reality, image recognition and machine learning
As part of an initiative to explore interactive digital technologies, BBC iWonder chose Bolser’s cutting-edge mobile app team to develop an interactive proof-of-concept mobile app.
The app transformed photos of objects taken by the user into an augmented reality 3D view, using image recognition software. Users were encouraged to add content to the AI image and share on line for others to learn and contribute further content. Effectively creating a virtual conversation about a particular object. All powered by the BBC.
The initial target market for the prototype app was BBC employees.
positive user feedback
working prototype delivery
Integrating innovative interactive technologies
The main challenge for our specialist teams was that this had never been done before.
Requirements included image recognition so that, when a user took a photograph of an everyday object, that photograph would link to a database of images stored in an image recognition library.
Once the user had tagged their photograph, say an iPhone for example, the app would then allow the user to write a story about the phone, save it to the database and share on social media.
Another user could then learn about an iPhone, with content/stories provided by other users, and would be able to view the image from every angle in an augmented reality environment.
The solution also needed ‘always on’ connectivity, and multi-device and operating system compatibility.
The BBC wanted us to deliver a working proof of concept app within two months of initiating the project.
Used extensive UX research to create an interactive and user-friendly app
Our app development specialists designed, built, tested and delivered a fully functional prototype with the specified two month timescale. Activities included:
- UX-led research to inform design specifications
- Using agile scrum methodology for rapid prototyping
- Sourcing and integrating AR and machine learning technologies.
How we delivered AR and machine learning
Our team built the app using Ionic for the framework and Apache Cordova for the underlying software. This allowed us to use CSS, JS and HTML to develop a highly interactive app and install it across both iOS and Android devices.
We used Cloudsight’s machine learning software that recognises almost any image and developed software to instantly display an artificial AR marker over the object, and modified Wikitude AR software to display user stories.
The app features GPS co-ordinates of the image, plus wi-fi data on the angle and position of the camera, to allow location targeting and accurate overlaying of the library images.
A fully functional easy to use app with 100% positive feedback
A fully functional prototype was delivered on budget and within the specified two month time frame.
The app is easy-to-use, engaging, and interactive, and focused on the stories rather than the technology, making it easy for everyone to use.
Feedback from internal testing at the BBC was 100% positive, and the project was deemed a success.