WHAT IS PA!GO?

PA!GO is an innovation project by Panasonic that uses Google's Coral products to design and develop an educational and entertaining product for kids, encouraging them to explore, play and learn.

Nowadays, young children know how to unlock mobile devices before they can even tie their shoes. We need to ensure we’re not letting technology take over a child’s imagination and curiosity but rather nurture it. So, instead of reducing their exposure to technology, we asked ourselves how could we use it to encourage them to spend more time outside and learn about the world around them?

We decided to turn the ubiquitous remote-control into something that would allow children to interact with the real world in a new and magical way instead of relying on a screen.

INPUT

EXPLORING

Find, Capture and Store

PA!GO is built around the Coral USB accelerator and captures, analyzes and identifies the world around you using on-device Machine Learning powered by Google’s TensorFlow. By using MobileNet’s dataset iNaturalist, users can immediately hear the descriptions of the things they’ve captured. A secondary button runs these base descriptions through Google’s Knowledge Graph so kids can learn even more about their world.

OUTPUT

VIEWING

Share with Family

The information captured by PA!GO can also transform the time a family spends around a screen into an opportunity for children to learn and share their experiences and the objects they encounter during their day.

Chromecast PA!GO to any TV or monitor, and kids can see their day’s adventure and navigate it along a timeline pointing out the images they captured along with other information from Google’s knowledge graph, related topics, and relevant educational videos.

Storytelling [Coming Soon]

The vision for PA!GO aims to combine the best of Panasonic’s camera and projection hardware capabilities, including a micro projector as a ‘torch light’ attachment. This attachment will allow kids to project the images and animals they’ve captured onto walls or their bedroom ceiling.

The open and flowing ‘at home’ user interface and functionality of PA!GO encourages children to develop their own descriptions and stories of their day. Google’s Knowledge Graph and relevant YouTube Kids content will further inform their stories allowing the entire family to learn more about the world around them.

HOW IT WORKS?

Hardware
[Edge TPU + Pi ZERO]

PA!GO is powered by Google’s TensorFlow platform and the Coral USB Accelerator. The USB Accelerator uses Google’s Edge TPU to provide inference acceleration for machine learning models and is linked to the Raspberry Pi Zero dev board over a USB 2.0 interface.

The benefit of on-device ML using the mobile optimised Tensor Flow Lite means it has low power consumption and all processing is done without the need for internet connectivity.

TensorFlow USB
Accelerator

Button 2 [More info]

Learn more about what you’ve captured by pressing this button. PA!GO will take the description of your last capture and tap into Google’s Knowledge Graph to analyze it even further giving you a detailed description.

Button 1 [Capture]

Point PA!GO at anything you want to capture and press the button. You’ll hear a sound once the object has been recognized, then hold PA!GO up to your ear to hear an automatic description of what you’ve captured through the internal speaker.

Software
[Image recognition + Datasets]

MobileNetV2 is the next generation of mobile vision applications and a significant improvement over MobileNetV1. It pushes the state of the art for mobile visual recognition including classification, object detection and semantic segmentation.

One dataset we used within MobileNetV2 is called iNaturalist. It has a few different models that specifically cater to a category (eg. birds, insects, plants), and users can change the models as they like.

Learn More

[Camera + Projector]

A magnetic ring within the blue camera component connects the PA!GO remote with the green micro projector attachment. In exploring mode the remote detaches from the projector and an LED light gives users feedback when you’re capturing an object or the remote is charging.

MOVING FORWARD

PA!GO is still in its prototyping phase to develop and refine the product, vision and rollout. Given the educational nature of the project we’re working with as many developers as possible to get as much support as possible for it. Below are some ways you can help us bring this to life.

Project Partners

Hardware
[Edge TPU + Pi ZERO]

PA!GO is powered by Google’s TensorFlow platform and the Coral USB Accelerator. The USB Accelerator uses Google’s Edge TPU to provide inference acceleration for machine learning models and is linked to the Raspberry Pi Zero dev board over a USB 2.0 interface.

The benefit of on-device ML using the mobile optimised Tensor Flow Lite means it has low power consumption and all processing is done without the need for internet connectivity.

TensorFlow USB
Accelerator

Button 2 [More info]

Learn more about what you’ve captured by pressing this button. PA!GO will take the description of your last capture and tap into Google’s Knowledge Graph to analyze it even further giving you a detailed description.

Button 1 [Capture]

Point PA!GO at anything you want to capture and press the button. You’ll hear a sound once the object has been recognized, then hold PA!GO up to your ear to hear an automatic description of what you’ve captured through the internal speaker.

Software
[Image recognition + Datasets]

MobileNetV2 is the next generation of mobile vision applications and a significant improvement over MobileNetV1. It pushes the state of the art for mobile visual recognition including classification, object detection and semantic segmentation.

One dataset we used within MobileNetV2 is called iNaturalist. It has a few different models that specifically cater to a category (eg. birds, insects, plants), and users can change the models as they like.

Learn More

[Camera + Projector]

A magnetic ring within the blue camera component connects the PA!GO remote with the green micro projector attachment. In exploring mode the remote detaches from the projector and an LED light gives users feedback when you’re capturing an object or the remote is charging.