Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Problem Scenario

Papers have been replaced by digital documents for many reasons. However, we still see a lot of papers in our daily life. Machines do not have an ability to understand what has been written on those physical papers. Converting handwritten characters to digital characters has been a tough problem in the past (and probably now). We cannot efficiently process those physical documents with computers unless we can convert them to digital documents.

...

In this use case, we train the CNN model on MNIST dataset that consists of 70,000 images containing handwritten digits. Each image is 28 pixels by 28 pixels and contains one handwritten digit. We  train the model on 60,000 images and keep 10,000 images for testing the model.

The live demo is available at our Machine Learning Showcase.

Objectives

  1. Use Remote Python Script Snap from ML Core Snap Pack to deploy a Python script to train Neural Networks model on Iris flower dataset.
  2. Test the model with a sample.
  3. Use Remote Python Script Snap from ML Core Snap Pack to deploy a Python script to host the model and schedule an Ultra Task to provide API.
  4. Test the API.
  5. Develop a demo with HTML and Javascript.

...

The Python script can be found here.

Below is a piece of the Python script from the Remote Python Script Snap used in this pipeline. The code is modified from official Keras example.

There are 3 main functions: snaplogic_init, snaplogic_process, and snaplogic_final. The first function, snaplogic_init, will be executed before consuming documents from the upstream snap. The second function, snaplogic_process, will be called on each of the incoming document. The last function, snaplogic_final, will be processed after all incoming documents have been consumed by snaplogic_process. In this case, Remote Python Script Snap does not have an input view, so snaplogic_process will not be executed.

...

The prediction of the Python Script Snap is shown below. It can be see that the character in the input image has been correctly identified.

Image Modified

Python Script

The Python script can be found here.

Model Hosting

This pipeline is scheduled as an Ultra Task to provide a REST API that is accessible by external applications. The core snaps are File Reader, JSON Parser, and Remote Python Script that are the same from the Model Testing pipeline. The rest are for authentication, parameter extraction, and Cross-Origin Resource Sharing (CORS) handling.

Scheduling Ultra Task

...

Once we have the API ready, it is time to build an application that demonstrates the power of our handwritten-digit recognition model. Below is the video demo. The live demo is available at our Machine Learning Showcase, Feel free to try and let us know your feedback. You can access the code here.

HTML Code

In this demo, we have four main components: canvas, CLEAR button, READ button and result label. You can use mouse or touch screen to write a digit on the canvas. You can clear the canvas using the CLEAR button. If you are ready, click the READ button to send the request to the API. Once it is done, the result will be displayed.

...