Thursday, February 21, 2019

The Fastest Path to Object Detection on Tensorflow Lite

Ever thought it would be cool to make an Android app that fuses Augmented Reality and Artificial Intelligence to draw 3D objects on-screen that interact with particular recognized physical objects viewed on-camera?  Well, here's something to help you get started with just that!

Making conference talks can be a chicken-and-egg problem.  Do you hope the projects you've already worked on are interesting enough to draw an audience, or do you go out on a limb, pitch a wild idea, and hope you can develop it between the close of the call for papers and the conference?  Well, in this case, the work I did for DevFests in Chicago and Dallas yield a template for talks formulated by either approach.

The most impressive part is that you can recreate for yourself the foundation I've laid out on GitHub by cloning the Tensorflow Git project, adding Sceneform, and editing (mostly removing) code.  However, it wasn't such a walk in the park to produce.  Here are the steps, edited down as best I can from the stream of consciousness note-taking that this blog post is derived from.  It has been distilled even further in slides on SlideShare, but this might give you some insights into the paths I took that didn't work -- but that might work in the future.

  • Upgrade Android Studio (I have version 3.3).
  • Upgrade Gradle (4.10.1).
  • Install the latest Android API platform (SDK version 28), tools (28.0.3), and NDK (19).
  • Download Bazel just as Google tells you to.  However, you don't need MSYS2 if you already have other things like Git Shell -- or maybe I already have MinGW somewhere, or who knows.

Nota Bene: ANYTHING LESS THAN THE SPECIFIED VERSIONS will cause a multitude of problems which you will spend a while trying to chase down.  Future versions may enable more compatibility with different versions of external dependencies.

Clone the Tensorflow Github repo.

A Fork In the Road

Make sure you look for the correct Tensorflow Android example buried within the Tensorflow repo!  The first one is located at path/to/git/repo/tensorflow/tensorflow/examples/android .  While valid, it's not the best one for this demo.  Instead, note the subtle difference -- addition of lite -- in the correct path, path/to/git/repo/tensorflow/tensorflow/lite/examples/android .  

You should be able to build this code in Android Studio using Gradle with little to no modifications.  It should be able to download assets and model files appropriately so that the app will work as expected (except for the object tracking library -- we'll talk about that later).  If it doesn't, here are some things you can try to get around it:

  • Try the Bazel build (as explained below) in order to download the dependencies.
  • Build the other repo at path/to/git/repo/tensorflow/tensorflow/examples/android and then copy the downloaded dependencies into the places where they would be placed.

However, by poking around the directory structure, you will notice is the population of several BUILD files (not build.gradle) that are important to the Bazel build.  It is tempting (but incorrect) to build the one in the tensorflow/lite/examples/android folder itself; also don't bother copying this directory out into its own new folder.  You can in fact build it this way, if you remove the stem of directories mentioned in the BUILD file so you're left with //app/src/main at the beginning of the callout of each dependency.  By doing this, you will still be able to download the necessary machine learning models, but you will be disappointed that it will never build the object detection library.  For it to work all the way, you must run the Bazel build from the higher-up path/to/git/repo/tensorflow folder and make reference to the build target all the way down in tensorflow/lite/examples/android .

For your reference, the full Bazel build command looks like this, from (the correct higher-up path) path/to/git/repo/tensorflow :
bazel build //tensorflow/lite/examples/android:tflite_demo

Now, if you didn't move your Android code into its own folder, don't run that Bazel build command yet.  There's still a lot more work you need to do.

Otherwise, if you build with Gradle, or if you did in fact change the paths in the BUILD file and copied the code from deep within the Tensorflow repo somewhere closer to the root, you'll probably see a Toast message about object detection not being enabled when you build the app; this is because we didn't build the required library.  We'll do this later with Bazel.

Now, let's try implementing the augmented reality part.

But First, a Short Diatribe On Other Models & tflite_convert

There's a neat Python utility called tflite_convert (that is evidently also a Windows binary, but somehow always broke due to failing to load dependencies or other such nonsense unbecoming of something supposedly dubbed an EXE) that will convert regular Tensorflow models into TFLite format.  As part of this, it's a good first step to import the model into Tensorboard to make sure it's being read in correctly and to understand some of its parameters.  Models from the Tensorflow Model Zoo imported int0 Tensorboard correctly, but I didn't end up converting them to TFLite, probably due to difficulties, as explained in the next paragraph.  However, models from TFLite Models wouldn't even read in Tensorboard at all.  Now these might not be subject to conversion, but it seems unfortunate that Tensorboard is incompatible with them.

Specifically, tflite_convert changes .pb files or models in a SavedModel dir into .tflite format models.  The problems with tflite_convert on Windows were firstly finding just exactly where Pip installs the EXE file.  Once you've located it, the EXE file has a bug due to referencing a different Python import structure than what things are now.  Building from source also has the same trouble; TF 1.12 from Pip doesn't have the same import structure that expects.  Easiest thing to do is just download the Docker repo (on a recent Sandy Lake or better system -- which means that even my newest desktop with an RX580 installed can't handle it) and use tflite_convert in there.

Looking Into Augmented Reality

Find the Intro to Sceneform codelab.  Go through the steps.  I got about halfway through it before taking a pause in order to switch out quite a lot of code.  The code I switched mostly revolved around swapping the original CameraActivity for an ArFragment and piping the camera input into the ArFragment into the Tensorflow Lite model as well.  More on the specifics can be seen in the recording of my presentation in Chicago (and in full clarity since I painstakingly recorded these code snippets in sync with how they were shown on the projector).

To build Sceneform with Bazel, first I must say it's probably not possible at this time.  But if you want to try (at least on Windows), make sure you have the WORKSPACE file from Github or else a lot of definitions for external repos (@this_is_an_external_repo) will be missing, and you'll see error messages such as:

error loading package 'tensorflow/tools/pip_package': Unable to load package for '@local_config_syslibs//:build_defs.bzl': The repository could not be resolved

After adding in the Sceneform dependency into Bazel, I also faced problems loading its dependencies.  There were weird issues connecting to the repository of AAR & JAR files over HTTPS (despite the Tensorflow Lite assets worked fine).  As such, I was greeted with all the things Bazel told me that Sceneform depended at a time, since Bazel would not tell me all the dependencies of all the libraries at once.  I was stuck downloading about 26 files one at a time, as I would continuously download libraries that depended on about 3 others themselves.  Or not... so I wrote a script to automate all this.

The following script, while useful for its intended purpose, alas did not solve its intended goal because once you do all this, it claims it's missing a dependency that you literally can't find on the Internet anywhere.  This leads me to believe it's currently impossible to build Sceneform with Bazel at all.  Nevertheless, here it is, if you have something more mainstream you're looking to build:

import ast
import re
import urllib.request

allUrls = []
allDeps = []

def depCrawl(item):
if (item['urls'][0] not in allUrls):
depStr = ""
for dep in item['deps']:
depStr += "\n%s_import(" % item['type']
depStr += "\n  name = '%s'," % item['name']
filepath = ":%s" % (item['urls'][0].split("/"))[-1]
if (item['type'] == "java"):
depStr += "\n  jars = ['%s']," % filepath
depStr += "\n  aar = '%s'," % filepath
if (len(item['deps']) > 0):
depStr += "\n  deps = ['%s']," % "','".join(item['deps'])
depStr += "\n)\n"
if (depStr not in allDeps):

f = ""

with open('git\\tensorflow\\tensorflow\\lite\\examples\\ai\\gmaven.bzl') as x: f =

m = re.findall('import_external\(.*?\)', f, flags=re.DOTALL)

aar = {}

for item in m:
aarName ='name = \'(.*?)\'', item)
name =
aarUrls ='(aar|jar)_urls = (\[.*?\])', item)
type = "java" if == "jar" else "aar"
urls = ast.literal_eval(
aarDeps ='deps = (\[.*?\])', item, flags=re.DOTALL)
deps = ast.literal_eval(
deps = [dep[1:-5] for dep in deps]
dictItem = {"urls": urls, "deps": deps, "type": type, "name": name, "depStr":}
aar[name] = dictItem
if (len(urls) > 1):
print("%s has >1 URL" % name)


for url in allUrls:
print("Downloading %s" % url)
urllib.request.urlretrieve(url, 'git\\tensorflow\\tensorflow\\lite\\examples\\ai\\%s' % url.split("/")[-1])

The important part of this script is toward the bottom, where it runs the depCrawl() function.  In here, you provide an argument consisting of the library you're trying to load.  Then the script seeks everything listed as a dependency for that library in the gmaven.bzl file (loaded from the Internet), and then saves it to a local directory (note it's formatted for Windows on here).

Giving Up On Bazel For the End-To-End Build

Nevertheless, for the reasons just described above, forget about building the whole app from end to end with Bazel for the moment.  Let's just build the object tracking library and move on.  For this, we'll queue up the original command as expected:

bazel build //tensorflow/lite/examples/android:tflite_demo

But before running it, we need to go into the WORKSPACE file in /tensorflow and add the paths to our SDK and NDK -- but not so much as to include references to the specific SDK version or build tools version, because when they were in, it seemed to get messed up.


  • Install the Java 10 JDK and set your JAVA_HOME environment variable accordingly.
  • Find a copy of visualcppbuildtools_full.exe, and install the following:
    • Windows 10 SDK 10.0.10240
    • .NET Framework SDK
  • Look at the Windows Kits\ directory and move files from older versions of the SDK into the latest version
  • Make sure your Windows username doesn't contain spaces (might also affect Linux & Mac users)
  • Run the Bazel build from an Administrator command prompt instance
  • Pray hard!
Confused by any of this?  Read my rationale below.

Eventually the Bazel script will look for javac, the Java compiler.  For this, I started out installing Java 8, as it was not immediately clear which Java that Bazel was expecting to use, and according to Android documentation, it supports "JDK 7 and some JDK 8 syntax."  Upon setting up my JAVA_HOME and adding Java bin/ to my PATH, it got a little bit further but soon complained about an "unrecognized VM option 'compactstrings'".  Some research showed similar errors are caused by the wrong version of the JDK being installed, so I set off to install JDK 10.  However, according to Oracle, JDK 10 is deprecated, so it redirected me to JDK 11.  Then, I had another issue with some particular class file "has wrong version 55.0, should be 53.0".  Once again, this is due to a JDK incompatibility.  I tried a little bit harder to seek JDK 10, and eventually found it but had to login to Oracle to download it (bugmenot is a perfect application to avoid divulging personal information to Oracle).

Once installing JDK 10, I then came across an error that Bazel can't find cl.exe, relating to the Microsoft Visual C++ '15 compiler & toolchain required to build the C++ code on Windows.  However, downloading the recommended vc_redist_x64.exe file didn't help me, since the installer claims the program is already installed (I must have installed Visual C++ a long time ago).  However, the required binaries are still nowhere to be found in the expected locations.  I ended up finding an alternate source, a file called "visualcppbuildtools_full.exe".  Unfortunately, this installs several GB of stuff onto your computer.  I first selected just the .NET Framework SDK to hasten the process, save hard disk space, and avoid installing unnecessary cruft, but then it couldn't find particular system libraries, so I had to select Windows 10 SDK 10.0.10240 and install that as well.

Trying again with the build, now it can't find Windows.h.  What?!?  I should have just installed the libraries & include files with this SDK!  Well, it turns out they did install correctly, but according to the outputs of SET INCLUDE from the Bazel script, it was looking in the wrong directory: C:\Program Files (x86)\Windows Kits\10\Include\10.0.15063.0 rather than C:\Program Files (x86)\Windows Kits\10\Include\10.0.10240.0.  To make my life easier, I just copied all the directories from 10240 into 15063, renaming the original directories in 15063 first.  I later had to do the same thing with the Lib directory, in addition to Include.

Upon setting this up, I made it to probably just about the completion of the build:

bazel-out/x64_windows-opt/bin/external/bazel_tools/tools/android/resource_extractor.exe bazel-out/x64_windows-opt/bin/tensorflow/lite/examples/android/tflite_demo_deploy.jar bazel-out/x64_windows-opt/bin/tensorflow/lite/examples/android/_dx/tflite_demo/extracted_tflite_demo_deploy.jar
Execution platform: @bazel_tools//platforms:host_platform
C:/Program Files/Python35/python.exe: can't open file 'C:\users\my': [Errno 2] No such file or directory

Aww, crash and burn!  It can't deal with the space in my username.  Instead, make an alternate user account if you don't already have one.  Now one thing you may notice is that the new account doesn't get permission to access files from the original user account, even if you define it as an Administrator.  Using Windows "cmd" as Administrator will finally allow you success with your Bazel build.


Look closely; this is the image of success.

Now, you're not out of the woods yet.

Tying It All Together

Now, you need to actually incorporate the object detection library in your Android code.

  • Find the file built by Bazel.  It's probably been stashed somewhere in your user home directory, no matter your operating system.
  • Copy this file into your Android project.  Remember where you normally stash Java files?  Well this will go into a similar spot called src/main/jniLibs/<CPU architecture> , where <CPU architecture> is most likely going to be armeabi-v7a (unless you're not reading this in 2019).
  • To support this change, you'll also need to add a configuration to your build.gradle file so that it will only build the app for ARMv7; otherwise if you have an ARMv8 (or otherwise different) device, it won't load the shared library and you won't get the benefit of object tracking.  This is described in the YouTube presentation linked above.
The final thing to do to get this all working is to add in the rest of the Sceneform stuff.  At this point, if you've followed the coding instructions in the YouTube video linked above that mentions what to change, then all you should need to do is build the Sceneform-compatible *.sfb model.

But hold tight!  Did you see where the Codelab had you install Sceneform 1.4.0 through Gradle, but now the Sceneform plugin offered through Android Studio is now at least 1.6.0?  Well if you proceed in building the model, you won't notice any difficulties until the first time your app successfully performs an object detection and tries to draw the model...only to realize the SFB file generated by the plugin isn't forward-compatible with Sceneform 1.4.0 which you included in your app.  The worst part is that if you try to upgrade Sceneform to 1.6.0 in Gradle, your Sceneform plugin in Android Studio will refuse to work properly ever again.

Your two solutions to this problem:
  • Rectify the Sceneform versions (plugin & library) prior to building anything, or at least making your first SFB file
  • Just use Gradle to build your SFB file, as shown in the YouTube video
Turns out you don't need the Sceneform plugin in Android Studio at all, and after a while it'll probably seem like a noob move, especially if you have a lot of assets to convert for your project or you're frequently changing things.  You'll want it to be automated as part of your build stage.

The big payoff is now you should be able to perform a Gradle build that builds and installs an Android app that:
  • Doesn't pop a Toast message about missing the object tracking library
  • Performs object detection on the default classes included with the basic MobileNet model
  • Draws Andy the Android onto the detected object

Any questions?

This is a lot of stuff to go through!  And I wonder how much of it will change (hopefully be made easier) before too long.  Meanwhile, have fun fusing AI & AR into the app of your dreams and let me know what you build!

As for me, I'm detecting old computers: (but not drawing anything onto them at this point)

* Not responsible for the labels on the detected objects ;) Obviously the model isn't trained to detect vintage computers!

And for the sake of demonstrating the whole thing, here's object detection and augmented reality object placement onto a scene at the hotel right before I presented this to a group:

No comments:

Post a Comment