= OpenCV-Android Dedicated to providing an optimized port of OpenCV for the Google Android OS. == Requirements In order to use OpenCV-Android, you will need to download and install both the Android SDK and NDK version 1.6. It may or may not work with a higher version. It has been confirmed that this doesn't work with older versions, so please use 1.6 or higher. In addition to having the SDK or NDK you will also need to have one of the following: * An Android Phone Dev Phone (Might work with other phones, has not been tested) * A newer version of the QuickTime Java Libraries For those of you running on the emulator, I built a very simple Socket-based Camera server that will send images over a socket connection. This uses the QuickTime libraries which are present by default on Mac OS X, but I haven't tested on other OSes. If it doesn't work for you, you can always try the original WebcamBroadcaster I derived mine from which uses the JMF (which doesn't work with Mac OS X, hence the QTWebcamBroadcaster): http://www.tomgibara.com/android/camera-source == Build Building is quite simple. I'm going to assume that you are familiar with Android if you are reading this and keep it rather short. * Create a symbolic link from the directory where you pulled the OpenCV-Android repository to your [ANDROID_NDK_ROOT]/apps/ directory. Examples: ln -s ~/Source/OpenCV-Android opencv * From the [ANDROID_NDK_ROOT] run: build/host-setup.sh * Now run: make APP=opencv By default, this will build the opencv library and push it into: [OPENCV_ANDROID_ROOT]/tests/VideoEmulation/libs You can change where the lib is delivered by modifying the APP_PROJECT_PATH in: [OPENCV_ANDROID_ROOT]/Application.mk Once you have built the OpenCV library, you can now build and [OPENCV_ANDROID_ROOT]/tests/VideoEmulation. NOTE: If you plan to use the Socket Camera, you will need to build and run the [OPENCV_ANDROID_ROOT]/tests/QTWebcamBroadcaster first. Also, if you use Eclipse to develop Android, there are already projects defined for both of these applications. You can simply import them into your workspace. == Setup If you want to test face tracking, then you need to have a Haar Classifier Cascade XML. I have provided one for use and it is stored in: tests/haarcascade_frontalface_alt.xml Before attempting to run the VideoEmulator application, you must first copy this XML file into the emulator in the following location: /data/data/org.siprop.opencv/files/haarcascade_frontalface_alt.xml Currently, this is a hard-coded path that we look up. Hopefully, this can be remedied in a future version. == Run In order to use the VideoEmulator, you have to use the emulator (hence the name.) If you have a Dev Phone, you can play around with the old 'OpenCVSample' test or modify the VideoEmulator to support a real camera. This is something we will work on resolving in the future. Using the emulator there are two slightly different 'flavors' of running. Both are socket based cameras, but one is written in C++ and emulates a real OpenCV Capture while the other (loosely) emulates a camera implementation in Java. The C++ version is the default as it is slightly faster and takes a little less memory. Also, the ultimate goal is to hook up with a real camera in C++ so that we don't have to pass huge amounts of data (images) back and forth through the JNI interface. NOTE: For all of these examples you cannot use localhost or 127.0.0.1 as your address for the socket camera. The reason is because when the client is running on the Android emulator, both of these map to Android's localhost, not the machine you are running the emulator on. This means you have to be connected to a network in order to use the socket camera, a limitation. === C++ Since this is the default, we have made it pretty easy... * Start the WebcamBroadcaster - this is a socket server that grabs images from your camera and serves them up * Start the VideoEmulator - this runs the Android application that allows you to try out the various pieces implemented thus far * Once the application comes up, you will have to configure for your machine address and port for the socket camera to work. * Leave Use C++ SocketCapture CHECKED! * Choose which test you want to run. === Java To use Java, you have to make a small code change. Eventually we will make this a configurable option without having to make a code change. The reason is because when we send data over a socket from Java to Java it is faster to send serialized buffered images. However, when we send data to C++, we have to send a raw byte array. * Modify the WebcamBroadcaster by changing the default value assigned to RAW to false. * Start the WebcamBroadcaster - this is a socket server that grabs images from your camera and serves them up * Start the VideoEmulator - this runs the Android application that allows you to try out the various pieces implemented thus far * Once the application comes up, you will have to configure for your machine address and port for the socket camera to work. * UNCHECK Use C++ SocketCapture! * Choose which test you want to run.