Remote Android emulator

I often use the Android emulator to check my apps with different display configurations and to stress-test them. But the problem is that it is really slow on my development laptop. So I installed the Android emulator on my desktop PC running Windows and connect to it over my LAN. The major advantage is that you can continue using your development machine while a “server” deals with emulating – one could even emulate several devices at once and still continue programming.

The approach in a nutshell: Forward the emulator’s port so that it is accessible in the local network. Then connect the ADB to it.

On your desktop – the “server”:

  1. Store the executable of Trivial Portforward on the desktop system (e.g. directly in C:\trivial_portforward.exe).
  2. Create a virtual device to emulate (HowTo) and name it “EmulatedAndroid”.
  3. Create a batch file:
    <your-android-sdk-path>\tools\emulator -avd EmulatedAndroid &
    echo 'On the development machine: adb kill-server and then: adb connect <desktop-pc-name>:5585'
    C:\trivial_portforward 5585 127.0.0.1 5555
  4. If you execute this batch file on your desktop PC, it will open the emulator with the specified virtual device.

Now on your laptop – the “client”:

  1. Now – given that both systems are in the same network – you can connect to the emulator from your laptop by typing in a terminal:
    adb kill-server
    adb connect <desktop-pc-name>:5585
  2. Now you can upload apps, access the logcat and execute adb commands on your remote emulator like on any other Android device. And all without performance impairments on your workstation.
  3. If you are experiencing communication losses, increase the emulator timeout in the eclipse settings to maybe 5000 ms (Window → Preferences → Android → DDMS → ADB connection time out (ms)).

Hello World for Android computer vision

Every once in a while I start a new computer vision project with Android. And I am always facing the same question: “What do I need again to retrieve a camera image ready for processing?”. While there are great tutorials around I just want a downloadable project with a minimum amount of code – not taking pictures, not setting resolutions, just the continuous retrieval of incoming camera frames.

So here they are – two different “Hello World” for computer vision. I will show you some excerpts from the code and then provide a download link for each project.

Pure Android API

The main problem to solve is how to store the camera image into a processable image format – in this case the android.graphics.Bitmap .

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
		int height) {
	if(camera != null) {
		camera.release();
		camera = null;
	}
	camera = Camera.open();
	try {
		camera.setPreviewDisplay(holder);
	} catch (IOException e) {
		e.printStackTrace();
	}
	camera.setPreviewCallback(new PreviewCallback() {

		public void onPreviewFrame(byte[] data, Camera camera) {
			System.out.println("Frame received!"+data.length);
			Size size = camera.getParameters().getPreviewSize();
			/*
			 * Directly constructing a bitmap from the data would be possible if the preview format
			 * had been set to RGB (params.setPreviewFormat() ) but some devices only support YUV.
			 * So we have to stick with it and convert the format
			 */
			int[] rgbData = convertYUV420_NV21toRGB8888(data, size.width, size.height);
			Bitmap bitmap = Bitmap.createBitmap(rgbData, size.width, size.height, Bitmap.Config.ARGB_8888);
			/*
			 * TODO: now process the bitmap
			 */
		}
	});
	camera.startPreview();
}

Notice the function convertYUV420_NV21toRGB8888() which is needed since the internal representation of camera frames does not match any supported Bitmap format.

 

Using OpenCV

This is even more straight-forward. We just use OpenCV’s JavaCameraView. If you are new to Android+OpenCV, here is a good tutorial for you.

cameraView = (CameraBridgeViewBase) findViewById(R.id.cameraView);
cameraView.setCvCameraViewListener(this);

OpenCV Image Watch for cv::Matx

When developing for/with OpenCV using Visual Studio, the Image Watch plug-in is very useful. However, it does not support the new cv::Matx types (e.g. cv::Matx33f which is the same as cv::Matx<float,3,3> ). Here is how I made use of debugger type visualizers to customize the plugin:

  1. Go to the folder <VS Installation Directory>\Visualizers\ and create a new file called Matx.natvis
  2. Open the file and insert the following:
    <?xml version="1.0" encoding="utf-8"?>
    <!-- Philipp Hasper, http://www.hasper.info-->
    <AutoVisualizer xmlns="http://schemas.microsoft.com/vstudio/debugger/natvis/2010">
      <UIVisualizer ServiceId="{A452AFEA-3DF6-46BB-9177-C0B08F318025}" Id="1" MenuName="Add to Image Watch"/>  
    
    	<Type Name="cv::Matx&lt;*,*,*&gt;">
    		<UIVisualizer ServiceId="{A452AFEA-3DF6-46BB-9177-C0B08F318025}" Id="1" />
    	</Type> 
    
    	<Type Name="cv::Matx&lt;*,*,*&gt;">
    		<DisplayString Condition='strcmp("float", "$T1") == 0'>{{FLOAT32, size = {$T3}x{$T2}}}</DisplayString>
    		<DisplayString Condition='strcmp("double", "$T1") == 0'>{{FLOAT64, size = {$T3}x{$T2}}}</DisplayString>
    
    		<Expand>
    		<Synthetic Name="[type]" Condition='strcmp("float", "$T1") == 0'>
    			<DisplayString>FLOAT32</DisplayString>
    		</Synthetic>
    		<Synthetic Name="[type]" Condition='strcmp("double", "$T1") == 0'>
    			<DisplayString>FLOAT64</DisplayString>
    		</Synthetic>
    
    		<Item Name="[channels]">1</Item>
    		<Item Name="[width]">$T3</Item>
    		<Item Name="[height]">$T2</Item>
    		<Item Name="[data]">(void*)val</Item>
    		<Item Name="[stride]">$T3*sizeof($T1)</Item>
    		</Expand>
    
    	</Type>
    
    </AutoVisualizer>
  3. You don’t even have to restart Visual Studio. Just start a new debugging session and you can look at your cv::Matx types.

Image Watch for cv::Matx

More about customizing the Image Watch plug-in can be found here.

OpenCV and Visual Studio: Empty Call Stack

For a couple of years I have used OpenCV for Android and developed with Eclipse. But a while back I started a bigger project which will run on stationary machines so I began to learn how to use Visual Studio 2013. The integration of OpenCV 2.4.8 was fairly easy and I was quickly able to run my code.

(Just as a service since the library names on the given site are outdated – here are all the names for easy copying:)

opencv_calib3d248.lib
opencv_contrib248.lib
opencv_core248.lib
opencv_features2d248.lib
opencv_flann248.lib
opencv_gpu248.lib
opencv_highgui248.lib
opencv_imgproc248.lib
opencv_legacy248.lib
opencv_ml248.lib
opencv_nonfree248.lib
opencv_objdetect248.lib
opencv_ocl248.lib
opencv_photo248.lib
opencv_stitching248.lib
opencv_superres248.lib
opencv_ts248.lib
opencv_video248.lib
opencv_videostab248.lib
opencv_calib3d248d.lib
opencv_contrib248d.lib
opencv_core248d.lib
opencv_features2d248d.lib
opencv_flann248d.lib
opencv_gpu248d.lib
opencv_highgui248d.lib
opencv_imgproc248d.lib
opencv_legacy248d.lib
opencv_ml248d.lib
opencv_nonfree248d.lib
opencv_objdetect248d.lib
opencv_ocl248d.lib
opencv_photo248d.lib
opencv_stitching248d.lib
opencv_superres248d.lib
opencv_ts248d.lib
opencv_video248d.lib
opencv_videostab248d.lib

But then I experienced a strange behaviour: Every time an exception or assertion was thrown inside of an OpenCV method, I would have no clue what happened since the call stack had only four entries: Something about KernelBase.dll, msvcr120d.dll, opencv_core248.dll and the last one “Frames below may be incorrect and/or missing, no symbols loaded for opencv_core248d.dll“.

Frames below may be incorrect and/or missing, no symbols loaded for opencv_core248d.dll

Upon further examination (clicking on the opencv_core248d.dll entry) Visual Studio revealed that the .pdb file was missing: it said “opencv_core248d.pdb not loaded” and “opencv_core248d.pdb could not be found in the selected paths“.

opencv_core248d.pdb not loaded. opencv_core248d.pdb could not be found in the selected paths

I quickly found some .pdb files in C:\opencv248\build\x86\vc12\staticlib but since they did not match the .dlls, they did not work either. So what to do? Essentially we have to build OpenCV ourselves but we will leave out any 3rd party libraries since we only want to debug, not to have fast code (of course you can build a complete version but I didn’t do it in order to save time). In the following I will describe only the basic steps, for a full documentation including pictures and how to add performance-improving 3rd party libraries visit the original tutorial.

  1. I assume you have an OpenCV copy, e.g. under C:\opencv248\ . The folder contains a build and a sources folder.
  2. Install CMake
  3. Start CMake (cmake-gui). It should be in your start menu. Enter C:\opencv248\sources in the first field (“Where is the source code:”) and a freely chosen path e.g. C:\opencv248\ownBuild\ in the second one.
  4. Press “Configure” and select your compiler – for Visual Studio 2013 32-bit it would be “Visual Studio 12″. I then ignored a warning about Java AWT and python missing and pressed the “Generate” button.
  5. Wait for the process to finish, then open C:\opencv248\ownBuild\OpenCV.sln. Build both Debug and Release configuration which should take some time.
  6. After the build, go into C:\opencv248\ownBuild\bin. There are two folders containing all files you will need. Now you have two options:
    1. Remove any directory previously leading to the OpenCV dlls from your PATH (e.g. in my case I removed C:\opencv248\build\x86\vc12\bin ) and then add C:\opencv248\ownBuild\bin\Debug and C:\opencv248\ownBuild\bin\Release to your PATH.
    2. Remove any directory previously leading to the OpenCV dlls from your PATH. Then move all .dll and .pdb files from the Debug and Release folder to a “save” place, e.g. C:\opencv248\debuggableDLL. Add this folder to your PATH, then delete the whole C:\opencv248\ownBuild\ folder to free disk space.
  7. Restart Visual Studio and start a debug session. Now the call stack shows exactly what happened: The call stack now shows the correct lines
  8. Remember to switch back to the optimized dlls when doing performance testing!

If you don’t want to build this all by yourself, here is the result of the build process for Visual Studio 2013 32-bit (the original size of >800MB is compressed to 80MB). To download the archive, activate JavaScript, enter “vsopencv” in the following field and then click the download button. Uncompress the archive with 7-zip and then perform step 6.b.

Enter password

OpenCV: Reading an image sequence backwards

Here is a small code snippet for OpenCV which reads an image sequence backwards. It needs a sequence of images 000.png, 001.png, 002.png, … in the project’s folder.

cv::Mat frame;
cv::VideoCapture capture("000.png");
capture.set(CV_CAP_PROP_POS_AVI_RATIO, 1);
while (true)
{
	capture >> frame;
	capture.set(CV_CAP_PROP_POS_FRAMES, capture.get(CV_CAP_PROP_POS_FRAMES) - 2);

	cv::imshow("image", frame);
	cv::waitKey(30);
}

So what does the code do?

  1. Setting the property CV_CAP_PROP_POS_AVI_RATIO to 1 means starting at the end of the sequence (0 = at the beginning).
  2. The property CV_CAP_PROP_POS_FRAMES defines the index of the next image to load. Since it is automatically increased after each image retrieval, we have to decrement it by the value of 2.

Start AfterEffects in a different language

If you want to start AfterEffects CS4 in a different language than you have it installed (i.e. if you want to follow an English tutorial and cannot find the effects), open the program with the additional command  -L en_US .

Just create a link on your desktop and add this command after the target: "C:\...\AfterFX.exe" -L en_US.

You can find all available language codes by looking into the installation directory of AfterEffects and opening the AMT folder.

xkcd widget

When I saw the xkcd comic “Now”, I immediately wanted it as widget on my smartphone. A cool little gadget showing you the approximate time of day all around the world.

I searched trough the Google Play Store and only found versions with a huge file size – since they just stored all possible images in the app files, the widgets reached a size of around 25 MB.

So I spent a few hours learning how to built Android widgets and compiled my own version – fewer than 1 MB big and with a cool preview animation.

Download the xkcd widget and try it for yourself!

XKCD Screenshot 3 XKCD Screenshot 4XKCD Screenshot 2

See you at the CeBIT 2014

As last year, I will be an exhibitor at this year’s CeBIT taking place March 10-14 in Hannover. We will present the awesome AR-Handbook at Hall 9, Booth D44.

Our topic is Fast MRO (Fast Maintenance, Repair and Overhaul) – a comprehensive Augmented Reality Maintenance Information System that shows and explains technical details for simplified maintenance work right where it is needed. Using a John Deere tractor as an example, Fast MRO offers support for the exchange of defective consumables or the maintenance of lubrication parts. The system offers information about the position of machine elements, maintenance intervals or the use of certain components and guides the repairman step by step through the individual work instructions.

Remote Execution vs. Simplification for Mobile Real-time Computer Vision

As part of my work at the DFKI Kaiserslautern, I published a paper at VISAPP 2014 dealing with Remote Execution for mobile Augmented Reality:

Remote Execution vs. Simplification for Mobile Real-time Computer Vision. Philipp Hasper, Nils Petersen, Didier Stricker. In Proceedings of the 9th International Conference on Computer Vision Theory and Applications (VISAPP) 2014. doi:10.5220/0004683801560161.

Mobile implementations of computationally complex algorithms are often prohibitive due to performance constraints. There are two possible solutions for this: (1) adopting a faster but less powerful approach which results in a loss of accuracy or robustness. (2) using remote data processing which suffers from limited bandwidth and communication latencies and is difficult to implement in real-time interactive applications. Using the example of a mobile Augmented Reality application, we investigate those two approaches and compare them in terms of performance. We examine different workload balances ranging from extensive remote execution to pure onboard processing.

@inproceedings{Hasper2014,
author = {Hasper, Philipp and Petersen, Nils and Stricker, Didier},
booktitle = {Proceedings of the 9th International Conference on Computer Vision Theory and Applications},
doi = {10.5220/0004683801560161},
isbn = {978-989-758-003-1},
pages = {156--161},
publisher = {SCITEPRESS - Science and and Technology Publications},
title = {{Remote Execution vs. Simplification for Mobile Real-time Computer Vision}},
year = {2014}
}