Difference between revisions of "Example Android Gradient"

From BoofCV
Jump to navigationJump to search
m
m
 
(9 intermediate revisions by the same user not shown)
Line 1: Line 1:
<center>
<center>
<gallery heights=250 widths=733 >
<gallery heights=300 widths=550 >
Image:Example_video_mosaic.jpg| Mosaic created from a view outside an airplane window.
Image:Example_android_video.jpg| Colorized image gradient being displayed on an Android phone.
</gallery>
</gallery>
</center>
</center>


Demonstration of to capture and process a video stream in real-time using BoofCV on an Android device.  On Android video streams are accessed inside a camera preview, which require several hoops to be jumped through for it to work well.  What this example does is capture the image in NV21 format, convert it into an ImageUInt8, compute the image gradient, visualize the gradient in a Bitmap image, and display the results.  Note that the example below is not entirely self contained, see the complete project for additional files.


Example File: [https://github.com/lessthanoptimal/BoofCV/blob/v0.13/integration/android/examples/video/src/org/boofcv/example/androidVideoActivity.java VideoActivity.java]
<center><SPAN STYLE="font-size: 20pt">New projects should use [[Example_Android_Fragment_Gradient | Fragments]]</SPAN></center>
<br>


Complete Project: [https://github.com/lessthanoptimal/BoofCV/blob/v0.13/integration/android/examples/video Android Project]
Demonstration of how to capture and process a video stream in real-time using BoofCV on an Android device.  On Android devices, video streams are accessed inside a camera preview, which require several hoops to be jumped through.  What this example does is capture the image in NV21 format, convert it into an GrayU8, compute the image gradient, visualize the gradient in a Bitmap image, and display the results.  Note that the example below is not entirely self contained, see the complete project for additional files.
 
Example File: [https://github.com/lessthanoptimal/BoofCV/blob/v0.31/integration/boofcv-android/examples/video/app/src/main/java/org/boofcv/video/GradientActivity.java GradientActivity.java]
 
Complete Project: [https://github.com/lessthanoptimal/BoofCV/blob/v0.31/integration/boofcv-android/examples/video/ Android Project]


Concepts:
Concepts:
Line 25: Line 29:
<syntaxhighlight lang="java">
<syntaxhighlight lang="java">
/**
/**
  * Demonstration of how to process a video stream on an Android device using BoofCV.  Most of the code below
  * Demonstrates how to use the visualize activity. A video stream is opened and the image gradient
  * is deals with handling Android and all of its quirks. Video streams can be accessed in Android by processing
  * is found. The gradient is then rendered into a format which can be visualized and displayed
  * a camera previewData from a camera preview comes in an NV21 image format, which needs to be converted.
* on the Android device's screen.
  * After it has been converted it needs to be processed and then displayedNote that several locks are required
*
  * to avoid the three threads (GUI, camera preview, and processing) from interfering with each other.
  * This greatly simplifies the process of capturing and visualizing image data from a camera.
  * Internally it uses the camera 2 API. You can customize its behavior by overriding
  * different internal functions. For more details, see the JavaDoc of it's parent classes.
*
  * @see VisualizeCamera2Activity
  * @see boofcv.android.camera2.SimpleCamera2Activity
  *
  *
  * @author Peter Abeles
  * @author Peter Abeles
  */
  */
public class VideoActivity extends Activity implements Camera.PreviewCallback {
public class GradientActivity extends VisualizeCamera2Activity
{
// Storage for the gradient
private GrayS16 derivX = new GrayS16(1,1);
private GrayS16 derivY = new GrayS16(1,1);


// camera and display objects
// Storage for image gradient. In general you will want to precompute data structures due
private Camera mCamera;
// to the expense of garbage collection
private Visualization mDraw;
private ImageGradient<GrayU8,GrayS16> gradient = FactoryDerivative.three(GrayU8.class, GrayS16.class);
private CameraPreview mPreview;


// computes the image gradient
// Used to display text info on the display
private ImageGradient<ImageUInt8,ImageSInt16> gradient = FactoryDerivative.three(ImageUInt8.class, ImageSInt16.class);
private Paint paintText = new Paint();


// Two images are needed to store the converted preview image to prevent a thread conflict from occurring
public GradientActivity() {
private ImageUInt8 gray1,gray2;
// The default behavior for selecting the camera's resolution is to
private ImageSInt16 derivX,derivY;
// find the resolution which comes the closest to having this many
 
// pixels.
// Android image data used for displaying the results
targetResolution = 640*480;
private Bitmap output;
}
// temporary storage that's needed when converting from BoofCV to Android image data types
private byte[] storage;
 
// Thread where image data is processed
private ThreadProcess thread;
 
// Object used for synchronizing gray images
private final Object lockGray = new Object();
// Object used for synchronizing output image
private final Object lockOutput = new Object();


@Override
@Override
public void onCreate(Bundle savedInstanceState) {
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
super.onCreate(savedInstanceState);


requestWindowFeature(Window.FEATURE_NO_TITLE);
setContentView(R.layout.gradient);
setContentView(R.layout.video);
FrameLayout surface = findViewById(R.id.camera_frame);


// Used to visualize the results
// By calling this function you are telling the camera library that you wish to process
mDraw = new Visualization(this);
// images in a gray scale format. The video stream is typically in YUV420. Color
// image formats are supported as RGB, YUV, ... etc, color spaces.
setImageType(ImageType.single(GrayU8.class));


// Create our Preview view and set it as the content of our activity.
// Configure paint used to display FPS
mPreview = new CameraPreview(this,this,true);
paintText.setStrokeWidth(4*displayMetrics.density);
paintText.setTextSize(14*displayMetrics.density);
paintText.setTextAlign(Paint.Align.LEFT);
paintText.setARGB(0xFF,0xFF,0xB0,0);
paintText.setTypeface(Typeface.create(Typeface.MONOSPACE, Typeface.BOLD));


FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
// The camera stream will now start after this function is called.
 
startCamera(surface,null);
preview.addView(mPreview);
preview.addView(mDraw);
}
 
@Override
protected void onResume() {
super.onResume();
 
if( mCamera != null )
throw new RuntimeException("Bug, camera should not be initialized already");
 
setUpAndConfigureCamera();
}
 
@Override
protected void onPause() {
super.onPause();
 
// stop the camera preview and all processing
if (mCamera != null){
mPreview.setCamera(null);
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
mCamera.release();
mCamera = null;
 
thread.stopThread();
thread = null;
}
}
}


/**
/**
* Sets up the camera if it is not already setup.
* This is where you specify custom camera settings. See {@link boofcv.android.camera2.SimpleCamera2Activity}'s
* JavaDoc for more funcitons which you can override.
*
* @param captureRequestBuilder Used to configure the camera.
*/
*/
private void setUpAndConfigureCamera() {
@Override
// Open and configure the camera
protected void configureCamera(CameraDevice device, CameraCharacteristics characteristics, CaptureRequest.Builder captureRequestBuilder) {
mCamera = Camera.open();
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_VIDEO);
 
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
Camera.Parameters param = mCamera.getParameters();
 
// Select the preview size closest to 320x240
// Smaller images are recommended because some computer vision operations are very expensive
List<Camera.Size> sizes = param.getSupportedPreviewSizes();
Camera.Size s = sizes.get(closest(sizes,320,240));
param.setPreviewSize(s.width,s.height);
mCamera.setParameters(param);
 
// declare image data
gray1 = new ImageUInt8(s.width,s.height);
gray2 = new ImageUInt8(s.width,s.height);
derivX = new ImageSInt16(s.width,s.height);
derivY = new ImageSInt16(s.width,s.height);
output = Bitmap.createBitmap(s.width,s.height,Bitmap.Config.ARGB_8888 );
storage = ConvertBitmap.declareStorage(output, storage);
 
// start image processing thread
thread = new ThreadProcess();
thread.start();
 
// Create an instance of Camera
mPreview.setCamera(mCamera);
}
}


/**
/**
* Goes through the size list and selects the one which is the closest specified size
* During camera initialization this function is called once after the resolution is known.
* This is a good function to override and predeclare data structres which are dependent
* on the video feeds resolution.
*/
*/
public static int closest( List<Camera.Size> sizes , int width , int height ) {
@Override
int best = -1;
protected void onCameraResolutionChange( int width , int height, int sensorOrientation ) {
int bestScore = Integer.MAX_VALUE;
super.onCameraResolutionChange(width, height,sensorOrientation);
 
for( int i = 0; i < sizes.size(); i++ ) {
Camera.Size s = sizes.get(i);
 
int dx = s.width-width;
int dy = s.height-height;
 
int score = dx*dx + dy*dy;
if( score < bestScore ) {
best = i;
bestScore = score;
}
}


return best;
derivX.reshape(width, height);
derivY.reshape(width, height);
}
}


/**
/**
* Called each time a new image arrives in the data stream.
* This function is invoked in its own thread and can take as long as you want.
*/
*/
@Override
@Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
protected void processImage(ImageBase image) {
 
// The data type of 'image' was specified in onCreate() function
// convert from NV21 format into gray scale
// The line below will compute the gradient and store it in two images. One for the
synchronized (lockGray) {
// gradient along the x-axis and the other along the y-axis
ConvertNV21.nv21ToGray(bytes,gray1.width,gray1.height,gray1);
gradient.process((GrayU8)image,derivX,derivY);
}
 
// Can only do trivial amounts of image processing inside this function or else bad stuff happens.
// To work around this issue most of the processing has been pushed onto a thread and the call below
// tells the thread to wake up and process another image
thread.interrupt();
}
}


/**
/**
* Draws on top of the video stream for visualizing computer vision results
* Override the default behavior and colorize gradient instead of converting input image.
*/
*/
private class Visualization extends SurfaceView {
@Override
protected void renderBitmapImage(BitmapMode mode, ImageBase image) {
switch( mode ) {
case UNSAFE: { // this application is configured to use double buffer and could ignore all other modes
VisualizeImageData.colorizeGradient(derivX, derivY, -1, bitmap, bitmapTmp);
} break;


Activity activity;
case DOUBLE_BUFFER: {
VisualizeImageData.colorizeGradient(derivX, derivY, -1, bitmapWork, bitmapTmp);


public Visualization(Activity context ) {
if( bitmapLock.tryLock() ) {
super(context);
try {
this.activity = context;
Bitmap tmp = bitmapWork;
 
bitmapWork = bitmap;
// This call is necessary, or else the
bitmap = tmp;
// draw method will not be called.
} finally {
setWillNotDraw(false);
bitmapLock.unlock();
}
}
 
}
@Override
} break;
protected void onDraw(Canvas canvas){
 
synchronized ( lockOutput ) {
int w = canvas.getWidth();
int h = canvas.getHeight();
 
// fill the window and center it
double scaleX = w/(double)output.getWidth();
double scaleY = h/(double)output.getHeight();
 
double scale = Math.min(scaleX,scaleY);
double tranX = (w-scale*output.getWidth())/2;
double tranY = (h-scale*output.getHeight())/2;
 
canvas.translate((float)tranX,(float)tranY);
canvas.scale((float)scale,(float)scale);
 
// draw the image
canvas.drawBitmap(output,0,0,null);
}
}
}
}
}


/**
/**
* External thread used to do more time consuming image processing
* Demonstrates how to draw visuals
*/
*/
private class ThreadProcess extends Thread {
@Override
 
protected void onDrawFrame(SurfaceView view, Canvas canvas) {
// true if a request has been made to stop the thread
super.onDrawFrame(view, canvas);
volatile boolean stopRequested = false;
// true if the thread is running and can process more data
volatile boolean running = true;
 
/**
* Blocks until the thread has stopped
*/
public void stopThread() {
stopRequested = true;
while( running ) {
thread.interrupt();
Thread.yield();
}
}
 
@Override
public void run() {
 
while( !stopRequested ) {
 
// Sleep until it has been told to wake up
synchronized ( Thread.currentThread() ) {
try {
wait();
} catch (InterruptedException ignored) {}
}


// process the most recently converted image by swapping image buffered
// Display info on the image being process and how fast input camera
synchronized (lockGray) {
// stream (probably in YUV420) is converted into a BoofCV format
ImageUInt8 tmp = gray1;
int width = bitmap.getWidth();
gray1 = gray2;
int height = bitmap.getHeight();
gray2 = tmp;
canvas.drawText(String.format(Locale.getDefault(),
}
"%d x %d Convert: %4.1f (ms)",
width,height,periodConvert.getAverage()),
0,120,paintText);


// process the image and compute its gradient
// Pro tip: Run in app fast or release mode for a dramatic speed up!
gradient.process(gray2,derivX,derivY);
// In Android Studio expand "Build Variants" tab on left.
 
// render the output in a synthetic color image
synchronized ( lockOutput ) {
VisualizeImageData.colorizeGradient(derivX,derivY,-1,output,storage);
}
mDraw.postInvalidate();
}
running = false;
}
}
}
}
}
</syntaxhighlight>
</syntaxhighlight>

Latest revision as of 14:23, 15 July 2023


New projects should use Fragments


Demonstration of how to capture and process a video stream in real-time using BoofCV on an Android device. On Android devices, video streams are accessed inside a camera preview, which require several hoops to be jumped through. What this example does is capture the image in NV21 format, convert it into an GrayU8, compute the image gradient, visualize the gradient in a Bitmap image, and display the results. Note that the example below is not entirely self contained, see the complete project for additional files.

Example File: GradientActivity.java

Complete Project: Android Project

Concepts:

  • Android
  • Camera Preview
  • Image Gradient

Related Tutorial:

Related Examples:

Example Code

/**
 * Demonstrates how to use the visualize activity. A video stream is opened and the image gradient
 * is found. The gradient is then rendered into a format which can be visualized and displayed
 * on the Android device's screen.
 *
 * This greatly simplifies the process of capturing and visualizing image data from a camera.
 * Internally it uses the camera 2 API. You can customize its behavior by overriding
 * different internal functions. For more details, see the JavaDoc of it's parent classes.
 *
 * @see VisualizeCamera2Activity
 * @see boofcv.android.camera2.SimpleCamera2Activity
 *
 * @author Peter Abeles
 */
public class GradientActivity extends VisualizeCamera2Activity
{
	// Storage for the gradient
	private GrayS16 derivX = new GrayS16(1,1);
	private GrayS16 derivY = new GrayS16(1,1);

	// Storage for image gradient. In general you will want to precompute data structures due
	// to the expense of garbage collection
	private ImageGradient<GrayU8,GrayS16> gradient = FactoryDerivative.three(GrayU8.class, GrayS16.class);

	// Used to display text info on the display
	private Paint paintText = new Paint();

	public GradientActivity() {
		// The default behavior for selecting the camera's resolution is to
		// find the resolution which comes the closest to having this many
		// pixels.
		targetResolution = 640*480;
	}

	@Override
	protected void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);

		setContentView(R.layout.gradient);
		FrameLayout surface = findViewById(R.id.camera_frame);

		// By calling this function you are telling the camera library that you wish to process
		// images in a gray scale format. The video stream is typically in YUV420. Color
		// image formats are supported as RGB, YUV, ... etc, color spaces.
		setImageType(ImageType.single(GrayU8.class));

		// Configure paint used to display FPS
		paintText.setStrokeWidth(4*displayMetrics.density);
		paintText.setTextSize(14*displayMetrics.density);
		paintText.setTextAlign(Paint.Align.LEFT);
		paintText.setARGB(0xFF,0xFF,0xB0,0);
		paintText.setTypeface(Typeface.create(Typeface.MONOSPACE, Typeface.BOLD));

		// The camera stream will now start after this function is called.
		startCamera(surface,null);
	}

	/**
	 * This is where you specify custom camera settings. See {@link boofcv.android.camera2.SimpleCamera2Activity}'s
	 * JavaDoc for more funcitons which you can override.
	 *
	 * @param captureRequestBuilder Used to configure the camera.
	 */
	@Override
	protected void configureCamera(CameraDevice device, CameraCharacteristics characteristics, CaptureRequest.Builder captureRequestBuilder) {
		captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_VIDEO);
		captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
	}

	/**
	 * During camera initialization this function is called once after the resolution is known.
	 * This is a good function to override and predeclare data structres which are dependent
	 * on the video feeds resolution.
	 */
	@Override
	protected void onCameraResolutionChange( int width , int height, int sensorOrientation ) {
		super.onCameraResolutionChange(width, height,sensorOrientation);

		derivX.reshape(width, height);
		derivY.reshape(width, height);
	}

	/**
	 * This function is invoked in its own thread and can take as long as you want.
	 */
	@Override
	protected void processImage(ImageBase image) {
		// The data type of 'image' was specified in onCreate() function
		// The line below will compute the gradient and store it in two images. One for the
		// gradient along the x-axis and the other along the y-axis
		gradient.process((GrayU8)image,derivX,derivY);
	}

	/**
	 * Override the default behavior and colorize gradient instead of converting input image.
	 */
	@Override
	protected void renderBitmapImage(BitmapMode mode, ImageBase image) {
		switch( mode ) {
			case UNSAFE: { // this application is configured to use double buffer and could ignore all other modes
				VisualizeImageData.colorizeGradient(derivX, derivY, -1, bitmap, bitmapTmp);
			} break;

			case DOUBLE_BUFFER: {
				VisualizeImageData.colorizeGradient(derivX, derivY, -1, bitmapWork, bitmapTmp);

				if( bitmapLock.tryLock() ) {
					try {
						Bitmap tmp = bitmapWork;
						bitmapWork = bitmap;
						bitmap = tmp;
					} finally {
						bitmapLock.unlock();
					}
				}
			} break;
		}
	}

	/**
	 * Demonstrates how to draw visuals
	 */
	@Override
	protected void onDrawFrame(SurfaceView view, Canvas canvas) {
		super.onDrawFrame(view, canvas);

		// Display info on the image being process and how fast input camera
		// stream (probably in YUV420) is converted into a BoofCV format
		int width = bitmap.getWidth();
		int height = bitmap.getHeight();
		canvas.drawText(String.format(Locale.getDefault(),
				"%d x %d Convert: %4.1f (ms)",
				width,height,periodConvert.getAverage()),
				0,120,paintText);

		// Pro tip: Run in app fast or release mode for a dramatic speed up!
		// In Android Studio expand "Build Variants" tab on left.
	}
}