Difference between revisions of "Tutorial Kinect"

From BoofCV
Jump to navigationJump to search
(Created page with "= Kinect Support = The Kinect a popular RGB-D sensor which provides both video and depth information. BoofCV provides support for the [http://openkinect.org|OpenKinect] driv...")
 
m
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
= Kinect Support =
<center>
<gallery widths=320px heights=240px>
file:Kinect_depth_basket.jpg | Depth image from Kinect sensor
file:Kinect_point_cloud_basket.jpg | 3D point cloud created from RGB and depth images
</gallery>
</center>


The Kinect a popular RGB-D sensor which provides both video and depth information.  BoofCV provides support for the [http://openkinect.org|OpenKinect] driver directly through several helper function.  The project already provides a Java interface.  The Kinect is much easier to work with than stereo cameras and provides similar information.  The major downside to working with a Kinect sensor is that they can't be use outdoors and have a more limited range.
The Kinect a popular RGB-D sensor which provides both video and depth information.  BoofCV provides support for the [http://openkinect.org|OpenKinect] driver directly through several helper function.  The project already provides a Java interface.  The Kinect is much easier to work with than stereo cameras and provides similar information.  The major downside to working with a Kinect sensor is that they can't be use outdoors and have a more limited range.
Line 5: Line 10:
To access helper function for Kinect in BoofCV go to the boofcv/integration/openkinect directory.  Several useful examples and utilities are provided in the openkinect/examples directory, while the main source code is in openkinect/src directory.  To use these functions, be sure to include the BoofCV_OpenKinect.jar in your project.  This can be downloaded precompile or you can compile it yourself using ant.
To access helper function for Kinect in BoofCV go to the boofcv/integration/openkinect directory.  Several useful examples and utilities are provided in the openkinect/examples directory, while the main source code is in openkinect/src directory.  To use these functions, be sure to include the BoofCV_OpenKinect.jar in your project.  This can be downloaded precompile or you can compile it yourself using ant.


Two classes are provided in the Kinect jar, StreamOpenKinectRgbDepth and UtilOpenKinect.  StreamOpenKinectRgbDepth is a high level interface for streaming data from a kinect using OpenKinect.  UtilOpenKinect contains functions for manipulating, reading, and saving Kinect data.  The depth image is stored as an unsigned 16-bit gray scale image (ImageUInt16) and the RGB in any standard image format.   
Two classes are provided in the Kinect jar, StreamOpenKinectRgbDepth and UtilOpenKinect.  StreamOpenKinectRgbDepth is a high level interface for streaming data from a kinect using OpenKinect.  UtilOpenKinect contains functions for manipulating, reading, and saving Kinect data.  The depth image is stored as an unsigned 16-bit gray scale image (GrayU16) and the RGB in any standard image format.   
 
Related Examples:
* [https://github.com/lessthanoptimal/BoofCV/tree/v0.23/integration/openkinect/example/src/boofcv/example/| Examples in source code]
* [[Example_Point_Cloud_Depth_Image| Depth Point Cloud]]
* [[Example Visual Odometry Depth| Depth Visual Odometry]]


== Usage Example ==
== Usage Example ==
Line 20: Line 30:


{
{
// Modify this link to be where you store your shared library
// be sure to set OpenKinectExampleParam.PATH_TO_SHARED_LIBRARY to the location of your shared library!
NativeLibrary.addSearchPath("freenect", "/home/pja/libfreenect/build/lib");
NativeLibrary.addSearchPath("freenect", OpenKinectExampleParam.PATH_TO_SHARED_LIBRARY);
}
}


MultiSpectral<ImageUInt8> rgb = new MultiSpectral<ImageUInt8>(ImageUInt8.class,1,1,3);
Planar<GrayU8> rgb = new Planar<GrayU8>(GrayU8.class,1,1,3);
ImageUInt16 depth = new ImageUInt16(1,1);
GrayU16 depth = new GrayU16(1,1);


BufferedImage outRgb;
BufferedImage outRgb;
Line 78: Line 88:
UtilOpenKinect.bufferDepthToU16(frame, depth);
UtilOpenKinect.bufferDepthToU16(frame, depth);


VisualizeImageData.grayUnsigned(depth,outDepth,1000);
// VisualizeImageData.grayUnsigned(depth,outDepth,UtilOpenKinect.FREENECT_DEPTH_MM_MAX_VALUE);
VisualizeImageData.disparity(depth, outDepth, 0, UtilOpenKinect.FREENECT_DEPTH_MM_MAX_VALUE,0);
guiDepth.repaint();
guiDepth.repaint();
}
}
Line 87: Line 98:
}
}


System.out.println("Got rgb! "+timestamp);
System.out.println("Got rgb!   "+timestamp);


if( outRgb == null ) {
if( outRgb == null ) {
Line 96: Line 107:


UtilOpenKinect.bufferRgbToMsU8(frame, rgb);
UtilOpenKinect.bufferRgbToMsU8(frame, rgb);
ConvertBufferedImage.convertTo_U8(rgb,outRgb);
ConvertBufferedImage.convertTo_U8(rgb,outRgb,true);


guiRgb.repaint();
guiRgb.repaint();

Latest revision as of 18:44, 3 January 2017

The Kinect a popular RGB-D sensor which provides both video and depth information. BoofCV provides support for the [1] driver directly through several helper function. The project already provides a Java interface. The Kinect is much easier to work with than stereo cameras and provides similar information. The major downside to working with a Kinect sensor is that they can't be use outdoors and have a more limited range.

To access helper function for Kinect in BoofCV go to the boofcv/integration/openkinect directory. Several useful examples and utilities are provided in the openkinect/examples directory, while the main source code is in openkinect/src directory. To use these functions, be sure to include the BoofCV_OpenKinect.jar in your project. This can be downloaded precompile or you can compile it yourself using ant.

Two classes are provided in the Kinect jar, StreamOpenKinectRgbDepth and UtilOpenKinect. StreamOpenKinectRgbDepth is a high level interface for streaming data from a kinect using OpenKinect. UtilOpenKinect contains functions for manipulating, reading, and saving Kinect data. The depth image is stored as an unsigned 16-bit gray scale image (GrayU16) and the RGB in any standard image format.

Related Examples:

Usage Example

Many more examples are provided in the openkinect/examples directory. Here is one showing typical usage.

/**
 * Example demonstrating how to process and display data from the Kinect.
 *
 * @author Peter Abeles
 */
public class OpenKinectStreamingTest {

	{
		// be sure to set OpenKinectExampleParam.PATH_TO_SHARED_LIBRARY to the location of your shared library!
		NativeLibrary.addSearchPath("freenect", OpenKinectExampleParam.PATH_TO_SHARED_LIBRARY);
	}

	Planar<GrayU8> rgb = new Planar<GrayU8>(GrayU8.class,1,1,3);
	GrayU16 depth = new GrayU16(1,1);

	BufferedImage outRgb;
	ImagePanel guiRgb;

	BufferedImage outDepth;
	ImagePanel guiDepth;

	public void process() {
		Context kinect = Freenect.createContext();

		if( kinect.numDevices() < 0 )
			throw new RuntimeException("No kinect found!");

		Device device = kinect.openDevice(0);

		device.setDepthFormat(DepthFormat.REGISTERED);
		device.setVideoFormat(VideoFormat.RGB);

		device.startDepth(new DepthHandler() {
			@Override
			public void onFrameReceived(FrameMode mode, ByteBuffer frame, int timestamp) {
				processDepth(mode,frame,timestamp);
			}
		});
		device.startVideo(new VideoHandler() {
			@Override
			public void onFrameReceived(FrameMode mode, ByteBuffer frame, int timestamp) {
				processRgb(mode,frame,timestamp);
			}
		});

		long starTime = System.currentTimeMillis();
		while( starTime+100000 > System.currentTimeMillis() ) {}
		System.out.println("100 Seconds elapsed");

		device.stopDepth();
		device.stopVideo();
		device.close();

	}

	protected void processDepth( FrameMode mode, ByteBuffer frame, int timestamp ) {
		System.out.println("Got depth! "+timestamp);

		if( outDepth == null ) {
			depth.reshape(mode.getWidth(),mode.getHeight());
			outDepth = new BufferedImage(depth.width,depth.height,BufferedImage.TYPE_INT_RGB);
			guiDepth = ShowImages.showWindow(outDepth,"Depth Image");
		}

		UtilOpenKinect.bufferDepthToU16(frame, depth);

//		VisualizeImageData.grayUnsigned(depth,outDepth,UtilOpenKinect.FREENECT_DEPTH_MM_MAX_VALUE);
		VisualizeImageData.disparity(depth, outDepth, 0, UtilOpenKinect.FREENECT_DEPTH_MM_MAX_VALUE,0);
		guiDepth.repaint();
	}

	protected void processRgb( FrameMode mode, ByteBuffer frame, int timestamp ) {
		if( mode.getVideoFormat() != VideoFormat.RGB ) {
			System.out.println("Bad rgb format!");
		}

		System.out.println("Got rgb!   "+timestamp);

		if( outRgb == null ) {
			rgb.reshape(mode.getWidth(),mode.getHeight());
			outRgb = new BufferedImage(rgb.width,rgb.height,BufferedImage.TYPE_INT_RGB);
			guiRgb = ShowImages.showWindow(outRgb,"RGB Image");
		}

		UtilOpenKinect.bufferRgbToMsU8(frame, rgb);
		ConvertBufferedImage.convertTo_U8(rgb,outRgb,true);

		guiRgb.repaint();
	}

	public static void main( String args[] ) {
		OpenKinectStreamingTest app = new OpenKinectStreamingTest();

		app.process();
	}
}