Difference between revisions of "Example SURF Feature"

From BoofCV
Jump to navigationJump to search
(Added SURF example)
 
m
 
(23 intermediate revisions by the same user not shown)
Line 1: Line 1:
= Example Using Computing SURF Features =
Speeded Up Robust Feature (SURF) is a region descriptor and interest point detector.  Two different ways of using SURF are demonstrated in this example.  The easy way uses a high level interface that is easy to work with, but sacrifices flexibility.  The harder way directly creates the SURF classes, is more complex, and requires a better understanding of how the code works.


Speeded Up Robust Feature (SURF) is a region descriptor and interest point detector.  Two different ways of using SURF are demonstrated in this example.  The easy way uses a generalized interface that is easy to work with, but sacrifices flexibility and some efficiency.  The harder way directly creates the SURF classes, is more complex, and requires a better understanding of how the code works.
Example File: [https://github.com/lessthanoptimal/BoofCV/blob/v0.40/examples/src/main/java/boofcv/examples/features/ExampleFeatureSurf.java ExampleFeatureSurf.java]
 
'''WARNING: The code below needs to be modified to work with the current stable release!'''
 
Example File: [https://github.com/lessthanoptimal/BoofCV/blob/master/examples/src/boofcv/examples/ExampleFeatureSurf.java ExampleFeatureSurf.java]


Concepts:
Concepts:
Line 12: Line 8:
* Interest Point
* Interest Point


Relevant Applets:
= Example Code =
* [[Applet_Point_Detected| Detecting Interest Points]]
* [[Applet_Description_Region| Visualize Region Descriptors]]


<syntaxhighlight lang="java">
<syntaxhighlight lang="java">
Line 23: Line 17:
  */
  */
public class ExampleFeatureSurf {
public class ExampleFeatureSurf {
/**
/**
* Use generalized interfaces for working with SURF. This removes much of the drudgery, but also reduces flexibility
* Use generalized interfaces for working with SURF. This removes much of the drudgery, but also reduces flexibility
* and slightly increases memory and computational requirements.  For example, the integral image is computed twice.
* and slightly increases memory and computational requirements.
*  
*  
*  @param image Input image type. DOES NOT NEED TO BE ImageFloat32, ImageUInt8 works too
*  @param image Input image type. DOES NOT NEED TO BE GrayF32, GrayU8 works too
*/
*/
public static void easy( ImageFloat32 image ) {
public static void easy( GrayF32 image ) {
// create the detector and descriptors
// create the detector and descriptors
InterestPointDetector<ImageFloat32> detector = FactoryInterestPoint.fastHessian(0, 2, 200, 2, 9, 4, 4);
ConfigFastHessian configDetector = new ConfigFastHessian();
// BoofCV has two SURF implementations. surfm() = slower, but more accurate. surf() = faster and less accurate
configDetector.extract = new ConfigExtract(2, 0, 5, true);
DescribeRegionPoint<ImageFloat32> descriptor = FactoryDescribeRegionPoint.surfm(true,ImageFloat32.class);
configDetector.maxFeaturesPerScale = 200;
configDetector.initialSampleStep = 2;
// just pointing out that orientation does not need to be passed into the descriptor
 
if( descriptor.requiresOrientation() )
DetectDescribePoint<GrayF32,TupleDesc_F64> surf = FactoryDetectDescribe.
throw new RuntimeException("SURF should compute orientation itself!");
surfStable(configDetector, null, null,GrayF32.class);
 
// detect interest points
detector.detect(image);
// specify the image to process
// specify the image to process
descriptor.setImage(image);
surf.detect(image);
List<Point2D_F64> locations = new ArrayList<Point2D_F64>();
List<TupleDesc_F64> descriptions = new ArrayList<TupleDesc_F64>();
for( int i = 0; i < detector.getNumberOfFeatures(); i++ ) {
// information about hte detected interest point
Point2D_F64 p = detector.getLocation(i);
double scale = detector.getScale(i);
// extract the SURF description for this region
TupleDesc_F64 desc = descriptor.process(p.x,p.y,0,scale,null);
// save everything for processing later on
descriptions.add(desc);
locations.add(p);
}


System.out.println("Found Features: "+locations.size());
System.out.println("Found Features: "+surf.getNumberOfFeatures());
System.out.println("First descriptor's first value: "+descriptions.get(0).value[0]);
System.out.println("First descriptor's first value: "+surf.getDescription(0).data[0]);
}
}


/**
/**
* Configured to exactly the same as the easy example, but require a lot more code and a more in depth understanding
* Configured exactly the same as the easy example above, but require a lot more code and a more in depth
* of how SURF works and is configured. SurfFeature are computed in this case.  They are essentially
* understanding of how SURF works and is configured. Each sub-problem which composes "SURF" is now explicitly
* TupleDesc_F64, but contain the Laplacian's sign which can be used to speed up association.   That is an example
* created and configured independently. This allows an advance user to tune it for a specific problem.
* of how uses less generalized interfaces can improve performance.
*
*  
* @param image Input image type. DOES NOT NEED TO BE GrayF32, GrayU8 works too
* @param image Input image type. DOES NOT NEED TO BE ImageFloat32, ImageUInt8 works too
*/
*/
public static <II extends ImageSingleBand> void harder( ImageFloat32 image ) {
public static <II extends ImageGray<II>> void harder(GrayF32 image ) {
// SURF works off of integral images
// SURF works off of integral images
Class<II> integralType = GIntegralImageOps.getIntegralType(ImageFloat32.class);
Class<II> integralType = GIntegralImageOps.getIntegralType(GrayF32.class);
// define the feature detection algorithm
// define the feature detection algorithm
FeatureExtractor extractor = FactoryFeatureExtractor.nonmax(2, 0, 5, false, true);
ConfigFastHessian config = new ConfigFastHessian();
FastHessianFeatureDetector<II> detector =  
config.extract = new ConfigExtract(2, 0, 5, true);
new FastHessianFeatureDetector<II>(extractor,200,2, 9,4,4);
config.maxFeaturesPerScale = 200;
config.initialSampleStep = 2;
FastHessianFeatureDetector<II> detector = FactoryInterestPointAlgs.fastHessian(config);


// estimate orientation
// estimate orientation
OrientationIntegral<II> orientation =  
OrientationIntegral<II> orientation = FactoryOrientationAlgs.sliding_ii(null, integralType);
FactoryOrientationAlgs.sliding_ii(0.65, Math.PI / 3.0, 8, -1, 6, integralType);


DescribePointSurf<II> descriptor = FactoryDescribePointAlgs.<II>msurf(integralType);
DescribePointSurf<II> descriptor = FactoryDescribeAlgs.surfStability(null,integralType);
// compute the integral image of 'image'
// compute the integral image of 'image'
II integral = GeneralizedImageOps.createSingleBand(integralType,image.width,image.height);
II integral = GeneralizedImageOps.createSingleBand(integralType,image.width,image.height);
    GIntegralImageOps.transform(image, integral);
GIntegralImageOps.transform(image, integral);


// detect fast hessian features
// detect fast hessian features
Line 97: Line 72:
orientation.setImage(integral);
orientation.setImage(integral);
descriptor.setImage(integral);
descriptor.setImage(integral);
 
List<ScalePoint> points = detector.getFoundPoints();
List<ScalePoint> points = detector.getFoundFeatures();
 
List<SurfFeature> descriptions = new ArrayList<SurfFeature>();
List<TupleDesc_F64> descriptions = new ArrayList<>();


for( ScalePoint p : points ) {
for( ScalePoint p : points ) {
// estimate orientation
// estimate orientation
orientation.setScale(p.scale);
orientation.setObjectRadius( p.scale*BoofDefaults.SURF_SCALE_TO_RADIUS);
double angle = orientation.compute(p.x,p.y);
double angle = orientation.compute(p.pixel.x,p.pixel.y);
// extract the SURF description for this region
// extract the SURF description for this region
SurfFeature desc = descriptor.describe(p.x,p.y,p.scale,angle,null);
TupleDesc_F64 desc = descriptor.createDescription();
descriptor.describe(p.pixel.x,p.pixel.y,angle,p.scale, true, desc);


// save everything for processing later on
// save everything for processing later on
Line 115: Line 91:
System.out.println("Found Features: "+points.size());
System.out.println("Found Features: "+points.size());
System.out.println("First descriptor's first value: "+descriptions.get(0).value[0]);
System.out.println("First descriptor's first value: "+descriptions.get(0).data[0]);
}
}


public static void main( String args[] ) {
public static void main( String[] args ) {
// Need to turn off concurrency since the order in which feature are returned
ImageFloat32 image = UtilImageIO.loadImage("../data/evaluation/particles01.jpg",ImageFloat32.class);
// is not deterministic if turned on
BoofConcurrency.USE_CONCURRENT = false;
 
GrayF32 image = UtilImageIO.loadImage(UtilIO.pathExample("particles01.jpg"), GrayF32.class);
// run each example
// run each example
Line 127: Line 106:
System.out.println("Done!");
System.out.println("Done!");
}
}
}
}
</syntaxhighlight>
</syntaxhighlight>

Latest revision as of 12:59, 17 January 2022

Speeded Up Robust Feature (SURF) is a region descriptor and interest point detector. Two different ways of using SURF are demonstrated in this example. The easy way uses a high level interface that is easy to work with, but sacrifices flexibility. The harder way directly creates the SURF classes, is more complex, and requires a better understanding of how the code works.

Example File: ExampleFeatureSurf.java

Concepts:

  • SURF
  • Region Descriptor
  • Interest Point

Example Code

/**
 * Example of how to use SURF detector and descriptors in BoofCV. 
 * 
 * @author Peter Abeles
 */
public class ExampleFeatureSurf {
	/**
	 * Use generalized interfaces for working with SURF. This removes much of the drudgery, but also reduces flexibility
	 * and slightly increases memory and computational requirements.
	 * 
	 *  @param image Input image type. DOES NOT NEED TO BE GrayF32, GrayU8 works too
	 */
	public static void easy( GrayF32 image ) {
		// create the detector and descriptors
		ConfigFastHessian configDetector = new ConfigFastHessian();
		configDetector.extract = new ConfigExtract(2, 0, 5, true);
		configDetector.maxFeaturesPerScale = 200;
		configDetector.initialSampleStep = 2;

		DetectDescribePoint<GrayF32,TupleDesc_F64> surf = FactoryDetectDescribe.
				surfStable(configDetector, null, null,GrayF32.class);

		 // specify the image to process
		surf.detect(image);

		System.out.println("Found Features: "+surf.getNumberOfFeatures());
		System.out.println("First descriptor's first value: "+surf.getDescription(0).data[0]);
	}

	/**
	 * Configured exactly the same as the easy example above, but require a lot more code and a more in depth
	 * understanding of how SURF works and is configured. Each sub-problem which composes "SURF" is now explicitly
	 * created and configured independently. This allows an advance user to tune it for a specific problem.
	 *
	 * @param image Input image type. DOES NOT NEED TO BE GrayF32, GrayU8 works too
	 */
	public static <II extends ImageGray<II>> void harder(GrayF32 image ) {
		// SURF works off of integral images
		Class<II> integralType = GIntegralImageOps.getIntegralType(GrayF32.class);
		
		// define the feature detection algorithm
		ConfigFastHessian config = new ConfigFastHessian();
		config.extract = new ConfigExtract(2, 0, 5, true);
		config.maxFeaturesPerScale = 200;
		config.initialSampleStep = 2;
		FastHessianFeatureDetector<II> detector = FactoryInterestPointAlgs.fastHessian(config);

		// estimate orientation
		OrientationIntegral<II> orientation =  FactoryOrientationAlgs.sliding_ii(null, integralType);

		DescribePointSurf<II> descriptor = FactoryDescribeAlgs.surfStability(null,integralType);
		
		// compute the integral image of 'image'
		II integral = GeneralizedImageOps.createSingleBand(integralType,image.width,image.height);
		GIntegralImageOps.transform(image, integral);

		// detect fast hessian features
		detector.detect(integral);
		// tell algorithms which image to process
		orientation.setImage(integral);
		descriptor.setImage(integral);

		List<ScalePoint> points = detector.getFoundFeatures();

		List<TupleDesc_F64> descriptions = new ArrayList<>();

		for( ScalePoint p : points ) {
			// estimate orientation
			orientation.setObjectRadius( p.scale*BoofDefaults.SURF_SCALE_TO_RADIUS);
			double angle = orientation.compute(p.pixel.x,p.pixel.y);
			
			// extract the SURF description for this region
			TupleDesc_F64 desc = descriptor.createDescription();
			descriptor.describe(p.pixel.x,p.pixel.y,angle,p.scale, true, desc);

			// save everything for processing later on
			descriptions.add(desc);
		}
		
		System.out.println("Found Features: "+points.size());
		System.out.println("First descriptor's first value: "+descriptions.get(0).data[0]);
	}

	public static void main( String[] args ) {
		// Need to turn off concurrency since the order in which feature are returned
		// is not deterministic if turned on
		BoofConcurrency.USE_CONCURRENT = false;

		GrayF32 image = UtilImageIO.loadImage(UtilIO.pathExample("particles01.jpg"), GrayF32.class);
		
		// run each example
		ExampleFeatureSurf.easy(image);
		ExampleFeatureSurf.harder(image);
		
		System.out.println("Done!");
	}
}