<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://boofcv.org/index.php?action=history&amp;feed=atom&amp;title=Example_Loop_Closure</id>
	<title>Example Loop Closure - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://boofcv.org/index.php?action=history&amp;feed=atom&amp;title=Example_Loop_Closure"/>
	<link rel="alternate" type="text/html" href="https://boofcv.org/index.php?title=Example_Loop_Closure&amp;action=history"/>
	<updated>2026-05-08T05:31:20Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.36.2</generator>
	<entry>
		<id>https://boofcv.org/index.php?title=Example_Loop_Closure&amp;diff=3218&amp;oldid=prev</id>
		<title>Peter: Created page with &quot;A key part of mapping is the ability to detect if you are retraversing an area that you&#039;ve already traveled. That&#039;s what loop closure does. It&#039;s when you&#039;ve identified that th...&quot;</title>
		<link rel="alternate" type="text/html" href="https://boofcv.org/index.php?title=Example_Loop_Closure&amp;diff=3218&amp;oldid=prev"/>
		<updated>2022-09-03T00:40:43Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;A key part of mapping is the ability to detect if you are retraversing an area that you&amp;#039;ve already traveled. That&amp;#039;s what loop closure does. It&amp;#039;s when you&amp;#039;ve identified that th...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;A key part of mapping is the ability to detect if you are retraversing an area that you&amp;#039;ve already traveled. That&amp;#039;s what loop closure does. It&amp;#039;s when you&amp;#039;ve identified that this same location has been seen before and you can then connect two disconnected views in the graph.&lt;br /&gt;
&lt;br /&gt;
Example Code:&lt;br /&gt;
* [https://github.com/lessthanoptimal/BoofCV/blob/v0.41/examples/src/main/java/boofcv/examples/reconstruction/ExampleLoopClosure.java ExampleLoopClosure]&lt;br /&gt;
&lt;br /&gt;
Concepts:&lt;br /&gt;
* SLAM&lt;br /&gt;
* Scene Reconstruction&lt;br /&gt;
* Loop Closure&lt;br /&gt;
&lt;br /&gt;
Relevant Examples/Tutorials:&lt;br /&gt;
* [[Example_Scene_Recognition|Scene Recognition]]&lt;br /&gt;
&lt;br /&gt;
= Example Code =&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;java&amp;quot;&amp;gt;&lt;br /&gt;
/**&lt;br /&gt;
 * Shows how you can detect if two images are of the same scene. This is known as loop closure and is done in&lt;br /&gt;
 * robotic mapping, e.g. SLAM. Here will use a fast recognition approach that takes only a few milliseconds to find&lt;br /&gt;
 * the most likely candidate images using image features alone. After that we perform feature matching to reduce&lt;br /&gt;
 * false positives. A complete solution would involve a geometric check, i.e. Fundamental matrix.&lt;br /&gt;
 *&lt;br /&gt;
 * Using scene recognition drastically reduces computational time as it eliminates most bad matches. As a result&lt;br /&gt;
 * this can run in a real-time or near real-time environment.&lt;br /&gt;
 *&lt;br /&gt;
 * @author Peter Abeles&lt;br /&gt;
 */&lt;br /&gt;
public class ExampleLoopClosure {&lt;br /&gt;
	public static void main( String[] args ) {&lt;br /&gt;
		System.out.println(&amp;quot;Finding Images&amp;quot;);&lt;br /&gt;
		String pathImages = &amp;quot;loop_closure&amp;quot;;&lt;br /&gt;
		videoToImages(UtilIO.pathExample(&amp;quot;mvs/stone_sign.mp4&amp;quot;), pathImages);&lt;br /&gt;
		List&amp;lt;String&amp;gt; imagePaths = UtilIO.listSmart(String.format(&amp;quot;glob:%s/*.png&amp;quot;, pathImages), true, ( f ) -&amp;gt; true);&lt;br /&gt;
&lt;br /&gt;
		// Create the feature detector. Default settings are often not the best configuration for recognition.&lt;br /&gt;
		// Finding the best settings is left as an exercise for the reader.&lt;br /&gt;
		DetectDescribePoint&amp;lt;GrayU8, TupleDesc_F64&amp;gt; detector =&lt;br /&gt;
				FactoryDetectDescribe.surfFast(null, null, null, GrayU8.class);&lt;br /&gt;
&lt;br /&gt;
		// Detect features in all the images&lt;br /&gt;
		var descriptions = new ArrayList&amp;lt;FastAccess&amp;lt;TupleDesc_F64&amp;gt;&amp;gt;();&lt;br /&gt;
		var locations = new ArrayList&amp;lt;FastAccess&amp;lt;Point2D_F64&amp;gt;&amp;gt;();&lt;br /&gt;
&lt;br /&gt;
		System.out.println(&amp;quot;Feature Detection&amp;quot;);&lt;br /&gt;
		for (int pathIdx = 0; pathIdx &amp;lt; imagePaths.size(); pathIdx++) {&lt;br /&gt;
			// Print out the progress&lt;br /&gt;
			System.out.print(&amp;quot;*&amp;quot;);&lt;br /&gt;
			if (pathIdx%80 == 79)&lt;br /&gt;
				System.out.println();&lt;br /&gt;
&lt;br /&gt;
			// Load the image and detect features&lt;br /&gt;
			String path = imagePaths.get(pathIdx);&lt;br /&gt;
			GrayU8 gray = UtilImageIO.loadImage(path, GrayU8.class);&lt;br /&gt;
&lt;br /&gt;
			detector.detect(gray);&lt;br /&gt;
&lt;br /&gt;
			// Copy all the features into lists for this image&lt;br /&gt;
			var imageDescriptions = new DogArray&amp;lt;&amp;gt;(detector::createDescription);&lt;br /&gt;
			var imageLocations = new DogArray&amp;lt;&amp;gt;(Point2D_F64::new);&lt;br /&gt;
&lt;br /&gt;
			for (int i = 0; i &amp;lt; detector.getNumberOfFeatures(); i++) {&lt;br /&gt;
				imageDescriptions.grow().setTo(detector.getDescription(i));&lt;br /&gt;
				imageLocations.grow().setTo(detector.getLocation(i));&lt;br /&gt;
			}&lt;br /&gt;
			descriptions.add(imageDescriptions);&lt;br /&gt;
			locations.add(imageLocations);&lt;br /&gt;
		}&lt;br /&gt;
		System.out.println();&lt;br /&gt;
&lt;br /&gt;
		// Put feature information into a format scene recognition understands&lt;br /&gt;
		var listRecFeat = new ArrayList&amp;lt;FeatureSceneRecognition.Features&amp;lt;TupleDesc_F64&amp;gt;&amp;gt;();&lt;br /&gt;
		for (int i = 0; i &amp;lt; descriptions.size(); i++) {&lt;br /&gt;
			FastAccess&amp;lt;Point2D_F64&amp;gt; pixels = locations.get(i);&lt;br /&gt;
			FastAccess&amp;lt;TupleDesc_F64&amp;gt; descs = descriptions.get(i);&lt;br /&gt;
			listRecFeat.add(new FeatureSceneRecognition.Features&amp;lt;&amp;gt;() {&lt;br /&gt;
				@Override public Point2D_F64 getPixel( int index ) {return pixels.get(index);}&lt;br /&gt;
&lt;br /&gt;
				@Override public TupleDesc_F64 getDescription( int index ) {return descs.get(index);}&lt;br /&gt;
&lt;br /&gt;
				@Override public int size() {return pixels.size();}&lt;br /&gt;
			});&lt;br /&gt;
		}&lt;br /&gt;
&lt;br /&gt;
		System.out.println(&amp;quot;Learning model. Can take a minute. You can save and reload this model.&amp;quot;);&lt;br /&gt;
&lt;br /&gt;
		var config = new ConfigRecognitionNister2006();&lt;br /&gt;
		config.learningMinimumPointsForChildren.setFixed(20);&lt;br /&gt;
		FeatureSceneRecognition&amp;lt;TupleDesc_F64&amp;gt; recognizer =&lt;br /&gt;
				FactorySceneRecognition.createSceneNister2006(config, detector::createDescription);&lt;br /&gt;
&lt;br /&gt;
		// Pass image information in as an iterator that it understands.&lt;br /&gt;
		recognizer.learnModel(new Iterator&amp;lt;&amp;gt;() {&lt;br /&gt;
			int imageIndex = 0;&lt;br /&gt;
&lt;br /&gt;
			@Override public boolean hasNext() {return imageIndex &amp;lt; descriptions.size();}&lt;br /&gt;
&lt;br /&gt;
			@Override public FeatureSceneRecognition.Features&amp;lt;TupleDesc_F64&amp;gt; next() {&lt;br /&gt;
				return listRecFeat.get(imageIndex++);&lt;br /&gt;
			}&lt;br /&gt;
		});&lt;br /&gt;
&lt;br /&gt;
		// To find functions for saving and loading these models look at RecognitionIO&lt;br /&gt;
&lt;br /&gt;
		System.out.println(&amp;quot;Creating database&amp;quot;);&lt;br /&gt;
		for (int imageIdx = 0; imageIdx &amp;lt; descriptions.size(); imageIdx++) {&lt;br /&gt;
			// Note that image are assigned a name equal to their index&lt;br /&gt;
			recognizer.addImage(imageIdx + &amp;quot;&amp;quot;, listRecFeat.get(imageIdx));&lt;br /&gt;
		}&lt;br /&gt;
&lt;br /&gt;
		System.out.println(&amp;quot;Scoring likely loop closures&amp;quot;);&lt;br /&gt;
&lt;br /&gt;
		// Have a strict requirement for matching to reduce false positives&lt;br /&gt;
		var configAssociate = new ConfigAssociateGreedy();&lt;br /&gt;
		configAssociate.forwardsBackwards = true;&lt;br /&gt;
		configAssociate.scoreRatioThreshold = 0.9;&lt;br /&gt;
&lt;br /&gt;
		var scorer = FactoryAssociation.scoreEuclidean(detector.getDescriptionType(), true);&lt;br /&gt;
		var associate = FactoryAssociation.greedy(configAssociate, scorer);&lt;br /&gt;
&lt;br /&gt;
		// Go through all the images and use scene recongition to greatly reduce the number of images that need&lt;br /&gt;
		// to be considered. Scene recognition is very fast, while feature matching is slow, and geometric&lt;br /&gt;
		// checks are even slower.&lt;br /&gt;
		var matches = new DogArray&amp;lt;&amp;gt;(SceneRecognition.Match::new);&lt;br /&gt;
		for (int imageIdx = 0; imageIdx &amp;lt; descriptions.size(); imageIdx++) {&lt;br /&gt;
			// Query results to find the best matches.&lt;br /&gt;
			// We are going to pass in a filter that will remove all the most recent frames since we don&amp;#039;t care&lt;br /&gt;
			// about those. This way we know all the returned results are potential loop closures.&lt;br /&gt;
			int _imageIdx = imageIdx;&lt;br /&gt;
			recognizer.query(&lt;br /&gt;
					/*query*/ listRecFeat.get(imageIdx),&lt;br /&gt;
					/*filter*/ ( id ) -&amp;gt; Math.abs(_imageIdx - Integer.parseInt(id)) &amp;gt; 20,&lt;br /&gt;
					/*limit*/ 5, /*found matches*/ matches);&lt;br /&gt;
&lt;br /&gt;
			// Set up association&lt;br /&gt;
			associate.setSource(descriptions.get(imageIdx));&lt;br /&gt;
			int numFeatures = descriptions.get(imageIdx).size;&lt;br /&gt;
&lt;br /&gt;
			System.out.printf(&amp;quot;Image[%3d]\n&amp;quot;, imageIdx);&lt;br /&gt;
			for (var m : matches.toList()) {&lt;br /&gt;
				// Note how earlier it assigned the image name to be the index value as a string&lt;br /&gt;
				int imageDstIdx = Integer.parseInt(m.id);&lt;br /&gt;
&lt;br /&gt;
				// Perform association&lt;br /&gt;
				associate.setDestination(descriptions.get(imageDstIdx));&lt;br /&gt;
				associate.associate();&lt;br /&gt;
&lt;br /&gt;
				// Compute and print quality of fit metrics&lt;br /&gt;
				double matchFraction = associate.getMatches().size/(double)numFeatures;&lt;br /&gt;
				System.out.printf(&amp;quot;  %4s error=%.2f matches=%.2f\n&amp;quot;, m.id, m.error, matchFraction);&lt;br /&gt;
&lt;br /&gt;
				// A loop closure will have a large number of matching features. When the fraction goes&lt;br /&gt;
				// over 30% in this example, you probably have a good match.&lt;br /&gt;
&lt;br /&gt;
				// Typically a geometric check is done next, such as estimating a fundamental matrix or PNP.&lt;br /&gt;
				// With a geometric check the odds of a false positive are low.&lt;br /&gt;
			}&lt;br /&gt;
		}&lt;br /&gt;
		System.out.println(&amp;quot;Done!&amp;quot;);&lt;br /&gt;
	}&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;/div&gt;</summary>
		<author><name>Peter</name></author>
	</entry>
</feed>