Background

Recently I have decided to explore tracking from 3D point clouds extracted from stereo vision cameras. Step 1: Extract 3D point cloud from stereo vision cameras. So right now I’m implementing Segment-Based Stereo Matching Using Belief Propogation and Self-Adapting Dissimilarity Measure” by Klaus, Sormann, and Karner. This paper is defined by the source on stereo vision to be the best one around. This paper has two parts. Part 1: Segment the image. Part 2: Compute disparity (and depth) from the segments. Well, today I finished Part 1.

Stereo Cameras

First Try

The authors refer to a mean-shift segmentation algorithm presented in Mean Shift: A Robust Approach Toward Feature Space Analysis” [pdf] by Comaniciu and Meer to do the image segmentation. This paper (unlike some of my own previous work) leans towards oversegmentation of an image. Meaning that you prefer to get lots of little bits rather than the “right object” after the algorithm has run.

Well, after looking over the paper and getting a grasp for the mathematics, I took a crack at implementing it. Easily done… HOWEVER, my first attempt, written in Matlab, was painfully slow. (For a simple image it took 6 hours to run!) So, I got on the internet and came up with a better solution!

The Solution

Some great guys at Rutgers University implemented this paper in C++ and made the code available to the public under the name EDISON. (there’s also a nice GUI that goes along with this if you want to just play to see if these codes will work for you). Okay, so I had C++ codes that worked well (only 2 sec to do an image rather than 6 hours). The next step was to bring the code into Matlab.

Mean Shift Segmentation Results
These were the type of results I was trying for

I cracked my knuckles and got ready to write a MEX wrapper for this EDISON code. Then I said to myself, “Self, maybe you should check the ‘net first.” Turns out I had a good point. I found the website of Shai Bagon. Mr. Bagon had already made the MEX wrapper! Awesome.

I downloaded the codes and put them together. Mr. Bagon’s stuff worked right out of the box, although it would have saved me about an hour if I would have had this information (alternative readme.txt for Matlab Interface for EDISON). I also wrote my own wrapper-wrapper so that I could process grayscale images, and do simpler calls to accomplish what I wanted. If you’d like the code, download my wrapper-wrapper here (msseg.m).

Results

Here is a sample of the output of this algorithm. The first image is a regular photo of some posed objects. The second image is the segmented version. Notice how the regions of the image are much, much more constant. This image has been broken into “tiles” of constant color.

Left Image
The original image (part of a standard pair of test images).
segmented image
The segmented image (ready to be processed in step 2)

Conclusion

Don’t re-invent the wheel. Taking a first crack at the implementation was good, and it helped me understand the algorithm. However, there was no need for me to spend a week tweaking it to be super-fast or two days getting the Matlab interface working. These things had already been done! It feels nice to knock out a task that you thought was going to take a week in a few hours : ) Stay tuned for the stereo part of this paper coming soon. Then maybe people will be writing about my page!

76 thoughts on “Mean Shift Segmentation in Matlab

  1. Hi! I too an enthusiast about this things, I’ve read many papers on the subject and in the case of this one I seriously doubt they’ve told everything they did in order to get their awesome results. Good luck and regards

  2. I’ve spent the whole week trying to implement this. Thank god I finally decided to try google (and for your post)!

  3. Hi Shawn, thanks for piecing this together. I have a question which I hope is straight forward:

    What is the best way to get the filtered image back into the same image space as it started?

    Consider for example a grayscale image of, say, intensities from 1 to 1024. Your wrapper converts this to RGB (using repmat). The filter itself works in Luv space, so the edison_wrapper converts RGB to Luv. The result is then converted by your wrapper back to RGB using Luv2RGB, then to grayscale using rgb2gray(). However, this converted grayscale space is still only from 0 to 1.

    Does this 0-1 map linearly to the min-max of the original image?

    Is it best to, say, loop over each returned label and calculate the mean pixel intensity from the original image? This would give an OK result, but I’m not sure if I’m missing an easier and perhaps faster method…

    Thanks,
    Sven.

  4. The code by Bagon seems like it has a color space issue. It assumes you convert to LUV before calling the mex wrapper, but the synergistic code assumes the image is RGB. In EDISON’s Windows app you can see they do the synergistic calcs in RGB and then convert to LUV. Therefore, I think you are not using the EDISON library correctly. That is, the weights you are calculating are not optimal.

  5. I have to implement mean shift clustering in matlab and I think the code you mention here will do exactly what I need. Unfortunately I have forgotten how to compile a MEX file.
    Could you give me a step by step check list to guide me through the steps?
    Thank you.

  6. I meet a question, the file “msseg” can not work on my Window XP 64 bit machine. I compile “mex edison_wrapper_mex.cpp”. One of the errors is that:
    error LNK2019: unresolved external symbol “public: __cdecl msImageProcessor::~msImageProcessor(void)” (??1msImageProcessor@@QEAA@XZ) referenced in function mexFunction
    Maybe there is a lack of .lib files.
    Could you help me figure this out?
    And I’m wondering if you could send me a copy of your source code? This would help me a lot.

    Thanks in advance.

  7. @David: You may be right… This code gets most of the way there, though.

    @Olaf: to compile mex, use >>mex filename.cpp. Check this tutorial.

    @Tina: The codes I posted are merely a wrapper for the codes by Shai Bagon. Many people are having trouble compiling them on Windows systems. My apologies, but I do not have the original source.

  8. Hi Shawn, just a quick question plz, did you by any chance used the edison execution script at all? I m trying to spot some examples for the syntax of it, do you by any chance have sth in mind? Thanks in advance. George

  9. I used the EDISON codes, but not the “execution script.” I have download links (above) for all of the scripts and functions I wrote to help run the EDISON codes. Enjoy.

  10. Hi Shawn, I’ve just got some work about segemetation, and fortunately find your post by Google, which will save me a lot of time. Thanks very much!

  11. Hi Shawn,

    I want just to ask if you have or ever seen where I can find implementation of this method of segmentation (information bottleneck method) (“Image Segmentation using Information Bottleneck Method”, A. Bardera, J. Rigau, I. Boada, M. Feixas and M. Sbert. IEEE Transactions on Image Processing, Volume 18, No 7, pp. 1601-1612, July 2009. )

    Thank you

  12. Hi there :)
    Thanks for the great post! It saved me lot’s of time :)
    i have just one question… when using the EDISON GUI (the exe file) i can get a good result by segmentation the image and after it applying the ‘Fusion Only’ option and I get an image which is much more homogeneous than the first transformation.
    My doubt is how to make this work in Matlab with the Shai Bagon wrapper… I can’t understand if the ‘steps’ option must be 2 (the default) or if somehow there is a ‘hidden 3rd step’ which I can activate… i cannot get the same result as the GUI by calling the function two times, and from the readme’s from the Matlab wrapper i found out the only options that matlab won’t do that the GUI does is the edge detection. So i can’t find a way to accomplish the same result… :(
    One last detail, I could not compile the MEX file, when I try the compiler outputs lots of errors and then terminates saying “Too many errors”, but I can use your wrapper with no problem that it reproduces the same segmentation as the exe GUI so i think this may not be the problem…
    Does anyone knows what I’m missing?

    thanks in advance :)*

  13. @Jeanine So there are a couple of things: 1) The Matlab wrapper that I wrote deliberately removes some of the functionality Bagon’s wrapper because that made the codes easier to use. You may look more deeply into the MEX codes to get access to the specific parameter you’re looking for.
    2) Lots of people have trouble compiling for the Windows operating system. Because I don’t run Windows, I can’t debug it, but people have told me that it isn’t too hard to find the offending line of code.

  14. @Olaf
    Hi, could you plz mail me Meanshift clustering matlab code, which can be used for classification of data set, I need it badly ..

  15. For the record, compiling in Windows requires a few steps:

    1) mex -setup, choose VC++ compiler (the default, lcc, doesn’t work)
    2) Please run “compile_edison_wrapper.m”, don’t call mex to compile it directly since it’s missing all the other important files!
    3) You may get an error about a “MAC file format” on one file or two. Open them with a text editor and save them again. If it doesn’t solve it, check the editor’s options to save it specifically in a format that is NOT Mac.
    4) You may get an error about “ambiguous function call for pow()”. The first parameter to that function is 2, change it to 2.0.

    Hope I didn’t forget anything :)

    BTW I can’t get the weights to work (ie, defining a “strength” for each sample), if anyone can help…

  16. Hi, Thank you very much for your article.

    I can run the Matlab code:
    “>> %% This will load, segment, and display the “peppers” image
    >> I = imread(‘b.jpg’);
    >> [fimg labels modes regsize grad conf] = edison_wrapper(I,@RGB2Luv);
    >> imshow(Luv2RGB(fimg))”

    I am now just confuse about how to use the output data to perform my next step, which is the registration part (to shift different segments and compared with the reference image). After I run the code, we can only get the segmentationed image, but we cannot get separate segments’ data.

    Do you happen to know how to do that please? Thank you!

  17. @Shawn Lankton

    Hello sir,
    Can you explain me the matlab code for Mean Shift as you have used three function. Can you give me a clear explanation of the same.

    Thank you,
    Rakesh

  18. Hi Shawn,
    As you mention first part segmentation of image
    second part computation of disparity(depth) map from segments, this is using
    RANSAC, I’m doing second part as described in stereoscopic Inpainting: Joint Color and depth estimation paper and doing with implementation of this paper.
    Disparity assignment for partly visible segments has be done as described in paper
    but for invisible(more occluded regions) i’m not getting good results,can you provide any idea, more explanation for computation of depth map from segments.

    regards
    Balaji

  19. Jack: The “labels” matrix has the info you want. It’s just like the labels that bwlabel outputs (but bwlabel only works when the clusters are disjoint). It assigns a cluster index number to each pixel. regionprops works with this structure and outputs blob information. Since these indexes are 0-based instead of 1-based (like Matlab usually likes), you have to add 1. So, call regionprops(labels+1), see help for regionprops to select the characteristics you want (centroid, convex hull, etc).

  20. How can I seperate each parts of a segmented image?

    For more explaination,suppose we segmented an image using mean-shift algorithm.
    Then we want to seperate each segmented parts and use these extracted parts as the input of a classifier such as
    a neural network.

    How can I take the colored segmented parts apart?
    Wold you please give me a matlab code for these purpose or even the structure of such a program?

  21. Hi Shawn,

    Thanks for the article, I am new in matlab, and would like to ask whether or not it is possible to use edison_matlab_interface without having Matlab Component Runtime (MCR) in my machine?

    Mine is matlab 7.10.0 (R2010a), I am trying to locate where is the MCR in my machine, but it seems not exist.

    any helps?

    -Bree

  22. @mohi I believe the function returns a label map as well as the visual output. The label map is an image where pixels in each segment have the same value (i.e., the first segment is all 1’s, the second is all 2’s, etc.)

  23. Shawn,
    Thank you for this lovely post! I see a lot of traffic going through here. Apparently mean-shift and its Matlab implementation draw some attention.
    I’m happy my wrapper helped some people.
    Best,
    Shai Bagon

  24. Hi Shawn,
    Thank you for sharing these implementation with all of us!
    Have you succeed any further implementation in this paper(I mean on part2)
    Best,

  25. Hi Shawn,
    your work is wonderful. I really appreciated it. Could you suggest me a way to resolve a problem in continuing my work?
    I’m working in a Visual Saliency contest. I have a saliency map of the same image that I give to the edison_wrapper method. The saliency map is a grey scale matrix in wich the point that are more salient are represented as the brighter pixel.
    So, what I need is a way in matlab to calculate the mean value of saliency for each region (for example the mean value of the Salient matrix for the coordinate that in the Labels matrix are labelled with 0. Then the one with 1, and so on). The first way I think it’s obviously a loop. But I’m sure that there’s something more efficient.
    Could you help me?
    Any suggestion is appreciated.
    Thanks in advance!

  26. yongsheng :
    hi, how to get above Matlab code and try it ?

    mohi Rad :
    How can I seperate each parts of a segmented image?
    For more explaination,suppose we segmented an image using mean-shift algorithm.
    Then we want to seperate each segmented parts and use these extracted parts as the input of a classifier such as
    a neural network.
    How can I take the colored segmented parts apart?
    Wold you please give me a matlab code for these purpose or even the structure of such a program?

    @yongsheng

  27. Jotaf :
    For the record, compiling in Windows requires a few steps:
    1) mex -setup, choose VC++ compiler (the default, lcc, doesn’t work)
    2) Please run “compile_edison_wrapper.m”, don’t call mex to compile it directly since it’s missing all the other important files!
    3) You may get an error about a “MAC file format” on one file or two. Open them with a text editor and save them again. If it doesn’t solve it, check the editor’s options to save it specifically in a format that is NOT Mac.
    4) You may get an error about “ambiguous function call for pow()”. The first parameter to that function is 2, change it to 2.0.
    Hope I didn’t forget anything :)
    BTW I can’t get the weights to work (ie, defining a “strength” for each sample), if anyone can help…

    Thank you very much for sharing your experience~O(?_?)O~

  28. @Tina
    unzip “edison_matlab_interface.tar.gz” to a folder,change the matlab path to that folder.
    if you have the Matlab?just run: “mex -setup” to select the compiler and then run the m file “compile_edison_wrapper.m”.
    after you do that,read remark in “edison_wrapper.m”,just do as it is said in that file.

  29. hi, i’m looking for a mean shift segmentation algorithm code by idl language useful and simple , can you help me , please i need it very much ….

    thank u.

  30. Hello, I’m graduate student in Taiwan.(My english is poor ><)
    Could you email to me the Meanshift segmentation Sample Code like the solution above?
    Visual C++ or Matlab also good, thank you@@.

  31. @David Ehrenberg
    Bagon’s Matlab code (mine :-) does not have a feature-space issue. You input the raw RGB image (for the synergistic weights) and a feature function (for example: one that converts from RGB to LUV) that transforms the image to the appropriate feature space. In that respect, the interface allows more flexibility in choosing the working domain for the mean-shift function.

    Shai Bagon

  32. @Shai
    Your wrapper(matlab and c++ codes) seems to be limited to application on RGB image, but single-channel or even high-dimensional data couldn’t be processed. My feature function computes features other than 3-dimensional (LUV alike) . So , I guess I have to make some modifications to your wrapper.

  33. Hi everybody,

    I am curious in Hr paramter, what is it ?

    Am i right? if I think that
    1. Hr is the color space LUV window, by 2*hr+1 in square window. or in the other words, every point in LUV are in the mode if that point has distance less than hr from the centroid of mode.(circular)

    2 the H_r is the maximum contrast in mode, so if i have circular window, the maximum distance = hr/2

    thanks you

    PSU

  34. hi, thanks for your article. Could you pls send me the EDISON code(c++)?i need it hably for my research. thank you.

  35. hi, thanks for your article. Could you pls send me the EDISON code(c++)?i need it hably for my research. thank you.

    Jack

  36. @Shawlankton: Thanks for your post.
    I just wonder how to combine each part of the object to extract the object the other object. For example, the lamp (in your segmentation result image) is segmented into many parts and we actually don’t know which parts belong to the lamp. How do you extract the lamp from the image?

  37. Hi, have you implemented Part-2. I want it very urjently. please can you mail me.
    I want to find region based disparity by affine transformation of segmented image.

  38. Hi, i cant compile the command prompt version of the code in visual studio 2010. it gives linker errors, any suggestions?

Leave a Reply

Your email address will not be published.