Skip to main content

image processing - Nontrivial background removal



I have an image of a product on a poorly made green screen and need to segment out just the product:


enter image description here


The problem is that it contains a mirror, so simple color-based methods are not enough.


I tried playing with the function RemoveBackground using markers, but no luck. Here's what I tried so far:


RemoveBackground[img, {"Background", Green}]
RemoveBackground[img, {"Background", {"Uniform", 0.1}}]

enter image description here enter image description here


Update:


With python and opencv can do it easily using the Grabcut algorithm referenced in the comments, but I can't find the way to do it with MMA.



%matplotlib inline

import numpy as np
import cv2
import skimage
from matplotlib import pyplot as plt
img = cv2.imread(path_to_img)
print "img", img.shape

# resize

side = 600
ratio = float(side) / max(img.shape)
img = skimage.img_as_ubyte(
skimage.transform.resize(
img, (int(img.shape[0] * ratio), int(img.shape[1] * ratio))))

s = (img.shape[0] / 10, img.shape[1] / 10)
rect = (s[0], s[1], img.shape[0] - 2 * s[0], img.shape[1] - 2 * s[1])

mask = np.zeros(img.shape[:2],np.uint8)


bgdModel = np.zeros((1,65),np.float64)
fgdModel = np.zeros((1,65),np.float64)

cv2.grabCut(img,mask,rect,bgdModel,fgdModel,5,cv2.GC_INIT_WITH_RECT)
mask2 = np.where((mask==2)|(mask==0),0,1).astype('uint8')
img = img*mask2[:,:,np.newaxis]

plt.imshow(img)
plt.colorbar()

plt.show()

enter image description here



Answer




image = Import["https://i.stack.imgur.com/zP5xF.jpg"];
imageData = Flatten[ImageData[ColorConvert[image, "LAB"]], 1];
c = ClusterClassify[imageData, 4, Method -> "KMedoids"];
decision = c[imageData];
mask = Image /@

ComponentMeasurements[{image,
Partition[decision, First@ImageDimensions[image]]}, "Mask"][[All,
2]]

enter image description here


allMask = FillingTransform[Dilation[ColorNegate[mask[[4]]], 1]];
SetAlphaChannel[image, Blur[allMask, 8]]

enter image description here




Method one,Classify the pixel by chain a nerve


I have to say this is worthless method in real life,because it is very very very low efficiency(Maybe when you have a CUDA feature GPU, it will be more faster).I don't remember how long I have run it.Well,Just for fun.


First we select a range that you need,which just is a selection roughly that mean you can include some singular point in your trained data.Of course you can make yourself trained data.This is what I select that arbitrarily



Then define a net and train it


image = Import["https://i.stack.imgur.com/zP5xF.jpg"];
trainData = Join[Thread[Rule[no, False]], Thread[Rule[yes, True]]];
net = NetChain[{20, Tanh, 2,
SoftmaxLayer["Output" -> NetDecoder[{"Class", {True, False}}]]},
"Input" -> 3];

ringQ = NetTrain[net, trainData, MaxTrainingRounds -> 20]


Be patient and wait some minutes,then you can get your ring.The final effect is depened on your training data and some luck.


Image[Map[If[ringQ[#],#,N@{1,1,1}]&,ImageData[image],{2}]]


We can use my above method to refine it in following step.


Method two,use the built-in function of Classify


This method is not bad as the result effect,but actually I will not tell you this code cost my one night to run,which mean this method is slower than that NetChain. Firstly,make some sample data




match = Classify[<|False -> Catenate[ImageData[no]], 
True -> Catenate[ImageData[yes]]|>];
ImageApply[If[match[#], #, {1, 1, 1}] &, image]

Be more patient please,after just one night,the result will show you.like this:



image = Import["https://i.stack.imgur.com/zP5xF.jpg"];

Method one



SetAlphaChannel[image, 
Erosion[Blur[
DeleteSmallComponents[
FillingTransform[Binarize[GradientFilter[image, 1], 0.035]]], 10],
1]]


Method two


SetAlphaChannel[image, 
Blur[Binarize[

Image[WatershedComponents[GradientFilter[image, 2],
Method -> {"MinimumSaliency", 0.2}] - 1]], 5]]


Method three


SetAlphaChannel[image, 
Blur[FillingTransform[
MorphologicalBinarize[
ColorNegate[
First[ColorSeparate[ColorConvert[image, "CMYK"]]]], {.6, .93}]],

7]]


Last but not least,this method do some principal component decomposition of color channels,which can face more situation commonly


First[KarhunenLoeveDecomposition[
ColorCombine /@ Tuples[ColorSeparate[image], {3}]]]


Note that picture from 2 to 5,every picture have more strong contrast then origin.Than we can use fist three method do next step.


Comments

Popular posts from this blog

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

What is and isn't a valid variable specification for Manipulate?

I have an expression whose terms have arguments (representing subscripts), like this: myExpr = A[0] + V[1,T] I would like to put it inside a Manipulate to see its value as I move around the parameters. (The goal is eventually to plot it wrt one of the variables inside.) However, Mathematica complains when I set V[1,T] as a manipulated variable: Manipulate[Evaluate[myExpr], {A[0], 0, 1}, {V[1, T], 0, 1}] (*Manipulate::vsform: Manipulate argument {V[1,T],0,1} does not have the correct form for a variable specification. >> *) As a workaround, if I get rid of the symbol T inside the argument, it works fine: Manipulate[ Evaluate[myExpr /. T -> 15], {A[0], 0, 1}, {V[1, 15], 0, 1}] Why this behavior? Can anyone point me to the documentation that says what counts as a valid variable? And is there a way to get Manpiulate to accept an expression with a symbolic argument as a variable? Investigations I've done so far: I tried using variableQ from this answer , but it says V[1...