Discussions>TypeError: UMat() missing required argument 'ranges' (pos 2)>

TypeError: UMat() missing required argument 'ranges' (pos 2)

I'm tring to create a simple face detection script, but I can't understand why I'm getting an error like this

def detect(img):   
    img_umat = cv2.UMat(img)
    gray = cv2.cvtColor(img_umat, cv2.COLOR_BGR2GRAY)
    cascade = cv2.CascadeClassifier(cv2.data.haarcascades + "haarcascade_frontalface_default.xml")
    faces = cascade.detectMultiScale(gray, scaleFactor=1.4, minNeighbors=3)

    if len(faces)==0:
        return None, None, None, None

    x, y, w, h = faces[0]
    return x, y, w, h
 img_umat = cv2.UMat(img)
TypeError: UMat() missing required argument 'ranges' (pos 2)

What's wrong with it?

3 votesJW326.00
1 Answers

Try to change the type of img and convert it to float. I assume the dtype of img object is float16, try to convert it to float32.

OpenCV offten gives weird error, which turn out to be related to dtypes of the input.

Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?