I've been experimenting with a homemade stereo camera, using a pair of webcams. I've taken several pictures and used OpenCV for calibration.
Rectified sample images:
The calibration chessboards seem to be lined up horizontally.
But when I generate a disparity map based on these instructions, the result seems meaningless.
My disparity-map code is fairly trivial:
stereo = cv2.StereoBM_create(numDisparities=16, blockSize=15)
disparity = stereo.compute(image_left, image_right)
cv2.imwrite(disparity, 'try2.ppm')
Where am I going wrong?
You should calibrate camera with different distance of calibration pattern images. then check the disparity map with different distance to camera and find best area.