c++pointersrgbintelrealsense

Pointer Exception while getting RGB values from (video) frame Intel Realsense


I'm trying to get the different RGB values from a frame with the Realsense SDK. This is for a 3D depth camera with RGB. According to https://github.com/IntelRealSense/librealsense/issues/3364 I need to use

int i = 100, j = 100; // fetch pixel 100,100
rs2::frame rgb = ...
auto ptr = (uint8_t*)rgb.get_data();
auto stride = rgb.as<rs2::video_frame>().stride();
cout << "R=" << ptr[3*(i * stride + j)];
cout << ", G=" << ptr[3*(i * stride + j) + 1];
cout << ", B=" << ptr[3*(i * stride + j) + 2];

In my code I'm getting a pointer exception if I want to get the values for pixel (x,y)=1000,1000. With (x,y)=100,100 it works every time... Error: Exception thrown: read access violation. ptr was 0x11103131EB9192A.

enter image description here

I set the enable_stream to cfg.enable_stream(RS2_STREAM_COLOR, WIDTH_COLOR_FRAME, HEIGTH_COLOR_FRAME, RS2_FORMAT_RGB8, 15); where in the .h file are:

#define WIDTH_COLOR_FRAME   1920
#define HEIGTH_COLOR_FRAME  1080

This is my code. Maybe it has something to do with the RS2_FORMAT_RGB8?

frameset frames = pl.wait_for_frames();
frame color = frames.get_color_frame();

uint8_t* ptr = (uint8_t*)color.get_data();
int stride = color.as<video_frame>().get_stride_in_bytes();

int i = 1000, j = 1000; // fetch pixel 100,100

cout << "R=" << int(ptr[3 * (i * stride + j)]);
cout << ", G=" << int(ptr[3 * (i * stride + j) + 1]);
cout << ", B=" << int(ptr[3 * (i * stride + j) + 2]);
cout << endl;

Thanks in advance!


Solution

  • stride is in bytes (length of row in bytes), multiplication with 3 is not required.

    cout << "  R= " << int(ptr[i * stride + (3*j)    ]);
    cout << ", G= " << int(ptr[i * stride + (3*j) + 1]);
    cout << ", B= " << int(ptr[i * stride + (3*j) + 2]);