I’m tinkering with a few USB bore cams and a Stereoscopy (Dual Camera webcam)… I’m looking at the Video demo, and I suppose need to be able to grab a pair of frames from 2 sources synchonised (at least as close to each other in time as possible / reasonable - given USB capabilities & throughput…)
My question is about the Video module… I can’t see how grabbing frames from 2 cameras can be done by one module (how can I identify the source camera device in an imageReceived callback)? Only thing I can think of doing is creating 2 camera devices, and registering 2 separate callbacks somehow (my CPP is still awful)
I’d ideally like to implement this on a Raspberry Pi, too, and am generally interested why Linux has no camera implementation (I mean, I can imagine it’s no walk in a park, but is it just not possible to open a /dev/video device and process it as presumably android does?)
I’ve tinkered with some simple MJPEG code, but not had a chance to test it. (Ought to do something handy in Linux)