Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Tetragramm
GitHub Repository: Tetragramm/opencv
Path: blob/master/modules/videoio/src/cap_v4l.cpp
16354 views
1
/* This is the contributed code:
2
3
File: cvcap_v4l.cpp
4
Current Location: ../opencv-0.9.6/otherlibs/videoio
5
6
Original Version: 2003-03-12 Magnus Lundin [email protected]
7
Original Comments:
8
9
ML:This set of files adds support for firevre and usb cameras.
10
First it tries to install a firewire camera,
11
if that fails it tries a v4l/USB camera
12
It has been tested with the motempl sample program
13
14
First Patch: August 24, 2004 Travis Wood [email protected]
15
For Release: OpenCV-Linux Beta4 opencv-0.9.6
16
Tested On: LMLBT44 with 8 video inputs
17
Problems? Post your questions at answers.opencv.org,
18
Report bugs at code.opencv.org,
19
Submit your fixes at https://github.com/opencv/opencv/
20
Patched Comments:
21
22
TW: The cv cam utils that came with the initial release of OpenCV for LINUX Beta4
23
were not working. I have rewritten them so they work for me. At the same time, trying
24
to keep the original code as ML wrote it as unchanged as possible. No one likes to debug
25
someone elses code, so I resisted changes as much as possible. I have tried to keep the
26
same "ideas" where applicable, that is, where I could figure out what the previous author
27
intended. Some areas I just could not help myself and had to "spiffy-it-up" my way.
28
29
These drivers should work with other V4L frame capture cards other then my bttv
30
driven frame capture card.
31
32
Re Written driver for standard V4L mode. Tested using LMLBT44 video capture card.
33
Standard bttv drivers are on the LMLBT44 with up to 8 Inputs.
34
35
This utility was written with the help of the document:
36
http://pages.cpsc.ucalgary.ca/~sayles/VFL_HowTo
37
as a general guide for interfacing into the V4l standard.
38
39
Made the index value passed for icvOpenCAM_V4L(index) be the number of the
40
video device source in the /dev tree. The -1 uses original /dev/video.
41
42
Index Device
43
0 /dev/video0
44
1 /dev/video1
45
2 /dev/video2
46
3 /dev/video3
47
...
48
7 /dev/video7
49
with
50
-1 /dev/video
51
52
TW: You can select any video source, but this package was limited from the start to only
53
ONE camera opened at any ONE time.
54
This is an original program limitation.
55
If you are interested, I will make my version available to other OpenCV users. The big
56
difference in mine is you may pass the camera number as part of the cv argument, but this
57
convention is non standard for current OpenCV calls and the camera number is not currently
58
passed into the called routine.
59
60
Second Patch: August 28, 2004 Sfuncia Fabio [email protected]
61
For Release: OpenCV-Linux Beta4 Opencv-0.9.6
62
63
FS: this patch fix not sequential index of device (unplugged device), and real numCameras.
64
for -1 index (icvOpenCAM_V4L) I don't use /dev/video but real device available, because
65
if /dev/video is a link to /dev/video0 and i unplugged device on /dev/video0, /dev/video
66
is a bad link. I search the first available device with indexList.
67
68
Third Patch: December 9, 2004 Frederic Devernay [email protected]
69
For Release: OpenCV-Linux Beta4 Opencv-0.9.6
70
71
[FD] I modified the following:
72
- handle YUV420P, YUV420, and YUV411P palettes (for many webcams) without using floating-point
73
- cvGrabFrame should not wait for the end of the first frame, and should return quickly
74
(see videoio doc)
75
- cvRetrieveFrame should in turn wait for the end of frame capture, and should not
76
trigger the capture of the next frame (the user choses when to do it using GrabFrame)
77
To get the old behavior, re-call cvRetrieveFrame just after cvGrabFrame.
78
- having global bufferIndex and FirstCapture variables makes the code non-reentrant
79
(e.g. when using several cameras), put these in the CvCapture struct.
80
- according to V4L HowTo, incrementing the buffer index must be done before VIDIOCMCAPTURE.
81
- the VID_TYPE_SCALES stuff from V4L HowTo is wrong: image size can be changed
82
even if the hardware does not support scaling (e.g. webcams can have several
83
resolutions available). Just don't try to set the size at 640x480 if the hardware supports
84
scaling: open with the default (probably best) image size, and let the user scale it
85
using SetProperty.
86
- image size can be changed by two subsequent calls to SetProperty (for width and height)
87
- bug fix: if the image size changes, realloc the new image only when it is grabbed
88
- issue errors only when necessary, fix error message formatting.
89
90
Fourth Patch: Sept 7, 2005 Csaba Kertesz [email protected]
91
For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
92
93
I modified the following:
94
- Additional Video4Linux2 support :)
95
- Use mmap functions (v4l2)
96
- New methods are internal:
97
try_palette_v4l2 -> rewrite try_palette for v4l2
98
mainloop_v4l2, read_image_v4l2 -> this methods are moved from official v4l2 capture.c example
99
try_init_v4l -> device v4l initialisation
100
try_init_v4l2 -> device v4l2 initialisation
101
autosetup_capture_mode_v4l -> autodetect capture modes for v4l
102
autosetup_capture_mode_v4l2 -> autodetect capture modes for v4l2
103
- Modifications are according with Video4Linux old codes
104
- Video4Linux handling is automatically if it does not recognize a Video4Linux2 device
105
- Tested successfully with Logitech Quickcam Express (V4L), Creative Vista (V4L) and Genius VideoCam Notebook (V4L2)
106
- Correct source lines with compiler warning messages
107
- Information message from v4l/v4l2 detection
108
109
Fifth Patch: Sept 7, 2005 Csaba Kertesz [email protected]
110
For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
111
112
I modified the following:
113
- SN9C10x chip based webcams support
114
- New methods are internal:
115
bayer2rgb24, sonix_decompress -> decoder routines for SN9C10x decoding from Takafumi Mizuno <[email protected]> with his pleasure :)
116
- Tested successfully with Genius VideoCam Notebook (V4L2)
117
118
Sixth Patch: Sept 10, 2005 Csaba Kertesz [email protected]
119
For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
120
121
I added the following:
122
- Add capture control support (hue, saturation, brightness, contrast, gain)
123
- Get and change V4L capture controls (hue, saturation, brightness, contrast)
124
- New method is internal:
125
icvSetControl -> set capture controls
126
- Tested successfully with Creative Vista (V4L)
127
128
Seventh Patch: Sept 10, 2005 Csaba Kertesz [email protected]
129
For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
130
131
I added the following:
132
- Detect, get and change V4L2 capture controls (hue, saturation, brightness, contrast, gain)
133
- New methods are internal:
134
v4l2_scan_controls_enumerate_menu, v4l2_scan_controls -> detect capture control intervals
135
- Tested successfully with Genius VideoCam Notebook (V4L2)
136
137
8th patch: Jan 5, 2006, [email protected]
138
Add support of V4L2_PIX_FMT_YUYV and V4L2_PIX_FMT_MJPEG.
139
With this patch, new webcams of Logitech, like QuickCam Fusion works.
140
Note: For use these webcams, look at the UVC driver at
141
http://linux-uvc.berlios.de/
142
143
9th patch: Mar 4, 2006, [email protected]
144
- try V4L2 before V4L, because some devices are V4L2 by default,
145
but they try to implement the V4L compatibility layer.
146
So, I think this is better to support V4L2 before V4L.
147
- better separation between V4L2 and V4L initialization. (this was needed to support
148
some drivers working, but not fully with V4L2. (so, we do not know when we
149
need to switch from V4L2 to V4L.
150
151
10th patch: July 02, 2008, Mikhail Afanasyev [email protected]
152
Fix reliability problems with high-resolution UVC cameras on linux
153
the symptoms were damaged image and 'Corrupt JPEG data: premature end of data segment' on stderr
154
- V4L_ABORT_BADJPEG detects JPEG warnings and turns them into errors, so bad images
155
could be filtered out
156
- USE_TEMP_BUFFER fixes the main problem (improper buffer management) and
157
prevents bad images in the first place
158
159
11th patch: April 2, 2013, Forrest Reiling [email protected]
160
Added v4l2 support for getting capture property CV_CAP_PROP_POS_MSEC.
161
Returns the millisecond timestamp of the last frame grabbed or 0 if no frames have been grabbed
162
Used to successfully synchronize 2 Logitech C310 USB webcams to within 16 ms of one another
163
164
12th patch: March 9, 2018, Taylor Lanclos <[email protected]>
165
added support for CV_CAP_PROP_BUFFERSIZE
166
167
make & enjoy!
168
169
*/
170
171
/*M///////////////////////////////////////////////////////////////////////////////////////
172
//
173
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
174
//
175
// By downloading, copying, installing or using the software you agree to this license.
176
// If you do not agree to this license, do not download, install,
177
// copy or use the software.
178
//
179
//
180
// Intel License Agreement
181
// For Open Source Computer Vision Library
182
//
183
// Copyright (C) 2000, Intel Corporation, all rights reserved.
184
// Third party copyrights are property of their respective owners.
185
//
186
// Redistribution and use in source and binary forms, with or without modification,
187
// are permitted provided that the following conditions are met:
188
//
189
// * Redistribution's of source code must retain the above copyright notice,
190
// this list of conditions and the following disclaimer.
191
//
192
// * Redistribution's in binary form must reproduce the above copyright notice,
193
// this list of conditions and the following disclaimer in the documentation
194
// and/or other materials provided with the distribution.
195
//
196
// * The name of Intel Corporation may not be used to endorse or promote products
197
// derived from this software without specific prior written permission.
198
//
199
// This software is provided by the copyright holders and contributors "as is" and
200
// any express or implied warranties, including, but not limited to, the implied
201
// warranties of merchantability and fitness for a particular purpose are disclaimed.
202
// In no event shall the Intel Corporation or contributors be liable for any direct,
203
// indirect, incidental, special, exemplary, or consequential damages
204
// (including, but not limited to, procurement of substitute goods or services;
205
// loss of use, data, or profits; or business interruption) however caused
206
// and on any theory of liability, whether in contract, strict liability,
207
// or tort (including negligence or otherwise) arising in any way out of
208
// the use of this software, even if advised of the possibility of such damage.
209
//
210
//M*/
211
212
#include "precomp.hpp"
213
214
#if !defined _WIN32 && (defined HAVE_CAMV4L2 || defined HAVE_VIDEOIO)
215
216
#include <stdio.h>
217
#include <unistd.h>
218
#include <fcntl.h>
219
#include <errno.h>
220
#include <sys/ioctl.h>
221
#include <sys/types.h>
222
#include <sys/mman.h>
223
224
#include <string.h>
225
#include <stdlib.h>
226
#include <assert.h>
227
#include <sys/stat.h>
228
#include <sys/ioctl.h>
229
230
#ifdef HAVE_CAMV4L2
231
#include <asm/types.h> /* for videodev2.h */
232
#include <linux/videodev2.h>
233
#endif
234
235
#ifdef HAVE_VIDEOIO
236
// NetBSD compatibility layer with V4L2
237
#include <sys/videoio.h>
238
#endif
239
240
/* Defaults - If your board can do better, set it here. Set for the most common type inputs. */
241
#define DEFAULT_V4L_WIDTH 640
242
#define DEFAULT_V4L_HEIGHT 480
243
#define DEFAULT_V4L_FPS 30
244
245
#define CHANNEL_NUMBER 1
246
#define MAX_CAMERAS 8
247
248
249
// default and maximum number of V4L buffers, not including last, 'special' buffer
250
#define MAX_V4L_BUFFERS 10
251
#define DEFAULT_V4L_BUFFERS 4
252
253
// if enabled, then bad JPEG warnings become errors and cause NULL returned instead of image
254
#define V4L_ABORT_BADJPEG
255
256
#define MAX_DEVICE_DRIVER_NAME 80
257
258
namespace cv {
259
260
/* Device Capture Objects */
261
/* V4L2 structure */
262
struct buffer
263
{
264
void * start;
265
size_t length;
266
};
267
268
struct CvCaptureCAM_V4L CV_FINAL : public CvCapture
269
{
270
int getCaptureDomain() /*const*/ CV_OVERRIDE { return cv::CAP_V4L; }
271
272
int deviceHandle;
273
int bufferIndex;
274
int FirstCapture;
275
String deviceName;
276
277
char *memoryMap;
278
IplImage frame;
279
280
__u32 palette;
281
int width, height;
282
int width_set, height_set;
283
int bufferSize;
284
__u32 fps;
285
bool convert_rgb;
286
bool frame_allocated;
287
bool returnFrame;
288
289
/* V4L2 variables */
290
buffer buffers[MAX_V4L_BUFFERS + 1];
291
v4l2_capability cap;
292
v4l2_input inp;
293
v4l2_format form;
294
v4l2_crop crop;
295
v4l2_cropcap cropcap;
296
v4l2_requestbuffers req;
297
v4l2_buf_type type;
298
v4l2_queryctrl queryctrl;
299
300
timeval timestamp;
301
302
/* V4L2 control variables */
303
Range focus, brightness, contrast, saturation, hue, gain, exposure;
304
305
bool open(int _index);
306
bool open(const char* deviceName);
307
308
virtual double getProperty(int) const CV_OVERRIDE;
309
virtual bool setProperty(int, double) CV_OVERRIDE;
310
virtual bool grabFrame() CV_OVERRIDE;
311
virtual IplImage* retrieveFrame(int) CV_OVERRIDE;
312
313
Range getRange(int property_id) const {
314
switch (property_id) {
315
case CV_CAP_PROP_BRIGHTNESS:
316
return brightness;
317
case CV_CAP_PROP_CONTRAST:
318
return contrast;
319
case CV_CAP_PROP_SATURATION:
320
return saturation;
321
case CV_CAP_PROP_HUE:
322
return hue;
323
case CV_CAP_PROP_GAIN:
324
return gain;
325
case CV_CAP_PROP_EXPOSURE:
326
return exposure;
327
case CV_CAP_PROP_FOCUS:
328
return focus;
329
case CV_CAP_PROP_AUTOFOCUS:
330
return Range(0, 1);
331
case CV_CAP_PROP_AUTO_EXPOSURE:
332
return Range(0, 4);
333
default:
334
return Range(0, 255);
335
}
336
}
337
338
virtual ~CvCaptureCAM_V4L();
339
};
340
341
static void icvCloseCAM_V4L( CvCaptureCAM_V4L* capture );
342
343
static bool icvGrabFrameCAM_V4L( CvCaptureCAM_V4L* capture );
344
static IplImage* icvRetrieveFrameCAM_V4L( CvCaptureCAM_V4L* capture, int );
345
346
static double icvGetPropertyCAM_V4L( const CvCaptureCAM_V4L* capture, int property_id );
347
static int icvSetPropertyCAM_V4L( CvCaptureCAM_V4L* capture, int property_id, double value );
348
349
/*********************** Implementations ***************************************/
350
351
CvCaptureCAM_V4L::~CvCaptureCAM_V4L() {
352
icvCloseCAM_V4L(this);
353
}
354
355
static bool try_palette_v4l2(CvCaptureCAM_V4L* capture)
356
{
357
capture->form = v4l2_format();
358
capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
359
capture->form.fmt.pix.pixelformat = capture->palette;
360
capture->form.fmt.pix.field = V4L2_FIELD_ANY;
361
capture->form.fmt.pix.width = capture->width;
362
capture->form.fmt.pix.height = capture->height;
363
364
if (-1 == ioctl (capture->deviceHandle, VIDIOC_S_FMT, &capture->form))
365
return false;
366
367
return capture->palette == capture->form.fmt.pix.pixelformat;
368
}
369
370
static int try_init_v4l2(CvCaptureCAM_V4L* capture, const char *deviceName)
371
{
372
// Test device for V4L2 compatibility
373
// Return value:
374
// -1 then unable to open device
375
// 0 then detected nothing
376
// 1 then V4L2 device
377
378
int deviceIndex;
379
380
/* Open and test V4L2 device */
381
capture->deviceHandle = open (deviceName, O_RDWR /* required */ | O_NONBLOCK, 0);
382
if (-1 == capture->deviceHandle)
383
{
384
#ifndef NDEBUG
385
fprintf(stderr, "(DEBUG) try_init_v4l2 open \"%s\": %s\n", deviceName, strerror(errno));
386
#endif
387
icvCloseCAM_V4L(capture);
388
return -1;
389
}
390
391
capture->cap = v4l2_capability();
392
if (-1 == ioctl (capture->deviceHandle, VIDIOC_QUERYCAP, &capture->cap))
393
{
394
#ifndef NDEBUG
395
fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_QUERYCAP \"%s\": %s\n", deviceName, strerror(errno));
396
#endif
397
icvCloseCAM_V4L(capture);
398
return 0;
399
}
400
401
/* Query channels number */
402
if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_INPUT, &deviceIndex))
403
{
404
#ifndef NDEBUG
405
fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_G_INPUT \"%s\": %s\n", deviceName, strerror(errno));
406
#endif
407
icvCloseCAM_V4L(capture);
408
return 0;
409
}
410
411
/* Query information about current input */
412
capture->inp = v4l2_input();
413
capture->inp.index = deviceIndex;
414
if (-1 == ioctl (capture->deviceHandle, VIDIOC_ENUMINPUT, &capture->inp))
415
{
416
#ifndef NDEBUG
417
fprintf(stderr, "(DEBUG) try_init_v4l2 VIDIOC_ENUMINPUT \"%s\": %s\n", deviceName, strerror(errno));
418
#endif
419
icvCloseCAM_V4L(capture);
420
return 0;
421
}
422
423
return 1;
424
425
}
426
427
static int autosetup_capture_mode_v4l2(CvCaptureCAM_V4L* capture) {
428
//in case palette is already set and works, no need to setup.
429
if(capture->palette != 0 and try_palette_v4l2(capture)){
430
return 0;
431
}
432
__u32 try_order[] = {
433
V4L2_PIX_FMT_BGR24,
434
V4L2_PIX_FMT_RGB24,
435
V4L2_PIX_FMT_YVU420,
436
V4L2_PIX_FMT_YUV420,
437
V4L2_PIX_FMT_YUV411P,
438
V4L2_PIX_FMT_YUYV,
439
V4L2_PIX_FMT_UYVY,
440
V4L2_PIX_FMT_SBGGR8,
441
V4L2_PIX_FMT_SGBRG8,
442
V4L2_PIX_FMT_SN9C10X,
443
#ifdef HAVE_JPEG
444
V4L2_PIX_FMT_MJPEG,
445
V4L2_PIX_FMT_JPEG,
446
#endif
447
V4L2_PIX_FMT_Y16,
448
V4L2_PIX_FMT_GREY
449
};
450
451
for (size_t i = 0; i < sizeof(try_order) / sizeof(__u32); i++) {
452
capture->palette = try_order[i];
453
if (try_palette_v4l2(capture)) {
454
return 0;
455
}
456
}
457
458
fprintf(stderr,
459
"VIDEOIO ERROR: V4L2: Pixel format of incoming image is unsupported by OpenCV\n");
460
icvCloseCAM_V4L(capture);
461
return -1;
462
}
463
464
static void v4l2_control_range(CvCaptureCAM_V4L* cap, __u32 id)
465
{
466
cap->queryctrl= v4l2_queryctrl();
467
cap->queryctrl.id = id;
468
469
if(0 != ioctl(cap->deviceHandle, VIDIOC_QUERYCTRL, &cap->queryctrl))
470
{
471
if (errno != EINVAL)
472
perror ("VIDIOC_QUERYCTRL");
473
return;
474
}
475
476
if (cap->queryctrl.flags & V4L2_CTRL_FLAG_DISABLED)
477
return;
478
479
Range range(cap->queryctrl.minimum, cap->queryctrl.maximum);
480
481
switch(cap->queryctrl.id) {
482
case V4L2_CID_BRIGHTNESS:
483
cap->brightness = range;
484
break;
485
case V4L2_CID_CONTRAST:
486
cap->contrast = range;
487
break;
488
case V4L2_CID_SATURATION:
489
cap->saturation = range;
490
break;
491
case V4L2_CID_HUE:
492
cap->hue = range;
493
break;
494
case V4L2_CID_GAIN:
495
cap->gain = range;
496
break;
497
case V4L2_CID_EXPOSURE_ABSOLUTE:
498
cap->exposure = range;
499
break;
500
case V4L2_CID_FOCUS_ABSOLUTE:
501
cap->focus = range;
502
break;
503
}
504
}
505
506
static void v4l2_scan_controls(CvCaptureCAM_V4L* capture)
507
{
508
509
__u32 ctrl_id;
510
511
for (ctrl_id = V4L2_CID_BASE; ctrl_id < V4L2_CID_LASTP1; ctrl_id++)
512
{
513
v4l2_control_range(capture, ctrl_id);
514
}
515
516
for (ctrl_id = V4L2_CID_PRIVATE_BASE;;ctrl_id++)
517
{
518
errno = 0;
519
520
v4l2_control_range(capture, ctrl_id);
521
522
if (errno)
523
break;
524
}
525
526
v4l2_control_range(capture, V4L2_CID_FOCUS_ABSOLUTE);
527
}
528
529
static int v4l2_set_fps(CvCaptureCAM_V4L* capture) {
530
v4l2_streamparm setfps = v4l2_streamparm();
531
setfps.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
532
setfps.parm.capture.timeperframe.numerator = 1;
533
setfps.parm.capture.timeperframe.denominator = capture->fps;
534
return ioctl (capture->deviceHandle, VIDIOC_S_PARM, &setfps);
535
}
536
537
static int v4l2_num_channels(__u32 palette) {
538
switch(palette) {
539
case V4L2_PIX_FMT_YVU420:
540
case V4L2_PIX_FMT_YUV420:
541
case V4L2_PIX_FMT_MJPEG:
542
case V4L2_PIX_FMT_JPEG:
543
case V4L2_PIX_FMT_Y16:
544
case V4L2_PIX_FMT_GREY:
545
return 1;
546
case V4L2_PIX_FMT_YUYV:
547
case V4L2_PIX_FMT_UYVY:
548
return 2;
549
case V4L2_PIX_FMT_BGR24:
550
case V4L2_PIX_FMT_RGB24:
551
return 3;
552
default:
553
return 0;
554
}
555
}
556
557
static void v4l2_create_frame(CvCaptureCAM_V4L *capture) {
558
CvSize size = {capture->form.fmt.pix.width, capture->form.fmt.pix.height};
559
int channels = 3;
560
int depth = IPL_DEPTH_8U;
561
562
if (!capture->convert_rgb) {
563
channels = v4l2_num_channels(capture->palette);
564
565
switch(capture->palette) {
566
case V4L2_PIX_FMT_MJPEG:
567
case V4L2_PIX_FMT_JPEG:
568
size = cvSize(capture->buffers[capture->bufferIndex].length, 1);
569
break;
570
case V4L2_PIX_FMT_YVU420:
571
case V4L2_PIX_FMT_YUV420:
572
size.height = size.height * 3 / 2; // "1.5" channels
573
break;
574
case V4L2_PIX_FMT_Y16:
575
if(!capture->convert_rgb){
576
depth = IPL_DEPTH_16U;
577
}
578
break;
579
}
580
}
581
582
/* Set up Image data */
583
cvInitImageHeader(&capture->frame, size, depth, channels);
584
585
/* Allocate space for pixelformat we convert to.
586
* If we do not convert frame is just points to the buffer
587
*/
588
if(capture->convert_rgb) {
589
capture->frame.imageData = (char*)cvAlloc(capture->frame.imageSize);
590
}
591
592
capture->frame_allocated = capture->convert_rgb;
593
}
594
595
static int _capture_V4L2 (CvCaptureCAM_V4L *capture)
596
{
597
const char* deviceName = capture->deviceName.c_str();
598
if (try_init_v4l2(capture, deviceName) != 1) {
599
/* init of the v4l2 device is not OK */
600
return -1;
601
}
602
603
/* V4L2 control variables are zero (memset above) */
604
605
/* Scan V4L2 controls */
606
v4l2_scan_controls(capture);
607
608
if ((capture->cap.capabilities & V4L2_CAP_VIDEO_CAPTURE) == 0) {
609
/* Nope. */
610
fprintf( stderr, "VIDEOIO ERROR: V4L2: device %s is unable to capture video memory.\n",deviceName);
611
icvCloseCAM_V4L(capture);
612
return -1;
613
}
614
615
/* The following code sets the CHANNEL_NUMBER of the video input. Some video sources
616
have sub "Channel Numbers". For a typical V4L TV capture card, this is usually 1.
617
I myself am using a simple NTSC video input capture card that uses the value of 1.
618
If you are not in North America or have a different video standard, you WILL have to change
619
the following settings and recompile/reinstall. This set of settings is based on
620
the most commonly encountered input video source types (like my bttv card) */
621
622
if(capture->inp.index > 0) {
623
capture->inp = v4l2_input();
624
capture->inp.index = CHANNEL_NUMBER;
625
/* Set only channel number to CHANNEL_NUMBER */
626
/* V4L2 have a status field from selected video mode */
627
if (-1 == ioctl (capture->deviceHandle, VIDIOC_ENUMINPUT, &capture->inp))
628
{
629
fprintf (stderr, "VIDEOIO ERROR: V4L2: Aren't able to set channel number\n");
630
icvCloseCAM_V4L (capture);
631
return -1;
632
}
633
} /* End if */
634
635
/* Find Window info */
636
capture->form = v4l2_format();
637
capture->form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
638
639
if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_FMT, &capture->form)) {
640
fprintf( stderr, "VIDEOIO ERROR: V4L2: Could not obtain specifics of capture window.\n\n");
641
icvCloseCAM_V4L(capture);
642
return -1;
643
}
644
645
if (autosetup_capture_mode_v4l2(capture) == -1)
646
return -1;
647
648
/* try to set framerate */
649
v4l2_set_fps(capture);
650
651
unsigned int min;
652
653
/* Buggy driver paranoia. */
654
min = capture->form.fmt.pix.width * 2;
655
656
if (capture->form.fmt.pix.bytesperline < min)
657
capture->form.fmt.pix.bytesperline = min;
658
659
min = capture->form.fmt.pix.bytesperline * capture->form.fmt.pix.height;
660
661
if (capture->form.fmt.pix.sizeimage < min)
662
capture->form.fmt.pix.sizeimage = min;
663
664
capture->req = v4l2_requestbuffers();
665
666
unsigned int buffer_number = capture->bufferSize;
667
668
try_again:
669
670
capture->req.count = buffer_number;
671
capture->req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
672
capture->req.memory = V4L2_MEMORY_MMAP;
673
674
if (-1 == ioctl (capture->deviceHandle, VIDIOC_REQBUFS, &capture->req))
675
{
676
if (EINVAL == errno)
677
{
678
fprintf (stderr, "%s does not support memory mapping\n", deviceName);
679
} else {
680
perror ("VIDIOC_REQBUFS");
681
}
682
/* free capture, and returns an error code */
683
icvCloseCAM_V4L (capture);
684
return -1;
685
}
686
687
if (capture->req.count < buffer_number)
688
{
689
if (buffer_number == 1)
690
{
691
fprintf (stderr, "Insufficient buffer memory on %s\n", deviceName);
692
693
/* free capture, and returns an error code */
694
icvCloseCAM_V4L (capture);
695
return -1;
696
} else {
697
buffer_number--;
698
fprintf (stderr, "Insufficient buffer memory on %s -- decreaseing buffers\n", deviceName);
699
700
goto try_again;
701
}
702
}
703
704
for (unsigned int n_buffers = 0; n_buffers < capture->req.count; ++n_buffers)
705
{
706
v4l2_buffer buf = v4l2_buffer();
707
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
708
buf.memory = V4L2_MEMORY_MMAP;
709
buf.index = n_buffers;
710
711
if (-1 == ioctl (capture->deviceHandle, VIDIOC_QUERYBUF, &buf)) {
712
perror ("VIDIOC_QUERYBUF");
713
714
/* free capture, and returns an error code */
715
icvCloseCAM_V4L (capture);
716
return -1;
717
}
718
719
capture->buffers[n_buffers].length = buf.length;
720
capture->buffers[n_buffers].start =
721
mmap (NULL /* start anywhere */,
722
buf.length,
723
PROT_READ | PROT_WRITE /* required */,
724
MAP_SHARED /* recommended */,
725
capture->deviceHandle, buf.m.offset);
726
727
if (MAP_FAILED == capture->buffers[n_buffers].start) {
728
perror ("mmap");
729
730
/* free capture, and returns an error code */
731
icvCloseCAM_V4L (capture);
732
return -1;
733
}
734
735
if (n_buffers == 0) {
736
capture->buffers[MAX_V4L_BUFFERS].start = malloc( buf.length );
737
capture->buffers[MAX_V4L_BUFFERS].length = buf.length;
738
}
739
}
740
741
v4l2_create_frame(capture);
742
743
// reinitialize buffers
744
capture->FirstCapture = 1;
745
746
return 1;
747
}; /* End _capture_V4L2 */
748
749
/**
750
* some properties can not be changed while the device is in streaming mode.
751
* this method closes and re-opens the device to re-start the stream.
752
* this also causes buffers to be reallocated if the frame size was changed.
753
*/
754
static bool v4l2_reset( CvCaptureCAM_V4L* capture) {
755
String deviceName = capture->deviceName;
756
icvCloseCAM_V4L(capture);
757
capture->deviceName = deviceName;
758
return _capture_V4L2(capture) == 1;
759
}
760
761
bool CvCaptureCAM_V4L::open(int _index)
762
{
763
cv::String name;
764
/* Select camera, or rather, V4L video source */
765
if (_index < 0) // Asking for the first device available
766
{
767
for (int autoindex = 0; autoindex < MAX_CAMERAS; ++autoindex)
768
{
769
name = cv::format("/dev/video%d", autoindex);
770
/* Test using an open to see if this new device name really does exists. */
771
int h = ::open(name.c_str(), O_RDONLY);
772
if (h != -1)
773
{
774
::close(h);
775
_index = autoindex;
776
break;
777
}
778
}
779
if (_index < 0)
780
{
781
fprintf(stderr, "VIDEOIO ERROR: V4L: can't find camera device\n");
782
name.clear();
783
return false;
784
}
785
}
786
else
787
{
788
name = cv::format("/dev/video%d", _index);
789
}
790
791
/* Print the CameraNumber at the end of the string with a width of one character */
792
bool res = open(name.c_str());
793
if (!res)
794
{
795
fprintf(stderr, "VIDEOIO ERROR: V4L: can't open camera by index %d\n", _index);
796
}
797
return res;
798
}
799
800
bool CvCaptureCAM_V4L::open(const char* _deviceName)
801
{
802
#ifndef NDEBUG
803
fprintf(stderr, "(DEBUG) V4L: opening %s\n", _deviceName);
804
#endif
805
FirstCapture = 1;
806
width = DEFAULT_V4L_WIDTH;
807
height = DEFAULT_V4L_HEIGHT;
808
width_set = height_set = 0;
809
bufferSize = DEFAULT_V4L_BUFFERS;
810
fps = DEFAULT_V4L_FPS;
811
convert_rgb = true;
812
deviceName = _deviceName;
813
returnFrame = true;
814
815
return _capture_V4L2(this) == 1;
816
}
817
818
static int read_frame_v4l2(CvCaptureCAM_V4L* capture) {
819
v4l2_buffer buf = v4l2_buffer();
820
821
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
822
buf.memory = V4L2_MEMORY_MMAP;
823
824
if (-1 == ioctl (capture->deviceHandle, VIDIOC_DQBUF, &buf)) {
825
switch (errno) {
826
case EAGAIN:
827
return 0;
828
829
case EIO:
830
if (!(buf.flags & (V4L2_BUF_FLAG_QUEUED | V4L2_BUF_FLAG_DONE)))
831
{
832
if (ioctl(capture->deviceHandle, VIDIOC_QBUF, &buf) == -1)
833
{
834
return 0;
835
}
836
}
837
return 0;
838
839
default:
840
/* display the error and stop processing */
841
capture->returnFrame = false;
842
perror ("VIDIOC_DQBUF");
843
return -1;
844
}
845
}
846
847
assert(buf.index < capture->req.count);
848
849
memcpy(capture->buffers[MAX_V4L_BUFFERS].start,
850
capture->buffers[buf.index].start,
851
capture->buffers[MAX_V4L_BUFFERS].length );
852
capture->bufferIndex = MAX_V4L_BUFFERS;
853
//printf("got data in buff %d, len=%d, flags=0x%X, seq=%d, used=%d)\n",
854
// buf.index, buf.length, buf.flags, buf.sequence, buf.bytesused);
855
856
//set timestamp in capture struct to be timestamp of most recent frame
857
capture->timestamp = buf.timestamp;
858
859
if (-1 == ioctl (capture->deviceHandle, VIDIOC_QBUF, &buf))
860
perror ("VIDIOC_QBUF");
861
862
return 1;
863
}
864
865
static int mainloop_v4l2(CvCaptureCAM_V4L* capture) {
866
for (;;) {
867
fd_set fds;
868
struct timeval tv;
869
int r;
870
871
FD_ZERO (&fds);
872
FD_SET (capture->deviceHandle, &fds);
873
874
/* Timeout. */
875
tv.tv_sec = 10;
876
tv.tv_usec = 0;
877
878
r = select (capture->deviceHandle+1, &fds, NULL, NULL, &tv);
879
880
if (-1 == r) {
881
if (EINTR == errno)
882
continue;
883
884
perror ("select");
885
}
886
887
if (0 == r) {
888
fprintf (stderr, "select timeout\n");
889
890
/* end the infinite loop */
891
break;
892
}
893
894
int returnCode = read_frame_v4l2 (capture);
895
if(returnCode == -1)
896
return -1;
897
if(returnCode == 1)
898
return 1;
899
}
900
return 0;
901
}
902
903
static bool icvGrabFrameCAM_V4L(CvCaptureCAM_V4L* capture) {
904
if (capture->FirstCapture) {
905
/* Some general initialization must take place the first time through */
906
907
/* This is just a technicality, but all buffers must be filled up before any
908
staggered SYNC is applied. SO, filler up. (see V4L HowTo) */
909
910
{
911
912
for (capture->bufferIndex = 0;
913
capture->bufferIndex < ((int)capture->req.count);
914
++capture->bufferIndex)
915
{
916
917
v4l2_buffer buf = v4l2_buffer();
918
919
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
920
buf.memory = V4L2_MEMORY_MMAP;
921
buf.index = (unsigned long)capture->bufferIndex;
922
923
if (-1 == ioctl (capture->deviceHandle, VIDIOC_QBUF, &buf)) {
924
perror ("VIDIOC_QBUF");
925
return false;
926
}
927
}
928
929
/* enable the streaming */
930
capture->type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
931
if (-1 == ioctl (capture->deviceHandle, VIDIOC_STREAMON,
932
&capture->type)) {
933
/* error enabling the stream */
934
perror ("VIDIOC_STREAMON");
935
return false;
936
}
937
}
938
939
#if defined(V4L_ABORT_BADJPEG)
940
// skip first frame. it is often bad -- this is unnotied in traditional apps,
941
// but could be fatal if bad jpeg is enabled
942
if(mainloop_v4l2(capture) != 1)
943
return false;
944
#endif
945
946
/* preparation is ok */
947
capture->FirstCapture = 0;
948
}
949
950
if(mainloop_v4l2(capture) != 1) return false;
951
952
return true;
953
}
954
955
/*
956
* Turn a YUV4:2:0 block into an RGB block
957
*
958
* Video4Linux seems to use the blue, green, red channel
959
* order convention-- rgb[0] is blue, rgb[1] is green, rgb[2] is red.
960
*
961
* Color space conversion coefficients taken from the excellent
962
* http://www.inforamp.net/~poynton/ColorFAQ.html
963
* In his terminology, this is a CCIR 601.1 YCbCr -> RGB.
964
* Y values are given for all 4 pixels, but the U (Pb)
965
* and V (Pr) are assumed constant over the 2x2 block.
966
*
967
* To avoid floating point arithmetic, the color conversion
968
* coefficients are scaled into 16.16 fixed-point integers.
969
* They were determined as follows:
970
*
971
* double brightness = 1.0; (0->black; 1->full scale)
972
* double saturation = 1.0; (0->greyscale; 1->full color)
973
* double fixScale = brightness * 256 * 256;
974
* int rvScale = (int)(1.402 * saturation * fixScale);
975
* int guScale = (int)(-0.344136 * saturation * fixScale);
976
* int gvScale = (int)(-0.714136 * saturation * fixScale);
977
* int buScale = (int)(1.772 * saturation * fixScale);
978
* int yScale = (int)(fixScale);
979
*/
980
981
/* LIMIT: convert a 16.16 fixed-point value to a byte, with clipping. */
982
#define LIMIT(x) ((x)>0xffffff?0xff: ((x)<=0xffff?0:((x)>>16)))
983
984
static inline void
985
move_411_block(int yTL, int yTR, int yBL, int yBR, int u, int v,
986
int /*rowPixels*/, unsigned char * rgb)
987
{
988
const int rvScale = 91881;
989
const int guScale = -22553;
990
const int gvScale = -46801;
991
const int buScale = 116129;
992
const int yScale = 65536;
993
int r, g, b;
994
995
g = guScale * u + gvScale * v;
996
// if (force_rgb) {
997
// r = buScale * u;
998
// b = rvScale * v;
999
// } else {
1000
r = rvScale * v;
1001
b = buScale * u;
1002
// }
1003
1004
yTL *= yScale; yTR *= yScale;
1005
yBL *= yScale; yBR *= yScale;
1006
1007
/* Write out top two first pixels */
1008
rgb[0] = LIMIT(b+yTL); rgb[1] = LIMIT(g+yTL);
1009
rgb[2] = LIMIT(r+yTL);
1010
1011
rgb[3] = LIMIT(b+yTR); rgb[4] = LIMIT(g+yTR);
1012
rgb[5] = LIMIT(r+yTR);
1013
1014
/* Write out top two last pixels */
1015
rgb += 6;
1016
rgb[0] = LIMIT(b+yBL); rgb[1] = LIMIT(g+yBL);
1017
rgb[2] = LIMIT(r+yBL);
1018
1019
rgb[3] = LIMIT(b+yBR); rgb[4] = LIMIT(g+yBR);
1020
rgb[5] = LIMIT(r+yBR);
1021
}
1022
1023
/* Converts from planar YUV420P to RGB24. */
1024
static inline void
1025
yuv420p_to_rgb24(int width, int height, uchar* src, uchar* dst, bool isYUV)
1026
{
1027
cvtColor(Mat(height * 3 / 2, width, CV_8U, src), Mat(height, width, CV_8UC3, dst),
1028
isYUV ? COLOR_YUV2BGR_IYUV : COLOR_YUV2BGR_YV12);
1029
}
1030
1031
// Consider a YUV411P image of 8x2 pixels.
1032
//
1033
// A plane of Y values as before.
1034
//
1035
// A plane of U values 1 2
1036
// 3 4
1037
//
1038
// A plane of V values 1 2
1039
// 3 4
1040
//
1041
// The U1/V1 samples correspond to the ABCD pixels.
1042
// U2/V2 samples correspond to the EFGH pixels.
1043
//
1044
/* Converts from planar YUV411P to RGB24. */
1045
/* [FD] untested... */
1046
static void
1047
yuv411p_to_rgb24(int width, int height,
1048
unsigned char *pIn0, unsigned char *pOut0)
1049
{
1050
const int numpix = width * height;
1051
const int bytes = 24 >> 3;
1052
int i, j, y00, y01, y10, y11, u, v;
1053
unsigned char *pY = pIn0;
1054
unsigned char *pU = pY + numpix;
1055
unsigned char *pV = pU + numpix / 4;
1056
unsigned char *pOut = pOut0;
1057
1058
for (j = 0; j <= height; j++) {
1059
for (i = 0; i <= width - 4; i += 4) {
1060
y00 = *pY;
1061
y01 = *(pY + 1);
1062
y10 = *(pY + 2);
1063
y11 = *(pY + 3);
1064
u = (*pU++) - 128;
1065
v = (*pV++) - 128;
1066
1067
move_411_block(y00, y01, y10, y11, u, v,
1068
width, pOut);
1069
1070
pY += 4;
1071
pOut += 4 * bytes;
1072
1073
}
1074
}
1075
}
1076
1077
/* convert from 4:2:2 YUYV interlaced to RGB24 */
1078
static void
1079
yuyv_to_rgb24(int width, int height, unsigned char* src, unsigned char* dst) {
1080
cvtColor(Mat(height, width, CV_8UC2, src), Mat(height, width, CV_8UC3, dst),
1081
COLOR_YUV2BGR_YUYV);
1082
}
1083
1084
static inline void
1085
uyvy_to_rgb24 (int width, int height, unsigned char *src, unsigned char *dst)
1086
{
1087
cvtColor(Mat(height, width, CV_8UC2, src), Mat(height, width, CV_8UC3, dst),
1088
COLOR_YUV2BGR_UYVY);
1089
}
1090
1091
static inline void
1092
y16_to_rgb24 (int width, int height, unsigned char* src, unsigned char* dst)
1093
{
1094
Mat gray8;
1095
Mat(height, width, CV_16UC1, src).convertTo(gray8, CV_8U, 0.00390625);
1096
cvtColor(gray8,Mat(height, width, CV_8UC3, dst),COLOR_GRAY2BGR);
1097
}
1098
1099
static inline void
1100
y8_to_rgb24 (int width, int height, unsigned char* src, unsigned char* dst)
1101
{
1102
Mat gray8(height, width, CV_8UC1, src);
1103
cvtColor(gray8,Mat(height, width, CV_8UC3, dst),COLOR_GRAY2BGR);
1104
}
1105
1106
#ifdef HAVE_JPEG
1107
1108
/* convert from mjpeg to rgb24 */
1109
static bool
1110
mjpeg_to_rgb24(int width, int height, unsigned char* src, int length, IplImage* dst) {
1111
Mat temp = cvarrToMat(dst);
1112
imdecode(Mat(1, length, CV_8U, src), IMREAD_COLOR, &temp);
1113
return temp.data && temp.cols == width && temp.rows == height;
1114
}
1115
1116
#endif
1117
1118
/*
1119
* BAYER2RGB24 ROUTINE TAKEN FROM:
1120
*
1121
* Sonix SN9C10x based webcam basic I/F routines
1122
* Takafumi Mizuno <[email protected]>
1123
*
1124
*/
1125
static void bayer2rgb24(long int WIDTH, long int HEIGHT, unsigned char *src, unsigned char *dst)
1126
{
1127
long int i;
1128
unsigned char *rawpt, *scanpt;
1129
long int size;
1130
1131
rawpt = src;
1132
scanpt = dst;
1133
size = WIDTH*HEIGHT;
1134
1135
for ( i = 0; i < size; i++ ) {
1136
if ( (i/WIDTH) % 2 == 0 ) {
1137
if ( (i % 2) == 0 ) {
1138
/* B */
1139
if ( (i > WIDTH) && ((i % WIDTH) > 0) ) {
1140
*scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+
1141
*(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4; /* R */
1142
*scanpt++ = (*(rawpt-1)+*(rawpt+1)+
1143
*(rawpt+WIDTH)+*(rawpt-WIDTH))/4; /* G */
1144
*scanpt++ = *rawpt; /* B */
1145
} else {
1146
/* first line or left column */
1147
*scanpt++ = *(rawpt+WIDTH+1); /* R */
1148
*scanpt++ = (*(rawpt+1)+*(rawpt+WIDTH))/2; /* G */
1149
*scanpt++ = *rawpt; /* B */
1150
}
1151
} else {
1152
/* (B)G */
1153
if ( (i > WIDTH) && ((i % WIDTH) < (WIDTH-1)) ) {
1154
*scanpt++ = (*(rawpt+WIDTH)+*(rawpt-WIDTH))/2; /* R */
1155
*scanpt++ = *rawpt; /* G */
1156
*scanpt++ = (*(rawpt-1)+*(rawpt+1))/2; /* B */
1157
} else {
1158
/* first line or right column */
1159
*scanpt++ = *(rawpt+WIDTH); /* R */
1160
*scanpt++ = *rawpt; /* G */
1161
*scanpt++ = *(rawpt-1); /* B */
1162
}
1163
}
1164
} else {
1165
if ( (i % 2) == 0 ) {
1166
/* G(R) */
1167
if ( (i < (WIDTH*(HEIGHT-1))) && ((i % WIDTH) > 0) ) {
1168
*scanpt++ = (*(rawpt-1)+*(rawpt+1))/2; /* R */
1169
*scanpt++ = *rawpt; /* G */
1170
*scanpt++ = (*(rawpt+WIDTH)+*(rawpt-WIDTH))/2; /* B */
1171
} else {
1172
/* bottom line or left column */
1173
*scanpt++ = *(rawpt+1); /* R */
1174
*scanpt++ = *rawpt; /* G */
1175
*scanpt++ = *(rawpt-WIDTH); /* B */
1176
}
1177
} else {
1178
/* R */
1179
if ( i < (WIDTH*(HEIGHT-1)) && ((i % WIDTH) < (WIDTH-1)) ) {
1180
*scanpt++ = *rawpt; /* R */
1181
*scanpt++ = (*(rawpt-1)+*(rawpt+1)+
1182
*(rawpt-WIDTH)+*(rawpt+WIDTH))/4; /* G */
1183
*scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+
1184
*(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4; /* B */
1185
} else {
1186
/* bottom line or right column */
1187
*scanpt++ = *rawpt; /* R */
1188
*scanpt++ = (*(rawpt-1)+*(rawpt-WIDTH))/2; /* G */
1189
*scanpt++ = *(rawpt-WIDTH-1); /* B */
1190
}
1191
}
1192
}
1193
rawpt++;
1194
}
1195
1196
}
1197
1198
// SGBRG to RGB24
1199
// for some reason, red and blue needs to be swapped
1200
// at least for 046d:092f Logitech, Inc. QuickCam Express Plus to work
1201
//see: http://www.siliconimaging.com/RGB%20Bayer.htm
1202
//and 4.6 at http://tldp.org/HOWTO/html_single/libdc1394-HOWTO/
1203
static void sgbrg2rgb24(long int WIDTH, long int HEIGHT, unsigned char *src, unsigned char *dst)
1204
{
1205
long int i;
1206
unsigned char *rawpt, *scanpt;
1207
long int size;
1208
1209
rawpt = src;
1210
scanpt = dst;
1211
size = WIDTH*HEIGHT;
1212
1213
for ( i = 0; i < size; i++ )
1214
{
1215
if ( (i/WIDTH) % 2 == 0 ) //even row
1216
{
1217
if ( (i % 2) == 0 ) //even pixel
1218
{
1219
if ( (i > WIDTH) && ((i % WIDTH) > 0) )
1220
{
1221
*scanpt++ = (*(rawpt-1)+*(rawpt+1))/2; /* R */
1222
*scanpt++ = *(rawpt); /* G */
1223
*scanpt++ = (*(rawpt-WIDTH) + *(rawpt+WIDTH))/2; /* B */
1224
} else
1225
{
1226
/* first line or left column */
1227
1228
*scanpt++ = *(rawpt+1); /* R */
1229
*scanpt++ = *(rawpt); /* G */
1230
*scanpt++ = *(rawpt+WIDTH); /* B */
1231
}
1232
} else //odd pixel
1233
{
1234
if ( (i > WIDTH) && ((i % WIDTH) < (WIDTH-1)) )
1235
{
1236
*scanpt++ = *(rawpt); /* R */
1237
*scanpt++ = (*(rawpt-1)+*(rawpt+1)+*(rawpt-WIDTH)+*(rawpt+WIDTH))/4; /* G */
1238
*scanpt++ = (*(rawpt-WIDTH-1) + *(rawpt-WIDTH+1) + *(rawpt+WIDTH-1) + *(rawpt+WIDTH+1))/4; /* B */
1239
} else
1240
{
1241
/* first line or right column */
1242
1243
*scanpt++ = *(rawpt); /* R */
1244
*scanpt++ = (*(rawpt-1)+*(rawpt+WIDTH))/2; /* G */
1245
*scanpt++ = *(rawpt+WIDTH-1); /* B */
1246
}
1247
}
1248
} else
1249
{ //odd row
1250
if ( (i % 2) == 0 ) //even pixel
1251
{
1252
if ( (i < (WIDTH*(HEIGHT-1))) && ((i % WIDTH) > 0) )
1253
{
1254
*scanpt++ = (*(rawpt-WIDTH-1)+*(rawpt-WIDTH+1)+*(rawpt+WIDTH-1)+*(rawpt+WIDTH+1))/4; /* R */
1255
*scanpt++ = (*(rawpt-1)+*(rawpt+1)+*(rawpt-WIDTH)+*(rawpt+WIDTH))/4; /* G */
1256
*scanpt++ = *(rawpt); /* B */
1257
} else
1258
{
1259
/* bottom line or left column */
1260
1261
*scanpt++ = *(rawpt-WIDTH+1); /* R */
1262
*scanpt++ = (*(rawpt+1)+*(rawpt-WIDTH))/2; /* G */
1263
*scanpt++ = *(rawpt); /* B */
1264
}
1265
} else
1266
{ //odd pixel
1267
if ( i < (WIDTH*(HEIGHT-1)) && ((i % WIDTH) < (WIDTH-1)) )
1268
{
1269
*scanpt++ = (*(rawpt-WIDTH)+*(rawpt+WIDTH))/2; /* R */
1270
*scanpt++ = *(rawpt); /* G */
1271
*scanpt++ = (*(rawpt-1)+*(rawpt+1))/2; /* B */
1272
} else
1273
{
1274
/* bottom line or right column */
1275
1276
*scanpt++ = (*(rawpt-WIDTH)); /* R */
1277
*scanpt++ = *(rawpt); /* G */
1278
*scanpt++ = (*(rawpt-1)); /* B */
1279
}
1280
}
1281
}
1282
rawpt++;
1283
}
1284
}
1285
1286
static inline void
1287
rgb24_to_rgb24 (int width, int height, unsigned char *src, unsigned char *dst)
1288
{
1289
cvtColor(Mat(height, width, CV_8UC3, src), Mat(height, width, CV_8UC3, dst), COLOR_RGB2BGR);
1290
}
1291
1292
#define CLAMP(x) ((x)<0?0:((x)>255)?255:(x))
1293
1294
typedef struct {
1295
int is_abs;
1296
int len;
1297
int val;
1298
} code_table_t;
1299
1300
1301
/* local storage */
1302
static code_table_t table[256];
1303
static int init_done = 0;
1304
1305
1306
/*
1307
sonix_decompress_init
1308
=====================
1309
pre-calculates a locally stored table for efficient huffman-decoding.
1310
1311
Each entry at index x in the table represents the codeword
1312
present at the MSB of byte x.
1313
1314
*/
1315
static void sonix_decompress_init(void)
1316
{
1317
int i;
1318
int is_abs, val, len;
1319
1320
for (i = 0; i < 256; i++) {
1321
is_abs = 0;
1322
val = 0;
1323
len = 0;
1324
if ((i & 0x80) == 0) {
1325
/* code 0 */
1326
val = 0;
1327
len = 1;
1328
}
1329
else if ((i & 0xE0) == 0x80) {
1330
/* code 100 */
1331
val = +4;
1332
len = 3;
1333
}
1334
else if ((i & 0xE0) == 0xA0) {
1335
/* code 101 */
1336
val = -4;
1337
len = 3;
1338
}
1339
else if ((i & 0xF0) == 0xD0) {
1340
/* code 1101 */
1341
val = +11;
1342
len = 4;
1343
}
1344
else if ((i & 0xF0) == 0xF0) {
1345
/* code 1111 */
1346
val = -11;
1347
len = 4;
1348
}
1349
else if ((i & 0xF8) == 0xC8) {
1350
/* code 11001 */
1351
val = +20;
1352
len = 5;
1353
}
1354
else if ((i & 0xFC) == 0xC0) {
1355
/* code 110000 */
1356
val = -20;
1357
len = 6;
1358
}
1359
else if ((i & 0xFC) == 0xC4) {
1360
/* code 110001xx: unknown */
1361
val = 0;
1362
len = 8;
1363
}
1364
else if ((i & 0xF0) == 0xE0) {
1365
/* code 1110xxxx */
1366
is_abs = 1;
1367
val = (i & 0x0F) << 4;
1368
len = 8;
1369
}
1370
table[i].is_abs = is_abs;
1371
table[i].val = val;
1372
table[i].len = len;
1373
}
1374
1375
init_done = 1;
1376
}
1377
1378
1379
/*
1380
sonix_decompress
1381
================
1382
decompresses an image encoded by a SN9C101 camera controller chip.
1383
1384
IN width
1385
height
1386
inp pointer to compressed frame (with header already stripped)
1387
OUT outp pointer to decompressed frame
1388
1389
Returns 0 if the operation was successful.
1390
Returns <0 if operation failed.
1391
1392
*/
1393
static int sonix_decompress(int width, int height, unsigned char *inp, unsigned char *outp)
1394
{
1395
int row, col;
1396
int val;
1397
int bitpos;
1398
unsigned char code;
1399
unsigned char *addr;
1400
1401
if (!init_done) {
1402
/* do sonix_decompress_init first! */
1403
return -1;
1404
}
1405
1406
bitpos = 0;
1407
for (row = 0; row < height; row++) {
1408
1409
col = 0;
1410
1411
1412
1413
/* first two pixels in first two rows are stored as raw 8-bit */
1414
if (row < 2) {
1415
addr = inp + (bitpos >> 3);
1416
code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1417
bitpos += 8;
1418
*outp++ = code;
1419
1420
addr = inp + (bitpos >> 3);
1421
code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1422
bitpos += 8;
1423
*outp++ = code;
1424
1425
col += 2;
1426
}
1427
1428
while (col < width) {
1429
/* get bitcode from bitstream */
1430
addr = inp + (bitpos >> 3);
1431
code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1432
1433
/* update bit position */
1434
bitpos += table[code].len;
1435
1436
/* calculate pixel value */
1437
val = table[code].val;
1438
if (!table[code].is_abs) {
1439
/* value is relative to top and left pixel */
1440
if (col < 2) {
1441
/* left column: relative to top pixel */
1442
val += outp[-2*width];
1443
}
1444
else if (row < 2) {
1445
/* top row: relative to left pixel */
1446
val += outp[-2];
1447
}
1448
else {
1449
/* main area: average of left pixel and top pixel */
1450
val += (outp[-2] + outp[-2*width]) / 2;
1451
}
1452
}
1453
1454
/* store pixel */
1455
*outp++ = CLAMP(val);
1456
col++;
1457
}
1458
}
1459
1460
return 0;
1461
}
1462
1463
static IplImage* icvRetrieveFrameCAM_V4L( CvCaptureCAM_V4L* capture, int) {
1464
/* Now get what has already been captured as a IplImage return */
1465
// we need memory iff convert_rgb is true
1466
bool recreate_frame = capture->frame_allocated != capture->convert_rgb;
1467
1468
if (!capture->convert_rgb) {
1469
// for mjpeg streams the size might change in between, so we have to change the header
1470
recreate_frame += capture->frame.imageSize != (int)capture->buffers[capture->bufferIndex].length;
1471
}
1472
1473
if(recreate_frame) {
1474
// printf("realloc %d %zu\n", capture->frame.imageSize, capture->buffers[capture->bufferIndex].length);
1475
if(capture->frame_allocated)
1476
cvFree(&capture->frame.imageData);
1477
v4l2_create_frame(capture);
1478
}
1479
1480
if(!capture->convert_rgb) {
1481
capture->frame.imageData = (char*)capture->buffers[capture->bufferIndex].start;
1482
return &capture->frame;
1483
}
1484
1485
switch (capture->palette)
1486
{
1487
case V4L2_PIX_FMT_BGR24:
1488
memcpy((char *)capture->frame.imageData,
1489
(char *)capture->buffers[capture->bufferIndex].start,
1490
capture->frame.imageSize);
1491
break;
1492
1493
case V4L2_PIX_FMT_YVU420:
1494
case V4L2_PIX_FMT_YUV420:
1495
yuv420p_to_rgb24(capture->form.fmt.pix.width,
1496
capture->form.fmt.pix.height,
1497
(unsigned char*)(capture->buffers[capture->bufferIndex].start),
1498
(unsigned char*)capture->frame.imageData,
1499
capture->palette == V4L2_PIX_FMT_YUV420);
1500
break;
1501
1502
case V4L2_PIX_FMT_YUV411P:
1503
yuv411p_to_rgb24(capture->form.fmt.pix.width,
1504
capture->form.fmt.pix.height,
1505
(unsigned char*)(capture->buffers[capture->bufferIndex].start),
1506
(unsigned char*)capture->frame.imageData);
1507
break;
1508
#ifdef HAVE_JPEG
1509
case V4L2_PIX_FMT_MJPEG:
1510
case V4L2_PIX_FMT_JPEG:
1511
if (!mjpeg_to_rgb24(capture->form.fmt.pix.width,
1512
capture->form.fmt.pix.height,
1513
(unsigned char*)(capture->buffers[capture->bufferIndex]
1514
.start),
1515
capture->buffers[capture->bufferIndex].length,
1516
&capture->frame))
1517
return 0;
1518
break;
1519
#endif
1520
1521
case V4L2_PIX_FMT_YUYV:
1522
yuyv_to_rgb24(capture->form.fmt.pix.width,
1523
capture->form.fmt.pix.height,
1524
(unsigned char*)(capture->buffers[capture->bufferIndex].start),
1525
(unsigned char*)capture->frame.imageData);
1526
break;
1527
case V4L2_PIX_FMT_UYVY:
1528
uyvy_to_rgb24(capture->form.fmt.pix.width,
1529
capture->form.fmt.pix.height,
1530
(unsigned char*)(capture->buffers[capture->bufferIndex].start),
1531
(unsigned char*)capture->frame.imageData);
1532
break;
1533
case V4L2_PIX_FMT_SBGGR8:
1534
bayer2rgb24(capture->form.fmt.pix.width,
1535
capture->form.fmt.pix.height,
1536
(unsigned char*)capture->buffers[capture->bufferIndex].start,
1537
(unsigned char*)capture->frame.imageData);
1538
break;
1539
1540
case V4L2_PIX_FMT_SN9C10X:
1541
sonix_decompress_init();
1542
sonix_decompress(capture->form.fmt.pix.width,
1543
capture->form.fmt.pix.height,
1544
(unsigned char*)capture->buffers[capture->bufferIndex].start,
1545
(unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start);
1546
1547
bayer2rgb24(capture->form.fmt.pix.width,
1548
capture->form.fmt.pix.height,
1549
(unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start,
1550
(unsigned char*)capture->frame.imageData);
1551
break;
1552
1553
case V4L2_PIX_FMT_SGBRG8:
1554
sgbrg2rgb24(capture->form.fmt.pix.width,
1555
capture->form.fmt.pix.height,
1556
(unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start,
1557
(unsigned char*)capture->frame.imageData);
1558
break;
1559
case V4L2_PIX_FMT_RGB24:
1560
rgb24_to_rgb24(capture->form.fmt.pix.width,
1561
capture->form.fmt.pix.height,
1562
(unsigned char*)capture->buffers[(capture->bufferIndex+1) % capture->req.count].start,
1563
(unsigned char*)capture->frame.imageData);
1564
break;
1565
case V4L2_PIX_FMT_Y16:
1566
if(capture->convert_rgb){
1567
y16_to_rgb24(capture->form.fmt.pix.width,
1568
capture->form.fmt.pix.height,
1569
(unsigned char*)capture->buffers[capture->bufferIndex].start,
1570
(unsigned char*)capture->frame.imageData);
1571
}else{
1572
memcpy((char *)capture->frame.imageData,
1573
(char *)capture->buffers[capture->bufferIndex].start,
1574
capture->frame.imageSize);
1575
}
1576
break;
1577
case V4L2_PIX_FMT_GREY:
1578
if(capture->convert_rgb){
1579
y8_to_rgb24(capture->form.fmt.pix.width,
1580
capture->form.fmt.pix.height,
1581
(unsigned char*)capture->buffers[capture->bufferIndex].start,
1582
(unsigned char*)capture->frame.imageData);
1583
}else{
1584
memcpy((char *)capture->frame.imageData,
1585
(char *)capture->buffers[capture->bufferIndex].start,
1586
capture->frame.imageSize);
1587
}
1588
break;
1589
}
1590
1591
if (capture->returnFrame)
1592
return(&capture->frame);
1593
else
1594
return 0;
1595
}
1596
1597
static inline __u32 capPropertyToV4L2(int prop) {
1598
switch (prop) {
1599
case CV_CAP_PROP_BRIGHTNESS:
1600
return V4L2_CID_BRIGHTNESS;
1601
case CV_CAP_PROP_CONTRAST:
1602
return V4L2_CID_CONTRAST;
1603
case CV_CAP_PROP_SATURATION:
1604
return V4L2_CID_SATURATION;
1605
case CV_CAP_PROP_HUE:
1606
return V4L2_CID_HUE;
1607
case CV_CAP_PROP_GAIN:
1608
return V4L2_CID_GAIN;
1609
case CV_CAP_PROP_AUTO_EXPOSURE:
1610
return V4L2_CID_EXPOSURE_AUTO;
1611
case CV_CAP_PROP_EXPOSURE:
1612
return V4L2_CID_EXPOSURE_ABSOLUTE;
1613
case CV_CAP_PROP_AUTOFOCUS:
1614
return V4L2_CID_FOCUS_AUTO;
1615
case CV_CAP_PROP_FOCUS:
1616
return V4L2_CID_FOCUS_ABSOLUTE;
1617
default:
1618
return -1;
1619
}
1620
}
1621
1622
static double icvGetPropertyCAM_V4L (const CvCaptureCAM_V4L* capture,
1623
int property_id ) {
1624
{
1625
v4l2_format form;
1626
memset(&form, 0, sizeof(v4l2_format));
1627
form.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
1628
if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_FMT, &form)) {
1629
/* display an error message, and return an error code */
1630
perror ("VIDIOC_G_FMT");
1631
return -1;
1632
}
1633
1634
switch (property_id) {
1635
case CV_CAP_PROP_FRAME_WIDTH:
1636
return form.fmt.pix.width;
1637
case CV_CAP_PROP_FRAME_HEIGHT:
1638
return form.fmt.pix.height;
1639
case CV_CAP_PROP_FOURCC:
1640
case CV_CAP_PROP_MODE:
1641
return capture->palette;
1642
case CV_CAP_PROP_FORMAT:
1643
return CV_MAKETYPE(IPL2CV_DEPTH(capture->frame.depth), capture->frame.nChannels);
1644
case CV_CAP_PROP_CONVERT_RGB:
1645
return capture->convert_rgb;
1646
case CV_CAP_PROP_BUFFERSIZE:
1647
return capture->bufferSize;
1648
}
1649
1650
if(property_id == CV_CAP_PROP_FPS) {
1651
v4l2_streamparm sp = v4l2_streamparm();
1652
sp.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
1653
if (ioctl(capture->deviceHandle, VIDIOC_G_PARM, &sp) < 0){
1654
fprintf(stderr, "VIDEOIO ERROR: V4L: Unable to get camera FPS\n");
1655
return -1;
1656
}
1657
1658
return sp.parm.capture.timeperframe.denominator / (double)sp.parm.capture.timeperframe.numerator;
1659
}
1660
1661
/* initialize the control structure */
1662
1663
if(property_id == CV_CAP_PROP_POS_MSEC) {
1664
if (capture->FirstCapture) {
1665
return 0;
1666
} else {
1667
return 1000 * capture->timestamp.tv_sec + ((double) capture->timestamp.tv_usec) / 1000;
1668
}
1669
}
1670
1671
__u32 v4l2id = capPropertyToV4L2(property_id);
1672
1673
if(v4l2id == __u32(-1)) {
1674
fprintf(stderr,
1675
"VIDEOIO ERROR: V4L2: getting property #%d is not supported\n",
1676
property_id);
1677
return -1;
1678
}
1679
1680
v4l2_control control = {v4l2id, 0};
1681
1682
if (-1 == ioctl (capture->deviceHandle, VIDIOC_G_CTRL,
1683
&control)) {
1684
1685
fprintf( stderr, "VIDEOIO ERROR: V4L2: ");
1686
switch (property_id) {
1687
case CV_CAP_PROP_BRIGHTNESS:
1688
fprintf (stderr, "Brightness");
1689
break;
1690
case CV_CAP_PROP_CONTRAST:
1691
fprintf (stderr, "Contrast");
1692
break;
1693
case CV_CAP_PROP_SATURATION:
1694
fprintf (stderr, "Saturation");
1695
break;
1696
case CV_CAP_PROP_HUE:
1697
fprintf (stderr, "Hue");
1698
break;
1699
case CV_CAP_PROP_GAIN:
1700
fprintf (stderr, "Gain");
1701
break;
1702
case CV_CAP_PROP_AUTO_EXPOSURE:
1703
fprintf (stderr, "Auto Exposure");
1704
break;
1705
case CV_CAP_PROP_EXPOSURE:
1706
fprintf (stderr, "Exposure");
1707
break;
1708
case CV_CAP_PROP_AUTOFOCUS:
1709
fprintf (stderr, "Autofocus");
1710
break;
1711
case CV_CAP_PROP_FOCUS:
1712
fprintf (stderr, "Focus");
1713
break;
1714
}
1715
fprintf (stderr, " is not supported by your device\n");
1716
1717
return -1;
1718
}
1719
1720
/* get the min/max values */
1721
Range range = capture->getRange(property_id);
1722
1723
/* all was OK, so convert to 0.0 - 1.0 range, and return the value */
1724
return ((double)control.value - range.start) / range.size();
1725
1726
}
1727
};
1728
1729
static bool icvSetControl (CvCaptureCAM_V4L* capture,
1730
int property_id, double value) {
1731
1732
/* limitation of the input value */
1733
if (value < 0.0) {
1734
value = 0.0;
1735
} else if (value > 1.0) {
1736
value = 1.0;
1737
}
1738
1739
/* initialisations */
1740
__u32 v4l2id = capPropertyToV4L2(property_id);
1741
1742
if(v4l2id == __u32(-1)) {
1743
fprintf(stderr,
1744
"VIDEOIO ERROR: V4L2: setting property #%d is not supported\n",
1745
property_id);
1746
return false;
1747
}
1748
1749
/* get the min/max values */
1750
Range range = capture->getRange(property_id);
1751
1752
/* scale the value we want to set */
1753
value = value * range.size() + range.start;
1754
1755
/* set which control we want to set */
1756
v4l2_control control = {v4l2id, int(value)};
1757
1758
/* The driver may clamp the value or return ERANGE, ignored here */
1759
if (-1 == ioctl(capture->deviceHandle, VIDIOC_S_CTRL, &control) && errno != ERANGE) {
1760
perror ("VIDIOC_S_CTRL");
1761
return false;
1762
}
1763
1764
if(control.id == V4L2_CID_EXPOSURE_AUTO && control.value == V4L2_EXPOSURE_MANUAL) {
1765
// update the control range for expose after disabling autoexposure
1766
// as it is not read correctly at startup
1767
// TODO check this again as it might be fixed with Linux 4.5
1768
v4l2_control_range(capture, V4L2_CID_EXPOSURE_ABSOLUTE);
1769
}
1770
1771
/* all was OK */
1772
return true;
1773
}
1774
1775
static int icvSetPropertyCAM_V4L( CvCaptureCAM_V4L* capture,
1776
int property_id, double value ){
1777
bool retval = false;
1778
bool possible;
1779
1780
/* two subsequent calls setting WIDTH and HEIGHT will change
1781
the video size */
1782
1783
switch (property_id) {
1784
case CV_CAP_PROP_FRAME_WIDTH:
1785
{
1786
int& width = capture->width_set;
1787
int& height = capture->height_set;
1788
width = cvRound(value);
1789
retval = width != 0;
1790
if(width !=0 && height != 0) {
1791
capture->width = width;
1792
capture->height = height;
1793
retval = v4l2_reset(capture);
1794
width = height = 0;
1795
}
1796
}
1797
break;
1798
case CV_CAP_PROP_FRAME_HEIGHT:
1799
{
1800
int& width = capture->width_set;
1801
int& height = capture->height_set;
1802
height = cvRound(value);
1803
retval = height != 0;
1804
if(width !=0 && height != 0) {
1805
capture->width = width;
1806
capture->height = height;
1807
retval = v4l2_reset(capture);
1808
width = height = 0;
1809
}
1810
}
1811
break;
1812
case CV_CAP_PROP_FPS:
1813
capture->fps = value;
1814
retval = v4l2_reset(capture);
1815
break;
1816
case CV_CAP_PROP_CONVERT_RGB:
1817
// returns "0" for formats we do not know how to map to IplImage
1818
possible = v4l2_num_channels(capture->palette);
1819
capture->convert_rgb = bool(value) && possible;
1820
retval = possible || !bool(value);
1821
break;
1822
case CV_CAP_PROP_FOURCC:
1823
{
1824
__u32 old_palette = capture->palette;
1825
__u32 new_palette = static_cast<__u32>(value);
1826
capture->palette = new_palette;
1827
if (v4l2_reset(capture)) {
1828
retval = true;
1829
} else {
1830
capture->palette = old_palette;
1831
v4l2_reset(capture);
1832
retval = false;
1833
}
1834
}
1835
break;
1836
case CV_CAP_PROP_BUFFERSIZE:
1837
if ((int)value > MAX_V4L_BUFFERS || (int)value < 1) {
1838
fprintf(stderr, "V4L: Bad buffer size %d, buffer size must be from 1 to %d\n", (int)value, MAX_V4L_BUFFERS);
1839
retval = false;
1840
} else {
1841
capture->bufferSize = (int)value;
1842
if (capture->bufferIndex > capture->bufferSize) {
1843
capture->bufferIndex = 0;
1844
}
1845
retval = v4l2_reset(capture);
1846
}
1847
break;
1848
default:
1849
retval = icvSetControl(capture, property_id, value);
1850
break;
1851
}
1852
1853
/* return the the status */
1854
return retval;
1855
}
1856
1857
static void icvCloseCAM_V4L( CvCaptureCAM_V4L* capture ){
1858
/* Deallocate space - Hopefully, no leaks */
1859
1860
if (!capture->deviceName.empty())
1861
{
1862
if (capture->deviceHandle != -1)
1863
{
1864
capture->type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
1865
if (-1 == ioctl(capture->deviceHandle, VIDIOC_STREAMOFF, &capture->type)) {
1866
perror ("Unable to stop the stream");
1867
}
1868
1869
for (unsigned int n_buffers = 0; n_buffers < MAX_V4L_BUFFERS; ++n_buffers)
1870
{
1871
if (capture->buffers[n_buffers].start) {
1872
if (-1 == munmap (capture->buffers[n_buffers].start, capture->buffers[n_buffers].length)) {
1873
perror ("munmap");
1874
} else {
1875
capture->buffers[n_buffers].start = 0;
1876
}
1877
}
1878
}
1879
1880
if (capture->buffers[MAX_V4L_BUFFERS].start)
1881
{
1882
free(capture->buffers[MAX_V4L_BUFFERS].start);
1883
capture->buffers[MAX_V4L_BUFFERS].start = 0;
1884
}
1885
}
1886
1887
if (capture->deviceHandle != -1)
1888
close(capture->deviceHandle);
1889
1890
if (capture->frame_allocated && capture->frame.imageData)
1891
cvFree(&capture->frame.imageData);
1892
1893
capture->deviceName.clear(); // flag that the capture is closed
1894
}
1895
};
1896
1897
bool CvCaptureCAM_V4L::grabFrame()
1898
{
1899
return icvGrabFrameCAM_V4L( this );
1900
}
1901
1902
IplImage* CvCaptureCAM_V4L::retrieveFrame(int)
1903
{
1904
return icvRetrieveFrameCAM_V4L( this, 0 );
1905
}
1906
1907
double CvCaptureCAM_V4L::getProperty( int propId ) const
1908
{
1909
return icvGetPropertyCAM_V4L( this, propId );
1910
}
1911
1912
bool CvCaptureCAM_V4L::setProperty( int propId, double value )
1913
{
1914
return icvSetPropertyCAM_V4L( this, propId, value );
1915
}
1916
1917
} // end namespace cv
1918
1919
CvCapture* cvCreateCameraCapture_V4L( int index )
1920
{
1921
cv::CvCaptureCAM_V4L* capture = new cv::CvCaptureCAM_V4L();
1922
1923
if(capture->open(index))
1924
return capture;
1925
1926
delete capture;
1927
return NULL;
1928
}
1929
1930
CvCapture* cvCreateCameraCapture_V4L( const char * deviceName )
1931
{
1932
cv::CvCaptureCAM_V4L* capture = new cv::CvCaptureCAM_V4L();
1933
1934
if(capture->open( deviceName ))
1935
return capture;
1936
1937
delete capture;
1938
return NULL;
1939
}
1940
1941
#endif
1942
1943