精华内容
下载资源
问答
  • qt+opencv编写,采集usb摄像头图像,并实时显示的例程
  • python获取usb摄像头图像。包括: 1.原图写字符; 2.转换为灰度图 3.二值化处理; 4.保存本地。
  • 本资源主要是在Ubuntu16.04环境下采集USB摄像头数据,本程序通过FFmpeg相关API采集/dev/video0数据,摄像头为罗技C270i,采集到的图像格式为yuyv422
  • 获取USB 摄像头图像(支持多个摄像头)。重要的事首先说,压缩里面包含详细说明(尤其是部署Aforge环境和引用DLL)!压缩包是.net2008工程文件和程序。代码注释的很详细! 代码是vb.net的,通过调用Aforge库打开...
  • 摘要:随着电子技术的发展,嵌入式图像处理系统应用越来越广泛,无论是在交通,军事还是消费电子等诸多领域,都能看到与图像处理系统相关的产品.... 本文提出一种基于Linux操作系统和USB摄像头图像采集系统方案.以S3C2...

    摘要:

    随着电子技术的发展,嵌入式图像处理系统应用越来越广泛,无论是在交通,军事还是消费电子等诸多领域,都能看到与图像处理系统相关的产品.传统的图像处理系统具有开发周期长,软件升级困难,价格昂贵等缺点,相比之下,嵌入式图像处理系统以其开发周期短,软件易升级,成本低等优势,逐渐取代了传统的图像处理系统,成为行业发展的趋势. 本文提出一种基于Linux操作系统和USB摄像头的图像采集系统方案.以S3C2440作为嵌入式处理器,同时结合外围的SDRAM,Flash等芯片构成开发板核心,在此基础上,外接LCD显示屏和USB摄像头,共同构成了一个嵌入式图像处理系统.同时采用免费,开源的Linux作为操作系统,进一步降低了图像处理系统的成本. 本论文首先介绍了课题研究的背景以及嵌入式理论知识;然后从硬件角度分析了构成整个系统的各个模块,主要包括:图像采集模块,图像处理模块,存储模块,程序运行模块和图像显示模块五部分;接下来介绍了系统移植;最后从软件角度分别介绍了驱动程序编写,应用程序编写. 总起来讲,本论文主要完成如下工作: 1.嵌入式理论知识介绍. 2.系统硬件总体框架以及主要模块介绍. 3.系统移植过程,主要包括交叉编译链的制作及安装,U-Boot移植,内核配置及移植,根文件系统的构造及移植. 4. USB摄像头驱动程序介绍. 5.图像采集及显示应用程序编写.

    展开

    展开全文
  • USB摄像头预览工具

    2019-03-01 10:15:04
    windows系统上的 usb摄像头查看和设置工具。此工具小巧,免安装,使用方便。
  • 将视频展示在pictureBox中,再存为MP4,详细注释。。
  • AndroidUSB摄像头源码,使用androidstudio编译,亲测可用,外接摄像头直接出现图像,可以拍照,录像,调节分辨率,亮度,对比度等。
  • C#读取USB摄像头

    2019-02-22 23:24:03
    该程序使用C#语言编写,主要功能为调取笔记本自带摄像头以及USB摄像头,并显示在窗体中,包含有截图录像功能。
  • 使用directshow实现的一个摄像头图像获取工具,支持:放大、缩小,灰度、彩色,旋转、镜像、切换分辨率等。
  • Opencv3捕获USB摄像头

    2018-11-18 09:44:34
    Opencv3捕获USB摄像头视频,在VisualStudio 2017 下编译通过
  • Android双USB摄像头程序

    2020-12-24 14:18:06
    USB摄像头程序 运行环境Android studio java代码 可直接运行,两个USB摄像图像显示,图像处理
  • zynq-7000学习笔记(八)——USB摄像头图像采集-附件资源
  • ROS开发 使用USB摄像头,将图像通过image_raw主题发布
  •  基于嵌入式Linux环境下图像采集及在嵌入式开发板上对图像显示的目的,本文研究了视频...最终实现了嵌入式Linux平台下USB摄像头图像的采集以及在开发板上的显示,并且在开发板上实现了对显示图像的简单操作。
  • 代码语法没有错误,摄像头也可以打开,但是就是无法读取当前帧图像 #include <opencv.hpp> using namespace cv; int main() { VideoCapture capture(0); if (!capture.isOpened()) ...

    利用Opencv读取电脑的摄像头,代码如下,一直无法正常运行,搜索网上的解决方案,均未解决。代码语法没有错误,摄像头也可以打开,但是就是无法读取当前帧图像

    #include <opencv.hpp>
    using namespace cv;
    int main() {
        VideoCapture capture(0);
        if (!capture.isOpened()) 
            return -1;
        Mat frame;
        while (1) {
            capture >> frame;
            imshow("读取视频", frame);
            waitKey(30);
        }
        return 0;
    }

     由于一直未找到原因,在网上搜索到老版本的代码如下,便可以成功读取并显示摄像头的画面。

    #include <opencv.hpp>    
    using namespace cv;
    
    int main(int argc, char** argv) 
    {
    	cvNamedWindow("视频");
    	CvCapture* capture = cvCreateCameraCapture(0);
    	IplImage* frame;
    
    	while (1) 
    	{
    		frame = cvQueryFrame(capture);
    		cvResizeWindow("视频", 640, 480);
    		cvShowImage("视频", frame);
    		cvWaitKey(50);
    	}
    
    	cvReleaseCapture(&capture);
    	cvDestroyWindow("视频");
    	return 0;
    }

    大致猜测无法获取当前帧的原因是摄像头和opencv版本之间的兼容问题。

     

     

    展开全文
  • 基于labview通过usb摄像头采集图像子vi,可以通过自带摄像头采集图像
  • 嵌入式Linux下USB摄像头单帧图像采集系统的实现
  • 亲身实践有效,一步步的操作详细明了,对与做Linux下USB摄像头视频采集有一定作用
  • ubuntu-linux环境下,运行代码,系统读取USB摄像头数据,并实时显示摄像头采集的视频信息
  • #include <iostream> #include <opencv2/opencv.hpp>...// 内部参数写0,代表笔记本自带的摄像头,2、1分别代表两个外接USB摄像头 VideoCapture cap1(1); // 设置分辨率 cap2.set(CV_CAP_PROP_FRAME_
    #include <iostream>
    #include <opencv2/opencv.hpp>
     
    using namespace std;
    using namespace cv;
     
    int main()
    {
        VideoCapture cap2(2);// 内部参数写0,代表笔记本自带的摄像头,2、1分别代表两个外接USB摄像头
        VideoCapture cap1(1);
    
        // 设置分辨率
        cap2.set(CV_CAP_PROP_FRAME_WIDTH,640);
        cap2.set(CV_CAP_PROP_FRAME_HEIGHT,480);
        
    	cap1.set(CV_CAP_PROP_FRAME_WIDTH,640);
        cap1.set(CV_CAP_PROP_FRAME_HEIGHT,480);
     
        Mat img1;
        Mat img2;
     
        while(cap2.read(img2) && cap1.read(img1))
        {
            imshow("img1", img1);
            imshow("img2", img2);
            char c = waitKey(1);
            if(c == 'q' || c == 'Q') // 按q退出
            {
                break;
            }
        }
        return 0;
    }
    

    测试效果
    在这里插入图片描述
    两个摄像头显示正常,没有出现卡顿现象。

    展开全文
  • 图像处理算法复杂,在常规检测领域应用相对较少等情况,在传感器与检测技术课程群平台的基础上,本课题研究了采用普通USB摄像头,构建适合于科学研究、工程测量以及教学实验使用的图像检测技术平台等方面的技术,并进行了...
  • USB摄像头采集图像(DirectShow)

    千次阅读 2016-08-23 15:48:54
    摄像头除了可以捕获视频流还可以捕获单张静止的图片,静止的图片质量比流的质量要高。支持输出静态图片的摄像头一般要提供一个静态图片PIN,这个PIN的种类是PIN_CATEGORY_STILL。 捕捉静态图片常用的filter是Sample...

    摄像头除了可以捕获视频流还可以捕获单张静止的图片,静止的图片质量比流的质量要高。支持输出静态图片的摄像头一般要提供一个静态图片PIN,这个PIN的种类是PIN_CATEGORY_STILL

    捕捉静态图片常用的filterSample Graber filter,它的用法参考手册。然后将捕捉filter的静态PIN连接到Sample Grabber,再将Sample Grabber连接到Null Render filter,连接Null Render只是为了给Sample Grabber 的输出PIN上连接点东西。其结构图如下所示:

                                               Capture Device --------------->  SampleGraber ---------------------> Null  Render

    为了实现上图所示的流图,需要分别实现其Filter,然后将各个Filter加入Graph,最后将他们连接起来,就实现整个流程。

    1.Capture Device 作为采集设备构建Filter,采用如下代码

    //将设备iddeviceId的设备绑定到指定的pFilter

    bool UsbCamera::BindFilter(int deviceId, IBaseFilter **pFilter)

    {

             if (deviceId < 0)

                       return false;

             CComPtr<ICreateDevEnum> pCreateDevEnum;

             HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER,

                       IID_ICreateDevEnum, (void**)&pCreateDevEnum);

             if (hr != NOERROR)

                       return false;

             CComPtr<IEnumMoniker> pEm;

             hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory,

                       &pEm, 0);

             if (hr != NOERROR)

                       return false;

             pEm->Reset();

             ULONG cFetched;

             IMoniker *pM;

             int index = 0;

             while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId)

             {

                       IPropertyBag *pBag;

                       hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag);

                       if(SUCCEEDED(hr))

                       {

                                VARIANT var;

                                var.vt = VT_BSTR;

                                hr = pBag->Read(L"FriendlyName", &var, NULL);

                                if (hr == NOERROR)

                                {

                                         if (index == deviceId)

                                         {

                                                   pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter);

                                         }

                                         SysFreeString(var.bstrVal);

                                }

                                pBag->Release();

                       }

                       pM->Release();

                       index++;

             }

             return true;

    }

    2.Sample Graber Filter的实现

    加入Sample Graber Filter是为了设置媒体的类型

    IBaseFilter* pSampleGrabberFilter;

    hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER,

                       IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter);

    ISampleGrabber*  pSampleGrabber;

    hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber);

    //设置媒体类型

    AM_MEDIA_TYPE   mt;

    ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));

    mt.majortype  = MEDIATYPE_Video;

    if (mode == "MEDIASUBTYPE_RGB24" )

    {

             mt.subtype = MEDIASUBTYPE_RGB24;

             bytePP     = 3.0;

    }

    else if (mode == "MEDIASUBTYPE_YUY2" )

    {

             mt.subtype = MEDIASUBTYPE_YUY2;

             bytePP     = 2.0;

    }

    mt.formattype = FORMAT_VideoInfo;

    hr = pSampleGrabber->SetMediaType(&mt);

    3.Null Render Filter的实现

    IBaseFilter* pNullFilter;

    hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER,

                       IID_IBaseFilter, (LPVOID*) &pNullFilter);

    4.将各个Filter加入Graph

    IGraphBuilder* pGraph;

    hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC,

                       IID_IGraphBuilder, (void **)&pGraph);

    pGraph->AddFilter(pDeviceFilter, NULL);

    pGraph->AddFilter(pSampleGrabberFilter, L"Grabber");

    pGraph->AddFilter(pNullFilter, L"NullRenderer");

    5.对于有的采集设备可能有多个输入和输出的PIN,我们需选择一个通路,通过SetCrossBar设备;

    //对于有多个输入和输出的设备,需选择一个通路,另外设置媒体类型;

    void UsbCamera::SetCrossBar(int fr, int iiw, int iih,string mode)

    {

             IAMCrossbar *pXBar1             = NULL;

             IAMStreamConfig*      pVSC      = NULL;

             HRESULT hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL,

                       CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2,

                       (void **)&pBuilder);

             if (SUCCEEDED(hr))

                       hr = pBuilder->SetFiltergraph(pGraph);

             hr = pBuilder->FindInterface(&LOOK_UPSTREAM_ONLY, NULL,

                       pDeviceFilter,IID_IAMCrossbar, (void**)&pXBar1);

             if (SUCCEEDED(hr))

             {

                       long OutputPinCount;

                       long InputPinCount;

                       long PinIndexRelated;

                       long PhysicalType;

                       long inPort = 0;

                       long outPort = 0;

                       pXBar1->get_PinCounts(&OutputPinCount,&InputPinCount);

                       for(int i =0;i<InputPinCount;i++)

                       {

                                pXBar1->get_CrossbarPinInfo(TRUE,i,&PinIndexRelated,&PhysicalType);

                                if(PhysConn_Video_Composite==PhysicalType)

                                {

                                         inPort = i;

                                         break;

                                }

                       }

                       for(int i =0;i<OutputPinCount;i++)

                       {

                                pXBar1->get_CrossbarPinInfo(FALSE,i,&PinIndexRelated,&PhysicalType);

                                if(PhysConn_Video_VideoDecoder==PhysicalType)

                                {

                                         outPort = i;

                                         break;

                                }

                       }

                       if(S_OK==pXBar1->CanRoute(outPort,inPort))

                       {

                                pXBar1->Route(outPort,inPort);

                       }

                       pXBar1->Release();         

             }

             //设置媒体类型;

             hr = pBuilder->FindInterface( &PIN_CATEGORY_CAPTURE,0,pDeviceFilter,

                       IID_IAMStreamConfig,(void**)&pVSC );

             AM_MEDIA_TYPE *pmt;

             if( SUCCEEDED(hr) )

             {

                       hr = pVSC->GetFormat(&pmt);

                       if (hr == NOERROR)

                       {

                                if (pmt->formattype == FORMAT_VideoInfo )

                                {

                                         VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*) pmt->pbFormat;

                                         if (mode == "MEDIASUBTYPE_RGB24" )

                                                   pmt->subtype = MEDIASUBTYPE_RGB24;

                                         else if (mode == "MEDIASUBTYPE_YUY2" )

                                                   pmt->subtype = MEDIASUBTYPE_YUY2;

                                         pvi->AvgTimePerFrame = (LONGLONG)( 10000000 / fr );

                                         pvi->bmiHeader.biWidth  = iiw;

                                         pvi->bmiHeader.biHeight = iih;

                                         pvi->bmiHeader.biSizeImage = DIBSIZE(pvi->bmiHeader);

                                         hr = pVSC->SetFormat(pmt);

                                }

                                FreeMediaType(*pmt);

                       }

                       SAFE_RELEASE( pVSC );

             }

    }

    6.将各filter连接起来

    Ipin* pCameraOutput;

    Ipin* pGrabberInput;

    Ipin* pGrabberOutput;

    Ipin* pNullInputPin;

    CComPtr<IEnumPins> pEnum;

    pDeviceFilter->EnumPins(&pEnum);

    hr = pEnum->Reset();

    hr = pEnum->Next(1, &pCameraOutput, NULL);

    pEnum = NULL;

    pSampleGrabberFilter->EnumPins(&pEnum);

    pEnum->Reset();

    hr = pEnum->Next(1, &pGrabberInput, NULL);

    pEnum = NULL;

    pSampleGrabberFilter->EnumPins(&pEnum);

    pEnum->Reset();

    pEnum->Skip(1);

    hr = pEnum->Next(1, &pGrabberOutput, NULL);

    pEnum = NULL;

    pNullFilter->EnumPins(&pEnum);

    pEnum->Reset();

    hr = pEnum->Next(1, &pNullInputPin, NULL);

    //连接Pin;

    hr = pGraph->Connect(pCameraOutput, pGrabberInput);

    hr = pGraph->Connect(pGrabberOutput, pNullInputPin);

    7.运行

    pSampleGrabber->SetBufferSamples(TRUE); // true for wait frame done call back 不再另外开辟单帧缓冲区

    pSampleGrabber->SetOneShot(TRUE); // FALSE=截图后继续运行graph,TRUE=STOP RUN GRAPH

    hr = pSampleGrabber->GetConnectedMediaType( &mt );

    VIDEOINFOHEADER *videoHeader;

    assert(mt.formattype == FORMAT_VideoInfo);

    videoHeader = reinterpret_cast<VIDEOINFOHEADER*>(mt.pbFormat);

    width  = videoHeader->bmiHeader.biWidth;

    height = videoHeader->bmiHeader.biHeight;

    FreeMediaType(mt);

    pMediaControl->Run();

    8.原代码

    //==================================ImageBuffer.h================================================================

    #ifndef _imagebuffer_h_
    #define _imagebuffer_h_

    class ImageBuffer
    {
    public:
    enum {
      FORMAT_YUV444=0,
      FORMAT_YUV422,
      FORMAT_YUV411,
      FORMAT_RGB,
      FORMAT_MONO,
      FORMAT_MONO16,
      FORMAT_UYV
    };   

    int width;             
    int height;             
    int format;            
    int size;               
    unsigned char* buffer; 

    ImageBuffer::ImageBuffer()
    {

    }
    ImageBuffer::ImageBuffer(int width, int height, int format,
      unsigned char* buffer, int size)
      : width(width), height(height), format(format), buffer(buffer), size(size)
    {

    }

    //static void convert(const ImageBuffer& src, ImageBuffer& dst) throw (RobotsException);

    };

    #endif

    //==================================ImageSource.h================================================================

    #ifndef _imagesource_h_
    #define _imagesource_h_

    #include "ImageBuffer.h"

    class ImageSource
    {
    public:
    virtual ImageBuffer getImage() =0;
    virtual int getWidth() const=0;
    virtual int getHeight() const=0;
    };
    #endif

    //==================================UsbCamera.h================================================================

    #ifndef USBCAMER_H_INCLUDE
    #define USBCAMER_H_INCLUDE

    #include <windows.h>
    #include <dshow.h>
    #include <atlbase.h>
    #include <qedit.h>
    #include <string>
    #include "ImageSource.h"

    #define WIN32_LEAN_AND_MEAN

    #ifndef SAFE_RELEASE
    #define SAFE_RELEASE( x ) \
    if ( NULL != x ) \
    { \
    x->Release( ); \
    x = NULL; \
    }
    #endif

    using namespace std;
    class UsbCamera : public ImageSource
    {
    public:
    UsbCamera();
    virtual ~UsbCamera();

    virtual ImageBuffer getImage();
    virtual int   getWidth() const;
    virtual int   getHeight() const;

    static UsbCamera* getCamera(  int port = 0, int framerate = 30, int width = 320,
      int height = 240, string mode = "" );
    static void   destroyUsbCamera();

    void    Init( int deviceId, bool displayProperties = false, int framerate = 30,int iw = 320, int ih = 240, string mode = "" );
    void    DisplayFilterProperties();

    bool    BindFilter(int deviceId, IBaseFilter **pFilter);
    void    SetCrossBar( int fr = 30, int iiw = 320, int iih = 240, string mode = "" );
    HRESULT    GrabByteFrame();
    long    GetBufferSize()  { return bufferSize; }
    long*    GetBuffer()   { return pBuffer;    }
    BYTE*    GetByteBuffer()  { return pBYTEbuffer;}

    public:
    bool        bisValid;

    protected:
    IGraphBuilder*   pGraph;
    IBaseFilter*   pDeviceFilter;
    IMediaControl*   pMediaControl;
    IBaseFilter*   pSampleGrabberFilter;
    ISampleGrabber*   pSampleGrabber;
    IPin*     pGrabberInput;
    IPin*     pGrabberOutput;
    IPin*     pCameraOutput;
    IMediaEvent*   pMediaEvent;
    IBaseFilter*   pNullFilter;
    IPin*     pNullInputPin;
    ICaptureGraphBuilder2*  pBuilder;

    static UsbCamera*  m_camera;
    ImageBuffer    imagebuf;

    double                  bytePP;
    private:
    void     ErrMsg(LPTSTR szFormat,...);
    void     FreeMediaType(AM_MEDIA_TYPE& mt);

    long  bufferSize;
    long*  pBuffer;
    BYTE*  pBYTEbuffer;
    bool  connected;
    int   width;
    int   height;

    ImageBuffer m_buffer;
    bool  bnotify;
    string     format_mode;

    };

    #endif

    //==================================UsbCamera.cpp================================================================

    #include "StdAfx.h"
    #include <assert.h>
    #include "UsbCamera.h"

    #ifndef USE_YUV422_FORMAT
    #define USE_YUV422_FORMAT 0
    #endif

    #ifndef USE_RGB24_FORMAT
    #define USE_RGB24_FORMAT  1
    #endif

    #ifndef SEND_WORK_STATE
    #define SEND_WORK_STATE  0
    #endif

    UsbCamera* UsbCamera::m_camera = NULL;

    UsbCamera::UsbCamera():bisValid(false),pBuffer(NULL),pBYTEbuffer(NULL),bufferSize(0),bytePP(2.0),
    connected(false),bnotify(false),width(0),height(0)
    {
    if (FAILED(CoInitialize(NULL)))
    {
      return;
    }
    pGraph     = NULL;
    pDeviceFilter   = NULL;
    pMediaControl   = NULL;
    pSampleGrabberFilter = NULL;
    pSampleGrabber   = NULL;
    pGrabberInput   = NULL;
    pGrabberOutput   = NULL;
    pCameraOutput   = NULL;
    pMediaEvent    = NULL;
    pNullFilter    = NULL;
    pNullInputPin   = NULL;
    pBuilder    = NULL;

    }

    UsbCamera::~UsbCamera()
    {
    if( connected )
    {
      if (pMediaControl )
      {
       pMediaControl->Stop();
      }
      SAFE_RELEASE(pGraph);
      SAFE_RELEASE(pDeviceFilter);
      SAFE_RELEASE(pMediaControl);
      SAFE_RELEASE(pSampleGrabberFilter);
      SAFE_RELEASE(pSampleGrabber);
      SAFE_RELEASE(pGrabberInput);
      SAFE_RELEASE(pGrabberOutput);
      SAFE_RELEASE(pCameraOutput);
      SAFE_RELEASE(pMediaEvent);
      SAFE_RELEASE(pNullFilter);
      SAFE_RELEASE(pNullInputPin);
      SAFE_RELEASE(pBuilder);
      CoUninitialize();
    }
    if( pBuffer )
      delete[] pBuffer;
    if( pBYTEbuffer )
      delete[] pBYTEbuffer;
    }

    void UsbCamera::Init(int deviceId, bool displayProperties,
    int framerate, int iw , int ih, string mode )
    {
    HRESULT hr = S_OK;
    format_mode = mode;
    // Create the Filter Graph Manager.
    hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC,
      IID_IGraphBuilder, (void **)&pGraph);

    hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER,
      IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter);

    hr = pGraph->QueryInterface(IID_IMediaControl, (void **) &pMediaControl);
    hr = pGraph->QueryInterface(IID_IMediaEvent, (void **) &pMediaEvent);

    hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER,
      IID_IBaseFilter, (LPVOID*) &pNullFilter);

    hr = pGraph->AddFilter(pNullFilter, L"NullRenderer");

    hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber);

    AM_MEDIA_TYPE   mt;
    ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));
    mt.majortype  = MEDIATYPE_Video;

    if (mode == "MEDIASUBTYPE_RGB24" )
    {
      mt.subtype = MEDIASUBTYPE_RGB24;
      bytePP     = 3.0;
    }
    else if (mode == "MEDIASUBTYPE_YUY2" )
    {
      mt.subtype = MEDIASUBTYPE_YUY2;
      bytePP     = 2.0;
    }

    mt.formattype = FORMAT_VideoInfo;
    hr = pSampleGrabber->SetMediaType(&mt);

    pGraph->AddFilter(pSampleGrabberFilter, L"Grabber");

    // Bind Device Filter. We know the device because the id was passed in
    if(!BindFilter(deviceId, &pDeviceFilter))
    {
      ErrMsg(TEXT("未找到USB摄像头!\n请检查设备后重试!"));
      exit(0);
      return;
    }

    pGraph->AddFilter(pDeviceFilter, NULL);

    CComPtr<IEnumPins> pEnum;
    pDeviceFilter->EnumPins(&pEnum);
    hr = pEnum->Reset();
    hr = pEnum->Next(1, &pCameraOutput, NULL);
    pEnum = NULL;
    pSampleGrabberFilter->EnumPins(&pEnum);
    pEnum->Reset();
    hr = pEnum->Next(1, &pGrabberInput, NULL);
    pEnum = NULL;
    pSampleGrabberFilter->EnumPins(&pEnum);
    pEnum->Reset();
    pEnum->Skip(1);
    hr = pEnum->Next(1, &pGrabberOutput, NULL);
    pEnum = NULL;
    pNullFilter->EnumPins(&pEnum);
    pEnum->Reset();
    hr = pEnum->Next(1, &pNullInputPin, NULL);

    SetCrossBar(framerate,iw,ih,mode);

    if (displayProperties)
    {
      CComPtr<ISpecifyPropertyPages> pPages;

      HRESULT hr = pCameraOutput->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pPages);
      if (SUCCEEDED(hr))
      {
       PIN_INFO PinInfo;
       pCameraOutput->QueryPinInfo(&PinInfo);
       CAUUID caGUID;
       pPages->GetPages(&caGUID);
       OleCreatePropertyFrame( NULL,0, 0,L"Property Sheet",1,
        (IUnknown **)&(pCameraOutput),
        caGUID.cElems, caGUID.pElems,
        0,0,NULL );
       CoTaskMemFree(caGUID.pElems);
       PinInfo.pFilter->Release();
      }
    }

    hr = pGraph->Connect(pCameraOutput, pGrabberInput);
    hr = pGraph->Connect(pGrabberOutput, pNullInputPin);

    pSampleGrabber->SetBufferSamples(TRUE); // true for wait frame done call back 不再另外开辟单帧缓冲区
    pSampleGrabber->SetOneShot(TRUE); // FALSE=截图后继续运行graph,TRUE=STOP RUN GRAPH

    hr = pSampleGrabber->GetConnectedMediaType( &mt );
    VIDEOINFOHEADER *videoHeader;
    assert(mt.formattype == FORMAT_VideoInfo);
    videoHeader = reinterpret_cast<VIDEOINFOHEADER*>(mt.pbFormat);
    width  = videoHeader->bmiHeader.biWidth;
    height = videoHeader->bmiHeader.biHeight;
    FreeMediaType(mt);

    pMediaControl->Run();
    connected = true;
    }

    //将设备id为deviceId的设备 绑定到指定的pFilter
    bool UsbCamera::BindFilter(int deviceId, IBaseFilter **pFilter)
    {
    if (deviceId < 0)
      return false;
    CComPtr<ICreateDevEnum> pCreateDevEnum;
    HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER,
      IID_ICreateDevEnum, (void**)&pCreateDevEnum);
    if (hr != NOERROR)
      return false;

    CComPtr<IEnumMoniker> pEm;
    hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory,
      &pEm, 0);
    if (hr != NOERROR)
      return false;

    pEm->Reset();
    ULONG cFetched;
    IMoniker *pM;
    int index = 0;
    while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId)
    {
      IPropertyBag *pBag;
      hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag);
      if(SUCCEEDED(hr))
      {
       VARIANT var;
       var.vt = VT_BSTR;
       hr = pBag->Read(L"FriendlyName", &var, NULL);
       if (hr == NOERROR)
       {
        if (index == deviceId)
        {
         pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter);
        }
        SysFreeString(var.bstrVal);
       }
       pBag->Release();
      }
      pM->Release();
      index++;
    }
    return true;
    }

    //对于有多个输入和输出的设备,需选择一个通路,另外设置媒体类型;
    void UsbCamera::SetCrossBar(int fr, int iiw, int iih,string mode)
    {
    IAMCrossbar *pXBar1             = NULL;
    IAMStreamConfig*      pVSC      = NULL;

    HRESULT hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL,
      CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2,
      (void **)&pBuilder);

    if (SUCCEEDED(hr))
      hr = pBuilder->SetFiltergraph(pGraph);

    hr = pBuilder->FindInterface(&LOOK_UPSTREAM_ONLY, NULL,
      pDeviceFilter,IID_IAMCrossbar, (void**)&pXBar1);

    if (SUCCEEDED(hr))
    {
      long OutputPinCount;
      long InputPinCount;
      long PinIndexRelated;
      long PhysicalType;
      long inPort = 0;
      long outPort = 0;

      pXBar1->get_PinCounts(&OutputPinCount,&InputPinCount);
      for(int i =0;i<InputPinCount;i++)
      {
       pXBar1->get_CrossbarPinInfo(TRUE,i,&PinIndexRelated,&PhysicalType);
       if(PhysConn_Video_Composite==PhysicalType)
       {
        inPort = i;
        break;
       }
      }
      for(int i =0;i<OutputPinCount;i++)
      {
       pXBar1->get_CrossbarPinInfo(FALSE,i,&PinIndexRelated,&PhysicalType);
       if(PhysConn_Video_VideoDecoder==PhysicalType)
       {
        outPort = i;
        break;
       }
      }

      if(S_OK==pXBar1->CanRoute(outPort,inPort))
      {
       pXBar1->Route(outPort,inPort);
      }
      pXBar1->Release(); 
    }

    //设置媒体类型;
    hr = pBuilder->FindInterface( &PIN_CATEGORY_CAPTURE,0,pDeviceFilter,
      IID_IAMStreamConfig,(void**)&pVSC );

    AM_MEDIA_TYPE *pmt;
    if( SUCCEEDED(hr) )
    {
      hr = pVSC->GetFormat(&pmt);

      if (hr == NOERROR)
      {
       if (pmt->formattype == FORMAT_VideoInfo )
       {
        VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*) pmt->pbFormat;
        if (mode == "MEDIASUBTYPE_RGB24" )
         pmt->subtype = MEDIASUBTYPE_RGB24;
        else if (mode == "MEDIASUBTYPE_YUY2" )
         pmt->subtype = MEDIASUBTYPE_YUY2;

        pvi->AvgTimePerFrame = (LONGLONG)( 10000000 / fr );
        pvi->bmiHeader.biWidth  = iiw;
        pvi->bmiHeader.biHeight = iih;
        pvi->bmiHeader.biSizeImage = DIBSIZE(pvi->bmiHeader);
        hr = pVSC->SetFormat(pmt);
       }
       FreeMediaType(*pmt);
      }
      SAFE_RELEASE( pVSC );
    }
    }

    HRESULT UsbCamera::GrabByteFrame()
    {
    HRESULT hr;
    long    size = 0;
    long evCode;

    hr = pMediaEvent->WaitForCompletion(10e4, &evCode); // INFINITE

    #if SEND_WORK_STATE
    if( evCode == EC_COMPLETE )
      pMediaControl->Pause();
    else if( FAILED(hr) || evCode <= 0 )
    {
      if( !bnotify )
      {
       bnotify = true;
       bisValid = false;
       return E_FAIL;
      }
    }
    #endif

    pSampleGrabber->GetCurrentBuffer(&size, NULL);

    // use YUV422 format
    #if USE_YUV422_FORMAT
    // if buffer is not the same size as before, create a new one
    if( size != bufferSize )
    {
      if( pBuffer )
       delete[] pBuffer;
      bufferSize = size;
      pBuffer = new long[bufferSize];
      if( pBYTEbuffer )
       delete[] pBYTEbuffer;

      pBYTEbuffer = new BYTE[bufferSize*2];
      memset(pBYTEbuffer,0,sizeof(BYTE)*bufferSize*2);
    }

    pSampleGrabber->GetCurrentBuffer(&size, pBuffer);

    const BYTE* pSrc = (BYTE*) pBuffer;
    const BYTE* pSrcEnd = pSrc + (width*height*2);
    BYTE* pDest = pBYTEbuffer;

    while (pSrc < pSrcEnd)
    {
      for (register int i =0; i< width; i++)
      {
       BYTE temp = *(pSrc++);
       BYTE temp2 = *(pSrc++);
       *(pDest++) = temp2;
       *(pDest++) = temp;
      }


    #endif

    #if USE_RGB24_FORMAT
    // use RGB format
    if (size != bufferSize)
    {
      if (pBuffer)
       delete[] pBuffer;
      bufferSize = size;
      pBuffer = new long[bufferSize];
      if(pBYTEbuffer)
       delete[] pBYTEbuffer;
      pBYTEbuffer = new BYTE[bufferSize*3];
    }

    pSampleGrabber->GetCurrentBuffer(&size, pBuffer);

    BYTE* pDest = pBYTEbuffer;
    BYTE *pBYTETemp = pBYTEbuffer;
    const BYTE* pSrc = (BYTE*) pBuffer;


    const ULONG remainder = ((width*3+3) & ~3) - width*3;

    for (register unsigned int i = 0; i < height; i++ )
    {
      pDest = pBYTETemp + (height-i) * width * 3;
      for (register unsigned int j = 0; j < width; j++ )
      {
       const BYTE blue = *(pSrc++);
       const BYTE green = *(pSrc++);
       const BYTE red = *(pSrc++);

       *(pDest++) = red;
       *(pDest++) = green;
       *(pDest++) = blue;
       pDest += remainder;
      }
    }
    #endif

    return S_OK;
    }

    ImageBuffer UsbCamera::getImage()
    {
    HRESULT hr = S_OK;
    hr = GrabByteFrame();

    if(FAILED(hr))
      ErrMsg(TEXT("UsbCamera disconnect!"));

    const BYTE* pSrc = GetByteBuffer();

    #if USE_YUV422_FORMAT
    m_buffer = ImageBuffer( width,height,ImageBuffer::FORMAT_YUV422,
      (unsigned char*)pSrc,(int)(width*height*2.));
    #endif

    #if USE_RGB24_FORMAT
    m_buffer = ImageBuffer( width,height,ImageBuffer::FORMAT_RGB,
      (unsigned char*)pSrc,(int)(width*height*3.));
    #endif

    pMediaControl->Run();

    return m_buffer;
    }

    UsbCamera* UsbCamera::getCamera(int port, int framerate /* = 30 */, int width /* = 320 */,
    int height /* = 240  */,string mode)
    {
    if (m_camera == NULL)
    {
      m_camera = new UsbCamera();
      m_camera->Init(port,false,framerate,width,height,mode);
      m_camera->bisValid = true;
    }
    return m_camera;
    }

    void UsbCamera::destroyUsbCamera()
    {
    if (m_camera)
    {
      delete m_camera;
      m_camera = NULL;
    }
    }

    void UsbCamera::FreeMediaType(AM_MEDIA_TYPE& mt)
    {
    if (mt.cbFormat != 0)
    {
      CoTaskMemFree((PVOID)mt.pbFormat);
      mt.cbFormat = 0;
      mt.pbFormat = NULL;
    }
    if (mt.pUnk != NULL)
    {
      mt.pUnk->Release();
      mt.pUnk = NULL;
    }
    }

    int UsbCamera::getWidth() const
    {
    return width;
    }

    int UsbCamera::getHeight() const
    {
    return height;
    }

    void UsbCamera::ErrMsg(LPTSTR szFormat,...)
    {
    static TCHAR szBuffer[2048]={0};
    const size_t NUMCHARS = sizeof(szBuffer) / sizeof(szBuffer[0]);
    const int LASTCHAR = NUMCHARS - 1;
    va_list pArgs;
    va_start(pArgs, szFormat);
    _vsntprintf(szBuffer, NUMCHARS - 1, szFormat, pArgs);
    va_end(pArgs);

    szBuffer[LASTCHAR] = TEXT('\0');
    ///MessageBox(NULL, szBuffer, "UsbCamera Error",
    // MB_OK | MB_ICONEXCLAMATION | MB_TASKMODAL);
    }

    void UsbCamera::DisplayFilterProperties()
    {
    ISpecifyPropertyPages* pProp;

    HRESULT hr = pDeviceFilter->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pProp);
    if (SUCCEEDED(hr))
    {
      FILTER_INFO FilterInfo;
      hr = pDeviceFilter->QueryFilterInfo(&FilterInfo);
      IUnknown* pFilterUnk;
      pDeviceFilter->QueryInterface(IID_IUnknown,(void**)&pFilterUnk);

      CAUUID caGUID;
      pProp->GetPages(&caGUID);
      pProp->Release();
      OleCreatePropertyFrame(
       NULL,
       0,
       0,
       FilterInfo.achName,
       1,
       &pFilterUnk,
       caGUID.cElems,
       caGUID.pElems,
       0,
       0,
       NULL);
      pFilterUnk->Release();
      FilterInfo.pGraph->Release();
      CoTaskMemFree(caGUID.pElems);
    }

    }

    //==========================================main.cpp===========================================

    #include "UsbCamera.h"

    int main(int argc, char** argv)
    {

    cvNamedWindow( "Motion", 1 );
    UsbCamera* pCamera = UsbCamera::getCamera(0,30,480,240,"MEDIASUBTYPE_RGB24");
    if(NULL == pCamera)
    {
      return 0;
    }
    ImageBuffer buffer = pCamera->getImage();
    IplImage* pImage = cvCreateImage(cvSize(buffer.width,buffer.height),8,3);
    for(;;)
    {
     
      buffer = pCamera->getImage();
      for(int i=0;i<pImage->height;i++)
      {
       uchar* data=(uchar*)pImage->imageData + pImage->widthStep * i;
       for(int j=0;j<pImage->width;j++)
       {
        data[3*j] = buffer.buffer[3*(i*buffer.width+j)+2];
        data[3*j+1] = buffer.buffer[3*(i*buffer.width+j)+1];
        data[3*j+2] = buffer.buffer[3*(i*buffer.width+j)+0];
       }
      }
      cvShowImage( "Motion", pImage );

      if( cvWaitKey(10) >= 0 )
       break;
    }
    return 0;
    }

    展开全文
  • usb摄像头进行实时预览与拍照获取图片,可用于第三方接口做人脸识别比对。也可自己做人脸比对。代码简单,调用预览与拍照代码不超过20行
  • 1. 安装labview 2010 vision 模块(8.6的要装个USB驱动,装了驱动还要改下win32下的一个东东(我不记得名字了,可百度)),如果喜欢图像处理的朋友不妨把vision assistant和vision builder 也装了,这两个模块傻瓜...
  • 可以纵向拼接也可以横向拼接,实时显示,拼接两个摄像头的动态图像,实用!
  • 前面说了使用USB摄像头采集bmp图片和yuv视频,显示过程直接就是把bmp图片在lcd显示出来就可以了。大体的USB摄像头操作和前面的一样,不清楚摄像头操作可以去 https://github.com/zhangdalei/video_lcd_show。...
  • 树莓派自带摄像头接口,但是自家的摄像头有点略贵,比较普遍的做法是用USB免驱动(Linux内核自带)的2不到20的渣渣摄像头代替,来实现简单的监控功能。 我所用到的摄像头参数大致为:USB2.0,30万像素,幅面640*480...
  • 配合这篇博客使用此资源https://blog.csdn.net/AIRKernel/article/details/121807558 将USB摄像头生成的YUV图像转换成H264文件输出,并可以在Ubuntu上播放此视频文件。完成对视频的压缩工作。
  • usb摄像头采集mjpg格式的数据 v4l2 usb camera

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 12,576
精华内容 5,030
关键字:

usb摄像头无图像