精华内容
下载资源
问答
  • USB摄像头采集图像的VC源代码图象采集.rar
  • qt+opencv编写,采集usb摄像头图像,并实时显示的例程
  • USB摄像头采集图像(DirectShow)

    千次阅读 2016-08-23 15:48:54
    摄像头除了可以捕获视频流还可以捕获单张静止的图片,静止的图片质量比流的质量要高。支持输出静态图片的摄像头一般要提供一个静态图片PIN,这个PIN的种类是PIN_CATEGORY_STILL。 捕捉静态图片常用的filter是Sample...

    摄像头除了可以捕获视频流还可以捕获单张静止的图片,静止的图片质量比流的质量要高。支持输出静态图片的摄像头一般要提供一个静态图片PIN,这个PIN的种类是PIN_CATEGORY_STILL

    捕捉静态图片常用的filterSample Graber filter,它的用法参考手册。然后将捕捉filter的静态PIN连接到Sample Grabber,再将Sample Grabber连接到Null Render filter,连接Null Render只是为了给Sample Grabber 的输出PIN上连接点东西。其结构图如下所示:

                                               Capture Device --------------->  SampleGraber ---------------------> Null  Render

    为了实现上图所示的流图,需要分别实现其Filter,然后将各个Filter加入Graph,最后将他们连接起来,就实现整个流程。

    1.Capture Device 作为采集设备构建Filter,采用如下代码

    //将设备iddeviceId的设备绑定到指定的pFilter

    bool UsbCamera::BindFilter(int deviceId, IBaseFilter **pFilter)

    {

             if (deviceId < 0)

                       return false;

             CComPtr<ICreateDevEnum> pCreateDevEnum;

             HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER,

                       IID_ICreateDevEnum, (void**)&pCreateDevEnum);

             if (hr != NOERROR)

                       return false;

             CComPtr<IEnumMoniker> pEm;

             hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory,

                       &pEm, 0);

             if (hr != NOERROR)

                       return false;

             pEm->Reset();

             ULONG cFetched;

             IMoniker *pM;

             int index = 0;

             while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId)

             {

                       IPropertyBag *pBag;

                       hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag);

                       if(SUCCEEDED(hr))

                       {

                                VARIANT var;

                                var.vt = VT_BSTR;

                                hr = pBag->Read(L"FriendlyName", &var, NULL);

                                if (hr == NOERROR)

                                {

                                         if (index == deviceId)

                                         {

                                                   pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter);

                                         }

                                         SysFreeString(var.bstrVal);

                                }

                                pBag->Release();

                       }

                       pM->Release();

                       index++;

             }

             return true;

    }

    2.Sample Graber Filter的实现

    加入Sample Graber Filter是为了设置媒体的类型

    IBaseFilter* pSampleGrabberFilter;

    hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER,

                       IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter);

    ISampleGrabber*  pSampleGrabber;

    hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber);

    //设置媒体类型

    AM_MEDIA_TYPE   mt;

    ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));

    mt.majortype  = MEDIATYPE_Video;

    if (mode == "MEDIASUBTYPE_RGB24" )

    {

             mt.subtype = MEDIASUBTYPE_RGB24;

             bytePP     = 3.0;

    }

    else if (mode == "MEDIASUBTYPE_YUY2" )

    {

             mt.subtype = MEDIASUBTYPE_YUY2;

             bytePP     = 2.0;

    }

    mt.formattype = FORMAT_VideoInfo;

    hr = pSampleGrabber->SetMediaType(&mt);

    3.Null Render Filter的实现

    IBaseFilter* pNullFilter;

    hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER,

                       IID_IBaseFilter, (LPVOID*) &pNullFilter);

    4.将各个Filter加入Graph

    IGraphBuilder* pGraph;

    hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC,

                       IID_IGraphBuilder, (void **)&pGraph);

    pGraph->AddFilter(pDeviceFilter, NULL);

    pGraph->AddFilter(pSampleGrabberFilter, L"Grabber");

    pGraph->AddFilter(pNullFilter, L"NullRenderer");

    5.对于有的采集设备可能有多个输入和输出的PIN,我们需选择一个通路,通过SetCrossBar设备;

    //对于有多个输入和输出的设备,需选择一个通路,另外设置媒体类型;

    void UsbCamera::SetCrossBar(int fr, int iiw, int iih,string mode)

    {

             IAMCrossbar *pXBar1             = NULL;

             IAMStreamConfig*      pVSC      = NULL;

             HRESULT hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL,

                       CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2,

                       (void **)&pBuilder);

             if (SUCCEEDED(hr))

                       hr = pBuilder->SetFiltergraph(pGraph);

             hr = pBuilder->FindInterface(&LOOK_UPSTREAM_ONLY, NULL,

                       pDeviceFilter,IID_IAMCrossbar, (void**)&pXBar1);

             if (SUCCEEDED(hr))

             {

                       long OutputPinCount;

                       long InputPinCount;

                       long PinIndexRelated;

                       long PhysicalType;

                       long inPort = 0;

                       long outPort = 0;

                       pXBar1->get_PinCounts(&OutputPinCount,&InputPinCount);

                       for(int i =0;i<InputPinCount;i++)

                       {

                                pXBar1->get_CrossbarPinInfo(TRUE,i,&PinIndexRelated,&PhysicalType);

                                if(PhysConn_Video_Composite==PhysicalType)

                                {

                                         inPort = i;

                                         break;

                                }

                       }

                       for(int i =0;i<OutputPinCount;i++)

                       {

                                pXBar1->get_CrossbarPinInfo(FALSE,i,&PinIndexRelated,&PhysicalType);

                                if(PhysConn_Video_VideoDecoder==PhysicalType)

                                {

                                         outPort = i;

                                         break;

                                }

                       }

                       if(S_OK==pXBar1->CanRoute(outPort,inPort))

                       {

                                pXBar1->Route(outPort,inPort);

                       }

                       pXBar1->Release();         

             }

             //设置媒体类型;

             hr = pBuilder->FindInterface( &PIN_CATEGORY_CAPTURE,0,pDeviceFilter,

                       IID_IAMStreamConfig,(void**)&pVSC );

             AM_MEDIA_TYPE *pmt;

             if( SUCCEEDED(hr) )

             {

                       hr = pVSC->GetFormat(&pmt);

                       if (hr == NOERROR)

                       {

                                if (pmt->formattype == FORMAT_VideoInfo )

                                {

                                         VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*) pmt->pbFormat;

                                         if (mode == "MEDIASUBTYPE_RGB24" )

                                                   pmt->subtype = MEDIASUBTYPE_RGB24;

                                         else if (mode == "MEDIASUBTYPE_YUY2" )

                                                   pmt->subtype = MEDIASUBTYPE_YUY2;

                                         pvi->AvgTimePerFrame = (LONGLONG)( 10000000 / fr );

                                         pvi->bmiHeader.biWidth  = iiw;

                                         pvi->bmiHeader.biHeight = iih;

                                         pvi->bmiHeader.biSizeImage = DIBSIZE(pvi->bmiHeader);

                                         hr = pVSC->SetFormat(pmt);

                                }

                                FreeMediaType(*pmt);

                       }

                       SAFE_RELEASE( pVSC );

             }

    }

    6.将各filter连接起来

    Ipin* pCameraOutput;

    Ipin* pGrabberInput;

    Ipin* pGrabberOutput;

    Ipin* pNullInputPin;

    CComPtr<IEnumPins> pEnum;

    pDeviceFilter->EnumPins(&pEnum);

    hr = pEnum->Reset();

    hr = pEnum->Next(1, &pCameraOutput, NULL);

    pEnum = NULL;

    pSampleGrabberFilter->EnumPins(&pEnum);

    pEnum->Reset();

    hr = pEnum->Next(1, &pGrabberInput, NULL);

    pEnum = NULL;

    pSampleGrabberFilter->EnumPins(&pEnum);

    pEnum->Reset();

    pEnum->Skip(1);

    hr = pEnum->Next(1, &pGrabberOutput, NULL);

    pEnum = NULL;

    pNullFilter->EnumPins(&pEnum);

    pEnum->Reset();

    hr = pEnum->Next(1, &pNullInputPin, NULL);

    //连接Pin;

    hr = pGraph->Connect(pCameraOutput, pGrabberInput);

    hr = pGraph->Connect(pGrabberOutput, pNullInputPin);

    7.运行

    pSampleGrabber->SetBufferSamples(TRUE); // true for wait frame done call back 不再另外开辟单帧缓冲区

    pSampleGrabber->SetOneShot(TRUE); // FALSE=截图后继续运行graph,TRUE=STOP RUN GRAPH

    hr = pSampleGrabber->GetConnectedMediaType( &mt );

    VIDEOINFOHEADER *videoHeader;

    assert(mt.formattype == FORMAT_VideoInfo);

    videoHeader = reinterpret_cast<VIDEOINFOHEADER*>(mt.pbFormat);

    width  = videoHeader->bmiHeader.biWidth;

    height = videoHeader->bmiHeader.biHeight;

    FreeMediaType(mt);

    pMediaControl->Run();

    8.原代码

    //==================================ImageBuffer.h================================================================

    #ifndef _imagebuffer_h_
    #define _imagebuffer_h_

    class ImageBuffer
    {
    public:
    enum {
      FORMAT_YUV444=0,
      FORMAT_YUV422,
      FORMAT_YUV411,
      FORMAT_RGB,
      FORMAT_MONO,
      FORMAT_MONO16,
      FORMAT_UYV
    };   

    int width;             
    int height;             
    int format;            
    int size;               
    unsigned char* buffer; 

    ImageBuffer::ImageBuffer()
    {

    }
    ImageBuffer::ImageBuffer(int width, int height, int format,
      unsigned char* buffer, int size)
      : width(width), height(height), format(format), buffer(buffer), size(size)
    {

    }

    //static void convert(const ImageBuffer& src, ImageBuffer& dst) throw (RobotsException);

    };

    #endif

    //==================================ImageSource.h================================================================

    #ifndef _imagesource_h_
    #define _imagesource_h_

    #include "ImageBuffer.h"

    class ImageSource
    {
    public:
    virtual ImageBuffer getImage() =0;
    virtual int getWidth() const=0;
    virtual int getHeight() const=0;
    };
    #endif

    //==================================UsbCamera.h================================================================

    #ifndef USBCAMER_H_INCLUDE
    #define USBCAMER_H_INCLUDE

    #include <windows.h>
    #include <dshow.h>
    #include <atlbase.h>
    #include <qedit.h>
    #include <string>
    #include "ImageSource.h"

    #define WIN32_LEAN_AND_MEAN

    #ifndef SAFE_RELEASE
    #define SAFE_RELEASE( x ) \
    if ( NULL != x ) \
    { \
    x->Release( ); \
    x = NULL; \
    }
    #endif

    using namespace std;
    class UsbCamera : public ImageSource
    {
    public:
    UsbCamera();
    virtual ~UsbCamera();

    virtual ImageBuffer getImage();
    virtual int   getWidth() const;
    virtual int   getHeight() const;

    static UsbCamera* getCamera(  int port = 0, int framerate = 30, int width = 320,
      int height = 240, string mode = "" );
    static void   destroyUsbCamera();

    void    Init( int deviceId, bool displayProperties = false, int framerate = 30,int iw = 320, int ih = 240, string mode = "" );
    void    DisplayFilterProperties();

    bool    BindFilter(int deviceId, IBaseFilter **pFilter);
    void    SetCrossBar( int fr = 30, int iiw = 320, int iih = 240, string mode = "" );
    HRESULT    GrabByteFrame();
    long    GetBufferSize()  { return bufferSize; }
    long*    GetBuffer()   { return pBuffer;    }
    BYTE*    GetByteBuffer()  { return pBYTEbuffer;}

    public:
    bool        bisValid;

    protected:
    IGraphBuilder*   pGraph;
    IBaseFilter*   pDeviceFilter;
    IMediaControl*   pMediaControl;
    IBaseFilter*   pSampleGrabberFilter;
    ISampleGrabber*   pSampleGrabber;
    IPin*     pGrabberInput;
    IPin*     pGrabberOutput;
    IPin*     pCameraOutput;
    IMediaEvent*   pMediaEvent;
    IBaseFilter*   pNullFilter;
    IPin*     pNullInputPin;
    ICaptureGraphBuilder2*  pBuilder;

    static UsbCamera*  m_camera;
    ImageBuffer    imagebuf;

    double                  bytePP;
    private:
    void     ErrMsg(LPTSTR szFormat,...);
    void     FreeMediaType(AM_MEDIA_TYPE& mt);

    long  bufferSize;
    long*  pBuffer;
    BYTE*  pBYTEbuffer;
    bool  connected;
    int   width;
    int   height;

    ImageBuffer m_buffer;
    bool  bnotify;
    string     format_mode;

    };

    #endif

    //==================================UsbCamera.cpp================================================================

    #include "StdAfx.h"
    #include <assert.h>
    #include "UsbCamera.h"

    #ifndef USE_YUV422_FORMAT
    #define USE_YUV422_FORMAT 0
    #endif

    #ifndef USE_RGB24_FORMAT
    #define USE_RGB24_FORMAT  1
    #endif

    #ifndef SEND_WORK_STATE
    #define SEND_WORK_STATE  0
    #endif

    UsbCamera* UsbCamera::m_camera = NULL;

    UsbCamera::UsbCamera():bisValid(false),pBuffer(NULL),pBYTEbuffer(NULL),bufferSize(0),bytePP(2.0),
    connected(false),bnotify(false),width(0),height(0)
    {
    if (FAILED(CoInitialize(NULL)))
    {
      return;
    }
    pGraph     = NULL;
    pDeviceFilter   = NULL;
    pMediaControl   = NULL;
    pSampleGrabberFilter = NULL;
    pSampleGrabber   = NULL;
    pGrabberInput   = NULL;
    pGrabberOutput   = NULL;
    pCameraOutput   = NULL;
    pMediaEvent    = NULL;
    pNullFilter    = NULL;
    pNullInputPin   = NULL;
    pBuilder    = NULL;

    }

    UsbCamera::~UsbCamera()
    {
    if( connected )
    {
      if (pMediaControl )
      {
       pMediaControl->Stop();
      }
      SAFE_RELEASE(pGraph);
      SAFE_RELEASE(pDeviceFilter);
      SAFE_RELEASE(pMediaControl);
      SAFE_RELEASE(pSampleGrabberFilter);
      SAFE_RELEASE(pSampleGrabber);
      SAFE_RELEASE(pGrabberInput);
      SAFE_RELEASE(pGrabberOutput);
      SAFE_RELEASE(pCameraOutput);
      SAFE_RELEASE(pMediaEvent);
      SAFE_RELEASE(pNullFilter);
      SAFE_RELEASE(pNullInputPin);
      SAFE_RELEASE(pBuilder);
      CoUninitialize();
    }
    if( pBuffer )
      delete[] pBuffer;
    if( pBYTEbuffer )
      delete[] pBYTEbuffer;
    }

    void UsbCamera::Init(int deviceId, bool displayProperties,
    int framerate, int iw , int ih, string mode )
    {
    HRESULT hr = S_OK;
    format_mode = mode;
    // Create the Filter Graph Manager.
    hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC,
      IID_IGraphBuilder, (void **)&pGraph);

    hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER,
      IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter);

    hr = pGraph->QueryInterface(IID_IMediaControl, (void **) &pMediaControl);
    hr = pGraph->QueryInterface(IID_IMediaEvent, (void **) &pMediaEvent);

    hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER,
      IID_IBaseFilter, (LPVOID*) &pNullFilter);

    hr = pGraph->AddFilter(pNullFilter, L"NullRenderer");

    hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber);

    AM_MEDIA_TYPE   mt;
    ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE));
    mt.majortype  = MEDIATYPE_Video;

    if (mode == "MEDIASUBTYPE_RGB24" )
    {
      mt.subtype = MEDIASUBTYPE_RGB24;
      bytePP     = 3.0;
    }
    else if (mode == "MEDIASUBTYPE_YUY2" )
    {
      mt.subtype = MEDIASUBTYPE_YUY2;
      bytePP     = 2.0;
    }

    mt.formattype = FORMAT_VideoInfo;
    hr = pSampleGrabber->SetMediaType(&mt);

    pGraph->AddFilter(pSampleGrabberFilter, L"Grabber");

    // Bind Device Filter. We know the device because the id was passed in
    if(!BindFilter(deviceId, &pDeviceFilter))
    {
      ErrMsg(TEXT("未找到USB摄像头!\n请检查设备后重试!"));
      exit(0);
      return;
    }

    pGraph->AddFilter(pDeviceFilter, NULL);

    CComPtr<IEnumPins> pEnum;
    pDeviceFilter->EnumPins(&pEnum);
    hr = pEnum->Reset();
    hr = pEnum->Next(1, &pCameraOutput, NULL);
    pEnum = NULL;
    pSampleGrabberFilter->EnumPins(&pEnum);
    pEnum->Reset();
    hr = pEnum->Next(1, &pGrabberInput, NULL);
    pEnum = NULL;
    pSampleGrabberFilter->EnumPins(&pEnum);
    pEnum->Reset();
    pEnum->Skip(1);
    hr = pEnum->Next(1, &pGrabberOutput, NULL);
    pEnum = NULL;
    pNullFilter->EnumPins(&pEnum);
    pEnum->Reset();
    hr = pEnum->Next(1, &pNullInputPin, NULL);

    SetCrossBar(framerate,iw,ih,mode);

    if (displayProperties)
    {
      CComPtr<ISpecifyPropertyPages> pPages;

      HRESULT hr = pCameraOutput->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pPages);
      if (SUCCEEDED(hr))
      {
       PIN_INFO PinInfo;
       pCameraOutput->QueryPinInfo(&PinInfo);
       CAUUID caGUID;
       pPages->GetPages(&caGUID);
       OleCreatePropertyFrame( NULL,0, 0,L"Property Sheet",1,
        (IUnknown **)&(pCameraOutput),
        caGUID.cElems, caGUID.pElems,
        0,0,NULL );
       CoTaskMemFree(caGUID.pElems);
       PinInfo.pFilter->Release();
      }
    }

    hr = pGraph->Connect(pCameraOutput, pGrabberInput);
    hr = pGraph->Connect(pGrabberOutput, pNullInputPin);

    pSampleGrabber->SetBufferSamples(TRUE); // true for wait frame done call back 不再另外开辟单帧缓冲区
    pSampleGrabber->SetOneShot(TRUE); // FALSE=截图后继续运行graph,TRUE=STOP RUN GRAPH

    hr = pSampleGrabber->GetConnectedMediaType( &mt );
    VIDEOINFOHEADER *videoHeader;
    assert(mt.formattype == FORMAT_VideoInfo);
    videoHeader = reinterpret_cast<VIDEOINFOHEADER*>(mt.pbFormat);
    width  = videoHeader->bmiHeader.biWidth;
    height = videoHeader->bmiHeader.biHeight;
    FreeMediaType(mt);

    pMediaControl->Run();
    connected = true;
    }

    //将设备id为deviceId的设备 绑定到指定的pFilter
    bool UsbCamera::BindFilter(int deviceId, IBaseFilter **pFilter)
    {
    if (deviceId < 0)
      return false;
    CComPtr<ICreateDevEnum> pCreateDevEnum;
    HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER,
      IID_ICreateDevEnum, (void**)&pCreateDevEnum);
    if (hr != NOERROR)
      return false;

    CComPtr<IEnumMoniker> pEm;
    hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory,
      &pEm, 0);
    if (hr != NOERROR)
      return false;

    pEm->Reset();
    ULONG cFetched;
    IMoniker *pM;
    int index = 0;
    while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId)
    {
      IPropertyBag *pBag;
      hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag);
      if(SUCCEEDED(hr))
      {
       VARIANT var;
       var.vt = VT_BSTR;
       hr = pBag->Read(L"FriendlyName", &var, NULL);
       if (hr == NOERROR)
       {
        if (index == deviceId)
        {
         pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter);
        }
        SysFreeString(var.bstrVal);
       }
       pBag->Release();
      }
      pM->Release();
      index++;
    }
    return true;
    }

    //对于有多个输入和输出的设备,需选择一个通路,另外设置媒体类型;
    void UsbCamera::SetCrossBar(int fr, int iiw, int iih,string mode)
    {
    IAMCrossbar *pXBar1             = NULL;
    IAMStreamConfig*      pVSC      = NULL;

    HRESULT hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL,
      CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2,
      (void **)&pBuilder);

    if (SUCCEEDED(hr))
      hr = pBuilder->SetFiltergraph(pGraph);

    hr = pBuilder->FindInterface(&LOOK_UPSTREAM_ONLY, NULL,
      pDeviceFilter,IID_IAMCrossbar, (void**)&pXBar1);

    if (SUCCEEDED(hr))
    {
      long OutputPinCount;
      long InputPinCount;
      long PinIndexRelated;
      long PhysicalType;
      long inPort = 0;
      long outPort = 0;

      pXBar1->get_PinCounts(&OutputPinCount,&InputPinCount);
      for(int i =0;i<InputPinCount;i++)
      {
       pXBar1->get_CrossbarPinInfo(TRUE,i,&PinIndexRelated,&PhysicalType);
       if(PhysConn_Video_Composite==PhysicalType)
       {
        inPort = i;
        break;
       }
      }
      for(int i =0;i<OutputPinCount;i++)
      {
       pXBar1->get_CrossbarPinInfo(FALSE,i,&PinIndexRelated,&PhysicalType);
       if(PhysConn_Video_VideoDecoder==PhysicalType)
       {
        outPort = i;
        break;
       }
      }

      if(S_OK==pXBar1->CanRoute(outPort,inPort))
      {
       pXBar1->Route(outPort,inPort);
      }
      pXBar1->Release(); 
    }

    //设置媒体类型;
    hr = pBuilder->FindInterface( &PIN_CATEGORY_CAPTURE,0,pDeviceFilter,
      IID_IAMStreamConfig,(void**)&pVSC );

    AM_MEDIA_TYPE *pmt;
    if( SUCCEEDED(hr) )
    {
      hr = pVSC->GetFormat(&pmt);

      if (hr == NOERROR)
      {
       if (pmt->formattype == FORMAT_VideoInfo )
       {
        VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*) pmt->pbFormat;
        if (mode == "MEDIASUBTYPE_RGB24" )
         pmt->subtype = MEDIASUBTYPE_RGB24;
        else if (mode == "MEDIASUBTYPE_YUY2" )
         pmt->subtype = MEDIASUBTYPE_YUY2;

        pvi->AvgTimePerFrame = (LONGLONG)( 10000000 / fr );
        pvi->bmiHeader.biWidth  = iiw;
        pvi->bmiHeader.biHeight = iih;
        pvi->bmiHeader.biSizeImage = DIBSIZE(pvi->bmiHeader);
        hr = pVSC->SetFormat(pmt);
       }
       FreeMediaType(*pmt);
      }
      SAFE_RELEASE( pVSC );
    }
    }

    HRESULT UsbCamera::GrabByteFrame()
    {
    HRESULT hr;
    long    size = 0;
    long evCode;

    hr = pMediaEvent->WaitForCompletion(10e4, &evCode); // INFINITE

    #if SEND_WORK_STATE
    if( evCode == EC_COMPLETE )
      pMediaControl->Pause();
    else if( FAILED(hr) || evCode <= 0 )
    {
      if( !bnotify )
      {
       bnotify = true;
       bisValid = false;
       return E_FAIL;
      }
    }
    #endif

    pSampleGrabber->GetCurrentBuffer(&size, NULL);

    // use YUV422 format
    #if USE_YUV422_FORMAT
    // if buffer is not the same size as before, create a new one
    if( size != bufferSize )
    {
      if( pBuffer )
       delete[] pBuffer;
      bufferSize = size;
      pBuffer = new long[bufferSize];
      if( pBYTEbuffer )
       delete[] pBYTEbuffer;

      pBYTEbuffer = new BYTE[bufferSize*2];
      memset(pBYTEbuffer,0,sizeof(BYTE)*bufferSize*2);
    }

    pSampleGrabber->GetCurrentBuffer(&size, pBuffer);

    const BYTE* pSrc = (BYTE*) pBuffer;
    const BYTE* pSrcEnd = pSrc + (width*height*2);
    BYTE* pDest = pBYTEbuffer;

    while (pSrc < pSrcEnd)
    {
      for (register int i =0; i< width; i++)
      {
       BYTE temp = *(pSrc++);
       BYTE temp2 = *(pSrc++);
       *(pDest++) = temp2;
       *(pDest++) = temp;
      }


    #endif

    #if USE_RGB24_FORMAT
    // use RGB format
    if (size != bufferSize)
    {
      if (pBuffer)
       delete[] pBuffer;
      bufferSize = size;
      pBuffer = new long[bufferSize];
      if(pBYTEbuffer)
       delete[] pBYTEbuffer;
      pBYTEbuffer = new BYTE[bufferSize*3];
    }

    pSampleGrabber->GetCurrentBuffer(&size, pBuffer);

    BYTE* pDest = pBYTEbuffer;
    BYTE *pBYTETemp = pBYTEbuffer;
    const BYTE* pSrc = (BYTE*) pBuffer;


    const ULONG remainder = ((width*3+3) & ~3) - width*3;

    for (register unsigned int i = 0; i < height; i++ )
    {
      pDest = pBYTETemp + (height-i) * width * 3;
      for (register unsigned int j = 0; j < width; j++ )
      {
       const BYTE blue = *(pSrc++);
       const BYTE green = *(pSrc++);
       const BYTE red = *(pSrc++);

       *(pDest++) = red;
       *(pDest++) = green;
       *(pDest++) = blue;
       pDest += remainder;
      }
    }
    #endif

    return S_OK;
    }

    ImageBuffer UsbCamera::getImage()
    {
    HRESULT hr = S_OK;
    hr = GrabByteFrame();

    if(FAILED(hr))
      ErrMsg(TEXT("UsbCamera disconnect!"));

    const BYTE* pSrc = GetByteBuffer();

    #if USE_YUV422_FORMAT
    m_buffer = ImageBuffer( width,height,ImageBuffer::FORMAT_YUV422,
      (unsigned char*)pSrc,(int)(width*height*2.));
    #endif

    #if USE_RGB24_FORMAT
    m_buffer = ImageBuffer( width,height,ImageBuffer::FORMAT_RGB,
      (unsigned char*)pSrc,(int)(width*height*3.));
    #endif

    pMediaControl->Run();

    return m_buffer;
    }

    UsbCamera* UsbCamera::getCamera(int port, int framerate /* = 30 */, int width /* = 320 */,
    int height /* = 240  */,string mode)
    {
    if (m_camera == NULL)
    {
      m_camera = new UsbCamera();
      m_camera->Init(port,false,framerate,width,height,mode);
      m_camera->bisValid = true;
    }
    return m_camera;
    }

    void UsbCamera::destroyUsbCamera()
    {
    if (m_camera)
    {
      delete m_camera;
      m_camera = NULL;
    }
    }

    void UsbCamera::FreeMediaType(AM_MEDIA_TYPE& mt)
    {
    if (mt.cbFormat != 0)
    {
      CoTaskMemFree((PVOID)mt.pbFormat);
      mt.cbFormat = 0;
      mt.pbFormat = NULL;
    }
    if (mt.pUnk != NULL)
    {
      mt.pUnk->Release();
      mt.pUnk = NULL;
    }
    }

    int UsbCamera::getWidth() const
    {
    return width;
    }

    int UsbCamera::getHeight() const
    {
    return height;
    }

    void UsbCamera::ErrMsg(LPTSTR szFormat,...)
    {
    static TCHAR szBuffer[2048]={0};
    const size_t NUMCHARS = sizeof(szBuffer) / sizeof(szBuffer[0]);
    const int LASTCHAR = NUMCHARS - 1;
    va_list pArgs;
    va_start(pArgs, szFormat);
    _vsntprintf(szBuffer, NUMCHARS - 1, szFormat, pArgs);
    va_end(pArgs);

    szBuffer[LASTCHAR] = TEXT('\0');
    ///MessageBox(NULL, szBuffer, "UsbCamera Error",
    // MB_OK | MB_ICONEXCLAMATION | MB_TASKMODAL);
    }

    void UsbCamera::DisplayFilterProperties()
    {
    ISpecifyPropertyPages* pProp;

    HRESULT hr = pDeviceFilter->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pProp);
    if (SUCCEEDED(hr))
    {
      FILTER_INFO FilterInfo;
      hr = pDeviceFilter->QueryFilterInfo(&FilterInfo);
      IUnknown* pFilterUnk;
      pDeviceFilter->QueryInterface(IID_IUnknown,(void**)&pFilterUnk);

      CAUUID caGUID;
      pProp->GetPages(&caGUID);
      pProp->Release();
      OleCreatePropertyFrame(
       NULL,
       0,
       0,
       FilterInfo.achName,
       1,
       &pFilterUnk,
       caGUID.cElems,
       caGUID.pElems,
       0,
       0,
       NULL);
      pFilterUnk->Release();
      FilterInfo.pGraph->Release();
      CoTaskMemFree(caGUID.pElems);
    }

    }

    //==========================================main.cpp===========================================

    #include "UsbCamera.h"

    int main(int argc, char** argv)
    {

    cvNamedWindow( "Motion", 1 );
    UsbCamera* pCamera = UsbCamera::getCamera(0,30,480,240,"MEDIASUBTYPE_RGB24");
    if(NULL == pCamera)
    {
      return 0;
    }
    ImageBuffer buffer = pCamera->getImage();
    IplImage* pImage = cvCreateImage(cvSize(buffer.width,buffer.height),8,3);
    for(;;)
    {
     
      buffer = pCamera->getImage();
      for(int i=0;i<pImage->height;i++)
      {
       uchar* data=(uchar*)pImage->imageData + pImage->widthStep * i;
       for(int j=0;j<pImage->width;j++)
       {
        data[3*j] = buffer.buffer[3*(i*buffer.width+j)+2];
        data[3*j+1] = buffer.buffer[3*(i*buffer.width+j)+1];
        data[3*j+2] = buffer.buffer[3*(i*buffer.width+j)+0];
       }
      }
      cvShowImage( "Motion", pImage );

      if( cvWaitKey(10) >= 0 )
       break;
    }
    return 0;
    }

    展开全文
  • 前面说了使用USB摄像头采集bmp图片和yuv视频,显示过程直接就是把bmp图片在lcd显示出来就可以了。大体的USB摄像头操作和前面的一样,不清楚摄像头操作可以去 https://github.com/zhangdalei/video_lcd_show。...

    前面说了使用USB摄像头采集bmp图片和yuv视频,显示过程直接就是把bmp图片在lcd显示出来就可以了。大体的USB摄像头操作和前面的一样,不清楚摄像头操作可以去
    http://blog.csdn.net/zhangdaxia2/article/details/72763847。Framebuffer操作也不难,具体如下:
    framebuffer设备节点为/dev/fbx。一般为/dev/fb0,一个液晶控制节点。操作步骤:

    1、打开framebuffer。

    int fd = -1;
    fd = open("/dev/fb0", O_RDWR);

    2、ioctl获取LCD参数。

    static struct fb_fix_screeninfo fixinfo;
    static struct fb_var_screeninfo varinfo;
    ret = ioctl(fb_fd, FBIOGET_VSCREENINFO, &varinfo);
    ret = ioctl(fb_fd, FBIOGET_FSCREENINFO, &fixinfo);

    3、进行mmap内存地址映射,将内核地址映射到用户空间,用户直接向映射的地址写入数据,就可将数据显示到LCD上。

    fbmmap = (int*)mmap(NULL, fixinfo.smem_len, PROT_READ | PROT_WRITE, MAP_SHARED,
            fb_fd, 0);
    if(NULL == fbmmap){
            printf("mmap memeory error.\n");
            exit(0);
        }

    4、向映射后的地址中写入数据,绘制framebuffer。

    void fb_drawbmp(uint height,uint width,unsigned char* pdata)
    {
        uint x,y,cnt;
        uint a = height*width*3;
        uint cnt1 = 0;
        uint* p = fbmmap;
        for(x=0; x<height; x++)
            for(y=0; y<width; y++)
            {
                cnt = x*(varinfo.xres)+y;       // LCD像素位置
                cnt1 = (x*width+width-y)*3;         // 图片像素位置,width-y用于解决图像显示左右反了的问题
                *(p+cnt) = (pdata[cnt1]<<0) | (pdata[cnt1+1]<<8) | (pdata[cnt1+2]<<16);
            }
    }

    5、在USB摄像头操作readfram()中,将fb_drawbmp()放到里头,把摄像头采集的数据转成RGB后,直接写入framebuffer,就可以完成LCD显示摄像头图像。

    static int readfram()
    {
        struct pollfd pollfd;
        int ret,i;
        char filename[50];
    for(i =0; i<10 ;i++){
            memset(&pollfd, 0, sizeof(pollfd));
            pollfd.fd = fd;
            pollfd.events = POLLIN;
            ret = poll(&pollfd, 1, 800);
            if(-1 == ret){
                perror("VIDIOC_QBUF fail.\n");
                return -1;
            }else if(0 == ret){
                printf("poll time out\n");
                continue;
            }
        if(pollfd.revents & POLLIN){
            memset(&buf, 0, sizeof(buf));
            buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
            buf.memory = V4L2_MEMORY_MMAP;
            ret = ioctl(fd, VIDIOC_DQBUF, &buf);
            if(0 != ret){
                perror("VIDIOC_QBUF fail.\n");
                return -1;
            }
            //v4l2_buffer
            // 直接保存的yuyv数据
            //FILE *file = fopen(YUYV_FILE, "wb");
            //ret = fwrite((char*)framebuf[buf.index].start, 1, buf.length, file);
            //fclose(file);
    
            // RGB格式数据
            starter = (unsigned char*)framebuf[buf.index].start;
            newBuf = (unsigned char*)calloc((unsigned int)(framebuf[buf.index].length*3/2),sizeof(unsigned char));
            yuv422_2_rgb();
            // 将RGB数据写入到framebuffer,在LCD上显示
            fb_drawbmp(height, width, newBuf);
    
            ret = ioctl(fd, VIDIOC_QBUF, &buf);
            }
        }
        return ret;
    }

    由于Ubuntu和Android会不断的刷新屏幕,所以在这些系统上实验不一定有想过。在Linux开发板上可以正
    常运行,显示出摄像头图像。
    包括USB摄像头操作的完整代码可从以下网址下载:
    https://github.com/zhangdalei/video_lcd_show

    展开全文
  • 基于labview通过usb摄像头采集图像子vi,可以通过自带摄像头采集图像。
  • 一、调用笔记本自带摄像头或者台式机外接USB摄像头实时录像显示,并逐帧保存采集图像到指定的文件夹(目录), 注意:保存量较大,长时间采集记得删除无用的图像。 路径前加"r"和路径后加""的目的是为了进行转义,...

    一、调用笔记本自带摄像头或者台式机外接USB摄像头实时录像显示,并逐帧保存采集图像到指定的文件夹(目录),

    注意:保存量较大,长时间采集记得删除无用的图像。

    路径前加"r"和路径后加""的目的是为了进行转义,确保不会因路径出现错误

    Windows7+Python3.x+OpenCV

    import cv2
    #打开笔记本的内置摄像头
    cap = cv2.VideoCapture(0)
    i = 0
    #也可写成while True
    while(1):
        """
        ret:True或者False,代表有没有读取到图片
        frame:表示截取到一帧的图片
        """
        ret,frame = cap.read()
        # 展示图片
        cv2.imshow('capture',frame)
        # 保存图片
        cv2.imwrite(r"E:\Test\\"+ str(i) + ".jpg",frame)
        i = i + 1
        """
           cv2.waitKey(1):waitKey()函数功能是不断刷新图像,返回值为当前键盘的值
           OxFF:是一个位掩码,一旦使用了掩码,就可以检查它是否是相应的值
           ord('q'):返回q对应的unicode码对应的值(113)
        """
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break
    #释放对象和销毁窗口
    cap.release()
    cv2.destroyAllWindows()
    

     

    二、Python-OpenCV 实时检测人脸图像并截图保存到目录。

    Windows7+Python3.x+OpenCV

    人脸检测相关的文件下载链接haarcascade_frontalface_default.xml:  https://download.csdn.net/download/GGY1102/18830646

    #使用opencv+python实现摄像头实时人脸识别并保存到项目目录下
    import cv2
    cap = cv2.VideoCapture(0)
    face = cv2.CascadeClassifier('E:/haarcascade_frontalface_default.xml')
    while(1):
        ret,frame = cap.read()
        gray = cv2.cvtColor(frame,cv2.COLOR_RGB2GRAY)
        faces = face.detectMultiScale(gray,1.1,3,0,(200,200))
        for (x,y,w,h) in faces:
            img = cv2.rectangle(frame,(x,y),(x+w,y+h),(255,255,0),2)
            gray_roi = gray[y:y+h,x:x+h]
            cv2.imwrite('face.jpg',gray_roi)
            print('Image saved!!')
        cv2.imshow('camera',frame)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break
    cap.release()
    cv2.destroyAllWindows()
    
    


     

    展开全文
  • usb摄像头采集mjpg格式的数据 v4l2 usb camera
  • 图像处理中的各种变换工具,基于vc++6.0完成,功能强大,可以实时采集图片(USB摄像头)。
  • #include <iostream> #include <opencv2/opencv.hpp>...// 内部参数写0,代表笔记本自带的摄像头,2、1分别代表两个外接USB摄像头 VideoCapture cap1(1); // 设置分辨率 cap2.set(CV_CAP_PROP_FRAME_
    #include <iostream>
    #include <opencv2/opencv.hpp>
     
    using namespace std;
    using namespace cv;
     
    int main()
    {
        VideoCapture cap2(2);// 内部参数写0,代表笔记本自带的摄像头,2、1分别代表两个外接USB摄像头
        VideoCapture cap1(1);
    
        // 设置分辨率
        cap2.set(CV_CAP_PROP_FRAME_WIDTH,640);
        cap2.set(CV_CAP_PROP_FRAME_HEIGHT,480);
        
    	cap1.set(CV_CAP_PROP_FRAME_WIDTH,640);
        cap1.set(CV_CAP_PROP_FRAME_HEIGHT,480);
     
        Mat img1;
        Mat img2;
     
        while(cap2.read(img2) && cap1.read(img1))
        {
            imshow("img1", img1);
            imshow("img2", img2);
            char c = waitKey(1);
            if(c == 'q' || c == 'Q') // 按q退出
            {
                break;
            }
        }
        return 0;
    }
    

    测试效果
    在这里插入图片描述
    两个摄像头显示正常,没有出现卡顿现象。

    展开全文
  • 本资源主要是在Ubuntu16.04环境下采集USB摄像头数据,本程序通过FFmpeg相关API采集/dev/video0数据,摄像头为罗技C270i,采集到的图像格式为yuyv422
  • 树莓派USB2.0连接摄像头 客户端通过网线或无线连接树莓派(服务器端) 已完成 火灾报警,拜访提示音,闯入报警 读取dht11 温湿度值校验 多连接支持,pthread和互斥锁 QT基本界面 视频 日志 QT界面 结语 本项目实现了...
  • 因为RK3399pro+fadora系统的/dev/video0口被占用,使用python3+opencv无法读取摄像头数据,所以使用过 v4l2调用USB摄像头
  • 亲身实践有效,一步步的操作详细明了,对与做Linux下USB摄像头视频采集有一定作用
  • Linux USB摄像头采集保存bmp图片

    千次阅读 2017-05-26 10:09:46
    操作摄像头是基于V4L2提供的系统调用,步骤大体如下:1、打开设备 open2、查询设备信息和能力,用到的结构体为struct v4l2_capability 用到的ioctl命令为 VIDIOC_QUERYCAPstruct v4l2_capability { __u8 driver[16]...
  • zynq-7000学习笔记(八)——USB摄像头图像采集-附件资源
  • python获取usb摄像头图像。包括: 1.原图写字符; 2.转换为灰度图 3.二值化处理; 4.保存本地。
  • 自己设计的一个四路摄像头采集图像的labview程序,经过多次修改的最终版本,使用状态机,修复错误退出提示
  • 嵌入式Linux系统usb摄像头图像采集图像格式,图片保存, 及图片在lcd上面显示。
  • 占位, 实现USB摄像头图像采集与保存
  • 实现基于FPGA的图像采集,并实现LCD显示,像素采集为500万
  •  基于嵌入式Linux环境下图像采集及在嵌入式开发板上对图像显示的目的,本文研究了视频...最终实现了嵌入式Linux平台下USB摄像头图像采集以及在开发板上的显示,并且在开发板上实现了对显示图像的简单操作。
  • 功能:可实现系统与电脑外接USB双目摄像机模组的连接,相机画面实时显示,...注意:本资源需要电脑USB外接双目摄像头的支持,对不同的摄像头,需要在代码中对摄像头的名称做出修改(本代码中摄像头名称为3D_SHOP)。
  • 最近想做一个 使用 video4linux2(注意不是 video4linux )采集 usb 摄像头上的图像的程序,网上找了一个源码,运行成功,分享过程。 程序编译时直接使用 gcc -o test test.c就可以了。 本人运行时所带参数:./...
  • printf("打开摄像头失败: dev_number=%d\n",dev_number); return -1; } cv::namedWindow("camera_setting"); cv::createTrackbar("帧率","camera_setting", &fps, 100, on_FPS_trackbar); cv::...
  • 占位, 实现USB摄像头图像采集与保存 转载于:https://www.cnblogs.com/vacajk/p/6165592.html
  • Linux 下基于ARM920T 的USB 摄像头图像采集
  • zynq-7000学习笔记(十六)——opencv采集USB摄像头图像

    千次阅读 热门讨论 2016-09-21 19:21:37
    PC平台:WINDOWS 10 64位 + 虚拟机Ubuntu 14.04 Xilinx设计开发套件:Xilinx_vivado_sdk...USB摄像头:罗技 C270(720P) Linux源码:2016_R1 Linaro文件系统:linaro-vivid-developer-20150618-705.tar.gz Q

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 4,496
精华内容 1,798
关键字:

usb摄像头采集图像