精华内容
下载资源
问答
  • AndroidUSBCamera  AndroidUSBCamera is developed based on the saki4510t/UVCCamera, the project of USB Camera (UVC equipment) and the use of video data acquisition are highly packaged, and it can help...
  • MTK USB camera patch

    2018-12-11 18:14:23
    MTK实现UVC camera的patch,可以实现Preview ,拍照,录像。支持USB摄像头,可以下载USB摄像头APP验证,此Patch针对MTK平台,官网释放
  • USB Camera 远程监控

    热门讨论 2011-09-20 13:45:42
    "监控号" 是一套智能的USB摄像头监控平台,可以... USBCamera 是被控端摄像头程序,安装后以系统服务方式运行。只要被控端在互联网上,用户便可以随时随地的对它进行监控。 親相见,愛有心。 看得到,世界就是你的
  • Android usb Camera源码

    热门讨论 2014-05-27 14:56:24
    这个资源是网站下载的一份Android usb摄像头的源码!我在我的Exynos4412开发板上面测试OK!不过只能预览!而且我强制把ImageProc.c里面的设备文件指向我USB摄像头的设备文件! 编译环境:ubuntu+ndk_build+eclipse ...
  • AndroidUSBCamera 使用步骤

    千次阅读 2020-03-10 11:12:21
    1、git下载:https://github.com/jiangdongguo/AndroidUSBCamera 2、下载后解压,将模块 libusbcamera、libutils集成到自已的项目中,直接拷贝到项目根目录下,相关配置 settings.gradle 后添加 , ':libusbcamera'...

    1、git下载:https://github.com/jiangdongguo/AndroidUSBCamera

    2、下载后解压,将模块 libusbcamera、libutils集成到自已的项目中,直接拷贝到项目根目录下,相关配置

    settings.gradle 后添加   , ':libusbcamera', ':libutils'

    app build.gradle 文件   implementation project(':libusbcamera')

    project build.gradle 文件   

    allprojects {
        repositories {
            jcenter()
            google()
            maven { url 'https://jitpack.io' }
            maven { url 'https://raw.githubusercontent.com/saki4510t/libcommon/master/repository/' }
    
        }
    }

    AndroidManifest.xml 文件开启相关权限

        <uses-permission android:name="android.permission.RECORD_AUDIO"/>
        <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
        <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    
        <uses-feature android:name="android.hardware.usb.host"/>

    项目ndk 要设置上 最后项目async

    3、MainActivity

    public class MainActivity extends AppCompatActivity implements CameraViewInterface.Callback {
    
        private final String TAG = MainActivity.class.getSimpleName();
    
        public View mTextureView;
        private UVCCameraHelper mCameraHelper;
        private CameraViewInterface mUVCCameraView;
    
        private boolean isRequest = false;
        private boolean isPreview = false;
        private boolean isRecording = false;
        private UVCCameraHelper.OnMyDevConnectListener listener = new UVCCameraHelper.OnMyDevConnectListener() {
    
            @Override
            public void onAttachDev(UsbDevice device) {
                // request open permission
                Log.d(TAG, "camera: usb 设备 " + device.getProductName() + " 新连接");
                if (mCameraHelper == null || mCameraHelper.getUsbDeviceCount() == 0) {
                    showShortMsg("未检测到USB摄像头设备");
                    return;
                }
                List<UsbDevice> devList = mCameraHelper.getUsbDeviceList();
                /*
                 * usb连接时,判断是不是这个摄像头,是就打开,实现了热插拔,插拔一次,
                 * 设备的id就加一,所以地址就改变,机器重启id初始化,getProductName()获得的是摄像头 
                 * 名称 
                 * */
                if (!isRequest)
                    for (int i = 0; i < devList.size(); i++) {
                        UsbDevice _device = devList.get(i);
                        if (_device.getProductName().indexOf("camera") > -1) {
                            isRequest = true;
                            mCameraHelper.requestPermission(i);//打开usb摄像头
                        }
                    }
            }
    
            @Override
            public void onDettachDev(UsbDevice device) {
                // close camera
                Log.d(TAG, "camera: usb 设备 " + device.getProductName() + " 已拔出");
                if (isRequest) {
                    isRequest = false;
                    mCameraHelper.closeCamera();
                    showShortMsg(device.getProductName() + " 已拨出");
                }
            }
    
            @Override
            public void onConnectDev(UsbDevice device, boolean isConnected) {
                Log.d(TAG, "camera: usb 设备 " + device.getProductName() + " 连接失败");
                if (!isConnected) {
                    showShortMsg("连接失败,请检查分辨率参数是否正确");
                    isPreview = false;
                } else {
                    isPreview = true;
                    showShortMsg("usb 设备正在连接");
                    // need to wait UVCCamera initialize over
                    Log.d(TAG, "camera is connected");
                }
            }
    
            @Override
            public void onDisConnectDev(UsbDevice device) {
                Log.d(TAG, "camera: usb disconnecting");
                showShortMsg("usb设备连接断开");
            }
        };
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_main);
            mContext = this;
            // step.1 initialize UVCCameraHelper
            mTextureView = findViewById(R.id.camera_view);
            mUVCCameraView = (CameraViewInterface) mTextureView;
            mUVCCameraView.setCallback(this);
            mCameraHelper = UVCCameraHelper.getInstance();
            mCameraHelper.setDefaultFrameFormat(UVCCameraHelper.FRAME_FORMAT_YUYV);
            /*
             * 初始化分辨率,一定是设备支持的分辨率,否则摄像不能正常使用
             * */
            mCameraHelper.setDefaultPreviewSize(640, 480);
            mCameraHelper.initUSBMonitor(this, mUVCCameraView, listener);
            mCameraHelper.setOnPreviewFrameListener(new AbstractUVCCameraHandler.OnPreViewResultListener() {
                int printNum = 0;
    
                @Override
                public void onPreviewResult(byte[] nv21Yuv) {
                    printNum++;
                    if (printNum == 300) {
                        printNum = 0;
                        Log.d(TAG, "onPreviewResult: " + nv21Yuv.length + "摄像头预览");
                    }
    
                }
            });
        }
    
        //录像
        private void cameraRecording(Boolean isStartRecording, String Name) {
            isRecording = isStartRecording;
            if (mCameraHelper == null || !mCameraHelper.isCameraOpened() || !isPreview) {
                showShortMsg("摄像头异常,请重新更换插口并重启app");
                return;
            }
            String OrderRecordStr = prefs.getString(Config.ORDER_RECORDING, "");
            Log.d(TAG, "OrderRecorde1=" + OrderRecordStr);
            if (!mCameraHelper.isPushing() && isStartRecording) {
                //文件地址自已设置
                String videoPath = Config.VIDEO_DIRECTORY + "/ " + Name;
                OrderRecordStr = OrderRecordStr + "&" + Name;
                prefs.edit().putString(Config.ORDER_RECORDING, OrderRecordStr).apply();
                RecordParams params = new RecordParams();
                params.setRecordPath(videoPath);
                params.setRecordDuration(0);                        // auto divide saved,default 0 means not divided
                params.setVoiceClose(true);    // is close voice
                params.setSupportOverlay(true); // overlay only support armeabi-v7a & arm64-v8a
                mCameraHelper.startPusher(params, new AbstractUVCCameraHandler.OnEncodeResultListener() {
                    @Override
                    public void onEncodeResult(byte[] data, int offset, int length, long timestamp, int type) {
                        // type = 1,h264 video stream
                        if (type == 1) {
                            FileUtils.putFileStream(data, offset, length);
                        }
                        // type = 0,aac audio stream
                        if (type == 0) {
    
                        }
                    }
    
                    @Override
                    public void onRecordResult(String videoPath) {
                        if (TextUtils.isEmpty(videoPath)) {
                            return;
                        }
                        new Handler(getMainLooper()).post(() -> Toast.makeText(MainActivity.this, "save videoPath:" + videoPath, Toast.LENGTH_SHORT).show());
                    }
                });
                // if you only want to push stream,please call like this
                // mCameraHelper.startPusher(listener);
                showShortMsg("开始录制视频");
            } else if (mCameraHelper.isPushing() && !isStartRecording) {
                FileUtils.releaseFile();
                mCameraHelper.stopPusher();
                showShortMsg("停止录制视频");
                String[] OrderRecordArr = OrderRecordStr.split("&");
                if (OrderRecordArr.length > 5) {
                    String order = OrderRecordArr[1];
                    String filePath = Config.VIDEO_DIRECTORY + "/ " + order + ".mp4";
                    deleteFile(filePath);
                    String _OrderRecordStr = "";
                    for (int i = 0; i < OrderRecordArr.length; i++) {
                        if (OrderRecordArr[i] != order && OrderRecordArr[i].length() > 0)
                            _OrderRecordStr = _OrderRecordStr + "&" + OrderRecordArr[i];
                    }
                    prefs.edit().putString(Config.ORDER_RECORDING, _OrderRecordStr).apply();
                    Log.d(TAG, "OrderRecorde=" + prefs.getString(Config.ORDER_RECORDING, ""));
                }
            }
        }
    
        //删除文件
        public boolean deleteFile(String filePath) {
            File file = new File(filePath);
            if (file.isFile() && file.exists()) return file.delete();
            else if (file.isFile() && !file.exists()) return true;
            return false;
        }
        @Override
        public void onResume() {
            super.onResume();
            // 恢复Camera预览
            if (mUVCCameraView != null) mUVCCameraView.onResume();
        }
    
        @Override
        protected void onStart() {
            super.onStart();
            // step.2 register USB event broadcast
            if (mCameraHelper != null) {
                mCameraHelper.registerUSB();
            }
        }
    
        @Override
        protected void onStop() {
            super.onStop();
            // step.3 unregister USB event broadcast
            if (mCameraHelper != null) {
                mCameraHelper.unregisterUSB();
            }
        }
    
        @Override
        protected void onPause() {
            super.onPause();
            
            if (mUVCCameraView != null) mUVCCameraView.onPause();
        }
    
        private void showShortMsg(String msg) {
            Toast.makeText(this, msg, Toast.LENGTH_SHORT).show();
        }
    
        @Override
        public USBMonitor getUSBMonitor() {
            return mCameraHelper.getUSBMonitor();
        }
    
        @Override
        public void onDialogResult(boolean canceled) {
    
        }
    
        public boolean isCameraOpened() {
            return mCameraHelper.isCameraOpened();
        }
    
        @Override
        public void onSurfaceCreated(CameraViewInterface view, Surface surface) {
            isPreview = false;
            new Thread(new Runnable() {
                @Override
                public void run() {
                    // wait for camera created
                    try {
                        Thread.sleep(1000);
                    } catch (InterruptedException e) {
                        e.printStackTrace();
                    }
                    Log.d(TAG, "camera: surface start preview " + isPreview + "  " + isCameraOpened());
                    if (!isPreview && isCameraOpened()) {
                        mCameraHelper.startPreview(mUVCCameraView);
                        isPreview = true;
                        Log.d(TAG, "camera: surface start preview");
                    }
                }
            }).start();
        }
    
        @Override
        public void onSurfaceChanged(CameraViewInterface view, Surface surface, int width, int height) {
    
        }
    
        @Override
        public void onSurfaceDestroy(CameraViewInterface view, Surface surface) {
            if (isPreview && isCameraOpened()) {
                mCameraHelper.stopPreview();
                Log.d(TAG, "surface:" + "is destroy");
            }
            isPreview = false;
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
            FileUtils.releaseFile();
            // step.4 release uvc camera resources
            if (mCameraHelper != null) {
                mCameraHelper.release();
                Log.d(TAG, "camera is release");
            }
            isPreview = false;
            isRequest = false;
        }
    
    }

    activity_main.xml      

    <com.serenegiant.usb.widget.UVCCameraTextureView
        android:id="@+id/camera_view"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:layout_gravity="center"/>

    4、需要注意

    (1)mCameraHelper.requestPermission(int index) 是打开usb设备,有的usb不是摄像头设备,需要对usb设备名称进行过滤,可控制需要打开特定的usb摄像头, 可以热插拔显示

    (2)app关闭或后台运行  isPreview 需要重置为 false 不然再次进入app 预览无画面因为startPreview 未执行

    (3)设备重启后第一次打开app, 预览画面可能没有,但实际是可以录制的,重新进入app就可以了

    展开全文
  • AndroidUsbCamera 编译问题

    千次阅读 2013-05-26 09:54:09
    下载AndroidUsbCamera源码后,执行cmake正常通过,但是make的时候 eric@eric-Lenovo-3000-G430:~/work/AndroidUsbCamera/AndroidUsbCamera/build$ make [ 11%] Built target qtInterfaces_lib [ 17%] B

    系统版本:

        Ubuntu 12.04LTS

    下载AndroidUsbCamera源码后,执行cmake正常通过,但是make的时候

    eric@eric-Lenovo-3000-G430:~/work/AndroidUsbCamera/AndroidUsbCamera/build$ make
    [ 11%] Built target qtInterfaces_lib
    [ 17%] Building CXX object src/CMakeFiles/AndroidUsbCameraStaticLib.dir/framesconverter.cpp.o
    /home/eric/work/AndroidUsbCamera/AndroidUsbCamera/src/framesconverter.cpp:32:28: 致命错误: linux/videodev.h:没有那个文件或目录
    编译中断。
    make[2]: *** [src/CMakeFiles/AndroidUsbCameraStaticLib.dir/framesconverter.cpp.o] 错误 1
    make[1]: *** [src/CMakeFiles/AndroidUsbCameraStaticLib.dir/all] 错误 2
    make: *** [all] 错误 2
    eric@eric-Lenovo-3000-G430:~/work/AndroidUsbCamera/AndroidUsbCamera/build$ sudo make
    [ 11%] Built target qtInterfaces_lib
    [ 17%] Building CXX object src/CMakeFiles/AndroidUsbCameraStaticLib.dir/framesconverter.cpp.o
    /home/eric/work/AndroidUsbCamera/AndroidUsbCamera/src/framesconverter.cpp:32:28: 致命错误: linux/videodev.h:没有那个文件或目录


    解决办法:

     28 extern "C" {
     29 #include <libavcodec/avcodec.h>
     30 #include <libavformat/avformat.h>
     31 #include <libswscale/swscale.h>
     32 }
     33 //#include "linux/videodev.h"
     34 #include <libv4l1-videodev.h>
     35 #include <sys/ioctl.h>
     36 #include "frame.h"
     37 #include <QCoreApplication>
     38 #include <qstringlist.h>



    展开全文
  • usbcamera.apk

    2015-03-14 10:04:43
    这是一个Android应用程序连接到智能手机或平板设备的USB摄像头,显示和记录视频。赶快来下载吧!
  • Android USB Camera .zip

    2019-12-24 14:35:57
    这个demo不是调用系统本身的摄像头,而是调用USB外接摄像头的,请根据需要下载.该demo能够拍照,录制视频以及调整分辨率
  • AndroidUSBCamera.zip

    2019-06-03 17:57:13
    这个demo不是调用系统本身的摄像头,而是调用USB外接摄像头的,请根据需要下载.该demo能够拍照,录制视频以及调整分辨率
  • 在上一篇文章中我们实现了ros节点读取usb camera数据,然后通过topic形式发布数据,接下来我们需要在web中实现usb camera数据的显示,步骤如下: 1、安装依赖库async-web-server sudo apt-get install ros-kinetic-...

    上一篇文章中我们实现了ros节点读取usb camera数据,然后通过topic形式发布数据,接下来我们需要在web中实现usb camera数据的显示,步骤如下:

    1、安装依赖库async-web-server

    sudo apt-get install ros-kinetic-async-web-server-cpp

    2、下载ros工程仓库

    git clone https://github.com/RobotWebTools/web_video_server

    3、编译并运行

    mkdir src
    mv web_video_server src/
    cd …/
    catkin build
    roscore
    #关于usb_cam节点不清楚地方,可以自己查看此篇文章:https://blog.csdn.net/pengrui18/article/details/88958487
    roslaunch usb_cam usb_cam-test.launch
    #启动web video server服务器
    rosrun web_video_server web_video_server

    4、在web中显示效果

    打开浏览器输入如下地址:
    http://localhost:8080/stream?topic=/usb_cam/image_raw
    此处/usb_cam/image_raw表示usb camera的rostopic节点原始数据
    效果图如下:
    在这里插入图片描述

    展开全文
  • deltavision usb camera   一 Tengine安装  下载Tengine代码: https://github.com/OAID/Tengine  安装文档:doc/install.md  1. 依赖库 caffe依赖库 sudo apt install libprotobu...

    开发环境:

    Rock960开发板

    Ubuntu16.04

    deltavision usb camera

     

    一 Tengine安装

        下载Tengine代码: https://github.com/OAID/Tengine

         安装文档:doc/install.md    

       1. 依赖库

    • caffe依赖库
    sudo apt install libprotobuf-dev protobuf-compiler libboost-all-dev libgoogle-glog-dev
    • opencv
    sudo apt install libopencv-dev

      2.配置文件

    cd ~/tengine
     
    cp makefile.config.example makefile.config

       因为直接在rock960上开发,直接可以使用默认配置

      3. 编译    

    cd ~/tengine
    make

     4. 验证

    ./build/tests/bin/bench_mobilenet -r1

    二.  运行自带的ssd example

     1. 编译example

    cd ~/tengine/examples
    vim linux_build.sh

    修改cmake,“/home/usr/tengine”是tengine 所在目录,“/usr/lib/aarch64-linux-gnu” 是protobuf库的所在目录

    修改 examples/mobilenet_ssd/CMakeLists.txt, 添加一句 set( TENGINE_DIR /home/rock/Tengine)

    cmake -DPROTOBUF_DIR=/usr/lib/aarch64-linux-gnu -DTENGINE_DIR=/home/usr/tengine \
          ..

      编译

    mkdir build
    cd build
    ../linux_build.sh
    make -j4 

    2. 运行ssd 

      把官方提供的模型,放在 ${Tengine_ROOT}/models/

    • MobileNetSSD_deploy.caffemodel
    • MobileNetSSD_deploy.prototxt

     执行

    cd example/build/mobilenet_ssd
    
    ./MSSD -p ../../../models/MobileNetSSD_deploy.prototxt -m ../../../models/MobileNetSSD_deploy.caffemodel -i img.jpg

    三 . 从usb camera 获取图像并处理

       读取usb camera 的方法参考:https://mp.csdn.net/postedit/85252440

       1. 复制 mobilenet_ssd目录,命名为mobilenet_ssd_camera

          修改  examples/CMakeLists.txt文件, 添加一句

    add_subdirectory(mobilenet_ssd_camera)

       2. 修改 mobilenet_ssd_camera/mssd.cpp

    /*
     *  V4L2 video capture example
     *
     *  This program can be used and distributed without restrictions.
     *
     *      This program is provided with the V4L2 API
     * see https://linuxtv.org/docs.php for more information
     */
    
    #include <stdio.h>
    #include <stdlib.h>
    #include <string.h>
    #include <assert.h>
    #include <iomanip>
    #include <vector>
    #include <getopt.h>             /* getopt_long() */
    
    #include <fcntl.h>              /* low-level i/o */
    #include <unistd.h>
    #include <errno.h>
    #include <sys/stat.h>
    #include <sys/types.h>
    #include <sys/time.h>
    #include <sys/mman.h>
    #include <sys/ioctl.h>
    #include <inttypes.h>
    
    #include <linux/videodev2.h>
    #include <iostream>
    #include <opencv2/opencv.hpp>
    #include <opencv2/core/core.hpp>
    #include "opencv2/imgproc/imgproc.hpp"
    #include <opencv2/highgui/highgui.hpp>
    
    #include "tengine_c_api.h"
    #include "common.hpp"
    
    
    #define DEF_PROTO "models/MobileNetSSD_deploy.prototxt"
    #define DEF_MODEL "models/MobileNetSSD_deploy.caffemodel"
    #define DEF_IMAGE "tests/images/ssd_dog.jpg"
    
    #define CLEAR(x) memset(&(x), 0, sizeof(x))
    #define COLS (640)
    #define ROWS (480)
    #define SSD_IMG_H (300)
    #define SSD_IMG_W (300)
    using namespace cv;
    
    enum io_method {
            IO_METHOD_READ,
            IO_METHOD_MMAP,
            IO_METHOD_USERPTR,
    };
    
    struct buffer {
            void   *start;
            size_t  length;
    };
    
    typedef struct buffer* PBUF;
    
    static char            *dev_name;
    static enum io_method   io = IO_METHOD_MMAP;
    static int              fd = -1;
    struct buffer          *buffers;
    static unsigned int     n_buffers;
    static int              out_buf;
    static int              force_format;
    static int              frame_count = 70;
    
    static std::string proto_file;
    static std::string model_file;
    static std::string image_file;
    static std::string save_name="save.jpg";
    const char *model_name = "mssd_300";
        
    cv::Mat yuvImg(ROWS , COLS, CV_8UC2);
    cv::Mat rgbImg(ROWS, COLS,CV_8UC3);
    cv::Mat resizeImg(SSD_IMG_W, SSD_IMG_H,CV_8UC3);
    cv::Mat floatImg(SSD_IMG_W, SSD_IMG_H, CV_32FC3);
    
    static int fpsTick();
    
    struct Box
    {
        float x0;
        float y0;
        float x1;
        float y1;
        int class_idx;
        float score;
    };
    
    void get_input_data_ssd(std::string& image_file, float* input_data, int img_h,  int img_w)
    {
        cv::Mat img = cv::imread(image_file);
    
        if (img.empty())
        {
            std::cerr << "Failed to read image file " << image_file << ".\n";
            return;
        }
       
        cv::resize(img, img, cv::Size(img_h, img_w));
        img.convertTo(img, CV_32FC3);
        float *img_data = (float *)img.data;
        int hw = img_h * img_w;
    
        float mean[3]={127.5,127.5,127.5};
        for (int h = 0; h < img_h; h++)
        {
            for (int w = 0; w < img_w; w++)
            {
                for (int c = 0; c < 3; c++)
                {
                    input_data[c * hw + h * img_w + w] = 0.007843* (*img_data - mean[c]);
                    img_data++;
                }
            }
        }
    }
    
    void post_process_ssd(std::string& image_file,float threshold,float* outdata,int num,std::string& save_name)
    {
        std::cout<<"post_process_ssd\n";
        const char* class_names[] = {"background",
                                "aeroplane", "bicycle", "bird", "boat",
                                "bottle", "bus", "car", "cat", "chair",
                                "cow", "diningtable", "dog", "horse",
                                "motorbike", "person", "pottedplant",
                                "sheep", "sofa", "train", "tvmonitor"};
    
        //cv::Mat img = cv::imread(image_file);
        int raw_h = rgbImg.size().height;
        int raw_w = rgbImg.size().width;
        std::vector<Box> boxes;
        int line_width=raw_w*0.005;
        printf("detect ruesult num: %d \n",num);
        for (int i=0;i<num;i++)
        {
            if(outdata[1]>=threshold)
            {
                Box box;
                box.class_idx=outdata[0];
                box.score=outdata[1];
                box.x0=outdata[2]*raw_w;
                box.y0=outdata[3]*raw_h;
                box.x1=outdata[4]*raw_w;
                box.y1=outdata[5]*raw_h;
                boxes.push_back(box);
                printf("%s\t:%.0f%%\n", class_names[box.class_idx], box.score * 100);
                printf("BOX:( %g , %g ),( %g , %g )\n",box.x0,box.y0,box.x1,box.y1);
            }
            outdata+=6;
        }
        for(int i=0;i<(int)boxes.size();i++)
        {
            Box box=boxes[i];
            cv::rectangle(rgbImg, cv::Rect(box.x0, box.y0,(box.x1-box.x0),(box.y1-box.y0)),cv::Scalar(255, 255, 0),line_width);
            std::ostringstream score_str;
            score_str<<box.score;
            std::string label = std::string(class_names[box.class_idx]) + ": " + score_str.str();
            int baseLine = 0;
            cv::Size label_size = cv::getTextSize(label, cv::FONT_HERSHEY_SIMPLEX, 0.5, 1, &baseLine);
            cv::rectangle(rgbImg, cv::Rect(cv::Point(box.x0,box.y0- label_size.height),
                                      cv::Size(label_size.width, label_size.height + baseLine)),
                          cv::Scalar(255, 255, 0), CV_FILLED);
            cv::putText(rgbImg, label, cv::Point(box.x0, box.y0),
                        cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(0, 0, 0));
        }
        cv::imshow("opencv",rgbImg);
        waitKey(1);
        std::cout<<"======================================\n";
        std::cout<<"[DETECTED IMAGE SAVED]:\t"<< save_name<<"\n";
        std::cout<<"======================================\n";
    
    
    }
    
    static void errno_exit(const char *s)
    {
            fprintf(stderr, "%s error %d, %s\n", s, errno, strerror(errno));
            exit(EXIT_FAILURE);
    }
    
    static int xioctl(int fh, int request, void *arg)
    {
            int r;
    
            do {
                    r = ioctl(fh, request, arg);
            } while (-1 == r && EINTR == errno);
    
            return r;
    }
    
    static void process_image(const void *p, int size,float *input_data,int img_w,  int img_h)
    {
        std::cout<<"process_image\n";
        //int fps = fpsTick();
        
        memcpy(yuvImg.data, p, COLS*ROWS*2);
        cv::cvtColor(yuvImg, rgbImg, CV_YUV2BGR_YUYV);
        
        cv::resize(rgbImg, resizeImg, cv::Size(img_w, img_h));
        resizeImg.convertTo(floatImg, CV_32FC3);
        float *img_data = (float *)floatImg.data;
        int hw = img_h * img_w;
    
        float mean[3]={127.5,127.5,127.5};
        for (int h = 0; h < img_h; h++)
        {
            for (int w = 0; w < img_w; w++)
            {
                for (int c = 0; c < 3; c++)
                {
                    input_data[c * hw + h * img_w + w] = 0.007843* (*img_data - mean[c]);
                    img_data++;
                }
            }
        }
        
        //cv::cvtColor(yuvImg, rgbImg, CV_YUV2BGR_YUYV);
        
        //char title[10];
        //sprintf(title, "fps:%d", fps);
        //cv::imshow("opencv",rgbImg);
        //waitKey(1);
        
        /*
        static int frame_number = 0;
        char fn[256];
        sprintf(fn, "%d.raw", frame_number);
        frame_number++;
    
        uint8_t *pixel = (uint8_t *) rgbImg.data;
        size = COLS*ROWS*3;
        int found = 0;
        for (int i=0; i < size; i++) {
            if (pixel[i] != 0) {
            found = 1;
            break;
            }
        }
    
        if (found) {
            FILE *f = fopen(fn, "wb");
            if (f == NULL) { printf("Error opening file\n"); exit(EXIT_FAILURE); }
            fwrite(pixel, size, 1, f);
            fclose(f);
    
            fprintf(stdout, "%s\n", fn);
            fflush(stdout);
        } else {
            fprintf(stdout, "empty image");
        }
        */
        
    }
    
    static int read_frame(float *input_data,int img_w,  int img_h)
    {
            struct v4l2_buffer buf;
            unsigned int i;
            static uint64_t timestamp;
            uint64_t stamp =0;
    
            switch (io) {
            case IO_METHOD_READ:
                    if (-1 == read(fd, buffers[0].start, buffers[0].length)) {
                            switch (errno) {
                            case EAGAIN:
                                    return 0;
    
                            case EIO:
                                    /* Could ignore EIO, see spec. */
    
                                    /* fall through */
    
                            default:
                                    errno_exit("read");
                            }
                    }
    
                    process_image(buffers[0].start, buffers[0].length,input_data, img_w, img_h);
                    break;
    
            case IO_METHOD_MMAP:
                    CLEAR(buf);
    
                    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    buf.memory = V4L2_MEMORY_MMAP;
    
                    if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {
                            switch (errno) {
                            case EAGAIN:
                                    return 0;
    
                            case EIO:
                                    /* Could ignore EIO, see spec. */
    
                                    /* fall through */
    
                            default:
                                    errno_exit("VIDIOC_DQBUF");
                            }
                    }
                    stamp = buf.timestamp.tv_sec*1000000+buf.timestamp.tv_usec;
                    //printf("timestamp :%ld", timestamp);
                    if(timestamp == stamp){
                        break;
                    }
                    
                    assert(buf.index < n_buffers);
    
                    process_image(buffers[buf.index].start, buf.bytesused, input_data, img_w, img_h);
    
                    if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
                            errno_exit("VIDIOC_QBUF");
                    break;
    
            case IO_METHOD_USERPTR:
                    CLEAR(buf);
    
                    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    buf.memory = V4L2_MEMORY_USERPTR;
    
                    if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {
                        
                            switch (errno) {
                            case EAGAIN:
                                    return 0;
    
                            case EIO:
                                    /* Could ignore EIO, see spec. */
    
                                    /* fall through */
    
                            default:
                                    errno_exit("VIDIOC_DQBUF");
                            }
                    }
    
                    for (i = 0; i < n_buffers; ++i)
                            if (buf.m.userptr == (unsigned long)buffers[i].start
                                && buf.length == buffers[i].length)
                                    break;
    
                    assert(i < n_buffers);
    
                    process_image((void *)buf.m.userptr, buf.bytesused, input_data, img_w, img_h);
    
                    if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
                            errno_exit("VIDIOC_QBUF");
                    break;
            }
    
            return 1;
    }
    
    static void mainloop(void)
    {
            unsigned int count;
    
            count = frame_count;
            
    
                // init tengine
            init_tengine_library();
            if (request_tengine_version("0.1") < 0)
                return ;
            if (load_model(model_name, "caffe", proto_file.c_str(), model_file.c_str()) < 0)
                return ;
            std::cout << "load model done!\n";
        
            // create graph
            graph_t graph = create_runtime_graph("graph", model_name, NULL);
            if (!check_graph_valid(graph))
            {
                std::cout << "create graph0 failed\n";
                return ;
            }
    
            
            int repeat_count = 1;
            const char *repeat = std::getenv("REPEAT_COUNT");
    
            if (repeat)
                repeat_count = std::strtoul(repeat, NULL, 10);
            
            int node_idx=0;
            int tensor_idx=0;
            tensor_t input_tensor = get_graph_input_tensor(graph, node_idx, tensor_idx);
            if(!check_tensor_valid(input_tensor))
            {
                printf("Get input node failed : node_idx: %d, tensor_idx: %d\n",node_idx,tensor_idx);
                return;
            }
    
            
            // input
            int img_h = 300;
            int img_w = 300;
            int img_size = img_h * img_w * 3;
            float *input_data = (float *)malloc(sizeof(float) * img_size);
            int dims[] = {1, 3, img_h, img_w};
            set_tensor_shape(input_tensor, dims, 4);
            
            prerun_graph(graph);
        
            while (1) {
               printf("Reading frame\n");
                    for (;;) {
                            fd_set fds;
                            struct timeval tv;
                            int r;
    
                            FD_ZERO(&fds);
                            FD_SET(fd, &fds);
    
                            /* Timeout. */
                            tv.tv_sec = 2;
                            tv.tv_usec = 0;
    
                            r = select(fd + 1, &fds, NULL, NULL, &tv);
    
                            if (-1 == r) {
                                    if (EINTR == errno)
                                            continue;
                                    errno_exit("select");
                            }
    
                            if (0 == r) {
                                    fprintf(stderr, "select timeout\n");
                                    exit(EXIT_FAILURE);
                            }
                            
                            
                            if (read_frame(input_data,img_w, img_h))
                            {
                                std::cout<<"run_graph\n";
                                /* EAGAIN - continue select loop. */
                                set_tensor_buffer(input_tensor, input_data, img_size * 4);
                                run_graph(graph, 1);
    
                                //gettimeofday(&t1, NULL);
                                //float mytime = (float)((t1.tv_sec * 1000000 + t1.tv_usec) - (t0.tv_sec * 1000000 + t0.tv_usec)) / 1000;
                                //total_time += mytime;
    
                                //std::cout << "--------------------------------------\n";
                                //std::cout << "repeat " << repeat_count << " times, avg time per run is " << total_time / repeat_count << " ms\n";
                                
                                tensor_t out_tensor = get_graph_output_tensor(graph, 0,0);//"detection_out");
                                int out_dim[4];
                                get_tensor_shape( out_tensor, out_dim, 4);
    
                                float *outdata = (float *)get_tensor_buffer(out_tensor);
                                int num=out_dim[1];
                                float show_threshold=0.5;
                                
                                post_process_ssd(image_file,show_threshold, outdata, num,save_name);
                                put_graph_tensor(out_tensor);
                                
                                
                                std::cout<<"run end\n";
                                
                                break;
                            }
    
                    }
                    
            }
            
            postrun_graph(graph);
            
            free(input_data);
            
            put_graph_tensor(input_tensor);
            
            
    
            destroy_runtime_graph(graph);
            remove_model(model_name);
    
            return;
    }
    
    static void stop_capturing(void)
    {
            enum v4l2_buf_type type;
    
            switch (io) {
            case IO_METHOD_READ:
                    /* Nothing to do. */
                    break;
    
            case IO_METHOD_MMAP:
            case IO_METHOD_USERPTR:
                    type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    if (-1 == xioctl(fd, VIDIOC_STREAMOFF, &type))
                            errno_exit("VIDIOC_STREAMOFF");
                    break;
            }
    }
    
    static void start_capturing(void)
    {
            unsigned int i;
            enum v4l2_buf_type type;
    
            switch (io) {
            case IO_METHOD_READ:
                    /* Nothing to do. */
                    break;
    
            case IO_METHOD_MMAP:
                    for (i = 0; i < n_buffers; ++i) {
                            struct v4l2_buffer buf;
    
                            CLEAR(buf);
                            buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                            buf.memory = V4L2_MEMORY_MMAP;
                            buf.index = i;
    
                            if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
                                    errno_exit("VIDIOC_QBUF");
                    }
                    type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    if (-1 == xioctl(fd, VIDIOC_STREAMON, &type))
                            errno_exit("VIDIOC_STREAMON");
                    break;
    
            case IO_METHOD_USERPTR:
                    for (i = 0; i < n_buffers; ++i) {
                            struct v4l2_buffer buf;
    
                            CLEAR(buf);
                            buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                            buf.memory = V4L2_MEMORY_USERPTR;
                            buf.index = i;
                            buf.m.userptr = (unsigned long)buffers[i].start;
                            buf.length = buffers[i].length;
    
                            if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
                                    errno_exit("VIDIOC_QBUF");
                    }
                    type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    if (-1 == xioctl(fd, VIDIOC_STREAMON, &type))
                            errno_exit("VIDIOC_STREAMON");
                    break;
            }
    }
    
    static void uninit_device(void)
    {
            unsigned int i;
    
            switch (io) {
            case IO_METHOD_READ:
                    free(buffers[0].start);
                    break;
    
            case IO_METHOD_MMAP:
                    for (i = 0; i < n_buffers; ++i)
                            if (-1 == munmap(buffers[i].start, buffers[i].length))
                                    errno_exit("munmap");
                    break;
    
            case IO_METHOD_USERPTR:
                    for (i = 0; i < n_buffers; ++i)
                            free(buffers[i].start);
                    break;
            }
    
            free(buffers);
    }
    
    static void init_read(unsigned int buffer_size)
    {
            buffers = (PBUF)calloc(1, sizeof(*buffers));
    
            if (!buffers) {
                    fprintf(stderr, "Out of memory\n");
                    exit(EXIT_FAILURE);
            }
    
            buffers[0].length = buffer_size;
            buffers[0].start = malloc(buffer_size);
    
            if (!buffers[0].start) {
                    fprintf(stderr, "Out of memory\n");
                    exit(EXIT_FAILURE);
            }
    }
    
    static void init_mmap(void)
    {
            struct v4l2_requestbuffers req;
    
            CLEAR(req);
    
            req.count = 4;
            req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
            req.memory = V4L2_MEMORY_MMAP;
    
            if (-1 == xioctl(fd, VIDIOC_REQBUFS, &req)) {
                    if (EINVAL == errno) {
                            fprintf(stderr, "%s does not support "
                                     "memory mapping\n", dev_name);
                            exit(EXIT_FAILURE);
                    } else {
                            errno_exit("VIDIOC_REQBUFS");
                    }
            }
    
            if (req.count < 2) {
                    fprintf(stderr, "Insufficient buffer memory on %s\n",
                             dev_name);
                    exit(EXIT_FAILURE);
            }
    
            buffers = (PBUF)calloc(req.count, sizeof(*buffers));
    
            if (!buffers) {
                    fprintf(stderr, "Out of memory\n");
                    exit(EXIT_FAILURE);
            }
    
            for (n_buffers = 0; n_buffers < req.count; ++n_buffers) {
                    struct v4l2_buffer buf;
    
                    CLEAR(buf);
    
                    buf.type        = V4L2_BUF_TYPE_VIDEO_CAPTURE;
                    buf.memory      = V4L2_MEMORY_MMAP;
                    buf.index       = n_buffers;
    
                    if (-1 == xioctl(fd, VIDIOC_QUERYBUF, &buf))
                            errno_exit("VIDIOC_QUERYBUF");
    
                    buffers[n_buffers].length = buf.length;
                    buffers[n_buffers].start =
                            mmap(NULL /* start anywhere */,
                                  buf.length,
                                  PROT_READ | PROT_WRITE /* required */,
                                  MAP_SHARED /* recommended */,
                                  fd, buf.m.offset);
    
                    if (MAP_FAILED == buffers[n_buffers].start)
                            errno_exit("mmap");
            }
    }
    
    static void init_userp(unsigned int buffer_size)
    {
            struct v4l2_requestbuffers req;
    
            CLEAR(req);
    
            req.count  = 4;
            req.type   = V4L2_BUF_TYPE_VIDEO_CAPTURE;
            req.memory = V4L2_MEMORY_USERPTR;
    
            if (-1 == xioctl(fd, VIDIOC_REQBUFS, &req)) {
                    if (EINVAL == errno) {
                            fprintf(stderr, "%s does not support "
                                     "user pointer i/o\n", dev_name);
                            exit(EXIT_FAILURE);
                    } else {
                            errno_exit("VIDIOC_REQBUFS");
                    }
            }
    
            buffers = (PBUF)calloc(4, sizeof(*buffers));
    
            if (!buffers) {
                    fprintf(stderr, "Out of memory\n");
                    exit(EXIT_FAILURE);
            }
    
            for (n_buffers = 0; n_buffers < 4; ++n_buffers) {
                    buffers[n_buffers].length = buffer_size;
                    buffers[n_buffers].start = malloc(buffer_size);
    
                    if (!buffers[n_buffers].start) {
                            fprintf(stderr, "Out of memory\n");
                            exit(EXIT_FAILURE);
                    }
            }
    }
    
    static void init_device(void)
    {
            struct v4l2_capability cap;
            struct v4l2_cropcap cropcap;
            struct v4l2_crop crop;
            struct v4l2_format fmt;
            unsigned int min;
    
            if (-1 == xioctl(fd, VIDIOC_QUERYCAP, &cap)) {
                    if (EINVAL == errno) {
                            fprintf(stderr, "%s is no V4L2 device\n",
                                     dev_name);
                            exit(EXIT_FAILURE);
                    } else {
                            errno_exit("VIDIOC_QUERYCAP");
                    }
            }
    
            if (!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) {
                    fprintf(stderr, "%s is no video capture device\n",
                             dev_name);
                    exit(EXIT_FAILURE);
            }
    
            switch (io) {
            case IO_METHOD_READ:
                    if (!(cap.capabilities & V4L2_CAP_READWRITE)) {
                            fprintf(stderr, "%s does not support read i/o\n",
                                     dev_name);
                            exit(EXIT_FAILURE);
                    }
                    break;
    
            case IO_METHOD_MMAP:
            case IO_METHOD_USERPTR:
                    if (!(cap.capabilities & V4L2_CAP_STREAMING)) {
                            fprintf(stderr, "%s does not support streaming i/o\n",
                                     dev_name);
                            exit(EXIT_FAILURE);
                    }
                    break;
            }
    
    
            /* Select video input, video standard and tune here. */
    
      struct v4l2_format format = {0};
      format.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
      format.fmt.pix.width = COLS;
      format.fmt.pix.height = ROWS;
      format.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;
      format.fmt.pix.field = V4L2_FIELD_NONE;
      int retval = xioctl(fd, VIDIOC_S_FMT, &format);
      if (retval == -1) { perror("Setting format\n"); return; }
    
    
    //
    //        CLEAR(cropcap);
    //
    //        cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    //
    //        if (0 == xioctl(fd, VIDIOC_CROPCAP, &cropcap)) {
    //                crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    //                crop.c = cropcap.defrect; /* reset to default */
    //
    //                if (-1 == xioctl(fd, VIDIOC_S_CROP, &crop)) {
    //                        switch (errno) {
    //                        case EINVAL:
    //                                /* Cropping not supported. */
    //                                break;
    //                        default:
    //                                /* Errors ignored. */
    //                                break;
    //                        }
    //                }
    //        } else {
    //                /* Errors ignored. */
    //        }
    //
    //
    //        CLEAR(fmt);
    //
    //        fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    //        if (force_format) {
    //                fmt.fmt.pix.width       = 640;
    //                fmt.fmt.pix.height      = 480;
    //                fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUYV;
    //                fmt.fmt.pix.field       = V4L2_FIELD_INTERLACED;
    //
    //                if (-1 == xioctl(fd, VIDIOC_S_FMT, &fmt))
    //                        errno_exit("VIDIOC_S_FMT");
    //
    //                /* Note VIDIOC_S_FMT may change width and height. */
    //        } else {
    //                /* Preserve original settings as set by v4l2-ctl for example */
    //                if (-1 == xioctl(fd, VIDIOC_G_FMT, &fmt))
    //                        errno_exit("VIDIOC_G_FMT");
    //        }
    //
    //        /* Buggy driver paranoia. */
    //        min = fmt.fmt.pix.width * 2;
    //        if (fmt.fmt.pix.bytesperline < min)
    //                fmt.fmt.pix.bytesperline = min;
    //        min = fmt.fmt.pix.bytesperline * fmt.fmt.pix.height;
    //        if (fmt.fmt.pix.sizeimage < min)
    //                fmt.fmt.pix.sizeimage = min;
    //
            switch (io) {
            case IO_METHOD_READ:
                    init_read(fmt.fmt.pix.sizeimage);
                    break;
    
            case IO_METHOD_MMAP:
                    init_mmap();
                    break;
    
            case IO_METHOD_USERPTR:
                    init_userp(fmt.fmt.pix.sizeimage);
                    break;
            }
    }
    
    static void close_device(void)
    {
            if (-1 == close(fd))
                    errno_exit("close");
    
            fd = -1;
    }
    
    static void open_device(void)
    {
            struct stat st;
    
            if (-1 == stat(dev_name, &st)) {
                    fprintf(stderr, "Cannot identify '%s': %d, %s\n",
                             dev_name, errno, strerror(errno));
                    exit(EXIT_FAILURE);
            }
    
            if (!S_ISCHR(st.st_mode)) {
                    fprintf(stderr, "%s is no device\n", dev_name);
                    exit(EXIT_FAILURE);
            }
    
            fd = open(dev_name, O_RDWR /* required */ | O_NONBLOCK, 0);
    
            if (-1 == fd) {
                    fprintf(stderr, "Cannot open '%s': %d, %s\n",
                             dev_name, errno, strerror(errno));
                    exit(EXIT_FAILURE);
            }
    }
    
    
    static int fpsTick()
    {
    
        static clock_t last=clock();
        static clock_t avgDuration = 0;
        static float alpha = 1.f/10.f;
        static int frameCount = 0;
        
        clock_t now = clock();
        clock_t delta = now-last;
        
        printf("delta clock:%d\n", delta);
        last = now;
        
        
        frameCount++;
        
        int fps = 0;
        if(1 == frameCount)
        {
            avgDuration = delta;
        }
        else
        {
            avgDuration = avgDuration * (1.f - alpha) + delta * alpha;
        }
        
        fps = (1.f * CLOCKS_PER_SEC/ avgDuration);
        printf("fps :%d\n", fps);
        
        
    }
    
    static void usage(FILE *fp, int argc, char **argv)
    {
            fprintf(fp,
                     "Usage: %s [options]\n\n"
                     "Version 1.3\n"
                     "Options:\n"
                     "-d | --device name   Video device name [%s]\n"
                     "-h | --help          Print this message\n"
                     "-m | --mmap          Use memory mapped buffers [default]\n"
                     "-r | --read          Use read() calls\n"
                     "-u | --userp         Use application allocated buffers\n"
                     "-o | --output        Outputs stream to stdout\n"
                     "-f | --format        Force format to 640x480 YUYV\n"
                     "-c | --count         Number of frames to grab [%i]\n"
                     "",
                     argv[0], dev_name, frame_count);
    }
    
    static const char short_options[] = "d:hmruofc:";
    
    static const struct option
    long_options[] = {
            { "device", required_argument, NULL, 'd' },
            { "help",   no_argument,       NULL, 'h' },
            { "mmap",   no_argument,       NULL, 'm' },
            { "read",   no_argument,       NULL, 'r' },
            { "userp",  no_argument,       NULL, 'u' },
            { "output", no_argument,       NULL, 'o' },
            { "format", no_argument,       NULL, 'f' },
            { "count",  required_argument, NULL, 'c' },
            { 0, 0, 0, 0 }
    };
    
    int main(int argc, char **argv)
    {
            dev_name = "/dev/video0";
    
            for (;;) {
                    int idx;
                    int c;
    
                    c = getopt_long(argc, argv,
                                    short_options, long_options, &idx);
    
                    if (-1 == c)
                            break;
    
                    switch (c) {
                    case 0: /* getopt_long() flag */
                            break;
    
                    case 'd':
                            dev_name = optarg;
                            break;
    
                    case 'h':
                            usage(stdout, argc, argv);
                            exit(EXIT_SUCCESS);
    
                    case 'm':
                            io = IO_METHOD_MMAP;
                            break;
    
                    case 'r':
                            io = IO_METHOD_READ;
                            break;
    
                    case 'u':
                            io = IO_METHOD_USERPTR;
                            break;
    
                    case 'o':
                            out_buf++;
                            break;
    
                    case 'f':
                            force_format++;
                            break;
    
                    case 'c':
                            errno = 0;
                            frame_count = strtol(optarg, NULL, 0);
                            if (errno)
                                    errno_exit(optarg);
                            break;
    
                    default:
                            usage(stderr, argc, argv);
                            exit(EXIT_FAILURE);
                    }
            }
            
        
            //cvNamedWindow("opencv", CV_WINDOW_AUTOSIZE);
            
            const std::string root_path = get_root_path();
            std::string save_name="save.jpg";
            proto_file = root_path + DEF_PROTO;
            model_file = root_path + DEF_MODEL;
            image_file = root_path + DEF_IMAGE;
        
            open_device();
            init_device();
            start_capturing();
            mainloop();
            stop_capturing();
            uninit_device();
            close_device();
            fprintf(stderr, "\n");
            waitKey(0);
            return 0;
    }

         3. 重新make

          回到 examples 目录,执行make\

         正常的话, 会生成 examples/build/mobilenet_ssd_camera 目录

       4. 执行

    ./MSSD -p ../../../models/MobileNetSSD_deploy.prototxt -m ../../../models/MobileNetSSD_deploy.caffemodel    

    展开全文
  • linux usb-camera 应用

    2018-05-14 10:16:56
    linux usb-camera 应用开发,内附详细说明,值得下载喔!
  • ROS_Kinetic_12 ROS程序基础Eclipse_C++(三)usb camera 软件包下载地址:https://github.com/bosch-ros-pkg/usb_cam 下载后,放到catkin_ws中src下;在Eclipse中编译: 注意,其中"pixel_format",如果编译后不能...
  • Android移植源码APP(USB CAMERA APK)问题总结

    千次阅读 热门讨论 2015-09-15 19:07:51
    最近公司的一个项目(行车记录仪)要求能支持USB...考虑到需要系统内部集成usb camera 的APK,且出现问题时最好能进行简单的调试,于是从网上下载了一个源码版本的apk,打算将其移植到系统代码中。原以为移植工作会很简
  • [USBCamera]驱动安装与使用

    千次阅读 2015-07-11 00:15:51
    试用过程:直接插上笔记本的USB接口,Win7弹出自动查找驱动的对话框,但未能进行自动安装,但给出了驱动的下载地址,于是下载和安装了驱动,驱动压缩包名字为DRV_VC0305_VA6241_50hz。解压缩后并未找到Win7的文件夹...
  • usb2.0 camera 摄像头驱动是一款万能的摄像头驱动程序,几乎支持市面上所有的摄像头型号,支持win7、win8和64位操作系统。usb2.0 camera驱动使用说明64位系统必须运行UniversalThemePatcher-x64.exe。需要管理员权限...
  • USB PC CAMERA SN9C102 驱动

    2012-04-27 23:25:32
    USB PC CAMERA 驱动下载,经常在网络上很难找到,现在我免费提供给大家。
  • Platform: RK3288 OS: Android 6.0 Kernel: 3.10.92 Kernel层: 打开UVC的宏 CONFIG_USB_VIDEO_CLASS=y ...编译下载插上usb camera开机之后应该会有类似如下log: [ 3.612836] usb 3-1: New USB device found, idVen
  • Linux下面接4个USB Camera 出现:VIDIOC_STREAMON: No space left on device,新patch解决了这个问题,欢迎下载
  • UVCCamera安卓驱动USB摄像头

    万次阅读 热门讨论 2017-05-03 09:36:43
    最近在做一个USB摄像头的项目,需要做一个安卓APP。从来没写过安卓程序,所以只能网上到处搜,搜了一个叫simplewebcam的源码,可能我配置不对一直...1.下载开源程序下载地址:https://github.com/saki4510t/UVCCamera
  • Camera-168驱动是一类摄影头驱动,为解决用户摄像头不能正常使用的驱动问题,支持多种型号,具体可参考以下详解,有需要的朋友欢迎下载体验!驱动介绍Camera-168驱动如果不能正常安装,请先删除原来的安装文件或者...
  • mjpg-streamer移植的android平台也很简单,只要写好Android.mk文件进行编译就行。由于mjpg-streamer是基于jpeg库来处理摄像头数据的,因此在移植mjpg-streamer到...下载mjpg-streamer: https://github.com/chenguangx
  • UVCCamera的开源程序,开源了8个例程,每个例程功能各不相同。 1.下载开源程序 下载地址:下载地址 点击图中的clone or download可以直接下载程序的压缩包 1)USBCameraTest0 显示如何使用SurfaceView来启动/停止...
  • USB host function must be required. If you want to try on Android 3.1, you will need some modification(need to remove setPreviewTexture method in UVCCamera.java etc.), but we have not confirm whether...
  • USB PC Camera 301P被自动Windows Update更新安装也不能正常使用,因为Windows Update下载安装后被误认,例如变成了LOOK 312P,不能正常工作,但系统提示设备正常,QQ视频测试提示视频组件启动成功,但没显示图像,...
  • 第一步: 下载MJPG-streamer源码 官网下载地址:https://sourceforge.net/projects/mjpg-streamer/ 第二步: 安装依赖库: sudo apt-get install libjpeg62-dev sudo apt-get install libjpeg8-dev 却换到下载的...
  • pccamera驱动适合绝大多少的摄像头,但某些品牌摄像头特有的功能无法使用。不过对于一些杂牌厂商出产的摄像头也可以在本站按具体型号搜索。pccamera驱动相关介绍本...pccamera驱动安装方法操作pda手机,欢迎下载体验
  • 雅虎通摄像头驱动是雅虎通摄像头专属驱动程序,能保证改摄像头能正常使用,驱动安装即可...快下载体验吧!雅虎通摄像头万能驱动介绍这款摄像头工具操作简单,安装方便。想要你的摄像头给力让你的生活丰富,欢迎下载体验

空空如也

空空如也

1 2 3 4 5 ... 8
收藏数 146
精华内容 58
关键字:

usbcamera下载