精华内容
下载资源
问答
  • d435i_凉亭-源码

    2021-02-20 18:03:56
    该软件包是用于Intel D435i realsense摄像机的Gazebo ROS插件。 通过该软件包,可以运行D435i摄像机(模拟),并在发射文件中包含urdf / xacro文件。 该软件包已在ROS Melodic,Ubuntu 18.04中开发和测试。 如何...
  • 该文档是有关realsense相机D435(D435i)利用左右相机采集图片的程序。涉及有VS17+opencv4.2.0+sdk2.0环境配置详细步骤、和相关测试代码。注:该工程没有连接实际相机,但是项目编译通过。
  • 基于D435I相机的手眼标定,包括内参获取,外参估计等
  • SIMOTION__D435__实例.pdf

    2019-11-18 12:40:58
    本文档通过一个实际项目来介绍 ...在 SIMOTION SCOUT 软件的安装包里提供了一个供初学者学习的项目 “D435_BEGINNER”,该项目提供了完整的项目文 件和介绍文档,该项目可以在 SIMOTION D435 演示设备上模拟运行。
  • Aubo i5双臂协作机器人-RealSense D435-3D对象姿势估计-ROS 一个用于使用新颖的体系结构和数据生成管道来检测和估计已知对象的6自由度姿态的软件包,该管道使用具有英特尔Realsense D435i摄像机的Aubo i5协作机器人...
  • 本人使用过一段时间的realsense D435 因为能力有限,写出来代码有很多缺陷,打包起来的d435的功能也并不是很全面
  • 使用英特尔D435实感摄像头,在Opencv DNN框架下基于Yolov3框架实现目标检测,并根据深度信息实现对异物的3D定位。 实时显示摄像机坐标系中的坐标。 异议检测和位置RealsenseD435要求C ++版本Ubuntu18.04或16.04 ...
  • 使用英特尔实感D435进行对象检测 坎特伯雷大学计算机科学系(COSC)和软件工程系最近购买了一辆陆虎,并在其上安装了Intel RealSense D435 3D摄像机。 目的是它将能够完成跨校园旅行的Mini-DARPA挑战。 该项目通过...
  • MFC实现D435i相机视频显示并截取保存某一图像帧。
  • Realsense_D435i.zip

    2021-11-01 10:04:39
    Linux 18.04版本, D435i的驱动程序,不会报错,有对应的解决方法,而且不会出现IMU时间混乱问题, 运行VINS-mono不会 出现IMU报错
  • 低速slam-d435i 使用Realsense d435i相机在室内环境下进行低速3D映射和猛击 安装-基本工具 从此处安装Ubuntu,ROS,Intel Realsense SDK,包装程序: : 注意:我推荐方法2 安装-带D435i包的SLAM 从链接安装以下...
  • 主从多机matlab代码多机位.cpp 时间戳、对齐、点云、保存选项 以 6、15、30 fps 正常运行 专为使用两个英特尔实感摄像头流式传输 保存在内存中 fssLeft 和 fssRight 向量中的帧 ...输出时间戳到文本文
  • open3D与D435运用python代码及实例ply数据 数据及部分代码参考github开源项目
  • ORB_SLAM需要修改的部分包括CMakelist.txt和rgbd_tum.cc,这部分需要较大改动。首先看rgbd_tum.cc,由于我们打算完成一个实时系统,所以不在需要数据的读取、color和depth的匹配;可以省去相当部分的工作。...
  • d435i calibration results file
  • RaspberryPi3(Raspbian Stretch)或Ubuntu16.04 / UbuntuMate +神经计算棒(NCS / NCS2)+ RealSense D435(或USB Camera或PiCamera)+ MobileNet-SSD(MobileNetSSD) 【公告】2018年12月19日,OpenVINO支持...
  • ROS 使用 RealSense D435i 时出现串口异常 1. 缘由 2. 解决 2.1. 创建 2.2. 生效 2.3. 测试 1. 缘由 有网友问 RealSense D435i 采用 《ROS 摄像头参数标定》 的usb_cam的功能包时出现的图像都是绿色的 恰好有个 ...

    ROS 使用 RealSense D435i 时出现串口异常


    1. 缘由

    有网友问 RealSense D435i 采用 《ROS 摄像头参数标定》usb_cam的功能包时出现的图像都是绿色的
    恰好有个 RealSense D435i 也试一下,确实如此
    而 usb_cam_node 已是集成文件,资料又少,就查看不了原因了

    想先看一下话题,采用 realsense2-camera 功能包,先安装

    $ sudo apt install ros-noetic-realsense2-camera
    

    然后启动一下

    $ roslaunch realsense2_camera rs_camera.launch
    

    就出现了failed to open usb interface: 0, error: RS2_USB_STATUS_ACCESS的异常


    2. 解决


    2.1. 创建

    检查目录/etc/udev/rules.d/下是否有99-realsense-libusb.rules
    如果有,需要修改内容。
    如果没有,则需要创建规则文件99-realsense-libusb.rules,内容如下:

    ##Version=1.1##
    # Device rules for Intel RealSense devices (R200, F200, SR300 LR200, ZR300, D400, L500, T200)
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0a80", MODE:="0666", GROUP:="plugdev", RUN+="/usr/local/bin/usb-R200-in_udev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0a66", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0aa5", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0abf", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0acb", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad0", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="04b4", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad1", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad2", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad3", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad4", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad5", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad6", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0af2", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0af6", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0afe", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0aff", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b00", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b01", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b03", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b07", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b0c", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b0d", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3a", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3d", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b48", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b49", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4b", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4d", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b52", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5b", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5c", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b64", MODE:="0666", GROUP:="plugdev"
    
    # Intel RealSense recovery devices (DFU)
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ab3", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0adb", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0adc", MODE:="0666", GROUP:="plugdev"
    SUBSYSTEMS=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b55", MODE:="0666", GROUP:="plugdev"
    
    # Intel RealSense devices (Movidius, T265)
    SUBSYSTEMS=="usb", ENV{DEVTYPE}=="usb_device", ATTRS{idVendor}=="8087", ATTRS{idProduct}=="0af3", MODE="0666", GROUP="plugdev"
    SUBSYSTEMS=="usb", ENV{DEVTYPE}=="usb_device", ATTRS{idVendor}=="8087", ATTRS{idProduct}=="0b37", MODE="0666", GROUP="plugdev"
    SUBSYSTEMS=="usb", ENV{DEVTYPE}=="usb_device", ATTRS{idVendor}=="03e7", ATTRS{idProduct}=="2150", MODE="0666", GROUP="plugdev"
    
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad5", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor_custom", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0ad5", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0af2", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0af2", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0afe", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor_custom", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0afe", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0aff", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor_custom", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0aff", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b00", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor_custom", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b00", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b01", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor_custom", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b01", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3a", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3a", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3d", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3d", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4b", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4b", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4d", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b4d", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5b", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5b", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5c", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b5c", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    KERNEL=="iio*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b64", MODE:="0777", GROUP:="plugdev", RUN+="/bin/sh -c 'chmod -R 0777 /sys/%p'"
    DRIVER=="hid_sensor*", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b64", RUN+="/bin/sh -c ' chmod -R 0777 /sys/%p && chmod 0777 /dev/%k'"
    

    2.2. 生效

    重新配置生效规则

    $ sudo /etc/init.d/udev restart
    

    2.3. 测试

    拔插设备
    再启动就正常了

    $ roslaunch realsense2_camera rs_camera.launch
    

    使用rviz查看不同图像话题效果
    在这里插入图片描述
    在标定时图像话题采用/camera/color/image_raw即可,收工。


    谢谢

    展开全文
  • Realsense D435I标定

    万次阅读 多人点赞 2019-07-26 14:56:25
    参考链接 :kilibr官网   ... ... ...前期准备:在ubuntu16.04配置好realsense d435i相机环境,详细参见前面的...tip:深度图与彩色图对齐:在d435i的launch文件(rs_camera.launch)里修改align_depth值为true即可。  

    参考链接:kilibr官网    https://github.com/ethz-asl/kalibr/wiki/camera-imu-calibration  ;

    https://blog.csdn.net/HUST_lc/article/details/96144499  感谢学友指教

    https://blog.csdn.net/weixin_40830684/article/details/88768225

    https://blog.csdn.net/zhubaohua_bupt/article/details/80222321 标定结果等各参数讲解

    https://blog.csdn.net/mxdsdo09/article/details/83514310 同上,标定结果等各参数讲解

    https://blog.csdn.net/tercel_zhang/article/details/90637392   标定中各个坐标系理解,讲的非常清晰

    https://blog.csdn.net/Aoulun/article/details/78768570 同上,内含畸变参数讲解

    https://blog.csdn.net/zhuoyueljl/article/details/89509623 标定单目相机的一些注意事项参考

    https://blog.csdn.net/qq_31119155/article/details/79908668

    https://blog.csdn.net/u012177641/article/details/92824014


    总体流程:

    前期准备:在ubuntu16.04配置好realsense d435i相机环境,详细参见前面的博客:https://blog.csdn.net/weixin_40628128/article/details/90376767  

    https://blog.csdn.net/weixin_40628128/article/details/89891227

    https://blog.csdn.net/weixin_40628128/article/details/89703000

    标定工具安装与配置(kalibr)----> 深度相机(RGB,其中IR相机为双目计算,不需要标定)标定----> imu标定  ----> 深度相机+imu联合标定

    一、kalibr安装:

    直接按照官网进行:

    https://github.com/ethz-asl/kalibr/wiki/installation

    选用非CDE方式在ros kinetic环境下安装。期间遇到pip报错,解决办法:https://blog.csdn.net/lyll616/article/details/85090132

    至此,安装好了kalibr。

    在标定(如kalibr_calibrate_cameras)前,需进入kalibr_workspace文件目录下source一下:

    cd ~/kalibr_workspace
    source ~/kalibr_workspace/devel/setup.bash

    二、realsense D435i 的camera标定

    (1)realsense d435i可以直接读取的相机内参:

    roslaunch realsense2_camera rs_camera.launch //打开相机节点
    rostopic echo /camera/color/camera_info //查看相机内参

    (2)kalibr标定camera

    • 设置棋盘格参数,保存成yaml文件(位置随意,在后面引用绝对路径即可,例如kalibr_calibrate_cameras --target ~/bagfiles/checkboard.yaml)

           checkboard的参数配置yaml文件在链接(https://github.com/ethz-asl/kalibr/wiki/downloads)下载,下载后修改以下参数(根据自己的实际标定板参数进行修改,我这里是购买的是GP340标定棋盘格9X12):

    target_type: 'checkerboard' #gridtype
    targetCols: 8               #number of internal chessboard corners 注意是内角点数目
    targetRows: 11               #number of internal chessboard corners 注意是内角点数目
    rowSpacingMeters: 0.025      #size of one chessboard square [m]
    colSpacingMeters: 0.025      #size of one chessboard square [m]
    •    录制camera标定数据:rosbag

     为了方便查看,先打开可视化窗口:

    终端1:roslaunch realsense2_camera rs_camera.launch //打开相机节点
    终端2:rviz,选camera_link,add topic->camera->color->image_raw  打开可视化窗口,固定相机,移动标定板(效果应稍微好一些)

    ros录制和回放数据子集教程参考官网或链接https://blog.csdn.net/u010510350/article/details/72457758。

    将图像频率降低为4HZ,这里可以用throttle方法,不会出错,并发布心得topic,不会修改原topic:

    rosrun topic_tools throttle messages /camera/color/image_raw 4.0 /color

    利用throttle工具降低录制的RGB图像频率,降至4HZ,新发布的topic名字是/color

    rosbag record -O camd435i /color

    其中 -O表示可以保存成 camd435i.bag(设置bag名称),/color 为录制的话题名称。录制方法参见(https://blog.csdn.net/zhuoyueljl/article/details/89509623)

    (ctrl-c结束录制,bag文件会自动保存到录制时命令行所在的目录,我这里是~/bagfiles)

     

    • kalibr标定单个camera(之前先对kalibr_workspace source一下)
    kalibr_calibrate_cameras --target ~/bagfiles/checkboard.yaml --bag ~/bagfiles/camd435i.bag --bag-from-to 5 50 --models pinhole-radtan --topics /color 

    其中--target ~/bagfiles/checkboard.yaml为定位棋盘格yaml参数文件路径;--bag ~/bagfiles/camd435i.bag为设置录制bag文件路径;--bag-from-to 5 50表示用5~50s(根据录制数据的质量决定时间的取舍)之间的数据进行计算;--models pinhole-radtan为相机模型;--topics /color,bag录制的话题为topics。

    果然也卡在了角点上。。。承蒙前辈手留余香~~https://github.com/ethz-asl/kalibr/issues/164,在后面添加--show-extraction

    kalibr_calibrate_cameras --target ~/bagfiles/checkerboard.yaml --bag ~/bagfiles/camd435i.bag --bag-from-to 5 50 --models pinhole-radtan --topics /color --show-extraction

    【此外,还可以对多个相机进行标定,如两个相机:

    kalibr_calibrate_cameras --target src/kalibr/checkerboard.yaml --bag src/kalibr/bag/fisheye_2019-07-15-21-52-12.bag --models pinhole-radtan pinhole-radtan --topics /image_raw1_th /image_raw2_th

    详情参见:https://blog.csdn.net/HUST_lc/article/details/96144499  】

    三、IMU内参获取/标定

    https://cloud.tencent.com/developer/article/1438368 参考理解,内有matlab标定源码链接及论文讲解。

    这里需要对IMU原理及算法熟悉,这周时间预算不足,放在vio学习阶段学习。这里暂时用网友示例的数据做yaml,先看联合标定。     

     

     

     

    四、kalibr标定camera+IMU

    参考 https://blog.csdn.net/mxdsdo09/article/details/83514310

    • 联合标定的目的

        我们进行camera-IMU标定的目的是为了得到IMU和相机坐标系的相对位姿矩阵T相对时间延时t_shift(t_imu=t_cam + t_shift)

    • 需要的文件

    1、.bag:包含有图片信息和IMU数据的ROS包
    2、camchain.yaml: 包含相机的内参、畸变参数的文件,如果是双目的话,还包含两个相机的位置转换矩阵;
    3、IMU.yaml: 包含IMU的噪声密度、随机游走;

    4、checkboard.yaml:标定目标板的参数

    所以在进行camera-IMU 标定前,我们分别要对camera内参(第二节)和IMU进行标定(第三节)得到相应的camd435i.yaml和IMU.yaml文件,已准备好相关文件,放到~/bagfile文件目录下。

    • 输出结果

      IMU和相机坐标系的相对位姿矩阵T和重投影误差(或者像素误差,Pixel Error(像素误差)指的是the standard deviation of the reprojection error (in Pixel) in both x and y directions respectivly(在x和y方向上以像素为单位的重投影误差的标准差。根据优化的准则我们知道重投影误差越小,就说相机标定的精度越高)。

    • 标定方法(kalibr)

    checkerboard.yaml:

    target_type: 'checkerboard' #gridtype
    targetCols: 8               #number of internal chessboard corners
    targetRows: 11               #number of internal chessboard corners
    rowSpacingMeters: 0.025      #size of one chessboard square [m]
    colSpacingMeters: 0.025      #size of one chessboard square [m]

    camchain.yaml:(用自己的相机内参参数替换里面的数值,参数见camd435i.yaml)

    cam0:
      camera_model: pinhole
      intrinsics: [632.9640658678117, 638.2668942402212, 339.9807921782614, 243.68020465500277]
      distortion_model: equidistant
      distortion_coeffs: [0.366041713382057, 1.1433178097591097, -3.008125411486294, -3.1186836086733227]
      T_cam_imu:
      - [0.01779318, 0.99967549,-0.01822936, 0.07008565]
      - [-0.9998017, 0.01795239, 0.00860714,-0.01771023]
      - [0.00893160, 0.01807260, 0.99979678, 0.00399246]
      - [0.0, 0.0, 0.0, 1.0]
      timeshift_cam_imu: -8.121e-05
      rostopic: /color
      resolution: [640, 480]
    

    IMU.yaml:

    rostopic: /imu
    update_rate: 200.0 #Hz
     
    accelerometer_noise_density: 0.01 #continous
    accelerometer_random_walk: 0.0002 
    gyroscope_noise_density: 0.005 #continous
    gyroscope_random_walk: 4.0e-06
    

    方法基本与camera标定等同,只不过需要设定IMU的频率(降低至200HZ)、同时录制IMU topic与color topic等。

    标定动作参照官网的视频:https://github.com/ethz-asl/kalibr    主要需要激活留个自由度的运动(xyz旋转与平移)

    https://blog.csdn.net/u012177641/article/details/92824014 主要参考如何将acce与gyro合成一个话题

     

    realsense的ROS launch文件中有将acc和gyr两个整成一个imu话题的配置,设置一下就好,对应为:        

    <arg name="enable_sync"               default="true"/>
         <arg name="unite_imu_method"         value="linear_interpolation"/>     【原来值为空:value= “ ”/】

    其中,linear_interpolation可以换成其他的,具体是啥目前不太清楚。
     

    将图像频率降低为4HZ,这里可以用throttle方法,不会出错,并发布新的topic,不会修改原topic:

    rosrun topic_tools throttle messages /camera/color/image_raw 20.0 /color
    rosrun topic_tools throttle messages /camera/gyro/image_info 200.0 /imu
    

    利用throttle工具降低录制的RGB图像频率,降至20HZ,新发布的topic名字是/color,把IMU频率降低为200Hz,新发布的topic名字是/imul。

        录制dynamic.bag数据(同时录制/imu与/color话题):

    rosbag record -O dynamic /color /imu

        开始标定:

    kalibr_calibrate_imu_camera --target ~/bagfiles/checkerboard.yaml --cam ~/bagfiles/camchain.yaml --imu ~/bagfiles/imu.yaml --bag ~/bagfiles/dynamic.bag --bag-from-to 5 45 --show-extraction
    

     

     

    五、realsense D435i 的IMU configuration(官网方法)

    该方法便于商用,但仅采用了六个位置静止放置,标定数据太少,标定误差会较大,不利于高精度测量和导航应用,可能并不适合SLAM。

    参照官网进行标定(白皮书):https://www.intel.com/content/www/us/en/support/articles/000032303/emerging-technologies/intel-realsense-technology.html

    其中在配置软件环境时,sudo pip install numpy因网络问题总是中断,(不断尝试),sudo apt install python-numpy似乎也行。不放心,可以导入numpy试试是否已经安装成功:

    未提示错误,表示numpy安装成功(感谢学友指点)。


    tip:深度图与彩色图对齐:在d435i的launch文件(rs_camera.launch)里修改align_depth值为true即可。

     

    展开全文
  • ,我们根据D435i的相机参数写出对应的D435i.yaml文件,相对于ORB-SLAM2中的yaml文件,主要修改相机参数Camera Parameters部分。 %YAML:1.0 #-----------------------------------------------------------------...


    配置安装环境:Ubuntu16.04LTS, Linux Kernel 4.15

    一、安装Intel Realsense SDK

    方式一 —— 直接安装官方已经编译好的包,可参考教程

    # Register the server's public key
    sudo apt-key adv --keyserver keys.gnupg.net --recv-key C8B3A55A6F3EFCDE || sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key C8B3A55A6F3EFCDE
    # Add the server to the list of repositories
    sudo add-apt-repository "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo xenial main" -u
    # Install the libraries
    sudo apt install librealsense2-dkms
    sudo apt install librealsense2-utils
    # Optionally install the developer and debug packages
    sudo apt install librealsense2-dev
    sudo apt install librealsense2-dbg
    # Reconnect the Intel RealSense depth camera and run: realsense-viewer to verify the installation.
    

    方式二 —— 下载源代码自行编译安装,可参考教程

    # ***************************Prerequisites***************************** #
    # Update Ubuntu distribution, including getting the latest stable kernel
    sudo apt update && sudo apt upgrade && sudo apt dist-upgrade
    # Download the complete source tree with git
    git clone https://github.com/IntelRealSense/librealsense.git
    # Navigate to librealsense root directory and Unplug any connected Intel RealSense camera
    # Install the core packages required to build librealsense binaries and the affected kernel modules
    sudo apt install git libssl-dev libusb-1.0-0-dev pkg-config libgtk-3-dev
    sudo apt install libglfw3-dev libgl1-mesa-dev libglu1-mesa-dev
    # certain librealsense CMAKE flags (e.g. CUDA) require version 3.8+(check: cmake --version)
    # Run Intel Realsense permissions script located from librealsense root directory
    ./scripts/setup_udev_rules.sh
    # Build and apply patched kernel modules
    ./scripts/patch-realsense-ubuntu-lts.sh
    # Tracking Module requires hid_sensor_custom kernel module to operate properly.
    echo 'hid_sensor_custom' | sudo tee -a /etc/modules
    
    # ***************************Building and Installing librealsense2 SDK***************************** #
    # Navigate to librealsense root directory
    mkdir build && cd build
    # Builds librealsense along with the demos and tutorials
    cmake ../ -DBUILD_EXAMPLES=true
    # Recompile and install librealsense binaries
    sudo make uninstall && make clean && make -jX && sudo make install
    # Use make -jX for parallel compilation, where X stands for the number of CPU cores available
    # The shared object will be installed in /usr/local/lib, header files in /usr/local/include
    # The binary demos, tutorials and test files will be copied into /usr/local/bin
    
    # ***************************Testing librealsense2 SDK***************************** #
    # Navigate to librealsense root directory
     cd build/examples/capture
    ./rs-capture 
    

    二、安装ROS Wrapper for Intel RealSense

    可参考教程

    # Step 1: Install the latest Intel RealSense SDK
    
    # Step 2: Install the ROS Kinetic --- http://wiki.ros.org/kinetic/Installation/Ubuntu
    
    # Step 3: Install Intel RealSense ROS from Sources
    # Create a catkin workspace
    mkdir -p ~/catkin_ws/src
    cd ~/catkin_ws/src/
    # Clone the latest Intel RealSense ROS into 'catkin_ws/src/'
    git clone https://github.com/IntelRealSense/realsense-ros.git
    cd realsense-ros/
    git checkout `git tag | sort -V | grep -P "^\d+\.\d+\.\d+" | tail -1`
    cd ../..
    # Make sure all dependent ros packages are installed. You can check .travis.yml file for reference.
    sudo apt install ros-kinetic-cv-bridge ros-kinetic-image-transport ros-kinetic-tf ros-kinetic-diagnostic-updater ros-kinetic-ddynamic-reconfigure
    catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
    catkin_make install
    echo source $(pwd)/devel/setup.bash >> ~/.bashrc
    source ~/.bashrc
    
    # Step 4:检验是否能在ros使用realsense相机:
    # 通过usb连接相机到电脑
    sudo apt install ros-kinetic-rgbd-launch
    roslaunch realsense2_camera rs_rgbd.launch
    # 查看一下相机发布的topic
    rostopic list
    # 查看相机内参信息的两种方式
    rostopic echo /camera/color/camera_info
    rostopic echo /camera/aligned_depth_to_color/camera_info
    # 再打开一个终端
    rviz
    # 此时并不能看到什么结果,左上角 Displays 中 Fixed Frame 选项中,下拉菜单选择 camera_link,这是主要到Global Status变成了绿色
    # 点击该框中的Add -> 上方点击 By topic -> /depth_registered 下的 /points 下的/PointCloud2
    # 点击该框中的Add -> 上方点击 By topic -> /color 下的 /image_raw 下的image
    

    三、使用公开数据集配置测试ORB_SLAM2

    可参考教程

    # Prerequisites
    # Pangolin, OpenCV, Eigen3 安装见我的另一篇文章:https://blog.csdn.net/jiangchuanhu/article/details/89163864
    
    # Building ORB-SLAM2 library and examples
    git clone https://github.com/raulmur/ORB_SLAM2.git ORB_SLAM2
    cd ORB_SLAM2 && chmod +x build.sh && ./build.sh
    # 编译时可能遇到的错误参考:https://github.com/raulmur/ORB_SLAM2/issues/337
    
    # 使用网上公开的数据集测试ORB_SLAM2
    
    # 1、Monocular Examples --- TUM Dataset
    # Download a sequence from http://vision.in.tum.de/data/datasets/rgbd-dataset/download and uncompress it.
    # Change TUMX.yaml to TUM1.yaml,TUM2.yaml or TUM3.yaml for freiburg1, freiburg2 and freiburg3 sequences respectively. Change PATH_TO_SEQUENCE_FOLDERto the uncompressed sequence folder.
    ./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUMX.yaml PATH_TO_SEQUENCE_FOLDER
    
    # 3、RGB-D Example --- TUM Dataset
    # Download a sequence from http://vision.in.tum.de/data/datasets/rgbd-dataset/download and uncompress it.
    # We already provide associations for some of the sequences in Examples/RGB-D/associations/. You can generate your own associations file executing:
    python associate.py PATH_TO_SEQUENCE/rgb.txt PATH_TO_SEQUENCE/depth.txt > associations.txt
    # Execute the following command. Change TUMX.yaml to TUM1.yaml,TUM2.yaml or TUM3.yaml for freiburg1, freiburg2 and freiburg3 sequences respectively. Change PATH_TO_SEQUENCE_FOLDERto the uncompressed sequence folder. Change ASSOCIATIONS_FILE to the path to the corresponding associations file.
    ./Examples/RGB-D/rgbd_tum Vocabulary/ORBvoc.txt Examples/RGB-D/TUMX.yaml PATH_TO_SEQUENCE_FOLDER ASSOCIATIONS_FILE
    

    四、在D435i设备上配置跑通ORB_SLAM2

    可参考教程

    # Add the path including Examples/ROS/ORB_SLAM2 to the ROS_PACKAGE_PATH environment variable. 
    # Open .bashrc file and add at the end the following line. Replace PATH by the folder where you cloned ORB_SLAM2
    export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH/ORB_SLAM2/Examples/ROS
    # Execute build_ros.sh script
    chmod +x build_ros.sh
    ./build_ros.sh
    # 编译时可能遇到的错误参考:https://github.com/raulmur/ORB_SLAM2/issues/337
    

    上述步骤中在执行./build_ros.sh出现关于boost库的错误时,在/Examples/ROS/ORB-SLAM2/CMakeLists.txt文件下修改,加上-lboost_system,然后重新执行./build_ros.sh

    set(LIBS 
    ${OpenCV_LIBS} 
    ${EIGEN3_LIBS}
    ${Pangolin_LIBRARIES}
    ${PROJECT_SOURCE_DIR}/../../../Thirdparty/DBoW2/lib/libDBoW2.so
    ${PROJECT_SOURCE_DIR}/../../../Thirdparty/g2o/lib/libg2o.so
    ${PROJECT_SOURCE_DIR}/../../../lib/libORB_SLAM2.so
    -lboost_system              # here
    )
    

    按照以下步骤我们可获取相机参数,通过usb连接相机到电脑,然后执行

    roslaunch realsense2_camera rs_rgbd.launch
    rostopic echo /camera/color/camera_info
    

    它的数据结构形式如下:

    ---
    header: 
      seq: 17
      stamp: 
        secs: 1560907148
        nsecs: 588988566
      frame_id: "camera_color_optical_frame"
    height: 480
    width: 640
    distortion_model: "plumb_bob"
    D: [0.0, 0.0, 0.0, 0.0, 0.0]
    K: [615.9417724609375, 0.0, 322.3533630371094, 0.0, 616.0935668945312, 240.44674682617188, 0.0, 0.0, 1.0]
    R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
    P: [615.9417724609375, 0.0, 322.3533630371094, 0.0, 0.0, 616.0935668945312, 240.44674682617188, 0.0, 0.0, 0.0, 1.0, 0.0]
    binning_x: 0
    binning_y: 0
    roi: 
      x_offset: 0
      y_offset: 0
      height: 0
      width: 0
      do_rectify: False
    ---
    

    根据网上看到的参考文章1参考文章2,我们根据D435i的相机参数写出对应的D435i.yaml文件,相对于ORB-SLAM2中的yaml文件,主要修改相机参数Camera Parameters部分。

    %YAML:1.0
    
    #--------------------------------------------------------------------------------------------
    # Camera Parameters. Adjust them!
    #--------------------------------------------------------------------------------------------
    
    # Camera calibration and distortion parameters (OpenCV) 
    Camera.fx: 615.9417724609375
    Camera.fy: 616.0935668945312
    Camera.cx: 322.3533630371094
    Camera.cy: 240.44674682617188
    
    Camera.k1: 0.0
    Camera.k2: 0.0
    Camera.p1: 0.0
    Camera.p2: 0.0
    Camera.p3: 0.0
    
    Camera.width: 640
    Camera.height: 480
    
    # Camera frames per second 
    Camera.fps: 30.0
    
    # IR projector baseline times fx (aprox.)
    # bf = baseline (in meters) * fx, D435i的 baseline = 50 mm 
    Camera.bf: 30.797   
    
    # Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
    Camera.RGB: 1
    
    # Close/Far threshold. Baseline times.
    ThDepth: 40.0
    
    # Deptmap values factor
    DepthMapFactor: 1000.0
    
    #--------------------------------------------------------------------------------------------
    # ORB Parameters
    #--------------------------------------------------------------------------------------------
    
    # ORB Extractor: Number of features per image
    ORBextractor.nFeatures: 1000
    
    # ORB Extractor: Scale factor between levels in the scale pyramid 	
    ORBextractor.scaleFactor: 1.2
    
    # ORB Extractor: Number of levels in the scale pyramid	
    ORBextractor.nLevels: 8
    
    # ORB Extractor: Fast threshold
    # Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
    # Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
    # You can lower these values if your images have low contrast			
    ORBextractor.iniThFAST: 20
    ORBextractor.minThFAST: 7
    
    #--------------------------------------------------------------------------------------------
    # Viewer Parameters
    #--------------------------------------------------------------------------------------------
    Viewer.KeyFrameSize: 0.05
    Viewer.KeyFrameLineWidth: 1
    Viewer.GraphLineWidth: 0.9
    Viewer.PointSize:2
    Viewer.CameraSize: 0.08
    Viewer.CameraLineWidth: 3
    Viewer.ViewpointX: 0
    Viewer.ViewpointY: -0.7
    Viewer.ViewpointZ: -1.8
    Viewer.ViewpointF: 500
    

    摄像头节点发布的rgbd图和depth图话题名与ORB-SLAM2的订阅RGB图和depth图话题名不同,在ORB-SLAM2/Examples/ROS/ORB-SLAM2/src中修改ros_rgbd.cc的topic订阅

    message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/color/image_raw", 1);
    message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/aligned_depth_to_color/image_raw", 1);
    

    最后在ORB_SLAM2工作目录下

    # 重新编译build_ros.sh
    chmod +x build_ros.sh
    ./build_ros.sh
    # 运行ORB_SLAM2
    rosrun ORB_SLAM2 RGBD Vocabulary/ORBvoc.txt Examples/RGB-D/D435i.yaml
    
    展开全文
  • Intel RealSense D435i Calibration

    千次阅读 热门讨论 2020-03-09 15:40:47
    Intel RealSense D435i Calibration0.引言1.标定工具安装1.1.imu_utils Install1.2.kalibr Install2.IMU标定3.相机标定4.IMU+相机联合标定5.VINS Yaml文件配置 0.引言 ref0.官方校准白皮书 ref1.blog ref2.blog ....

    0.引言

    .   realsense d435i包含两个红外相机、红外发射器、RGB相机和IMU四个模块,显然四个传感器的空间位置是不同的,我们在处理图像和IMU数据时需要将这些数据都放在统一的坐标系上去。比如我们用d435i运行vins,处理的图像和IMU数据都需要放在同一个坐标系下,因此需要标定IMU相对RGB相机的空间位置(包括旋转和位移)。
      另外,相机固有参数比如焦距、畸变参数等以及IMU的零偏和scale系数等都需要提前知道。前者称为外参,后者称为内参,在运行程序前我们需要标定它们,不论程序是否有自标定功能,毕竟好的初始标定值对于自标定来说也是有利的。

    1.标定工具安装

    1. 安装realsense-sdk2.0,包括d435i的驱动等,直到可以运行realsense-viewer,可以看到图像和深度图。
    2. 安装realsense-ros,前提要安装ros-kinetic,这个包可以直接读取d435i的数据流,并发布各个topic,后面标定操作直接订阅相关的topic。
    3. 安装imu_utils,前提要安装code_utils,这个用于标定IMU的噪音密度和随机游走系数。
    4. 安装Kalibr。这个软件包可以同时标定多个相机的外参和内参(提供不同的相机的模型),另外可以标定相机和IMU的外参。

    1.1.imu_utils Install

    参照官网安装即可。

    1.2.kalibr Install

    标定顺序:IMU标定 -> 相机标定 -> IMU+相机联合标定. 这么设定顺序是因为最后一步IMU和相机的联合标定需要IMU和相机的内参,现在开始展开说明。

    2.IMU标定

    • step1:进入realsense-ros包,开启d435i,发布IMU数据。也可以直接在rs_camera.launch基础上进行更改。
    roslaunch realsense2_camera rs_imu_calibration.launch
    

    rs_imu_calibration.launch是在rs_camera.launch基础上针对IMU校准做了修改。目的是将acc、gyro数据对齐使用同一个topic发布。
    Modified:

    <!-- 更改前原版本arg name="unite_imu_method"          default=""/-->  
    <arg name="unite_imu_method"          default="linear_interpolation"/>
    <!--或着将参数改为copy-->
    <arg name="unite_imu_method"          default="copy"/>
    
    • step2: 录制imu数据包
     rosbag record -O imu_calibration /camera/imu
    
    • step3: 运行校准程序

    根据imu_utils文件夹里面的A3.launch改写D435i标定启动文件:d435i_imu_calibration.launch注意,记得修改max_time_min对应的参数,默认是120,意味着两个小时,如果ros包里的imu数据长度没有两个小时,就进行不下去,始终停留在wait for imu data这里。后面录制的时候不知道为什么自己停了,只录制了大约一个小时,所以实际数据时间参数填写的50.

    d435i_imu_calibration.launch:

    <launch>
    
        <node pkg="imu_utils" type="imu_an" name="imu_an" output="screen">
        	<!--TOPIC名称和上面一致-->
            <param name="imu_topic" type="string" value= "/camera/imu"/>
            <!--imu_name 无所谓-->
            <param name="imu_name" type="string" value= "d435i"/>
            <!--标定结果存放路径-->
            <param name="data_save_path" type="string" value= "$(find imu_utils)/data/"/>
            <!--数据录制时间-min-->
            <param name="max_time_min" type="int" value= "120"/>
            <!--采样频率,即是IMU频率,采样频率可以使用rostopic hz /camera/imu查看,设置为200,为后面的rosbag play播放频率-->
            <param name="max_cluster" type="int" value= "200"/>
        </node>
        
    </launch>
    

    执行校准启动文件

    roslaunch imu_utils d435i_imu_calibration.launch
    
    • step4: 回放数据包
    rosbag play -r 200 imu_calibration.bag  #以200倍速度进行播放,因为IMU数据录制时间很长,如果就按照默认速度播放会播放过很久!
    

    标定结果imu_utils/data/d435i_imu_param.yaml

    %YAML:1.0
    ---
    type: IMU
    name: d435i
    Gyr:
       unit: " rad/s"
       avg-axis: #取平均值作为标定结果
          gyr_n: 2.1732068912927271e-03 #白噪声
          gyr_w: 1.7797900203083191e-05 #偏置
       x-axis:
          gyr_n: 2.1391701666780305e-03
          gyr_w: 1.9895954849984289e-05
       y-axis:
          gyr_n: 2.5929821451625827e-03
          gyr_w: 1.9897115334756576e-05
       z-axis:
          gyr_n: 1.7874683620375685e-03
          gyr_w: 1.3600630424508712e-05
    Acc:
       unit: " m/s^2"
       avg-axis:  #取平均值作为标定结果
          acc_n: 2.3786845794688424e-02  #白噪声
          acc_w: 5.9166889270489845e-04  #偏置
       x-axis:
          acc_n: 2.1534587157119929e-02
          acc_w: 6.2088900106164286e-04
       y-axis:
          acc_n: 2.2044805094743814e-02
          acc_w: 6.2033852597901415e-04
       z-axis:
          acc_n: 2.7781145132201524e-02
          acc_w: 5.3377915107403855e-04
    

    终端输出信息:

    [ INFO] [1583551991.407318058]: Loaded imu_topic: /camera/imu
    [ INFO] [1583551991.409118658]: Loaded imu_name: d435i
    [ INFO] [1583551991.410842690]: Loaded data_save_path: /home/ipsg/D435I/imu_utils/src/imu_utils/data/
    [ INFO] [1583551991.412641903]: Loaded max_time_min: 60
    [ INFO] [1583551991.414503358]: Loaded max_cluster: 200
    gyr x  num of Cluster 200
    gyr y  num of Cluster 200
    gyr z  num of Cluster 200
    acc x  num of Cluster 200
    acc y  num of Cluster 200
    acc z  num of Cluster 200
    wait for imu data.
    gyr x  numData 1379952
    gyr x  start_t 1.58355e+09
    gyr x  end_t 1.58355e+09
    gyr x dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    gyr x  freq 383.231
    gyr x  period 0.00260939
    gyr y  numData 1379952
    gyr y  start_t 1.58355e+09
    gyr y  end_t 1.58355e+09
    gyr y dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    gyr y  freq 383.231
    gyr y  period 0.00260939
    gyr z  numData 1379952
    gyr z  start_t 1.58355e+09
    gyr z  end_t 1.58355e+09
    gyr z dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    gyr z  freq 383.231
    gyr z  period 0.00260939
    Gyro X 
    C   -1.06724    30.4679   -8.79353    1.24469 -0.0272342
     Bias Instability 2.04796e-05 rad/s
     Bias Instability 1.9896e-05 rad/s, at 110.646 s
     White Noise 7.64079 rad/s
     White Noise 0.00213917 rad/s
      bias -0.147923 degree/s
    -------------------
    Gyro y 
    C   -1.50544    38.3445   -8.85277    1.13221 -0.0212569
     Bias Instability 1.41036e-05 rad/s
     Bias Instability 1.98971e-05 rad/s, at 110.646 s
     White Noise 9.31086 rad/s
     White Noise 0.00259298 rad/s
      bias 0.01896 degree/s
    -------------------
    Gyro z 
    C  -0.866715     24.625   -5.52922    0.72493 -0.0121455
     Bias Instability 6.32623e-06 rad/s
     Bias Instability 1.36006e-05 rad/s, at 96.9284 s
     White Noise 6.42775 rad/s
     White Noise 0.00178747 rad/s
      bias -0.0351278 degree/s
    -------------------
    ==============================================
    ==============================================
    acc x  numData 1379952
    acc x  start_t 1.58355e+09
    acc x  end_t 1.58355e+09
    acc x dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    acc x  freq 383.231
    acc x  period 0.00260939
    acc y  numData 1379952
    acc y  start_t 1.58355e+09
    acc y  end_t 1.58355e+09
    acc y dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    acc y  freq 383.231
    acc y  period 0.00260939
    acc z  numData 1379952
    acc z  start_t 1.58355e+09
    acc z  end_t 1.58355e+09
    acc z dt 
    -------------3600.83 s
    -------------60.0138 min
    -------------1.00023 h
    acc z  freq 383.231
    acc z  period 0.00260939
    acc X 
    C -3.05601e-05   0.00142723 -0.000646156  0.000264777 -1.39214e-06
     Bias Instability 0.000620889 m/s^2
     White Noise 0.0215346 m/s^2
    -------------------
    acc y 
    C -3.20282e-05   0.00142617 -0.000709606   0.00030218 -3.36171e-06
     Bias Instability 0.000620339 m/s^2
     White Noise 0.0220448 m/s^2
    -------------------
    acc z 
    C  -1.2485e-05   0.00135554  0.000129763 -1.62121e-05   3.7871e-06
     Bias Instability 0.000533779 m/s^2
     White Noise 0.0277811 m/s^2
    -------------------
    [imu_an-2] process has finished cleanly
    

    当然也可以不用录制和回放直接在线标定,也即是1和3步即可。经过这些标定会生成一个yaml文件和很多txt文件,主要是yaml文件,给出了加速度计和陀螺仪三轴的noise_density和random_walk,同时计算出了平均值,后面IMU+摄像头联合标定的时候需要这些均值。

    通过官方指令查看默认imu与相机参数(acc、gyro分开的):

    rostopic echo /camera/gyro/imu_info
    rostopic echo /camera/accel/imu_info
    rostopic echo /camera/color/camera_info #相机参数
    

    Alt

    可以看出,实际上出厂没有标定IMU?相机出厂内参:

    height: 480
    width: 640
    distortion_model: "plumb_bob"
    D: [0.0, 0.0, 0.0, 0.0, 0.0]
    K: [616.1290893554688, 0.0, 319.9371032714844, 0.0, 616.3303833007812, 240.64352416992188, 0.0, 0.0, 1.0]
    R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
    P: [616.1290893554688, 0.0, 319.9371032714844, 0.0, 0.0, 616.3303833007812, 240.64352416992188, 0.0, 0.0, 0.0, 1.0, 0.0]
    

    绘制Allan曲线:
    Alt
    Alt

    3.相机标定

    使用Kalibr标定相机的内参和多个相机相对位置关系即外参,需要准备kalibr提供的标定物,具体可以在kalibr-wiki目录中找到Calibration Target ,然后找到april grid下载,那里面提供了标定目标的图案和相应的参数。你也可以定做不同大小的标定目标。具体标定步骤:

    • step 0: 安装kalibr。使用kalibr自带的标定物生成脚本生成自己想要尺寸的aprilgrid,将其pdf文件放在电脑屏幕上按照真实尺寸显示,这样免去了打印的麻烦,而且因为目标是在绝对的水平面上,不会引入额外的标定误差.
    kalibr_create_target_pdf --type apriltag --nx 6 --ny 6 --tsize 0.02 --tspace 0.28571429
    #kalibr_create_target_pdf --type apriltag --nx [NUM_COLS] --ny [NUM_ROWS] --tsize [TAG_WIDTH_M] --tspace [TAG_SPACING_PERCENT]
    

    yaml文件需要按照设定的尺寸进行修改.过程中报错:ImportError: No module named pyxsolve method.
    yaml文件修改举例:
    在这里插入图片描述

    target_type: 'aprilgrid' #gridtype
    tagCols: 6               #number of apriltags
    tagRows: 6               #number of apriltags
    tagSize: 0.02           #size of apriltag, edge to edge [m]
    tagSpacing: 0.28571429          #ratio of space between tags to tagSize
    
    • step 1:进入realsense-ros包,开启d435i,发布摄像头图像的话题
    roslaunch realsense2_camera rs_camera.launch                 
    

    注意:d453i是有红外发射器的,可以发射很多红外小斑点,如果打开你会在rviz看到很多光斑,可能不利于标定,所以标定时关闭这个发射器的。关闭发射器:在realsense-viewer里面设置后,设置的参数使用ROS Wrapper打开后依然生效!!这一点在后面的使用中也十分有用,比如设置滤波器等等.

    • step2:将标定目标AprilGrid置于相机前方合理距离范围内,然后缓慢移动标定目标,让所有摄像头均能看到标定物

    一定不要太远,不然无法检测到标定目标的特征,在标定算法中需要检测是否有足够数量图片检测到标定特征,否则直接无法标定。移动标定物时候不要过快导致运动模糊,我们只需要获取不同位置和角度的图像,确保图像清晰和特征完整即可。另外要尽可能多角度和多位置(上下左右等)甚至到摄像头捕捉图像的边缘,这样移动目标1min左右即可。

    • step3:降低图像话题的频率,录制图像数据包

    kalibr在处理标定数据的时候要求图像的频率不可过高,一般为4hz(后面计算过程报错,改为20)。使用如下指令来限制图像频率:

    rosrun topic_tools throttle messages /camera/color/image_raw 20 /color
    rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 20 /infra_left
    rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 20 /infra_right
    

    然后进行录制(录制了两分钟):

    rosbag record -O multicameras_calibration /infra_left /infra_right /color
    
    • step4:调用kalibr的算法计算各个摄像头的内参和外参
    kalibr_calibrate_cameras --target ../yaml/april_6x6_A4.yaml --bag ./bag/0_multicameras_calibration.bag --models pinhole-equi pinhole-equi pinhole-equi --topics /infra_left /infra_right /color --bag-from-to 10 100
    

    or checkboard

    kalibr_calibrate_cameras --target checkerboard_8x11_30x30cm.yaml --bag ./bag/0_multicameras_calibration.bag --models pinhole-equi pinhole-equi omni-radtan omni-radtan --topics /cam0/image_raw /cam1/image_raw /cam2/image_raw /cam3/image_raw
    

    命令说明:
    april_6x6_A4.yaml – 标定物的参数,具体是标定目标的尺寸之类,因为我是缩小打印在A4上,所以要对参数进行修改;pinhole-equi – 选择的相机模型,kalibr提供了很多相机模型,可以自己选择. --bag-from-to 可以选择时间段,毕竟录制的时候不能保证整体都录制的很好。这个计算的结果会很久,是真的很久,最后会输出一个pdf和txt文件,有内外参数据。

    只标定主相机:

    kalibr_calibrate_cameras --target ../yaml/april_6x6_A4.yaml --bag ./bag/0_multicameras_calibration.bag --model pinhole-equi  --topic  /color  --show-extraction --approx-sync 0.04
    

    最后还是标定的多相机:

    kalibr_calibrate_cameras --target april_6x6_A4.yaml --bag multicameras_calibration.bag --models pinhole-equi pinhole-equi pinhole-equi --topics /infra_left /infra_right /color  --show-extraction --approx-sync 0.04 --bag-from-to 10 100
    # --bag-from-to 10 100 选取10到100s之间的数据
    

    期间遇到的报错:

    (1).err1.ImportError: cannot import name NavigationToolbar2Wxsolve method.不需要重新编译。
    (2).err2

    Cameras are not connected through mutual observations, please check the dataset. Maybe adjust the approx. sync. tolerance.
    Traceback (most recent call last):
      File "/home/ipsg/D435I/kalibr_ws/devel/bin/kalibr_calibrate_cameras", line 15, in <module>
        exec(compile(fh.read(), python_script, 'exec'), context)
      File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 447, in <module>
        main()
      File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_calibrate_cameras", line 204, in main
        graph.plotGraph()
      File "/home/ipsg/D435I/kalibr_ws/src/kalibr/aslam_offline_calibration/kalibr/python/kalibr_camera_calibration/MulticamGraph.py", line 311, in plotGraph
        edge_label=self.G.es["weight"],
    KeyError: 'Attribute does not exist'
    

    solve method.

    报告生成:

    (1).PDF
    在这里插入图片描述

    (2).TXT

    Calibration results 
    ====================
    Camera-system parameters:
    	cam0 (/infra_left):
    	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
    	 distortion: [ 0.40059551 -0.73828444  4.12512287 -6.18517722] +- [ 0.0096611   0.06314745  0.09801433  0.08414741]
    	 projection: [ 386.71250573  387.18240561  320.55905537  239.8204248 ] +- [ 0.14888357  0.16540833  0.34725177  0.27395728]
    	 reprojection error: [-0.000012, -0.000007] +- [0.148506, 0.106890]
    
    	cam1 (/infra_right):
    	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
    	 distortion: [ 0.36572062 -0.38257775  3.28883181 -6.72070632] +- [ 0.01450564  0.12847927  0.29582794  0.15893508]
    	 projection: [ 386.94988374  387.35006645  321.07250734  240.44316048] +- [ 0.13223764  0.16242541  0.3057414   0.26940346]
    	 reprojection error: [0.000004, -0.000005] +- [0.164897, 0.108821]
    
    	cam2 (/color):
    	 type: <class 'aslam_cv.libaslam_cv_python.EquidistantDistortedPinholeCameraGeometry'>
    	 distortion: [  0.24904226   2.8043341  -15.57145308  26.7679881 ] +- [ 0.0099252   0.09318402  0.2472954   0.11722548]
    	 projection: [ 611.67156667  611.71826379  323.28627384  240.57596379] +- [ 0.10541019  0.13846652  0.29617104  0.23250236]
    	 reprojection error: [0.000003, -0.000001] +- [0.207269, 0.167030]
    
    	baseline T_1_0:
    	 q: [-0.00053684  0.00066745  0.00003144  0.99999963] +- [ 0.00122298  0.00096443  0.00015755]
    	 t: [ 0.04996444  0.00001392  0.0000511 ] +- [ 0.00013475  0.00012981  0.00032963]
    
    	baseline T_2_1:
    	 q: [ 0.00164081 -0.00396511  0.00180494  0.99998916] +- [ 0.00097517  0.00060299  0.00012675]
    	 t: [ 0.01598901 -0.00016613  0.00066788] +- [ 0.00010335  0.00010625  0.00027807]
    
    
    
    Target configuration
    ====================
    
      Type: aprilgrid
      Tags: 
        Rows: 6
        Cols: 6
        Size: 0.021 [m]
        Spacing 0.00614285721 [m]
    

    4.IMU+相机联合标定

    在标定IMU和camera的时候可以选择多个相机和IMU一起,也可以选择一个相机,前面已将多个相机进行了标定,如果需要可以只标定主相机(作为参照坐标系的相机)与IMU的相对位置。

    • step0:准备IMU和camera配置文件,将之前标定的数据按照kalibr的yaml文件准备好.

    将Acc和Gyr的第一组平均数据拷贝到kalibr对应的imu.yml文件中,

    rostopic: /camera/imu  #由于后面要重命名topic名字,这里改为rostopic: /imu
    update_rate: 200.0 #Hz
     
    accelerometer_noise_density: 2.3786845794688424e-02  #白噪声
    accelerometer_random_walk: 5.9166889270489845e-04  #偏置
    gyroscope_noise_density: 2.1732068912927271e-03 #continous
    gyroscope_random_walk: 1.7797900203083191e-05
    

    camchain-multicameras_calibration.yaml中复制单个相机信息进行单目+IMU标定,例如\color + \imu: color_d435i.yaml,多目+IMU就直接使用camchain-multicameras_calibration.yaml

    cam0:
      T_cn_cnm1:
      - [0.9999620402124648, 0.0035968244723977626, 0.007936056188530766, 0.015989010745942417]
      - [-0.0036228484455317336, 0.9999880998809918, 0.003267271879129718, -0.0001661290746026598]
      - [-0.007924209945064884, -0.003295898983009658, 0.9999631712951373, 0.0006678796643165944]
      - [0.0, 0.0, 0.0, 1.0]
      cam_overlaps: [0, 1]
      camera_model: pinhole
      distortion_coeffs: [0.24904225693789128, 2.804334100694488, -15.571453079619436,
        26.767988102446466]
      distortion_model: equidistant
      intrinsics: [611.6715666677603, 611.7182637869023, 323.2862738374678, 240.57596378988336]
      resolution: [640, 480]
      rostopic: /color
    
    • step1:进入realsense-ros包,开启d435i,rs_camera.launch发布摄像头图像和IMU数据的话题,rs_camera.launch需要修改.
      Modified:
    ...
    <arg name="enable_sync"               default="true"/>,  
    ...
     <arg name="unite_imu_method"          default="copy"/>
    

    目的:一个是保持IMU和图像信息同步,另一个输出同步IMU数据。

    • step2:固定标定目标,确保摄像头能够提取特征前提下充分调整d435i的姿势和位置,录制数据包.

    具体可以参照kalibr的视频,视频中是先面对标定目标,然后俯仰、偏航和横滚三个角度分别面向目标运动,然后是前后左右和上下运动,充分运动起来。因为kalibr推荐IMU 200Hz,图像20Hz,所以用topic_tools throttle限制频率。

    注意:用topic_tools throttle限制频率后一定要看一下限制后的topic输出频率:rostopic hz /topic ,你会发现实际的频率与设定的频率并不一致,你可能需要设置不同的数值比如:rosrun topic_tools throttle messages /topic_1 25 /topic_2,如果topic_1是40hz,/topic_2可能不是25hz,而是20hz,具体原因不明。有知道的可以留言告知,谢谢!

    切记一定要保证目标在图像中清晰可见,同时要求整个视频时间尽量短,否则后续优化耗时很久。

    kalibr在处理标定数据的时候要求图像的频率不可过高,降低图像数据到20HZ,IMU数据至200HZ。。使用如下指令来限制图像频率:

    rosrun topic_tools throttle messages /camera/imu  200 /imu
    rosrun topic_tools throttle messages /camera/color/image_raw 20 /color
    rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 20 /infra_left
    rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 20 /infra_right
    

    然后进行录制(录制了两分钟):

    rosbag record -O camera_imu_calibration   /color /imu  /infra_left  /infra_right  #多目
    rosbag record -O camera_imu_calibration   /color /imu #单目
    
    • step3:调用kalibr的算法计算IMU和camera外参

    单目+IMU:

    kalibr_calibrate_imu_camera --target april_6x6_A4.yaml --cam  color_d435i.yaml --imu imu_d435i.yaml --bag imu_cameras_calibration.bag --bag-from-to 10 100 --max-iter 15  --show-extraction
    

    单目+IMU,棋盘格:

     kalibr_calibrate_imu_camera --target checkerboard_8x11_30x30cm.yaml --cam ./20200528calibr/color_d435i.yaml --imu ./20200528calibr/imu_d435i.yaml --bag  ./0529calibr/camera_imu_calibration.bag  --imu-models scale-misalignment --max-iter 15  --show-extraction 
    

    多目+IMU:

    kalibr_calibrate_imu_camera --target april_6x6_A4.yaml --cam camchain-multicameras_calibration.yaml --imu imu_d435i.yaml --bag  camera_imu_calibration.bag  --bag-from-to 10 100 --max-iter 15  --show-extraction
    

    注释:

    --bag-from-to 10 100 选择10-100s之间的数据.
    --max-iter 15 设置优化迭代次数为15次,默认30.
    --show-extraction 展示特征提取情况.

    多目优化时矩阵维数太大,优化失败:

    在这里插入图片描述

    标定报告:

    (1).PDF
    在这里插入图片描述

    (2).TXT

    Calibration results
    ===================
    Normalized Residuals
    ----------------------------
    Reprojection error (cam0):     mean 0.309357318376, median 0.251843164322, std: 0.230997537619
    Gyroscope error (imu0):        mean 0.265159164537, median 0.228561532289, std: 0.176229173919
    Accelerometer error (imu0):    mean 0.221421363455, median 0.195499299021, std: 0.128105231257
    
    Residuals
    ----------------------------
    Reprojection error (cam0) [px]:     mean 0.309357318376, median 0.251843164322, std: 0.230997537619
    Gyroscope error (imu0) [rad/s]:     mean 0.00814934517662, median 0.00702456135722, std: 0.00541618982304
    Accelerometer error (imu0) [m/s^2]: mean 0.0744854379606, median 0.065765338454, std: 0.0430941897671
    
    Transformation (cam0):
    -----------------------
    T_ci:  (imu0 to cam0): 
    [[ 0.99996271 -0.00181009  0.00844446  0.02651246]
     [ 0.00171691  0.9999377   0.01102943  0.00830124]
     [-0.0084639  -0.01101452  0.99990352 -0.01286511]
     [ 0.          0.          0.          1.        ]]
    
    T_ic:  (cam0 to imu0): 
    [[ 0.99996271  0.00171691 -0.0084639  -0.02663461]
     [-0.00181009  0.9999377  -0.01101452 -0.00839444]
     [ 0.00844446  0.01102943  0.99990352  0.01254843]
     [ 0.          0.          0.          1.        ]]
    
    timeshift cam0 to imu0: [s] (t_imu = t_cam + shift)
    -0.0253737394755
    
    
    Gravity vector in target coords: [m/s^2]
    [ 0.25486723 -9.78147004 -0.65292373]
    
    
    Calibration configuration
    =========================
    
    cam0
    -----
      Camera model: pinhole
      Focal length: [611.6715666677603, 611.7182637869023]
      Principal point: [323.2862738374678, 240.57596378988336]
      Distortion model: equidistant
      Distortion coefficients: [0.24904225693789128, 2.804334100694488, -15.571453079619436, 26.767988102446466]
      Type: aprilgrid
      Tags: 
        Rows: 6
        Cols: 6
        Size: 0.021 [m]
        Spacing 0.00614285721 [m]
    
    
    
    IMU configuration
    =================
    
    IMU0:
    ----------------------------
      Model: calibrated
      Update rate: 200.0
      Accelerometer:
        Noise density: 0.0237868457947 
        Noise density (discrete): 0.336396799289 
        Random walk: 0.000591668892705
      Gyroscope:
        Noise density: 0.00217320689129
        Noise density (discrete): 0.0307337865951 
        Random walk: 1.77979002031e-05
      T_i_b
        [[ 1.  0.  0.  0.]
         [ 0.  1.  0.  0.]
         [ 0.  0.  1.  0.]
         [ 0.  0.  0.  1.]]
      time offset with respect to IMU0: 0.0 [s]
    

    kalibr多目标定与联合标定后会生成标定报告以及标定出的外参值,标定报告会直接给出相机坐标的空间位置示意图,结合标定结果和实际位置比较可以较为直观的判断结果是否可靠;另外就是看重投影误差,这个值越小越好,自己标定出来误差有点大,猜测是标定板参数不精确导致。内参数的话也可以关注一下,我使用的d435i,标定后双面的内参基本一样,不过和自带/camera_info还是有所区别的,但是不是很大。

    5.VINS Yaml文件配置

    realsense config update in 202005292.

    %YAML:1.0
    
    #common parameters
    imu_topic: "/camera/imu"
    image_topic: "/camera/color/image_raw"
    output_path: "/home/fb/output/"
    
    #camera calibration
    model_type: PINHOLE
    camera_name: camera
    image_width: 640
    image_height: 480
      #TODO modify distortion
    
    distortion_parameters:
       k1: 0.10364469
       k2: -0.1823355
       p1: 0.002330617
       p2: 0.0037446
    projection_parameters:
       fx: 601.226091
       fy: 601.3432164
       cx: 332.171979
       cy: 240.5101526
    
    # Extrinsic parameter between IMU and Camera.
    estimate_extrinsic: 0   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                            # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
                            # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.
    #If you choose 0 or 1, you should write down the following matrix.
    #Rotation from camera frame to imu frame, imu^R_cam
    extrinsicRotation: !!opencv-matrix
       rows: 3
       cols: 3
       dt: d
       # data: [  0.99977841,  0.00412757,  0.02064201,
       #         -0.00374241,  0.99981883, -0.01866291,
       #          -0.0207153,   0.01858152,  0.99961273 ] 
       data: [ 0.99998318, -0.00302186, -0.00495021,
               0.00296347,  0.99992645, -0.01176032,
               0.00498539,  0.01174545,  0.99991859 ]   
    #Translation from camera frame to imu frame, imu^T_cam
    extrinsicTranslation: !!opencv-matrix
       rows: 3
       cols: 1
       dt: d
       # data: [-0.02154582, 0.00681016, 0.02740755]
       data: [-0.0132516, -0.00082214, 0.01535377]
    
    
    
    #feature traker paprameters
    max_cnt: 200           # max feature number in feature tracking
    min_dist: 15            # min distance between two features
    freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
    F_threshold: 1.0        # ransac threshold (pixel)
    show_track: 1           # publish tracking image as topic
    equalize: 0             # if image is too dark or light, trun on equalize to find enough features
    fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points
    
    #optimization parameters
    max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
    max_num_iterations: 8   # max solver itrations, to guarantee real time
    keyframe_parallax: 10.0 # keyframe selection threshold (pixel)
    
    #imu parameters       The more accurate parameters you provide, the better performance
    #for handheld, wheeld
    acc_n: 0.021793          # accelerometer measurement noise standard deviation. #0.2
    gyr_n: 0.00215568        # gyroscope measurement noise standard deviation.     #0.05
    acc_w: 0.00050207        # accelerometer bias random work noise standard deviation.  #0.02
    gyr_w: 1.71656e-05      # gyroscope bias random work noise standard deviation.     #4.0e-5
    
    #for tracked applications
    #acc_n: 0.5          # accelerometer measurement noise standard deviation. #0.2
    #gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05
    #acc_w: 0.001         # accelerometer bias random work noise standard deviation.  #0.02
    #gyr_w: 2.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5
    
    
    g_norm: 9.805       # gravity magnitude
    
    #loop closure parameters
    loop_closure: 1                    # start loop closure
    fast_relocalization: 1             # useful in real-time and large project
    load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
    #pose_graph_save_path: "/home/fb/output/pose_graph/" # save and load path
    pose_graph_save_path: "*****"
    #unsynchronization parameters
    estimate_td: 1                      # online estimate time offset between camera and imu
    td: -0.0237                          # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)
    
    #rolling shutter parameters
    rolling_shutter: 1                      # 0: global shutter camera, 1: rolling shutter camera
    rolling_shutter_tr: 0.033               # unit: s. rolling shutter read out time per frame (from data sheet).
    
    #visualization parameters
    save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0
    visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results
    visualize_camera_size: 0.4      # size of camera marker in RVIZ
    

    备份VINS_ws配置:

    %YAML:1.0
    
    #common parameters
    imu_topic: "/camera/imu"
    image_topic: "/camera/color/image_raw"
    output_path: "/home/ipsg/output/"
    
    #camera calibration 
    model_type: PINHOLE
    camera_name: camera
    image_width: 640
    image_height: 480
    distortion_parameters:
       k1: 9.2615504465028850e-02
       k2: -1.8082438825995681e-01
       p1: -6.5484100374765971e-04
       p2: -3.5829351558557421e-04
    projection_parameters:
       fx: 616.1290893554688
       fy: 616.3303833007812
       cx: 319.9371032714844
       cy: 240.64352416992188
    
    
    # Extrinsic parameter between IMU and Camera.
    estimate_extrinsic: 2   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                            # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.
                            # 2  Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning.                        
    #If you choose 0 or 1, you should write down the following matrix.
    #Rotation from camera frame to imu frame, imu^R_cam
    extrinsicRotation: !!opencv-matrix
       rows: 3
       cols: 3
       dt: d
       data: [ 0.99964621,  0.01105994,  0.02418954,
               -0.01088975,  0.9999151,  -0.00715601, 
               -0.02426663,  0.00689006,  0.99968178]
    #Translation from camera frame to imu frame, imu^T_cam
    extrinsicTranslation: !!opencv-matrix
       rows: 3
       cols: 1
       dt: d
       data: [0.07494282, -0.01077138, -0.00641822]
    
    #feature traker paprameters
    max_cnt: 150            # max feature number in feature tracking
    min_dist: 25            # min distance between two features 
    freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image 
    F_threshold: 1.0        # ransac threshold (pixel)
    show_track: 1           # publish tracking image as topic
    equalize: 0             # if image is too dark or light, trun on equalize to find enough features
    fisheye: 0              # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points
    
    #optimization parameters
    max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
    max_num_iterations: 8   # max solver itrations, to guarantee real time
    keyframe_parallax: 10.0 # keyframe selection threshold (pixel)
    
    #imu parameters       The more accurate parameters you provide, the better performance
    #acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2
    #gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05
    #acc_w: 0.0002         # accelerometer bias random work noise standard deviation.  #0.02
    #gyr_w: 2.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5
    #g_norm: 9.805       # gravity magnitude
    acc_n: 0.2          # accelerometer measurement noise standard deviation. #0.2
    gyr_n: 0.05         # gyroscope measurement noise standard deviation.     #0.05
    acc_w: 0.02         # accelerometer bias random work noise standard deviation.  #0.02
    gyr_w: 4.0e-5       # gyroscope bias random work noise standard deviation.     #4.0e-5
    g_norm: 9.81       # gravity magnitude
    
    
    
    #loop closure parameters
    loop_closure: 1                    # start loop closure
    fast_relocalization: 1             # useful in real-time and large project
    load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
    pose_graph_save_path: "/home/tony-ws1/output/pose_graph/" # save and load path
    
    #unsynchronization parameters
    estimate_td: 1                      # online estimate time offset between camera and imu
    td: 0.000                           # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)
    
    #rolling shutter parameters
    rolling_shutter: 0                      # 0: global shutter camera, 1: rolling shutter camera
    rolling_shutter_tr: 0             # unit: s. rolling shutter read out time per frame (from data sheet). 
    
    #visualization parameters
    save_image: 1                   # save image in pose graph for visualization prupose; you can close this function by setting 0 
    visualize_imu_forward: 0        # output imu forward propogation to achieve low latency and high frequence results
    visualize_camera_size: 0.4      # size of camera marker in RVIZ
    
    展开全文
  • 一览英特尔实感摄像头D415、D435D435i、T265的差异 作者 广州客 · 分类 快讯 · 2019年04月23日 08:31:36 加入映维网会员 查看引用和消息源请点击:i映维网 帮助你寻找最合适的设备 (映维网 2019年04月23...
  • Inter Realsense D435i标定详细步骤

    千次阅读 2020-11-17 19:37:06
    官方文档https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_Depth_D435i_IMU_Calib.pdf 还有的说使用imu_utils 和kalibr(非常强大的标定...
  • ORB_SLAM需要修改的部分包括CMakelist.txt和rgbd_tum.cc,这部分需要较大改动。首先看rgbd_tum.cc,由于我们打算完成一个实时系统,所以不在需要数据的读取、color和depth的匹配;可以省去相当部分的工作。...
  • 今天实验室新置办的D435i到了,然后需要调试一下,这款D435i是D435的升级版,二者仅仅差了一个imu(D435i有imu),其余并无差别(官网上是这样说的),然后在某东上买的,一共1999(带发票). 先是配置开发环境,...
  • 基于cuda10.2-cudnn8-devel-ubuntu18.04镜像
  • realsense D435i gazebo slam(px4)仿真

    热门讨论 2021-06-15 10:36:05
    realsense D435i gazebo slam仿真 包含realsense T265 D435i的urdf和sdf文件、realsense_gazebo_plugin包及realsense 模型文件使用示例。 下载realsense 仿真模型 [catkin_ws]表示自定义的工作目录 mkdir -p [catkin...
  • 参考: 1、如何用Realsense D435i运行VINS-Mono等VIO算法 获取IMU同步数据 2、Realsense D435i 在ubuntu上安装SDK与ROS Wrapper 运行ORB-SLAM2、RTAB和VINS-Mono 3、rosbag录制和回放消息实用总结
  • Realsense D435i(2018) Realsense D415(2018) Realsense T265(2019) 深度范围(米) 0.4m -> 10m(推荐0.4-6m) 0.2m - 10m 0.3m - 10m - 功耗 360mW ...
  • Intel 深度摄像D435i 的标定

    千次阅读 2019-11-23 12:04:05
    这个官方的标定链接,先下载这个的pdf 文件在本地容易学习: ... 看上去这个链接也不错 使用OpenCV进行标定(Python) 打算明天做完测试再补充具...
  • Kalibr标定Intel D435i相机

    千次阅读 2019-11-10 16:24:50
    Kalibr标定Intel D435i相机 文章目录Kalibr标定Intel D435i相机1、相机标定2、IMU标定3、相机+IMU联合标定4、标定结果评价 系统环境:ubuntu16.04+roskinetic 需要的基础知识基础:ROS基本知识(如何使当前终端...

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 25,220
精华内容 10,088
关键字:

D435