Optical Flow-based Spherical Camera Motion Estimation and 3D Reconstruction

Spherical images have high distortion which can induce mistakes in feature point tracking, offsetting the advantage of their large fields-of-view. This research provides a novel approach of using dense optical flow for distortion-robust spherical camera motion estimation and 3D reconstruction. Dense optical flow incorporates smoothing terms and is free of local outliers. It encodes the camera motion as well as dense 3D information. This approach decomposes dense optical flow into epipolar geometry and the dense 3D model of the environment, and reprojects this model to estimate 6 DoF camera motion.

Related papers:

  1. Sarthak Pathak, Alessandro Moro, Atsushi Yamashita and Hajime Asama: "Optical Flow-based Epipolar Estimation of Spherical Image Pairs for 3D Reconstruction", SICE Journal of Control, Measurement, and System Integration, Vol. 10, No. 5, pp. 476-485, September 2017. [doi:10.9746/jcmsi.10.476]
  2. Sarthak Pathak, Alessandro Moro, Atsushi Yamashita and Hajime Asama: "Dense 3D Reconstruction from Two Spherical Images via Optical Flow-based Equirectangular Epipolar Rectification", Proceedings of the 2016 IEEE International Conference on Imaging Systems and Techniques (IST2016), pp. 140-145, Crete Island (Greece), October 2016. [doi.org/10.1109/IST.2016.7738212] [PDF]
  3. Sarthak Pathak, Alessandro Moro, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Distortion-Robust Spherical Camera Motion Estimation via Dense Optical Flow", Proceedings of the 2018 IEEE International Conference on Image Processing (ICIP2018), Athens (Greece), October 2018.

Optical Flow-based Spherical Video Stabilization

This research creates a rotationless, stable video by processing a video taken from a moving spherical camera. It is implemented by removing the rotation in-image with a novel algorithm based on aligning the dense spherical optical flow field along the epipolar direction, using directional symmetry of translational spherical image pixel motion.

Related papers:

  1. Sarthak Pathak, Alessandro Moro, Atsushi Yamashita and Hajime Asama: "A Decoupled Virtual Camera Using Spherical Optical Flow", Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP2016), pp. 4488-4492, Phoenix (USA), September 2016. [doi:10.1109/ICIP.2016.7533209] [PDF][Movie]
  2. Sarthak Pathak, Alessandro Moro, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Spherical Video Stabilization by Estimating Rotation from Dense Optical Flow Fields", Journal of Robotics and Mechatronics, Vol. 29, No. 3, pp. 566-579, June 2017. [doi:10.20965/jrm.2017.p0566]

Spherical Video Occlusion Removal (first author: Binbin Xu)

Spherical cameras are widely used due to their full 360◦ fields of view. However, a common but severe problem is that anything carrying the camera is always included in the view, occluding visual information. In this research, a novel method to remove such occlusions in videos taken from a freely moving spherical camera is proposed. This is achieved by inpainting the color and motion information of pixels. The missing color and motion information inside the occluded region is iteratively recovered in a coarse-to-fine optimization. Spatial and temporal coherence of color and motion information is enforced, considering spherical image geometry.

Related papers:

  1. Binbin Xu, Sarthak Pathak, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Spatio-temporal Video Completion in Spherical Image Sequences", IEEE Robotics and Automation Letters, Vol. 2, No. 4, pp. 2032-2039, October 2017. [doi:10.1109/LRA.2017.2718106][Movie]
  2. Binbin Xu, Sarthak Pathak, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Optical Flow-based Video Completion in Spherical Image Sequences", Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO2016), pp. 388-395, Qingdao (China), December 2016. [doi:10.1109/ROBIO.2016.7866353] [PDF]

Global Visual Localization of Spherical Images Based on Lines (first author: Tsubasa Goto)

In this research, indoor localization is achieved using a spherical camera. Spherical cameras can obtain a complete view of thesurroundings and can allow the use of global environmentalinformation. Taking advantage of this, the position and the orientation with respect to a known 3D line map of an indoor environment is achieved using a single image. 2D line information is robustly extracted from a spherical image via spherical-gradient filtering in the Hough space and matched to 3D line information in the line map.

Related papers:

  1. 後藤 翼, Sarthak Pathak, 池 勇勳, 藤井 浩光, 山下 淳, 淺間 一: "人工物環境における全天球カメラの位置姿勢推定のための直線特徴に基づく3D-2Dマッチング", 精密工学会誌, Vol. 83, No. 12, pp. 1209-1215, December 2017. [doi:10.2493/jjspe.83.1209]
  2. Tsubasa Goto, Sarthak Pathak, Yonghoon Ji, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Spherical Camera Localization in Man-made Environment Using 3D-2D Matching of Line Information", Proceedings of the International Workshop on Advanced Image Technology 2017 (IWAIT2017), Penang (Malaysia), January 2017. [PDF]
  3. Tsubasa Goto, Sarthak Pathak, Yonghoon Ji, Hiromitsu Fujii, Atsushi Yamashita and Hajime Asama: "Line-based Global Localization of a Spherical Camera in Manhattan Worlds", 2018 IEEE International Conference on Robotics and Automation (ICRA2018), Brisbane (Australia), May 2018.[PDF][Movie]

Distortion-resistant Convolutional Neural Networks for Spherical Camera Motion Estimation (first author: Dabae Kim)

In this research, spherical camera motion is estimated in a distortion-resistant manner via dense optical flow. The distorted optical flow field on the equirectangular image is rotated to uniformize the patterns across all axes, leading to better learning. It is shown that naively applying convolutions on a single equirectangular image without uniformization results in weak learning, which does not improve even on increasing the number of convolutional layers.

Related Papers:

  1. Dabae Kim, Sarthak Pathak, Alessandro Moro, Ren Komatsu, Atsushi Yamashita and Hajime Asama: "E-CNN: Accurate Spherical Camera Rotation Estimation via Uniformization of Distorted Optical Flow Fields", Proceedings of the 2019 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2019), pp. 2232-2236, Brighton (UK), May 2019. [doi:10.1109/ICASSP.2019.8682203] [PDF]
  2. Dabae Kim, Sarthak Pathak, Alessandro Moro, 小松 廉, 山下 淳, 淺間 一: "全天球カメラの回転推定のためのE-CNNの構築", 第19回計測自動制御学会システムインテグレーション部門講演会講演論文集(SI2018), 大阪, December 2018.