Loading [MathJax]/extensions/MathMenu.js
PaintCopter: An Autonomous UAV for Spray Painting on Three-Dimensional Surfaces | IEEE Journals & Magazine | IEEE Xplore

PaintCopter: An Autonomous UAV for Spray Painting on Three-Dimensional Surfaces


Abstract:

This letter describes a system for autonomous spray painting using an unmanned aerial vehicle (UAV), suitable for industrial applications. The work is motivated by the po...Show More

Abstract:

This letter describes a system for autonomous spray painting using an unmanned aerial vehicle (UAV), suitable for industrial applications. The work is motivated by the potential for such a system to achieve accurate and fast painting results. The PaintCopter is a quadrotor that has been custom fitted with an arm plus a spray gun on a pan-tilt mechanism. To enable long deployment times for industrial painting tasks, power and paint are delivered by lines from an external unit. The ability to paint planar surfaces, such as walls in single color, is a basic requirement for a spray painting system. But this work addresses more sophisticated operation that subsumes the basic task, including painting on three-dimensional (3D) structure, and painting of a desired texture appearance. System operation consists of (a) an offline component to capture a 3D model of the target surface, (b) an offline component to design the painted surface appearance and generate the associated robotic painting commands, and (c) a live system that carries out the spray painting. Experimental results demonstrate autonomous spray painting by the UAV, doing area fill and versatile line painting on a 3D surface.
Published in: IEEE Robotics and Automation Letters ( Volume: 3, Issue: 4, October 2018)
Page(s): 2862 - 2869
Date of Publication: 11 June 2018

ISSN Information:

References is not available for this document.

I. Introduction

Robotic painting using a UAV has the potential to produce accurate (predictable and repeatable) painted appearance, to be low-cost, and to avoid the need for scaffolding and ladders. This motivates our work on PaintCopter for autonomous spray painting. A basic painting task is to paint a planar surface such as a wall in a single color. This letter attacks a larger challenge that subsumes the more basic task in two ways - with the ability to paint on 3D structure, and to paint texture. Fig. 1 provides an illustration in which painting is being done on a synthetic rock, and the goal is to paint a uniform base color and then to overlay color striations, in order to produce a (user-designed) rock-like surface appearance. This type of task - with the two dimensions of painting on a 3D object and painting a texture for theming/styling - is the motivation for our work. It requires a more sophisticated approach than the single color wall painting problem, and explains design choices that might have been avoided with more basic functionality - for example, the use of a spray gun on a pan-tilt unit (PTU) instead of on a rigid mount.

Select All
1.
S.-H. Suh, I.-K. Woo and S.-K. Noh, "Development of an automatic trajectory planning system (ATPS) for spray painting robots", Proc. IEEE Int. Conf. Robot. Autom., pp. 1948-1955, 1991.
2.
M. V. Andulkar, S. S. Chiddarwar and A. S. Marathe, "Novel integrated offline trajectory generation approach for robot assisted spray painting operation", J. Manuf. Syst., vol. 37, pp. 201-216, 2015.
3.
A. Cortellessaa et al., "Automatic path-planning algorithm for realistic decorative robotic painting", Proc. 3rd Int. Conf. Prod. Eng. Manage. Trieste Italy, vol. 56, pp. 67-75, 2015.
4.
R. Prévost, A. Jacobson, W. Jarosz and O. Sorkine-Hornung, "Large-scale painting of photographs by interactive optimization", Comput. Graph., vol. 55, no. C, pp. 108-117, Apr. 2016, [online] Available: https://doi.org/10.1016/j.cag.2015.11.001.
5.
R. Shilkrot, P. Maes, J. A. Paradiso and A. Zoran, "Augmented airbrush for computer aided painting (CAP)", ACM Trans. Graph., vol. 34, no. 2, pp. 19:1-19:11, Mar. 2015, [online] Available: http://doi.acm.org/10.1145/2699649.
6.
B. S. Faiçal et al., "An adaptive approach for UAV-based pesticide spraying in dynamic environments", Comput. Electron. Agriculture, vol. 138, pp. 210-223, 2017.
7.
B. Galea, E. Kia, N. Aird and P. G. Kry, "Stippling with aerial robots", Proc. Joint Symp. Comput. Aesthetics Sketch Based Interfaces Modeling Non-Photorealistic Animation Rendering, pp. 125-134, 2016.
8.
B. Galea and P. G. Kry, "Tethered flight control of a small quadrotor robot for stippling", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 1713-1718, Sep. 2017.
9.
P. Furgale, J. Rehder and R. Siegwart, "Unified temporal and spatial calibration for multi-sensor systems", 2013 IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 1280-1286, Nov. 2013.
10.
M. Bloesch, S. Omari, M. Hutter and R. Siegwart, "Robust visual inertial odometry using a direct EKF-based approach", 2015 IEEE/RSJ Int. Conf. Intell. Robots Syst, pp. 298-304, Sep. 2015.
11.
S. Lynen, M. Achtelik, S. Weiss, M. Chli and R. Siegwart, "A robust and modular multi-sensor fusion approach applied to MAV navigation", Proc. IEEE/RSJ Conf. Intell. Robots. Syst., pp. 3923-3929, 2013.
12.
M. Kamel, T. Stastny, K. Alexis and R. Siegwart, "Model predictive control for trajectory tracking of unmanned aerial vehicles using robot operating system" in Robot Operating System (ROS) The Complete Reference, New York, NY, USA:Springer-Verlag, vol. 2, 2017.
13.
A. S. Vempati, I. Gilitschenski, J. Nieto, P. Beardsley and R. Siegwart, "Onboard real-time dense reconstruction of large-scale environments for UAV", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 3479-3486, Sep. 2017.
14.
I. Sa, M. Kamel et al., "Build your own visual-inertial drone: A cost-effective and open-source autonomous drone", IEEE Rob. Autom. Mag., vol. 25, no. 1, pp. 89-103, Mar. 2018.
15.
I. Sa et al., "Dynamic system identification and control for a cost-effective and open-source multi-rotor mav" in Field and Service Robotics, New York, NY, USA:Springer-Verlag, pp. 605-620, 2018.
16.
W. S. Cleveland and S. J. Devlin, "Locally weighted regression: An approach to regression analysis by local fitting", J. Amer. Statist. Assoc., vol. 83, no. 403, pp. 596-610, 1988.
17.
Q.-Y. Zhou, J. Park and V. Koltun, "Fast global registration" in European Conference on Computer Vision, New York, NY, USA:Springer-Verlag, pp. 766-782, 2016.
18.
C. Richter, A. Bry and N. Roy, "Polynomial trajectory planning for aggressive quadrotor flight in dense indoor environments" in Robotics Research, New York, NY, USA:Springer-Verlag, pp. 649-666, 2016.
19.
M. Burri, H. Oleynikova, M. W. Achtelik and R. Siegwart, "Real-time visual-inertial mapping re-localization and planning onboard MAVS in unknown environments", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 1872-1878, Sep. 2015.
20.
F. Furrer, M. Burri, M. Achtelik and R. Siegwart, "RotorS—A modular gazebo MAV simulator framework" in Robot Operating System (ROS): The Complete Reference, New York, NY, USA:Springer-Verlag, vol. 1, pp. 595-625, 2016, [online] Available: http://dx.doi.org/10.1007/978-3-319-26054-9_23.
21.
E. Olson, "AprilTag: A robust and flexible visual fiducial system", Proc. IEEE Inte. Conf. Robot. Autom., pp. 3400-3407, May 2011.
Contact IEEE to Subscribe

References

References is not available for this document.