Loading web-font TeX/Math/Italic
Integration of Hardware and Software Designs for Object Grasping and Transportation by a Mobile Robot With Navigation Guidance Via a Unique Bearing-Alignment Mechanism | IEEE Journals & Magazine | IEEE Xplore

Integration of Hardware and Software Designs for Object Grasping and Transportation by a Mobile Robot With Navigation Guidance Via a Unique Bearing-Alignment Mechanism

Open Access

Abstract:

In this paper, we consider the systematic design and implementation of grasping and transportation by a mobile robot with low implementation complexity. The study is main...Show More

Abstract:

In this paper, we consider the systematic design and implementation of grasping and transportation by a mobile robot with low implementation complexity. The study is mainly divided into three aspects: 1) design of a mechanical gripper and an arm mounted on the robot for grasping; 2) design of the hardware communication configuration (HCC) for transportation guidance that includes a destination beacon and a sensor system on the robot that working together with some designed event-triggered communication mechanisms, among which the unique bearing-alignment mechanism is designed to calibrate the heading direction of the robot, the HCC is low cost in the sense of low energy consumption; and 3) an effective and low-cost heuristic transportation algorithm that is tightly incorporated with the designed HCC in an event-triggered manner. The particular contribution is the tight integration of the heuristic transportation algorithm and the HCC, and the very simple nature of the transportation algorithm dues to this integration, which passes some computational functions to the HCC. Numerical simulations and experiments are presented, showing the effectiveness of the proposed system.
Published in: IEEE/ASME Transactions on Mechatronics ( Volume: 21, Issue: 1, February 2016)
Page(s): 576 - 583
Date of Publication: 05 May 2015

ISSN Information:


SECTION I.

Introduction

Mobile robot manipulation and transportation are becoming more and more important in many applications such as manipulation and transportation [1]–​[5], [21], [22]. For example, grasping using robotic multifingered hands was developed in [1] and [2]. Cooperative manipulation of a rigid body with multiple attached unilateral thrusters was investigated in [3]. A grasper was implemented to perform pick and place task [4]. The hand exoskeletons were designed in [5]. Cooperative transportation using multiple nonholonomic robots with physical rigid-formation-motion constraints was investigated in [29].

As for positioning or localization [6]–​[14], [32]–​[35] and path planning [15], [16], [23]–​[26], which are quite essential in transportation, different methods have been proposed. There are some techniques of localization (e.g., the global positioning system, wheeled odometry [8], ultrasound localization [9], visual odometry [10]–​[12] , map of environment, and multisensor fusing method [14]). For example, a visual odometry method with 3-D sensors was used [11]. In [10], a monocular vision and odometry/AHRS (AHRS, three-axial gyroscopes, accelerometers, and magnetometers) sensors fusion was proposed with parallel processing and graphics processing units used to improve the performance. Generally, visual odometry methods are computational incentive, with higher requirements on hardware resources. In [8], a local sensor fusion technique with a global position correction was proposed, using a zenithal camera for global sensing and the sensors (encoders, gyroscope, accelerometers, compass, etc.) on the robot for localization, which consumed higher computational resources as in [9]–​[11] and [15]. An onboard robotic module was designed for determining relative positions among miniature robots [13]. For a system of unmanned air and ground vehicles, a cooperative path planning method was proposed, with vision occlusions for obstacle avoidance [16]. For path planning techniques, also refer to [20].

Mechatronic implementation of a system also requires design of hardware and software architectures, as in systems of UAV tracking [17], UAVs-GAVs [18], wall-climbing robot [19], flapping-wing robot [31], etc. To the best of our knowledge, there is relatively less literature dealing with implementation of a mechatronic system for grasping and transportation, particularly with low implementation complexity.

This paper considers a systematic design and implementation for cylinder-shaped object grasping and transportation to the destination. Microcontroller units (MCUs) are used, with limited computational capacity, thus we are particularly concerned with low manipulation and path-planning complexity. The platform is a three-wheeled nonholonomic mobile robot.

For grasping, we design a mechanical gripper together with an arm, equipped with a set of sensors that mounted on the platform.

For path planning, one of the distinct characteristics in this paper is that, we do not rely on positioning or localization techniques as mentioned earlier. Instead, we intend to guide the robot with rough direction or bearing of the destination that provided by the specifically designed destination beacon and the sensor system on the robot. Necessary infrared and laser emitters and receivers are equipped on both the beacon and the robot. In our design, the robot moves along rough directions of the destination at first steps provided by the infrared system; during the process the robot has the opportunity to calibrate an accurate direction of the destination beacon via the particularly designed event-triggered bearing-alignment mechanism, with very low energy and computational cost. Then, we present an effective and computationally low-cost heuristic algorithm for transportation that tightly incorporated with the designed hardware communication configuration (HCC) in an event-triggered manner, which is robust and suitable for limited on-board computational resources. The very simple and effective nature of the algorithm dues to the designed HCC, which is also low cost (in the sense of low energy consumption); the transportation algorithm integrated with the HCC can be also viewed as a simple yet efficient version of sensor fusion, i.e., information inference using multiple cues or signals [27] (although in [27], the inference of human mind state is discussed, it can be readily applicable to sensor applications). As a comparison, the transportation methods using visual odometry (unavoidably with influence of light), visibility-graph construction, and multisensor fusing, etc., as mentioned earlier have obviously higher computational costs.

For robot control, we design a modular, hierarchical approach in the software architecture that decomposes the task into a set of high-level behaviors and low-level behaviors that are suitable for the whole implementation.

The remainder of this paper is as follows. Section II designs the mechanical parts. Section III designs the HCC, mainly for transportation. Section IV describes grasp and release. Section V is for transportation. Section VI is the conclusion.

SECTION II.

Mechanical Design

To facilitate grasping, there are two mechanical components designed: the mechanical gripper and the robot arm.

The MCU used on the robot is: ATmega328p (Flash: 32 kB, RAM: 2 kB, Clock: 20 MHz). The platform is three wheeled (two wheels are driven by two servo motors and the third one is a caster) that is nonholonomic. The geometric parameters of the robot are given in Appendix B (see Table III).

A. Design of Gripper

Assume the size of cylinder-shaped objects is known prior to grasping. In order to implement grasping, we design the mechanical gripper as shown in Fig. 1, the claw of the gripper is driven by a servo motor with two gears for mechanical transmission.

Fig. 1. - Gripper with the main parameters (notation ø represents the diameter).
Fig. 1.

Gripper with the main parameters (notation ø represents the diameter).

The gripper here is for cylindrical objects, for other shapes of objects, one may possibly just consider replacing the claw (see Fig. 1) that known a priori, which is not the focus in this paper.

B. Design of Robot Arm

In order to facilitate grasping using the gripper, we design the 3-DOF arm (see Fig. 2), each joint is driven by a servo motor (CYS model S0090) for which the range of motion is 180 degree of rotation. The robot arm is made of aluminum alloy material. The lengths of the three parts of the arm are: l_1 = l_3 = 10\,{\rm cm}, l_2 = 12\,{\rm cm}. The posture of grasping is shown in Fig. 10 in Section IV.

Fig. 2. - Mechanical structure of the arm of the robot.
Fig. 2.

Mechanical structure of the arm of the robot.

Fig. 3. - Design of the destination beacon.
Fig. 3.

Design of the destination beacon.

Fig. 4. - Sensory system of the robot. The laser emitter and ultrasonic sensor are equipped on the front of the
 robot.
Fig. 4.

Sensory system of the robot. The laser emitter and ultrasonic sensor are equipped on the front of the robot.

Fig. 5. - Illustration of the four infrared receivers on the robot.
Fig. 5.

Illustration of the four infrared receivers on the robot.

Fig. 6. - (a) Illustration of the position and orientation of robot. (b) In this scenario of transportation, only the
 sensor $r_3$ can receive the signal.
Fig. 6.

(a) Illustration of the position and orientation of robot. (b) In this scenario of transportation, only the sensor r_3 can receive the signal.

Fig. 7. - Invalid combination of the received signal.
Fig. 7.

Invalid combination of the received signal.

Fig. 8. - Valid combination of the received signal.
Fig. 8.

Valid combination of the received signal.

Fig. 9. - Scenario of Event I: The bearing-alignment event mechanism.
Fig. 9.

Scenario of Event I: The bearing-alignment event mechanism.

Fig. 10. - Posture of grasping.
Fig. 10.

Posture of grasping.

Fig. 11. - Example of path planning with multiple encounters of obstacles and multiple event triggers. The arrows
 denote the moving directions of the robot. The solid round discs denote the obstacles. Event II (obstacle avoidance)
 is triggered at positions B and D, respectively; Event I (the bearing-alignment event mechanism) is triggered at
 positions C and E, respectively, that making of the robot heading exactly to the direction of the beacon (denoted by
 the dotted lines).
Fig. 11.

Example of path planning with multiple encounters of obstacles and multiple event triggers. The arrows denote the moving directions of the robot. The solid round discs denote the obstacles. Event II (obstacle avoidance) is triggered at positions B and D, respectively; Event I (the bearing-alignment event mechanism) is triggered at positions C and E, respectively, that making of the robot heading exactly to the direction of the beacon (denoted by the dotted lines).

SECTION III.

Hardware Communication Configuration

As one main emphasis of this paper, this section designs the HCC with mainly four parts, which are low cost (in the sense of very low energy consumption).

In Section V, an effective heuristic transportation algorithm will be provided that is perfectly and tightly incorporated with the HCC in an event-triggered manner. The transportation algorithm becomes simple with low computational cost, just dues to the HCC designed here.

Additionally, another characteristic is that, the rough direction of the destination is used for guidance of the robot, thus the heuristics for path planning are robust (in the sense of inaccuracy detection of the destination's direction) due to the nature of the design. Other aspects of robustness are considered in Section V for multiple encounters of obstacles and multiple event-triggers.

A. Design of Destination Beacon

As shown in Fig. 3, the beacon is designed to guide the robot to get to the destination area, which consists of:

  1. an XBee module on the top of the beacon to communicate with the XBee module mounted on the mobile robot;

  2. six infrared emitters that transmitting infrared signal in all directions, whose frequency is 38 K and duty cycle is 75%;

  3. six laser receivers for receiving the laser signal emitted from the transmitters equipped on the robot;

  4. an MCU (ATmega328p) to control the aforementioned modules.

We also print some black lines around the beacon representing the boundary of the destination area.

B. Design of the Sensor System of the Robot

As depicted in Fig. 4, the robot is equipped with:

  1. an ultrasonic sensor on the front of the robot that is used for object detection and obstacle avoidance;

  2. a sensor system for transportation guidance that consists of:

    1. a laser emitter (its function is described in Section III-D and shown in Fig. 9);

    2. four infrared receivers for getting the infrared information from the beacon;

    3. an XBee module for communication with the XBee module mounted on the beacon;

  3. four Parallax QTI sensors to detect the boundary of the destination area (printed as the black lines in Fig. 3);

  4. a force sensing resistor to measure the grasping force.

C. Heuristics for Path Planning

Consider the robot moves toward the destination with guidance by the beacon. There are four infrared receivers on the robot (the configuration is shown in Fig. 5, the most sensitive scope of each receiver is 90°) for receiving the infrared wave from the infrared transmitters mounted on the destination beacon.

The kinematics of the wheeled robot can be described as \begin{eqnarray} \dot x(t) &=& v_0 \cos \theta (t)\\ \dot y(t) &=& v_0 \sin \theta (t) \end{eqnarray}

View SourceRight-click on figure for MathML and additional features.where (x,y) and \theta are the position and orientation of the robot, respectively (see Fig. 6), v_0 is the velocity. For numerical control, discretize it as \begin{eqnarray} p(k + 1) &=& p(k) + v_0 \Delta t\left(\begin{array}{l} \cos \theta (k) \\ \sin \theta (k) \\ \end{array} \right) \nonumber\\ \theta (k + 1) &=& \theta (k) + u_\sigma (k) \end{eqnarray}
View SourceRight-click on figure for MathML and additional features.
where p(k) = [x(k),y(k)]^T is the position at step k, \Delta t = 1{\rm (s)} , u_\sigma (k) is the control input at step k, in which \sigma represents the signal states of the four infrared receivers on the robot (as illustrated in Fig. 5) that is defined as \begin{equation} \sigma (k) = (r_1 (k),r_2 (k),r_3 (k),r_4 (k)) \end{equation}
View SourceRight-click on figure for MathML and additional features.
where r_i (k) \in \{ 0,1\} is the status of the infrared receiver i at step k . If receiver i could receive the signal, then r_i (k) = 1, else r_i (k) = 0. The motion action of control input u_\sigma (k) is shown in Table I, with respect to different values of \sigma (k). For example, Fig. 6 shows an instance of the states of the four infrared receivers, with only the infrared receiver r_3 could receive the signal from the beacon. In this case, the state vector \sigma (k) = (0,0,1,0) and by consulting Table I (Line 3), the robot would turn right in place with rotation angle 1/4\pi. The numerical control of the path transportation integrated with the HCC can be viewed as a simple yet efficient version of sensor fusion, here u_\sigma (k) is the information inference using multiple cues or signals \sigma (k) = (r_1 (k),r_2 (k),r_3 (k),r_4 (k)).

TABLE I Reference Table of Motion Actions
Table I- Reference Table of Motion Actions

Fig. 7 shows the situations of invalid states of \sigma (k), and Fig. 8 shows the situations of valid states of \sigma (k).

D. Description of Bearing-Alignment Event Mechanism (Event I)

Event I is the bearing-alignment event mechanism as an important component in the HCC, which is particularly designed and has an important function in Algorithm 1 in Section V, as described in the following and illustrated in Fig. 9.

  1. During the turning process (refer to Section III -C and Line 3 of Algorithm 1 in Section V), the laser beam emitted from the laser emitter on the robot may have the opportunity to point directly to the beacon.

  2. When the laser receiver on the beacon receives the laser signal, an interrupt is triggered and reported to the MCU of the beacon.

  3. Then, the MCU of the beacon sends the message through the XBee module of the beacon to the XBee module of the robot to report the state of current alignment direction.

  4. An Event I interrupt is triggered and reported to the MCU of the robot. The robot stops its rotation, calibrates and points to the alignment direction.

Remark 1:

The energy required in the HCC is very low. To further save energy, the laser emitter can be set to begin working after having grasped the cylinder-shaped object.

Remark 2:

During the motion to the destination, if the ultrasound sensor detects an object, then Event II will be triggered (see Section V-B), one example with multiple encounters of obstacles and multiple event-triggers is shown in Fig. 11.

SECTION IV.

Grasp and Release

The software architecture is designed and provided in Appendix A, the task sequence is shown in Fig. A1. This section describes the task of grasp and release that consists of three parts: 1) object detection, 2) control of the arm and gripper, and 3) detection of the destination area and release of the object.

Fig. A1. - Hierarchical task architecture and the task sequence.
Fig. A1.

Hierarchical task architecture and the task sequence.

A. Distance Measurement and Object Detection

To locate an object in the starting area, the robot first turns in place and records the data measured by the ultrasonic sensor; if an object is detected (refer to Fig. A2 in Appendix A), then it move to get closer to the object until the distance is equal to a specified one (refer to d_1 in the following) for grasping.

Fig. A2. - Data of ultrasonic sensor for object detection.
Fig. A2.

Data of ultrasonic sensor for object detection.

B. Control of the Arm and Gripper for Grasping

The servo motors on the joints of the arm are controlled by the pulse width modulation (PWM) wave input and the angle of the motors has a liner response to the number of the pulses. We just need to calculate the angle of every motor required for grasping, and then, control the posture of the arm and claw to grasp the object accurately. Then, we get the following equations: \begin{eqnarray*} && d_1 = l_1 \sin \alpha + l_2 \sin \left({\alpha + \beta } \right) + l_3 \\ && h_2 = h_1 + l_1 \cos \alpha + l_2 \cos \left({\alpha + \beta } \right) \\ && \alpha + \beta + \gamma = \frac{{3\pi }}{2} \\ && {\rm s.}{\rm t.}\;\left\{ \begin{array}{l} 0 \le \alpha \le \frac{\pi }{2} \\ 0 \le \beta \le \pi \\ \end{array} \right. \end{eqnarray*}

View SourceRight-click on figure for MathML and additional features.where the parameters \alpha\beta\gammad_1 h_1 , and h_2 are shown in Fig. 10.

Calculate some frequently-used data and store them in the flash memory of the MCU of the robot, d_1 is 18 cm (an optimal value for the arm via experiments), as shown in Table II. The data have been tested and calibrated through experiments. When the value of h_2 is given, we can easily get the required angles of the three motors mounted on the arm to control the posture of the arm. Control the claw to grip object while measure the force between the object and claw. When the force is larger than a calibrated threshold, the robot lifts and holds up the object.

TABLE II Frequently-Used Data
Table II- Frequently-Used Data

Algorithm 1: Algorithm for path planning

Set k = 0.

Loop

1.

The robot reads the states of the four infrared receivers to

determine the state of \sigma (k) at step k (refer to Section III-C).

2.

Then, the robot refers to the reference table (see Table I) in

its memory to determine the value of the control input u_\sigma (k).

3.

The robot takes the corresponding action, i.e., turn in place

with the value of u_\sigma (k).

4.

If Event I (the bearing-alignment event, see the detail of

Event I in Section III-D) is triggered, then go to Line 9 out

of the loop.

5.

The robot records the distance d detected by the ultrasonic

sensor at each step.

6.

If Event II (the obstacle avoidance event described in the

following) is triggered (i.e., when there is an obstacle

detected in the front), then the robot moves with

obstacle avoidance (see Section V–B).

7.

If no Event I and Event II triggered, the robot moves ahead

with velocity v_0 for \Delta t (s).

8.

k = k + 1.

End loop

9.

The robot moves straightly on along the alignment direction

(refer to Event I) until Event III (the reaching-destination-

area event in the following) is triggered. During the process,

if Event II is triggered, then go to Line 6 and restart the loop

(refer to the remarks as follows).

C. Release of Object in the Destination Area

As the robot gets to the destination area, it will release the object, in this case, the posture of the arm and the values of \alpha,\beta,\;{\rm and}\;\gamma are controlled same as those in the grasping posture.

SECTION V.

Transportation

With the HCC designed in Section III, the software transportation algorithm of the robot designed in this section has three distinct characteristics:

  1. without localization of the robot in the environment;

  2. transportation with rough direction of the destination, and thus, having the robustness in this sense;

  3. the effective and very low computational cost of the software algorithm , with robustness on event handing and obstacle avoidance.

A. Algorithm for Transportation

The algorithm is shown in Algorithm 1, which is perfectly and tightly incorporated with the HCC (see Section III) in an event-trigged manner. The very simple and effective nature of the algorithm dues to the designed mechanism of the HCC.

Remark 1:

As the robot moves along the alignment-direction, the watching of Event I can be closed by the MCU of the robot in this case to further save energy and computation resources until additional (if exists) Event II is triggered after that.

Remark 2:

In Algorithm 1, in Line 4 (if Event I is triggered), the execution goes out of the loop; while in Line 9, if Event II is triggered (no matter the laser is on or off, refer to Remark 1, and note that if the watching of Event I was closed, the MCU of the robot should restart the watching of Event I after Event II is triggered), the execution reenters the loop, which enables the robot to handle multiple encounters of obstacles, and the algorithm is robust in this sense.

Remark 3:

Algorithm 1 (together with the HCC designed in Section III) has the inherent robustness for 1) using rough directions of the destination beacon (refer to Section III–C) and 2) handling of multiple encounters of obstacles (one example shown in Fig. 11).

B. Action of Event II for Obstacle Avoidance

For obstacle avoidance, set the threshold as 20 cm (a value validated by experiments) for distance d for the ultrasonic sensor to judge obstacles in front of it. Then, if there is an obstacle detected in front of the robot, the action of the robot is quite simple: it first turns right in place with a certain angle, detects its right side to see if there is an outlet, and if not, the left side. The details of the action are not the focus, and thus, omitted here.

One example of path planning is shown in Fig. 11, with multiple encounters of obstacles, and multiple event triggers of Event I (refer to Section III –D) and Event II.

C. Description of Event III

Event III is the reaching-destination-area event. When the QTI sensors on the robot detect the black line (see Fig. 3), which means the robot has already arrived at the destination, the robot stops.

D. Integration of Hardware and Software Algorithm for Computational Efficiency

Finally, we would emphasize the computational-efficiency merit of the integration [30] of software and hardware: just dues to the HCC designed in Section III and the tight integration of the heuristic software algorithm and the HCC, the transportation algorithm becomes simple with low computational cost, i.e., the very simple nature of the transportation algorithm dues to this integration, which passes some computational functions to the HCC (e.g., the heuristics for path planning in Section III-C using the effective sensors system and the destination beacon in the HCC, and the bearing-alignment event mechanism that relies on the laser emitter/receiver and the destination beacon together with communication mechanism), and thus, decreases or even eliminate many computational costs from the MCU of the robot such as conventional computations (e.g., positioning or localization, visual or wheeled odometry, visibility-graph construction, and multisensor fusing).

The simulations and experiment are provided in Appendix C.

SECTION VI.

Conclusion

In this paper, we design the hardware architecture, and the low-cost algorithms for object grasping and transportation; the solution for transportation is, as least, suitable for an environment with sparse obstacles. The physical experiments show the effectiveness of the system design and algorithms.

How this work can be extended to the environment problems in which the environment is full of static or dynamic obstacles? In this case, much powerful microprocessors may be required. One possible way is to incorporate the architecture and algorithm in this paper with existing sophisticated algorithms (typically computationally incentive, with necessary hardware) for such environment, and set the highest priority for selecting the algorithm in this paper (if possible) for certain conditions to save resources consumption, which will be provided in a future paper.

Future work may also include developing a more intelligent algorithm and a cooperative path planning method by multiple mobile robots as in the scenarios of [16], [35] and an implementation of a defender–intruder problem [28], with the treatments of some moving obstacles, etc.

Appendix

SECTION A.

Software Architecture

The software architecture consists of three layers.

  1. The lowest layer is the hardware abstraction layer. In this layer, the function interfaces of various sensors and motors are provided, for example, the functions of robot motion, distance detection, arm movement, the state of the infrared receiver and the sending and receiving data of the XBee modules. This layer provides the application programming interfaces (APIs) that encapsulate the hardware implementation of these functionalities.

  2. The middle layer is the robotic action layer. The performance functions are provided with integration of certain functions in the hardware abstraction layer, for example, obstacle avoidance, object detection, grasping and releasing, destination detection, path planning, etc.

  3. The highest layer is the task layer. The functions provided in the robotic action layer are organized to form an integrated whole to accomplish the given transportation task.

The program is executed through an event-driven mechanism. The whole hierarchical task architecture and the task sequence are shown in Fig. A1.

SECTION B.

Geometric Parameters of Robot

TABLE III Geometric Parameters of Robot
Table III- Geometric Parameters of Robot

SECTION C.

Simulations and Experiment

Algorithm 1 is simulated in the following. In every simulation, the robot is initially located at the origin of the coordinates with orientation \theta (0) = 0; the destination area is a circle centered at the coordinates (90, 110) with radius 20 cm. v_0 = 0.15\,{\rm m/s}.

Fig. A3 is the simulation of Algorithm I without obstacles: in (a), the trigger of Event I is disabled; for comparison, in (b), the trigger of Event I is enabled and is triggered when the robot is at the coordinates (10.61, 10.61).

Fig. A3. - Simulation result of transportation task of the robot without obstacles. The center of the destination area
 is the location of the beacon. (a) Without Event I triggered. (b) With Event I triggered.
Fig. A3.

Simulation result of transportation task of the robot without obstacles. The center of the destination area is the location of the beacon. (a) Without Event I triggered. (b) With Event I triggered.

In Fig. A4, one obstacle is located at the coordinates (40, 50) with radius 15 cm. Event II is triggered when the robot is at the coordinates (21.21, 36.21); the turning angle is set as 10°. Event I is triggered when the robot is at the coordinates (27.55, 64.81).

Fig. A4. - Simulation result of transportation task of the robot with obstacle. The center of the destination area is
 the location of the beacon. (a) Without Event I triggered. (b) With Event I triggered.
Fig. A4.

Simulation result of transportation task of the robot with obstacle. The center of the destination area is the location of the beacon. (a) Without Event I triggered. (b) With Event I triggered.

The whole experiment is implemented in a 3 m × 4 m rectangular place. The diameter and the height of the obstacle are 8 and 20 cm, respectively. The radius of the black line around the beacon is 50 cm. As illustrated in Fig. A5, the robot moves to the cylinder-shaped object (a), grasps the object (b), lifts up the object (c), then moves to the beacon with object-avoidance (d)—(h), and Event I is triggered (i), then stops at the black line when it detects the black line (k), finally the robot puts the object down (l). The experiment shows the validity of the system design and effectiveness of the algorithm.

Fig. A5. - Experiment of the grasping and transportation task.
Fig. A5.

Experiment of the grasping and transportation task.

References

References is not available for this document.