Lesson 15:Diego1 Plus Introduction

From the beginning of this section due to the content involved in image processing, raspberry faction processing power has been powerless, after repeated testing, running some relatively simple video processing functions Raspberry faction 3b CPU usage must be 80 % Of the above, it can not meet the requirements of the follow-up function, it Diego1 hardware upgrade, I called Diego1 plus here,in the github corresponding code warehouse is diego1-plus

1.Hardware adjustment

The following table is Diego1 plus hardware adjustment program:

PhotoNameCountmapping diego 1#StatusFeaturediego 1# plus中的作用
Mini PC1RaspBerry PI3Replace•Intel I5 5200u
•4g DDR3 1600
•802.11 b/g/n 无线局域网
•蓝牙 4.1
•1 x 10/100 以太网端口
•1 x HDMI 视频/音频连接器
•4 个USB 3.0 端口
•2 个USB 2.0 端口
•尺寸:181 x 128x 55 mm
•主控制板
•Ubuntu 16.04
•ROS kinetic
•Other ROS package
buck0BuckDelete•大功率5A输出,低纹波
•板载USB输出
•恒压恒流两种可调模式
•过流保护
•板载电压、电流、功率显示
•为树莓派提供5v USB供电
12V 18650 battery212V 18650 batteryAdd 1 unit•标称电压 :12V
•放电电流 :6.0A
•充电电流 :0.5A
•内阻 :≤60mΩ
•最高充电电压 :12.6V
•最低放电电压 :9.0V
•一组为6自由度机械臂舵机供电
•一组为mini pc供电
Letv Pro Xtion1New•最大功耗:低于2.5瓦
•使用距离:0.8m~3.5m
•视野:58° H, 45° V, 70° D(水平,垂直,对角)
•传感器:深度
•深度影像大小:VGA(640*480):30fps; QVGA(320*240):60FPS
•接口:USB2.0
•作为机器人的视觉传感器,提供深度影像数据,点云数据

Adjustment instructions:

  • mini pc:Using intel low power processor I5 5200U, 4g memory, installed directly on the Diego1 plus base.
  • Power: As the mini pc with 12V power supply, so add a 12V 6A output 18650 battery to mini pc power supply, and the original 9v battery dedicated to the chassis motor power supply, coupled with the power supply to the arm of the battery, a total of three groups of 18650 battery.
  • Buck module: The 5V3A buck module, which was originally powered by raspberry, is no longer needed in Diego 1 plus.
  • Xtion pro: In the Diego1 plus new Xtion pro depth camera, Xtion pro visual range is 0.8m ~ 3.5m, so at the time of installation to ensure that Diego1 plus to operate within the scope of its visible range.

2.software adjustment

Software adjustment has two main aspects:

  • Operating System: Ubuntu mate 16.04 used by the original raspberry 3 to adjust to x64 architecture Ubuntu 16.04
  • ROS: In Diego1 because the raspberry-3 ros-kinetic-desktop adjusted to ros-kinetic-desktop-full, and full version of ROS

Even if you install the full version of ROS, we need to add the following package in order to ensure the normal operation of Diego1 plus

  • gmapping
  • amcl
  • move_base
  • hector slam
  • openni
  • tf2
  • sphinx相关包

Please execute the following code to complete the installation

sudo apt-get install ros-kinetic-gmapping
sudo apt-get install ros-kinetic-navigation
sudo apt-get install ros-kinetic-hector-slam
sudo apt-get install ros-kinetic-tf2-sensor-msgs
sudo apt-get install ros-kinetic-openni-camera
sudo apt-get install ros-kinetic-openni-launch
sudo apt-get install libopenni-sensor-primesense0
sudo apt-get install ros-kinetic-audio-common
sudo apt-get install libasound2
sudo apt-get install gstreamer0.10-*
sudo apt-get install python-gst0.10

Also related to the sphinx deb package to be replaced by amd64 version, the specific installation method please refer to Lesson12, download please select amd64 version of the package

Lesson 14: Keyboard control

In this section we show how to use the computer keyboard to control the robot chassis, and the movement of the robot arm in the ROS has teleop_twist_keyboard package to control the movement of the robot chassis, but the lack of control of the robot, we can on this basis Modify the part that controls the arm control.

please see http://wiki.ros.org/teleop_twist_keyboard to get more detail information for teleop_twist_keyboard

1.Keyboard control diagram:

2.Control principle

2.1Chassis motion control

As described in the previous section, the ROS machine chassis is controlled by receiving Twist messages, so we only need to convert the control commands of the corresponding keyboard to Twist messages to achieve motion control of the chassis

2.2Motion control of the manipulator

In arduino_node.py provides a servo servo control service

# A service to position a PWM

servo rospy.Service('~servo_write', ServoWrite, self.ServoWriteHandler)

Control the arm, in fact, is to control the arm of the various joints of the steering gear, so we only need to control the steering wheel command, call arduino_node.py serve can achieve the control of the robot

3. code

the following is code ,the file name is diego_teleop_twist_keyboard.py

  • Chassis motion control: retains the original teleop_twist_keyboard control logic
  • Robot control: Increases or decreases 5 ° control angle each time
#!/usr/bin/env python
import roslib
import rospy

from geometry_msgs.msg import Twist
from ros_arduino_msgs.srv import *
from math import pi as PI, degrees, radians
import sys, select, termios, tty

msg = """
Reading from the keyboard  and Publishing to Twist, Servo or sensor!
---------------------------
Moving around:
   u    i    o
   j    k    l

, : up (+z)
. : down (-z)

----------------------------
Left arm servo control:
+   1   2   3   4   5   6
-   q   w   e   r   t   y  
----------------------------
Right arm servo control:
+   a   s   d   f   g   h
-   z   x   c   v   b   n 

p : init the servo

CTRL-C to quit
"""

moveBindings = {
 'i':(1,0,0,0),
 'o':(1,0,0,1),
 'j':(-1,0,0,-1),
 'l':(-1,0,0,1),
 'u':(1,0,0,-1),
 'k':(-1,0,0,0),
 ',':(0,0,1,0),
 '.':(0,0,-1,0),
 }

armServos={
 '1':(0,1),
 '2':(1,1),
 '3':(2,1),
 '4':(3,1),
 '5':(4,1),
 '6':(5,1),
 'a':(6,1),
 'q':(0,0),
 'w':(1,0),
 'e':(2,0),
 'r':(3,0),
 't':(4,0),
 'y':(5,0),
 'z':(6,0),
 }

armServoValues=[70,100,60,80,70,70,35]

def getradians(angle):
    return PI*angle/180


def getKey():
    tty.setraw(sys.stdin.fileno())
    select.select([sys.stdin], [], [], 0)
    key = sys.stdin.read(1)
    termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)
    return key


def servoWrite(servoNum, value):
        rospy.wait_for_service('/arduino/servo_write')
    try:
        servo_write=rospy.ServiceProxy('/arduino/servo_write',ServoWrite)
        servo_write(servoNum,value)
        print servoNum
            print value
        except rospy.ServiceException, e:
            print "Service call failed: %s"%e


def vels(speed,turn):
    return "currently:\tspeed %s\tturn %s " % (speed,turn)

if __name__=="__main__":
        settings = termios.tcgetattr(sys.stdin)

    pub = rospy.Publisher('cmd_vel', Twist, queue_size = 1)
    rospy.init_node('teleop_twist_keyboard')

    speed = rospy.get_param("~speed", 0.25)
    turn = rospy.get_param("~turn", 1.0)
    x = 0
    y = 0
    z = 0
    th = 0
    status = 0  

    try:
        print msg
        print vels(speed,turn)
        servoWrite(0, radians(armServoValues[0]))
        servoWrite(1, radians(armServoValues[1]))
        servoWrite(2, radians(armServoValues[2]))
        servoWrite(3, radians(armServoValues[3]))
        servoWrite(4, radians(armServoValues[4]))
        servoWrite(5, radians(armServoValues[5]))
        servoWrite(6, radians(armServoValues[6]))
        while(1):
            key = getKey()
            print key
            if key in moveBindings.keys():
                x = moveBindings[key][0]
                y = moveBindings[key][1]
                z = moveBindings[key][2]
                th = moveBindings[key][3]
                twist = Twist()
                twist.linear.x = x*speed; twist.linear.y = y*speed; twist.linear.z = z*speed;
                twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = th*turn
                pub.publish(twist)
            elif key in armServos.keys():  
                if(armServos[key][1]==0):
                    armServoValues[armServos[key][0]]=armServoValues[armServos[key][0]]-5
                    if armServoValues[armServos[key][0]]<=0:
                       armServoValues[armServos[key][0]]=0
                else:
                    armServoValues[armServos[key][0]]=armServoValues[armServos[key][0]]+5
                    if armServoValues[armServos[key][0]]>=180:
                       armServoValues[armServos[key][0]]=180
                print armServoValues[armServos[key][0]]
                servoWrite(armServos[key][0], radians(armServoValues[armServos[key][0]]))
            else:
                x = 0
                y = 0
                z = 0
                th = 0
                if (key == '\x03'):
                    break



    except BaseException,e:
        print e

    finally:
        twist = Twist()
        twist.linear.x = 0; twist.linear.y = 0; twist.linear.z = 0
        twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = 0
        pub.publish(twist)

            termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)

4.Start keyboard control

rosrun teleop_twist_keyboard diego_teleop_twist_keyboard.py

Lesson 13: Vision system-Xtion+Openni

Machine vision is currently generally used in depth camera, binocular camera and other solutions, but the more mature is the use of deep camera solutions. Depth camera is currently more popular are:

  • Microsoft Kinect
  • Intel RealSense
  • ASUS Xtion pro live

In Diego 1 # we used ASUS Xtion as a visual hardware solution, the main reason is that Xtion uses USB2.0 interface, the other two products are USB3.0 interface, while the raspberry faction only supports USB2.0 Interface; another Xtion relative volume is relatively small, no additional power supply.

Software package we use openni, openni is an open source 3D machine vision open source library, which provides a convenient and powerful API interface, and it is important to support Xtion pro, available in linux through openni to drive Xtion depth camera.

1.install OpenNI

1.1Install OpenNI

sudo apt-get install ros-kinetic-openni-camera
sudo apt-get install ros-kinetic-openni-launch

1.2.Install Xtion

sudo apt-get install libopenni-sensor-primesense0

1.3.start openni node

roslaunch openni_launch openni.launch

After the successful start, the terminal should display the following information

2.Converts Openni’s point cloud data to laser data

Openni available 3D point cloud data, and we can also transform 3D point cloud data into 2D laser data, can replace the laser radar。we will use the package pointcloud_to_laserscan(https://github.com/ros-perception/pointcloud_to_laserscan)to transform the data.

2.1.Install pointcloud_to_laserscan

clone the package to ~/catkin_ws/src

cd ~/catkin_ws/src
git clone https://github.com/ros-perception/pointcloud_to_laserscan

execute follow command to install ros-kinetic-tf2-sensor-msgs

sudo apt-get install ros-kinetic-tf2-sensor-msgs

compile

cd ~/catkin_ws/

catkin_make

modify sample_node.launch

<?xml version="1.0"?>

<launch>

<arg name="camera" default="camera" />

<!-- start sensor-->
<include file="$(find openni_launch)/launch/openni.launch">
<arg name="camera" default="$(arg camera)"/>
</include>

<!-- run pointcloud_to_laserscan node -->
<node pkg="pointcloud_to_laserscan" type="pointcloud_to_laserscan_node" name="pointcloud_to_laserscan">

<remap from="cloud_in" to="$(arg camera)/depth_registered/points_processed"/>
<remap from="scan" to="$(arg camera)/scan"/>
<rosparam>
target_frame: camera_link # Leave disabled to output scan in pointcloud frame
transform_tolerance: 0.01
min_height: 0.0
max_height: 1.0

angle_min: -1.5708 # -M_PI/2
angle_max: 1.5708 # M_PI/2
angle_increment: 0.087 # M_PI/360.0
scan_time: 0.3333
range_min: 0.45
range_max: 4.0
use_inf: true

# Concurrency level, affects number of pointclouds queued for processing and number of threads used
# 0 : Detect number of cores
# 1 : Single threaded
# 2->inf : Parallelism level
concurrency_level: 1
</rosparam>

</node>

</launch>

2.2.start sample node

roslaunch pointcloud_to_laserscan sample_node.launch

After the success of the start will appear the following information:

Execute the following command to view the laser data

rostopic echo /camera/scan


You can also use rviz to view laser data

rosrun rviz rviz


the depth of the picture

The above is the depth of the camera and its installation process and a simple conversion to laser data applications, the follow-up section will be based on the visual system to gradually develop other robot applications

Lesson 12: Voice system-sphinx

Robot voice system is divided into two parts of speech recognition and speech synthesis, ROS voice recognition package using sphinx, and voice synthesis using source_play

1.Speech Recognition

In the ROS voice recognition package using the sphinx, sphinx is not installed in ROS kinetic this version , you need to manually install.

the installation process is as follows:

1.1 installation

1.1.1First install the following dependent packages

sudo apt-get install ros-kinetic-audio-common
sudo apt-get install libasound2
sudo apt-get install gstreamer0.10-*
sudo apt-get install python-gst0.10

1.1.2.install libsphinxbase1

https://packages.debian.org/jessie/libsphinxbase1
Since Diego uses a raspberry platform, please download the armhf version
这里写图片描述
After the implementation of the download

sudo dpkg -i libsphinxbase1_0.8-6_amdhf.deb

1.1.3.install libpocketsphinx1

https://packages.debian.org/jessie/libpocketsphinx1
Also download armhf version, after the completion of the implementation of the download

sudo dpkg -i libpocketsphinx1_0.8-5_amdhf.deb

1.1.4.install gstreamer0.10-pocketsphinx

https://packages.debian.org/jessie/gstreamer0.10-pocketsphinx
Also download armhf version, after the completion of the implementation of the download

sudo dpkg -i gstreamer0.10-pocketsphinx_0.8-5_amdhf.deb

1.1.5.install pocketsphinx

Go to the working directory and clone the git directory

cd ~/catkin_ws/src
Git clone https://github.com/mikeferguson/pocketsphinx

1.1.6.Download English voice packs pocketsphinx-hmm-en-tidigits (0.8-5)

https://packages.debian.org/jessie/pocketsphinx-hmm-en-tidigits

In the package pocketsphinx below to build a model directory, store the voice model file

cd ~/catkin_ws/src/pocketsphinx
mkdir model

Will download a good voice file, after decompression, the model file under which all the files copied to ~ / catkin_ws / src / pocketsphinx / model under
这里写图片描述

1.2.创建launch启动脚本

Create a new launch folder in the ~ / catkin_ws / src / pocketsphinx directory and create the diego_voice_test.launch file

cd ~/catkin_ws/src/pocketsphinx
mkdir launch
vi diego_voice_test.launch

diego_voice_test.launch

<launch>

  <node name="recognizer" pkg="pocketsphinx" type="recognizer.py" output="screen">
    <param name="lm" value="$(find pocketsphinx)/model/lm/en/tidigits.DMP"/>
    <param name="dict" value="$(find pocketsphinx)/model/lm/en/tidigits.dic"/>
    <param name="hmm" value="$(find pocketsphinx)/model/hmm/en/tidigits"/>
  </node>

</launch>

1.3.Modify recognizer.py

In the def init (self): function to increase the hmm parameter read

def __init__(self):
        # Start node
        rospy.init_node("recognizer")

        self._device_name_param = "~mic_name"  # Find the name of your microphone by typing pacmd list-sources in the terminal
        self._lm_param = "~lm"
        self._dic_param = "~dict"
        self._hmm_param = "~hmm" 

        # Configure mics with gstreamer launch config
        if rospy.has_param(self._device_name_param):
            self.device_name = rospy.get_param(self._device_name_param)
            self.device_index = self.pulse_index_from_name(self.device_name)
            self.launch_config = "pulsesrc device=" + str(self.device_index)
            rospy.loginfo("Using: pulsesrc device=%s name=%s", self.device_index, self.device_name)
        elif rospy.has_param('~source'):
            # common sources: 'alsasrc'
            self.launch_config = rospy.get_param('~source')
        else:
            self.launch_config = 'gconfaudiosrc'

        rospy.loginfo("Launch config: %s", self.launch_config)

        self.launch_config += " ! audioconvert ! audioresample " \
                            + '! vader name=vad auto-threshold=true ' \
                            + '! pocketsphinx name=asr ! fakesink'

        # Configure ROS settings
        self.started = False
        rospy.on_shutdown(self.shutdown)
        self.pub = rospy.Publisher('~output', String)
        rospy.Service("~start", Empty, self.start)
        rospy.Service("~stop", Empty, self.stop)

        if rospy.has_param(self._lm_param) and rospy.has_param(self._dic_param):
            self.start_recognizer()
        else:
            rospy.logwarn("lm and dic parameters need to be set to start recognizer.")

在def start_recognizer(self):函数hmm参数的代码,如下

    def start_recognizer(self):
        rospy.loginfo("Starting recognizer... ")

        self.pipeline = gst.parse_launch(self.launch_config)
        self.asr = self.pipeline.get_by_name('asr')
        self.asr.connect('partial_result', self.asr_partial_result)
        self.asr.connect('result', self.asr_result)
        self.asr.set_property('configured', True)
        self.asr.set_property('dsratio', 1)

        # Configure language model
        if rospy.has_param(self._lm_param):
            lm = rospy.get_param(self._lm_param)
        else:
            rospy.logerr('Recognizer not started. Please specify a language model file.')
            return

        if rospy.has_param(self._dic_param):
            dic = rospy.get_param(self._dic_param)
        else:
            rospy.logerr('Recognizer not started. Please specify a dictionary.')
            return
   
        if rospy.has_param(self._hmm_param):
            hmm = rospy.get_param(self._hmm_param)
        else:
            rospy.logerr('Recognizer not started. Please specify a hmm.')
            return

        self.asr.set_property('lm', lm)
        self.asr.set_property('dict', dic)
        self.asr.set_property('hmm', hmm)       

        self.bus = self.pipeline.get_bus()
        self.bus.add_signal_watch()
        self.bus_id = self.bus.connect('message::application', self.application_message)
        self.pipeline.set_state(gst.STATE_PLAYING)
        self.started = True

1.4.start shpinx

roslaunch pocketsphinx diego_voicd_test.launch

Now you can talk to your robot, pay attention to the words in the voice model dictionary
这里写图片描述

Sphinx for a specific voice environment recognition is good, but once the environment changes, with different noise, the recognition rate will be significantly reduced, which is now the voice recognition technology is facing common problems

2. Speech synthesis

In the ROS has been integrated in a complete voice synthesis package source_play, only supports English voice synthesis, the implementation of the following order, you can test

rosrun sound_play soundplay_node.py
rosrun sound_play say.py "hi, i am diego."

In this section, the voice system has been built, in the subsequent chapters will use the voice system to build other applications.

Lesson 11: Navigation-Location & planner

In the case of existing maps, the need to allow the robot to locate the location of the map, which requires the use of ROS acml package to achieve, while the release of the target location through the move_base to do the path planning, bypass obstacles to reach destination.

1.Location

1.1 Write acml launch script

<launch>
<master auto="start"/>

<include file="$(find flashgo)/launch/lidar.launch" />

<node name="arduino" pkg="ros_arduino_python" type="arduino_node.py" output="screen">
<rosparam file="$(find ros_arduino_python)/config/my_arduino_params.yaml" command="load" />
</node>

<node pkg="tf" type="static_transform_publisher" name="base_frame_2_laser_link" args="0.0 0.0 0.2 3.14 3.14 0 /base_link /laser 40"/>

<!-- Map server -->
<node name="map_server" pkg="map_server" type="map_server" args="$(find diego_nav)/maps/f4_gmapping.yaml" />

<!-- amcl node -->
<node pkg="amcl" type="amcl" name="amcl" output="screen">
<remap from="scan" to="scan"/>
<!-- Publish scans from best pose at a max of 10 Hz -->
<param name="initial_pose_x" value="0.0"/>
<param name="initial_pose_y" value="0.0"/>
<param name="initial_pose_a" value="0.0"/>
<param name="use_map_topic" value="true"/>
<param name="odom_model_type" value="diff"/>
<param name="odom_alpha5" value="0.1"/>
<param name="transform_tolerance" value="0.5" />
<param name="gui_publish_rate" value="10.0"/>
<param name="laser_max_beams" value="300"/>
<param name="min_particles" value="500"/>
<param name="max_particles" value="5000"/>
<param name="kld_err" value="0.1"/>
<param name="kld_z" value="0.99"/>
<param name="odom_alpha1" value="0.1"/>
<param name="odom_alpha2" value="0.1"/>
<!-- translation std dev, m -->
<param name="odom_alpha3" value="0.1"/>
<param name="odom_alpha4" value="0.1"/>
<param name="laser_z_hit" value="0.9"/>
<param name="laser_z_short" value="0.05"/>
<param name="laser_z_max" value="0.05"/>
<param name="laser_z_rand" value="0.5"/>
<param name="laser_sigma_hit" value="0.2"/>
<param name="laser_lambda_short" value="0.1"/>
<param name="laser_lambda_short" value="0.1"/>
<param name="laser_model_type" value="likelihood_field"/>
<!-- <param name="laser_model_type" value="beam"/> -->
<param name="laser_min_range" value="1"/>
<param name="laser_max_range" value="8"/>
<param name="laser_likelihood_max_dist" value="2.0"/>
<param name="update_min_d" value="0.2"/>
<param name="update_min_a" value="0.5"/>
<param name="resample_interval" value="1"/>
<param name="transform_tolerance" value="0.1"/>
<param name="recovery_alpha_slow" value="0.0"/>
<param name="recovery_alpha_fast" value="0.0"/>
</node>
</launch>

Here is the need to emphasize that the robot must be configured in fact the location, the robot casually put a position, amcl will not be positioned to the need to specify the initial location, the map is the origin of the map with gmapping / hector when the starting point, The file needs to set the initial position, we put the robot here in the starting position, the value is set to 0.0.

<param name="initial_pose_x" value="0.0"/>
<param name="initial_pose_y" value="0.0"/>

1.2 start acml node

roslaunch diego_nav diego_run_gmapping_amcl_flashgo.launch

Open a new terminal and start the keyboard control node

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

At this time we can use the keyboard to control the robot to move up and open rviz to view the positioning

rosin rviz rviz

In this picture we can see a large number of red arrows and Monte Carlo positioning algorithm generated by the distribution of particles, in which the direction of the arrow is the direction of the robot movement

This picture can be seen in the chart more aggregates than the previous chart, from the actual observation of the case of high degree of polymerization more positioning effect.

2.route plan

Path planning and how to allow the robot to move from one location to another location, the way to avoid obstacles, is also the usual sense of the most intuitive navigation, in the ROS path planning is achieved through the move_base package.

2.1 Write the launch_base launch script

<launch>

<!-- move_base node -->
<node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
<rosparam file="$(find diego_nav)/config/costmap_common_params.yaml" command="load" ns="global_costmap" />
<rosparam file="$(find diego_nav)/config/costmap_common_params.yaml" command="load" ns="local_costmap" />
<rosparam file="$(find diego_nav)/config/local_costmap_params.yaml" command="load" />
<rosparam file="$(find diego_nav)/config/global_costmap_params.yaml" command="load" />
<rosparam file="$(find diego_nav)/config/base_local_planner_params.yaml" command="load" />
</node>

</launch>
The contents of the launch file can also be combined with the acml startup script in a file.
The configuration parameters of the move_base node are distributed in four configuration files:
  • costmap_common_params.yaml
  • global_costmap_params.yaml
  • local_costmap_params.yaml
  • base_local_planner_params.yaml

2.2 costmap_common_params.yaml

Cost map general configuration file

obstacle_range: 2.5  #Maximum obstacle detection range
raytrace_range: 3.0. #Detects the maximum range of free space
footprint: [[0.14, 0.14], [0.14, -0.14], [-0.14, 0.14], [-0.14, -0.14]] #The robot is rectangular and sets the area occupied by the machine in the coordinate system
inflation_radius: 0.55 #And the safety factor of the obstacle

observation_sources: laser_scan_sensor #Only focus on the data of the lidar

laser_scan_sensor: {sensor_frame: laser, data_type: LaserScan, topic: scan, marking: true, clearing: true} #Set the relevant parameters of the lidar

 

2.3 global_costmap_params.yaml

Global cost map configuration file

global_costmap: 
 global_frame: /map #the global cost map reference is /map
 robot_base_frame: base_link #base_frame is base_link
 update_frequency: #Specify the map update frequency
 static_map: true 5.0 #Use static maps and initialize them
 transform_tolerance: 0.8 #Set tf update tolerance of 0.8, can be more hardware to adjust the actual situation of this parameter, in the raspberry sent less than 0.8 will continue to report tf release timeout warning message

 

2.4 local_costmap_params.yaml

Local cost map configuration file

local_costmap:
 global_frame: odom #the local cost map reference is odom
 robot_base_frame: base_link #base_frame is base_link
 update_frequency: 5.0 #the map update frequency
 publish_frequency: 2.0 #Cost Map The frequency at which visual information is published
 static_map: false #The local cost map will continue to update the map, so here is set to false
 rolling_window: true #设Set the scroll window so that the robot is always in the center of the form
 width: 4.0 #the width of cost map
 height: 6.0 #the length of cost map
 resolution: 0.05 #Cost map resolution

 

2.5 base_local_planner_params.yaml

Local Planner configuration file

TrajectoryPlannerROS:
 max_vel_x: 0.45 #Maximum speed in the x-axis direction
 min_vel_x: 0.1 #xMinimum speed in the direction of the shaft
 max_vel_theta: 1.0 #Maximum angular velocity
 min_in_place_vel_theta: 0.4

 acc_lim_theta: 3.2 #Maximum angular acceleration
 acc_lim_x: 2.5 #Maximum acceleration in the x-axis direction
 acc_lim_y: 2.5 #Maximum acceleration in the y-axis direction

2.6start move_base node

roslaunch diego_nav diego_run_gmapping_amcl_flashgo.launch
Diego 1 # already has a positioning, navigation function, then we conducted a navigation test

3.Navigation test

Navigation can be placed in the indoor fixed position obstacles, using gmapping or hector to draw a map of obstacles, and then use the above method to start acml and move_base for positioning and navigation。

3.1Navigate the test code

We can modify the navigation test code, change the location to the location on the actual map

#!/usr/bin/env python
import roslib;
import rospy
import actionlib
from actionlib_msgs.msg import *
from geometry_msgs.msg import Pose, PoseWithCovarianceStamped, Point, Quaternion, Twist
from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal
from random import sample
from math import pow, sqrt

class NavTest():
def __init__(self):
rospy.init_node('nav_test', anonymous=True)

rospy.on_shutdown(self.shutdown)

# How long in seconds should the robot pause at each location?
self.rest_time = rospy.get_param("~rest_time", 10)

# Are we running in the fake simulator?
self.fake_test = rospy.get_param("~fake_test", False)

# Goal state return values
goal_states = ['PENDING', 'ACTIVE', 'PREEMPTED',
'SUCCEEDED', 'ABORTED', 'REJECTED',
'PREEMPTING', 'RECALLING', 'RECALLED',
'LOST']

# Set up the goal locations. Poses are defined in the map frame.
# An easy way to find the pose coordinates is to point-and-click
# Nav Goals in RViz when running in the simulator.
# Pose coordinates are then displayed in the terminal
# that was used to launch RViz.
locations = dict()

locations['point_0'] = Pose(Point(0.0, 2.0, 0.000), Quaternion(0.000, 0.000, 0.223, 0.975))
locations['point_1'] = Pose(Point(0.5, 2.0, 0.000), Quaternion(0.000, 0.000, -0.670, 0.743))

# Publisher to manually control the robot (e.g. to stop it)
self.cmd_vel_pub = rospy.Publisher('cmd_vel', Twist)

# Subscribe to the move_base action server
self.move_base = actionlib.SimpleActionClient("move_base", MoveBaseAction)

rospy.loginfo("Waiting for move_base action server...")

# Wait 60 seconds for the action server to become available
self.move_base.wait_for_server(rospy.Duration(60))

rospy.loginfo("Connected to move base server")

# A variable to hold the initial pose of the robot to be set by
# the user in RViz
initial_pose = PoseWithCovarianceStamped()

# Variables to keep track of success rate, running time,
# and distance traveled
n_locations = len(locations)
n_goals = 0
n_successes = 0
i = n_locations
distance_traveled = 0
start_time = rospy.Time.now()
running_time = 0
location = ""
last_location = ""

# Get the initial pose from the user
#rospy.loginfo("*** Click the 2D Pose Estimate button in RViz to set the robot's initial pose...")
#rospy.wait_for_message('initialpose', PoseWithCovarianceStamped)
#self.last_location = Pose()
#rospy.Subscriber('initialpose', PoseWithCovarianceStamped, self.update_initial_pose)

# Make sure we have the initial pose
#while initial_pose.header.stamp == "":
# rospy.sleep(1)

rospy.loginfo("Starting navigation test")

# Begin the main loop and run through a sequence of locations
while not rospy.is_shutdown():
# If we've gone through the current sequence,
# start with a new random sequence
if i == n_locations:
i = 0
sequence = sample(locations, n_locations)
# Skip over first location if it is the same as
# the last location
if sequence[0] == last_location:
i = 1

# Get the next location in the current sequence
location = sequence[i]

# Keep track of the distance traveled.
# Use updated initial pose if available.
if initial_pose.header.stamp == "":
distance = sqrt(pow(locations[location].position.x -
locations[last_location].position.x, 2) +
pow(locations[location].position.y -
locations[last_location].position.y, 2))
else:
rospy.loginfo("Updating current pose.")
distance = sqrt(pow(locations[location].position.x -
initial_pose.pose.pose.position.x, 2) +
pow(locations[location].position.y -
initial_pose.pose.pose.position.y, 2))
initial_pose.header.stamp = ""

# Store the last location for distance calculations
last_location = location

# Increment the counters
i += 1
n_goals += 1

# Set up the next goal location
self.goal = MoveBaseGoal()
self.goal.target_pose.pose = locations[location]
self.goal.target_pose.header.frame_id = 'map'
self.goal.target_pose.header.stamp = rospy.Time.now()

# Let the user know where the robot is going next
rospy.loginfo("Going to: " + str(location))

# Start the robot toward the next location
self.move_base.send_goal(self.goal)

# Allow 5 minutes to get there
finished_within_time = self.move_base.wait_for_result(rospy.Duration(300))

# Check for success or failure
if not finished_within_time:
self.move_base.cancel_goal()
rospy.loginfo("Timed out achieving goal")
else:
state = self.move_base.get_state()
if state == GoalStatus.SUCCEEDED:
rospy.loginfo("Goal succeeded!")
n_successes += 1
distance_traveled += distance
rospy.loginfo("State:" + str(state))
else:
rospy.loginfo("Goal failed with error code: " + str(goal_states[state]))

# How long have we been running?
running_time = rospy.Time.now() - start_time
running_time = running_time.secs / 60.0

# Print a summary success/failure, distance traveled and time elapsed
rospy.loginfo("Success so far: " + str(n_successes) + "/" +
str(n_goals) + " = " +
str(100 * n_successes/n_goals) + "%")
rospy.loginfo("Running time: " + str(trunc(running_time, 1)) +
" min Distance: " + str(trunc(distance_traveled, 1)) + " m")
rospy.sleep(self.rest_time)

def update_initial_pose(self, initial_pose):
self.initial_pose = initial_pose

def shutdown(self):
rospy.loginfo("Stopping the robot...")
self.move_base.cancel_goal()
rospy.sleep(2)
self.cmd_vel_pub.publish(Twist())
rospy.sleep(1)

def trunc(f, n):
# Truncates/pads a float f to n decimal places without rounding
slen = len('%.*f' % (n, f))
return float(str(f)[:slen])

if __name__ == '__main__':
try:
NavTest()
rospy.spin()
except rospy.ROSInterruptException:
rospy.loginfo("AMCL navigation test finished.")

3.2Start navigation

rosrun diego_nav nav_test.py

After the start, the robot will move in accordance with the location set in the navigation test file one by one, and can avoid obstacles in the map, but if it is in a small space, the machine will keep spinning, and no way out, this may Set the parameters related to the need to achieve good results also need to constantly debug parameters.

Lesson 10: Navigation-Create map based on the laser

Construction of the map is the basis of SLAM, positioning and path planning are based on a certain map to achieve, this section we will be based on the laser radar, using gmapping and hector slam two packages to build the map

1.Lidar

Lidar has a very high measurement accuracy characteristics, is a very good choice of distance measurement, but at the same time he also has high power consumption, high cost shortcomings, the current widespread use of 2D laser radar, intelligent measurement of a plane around the distance, and The cost of 3D laser radar is high and can not be widely promoted at present. 2D laser radar can only measure a plane problem, but also affect the use of the scene, if used as navigation, only suitable for some relatively simple rules in the environment.

In the Diego robot, we use the Flash go F4 radar from EAI, which features the following:

• 360 degree omni-directional scanning, 10 Hz adaptive scanning frequency, 4500 laser strokes per second
• Not less than 8 m range of ranging, measuring the resolution of 1% of the range
• Low noise and long life
Class 1 laser safety standard

F4 using USB interface and the host connection, at the same time with power and data transmission function, so the installation is still very convenient, only need to plug in the host’s USB interface

1.1. The installation of the laser radar ROS driver package

Execute the following command

cd ~/catkin_ws/src
git clone https://github.com/EAIBOT/flashgo.git
cd ..
catkin_make

After the completion of the implementation of the src directory to add a flashgo directory, is the F4 ROS package

 

 

 

 

 

 

 

 

 

1.2. Modify the serial number in the configuration file

F4 USB and motherboard connection, and raspberry connection, the system will appear in a serial port, because Arduino UNO is also connected through the USB serial communication, so there will be two serial devices in the system, the configuration must be clearly distinguish The serial number of the two devices


As shown above ttyACM0, and ttyACM1 is the corresponding serial port number, in the diego1 # ttyACM1 corresponds to arduino, ttyACM0 corresponding F4 laser radar, we need to modify the corresponding parameters, first modify the Ardunio configuration file, open the config directory my_arduino_params.yaml

 

 

 

 

 

Modify the arduino_bridge connection to the serial port for ttyACM1

 

 

 

 

 

 

Modify the F4 configuration file, open the flashar / launch directory under the lidar.launch file

 

 

 

 

 

Modify the F4 connection to the serial port ttyACM0

 

 

 

 

 

 

2.Use hector slam to build a map

2.1Install hector slam

sudo apt-get install ros-kinetic-hector-slam

2.2Write the hector slam to start the launch file

The following code is written hector start launch the file, here need to pay attention to the point of the EAI Flash lidar using the left hand coordinate system, and ROS use the right hand coordinate system, so the need to prepare a file in the static tf type base_frame_2_laser coordinate conversion.

<?xml version="1.0"?>

<launch>
  <arg name="tf_map_scanmatch_transform_frame_name" default="/scanmatcher_frame"/>
  <arg name="pub_map_odom_transform" value="true"/> 
  <arg name="map_frame" value="map"/> 
  <arg name="base_frame" value="base_link"/> 
  <arg name="odom_frame" value="base_link"/>
  <arg name="scan_subscriber_queue_size" default="5"/>
  <arg name="scan_topic" default="scan"/>
  <arg name="map_size" default="2048"/>

  <master auto="start"/>

  <include file="$(find flashgo)/launch/lidar.launch" />

  <node name="arduino" pkg="ros_arduino_python" type="arduino_node.py" output="screen">
      <rosparam file="$(find ros_arduino_python)/config/my_arduino_params.yaml" command="load" />
  </node> 

  <node pkg="tf" type="static_transform_publisher" name="base_frame_2_laser" args="0 0 0 0 0 0 /$(arg base_frame) /laser 100"/>  
  <node pkg="tf" type="static_transform_publisher" name="map_2_odom" args="0.0 0.0 0.0 0 0 0.0 /odom /$(arg base_frame) 10"/>


  <node pkg="hector_mapping" type="hector_mapping" name="hector_mapping" output="screen">   
  <!-- Frame names -->
  <param name="map_frame" value="$(arg map_frame)" />
  <param name="base_frame" value="$(arg base_frame)" />
  <param name="odom_frame" value="$(arg base_frame)" />   
   <!-- Tf use -->
  <param name="use_tf_scan_transformation" value="true"/>
  <param name="use_tf_pose_start_estimate" value="false"/>
  <param name="pub_map_odom_transform" value="$(arg pub_map_odom_transform)"/> 
  <!-- Map size / start point -->
  <param name="map_resolution" value="0.050"/>
  <param name="map_size" value="$(arg map_size)"/>
  <param name="map_start_x" value="0.5"/>
  <param name="map_start_y" value="0.5" />
  <param name="map_multi_res_levels" value="2" />
  <!-- Map update parameters -->
  <param name="update_factor_free" value="0.4"/>
  <param name="update_factor_occupied" value="0.7" />   
  <param name="map_update_distance_thresh" value="0.2"/>
  <param name="map_update_angle_thresh" value="0.9" />
  <param name="laser_z_min_value" value = "-1.0" />
  <param name="laser_z_max_value" value = "1.0" />  
  <!-- Advertising config -->
  <param name="advertise_map_service" value="true"/>
  <param name="scan_subscriber_queue_size" value="$(arg scan_subscriber_queue_size)"/>
  <param name="scan_topic" value="$(arg scan_topic)"/>
  <param name="tf_map_scanmatch_transform_frame_name" value="$(arg tf_map_scanmatch_transform_frame_name)" />
  </node>

  <include file="$(find hector_geotiff)/launch/geotiff_mapper.launch"/> 

</launch>

2.3 start hector slam

Now you can execute the following code to start the lidar

roslaunch diego_nav diego_run_hector_flashgo.launch

At this time we can control through the keyboard diego # in the room movement, start rviz can draw the corresponding map, in different terminals in the implementation of the following order
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
rosrun rviz rviz

3.Use gmapping to map

3.1Install the ros navigation package

sudo apt-get install ros-kinetic-navigation

gmapping,acml,move_base will be installed after the implementation

3.2Write the gmapping package to launch the launch file

<launch>
<master auto="start"/>

<include file="$(find flashgo)/launch/lidar.launch" />

<node name="arduino" pkg="ros_arduino_python" type="arduino_node.py" output="screen">
<rosparam file="$(find ros_arduino_python)/config/my_arduino_params.yaml" command="load" />
</node>

<node pkg="tf" type="static_transform_publisher" name="base_frame_2_laser_link" args="0.0 0.0 0.2 3.14 3.14 0 /base_link /laser 40"/>

<!-- gmapping node -->
<node pkg="gmapping" type="slam_gmapping" name="slam_gmapping">
<param name="base_frame" value="base_link"/>
<param name="odom_frame" value="odom"/>
<param name="maxUrange" value="4.0"/>
<param name="maxRange" value="5.0"/>
<param name="sigma" value="0.05"/>
<param name="kernelSize" value="3"/>
<param name="lstep" value="0.05"/>
<param name="astep" value="0.05"/>
<param name="iterations" value="5"/>
<param name="lsigma" value="0.075"/>
<param name="ogain" value="3.0"/>
<param name="lskip" value="0"/>
<param name="minimumScore" value="30"/>
<param name="srr" value="0.01"/>
<param name="srt" value="0.02"/>
<param name="str" value="0.01"/>
<param name="stt" value="0.02"/>
<param name="linearUpdate" value="0.05"/>
<param name="angularUpdate" value="0.0436"/>
<param name="temporalUpdate" value="-1.0"/>
<param name="resampleThreshold" value="0.5"/>
<param name="particles" value="8"/>
<!--
<param name="xmin" value="-50.0"/>
<param name="ymin" value="-50.0"/>
<param name="xmax" value="50.0"/>
<param name="ymax" value="50.0"/>
make the starting size small for the benefit of the Android client's memory...
-->
<param name="xmin" value="-1.0"/>
<param name="ymin" value="-1.0"/>
<param name="xmax" value="1.0"/>
<param name="ymax" value="1.0"/>

<param name="delta" value="0.05"/>
<param name="llsamplerange" value="0.01"/>
<param name="llsamplestep" value="0.01"/>
<param name="lasamplerange" value="0.005"/>
<param name="lasamplestep" value="0.005"/>
</node>
</launch>

3.3 start gmapping

Now you can execute the following code to start the lidar

roslaunch diego_nav diego_run_gmapping_flashgo.launch

At this time we can control through the keyboard diego # in the room movement, start rviz can draw the corresponding map, in different terminals in the implementation of the following order
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
rosrun rviz rviz

4.Generate map files

Whether you use hector or gmapping to create a map dynamically, we can generate a map file by command

4.1.Create the maps folder

The map file needs to give it permission to 777, allowing map_server to generate a map file in this folder

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

4.2.Generate map files

Open the terminal cd into the maps directory and execute the map generation command

cd ~/catkin_ws/src/diego_nav/maps/

rosrun map_server map_saver -f f4_gmapping

 

Lesson 9: Navigation-Diego 1# navigation framework

SLAM navigation is a complicated topic, which is related to many mathematics model and algorithm. While in ROS platform, these models and algorithms have been implemented and encapsulated as a function package within ROS architecture, so in diego 1#, we use ROS navigation framework.

1. ROS navigation framework

Following figure is navigation framework illustrated by ROS, you can find more detail from http://wiki.ros.org/navigation/Tutorials/RobotSetup

 

In the navigation stack, function packages in white and grey have been encapsulated in ROS, while function packages in blue require customized development based on hardware platform.

Though both gmapping and hector can be used to construct map, they have different algorithm, that is, gmapping relies on odom data while hector not when constructing map. You can choose one of them for application.

2. Diego 1# related function package

Following figure shows function packages and corresponding hardware of Diego 1#.

ROS导航功能软件包硬件资源说明
move_basemove_base.树莓派
map_servergmapping
. hector
. 树莓派
. 激光雷达
. xtion
激光雷达和xtion深度相机,可以二选其一
acmlacml. 树莓派
. 激光雷达
. xtion
激光雷达和xtion深度相机,可以二选其一
base_controllerros_arduino_bridge.Arduino UNOros_arduino_bridge包包含了base_controller
odometer sourceros_arduino_bridge.Arduino UNO
.霍尔编码器
ros_arduino_firmware获取霍尔编码器数据,在ros_arduino_bridge中经过计算发布odom消息
sensor transform. 树莓派tf基本上都是静态的,可以在launch文件中实现
sensor sourcerplidar A2 Driver
OpenNI
. 树莓派
. 激光雷达
. xtion
rplidar激光雷达本身提供laser数据发布的驱动包
如果用深度相机,可以使用OpenNI包发布laser数据

Lesson 8: Move control-calibrate angular

1.Angular velocity calibration principle and code

Similar to linear velocity, angular velocity calibration is fulfilled via checking consistency of robot rotation angle and control command required rotation angle. Considering of hardware precision, the error in allowable range can be accepted.

Here we control robot to rotate 360 degree in clockwise direction , following is angular velocity code:

#!/usr/bin/env python
import rospy
from geometry_msgs.msg import Twist, Quaternion
from nav_msgs.msg import Odometry
import tf
from math import radians, copysign
from transform_utils import quat_to_angle, normalize_angle
import PyKDL
from math import pi

class CalibrateAngular():
    def __init__(self):
        # Give the node a name
        rospy.init_node('calibrate_angular', anonymous=False)

        # Set rospy to execute a shutdown function when terminating the script
        rospy.on_shutdown(self.shutdown)

        # How fast will we check the odometry values?
        self.rate = 10
        r = rospy.Rate(self.rate)

        # The test angle is 360 degrees
        self.test_angle = 2*pi # radians is used rather than angle in ROS

        self.speed = 0.1 # radians per second
        self.tolerance = 1 # degrees converted to radians
        self.odom_angular_scale_correction = 1
        self.start_test = True

        # Publisher to control the robot's speed
        self.cmd_vel = rospy.Publisher('/cmd_vel', Twist, queue_size=5)

        # The base frame is usually base_link or base_footprint
        self.base_frame = rospy.get_param('~base_frame', '/base_link')

        # The odom frame is usually just /odom
        self.odom_frame = rospy.get_param('~odom_frame', '/odom')

        # Initialize the tf listener
        self.tf_listener = tf.TransformListener()

        # Give tf some time to fill its buffer
        rospy.sleep(2)

        # Make sure we see the odom and base frames
        self.tf_listener.waitForTransform(self.odom_frame, self.base_frame, rospy.Time(), rospy.Duration(60.0))

        rospy.loginfo("Bring up rqt_reconfigure to control the test.")

        reverse = 1

        while not rospy.is_shutdown():
            if self.start_test:
                # Get the current rotation angle from tf
                self.odom_angle = self.get_odom_angle()
                rospy.loginfo("self.odom_angle: "+str(self.odom_angle))

                last_angle = self.odom_angle
                turn_angle = 0
                self.test_angle *= reverse
                error = self.test_angle - turn_angle
                rospy.loginfo("errir: "+str(error))

                # Alternate directions between tests
                reverse = -reverse

                while abs(error) > self.tolerance and self.start_test:
                    if rospy.is_shutdown():
                        return
                    rospy.loginfo("*************************** ")
                    # Rotate the robot to reduce the error
                    move_cmd = Twist()
                    move_cmd.angular.z = copysign(self.speed, error)
                    rospy.loginfo("move_cmd.angular.z: "+str(move_cmd.angular.z))
                    self.cmd_vel.publish(move_cmd)
                    r.sleep()

                    # Get the current rotation angle from tf                   
                    self.odom_angle = self.get_odom_angle()
                    rospy.loginfo("current rotation angle: "+str(self.odom_angle))

                    # Compute how far we have gone since the last measurement
                    delta_angle = self.odom_angular_scale_correction * normalize_angle(self.odom_angle - last_angle)
                    rospy.loginfo("delta_angle: "+str(delta_angle))

                    # Add to our total angle so far
                    turn_angle += abs(delta_angle)
                    rospy.loginfo("turn_angle: "+str(turn_angle))

                    # Compute the new error
                    error = self.test_angle - turn_angle
                    rospy.loginfo("error: "+str(error))

                    # Store the current angle for the next comparison
                    last_angle = self.odom_angle

                # Stop the robot
                self.cmd_vel.publish(Twist())

                # Update the status flag
                self.start_test = False
                params = {'start_test': False}
                dyn_client.update_configuration(params)

            rospy.sleep(0.5)

        # Stop the robot
        self.cmd_vel.publish(Twist())

    def get_odom_angle(self):
        # Get the current transform between the odom and base frames
        try:
            (trans, rot)  = self.tf_listener.lookupTransform(self.odom_frame, self.base_frame, rospy.Time(0))
        except (tf.Exception, tf.ConnectivityException, tf.LookupException):
            rospy.loginfo("TF Exception")
            return

        # Convert the rotation from a quaternion to an Euler angle

        return quat_to_angle(Quaternion(*rot))

    def shutdown(self):
        # Always stop the robot when shutting down the node
        rospy.loginfo("Stopping the robot...")
        self.cmd_vel.publish(Twist())
        rospy.sleep(1)
if __name__ == '__main__':
    try:
        CalibrateAngular()
    except:
        rospy.loginfo("Calibration terminated.")

2. Code Description

Firstly, importing necessary package

import rospy
from geometry_msgs.msg import Twist, Point
from math import copysign, sqrt, pow
import tf

Initiating node and setting command sending frequency via ROS standard initial code


class CalibrateLinear():
    def __init__(self):
        # Give the node a name
        rospy.init_node('calibrate_linear', anonymous=False)

        # Set rospy to execute a shutdown function when terminating the script
        rospy.on_shutdown(self.shutdown)

        # How fast will we check the odometry values?
        self.rate = 10
        r = rospy.Rate(self.rate)

Setting calibration parameters

test_angle:Calibrated rotation angle in radians

speed:linear velocity

tolerance:Precision range

odom_angular_scale_correction:odom data scale

start_test:start test ON/OFF switch

        # The test angle is 360 degrees
        self.test_angle = 2*pi #

        self.speed = 0.1 # radians per second
        self.tolerance = 1 # degrees converted to radians
        self.odom_angular_scale_correction = 1
        self.start_test = True

Publishing Twist message to control robot moving forward

        # Publisher to control the robot's speed
        self.cmd_vel = rospy.Publisher('/cmd_vel', Twist, queue_size=5)

Initiating base_frame, odom_frame, tf listener

        # The base frame is base_footprint for the TurtleBot but base_link for Pi Robot
        self.base_frame = rospy.get_param('~base_frame', '/base_link')

        # The odom frame is usually just /odom
        self.odom_frame = rospy.get_param('~odom_frame', '/odom')

        # Initialize the tf listener
        self.tf_listener = tf.TransformListener()

        # Give tf some time to fill its buffer
        rospy.sleep(2)

        # Make sure we see the odom and base frames
        self.tf_listener.waitForTransform(self.odom_frame, self.base_frame, rospy.Time(), rospy.Duration(60.0))

        rospy.loginfo("Bring up rqt_reconfigure to control the test.")

Initiating position

        self.position = Point()

        # Get the starting position from the tf transform between the odom and base frames
        self.position = self.get_position()

        x_start = self.position.x
        y_start = self.position.y

        move_cmd = Twist()

Following while loop is used to control root moving forward 1 meter

        while not rospy.is_shutdown():
            if self.start_test:
                # Get the current rotation angle from tf根据tf的信息获取当前旋转角度  
                self.odom_angle = self.get_odom_angle()
                rospy.loginfo("self.odom_angle: "+str(self.odom_angle))

                last_angle = self.odom_angle
                turn_angle = 0
                self.test_angle *= reverse
                error = self.test_angle - turn_angle
                rospy.loginfo("errir: "+str(error))

                # Alternate directions between tests可以控制旋转的方向
                reverse = -reverse
                
                #errorerror is used to measure the difference of robot rotation angle and control command required rotation angle, while loop is used to make error be close indefinitely to zero
                while abs(error) > self.tolerance and self.start_test:
                    if rospy.is_shutdown():
                        return
                    rospy.loginfo("*************************** ")
                    # Rotate the robot to reduce the error
                    move_cmd = Twist()
                    move_cmd.angular.z = copysign(self.speed, error)
                    rospy.loginfo("move_cmd.angular.z: "+str(move_cmd.angular.z))
                    self.cmd_vel.publish(move_cmd)
                    r.sleep()

                    # Get the current rotation angle from tf
                    self.odom_angle = self.get_odom_angle()
                    rospy.loginfo("current rotation angle: "+str(self.odom_angle))

                    # Compute how far we have gone since the last measurement
                    delta_angle = self.odom_angular_scale_correction * normalize_angle(self.odom_angle - last_angle)
                    rospy.loginfo("delta_angle: "+str(delta_angle))

                    # Add to our total angle so far
                    turn_angle += abs(delta_angle)
                    rospy.loginfo("turn_angle: "+str(turn_angle))

                    # Compute the new error计算新的error
                    error = self.test_angle - turn_angle
                    rospy.loginfo("error: "+str(error))

                    # Store the current angle for the next comparison
                    last_angle = self.odom_angle

                # Stop the robot
                self.cmd_vel.publish(Twist())

                # Update the status flag
                self.start_test = False
                params = {'start_test': False}
                dyn_client.update_configuration(params)

            rospy.sleep(0.5)

Getting current angle based on odom data

    def get_odom_angle(self):
        # Get the current transform between the odom and base frames
        try:
            (trans, rot)  = self.tf_listener.lookupTransform(self.odom_frame, self.base_frame, rospy.Time(0))
        except (tf.Exception, tf.ConnectivityException, tf.LookupException):
            rospy.loginfo("TF Exception")
            return

        # Convert the rotation from a quaternion to an Euler angle

        return quat_to_angle(Quaternion(*rot))

Stopping node

    def shutdown(self):
        # Always stop the robot when shutting down the node
        rospy.loginfo("Stopping the robot...")
        self.cmd_vel.publish(Twist())
        rospy.sleep(1)

3.Calibration

With above calibration node, we can calibrate robot angle velocity. First finding a flat space with 2*2m and putting robot within it, then executing following command to check whether robot can rotate 360 degree

rosrun diego_nav calibrate_angular.py

The main parameter to affect angle velocity is wheel_track. If parameters of configuration file is the same as real data but robot cannot rotate to fixed angle, wheel_track parameter can be adjusted for precision compensation.

For high precision control, high precision hardware is required.

Lesson 7: Move control-calibrate linear

After above steps, robot can work but lacking of precise control. To fulfill ideal control precision, calibration is required, including linear velocity calibration for moving distance and angle velocity calibration for rotation angle.

1. Linear velocity calibration principle and code

Linear velocity calibration is fulfilled via checking the difference between robot move distance and control command required move distance. Considering of hardware precision, the error in allowable range can be accepted

Following code control robot moving forward 1m distance.

#!/usr/bin/env python
import rospy
from geometry_msgs.msg import Twist, Point
from math import copysign, sqrt, pow
import tf
class CalibrateLinear():
    def __init__(self):
        # Give the node a name
        rospy.init_node('calibrate_linear', anonymous=False)

        # Set rospy to execute a shutdown function when terminating the script
        rospy.on_shutdown(self.shutdown)

        # How fast will we check the odometry values?
        self.rate = 10
        r = rospy.Rate(self.rate)

        # Set the distance to travel
        self.test_distance = 1.0 # meters
        self.speed = 1.0 # meters per second
        self.tolerance = 0.01 # meters
        self.odom_linear_scale_correction = 1.0
        self.start_test = True

        # Publisher to control the robot's speed
        self.cmd_vel = rospy.Publisher('/cmd_vel', Twist, queue_size=5)

        # The base frame is base_footprint for the TurtleBot but base_link for Pi Robot
        self.base_frame = rospy.get_param('~base_frame', '/base_link')

        # The odom frame is usually just /odom
        self.odom_frame = rospy.get_param('~odom_frame', '/odom')

        # Initialize the tf listener
        self.tf_listener = tf.TransformListener()

        # Give tf some time to fill its buffer
        rospy.sleep(2)

        # Make sure we see the odom and base frames
        self.tf_listener.waitForTransform(self.odom_frame, self.base_frame, rospy.Time(), rospy.Duration(60.0))

        rospy.loginfo("Bring up rqt_reconfigure to control the test.")

        self.position = Point()

        # Get the starting position from the tf transform between the odom and base frames
        self.position = self.get_position()

        x_start = self.position.x
        y_start = self.position.y

        move_cmd = Twist()

        while not rospy.is_shutdown():
            # Stop the robot by default
            move_cmd = Twist()

            if self.start_test:
                # Get the current position from the tf transform between the odom and base frames
                self.position = self.get_position()

                # Compute the Euclidean distance from the target point
                distance = sqrt(pow((self.position.x - x_start), 2) +
                                pow((self.position.y - y_start), 2))

                # Correct the estimated distance by the correction factor
                distance *= self.odom_linear_scale_correction
                # How close are we?
                error =  distance - self.test_distance

                # Are we close enough?
                if not self.start_test or abs(error) <  self.tolerance:
                    self.start_test = False
                    params = False
                    rospy.loginfo(params)
                else:
                    # If not, move in the appropriate direction
                    move_cmd.linear.x = copysign(self.speed, -1 * error)
            else:
                self.position = self.get_position()
                x_start = self.position.x
                y_start = self.position.y

            self.cmd_vel.publish(move_cmd)
            r.sleep()

        # Stop the robot
        self.cmd_vel.publish(Twist())

    def get_position(self):
        # Get the current transform between the odom and base frames
        try:
            (trans, rot)  = self.tf_listener.lookupTransform(self.odom_frame, self.base_frame, rospy.Time(0))
        except (tf.Exception, tf.ConnectivityException, tf.LookupException):
            rospy.loginfo("TF Exception")
            return

        return Point(*trans)

    def shutdown(self):
        # Always stop the robot when shutting down the node
        rospy.loginfo("Stopping the robot...")
        self.cmd_vel.publish(Twist())
        rospy.sleep(1)

if __name__ == '__main__':
    try:
        CalibrateLinear()
        rospy.spin()
    except:
        rospy.loginfo("Calibration terminated.")

2. Code description

First importing necessary package

import rospy
from geometry_msgs.msg import Twist, Point
from math import copysign, sqrt, pow
import tf

Initiating node and setting command sending frequency based on ROS standard initiation code


class CalibrateLinear():
    def __init__(self):
        # Give the node a name
        rospy.init_node('calibrate_linear', anonymous=False)

        # Set rospy to execute a shutdown function when terminating the script
        rospy.on_shutdown(self.shutdown)

        # How fast will we check the odometry values?
        self.rate = 10
        r = rospy.Rate(self.rate)

Setting calibration parameter

test_distance:Moving forward distance

speed:Moving forward velocity

tolerance:precision range

odom_linear_scale_correction:odom data scale

start_test:test ON/OFF switch

       # Set the distance to travel
        self.test_distance = 1.0 # meters
        self.speed = 1.0 # meters per second
        self.tolerance = 0.01 # meters
        self.odom_linear_scale_correction = 1.0
        self.start_test = True

Publishing Twist message to control robot move

        # Publisher to control the robot's speed
        self.cmd_vel = rospy.Publisher('/cmd_vel', Twist, queue_size=5)

Initiating base_frame, odom_frame, tf listener

        # The base frame is base_footprint for the TurtleBot but base_link for Pi Robot
        self.base_frame = rospy.get_param('~base_frame', '/base_link')

        # The odom frame is usually just /odom
        self.odom_frame = rospy.get_param('~odom_frame', '/odom')

        # Initialize the tf listener
        self.tf_listener = tf.TransformListener()

        # Give tf some time to fill its buffer
        rospy.sleep(2)

        # Make sure we see the odom and base frames
        self.tf_listener.waitForTransform(self.odom_frame, self.base_frame, rospy.Time(), rospy.Duration(60.0))

        rospy.loginfo("Bring up rqt_reconfigure to control the test.")

Initiating position

        self.position = Point()

        # Get the starting position from the tf transform between the odom and base frames
        self.position = self.get_position()

        x_start = self.position.x
        y_start = self.position.y

        move_cmd = Twist()

While loop is used to control robot move forward 1m as shown in following code


        while not rospy.is_shutdown():
            # Stop the robot by default
            move_cmd = Twist()

            if self.start_test:
                # Get the current position from the tf transform between the odom and base frames
                self.position = self.get_position() #Getting current location information

                #  Calculating the distance between current location and initial location
                distance = sqrt(pow((self.position.x - x_start), 2) +
                                pow((self.position.y - y_start), 2))

                # Correct the estimated distance by the correction factor
                distance *= self.odom_linear_scale_correction

                # Calculating the distance to target location
                error =  distance - self.test_distance

                # Are we close enough?
                if not self.start_test or abs(error) <  self.tolerance: #When arriving target location, stop robot moving
                    self.start_test = False
                    params = False
                    rospy.loginfo(params)
                else:
                    # If still not arriving target location, keep moving forward; if exceeding target location, control motor reversion
                    move_cmd.linear.x = copysign(self.speed, -1 * error)
            else:
                self.position = self.get_position()
                x_start = self.position.x
                y_start = self.position.y

            self.cmd_vel.publish(move_cmd)#Publishing Twist message
            r.sleep()

Getting current location based on tf data

    def get_position(self):
        # Get the current transform between the odom and base frames
        try:
            (trans, rot)  = self.tf_listener.lookupTransform(self.odom_frame, self.base_frame, rospy.Time(0))
        except (tf.Exception, tf.ConnectivityException, tf.LookupException):
            rospy.loginfo("TF Exception")
            return

        return Point(*trans)

Stopping node

    def shutdown(self):
        # Always stop the robot when shutting down the node
        rospy.loginfo("Stopping the robot...")
        self.cmd_vel.publish(Twist())
        rospy.sleep(1)

3.Calibration

With above calibrated node, we can calibrate robot linear velocity. First, finding a flat space for moving forward 1m, putting robot at starting point, then executing following command to check whether robot can move forward 1m

rosrun diego_nav calibrate_linear.py 

It’s ideal if robot exactly moves forward 1m, if not, double checking robot related parameters in my_arduino_params.yaml file, also, keeping in mind that the unit is meter in ROS.

# === Robot drivetrain parameters
wheel_diameter: 0.02900
wheel_track: 0.18
encoder_resolution: 2 # from Pololu for 131:1 motors
gear_reduction: 75.0
motors_reversed: True

If parameters of configuration file is the same as real data but robot cannot move forward as required precision, it may be related to motor performance, my_arduino_params.yaml parameter can be adjusted for precision compensation.

For high precision control, high precision hardware is required.

Lesson 6: Move control-PID Control for 2 motor

Ros_arduino_bridge the base controller in the PID control of the two motors is a set of PID parameters, but the actual use, due to the characteristics of the motor, terrain, or robot load balance of the many problems so that the robot can not follow the scheduled trajectory , The simplest is whether the robot can go straight, which requires the two motors were PID speed, this section will be modified ros_arduino_bridge to support the two motor with different PID parameters speed

1.First modify the arduino code

1.1.diff_controller.h
Increase the PID control variables of the left and right motors:

/* PID Parameters */
int Kp = 20;
int Kd = 12;
int Ki = 0;
int Ko = 50;

int left_Kp=Kp;
int left_Kd=Kd;
int left_Ki=Ki;
int left_Ko=Ko;

int right_Kp=Kp;
int right_Kd=Kd;
int right_Ki=Ki;
int right_Ko=Ko;

modify resetPID function

void resetPID(){
 leftPID.TargetTicksPerFrame = 0.0;
 leftPID.Encoder = readEncoder(LEFT);
 leftPID.PrevEnc = leftPID.Encoder;
 leftPID.output = 0;
 leftPID.PrevInput = 0;
 leftPID.ITerm = 0;

 rightPID.TargetTicksPerFrame = 0.0;
 rightPID.Encoder = readEncoder(RIGHT);
 rightPID.PrevEnc = rightPID.Encoder;
 rightPID.output = 0;
 rightPID.PrevInput = 0;
 rightPID.ITerm = 0;
}

Define the dorightPID () and doleftPID () functions of the left and right motors, respectively

/* PID routine to compute the next motor commands */
void dorightID(SetPointInfo * p) {
  long Perror;
  long output;
  int input;

  //Perror = p->TargetTicksPerFrame - (p->Encoder - p->PrevEnc);
  input = p->Encoder - p->PrevEnc;
  Perror = p->TargetTicksPerFrame - input;

  /*
  * Avoid derivative kick and allow tuning changes,
  * see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-derivative-kick/
  * see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-tuning-changes/
  */
  //output = (Kp * Perror + Kd * (Perror - p->PrevErr) + Ki * p->Ierror) / Ko;
  // p->PrevErr = Perror;
  output = (right_Kp * Perror - right_Kd * (input - p->PrevInput) + p->ITerm) / right_Ko;
  p->PrevEnc = p->Encoder;

  output += p->output;
  // Accumulate Integral error *or* Limit output.
  // Stop accumulating when output saturates
  if (output >= MAX_PWM)
    output = MAX_PWM;
  else if (output <= -MAX_PWM)
    output = -MAX_PWM;
  else
  /*
  * allow turning changes, see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-tuning-changes/
  */
    p->ITerm += Ki * Perror;

  p->output = output;
  p->PrevInput = input;
}

/* PID routine to compute the next motor commands */
void doleftPID(SetPointInfo * p) {
  long Perror;
  long output;
  int input;

  //Perror = p->TargetTicksPerFrame - (p->Encoder - p->PrevEnc);
  input = p->Encoder - p->PrevEnc;
  Perror =p->TargetTicksPerFrame + input;

  /*
  * Avoid derivative kick and allow tuning changes,
  * see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-derivative-kick/
  * see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-tuning-changes/
  */
  //output = (Kp * Perror + Kd * (Perror - p->PrevErr) + Ki * p->Ierror) / Ko;
  // p->PrevErr = Perror;
  output = (left_Kp * Perror - left_Kd * (input - p->PrevInput) + p->ITerm) / left_Ko;
  p->PrevEnc = p->Encoder;

  output += p->output;
  // Accumulate Integral error *or* Limit output.
  // Stop accumulating when output saturates
  if (output >= MAX_PWM)
    output = MAX_PWM;
  else if (output <= -MAX_PWM)
    output = -MAX_PWM;
  else
  /*
  * allow turning changes, see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-tuning-changes/
  */
    p->ITerm += Ki * Perror;

  p->output = output;
  p->PrevInput = input;
}

modify updatePID() function

void updatePID() {
  /* Read the encoders */
  leftPID.Encoder =readEncoder(LEFT);
  rightPID.Encoder = readEncoder(RIGHT);

  /* If we're not moving there is nothing more to do */
  if (!moving){
    /*
    * Reset PIDs once, to prevent startup spikes,
    * see http://brettbeauregard.com/blog/2011/04/improving-the-beginner%E2%80%99s-pid-initialization/
    * PrevInput is considered a good proxy to detect
    * whether reset has already happened
    */
    if (leftPID.PrevInput != 0 || rightPID.PrevInput != 0) resetPID();
    return;
  }

  /* Compute PID update for each motor */
  dorightID(&rightPID);
  doleftPID(&leftPID);

  /* Set the motor speeds accordingly */
  setMotorSpeeds(leftPID.output, rightPID.output);

}

1.2. encoder_driver.ino

Since the direction of rotation of the left and right motors is reversed, there will be an increase in the reading of the encoder, but when we actually control, the direction of travel of the two motors is the same, which affects the PID calculation. Need to reverse the encoder on the left side of the motor read, modify the readEncoder function.

 /* Wrap the encoder reading function */
 long readEncoder(int i) {
 if (i == LEFT) return 0-left_enc_pos;
 else return right_enc_pos;
 }

1.3.ROSArduinoBridge.ino

Modify the declaration of pid_args to the following code

int pid_args[8];

Modify the runCommand () function, modify the case UPDATE_PID, the original Kp, Kd, Ki, Ko assignment code comment out, modify the following code

    case UPDATE_PID:
      while ((str = strtok_r(p, ":", &p)) != '\0') {
        pid_args[i] = atoi(str);
        i++;
      }
//      Kp = pid_args[0];
//      Kd = pid_args[1];
//      Ki = pid_args[2];
//      Ko = pid_args[3];

      left_Kp = pid_args[0];
      left_Kd = pid_args[1];
      left_Ki = pid_args[2];
      left_Ko = pid_args[3];

      right_Kp = pid_args[4];
      right_Kd = pid_args[5];
      right_Ki = pid_args[6];
      right_Ko = pid_args[7];
      Serial.println("OK");
      break;

Now arduino side has been supported on the two motors using different PID parameters of the speed control

1.4ROSArduinoBridge.ino

Modify the file argv1 argv2 two parameters of the definition of its array length expansion, because these two parameters used to receive the host computer to send the command, we increased the pid parameters, making the command length exceeds the original two parameters length.
// Character arrays to hold the first and second arguments
char argv1[32];
char argv2[32];

2.Modify ROS package code

2.1.my_arduino_params.yaml
Increase the PID parameters of the left and right motors in the PID parameter section

# === PID parameters
Kp: 10
Kd: 12
Ki: 0
Ko: 50
accel_limit: 1.0

left_Kp: 10
left_Kd: 12
left_Ki: 0
left_Ko: 50

right_Kp: 8
right_Kd: 12
right_Ki: 0

2.2.arduino_driver.py

modify update_pid function

def update_pid(self, left_Kp, left_Kd, left_Ki, left_Ko, right_Kp, right_Kd, right_Ki, right_Ko):
        ''' Set the PID parameters on the Arduino
        '''
        print "Updating PID parameters"
        cmd = 'u ' + str(left_Kp) + ':' + str(left_Kd) + ':' + str(left_Ki) + ':' + str(left_Ko) + ':' + str(right_Kp) + ':' + str(right_Kd) + ':' + str(right_Ki) + ':' + str(right_Ko)
        self.execute_ack(cmd)  

2.3.base_controller.py

modify def init(self, arduino, base_frame),Increase the initialization code of the left and right motor PID parameters

def __init__(self, arduino, base_frame):
        self.arduino = arduino
        self.base_frame = base_frame
        self.rate = float(rospy.get_param("~base_controller_rate", 10))
        self.timeout = rospy.get_param("~base_controller_timeout", 1.0)
        self.stopped = False

        pid_params = dict()
        pid_params['wheel_diameter'] = rospy.get_param("~wheel_diameter", "") 
        pid_params['wheel_track'] = rospy.get_param("~wheel_track", "")
        pid_params['encoder_resolution'] = rospy.get_param("~encoder_resolution", "") 
        pid_params['gear_reduction'] = rospy.get_param("~gear_reduction", 1.0)

        #modify by william 
        pid_params['left_Kp'] = rospy.get_param("~left_Kp", 20)
        pid_params['left_Kd'] = rospy.get_param("~left_Kd", 12)
        pid_params['left_Ki'] = rospy.get_param("~left_Ki", 0)
        pid_params['left_Ko'] = rospy.get_param("~left_Ko", 50)
        pid_params['right_Kp'] = rospy.get_param("~right_Kp", 20)
        pid_params['right_Kd'] = rospy.get_param("~right_Kd", 12)
        pid_params['right_Ki'] = rospy.get_param("~right_Ki", 0)
        pid_params['right_Ko'] = rospy.get_param("~right_Ko", 50)

        self.accel_limit = rospy.get_param('~accel_limit', 0.1)
        self.motors_reversed = rospy.get_param("~motors_reversed", False)

        # Set up PID parameters and check for missing values
        self.setup_pid(pid_params)

        # How many encoder ticks are there per meter?
        self.ticks_per_meter = self.encoder_resolution * self.gear_reduction  / (self.wheel_diameter * pi)

        # What is the maximum acceleration we will tolerate when changing wheel speeds?
        self.max_accel = self.accel_limit * self.ticks_per_meter / self.rate

        # Track how often we get a bad encoder count (if any)
        self.bad_encoder_count = 0

        now = rospy.Time.now()    
        self.then = now # time for determining dx/dy
        self.t_delta = rospy.Duration(1.0 / self.rate)
        self.t_next = now + self.t_delta

        # internal data        
        self.enc_left = None            # encoder readings
        self.enc_right = None
        self.x = 0                      # position in xy plane
        self.y = 0
        self.th = 0                     # rotation in radians
        self.v_left = 0
        self.v_right = 0
        self.v_des_left = 0             # cmd_vel setpoint
        self.v_des_right = 0
        self.last_cmd_vel = now

        # subscriptions
        rospy.Subscriber("cmd_vel", Twist, self.cmdVelCallback)

        # Clear any old odometry info
        self.arduino.reset_encoders()

        # Set up the odometry broadcaster
        self.odomPub = rospy.Publisher('odom', Odometry)
        self.odomBroadcaster = TransformBroadcaster()

        rospy.loginfo("Started base controller for a base of " + str(self.wheel_track) + "m wide with " + str(self.encoder_resolution) + " ticks per rev")
        rospy.loginfo("Publishing odometry data at: " + str(self.rate) + " Hz using " + str(self.base_frame) + " as base frame")

修改def setup_pid(self, pid_params):

def setup_pid(self, pid_params):
        # Check to see if any PID parameters are missing
        missing_params = False
        for param in pid_params:
            if pid_params[param] == "":
                print("*** PID Parameter " + param + " is missing. ***")
                missing_params = True

        if missing_params:
            os._exit(1)

        self.wheel_diameter = pid_params['wheel_diameter']
        self.wheel_track = pid_params['wheel_track']
        self.encoder_resolution = pid_params['encoder_resolution']
        self.gear_reduction = pid_params['gear_reduction']

        #self.Kp = pid_params['Kp']
        #self.Kd = pid_params['Kd']
        #self.Ki = pid_params['Ki']
        #self.Ko = pid_params['Ko']

        #self.arduino.update_pid(self.Kp, self.Kd, self.Ki, self.Ko)

        #modify by william
        self.left_Kp = pid_params['left_Kp']
        self.left_Kd = pid_params['left_Kd']
        self.left_Ki = pid_params['left_Ki']
        self.left_Ko = pid_params['left_Ko']

        self.right_Kp = pid_params['right_Kp']
        self.right_Kd = pid_params['right_Kd']
        self.right_Ki = pid_params['right_Ki']
        self.right_Ko = pid_params['right_Ko']

        #Pass the 8 parameters to the update_pid function
        self.arduino.update_pid(self.left_Kp, self.left_Kd, self.left_Ki, self.left_Ko, self.right_Kp, self.right_Kd, self.right_Ki, self.right_Ko)

After modifying the above code ros_arduino_bridge can be set on the two different PID parameters of the motor speed control.

3.PID Speed control

Two motor PID speed is more trouble, need to constantly modify the parameters of the corresponding parameters, so that the speed of the two motors can be consistent, the author’s experience is to debug a good side of the motor, in this motor as the subject matter, adjust the other A motor PID parameters, of course, the control accuracy is also limited by the motor performance and encoder accuracy, in order to facilitate the PID speed here we rqt_plot graphics curve to the two motor speed, PID input and output display.

3.1. base_controller.py

modify def init(self, arduino, base_frame),Increase the self.debugPID parameter at the beginning, debugPID = true, start debugging, debugPID = false, turn off debugging

class BaseController:
     def __init__(self, arduino, base_frame):
     self.arduino = arduino
     self.base_frame = base_frame
     self.rate = float(rospy.get_param("~base_controller_rate", 10))
     self.timeout = rospy.get_param("~base_controller_timeout", 1.0)
     self.stopped = False
     self.debugPID=False

modify def init(self, arduino, base_frame),Add the following code at the end of the function

if self.debugPID:
     self.lEncoderPub = rospy.Publisher('Lencoder', Int32)
     self.rEncoderPub = rospy.Publisher('Rencoder', Int32)
     self.lPidoutPub = rospy.Publisher('Lpidout', Int32)
     self.rPidoutPub = rospy.Publisher('Rpidout', Int32)
     self.lVelPub = rospy.Publisher('Lvel', Int32)
     self.rVelPub = rospy.Publisher('Rvel', Int32)

modify def poll(self),Add the following code to the function header

if self.debugPID:
     rospy.logdebug("poll start-------------------------------: ")
     try:
         left_pidin, right_pidin = self.arduino.get_pidin()
         self.lEncoderPub.publish(left_pidin)
         self.rEncoderPub.publish(right_pidin)
         rospy.logdebug("left_pidin: "+str(left_pidin))
         rospy.logdebug("right_pidin: "+str(right_pidin))
     except:
         rospy.logerr("getpidin exception count: ")
         return

     try:
         left_pidout, right_pidout = self.arduino.get_pidout()
         self.lPidoutPub.publish(left_pidout)
         self.rPidoutPub.publish(right_pidout)
         rospy.logdebug("left_pidout: "+str(left_pidout))
         rospy.logdebug("right_pidout: "+str(right_pidout))
     except:
         rospy.logerr("getpidout exception count: ")
         return

modify def poll(self),Add the following code after if not self.stopped:

if self.debugPID:
     self.lVelPub.publish(self.v_left)
     self.rVelPub.publish(self.v_right)

3.2. arduino_driver.py

Add get_pidin and get_pidout

def get_pidin(self):
    values = self.execute_array('i')
    if len(values) != 2:
         print "pidin was not 2"
         raise SerialException
         return None
    else:
         return values

def get_pidout(self):
    values = self.execute_array('f')
    if len(values) != 2:
         print "pidout was not 2"
         raise SerialException
         return None
    else:
         return values

3.3. Arduino 的commands.h

and command pidin and pidout

#define READ_PIDOUT    'f'
#define READ_PIDIN     'i'

 

3.4. Arduino 的diff_controller.h

Add readPidIn and readPidOut functions at the end

long readPidIn(int i) {
  long pidin=0;
    if (i == LEFT){
    pidin = leftPID.PrevInput;
  }else {
    pidin = rightPID.PrevInput;
  }
  return pidin;
}

long readPidOut(int i) {
  long pidout=0;
    if (i == LEFT){
    pidout = leftPID.output;
  }else {
    pidout = rightPID.output;
  }
  return pidout;
}

 

3.5. Arduino ROSArduinoBridge.ino

modify runcammand(),Add case READ_PIDIN: and case READ_PIDOUT in the switch section:

  switch (cmd) {
    case GET_BAUDRATE:
      Serial.println(BAUDRATE);
      break;
    case READ_PIDIN:
      Serial.print(readPidIn(LEFT));
      Serial.print(" ");
      Serial.println(readPidIn(RIGHT));
      break;
    case READ_PIDOUT:
      Serial.print(readPidOut(LEFT));
      Serial.print(" ");
      Serial.println(readPidOut(RIGHT));
      break;
     ...

3.6. execute PID speed control

Execute the following commands at different terminals
start roscore

roscore

add path to bash

. ~/catkin_ws/devel/setup.bash

start node

roslaunch ros_arduino_python arduino.launch

At this time we can release the Twist message to control the robot’s operation, such as:

rostopic pub /cmd_vel geometry_msgs/Twist -r 1 -- '[0.5, 0.0, 0.0]' '[0.0, 0.0, 0.0]'

start rqt_plot

rqt_plot

For the use of rqt_plot, see the official documentationhttp://wiki.ros.org/rqt_plot
This time you can rqt_plot provided by the graphical interface, according to the following PID speed formula to debug,

Parameter tuning to find the best, from small to large order check
First after the proportion of points, and finally add the differential plus
Curve oscillation is very frequent, proportional dial to enlarge
Curve floating around the big bay, the proportion of the plate to a small pull
The curve deviates slowly and the integration time goes down
Curve fluctuation cycle is long and the integration time is lengthened
Curve oscillation frequency fast, first differential down
Moving to big slow fluctuations. The derivative time should be longer
Ideal curve of two waves, the former high to low 4 to 1
A look at the two tone more analysis, the quality of regulation will not be low.

Posts navigation

1 2 3 4
Scroll to top