Implementation of an algorithm for determining the width of an object using an ultrasonic rangefinder and encoders in RobotC language

Good day!

Today I would like to share an implementation of the algorithm for determining the width of an object using an ultrasonic rangefinder and encoders in the RobotC programming language (using the VEX EDR platform).

Prehistory


Recently, I was offered to conduct a master class on solving interesting problems using the RobotC programming language. During the master class, I considered the task of determining the width of the object, the implementation of the algorithm of which I want to share in this publication.

Elemental base


As an experimental training ground, I used the Robot Virtual Worlds environment for the VEX platform, the virtual robot model VEX Launcher, virtual models of VRC Turning Point competition elements (Figure 1) and the simplest mathematical formulas (for determining the length of a chord in a circle with a known central angle).

image
Figure 1 . Virtual training ground with a robot

Task


- Determining the diameter of the ball using only an ultrasonic rangefinder and encoders mounted on the robot

Theoretical basis


To determine the width of the object, I used the concept of a radiation pattern (Figure 2).

image
Figure 2. Sensitivity and directional pattern of the ultrasonic range finder

In essence, it was necessary to determine the angle formed when turning from the extreme right position to the extreme left position between the tangents at the extreme points of the diagram.
Consider Figure 3.

image
Figure 3 . Algorithm geometry

We will search for the central angle of the arc based on the relationship between the radius of the angle and the arc of rotation of the robot. The final width value will correspond to the m - chord of the arc.

Input parameters


As input parameters, I have selected:

  • dist = 100 mm (the distance at which the robot stops to measure, the parameter is selected based on the presence of other objects around the object that may fall into the field of view of the sensor and interfere with the measurement)
  • d_wheel = 102 mm (diameter of the wheel of the robot)
  • d_ball = 102 mm (actual ball diameter)

An error was determined as the output parameter of this task, which will indicate the accuracy of our method and the selected parameters:

  • error = abs (d_ball_exper - d_ball)

Program Operation Algorithm


The following stages of the algorithm were identified.

  1. Movement to the object.
  2. Rotate to the far right point of an object.
  3. Rotate to the far left point of the object.
  4. Calculation of the angle using readings taken from encoders.
  5. Error calculation.

RobotC implementation of the algorithm


First of all, we need to initialize the standard configuration of the virtual robot:

#pragma config(StandardModel, "VEX Launchbot")

We implement a function that will be responsible for turning the robot to the far right and / or extreme left position. The function returns the encoder after a rotation, which will be used in further calculations:

int turn_to_over_object(int leftSpeed, int rightSpeed, int distance, bool need_object, bool left_side)
{
if(need_object == true)
{
if(left_side == true)
{
while(SensorValue[sonarSensor] > distance)
{
setMultipleMotors(-leftSpeed, leftMotor,frontLeftMotor);
setMultipleMotors(rightSpeed, rightMotor,frontRightMotor);
}
}
else
{
while(SensorValue[sonarSensor] > distance)
{
setMultipleMotors(leftSpeed, leftMotor,frontLeftMotor);
setMultipleMotors(-rightSpeed, rightMotor,frontRightMotor);
}
}
}
else
{
if(left_side == true)
{
while(SensorValue[sonarSensor] < (distance + 1))
{
setMultipleMotors(-leftSpeed, leftMotor,frontLeftMotor);
setMultipleMotors(rightSpeed, rightMotor,frontRightMotor);
}
}
else
{
while(SensorValue[sonarSensor] < (distance + 1))
{
setMultipleMotors(leftSpeed, leftMotor,frontLeftMotor);
setMultipleMotors(-rightSpeed, rightMotor,frontRightMotor);
}
}
}

return getMotorEncoder(leftMotor);
}

Let's take a closer look at the function arguments:

  • leftSpeed ​​- left drive speed
  • rightSpeed ​​- right drive speed
  • rightSpeed ​​- right drive speed
  • distance - distance to the object where the robot stopped
  • need_object - movement mode: True, if an object is needed in the sensor field of view, False if an object is needed
  • left_side - direction of rotation of the robot. True - turn left, False - turn right

With the value need_object = True, the robot will rotate until the object is at a distance from the robot (in our case, 100 mm).

With the value need_object = False, the robot will rotate until the object is at a distance (distance + 1) from the robot. The unit parameter in total affects the final accuracy of determining the absence of an object in the field of view of the sensor (it is possible to improve the program by taking this parameter into function arguments).

Let's write the main body of the program:

task main()
{
int d_ball = 10.2; //sm
int d_wheel = 10.2;

int dist = 10; //sm
//  
setMotor(armMotor, -50);
wait(1);

while(SensorValue[sonarSensor] > 10)
{
setMultipleMotors(50, leftMotor, rightMotor, frontLeftMotor, frontRightMotor);
}
stopAllMotors();
int enc = 0;
enc = turn_to_over_object(50, 50, dist, false, false);
enc = turn_to_over_object(50, 50, dist, true, true);
resetMotorEncoder(leftMotor);
enc = turn_to_over_object(50, 50, dist, false, true);
stopAllMotors();

float teta = abs(enc) * PI * d_wheel * d_wheel / (dist * 627);
float d_ball_exper = 2 * dist * sin(teta / 2);
	
float error = abs(d_ball_exper - d_ball);
while(1)
{
}
}

The first while loop runs until the robot reaches the specified distance to the object. Next, we need to call the function written above one by one in order to measure the encoder reading between the rightmost and leftmost positions (note that before measuring it is necessary to reset the encoder to zero).

To get the angle formula, I was guided by the following formulas:
- enc = N * 627, where
    N is the number of revolutions of the wheel,
    enc is the encoder reading,
    627 is the encoder resolution => express N;
- S = N * l, where
    l is the length of the wheel, l = PI * d_wheel * d_wheel => substitute N and l => S = enc * PI * d_wheel * d_wheel / 627;
- on the other hand, S = dist * teta, where
    S is the magnitude of the arc that the robot turned,
    teta is the desired central angle;
- based on the above data, we express teta => teta = enc * PI * d_wheel * d_wheel / (627 * dist)

To find the chord equal to the desired diameter of the ball, we use the formula:
    d_ball_exper = 2 * dist * sin (teta / 2)

The endless while loop at the end of the program is necessary in order to see the received values ​​in the debugger.

Summary data


The following values ​​of the ball diameter and errors were obtained on the virtual field:

    d_ball_exper = 10.6 cm
    error = 0.4 cm

As you can see, the obtained method for measuring the width gives a small error value, which is acceptable for the accuracy class of the sensors used. The error value can be reduced by selecting the value of the dist parameter (for example, starting a cycle with the robot moving back with a certain step that changes to dist) or using the PID controller, which will reduce the influence of the inertia of the robot on the final value of the encoder during braking.

Prospects


In the future, I will consider other methods for using robot sensors to determine environmental parameters.

All Articles