Routines to control a humanoid echolocator robot.

Semester Project by Frederike Dümbgen

Acoustic Robot

Analyze

As explained in Section Operate, a large number of files is created in the output folder after running the experiments. Note that if the program had to be restarted for unexpected errors during the experiment, it is worth it to check and rearrange the measurements such that they respect the file structure suggested. (For example, group all measured encoder readings into one file, compensating for when it was reset to 0 because the robot needed to be stoped)

Analysis class

Important: you need to adjust the two variables "PTS_BASIS" and "MARGIN" manually for correct functioning of the program. See Setup for how they are defined.

Positions

First of all, there are text files of the robot's real positions, its odometry encoder measurements, its position calculated using the cameras and the camera's positions. For their evaluation, a Python class Analysis.py was created. You need to construct an Analysis object with at least the output folder as argument. Then, the names of the outputfiles of interest can be given to the respective fields, as explained in the Code documentation Develop.

For example, if you are only interested in plotting the robot's real positions, the visual localization positions(using fixed height) and the calculated camera postions, you could enter PTS_BASIS.
b = Analysis.Analysis('output_analysis/')
b.output_real = 'output/real_X.txt'
b.output_vis = 'output/posobj_fix_X.txt'
b.output_cam = 'output/cameras_X.txt'
b.output_camreal = 'output/cameras_real_positions.txt'
b.read_files()

Once the object is created, you can plot its components, as far as they're defined, by calling
b.plot_geometry('my_name').
The resulting picture is saved under output_analysis/'my_name'. It will look similar to Figure 1.

Figure 1 - Plot obtained from Analysis.plot_goemtry('my_name').
If you would like to zoom into the trajectory of the robot only (not showing the camera positioning and the reference frames), simply type
b.plot_geometry('my_name_zoomed',True)

Room impulse response

You can also obtain the impulse responses at every position with different zooms using the Analysis object. Simply declare the input_wav file and output_wav_list. If you also want to include vertical lines at the expected time of arrival of the principal echoes, add 'output/real_X.txt' and run b.get_TOA() for calculating the times of arrival. Then, run b.get_RIR() to get plots of the impulse responses. The function also returns the couples [t,h] [f,H] for the impulse response in time and frequency domain respectively. A typical plot created by this function is shown in Figure 2.

b = Analysis.Analysis('output_analysis/')
b.input_wav = 'input/sound.wav'
b.output_wav_list = ['output/0_audio_X_0.txt','output/0_audio_X_1.txt','output/1...', etc. ]
# optional:
b.output_real = 'output/real_X.txt'
b.get_TOA()
b.read_files() # get impulse responses
[t,h],[f,H]=b.get_RIR()
Figure 2 - Plot obtained from Analysis.get_RIR().

Note that the program can filter out certain harmonic frequencies for better results. The list of frequencies needs to be adjusted to contain all frequencies at which the reponse Y(w) shows peaks. If no filtering shall be done, you can comment the line b.apply_filter(time, freq).

Correlation

The analysis class also serves for doing the latency analysis, described in Prepare . You can use the b.get_corsscor() to plot the cross correlation between the input file b.input_wav and each output file from b.output_wav_list. (Figure 3)

Figure 3 - Plot of cross correlation with maximum.

Other results

For the visual localization, a plot of the resulting 2D and 3D errors of the robot position or the reference point positions for the visual localization algorithm is created and saved under "output/N_combi_rob_X.png" (Figure 4).
Figure 4 - 2D and 3D errors of robot position in "output/N_combi_rob_X.png".