GENERAL INFORMATION
-------------------

1. Dataset Title: 

Videos underlying the publication: A Novel MPC Formulation for Dynamic Target Tracking with 
Increased Area Coverage for Search-and-Rescue Robots


2. Authorship: 

	Name: Mirko Baglioni
	Institution: TU Delft, Faculty of Aerospace Engineering, Department of Control and Operations
	Email: M.Baglioni@tudelft.nl
	ORCID: 0000-0001-5421-2824



DESCRIPTION
-----------

1. Abstract: 

This dataset contains the videos of the trajectories of a robot and victims in a simulated 
search-and-rescue scenario, the videos of experiments performed with robots in real life, 
and the tables with the uncertainty values used in the simulations.

The videos of the trajectories of a robot and victims in a simulated search-and-rescue 
scenario consider five different approaches for comparison purposes:

1) trajectory_video_1_our_approach.avi for our proposed tube-based Model Predictive Control (MPC) approach 
(see 'Resource DOI' of the 4TU.ResearchData repository at https://doi.org/10.4121/22270498);
2) trajectory_video_2_Farrohksiar.avi for Farrohksiar tube-based MPC approach; 
3) trajectory_video_3_AstarMPC.avi for A*-MPC approach; 
4) trajectory_video_4_randMPC.avi for randomized MPC approach; 
5) trajectory_video_5_BMastar.avi for Boustrophedon-motion-A* approach. 

The videos of experiments of our tube-based Model Predictive Control (MPC) approach with robots in 
real life consist of three scenarios in a lab environment (respectively, experiment_scenario_1.MP4,
experiment_scenario_2.MP4 and experiment_scenario_3.MP4), with a TurtleBot Burger robot behaving 
as the search-and-rescue robot, an iRobot Create 3 robot behaving as the victim, and 3 static obstacles.

The non-smoothness_map_x.csv and non-smoothness_map_y.csv files are the tables of the non-smoothness map 
of the environment for coordinates x and y respectively (with values in meters; each cell is 1 meter by 
1 meter in a 12 meter by 12 meter environment).


3. Keywords: 

	Model predictive control
	Tube-based model predictive control
	Robust control
	Coverage path planning
	Robots in search and rescue operations
	Disaster robotics


4. Date of data collection: 

	14 March 2023 (videos of the trajectories in the simulations, and non-smoothness map tables)
	26 February 2024 (videos of experiments with robots in real life)


5. Date of dataset publication: 

	17 March 2023 (videos of the trajectories in the simulations, and non-smoothness map tables)
	20 March 2024 (videos of experiments with robots in real life)


6. Funding: 

	This research has been supported jointly by the TU Delft AI Labs program - as a part of the 
	AI*MAN lab research - and by the NWO Talent Program Veni project "Autonomous drones flocking 
	for search-and-rescue" (18120), which has been financed by the Netherlands Organisation for 
	Scientific Research (NWO).



ACCESS INFORMATION
------------------

1. Creative Commons License of the dataset: 
	
	The contents of this dataset are publicly released under a Creative Commons Attribution 
	Non-Commercial No-Derivatives 4.0 International license (CC BY-NC-ND 4.0). See more information 
	in https://creativecommons.org/licenses/by-nc-nd/4.0/


2. Dataset DOI:

	https://doi.org/10.4121/22270498



VERSIONING AND PROVENANCE
-------------------------

1. Last modification date: 

	26 September 2024


2. Was data derived from another source? If yes, which source? 

	n.a.



METHODOLOGICAL INFORMATION
--------------------------

1. Description of data collection methods: 

The video files have been generated with MATLAB R2021a using the output data of our simulations. The aim 
of the simulations was to analyze and validate a tube-based MPC approach for the mission planning 
of a robot in presence of victims in a search-and-rescue scenario. The scenario consisted on a disaster
building in which the robot enters the building, explores the environment to detect the victims and
to attach to them a tracking system while avoiding static obstacles, and finally goes to the exit point, 
while the victims move accordingly to an established crowd evacuation model.

The tube-based MPC approach is explained in the article related to this dataset (see 'Resource DOI' of 
the 4TU.ResearchData repository at https://doi.org/10.4121/22270498). The csv files contain values 
generated randomly within the ranges [-0.1 m, 0.1 m] using MATLAB R2021a. This files has been used 
to provide the uncertainty in the robot position that enters the robot equation for our simulations. 
These values were generated randomly but kept fixed during our simulations to guarantee reproducibility.

The video files of the experiments have been recorded using a video camera. The aim of the experiments 
was to show the applicability of the tube-based MPC approach beyond simulations. The three videos 
correspond to three search-and-rescue scenarios in which the search-and-rescue robot (TurtleBot 3 Burger)
explores the environment to detect the victim (iRobot Create 3) that moves accordingly to an 
established crowd evacuation model, while avoiding static obstacles, and finally goes to the exit point.


2. Methods for processing the data: 

The simulation data has been processed using MATLAB R2021a to generate the videos and the tables.
The simulation data was available as MATLAB .mat files, in which the variables have been used to
generate the videos, through the MATLAB function VideoWriter.
The values in the tables have been used as variables during our simulations.


3. Instrument- or software- specific information (incl. software version) needed to interpret the data: 

The videos can be viewed using any video player that accepts .avi and .mp4 format; the csv files can be 
opened with any text editor.



FILE OVERVIEW
-------------

1. Explain the file naming convention, if applicable: 

The videos of the simulations are listed from 1 to 5 depending on the order of the comparison approach 
simulated in the corresponding video.
The videos of the experiments are listed from 1 to 3 depending on the scenario considered.


2. Extra information on the files:

In the videos, each cell of the 6 by 6 grid represents a 2 meters by 2 meters cell, for the whole
scenario of 12 meters by 12 meters. 
The black small point and the black circle indicates the robot and its perception field. The red, green 
and blue colors are used to indicate each of the three victims trajectories, and the perception field 
of the robot when it detects the corresponding victim; an asterisk of the same color indicates the
victim position at the time it is detected by the robot. The cyan circles indicate the static obstacles.
The black squares indicate the starting positions and the stars indicate the final positions, for both
the robot and the victims.



REFERENCES
----------

See related article provided in 'Resource DOI' of the 4TU.ResearchData repository at 
https://doi.org/10.4121/22270498
See the comparison approaches:
1) Farrohksiar tube-based MPC approach,
https://www.sciencedirect.com/science/article/abs/pii/S0921889013001401; 
2) A*-MPC approach,
https://ieeexplore.ieee.org/abstract/document/8899955;
3) randomized MPC approach,
https://ieeexplore.ieee.org/abstract/document/5152240;
4) Boustrophedon-motion-A* approach,
https://link.springer.com/article/10.1007/s10489-012-0406-4.
