The Open Call Trial for the RODENT (Rodent Obstruction through Drone-Enabled Non-invasive Technology) project is part of the PULL 2 Open Call for Farming, Forestry, and Rural Challenges (FFRC). It aims to validate the effectiveness of drone-mounted ultrasound technology in repelling rodents and preventing crop losses in a sustainable, non-invasive manner. By emitting ultrasound waves within the decided range, drones create an acoustic barrier that deters rodents without harming ecosystems.
This repository contains:
-
testing_SPL/
- rezultati_merenja.m loads measured signals from signali/ with filenames of the form elevacija_x.mat where x ranges from 1 to 30. These signals were recorded on the surface beneath the parabola to assess sound-field coverage. The script computes the sound pressure level (SPL) on the surface located 1.0 m below the parabola's center, for a parabola of radius 0.58 m, and summarises the spatial SPL distribution used for coverage analysis. The signali/ dataset is hosted on the Zenodo platform (DOI/URL in progress).
-
infrared-rodent-detection/
- Contains scripts and models for rodent detection using YOLOv11n and OpenCV with an IR thermal camera.
- For more details, see the infrared-rodent-detection/README.md.
- Key functionalities:
- YOLOv11n finetuning on a rodent thermal dataset (dataset link).
- Real-time rodent detection and tracking using IR thermal camera streams.
- File export, playback, and inference scripts.
- Model evaluation and performance metrics.
- Environment setup for GPU training and Spinnaker SDK integration.
-
drone-path-optimization/
-
GA.py
Implements a genetic algorithm (GA) that plans an autonomous drone’s grid-based path over a 20 m × 10 m field. The objective combines coverage, battery use, and rodent (mice) interaction shaping.
The script: Builds a 1 m node grid (with a 0.6 m sensing/coverage footprint and 0.05 m resolution for coverage accounting). Encodes a path as a sequence of moves. Simulates mice behavior on the grid (16 by default): mice tend to move opposite the drone’s motion; they can be “removed” when exiting the field.Evaluates fitness with tunable coefficients: uncovered area, redundancy penalties (sliding window), battery shaping and hard over-use penalty, mice edge-seeking/avoidance shaping, and rewards for removal. Runs a GA with roulette selection, single-point crossover, mutation, and elitism.
Gen-1 explores variable path lengths; subsequent generations fix to the best-found length. • Produces diagnostics and plots for the best individual. • best_run.json (created after a run) Saved artifact with the best path, fitness breakdown, and the full mice trajectories for later analysis/visualization. • figures/ (created) Plots generated by the run: • plot_fitness: fitness vs. generation. • plot_path: the best path over the field with coverage and mice traces. • data/Battery_Model_Stats.xlsx
Key parameters (edit at the top of GA.py) • Field & motion: FIELD_WIDTH, FIELD_HEIGHT, NODE_SPACING, DRONE_SPEED, COVERAGE_RADIUS, GRID_RESOLUTION, START_NODE • GA: POP_SIZE, GENS, PATH_LENGTH_MIN/MAX, MUTATION_RATE, ELITISM • Mice: NUM_MICE • Battery: automatically inferred from data/Battery_Model_Stats.xlsx when available; otherwise fallback average drain rate. • Fitness weights.
Outputs • Console: timing, generation summaries, and best-so-far updates. • Files: best_run.json – best path, fitness components, and mice histories; figures/ – fitness curve and best-path visualization.
-
Battery_Model_Stats.xlsx
Contains empirical flight statistics collected from earlier DJI Inspire 2 missions. Each record summarizes one flight session, including battery usage, duration, and energy metrics. This dataset is used by GA.py to model the drone’s average battery drain rate and improve the accuracy of mission-level battery consumption simulations. • Columns typically include: • Battery_Used [%]: total battery percentage consumed during the flight. • Duration [s]: total mission time in seconds. • Avg_Drain_Rate [%/s]: precomputed average consumption rate.
-
This project is funded by the European Union, grant ID 101060643.
