This repository presents a tool for 3D volleyball court reconstruction and ball tracking
using a multi-camera system composed of 10 synchronized cameras strategically positioned around the court. The proposed pipeline integrates intrinsic and extrinsic camera calibration. Intrinsic calibration estimates each camera’s focal length, principal point, and lens distortion to correct image deformations, while extrinsic calibration determines the cameras’ positions and orientations within a shared 3D coordinate system.
For intrinsic calibration
, the official OpenCV Calibration Method is employed with a checkerboard pattern. Extrinsic parameters
are optimized using Perspective-n-Point (PnP) algorithms and bundle adjustment by matching key-points across multiple views.
By combining these calibrations with multi-view geometry and deep learning-based ball detection (YOLOv11
), the system achieves robust 3D localization of the volleyball, even under challenging conditions such as occlusions and rapid motion.
To model trajectory dynamics, a particle filter
is implemented, incorporating physics-based motion priors to enable accurate predictions during brief detection dropouts. The estimated trajectory is further refined using a low-pass filter.
To ensure compatibility and reproducibility, this project requires Python and the necessary packages. Follow the steps below to set up the environment.
- Clone the Repository to your local machine
git clone https://github.com/pietrolechthaler/MultiViewCalibration-BallTracking.git
cd MultiViewCalibration-BallTracking/
- Create a virtual environment
python -m venv env
- Activate the virtual environment and install the required packages
source env/bin/activate # MacOS and Linux
env\Scripts\activate # Windows
pip install -r requirements.txt
In the parameters.py
file, you can modify the parameters used by the tool.
To process videos and extract frames, use the following command:
./preprocessing.sh
This command will create a folder named src-gen
, which will contain the extracted frames for each video.
After extracting the frames, you can run the intrinsic and extrinsic calibration using the following commands:
- Intrinsic Calibration:
./intrinsic.sh
- Extrinsic Calibration:
./extrinsic.sh
Make sure to check the output for any errors and verify that the calibration results are as expected.
To start the 3D Reconstruction Viewer tool:
./viewer_linux.sh # MacOS and Linux
./viewer_windows.sh # Windows
Open your web browser and go to http://127.0.0.1:5000/ to access the 3D Reconstruction Viewer.
To run the ball detection script, use the following command:
./detection.sh
Once the ball detection process is complete, you can proceed to run the 3D tracking script with the following command:
./tracking.sh
Ylenia Graziadei, Pietro Lechthaler
The project has been built and tested on the following systems:
- Windows 11 Pro, Python 3.10.14
- Ubuntu 24.10, Python 3.12.7