The following was saved from the old littleripper.com website, just in case anyone was looking for it (Source: archive.org):
SHARK DETECTION USING DEEP LEARNING AND ARTIFICIAL INTELLIGENCE
Professor Michael Blumenstein, the Head of the School of Software in the Faculty of Engineering and Information Technology, University of Technology Sydney (UTS), is heading a team to develop an algorithm to automatically detect sharks from the video footage streamed from the Westpac Little Ripper Lifesaver UAVs.
The work of the UTS team on Shark Spotter has involved:
- Preprocessing aerial videos of sharks from publicly available sources to train the algorithms and create ‘ground truth’ for video frames of sharks
- A sophisticated Deep Learning Framework (machine learning) as the backbone for the shark detection and recognition algorithm using a Region based-Convolution Neural Network (RCNN) for accurate object detection and recognition
- Using the objects (sharks, swimmers, surf boards, etc.) from the available video footage for testing and checking the preliminary approach
Based on the ground truth generation, a Deep Learning approach is then used to produce real time object recognition from the video feeds from the UAVs (e.g. sharks and other objects).
Initially, a CPU (central processing unit) based system has been used for the investigations, which will be moved to a GPU
(graphics processing unit) based system (optimal for Deep Learning).
The aim is to initially conduct shark detection in real time from the downlinked video from the UAVs and then to move to
shark detection in real time on the UAVs themselves, with only detections downlinked to the UAV pilots and operators.