Human Activity Recognition
The project proposed uses a simple webcam to detect and classify the following activities performed by a single person: sitting, standing, walking, blinking and waving. The project uses image processing and pre-defined cascades to detect human activities.
The current implementation uses Haar Cascades and thresholds to detect and classify human activities. It uses python and OpenCV libraries to implement the project.
To detect walking, the code finds the cascade of the face, and then calculates the velocity of the
midpoint over 7 frames and then classifies as walking if the average velocity is over a certain
threshold. Standing and sitting is done by checking the y coordinate of the midpoint of the face
Blinking is checked by checking for open eye cascade within the face cascade. If the frequency of a cascade appearing within the face is above a certain threshold, it is counted as blinking.
Waving is checked by getting the hand cascade and tracking its movement.
Our Activities predicted are shown below:
More activities can be added, and the project can be generalized to include more than one user. Also the current implementation assumes the person is within a certain range of the webcam for some activities, which can be generalized.
As part of this project, python, OpenCV, Machine learning and other image processing algorithms were learnt.
The project was executed and 5 activities were recognized.
● Rohan Jiju
● Rithik Jain
● Preethi Reddy