National Institute of Technology Karnataka, Surathkal ie@nitk.edu.in
Click here to give first round of RTFM

Human Activity Recognition

PROBLEM STATEMENT:

The project proposed uses a simple webcam to detect and classify the following activities performed by a single person: sitting, standing, walking, blinking and waving. The project uses image processing and pre-defined cascades to detect human activities.

PROPOSED SOLUTION:

The current implementation uses Haar Cascades and thresholds to detect and classify human activities. It uses python and OpenCV libraries to implement the project.

METHODOLOGY

To detect walking, the code finds the cascade of the face, and then calculates the velocity of the midpoint over 7 frames and then classifies as walking if the average velocity is over a certain threshold. Standing and sitting is done by checking the y coordinate of the midpoint of the face cascade.
Blinking is checked by checking for open eye cascade within the face cascade. If the frequency of a cascade appearing within the face is above a certain threshold, it is counted as blinking.
Waving is checked by getting the hand cascade and tracking its movement.

RESULTS

Our Activities predicted are shown below:











FUTURE WORK

More activities can be added, and the project can be generalized to include more than one user. Also the current implementation assumes the person is within a certain range of the webcam for some activities, which can be generalized.

KEY LEARNINGS

As part of this project, python, OpenCV, Machine learning and other image processing algorithms were learnt.

CONCLUSION

The project was executed and 5 activities were recognized.

TEAM

● Rohan Jiju
● Rithik Jain

● Mentor:
● Preethi Reddy