Welcome to Seunghun Oh's web page

Assignment 2: Circularity, ROC analysis, Template Matching

Colalborated with Lang Gao

Part 1

For this part of the assignment, I used my pair of glasses as a template for tracking motion. The glasses

were tracked in a live feed provided by my webcam. The glasses were tracked and separated with black

bordering. Below is the template image used for this part of the assignment:

And now below are the results of live feed template matching:

The first image is a relatively normal image; it tracks the glasses normally. The second image has enlarged eyes; however, the tracking does not fail. The third image is when the subject leans backwards; this motion causes the tracking system to reallocate to the subject's neck shadows. The fourth photo is of the subject leaning in; the tracking system reallocates to the neck shadows again. Finally, the fifth shadow is of the subject leaning sideways; the tracking system follows the glasses.

Some of the challenges were figuring out what type of object could be used as a template item. IF the object is too skinny, then the object would not be recognized by the tempalte matching.

These series of photos show that the tracking system that I have coded for this assignment follows the template well except on one condition: the template cannot change in scale very much or else the tracking will not succeed.

Part 2

For this part of the assignment, I used background differencing, frame-to-frame differencing, motion energy templates, and skin-color detection to delineate hand gestures.

Below are the images of the motion history on the left and skin-color detection on the right:

My classmate and I struggled with trying to find efficient and fast algorithms for processing information. Also, we struggled with attempting to find a method of highlighting different hand gestures. Because the latter part was incomplete, it was omitted in the submitted version.

In the first row of images, note the hand shape is clearly delineated as well as the waving hand movement. The second set of images shows closing of the hand into a fist; the fist is shown on the right and the motion can clearly be seen on the left. The third set of images shows a pointed finger moving around. The hand sign is clearly shown on the right and the movement is shown clearly on the left. Finally, the last row shows the fingers spreading apart on the hand. The hand gesture is not shown clearly in the motion history screencap nor on the skin detection screencap.

The algorithm lacks when motions are rapid; the screencaps cannot show much movement at a time. However, the algorithm excels in accurately detecting skin as well as tracking regular speed movements.

The code that I was written is written by myself and my classmate Lang Gao with instructions from the professor as well as the TF. The ideas for the algorithms were taken from the OpenCV tutorials for template matching.