Please use this identifier to cite or link to this item: https://idr.l1.nitk.ac.in/jspui/handle/123456789/8214
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDobhal, T.
dc.contributor.authorShitole, V.
dc.contributor.authorThomas, G.
dc.contributor.authorNavada, G.
dc.date.accessioned2020-03-30T10:18:13Z-
dc.date.available2020-03-30T10:18:13Z-
dc.date.issued2015
dc.identifier.citationProcedia Computer Science, 2015, Vol.58, , pp.178-185en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/8214-
dc.description.abstractView based recognition methods use visual templates for recognition and hence do not extract complex features from the image. Instead they retain the entire raw image as a single feature in high dimensional space. These example images or templates are learnt under different poses and illumination conditions for recognition. With this in mind, we build on the idea of 2-D representation of action video sequence by combining the image sequences into a single image called Binary Motion Image (BMI) to perform human activity recognition. For classification, we employ Convolutional Neural Networks which inherently provide slight invariance to translational and rotational shifts, partial occlusions as well as background noise. We test our method on Weizmann dataset focusing on actions that look similar like run, walk, jump, side and skip actions. We also extended our method to 3-D depth maps using MSR Action3D dataset by extracting three BMI projections namely the front view, the side view and the top view. From the results obtained, we believe that BMI is sufficient for activity recognition and has shown to be invariant to speed of the action performed in addition to the aforementioned variations. � 2015 Published by Elsevier B.V.en_US
dc.titleHuman Activity Recognition using Binary Motion Image and Deep Learningen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.