Please use this identifier to cite or link to this item: https://idr.l1.nitk.ac.in/jspui/handle/123456789/8955
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHeshi, R.
dc.contributor.authorSuma, S.M.
dc.contributor.authorKoolagudi, S.G.
dc.contributor.authorBhandari, S.
dc.contributor.authorRao, K.S.
dc.date.accessioned2020-03-30T10:23:07Z-
dc.date.available2020-03-30T10:23:07Z-
dc.date.issued2016
dc.identifier.citationSmart Innovation, Systems and Technologies, 2016, Vol.43, , pp.603-609en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/8955-
dc.description.abstractIn this work, an effort has been made to analyze rhythm and timbre related features to identify raga and tala from a piece of Carnatic music. Raga and Tala classification is performed using both rhythm and timbre features. Rhythm patterns and rhythm histogram are used as rhythm features. Zero crossing rate (ZCR), centroid, spectral roll-off, flux, entropy are used as timbre features. Music clips contain both instrumental and vocals. To find similarity between the feature vectors T-Test is used as a similarity measure. Further, classification is done using Gaussian Mixture Models (GMM). The results shows that the rhythm patterns are able to distinguish different ragas and talas with an average accuracy of 89.98 and 86.67 % respectively. � Springer India 2016.en_US
dc.titleRhythm and timbre analysis for carnatic music processingen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.