Please use this identifier to cite or link to this item: https://idr.l1.nitk.ac.in/jspui/handle/123456789/7610
Full metadata record
DC FieldValueLanguage
dc.contributor.authorUpadhya, B.A.-
dc.contributor.authorUdupa, S.-
dc.contributor.authorSowmya, Kamath S.-
dc.date.accessioned2020-03-30T10:02:33Z-
dc.date.available2020-03-30T10:02:33Z-
dc.date.issued2019-
dc.identifier.citation2019 10th International Conference on Computing, Communication and Networking Technologies, ICCCNT 2019, 2019, Vol., , pp.-en_US
dc.identifier.urihttps://idr.nitk.ac.in/jspui/handle/123456789/7610-
dc.description.abstractAutomatic generation of responses to questions is a challenging problem that has applications in fields like customer support, question-answering forums etc. Prerequisite to developing such systems is a requirement for a methodology that classifies questions as yes/no or opinion-based questions, so that quick and accurate responses can be provided. Performing this classification is advantageous, as yes/no questions can generally be answered using the data that is already available. In the case of an opinion-based or a yes/no question that wasn't previously answered, an external knowledge source is needed to generate the answer. We propose a LSTM based model that performs question classification into the two aforementioned categories. Given a question as an input, the objective is to classify it into opinion-based or yes/no question. The proposed model was tested on the Amazon community question-answer dataset as it is reflective of the problem statement we are trying to solve. The proposed methodology achieved promising results, with a high accuracy rate of 91% in question classification. � 2019 IEEE.en_US
dc.titleDeep Neural Network Models for Question Classification in Community Question-Answering Forumsen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
File Description SizeFormat 
10 Deep Neural Network.pdf792.32 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.