Constructing linguistically motivated structures from statistical grammars
Research output: Contribution to journal › Conference article › Research › peer-review
Standard
Constructing linguistically motivated structures from statistical grammars. / Basirat, Ali; Faili, Heshaam.
In: International Conference Recent Advances in Natural Language Processing, RANLP, 2011, p. 63-69.Research output: Contribution to journal › Conference article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Constructing linguistically motivated structures from statistical grammars
AU - Basirat, Ali
AU - Faili, Heshaam
PY - 2011
Y1 - 2011
N2 - This paper discusses two Hidden Markov Models (HMM) for linking linguistically motivated XTAG grammar and the automatically extracted LTAG used by MICA parser. The former grammar is a detailed LTAG enriched with feature structures. And the latter one is a huge size LTAG that due to its statistical nature is well suited to be used in statistical approaches. Lack of an efficient parser and sparseness in the supertags set are the main obstacles in using XTAG and MICA grammars respectively. The models were trained by the standard HMM training algorithm, Baum-Welch. To converge the training algorithm to a better local optimum, the initial state of the models also were estimated using two semi-supervised EM-based algorithms. The resulting accuracy of the model (about 91%) shows that the models can provide a satisfactory way for linking these grammars to share their capabilities together.
AB - This paper discusses two Hidden Markov Models (HMM) for linking linguistically motivated XTAG grammar and the automatically extracted LTAG used by MICA parser. The former grammar is a detailed LTAG enriched with feature structures. And the latter one is a huge size LTAG that due to its statistical nature is well suited to be used in statistical approaches. Lack of an efficient parser and sparseness in the supertags set are the main obstacles in using XTAG and MICA grammars respectively. The models were trained by the standard HMM training algorithm, Baum-Welch. To converge the training algorithm to a better local optimum, the initial state of the models also were estimated using two semi-supervised EM-based algorithms. The resulting accuracy of the model (about 91%) shows that the models can provide a satisfactory way for linking these grammars to share their capabilities together.
UR - http://www.scopus.com/inward/record.url?scp=84866851139&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:84866851139
SP - 63
EP - 69
JO - International Conference Recent Advances in Natural Language Processing, RANLP
JF - International Conference Recent Advances in Natural Language Processing, RANLP
SN - 1313-8502
T2 - 8th International Conference on Recent Advances in Natural Language Processing, RANLP 2011
Y2 - 12 September 2011 through 14 September 2011
ER -
ID: 366047680