Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation. / Park, Chanjun; Go, Woo Young; Eo, Sugyeong; Moon, Hyeonseok; Lee, Seolhwa; Lim, Heuiseok.

In: IEEE Access, Vol. 10, 2022, p. 38684-38693.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Park, C, Go, WY, Eo, S, Moon, H, Lee, S & Lim, H 2022, 'Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation', IEEE Access, vol. 10, pp. 38684-38693. https://doi.org/10.1109/ACCESS.2022.3165572

APA

Park, C., Go, W. Y., Eo, S., Moon, H., Lee, S., & Lim, H. (2022). Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation. IEEE Access, 10, 38684-38693. https://doi.org/10.1109/ACCESS.2022.3165572

Vancouver

Park C, Go WY, Eo S, Moon H, Lee S, Lim H. Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation. IEEE Access. 2022;10:38684-38693. https://doi.org/10.1109/ACCESS.2022.3165572

Author

Park, Chanjun ; Go, Woo Young ; Eo, Sugyeong ; Moon, Hyeonseok ; Lee, Seolhwa ; Lim, Heuiseok. / Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation. In: IEEE Access. 2022 ; Vol. 10. pp. 38684-38693.

Bibtex

@article{94ffc31588704f988756a29f50530981,
title = "Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation",
abstract = "Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands. ",
keywords = "cross communication method, deep learning, Domain-specialized neural machine translation, neural machine translation",
author = "Chanjun Park and Go, {Woo Young} and Sugyeong Eo and Hyeonseok Moon and Seolhwa Lee and Heuiseok Lim",
note = "Publisher Copyright: {\textcopyright} 2013 IEEE.",
year = "2022",
doi = "10.1109/ACCESS.2022.3165572",
language = "English",
volume = "10",
pages = "38684--38693",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

RIS

TY - JOUR

T1 - Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation

AU - Park, Chanjun

AU - Go, Woo Young

AU - Eo, Sugyeong

AU - Moon, Hyeonseok

AU - Lee, Seolhwa

AU - Lim, Heuiseok

N1 - Publisher Copyright: © 2013 IEEE.

PY - 2022

Y1 - 2022

N2 - Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.

AB - Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.

KW - cross communication method

KW - deep learning

KW - Domain-specialized neural machine translation

KW - neural machine translation

UR - http://www.scopus.com/inward/record.url?scp=85128258171&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2022.3165572

DO - 10.1109/ACCESS.2022.3165572

M3 - Journal article

AN - SCOPUS:85128258171

VL - 10

SP - 38684

EP - 38693

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -

ID: 309124658