Translation Quality Estimation for the IT Domain using Knowledge Distillation Approach (Record no. 11021)
[ view plain ]
| 000 -LEADER | |
|---|---|
| fixed length control field | 02880nam a22002657a 4500 |
| 008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION | |
| fixed length control field | 201210b2022 a|||f bm|| 00| 0 eng d |
| 024 7# - Author Identifier | |
| Standard number or code | https://orcid.org/0000-0003-3874-805X |
| Source of number or code | ORCID |
| 040 ## - CATALOGING SOURCE | |
| Original cataloging agency | EG-CaNU |
| Transcribing agency | EG-CaNU |
| 041 0# - Language Code | |
| Language code of text | eng |
| Language code of abstract | eng |
| -- | ARA |
| 082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER | |
| Classification number | 610 |
| 100 0# - MAIN ENTRY--PERSONAL NAME | |
| Personal name | Amal Abdelsalam Mahmoud Mohamed |
| 245 1# - TITLE STATEMENT | |
| Title | Translation Quality Estimation for the IT Domain using Knowledge Distillation Approach |
| Statement of responsibility, etc. | /Amal Abdelsalam Mahmoud Mohamed |
| 260 ## - PUBLICATION, DISTRIBUTION, ETC. | |
| Date of publication, distribution, etc. | 2022 |
| 300 ## - PHYSICAL DESCRIPTION | |
| Extent | p. |
| Other physical details | ill. |
| Dimensions | 21 cm. |
| 500 ## - GENERAL NOTE | |
| Materials specified | Supervisor: Mohamed El-Helw |
| 502 ## - Dissertation Note | |
| Dissertation type | Thesis (M.A.)—Nile University, Egypt, 2022 . |
| 504 ## - Bibliography | |
| Bibliography | "Includes bibliographical references" |
| 505 0# - Contents | |
| Formatted contents note | Contents: |
| 520 3# - Abstract | |
| Abstract | Abstract:<br/>Machine Translation (MT) plays a vital role in overcoming language barriers in today’s interconnected world, producing vast streams of translated text used in businesses and daily life. Traditional evaluation methods rely on human-generated reference translations are no longer practical, creating an urgent need for automatic Quality Estimation (QE) systems to assess translation quality. While recent advances in Deep Learning (DL) have significantly improved MT systems, QE systems still face challenges in domain-specific and low-resource settings. Addressing these gaps is crucial to enable reliable and cost-effective QE systems for real-world use. Recognizing that building QE models requires a curated model design, this thesis proposed using knowledge distillation approach to build bilingual distributed representations for training a light QE neural model. The proposed model design aimed to generate bilingual representations able to embed deep semantics and linguistics for the language pair used for translation into a single vector space. Additionally, with the capabilities of knowledge distillation as a model compression technique, the proposed design is aimed to enable the adoption of the QE model in real-world applications. The model is evaluated on the sentence level QE in the Information Technology (IT) domain datasets provided by the Machine Translation Community (WMT). The model performance outperforms strong QE systems that are based on complex deep networks and ensemble models. It achieves the best performance on the WMT IT-domain QE data versions of 2016 and 2017. And it achieves the third best reported correlation on the WMT IT-domain QE data version of 2018. Additionally, the proposed model reduces the QE model size to one-third of that of existing QE ensemble models. With these achievements, this research proved offering scalable and efficient solutions for real-world applications in the field of low-resource QE.<br/> |
| 546 ## - Language Note | |
| Language Note | Text in English, abstracts in English and Arabic |
| 650 #4 - Subject | |
| Subject | InformaticsIFM |
| 655 #7 - Index Term-Genre/Form | |
| Source of term | NULIB |
| focus term | Dissertation, Academic |
| 690 ## - Subject | |
| School | InformaticsIFM |
| 942 ## - ADDED ENTRY ELEMENTS (KOHA) | |
| Source of classification or shelving scheme | Dewey Decimal Classification |
| Koha item type | Thesis |
| 655 #7 - Index Term-Genre/Form | |
| -- | 187 |
| Withdrawn status | Lost status | Source of classification or shelving scheme | Damaged status | Not for loan | Home library | Current library | Date acquired | Total Checkouts | Full call number | Date last seen | Price effective from | Koha item type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Dewey Decimal Classification | Main library | Main library | 07/19/2025 | 610/A.M.T/2022 | 07/19/2025 | 07/19/2025 | Thesis |