MARC details
| 000 -LEADER |
| fixed length control field |
09307nam a22002537a 4500 |
| 008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION |
| fixed length control field |
210830s2019 ||||f mb|| 00| 0 eng d |
| 040 ## - CATALOGING SOURCE |
| Original cataloging agency |
EG-CaNU |
| Transcribing agency |
EG-CaNU |
| 041 0# - Language Code |
| Language code of text |
eng |
| Language code of abstract |
eng |
| 082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER |
| Classification number |
610 |
| 100 0# - MAIN ENTRY--PERSONAL NAME |
| Personal name |
Moustafa Alaa Mohamed |
| 245 1# - TITLE STATEMENT |
| Title |
Learning Meters of Arabic Poems with Deep Learning / |
| Statement of responsibility, etc. |
Moustafa Alaa Mohamed |
| 260 ## - PUBLICATION, DISTRIBUTION, ETC. |
| Date of publication, distribution, etc. |
2019 |
| 300 ## - PHYSICAL DESCRIPTION |
| Extent |
56 p. |
| Other physical details |
ill. |
| Dimensions |
21 cm. |
| 500 ## - GENERAL NOTE |
| General note |
Supervisor: Samhaa El-Beltagy |
| 502 ## - Dissertation Note |
| Dissertation type |
Thesis (M.A.)—Nile University, Egypt, 2019 . |
| 504 ## - Bibliography |
| Bibliography |
"Includes bibliographical references" |
| 505 0# - Contents |
| Formatted contents note |
Contents:<br/>Dedication i<br/>Acknowledgment i<br/>Table of Contents ii<br/>List of Figures iv<br/>List of Tables v<br/>Thesis Outline 1<br/>Abstract 1<br/>1 INTRODUCTION 3<br/>1.1 Arabic Poetry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3<br/>1.2 Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3<br/>1.3 Thesis Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3<br/>2 BACKGROUND 5<br/>2.1 Arabic Arud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6<br/>2.1.1 Al-Farahidi and Pattern Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6<br/>2.1.2 Feet Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7<br/>2.1.3 Arabic Poetry feet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8<br/>2.1.4 Arabic Poetry Meters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8<br/>2.1.5 Meters Relation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12<br/>2.2 Deep Learning Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15<br/>2.2.1 Logistic Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16<br/>2.2.2 The Neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21<br/>2.2.3 The Neural Network Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21<br/>2.2.4 Neural Network Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22<br/>2.2.5 Recurrent Neural Networks (RNNs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27<br/>2.2.6 Long Short Term Memory networks (LSTMs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29<br/>2.2.7 Machine Learning Model Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32<br/>2.3 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35<br/>2.3.1 Deterministic (Algorithmic) Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35<br/>ii<br/>3 DESIGN DATASET AND EXPERIMENTS 36<br/>3.1 Dataset Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37<br/>3.1.1 Data Scraping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37<br/>3.1.2 Data Preparation and Cleansing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38<br/>3.1.3 Data Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38<br/>3.2 Model Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41<br/>3.2.1 Parameters of Data Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41<br/>3.2.2 Parameters of Network Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42<br/>3.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44<br/>3.3.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44<br/>3.3.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44<br/>3.3.3 Implementation Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44<br/>3.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46<br/>3.4.1 Overall Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46<br/>3.4.2 Data Representation Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46<br/>3.4.3 Network Configurations Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47<br/>3.4.4 Per-Class (Meter) Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47<br/>3.4.5 Encoding Effect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48<br/>3.4.6 Comparison with Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49<br/>3.4.7 Classifying Arabic Non-Poem Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49<br/>3.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51<br/>3.5.1 Dataset Unbalanced . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51<br/>3.5.2 Encoding Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51<br/>3.5.3 Weighting Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51<br/>3.5.4 Neural Network configurations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51<br/>3.5.5 Model Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52<br/>4 Conclusion and Future Work 53<br/>4.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53<br/>4.2 Future Work . . . . . . . . . . . . . . . . . . . . |
| 520 3# - Abstract |
| Abstract |
Abstract:<br/>People can easily determine whether a piece of writing is a poem or prose, but only specialists can determine the class of poem.<br/>In this thesis, we built a model that can classify poetry according to its meters; a forward step towards machine understanding<br/>of the Arabic language.<br/>A number of different deep learning models are proposed for poem meter classification. As poetry is sequence data, then recurrent<br/>neural networks are suitable for the task. We have trained three variants of them; LSTM, GRU with different architectures<br/>and hyper-parameters. Because meters are a sequence of characters, we have encoded the input text at the character-level, so that<br/>we preserve the information provided by the letters succession directly fed to the models. Moreover, we introduce a comparative<br/>study on the difference between binary and One-hot encoding regarding their effect on the learning curve. We also introduce a new<br/>encoding technique called Two-Hot, which merges the advantages of both Binary and One-Hot techniques.<br/>Artificial Intelligence currently works to do the human tasks such as our problem here. Our target in this thesis is to achieve the<br/>human accuracy which will make it easy for anyone to recognise the meter for any poem without referring to the language experts<br/>or to study the whole field.<br/>In this thesis, we will explain how to use deep learning to classify the Arabic poem. We will also explain in details the feature of<br/>Arabic poem and how to deal with this feature. We explain how anyone can work with Arabic text encoding in a dynamic way to<br/>encode the text at the character level and deal with Arabic text features such as the Tashkeel.<br/>To the best of the author’s knowledge, this research is the first to address classifying poem meters in a machine learning approach,<br/>in general, and in RNN featureless based approach, in particular. In addition, the dataset is the first publicly available<br/>dataset prepared for the purpose of future computational research. |
| 546 ## - Language Note |
| Language Note |
Text in English, abstracts in English. |
| 650 #4 - Subject |
| Subject |
Informatics-IFM |
| 655 #7 - Index Term-Genre/Form |
| Source of term |
NULIB |
| focus term |
Dissertation, Academic |
| 690 ## - Subject |
| School |
Informatics-IFM |
| 942 ## - ADDED ENTRY ELEMENTS (KOHA) |
| Source of classification or shelving scheme |
Dewey Decimal Classification |
| Koha item type |
Thesis |
| 650 #4 - Subject |
| -- |
266 |
| 655 #7 - Index Term-Genre/Form |
| -- |
187 |
| 690 ## - Subject |
| -- |
266 |