Self-Supervised Learning Framework for Sequential Data Applications / (Record no. 9253)
[ view plain ]
| 000 -LEADER | |
|---|---|
| fixed length control field | 06042nam a22002537a 4500 |
| 008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION | |
| fixed length control field | 220110b2021 |||a|||f mb|| 00| 0 eng d |
| 040 ## - CATALOGING SOURCE | |
| Original cataloging agency | EG-CaNU |
| Transcribing agency | EG-CaNU |
| 041 0# - Language Code | |
| Language code of text | eng |
| Language code of abstract | eng |
| 082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER | |
| Classification number | 610 |
| 100 0# - MAIN ENTRY--PERSONAL NAME | |
| Personal name | Karim Magdy Amer |
| 245 1# - TITLE STATEMENT | |
| Title | Self-Supervised Learning Framework for Sequential Data Applications / |
| Statement of responsibility, etc. | Karim Magdy Amer |
| 260 ## - PUBLICATION, DISTRIBUTION, ETC. | |
| Date of publication, distribution, etc. | 2021 |
| 300 ## - PHYSICAL DESCRIPTION | |
| Extent | 95 p. |
| Other physical details | ill. |
| Dimensions | 21 cm. |
| 500 ## - GENERAL NOTE | |
| Materials specified | Supervisor: Mohamed Elhelw |
| 502 ## - Dissertation Note | |
| Dissertation type | Thesis (M.A.)—Nile University, Egypt, 2021 . |
| 504 ## - Bibliography | |
| Bibliography | "Includes bibliographical references" |
| 505 0# - Contents | |
| Formatted contents note | Contents:<br/>Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii<br/>Dedication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv<br/>Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v<br/>List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii<br/>List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix<br/>Chapters:<br/>1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1<br/>1.1 Overview and Motivation . . . . . . . . . . . . . . . . . . . . 1<br/>1.2 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . 3<br/>1.3 Objectives and Contributions . . . . . . . . . . . . . . . . . . 4<br/>1.4 Organization of Thesis . . . . . . . . . . . . . . . . . . . . . . 5<br/>2. Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7<br/>2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 7<br/>2.2 Deep Neural Network Architectures . . . . . . . . . . . . . . . 7<br/>2.2.1 Convolutional Neural Network . . . . . . . . . . . . . 7<br/>2.2.2 Sequence Models . . . . . . . . . . . . . . . . . . . . . 9<br/>2.3 Self Supervised Learning . . . . . . . . . . . . . . . . . . . . . 11<br/>2.3.1 Computer Vision . . . . . . . . . . . . . . . . . . . . . 11<br/>2.3.2 Natural Language Processing . . . . . . . . . . . . . . 21<br/>2.3.3 Automatic Speech Recognition . . . . . . . . . . . . . 23<br/>2.4 Sequential Data Applications using Machine Learning and<br/>Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . 26<br/>2.4.1 Bioinformatics . . . . . . . . . . . . . . . . . . . . . . 26<br/>2.4.2 Agriculture . . . . . . . . . . . . . . . . . . . . . . . . 26<br/>2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27<br/>vi<br/>3. Self-Supervised Learning Framework for Sequential Data . . . . . . 29<br/>3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 29<br/>3.2 Framework Architecture . . . . . . . . . . . . . . . . . . . . . 29<br/>3.3 Self Supervised Learning Modules . . . . . . . . . . . . . . . . 31<br/>3.3.1 Denoising Autoencoder Module . . . . . . . . . . . . . 31<br/>3.3.2 Contrastive Learning Module . . . . . . . . . . . . . . 33<br/>3.4 Supervised Learning Module . . . . . . . . . . . . . . . . . . . 35<br/>3.5 Deep Neural Network Module . . . . . . . . . . . . . . . . . . 36<br/>3.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36<br/>4. Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . . 37<br/>4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 37<br/>4.2 Case Study: Crop Identification . . . . . . . . . . . . . . . . . 37<br/>4.2.1 Problem Description . . . . . . . . . . . . . . . . . . . 37<br/>4.2.2 Dataset and Evaluation Metrics . . . . . . . . . . . . . 39<br/>4.2.3 Experimental setup . . . . . . . . . . . . . . . . . . . . 44<br/>4.2.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . 48<br/>4.3 Case Study: mRNA Vaccine Degradation Prediction . . . . . 51<br/>4.3.1 Problem Description . . . . . . . . . . . . . . . . . . . 51<br/>4.3.2 Dataset and Evaluation Metrics . . . . . . . . . . . . . 52<br/>4.3.3 Experimental Setup . . . . . . . . . . . . . . . . . . . 56<br/>4.3.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . 59<br/>4.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62<br/>5. Conclusions and Future Work . . . . . . . . . . . . . . . . . . . . . 63<br/>5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 63<br/>5.2 Summary and Contributions . . . . . . . . . . . . . . . . . . . 63<br/>5.3 Future Work Directions . . . . . . . . . . . . . . . . . . . . . 64<br/>Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 |
| 520 3# - Abstract | |
| Abstract | Abstract:<br/>Recent advances in Deep Learning (DL) and Artificial Intelligence (AI) algorithms have proven to be greatly beneficial in many applications. The<br/>success of such algorithms in the last decade was mainly built on supervised training of very complex models using large labeled datasets. However, the data labeling process is expensive and time consuming especially for applications with sequential data.<br/>Self-Supervised Learning (SSL) appeared as a solution to train complex<br/>models with unlabeled datasets. The main idea behind SSL is to create auxiliary<br/>task based on some properties in the dataset to learn a latent space<br/>that captures the data semantics. Models trained using SSL can be fine-tuned<br/>later on a much smaller labeled datasets. This learning scheme becomes very<br/>successful in Computer Vision (CV) and Natural Language Processing (NLP)<br/>applications but hasn’t yet generalized to other sequential data applications.<br/>In this work, we propose a novel DL framework for sequential data processing<br/>while adopting SSL techniques to make use of unlabeled data. Our framework<br/>is evaluated on two different applications from two different domains,<br/>Crop Identification and mRNA Vaccine Degradation Prediction. Details of<br/>the applications’ datasets, pre-processing steps and learning objectives of both<br/>applications are explained. Experimental results demonstrate that the proposed<br/>framework offers a promising approach for utilizing SSL in the training<br/>pipeline of commonly-used DL models. |
| 546 ## - Language Note | |
| Language Note | Text in English, abstracts in English. |
| 650 #4 - Subject | |
| Subject | Informatics-IFM |
| 655 #7 - Index Term-Genre/Form | |
| Source of term | NULIB |
| focus term | Dissertation, Academic |
| 690 ## - Subject | |
| School | Informatics-IFM |
| 942 ## - ADDED ENTRY ELEMENTS (KOHA) | |
| Source of classification or shelving scheme | Dewey Decimal Classification |
| Koha item type | Thesis |
| 650 #4 - Subject | |
| -- | 266 |
| 655 #7 - Index Term-Genre/Form | |
| -- | 187 |
| 690 ## - Subject | |
| -- | 266 |
| Withdrawn status | Lost status | Source of classification or shelving scheme | Damaged status | Not for loan | Home library | Current library | Date acquired | Total Checkouts | Full call number | Date last seen | Price effective from | Koha item type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Dewey Decimal Classification | Main library | Main library | 01/10/2022 | 610/ K.M.S/ 2021 | 01/10/2022 | 01/10/2022 | Thesis |