Neural Architecture Search With Application to Image Super Resolution / (Record no. 9252)

MARC details
000 -LEADER
fixed length control field 13691nam a22002537a 4500
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 220110b2021 |||a|||f mb|| 00| 0 eng d
040 ## - CATALOGING SOURCE
Original cataloging agency EG-CaNU
Transcribing agency EG-CaNU
041 0# - Language Code
Language code of text eng
Language code of abstract eng
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 610
100 0# - MAIN ENTRY--PERSONAL NAME
Personal name Mahmoud Abdelgyd
245 1# - TITLE STATEMENT
Title Neural Architecture Search With Application to Image Super Resolution /
Statement of responsibility, etc. Mahmoud Abdelgyd
260 ## - PUBLICATION, DISTRIBUTION, ETC.
Date of publication, distribution, etc. 2021
300 ## - PHYSICAL DESCRIPTION
Extent 61 p.
Other physical details ill.
Dimensions 21 cm.
500 ## - GENERAL NOTE
Materials specified Supervisor: Mohamed Elhelw
502 ## - Dissertation Note
Dissertation type Thesis (M.A.)—Nile University, Egypt, 2021 .
504 ## - Bibliography
Bibliography "Includes bibliographical references"
505 0# - Contents
Formatted contents note Contents:<br/>LIST OF FIGURES .............................................................................................................................................. 5<br/>ABSTRACT ....................................................................................................................................................... 8<br/>CHAPTER 1: INTRODUCTION ............................................................................................................................. 9<br/>1.1 INTRODUCTION ....................................................................................................................................................... 9<br/>1.2 OBJECTIVES AND CONTRIBUTIONS .............................................................................................................................. 10<br/>1.3 ORGANIZATION OF THESIS ....................................................................................................................................... 10<br/>CHAPTER 2: REVIEW OF SINGLE IMAGE SUPER RESOLUTION AND NEURAL ARCHITECTURE SEARCH NAS............ 11<br/>2.1 NEURAL NETWORKS ............................................................................................................................................... 11<br/>2.2 REGULARIZATION ................................................................................................................................................... 11<br/>2.3 OPTIMIZATION ...................................................................................................................................................... 11<br/>2.3.1 Derivative free optimization ...................................................................................................................... 12<br/>2.3.2 First order methods .................................................................................................................................. 12<br/>2.3.3 Advanced first order optimization techniques. ............................................. Error! Bookmark not defined.<br/>2.4 BACKPROPAGATION ............................................................................................................................................... 13<br/>2.5 BIOLOGICAL INSPIRATION ........................................................................................................................................ 13<br/>2.6 CONVOLUTIONAL NEURAL NETWORKS ........................................................................................................................ 14<br/>2.6.1 Concrete example ..................................................................................................................................... 15<br/>2.6.2 Pooling layers............................................................................................................................................ 16<br/>2.6.3 Normalization Methods ............................................................................................................................. 16<br/>2.7 CROSS-VALIDATION ................................................................................................................................................ 17<br/>2.8 DEEP NEURAL NETWORK APPLICATION ....................................................................................................................... 17<br/>2.8.1 Multi digit recognition using deep recurrent attention model .................................................................. 17<br/>2.9 IMAGE SUPER RESOLUTION ..................................................................................................................................... 19<br/>2.9.1 Classic Methods ........................................................................................................................................ 19<br/>2.9.2 Deep Learning Methods ............................................................................................................................ 20<br/>2.10 NEURAL ARCHITECTURE SEARCH............................................................................................................................. 21<br/>2.10.1 Search Space ........................................................................................................................................... 21<br/>2.10.2 Search Strategy ....................................................................................................................................... 23<br/>2.10.3 Performance Estimation Strategy ........................................................................................................... 23<br/>2.11 DIFFERENTIABLE ARCHITECTURE SEARCH ................................................................................................................... 24<br/>2.12 PROXYLESS NEURAL ARCHITECTURE SEARCH .............................................................................................................. 26<br/>2.13 SUPERNET.......................................................................................................................................................... 26<br/>2.14 SUMMARY .......................................................................................................................................................... 28<br/>CHAPTER 3: IMAGE SUPER RESOLUTION USING NAS ........................................................................................ 29<br/>3.1 EDSR BASELINE MODEL ......................................................................................................................................... 29<br/>3.2 EDSR NAS .......................................................................................................................................................... 30<br/>3.2.1 Normalization Search Cells ........................................................................................................................ 31<br/>3.2.2 Convolution Type Search Cells ................................................................................................................... 31<br/>3.2.3 Kernel Size Search Cells .............................................................................................................................. 31<br/>3.2.4 Architecture Search .................................................................................................................................. 31<br/>3.3 ESRGAN BASELINE MODEL .................................................................................................................................... 35<br/>3.4 ESRGAN NAS ..................................................................................................................................................... 36<br/>4<br/>3.4.1 ESRGAN Search Cell .................................................................................................................................. 36<br/>3.4.2 Architecture Search .................................................................................................................................. 36<br/>3.5 SUMMARY ............................................................................................................................................................ 37<br/>CHAPTER 4: EXPERIMENTS AND RESULTS ........................................................................................................ 38<br/>4.1 INTRODUCTION ..................................................................................................................................................... 38<br/>4.2 EXPERIMENTAL SETUP ............................................................................................................................................ 38<br/>4.3 DATASETS ............................................................................................................................................................ 38<br/>4.3.1 Div2k ......................................................................................................................................................... 38<br/>4.3.2 Set5 & Set14 ............................................................................................................................................. 39<br/>4.3.3 Urban100 .................................................................................................................................................. 39<br/>4.3.4 B100 .......................................................................................................................................................... 40<br/>4.4 RESULTS AND ANALYSIS .......................................................................................................................................... 40<br/>4.4.1 EDSR Baseline ........................................................................................................................................... 40<br/>4.4.2 EDSR NAS .................................................................................................................................................. 41<br/>4.4.3 ESRGAN Baseline: ..................................................................................................................................... 45<br/>4.4.4 ESRGAN NAS: ............................................................................................................................................ 46<br/>CHAPTER 5: CONCLUSIONS AND FUTURE WORK .............................................................................................. 52<br/>5.1 Conclusions .................................................................................................................................................. 52<br/>5.2 Future Work ................................................................................................................................................. 52<br/>REFERENCES ................................................................................................................................................. 55
520 3# - Abstract
Abstract Abstract:<br/>Deep learning uses deep neural networks (DNN) to learn high-dimensional data representation. DNN exceeded the state of art benchmarks in a variety of tasks that were dominated by classical machine learning methods. These benchmarks include natural language processing, speech recognition and computer vision. One noticeable trend in deep neural networks is that their performance depends on the choice of architecture and hyperparameters. There is many variety of DNN architectures even for a one single task. Finding the right architecture and hyperparameters is usually tedious and time-consuming task. Some attempts used partially automated methods. The authors tried to tune a network hyperparameters or switch between using several fixed architectures but optimizing both the architecture and hyperparameters simultaneously were rarely done. Neural architecture search (NAS) focuses on designing effective neural networks in an automatic manner.<br/>In this work, NAS algorithms were used for the design of deep networks for image super resolution as an application domain. Image super resolution is the task of improving image quality or resolution. It has always been an active area of research in image processing community. For some applications it is essential to have high resolution images with many details, like in forensic investigations, security video surveillance and license plate recognition systems. Classical methods, like sampling and interpolation methods have been the dominant methods in image super resolution domain for decades. Nowadays DNNs have proven its superiority in image super resolution tasks.<br/>To the best of our knowledge this is the first work of applying NAS in designing image super resolution deep networks. Novel building blocks defining the search space for NAS were designed for this task. These building blocks facilitate the manual process of designing deep networks layers like choosing the right data normalization type, convolution type or convolution kernel size. A set of experiments using NAS algorithms (Differentialbe Archiecture Search, Proxyless Neural Archiecture Search, SuperNet) has shown the effectiveness of NAS in designing DNNs that has competitive performance in terms of Peak Signal to Noise Ratio metric relative to manually designed deep networks. The resulted architectures have much lower size and computational latency making it suitable for constrained hardware environments (in terms of memory and computing capabilities) like edge devices. Extensive experiments and analysis were done on the widely used benchmarks Set5, Set14, Urban100 and BSD100 datasets proving the effectiveness of our architectures in different image varieties and application scenarios. We believe that NAS shows a promising approach for designing image super resolution deep networks removing the burden of hand engineering architectures.
546 ## - Language Note
Language Note Text in English, abstracts in English.
650 #4 - Subject
Subject Informatics-IFM
655 #7 - Index Term-Genre/Form
Source of term NULIB
focus term Dissertation, Academic
690 ## - Subject
School Informatics-IFM
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Source of classification or shelving scheme Dewey Decimal Classification
Koha item type Thesis
650 #4 - Subject
-- 266
655 #7 - Index Term-Genre/Form
-- 187
690 ## - Subject
-- 266
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Home library Current library Date acquired Total Checkouts Full call number Date last seen Price effective from Koha item type
    Dewey Decimal Classification     Main library Main library 01/10/2022   610/M.A.N/2021 01/10/2022 01/10/2022 Thesis