Image from Google Jackets

Neural Architecture Search With Application to Image Super Resolution / Mahmoud Abdelgyd

By: Material type: TextTextLanguage: English Summary language: English Publication details: 2021Description: 61 p. ill. 21 cmSubject(s): Genre/Form: DDC classification:
  • 610
Contents:
Contents: LIST OF FIGURES .............................................................................................................................................. 5 ABSTRACT ....................................................................................................................................................... 8 CHAPTER 1: INTRODUCTION ............................................................................................................................. 9 1.1 INTRODUCTION ....................................................................................................................................................... 9 1.2 OBJECTIVES AND CONTRIBUTIONS .............................................................................................................................. 10 1.3 ORGANIZATION OF THESIS ....................................................................................................................................... 10 CHAPTER 2: REVIEW OF SINGLE IMAGE SUPER RESOLUTION AND NEURAL ARCHITECTURE SEARCH NAS............ 11 2.1 NEURAL NETWORKS ............................................................................................................................................... 11 2.2 REGULARIZATION ................................................................................................................................................... 11 2.3 OPTIMIZATION ...................................................................................................................................................... 11 2.3.1 Derivative free optimization ...................................................................................................................... 12 2.3.2 First order methods .................................................................................................................................. 12 2.3.3 Advanced first order optimization techniques. ............................................. Error! Bookmark not defined. 2.4 BACKPROPAGATION ............................................................................................................................................... 13 2.5 BIOLOGICAL INSPIRATION ........................................................................................................................................ 13 2.6 CONVOLUTIONAL NEURAL NETWORKS ........................................................................................................................ 14 2.6.1 Concrete example ..................................................................................................................................... 15 2.6.2 Pooling layers............................................................................................................................................ 16 2.6.3 Normalization Methods ............................................................................................................................. 16 2.7 CROSS-VALIDATION ................................................................................................................................................ 17 2.8 DEEP NEURAL NETWORK APPLICATION ....................................................................................................................... 17 2.8.1 Multi digit recognition using deep recurrent attention model .................................................................. 17 2.9 IMAGE SUPER RESOLUTION ..................................................................................................................................... 19 2.9.1 Classic Methods ........................................................................................................................................ 19 2.9.2 Deep Learning Methods ............................................................................................................................ 20 2.10 NEURAL ARCHITECTURE SEARCH............................................................................................................................. 21 2.10.1 Search Space ........................................................................................................................................... 21 2.10.2 Search Strategy ....................................................................................................................................... 23 2.10.3 Performance Estimation Strategy ........................................................................................................... 23 2.11 DIFFERENTIABLE ARCHITECTURE SEARCH ................................................................................................................... 24 2.12 PROXYLESS NEURAL ARCHITECTURE SEARCH .............................................................................................................. 26 2.13 SUPERNET.......................................................................................................................................................... 26 2.14 SUMMARY .......................................................................................................................................................... 28 CHAPTER 3: IMAGE SUPER RESOLUTION USING NAS ........................................................................................ 29 3.1 EDSR BASELINE MODEL ......................................................................................................................................... 29 3.2 EDSR NAS .......................................................................................................................................................... 30 3.2.1 Normalization Search Cells ........................................................................................................................ 31 3.2.2 Convolution Type Search Cells ................................................................................................................... 31 3.2.3 Kernel Size Search Cells .............................................................................................................................. 31 3.2.4 Architecture Search .................................................................................................................................. 31 3.3 ESRGAN BASELINE MODEL .................................................................................................................................... 35 3.4 ESRGAN NAS ..................................................................................................................................................... 36 4 3.4.1 ESRGAN Search Cell .................................................................................................................................. 36 3.4.2 Architecture Search .................................................................................................................................. 36 3.5 SUMMARY ............................................................................................................................................................ 37 CHAPTER 4: EXPERIMENTS AND RESULTS ........................................................................................................ 38 4.1 INTRODUCTION ..................................................................................................................................................... 38 4.2 EXPERIMENTAL SETUP ............................................................................................................................................ 38 4.3 DATASETS ............................................................................................................................................................ 38 4.3.1 Div2k ......................................................................................................................................................... 38 4.3.2 Set5 & Set14 ............................................................................................................................................. 39 4.3.3 Urban100 .................................................................................................................................................. 39 4.3.4 B100 .......................................................................................................................................................... 40 4.4 RESULTS AND ANALYSIS .......................................................................................................................................... 40 4.4.1 EDSR Baseline ........................................................................................................................................... 40 4.4.2 EDSR NAS .................................................................................................................................................. 41 4.4.3 ESRGAN Baseline: ..................................................................................................................................... 45 4.4.4 ESRGAN NAS: ............................................................................................................................................ 46 CHAPTER 5: CONCLUSIONS AND FUTURE WORK .............................................................................................. 52 5.1 Conclusions .................................................................................................................................................. 52 5.2 Future Work ................................................................................................................................................. 52 REFERENCES ................................................................................................................................................. 55
Dissertation note: Thesis (M.A.)—Nile University, Egypt, 2021 . Abstract: Abstract: Deep learning uses deep neural networks (DNN) to learn high-dimensional data representation. DNN exceeded the state of art benchmarks in a variety of tasks that were dominated by classical machine learning methods. These benchmarks include natural language processing, speech recognition and computer vision. One noticeable trend in deep neural networks is that their performance depends on the choice of architecture and hyperparameters. There is many variety of DNN architectures even for a one single task. Finding the right architecture and hyperparameters is usually tedious and time-consuming task. Some attempts used partially automated methods. The authors tried to tune a network hyperparameters or switch between using several fixed architectures but optimizing both the architecture and hyperparameters simultaneously were rarely done. Neural architecture search (NAS) focuses on designing effective neural networks in an automatic manner. In this work, NAS algorithms were used for the design of deep networks for image super resolution as an application domain. Image super resolution is the task of improving image quality or resolution. It has always been an active area of research in image processing community. For some applications it is essential to have high resolution images with many details, like in forensic investigations, security video surveillance and license plate recognition systems. Classical methods, like sampling and interpolation methods have been the dominant methods in image super resolution domain for decades. Nowadays DNNs have proven its superiority in image super resolution tasks. To the best of our knowledge this is the first work of applying NAS in designing image super resolution deep networks. Novel building blocks defining the search space for NAS were designed for this task. These building blocks facilitate the manual process of designing deep networks layers like choosing the right data normalization type, convolution type or convolution kernel size. A set of experiments using NAS algorithms (Differentialbe Archiecture Search, Proxyless Neural Archiecture Search, SuperNet) has shown the effectiveness of NAS in designing DNNs that has competitive performance in terms of Peak Signal to Noise Ratio metric relative to manually designed deep networks. The resulted architectures have much lower size and computational latency making it suitable for constrained hardware environments (in terms of memory and computing capabilities) like edge devices. Extensive experiments and analysis were done on the widely used benchmarks Set5, Set14, Urban100 and BSD100 datasets proving the effectiveness of our architectures in different image varieties and application scenarios. We believe that NAS shows a promising approach for designing image super resolution deep networks removing the burden of hand engineering architectures.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)

Supervisor: Mohamed Elhelw

Thesis (M.A.)—Nile University, Egypt, 2021 .

"Includes bibliographical references"

Contents:
LIST OF FIGURES .............................................................................................................................................. 5
ABSTRACT ....................................................................................................................................................... 8
CHAPTER 1: INTRODUCTION ............................................................................................................................. 9
1.1 INTRODUCTION ....................................................................................................................................................... 9
1.2 OBJECTIVES AND CONTRIBUTIONS .............................................................................................................................. 10
1.3 ORGANIZATION OF THESIS ....................................................................................................................................... 10
CHAPTER 2: REVIEW OF SINGLE IMAGE SUPER RESOLUTION AND NEURAL ARCHITECTURE SEARCH NAS............ 11
2.1 NEURAL NETWORKS ............................................................................................................................................... 11
2.2 REGULARIZATION ................................................................................................................................................... 11
2.3 OPTIMIZATION ...................................................................................................................................................... 11
2.3.1 Derivative free optimization ...................................................................................................................... 12
2.3.2 First order methods .................................................................................................................................. 12
2.3.3 Advanced first order optimization techniques. ............................................. Error! Bookmark not defined.
2.4 BACKPROPAGATION ............................................................................................................................................... 13
2.5 BIOLOGICAL INSPIRATION ........................................................................................................................................ 13
2.6 CONVOLUTIONAL NEURAL NETWORKS ........................................................................................................................ 14
2.6.1 Concrete example ..................................................................................................................................... 15
2.6.2 Pooling layers............................................................................................................................................ 16
2.6.3 Normalization Methods ............................................................................................................................. 16
2.7 CROSS-VALIDATION ................................................................................................................................................ 17
2.8 DEEP NEURAL NETWORK APPLICATION ....................................................................................................................... 17
2.8.1 Multi digit recognition using deep recurrent attention model .................................................................. 17
2.9 IMAGE SUPER RESOLUTION ..................................................................................................................................... 19
2.9.1 Classic Methods ........................................................................................................................................ 19
2.9.2 Deep Learning Methods ............................................................................................................................ 20
2.10 NEURAL ARCHITECTURE SEARCH............................................................................................................................. 21
2.10.1 Search Space ........................................................................................................................................... 21
2.10.2 Search Strategy ....................................................................................................................................... 23
2.10.3 Performance Estimation Strategy ........................................................................................................... 23
2.11 DIFFERENTIABLE ARCHITECTURE SEARCH ................................................................................................................... 24
2.12 PROXYLESS NEURAL ARCHITECTURE SEARCH .............................................................................................................. 26
2.13 SUPERNET.......................................................................................................................................................... 26
2.14 SUMMARY .......................................................................................................................................................... 28
CHAPTER 3: IMAGE SUPER RESOLUTION USING NAS ........................................................................................ 29
3.1 EDSR BASELINE MODEL ......................................................................................................................................... 29
3.2 EDSR NAS .......................................................................................................................................................... 30
3.2.1 Normalization Search Cells ........................................................................................................................ 31
3.2.2 Convolution Type Search Cells ................................................................................................................... 31
3.2.3 Kernel Size Search Cells .............................................................................................................................. 31
3.2.4 Architecture Search .................................................................................................................................. 31
3.3 ESRGAN BASELINE MODEL .................................................................................................................................... 35
3.4 ESRGAN NAS ..................................................................................................................................................... 36
4
3.4.1 ESRGAN Search Cell .................................................................................................................................. 36
3.4.2 Architecture Search .................................................................................................................................. 36
3.5 SUMMARY ............................................................................................................................................................ 37
CHAPTER 4: EXPERIMENTS AND RESULTS ........................................................................................................ 38
4.1 INTRODUCTION ..................................................................................................................................................... 38
4.2 EXPERIMENTAL SETUP ............................................................................................................................................ 38
4.3 DATASETS ............................................................................................................................................................ 38
4.3.1 Div2k ......................................................................................................................................................... 38
4.3.2 Set5 & Set14 ............................................................................................................................................. 39
4.3.3 Urban100 .................................................................................................................................................. 39
4.3.4 B100 .......................................................................................................................................................... 40
4.4 RESULTS AND ANALYSIS .......................................................................................................................................... 40
4.4.1 EDSR Baseline ........................................................................................................................................... 40
4.4.2 EDSR NAS .................................................................................................................................................. 41
4.4.3 ESRGAN Baseline: ..................................................................................................................................... 45
4.4.4 ESRGAN NAS: ............................................................................................................................................ 46
CHAPTER 5: CONCLUSIONS AND FUTURE WORK .............................................................................................. 52
5.1 Conclusions .................................................................................................................................................. 52
5.2 Future Work ................................................................................................................................................. 52
REFERENCES ................................................................................................................................................. 55

Abstract:
Deep learning uses deep neural networks (DNN) to learn high-dimensional data representation. DNN exceeded the state of art benchmarks in a variety of tasks that were dominated by classical machine learning methods. These benchmarks include natural language processing, speech recognition and computer vision. One noticeable trend in deep neural networks is that their performance depends on the choice of architecture and hyperparameters. There is many variety of DNN architectures even for a one single task. Finding the right architecture and hyperparameters is usually tedious and time-consuming task. Some attempts used partially automated methods. The authors tried to tune a network hyperparameters or switch between using several fixed architectures but optimizing both the architecture and hyperparameters simultaneously were rarely done. Neural architecture search (NAS) focuses on designing effective neural networks in an automatic manner.
In this work, NAS algorithms were used for the design of deep networks for image super resolution as an application domain. Image super resolution is the task of improving image quality or resolution. It has always been an active area of research in image processing community. For some applications it is essential to have high resolution images with many details, like in forensic investigations, security video surveillance and license plate recognition systems. Classical methods, like sampling and interpolation methods have been the dominant methods in image super resolution domain for decades. Nowadays DNNs have proven its superiority in image super resolution tasks.
To the best of our knowledge this is the first work of applying NAS in designing image super resolution deep networks. Novel building blocks defining the search space for NAS were designed for this task. These building blocks facilitate the manual process of designing deep networks layers like choosing the right data normalization type, convolution type or convolution kernel size. A set of experiments using NAS algorithms (Differentialbe Archiecture Search, Proxyless Neural Archiecture Search, SuperNet) has shown the effectiveness of NAS in designing DNNs that has competitive performance in terms of Peak Signal to Noise Ratio metric relative to manually designed deep networks. The resulted architectures have much lower size and computational latency making it suitable for constrained hardware environments (in terms of memory and computing capabilities) like edge devices. Extensive experiments and analysis were done on the widely used benchmarks Set5, Set14, Urban100 and BSD100 datasets proving the effectiveness of our architectures in different image varieties and application scenarios. We believe that NAS shows a promising approach for designing image super resolution deep networks removing the burden of hand engineering architectures.

Text in English, abstracts in English.

There are no comments on this title.

to post a comment.