Deep Learning from Scratch: From Basics to Building Real Neural Networks in Python with Keras by Artem Kovera
Author:Artem Kovera [Kovera, Artem]
Language: eng
Format: azw3
Published: 2018-11-08T16:00:00+00:00
A pooling layer has no learnable parameters. Because of this, pooling layers are usually not included in the total number of layers of convolutional networks. For example, if a neural network has seven convolutional layers, three fully-connected layers, one soft-max layer, and four pooling layers, we say that this is an 11-layers deep neural network (7+3+1) because we only count layers with learnable parameters.
In some recent convolutional neural network architectures, fully-connected layers in the end of the networks are replaced by several average pooling layers. This allows these networks to drastically reduce the total number of learnable parameters in them, which leads to better overfitting prevention.
In addition to convolutional neural networks, there are also so-called deconvolutional neural networks. These networks utilize special operations called deconvolutions or transposed convolutions.
Deconvolutional neural networks usually use a method called unpooling or unsampling in their unpooling layers. Basically, it is the opposite of what pooling does. The output of an unpooling layer is an enlarged but sparse feature map. After unpooling, a deconvolutional network uses its deconvolutional layer. In this layer, the network densifies the sparse feature map coming from the unpooling layer using transposed convolutions. As a result, deconvolutional neural networks are able to increase image resolution or, speaking more generally, dimensionality of their inputs. Deconvolutional neural networks can be applied for generative models, for example.
The same effects of dimensionality input increasing that are provided by deconvilutional networks can be also achieved using convolutional networks with appropriate convolutional layers. So, we don’t necessary need to use deconvolutional neural networks. Also, often networks which have deconvolutional layers are simply called convolutional.
Download
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.
Deep Learning with Python by François Chollet(12585)
Hello! Python by Anthony Briggs(9920)
OCA Java SE 8 Programmer I Certification Guide by Mala Gupta(9799)
The Mikado Method by Ola Ellnestam Daniel Brolund(9782)
Dependency Injection in .NET by Mark Seemann(9343)
A Developer's Guide to Building Resilient Cloud Applications with Azure by Hamida Rebai Trabelsi(9299)
Hit Refresh by Satya Nadella(8826)
Algorithms of the Intelligent Web by Haralambos Marmanis;Dmitry Babenko(8305)
Sass and Compass in Action by Wynn Netherland Nathan Weizenbaum Chris Eppstein Brandon Mathis(7786)
Test-Driven iOS Development with Swift 4 by Dominik Hauser(7768)
Grails in Action by Glen Smith Peter Ledbrook(7700)
The Kubernetes Operator Framework Book by Michael Dame(7666)
The Well-Grounded Java Developer by Benjamin J. Evans Martijn Verburg(7562)
Exploring Deepfakes by Bryan Lyon and Matt Tora(7455)
Practical Computer Architecture with Python and ARM by Alan Clements(7378)
Implementing Enterprise Observability for Success by Manisha Agrawal and Karun Krishnannair(7361)
Robo-Advisor with Python by Aki Ranin(7335)
Building Low Latency Applications with C++ by Sourav Ghosh(7242)
Svelte with Test-Driven Development by Daniel Irvine(7207)
