56 questions
0
votes
1
answer
60
views
Given groups=1, weight of size [8, 3, 3, 3], expected input[1, 1000, 28, 28] to have 3 channels, but got 1000 channels instead
I tried to implement a Residual connection neural network and recreate Lenet-5, and can't set up architecture
Here is Residual connection block
class ResidualBlock(torch.nn.Module):
def __init__(...
0
votes
1
answer
582
views
model diagnosis in GLMM model of binary outcome variable
Dear Scientists and researchers, Greetings
I am trying to do a research article on morbidity, which is categorical, i.e., yes or no, and I also consider fixed and random effects, given that all of the ...
0
votes
0
answers
752
views
Can Low gradient norm be an indicator of a problem in my deep learning model?
Anyone knows how to troubleshoot deep learning model training through gradient norm ? I am reproducing a research paper work but I'm not getting same results as theirs. I am training a model of 16 ...
-1
votes
1
answer
284
views
image segmentation with vgg16 and randomforest
according to the below link, the writer has implemented the image segmentation with vgg16 as the encoder and the random forest as the classifier.
https://github.com/bnsreenu/python_for_microscopists/...
1
vote
1
answer
134
views
VGG16 Custom Activation Function used in ResNet function
Here's my code:
import os
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2"
import tensorflow as tf
from tensorflow import keras
from keras import layers
from keras.datasets import cifar10
...
1
vote
0
answers
97
views
Why does the accuracy fluctuate widely after using batch normalization
I'm training a model which includes batch normalization layer, but i noticed that the accuracy can fluctuate widely (from 55% to 31% in just one epoch), both train accuracy and test accuracy, so i ...
0
votes
1
answer
72
views
how to properly define the tf.Variable if I have a number of blocks
I just started to transform from pytorch to tensorflow, and have some problems when designing the residual blocks. I have a residual group which contain a number of residual blocks and eack block ...
1
vote
0
answers
259
views
Gated Residual and Variable Selection Networks for regression
I came across this Keras tutorial which is for a classification problem, but I'm trying to apply it to a regression one. Does anyone know why they change the dimension of the features? Has anybody ...
0
votes
1
answer
891
views
ResNet for 32x32 images
I am trying to train a resnet for 32x32 images, and I came upon a tutorial: https://towardsdatascience.com/resnets-for-cifar-10-e63e900524e0, which applies to cifar-10 (32x32 image dataset), but I don'...
1
vote
0
answers
64
views
Residual Unit with ShakeDrop Depth regularization
Please I need Python codes for the implementation of a Residual Unit with ShakeDrop Depth regularization.
Here is the design
Design Architecture
1
vote
1
answer
94
views
ValueError: Shapes (None, 14065, 17) and (None, 17) are incompatible
I am trying to train the model below with Indian Pines dataset but I get the following error
Model:
def ResNet50(input_shape, classes=16):
# Define the input as a tensor with shape ...
1
vote
1
answer
4k
views
cannot import name 'Deconvolution2D' from 'keras.layers'
My keras code is throwing this error:
2021-03-01 08:31:47.267964: W tensorflow/stream_executor/platform/default/dso_loader.cc:60] Could not load dynamic library 'cudart64_110.dll'; dlerror: ...
0
votes
1
answer
1k
views
Building a dense residual network with keras
I am trying to build a classifier based on a Dense Network with Keras. My input are (26,1) vectors and I want to get a binary classification 1 or 0 as output.
Using a Dense network and some ...
0
votes
1
answer
181
views
How can I use the Edmonds & Karp Algorithm for residual graph
I am on to solve the Edmonds and Karp Algorithm. On normal Networks I know how to use it, indeed I am unsure / I got no clue how to use the Algorithm on a residual graph, hence there are Back-Edges.
...
0
votes
1
answer
426
views
adding batch norm to a non-batch norm layer
I'm implementing a modified ResNet architecture. In the Basic Block of ResNet, I've used the Conv layer in a shortcut connection. So my main path consists of two Conv layers, each followed by Batch ...
12
votes
2
answers
24k
views
What is the idea behind using nn.Identity for residual learning?
So, I've read about half the original ResNet paper, and am trying to figure out how to make my version for tabular data.
I've read a few blog posts on how it works in PyTorch, and I see heavy use of ...
0
votes
1
answer
1k
views
RuntimeError: Expected 4-dimensional input for 4-dimensional weight [1024, 64, 3, 3], but got input of size [32, 10] instead
This line works fine
self.conv = nn.Conv2d(3, 64, kernel_size=3, stride=2, padding=1, bias=False)
I introduced ResNet18
self.conv = ResNet18()
**ResNet Class**
'''ResNet in PyTorch.
For Pre-...
1
vote
1
answer
955
views
Modifying residual LSTM
I found som code for a Residual LSTM here: https://gist.github.com/bzamecnik/8ed16e361a0a6e80e2a4a259222f101e
I have been using an LSTM for timeseries classification with a 3d input (sample,timestep,...
0
votes
1
answer
3k
views
Resnet18 first layer output dimensions
I am looking at the model implementation in PyTorch. The 1st layer is a convolutional layer with filter size = 7, stride = 2, pad = 3. The standard input size to the network is 224x224x3. Based on ...
8
votes
4
answers
6k
views
Does it make sense to build a residual network with only fully connected layers (instedad of convolutional layers)?
Residual networks are always built with convolutional layers. I have never seen residual networks with only fully connected layers. Does it work to build a residual network with only fully connected ...
0
votes
0
answers
110
views
How to interpret Deep learning network architecture into a diagram?
How to draw this Deep learning network architecture diagrams?
I'm using Faster R-CNN: R50-FPN.
Any ideas or tip to convert this to a diagram?
For this I used detectron2 framework with pytorch. But ...
0
votes
1
answer
161
views
Could someone explain where BatchNorm is performed in ResNeXT https://github.com/facebookresearch/ResNeXt neural network?
I was reading the original paper that described ResNeXT (variation of Resnet) at https://arxiv.org/pdf/1611.05431.pdf.
On Page-5 top right column, it says:
ReLU is performed right after eachBN, ...
0
votes
1
answer
226
views
Residual Network : Operands could not be broadcast together with shapes (128, 128, 16) (126, 126, 16)
I am trying to code ResNet-12 in Keras based on this paper .
But I have an error in the 8 Layer, and in my code below the probelem is in the function Layer_Type3.
I can not see where the problem is, ...
1
vote
1
answer
2k
views
Good training accuracy but poor validation accuracy
I am trying to implement a residual network to classify images on the CIFAR10 dataset for a project and I have a working model that has an accuracy that logarthimically grows, but a validation ...
0
votes
1
answer
2k
views
How to Add Layers together in a residual network [duplicate]
# import the necessary packages
import keras
from keras.initializers import glorot_uniform
from keras.layers import AveragePooling2D, Input, Add
from keras.models import Model
from keras.layers....
0
votes
1
answer
658
views
ValueError: A merge layer should be called on a list of inputs. Add()
# import the necessary packages
import keras
from keras.initializers import glorot_uniform
from keras.layers import AveragePooling2D, Input, Add
from keras.models import Model
from keras.layers....
1
vote
0
answers
3k
views
Get tensorflow autograph warning when building model by calling self-defined module
I tried to build model by calling self-defined module in Google Colab, but get autoGraph warning.
Folder structure :
drive/My Drive/Colab Notebooks/stackoverflow/q001/
├── train
│ └── train.ipynb ...
0
votes
0
answers
272
views
About Computational complexity of a residual network
While training a residual network, doesn't the residual layer increases the computational complexity of the network with so many weights to train in the residual block?
0
votes
1
answer
117
views
Deep residual networks in Resnets
Are the weight matrices of the residual blocks already set to 0 or we need to train the weight matrices of the residual block to be close to 0?
In what cases we backpropagate through the weight ...
1
vote
2
answers
2k
views
Inception-v1 vs Inception-Resnet-V1
Like the title says, is there any difference between the two? I think the original inception v1 model does not have Res Blocks, but maybe I'm wrong. Are they the same thing?
0
votes
1
answer
716
views
Pytorch method for conditional use of intermediate layer instead of final cnn layer output. ie: allow nn to learn to use more layers or less
I'm implementing a residual cnn(modified smaller version of xception) in a low latency environment. I've done a lot of manual tuning to minimize the run time speed of my network (reducing number of ...
0
votes
1
answer
438
views
How to build a resnet with Keras that trains and predicts the subclass from the main class?
I would like go implement a hierarchical resnet architecture. However, I could not find any solution for this. For example, my data structure is like:
class A
Subclass 1
Subclass 2
....
class B
...
0
votes
0
answers
53
views
Explain the meaning of a matlab code involving filters in binary
Hi I have a sample code to convert to python. Unfortunately I am very new to Matlab code. Could you please help me to understand the code. I want to write in python. Any leads would be appreciated. ...
0
votes
1
answer
339
views
On Implementing Wide Resnet architecture
While I was working on implementing the wide resnet architecture, I had one main question regarding the calculation of N according to the paper wide resnet:
In their implementation I found that N is ...
3
votes
1
answer
7k
views
How to add skip connection between convolutional layers in Keras
I would like to add a skip connection between residual blocks in keras. This is my current implementation, which does not work because the tensors have different shapes.
The function looks like this:
...
-2
votes
1
answer
4k
views
Why does my neural network never overfit?
I am training a deep residual network with 10 hidden layers with game data.
Does anyone have an idea why I don't get any overfitting here?
Training and test loss still decreasing after 100 epochs of ...
0
votes
1
answer
1k
views
residual LSTM layers
I have trouble understanding tensor behaviour in LSTM layers in keras.
I have preprocessed numeric data that looks like [samples, time steps, featues]. So 10 000 samples, 24 time steps and 10 ...
0
votes
1
answer
64
views
My residual neural network is giving a very strange depth map as output .I dont know how to improve my model?
The output i am getting from my residual model is an image with small little squares on it ( a very low resolution image), but it is supposed to give me a depth map. The objects in the image are lost ...
1
vote
1
answer
8k
views
How to implement a 1D convolutional neural network with residual connections and batch-normalization in Keras?
I am trying to develop a 1D convolutional neural network with residual connections and batch-normalization based on the paper Cardiologist-Level Arrhythmia Detection with Convolutional Neural Networks,...
3
votes
0
answers
202
views
How to set batch-normalization op in inference mode without calling tf.layers.batch_normalization() ?
I define a deep CNN with tensorflow, inluding a batch-normalization op, i.e, my code may look like this:
def network(input):
...
input = tf.layers.batch_normalization(input, ...)
...
...
1
vote
0
answers
662
views
combining DropoutWrapper and ResidualWrapper with variational_recurrent=True
I'm trying to create a MultiRNNCell of LSTM cells wrapped with both DropoutWrapper and ResidualWrapper. For using variational_recurrent=True, we must provide input_size parameter to DropoutWrapper. I'...
1
vote
1
answer
1k
views
Why each block in Deep residual network has two convolutional layers instead of one?
This figure shows a basic block of a residual network. What it has two convolutional layers? What will happen when it has only one convolutional layer?
0
votes
1
answer
559
views
Keras residual connection on only part of input
I already made a model without residual connection which compile and fit without any errors [using Keras Sequential API]
I wish to test a modified version just adding a residual connection like in ...
0
votes
1
answer
388
views
Implementing residual block
In the traditional residual block, is the "addition" of layer N to the output of layer N+2 (prior to non-linearity) element-wise addition or concatenation?
The literature indicates something like ...
1
vote
1
answer
300
views
Can Residual Nets skip one linearity instead of two?
Standard in ResNets is to skip 2 linearities.
Would skipping only one work as well?
8
votes
3
answers
3k
views
Tensorflow: How to set the learning rate in log scale and some Tensorflow questions
I am a deep learning and Tensorflow beginner and I am trying to implement the algorithm in this paper using Tensorflow. This paper uses Matconvnet+Matlab to implement it, and I am curious if ...
3
votes
1
answer
4k
views
Residual learning in tensorflow
I am attempting to replicate this image from a research paper. In the image, the orange arrow indicates a shortcut using residual learning and the layer outlined in red indicates a dilated convolution....
8
votes
2
answers
5k
views
Residual Neural Network: Concatenation or Element Addition?
With the residual block in residual neural networks, is the addition at the end of the block true element addition or is it concatenation?
For example, would addition([1, 2], [3, 4]) produce [1, 2, 3,...
3
votes
1
answer
1k
views
Clarification on NN residual layer back-prop derivation
I've looked everywhere and can't find anything that explains the actual derivation of backprop for residual layers. Here's my best attempt and where I'm stuck. It is worth mentioning that the ...
13
votes
3
answers
10k
views
What is "linear projection" in convolutional neural network [closed]
I am reading through Residual learning, and I have a question.
What is "linear projection" mentioned in 3.2? Looks pretty simple once got this but could not get the idea...
Can someone ...