Skip to main content
Filter by
Sorted by
Tagged with
0 votes
1 answer
60 views

I tried to implement a Residual connection neural network and recreate Lenet-5, and can't set up architecture Here is Residual connection block class ResidualBlock(torch.nn.Module): def __init__(...
2Razor007's user avatar
0 votes
1 answer
582 views

Dear Scientists and researchers, Greetings I am trying to do a research article on morbidity, which is categorical, i.e., yes or no, and I also consider fixed and random effects, given that all of the ...
Sofonias Derso's user avatar
0 votes
0 answers
752 views

Anyone knows how to troubleshoot deep learning model training through gradient norm ? I am reproducing a research paper work but I'm not getting same results as theirs. I am training a model of 16 ...
Repo1's user avatar
  • 11
-1 votes
1 answer
284 views

according to the below link, the writer has implemented the image segmentation with vgg16 as the encoder and the random forest as the classifier. https://github.com/bnsreenu/python_for_microscopists/...
ali ali's user avatar
1 vote
1 answer
134 views

Here's my code: import os os.environ["TF_CPP_MIN_LOG_LEVEL"] = "2" import tensorflow as tf from tensorflow import keras from keras import layers from keras.datasets import cifar10 ...
James Newton's user avatar
1 vote
0 answers
97 views

I'm training a model which includes batch normalization layer, but i noticed that the accuracy can fluctuate widely (from 55% to 31% in just one epoch), both train accuracy and test accuracy, so i ...
QZero's user avatar
  • 43
0 votes
1 answer
72 views

I just started to transform from pytorch to tensorflow, and have some problems when designing the residual blocks. I have a residual group which contain a number of residual blocks and eack block ...
jessie's user avatar
  • 1
1 vote
0 answers
259 views

I came across this Keras tutorial which is for a classification problem, but I'm trying to apply it to a regression one. Does anyone know why they change the dimension of the features? Has anybody ...
Caterina's user avatar
  • 1,117
0 votes
1 answer
891 views

I am trying to train a resnet for 32x32 images, and I came upon a tutorial: https://towardsdatascience.com/resnets-for-cifar-10-e63e900524e0, which applies to cifar-10 (32x32 image dataset), but I don'...
Bobby Joe's user avatar
1 vote
0 answers
64 views

Please I need Python codes for the implementation of a Residual Unit with ShakeDrop Depth regularization. Here is the design Design Architecture
Tiwalade Modupe Usman's user avatar
1 vote
1 answer
94 views

I am trying to train the model below with Indian Pines dataset but I get the following error Model: def ResNet50(input_shape, classes=16): # Define the input as a tensor with shape ...
Eduard's user avatar
  • 11
1 vote
1 answer
4k views

My keras code is throwing this error: 2021-03-01 08:31:47.267964: W tensorflow/stream_executor/platform/default/dso_loader.cc:60] Could not load dynamic library 'cudart64_110.dll'; dlerror: ...
Chris's user avatar
  • 23
0 votes
1 answer
1k views

I am trying to build a classifier based on a Dense Network with Keras. My input are (26,1) vectors and I want to get a binary classification 1 or 0 as output. Using a Dense network and some ...
johnnyp's user avatar
  • 99
0 votes
1 answer
181 views

I am on to solve the Edmonds and Karp Algorithm. On normal Networks I know how to use it, indeed I am unsure / I got no clue how to use the Algorithm on a residual graph, hence there are Back-Edges. ...
none_kak's user avatar
0 votes
1 answer
426 views

I'm implementing a modified ResNet architecture. In the Basic Block of ResNet, I've used the Conv layer in a shortcut connection. So my main path consists of two Conv layers, each followed by Batch ...
Quamer Nasim's user avatar
12 votes
2 answers
24k views

So, I've read about half the original ResNet paper, and am trying to figure out how to make my version for tabular data. I've read a few blog posts on how it works in PyTorch, and I see heavy use of ...
rocksNwaves's user avatar
  • 6,342
0 votes
1 answer
1k views

This line works fine self.conv = nn.Conv2d(3, 64, kernel_size=3, stride=2, padding=1, bias=False) I introduced ResNet18 self.conv = ResNet18() **ResNet Class** '''ResNet in PyTorch. For Pre-...
Khawar Islam's user avatar
  • 2,984
1 vote
1 answer
955 views

I found som code for a Residual LSTM here: https://gist.github.com/bzamecnik/8ed16e361a0a6e80e2a4a259222f101e I have been using an LSTM for timeseries classification with a 3d input (sample,timestep,...
Jim Björklund's user avatar
0 votes
1 answer
3k views

I am looking at the model implementation in PyTorch. The 1st layer is a convolutional layer with filter size = 7, stride = 2, pad = 3. The standard input size to the network is 224x224x3. Based on ...
Sahil Gupta's user avatar
8 votes
4 answers
6k views

Residual networks are always built with convolutional layers. I have never seen residual networks with only fully connected layers. Does it work to build a residual network with only fully connected ...
rxxcow's user avatar
  • 91
0 votes
0 answers
110 views

How to draw this Deep learning network architecture diagrams? I'm using Faster R-CNN: R50-FPN. Any ideas or tip to convert this to a diagram? For this I used detectron2 framework with pytorch. But ...
Sebastián's user avatar
0 votes
1 answer
161 views

I was reading the original paper that described ResNeXT (variation of Resnet) at https://arxiv.org/pdf/1611.05431.pdf. On Page-5 top right column, it says: ReLU is performed right after eachBN, ...
Joe Black's user avatar
  • 667
0 votes
1 answer
226 views

I am trying to code ResNet-12 in Keras based on this paper . But I have an error in the 8 Layer, and in my code below the probelem is in the function Layer_Type3. I can not see where the problem is, ...
hamzaca's user avatar
  • 11
1 vote
1 answer
2k views

I am trying to implement a residual network to classify images on the CIFAR10 dataset for a project and I have a working model that has an accuracy that logarthimically grows, but a validation ...
Riley K's user avatar
  • 403
0 votes
1 answer
2k views

# import the necessary packages import keras from keras.initializers import glorot_uniform from keras.layers import AveragePooling2D, Input, Add from keras.models import Model from keras.layers....
roma972012's user avatar
0 votes
1 answer
658 views

# import the necessary packages import keras from keras.initializers import glorot_uniform from keras.layers import AveragePooling2D, Input, Add from keras.models import Model from keras.layers....
roma972012's user avatar
1 vote
0 answers
3k views

I tried to build model by calling self-defined module in Google Colab, but get autoGraph warning. Folder structure : drive/My Drive/Colab Notebooks/stackoverflow/q001/ ├── train │ └── train.ipynb ...
Paul's user avatar
  • 43
0 votes
0 answers
272 views

While training a residual network, doesn't the residual layer increases the computational complexity of the network with so many weights to train in the residual block?
Dude's user avatar
  • 21
0 votes
1 answer
117 views

Are the weight matrices of the residual blocks already set to 0 or we need to train the weight matrices of the residual block to be close to 0? In what cases we backpropagate through the weight ...
Dude's user avatar
  • 21
1 vote
2 answers
2k views

Like the title says, is there any difference between the two? I think the original inception v1 model does not have Res Blocks, but maybe I'm wrong. Are they the same thing?
katiex7's user avatar
  • 913
0 votes
1 answer
716 views

I'm implementing a residual cnn(modified smaller version of xception) in a low latency environment. I've done a lot of manual tuning to minimize the run time speed of my network (reducing number of ...
user3029296's user avatar
0 votes
1 answer
438 views

I would like go implement a hierarchical resnet architecture. However, I could not find any solution for this. For example, my data structure is like: class A Subclass 1 Subclass 2 .... class B ...
TheJokerAEZ's user avatar
0 votes
0 answers
53 views

Hi I have a sample code to convert to python. Unfortunately I am very new to Matlab code. Could you please help me to understand the code. I want to write in python. Any leads would be appreciated. ...
codeprb's user avatar
  • 27
0 votes
1 answer
339 views

While I was working on implementing the wide resnet architecture, I had one main question regarding the calculation of N according to the paper wide resnet: In their implementation I found that N is ...
I. A's user avatar
  • 2,322
3 votes
1 answer
7k views

I would like to add a skip connection between residual blocks in keras. This is my current implementation, which does not work because the tensors have different shapes. The function looks like this: ...
atlas's user avatar
  • 411
-2 votes
1 answer
4k views

I am training a deep residual network with 10 hidden layers with game data. Does anyone have an idea why I don't get any overfitting here? Training and test loss still decreasing after 100 epochs of ...
Dookie's user avatar
  • 11
0 votes
1 answer
1k views

I have trouble understanding tensor behaviour in LSTM layers in keras. I have preprocessed numeric data that looks like [samples, time steps, featues]. So 10 000 samples, 24 time steps and 10 ...
JacobJacox's user avatar
0 votes
1 answer
64 views

The output i am getting from my residual model is an image with small little squares on it ( a very low resolution image), but it is supposed to give me a depth map. The objects in the image are lost ...
Vishesh Breja's user avatar
1 vote
1 answer
8k views

I am trying to develop a 1D convolutional neural network with residual connections and batch-normalization based on the paper Cardiologist-Level Arrhythmia Detection with Convolutional Neural Networks,...
JPM's user avatar
  • 489
3 votes
0 answers
202 views

I define a deep CNN with tensorflow, inluding a batch-normalization op, i.e, my code may look like this: def network(input): ... input = tf.layers.batch_normalization(input, ...) ... ...
Arsenal591's user avatar
  • 1,626
1 vote
0 answers
662 views

I'm trying to create a MultiRNNCell of LSTM cells wrapped with both DropoutWrapper and ResidualWrapper. For using variational_recurrent=True, we must provide input_size parameter to DropoutWrapper. I'...
devin's user avatar
  • 1,120
1 vote
1 answer
1k views

This figure shows a basic block of a residual network. What it has two convolutional layers? What will happen when it has only one convolutional layer?
user570593's user avatar
  • 3,560
0 votes
1 answer
559 views

I already made a model without residual connection which compile and fit without any errors [using Keras Sequential API] I wish to test a modified version just adding a residual connection like in ...
AlexDtd's user avatar
  • 21
0 votes
1 answer
388 views

In the traditional residual block, is the "addition" of layer N to the output of layer N+2 (prior to non-linearity) element-wise addition or concatenation? The literature indicates something like ...
rodrigo-silveira's user avatar
1 vote
1 answer
300 views

Standard in ResNets is to skip 2 linearities. Would skipping only one work as well?
Seguy's user avatar
  • 398
8 votes
3 answers
3k views

I am a deep learning and Tensorflow beginner and I am trying to implement the algorithm in this paper using Tensorflow. This paper uses Matconvnet+Matlab to implement it, and I am curious if ...
chesschi's user avatar
  • 708
3 votes
1 answer
4k views

I am attempting to replicate this image from a research paper. In the image, the orange arrow indicates a shortcut using residual learning and the layer outlined in red indicates a dilated convolution....
Devin Haslam's user avatar
8 votes
2 answers
5k views

With the residual block in residual neural networks, is the addition at the end of the block true element addition or is it concatenation? For example, would addition([1, 2], [3, 4]) produce [1, 2, 3,...
C. R.'s user avatar
  • 87
3 votes
1 answer
1k views

I've looked everywhere and can't find anything that explains the actual derivation of backprop for residual layers. Here's my best attempt and where I'm stuck. It is worth mentioning that the ...
Jacob Statnekov's user avatar
13 votes
3 answers
10k views

I am reading through Residual learning, and I have a question. What is "linear projection" mentioned in 3.2? Looks pretty simple once got this but could not get the idea... Can someone ...
Troy's user avatar
  • 125