Skip to content

sgupta38/Fully-connected-neural-network-using-CUDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

author
sgupta38
May 22, 2019
5459a66 · May 22, 2019

History

10 Commits
May 5, 2019
May 14, 2019
May 14, 2019
May 20, 2019
May 20, 2019
May 14, 2019
May 14, 2019
May 14, 2019
May 14, 2019
May 14, 2019
May 6, 2019
May 14, 2019
May 22, 2019
May 20, 2019
May 14, 2019
May 17, 2019
May 14, 2019
May 20, 2019

Repository files navigation

neural-network-in-cuda-sgupta38

Fully Connected Neural Network

This project achieves a 90% accuracy for given images. Expensive functions such as 'forward', 'backprop' and 'update_weights' are executed on GPU device. 'mnist' images has been used for training and testing purpose.

Following are some of the impotant files.

  • main.cu --> File which has entry point function and this is where we configure our layers & adjust our 'configurations' such as Number of neurons, dropout rate etc.
  • cuda_functions.cu --> GPU API's for forwarding, backpropogation and updating weights.
  • CFullyConnectedLayer.cu --> Hidden layer which calls CUDA API's

How to RUN?

> ./fnn

Architecture:

Layers

Input Layer

The input layer is a 28×28×1 grayscale image.

Fully Connected Hidden Layer

This is just a fully connected layer with ReLU activation. It consists of 1024 neurons with dropout rate of 0.4. This means that during training, any given neuron has a 0.4 probability of “dropping out”, which means that it is set to 0, regardless of the inputs.

Output Layer

This is the final classification layer. It is a fully connected layer consisting of 10 neurons, one for each class. It will compute a softmax

About

Fully connected neural network using CUDA

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published