You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Although a wide range of activation algorithms and formulations can be used and it can be discovered in depth. But for simplicity, LeakyReLU has been used for Hidden Layer [PyTorch LeakyReLU](https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html). The input layer and output have Linear activation [PyTorch Linear](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html). Logsoftmax has been used to formulate the output [PyTorch LogSoftmax](https://pytorch.org/docs/stable/generated/torch.nn.LogSoftmax.html)
Similar to above, many loss functions can be used to compute the loss but again for simplicity, NLLLoss i.e. Negatice Log Likelihood Loss has been used [PyTorch NLLLoss](https://pytorch.org/docs/stable/generated/torch.nn.NLLLoss.html)
0 commit comments