I have the following output of my neural network algorithm:
Training accuracy: 0.875817
Test accuracy: 0.659091
Training error: 0.632095
Testing error: 0.43051
I can raise up my training accuracy by adding more hidden units and layers in my algorithm, but my training error remains super high at 0.63. No matter how (many times) I train, my training error remains super high.
First of all, what does 0.63 and 0.43 exactly mean? Do they mean 63% and 43 % error, respectively? Second, are these numbers correlated in any way to accuracy rates?
FYI, by training/testing error, I mean their training and testing cost.
Aucun commentaire:
Enregistrer un commentaire