This paper introduces an approach for fast training of the Feedforward Neural Networks (FNNs). The approach is based on linearization of nonlinear output activation function by inverting it and calculation of weights using gradient-decent and QR decomposition techniques. The approach called Gradient-descent Orthogonalized Training (GOT) algorithm. The algorithm GOT is applied to some benchmark problems and the results are compared to those obtained using Error Back-Propagation (EBP) algorithm.
展开▼