Expected to rewrite the future of AI! NVIDIA's new nGPT accelerates training speed by 20 times
According to media reports, NVIDIA's latest research may completely change the future of AI. Their research team has proposed a new neural network architecture called Normalization Transformer. This architecture conducts representation learning on hyperspheres, significantly improving the training speed of large language models, up to 20 times, while maintaining model accuracy