The Chinese AI lab may have just found a way to train advanced LLMs in a manner that's practical and scalable, even for more cash-strapped developers.
Chinese AI company Deepseek has unveiled a new training method, Manifold-Constrained Hyper-Connections (mHC), which will make it possible to train large language models more efficiently and at lower ...