Meituan is reportedly testing a new-generation large model with trillions of
parameters, according to Cailian Press.
The model is said to be trained entirely on domestically sourced computing
clusters and is currently available only to invited users. If confirmed, this
would mark a notable step in large-scale model training using local compute
infrastructure.