虼蚻_cmchao
上星期有一篇
https://images.plurk.com/6P0OzsFdS1VAE9pwUXFxHP.png
虼蚻_cmchao
虼蚻_cmchao
路人一:"They're still multiplying matrices, it's just that one of the matrices has entries -1, 0, 1."
虼蚻_cmchao
路人二:"This kind of weight is called tenary weight. There is a paper called “Ternary Weight Networks” which is published in 2016.
However, LLM has less error tolerance in quantization. I think this method still has long path to be widely used in LLM"
虼蚻_cmchao
對噗頭的那個人的評價....
載入新的回覆