SparseGPT: fewer parameters is better?
How to get rid of 100 billion parameters and happily infer on one GPU
Published in
7 min readFeb 8, 2023
What is the limitation of language models? We have heard of the wonders of GPT3, LaMDA, and so on, of the supposed gigantic number of parameters in GPT4 (but it seems these are just rumors). On the other hand, more parameters also mean it takes up more space and you also need more…