May we please get a light-weight Deepseek model?
#27
by
blun4me
- opened
My 16GB GPU and 64GB RAM cannot load something like this. I'd love to see a model within the 16b-32b range in the future, if that is possible. Thanks in advance.
For real, would be great to see smaller models from them and not these huge TB models that require a data center to run at any usable speed much less load.
I think Deepseek is more interested in trying to push the frontier rather than catering to edge devices.
I think Deepseek is more interested in trying to push the frontier rather than catering to edge devices.
Fair but it would be nice to have a light weight deepseek model that CAN run on smaller devices.