Honestly its such a good idea to share models via p2p it saves so much bandwidth Ofc there should still be a ddl for preservation but still
this post was submitted on 08 Dec 2023
13 points (100.0% liked)
LocalLLaMA
2747 readers
5 users here now
Welcome to LocalLLama! This is a community to discuss local large language models such as LLama, Deepseek, Mistral, and Qwen.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support eachother and share our enthusiasm in a positive constructive way.
founded 2 years ago
MODERATORS
The only concern I had was my god is it a lot of faith to put in this random twitter, hope they never get hacked lol, but otherwise yes it's a wonderful idea, would be a good feature for huggingface to speed up downloads/uploads