this post was submitted on 04 Nov 2023
21 points (100.0% liked)
LocalLLaMA
2819 readers
22 users here now
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Cool stuff! Smarter than smart contexts.
I wasn't able to get good use out if the old 'Smartcontext' anyways and seems other people had the same problem. To me, this is a huge improvement. And it doesn't even need extra memory or anything.
I really like how the KoboldCPP dev(s(?)) and the llama.cpp community constantly implement all the crazy stuff.