One to one with HR. I'm not paraphrasing, she litterally said that: "You are a really good performer, we really want to keep you but we won't do anything to do so".
borokov
You know you are missing the only fun part of having a baby ?
Anyway, fuck AI.
Yet another package manager...
apt for life !
Fool me once, shame on you. Fool me twice, shame on me.
Never forget that deep inside LLM training data, their is all the tweet history of pussyslayer69.
BitNet sucks:
GPT started answered correctly since around gpt3.5. But smaller LLM generally responds rabbit eggs are around 5 to 8 cm and are smaller than cows one 🤣
To test level of stupidity of AI chat bots, I ask them the size of a rabbit egg, and if it is bigger or smaller that a cow eggs.
In the early 2ks, computer were ugly grey box with noisy fan and a hard drive that gave the impression a cockroach colony were trying to escape your case. I wanted to build a silent computer to watch Divx movies from my bed, but as a broke teen, I just had access to disposed hardwares I could find there and there.
I dismantled a power supply, stuck mosfets to big mother fucking dissipator, and I had a silent power supply. I put another huge industrial dissipator on CPU (think it was an AMD k6 500Mhz) and had fanless cooling. Remained the hard drive.
Live CD/USB weren't common at that time. I've discovered a live CD distrib (I think it was Knoppix) that could run entirely from RAM.
I removed hard drive, boot on live distrib, then replace CD by my Divx and voila.
Having a fanless-harddriveless computer was pure science fiction for me and my friends at that time.
I gave a try to ollama.nvim, but I'm not conviced (not by the plugin, but by using LLM directly in IDE). Because of security reasons, I cannot send code to public LLMs, so I have to either use my company's internal LLM (GPT4o), but which just have a front end, no API. Either I have to use local LLM through ollama for ex.
I tried several models, but they are to slow or too dumb. In the end, when I need help, I copy/past code into LLM portal front end.
Same in France. We don't have oil but we have strikes.