It is a concern.
Check out https://tiktokenizer.vercel.app/?model=deepseek-ai%2FDeepSeek-R1 and try entering some freeform hexadecimal data - you'll notice that it does not cleanly segment the hexadecimal numbers into individual tokens.
It is a concern.
Check out https://tiktokenizer.vercel.app/?model=deepseek-ai%2FDeepSeek-R1 and try entering some freeform hexadecimal data - you'll notice that it does not cleanly segment the hexadecimal numbers into individual tokens.
Still, this does not quite address the issue of tokenization making it difficult for most models to accurately distinguish between the hexadecimals here.
Having the model write code to solve an issue and then ask it to execute it is an established technique to circumvent this issue, but all of the model interfaces I know of with this capability are very explicit about when they are making use of this tool.
Is this real? On account of how LLMs tokenize their input, this can actually be a pretty tricky task for them to accomplish. This is also the reason why it's hard for them to count the amount of 'R's in the word 'Strawberry'.
Fwiw, that's not the traditional presentation. The traditional is, from bottom to top:
I've never seen one with jam, although people go wild with the variations these days.
Well, it's obviously not going to be the iPad that wins in that case
If there was a real demand for big pockets, there would be money to make in selling those, and big pocket brands would dominate.
I think you might be giving a bit too much credit to the industry here.
No, that's the wrong sweet baked good - the story goes that he went to town on 14 servings of Hetvägg, the ancestor to the popular Semla.
We eat Semla for Shrove Tuesday in Sweden, which traditionally marked the beginning of the Lent and hence fasting until Easter. Another name for Semla is fastlagsbulle, which roughly translates to "fasting bun".
Hetvägg is a Semla served on a plate with hot milk.
If you have a Swedish bakery nearby, try seeing if they serve Semla. Or get one when you visit Sweden, if they are in season during your visit.
There used to be pockets on women's clothes - or more accurately, you tied them on yourself as they came separately from the clothes - but they fell out of fashion as handbags became the fashion statement that said: look - I'm not poor enough to have to have pockets.
Very dumb, but it is what it is.
What baffles me now is that pockets on women's clothes haven't made a comeback yet. How asleep at the wheel are fashion designers?
Depends on what kind of generation of MacBook that is. Intel-series? I'm leaning ThinkPad. M-series? It's gonna have to be the MacBook.
As others mentioned it's diminishing returns, but there's still a lot of good innovation going on in the codec space. As an example - the reduction in the amount of space required for h265 compared to h264 is staggering. Codecs are a special form of black magic.
Fucking losers
It's not out of the question that we get emergent behaviour where the model can connect non-optimally mapped tokens and still translate them correctly, yeah.