

Is the code available somewhere?
Keyoxide: aspe:keyoxide.org:MWU7IK7RMUTL3AP6U6UWCF4LHY
Is the code available somewhere?
Lol, there are smaller versions of Deepseek-r1. These aren’t the “real” Deepseek model, but they are distilled from other foundation models (Qwen2.5 and Llama3 in this case).
For the 671b parameter file, the medium-quality version weighs in at 404 GB. That means you need 404 GB of RAM/VRAM just to load the thing. Then you need preferably ALL of that in VRAM (i.e. GPU memory) to get it to generate anything fast.
For comparison, I have 16 GB of VRAM and 64 GB of RAM on my desktop. If I run the 70b parameter version of Llama3 at Q4 quant (medium quality-ish), it’s a 40 GB file. It’ll run, but mostly on the CPU. It generates ~0.85 tokens per second. So a good response will take 10-30 minutes. Which is fine if you have time to wait, but not if you want an immediate response. If I had two beefy GPUs with 24 GB VRAM each, that’d be 48 total GB and I could run the whole model in VRAM and it’d be very fast.
They’re probably referring to the 671b parameter version of deepseek. You can indeed self host it. But unless you’ve got a server rack full of data center class GPUs, you’ll probably set your house on fire before it generates a single token.
If you want a fully open source model, I recommend Qwen 2.5 or maybe deepseek v2. There’s also OLmo2, but I haven’t really tested it.
Mistral small 24b also just came out and is Apache licensed. That is something I’m testing now.
Most open/local models require a fraction of the resources of chatgpt. But they are usually not AS good in a general sense. But they often are good enough, and can sometimes surpass ChatGPT in specific domains.
It’s enough to run quantized versions of the distilled r1 model based on Qwen and Llama 3. Don’t know how fast it’ll run though.
Even the smell of Olives causes me to gag. I absolutely cannot eat them. Olive oil is fine. But actual olives, no. Doesn’t matter if they’re old, new, canned, fresh. They’re absolutely disgusting. One of the few foods I outright cannot and will not eat.
Doesn’t gnome already have this?
Lol. Git itself can act as a server over the git protocol. Might have been easier 🤪
There’s plenty of git forges that aren’t GitHub. Git itself has nothing to do with central servers and can theoretically be used in a completely decentralized manner.
It’s the opening of the Canterbury Tales.
Makes sense when many of the spiders in Australia are dangerous, though.
I use a Misskey fork for micro blogging and I can’t even get Lemmy posts to load. The profiles of communities do, but that’s it.
Ah right. What I really meant to ask was if it can do protocols other than http.
Which I don’t think it can…
Are you able to tunnel ports other than 80 and 443 through Cloudflare?
Right. I agree.
You mean the part about people citing laws like GDPR is dead on?
Definitely a good way to do it. Photoprism supports uploading to WebDAV for sharing. Could front a CDN upload with a web dav server 🤔
Can you link the feeds?