autonomoususer

joined 2 years ago
MODERATOR OF
llm
24
submitted 1 day ago* (last edited 23 hours ago) by [email protected] to c/[email protected]
 

cross-posted from: https://lemmy.world/post/27088416

This is an update to a previous post found at https://lemmy.world/post/27013201


Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

AMD GPU issue fix

  1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
  2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
sudo systemctl edit ollama.service
  1. Add the following, save and exit. You can try different versions as shown at github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
[Service]
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
  1. Restart the Ollama service.
sudo systemctl restart ollama
 

cross-posted from: https://lemmy.world/post/27088416

This is an update to a previous post found at https://lemmy.world/post/27013201


Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

AMD GPU issue fix

  1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
  2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
sudo systemctl edit ollama.service
  1. Add the following, save and exit. You can try different versions as shown at github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
[Service]
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
  1. Restart the Ollama service.
sudo systemctl restart ollama
 

This is an update to a previous post found at https://lemmy.world/post/27013201


Ollama uses the AMD ROCm library which works well with many AMD GPUs not listed as compatible by forcing an LLVM target.

The original Ollama documentation is wrong as the following can not be set for individual GPUs, only all or none, as shown at github.com/ollama/ollama/issues/8473

AMD GPU issue fix

  1. Check your GPU is not already listed as compatibility at github.com/ollama/ollama/blob/main/docs/gpu.md#linux-support
  2. Edit the Ollama service file. This uses the text editor set in the $SYSTEMD_EDITOR environment variable.
sudo systemctl edit ollama.service
  1. Add the following, save and exit. You can try different versions as shown at https://github.com/ollama/ollama/blob/main/docs/gpu.md#overrides-on-linux
[Service]
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
  1. Restart the Ollama service.
sudo systemctl restart ollama
[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

Windows fails to include a libre software license text file. We do not control it, anti-libre software.

So, to restrict its access to external resources such as networking, must be sandboxed in an enviroment we do control, such a virtualising it under a libre software operating system.

This is only harm reduction. We still do not control Windows itself, anti-libre software.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago) (1 children)

How might this impact VRAM requirements? ~~I would also like to see a libre software implementation.~~

[–] [email protected] 1 points 2 days ago* (last edited 1 day ago)

Llama fails to include a libre software license text file. We do not control it, anti-libre software.

[–] [email protected] 3 points 2 days ago* (last edited 2 days ago)

Cosmos Cloud fails to include a libre software license text file. We do not control it, anti-libre software. This defeats the purpose of running Ollama, libre software, on our own device.

Also, although Docker on Arch Linux is fine, Docker Desktop used to install Docker on Windows is also anti-libre software. The upcoming guide will provide a workaround. Of course, Windows is anti-libre software, so this is for harm reduction at best.

However, thank you for freeing this information from Discord, also anti-libre software.

 

cross-posted from: https://lemmy.world/post/27013201

Ollama lets you download and run large language models (LLMs) on your device.

Install Ollama on Arch Linux (Windows guide coming soon)

  1. Check whether your device has an AMD GPU, NVIDIA GPU, or no GPU. A GPU is recommended but not required.
  2. Open Console, type only one of the following commands and press return. This may ask for your password but not show you typing it.
sudo pacman -S ollama-rocm    # for AMD GPU
sudo pacman -S ollama-cuda    # for NVIDIA GPU
sudo pacman -S ollama         # for no GPU (for CPU)
  1. Enable the Ollama service [on-device and runs in the background] to start with your device and start it now.
sudo systemctl enable --now ollama

Test Ollama alone (Open WebUI guide coming soon)

  1. Open localhost:11434 in a web browser and you should see Ollama is running. This shows Ollama is installed and its service is running.
  2. Run ollama run deepseek-r1 in a console and ollama ps in another, to download and run the DeepSeek R1 model while seeing whether Ollama is using your slow CPU or fast GPU.

AMD GPU issue fix

https://lemmy.world/post/27088416

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

A large part of why I recently took over [email protected] and not [email protected].

[–] [email protected] 2 points 2 days ago

Modern software is complex, so we need to work together. Libre software defends us computing alone and in groups. Anti-libre licenses stop groups.

 

cross-posted from: https://lemmy.world/post/27013201

Ollama lets you download and run large language models (LLMs) on your device.

Install Ollama on Arch Linux (Windows guide coming soon)

  1. Check whether your device has an AMD GPU, NVIDIA GPU, or no GPU. A GPU is recommended but not required.
  2. Open Console, type only one of the following commands and press return. This may ask for your password but not show you typing it.
sudo pacman -S ollama-rocm    # for AMD GPU
sudo pacman -S ollama-cuda    # for NVIDIA GPU
sudo pacman -S ollama         # for no GPU (for CPU)
  1. Enable the Ollama service [on-device and runs in the background] to start with your device and start it now.
sudo systemctl enable --now ollama

Test Ollama alone (Open WebUI guide coming soon)

  1. Open localhost:11434 in a web browser and you should see Ollama is running. This shows Ollama is installed and its service is running.
  2. Run ollama run deepseek-r1 in a console and ollama ps in another, to download and run the DeepSeek R1 model while seeing whether Ollama is using your slow CPU or fast GPU.

AMD GPU issue fix

https://lemmy.world/post/27088416

 

Ollama lets you download and run large language models (LLMs) on your device.

Install Ollama on Arch Linux (Windows guide coming soon)

  1. Check whether your device has an AMD GPU, NVIDIA GPU, or no GPU. A GPU is recommended but not required.
  2. Open Console, type only one of the following commands and press return. This may ask for your password but not show you typing it.
sudo pacman -S ollama-rocm    # for AMD GPU
sudo pacman -S ollama-cuda    # for NVIDIA GPU
sudo pacman -S ollama         # for no GPU (for CPU)
  1. Enable the Ollama service [on-device and runs in the background] to start with your device and start it now.
sudo systemctl enable --now ollama

Test Ollama alone (Open WebUI guide coming soon)

  1. Open localhost:11434 in a web browser and you should see Ollama is running. This shows Ollama is installed and its service is running.
  2. Run ollama run deepseek-r1 in a console and ollama ps in another, to download and run the DeepSeek R1 model while seeing whether Ollama is using your slow CPU or fast GPU.

AMD GPU issue fix

https://lemmy.world/post/27088416

 
[–] [email protected] 1 points 3 days ago* (last edited 3 days ago)

No, it shows the bot you are active in chat, so they spam you more.

 

Rules

  1. Please tag [not libre software] and [never on-device] services as such (those not green in the License column here).
  2. Be useful to others

Resources

github.com/ollama/ollama
github.com/open-webui/open-webui
github.com/Aider-AI/aider
wikipedia.org/wiki/List_of_large_language_models

[–] [email protected] 67 points 5 days ago (2 children)

Anon doesn't know, Discord fails to include a libre software license text file. We do not control it, anti-libre software.

7
Requesting c/llm (lemmy.world)
submitted 5 days ago* (last edited 5 days ago) by [email protected] to c/[email protected]
 

[email protected] community and @[email protected] moderator appear inactive.

[–] [email protected] 5 points 1 week ago (1 children)

BlueSky is not decentralised.

[–] [email protected] 5 points 1 week ago* (last edited 1 week ago) (2 children)

Install KDE on Mint and select it at login

[–] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

They fail to do that too.

 

watomatic.app

Example

🤖 Automated Reply

💬 I reply faster on example.org

⁉️ WhatsApp is anti-libre software. We do NOT control it. It withholds a libre software license text file, like GPL.

Explained

I reply faster on

Deleting the only way to reach someone online breaks your influence.

example.org

A link and only one link, so (1) they see it's an app, not some random word or typo, (2) they can download it without searching, and (3) they don't have multiple choice–they don't need to do any thinking or research. Remove everything stopping them.

anti-libre software.

Never say privacy, they've heard it all before (from you, no doubt). Say something different.

We do NOT control it.

Make it simple and direct. Think of the most removed person you know and break it down in a way they would understand. Think about every angle it could be misunderstood.

It withholds

Libre software is normal, default. Anti-libre software is cringe, weird, dangerous. Act like it. Also, humans care less about getting and more about losing stuff.

libre software license text file

Show them what to check for, for themselves, easily, obvious. Later, show them how to spread these ideas. Then, show them how to show others how to spread these ideas, make more of you.

GPL

A keyword for them to web search for more, with better results than more complex terms like AGPL or misleading terms like 'open source'.

Don't waste a word.

Lastly, make yourself someone everyone wants to talk to.

 

Automatic calendar synchronisation with no server and no cloud?

view more: next ›