this post was submitted on 20 May 2024
23 points (100.0% liked)

Technology

69041 readers
2343 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 27 comments
sorted by: hot top controversial new old
[–] woelkchen@lemmy.world 18 points 11 months ago

Chrome, Spotify, Zoom, WhatsApp, Blender, Affinity Suite, DaVinci Resolve and many more now run​ natively on Arm

"Meanwhile we're too lazy to even port our basic casual games like Minesweeper or Solitaire to ARM even though we're shipping our own Surface line of tablets with ARM CPUs since 12 years. Fun fact, we're shipping native ARM versions of Minecraft on iPhone, iPad, Android, and even low-performace hardware like Nintendo Switch but our own top-notch Copilot+ ARM PCs: lolnope."

[–] possiblylinux127@lemmy.zip 15 points 11 months ago (1 children)

Wow!

I can't wait until there is nothing left of Windows except Copilot and onedrive

[–] applepie@kbin.social 10 points 11 months ago (1 children)
[–] db2@lemmy.world 4 points 11 months ago
[–] morrowind@lemmy.ml 10 points 11 months ago (4 children)

I know most people are over AI, as am I, but if we're gonna have it, I'm glad to see there's a focus on it being local

[–] possiblylinux127@lemmy.zip 22 points 11 months ago (1 children)

It isn't though

Copilot is a cloud service

[–] morrowind@lemmy.ml 2 points 11 months ago (1 children)

The goal is to make it work on device in the next 4 years. That's the point of an "AI PC"

[–] possiblylinux127@lemmy.zip 10 points 11 months ago (2 children)

Why would Microsoft want that? They make money from the cloud. There is a reason they want you to move to Azure.

[–] shiftymccool@programming.dev 2 points 11 months ago

They want you to foot the electric bill for the LLM processing, they're still going to collect your data. Double-win for MS!

[–] Thekingoflorda@lemmy.world 2 points 11 months ago (1 children)

Because they don’t want to pay for running AI in the cloud.

But they DO want that sweet sweet customer data. If you think there’s not going to be some sort of user data and behavior profiling bullshit going on at the very least, I’ve got a bridge to sell you.

[–] BroBot9000@lemmy.world 15 points 11 months ago (1 children)

It’s not going to be private lol

[–] morrowind@lemmy.ml 2 points 11 months ago (1 children)

The MS implementations won't, but once they build the capability, we can make our own

[–] possiblylinux127@lemmy.zip 8 points 11 months ago (1 children)

That's not how that works. Also we have our own. Its called ollama

[–] kaboom36@ani.social 1 points 11 months ago (1 children)

How would one set this up with ollama?

[–] possiblylinux127@lemmy.zip 2 points 11 months ago* (last edited 11 months ago) (1 children)

On which platform?

Basically you need three things. You need the ollama software, a LLM model such as mistral and a front end like openwebui.

Ollama is pretty much just a daemon that has a web api apps can use to query LLMs.

[–] kaboom36@ani.social 1 points 11 months ago (1 children)

Linux, specifically nobara (a gaming focused fedora distro) for me

Do you have any guides you would recommend?

[–] possiblylinux127@lemmy.zip 2 points 11 months ago

Actually it is pretty easy. You can either run it in a VM or you can run it in podman.

For a VM, you could install virtual manager and then Debian. From there you need to of course do the normal setup of SSH and disable the root login.

Once you have a Debian VM you can install ollama and pull down llava and mistral. Make sure you give the VM plenty of resources including almost all cores and 8gb of ram. To setup ollama you can follow the guides

Once you have ollama working you can then setup openwebui. I had to use network: host with the ollama environment variable pointed to 127.0.0.1 (loopback)

Once that's done you should be able to access it at the IP of the VM port 8080. The first time it runs you need to click create account.

Keep in mind that a blank screen means that it can't reach ollama.

The alternative setup to this would be podman. You theoretically could create a ollama container and a openwebui container. They would need to be attached to the same internal network. It probably would be simpler to run but I haven't tried it.

[–] webghost0101@sopuli.xyz 9 points 11 months ago (2 children)

Its only partly local and i bet only because the alternative is to expensive.

Just look at the kinda crap windows pulls today.

I thought i had disabled pretty much all the spyware, them on the microsoft online dashboard i found

  • full machine specs
  • A list off all applications installed
  • edge search history

Btw, did you know that anything typed in start/search now counts as a search in edge? I wander whats behind that move.

In other words i understand much of the ai may function on a local level but it be fully integrated with cloud systems and you can safely assume a full record of your prompts and uses is stored on microsoft servers and delivered to bill gates for reading before bed.

[–] morrowind@lemmy.ml 3 points 11 months ago (1 children)

full machine specs

just fyi, every website you visit has access to this, it's not private

[–] webghost0101@sopuli.xyz 3 points 11 months ago

Depends on your browser extensions, you can even pretend to be a phone if you want. ;)

And also not the same level of detail as microsoft is just lifting from your hardware like motherboard model and manufacturer, how many drives, how they are formatted and how much space is used on each of em.

[–] Railcar8095@lemm.ee 1 points 11 months ago (1 children)

Btw, did you know that anything typed in start/search now counts as a search in edge? I wander whats behind that move.

It searches for everything you search so that by the time you press enter, it already has the result of the last one, hence seeming faster.

[–] webghost0101@sopuli.xyz 1 points 11 months ago* (last edited 11 months ago) (1 children)

What i meant is, what good reason is there for local search to always count as a Bing internet search.

If i wanted to look up something online id open up an internet browser with the search engine of my preference.

Spamming my local search results with online rubbish is the opposite of value, even micrsoft devs acnowledge this which is why microsoft powertoys has macbook style local searchbar thats way faster and better at finding your files and offline.

[–] Railcar8095@lemm.ee 1 points 11 months ago

Well, it does search online too. That the implementation is atrocious is a different thing.

[–] BombOmOm@lemmy.world 7 points 11 months ago

Yeah, that is a big deal for privacy reasons. There is no reason one needs to send such information to companies.

[–] LinusWorks4Mo@kbin.social 3 points 11 months ago
[–] LinusWorks4Mo@kbin.social 1 points 11 months ago