brownmustardminion

joined 3 years ago
[–] brownmustardminion@lemmy.ml 26 points 4 months ago (1 children)

FBI, open up!

Jk. Thank you for your service

[–] brownmustardminion@lemmy.ml 5 points 4 months ago (1 children)

Compressed air can spin the fans fast enough to cause damage unfortunately.

[–] brownmustardminion@lemmy.ml 21 points 4 months ago (4 children)

Did you use compressed air to clean out the fans?

It's possible to fry circuitry if you artificially rotate the fans too fast, as this generates an electric field more powerful than the fans and their attached components are rated for.

Probably rare to cause damage with modern computers but an old PC might be more susceptible to this type of damage.

[–] brownmustardminion@lemmy.ml 4 points 4 months ago (1 children)

Am I understanding correctly that if users had 2FA, the vulnerability would be prevented from gaining access?

[–] brownmustardminion@lemmy.ml 3 points 6 months ago

I was in your position recently and decided to install PVE from scratch and restore VMs from backup.

I had a fairly complex PVE config so it took some additional work to get everything up and running. But it was absolutely worth it.

[–] brownmustardminion@lemmy.ml 1 points 6 months ago

Same. It works great.

[–] brownmustardminion@lemmy.ml 1 points 6 months ago (1 children)

I don't want to be too specific for opsec reasons. But windows 10 is the OS. OFX aka OpenFX.

[–] brownmustardminion@lemmy.ml 4 points 6 months ago

Most important term to research regarding arr apps is "hardlinking". Make sure you have your apps configur ed with hardlinks. Everything else is pretty easy and self explanatory.

[–] brownmustardminion@lemmy.ml 1 points 6 months ago

I replaced the drives, installed the newest version of PVE, then restored all of my VMs from local USB backup. I had to reconfigure a number of things such as HDD pass through and other network settings, but in the end the migration was a success.

[–] brownmustardminion@lemmy.ml 2 points 7 months ago

I don't work in IT at all. My self hosting journey started when I got sick of feeling powerless in the face of big tech companies who are increasingly ripping off customers or violating their right to privacy. There's also the general mistrust that comes from my data being repeatedly breached or leaked because share holder profits are more important than investing in basic security.

[–] brownmustardminion@lemmy.ml 2 points 7 months ago

When I say local I mean automated PVE backups the same as it would be through PBS. If that makes any difference.

[–] brownmustardminion@lemmy.ml 1 points 7 months ago (2 children)

I have a remote pbs but the backups aren't current because there was a connection error. I have Proxmox backups locally to a USB thumbdrive. That's what I was going to restore from.

 

I'll start by stating my threat model is avoiding corporate tracking, profiling, and analytics. For anything beyond that scope I believe tor is ideal.

Correct me if I'm wrong but my understanding is that Newpipe is a frontend to provide an alternative to the awful YouTube app and/or youtube account. However, your IP along with other device information may still be exposed to google servers. Any ideas as to what info beyond IP is sent to google?

Whereas invidious instances act as a proxy in addition to what is offered by Newpipe, but you are trusting your privacy to the instance owner.

My idea for utilizing these services is the following: Newpipe for managing subscription based YouTube viewing. Google would have my IP, but this IP would be a VPN IP address that periodically changes. Much more reliable than invidious and better quality. App is great.

Invidious for random video searches as well as content I may want to be slightly more cautious about associating with.

I'm looking for feedback on this conceptual setup. I've also been considering making a public invidious instance that I can use but hopefully obfuscates my viewing through its usage by others.

 

spotify-downloader is great. I already have an arr stack running for movie and shows. It would be cool to add music to the mix.

I have a shared spotify playlist with friends that I pretty much listen to exclusively as of late. What I'd like is to have an arr app that constantly pulls from that playlist and downloads via spotify-downloader, so that I can listen to those songs from my private server and then I don't need to have spotify open so much.

The ideal setup would be a system where songs are pulled from a spotify playlist, downloaded via spotify-downloader, but later once a higher quality version is discovered, downloads that and replaces the youtube quality initial version.

I can't be the first to think of this, so I'm hopeful something like this is already ready to deploy. Thoughts?

 

I recently acquired a pixel phone and set up gos. Prior to trying gos I was using an iPhone hardened as much as possible based off of recommendations and guides from respected OSINT experts.

It’s only been a week but I’ve found gos extremely frustrating and mostly useless except for web browsing.

I can’t seem to get my Yubikey to work so my 2FA is borked. Works fine on my iPhone.

I’ve previously managed to degoogle my life but now certain apps require me to use sandboxed google apps just to run.

I’m facing the nearly insurmountable task of convincing my friends, family, and colleagues to download and use signal when they are all using encrypted iMessage.

Most of my banking apps just simply do not work. Mobile banking is unfortunately something important that I need in my occupation. A part of the appeal of gos was being able to have an isolated dedicated profile for banking.

There’s also a few features that I’m assuming are iPhone exclusive that it really sucks to have without. Double tapping the bottom of the screen to shift everything down so you can reach the top of the screen with your finger when using one hand. Holding down on the space bar to move the text cursor between characters. Maybe these exist on gos though?

I understand most of the issues lay on the shoulders of the app developers. I’m grateful for the devs for creating and working on this project. I’m not bashing anyone here. I’m simply asking for some guidance on how I can break through the hurdles and make this work for me, from the mouth of those who were once in my position.

 

I’ve been using invidious for a few years. I recently changed up my morning routine and have been eating breakfast watching YouTube via the TV app versus on my PC.

It made me realize I kind of miss the recommended videos in some circumstances like when I just wanna veg out.

Are there any current viable yt front ends that either maintain the algorithm or utilize their own to find you new content?

 

If you have an outdoor Ethernet port—in my case with a WiFi AP connected—how can you go about protecting your network from somebody jacking in?

Is there a way to bind that port to only an approved device? I figured a firewall rule to only allow traffic to and from the WiFi AP IP address, but would that also prevent traffic from reaching any wireless clients connected to the AP?

Edit: For more context, my router is a Ubiquiti UDM and the AP is also Unifi AP

 

What is the general consensus on trusting data removal services with the data you provide them?

I’ve spent 5 years telling myself I’ll go through the long lists of data aggregators and one by one manually send removal requests. But it’s such a massive undertaking. I’d like to finally get it done through one of these services, but my gut tells me it feels wrong.

Has anybody used them and how do you feel about it? Is DeleteMe a good choice?

 

I have a Dell Poweredge r720xd in RAID10. I've had a couple of drives fail since I've bought it and was able to buy cheap replacements on ebay.

I had another drive fail recently and one of the spare ebay drives came up as "blocked". It put me out a few days while I waited for a new one to arrive; also from ebay.

I'd like to avoid getting another dud drive. Are there any reputable resellers of these old drives so I can stock up on some spares?

 

I’ve made a few posts in the past about my experimentation with connecting various devices and servers over a VPN (hub and spoke configuration) as well as my struggles adapting my setup towards a mesh network.

I recently decided to give a mesh setup another go. My service of choice is Nebula. Very easy to grasp the system and get it up and running.

My newest hurdle is now enabling access to the nebula network at the same time as being connected to my VPN service. At least on iOS, you cannot utilize a mesh network and a VPN simultaneously.

TLDR: Is it a bad or a brilliant idea to connect my iOS device to a nebula mesh network to access for example my security camera server, as well as route all traffic/web requests through another nebula host that has a VPN such as mullvad on it so I can use my phone over a VPN connection while still having access to my mesh network servers?

 

As the title says, I'm trying to multiboot Fedora 40 and Ubuntu 24. The documentation and guides for this all seems pretty outdated through my searching and troubleshooting.

I currently have ubuntu installed. My drive partition table looks like this:

  • sda1 -- EFI (250MB)
  • sda2 -- /boot (ext4, 2GB)
  • sda3 -- empty (ext4, 2TB) <-- Fedora partition
  • sda4 -- Ubuntu 24 (LUKS encrypted, 2TB)

I'm trying to install Fedora now and it's giving me nothing but errors. The most useful guide I found for this specific setup just has you adding sda3 as the installation path (mounted at /) for Fedora and it's supposed to figure out the EFI and boot, but that doesn't happen. In fact, the EFI and /boot partitions show up under an "Unknown" tab in the Fedora custom partition window of the installation. They should be under a tab such as "Ubuntu 24 LTS". Fedora isn't recognizing the ubuntu installation (because it's encrypted?)

Am I wrong in assuming that both OS's should be sharing the EFI and /boot partitions? Maybe that's the issue?

Anybody out there successfully dual booting Linux distros with both distros encrypted?

 

For years I’ve had a dream of building a rack mounted PC capable of splitting its resources to host multiple GPU intensive VMs:

  • a few gaming VMs
  • a VM for work that can run Davinci Resolve and Blender renders
  • an LLM server
  • a Stable Diffusion server
  • media server

Just to name a few possibilities…

Everytime I’ve looked into it, it seemed like the technology just wasn’t there yet. I remember a few years ago Linus TT took a shot at it, but in the end suggested the technology (for non-commercial entities) just wasn’t in a comfortable spot yet.

So how far off are we? Obviously AI focused companies seem to make it work, but what possibilities exist for us self-hosters who might also want to run multiple displays in addition to the web gui LLM servers? And without forking out crazy money for GPU virtualization software licenses?

 

HACS has a problem with hitting the GitHub rate limit when you first install it. It’s not really that big of a deal. You usually just need to wait an hour for the local database to populate.

It used to be optional to link your GitHub to HACS to bypass the rate limiting but now it seems the installation requires it.

I’m not a fan of this as somebody who uses Homeassistant for its privacy values and am kind of frustrated HACS removed the ability to install without a GitHub API key.

Is there a manual way to override the API linking process?

 

Would this work or would I have problems:

Using dd command to backup an entire SSD containing dual boot Windows/Ubuntu partitions into an .iso file, with the intent to then dd that iso back onto the same size SSD in the case of a drive failure?

view more: ‹ prev next ›