That’s a weird read having in mind I had to move to Wayland because x11 had severe screen tearing. I would have guessed Wayland had better support.
That’s a weird read having in mind I had to move to Wayland because x11 had severe screen tearing. I would have guessed Wayland had better support.
I don’t think there are services like that, since usually this means deploying and destructing an instance, which takes a few minutes (if you just turn off the instance you still get billed).
Probably the best option would be to have a snapshot, which costs way less than the actual instance, and create from it each day or so yo run on the images since it was last destroyed.
This is kind of what I do with my media collection, I process it on my main machine with a GPU, and then just serve it from a low-power one with Jellyfin.
IIRC this was already addressed and should be automatic.
There was an issue specifically mentioning GDPR and the devs implemented a way to automatically delete the data of an account within the given time.
It’s not a GDPR request in itself, but AFAIK a normal delete account request should be compliant… INAL
Start by learning docker, you don’t have to selfhost anything yet, just learn to run a container, specially to run automated stuff. Then learn to build the images and run docker compose.
Also you could start checking any form or infrastructure as code. I usually hear about ansible and nixos.
This helps having a way to redeploy your services in any hardware easily.
Does it apply it to all feeds? Or can it detect what feeds are actually Youtube ones?
Why do you need the files in your local?
Is your network that slow?
I’ve heard of multiple content creators which have their video files in their NAS to share between their editors, and they work directly from the NAS.
Could you do the same? You’ll be working with music, so the network traffic will be lower than with video.
If you do this you just need a way to mount the external directory, either with rclone or with sshfs.
The disks on my NAS go to sleep after 10 minutes idle time and if possible I would prefer not waking them up all the time
I think this is a good strategy to not put additional stress in your drives (as a non-expert of NAS), but I’ve read the actual wear and tear of the drives is mostly during this process of spinning up and down. That’s why NAS drives should be kept spinning all the time.
And drives specifically built for NAS setups are designed with this in mind.
IIRC: webp webm file extensions, and VP8/VP9 video format.
I could even go further into saying: always test every change you make, do not assume the change has been made because you updated a file.
I use rclone and duplicati depending on the needs of the backup.
For long term I use duplicati, it has a GUI and you can upload it to several places (mines are spread between e2 and drive).
You configure the backend, password for encryption, schedule, and version retention.
rclone, with the crypt submodule, you use it to mount your backups as am external drive, so you need to manually handle the actual copy of the data into it, plus versioning and retention.
I can’t give you the technical explanation, but it works.
My Caddyfile only something like this
@forgejo host forgejo.pe1uca
handle @forgejo {
reverse_proxy :8000
}
and everything else has worked properly cloning via ssh with git@forgejo.pe1uca:pe1uca/my_repo.git
My guess is git only needs the host to resolve the IP and then connects to the port directly.
Ohhh! Now I understand!
Yeah, then that’s an issue on mastodon.
I mentioned some time ago, the fact that mastodon and Lemmy use the same protocol is annoying, because the experiences are different, so it causes a lot of issues :/
Unless lemmy devs have changed something since last year, this shouldn’t be the case, there’s a bug in there.
All interactions are recived by the instance hosting the community, and that instance is responsible for broadcasting that interaction to each instance where a user subscribed to it is hosted.
So, mastodon is only responsible for sending the upvote to feddit.dk and then feddit.dk to all other instances.
I’m not saying to delete, I’m saying for the file system to save space by something similar to deduping.
If I understand correctly, deduping works by using the same data blocks for similar files, so there’s no actual data loss.
I had a similar case.
My minipc has a microSD card slot and I figured if it could be done for a RPI, why not for a mini PC? :P
After a few months I bought a new m2nvme but I didn’t want to start from scratch (maybe I should’ve looked into nix?)
So what I did was sudo dd if=/dev/sda of=/dev/sdc bs=1024k status=progress
And that worked perfectly!
Things to note:
What?
Well, I can only speak for myself, I’m not here to follow users but communities.
And if someone wants to follow me I’d see it as kind of annoying for them seeing all the different topics I post and comment instead of something focused.
IMO the ability to see Mastodon interactions in Lemmy and vice-versa is quite annoying since they use the same protocol for different experiences.
Text to speech is what piper is doing.
What I’m looking for is called voice changer since I want to change a voice which already read something.
That’s exactly what I want: “the thing in the Darth Vader halloween masks” but for linux, preferably via CLI to ingest audio files and be able to configure it to change the voice as I want, not only Darth Vader.
I don’t want to manage piper voices, I can handle that directly in my file system as I only have a few.
The issue is none of the ones I’ve found are good for me, so what I need is something to change the voice once it has been generated by piper.
I haven’t completely looked into creating a model for piper, but just having to deal with a dataset is not something I look forward to, like gathering the data and all of what this implies.
So, I’m thinking it’s easier to take an existing model and make adjustments to fit a bit better on what I would like to hear constantly.
I’m looking at this in eternity and seems only spoilers don’t work from the post you linked.
User and community links work properly.
With invidious and in FreshRSS I use the youtube extension to use the embedded video player, you just need update this part of the code https://github.com/FreshRSS/Extensions/blob/master/xExtension-YouTube/extension.php#L153-L163
It easy just to replace for this:
public function getHtmlContentForLink(FreshRSS_Entry $entry, string $link): string { $domain = 'www.youtube.com'; if ($this->useNoCookie) { $domain = 'www.youtube-nocookie.com'; } $domain = 'invidious.personal.com'; $params = 'quality=dash'; $url = str_replace('//www.youtube.com/watch?v=', '//'.$domain.'/embed/', $link); $url = str_replace('http://', 'https://', $url); $url = $url . '?' . $params; return $this->getHtml($entry, $url); }
The only change is to use
$domain = 'invidious.personal.com';
And add the parameter
quality=dash
Seems there’s also this one https://github.com/tunbridgep/freshrss-invidious
but haven’t tried it