What it takes to setup a new domain name (for me)
1 December 2023 | 7:29 pm

While I’m trying to get certificates for reinfo.wiki, I’m going to talk about setting up new domain names for stuff.

Usually, I may put my stuff under a subdomain on jacksonchen666.com, like videos.jacksonchen666.com or status.jacksonchen666.com or files.jacksonchen666.com.

Those kinds of new domains are easy to deal with:

  1. Add more nginx configuration
  2. Use a subdomain as the server_name

And that’s it. The server software part is pretty specific and setup varies, so that’s not included in the list.

But what about the cases of new different domains? Like reinfo.wiki?

Well, that involves a lot more setup. Here’s the case for reinfo.wiki:

  1. Get the domain name
  2. Setup authoritative nameserver (if not setup already)
  3. Add a domain in deSEC.io
  4. Setup DNSSEC
  5. Setup DNS records
  6. Get certificates
  7. Setup the software that’s gonna run the thing (already done)
  8. Setup the Tor Onions
  9. Add nginx configuration
  10. Do HSTS preloading
  11. Update prometheus and blackbox configuration (the uptime checker)
  12. Update links in many places (specific for reinfo wiki as it already existed)
  13. Add CAA records

Yeah, there’s a lot of work involved in a new domain name that isn’t just a subdomain.

So yeah that’s it I guess. There’s not much special about the setup.


The Logistical Mess of Trying to Process a Lot of Videos
26 November 2023 | 2:51 pm

Warning: Rushed blog post. Yeah.

The last time vs Now

Remember last time? Yeah this time I’m doing it for over 700 gigabytes of videos, with more specific parameters to ffmpeg and with more being added to the pile of videos.

Again, it’s gameplay videos (because turns out I’m back to playing that game again). This time, with more specific parameters.

Goals

This time, I wanted to do the following:

  • Preserve 4k resolution (for text clarity)
  • Make the quality of the videos OK-ish but not horrible
  • Keep 60 fps
  • Use much less space than the original recording

I settled on using -crf 29 -preset slow to keep file sizes low and quality OK enough.

Not enough time

However, there’s a major problem with -preset slow.

It’s slow (duh). Like way too slow for me to process a single hour video in less than 10 hours.

After trying to process a single video in many weeks, I decided that purely my Macbook with maybe 8 hours of availability to process videos was not enough. I have to go farther with the amount of time available.

An idea

After some thought, I decided that I will now have to use every other computer that is available to me. Here’s the list of hostnames of computers I could use for video processing:

  • imac
  • stupid-desktop
  • stupid-laptop

Actual setup

The actual setup was a bit different.

I tried using laptop-server, but I forgot it doesn’t have more RAM and ffmpeg needed 2 or 4 gigabytes of RAM for some reason. So I use it as a middleman of all the video files (using an external SSD), and as a place to hold videos while I’m away.

The computers were setup to run in a while loop, constantly going over all videos that haven’t been processed yet, then processing them.

Now for the other shenanigans.

File synchronization

To get files from my SSD to the computers, I used laptop-server as an middleman with another SSD plugged in. It’s actually my old 2TB Sandisk SSD before I needed to get a larger SSD for more crap.

I would transfer video the file onto the SSD plugged into laptop-server, then make the processing computers pull the video files.

I would then transfer the finished video files back onto laptop-server, then back to my actual SSD where I store my crap.

This extra roundtrip is pretty inefficient, and I have (technically) eliminated the extra roundtrip of transferring files onto the processing computers. But this middleman place is kind of used as a place to hold files before I get the time to actually grab it.

It also allows for using a single rsync command to grab all the processed videos if all of it is on the middleman. And I did use rsync for all of this file transfer thing.

If ran on LockedMac

The actual time it took to process 1 hour of video is about 12 hours, which means processing over 72 hours of video would take a long fucking time.

So processing it on my Macbook (LockedMac) would’ve been a major pain and would take way too long.

actually it will still take too long whatever just continue with this ok

Charging for stupid-laptop

Of course, due to resource constraints (lack of cables for charging stupid-laptop), I also had to rearrange my charging setup a little bit:

  • Charging brick for f had to be given to stupid-laptop
  • Charging cable for f was plugged into stupid-desktop
  • Another charging cable was provided to stupid-laptop

This was what I can do with what I had. Not much, and it will change when I have to take the charging brick (although not much and not very disruptive).

Speedrunning an install on stupid-desktop

I didn’t have an OS installed on stupid-desktop. So I had to install it.

I followed my own blog post for Alpine Linux with /boot/ and / encrypted, and finished it before I had to leave to do other things. It took about half an hour or an hour.

Doing all the commands was quite manual work, so later I decided to make a script that does all of it. It’s unlikely to be as robust as Alpine’s setup-disk, but hey it works (although I haven’t yet published it).

Networking for stupid-desktop

stupid-desktop is a desktop computer. It does not have Wi-Fi, so I had to give it my Ethernet cable so it can get access to the files.

That meant for LockedMac, it would have to use Wi-Fi. Which I had experienced many problems in the past few days just trying to play games.

Storage and space management

I also had to manage the space available and used.

Each hour of unprocessed video is around 20 gigabytes. Yes, 20 gigabytes per hour. That’s about 44444 kilobits per second. Very intense footage, and it makes sense given that it’s also in 4k 60 fps.

I could maybe give each computer 100 gigabytes before there is concern about running out of space for processed videos. So currently, all computers have 100 gigabytes of videos to deal with.

Long-term solutions

This kind of setup was actually for the short term to process everything I had right now and quickly (hopefully).

The longer-term would be to get a computer that can handle a stupid amount of CPU transcoding crap (and other intensive stuff), then use that for my PeerTube and whatever else.


Recovering From Syncthing Deleting My Data
19 November 2023 | 9:49 pm

Syncthing-fork

So I decided to try Syncthing-fork to see if it was any better at battery life (spoilers: I have no idea).

To my surprise, it has much more tweaking options and stuff, plus more information.

However, I now wanted to migrate from Syncthing to Syncthing-fork because now I wanted to use Syncthing-fork.

The easiest way was to just export the config from Syncthing and import it into Syncthing-fork. Easy and done.

Data loss

I explore around the app, saw a folder with the “revert local changes” button, promptly pressed it and didn’t think about it more.

Much later, I wanted to use my password manager to login to GitHub to find the code that defines the mode function in libqalculate (actually the specifics don’t matter, I just wanted to use my password manager). I open my password manager app, and it says file not found.

So I went ahead and looked in my Sync folder (at the root of user facing internal storage). I found… nothing but .stfolder and .stversions.

Well, fuck. I’ve just lost data, and setting it all back up was going to be a pain.

Complaints sent

I file an issue about the data loss I had, then move on to some other thing.

Data recovery

Later, I look into the .stversions folder. And I realize that my data is literally right there. It’s all there.

So I made a copy of all my data in .stversions by doing an rsync from my phone to my computer in Termux. I think it took over 5 hours.

(In hindsight, I did not need to make a copy of 100GB of data, only the filenames. But hey, who knows what the future can hold.)

Anyways, looking at the filenames, I came up with this python program to change the name to remove the syncthing time marker:

import re

filenames = open("list").read()

result = re.sub(r"(.*)~\d{8}-\d{6}(.*)", r"\1\2", filenames, 0, re.MULTILINE)

if result:
    with open("list2", "w") as f:
        f.write(result)

(Code is under public domain/CC0-1.0 by the way)

The program gets file contents from a file called list, does some regex (which should only remove the last date and time part in the filename added by Syncthing) and writes it to list2.

Renaming the file was done with a shell script with a bunch of mv commands, not the python program.

After making the (hyperspecific to my setup and files) shell script, I put it on my phone and transferred it.

I then also had to restart my phone to fix Syncthing-fork not using the right private keys.

And my data is back, and Syncthing-fork is working.

Bye bye Syncthing (not fork)

After all that, I just deleted Syncthing (not the fork one) from my phone, because I already had another version (it was the fork version).

Post-mortem

So how did Syncthing just delete all my data? And keep it?

Well, here’s how it went:

  1. A receive only folder was set to the entire Sync folder
  2. I revert local changes
  3. It deletes (and saves a copy into .stversions)
  4. I realize the issue (trying to use my password manager)

I have no idea how one of my receive only folders was set to the entire Sync folder, but I guess it was in that state.



More News from this Feed See Full Web Site