Out of the Dog House

I just got over a cold, and I am back at home. My parents went on a two week cruise, so I got to take care of the dogs. Because my parents have a fenced yard and it is closer to my work, it is easier just to stay at their house. I got a lot of Zelda playing done; have I mentioned how much I love my Steam Deck?

Actually, I put time into my 3DS. One of the Zelda games I decided on was one I have never played. I own Link Between Worlds, and decided my stay at my parents was a good time to play it. Before I booted up the game, I made the decision to hack my 3DS. Most people who hack their 3DS do so to pirate games; let’s not pretend otherwise. However, I decided to do so for a weird reason. I wanted to take screenshots.

It’s really easy to hack your 3DS. I had to switch the SD card  between the 3DS and a computer a lot, but I did not run into any issues. After I was done, I was able to enable cheats, download homebrew games, FTP into the console, and take screenshots. There are a few other features, but the screenshots and FTP are the coolest parts to me.

Link Between Worlds is a very beautiful game, and I took a lot of screenshots. I should not have been surprised, but I was a little shocked at how small the screenshots are on a PC! The top screen is 800×240, so they are pretty small. I can use GIMP to upscale them, but it makes them blurry. I opted to try the AI route.

I found Upscayl. It’s free to download, and has a couple of different methods built in for upscaling. A dedicated GPU is recommended, but I was able to use it on my aging laptop. It took awhile, and one of the methods produced a distorted mess, but most of the attempts turned out great. When I got home, I did a few on my desktop PC, and it went quickly.

Here is one of the original screenshots:

Zelda Link Between Worlds Screenshot - original size

It is small and definitely needs some upscaling. Upscayl only has two sizing options: 4x or 8x upscaling. These options greatly increases the file size. I’m happy with at least 1080p or 4K resolution, so I use GIMP to scale the image down to a normal resolution. The resolution becomes reasonable. Here are some of my results:

Zelda Link Between Worlds screenshot upscaled with remacri

Pretty good, right? It has other options, and I think the digital art one is interesting. It kind of anime-izes the image. Here’s the result:

Zelda Link Between Worlds digital art

It’s cool, but I think it removes too much detail, and takes away from the original. The jagged edges give the other one character.

Outside of messing around with screenshots, I worked on some Python projects. I am bad at coming up with efficient ways of doing things with scripts. I usually bruteforce my way into a satisfactory result. My Games website is built on the back of inefficiency. You can read about the process on the About page. The abridged version is that I use Python scripts to process an offline database, output a bunch of web pages, and upload them to the server. Over the years, I have made generating the webpages faster, and I am happy with the speed for the most part. The upload script was bad through. While at my parents, I woke up one day and came up with a solution.

You would think an upload script would be easy. Compare local files to the server files. If doesn’t exist on the server, upload it from local. If it doesn’t exist locally, remove it from the server. If the file is newer locally, then override the server file. If the local file is older, keep the server file. Easy with programming.

However, the modified dates of the local files are a lie. Most of them are completely regenerated every time I run the scripts, so they will always look newer than the server versions. I ended up finding a way to make it work, but the script took ten to fifteen minutes to run. It essentially removes and re-uploads all of the game pages. Even after the upload script ran, I still needed to manually upload achievement icons, and any changes to images. I made it work, but I finally got tired of it.

I devised a system where the upload script saves another copy of the database data after it uploads. When the database data changes, the script is able to process the new and old data, and see the changes. It will then only upload the files that changed between uploads. While ripping into the script, I also added logic to check images, and achievement icons. The check on the images is the longest part of the script, but it is not too bad. In the end, the script takes about two and half minutes to run. It’s a huge improvement, and I no longer have to manually upload images or icons.

The other scripting project is a weird one. When I first started Royfuss, I manually typed up blog posts in an HTML editor offline, and then uploaded the files. If anything happened to my website, I had an offline copy ready to re-upload. A few months after starting Royfuss, my brother created a basic PHP/MySQL app that allowed me to publish blog posts. As a result, I stopped making offline copies of my posts. In July 2006, our server died, and our web host brought up a new one and tried to restore data. Unfortunately, we lost about three months of data. After that, I went back to typing up blog posts, and manually uploading them in HTML. I switched hosts after that, and have had zero issues since. It took almost five years, but I ended up setting up a database, and switching to WordPress.

I still do backups on the database, the images, and WordPress. If something were to happen, I would be able to setup a new WordPress instance, and do an import. But I am weird, and my paranoia runs deep. I worry about an update causing data loss. Old data might become incompatible with a new update to the database, PHP, WordPress, or something.

In the event that I would need to migrate my posts to a new instance of WordPress, I think I would just create a static copy, and start anew. I am nearing 100 posts, and I feel like that is too many (even though it’s not). For my sanity, I created a lifeboat in the form of Python scripts. One script scrapes my WordPress posts. It grabs the title, metadata, and post content. For the post content, I preserve all of the WordPress tags, so I have as much data saved as possible. It outputs each post into a file which is easily digestible by BeautifulSoup. It works quickly, but I still need to manually backup the images.

As a tests, I whipped up a script that scans the output files, and tries to convert it into my older archives look. I got it working, and the posts look pretty good in the format. I would have to do some overhauling of the archival pages, but I’ll cross that bridge if I need to. For now, I am content with the scripts.

All this has got me thinking about the look of my current site, and I have decided I am going to switch themes. I want a different look, and QI cannot match what I want. It’s a freemium theme, and a lot of basic features are locked behind a paywall; I have to manual edit the code every time there is an update to get around the paywall. There are a lot of themes like that, but I think I found one I like. I might even get it installed tonight. We’ll see.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top