Retrobrite

Disclaimer

There are many reasons why you probably SHOULDN’T Retrobrite your yellowing retro devices. The harsh chemicals are not good for your skin, and CAN break down the plastic over time if you continue to reapply the Retrobrite over time. Also, while this does work on reducing (and sometimes removing completely) the yellowing of your retro device, the yellowing will come back. Unless you change the materials or find a way to break down the plastic on a molecular level and reconstruct with better materials, the plastic will continue to age and yellow.

What is Retrobrite?

I realized quickly when I began collecting that many of us are not the biggest fan of the look of aging retro consoles. Sometimes systems like the NES and SNES yellow severely, while others are lucky and stay minty fresh. More than likely, you will come into the hands of the less pretty ones. And if you are like me, you would love to see that original color come back to life. Enter Retrobrite.

While the original “formula” of Retrobrite is somewhere on the internet, you can read more details about where the idea came from here. I will link some videos you can watch on how you can attempt this with your own consoles. It is not perfect, but it works. And you will be shocked on where you could buy the materials you need.

My NES restoration

My first attempt at this was with my NES console and controllers. It turned out better than I thought, but it certainly has areas where it could have been much better. Here are some pictures:

So you want to try it out too?

Lucky for you, it is pretty easy as long as you have patience and the right materials. Also, it helps to set your expectations low-this is not a perfect science unfortunately. As you can see in my pictures, there are certainly some areas where the plastic did not completely transition to the original color, but it still looks better than yellow. Regardless, I suggest you check out the following people for various ways you can Retrobrite your old consoles. Please note, some of the materials you may need could be hard to find. The salon cream I ended up finding that is recommended by most people was over priced everywhere but actual beauty stores. So I strongly suggest you check the places it, you know, is actually supposed to be sold at.

The 8-Bit Guy is the place to go if you need plain and simple instructions (with multiple techniques might I add) on how to Retrobrite your consoles. There are also plenty of others on Youtube or just on various forums (reddit) that have instructions as well. But for me, I would put my money on The 8-Bit Guy.

What is next?

I have a few things that need to be repaired within my collection, so expect to see some posts (and maybe a repair video) coming in the future. Currently, I am on the hunt for a good soldering iron an materials to assist with the repairs. So stay tuned!

A New Challenger Approaches!

Collection Update

Recently, I was able to find a source that was assisting their father in selling his video game collection of 40 years. When they mentioned the size of the collection, I may have shed a tear or two. For privacy reasons, I will not disclose my source, but I will be buying from them multiple times in the future. So, why don’t we take a look at the goodies?

New Additions

What Is Next?

The SNES I acquired actually does not power on, so that is on the list to repair. Luckily, I have a few things I can do to narrow down the issue. My hunch is a bad fuse, which is an easy fix. However, I need to get my hands on a multimeter and a soldering iron. Plus, the Pokemon Special Edition Gameboy Color actually did not work originally. A battery had exploded, at one point, when it sat in the battery casing. So there was massive corrosion on the terminals and on the board. After a good cleaning, it worked like a charm. Only, the speaker was toast.

Despite the repairs that are needed, I was very please. Not pictured is another copy of Pokemon Blue I received and another Atomic Purple Gameboy Color. My goal is to probably resell/trade these in games to add to the collection. If you are interested, let me know!

Being Perfectly Human

Going to try something little different here in this blog, and really just keep myself true to focus on self care. I will post things I am doing for myself to read later, to reflect on, and to see how I grow as a person. As a perfectionist, simple tasks cause me to over think to extremes, which in turn produces anxiety. While everyone experiences anxiety and handles it their own way, I was able to hold it at bay for 20 years.

This year has been one of the hardest and life changing for me. I am changing for the better and focusing on positive aspects of life, always looking forward and not looking back. The person I am now is who I want to be.

Stardew Collector’s Edition

There should be no shock that I bought Fangamer’s Stardew Valley Collector’s Edition. I currently own the game on almost every platform, and have poured hundreds of hours into the game. It is, as you may know, one of my favorite games of all time. So, when Fangamer announced, with ConcernedApe, that they were releasing a collectors edition for the Switch…I knew I had to get it.

The Contents

Starting off, the Collector’s Edition is available for both the PC and the Nintendo Switch, appropriately priced at $72 (PC is $69). To most people, it may seem crazy to spend $72 dollars on a game that normally cost $15 (or less when on sale). However, this Collector’s Edition is not meant for the casual Stardew fan; it is for the crazy people like me who eat, sleep, and breath the game. Thankfully, Fangamer has made it possible to simply just purchase the physical edition of this game for both PC($29) and Switch($34).

So what does the $72 give you?

  1. Collector’s Box
  2. Wooden Standee
  3. Wooden Lapel Pin
  4. Junimo Comic
  5. Farm Deed
  6. Clean Cloth
  7. Physical Game and Case
  8. AN ACTUAL MANUAL!

How Is It?

After opening the Collector’s Box, which has killer artwork on it, you are presented with the physical copy of the game (which includes an actual manual, something you don’t see anymore with actual helpful information). But the exciting stuff is behind the insert holding the game.

The Farm Deed is printed on some high quality stationary/card stock that I almost do not want to touch to save its condition. The Wooden Lapel Pin, Junimo comic, and cleaning cloth all seem to be made of high quality material. The design and artwork on everything is nothing short of fine pixel art, but my favorite piece of this is the wooden standee.

It is made of high quality pine wood, and comes with a protective lay on both sides of every piece. When put together, it looks absolutely gorgeous. ConcernedApe has always done a fantastic job with Stardew Valley, and I love the collaborations he is partaking in, and hope he continues to be active with his community. I am so excited to happily place this on my shelf, along with all the contents that come with it.

There is a BOARD GAME?

Yes, there is! It was earlier this year, shortly after the 1.5 update released on consoles, that ConcernedApe dropped a bomb shell. The Stardew Valley Board Game was available for purchase through the game’s website. At the time, I was purchasing the Collectors Edition so I could not get both, which is a shame because the board game sold out like hot cakes. Luckily, ConcernedApe already announced a new printing was already underway and fans would be able to buy more in the future. You can check it out here, or simply google it to find so many raving reviews of how well put together the game is.

What is next?

I have some purchases lined up that I am excited to share with everyone, including some posts on some common repairs that can be done to older gaming consoles and handhelds. I should be posting these within the next few weeks. Until next time!

Collection Update

So I wanted to give everyone an update on where the collection is at this point. I really started collecting Nintendo consoles and games at the beginning of March, coincidently as COVID-19 was just starting to consume our lives. Little did I know it would also assist in the further ballooning of “retro” games and console prices. Luckily for me, I was able to snag all but the Wii U and SNES last year for the Nintendo consoles. 

The next focus will be on the handhelds next, but I really want to get an SNES before those prices get worse than they already are. Additionally, I am trying to snag as many of the Pokemon games that I can. That is going to also be a challenge as most of them are raising in price literally by the day.

Apart from that, I did recently purchase the Stardew Valley Collector’s Edition for the Nintendo Switch and I plan to do a post about that unboxing. Plus, I do want to show people how easy it is to retrobrite older consoles that have yellowed over the years. There are plenty of ideas in the hopper for the collection related posts, but those will ramp up as the garage sale season starts. With being fully vaccinated by end of April and working from home, I can take full advantage of the sales around me. 

happy pokemon GIF

If you know, or you yourself are looking to sell some of your older retro games, feel free to reach out to theretrodba@gmail.com!

Checking Who Be Active

After talking about Ola’s MaintenanceSolution for SQL Server, I wanted those new/accidental DBAs out there to make sure they add a very helpful (and wildly used) stored procedure to their toolbox.

sp_whoisactive

Created by Adam Machanic, sp_whoisactive provides the user with all the current active connections running at the time of execution. This can be very useful when trying to identify what may be inflicting pain on your SQL Server or, in some cases, help you figure out who keeps executing that pesky “select * From”. It is easy to use, and is well documented. You can read the documentation here or on Adam’s Github page. He has recently moved his updates to Github, so if you want the most up to date version of sp_whoisactive, go there.

As always, I do want to point out a useful parameter that pulls the sleeping spids out as well, so you can see the connections that are still open, but just waiting for the “next task.” This is an over generalization but the idea is that these could still impact your SQL Server’s performance, even if they are not active currently.

sp_whosisactive @@show_sleeping_spids = 2

We can see the session_id 54 is in a “sleeping” status. While it is not currently querying our database server, at any time, this user could execute a new query. Where this is helpful is sometimes identifying a user or even an application, with an open connection to the server that “recently” executed the query (details in the “sql_text” column of the results). We can see system data of the performance impact the query had, and in some aspect, what to expect next time they do it again. I am over generalizing here, and I am sure some of you are screaming at the screen that I am forgeting about X detail, Y context, etc. But, for those out there not “classically” trained in Database Administration, this is what they need. 

Last Bit

Finally, my last tip, just make sure to install the query on the same database (whether that is your master database or a user database created for your tools) as the rest of the stored procedures you use for the maintenance of your servers. Hopefully this quick post helped someone out there. Thanks again for reading.

http://whoisactive.com/

https://github.com/amachanic/sp_whoisactive/releases

So You Need A Maintenance Plan?


I mean it this time when I say this will be a quick one. For most DBAs, this information is already a part of your bread and butter. But for some accidental DBAs, this information may be useful.

One of the first things that was drilled into my brain was to ensure that we had a solid maintenance plan for our SQL Servers, complete with a consistent backup schedule and troubleshooting tools to assist when a problem arose. Luckily, the SQL Community has developed many free, open source tools that are widely used across many industries. Today, we are looking at one of the most importance tool of them all.

Ola Hallengren’s Maintenance Solution

https://ola.hallengren.com/

This is the first thing I deploy to my new servers after SQL Server has been installed. This sets up various jobs (backups, integrity checks, reindexing, etc) that you can easily schedule to your hearts content. The parameters are WELL documentated and the community can assist with anything you run into if you find an issue installing the plan. Out of the box, this is solid but the customization is endless. Below are the general parameters that you SHOULD look at when installing the plan, as your needs will be different than mine.

USE [master] -- Specify the database in which the objects will be created.

1) I use a separate database, like a “DBADMIN”, to store all my tools in rather than master. But if you use master, just keep it set to this.

DECLARE @CreateJobs nvarchar(max)          = 'Y'         -- Specify whether jobs should be created.

2) If this is your first install, leave this to ‘Y’. If you are updating your plan, due to an updated version pushed out by Ola, make sure to switch this to ‘N’.

DECLARE @BackupDirectory nvarchar(max)     = NULL        -- Specify the backup root directory. If no directory is specified, the default backup directory is used.

3) I strongly suggest backing up your databases to another server, and put the network path here. This will set the default backup path for your jobs. So change NULL to that path.

DECLARE @CleanupTime int                   = NULL        -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted.

4) For me, 72 hours works here. But that does not mean the need of your client/your customer will be the same. Make sure to ask them! And if they don’t tell you, then maybe it is worth the effort to sit down with them to discover your RTO/RPO. (More on that in another post?)

DECLARE @OutputFileDirectory nvarchar(max) = NULL        -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used.

5) Find a location on your server to output a seperate log of this plan. You want this, trust me. With the path, you can always take a look at the log to see why a job failed in an easier to read format 🙂

DECLARE @LogToTable nvarchar(max)          = 'Y'         -- Log commands to a table.

6) I always leave this as “Y” as there are times where I need to see what commands were passed (in case I screwed something up) in a previous job execution. The table will be created under whatever database you have installed the plan in. 

That Wasn’t So HARD!?

After that, go ahead and execute the plan and now you have a solid maintenance plan! Go ahead and adjust the schedules to your liking, and make sure to include your operator in those notifications if the job fails! 

Changing Backup Paths, Better

In the last week of February, I think I will try to do some smaller posts every week with something new I learned or found useful while conducting my job. The goal will be to submit small tidbits of info that will provide some sort of value for those seeking it! This week, it is going to be about changing your default backup path for your database maintenance jobs!

There will come a time where you might need to change your default path for your backup jobs. For some of you, this might be a pretty straightforward task on a handful of servers.For the rest of us, it is not ideal to hop from one instance to the next, updating the default paths manually across several servers. Thankfully for everyone, our community has a solution (most of the time) for our daily struggles. And this situation is no different!

Identifying WHERE your backups are going

booty had me like dat ass GIF by August Burns Red

I know you are probably thinking, “How could ANYONE not know where your backups are going?” I will simply respond by saying please see any sysadmin that was dubbed an accidental DBA by their company and told to “just figure it out.” Plus, maybe you are someone new to the gig and really don’t have any guidance to where anything is! These things happen people! 

A quick solution would be to connect to your database servers and run the following command:

USE msdb;
SELECT  j.job_id,
    s.srvname,
    j.name,
    js.step_id,
    js.command,
    j.enabled
FROM    dbo.sysjobs j
JOIN    dbo.sysjobsteps js
    ON  js.job_id = j.job_id
JOIN    master.dbo.sysservers s
    ON  s.srvid = j.originating_server_id
WHERE   js.command LIKE N'%BACKUP%'

This will look through system databases, identify commands where a backup path is mentioned, and output the value to you. From this info, you can identify what the job id is, servername, name of job, what step the backup command is in, and then the command itself. Unfortunately, I do not have a test server to show you the output of the command, so you will need to just trust me (or run it on your server!).

Change the backup path!

Now, the next step, with the use of the following script, makes changing default backup paths an absolute BREEZE. 

USE msdb;
DECLARE @Find nvarchar(max);
DECLARE @Replace nvarchar(max);
DECLARE @DebugOnly bit;
SET @Find = N'\\SERVERA\Backups';
SET @Replace = N'\\SERVERB\Backups';
SET @DebugOnly = 1;
IF OBJECT_ID(N'tempdb..#excludeJobs', N'U') IS NOT NULL
BEGIN
    DROP TABLE #excludeJobs;
END
CREATE TABLE #excludeJobs
(
    JobName sysname NOT NULL
        PRIMARY KEY CLUSTERED
);

INSERT INTO #excludeJobs (JobName)
VALUES ('The Name of a job you want to skip');

IF OBJECT_ID(N'tempdb..#deets', N'U') IS NOT NULL
DROP TABLE #deets;
CREATE TABLE #deets 
(
    JobName sysname NOT NULL
    , StepName sysname NOT NULL
    , OldCommand nvarchar(max) NOT NULL
    , NewCommand nvarchar(max) NOT NULL
    , PRIMARY KEY (JobName, StepName)
);
DECLARE @JobName sysname;
DECLARE @StepName sysname;
DECLARE @StepID int;
DECLARE @Command nvarchar(max);
DECLARE @NewCommand nvarchar(max);

BEGIN TRY
    BEGIN TRANSACTION;

    DECLARE cur CURSOR LOCAL FORWARD_ONLY STATIC READ_ONLY 
    FOR
    SELECT sj.name
        , sjs.step_name
        , sjs.step_id
        , sjs.command
    FROM dbo.sysjobsteps sjs
        INNER JOIN dbo.sysjobs sj ON sjs.job_id = sj.job_id
    WHERE sjs.command LIKE N'%' + @Find + N'%' ESCAPE N'|' COLLATE SQL_Latin1_General_CP1_CI_AS
        AND sj.enabled = 1
        AND NOT EXISTS (
            SELECT 1
            FROM #excludeJobs ej
            WHERE ej.JobName = sj.name
            )
    ORDER BY sj.name
        , sjs.step_name;

    OPEN cur;
    FETCH NEXT FROM cur INTO @JobName
        , @StepName
        , @StepID
        , @Command;
    WHILE @@FETCH_STATUS = 0
    BEGIN
        SET @NewCommand = REPLACE(@Command, @Find, @Replace) COLLATE SQL_Latin1_General_CP1_CI_AS;
        INSERT INTO #deets (JobName, StepName, OldCommand, NewCommand)
        SELECT JobName = @JobName
            , StepName = @StepName
            , PriorCommand = @Command
            , NewCommand = @NewCommand;

        IF @DebugOnly = 0
        BEGIN
            EXEC dbo.sp_update_jobstep @job_name = @JobName, @step_id = @StepID, @command = @NewCommand;
            PRINT N'Updated ' + @JobName;
        END
    
        FETCH NEXT FROM cur INTO @JobName
            , @StepName
            , @StepID
            , @Command;
    END
    CLOSE cur;
    DEALLOCATE cur;

    SELECT *
    FROM #deets;

    COMMIT TRANSACTION;
END TRY
BEGIN CATCH
    IF @@TRANCOUNT > 0
    BEGIN
        ROLLBACK TRANSACTION;
        PRINT N'Transaction rolled back';
    END
    PRINT ERROR_MESSAGE();
    PRINT ERROR_LINE();
END CATCH

Lets focus on the parameters you can modify, which make this script super helpful. In the “SET @Find” parameter, insert the backup path you have already identified either on your own or via the script I provided before. Next, you will want to add in the “SET @REPLACE” parameter the NEW default path you would like to update to all of your backup jobs. Finally, and this is my favorite part, is the “SET @debugOnly” parameter. By default, I leave this set to 1 as it will show you the output of running the script without actually changing the default path.  Once you have set your parameters, run the script and take a look at the details provided. In the results, you can see the JobName, StepName, OldCommand, NewCommand, OldOutputfile, and NewOutputfile.  By running the script in debug mode, you can verify if your new paths are correct. Once you are confident in the new path, set the debug mode to 0, and execute the script. I recommend running the script twice, as the second execution should come up blank signifying that your change was made successfully.

Do the backup jobs run successfully with the new path?

Since I cannot account for every situation, I will speak plainly. I strongly encourage that you run a system database backup just to verify that the backups can write to the new path. The last thing you want to do is make changes to the backup paths, and then fail to verify if the database server can even write to the new path. My guess is you will receive an unwanted call when the database server starts sending emails out saying backups are failing….

The script I provided above was not my own to be fair, and you can find the author’s write up here

Find-and-Replace for SQL Server Agent Jobs – SQL Server Science

Until next week! 

TempDB; Shrinking Responsibly

I just wanted to post this as a “lessons learned” for me, a very green DBA, as I wish I would have known this before I ran into an issue in production. Yes, production. I hear all the people screaming “You Monster! How could you!” Believe me, I was in the same boat. Thankfully, many other people have done the same!

So, let us get to the point. There will come a time when someone runs a query that bloats your tempdb (and any of the separate data files) and you begin to receive alerts. For some people, the first response should ALWAYS be to just extend the drive, then look at why your tempdb expanded to fill the drive. 

However, some of us may not be as lucky and have to find a way to make space. In this case, we shrink-responsibly!

My first move was to make sure I could even shrink the files. 

USE [tempdb];
GO
DBCC SHRINKFILE (Tempdev, 20480);----remember this number is megabytes. Make sure to be exact. 
GO
DBCC SHRINKFILE (Temp02, 20480);
GO

If all goes well, you should be good right? Some people suggest that you do not need to restart SQL Service. However, in my case I needed to as I had another piece to the problem I have not touched on yet. Regardless, I would strongly suggest if you have already scheduled the downtime to shrink the tempdb files, it would not hurt for the restart.

BUT! My problem was not a rouge transaction that expanded the tempdb files to max out the drive. No, the problem was my stupidity in a previous scheduled downtime to add more tempdb files to match the rest of our database servers. In doing so, I added TOO MANY. Yes…TOO many. So I was going back and fixing my mistake. In this case, the SQL gods were making it a point to teach me a lesson and here is why.

I had to shrink all of the files down, and delete the extra one. So I shrank the files, and went to delete the extra tempdb file. In order to do so, a file needs to be “empty” in order to be removed. So….

DBCC SHRINKFILE (LogicalFilename, EMPTYFILE);
GO

And then the problem hit me:

DBCC SHRINKFILE: Page 3|44101 could not be moved because it is a work table page.
Msg 2555, Level 16, State 1, Line 1

There are work tables present. Of course!

Now, you could try and restart the instance to see if the worktables magically go away, but I want to reduce the amount of times we restart the service. Therefore, I took to the community and came across some suggestions about clearing various caches(sp?). Let’s give it a shot!

DBCC DROPCLEANBUFFERS;
GO
DBCC FREEPROCCACHE;
GO
DBCC FREESESSIONCACHE;
GO
DBCC FREESYSTEMCACHE ( 'ALL');
GO

But life isn’t easy right? Why would it be! I tried again to empty the file and it did not work. Thankfully,  I have a FANTASTIC mentor, who suggested “Why don’t you just try to shrink it down to 1MB, do a restart, and delete it.” At this point, I was happy for suggestions. So….

DBCC SHRINKFILE (Temp05, 1);
GO

Then the restart of the service. So far so good, as the file DID shrink to close to 1MB. Let’s try to delete via SSMS…and SUCCESS! I could not believe it. I spent two nights, during off hours, to try and fix this and the answer was as simple as just shrinking the file to 1mb, and delete. 

I would like to link two wonderful articles that I found in the process of battling the tempdb, and you should definitely follow both Erin and Andy. They are fantastic people, and smart. Very smart.

Moral of the story: Don’t shrink, add space. Because, should you ever really shrink your SQL Server database? Andy has the answer for you

Running, From a Pandemic

Where it started

Back in 2019, a group of co-workers and I decided to start an unofficial run club at work. We met once a week after work, ran a few miles locally, and just encouraged each other to keep running on our own. I have dabbled with running when I was younger, and loved it. However, my body was in the middle of a huge growth spurt and it HURT to run. I gave it up, tried a few times over many years, but it never stuck.

Flash forward to the beginning of 2020, and we were regularly together decent miles multiple times a week on our own or after work. COVID-19 comes around full swing, and our typical running routine went straight to solo runs. I knew it was going to suck without the guys, but I knew it was for the better. Since we could not be together, we started a running club in Strava, which would help us keep each other accountable and encourage one another to run. Some of the group fell off the wayside, no harm no foul. Thankfully, I had two co-workers stick to it and we certainly made it a competition throughout the year to keep running.

Running on a country road

I am super grateful that I started running because it came in as a clutch stress reliever during times we have all experienced. This year alone, I learned how to overcome various running injuries, found my recovery pace, discovered how hard it was to not run, and understood WHY everyone recommends not to go all out on the first mile on your first race. In my first race ever (COVID-Style), I blew my energy on my first mile of a 5k and regretted EVERY second of it in my second and third mile. While I ran my fastest mile ever in my life that day (6:07min), I ended up aggravating an injury I experienced earlier that year (Plantar Fasciitis) which set me back a week or so.

The last and final part of this journey actually started with a community I wish I would have jumped into at the first leg of my 2020 running year. One of the co-workers in the run group introduced me to Kofuzi and The Ginger Runner, two YouTubers from the YouTube running community. I originally watched their videos for the reviews on the many shoes, apparel, and accessories for running to see what was worth buying. One after another, I watched their videos and became integrated into the community. They are part of my inspiration, and motivation, to keep on taking it one step at a time. Both are heavily involved in their community and really care about their followers. It is really refreshing, especially now, to see how much they give back to their community and engage their followers in their efforts. Absolutely fantastic!

So where this long winded post is coming to is my recap of the year of running but ALSO my 2021 running “themes”.

My 2020 Stats from Strava!

2021 Running Themes

Half Marathon Attitude with Marathon Dreams Focus on increasing weekly mileage to train my body to complete half marathons

COMFORTABLY– Find my target pace and get comfortable with it-Avoid injury by constantly working on recovery

Run Healthier– Buy the proper apparel -Rotate shoes more often, pending I have pairs to rotate in-Diversify my running workouts

Trail-Hunter– Get out on a trail and run-Rotate trail running into the work schedule-Don’t get ticks

500 Mile Club– This is to hit a minimum of 500 miles. I know I can hit this based off 2020’s stats, but I think increasing to 500 is a healthy, and safe goal. However, I know if I increase my weekly mileage from the 17 miles I am currently doing now to 30 miles , stay healthy, and stay motivated I will make that goal.

So, if you are interested, follow me on Strava- and join me in tackling the next run.