Bios 1 Hackintosh 0

I recently said that a firmware update can do wonders.  But a firmware (or BIOS) update can cause problems too.  🙁

The BIOS version that I was running on my Gigabyte Z390 M Gaming motherboard was 8, whereas the latest version is 9M.  I was having an issue where my new NVMe drive that I cloned Catalina on to, was not recognized as a boot option.  It didn’t think it was a UEFI OS like the old NVMe was.  It had to be an UEFI OS since I had cloned it with CCC 5, right?  I thought I should get the latest BIOS version and update my motherboard.  The Bios updates come with a utility that allows you update the BIOS, but you need to be running Windows to use the utility.  My motherboard has a feature in the Bios called Q-Flash which allows you to update the Bios while you are running it.  No need to use Windows.

The Q-Flash utility allows you to read the new BIOS update from an external disk.  I loaded version 9M on to a USB stick and booted into the BIOS.  Running Q-Flash was easy enough.  You can also save a backup BIOS version too, I suspect that it is in case your update fails and you can’t start the motherboard with the newly installed update.  I selected the update file from a file menu of sorts inside Q-Flash and then ran the update.  Q-Flash said that the install was successful. I then rebooted and went back into the BIOS to check.  I was now running version 9M and I could see all drives as boot options.

But my new NVMe drive was still not listed as being a UEFI OS.  I then exited the BIOS and booted into the coreboot menu.  The current default selection was my new NVMe, so I let the process continue.  Never made it into Catalina. An error was displayed and it said:

OCS: No schema for KeyMergeThreshold at 2 index, context <Input>!
OC: Failed to bootstrap IMG4 values – Invalid ParameterHalting on critical error

I then tried the old NVMe and it also stalled in the boot up process before it ever got to the Apple!  No error messages that I could see.  I then reverted the BIOS back to version 8 and tried to boot with both of the NVMe drives and the same thing happened.  I then reinstalled BIOS version 9M figuring that it wasn’t the BIOS upgrade that caused my issue.

I went to another computer so I could search on the internet for a possible solution to my boot up issue.  As far as the No schema error, I was able to find some information.  It seems that in Opencore 0.6.7, KeyMergeThreshold it no longer is used and should be removed from the config.plist file.  But it still doesn’t really explain my boot stalling issue.  After looking at various things, there was something that helped me figure out what happened.  When I first installed Catalina and tried to boot up, it too stalled.  It was just like what was happening now.  I went back into the BIOS and checked on those settings that I had to previously change.  I didn’t think about it, but a good number of those settings had changed.  With a new BIOS there were new defaults (except Fast Boot was False this time around).  And like the last time, once I made those BIOS changes I was able to boot into Catalina again.  Using the opencore menu I was able to boot into Catalina on all three of the devices that I had it on.  The three devices were my two NVMe drives and the old HACK1 boot drive which I had cloned the old NVMe drive to.

But still there was the question of why the new NVMe drive wasn’t listed as being UEFI OS like the old one.  I figured out that CCC 5 didn’t make a complete clone from the old NVMe drive.  Using command line magic, I mounted the EFI volume/partition on each of the NVMe drives.  Checking the new NVMe I saw the the EFI volume/partition was empty.  I then copied the files in the EFI volume/partition from the old NVMe to the new NVMe’s EFI volume/partition.  I rebooted into the BIOS and checking the Boot menu I could now see that the new NVMe was now also UEFI OS like the old NVMe.  I don’t know if CCC 5 has trouble with APFS.  Or maybe since I was copying from APFS case sensitive to APFS case insensitive, there was an issue. Anyway, that problem was solved.

Side note:  Later on I also did the same thing for the Samsung 850 EVO drive that I cloned using the old NVMe drive.  It too was missing the contents of the EFI volume/partition.

The last thing I needed to do was to get rid of that “No schema for KeyMergeThreshold” error.  This called for editing the config.plist file and removing the line with KeyMergeThreshold.  You can’t edit the config.plist file directly in the EFI volume/partition, so I made a copy where I could edit the file.  I opened up my copy of the config.plist file in Xcode to remove the line.  The KeyMergeThreshold line is located in UEFI->Input.  I deleted the line and saved the changes.  I then delete the original file and copied the edited file into the EFI volume/partition.  I rebooted the machine to see if I could see the error still.  Because the error message just flashes on the screen and doesn’t stop, I had to keep an eye out for it.  And it did not appear!  Success!  That evens the score with the BIOS. 🙂  My next task will be to see if I can install Win 10 on that older NVMe drive while using opencore.  I will have to do some research on it.  Until next time.

 

A tale of two ports

While I was installing Fortnite, I noticed the download was very slow.  I ran the Speedtest utility from Ookla and the speeds were indeed slow (6.78Mbps download and 7.67Mbps upload).  Faster than when I had wifi service, but too slow for fiber.  When I ran the speedtest on the Hack1 running Win10, my speed was 456.45Mbps download and 119.78Mbps upload.  I booted the Hack2 into Linux Mint so I could check the ethernet controller.  The speeds I got in Linux Mint were very poor too.  I was able to determine that there was an issue with ethernet port 14 on my DLink DGS-1216T switch that the Hack2 was plugged into.  I plugged the ethernet cable from the Hack2 into port 12 on the switch and I was able to get good numbers.  I got speeds of 445.90Mbps for the download and 228.89Mbps for the upload.  When I booted the Hack2 back into Catalina, the numbers were in the same range (459Mbps download and 224 upload).  I checked out the rest of the ports on the switch and determined that ports 5, 9, and 14 were a bit flaky.  Ports 5 and 9 were stuck at 100Mbps while port 14 was stuck at 10Mbps.

The DLink DGS-1216T switch is about 15 years old and is now discontinued. But DLink still has a support page for it on their website.  I was able to download the user manual, the last few firmware releases, and the SmartConsole utility. The SmartConsole Utility is only available for Windows.  When I first started the utility up, it couldn’t find the switch.  I noticed that there was another program called SmartServer.exe in the folder. After I ran the SmartServer.exe program (which opened in a command window), I started the SmartConsole Utility.  I was able to press the Discover button in the SmartConsole Utility and it found the DLink DGS-1216T.  The reason that I wasn’t able to find it before was that it wasn’t on the same network that I was running.  I changed the ip address and gateway settings to match the correct network. I determined that the current firmware that I was running was 4.10.06 and the last available firmware for the switch was 4.21.02. The SmartConsole Utility also verified the issues with the ports that the ports lights indicated.  After looking at the release notes, I decided to go with the next to last firmware release (4.21.01).  The last release doesn’t allow you to downgrade the firmware if you need to for some reason.  Once the firmware had been updated, I had to change the ip address and gateway address again because they were set back to the default.  I was able to access the router from the browser to check on the ports.  The 3 ports that were having issues, were fine now.  Port 4 has a device attached to it that uses 100M Full, but the port tested out at 1000M Full.  A firmware upgrade can do wonders!

Games on the Mac Act 2

Amazon delivered my 1TB WD Black NVMe drive today along with 6 sata cables with straight end connectors.  Needed to replace two of the sata cables in the Hack2 since they both have one connector end with a 90 degree bend.  That bend made it difficult to plug in the cables with how things are arranged in the case.  It was a pain in the neck installing the NVMe drive on to the motherboard.  I had to remove the wifi/bluetooth card, the video card, and the CPU cooler before I could access the NVMe slot.  But before I could remove the video card, I had to access the butteryfly locking lever on the Pci Express x16 slot and push it down.  With a long video card like my Sapphire Radeon Pulse RX 580 8GB GDDR5, it was hard to get to the locking lever.  I managed to flip the lever down to release the video card.

I formatted the new drive as APFS case insensitive.  I used Carbon Copy Cloner (CCC) 5 to clone the old drive with the new drive as the destination.  It gave me a warning that the new drive’s format didn’t match the old drive’s APFS case sensitive format.  I told it to continue with the cloning.

I was able to boot up using the new drive.  Then I was able to install Steam and League of Legends without getting that warning message about the case sensitivity.  I went ahead and resumed my attempt at installing Fortnite.  I didn’t have enough space to install it before. It is a large download.  It is still in the install process at this moment.

That wraps up this post.  Catch you later.

Fortnite, LOL, and Steam on the Mac

There are a few online games that I play on my Win 10 machine that I wanted to play on the Mac.  They are Fortnite, League of Legends (LOL), and Steam.  Fortnite is from Epic Games and there is also a client that runs on the Mac.  Riot Games distributes League of Legends and they have a client that runs on the Mac.  Steam has a client that runs on the Mac.

The League of Legends application starts up ok.  But there are problems with the other two.  Steam requires a case-insensitive filesystem and it won’t start.  The Epic Games launcher uses the Unreal Engine and it also needs a case-insensitive file systems.  It too doesn’t start.  While you can move the Epic Games Launcher to a drive that has a case-insensitive file system, you can’t do the same with the Steam launcher.  I tried using the Crossover (Wine) application to run the Windows Steam application.  While I was able to get it to run, the game Among Us locked up on me.  I chose APFS case-sensitive for the drive when I installed Catalina, while the default was APFS case-insensitive.  As a Unix/Linux fan, I prefer to use case-sensitive.  I am planning to upgrade the current 500G NVMe drive to a 1TB NVMe drive since I was running out of space.  Since I will be using APFS for Catalina, I could create a volume that was APFS case-insensitive and put those application there.  “Space sharing” is one feature of APFS that will allow me to create a volume without need to determine the size of the volume initially.  The volumes in the APFS container will share space.  As long as the total space used by the volumes is less than the total available space, things will work out ok.  The current drive will then become a Win 10 boot drive.  Catch you later.

Overclocking the i5-3570K

When I put together my original Hackintosh, I planned on overclocking the i5-3570K but never did.  The K CPU units are unlocked so they can be overclocked.  Now that my new Hackintosh is fully functional, I have reassigned the original Hackintosh to other duties.  I  put a hard drive containing Win 10 Pro into the system.  I had previously installed a copy of Win 10 Pro on this drive while it was attached to this system.  I am going to run Win 10 Pro for a while.  I still haven’t decided on whether or not I will make the system into a ProxMox box.  But in the mean time I am going to overclock the i5-3570K in the system.  One of the first things I did was to replace the stock Intel CPU cooler with something more substantial.  The Artic cooler is on the left and the stock cooler is on the right.

The fan on the Artic cooler howled and needed to be replaced.  I was able to mount a 92mm fan onto the cooler’s shroud.  This is the second time I had to replace the fan.  I have had the Artic unit for more than 10 years.  I also managed to break two of the mounting pins and had to get two mounting pins from the original cooler.  I really don’t like how the mounting assembly was designed.  If I have to remove the fan in the future, I will look into getting an adapter bracket to make the cooler mounting easier.

For my research on overclocking, I looked at a few videos where they overclocked an i5-3570K.  Linus Tech Tips had a good one and so did Jay Z Two Cents.  Plus there were a few more that I Iooked at.  In preparation for the overclocking event, I gathered a few of their recommended programs (plus a few others that I found): Cinebench, Prime95, OCCT, CPU-Z, CPUID HWMonitor, GeekBench5 (free version), IntelBurn Test, and RealTemp.  I will probably not use them all, but you never know.  To get a baseline before I started overclocking I ran GeekBench5.  The report that GeekBench5 generates is quite nice.  For my baseline I will use the Single-Core Score which was 805 and the Multi-Core Score which was 2625.  I also ran Cinebench to see what numbers that came up with.  The CPU Multi-Core was 2783 and the CPU Single-Core was 752.  The MP ratio was 3.70 x.

The two settings that I changed in my computer’s BIOS were the CPU Ratio (the multiplier) and the CPU Core Voltage.  The initial values were 34 for the CPU Ratio and 1.350V for the CPU Core Voltage.  First I increased the CPU Ratio by 2 to give me 34.  Saved the changes and rebooted into Windows.   Once in Windows I ran IntelBurn Test to check the stability of the overclocking.  I eventually increased the CPU Ratio to 42.  After getting to that point I then reduced the CPU Core Voltage by increments of 0.050V.  After each time I did that I would run IntelBurn Test to check the stability. I did that until I reached a CPU Core Voltage of 1.200V.  While I was running the IntelBurn Test, I also ran CPUID HWMonitor to look at the temperatures of the CPU cores and also the CPU usage.  The last run of GeekBench5 netted me values of 895 for the Single-Core Score and 3015 for the Multi-Core Score.  The last run of Cinebench gave me a CPU Multi-Core of 3172 and a CPU Single-Core of 835.  The MP ratio was 3.80 x.

While the numbers look good, there are some other numbers that aren’t so great.  The core temps are getting too high during the IntelBurn test. 🙁  The temperatures hit 105C for the highest overclocked settings that I used.  For the default settings the temperatures were in the low 80C range, which is probably a bit too hot.  The CPU might have a thermal transfer issue where the conductivity between the actual CPU and its lid is poor.  While I am tempted to do it, I am not going to go through the effort of delidding the CPU and applying liquid metal.  It is not worth it for this scenario.  So I have increased the CPU ratio to 40 and set the voltage at 1.100V.  Those setting only raises the core cpu temperatures a little more than what they are with the stock settings.  The default settings has the CPU ratio at 34 and the voltage at 1.035v.  The max temperatures for the default settings under the IntelBurn Test were 80c for core 0, 85c for core 1, 84c for core 2, and 82c for core 3.  The max temperatures for my overclocked settings under the IntelBurn Test were 84c for core 0, 88c for core 1, 88c for core 2, and 84c for core 3.  That’s it for this post.  Catch you later.

 

 

 

 

 

 

 

 

Hyper-V

Yet Another Virtualization Platform!  Hyper-V is Microsoft’s entry in the virtualization market.  It is a competitor to VMware’s Fusion and Oracle’s VM VirtualBox.   I have used both of these other products in the past and I still use VirtualBox from time to time.  Hyper-V is available in three versions.  There is Hyper-V for Windows Servers, Hyper-V Servers, and Hyper-V on Windows 10.  Hyper-V for Windows Servers is an add-on to the Windows Server OS.  Hyper-V Servers is a standalone solution .  Hyper-V on Windows 10 is the version that you can run on your laptop or desktop computer.  I have installed it on my Win 10 laptop and it works well.  I am currently doing an install on a Windows 10 desktop computer.  Well actually there isn’t any software to install.   Hyper-V is built into Windows 10 as an optional feature so there is no Hyper-V download.

There are a few requirements that you need to check off before you can use Hyper-V.   You must be running Windows 10 Enterprise, Pro, or Education.  It cannot be installed on Windows 10 Home.  If you have Windows 10 Home, you must upgrade to Windows 10 Pro.  You need to have a 64-bit Processor with Second Level Address Translation (SLAT).  You need CPU support for VM Monitor Mode Extension (VT-c on Intel CPUs).  You must have a minimum of 4 GB memory in your system.  While 4 GB is the minimum, having 8 GB to 16 GB would be better.

After you have determined that your computer meets the requirements for using Hyper-V, you can enable Hyper-V on your computer.  You can do the from the command line (cli) or from the control panel.  I am just going to talk about using the cli.  You can search on how to enable Hyper-V in the control panel.

There are basically two commands that you run to enable Hyper-V.  Run a PowerShell command as Administrator.  Make sure you do it as Adminstrator or it won’t work.  Enter the following command:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V -All

When the installation has completed, reboot.  Once the computer has rebooted and we are logged in, we will open up a PowerShell command as Administrator again.  We will want to enable Hyper-V with the Deployment Image Servicing and Management tool (DISM).  DISM helps configure Windows and Windows images. Among its many applications, DISM can enable Windows features while the operating system is running.

In our PowerShell we will type in the following command:

DISM /Online /Enable-Feature /All /FeatureName:Microsoft-Hyper-V

At this point Hyper-V has been enable and is ready for us to use.  But first we have to create a virtual environment.  Open Hyper-V Quick Create from the windows start menu.  Once the Hyper-V Quick Create application has started, you can select an operation system from the menu or you can chose your own by selecting the Local installation source.  Then you will press the Create Virtual Machine button in the application.  After the virtual environment has been created, open Hyper-V Manager from the windows start menu.  This application will allow you to configure your virtual environment and also start it.  That’s about all there is to it.  For Windows users, Hyper-V allows you to experiment with other operating system in a familiar environment.