Debloating Windows 10 the easy way

I found this website on GitHub. Windows10Debloater

I particular there’s one script in particular on the main page.

At the time of this posting it is called Windows10Debloater.ps1

Used it on some of my machines. While it wasn’t able to remove everything I will say 90% of the bloat is removed and you just have to uninstall a couple other apps.

I do suggest checking the page out before running the code as I have noticed things change and as of this writing this script was updated 9 days before.

Run scripts in powershell

Powershell scripts by default is set to restricted. So if you try to run a powershell script on a new machine you’ll find that it doesn’t work.

So here are the commands I use most often.

First off the default command is

Set-ExecutionPolicy -ExecutionPolicy Restricted

This is the command I run if this is a powershell script that will be ran all the time on that computer

Set-ExecutionPolicy -ExecutionPolicy RemoteSigned

And this is the one I use to just run something one time

Set-ExecutionPolicy -Scope Process -ExecutionPolicy AllSigned

Logmein Hamachi setup on linux

This was going to be a completely different post, but here we are. I needed a linux vm to access a hamachi network. Ultimately the build didn’t work right but the Hamachi part did.

First I got the latest build from logmein. At the time of this writing this was the latest build.

sudo wget http://vpn.net/installers/logmein-hamachi_2.1.0.198-1_amd64.deb

Then I did

dpkg -i logmein-hamachi_2.1.0.198-1_amd64.deb

And then I started it up by typing.

/etc/init.d/logmein-hamachi start

Now that that was done it was time to attach and login

So I did

sudo hamachi login

Then I had to attach it to my account by typing.

sudo hamachi attach myemail@address.com

Then I did the following command changing the 1s to my network id #

sudo hamachi do-join 111-111-111

And that was all I needed to do to get the turnkey running on a hamachi vpn network

Batch file to sort files by name

So we have electronic journals that get dumped into a folder. These are things we need to keep. However having them all thrown into the same folder can also be a pain.  Not only does it just cause a bunch of files to be in one folder but I we need to look back at Jan 18th 2017 we’d have to sort through tons and tons of these files looking for the one we wanted.

So I wrote this bat file that basically looks at out file name and puts everything into it’s own separate folder.

So first I’ll post the code then I’ll explain.

@echo off

setlocal enabledelayedexpansion

cls

pushd c:\sortfolder\

for /f "tokens=*" %%1 in ('dir /a-d /b c:\EJ\*.JRN') do (

set filename=%%1&set dirname=!filename:~0,2!\!filename:~2,2!\!filename:~4,2!

if not exist c:\sortedfolder\20!dirname!\ (md c:\sortedfolder\20!dirname!\)

move %%1 c:\sortedfolder\20!dirname!\>nul

)

Now out file format looks like this 18091700-000101Z.JRN

Now the only part we’re really interested in is

(180917) 00-000101Z.JRN

18 year, 09 month, 17 day.

Now this does all the sorting.

set filename=%%1&set dirname=!filename:~0,2!\!filename:~2,2!\!filename:~4,2!

!filename:~0,2! – Starts at the first character and goes two over so it grabs 18

!filename:~2,2! – Starts after the second character and goes 2 over so 09

!filename:~4,2! – Starts after the 4th character

If you adjust the first and last number you can sort by other setups.

So !filename:~0,4! Would start and go to include 1809

Clear dropbox cache

So with work we tend to use dropbox for some rather large files. And on most of my computers I don’t have an issue with how dropbox functions. However I occasionally have issues where these large files remain on the system in a hidden file, and I will have to go in and clean out these old files.

Like today I had to go and remove about 200 gigs of old files. Now from what I’ve read dropbox is supposed to take care of this itself, but clearly it doesn’t happen all the time.

So here’s how you clean this out.

Hit the start button and type:

%HOMEPATH%\Dropbox\.dropbox.cache

This will bring you to the dropbox cache where you can delete files that you have deleted from dropbox, but still remain on your system.

Purge old files and folders from directory

So at work we have a couple of directories that generate logs on a regular basis. These logs really are not that important. So I keep them for 60 days.

Problem of course is that some also create folders.

So I created this powershell script from some pieces of code I found that will go through the folder and clean out all the old files and then clean out the old folders.

I’m sure I could clean this up a bit, but seeing it does what I want now it’s good enough. You just need to change some of the variables

#Days older than

$HowOld = -60

#Path to the root folder

$Path = "C:\directory you want purged\"

#Deletion files task

get-childitem $Path -recurse | where {$_.lastwritetime -lt (get-date).adddays($HowOld) -and -not $_.psiscontainer} |% {remove-item $_.fullname -force -verbose}

#Deletion empty folders task

do {

$dirs = gci $Path -directory -recurse | Where { (gci $_.fullName -Force).count -eq 0 } | select -expandproperty FullName

$dirs | Foreach-Object { Remove-Item $_ }

} while ($dirs.count -gt 0)

Powershell script to backup hyper-v servers

So while I do not have a hyper-v server now I did have one. I cobbled together this script out of some stuff I found online to basically allow me to backup a hyper-v server (meaning not a windows server just straight hyper-v) to a backup server and keep a couple of copies in case a backup went bad.

So here’s the script.

# This script it designed to retrieve all virtual machines from a remote hyper-v server and export them to a location of your choosing.
# You can set this up as a scheduled task based on your needs. So if you need backups daily, weekly, you can set that based on your needs.
# It will then remove any backups older than the amount of backups you configure.

# Enter the name of the server you want to backup
$server = "HYPERV-SERVER"
# Enter the full path for the backup folder
$backuppath = "\\backup server\Hyper-V Backups" # Example: "\\backupserver\fileshare\HyperVBackups"
# How many backups of VMs you want to keep
$backupstokeep = 7

# Get all Hyper-V VMs
$vms = Get-VM -computername $hypervserver
# Grab the current date
$today = Get-Date -Format MM-dd-yy

foreach ($vm in $vms) {
$vmname = $vm.Name
Write-Host "Backing up $vmname..."
# Create the folders for the backup
New-Item -ItemType Directory -Path "$backuppath\$vmname" # We run this in case it is a new VM. Normally it will fail if the VM folder already exists, which is fine
New-Item -ItemType Directory -Path "$backuppath\$vmname\$today"
# Export the VM
Export-VM -VM $vm -Path "$backuppath\$vmname\$today"
# Remove any backups older than the past 7 days
Get-ChildItem "$backuppath\$vmname" | Sort-Object -Property CreationTime -Descending | Select-Object -Skip $backupstokeep | Remove-Item -Recurse
}

Worked great for my needs. Especially when the only other options I could find cost $$

Windows Command Line Repair commands

Lately I find that doing some repair commands at the command prompt in windows seems to fix just about any of the “something isn’t working right” problems, but I know they’re not hardware related.

Anyway from an elevated command prompt these are the commands

1.

sfc /scannow

That fixes most of the issues.

2.

DISM.exe /Online /Cleanup-image /Scanhealth && DISM.exe /Online /Cleanup-image /Restorehealth

Now there’s two parts to the script. There’s the scan health part and the restore health part. You can run the first one by just running everything before the && but I typically run just both at the same time.

Now not that I’ve ever ran into it honestly but if it says something about “cannot find source files”

You need to get an iso of the exact same system  Then run this.

DISM /Online /Cleanup-Image /RestoreHealth /source:WIM:X:\Sources\Install.wim:1 /LimitAccess

Where the X is you need to change to the location of the drive where the iso is.

I’ll be honest I’d say 80% of the time the SFC command works if it’s just something that seems corrupt. The DISM I can’t say I’ve had a computer get fixed by that one, but I always run it to make certain things are working fine.

They’re fairly simple things you can do and it doesn’t hurt to try.

Pi-Hole Debian setup on VMware

So personally I don’t have any raspberry PIs. Just have 3 big VMware servers. So I decided to setup a PI-hole server for my network using a supported linux distro that PI-hole said it supported.

So here’s the results. For me it took 3 attempts to do it simply.

  • Ubuntu 18.04 – Had trouble with the network connection. After setup I couldn’t ping anything outside my network and dns didn’t appear to work.
  • Centos 7 – Couldn’t access the web interface.
  • Debian 9.5 – Was the choice that worked with minimal effort.

So here’s how it was done.

  • First I did an install of debian 9.5.0 from a net install iso. I gave the virtual machine 1 core of 1 processor and 2 gigs of ram
  • As far as the debian install I did the most basic install possible because for me the only thing this machine is going to do is function as a PI-hole.
  • After debian completed it’s install I had to install curl by doing.
apt-get install curl
    • Then once it was installed I did.
curl -sSL https://install.pi-hole.net | bash
  • Then I just went through the setup prompts of PI-hole, which are pretty self explanatory.

And that was pretty much it. After this everything works and is pretty much ready to be set up to your liking.

Delete files older than… in powershell

I have a couple of directories that create files

Here’s a script I use for deleting files older than 30 days in powershell.

$Path = "C:\temp"
$Daysback = "-30"
 
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
Get-ChildItem $Path | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item

You need to chance the $Path location and if you want do delete files after another time frame you just have to edit the 30 to however many days you want.