Wednesday, October 21, 2015

Find when servers were last updated

My colleague is just starting to learn powershell so I was giving him a hand on an idea he had. He wanted to do a script that would check when the last time servers had updates applied. I'm not saying we came to the best solution but here is what we came up with.

I think my favourite part of this is the evolution of making the script. So I'll show you the evolution that my script went through.

I knew that 'get-hotfix' would return the installed updates as well as when they were installed. I figured I could run that, sort it by date, and select the last line.

(Get-HotFix | sort InstalledOn | select -Last 1).installedon
This will return the date of the last update. I also realized that even though it returns a time for "installedon" they are always midnight so it must not actually record that info.

The next evolution was to get this to run against multiple computers. I figured a for-each loop would be easiest to get him started.

 $servers = 'Server1','Server2' foreach ($server in $servers)
{  (Get-HotFix -ComputerName $server | sort InstalledOn | select -Last 1).installedon
Well it turns out that Get-Hotfix doesn't like this. I got a bunch of red text so I decided to use Invoke-Command instead.

 $servers = 'Server1','Server2' foreach ($server in $servers)
{  $date = invoke-command -ComputerName $Server -ScriptBlock {(Get-HotFix | sort InstalledOn | select -Last 1).installedon }  write-host $server $date
This worked much better but it would be nice to grab the servers from AD instead of having to know the server names.

 $servers = Get-ADComputer -Filter {OperatingSystem -Like "Windows Server 2012*"} | select name
 foreach ($server in $servers)
{  invoke-command -ComputerName $Server -ScriptBlock {(Get-HotFix | sort InstalledOn | select -Last 1).installedon }write-host $server $date
 
So that will grab every server 2012 or 2012 R2 box in AD. The output of this isn't great though, it's just a list of server names and dates on the screen. Sorry Don Jones, I used 'write-host'.

A better solution would be to output this to a csv. Luckily the easiest way to do this will also speed up the command. Rather than use a loop and get the results serially I'm going to use a session and get the results in parallel.

$computers = Get-ADComputer -Filter {OperatingSystem -Like "Windows Server 2012*"} | select -expand Name  $session = New-PSSession -ComputerName $computers  invoke-command -session $session -ScriptBlock {(Get-HotFix | sort InstalledOn | select PSComputername,Installedon -Last 1)} | export-csv c:\temp\servers.csv
I think this is much better. Now we're getting the results faster and it's going out to a CSV file. It's still far from perfect. Error handling isn't great and I'm getting a couple extra columns in my CSV that I don't want.

This gets him the data he wants so I'm going to leave it in his hands to try to refine.

Thursday, October 8, 2015

Server Reboots

I can't say I'm a huge fan of server reboots and I really don't like doing them during business hours but today I had to give our main file server a boot. I came in this morning to complaints that the shared drives were slow so I checked it out. The SAN wasn't getting taxed and the host was idling so it must be the VM. I could log in just fine but task manager hung... that's odd. Everything was ok on the VM, opening windows, etc but I could not get task manager to open. Open up powershell and get-process is taking for ever to populate, about one line per second. Finally it started to catch up and there were all these 'cscript.exe' processes with more and more scrolling in. Finally it finished and the process ID of the last one was in the 90,000s. Wow I think I found the problem. Well it should be an easy solution, right? Get-process -name cscript.exe | stop-process -force. Nope, access denied. Ah, I didn't run powershell as admin, that'll get it. Nope, access denied. Shoot. It's time to put a change request in I guess. Luckily it's only a file server and it's a VM so it rebooted in about 10 seconds. I don't think anyone even noticed it was gone.

Wednesday, October 7, 2015

Epson Receipt Printers and RemoteApp

If you've ever had the joy of working with those two products I'm sure you shuddered as you read the title. I'm currently struggling my way through a RDS environment that has the requirement to be able to print to these receipt printers. The app is old and basically just throws a print job at the LPT1 port and hopes for the best so it doesn't work very well with redirected printers. Also I need to have it so the users don't need to do anything.

I know that RDP can redirect the local ports so I figured this would be easy, I was wrong. I couldn't get the LPT redirection working so I first came up with just adding the redirected printer to the app. Unfortunately it doesn't allow you to choose a printer when you print the first receipt it just throws the job at LPT 1. If you want to specify a printer you can do that in the options menu but the redirected printer name might change on every login because they append the session number to the end of the printer name. This would potentially require the user to change the printer settings every time they logged in. So that's "Plan B".

Next I thought I could use the old "net use LPT1 \\client\printer" to make it work. And indeed it did work but again there are problems. First, I'd have to make sure the local printer was shared. I don't like this idea because then people will be printing their receipts to each others receipt printers by accident. In order to have this run in a login script I'd have to make sure I checked for the local host name to map to the correct printer because they might be on a different machine. This is "Plan C".

I finally had a break through when I tried just doing a regular RDP session to the host server with the option to redirect ports checked but redirect printers unchecked. I figured out if I have a dummy printer on the server that is on the non-existent LPT1 port then it will print to the client because of the port redirection. Halleuja I think I'm onto something here. So why doesn't it work in remote app? I found the .RDP file in %appdata%\microsoft\workspaces\<guid>\resource and took a look a it. Port redirection is off! I knew that LPT redirection was allowed by default in GPO but I didn't realize that it was turned off by default in the RDP files.

Now to make that change in remoteapp. Well it turns out that 2008R2 had a GUI to do it but 2012 nixed that and the only option is powershell. That's not so hard except the remote desktop serivces blog entry had the command wrong. Boo.

The working command is:

Set-RDSessionCollectionConfiguration –CollectionName %collectionname% -CustomRdpProperty "redirectcomports:i:1"
Put your own collection name in for %collectionname%

Now this will add "redirectcomports:i:1" to the registry. You can check it at HKLM\Software\Microsoft\Windows NT\CurrentVersion\Terminal Server\CentralPublishedResources\PublishedFarms\%collectionname%\DeploymentSettings

As far as I know to test it you'll have to remove the existing connection to this collection from your client machine, which deletes the RDP files, and then reconnect to download the new copies off the server.

So now I'm able to open the remoteapp app from the start menu on my client and print a receipt to the local receipt printer without having to make any changes in my app each time I log in. The only change I have to make is initially I have to tell it to use the local printer on the server. For ease I named both the server printer and the client local printer RECEIPT. Everything seems to be working so far. I've got two weeks left before it has to go into production.

Monday, October 5, 2015

Finding old files

One of the systems at work automatically backs up to one of the file servers but unfortunately it doesn't follow it's own rules. The backups are supposed to happen once a week and get purged after a year. The thousands and thousands of backup files would lead me to believe that's not working right. That side of it is on the network team to fix but it is on me to remove all the old files. They decided we didn't need anything before Jan 1, 2015 and I need to remove them. There are 50 devices that back up and each one has a multi file back up from each day so there are way too many things to go through and manually remove them. I'm fairly new to powershell but I knew I could whip up a quick script.

Finding old files is easy. Piping Get-ChildItem to Get-Member reveals there is a "LastWriteTime" propery. It's easy to get anything that's been written in the last year.

Get-ChildItem | where LastWriteTime -le (get-date).addyears(-1)
That'll work but not really what I wanted. I need files that were written on or before a certain day, not just in the last year.

 Get-ChildItem | where LastWriteTime -le 1/1/2015
Ok that's better. I'm not really worried about specific files since each backup is in it's own folder.
Get-ChildItem -directory -recurse | where LastWriteTime -le 1/1/2015
That did it. I just get a list of directories back. Unfortunately I'm getting some false positives. Looks like I'll have to actually write a script.

cd Relevant Folder

$directory = Get-ChildItem -Directory

foreach ($d in $directory) {                                                                                                              (Get-ChildItem .\$d\backup -directory -Recurse | where lastwritetime -le                             1/1/2015) | remove-item -Force -Recurse
}
There we go. That'll get all the directories in the share and then step through each one and find the directories that are too old and send them to be deleted.

First Post!

Welcome to my new blog. Hopefully I'll be able to put some funny stories or useful information on here. I'm not really sure what I'm going to post but I'm sure it'll evolve as I go.