Pages

Sunday, November 24, 2013

Creating output to make your boss proud

Yes, it's possible. And easy. And annoying. Bosses love reports, I love to get reports. But I also hate to write reports, because usually we just look at them and say, "Yes! this looks like progress. Make me another next week". Preferrabley in those see-through sheets they used to put on the projectors in the 90's so we can compare one on top of the other. Yes that would be fantastic.

However outputting in Powershell is probably something you use on a daily basis when you are writing your scripts. Most commonly, we are using Write-Host to output directly to the console. At least that's what I do - I guess i'm weird like that. When I run a script in production I like to watch it in real time and look at the waterfalls of red text because I forgot a $ or "" or something, but that's another story. So what kind of outputting can we do in PowerShell? Well, the answer to that question is another question, what kind of outputting can we not do in PowerShell? I actually don't know the answer to that. I have a feeling I will soon though. So let's start with the basics. If we are coming from BATCH processing we are familiar with Echo, in PowerShell we use Write-Host. It is essentially the same thing. Like so;

Write-Host Pancakes are delicious.

Write-Host can do variables also;

$WriteVar = [string]"Pancakes"
Write-Host What is delicious? $WriteVar are delicious.

It is this most basic form of output you will most likely find the most useful, especailly when it comes to troubleshooting commands. A lot of times when I am running a command against AD and using variables inside of quotes (such as CN=), I need to make sure it's being represented correctly. Because things get weird when we are using variables with special characters with quotes inside of single-strings inside of quotes. Pain you will soon come to get used to. For this I recommend you arm yourself with the knowledge of escape characters. I will cover those in another post and link them here.

Sorry, anyways - I need to write things to a .txt file, or a .log file (they are the same thing). So how do I do that?

$WriteVar = [string]"Pancakes"
$WriteVar | Out-File -FilePath C:\Working\Output.txt -Append

Adding more than one variable can be tricky at first, for example:

$WriteVar = [string]"Pancakes"
$WriteVar2 = [string]"Pancakes are awesome"
$WriteVar,$WriteVar2 | Out-File -FilePath C:\Working\Output.txt -Append

You might think this would put them on the same line, however the comma acts as a delimeter, and output goes on seperate lines. If you need to put them on the same line seperated by a comma, just surround the whole statement in quotes.

$WriteVar = [string]"Pancakes"
$WriteVar2 = [string]"Pancakes are awesome"
"$WriteVar, $WriteVar2" | out-file -FilePath C:\Working\output.txt -Append

When you do this, you don't need the comma there, you can have just spaces or whatever. Since the statement is in quotes it will be literal. Note this isn't only for strings. In the following example we will get a user from Active Directory using Get-ADUser, and output the results to a text file.

$User = Get-ADUser -Identity PancakeBro
$User | Out-File -FilePath C:\Working\output.txt -Append

For those of you without RSAT, you can follow with this example;

$GetBIOS = Get-WMIObject Win32_BIOS
$GetBIOS | Out-File -FilePath C:\Working\Output.txt -Append

But wait, I don't need all of that junk. How do I just write certain pieces of a command to output? For instance, I don't need to know the Manufacturer, but I just need the SerialNumber? Aha, now we are getting into usefulness territory. So you need just the SN# do you?

$GetBIOS = Get-WmiObject Win32_BIOS
$GetBIOS | select SerialNumber | out-file -FilePath C:\Working\output.txt -Append

Now you've outputted just the SN# to your output path. But if you notice, it's a bit messy isn't it? There's a bunch of spaces and it just looks dumb. This was actually just asked of me the other day when a colleague of mine was compiling a report. What we need to use is the -ExpandProperty parameter on Select. So let's try the following bit of code, and it should look much nicer in Output.txt.

$GetBIOS = Get-WmiObject win32_bios
$GetBIOS | select -ExpandProperty SerialNumber | out-file -FilePath C:\Working\output.txt -Append

Protip: Put -ExpandProperty before the object.

You can do this with anything en masse, put it in a loop and collect massive amounts of information from your network. If I was to take a bet, this is what you will be using it for. PowerShell won't be writing your next college paper. But it could write one hell of a SN# report. But we need to go a little deeper, another level down. You have a list of SN#'s now, let's say, you have 1,234 of them. Awesome, but what do they go to? You could run two reports, and combine them yourself in excel, but what if some of the PC's are unreachable and the lines don't match up? Well Jimmys-AwesomePC is going to have the wrong serial number and audit is going to hit you for it. You provided the Serial number for Awesome-TorrentPC instead. Or you need to combine lots of data from different points, and not just two columns, but maybe 5. Or 321. Or 65,536. Who knows, I don't know, I don't care - what's nice is these all scale to infinity, so we don't really mind how much data there is.

This is where we start wanting to write things to another file type. Let's start with the most basic, CSV. A few pointers about CSV though, you need to provide headers to define your tables, as well as enter data into the correct columns. Otherwise there is no difference between a list in .txt and a list in .csv. Even if you put commas in your output, you won't get it to seperate into columns. Which honestly is a bit strange but whatever. So let's create two columns for a CSV. But before we get into that, you can export any cmdlet into a CSV like so;

Get-WMIObject Win32_BIOS | Export-CSV C:\Working\Output.csv -NoTypeInformation

Protip: -NoTypeInformation hides useless PS object info from getting into the output

Make note of -Append if you want to write multiple sets of data to the same file. However, let's note that -Append is only fully functional in V3 of PowerShell, which might cause you some headaches. If you are going to be appending to CSV in older versions, it is quite a bit more complicated. I will cover that in another post at another time. Essentially, however - you really want to upgrade to PS 3 or 4 as soon as you can. With that being said, if you just want to be lazy you can do it like this. Most likely there is going to be a lot of stuff you don't care about. We can use Select though, to make our life more awesome, and less disorganized. Note that in this fashion, you also create headers for your information. So if you are using the same command on different computers, but outputting to the same file, the header will be Serial Number, with the SN#'s listed, and Manufacturer, with the Manufacturers for each one listed as well.

Get-WmiObject win32_bios | select manufacturer,serialnumber | Export-Csv C:\Working\Output.csv -NoTypeInformation -Append

Maybe now you are getting the hang of making some simple reports. However what sucks about the above example is you are sort of constraining yourself to working with only items that are returned from Win32_BIOS. Because what you would want to include in this report is the hostname, obviously. Or you have no reference to what you are looking at. So back to the original goal, let's create our own headers and put whatever info we want into them, instead of piping our output and limiting ourselves. Break free! KHANNNNNNNN!

Let's start off by saying there are several ways to create a CSV file. You can create new PSO Objects, you can create it with a Here string (my preferred favorite), and probably more than that either. However they are all a little confusing, so I really recommend you mess around with them a little before you really get started. It's good to note that you should be able to run these commands on your computer directly from this guide to get a good feel for it and then start messing around with it. I like the HERE statements because it offers flexibility in the way the tables and strings are entered, as well as you can sort of just do it in each loop or at the end of a script. However you prefer to run it really. However with a here string it's a bit rigid, it doens't let you use things like comments and quotes and so on don't have so much effect. So, let's get started.

Pancake:
$csv = 
@"
name,manufacturer,serialnumber
$var1,$var2,$var3
"@

So what we are doing here is creating a variable, $csv, where we will store our data. The comma denotes the column on the top, and you enter data, like so, below on the second line. For the work I do and have seen others do, this is probably going to be the most used way. As usual, with $var1-3, we can enter any strings that we would like. But to understand how this works for us, we need to understand how this doesn't work for us. Let's look at the following example.

$WMInfo = Get-WmiObject win32_bios

$csv = 
@"
name,manufacturer,serialnumber
$WMInfo.name,$WMInfo.manufacturer,$WMInfo.serialnumber
"@

$csv > C:\working\outputstuff.csv
Import-Csv C:\working\outputstuff.csv

Here is the gotcha. Unfortunately this doesn't work. In any normal situation, this would be perfectly acceptable. You can select members of a cmdlet but just adding the . and the property of whatever it is you want to add. However in the Here string that isn't going to work. This will be a little more manual than maybe we would like, so we will have to add some things for this to work as expected. So what is it that we have to do? Not much, we just need to put those objects into a single variable, like so:

$WMInfo = Get-WmiObject win32_bios
$WMInfoName = $WMInfo.Name
$WMInfoManufacturer = $WMInfo.Manufacturer
$WMInfoSerial = $WMInfo.SerialNumber

$csv = 
@"
name,manufacturer,serialnumber
$WMInfoName,$WMInfoManufacturer,$WMInfoSerial
"@

$csv > C:\working\outputstuff.csv
Import-Csv C:\working\outputstuff.csv

So here we have it. The simple way, in my opinion, to make a CSV file with custom information. Just for an example, let's throw in something else, that is just our own string that we could be getting from anywhere.

$Hostname = [string]"Awesome-TorrentPC"

$WMInfo = Get-WmiObject win32_bios
$WMInfoManufacturer = $WMInfo.Manufacturer
$WMInfoSerial = $WMInfo.SerialNumber

$csv = 
@"
hostname,manufacturer,serialnumber
$Hostname,$WMInfoManufacturer,$WMInfoSerial
"@

$csv > C:\working\outputstuff.csv
Import-Csv C:\working\outputstuff.csv

Sufficiently, this gives you the structure to create a CSV file and fill the columns with whatever kind of data you want, and you can name the columns you want, or add columns (I guess up to infinity? But let's not get carried away here). But if you put this in a loop while you are gathering information, you can use this sort of structure to write all of the relevant output to a file.

I think that is enough for now, we will go over output again at another time, as there is just so much to cover. But in the meantime, make your boss proud!

Wednesday, November 6, 2013

Jobs, and how they save your life

No. I don't mean your job. You are reading this page because you have a job and you need your life to be easier, or you don't have a job, and cats got boring. What I do know is Powershell, and it's ability to thread workloads, has made me keep my current job, as well as other important things like sanity, clean underpants, hair, carpel-tunnel free hands. You get the idea. It's also a topic that when I was starting out learning about jobs, I had trouble finding many good, easy to use examples of how I could apply them for what I need.

There are many types of jobs in Powershell, and just as many different ways to execute them. However, for the most part, you either have a script you wish you could thread, or you have the requirement to thread a lot of things instead of running them sequentially. A few instances this might be used in is, for example, starting or stopping services on hundreds of servers, copying a file to a remote file system on each server, creating/nesting groups based off of a document in AD, grabbing log files from many different hosts, or whatever it is you may be doing. I hope the valuable application of this has already become apparant.

Lets get something clear first. Start-Job is expensive. It also starts a seperate instance of Powershell. This brings with it two unique challenges:
  1. You are going to catch your computer or server on fire (maybe)
  2. You can't pass variables from the host shell so easily
Number 1 is because Start-Job creates a new instance of powershell, which needs resources. Personally, I have gotten away with running about 150 Powershell instances concurrently, which required about 12GB of available memory. Number 2 is valid. Since it is spinning up a fresh clean instance of Powershell, it won't have any of your session variables stored, so you need to pass them to the job. This would be the function of -ArgumentList. We will go over this in this post.

Pancake:
Start-Job -Name JobNameWhatever -Scriptblock {..} -ArgumentList $var,$var2

Easy enough right? This is the basic breakdown of Start-Job, and how 99% of the worlds population will be using it. You can start a job based on a remote script as well, which is similar to Invoke-Expression (We won't cover that here), you can use a very large and complicated script block (what you will most likely be doing), or you can just enter any plain old boring Powershell command, which you will put into a script block anyways. So let's take another example of actually using Start-Job in a loop. This will basically serve all of the needs I think you need to use it for.

$Names = Get-Content C:\PowershellPancakes\AuntJemima.txt
foreach ($Name in $Names)
    {
    $MapleSyrup =
        {
        Param ($name)
        New-Item -Name $Name -ItemType Directory -Path C:\PowershellPancakes
        }

Start-Job -Name $Name -Scriptblock $MapleSyrup -ArgumentList $Name
    }

This is just a simple script in a loop, where we are creating some delicious folders based off the names in the text file. You can open the loop however you want, in any way you want. This is the basic function. So imagine your $MapleSyrup scriptblock. You can do whatever you want in this script block. You can even take an existing script that takes manual input, put it into a script-block like so, and start a job based off of it. I have been doing this increasingly so, when someone asks me, can we do this, like, all at once? Why yes, client A, we can do this all at once. To maintain your full head of hair I suggest you put the job name to a variable you can identify. That way if you are running several hundred jobs you can skim through a list with Get-Job.

I want to go into detail about the following bit, because this may cause you a lot of headaches when you get started with jobs. So here it is:

Param ($name)

Were you expecting more? This just needs to be noted. This is where your parameters will get transferred into your scriptblock, which need to be used in -ArgumentList by Start-Job. If you don't, the value of $name will be null when the job starts, which could create unforseen problems, as well as just not work. Another note, variables you create inside script blocks don't need to be added to -ArgumentList or Param. They are contained. As long as you add the external variables being passed from outside of the scriptblock through Param and -ArgumentList, you are okay. Also remember, seperate params by a comma, like so,

Param ($name,$plants,$syrup)


Or, personally,

Param ($name,
       $syrup,
       $plants)


You have just one line you need to run?

Start-Job -Name Blueberries -Scriptblock {robocopy \\serverB\1 \\serverB\2}

What's nice about starting jobs is since they use another instance, you're free to continue using the parent shell for whatever you want. Some of you may have run into the situation where if you were running a script in the ISE and wanted to do something else on another script in the same ISE, you couldn't. Well since Start-Job creates its own instance of Powershell, you don't have to worry about this.

So you've started a job and it's just...running...somewhere...

Get-Job 

Though no, you can't exactly expand the operating job, and you will need to have some confidence in your own script. In another post we will go over how you can add some error handling to pass things back to your parent shell, or to a log file using different commands, including Try and Catch, which we touched on earlier.

But I have a job I need to run, through 400 objects in my CSV file that i'm looping through. I only have 4gb of available memory! Oh what do I do? First I will refer you to a previous posting, which goes very closely with this one. That would be What Do Until loops can do for you. They are connected directly. But how? Let's take our earlier example, and put some job control on it.

$Names = Get-Content C:\PowershellPancakes\AuntJemima.txt
foreach ($Name in $Names)
    {
    $MapleSyrup =
        {
        Param ($name)
        New-Item -Name $Name -ItemType Directory -Path C:\PowershellPancakes
        }

Start-Job -Name $Name $MapleSyrup -ArgumentList $Name
DO {
    Start-Sleep -Seconds 5
    #Give it some time
    Remove-Job -State Completed
    } Until (@(Get-Job).Count -le 5)
    }

What just happened? Well, we just limited the number of active jobs at any given time. I probably don't need to tell you why you want to do this. Especially if you are working with just huge bits of data and lots of jobs need to be created to do it. Now you can create jobs that create sub-jobs, but that's just a bit unnecessary in my personal opinion, unless each job has a lot of items it needs to complete, that would be better suited by threading those as well. Just remember your hardware limit so you don't do anything too crazy and bring everything to a halt. Also consider your network/storage/hardware or whatever it is you will be working with. Let's not be reckless here.

I think that basically covers it for Jobs. An absolutely essential function of Powershell that I think could sometimes be misunderstood. Mess around with the examples in this post and you should get a very good feel for how to incorporate these Pancakes into your own recipes. 



Monday, November 4, 2013

What Do Until loops do for you

Lately I've been getting into a lot of multi-processing and threading features of powershell. Of course, being the Powershell Pancaker, I don't typically like most of the examples I see - as oftentimes they are just too bloated, and they hurt to read (variables in the ISE burn my retinas). But I really needed this. Almost as bad as I need coffee, or excercise, or underpants.

Have you ever wanted something to happen, until it happened correctly? What if you had a bit of code, that depends on something external, maybe you want to run a script, and then (for whatever reason) go manually complete a task, and the script will continue when you are done? I don't know why you would do that. But this can be particularly useful in scenarios where maybe you are waiting for some bit of AD replication, as I was, which prompted me to write this. This is essential if you are running jobs in Powershell.

The lazy mans way out of this predicament is to use the Start-Sleep cmdlet. Which basically means you have way too much free time to run this script, and obviously you really just don't care about pretty much anything. But this doesn't work if you have a deadline, and you need this to be done as fast as possible. And doubly so if you are running your function as a seperate job, where you either arn't particularly bothered to report results from each job or - it is just unnecessary. This also doesn't work if you need to execute 400 instances of Powershell and you only have 16GB of RAM. This could be troublesome if your lazy Start-Sleep timer wasn't long enough, and unbeknownst to you, your script failed or didn't meet your success criterion.

So let's get started.

Pancake:
DO { #Start a DO loop
    $a++
    Write-Host $a
    } Until ($a -eq 10) #Write $a++ until -eq 10

This is the basic breakdown. Until (...) accepts all sorts of input. This command is useful in a number of scenarios. Inside of the DO loop, you can do pretty much anything you want. Let's try something completely non-sensible.

DO {
    Start-Sleep 2
    Write-Host Waiting for folder..
    $Folder = Test-Path C:\PowerShellPancakes\Testdir
    } Until ($Folder -eq "True")

In the above, we are using one of the unlikely scenarios that you want to manually create a folder to return some sort of output. Let's say we need to wait until this folder is created, and then when it is, copy something into it? If you are interleaving copy jobs maybe you will find this useful. Otherwise no one else will ever use this. But these are Pancakes, remember? We can put whatever we want on them. To do this we just have to add whatever we want at the end of the loop to continue on with our script.

DO {
    Start-Sleep 2
    Write-Host Waiting for folder..
    $Folder = Test-Path C:\PowerShellPancakes\Testdir
    } Until ($Folder -eq "True")
Copy-Item -Path C:\PowershellPancakes\AuntJemima.txt -Destination C:\PowershellPancakes\testdir
#Script continues as normal

As you can imagine, we can use this bit of code to transform basically anything that requires a wait time, success criteria, value limit, into something of a more fire and forget system. Is there anything else we can do with this? Hmm, what about Try and Catch? This is what you need to know if you don't want a bunch of error output while you are waiting for your criterion to become true, or, whatever it is you are doing. Let's use the following example.

DO {
    Start-Sleep 2
    Write-Host Waiting for folder..
    $Folder = Move-Item C:\PowershellPancakes\testdir -Destination C:\PowershellPancakes\Destination
        } Until ($Folder -eq "True")

This is going to be super annoying. You haven't even created TestDir yet, how are we going to move it if it doesn't exist? It's okay. Powershell will definitely let you know that it doesn't exist. Every two seconds, actually. What if you aren't as stupid as it assumes you are, and you want it to chill out until you have created this folder so it can be moved. (Again. Let's forget practicality for a moment.) Well, this is where catch and try come into scope. With try, we can do just that - try things. There is a big difference between try and do, except in Powershell, we do try. (See what I did there?) So let's look at something that won't hurt our eyes quite so much.

$ErrorActionPreference = "Stop"
DO {
    Start-Sleep 1
    Write-Host Waiting for folder..
    try {
        $GetFolder = Get-ChildItem C:\PowershellPancakes\Testdir
        }
    catch { }
        } Until ($GetFolder -eq $null)
Copy-Item -Path C:\PowershellPancakes\AuntJemima.txt -Destination C:\PowershellPancakes\testdir

Ah, much better. Thank you for understanding Powershell, don't be so impatient. There is one important thing to note, however - try and catch work great, however only for terminating errors. The above is an example of a non-terminating error. The script will still go on. So that is why we need to set our $ErrorActionPreference variable. In the above we are doing exactly what we want, in the do loop we are going to try and get the contents of that folder, and we are going to keep trying every two seconds. However, with catch, we are capturing what comes out of that try statement. You can enter a lot of things in here, you can enter some additional processing, maybe a nice Write-Error that looks better, put it out to a log file, whatever - the possibilities here are endless. You can even use the system classes to catch particular types of errors from try, if you happen to be running more than one set of commands. When I write scripts I usually leave this blank as I haven't had a reason to fill in the catch block yet, I use it to just ignore the error output. But much like If and Else statements, they follow eachother naturally.

That's all for now - this should get you started with the Do Until, Try and Catch statements.