Sep 11

devLink 2013 Retrospective

I realize this is late now, but I’ve finally both recovered from devLink 2013 and (most importantly) caught up to the point that I can think about blog posts again. What a great time that was. I can’t overstate how much I enjoyed devLink.

This was my first devLink (and obviously my first time presenting at devLink). I was clearly unprepared for the intensity of a three-day conference. By the end of the third day, I was worn out. I don’t think I could have fit another nugget of knowledge in my head. I was lucky to attend some great sessions by some great speakers that were willing to share the vast knowledge they have.

For those that attended either of my sessions, thank you. I’ve posted the slides and notes for post sessions for anyone that is interested.

Jul 17

Unhelpful PowerShell Azure Message

Even though my day job does not do much with Azure (right now), I thought I should spend some time becoming familiar with the PowerShell connections available to Azure. I’m hardly stretching the state of the art here, but I have had more than a couple of instances lately where standing up a quick website using Azure Websites was handy. So far, I have done this using WebMatrix. While WebMatrix is a nice tool for the quick stuff, I feel like it has isolated me from the real deployment work flow. So off to PowerShell I go!

Unfortunately, even early on I have found some things that rub me the wrong way. Take this example, when running the Set-AzureSubscription cmdlet.

Ge-tAzureSubscription Error Message

That is a perfect example of a bad help message.
Me: “What does Subscription Name mean?”
System: “It’s the name of the subscription, dummy.”

Jul 14

CodeStock 2013 Retrospective

Another CodeStock is in the books. Just like every year, I had a great time, and I learned a lot. There were many great sessions this year and I’ll be digesting this information for a while.

One of the things that struck me this year is that, even though there were great sessions, some of the most fun I had was outside of sessions. The dinners, social events and just hanging out with other developers was rejuvenating. It was great sharing ideas with everyone. At any conference, the sessions are what get most of the attention, but I realized this year that just talking to other developers is at least as valuable.

My only “disappointment” this year was that there were too many good sessions. I didn’t get to attend but about half of what I wanted to. Part of that was scheduling overlap, and part was over-crowded rooms. Whatever the reason, the big take-away is that there were a high number of great sessions this year.

More than once, I was unable to get into my first session pick during a time slot. I could point out that sticking more popular sessions in the smaller rooms wasn’t a great idea, I assume that point has already been made. Some of the sessions I ended up attending as backups were incredible. Next year, I’m going to continue to try to get into mostly sessions on technologies I currently use or am interested in, but I’m also going to make it a point to go to at least a couple of sessions that seem completely unrelated to my interests. “Phat Stacks” is one that sticks out in my mind. I ended up there completely by accident, and it ended up being one of the most interesting and informative sessions of the conference.

All of the positives said, there were definitely some scheduling issues that I’m sure will be fixed for next year. Specifically, in many cases it seemed like sessions on similar technologies were stacked horizontally instead of vertically. In the last session block of the weekend, there were two ASP.NET MVC sessions I wanted to see, but could obviously only attend one. Earlier on Saturday, there were two sessions at the same time about single-page application libraries in JavaScript. It seems to be that instead of forcing me to pick a framework and attend that session blind, a more appropriate scheduling would have been to stack those vertically (at different times) so I could attend both, get an overview of both frameworks, and make a decision after the fact which fit me better. Hopefully that’s something that will be investigated for next year.

I already mentioned the over-crowding in some sessions. Combined with the horizontal stacking of sessions, I’m hopeful that next year sees a return to the post-selection session builder. If we all could have preselected the sessions (out of the ones that were accepted to the conference) that we planned on attending, I think we could knock out both stacking and the crowding issues. Is there a technological solution for this?

Again, even slight issues with scheduling aside, CodeStock 2013 was a wonderful and educational time. I can’t wait for next year.

May 31

PowerShell for Fun and Irritation

If you’re a fan of the show “Family Guy”, you may remember the joke where Robert Loggia was asked to spell his name. He did so by working his name into a sentence for every letter. For example, “E as in everybody loves Robert Loggia.” We’ve started doing that around the office, with the challenge being can you come up with the verbose spelling fast enough for it to still be relevant. I quickly decided that PowerShell could help me with this. There’s no trick here, really. I break up the incoming text to a character array, and for each letter I reference into a hash table of sentence beginnings to go with each letter.

function Get-VerboseSpelling
{
    Param(
    [string]$Text
    )
    
    $spellingDictionary = @{
    "A"="Awesome,";
    "B"="Boy howdy,";
    "C"="Certainly you now know";
    "D"="Don't you know";
    "E"="Everybody knows";
    "F"="Feel good about yourself because you know";
    "G"="Good thing you now know";
    "H"="Have you heard";
    "I"="I have now told you";
    "J"="Just accept that";
    "K"="Knowledge has been passed because you know that";
    "L"="Look, just accept that";
    "M"="Man, aren't you glad you now know that";
    "N"="Now you know that";
    "O"="Other people know that";
    "P"="Please rember that";
    "Q"="Quick, remember that";
    "R"="Remember that";
    "S"="Stop, collaborate and listen,";
    "T"="Thank you for remembering that";
    "U"="Ultimately, just accept that";
    "V"="Very smart people know that";
    "W"="Why won't you accept that";
    "X"="Xenobiologists know that";
    "Y"="Your significant other already knows that";
    "Z"="Zebras know that";
    }

    $charArray = $Text.ToUpper().ToCharArray();

    foreach($char in $Text.ToUpper().ToCharArray())
    {
        $char = $char.ToString()
        if($char -ne ' ') {
            "{0} as in `"{1} it's {2}`"" -f $char, $spellingDictionary[$char], $Text
        }
    }
}

The real humor here comes when you pair this with the PowerShell Community Extensions’ cmdlet Out-Speech. You can you type something in and have the verbose spelling read to you. The default voice in Windows 7 doesn’t do a particularly good job of it, but it’s still fun.

I’m already seeing ways to improve this, such as having each letter defined with multiple sentence starters where the script will pick one randomly. If you do anything neat to build on this, I’d love to hear about it!

Apr 30

Upcoming Conference Sessions

Conference season is almost upon us. Codestock is coming up in mid-July and DevLink at the end of August. I’m really looking forward to some focused learning as well as getting to “geek out” with some other developers.

If you haven’t yet signed up for Codestock, you should do so if travel to Knoxville, TN is possible for you. It’s a great conference with some great people, all for less than the price of a couple of books. If you sign up before May 15, 2013, the tickets are at a reduced price and you get to cast a vote on which sessions will be selected. If you’re doing that, I’d appreciate a vote for the sessions I’ve submitted.

 

Since nobody said otherwise and the information is on the website, I’m going to assume it’s public knowledge that I’ve also been selected to speak at DevLink. This will be the first year where I am able to attend DevLink, but I’ve always heard very good things about this conference. I encourage you to attend that one as well, if you can make it to Chattanooga, TN.  It’s a little more expensive than Codestock, but it’s still a steal for three days.

Apr 19

Maintaining Custom web.config Elements During an Upgrade

We have been using PowerShell to perform low-friction updates to our intranet application for several months. This lets any of our DevOps staff build an update package, deploy it to the client system and run a single command to update the application. This has worked well for us, until we finally bumped into our first client site that had a web.config file different from the others. In this case, it was the presence of an <identity impersonate=”true” /> tag. After we performed an update at this site, the setting was lost and the application stopped working.

This is clearly not acceptable, but we also do not need to maintain two (or more) copies of a stock config file just to account for a single setting. Fortunately, PowerShell’s powerful handling of XML comes to the rescue. Not only does XML data get treated like an object in PowerShell (so you can do $myXml.ParentNode.ChildNode.AnotherChild), but you have access to all of the XDocument methods you’re used to in C#.

Before performing the update, we want to check the config file for the <identity> tag, and grab a copy of the node. Not the text, but the node itself. This ensures that whatever attributes are defined come with it, without us having to do anything extra. That means that “impersonate=true”, “impersonate=false”, “username=blah”, etc. will all be migrated without having to manually handle each possibility. We then update the web.config file inside of the update package, before deploying it. This ensures that some kind of crash during the update (such as after copying over the new website but before updating the web.config) won’t leave the client’s system in an invalid state. We do this by inserting a copy of the existing <identity> tag.

There is one big gotcha here. When you try to insert a node from one XDocument to another, it complains. The fix for this is to perform an ImportNode() first, then append the node to the new document. Also note that AppendNode() will put the <identity> at the end of <system.web>, not necessarily in the same position it was before. If node order matters for your XML document, be aware of this. If the order of sibling nodes matters for your XML document, I’d argue that you might have a different problem anyway.

After all of that, here’s the code we used. Hopefully it proves useful to someone.

function MigrateIdentityTagsInWebConfig
{
	Param(
		$Source,
		$Destination
	)

	foreach($webConfigFile in (Get-ChildItem -Path $Destination -Filter web.config -Recurse))
	{
		$oldXml = [xml](Get-Content $webConfigFile.FullName)
		$identityElement = $oldXml.configuration."system.web".identity

		#Did we have an identity element in the existing web.config file?
		if($identityElement)
		{
			$correspondingSourceFile = $webConfigFile.FullName.Replace($Destination, $Source)
			$newXml = [xml](Get-Content $correspondingSourceFile)

			#Make a copy of the node from the existing web.config, import it into the new, and then append it
			$newIdentityElement = $newXml.configuration."system.web".OwnerDocument.ImportNode($identityElement, $true)
			[void]$newXml.configuration."system.web".AppendChild($newIdentityElement)

			$newXml.OuterXml | Format-Xml | Set-Content -Path $correspondingSourceFile
		}
	}
}

Mar 07

Running PowerShell 2 scripts After Installing PowerShell 3

My employer has a lot of PowerShell v2 scripts in production, spread across a great many servers at a great many sites. It’s because I have to support these scripts in production that I have held off on updating myself to PowerShell v3. I finally got tired of holding myself back and took the plunge. I updated my PowerShell install, but was immediately hit with a wall of red text concerning an incompatibility in my install of PowerShell Community Extensions. No problem, I go and download the latest. Now I can run PowerShell, but I’m smacked in the face with problems from our deployment package scripts. What I really needed to be able to do right away was get back to a v2 compatible mode. Fortunately, that wasn’t too bad.

PowerShell.exe –version 2.0

That will start version 2.0 of PowerShell. As far as I can tell, this isn’t just some compatibility mode, but the real v2 environment (if you know otherwise, let me know in the comments). You can tell there’s a difference by checking the value of $PSVersonTable, or by the difference in behavior in Start-Process. Now I can start up in version 2 and run all of our production scripts in the same environment that (most of) our servers will have.

I’ve had to do this enough now that I just created a new tab type in Console2.

Console2 PowerShell v2 Tab

Now when I want to be 100% sure that I’m stuck in PowerShell v2 mode for dealing with production scripts requiring that environment, I can just start up this tab.  It’s clearly labeled via the title, and I selected a fairly irritating cursor style so that there’s never any question what mode I’m running.

All of this is great, except now I get a wall of red text because the new PSCX I just installed doesn’t work with PowerShell v2. Don’t worry, there’s actually a fairly simple solution to that. First, you’re going to want to go grab a version of PSCX that works with PowerShell v2. Install it to a sibling folder of your v3 PSCX install (I picked Modules\Pscx_21).

Most of the time, Import-Module wants the module name and the name of the folder in which it is stored to be the same. The way around this is to point Import-Module to the actual .psd1 file. So Import-Module .\Modules\Pscx_21\Pscx.psd1 works just fine. Now a little bit of trickery with your $profile to figure out what version of PowerShell is being loaded, and to load the correct version of PSCX in response.

if($PSVersionTable.PSVersion.Major –eq 2) {
    Import-Module .\Modules\Pscx_21\Pscx.psd1
}
else {
    Import-Module .\Modules\Pscx
}

Now you will be loading the correct version of PowerShell Community Extensions no matter which version of PowerShell you’re starting. Of course, there are still a few gotchas that we’ve bumped into in our upgrade. I’ll cover those in future posts. What about those of you reading this? Anything you want to warn other developers about before they upgrade to PowerShell version 3?

Jan 15

PowerShell to Clean Up Invalid UniqueName Properties for RadGrid

In a recent upgrade to our product, we discovered that a great many of our custom reports had UniqueName properties that were incompatible with the new version of Telerik’s RadGrid. The question fell to me, “how many reports have spaces in their UniqueName property?” Fortunately, PowerShell made it fairly simple to identify the reports with problems.

Get-ChildItem -Path . -Filter *.aspx -Recurse | 
  Select-String -Pattern "UniqueName=`"\w+\s\w*`"" | 
  Group-Object -Property Path | 
  Select-Object -Property @{Name='Spaces in Name';Expression={$_.Name}}

Wonderful, I have identified the problem. It should come as no surprise to any developer that the next step was “great, can you fix it?” I went through a few files by hand to verify that any kind of mass replace was unlikely to cause issues. Once I was sure it was “safe” (as safe as blindly replacing property values can be), PowerShell made a day-long job take only minutes.

$reports = Get-ChildItem -Path . -Filter *.aspx -Recurse |
    Select-String -Pattern "UniqueName=`"\w+\s\w*`"" |
    Group-Object -Property Path |
    Select-Object -Property Name

foreach($report in $reports)
{
    $aspxPage = $report.Name
    $codeBehind = "{0}.vb" -f, $report.Name

    $uniqueNamesWithSpaces = (Select-String -Path $aspxPage -Pattern `
      "UniqueName=`"(?<uniqueName>\w+\s\w*)`"" `
      | Select-Object -Expand Matches)

    $aspxContent = (Get-Content -Path $aspxPage)
    $codeContent = (Get-Content -Path $codeBehind)

    foreach($uniqueName in $uniqueNamesWithSpaces)
    {
        $aspxContent = $aspxContent -replace `
          $uniqueName.Groups["uniqueName"].Value, `
          $uniqueName.Groups["uniqueName"].Value.Replace(" ", "")

        $codeContent = $codeContent -replace `
          $uniqueName.Groups["uniqueName"].Value, `
          $uniqueName.Groups["uniqueName"].Value.Replace(" ", "")
    }

    Set-Content -Path $aspxPage -Value $aspxContent
    Set-Content -Path $codeBehind -Value $codeContent 
}

And that’s all there is to it. Any custom report that has a RadColumn defined with a UniqueName property containing a space would magically have that space removed, both in the ASPX page and the codebehind file. What about you? How would you have handled this differently? One of the great things about PowerShell is that there’s no One Way to handle everything, so post your solutions below.

Dec 21

A Pomodoro Timer in PowerShell

This post falls under the heading of things that probably don’t make much sense, except it was fun and I wanted to do something very specific using a very specific technology. Rather than run yet another program in my task tray (which mostly irritates me because of icon clutter and not system resources), how could I just use another tab in my PowerShell window to help me use The Pomodoro Technique? I came up with the following function that I’ve placed in my $profile.

Function Start-Pomodoro
{
    Param (
        [int]$Minutes = 25
    )

    $seconds = $Minutes*60
    $delay = 15 #seconds between ticks

    for($i = $seconds; $i -gt 0; $i = $i - $delay)
    {
        $percentComplete = 100-(($i/$seconds)*100)
        Write-Progress -SecondsRemaining $i `
                       -Activity "Pomodoro" `
                       -Status "Time remaining:" `
                       -PercentComplete $percentComplete
        Start-Sleep -Seconds $delay
    }

   $player = New-Object System.Media.SoundPlayer "C:\Users\me\Dropbox\Music\CTU.wav"
    1..6 | %{ $player.Play() ; sleep -m 1400 }
 }

What does this do? For starters, in my now-dedicated tab, I can just run “Start-Pomodoro” and get a 25-minute timer going, with a progress bar that shows how long I have remaining in my current pomodoro. At the end, it repeats the CTU ringtone from “24″ (which is just irritating enough to get my attention). I had to tinker with the delay on the sleep command so that the clip had time to complete before restarting, so you will need to do the same with whatever audio clip you use.

If using an external audio clip isn’t your speed, you can just use whatever you have for system sounds. Just replace the two audio-playing lines above with something like the following.

1..10 | %{ [System.Media.SystemSounds]::Exclamation.Play() ; sleep -m 750 }

Like I said, I don’t know how useful this is. But it was fun, and I use it pretty much all day.

Nov 02

PowerShell, Dropbox, and Windows Live Writer (Part 2)

In a previous post, I described how to setup a file system junction to make Windows Live Writer save to Dropbox without knowing it. That solution worked if setup on each computer that I use. However, I am a huge fan of having the PowerShell profile I stash in Dropbox keep track of this type of configuration. The solution was fairly simple. First, the script snippet:

if( (Get-ChildItem $HomeDocuments "My Weblog Posts" | Select-Object -Property Attributes | Select-String "ReparsePoint") -eq $null)
{
    [void](New-Junction -LiteralPath "$HomeDocumentsMy Weblog Posts" -TargetPath "....DocumentsMy Weblog Posts")
}

Let’s break that down. I’m checking for the existence of a “My Weblog Posts” folder in my local documents folder. If one is not found, it creates a junction to the Dropbox folder.

 

“But Nathan, what happens if you already had a Windows Live Writer installed and that folder existed?”

Good question. This depends on my having PowerShell setup and running on any new computer I use prior to installing or using Windows Live Writer. The way I work, I can 100% guarantee that both PowerShell and Dropbox are setup before I even think about things like Writer. Your workflow may be different, but you can pretty easily adapt this snippet to handle that. If I ever run into a situation where I Writer configured before PowerShell, I can just remove the folder manually and then let PowerShell take over. That feels safer to me than any script that would just blindly remove that folder, in case I had any local drafts saved that I might not want to delete.

Older posts «