RightScale Blog

Cloud Management Blog
Cloud Management Blog

Windows Azure VMs with RightScale for Rapid Testing & Validation

With Microsoft's announcement today of the general availability of its IaaS solutions, it's only appropriate that we press on and get technical with how to use Windows Azure within your dev/test workflow. In my article on how to Integrate Windows Azure IaaS with RightScale to Improve Your SDLC, I talked about strategy for DevOps shops for improving development and test cycles, shared some insights into where to think about your process, and gave some pointers toward tools that can help you. The next step is to tune up your well-oiled software development lifecycle (SDLC). I’ll show you how you can reduce your wait times for testing and improve your continuous deployment process using MSBuild and RightScale.

I’ll cover:

  • Architecting a custom MSBuild script for flexibility and ease of use
  • Utilizing Windows Azure services to store and manage your build packages
  • Implementing RightScale processes and assets to deploy, configure, and manage your dev and testing environment

Now let’s dig into the nuts and bolts of how your DevOps team can improve your dev/test workflow – from build script architectures to deployment automation using Windows Azure and RightScale.

The Reference App

To show how the process can work, I chose as a reference project the Mileage Stats application that Microsoft's Patterns and Practices group put out under the Project Silk initiative. The group did a great job of building a visually appealing web app that has some cool HTML5 in it and uses SQL Server's LocalDB engine to host the data. For a demonstration of web technologies it's a good choice, but I actually want to dig into integrating SQL Server deployments with the demo app, so I modified the project to use SQL Server by building out a new class library and modifying the Unity configuration to point to it. I also uploaded two database backups to GitHub (SQL 2008r2 and SQL 2012).

MSBuild Process and Architectures

If you want to learn more about MSBuild, there are a ton of great resources to get you up to speed:

Frameworks like MSBuild are the key to getting DevOps right in the Microsoft ecosystem. MSBuild is completely entrenched in the software world and is easily extendable. For me, it has become the go-to scripting framework for managing deployments and configuration changes. If you’re a Visual Studio developer, you already use MSBuild, even if you're not aware of it. Visual Studio project files are actually (you guessed it!) MSBuild project files, and are consumed by the MSBuild engine to compile your application every time you press F5 or Ctrl-Shift-B when you're in Visual Studio. When putting your custom MSBuild scripts together, it can be tough to figure out how much flexibility to put into the framework you're architecting, so I thought I'd share my template and thought process on building it out.

png;base648e83c177b05f1ede

To start, I wrapped my XML docs in a standard class library project so that I can version them and eventually build a publish process out of it, but for now, it's just the container I'm using. You can get the code from GitHub. I have a main Build.xml file that includes a number of other files along the way: an environment-specific file, a project-specific file, and a base set of custom targets that may consume custom tasks.

You need an environment file because developers all set up their machines differently, such as where SQL Server is installed, for instance, or specific repository root paths, or the possible use of a secondary partition to store data. Also, some critical paths in the process of building this application  – most notably the location where we're going to publish the package – may need to be accounted for on an environment-by-environment basis.  While you can push the name of the environment file into the MSBuild process via the command line, it's much easier to set defaults and use the $(ComputerName) variable to pull in the environment file automatically. With this technique, you only need to create the file in the right place, and if you're on a domain network, you're guaranteed to have a unique name! Check out how it works: if a value is found for $(EnvFile) among the command-line parameters, MSBuild imports that file; otherwise, if a corresponding environment file exists for the computer, it imports that one:

With this setup, the process of configuring an environment is the same for your local machine as it is for your build box. That gives you a ton of flexibility as to how you set things up on your machine depending on what else you need to have set up to do the rest of your day job.

Next, the build process pulls in all of the solution-specific artifacts, such as what to build and other solution-level defaults. In the Mileage Stats application, check out the Properties.xml file and note that I'm setting up a number of properties for the build process itself, such as paths for tasks and the solution root path, as well as compiler-level hints for code analysis and build configuration and architecture. Next, I'm creating the collection of project and solution files that I'll actually build; for this application I’m just building the overall .sln file: 

Finally, I have the rest of the targets and process behind what MSBuild is actually going to do. At this point I'm just packaging up the web application using the Web Deploy package target and doing some file manipulation to zip up the entire contents of the package using DNZip from the MSBuild Extension Pack.

At the end of the process I have a fully compiled MSDeploy package that I can use to move the web app out in the wild. I've incorporated RightScale and Windows Azure into TFS, and rather than building the solution file directly, I'm calling my custom build process and this package target in addition to my normal build/compile/test process.

Taking It to the Cloud

With that part of the process completed, we've actually got the majority of the heavy lifting out of the way.  In order to deploy our assets in the cloud, we're going to utilize Windows Azure Storage to house our build packages. This way we can modify some inputs on a RightScale ServerTemplateTM (more on that in a bit) and get the system fired up “automagically.”

On the tail end of the package process in MSBuild, we're going to add a custom task to take the final file and upload it to a specific container in a Windows Azure Storage account. This process is simple and is just a matter of consuming the Windows Azure Storage API within a custom MSBuild Task:

It's that simple (from a task perspective), and calling it from MSBuild is just as easy:

With all of this integrated, there's nothing cooler than seeing it work without errors:

Deploying Automatically via RightScale

Once we have our compiled (and hopefully unit-tested) application deployment package sitting in the cloud, the next step is where all the magic happens. Using an all-in-one ServerTemplate in RightScale that contains SQL Server and IIS, we're going to spin up an instance for testing.

From the build script, we can run two potential tasks to finalize the process: UpdateCITestServer, which updates a running server (using RefreshApplication) with the latest version of code, and LaunchDevTestServer, which terminates a server (using TerminateWithWait) if it’s currently running, then relaunches it (using LaunchRSServer) with the newly built code package.

The three custom build tasks I’m using to leverage RightScale’s API are pretty simple, too, when implemented with a .NET wrapper for the RightScale’s API 1.5:

You can find this ServerTemplate in our MultiCloud Marketplace. It is essentially the Database Manager for SQL Server ServerTemplate with modifications to support this test environment deployment process. After the SQL instance is up, the RightScale cloud management platform automatically runs PowerShell scripts to add database users and configure IIS for the application. It:

  • Installs IIS via the Add-WindowsFeature cmdlet and registers .NET 4.0/4.5 with IIS
  • Using the Web Platform Installer 4.5, installs MVC3 and .NET 4.5 (if it's not already installed)
  • Installs Web Deploy 3.0
  • Downloads, unpackages, and imports the Web Deploy Package into IIS

We're calling the launch action on the server from MSBuild, so we have a lot of control over the inputs and setup on the server via RightScale:

  • We set the Web Deploy package name at launch time so we can launch the package we just built.
  • We can also set variables and modify the configuration of the Web Deploy package – for instance modifying connection strings via inputs on the ServerTemplateTM that populate the ParameterSet.xml file.
  • Along with that, we can tightly control creating accounts to make sure that the web.config and the SQL Server accounts are synced for connection strings.

End-to-end, I can rapidly deploy an instance for testing from within my build workflow. In 20 to 30 minutes, I have an instance that's deployed and ready for testing in the cloud initiated from a build process from my development environment with just a few easy clicks.

Behind the scenes, the RightScale ServerTemplate that I'm using also has been set up to conform to my specific security and policy standards. In an upcoming post I’ll show how the ServerTemplate can even be domain-joined to pull down group policy rules so that not only am I able to test on a clean machine, but that I can also test on one that's representative of the proper environment and security context.

A Final Word

Admittedly, this is a lot to take in, even if it is an “easy” example of an all-in-one test server.  But scaling out to an environment that includes multiple tiers (load balancers, SQL Servers, and IIS Servers) for testing and eventually production is really just a matter of mechanics rather than an exercise in building new components. Given the flexibility of the demo assets, with a little modification you could be standing up full stacks right from your development environment! 

Post a comment