How true it is….

I found this while searching for using GTK on Windows. Here is the thread (http://episteme.arstechnica.com/eve/forums/a/tpc/f/48409524/m/346009087831/p/3).

Quote

Originally posted by The Shadow:
You know, funny thing is, they said the same thing about XP’s animations when XP came out – 5 years ago. But most 2002-vintage machines don’t run XP very well unless they’ve been pretty heavily upgraded. I don’t really see Vista as being any different in that respect.

Not just there, most of the complaints about Vista are also the complaints people said about XP and Win2K when they were released. Right down to the same catch phrases and statements.

Bloated… blah blah… resource hog… blah blah… incompatible… blah blah… wait for SP1… blah blah…

It’s actually almost comical.

Why “Scorched Earth” can be a good thing.

Historically here in the School of Engineering, software installation was handled by a single person. What I mean by this is, if a faculty person required specific software to be installed, one individual would take the time to get that software to work. Little if any documentation was created as we relied on this one person to always be there.

Currently, our image is very poor. We lack the documentation, (documentation = knowledge), to properly configure the upwards of 60 unique applications needed by the various disciplines within the school. Consequently, when an application is found to be missing, or mis-configured it takes a significant amount of time to re-invent the wheel. Because of that confidence in our ability has suffered extensively.

So the only real solution is to start over, and in our environment the only real way for us to do this is to blow everything away. Come up with a new way of delivering the requisite service that faculty, staff, and most importantly the students require. To that end I have a very simple plan that I will begin to implement.

Step 1.

Create a new way to deploy software.

All software will be packaged into an easily deployed .msi file. Each .msi file will be deployed via GPO attached to the appropriate OU in the Active Directory.

Step 2.

On the first full week after classes end; systematically wipe and re-install each individual lab, using Windows Deployment Services combined with Managed Software Installation and Windows Software Update Services.

The end result should be a lab hat is fully configured and updated with software that works. I was intentionally vague on the “new way” process as I will detail that in a second posting, but this is basically the idea.

The “new way” or Software Lifecycles

Currently we have a very poor methodology of managing the host of software applications that we support on our network. The process is as follows:

New Software Request

Faculty: I need application xyz.exe installed for my class

Staff: Ok

Student: Application xyz.exe doesn’t work, and I need this for a very important project

Staff: Ok

Faculty: Why doesn’t this application work, I can download it and run just fine on my computer at home.

Staff: Ok

Software Upgrade

Faculty: The latest release of xyz.exe is out today, we need this loaded on the labs.

Staff: Ok

Student: Application abc.exe doesn’t work anymore, and I can’t get xyz.exe to run widget.

Staff: Ok

Faculty: I don’t know why abc.exe doesn’t work, I don’t have that problem on my computer at home.

Staff: Ok

Do you see the problem? We have no formal process of evaluating the impact of a given application in our environment, and beyond that once we have a given application working, we have no procedure on how to deliver that software to the appropriate computers. This is where Software Lifecycle comes in to play. All software, regardless of it being new, or an upgrade to existing applications will go through the following process.

Step 1.

Install the application onto a test machine. The test machine will have the base load of software, consisting of Windows, Office and Sophos Anti-Virus.

Step 2.

Verify that the application will start, then have someone who is knowledgeable in the program to verify that it works under a regular user context. If there is a problem, research and resolve that problem and then test again. This process will continue until the application works.

Once the application has been verified to be functional it will then be packaged into an .msi file and installed onto a target machine. The target machine will exactly mirror the targeted lab environment in every single way.

Step 3.

Verify that the application will start, then have someone who is knowledgeable in the program verify its functionality. Document any issues that occur at this point, and resolve them. This process will continue until the application works. Once the application is functional then all applications that are already installed will be verified to work, if any issues are discovered document and resolve them.

Once the target machine functions any fixes will be packaged into the application prior to deployment.

Step 4.

Load the packaged application onto the distribution server, and configure the appropriate software deployment policy.

Another option would be to use System Center Configuration Manager to schedule the deployment of the application at a time when it will have the least impact on the lab and the students who may be using the lab. The benefit of this solution is the ability to schedule when an application gets installed as opposed to waiting for a computer reboot to occur.

Once this system is in place we can begin to develop a valuable documentation resource for our applications.