Printing from a Scheduled Task as a different user

It does sound a bit odd, but I’m in the process of moving all the regular monitoring I do to scheduled tasks, and this particular one caused me headaches all afternoon.

I have a script that I run that will update the DPM VolumeSizing spreadsheet that Microsoft put together for System Center Data Protection Manager. It’s a great tool, if you’ve not looked at it and are running DPM you should check it out!

The problem I had was I scheduled this to run as my account and it worked just fine. As soon as I configured this to run as a service account, the script would go, but nothing with Excel worked. I found several threads on Google that mention as much.

I finally found a very nice thread on Technet, the answer is from a user named JensKalski who recommends creating a desktop folder under systemprofile. I have read this before and it escapes me now where I saw this, but as soon as I created this folder on my server, I got the printout!

YAY! Thanks Jens!

New Home

It’s been forever since I’ve posted anything, a lot has gone on in my work and family life. I don’t have the time I used to have to tinker around with hosting my own stuff. I had thought about moving my site over to WordPress, but in order to do that I would lose all my custom DNS settings.

So I have imported my blog to blogger.com, which was relatively easy, which helped even more.
Not a lot else going on, SuperBowl 46 just ended a few hours ago, I’ve got the dishes in the dishwasher and felt it was the right time to move everything.

I have several posts with images, I have decided yet if I will update that. I do have some things I want to make sure I carry forward as I use this site quite a bit. I will also most likely move my repository over to google as well, I was going to use codeplex, but I wanted to keep my history and I think I can do that with code.google.com.

Have a nice evening!

Supporting ancient hardware

Today we’ll be working on some moldy oldies! I give you the SGI Indigo, and it’s successor the SGI Octane! Bow in front of their immense glory!

SGI Indigo2 Circa 1994SGI Octane Circa 1997

Here is the problem:

We have research analysis software designed solely for SGI, and our main computer for these analysese, the SGI Octane unix computer, OCTANE, will not boot due to apparent hard drive RAM failure. This 10+year old may not be recoverable.
We can regain our analysis functionality using the functional SGI Indigo 2 computer that has the necessary software, INDIGO. However, the network functionality is not currently available on INDIGO to allow file transfer to/from the computer. Thus, my primary request is that INDIGO be configured to allow secure ftp and remote shell access only.
I believe the computer needs to be added to the workgroup (or in setup in that “domain”). Please assign someone to complete this network access task in the next few days. Please coordinate with my lab assistant to assure that sftp and secure shell from our PC computers is working.
Secondarily, if it is staightforward to recover an image of the failed SGI Octane hard drive, we may consider replacing the drive (if possible), restoring the image on the “new” (likely used) drive and getting the SGI Octane, OCTANE working again. This is a lower priority, as there is no substantial data loss, and we have already been looking at ways to move the analyses done on the SGI over to Windows PC software (or possibly Linux). We have no desire to invest large amounts of time and/or money to restore the failed SGI. But, if the process is simple and cost is “cheap”, then we would like to restore the SGI octane until a long-term solution is found.
SGI_Error_Mesage

So how do you start? Well, we started with one assumption, the SCSI card is dead. If the disk itself is dead, then there is really nothing we can do. Working on our assumption we were able to scrounge up a full length PCI SCSI card.
WP_001576
We verified that it was recognized in the BIOS and could see an attached disk.
WP_001574WP_001575
After our regularly scheduled lunch (Buffalo Wild Wings…yum) and our staff meeting we headed over to the lab. We had been there once before with Nick, and it wasn’t much fun then either. It was during an IP inventory that we found these machines, and it was on that visit that we configured them to use our network. Sadly the “Supercomputers” were so old, they pre-dated DHCP…but that’s another story. We spoke with the advisor who was unaware that SGI ceased to exist as a company about 8yrs ago, further he was also unaware that SGI stopped using MIPS and switched to Intel based CPU’s about 4yrs before the company filed for bankruptcy.
In the lab we found many things, lovely wiring:
WP_001577
We also found the sad little Octane, with it’s drive removed. The first thing we noted was that the disk was six years newer than the computer. This was good news for us as we had servers that we could mount that disk into natively. We also “fixed” their network issue, turns out that the Indigo2 had the subnet incorrectly specified, sounds like a stupid mistake, aside from the fact you had to set the network id in hex, ya…you heard me. We gave them the IP of their box and they were able to access it via sftp.
WP_001578
We informed the advisor that we would attempt to access the disk back in our office, but first we had to make a stop over to our storage room. We picked up a lovely Dell PowerEdge 2650, and swapped one of it’s disks for the failed SGI disk.
WP_001580
Upon booting into the BIOS we ran disk utilities and it informed us that the disk was in fact dead. Sadly this means we were unable to fix that part of the problem. But the good news is that we had a nice trip down memory lane playing with hardware that used to cost thousands of dollars, and now is up for sale on E-bay for about $400. Sorry this has been a rather rambling post, but I felt upon receiving this ticket in the help desk that it really merited some form of posting.

Do you suffer from “Premature Installation”?

Or, “What’s in a name?”

Turns out a whole hell of a lot! First I need to thanks Nick for the awesome title, as he completely pinpointed my issue after I told him what happened! The last article I posted talked about our desire to move away from vanilla Windows 2008 and up to Windows 2008 R2. What should have been a pretty straightforward process got slightly mangled by two things. I forgot to rename the computer, and I moved to fast, hence the “Premature Installation!”

Naming is important, there are some names you can change and some you can’t. How computers get names has also changed with 2008, it used to be that during installation you were prompted for a name, now you do that after. One of the things we found out was that a Domain Controller can have multiple names, while I don’t know how recent that change is, or isn’t it was new to us. Back to the naming process, while there’s nothing inherently wrong with a Domain Controller named WIN-LLF3467Q0, you would undoubtedly agree it doesn’t really roll off the tongue.

So that was the first problem, I installed Windows 2008 R2 without mishap, and Directory Services installed, and when I hopped over to the Domain Controller’s OU I noticed my problem. So the first thing I did was go to the above article and renamed my new Domain Controller, and this is where the second problem occurred.

Replication while speedy, it does take time, and the more things you have in your AD the longer it could potentially take. The end result of my fubar is that we wound up with no less than three different entries in DNS for the same server, only one of which was correct, and due to replication latency the name of the server in AD was completely wrong.

So I did what I imagine most people would do, and went to uninstall DS from the server and attempt to start over. But because things had gotten so trashed I was unable to uninstall DS, because the server name that I was on didn’t exist in AD, I really should have screenshot stuff but take my word, I was on dc1 and the error was dc1 didn’t exist…which was technically true. It was a crazy weird edge situation, you could actually connect to DC1 but you had to type it in manually in order to get there. At any rate I was unable to remove DS, so I turned off the computer and attempted to remove the computer account that was listed from the Domain.

The problem with that was in order to do it, you MUST be on a Domain Controller to remove a non-functional Domain Controller from the Domain. I’ve not found an article on TechNet that mentions that, but I’ve not looked in any great detail. This information was found on the TechNet Social site, after connective over RDP to the off-site Domain Controller I was able to remove the offending account.

So, in the future, remember to be patient and make sure you have a checklist!

  1. Install Windows OS
  2. Change the default name before network connectivity
  3. Make any needed changes
    1. Disable IPv6
    2. Apply 3rd party DNS Hotfix
  4. Install Directory Services
  5. Wait
  6. Wait
  7. Wait
  8. Verify successful replication

These are the steps I followed on my server rebuild yesterday, as well as the same instructions I followed when I migrated the second Domain Controller this morning.

Google Reader feed to Blogengine blogroll.xml

So I wanted the list of blogs I follow in Google Reader to be displayed on my website. Blogengine uses XML files for just about everything, so I decided I would see how difficult it was to convert the Google Reader OPML to a format suitable for Blogengine. The first step is to export your list to an OPML file, once you have that you need to grab this XSLT file that will handle the conversion and finally some sort of utility that will read in the OPML and XSLT and output the appropriate XML.

Getting your feed

Login to Google Reader, and navigate to Manage Subscriptions. You can find that link in the lower left of your reader display. From there you will need you need to click the Import/Export tab, and then just click the “Export your subscriptions as an OPML file” link.

image That file will download to your computer, you’ll want to remember where it goes as you’ll need it shortly. Once you have that file you will need the following XSLT file. I found this code on codeplex where the original is located.





















You should save that as a file with a .XSLT extension, ideally this would live in the same folder as the .OPML file you received from Google. Now you need something that will convert those two files into what you need, blogroll.xml. I found the utility on Microsoft.com, msxml.exe and it did the trick, you will just need the msxml.exe file. I downloaded that to the same place as my OPML and XSLT files.

Now we can create our file, the syntax is pretty straightforward, and actually quite flexible you can literally key in your XML and XSLT from stdin. But this is what my syntax looked like.

msxsl.exe google-reader-subscriptions.xml google-reader-to-blogengin.xslt -o blogroll.xml

The first parameter is your input XML file, the second parameter is the XSLT file that will be used to create your output (-o) file. The resulting blogroll.xml file can be copied into your APP_DATA folder on your Blogengine installation, and you may or may not need to restart the webserver before it shows up. If you don’t have the Blogroll extension displayed you’ll need to login and add that to the site.

Updated Theme

So I’ve spent the past few days working up a new theme for the site. I’ve been wanting something that looked similar to a newspaper, and I think I came pretty close.

I used the NY Times, LA Times, and the Chicago Tribune as inspiration. I’m using some fonts that I found online. The site title font is Wedding Text BT Regular, the post title font is Baker Signet BT Roman and the body of the site is Baskerville Old Face Regular.

I’ve been using Blueprint CSS since I moved off Drupal and have been very happy with it. I wouldn’t say that I’m an expert by any means but I’m getting pretty good at using it and understanding what I’m looking at.

I hope you enjoy it!

Welcome to the new site

I’ve been running on an OpenVZ server from PhotonVPS for quite some time now. That server was running on Ubuntu Server 9.10 (I was wrong Carson), and the site was running on Django. I’ve had no complaints for the most part, Photon has always been very prompt and aside from the few minor annoyances which seemed to plague Carson and Nick more than I, it’s been fun.

Over the past weekend I decided to look for a different hosting provider, and was wanting to change the sites look and feel. Since I am at heart a Windows guy I felt I should move over to a Windows server. I looked at the Photon Hyper-V service, and compared that to the AccuWebHosting Hyper-V service and decided that for the money, AccuWebHosting was where I wanted to be.

The past few days we’ve had some crazy cold weather and so on Monday I moved my site over to the new server. I was able to setup Apache, Python and mod_wsgi and get my new server up and running in about 20 minutes, thanks to an article I wrote a while back. Then I started poking around at an alternative to Django on IIS (which I’ve not got working…yet). I found BlogEngine.NET, and so far I really enjoy it, it feels very Drupal-ish and I don’t think that I’ll stay with it for the long haul, but it works for now.

Over the next few days I’ll make more tweaks and move the rest of my content over.

Redeeming my IT card

So flat in the middle of standing up a new file server cluster and writing about it, in draft, and tweeted a few things that Carson nailed me on. First, I am just slightly annoyed at how the Dell NX3000 comes pre-configured, to which Carson pointed out that it’s an appliance and to get over it. Then I was annoyed because of the default roles and services and progressed through the baseline in my head adding what I needed, removing what was there that was not needed.

I informed Carson that they had the web server role installed by default and he said, “Did you open the site?”

Um, well…to be honest no I didn’t. So he pointed out that one of the core foundations of an IT guy is curiosity. If you’re not curious how can you possibly figure things out? So, I’m currently re-installing one of the nodes back to OEM, for the sole purpose of clicking on the web role to see if it has anything configured.

Thanks a lot Carson!

What to do when you receive a truly horrible request

I’m fairly certain this happens more often than not. A user fashion’s an email, than in their mind makes perfect and complete sense. They blithely send this email off to you, without care to the damage that could be wrecked upon your psyche. Today, a good friend of mine received just such a letter, and it took a considerable amount of time to fashion a politically correct response. So I decided that there should be a canned response to these types of request’s and I would like to submit mine today.

 

Dear User,

I would like to respond to the email that you sent earlier, but I noticed that shortly after reading it, I was bleeding out of both eyes. It wasn’t until that moment that I realized that both my pen and pencil were jammed into each. So if you will pardon me I am currently unable to help you with your particular request, as I am headed to the emergency room.

Regards,

Me

Feel free to copy and paste this wherever you feel the need!