Intellisense not working via COM

I recently needed to write a class in C# that would be accessed by a VB6 application – something quite new to me.

The existing VB6 application (which is in the process of being rewritten in C#) currently uses OLE Automation to generate MSWord documents based on templates. While this approach was working fine it was pretty slow. Having made the decision that the new version of the application would be capable of generating Word 2007 documents it was decided that the resulting class should be accessible to the VB6 application via COM Interop.

The process of exposing a C# class via COM is well documented out there and only a matter of clicking a couple of project level options. To test the concept (which as already mentioned was new to me) I created a simple class in C# with a public method to create a text file.

With the class compiled I opened up VB6 (it’s been a while!) and after adding the COM class as a reference was pleased to see that I could create an instance of it as normal:

Dim engine As OpenXMLDocumentEngine.Engine
Set engine = New OpenXMLDocumentEngine.Engine

The problem arose when I attempted to call one of the COM Class methods – there was no Intellisense (at least nothing showing my test method). However, entering the method name manually and running the code resulted in the test file being created so the handles were there.

I Googled the problem and many solutions involved creating and implimenting a Public Interface for the C# Class. However this approach did not work for me and after a lot of hunting around I found a solution that worked for me and it was as simple as adding a references to the InteropServices assembly and a Class Attribute setting the ClassInterfaceType to AutoDual.

using System.Runtime.InteropServices;

namespace OpenXMLDocumentEngine
public class Engine

A quick recompile and behold full method level Intellisense in VB6:


Upgrading to Windows 7 (Dual Boot with Ubuntu)

I’ve run with Vista for about a year now and on the whole have had no problems with it. Now I’m one of those people that hasn’t really had any problems with Vista – and frankly I’m not sure what all the fuss was about. Yeah, I was happy with XP and only upgraded to Vista because my new employer was using it.

So why was I looking to upgrade to Windows 7 now..? Well there were a number of reasons but the most pressing was that Vista was starting to get on my nerves. So some reason it started to run painfully slow. This was mainly due to the fact that the hard drive would be chattering away for a good 20 minutes after I had logged in – now that’s infuriating! What the hell was it doing.? Well I didn’t really have the drive to spend hours looking for the source so had just lived with it.


When Windows 7 hit Release Candidate I thought – “What the Hell”. If it all went to hell in a wheel barrow then I could either restore my XP system with the CD that came with the laptop or install Vista from the CD I had bought for the upgrade (OEM version purchased with a new HDD).


In a previous post I explained that how I configured the laptop to dual-boot with Vista/Ubuntu and when I took the decision to upgrade the Vista part of this configurations I had a few alarm bells going off in my head. Would it adversly affect my Ubuntu installation (apart from overwritting the grub bootloader), would it lose all my documents, music and photos.?


Well, it was as good a time as any to do some housekeeping on my systems to delete old stuff and backup the rest. So after my little spring clean I downloaded the latest Release Candidate (7100), burnt the iso to a CD and kicked off the setup process.


The first part of the process was to perform a compatibility check which advised me that Skype, SQL Server 2005 and my Sony Ericsson PC Suite may have problems running on the new operating system. No big deal, I knew that Skype had a fix for this and was pretty sure that Microsoft would sort out SQL Server. As for the mobile phone software – well I don’t ue it that much and if a new version was not available then it was no big deal.


The rest of the installation took about 4 hours and restarted the system a number of times – no interaction needed from me so that was good.


When the installation was complete I booted into Windows 7 and had a nose about – and although it’s early days I’m quite impressed. Impressed enough to buy it when it’s released..? Well let’s no get ahead of ourselves here :-).


Ok, so know I had a system running Windows 7 and a Ubuntu installation that I could not access – the grub bootloader had been overwritten (which I knew would happen before I started).


So, how do I reinstall grub so that I could get me dual boot back?


Well the Ubuntu Documentation has this article on just this process which worked like a charm for me. I used the first method (with the Live CD) and within 5 minutes I was booting into Ubuntu (which to my relief was still there).


Tweeting my External IP Address from Ubuntu Server

In my previous post I described the problems I encountered while trying to configure my Ubuntu Server to be able to send emails via the command line (in my case it was actually via a script). The reason I wanted to do this was so that I could run the script on a scheduled basis to check my external IP address and notify me when it changed.

Why? Well my ISP has provided me with a dynamic IP address which changes on a periodic basis – not that I normally notice. Well if I want to be able to administrate my Ubuntu Server from outside my local network, i.e. over the Internet via SSH, then I need to know the outward facing IP address of my router.

I had already found a script to do this and tweaked it a little to run under a non-admin user but while it could detect the change I needed some mechanism for it to tell me. After a fruitless evening trying to set up email I gave up and decided to use the Twitter API instead.


Now there are a few downsides to using Twitter:

  • The data is on someone elses server – i.e. Twitters. This is not a major problem but what if Twitter decided that they were now going to charge for thier service, or if they were (for some reason) forced to take it down altogether (unlikely I know)
  • Username/Passwords are sent over the wire in plain text (at least in my implementation they are)
  • I may not want to stick around on Twitter forever (the SPAM is driving me nuts and although I have protected my updates it does defeat the point a little bit IMHO)

 Having said that – it is so damn simple it makes no sense not to have a go 🙂


The logic of the script is this:

  1. Store the current IP Address in file
  2. Resolve External IP Address using
  3. Compare value with that stored in IP address file
  4. If the IP Addresses are different, send a Tweet

Simple eh?


Well first of all I had to get my Twitter script sorted. My previous post links to the site where I found this script but I’ll post it here for reference:

curl --basic -- user "username:password" --data-ascii 
      "status=`echo $@|tr ' ' '+'`" 
          "" -o /dev/null

Note: This is all on one line and that you will obviously need to specify your own Twitter username/password.


With that in place (saved as /usr/bin/twitter) I just needed my IP Checking script and here is the script I used:

CURRENT_IP=$(wget -q -O -
if [ -f $IPFILE ]; then
        KNOWN_IP=$(cat $IPFILE)

if [ "$CURRENT_IP" != "$KNOWN_IP" ]; then
        echo $CURRENT_IP > $IPFILE
        twitter "External IP Address has changed to $CURRENT_IP"

Running the following command will make the script (called checkipaddress) executable:

sudo chmod +x checkipaddress

I decided to create a new Twitter account for my Ubuntu Server and to protect it’s updates – no offence but I didn’t think that my External IP address was a good thing to be broadcasting to the world. As my updates are also protected (I was just getting too much annoying SPAM) I’m happy that my IP Address is as secure as it needs to be.


Running the checkipaddress script does what it says on the tin, it checks the IP address and Tweets me if it is different from the last time it checked, so I can now SSH intp my Ubuntu Server when I like. Simple!


Now I didn’t come up with the checkipaddress script myself – although it’s not difficult – but I cannot remember where I copied it from. If I remember then I’ll post the link, if you know then post a comment and I’ll update the post


Command Line EMail on Ubuntu Server Failed – Tweeting Instead

Now that my Ubuntu Server is up and running and configured for the network I want to be able to remotely access it via SSH. Now that’s easy, even for me. I have OpenSSH installed and my router configured and it works like a charm. The problem is that I don’t have a static IP address from my ISP so periodically I will be assigned with a new one – so how do i know what it is at any point in time? Answer, I don’t. So after browsing around I found a script that would resolve my external IP address and email it to me. Brilliant! Everything was working up to the point where it needed to send the email.

Well I didn’t think it would be that difficult – I just wanted to be able to configure the system so that it could send me an email via the script. I found a few tutorials on the web and decided to follow one which boasted to be able to complete the setup in just 5 steps. There was even a comment from someone saying ‘Thanks, it worked where others didn’t’ – well it didn’t for me. Why is it so difficult?
Once I had tweaked the script to include details for the appropriate SMTP server etc I could not get past an annoying ‘Invalid HELO message‘ error. Using the same settings in my email client proved that they were valid.
At one point I thought I had it working, i.e. the command didn’t result in a nasty error and an email duly appeared in my Inbox – success? Not quite, it appeared to come from ‘Unknown Sender’ even though I had configured a From address. I tweaked the script and was promptly returned to the ‘Invalid HELO message‘ error that had been greeting me for the previous couple of hours.
I trawled the Web for answers but the results were far to cryptic for my frame of mind and in the end I decided that it was too close to midnight to be banging my head on the wall and called it a night.
This morning on the way into work I was thinking about alternative approaches, knowing that I would have to revisit the whole email server setup process in the future, but for now I just wanted a working solution. Then it hit me – Twitter!
I know that Twitter has a RESTful API and with that in mind I knew it could not be that difficult – at least compared to my efforts last night with mailx. As it turns out typing “command line twitter ubuntu” into Google gave me the answer – use Curl!
curl --basic --user "USERNAME:PASSWORD" --data-ascii
      "status=`echo $@|tr ' ' '+'`"
           "" -o /dev/null
This should all be on one line and insert your Twitter username and password – I’ve split it to prevent it scrolling off the page.
Now I just have to create an account for my Ubuntu Server (protecting the updates of course), install curl, create the script and I’m away. Once this is sorted I’ll be able to update the script to resolve the External IP address and configure it as an hourly cron job.
I know I will have to revisit the configuration of the email system – afterall it’s not as if it’s a totally new technology. I knew that this process would not be straightforward and that I would hit obstacles like this along the way so I’m not that surprised, although I am still frustrated.
If anyone has any input, help or advise then please leave a comment.
Update: I’ve resolved this issue now – see next post..

Remotely Connecting to Ubuntu Server

In a previous post I managed to get my Ubuntu Server test system connected to my home network and the Internet (at least from the inside looking out). If you read the post then you will know that because my house is almost 100% wireless I needed to move the system into the hallway to be next to the router in order to physically connect.

Now although I have a very nice hallway I don’t fancy sitting in it for hours with a keyboard on my lap. I also don’t really want to be running wires around the house and as I think that the system will end up in the garage I need to be able to connect remotely, from my Ubuntu or Windows systems.
Although it’s been a while since I used any flavour of Linux (and even then in my use was pretty limited) I knew that remote connections were the way to go and remembered that OpenSSH was a good tool for the job.
After checking that it was not already installed by running:
dpgk --get-selections | grep openssh-server
I went about the installation by running:
sudo apt-get install openssh-server
followed by:
sudo /etc/init.d/ssh start
to get things going.
So far so good, I now have OpenSSH installed and running – how do i connect to it..?
Well I had already booted into Ubuntu (9.04 – Jaunty) on my laptop so it was just a matter of running: [see note at end of post] 
ssh dilbert@ 
[dilbert being the username I provided during the installation of Ubuntu Server and being it’s IP address].
I was prompted for the password for the dilbert user and I was in … but I was not quite finished.
By default OpenSSH accepts remote logins from the root user – not a good idea! So while I was connected I editted the configuration file:
sudo vi /etc/ssh/sshd_config
and set PermitRootLogin to no.
All that’s needed now is to restart OpenSSH so that the new settings take effect:
sudo /etc/init.d/ssh restart
So there it is – I now have my Ubuntu Server setup so that I can access it remotely from my laptop when I boot into Ubuntu. But what if I’ve booted into Windows? Well I’ve heard that Putty is a good tool to connect to SSH from Windows so I’ll have to check that out.
While I was able to run SSH from my Ubuntu laptop means that the client must have been installed already – although I can’t remember doing it so it must be there by default. If it’s not installed on your setup you will need to run
sudo apt-get install openssh-client



Configuring Networking on Ubuntu Server After Installation

I recently bought an ‘old’ PC to use as a test system, running XP for my .NET development and Ubuntu Server (Jaunty 9.04) for my investigations into Linux. I have a few spare drives kicking around so having a totally separate installation of XP and Ubuntu would be a doddle.

I fitted an additional 20GB drive and for now I’m content to open the side and swap the cables until I get around to sorting out a suitable boot loader.
The installation progressed without any problems until I reached the network configuration. As the router is in the hall and my usual  connects via the wireless I didn’t have a network socket nearby – we only have one wired PC in the house. I opted for ‘Setup Networking Later’ option and the process completed without any further problems. Now, as a Windows ludite how do I configure the network without a GUI..?
Well the first step was to get the box connected to the router – which acts at the DHCP server for my network, i.e. move it into the hall.
After logging in the first thing to do is to configure the network card – this is done by running the following command:
sudo vi /etc/network/interfaces
which will open the file in the ‘vi’ editor.
Navigate to the end of the file and then press ‘i’ (which will put vi into Insert mode) and add the following lines:
auto eth0
iface eth0 inet dhcp
Now press the escape key to take vi out of Insert mode. So far so good but now we need to write the changes to disk and exit the editor. To write the updated file back to disk simply enter :w and press enter. Now to exit the vi editor enter :q and press enter.
So what have we just done? Well the file contains details of all network devices connected to the system and specifies how they are configured.
The first line we added auto eth0 tells the system to start this adapter, eth0, when it boots up. The next line iface eth0 inet dhcp tells the system to query DHCP for an IP address for this adapter.
Now we just need to restart the networking service to apply the new settings. To do that run the following command:
sudo /etc/init.d/networking restart
To check that all is well enter
and review the resulting output for details of the eth0 adapter which should include an IP address as assigned by the router, in my case
Once I had completed this I was able to ping the router and, thus confirming that I had Internet access.

ASP.NET Date Validation

A simple requirement at first sight not one with a simple solution. You have an ASP.NET page which allows the user to specify a date which is then used as an input parameter for an SQL Stored Procedure. What is stopping the user from entering ‘Hello World’ and submitting it?

Answer, nothing unless you configure some sort of validation.
So what sort of validatator do you use? At first sight there does not appear to be a suitable candidate but to enforce a specific format, e.g. dd/mm/yyyy, the RegularExpressionValidator could be called into play here. This will stop the user from entering data in the wrong format but what about invalid date such as 32/12/07 or 29/02/07? How do make sure that the date is a real one?
This is where I was a little while ago and I decided to opt for the CustomValidator, writing my own Server Side and Client Side validation functions that Cast the user input to a DateTime type and catching any exceptions.
While this worked I was not too happy with the operation of the code. Then I watched a PodCast from dnrTV where Peter Blum demonstrated some lesser known features of the ASP.NET validator controls. He used a CompareValidator to check for a valid date – a CompareValidator??!!
I normally associate the CompareValidator to check one value against another, e.g. Password and Confirm Password. So how does it validate a date?
  • Simply add a CompareValidator and set it’s ControlToValidate to the textbox containing the date.
  • Now locate the Operator property and Select DataTypeCheck from the dropdown list.
  • Finally set the Type property to Date and you’re done.
So there you have it – no need for custom code on Server or Client side.

Reading an RSS Feed with C# and Python

When I started this site I had a project in mind that would download Podcasts as they were posted and maintain the content of my MP3 player so that I didn’t have to do it myself. Well since then I have lost my iTunes virginity and while it doesn’t do everything that I wanted (like telling me that a new episode has been downloaded) it does automatically download and delete them once I’ve watched/listened to them.

But just because I don’t need to develop a complete application there is still an itch to scratch here – a few of them in fact.
  • How do I download an RSS stream – it’s not just podcasts that uses them
  • How do I parse the resulting XML
  • How do I download a file and store it locally
  • and how do I do this in C# and Python
Well this post will answer the first two questions using C# and LINQ and Python and it’s XML library.
Using C#
First of all lets take a look at using C# and LINQ to pull an RSS feed down and read the resulting XML, outputting the results to a console application.
Using Visual Studio 2008 create a C# Console application – it should present you with a default Program.cs file with a few namespace imports and a Main method.
Before we can start we need to import the .NET Framework LINQ to XML library. To do this, add the following line under the last using statement at the top of the file.
using System.Xml.Linq;
With that in place we can now declare an XDocument specifying the URL to the feed we want to download.
Add the following line to the Main method:
XDocument feedXML = XDocument.Load("");
That single line pretty much taked care of downloading the XML (I’ve targeted the 20 show feed from DotNetRocks), all we need to do now is parse the content.
Now I could write a simple class and fill a generic collection and then do a foreach over the collection … but this is .NET 3.5 and I have implicit types and LINQ at my disposal.
The following simple LINQ query will populate a collection of XElements:
var feeds = from feed in feedXML.Descendants("item")
select new
    Title = feed.Element("title"),
    Link = feed.Element("link"),
    Date = feed.Element("pubDate"),
    Description = feed.Element("description")
Each element has Title, Link, Date and Description properties which will show up in intellisense.
A simple foreach statement over this collection, again using implicit typing, will provide access to the content:
foreach (var feed in feeds)
    Console.WriteLine("Title:{0}, Date{1}",feed.Title.Value,feed.Date.Value);
So that’s how it can be done in a handful of lines using C#, so how about Python?
Using Python
Now I’m not going to go into project structure here (two reasons, I’m new to Python and I’m in the process of changing from Eclipse to Python Machine as my IDE of choice – more on that in another post), I’m just going to review the code required to do the job.
Just as with C# there is a library to do much of the grunt work leaving us to add a handful of lines of code. The library in question is urllib. With this imported it is just a matter of navigating to the RSS feed and reading it into a file.
This is the code I used for downloading the RSS to a local file:
import urllib
file = urllib.urlopen("")
text =
fileObj = open("feed.xml","w")
Line 2 creates a ‘file-like’ object that is connected to the specified network resource, in this case the RSS feed for FLOSS Weekly on the TWiT network. This object exposes a number of methods for reading the content of the resource including .read() which I use in line 3 to simply read the contents into a text object. Lines 4 and 5 simply create a filesystem object which creates a file called feed.xml and writes the contents of the text object into it. 
So now we have the contents of the RSS feed we still need to parse it – but we don’t have LINQ, how will we cope?
Well as luck would have it Python is Open Source and there are plenty of libraries out there that will provide the required functionality. I’ve chosen feedparser for this exercise – why, because it was written with this task in mind and it does exactly what it says it does.
The code required to provide a similar output to the C# example is:
import feedparser
feed = feedparser.parse('feed.xml')
print feed["channel"]["title"]

for item in feed["entries"]:
    encLocation = item["enclosures"][0]["href"]
    encLength = item["enclosures"][0]["length"]
    if encLength.isdigit():
   print encLocation, int(item["enclosures"][0]["length"])/(1024*1024),"MB"
   print encLocation,"N/A"
Now I could try to explain all of this but the documentation is pretty good itself, at least this will provide a starting point. Essentially the feed.xml document is opened and the channel title is printed out. After this a for loop is executed to iterate over the entries and output the required information. I’ve added a little check to ensure that a missing ‘length’ value does not throw an exception.
So it is clear that with a similar number of lines of code it is possible to download and parse an RSS feed from a remote computer using either C# or Python. While C# has the weight of Microsoft behind it and the functionality of LINQ, Python has the Open Source community and the functionality of … well whatever the community decides it needs.

Upgrading Ubuntu Intrepid to Jaunty

When it comes to upgrading Operating Systems I’m not known as an early adopter, I normally wait a while for others to have the headache of encountering and resolving problems. However, in a moment of madness I decided to upgrade my fully functional Ubuntu 8.10 (Intrepid Ibex) installation to 9.04 (Jaunty Jackalope).

I’ve read a few blog posts where users have upgraded and then found that thier sound no longer works or that thier display crashes or won’t hit the previous resolution so I made sure that I had a backup of my /home directory and exported a list of my installed applications.
The backup was performed using rsync and the following command:
rsync -av /home/dave "/media/FREECOM HDD/ubuntu"
Note that the quotes were required because when my external HDD connects it is given a name with a space in it!
I pinched a tip from the LifeHacker site to use a CLI command to generate a text file containing a list of my installed applications. This file could be used, again via the CLI, to reinstall the applications should the worst happen.
The command was:
sudo dpkg --get-selections "*" > current-installations.txt
This file was then copied to the USB drive containing the backup. Apparently, to reinstall from this file I would just need to run the following commands:
sudo apt-get update
dpkg --set-selections < current-installations.txt
apt-get -u dselect-upgrade
I say apparently because my upgrade went without a hitch 🙂
I opened the Update Manager and sure enough was prompted that a Distribution Upgrade was available. Before progressing with the upgrade I thought it prudent to install the handful of Updates that were listed as well.
Once the updates were installed I took a deep breath and went for it.
After downloading the Upgrade Installer I was prompted that the download would take in excess of 3 hours – not totally unexpected but this would tie up my laptop all night, so i went to the gym instead (i’ve got a holiday coming up and I don’t want GreenPeace trying to roll me back into the sea!).
The process still has about an hour to go when I went to bed so I expected it to be finished when I came down this morning – which it sort of was… It was waiting for me to respond to a prompt! It also indicated that there was another hour to go … great.
The prompt was to determine what I wanted to do with the grub menu.lst file. I run the laptop as a Dual-Boot with Vista on the other partition (yeah, yeah, I need it for my day job!) so as I didn’t really know what it would do to my settings I opted for the ‘leave it alone’ option.
I was prompted twice more for the same thing which was frustrating as I was getting ready for work and started to think that when I got home I would be faced with a prompt saying ‘Are you sure?’ or similar. As it happened the installation completed within about 5 minutes (not the hour it originally indicated).
Once the reboot had completed I logged in (via the new swishy login screen) and gave it a quick once over. Everything was there (despite the installation removing numerous, obsolete, packages) and appeared to be working normally.
The only thing I noticed was the my Eclipse IDE failed to load my PyDev perspective when I opened if for the first time after the upgrade. I loaded it manually and it seems to be fine now.
So for me, the upgrade process was a piece of cake, if a bit lengthy.

Request format is unrecognized for URL unexpectedly ending in …

Another thing that you think would be straight forward but turned out to be quite frustrating.

I have written Web Services in the past without any problems but recently when I was writing one specifically for Excel (2007) I could not get it to link properly.
Running the web service in VS2005 resulted in the test page we all know and love and the Invoke button worked fine! Taking the URL and pasting it into Excel resulted in a “Request format is unrecognized for URL unexpectedly ending in…” error!
It turns out that this is by design – sort of. Basically the HTTP Get and HTTP Post protocols are disabled by default (not the case in .NET 1.0). Enabling them in is simple matter of adding the following to the system.web section of the web.config:
       <add name="HttpGet"/>
       <add name="HttpPost"/>
TaDa! it worked.
If you want some more information then click here to got to the Knowledge Base article [KB819267]