Creating a SublimeText plugin to publish to WordPress

The Plan

As you may have noticed over the past few blogs, I tend to take a lot of notes at seminars/conferences etc. At the last one I attended I took along my teeny old samsung netbook and just used sublimetext in “zen” mode to take notes with a nice, basic, “white on black” UI.

I took my notes using the markdown syntax (when I remembered to) but the process of getting that to my blog was a little longer than I’d like. I’ve since added the Markdown on Save plugin, which will automatically convert a wordpress post from markdown syntax to html.

I thought I’d have a crack and speeding up the process by hooking sublimetext into wordpress, either letting the wordpress plugin do the markdown conversion or perhaps do that from sublimetext itself.

A Little Background

What’s SublimeText?

Only the most awesome text editor EVER (at the moment until I find another one I like better). It can be extended with plugins written in python, which gave me a great excuse to learn a basic chunk of a new language.

What’s Markdown?

A lightweight markup language for writing human-readable content which can be compiled into HTML

What’s Python?

A functional scripting language

What’s WordPress?

An insanely popular free blog engine; sometimes even used as a CMS. It has an XML RPC API.

I initially wanted to use the “Markdown On Save” plugin to automatically convert my markdown content to HTML as it gets inserted into wordpress, so decided to self host wordpress since blgos on don’t allow for plugins; the easiest route to do this is with Azure websites.

How I hooked this all together

  1. download sublimetext2
  2. set up an azure account
  3. set up a new website in azure using the gallery to choose blog->wordpress
  4. use google one helluva lot to get
    i. wordpress metablog api
    ii. LOADS of python scripting help

Unfortunately, I can’t seem to get the “markdown on save” plugin to fire when I post remotely; I’m pretty sure there is support for this as the documentation for the plugin mentions support for 3rd party apps and there are comments referring to xmlrpc in the code.

I couldn’t manage to work it out though, so have opted for the option of implementing markdown using python within sublimetext. The script I’m using for this is the fantastic python-markdown2 library; I just had to copy the lib/ file into the sublimetext “plugins/user/” directory (or anywhere in “plugins” actually) and reference it from my script.

other random reference material

  • pyblog
  • Fiddler & Windows Live Writer: for finding out what the actual parameter being used to specify “tags” is; mt_keywords

TeamCity and MSBuild Quirks #2

If your build works just peachy on your own PC but bombs out on your TeamCity server with the error “Default parameter specifiers are not permitted”, head over to your build step and update the Tools Version to 4.0

Build step tools version

This way your CI build will be targetting C# 4, the version that introduced optional parameters. And yes, I realise that the correct solution here is to not have optional parameters in your codebase; the codebase is new to me and I haven’t refactored them all out yet!


TeamCity and MSBuild Quirks #1

I’m using a postbuild event to fire off resgen and AL to create DLLs from resource files (for nice multilingual content). This event looks a bit like:

"$(WindowsSDKDir)bin\resgen.exe" "$(SolutionDir)Lang\fr.resx"
"$(WindowsSDKDir)bin\AL" /t:lib /culture:fr /embed:"$(SolutionDir)Lang\fr.resources" /out:"$(SolutionDir)bin\"

As such, when I build this locally from either VS2010 or MSBuild hitting the csproj file, all is fine. The resx file is picked up by resgen and creates a resources file, which is in turn picked up by AL and embedded into a dll. It actually appears in the build log/command line as something similar to:

  "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\resgen.exe" "..\Language\fr.resx"
  "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\AL" /t:lib /culture:fr /embed:"..\Language\fr.resources" /out:"..\bin\"

However, when I committed this into my branch and saw TeamCity pick it up, even though the build appeared to pass the build log shows the truth:

[PostBuildEvent] Exec
[Exec] "bin\resgen.exe" "..\Language\fr.resx" "bin\AL" /t:lib /culture:fr /embed:"..\Language\fr.resources" /out:"..\bin\"

Eh? Where’s the path to the  Windows SDK? Maybe it’s not installed on the Build Server. Fine, I’ll pop off and install it. And restart. And try again. Still the same result. What’s up with that? I whacked in a <Message> tag to output the value of $(WindowsSDKDir)… and it shows the correct path!

I still don’t understand why the value of $(WindowsSDKDir) is correctly set, but when used in a post build event it appears to be empty. If anyone can help with that, please let me know.

So, to my hack: I have yet to discover the correct way to fix this so have instead edited the MSBuild build step on the CI server to override the empty value with a specific one (i.e., the location of the windows SDK on the build server):

TeamCity Build Step Parameter

(A little tip: for some reason I needed to remove the trailing quote from the path parameter else it resulted in “C:\blahblah\”bin\resgenexe” instead of “C:\blahblah\bin\resgen.exe”)

TeamCity doesn’t approve of this and instead suggests you use a TeamCity Build Parameter instead of an MSBuild parameter; unfortunately I couldn’t get this to work so am sticking with the command line parameter, as above. If anyone can inform me how to do this correctly, I’d appreciate it.

Check out the big brain on Brett!

Wow. Ok, so I managed to completely forget that I’d registered this WP account. No idea when I did it either. Weird.

And I’ve managed to register a matching domain name too, and completely forget about it. I am just not on the ball these days!

It’s just as well too, since my other blog seems to have been taken down in a blaze of lightning (thanks for nothing EC2).

Posts coming up – in no particular order – prepare your bookmarking finger!

  • Grading a developer
  • Lessons learned: MSBuild for n00bs (i.e., me)
  • Lessons learned: Continuous Integration using TeamCity, a VirtualBox VM, NUnit, Selenium, a private NuGet repo, SVN, git and oodles and oodles of patience. For n00bs.
  • Content templating using Nustache (mustache for .Net), a.k.a. “How to work effectively with an external design company: a case study”
Ok, so that’s a lot to commit to. I better get started!

Project: Hands-free (or as close as possible) DVD Backup #2

So after huge amounts of frustration, it has come down to lots of Diagnostics.Process calls with StandardError and StandardOutput redirected to files which then get ReEx-ed to work out the contents of a dvd, rip specific streams (demux), then recombine specific streams (mux).

This, however, is proving incredibly difficult so I thought I’d share my annoyances:

1. DvdDecrypter – has proven fantastic for me in the past, so tried to use it here to demux the dvd into streams on my HD.
Tip: make sure you change the settings to not write anything to registry else it will fail to close cleanly.
Pro: Great command line functionality
Con: is out of date and does not handle current dvd encryption.

2. DvdFab – apparently the best thing since Decrypter went out of date.
Pro: Command line is almost as easy as Decrypter’s
Con: doesn’t work. It just dies on my pc, so this is a non-starter

3. SmartRipper
Pro: Decent command line support
Con: Throws up a “could not unlock dvd” upon opening which means it won’t rip. you have to be playing the dvd at the time that you open SmartRipper for it to get access to the contents of the dvd. This is fiddly, but doesn’t totally remove it from the running (could use “mplayer dvd:// -endpos 00:00:10″ just as I kick off SmartRipper if necessary)

4. mencoder – After masses of trial and error with the various options, I’m now using this to mux the streams back into one file. Can I use this to also rip the dvd in the first place? It seems like I can, but once again – not recent dvds.

This is SO annoying! I just want to backup my dvd collection with some nice command line tomfoolery! ARGH!

Project: Hands-free (or as close as possible) DVD Backup


I’ve recently bought a 2TB LaCie LaCinema Classic HD Media HDD as the solution to my overly complex home media solution. The previous solution involved a networked Mac Mini hooked to the TV, streaming videos from an NSLU2 Linksys NAS (unslung, obviously) or my desktop in another room, using my laptop to VNC in to the Mac and control VLC.

Not exactly a solution my wife could easily use.

The LaCinema is a wonderful piece of kit; very simple interface, small but mighty remote control, is recognised as a media device on your network, can handle HD video, and pretty reasonable for the capacity and functionality. Plus it’s so easy to use I can throw the remote to the missus and she’ll be happy to use it.

Now comes the hard part: transferring a couple of hundred DVDs to the LaCinema internal HD. Ripping CDs is easy, since you can configure even Windows Media Player to detect a CD being inserted, access the CDDB, create the correct folders, rip the CD, even eject it when done.

Nothing comparable seems to exist for DVDs, which is extremely frustrating. You always need to have manual interaction to either specify the name of the DVD you’re ripping, the streams you want to rip, the size and format of the output video file, etc.

I can’t be arsed with all that faffing around for my sprawling DVD collection, so I thought about creating a solution.

I’ve gone for a windows service with a workflow-esque model that has the following steps:

1. Detect a DVD being inserted
2. Look up the film/series name, year, genre, related images online
3. Determine which sections and streams to rip
4. Rip to local PC
5. Move to media centre

Over the next few posts I’ll go into a bit more detail on the challenges each stage posed and the solutions I came up with. I’ll post the code online and would love for some constructive feedback!

This isn’t about me making something that everyone should look at and go “oooh, he’s so clever”, it’s about having a solution for ripping a DVD library that everyone can use and tweak to suit their own requirements. As such, help is always appreciated.

Data URI scheme

The Data URI Scheme is a method of including (potentially external) data in-line in a web page or resource.

For example, the usual method of referencing an image (which is almost always separate to the page you’ve loaded) would the one schemes of either html:

[html]<img src="/assets/images/core/flagsprite.png" alt="flags" />[/html]

or css:


However, this remote image (or other resource) can be base64 encoded and included directly into the html or css using the data uri schema:

[html]<img src="data:image/png;base64,



So, if you fancy cutting down on the number of HTTP requests required to load a page whilst massively increasing the size of your css and html downloads, then why not look into the data uri scheme to actually include images in your css/htm files instead of referencing them?!

Sounds crazy, but it just might work.

Using the code below you can recursively traverse a directory for css files with “url(“ image references in them, download the images, encode them, and inject the encoded image back into the css file. The idea is that this little proof of concept will allow you to see the difference in http requests versus full page download size between referencing multiple external resources (normal) and referencing fewer, bigger resources (data uri).

Have a play, why don’t you:

[csharp highlight=”72,73,75″]using System;
using System.IO;
using System.Text.RegularExpressions;
using System.Net;

namespace Data_URI
class Data_URI
static void Main(string[] args)
var rootPath = @"C:\VisualStudio2010\Projects\WebWithLoadsOfCssReferencedImages\";

// css file specific stuff
var cssExt = "*.css";
// RegEx "url(….)"
var cssPattern = @"url\(([a-zA-Z0-9_.\:/]*)\)";
// new structure to replace "url(…)" with
var cssReplacement = "url(data:{0};base64,{1})";

// recursively get all files matching the extension specified
foreach (var file in Directory.GetFiles(rootPath, cssExt, SearchOption.AllDirectories))
Console.WriteLine(file + " injecting");

// read the file
var contents = File.ReadAllText(file);

// get the new content (with injected images)
// match css referenced images: "url(/blah/blah.jpg);"
var newContents = GetAssetDataURI(contents, cssPattern, cssReplacement);

// overwrite file if it’s changed
if (newContents != contents)
File.WriteAllText(file, newContents);
Console.WriteLine(file + " injected");
Console.WriteLine(file + " no injecting required");

Console.WriteLine("** DONE **");
catch (Exception e)

static string GetAssetDataURI(string fileContents, string pattern, string replacement)
// pattern matching fun
return Regex.Replace(fileContents, pattern, new MatchEvaluator(delegate(Match match)
string assetUrl = match.Groups[1].ToString();

// check for relative paths
if (assetUrl.IndexOf("http://&quot;) < 0)
assetUrl = "; + assetUrl;

// get the image, encode, build the new css content
var client = new WebClient();
var base64Asset = Convert.ToBase64String(client.DownloadData(assetUrl));
var contentType = client.ResponseHeaders["content-type"];

return String.Format(replacement, contentType, base64Asset);
catch (Exception)
Console.WriteLine("Error"); //usually a 404 for a badly referenced image
return fileContents;

The key lines are highlighted: they download the referenced resource, convert it to a byte array, encode that as base64, and generate the new css.

Comments welcomed.


Get every new post delivered to your Inbox.