This is my story about a legacy system I wrote.

I ran across a great article the other day called The Most Remarkable Legacy System I Have Seen. This is worth a read, if you haven’t read it yet! It instantly reminded me of a legacy system I wrotewhile working at an insurance company.

Background

I was the first full-time web developer at an insurance company, taking over maintenance and further development of some contractor-written code. The websites were run on a colocated server that we owned (it was 2004!) and were written in Active Server Pages. The code was quite a mess.

One of the things there was a lot of was forms. Comment forms, feedback forms, request-a-quote forms, you name it. All they did was send emails. Lots of emails. So we had a ton of ASP code out there that would send email through an Exchange Server installed on the same machine. (There were a lot less security concerns at the time.)

The time eventually came to vacate the colocation facility, so we brought the machines back to the office on wheely chairs. We moved the actual hosting to Allstream, before they were owned by MTS, so they were still pretty good. Allstream took care of patching and stuff, so that saved us a lot of time.

One of the problems with the Allstream solution was that they didn’t have a mail server, so our IIS machine couldn’t send mail. But we had mail capabilities back at the office, and for the most part the mail just went to internal people, so we decided we’d just send mail from home. Then the problem became a transport issue. It was 2005 by now and nobody was doing much more than simple web services using SOAP, so we needed a more standard method of bringing information home.

Standardizing Messages

Every email we wanted to send had a sender and a recipient, of course. The sender was the person submitting the form and the recipient was the internal staffer who had to read the feedback, fill out the quote, whatever. Apart from that, there was a number of custom fields per form.

You have to understand that XML was the belle of the ball back then, and JSON was not quite yet A Thing People Used. So I designed an XML format (complete with XSD) that looked kinda like this:

<?xml version="1.0">
<message>
    <type>french-site-feedback</type>
    <timestamp>2005-01-12T13:49:25.674+00:00</timestamp>
    <fields>
        <field>
            <name>sender</name>
            <value>someone@somewhere.com</value>
        </field>
        <field>
            <name>recipient</name>
            <value>someone@ourcompany.com</value>
        </field>
        <field>
            <name>comment</name>
            <value>Hey, I never got my pink card for my new auto policy. What's up with that?</value>
        </field>
    </fields>
</message>

Building Messages

Obviously you don’t want to be writing XML in your ASP code, and besides, you want it to be consistent, so I created a COM library to go with it. Then your ASP code can look like this:

<%
set message = Server.CreateObject("DMPS.Message")
message.SetField("sender", Request.Form("myName"))
message.SetField("recipient", Application("feedbackRecipient"))
message.SetField("comment", Request.Form("comment"))
message.Save("french-site-feedback", Application("feedbackDirectory"))
%>

This ensured a consistent XML format. What the Save() function did was build an XML document using MSXML and save the XML file into configured directory. The directory was configured as explicitly writeable, unlike the rest of the site. And since user information didn’t get to enter the filename, we didn’t have to worry about that.

Now we have a gradually filling directory of XML files, with names like french-site-feedback-20150112134925674.xml, qc-site-quote-request-20150112135001771.xml, etc.

Transport

Transport was one of the simpler parts of the equation. We had a Scheduled Task running on the web server that would run a zip command every minute or two, doing a move operation into a .zip file that had a timestamp in its name. The .zip file was placed into a directory that was server by a SFTP server. Honestly, I forget which one it was, but back in the day it was a paid product.

On our brand-spanking-new Windows 2003 Server back home, we needed to get those files, so we had another Scheduled Task that would pull from that SFTP server and delete the files as it went. Since these two jobs were out of sync, and a server could go down at any time (but honestly never did) they had to operate as though the other side could be non-functional, and they did.

Once the .zip file(s) were downloaded, they were unzipped into a special directory.

Processing

In 2005 a big technology was emerging called .NET. Imagine Java, by Microsoft. There’s more to it than that, but just know that it was a big step forward and we didn’t have to write in VBScript anymore.

I formulated a system called DMPS – the Decoupled Message Processing System. Its core was a configuration-based pipeline through which messages could flow.

The program itself was a .NET service. It was headless and read a configuration file at startup, and log as it went. It used a FileSystemWatcher to observe an incoming\ directory for files to appear, spawned a thread for the message, then used the instructions in its configuration to determine what to do with it. Once it was done processing, it would move the file to a completed\ directory.

The instructions themselves were a list of processors, one list per message type. A processor was located by the directory of the assembly (the .dll) and the name of the type. The idea was that I would write a few processors to do the most standard tasks and then reuse them. From what I recall, I started with two processors:

  1. A table loader, which would dump the message into the specified table; and
  2. An XSLT + email processor, which would apply an XSLT template to the message and then email it to the configured address.

These could be combined and even reused in the same message type. Therefore you could have one type of message get copied into a table, have one XSLT applied and emailed to an internal staffer, then have a different XSLT applied and emailed back to the sender (like a “thank you for your email” message).

If a processor failed, the rest of the pipeline was cancelled and the XML file was moved to a directory of failed messages (like a Dead Letter Queue). If you solved a backend issue and had to reprocess the messages, all you had to do was drag the files from failed\ back into incoming\ and they’d be processed again.

Here’s an example of the configuration. Please forgive my syntax, it’s probably wrong, but this is more or less how it went:

<configuration>
    <appSettings>
        <add key="incoming" value="D:\DMPS\incoming" />
        <add key="completed" value="D:\DMPS\completed" />
        <add key="failed" value="D:\DMPS\failed" />
        <add key="threads" value="2" />
        <messages>
            <message type="french-site-feedback">
                <processors>
                    <processor assembly="DMPS.Processors.dll" type="DMPS.Processors.TableLoader">
                        <add key="udl" value="C:\dmps.udl" />
                        <add key="table" value="T_Feedback" />
                    </processor>
                    <processor assembly="DMPS.Processors.dll" type="DMPS.Processors.XSLTEmailer">
                        <add key="template" value="D:\DMPS\templates\any-feedback.xslt" />
                        <add key="senderField" value="sender" />
                        <add key="recipientField" value="recipient" />
                    </processor>
                </processors>
            </message>
        </messages>
    </appSettings>
</configuration>

This is just an example of a pipeline that logs the message to the DB and then transforms the XML and sends an email to the recipient. Pretty simple and mostly copy/paste.

Installing the service on a Windows machine was very simple. Just copy a directory with the .exe, the processor DLLs, and the .exe.config file, and then run .NET’s installutil on the server. In the pre-Docker era, this was a godsend, since this was a problem you’d normally solve with InstallShield or the emerging MSI standard.

Putting It All Together

What I designed is essentially an asynchronous RPC system. By making a few calls in VBScript (or you could write an XML file in another language), a job is started on a remote system. There’s no installation required on a client (merely registering an ActiveX DLL once) and you’re done.

The system is secure unless MSXML itself is exploited. Files are constructed by a process with limited permissions, saved to the only directory that process is allowed to write to, and then the files are transferred securely (for the time, yes).

Extensibility

Insurance uses a lot of batch jobs. Files coming in, files going out. Later on, another team built a hack into an existing system to automatically calculate the rates for a quote without having that quote open in the GUI. I was able to decode the table formats (in a DB/2 database, no less), and build a processor that took the XML message and inserted records into the DB/2 database. The job would run on its own schedule, rate those quotes, and then emit some XML files of its own. Guess which format we’d use? We’d have those messages come back into DMPS and use a new processor that would use Apache FOP instead of XSLTs (to produce PDFs) and email those out.

Because I’d load the processor by assembly path and type name, you could build a processor without having the source code for DMPS itself.

Legacy

I built this system to be extensible but it lasted a lot longer and worked a lot better than I anticipated.

After running for a year without being restarted (hey, eventually Windows Server 2003 was off support) the process had been running for a year without exceeding 5 MB of RAM. Yeah, 5 megabytes. It didn’t leak.

The company got wined and dined by some enterprise software people and bought this huge system called Sonic ESB. It used a lots of its own XML. Whenever a developer wanted to send mail, they tried to use JavaMail to send it, which never worked, and they always resulted in dropping off a DMPS message into my system to send the mail. Enterprise!

When my friend left the company in ~2017, the system was still in use.