Should Companies Pay More For Legacy Development?
This week, I scoped out a statement of work for some legacy development: enhancements to a Visual Basic 6.0 application.
It sounds really weird to call VB6 a legacy platform. But since ~2000, the whole Microsoft Platform paradaigm has shifted away from COM-based development to managed code (.NET). With that, so did the skillset of the developer community as a whole.
When everyone was regularly doing VB6 development, myself included, it was called a commodity skillset, and therefore, brought in relatively low billrates for consultants (when compared to more cutting edge languages, like Java). This was just classic supply-and-demand economics.
That mindset still exists today in my customers. They think, "VB6 is old and, therefore, it should be very simple to work with." With that, there is also an expectation of low billrates to perform the work. But, is this necessarily true?
There's now a reverse learning curve involved for me to perform this work: I have to unlearn some .NET syntax in order to write VB6 code, and that directly cuts into my productivity. Not to mention that I primarily work in C# now. (But, for disclosure, I still do A LOT of VBScript development since I have to work on classic ASP/ADO web applications for this same customer).
And that brings me to the title question: Should companies expect to pay more for legacy development, even if the legacy system is less than a decade old?
|