Do you still remember the time before the Web? When servers were something that ran only on bank mainframes and data connections were mainly V.22bis?

I can still remember how easy it was to build services back then. To set up a BBS with MBBS or PCBoard, all you needed was a cheap PC and a phone line. No application servers, no load balancers, no clustered databases, no redundant power supplies, no incremental backups, no fiber ring networks.. (Well, maybe the backups were there, sometimes.)

When the web boom was starting in 1995-1996, things were still easy. Nobody talked about putting web servers into rack cabinets, let alone in blade servers. You just took a PC and maybe even wrote your own HTTP server to run on it, when the existing ones couldn't do the job.

Somewhere along the line things got terribly complex. The audiences grew and suddenly you had to serve several hundred requests per second. It took weeks instead of days to set up new services, when you actually had to plan and think about them in advance...

Where is this getting at? Well, I have been working on a new project for several months now, and it just occurred to me that the days of simple and easy services might just be permanently over. Whatever you start doing nowadays, it always takes months before you can publish it. Why is that?

I suppose the real reason is demand for quality multiplied by larger and larger audiences. One of my first UNIX programs was a simple messaging application that could send messages (nicer than plain "write") to other ttys. The user base was maybe 5 or 10 at a time. Now it seems the minimum is a thousand times that, with scaling requirements up to hundredfold.

On the other hand, maybe the skill of building small and simple services is something you lose when you get older. It would, after all, be wise to always start with something quick and easy. Then you just need to trust your refactoring abilities and make it scale when demand increases. Oh well..