MENU

Thursday, 26 February 2015

The Enterpri[s|c]e Trap

The Beginning

In the beginning of the "computer" people were developing with 0s and 1s; really. Then some dude came up with an abstraction which some know as "assembly" which made it easier to write 0s and 1s. Though it's a very old abstraction I myself "developed" in assembly. I did some windows specific applications in the "masm" dialect. I learned assembly in school. At first assembly was like "eh?" but once you understood you could write it like you speak. My assembly teacher died during my school time. He died in his car...best death if you ask me. He just fell "asleep". No other human being hurt. He parked his car and died.

But of course writing assembly code needs quite a lot of time. It has some advantages though. First, you really know what you are doing. That sounds like as a matter of course but nowadays this isn't. It has its cons as well. It's very processor and operating system specific. I myself used to write windows applications in masm which is _very_ specific. Rewriting them for linux would require a complete new implementation.

People who were used to writing 0s and 1s were afraid of writing assembly because that abstraction only could harm. Of course.

Ritchie came up with C

Later on some scientist called Dennis MacAlistair Ritchie invented the C programming language. Together with Brian W. Kernighan he wrote a book which is called "The C Programming Language" and it's a "must buy" imho. Only because it's still valid as of today. C first appeared 43 years ago.

C was another further abstraction and people who adapted to assembly now feared C because now C was evil. But as time went on people adapted to C and C was such a success it's still the most used programming language out there. A programming language that's 43 years old is still used by most of the people. C influenced a lot of other programming languages like C++, Objective-C, D, Go, Rust, Java (gosh!), Javascript (oh dear!), limbo (nice in its own way), perl (thank you), php (ok, thanks, bye), pike (nice people!) etc...

C even turned into some kind of "target". Most new programming languages just "compile" down to C code. Also most tools for linux and unix like operating systems are written in C and you'd rarely find anything that doesn't need a C compiler in the *nix world.

But when you first learn C and don't understand the underlying hardware you'd probably find C "hard" to understand. So one might wonder why it's the most used programming language in the world. I say it's the evolution. Assembly programmers knew what their c programs did. They just understood. And they understood that C is helping them in writing efficient code while still understanding it.

Perhaps C is the best abstraction one needs for a computer. C let's you write structured code that's not too far away from your hardware. You have full control of your hardware and still understand your code because it's structured.

Abstraction Jungle

Still every "new" language wants to abstract further away from the computer which sounds fine for people who don't understand it. But once you have a very specific problem you are left alone with the programming language and its ecosystem; and because you have no clue about your hardware and all the layers you are using blindly, you're lost. I'm not inventing this; I experienced this myself. You are working on a pile of layers you don't really understand. You will get the first 80% right and the last 20% horribly wrong. Is that what you want? Then go this way of totally fucked up pile of layers.

What I'm saying here is that most developers don't know what they are actually doing.

A simple task like counting the lines of a text file with 10+ millions of lines in a reasonably short time is a problem for most of the programmers I know. It's not a real allegation but it still frightens me because it _is_ a simple task.

Further abstraction means 100% trust. So why would one need a further abstraction of an abstraction that was far away from the hardware already?

Well, it's the "business" thing. Business obviously doesn't need the last 20%. They are ok with you churning out some code that delivers the 80%. Most of the time it even doesn't matter if you don't understand what you are doing. You are basically just using several libraries with your own logic and see if it works out. You are the "gluer" aka conductor. Phew...

The Actual Enterprise Trap

And here comes the enterpri[s|c]e trap.

Let's say you need a database because you can't write it yourself. It's a difficult task, right? so it probably makes sense to use something that's proven (I always give that advice, funny, eh?). So you have some choices here. It's often the choice between mysql, mariadb and postgresql because people believe it's "free". I tell you it's not. By using mysql, mariadb or postgresql you depend on them the minute you start to use it. In most cases you won't hit severe problems but I can tell you, at some stage, you will and you will need support. You probably can find a solution using google when someone already faced your problem (which is likely) but if no one ever had your problem, you are _lost_. No, not really; you can buy "support" from several companies that built their business around mysql, mariadb and postgresql.

Eh? What's wrong here? You started to use something you never understood completely. Most people have no clue what these databases are doing behind the layers you are using and a lot of developers are stating this is the way to go. To have abstractions which implementation you shall not understand. You shall only know the "interface".

This leads to a strange evolvement. People are getting used to the gotchas of these layers without knowing what actually happens behind these layers. This also means that this knowledge can become invalid very quickly if the implementation changes (which is likely).

But people are ok with it. They get all this "enterprise" stuff for "free"; basically to play around on their own. Once they got used to it, they use it for their next product. If that happens, it's likely they are going to sign some contract with some other company which _does_ understand the abstraction (or they are ok with the 80/20 myth).

I only talked about databases, but you can map this on almost anything you find on the "internet". ActiveMQ, Droole, Wildfly, Tomcat, Elasticsearch, solr, etc...just read a bit on reddit.com/r/programming and you will be infected with all of it.

It's all there to make you believe it's going to make your "business" easier and cheaper but in the end it's often a pita.

I don't want to trivialize that being a software conductor is wrong. Conducting software is something that obviously needs to be done.

But for me it has a taste I don't really like too much. All the "enterprise" stuff is moving too fast to keep track of all the layers and the implementations underneath. I also talked a lot about hardware but when talking about all these "enterprise" stuff your "interface" is probably just an interface of another hundreds abstractions.

You are further away from your hardware than you will ever believe because you are used to all this abstract shit.

Rescuing Myself

Please don't get me wrong on this "abstract shit". I love "abstract shit" but only if I understand its implementation. So there's really just one decision to make. To be a "dumb" replaceable conductor or to be a software specialist. I decided to become the latter.

Thank you for reading.