We were having a discussion about languages, prompted by a piece of technology strategy work we are doing around Node.js, and I came away with the impression that there is a general sense that we are moving from a period when the C-family of languages had a hegemony (from the mid-nineties onwards) into a period of fragmentation and diversity.
First, I'm going to challenge that notion with a bit of history, and then see where that perspective leaves us in the "language wars" of today.
When I was a baby developer, every single engineer was reasonably proficient in one of the popular Assembly Language families - typically Motorola's 680x0 or Intel's x86, but it could be Z80, 6502, ARM or something more esoteric (IBM Mainframes, anyone?)
Here's some nostalgic Z80 code. Ah, big-endian architectures. How I miss you.
; 99 Bottles of Beer program in Zilgo Z80 assembly language. ; Assembles for ZX Spectrum/Timex - change Spectrum: lines ; if porting. Help from Marko! Compiled and tested with SPIN ; ; Adapted from the Alan deLespinasse's Intel 8086 version ; ; Author: Damien Guard ; email@example.com ; www.damieng.com org 32768 start: ld a, 2 ; Spectrum: channel 2 = "S" for screen call $1601 ; Spectrum: Select print channel using ROM ld c,99 ; Number of bottles to start with loopstart: call printc ; Print the number of bottles ld hl,line1 ; Print the rest of the first line call printline call printc ; Print the number of bottles ld hl,line2_3 ; Print rest of the 2nd and 3rd lines call printline dec c ; Take one bottle away call printc ; Print the number of bottles ld hl,line4 ; Print the rest of the fourth line call printline ld a,c cp 0 ; Out of beer bottles? jp nz,loopstart ; If not, loop round again ret ; Return to BASIC printc: ; Routine to print C register as ASCII decimal ld a,c call dtoa2d ; Split A register into D and E ld a,d ; Print first digit in D cp '0' ; Don't bother printing leading 0 jr z,printc2 rst 16 ; Spectrum: Print the character in 'A' printc2: ld a,e ; Print second digit in E rst 16 ; Spectrum: Print the character in 'A' ret printline: ; Routine to print out a line ld a,(hl) ; Get character to print cp '$' ; See if it '$' terminator jp z,printend ; We're done if it is rst 16 ; Spectrum: Print the character in 'A' inc hl ; Move onto the next character jp printline ; Loop round printend: ret dtoa2d: ; Decimal to ASCII (2 digits only), in: A, out: DE ld d,'0' ; Starting from ASCII '0' dec d ; Because we are inc'ing in the loop ld e,10 ; Want base 10 please and a ; Clear carry flag dtoa2dloop: inc d ; Increase the number of tens sub e ; Take away one unit of ten from A jr nc,dtoa2dloop ; If A still hasn't gone negative, do another add a,e ; Decreased it too much, put it back add a,'0' ; Convert to ASCII ld e,a ; Stick remainder in E ret ; Data line1: defb ' bottles of beer on the wall,',13,'$' line2_3: defb ' bottles of beer,',13,'Take one down, pass it around,',13,'$' line4: defb ' bottles of beer on the wall.',13,13,'$'
Even then, C was starting to take a firm hold everywhere, while scientists were using FORTRAN, line-of-business devs were still knocking out COBOL (and various mainframe languages like APL and PL/I), and CompSci academics were using languages like ML, LISP and Haskell.
We lived in a world of profound language diversity, specialized to a particular use case. It is often perceived that people used "the right tool for the right job" - but I think the reality was somewhat different. As I said, everyone knew a bit of assembler. You had to if you wanted to be able debug things at the lowest level on your platforms of choice. But LOB developers knew COBOL, not ML. Scientists knew FORTRAN, not LISP. Language diversity was really programmer diversity.
So where are we today? Well, judging by the TIOBE index, more than half of developers know at least one C-family language, be that C, C++, Java, C# or Objective-C.
Given the importance of the concepts embodied in node.js on the one hand, and the apparently insatiable industry demand for ever-more elaborate web pages on the other, why might this be so?
I think the answer is probably influenced by a risk/reward calculation. I mentioned this in the context of HTML5 a while ago: the tooling is poor, developer education is poor, the language is deceptive (it looks like C, but has much more in common with LISP1 and ML), the debugging experience is extremely poor (even, perhaps especially, in the world of node.js), and although there are many 3rd party libraries (just look at the 28,000-odd packages on NPM), they are riddled with incompatibilities, and even the base libraries supported across all implementations are barely fit for purpose.
As a CTO, I certainly wouldn't bet the farm on that kind of technology at this stage if I didn't have to. Of course, it is really interesting to work with, and if people don't work with it, it will never improve, but it is (clearly) not yet ready for mainstream adoption - Mort and Elvis are not in the building. If it is really the way forward, then how is this technology going to evolve to meet the constraints of broad adoption?