The JFDI Theory of Language Adoption


Perhaps you've seen a proposal to reboot PHP. How telling that its explanation of new features focuses on syntax issues.

I don't know if phpreboot is doomed. I do know that PHP isn't doomed, at least not because it has awful syntax (it does) or baffling feature inconsistencies (it does) or lends itself to awful code (it does).

Consider also JavaScript, a halfway decent language tied for the longest time to implementations which competed to out-awful each other, bound to the horrible DOM, and limited by syntactic inconsistencies, a mish-mash of ideas from Scheme and Self, a strange paucity of built-in data structures, and a scoping implementation so deliberately wrong you might scratch your head and wonder if its designer accidentally stumbled upon the other principles of Scheme and Self.

JavaScript's not doomed, either. (It's not the all-singing, all-dancing crud of the world, and if you liked awful desktop software written in VB and Access, you absolutely must look forward to a world where the only software is web software written in client-side pre-ECMAScript 5 JavaScript, but it's not doomed.)

Programmers and techies have a theory that the best product will win, even though we all know this is a silly theory. To start, we don't all agree on what's best. Some people like mixing code with HTML. Other people have good taste. Some people like semicolons and sigils. Other people like their code to look like a big homogenous mishmash of runon sentences. The amount of which this matters as a general explanation for the whole set of potential programming language adoptees is slim, because aesthetics matter much, much less to techies than we want to admit. (How strange that we complain about slick marketers in suits nattering on about presentation, then argue over the placement of braces and the presence or absence of parentheses on method calls.)

The JFDI theory of language adoption is this: the primary concern of a novice adopting a programming language is the distance between zero and just getting something done.

For PHP, that's dumping new tag in the existing HTML file you already know how to write and upload.

For JavaScript, that's copying and pasting code from View Source into an existing HTML file you already know how to write and upload.

For Perl CGI programs, that's dumping a slightly-better-than-shell script in a directory on the server where you already know how to create an HTML file.

The quality of PHP or JavaScript or any implementation is relevant to the JFDI Theory only in so far as an implementation which blocks this short onramp prevents the initial success. Any other implementation or design problems do not matter.

The semi-obvious corollary to the JFDI Theory is the Nomad Theory of Language Unadoption: the people to whom semicolons and pseudo-DSLs and sigils are Vitally Important Issues and Markers of Good Taste and Reasons Your 20 Year Old Language Family will Soon be Irrelevant will move on en masse to something else in six to eighteen months anyway, so why bother pleasing them anyhow?

In other words, the primary driver of the possibility of large-scale adoption of a new language or an existing language in a new niche is the ease of which regular people can get something done.

That's why projects like ActiveState's PPM repository—much improved over a couple of years ago—and Strawberry Perl and Perlbrew and Plack and cpanminus and hopefully Carton are important to Perl. Continuing to ease the onramp to Perl and the CPAN can help attract and retain new users.


I couldn't agree more with the JFDI principle.

I've always thought that if Perl + mod_perl + Html::Mason (or mason2 now) were packaged together in a single OS Package called 'perl' that 'just work' when you name a file .pl in your apache's default dir, then it would be as successful as Php, if not more.

Is-it crazy thinking?

Add in a browser-based CPAN client, and that could be a winning idea.

I think one of the biggest limiting factors we experience with getting people into Perl is the difficulty of installing XS modules on machines, which pretty much requires a compiler.

And as really what makes Perl Perl is CPAN, not being able to install a CPAN module because it needs XS or depends on something that needs XS is a recurrent problem.

However, it may be interesting for somebody to take the PPM approach to things for non-windows implementations of Perl.

After all, people release binary distribution-agnostic copies of things all the time with mostly successful results, so there's no reason Perl can't do this.

A roadmap could look like as follows:

  1. Produce binary pre-built releases of Perl for every major release for every operating system we wish to support. ( Start with Linux elf binary builds that you can just tar -xvf into a directory such as /opt and have it "just work" )
  2. Have a binary pre-built repository ( like PPM ) of perl modules pre-compiled for each of the above releases.

Then you've largely quashed the need to have a working compiler and make and all that sort of stuff, and quashed the need for 90% of the install toolchain.

mkdir /opt/perl/

cd /opt/perl/

wget http://someserver/some_tar_list
wget -i some_tar_list
xargs tar -xf < some_tar_list

^ and you're basically good to go.

It'd be easy enough to unify this into a sort of web-based interface that makes upgrading and maintaining this simple.

I myself would not be inclined to use such a service, I'm really happy using source-based-everything.

But lots of people really love binary builds.

Ctypes for Perl would go a long way to solving the XS problem.

Binary builds are troublesome in that you have to consider how you built your Perl binary, with binary options such as threading, multiplicity, IV size, and more. A better (or "less C-centric") Perl 5 API would help.

Yeah, but if the Perl binary itself is also produced for users in binary form, and binary builds are matched for those perl binary builds, that solves that issue.

You'll still have the occasional issue when something XS-based needs system libraries to work I guess, and static builds are not always wonderful, but something that works might be better than something that doesn't

Programmers and techies have a theory that the best product will win, even though we all know this is a silly theory.

Not at all so.

The best product does win – it’s just that it’s not obvious up the line which axis of fitness will matter further down.

Every oft-cited example given for the apparent truism that the best may fail, like Betamax vs VHS, when examined more closely, reveals that “the market” judged the products by criteria the enthusiasts did not care about (if they did not openly thumb their noses at it), and found the enthusiasts’ darling wanting. But the product the market chose is always the better one according to the criteria that it decides are the ones that matter.

The example you went on to give is no different. PHP won because it’s disproportionally better along the axis which is most relevant to most “customers”.

Because products aren’t just better or worse. Almost every dis-/advantage is in fact a trade-off involving axes. A product comprises many different trade-offs, and you can never say it is just “better” or “worse” than another, often not even if you narrow down your criteria and presume a context of evaluation. And “the market” is good at finding the most rational, if often boring (for that reason!), basis for evaluation.

(This is what “worse is better” means. Worse isn’t better. But worse in particular ways may be necessary to make it better in other ways, and if those other ways are the ones that truly matter, then “worse” will be better. It’s really “better is better”.)

Of course it’s bitter to care about criteria no one else gives a flying monkey about. Painting those who disagree as irrational and short-sighted is a convenient narrative that soothes the ego, however much much it may spite fact.

I'm a rampant capitalist, but I'm not so enamored of economics that I believe that all agents are rational free actors with sufficient information to choose the best product.

Besides, best is such a loaded term that I can't let you get away with a post hoc "it's whatever sold best" definition.

Read again: there is no “best” or “better” as one linear scale. That is mere narrative.

Nor am I saying the actors are rational – where did you see that? But they are self-interested. More so they are lazy. And so the market as an aggregate mass finds a path of least resistance – which you should know is an optimisation problem. It achieves this optimisation independently of any actor’s particular (non-)strategy – emergently.

And nowhere did I say the criteria that the market picks are the ones that will lead to the most rationally and maybe ethically desirable choice being favoured. “You can’t derive ought from is.” As well it is prone to tragedy of the commons.

I am no believer in the free market fairy. (Just as a note on that count that I am not going to pursue into here: efficiency is often not a virtue. How is that for contrarian?)

But this chimera that the market makes random bad choices needs to die. The reasons behind the success of any product over another may not ultimately be desiderata, but reasons are there. And if you want the market to pick what you think it ought to, you have to respect it and account for its desires in your schemes.

(As an example, as an advocate of libre software I find Stallman’s recent tirades against Apple embarrassing. I’m on the same side of the debate as he is but he so completely fails to empathise with people who choose Apple that after hearing him they can only be less predisposed to hear anyone else – and even if I get past that barrier I’ll have had to disown his views first. I hate where we are going but I feel almost alone on my side of that debate in understanding why we’re going there, which gives me little hope that it can be averted.)

The beauty of a free market system (ie open source) is that if someone produces something that someone else pays for because they find it valuable, then they are free to do so and both parties are happy.

If the purchaser values the product more than the cash they hold, then they profit when they buy it.

But what if another product is cheaper and better? Then these may not be factors they consider most important. What if they dont know about the other product? then the product they are buying is clearly valuable enough to them that they purchase it.

Heterogeneity and competition are the secret sauce to the free market. In a planned economy they are labeled things like "overbuild" and "duplication", which ultimately results in products that try to suit everyone’s needs and end up suiting no ones needs.

If people buy an 'overpriced' apple device, then they value the device sufficiently to purchase it. Spectators may criticize them because their values for purchasing differ significantly. This criticism lacks empathy and deprives them of their right to chose. The purchase may simply boil down to trust, without which why would anyone spend a dollar?

Such is open source, it is a pure free market without capitalism. If i code in perl, python, ruby, c#, c, c++, COBOL, pascal or whatever, then i am free to do so and whilst i find the language sufficiently suited to my needs, then i profit. What is my concern that another language is better because of xyz? My objective is to solve a problem as economically as possible. I may chose to add the cost of learning a new language, it may prove a valuable long term investment, or not. I trust my language and understand its quirks.

Apple has mastered that 'jfdi theory' - the make their products easy to use. People can feel like they are getting value quickly and then become passionate about them.

Programming languages can learn to do the same. A new programmer needs to taste the thrill of achievement and creation! Then to be seduced by it.

It kept bothering me that I made a point so simple and obvious so badly. Hundreds of words, all wasted.

So let me try this again:

The market always emergently optimises for something when choosing which product will win, even when it optimises for an undesirable thing. Products that win are always better by some set of objective criteria.


I agree in principle, but anthropomorphisizing the market has two flaws. First, it's inherently reductionist. Second, it leads many people into the fallacy that markets are consistent in their optimization characteristics.

For example, PHP succeeded as it did because it was cheap and easy. That doesn't mean than a successor to PHP will succeed solely because it's cheap and easy.

I am aware, that’s why I put “‘the market’” in quotes in my first comment, and what “emergently” in my last one is trying to cover for. It’s a fair point.

As for PHP, the key is that a competitor won’t succeed – at the same game – if it’s not at least sufficiently cheap and easy. (I’ve written about what I think would beat PHP, and the quiet gasps of demand for the platform-as-a-service offers that have come to exist since then seem to confirm that theory. The problem that the PaaS offers suffer is that they lack PHP’s ubiquiy – too much service and not enough platform. Language fragmentation stands in the way, in part. If someone were to package node.js into an easily installable PaaS and to remedy node.js’s lack of libraries and to then get it to go viral (it’s always step 3 that gets you…), I’m convinced it would engulf “the market” like wildfire, considering the moods that currently blow. It’s windy…)

Modern Perl: The Book

cover image for Modern Perl: the book

The best Perl Programmers read Modern Perl: The Book.

sponsored by the How to Make a Smoothie guide



About this Entry

This page contains a single entry by chromatic published on October 10, 2011 10:58 AM.

Why Test::Harness Should Not Enable Global Warnings was the previous entry in this blog.

In Search of Minimum Viable Utility is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Powered by the Perl programming language

what is programming?