August 2013 Archives

Drawbacks of a Perl Specification

In What if Perl Had a Specification, I suggested that part of the C language's staying power came from the fact that it allows multiple implementations of a regularly revised specification. That may be good for Perl—and I'll discuss that in a future post.

Very little in life is free, however. The downsides of a specification are numerous and real.

Who Writes the Specification

The biggest drawback of a specification process for Perl is that it'll require a lot of volunteer labor, if it ever happens. The last time this happened was The P6 RFC Process, and, over a decade later, you can see a few hints of the varied suggestions remaining after a small cohort of editors revised that specification into its current state.

Unlike a specification for an Internet standard such as a networking protocol, a mechanism to improve device accessibility in web pages, or even something as simple as allowing international characters in domain names, there's no industry lined up behind Perl with multiple multinational vendors and mutually cooperating and competing non-profit vendors already implementing parts of the specification and jockeying to have their ideas included with as few modifications as possible.

It's not even like the Java JSR process, where the Enterprisey software companies get together to design the worst possible persistence format, date and time library, or attempt to wedge dynamic programming into a static language through the use of convoluted frameworks driven by XML.

It's also not like Perl is a small language.

Standards are opinions, and if I were in charge, you'd see a language that looks the way I program it and not a lot of things that look like I don't.

What Gets in and What Stays Out

That brings up the second point. He or she who does the work does what he or she wishes.

If I were in charge, an ideal Perl specification would have strict and warnings and enforced version numbers in all files by default. That decision would instantly mean that almost all existing Perl files do not conform to the specification.

Someone else might remove tie and AUTOLOAD, breaking big parts of the CPAN unwittingly.

There's probably little controversy over removing features such as reset and srand and dump, but unbundling a module such as affects real people and real programs.

Your small core might not be my small core. You might want as the bootstrap mechanism for the CPAN, while I might prefer cpanm. You might want SysV IPC keywords available by default, while I'm fine with them as a loadable module. You might want a guarantee that the only symbols available by the core POSIX module are those that Perl can provide across all platforms, and I might disagree.

If there's a solution, it's finding an extension mechanism. Even as much as the zeitgeist on p5p and PerlMonks and elsewhere might want to suggest that "Just install it from the CPAN!" is the ultimate solution, it's still not a ubiquitous solution.

Which Parts of the Implementation are Deliberate

Because Perl has never had a single specification created around a few axioms rigorously devised for maximum flexibility from a minimal combination of primitives, it's grown organically. That metaphor is deliberate. Nature is messy and nature is red in tooth and claw and nature resists attempts to make it conform.

In other words, Perl is good at things Perl was never deliberately designed to do and Perl is capable of things no one ever thought Perl would do, and one reason why is that Perl allows you to be messy. Perl has side effects and Perl has quirks of implementation.

Any specification will have to get into the details of how, for example, polymorphic value caching works, or when you can treat a blessed reference as a regular reference in the face of overloading or tie, or exactly how cross platform things like symlinks and filesystem encoding or threading implementations are.

A good specification ought to solve the question of reference counting of items on the Perl stack.

A great specification would draw a line around parts of the Perl XS API and codify what exactly is part of the API and what isn't and break large pieces of the CPAN until things get better. (If that disturbs you as much as it does me, good.)

Which Flaws Need Addressed

I write documentation last. If something's difficult or embarrassing to explain, I revise the code until it makes more sense. (Writing tests first helps with this, but doesn't always prevent it. Code and prose are different, and anyone who tells you otherwise should have a lollipop and step away from computers for a while.)

Even if all of the previous obstacles to creating a usable and useful specification are surmountable, arguments over whether a flaw exists and whether it's ethically acceptable to enshrine them in a specification (however ephemeral this version is) will arise. These will be nasty arguments. There's a strong reluctance to mark too many things as undefined corner cases, even when they exist.

(This is especially true in the Perl world, where it took ages to get rid of Perl 5.005 threads, pseudohashes, the module, smartmatch, XS, and indirect notation. QED.)

On the other hand, there will be pressure to fix bugs and corner cases before the specification comes out, but that's pressure on developer and tester resources, which means a prioritization process involving volunteers, and everyone knows how that goes.

Is it Worth It

This is a huge amount of effort.

Then again, other projects in the Perl world have used up huge amounts of resources, so that shouldn't mean the plan is unworkable.

It would be expensive and burdensome and take a lot of effort from a lot of unidentified people, as well as a lot of effort from people already burdened with the maintenance of the existing code and important CPAN modules. It'll cause arguments and fights, but some of those arguments deserve resolving anyway.

The biggest drawback is that Perl doesn't have a Larry; he's been doing the same thing for P6 for many years. Similarly, there's no Damian to provide the rigorous attention to detail bounded by practical experience teaching the language and helping users at all levels write workable code.

Is that a fatal flaw in the idea? By no means. The advantages to a specification are compelling—but it's important to be realistic about whether this might (or even could) ever happen.

What if Perl Had a Specification?

The C language is old. Really old. It's older than the parents of some of the people reading this.

(Okay, it's not that old; the English language is older and so are some of the houses in which the people reading this are sitting, but it's old in terms of technology, which evaluates technology in units of "how new is this?" and "how few people know this yet?")

C is more than four decades old, and yet C compilers are still being written and C specifications are still being written and I have a fair amount of money invested in a semiconductor company that sells to a lot of people who write C code to make phones and calculators and microwaves and DVD players and all sorts of industrial and consumer electronics devices work.

C may not be a language with the ironic mustache and candy-colored rounded corners of the latest Y Combinator-funded todo list social networking app for marmots who love mobile analytics (they pivoted halfway through me typing that sentence), but people use it to get things done, and even among the brogrammer set, you will get people admitting that "You know, I don't consider myself a real programmer, because I don't know how pointers work."

(You do find people who think that, I am not making this up, "JavaScript is the new assembly", but if there's any justice in the world, these people will eventually have to maintain their code, and that thought amuses me greatly.)

No one really thinks "I want to use the C language version 5.20" or "Wow, if C had a newer version number, I could totally justify using it for the touch-screen microcontroller!"

Then again, C has a specification. GCC and LLVM both attempt to keep up with what's in, for example, the C11 specification. So do the C standard libraries on many operating systems.

(Okay, Microsoft's a holdout here. It took Microsoft several years to get around to supporting CSS in IE, which would have had arguably more users than a C compiler, but Microsoft's too busy trying to figure out how to release a tablet computer one of these years, so I'll give them a break. Technology is hard.)

C isn't a more or less valid language for its uses because of its age. (C++ didn't replace C. D didn't replace C++, let alone C. Rust won't replace C. Objective C didn't replace C.) The market expanded and newer niches grew and some languages found ecological footholds in those new niches, but nothing has managed to replace C altogether. (If you think JavaScript will replace C, then take it from me: you're not too cool to have balloons and a clown at your 12th birthday party next year. Be in no hurry to grow up.)

C does have its warts: its type system is a series of patches on the PDP memory model, its linking and symbol visibility mechanism is baroquen, and it's funny how people like to communicate in text with things like string data and the language really doesn't support that. Oh, and its mechanisms for parallelism and concurrency mechanisms for parallelism and concurrency mechanisms for parallelism and concurrency dead lock memory corruption purple monkey dishwasher contents of /etc/shadow:.

Go won't kill that off either.

What C does have going for it is a history, a commitment to backwards compatibility, an ecological niche that's grown to expand just about every operating system everywhere—even places where the only operating system you get is the bootloader you wrote yourself in 256 bytes and if you want to use printf for anything more complicated than 16 bit integers, floats, and strings, you get to write it yourself (I told you, I'm not running the timing system of my new car on an embedded web browser through a RESTful single-page Angular.js app, no matter how much you insist it's the wave of the future)—because it's useful. That's pragmatic. Also, there's a specification the degree to which you can decide your implementation will conform.

That's why GCC exists. That's why LLVM exists.

I'm not sure that anyone thinks C is more or less relevant because there's a K&R specification, a C89/C90 specification, and a C11 specification.

Now what about Perl?

Perl first appeared in the '80s. You can claim the current version of Perl dates from either 1994 (Perl 5.0) or 2007 (Perl 5.10).

Perl has an enormous test suite. It's almost comprehensive. It covers major, minor, and esoteric features of the language. It also has very comprehensive documentation (almost too comprehensive in places).

Yet Perl has a single implementation. It's had at various times attempts at alternate implementations—some used the test suite and some didn't—but there's no serious worthy secondary implementation.

Contrary to the six-monthly nonsense that "Perl Cannot Be Parsed", this isn't because writing a parser for Perl is difficult. (It is, but writing parsers for complex languages is difficult anyway. You're not a language designer. You polluted a global namespace with a bunch of silly little self-chaining methods and called it a DSL.) This is because implementing a Perl is difficult, and that's because no one knows exactly what you have to do to implement enough of Perl.

... whereas everyone knows exactly what you have to do to write a JVM or a C compiler and standard library, because there are specifications.

The presence of a specification makes the task of reimplementation neither easy nor simple, but it makes the task more possible and much, much more plausible.

More than that, you can revise a specification, such that you identify the essential, experimental, and ephemeral components of a modern implementation of the language and runtime and ecosystem. People who want the old version know where to get them, and you get to continue to evolve because you're setting concrete and specific expectations.

In the Perl world, Task::Kensho has attempted to do this with a collection of recommended CPAN modules that would form an extended core of libraries the average Perl developer should have installed. In the Python world, one might fairly characterize a recent slowdown in proposed enhancements as a love letter to alternate implementations to help catch up to the 3.x series. (A specification is not in and of itself a sufficient precondition to language success. P6 has had a specification for years. Perhaps specifications are, like frameworks, best when extracted from working projects.)

This is difficult work and it's less rewarding than it sounds. It's fraught with peril. (Do you specify smartmatch? ithreads? XS? Autodereferencing? Tie?) I haven't mentioned this to p5p and the idea might be met with universal disdain or even polite ignoring.

Yet there are advantages beyond even the ability to promote a language and platform as current and vibrant—but that's the subject of another post.

Trustworthy Developer Estimates

| No Comments

If you're tired of people complaining that you can't make and meet commitments, try making commitments you can keep and then keeping them.

Practicality and TPF Grants

Matthew Wilson has requested a $10,000 grant to embed Perl 5 in a potential new VM for Rakudo. While this comes from Ian Hague's Perl 6 Development Grant and not TPF's general fund, the grant process is still under the purview of the Perl Foundation.

I've been thinking about this grant since its announcement. It raises several questions about the purposes of TPF grants in general. In my mind, all grant applications should answer one question:

What's the desired practical effect of funding this work?

Some grants answer that question with ease. For example, Dave Rolsky's work to write Moose documentation produced a large manual which turned Moose from an esoteric project into something people could actually use without having hacked on it. Similarly, Paul Johnson's grant to work on Devel::Cover has fixed several longstanding bugs and added quite a few desirable features. (Even though Devel::NYTProf was sponsored by the New York Times and not TPF, its success reinforces the argument that practical sponsorships and grants have a greater chance of, well, success.) Funding Nicholas Clark, Dave Mitchell, and Tony Cook to maintain Perl 5 has closed a lot of bugs, improved performance, and cleaned up a lot of code. Perl 5.18 exists at its current level of high quality due in no small part to these three and their grants.

I should add a caveat here. The suite of projects under the "Perl 6" umbrella have always been research and development. Even though that may not have been the initial intent of the project (and certainly not the intent of all of the developers who've come and gone), there's never been a serious and sustained effort to turn any of those projects into a product suitable for general use. (I can imagine howls of protest over this statement, but I'd need evidence if more than a couple of handfuls of people who've ever used any of these projects as products to consider that it's even possible, let alone a priority.)

Investing in research and development is the opposite of eating your seed corn. It may be what saves your business or project from slow and creeping irrelevance. Certainly the ubiquity and quality of the CPAN and its infrastructure has done more to sustain and expand Perl 5 than almost anything else. (Look, you don't get Moose or Mouse or Moo without the CPAN. Sure, maybe if Perl 5 hadn't borrowed Python's terrible, barely-there object system and had devised something that worked much better, Perl 5 wouldn't have needed Moose, but a lot of worse for a long time gave us a lot more better.)

Yet research and development doesn't always pay off, and you can't always judge its long term success in a short time period. Back in 2000, Perl QA didn't know Test::Builder would take off like it has now. It was just a fun little hack Schwern and I came up with. There are probably a dozen other hacks I perpetrated back in that time that never went anywhere.

Hence my question about the purpose of these grants. I see risks in Matthew's proposal:

  • What if MoarVM can't run Rakudo effectively?
  • What if the project takes an order of magnitude more time or effort than estimated?
  • What if supporting that last 5% of XS modules is difficult, and that lack renders a significant portion of the CPAN unusable?
  • What if Rakudo needs another two, five, or ten years to become a usable product?
  • What if Perl 5.20 or 5.22 makes a breaking change that the project can't support?

I'm not saying these are likely, but that they're possible. They're not the most pressing question, though:

When will people be able to take advantage of this project in practical ways?

Even if Perl 5 were usable from MoarVM today, I still think it would take years before Rakudo on MoarVM were a usable product for general purposes. Even if that's an optimistic two years instead of five, how is the group overseeing Hague grants to judge the success of the project without waiting two to five years to see if the research has paid off?

Maybe my criterion is too strict. Maybe grants shouldn't be solely for people and projects who and which have demonstrated practical applicability to a measurable quantity of users. (Maybe all of the Hague grants should be considered as funds for research and development that may never have any practical applicability.) Maybe it's sufficient that only half of the awarded grants have a practical benefit. (Maybe 3/4 should, or maybe 25% should. I don't know what's the right ratio.)

I don't know how I'd vote if I were in the TPF group overseeing this proposal. In a hypothetical world with a working P6, I see tremendous value in Perl 5 interoperability. (Goodness knows I embedded Parrot in Perl 5 and then in Ruby several years ago, just to be able to use its grammar engine.) In the real world of 2013 now, however, I can't quite convince myself that Rakudo or MoarVM (and especially Rakudo on MoarVM) are close enough to being real products that anyone can predict what's practical.

Update: A previous version of this article implied that TPF's Grant Committee oversees Hague grants. Private correspondence from a Grant Committee member corrected me that the GC does not oversee these grants.

Modern Perl: The Book

cover image for Modern Perl: the book

The best Perl Programmers read Modern Perl: The Book.

affiliated with



About this Archive

This page is an archive of entries from August 2013 listed from newest to oldest.

July 2013 is the previous archive.

September 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.

Sponsored by Blender Recipe Reviews and the Trendshare how to invest guide

Powered by the Perl programming language

what is programming?