Michael Jackson's "Structured Design", were one to explain it in algebraic terms, basically says that if one writes a batch program as the homomorphism which is the meet of the programs which homomorphically read the input files and the program(s) which homomorphically write the output file(s), not only are many design decisions are forced (there is a unique solution, up to bikeshedding), but the resulting program will be largely online/deforested/streaming (pick the adjective appropriate to your decade).
Now, I don't know if Jackson thought of his process in these terms, but having overlapped at university with CAR Hoare, I find it likely — what I do know is that he was very careful, in communicating with practitioners, to never speak in terms of algebraic abstractions.
[Update: I was wrong about algebraic antecedents to JSP; Dr. Jackson kindly not only informed me he would characterize his thinking for JSP rather as "pre-formal", but also sent along an account of the gestation of its development.]
Alex Payne's thoughts... mentions a hypothesis that productive languages are conservative ones.
There is also the question of what is being conserved; it may reflect the history of each individual's exposure to particular concepts more than the history of the field as a whole. One might even argue (pace GKC) that many ideals in CS have not been tried and found wanting, but have been found difficult, and left untried. E.g., the idea of writing a program as a largely stateless nest of expressions interspersed with delimited groups of statements goes back to 1960's CPL, at least. Von Neumann's 1945 formal methods were actually a bit of an advance on what Floyd (who did say he was just writing down some folklore) popularized much later. Model theory may be contemporaneous with Turing, but Universal Algebra predates him, and so the example plea that "boolean operators produce only booleans" sounds to me (as someone who prefers J's model in which boolean operations are all special cases of more general ones on boolean domains) like a call to return to the good old days of the XIXth century.
The antithesis, courtesy "Raganwald":
My thesis is that in early days, you need to select for people willing to invest in a new point of view, and having a glaringly self-indulgent features–like funny syntax or a completely new model for managing asynchronicity–is helpful to keeping the concentration of early “wows” to early “meh’s” high.
On a different topic: back in the early days of compilers —when they were just starting to be practically useful, and it was yet unclear if they were even theoretically possible— there was a fair amount of work done with homomorphic compilation, with the idea that to be a homomorphism was sufficiently constraining that there couldn't be more than one, and hence if one succeeded in constructing such a compiler, it would necessarily be the one sought.
(anyone know how to express the above para in a less esoteric language?)
It just occurred to me that a homomorphic compiler is a reasonable counter to Thompson's "trusting trust" attack, on at least two levels:
as per the reasoning above, a properly constructed homomorphic compiler shouldn't be able to insert trojans (to what degree would this extend to a Scott-continuous compiler?), as that involves a bit more latitude of action than simply composing meanings to determining the meaning of a composition.
even if one were not convinced that the compiler was fully homomorphic (and that programs were initial, etc.), as long as it was sufficiently homomorphic, one could effectively use "homomorphic encryption": by compiling a conjugated login program, and unconjugating the compiled object, it could be made arbitrarily unlikely that the trojan insertion would be triggered.
(of course, at this point one would still have to trust the loader, the h/w, etc. etc.)