The lack of availability of a certain character is not the only good reason to divorce typeset and handwritten realizations, either. There's also just good typesetting practice. For instance, terminal er in the Abbreviations looks more or less like 'ɛ' with one more hook coming off the bottom, if that makes sense. The closest unicode analog is Ę, CAPITAL LETTER E WITH OGONEK. But it quickly became apparent that Ę looks very awkward at the end of a word. It's a capital letter and the eye has very specific expectations for where capital letters are to be found. So I made the unicode version of terminal er into ę, instead. That's a little awkward to unambiguously write by hand, but it makes perfect sense when reading typewritten text. And of course this relationship of very light digraphia is already very much the norm whenever one compares typeset text and handwritten text.
One happy result of working on abba
is that it's informed my understanding of the Abbreviations, too. One immediate and obvious consequence is that it will force me to very rigorously define positioning rules and such; any place where I've been allowing myself to fudge things a little when composing texts—because things will be immediately obvious from context—will be exposed, and the abba
implementation of The New Abbreviations will evolve as a reference implementation of the Abbreviations themselves.
Something else that this has pointed out to me is that the unicode realization of any given abbreviation and my handwritten realization of that abbreviation need not be very similar at all. Precisely because abbreviations are objects, with multiple renderings, before they are unicode characters, means that I can happily say that 'in', for instance, can look like 'ɹ' in unicode and look like something similar, but distinct, when written by hand. And if I want to write a bitmap outputter or something that replicates the glyphs as they're written by hand at a later date, I can.
The Abbreviations are really just a series of rules for letter substitution within any text written in the Latin script—having been optimized for English.
It's a kind of shorthand. But whereas most shorthands use a series of written symbols to approximate spoken syllables, this is an extended alphabet that comprises not only the standard 26 letters of the English alphabet but also a series of added letters that stand for common words or common sequences of letters. That's really it.
I need to define a roadmap. There's several things I know I need to do already; I need to figure out the best order to do them in. Off the top of my head:
On the other hand: I would argue that the primary aesthetic appeal of the system is in its function. What I find most beautiful in the system—I think—is the way the signs play together, the way the rules interlock. I find my personally produced artifacts to be visually beautiful as well, but I think that's a secondary beauty.
This is perhaps one of the central questions of this project.
\4. Following from that: what is the relationship between style and function here? That is, if Akie learns all the abbreviations and, let's say, assimilates them into his own handwriting, as it stands—
i. Is this completely feasible? The interactions between letters are much more involved than in written longhand. They therefore don't allow as much stylistic variation.
ii. Is it still the Abbreviations? Are the abbreviations the way they look, or the way they act? I'm inclined to say it's the latter—I hope it's the latter—but I also recognize that as a primarily aesthetic exercise, the way the resulting product looks is more important than it might otherwise be.
\3. How much of this system is biased towards lefties? Akie, a rightie, found some of the basic components a bit awkward to write—viz., the alpha-style a and the the looped ascenders on letters like h and k.
\2. Figure out a good sentence that displays the basic characteristics of the system, something that you could use as the basis of a primer—maybe 3 basic lessons, first just the letter forms, then the word signs, then the segment signs (multiple characters), then the diacritics.
\1. Define the graphical tolerances of each letter. That is, what needs to be functionally or topographically true of each letter for it to work correctly within the system? Put another way, how much stylistic variation can each letter withstand, as a result of individual execution, before it can no longer interact as designed with the diacritics, additions, et cetera that make up the system?
I had a good talk with one of the Abbreviations' earliest boosters, Akie Bermiss, today. We were talking more around this new set of questions, of what I need to know in order to decide how to implement this thing for an audience. A couple of useful points of inquiry emerged:
It occurs to me that there's almost certainly a large body of implicit knowledge and rules that govern the—what's the word we're looking for here? supraliteral?—features and practice of the system.
That is, there's a big leap from the individual letters to paragraphs of the stuff. Many of the rules involved—what letters look like at the end of words, et cetera—are articulated. But I fear that there are ligatures I'm making intuitively that I'm not documenting.
Then again—maybe that's just style. If I'm enacting these various processes more or less intuitively, then I could theoretically trust any other practitioner to follow their own lights as well. Whether any given implementation is more or less cursive, or loopy, or whatever—no need to legislate that at all.
I suppose that's actually an overstatement. There's still a real value in elegance, obviously; this is a compression algorithm and so its beauty is largely derived from how efficiently it compresses.
These three types of characters—word glyphs, diacritics, and letter sequence glyphs—constitute nearly the whole system. There are only a couple other minor additions and flourishes, like eg the reintroduction of the medial s, ſ.
It's certainly true that the system has obeyed something of a maximalist principle, at least along one or two dimensions. That is, if there's a rule that would be useful in saving space, character count or time, or consonant and aesthetically pleasing, then it's worth it to add it to the set. There is no value in keeping the total number of glyphs down.
But I guess the system only began to approach the combinatorial complexity that makes it worthy of the word "system" once I started adding the diacritics. I think ~ was the first one. As I recall, I probably decided to create a character for -ant—or maybe -ent. That's the thing—why make a whole letter for -ent, -ant, etc., when you can generalize to -Vnt?
I figured the tilde would be an appropriate character for it; already in the IPA and Portugese et al it nasalizes the vowel that you write it over. So now ã means ant and ẽ means ent and so on.
The scribal abbreviations include both abbreviations for whole words and abbreviations for commonly-repeated sequences or case endings. So once I started immersing myself in that world, I began slowly to expand the inventory of signs in the system. Once you're already thinking of words like "&" it's not a huge stretch to stick a new symbol in for, say, "is"—I picked one of the abbreviationes for "esse", I think. Then "we", "they", and so on down the line.
And I had already been thinking about sequences like ff and ee, so it was natural enough to expand further into common letter sequences in English. English doesn't inflect nearly as much as Latin, of course, so the series of relatively fixed noun and verb endings wasn't going to supply itself—nevertheless, it turns out that English spelling is so idiosyncratic and crufty that there's quite a bit that one can compress!
th Is maybe the best and most obvious example; it has been represented already as a single letter in earlier forms of English, a couple times as it happens. So it's trivial to choose one of those letters and bring it back into the present. Make it ð, which is easier to write than its rough synonym Þ. After that you might as well bring back the yogh, ȝ, for gh.
The script began as a single alphabet. I had become very dissatisfied with my handwriting—it was uneven, varying in slant, thickness, and form. So I decided to come up with a single, unified alphabet to teach myself. I decided to come up with a set of letterforms and practice them so I wrote consistently.
Early on I decided to make the alphabet unicase—not only did I prefer the more solid appearance, but I appreciated the simplicity. I knew that word shape, as determined by ascenders and descenders, was crucial for readability, so in nearly every instance I chose for my one case the form of the letter that went above the x-height or below the baseline. I was already, I guess, being a little bit daring, or idiosyncratic; at this point already I was doing things like eliminating the bottom stroke in the letter k.
After the letters, the notion of abbreviations for common digraphs, ligatures, or words came gradually. It had its origin, specifically, in the ampersand, which I had disliked writing for a very long time—the ampersand '&' is hell for nearly anybody to write by hand!
I figured I might as well come up with—or choose from history—a glyph more suited to handwriting. I found it in an old but not entirely obsolete ligature. Once I integrated that into the alphabet, it was more convenient and attractive than what had come before.
From there followed some of the double letters which I found tiresome to write: ee, ff, et cetera. I went searching for ligatures that would translate well into handwriting; for more historical models like with the ampersand. In fact, I found relatively few—there aren't a lot of conventional ways of writing doubled-up letters in English. But in my reading I came onto the idea of scribal abbreviations. It was at that point that I started to include those in the system—drawing some from the medieval Latin originals, or creating new ones myself.
One area that I can make progress in is exploring what elements of the system would benefit from outside input. For instance: whether there are any glaring omissions in the feature set that somebody else notices. Another possible concern is whether there are any given letterforms that prove problematic for a lot of people.
This raises another issue: I'm a lefty and I've designed all the letterforms to flatter my tendencies. Already I've gotten feedback from one user that he finds or has found the letterforms awkward in general; I hope that it won't prove inhospitable to all righties.
Nevertheless I feel like it would feel like a disservice if I just presented it as an artistic curiosity. Maybe that's how most eccentric outsider artist types feel; you spend 10 years building a devotional triptych in your garage out of scrap metal and you start feeling like it deserves deeper engagement than just unattached appreciation. But there's only so much you can expect from any audience, no?
It already almost goes without saying that there's no utility to learning it, not in proportion anyway to the amount of time and thought it would demand. Nobody writes things by hand; and those who do are not quite so constrained by material lack as the scribes of earlier eras. Saving space and time in a notebook on the go is nice, but again: not enough to push anybody into actually learning the whole thing.
The only real appeal is aesthetic; it is, in my experience, a beautiful system, beautiful to use, beautiful to read, beautiful to contemplate.
One of the questions to answer here is, what is the purpose of documenting it in the first place? Or more directly—is it expectable that anybody who isn't me would
a) Have interest in learning this system, and b) Be able to learn it to the degree that I have?
It requires essentially no thought for me to write in this system. But that must be in large part because I came up with it; and because I came up with individual characters over the course of time.
I have no experience with any sort of beta testers, in other words.
Arguably the very first thing that needs to be done is to freeze the feature set, as it were, and define a version 1.0 once I'm confident it's got all the glyphs for words and character combinations for a first pass.
There are several technical considerations after that. The letters need to be written by hand and scanned.
But I also need to establish a way to represent them in printed text; currently I have a large number of bitmaps that I made from editing a fixed-width pixel font. This may or may not be the best way. Obviously a full featured typeface would be best but that might be beyond my abilities.
Second there's the question of how to compose the documents; whether to put together a PDF document or a navigable website.
For the last six years or so, I've worked on a complete and comprehensive system of transcription-based shorthand which takes its inspiration primarily from the abbreviationes of the medieval scribal tradition. It's tailored primarily for writing in English, though it's useable in any language that uses the Latin character set. It comprises about 150 individual characters, though a sizeable chunk of those characters are diacritics that can combine with others to form a very flexible, dense, and aesthetically pleasing system.
I've documented this system in a slightly less-than-mature form; it's closer to feature complete now. Every so often I consider the best way to publish the system for wider consumption.