under the hood

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nulla eu dui tellus. Mauris nisi enim, posuere id laoreet eget, laoreet sagittis ante. Vivamus finibus arcu id metus molestie vehicula.

Just kidding. Always good to start off a page with a fresh batch of Lorem ipsum, especially on a tech page exuberantly bubbling with cheerful technobabble. Anyway, herein is even more technical stuff – everything you always wanted to know but not necessarily today. By the way, your basic desktop browser will more easy-to-use for this page, what with the hashes and copy-and-paste of code and whatnot.

all your hash are belong to us

For fun, we build a unique robot for each web page with information derived from that web page. The robot, then, is a simple image representing that web page. If the robot changes, the web page must have changed. If the web page changes, the robot changes. (Of course, the website is served with HTTPS, ensuring what is rendered in your browser is really what was intended.)

Robot graphics are generated by a local copy of the most excellent RoboHash code created by Colin Davis. See this write up about RoboHash. The RoboHash code takes a string, any string, and constructs a robot based on that string. For our robots, the string is a hash (also called a “message digest”) generated with a cryptographic hash function from preprocessed web page markup, processed with our secret brew of interlocking code gears, pulleys, and steam piston). So, before it is hashed, the the web page markup (XHTML5 in this case), is flattened, reduced, and normalized down to a block of only alpha characters. There are several benefits of, and compelling arguments for, some sort of reduction before hashing which might be worth more paragraphs in the future, which will probably happen here due to tight binding glue of several time machine algorithims.

So, suffice to say, the web page hashes are spun up with our own vaguely secret prep-then-hash mechanisms and, yes, you, too, can do the same thing and generate a hash and see if matches. After generating the web page markup in our vast, underground secret web factories, we hash it. For this work, hashing is done with SHA256. If only building the unique Robohash robots with a small number of permutations relative to a hash’s permutations, we could get by with MD5, say, and not worry about collisions and whatnot. However, SHA256 moves us to a more useful place, versatile for later useful verifications we might want to do.

Note that this is not a comprensive hash of everything that results in the final rendered web page. For example, cascading style sheets (CSS) and images (PNG, SVG, JPEG) are not hashed, just the markup, and not all of the markup at that. The preprocessing of the markup is a arbitrary compression scheme that tosses out data (more on that in a moment). The tossed data is not part of the hash, so to speak, therefore comparing hashes isn’t a complete validation. Even so, utilitarian enough at this level, weighing practical risks of, say, just the punctuation being compromised.

Anyway, want to see the secret code? Keep reading, it’s up just ahead. Indeed, it’s not every day that such light lunch reading magically appears.

first, preprocess and then hash the web page’s markup

There are a million stories in Hash Prep City and this is just one of them. By the way, you can hash anything “as is” – it doesn’t have to be prepped. We prep to reduce and simplify what we are hashing, yet not so reduced that uniqueness is lost. Think of it as one variation on lossy compression. In this arbitrarily determined process, we first flatten the web page markup with the quite useful program tr(1) to ensure a predictable, reduced block of text for hashing by simply stripping out spaces, blanks, punctuation, digits, control characters, and finally anything left that is not printable. This gives a reduced, predictable, consistent block of “enough content that counts,” minimizing error possibility, say, from retrieval or platform text differences. Then we pipe that reduced block of text to the program sha256(1) which outputs the SHA256 hash. Here’s the code snippet, written for ksh(1).

    function htmlhash
    {
        tr -cd "[:print:]" | tr -d "[:space:]" | \\
        tr -d  "[:punct:]" | tr -d "[:digit]"  | sha256
    }

get the web page

Now, pull down the website code with curl(1) and pipe it to the htmlhash function.

    curl -s [pageurl] | htmlhash > somehash

then, compare hashes

Then diff(1) it with the hash listed for the page.

    diff somehash publishedhash

As a practical matter for determining validity, there will be an off-line protected list of hashes to diff with hashes generated from retrieved web page stuff. In other words, obviously if the website is compromised, anything can be compromised, including the list of hashes on this page. And, of course, the robots. But this exercise using the published list suffices for trivial proof of concept and other cool buzzphrases.

Note that the program names mentioned above are written in Unix notation style of the program name followed by manual page number in parentheses, generalized as name(section). See man(1), Unix, and Unix-like.

page hashes and related spiffy robots

A motley crew, indeed. Notably, this page is the only page not listed. Wait, what? A mystery! The answer to which is left as an exercise for the reader.


creative-energy-and-hard-work
7d6f1f8b0a7787a2b2c807f6cb76c62443e82ccf8be046e6f9525deb7b86ff29


the-importance-of-art-education
61ed50b931218bd51621e2df367164e2ca4213dfb80c60b59a003630132e799f


wordle
4031fd6e8ce7a59dee8d1b18d2e241c3dafc48409246269fe6a91c596897e7f8


finishing-touches
9a72b8cfd6f671cb94eb1ed2aabdf0a20fd09639d3a3b07e3f79a0d52ed12e56


ride-the-wave
24da18c1489464f373ccc91ba4901e55c96bdc0064a9dbb90a6a5cdc45cd8293


living-in-the-digital-age
e9e41a38b2a5b5c41053c108936f99c5750e49dc6f2e84c16c71c0944aab199c


creative-juices
c46a54a10ee42929c2d486a8f47cf51c67a4508412a6de3ee84b9a4dd97dd8e2


mothers-day-gnomes
f089283aa5999fe48f641e1dcb2c216dca6ae62bafbeffa11fb2b640df1ce7db


poppies-and-sketch-crawls-and-rain-oh-my
8a4aa7cbbe1731d89ee44ae749cf31c90fe1cc4d905eeedc03a56dd8be9d695a


22nd-world-wide-sketch-crawl-in-georgetown-texas
c78a9d48d91e6ae7cbe7532ce06e29582491c2d237fdcbb217084d2d1a4db7fd


creative-energy-put-to-good-use
069d440c292315e75d378f91fe1435e3e44d7069aa5283dbadd118761c495eae


creativity-tastes-good
1ea21c8dbe2faa310c76f77ebff24d93edd4b3948a72a756e439d9f224e2ae72


of-chairs-and-scale
e021fd95cf6bf4100b2825095838beb2558dd28006616e5d299c4786963dd76e


convergence-bead-set–3
7930c4fef96cb206af85e9180809ebb16635ad6925377e7b7d2fd53dd49f5f9d


another-convergence-bead-set
58fefb69a002a51dc653da68bbea4d0bdb8e8b83900bb8530bee70785c154d77


convergence
2fa0a4f7f9541676888764eab644de1fcefe9979d7b561ae046625fa12755c03


finished-the-shrines-for-now
7b2fcbaa9b62424253753e273016683eeda27ab67ad1e2481f67cd815b1c4616


shrines-continued
5d5cd10d655575e54c64a91c15752f192f0cc86844c5933ad3841c15dba8938e


something-new
d960b09b905e0f8829d35e2fc6d3d708c356c541fa838d50f6341955d97eae38


charismacolors
cd6fdcf2b642562889ae60250a35754deab565644e75907e41e16b17f0930e4e


creative-stirrings
31f53a80e45229e2a6314f95f16cd6619acb41debab1942609693d3fb68a171b


taking-advantage-of-down-time
a4d6c4fed811d1ce5772633f085ab6095d70892e42223a80e1f28ed6322b2573


okay-so-it-worked
10f6afc528bc87f77d660a65ce997c926f25edae7452e922a67bd95975fae6cc


the-point-of-this-is
f9fe3cb83a337ff2517ccee6d37ae7b9669a7dceb7ee4eefa46ebae13c400a64


getting-started
6fbd0778bdc80bac061172dc911cd05bfdc60280a2af61016ed1752865f92c7f


index
6b8383faba27c983ee4d297f5356ea315ca797cdbdfbc5bc40f5b0b7f375b33c

typefaces

Typefaces are courtesy Google Fonts and Font Squirrel. Licenses are documented for each typeface on their respective web pages linked in the colophon.

Web font format is WOFF2 and is the only font format delivered by this site. Nothing but cutting edge here. Typeface data is embedded in CSS for performance. The CSS property control font-variant-ligatures is in play. So is some kerning magic.

In theory, font rendering should only depend on your browser but, for example, in the case of Mac OS X, the OS version is also a factor; WOFF2 is not supported Mac OS X Yosemite and lower even with the very-latest new-fangled WOFF2-capable modern browsers. Peruse browser WOFF2 support.

If your browser of choice can’t handle WOFF2, the browser will roll back to generic defaults, which in any case will be far less of a typographical experience than designed and intended and maybe even cause your computing device to suddenly fold into a black hole, perhaps leaping time and space back to, say, 1874, with no electrical grid, internets or webtubulars to be found. Upgrading is no doubt prudent in that case.

Body fallbacks are typically Georgia and serif. Heading fallbacks are usually HelveticaNeue‑CondensedBold, Arial, and sans-serif.

modular scale

We use a modular scale to calculate heading sizes. In specific, the ratio of choice here is 1.250 otherwise known as Major Third. H2 and H3 are the only headings used through out, at least with this edition. Heading sizes calculated with Jeremy Church’s magical type scale. The relative value rem is used throughout the CSS, with just a tiny part using px or em. Plenty of good articles on the web about all this and more, starting with this excellent treatise on rem and em.

your browser’s secret life

For light lunch reading, check your browser’s ideally excellent support for HTML5 and CSS3.

the source and only the source

Coming soon! All the source, served by mercurial. Yep, no plans to use git around here. Source, no doubt, will definitely include the magical pumpkin code.