GitHub Pages For The Rest Of Us

In A web of agreements and disagreements I documented one aspect of a recent wiki migration: conversion of MediaWiki’s markup lingo, wikitext, to Github’s lingo, Github Flavored Markdown. Here I’ll describe the GitHub hosting arrangement we ended up with.

There were two ways to do it. We could use GitHub’s built-in per-repository wiki, powered by an engine called Gollum. Or we could use GitHub Pages, a general-purpose web publishing system that powers (most famously) the website for the Ruby language. The engine behind GitHub Pages, Jekyll, is also often used for blogs. If you scan the list of Jekyll sites you’ll see that a great many are software developers’ personal blogs. That’s no accident. They are the folks who most appreciate the benefits of GitHub Pages, which include:

Simple markup. You write Markdown, Jekyll converts it to HTML. Nothing prevents you from mixing in HTML, or using HTML exclusively, but the simplicity of Markdown — more accurately, GitHub Flavored Markdown — is a big draw.

No database. For simple websites and blogs, a so-called dynamic system, backed by a database, can be overkill. You have to install and maintain the database which then regulates all access to your files. Why not just create and edit plain old files, either in a simple lingo like Markdown or in full-blown HTML, then squirt them through an engine that HTMLizes the Markdown (if necessary) and flows them through a site template? People call sites made this way static sites, which I think is a bit of a misnomer. It’s a mouthful but I prefer to call them dynamically generated and statically served. I’ve built a lot of web publishing systems over the years and they all work this way. If what you’re publishing is data that naturally resides in a database then of course you’ll need to feed the site from the database. But if what you’re publishing is stuff that you write, and that most naturally lives in the filesystem, why bother?

Version control. A GitHub Pages site is just a branch in a GitHub repository associated with some special conventions. And a GitHub repository offers many powerful affordances, including an exquisitely capable system for logging, tracking, and visualizing the edits to a set of documents made by one or more people.

Collaborative editing. When more then one person edits a site, each can make a copy of the site’s pages, edit independently of others, and then ask the site’s owner to merge in the changes.

Issue tracking. Both authors and readers of the site can use GitHub to request changes and work collaboratively to resolve those requests.

You don’t have to be a programmer to appreciate these benefits. But GitHub is a programmer-friendly place. And its tools and processes are famously complex, even for programmers. If you use the tools anyway on a daily basis, GitHub Pages will feel natural. Otherwise you’ll need a brain transplant.

Which is a shame, really. All sorts of people would want to take advantage of the benefits of GitHub Pages. If it were packaged very differently, and presented as GitHub Pages For The Rest of Us, they might be able able to. The collaborative creation and management of sets of documents is a general problem that’s still poorly solved for the vast majority of information workers. Mechanisms for collaborative editing, version control, and issue tracking often don’t exist. When they do they’re typically add-on features that every content management system implements in its own way. GitHub inverts that model. Collaborative editing, version control, and issue tracking are standard capabilities that provide a foundation on which many different workflows can be built. Programmers shouldn’t be the only ones able to exploit that synergy.

In this case, though, the authors of the wiki I was migrating are programmers. We use GitHub, and we know how to take advantage of the benefits of GitHub Pages. But there was still a problem. You don’t make a wiki with GitHub Pages, you make a conventional website. And while you can use GitHub Flavored Markdown to make it, the drill involves cloning your repository to a local working directory, then installing Jekyll and using it to compile your Markdown files into their HTML counterparts which you preview and finally push to the upstream repo. It’s a programmer’s workflow. We know the drill. But just because we can work that way doesn’t mean we should. Spontaneity is one of wiki’s great strengths. See something you want to change? Just click to make the page editable and do it. The activation threshold is as low as it can possibly be, and that’s crucial for maintaining documentation. Every extra step in the process is friction that impedes the flow of edits.

So we went with the built-in-wiki. It’s easy to get started, you just click the Wiki link in the sidebar of your GitHub repository and start writing. You can even choose your markup syntax from a list that includes MediaWiki and Markdown. As we went along, though, we felt increasingly constrained by the fixed layout of the built-in wiki. Wide elements like tables and preformatted blocks of text got uncomfortably squeezed. You can create a custom sidebar but that doesn’t replace the default sidebar which lists pages alphabetically in a way that felt intrusive. And we found ourselves using Markdown in strange ways to compensate for the inability to style the wiki.

If only you could use GitHub Pages in a more interactive way, without having to install Jekyll and then compile and push every little change. Well, it turns out that you can. Sort of.

I haven’t actually run Jekyll locally so I may be mistaken, but here’s how it looks to me. Jekyll compiles your site to a local directory which becomes the cache from which it serves up the results of its Markdown-to-HTML conversion. When you push your changes to GitHub, though, the process repeats. GitHub notices when you update the repo and runs Jekyll for you in the cloud. It compiles your Markdown to a cache that it creates and uses on your behalf.

If that’s how it works, shouldn’t you be able to edit your Markdown files directly in the repository, using GitHub’s normal interface for editing and proofing? And wouldn’t that be pretty close to the experience of editing a GitHub wiki?

Yes and yes, with some caveats. Creating new pages isn’t as convenient as in the wiki. You can’t just type in [see here](New Page) and then click the rendered link to conjure that new page into existence. Jekyll requires more ceremony. You have to manually create NewPage.md (not, evidently, “New Page.md”) in your repo. And then you have to edit NewPage.md and add something like this at the top:

---
title: New Page
layout: default
---

Since conjuring new pages by name is arguably the essence of a wiki, this clearly isn’t one. But you can create a page interactively using GitHub’s normal interface. Once that’s done, you can edit and preview “New Page.md” using GitHub’s normal interface. To me the process feels more like using the built-in wiki than compiling locally with Jekyll. And it opens the door to the custom CSS and layouts that the built-in wiki precludes.

There are, alas, still more caveats. You can’t always believe the preview. Some things that look right in preview are wrong in the final rendering. And that final rendering isn’t immediate. Changes take more or less time to show up, depending (I suppose) on how busy GitHub’s cloud-based Jekyll service happens to be. So this is far from a perfect solution. If you only need something a bit more robust than your repository’s README.md file, then the built-in wiki is fine. If you’re creating a major site like ruby-lang.org then you’ll want to install and run Jekyll locally. Between those extremes, though, there’s a middle ground. You can use GitHub Pages in a cloud-based way that delivers a wiki-like editing experience with the ability to use custom CSS and layouts,

I don’t think this particular patch of middle ground will appeal widely. Maybe a hypothetical GitHub Pages For The Rest Of Us will. Or maybe Ward Cunningham’s Smallest Federated Wiki (see also http://hapgood.us/tag/federated-wiki/) will. In any case, the ideas and methods that enable software developers to work together online are ones that everyone will want to learn and apply. The more paths to understanding and mastery, the better.

Voyage of the Captain Kirk Floating Arms Keyboard Chair

When we moved last month we let go of a great many things in order to compress our household and Luann’s studio into a set of ABF U-Pack containers. At one point we planned to shed all our (mostly second-hand) furniture, figuring it’d be cheaper to replace than to ship cross-country. But since Luann had acquired all that furniture, it was much harder for her to let go of it than me. So to put a bit more of my own skin into the game I sacrificed my beloved Captain Kirk chair with Floating Arms keyboard.

The idea was to preserve the essential one-of-a-kind keyboard and replace the commodity chair. Which was foolish, Bodybilt chair’s don’t come cheap. But I was in the grip of an obsession to lighten our load, and there was no time left to sell it, so off to the curb it went. My friend John Washer and I immortalized the moment.

Then, happily, fate intervened. First another friend, George Ponzini, sensibly picked up the chair and took it home. Then we decided to use our reserve fourth U-Pack container to bring a sofa, some living room chairs, and other stuff we thought we’d leave behind. Now there was room for the Captain Kirk chair to come along on our voyage. George kindly brought it back, I packed it, and off to California we went.

Weeks later we unpacked our household containers in our rented home in Santa Rosa. When I set the chair down in my office, the hydraulic lifter broke. Not a disaster, I could live with it at the lowest setting until I could replace the lifter. But then, as we emptied box after box, I began to worry. The Floating Arms keyboard wasn’t showing up. Disaster!

Then, finally, it turned up. Joy!

But when I tried to reattach it to the chair, two crucial parts — the rods that connect the arms of the chair to the custom keyboard — were inexplicably missing. Disaster!

Eventually it dawned on me. This wasn’t the Floating Arms keyboard I’d been using for the past 15 years. It was the original prototype that I’d reviewed for BYTE, and that Workplace Designs had replaced with the production model. I’d had a backup Floating Arms keyboard all this time, forgotten in a box up in the attic. So now I could recreate my setup. I just needed to replace the connector rods and the hydraulic lifter. Joy! Maybe! If those parts were still available!

I called The Human Solution and spoke to the very friendly and helpful Jonan Gardner. He took down the serial number on the chair, asked for photos of the broken hydraulic lifter and the arms into which the missing connector rods needed to fit, and promised to get back to me.

The next day the missing Floating Arms keyboard turned up in the bottom of a bag of shoes. More joy! I hooked it up to my broken-but-still-functional chair and got to work. The first order of business was to contact Jonah and let him know I didn’t need those connector rods, they were attached to the missing-but-now-found keyboard. “You’re lucky,” he said. “We couldn’t have replaced those. But the lifter is still available for your chair, and you can order it.” So I did.

The lifter arrived today. It wasn’t immediately obvious how to extract the old one in order to replace it. There were no fasteners. Do you just need to pound on it with a sledgehammer? I wrote to Jonah and he responded with this video and these instructions:

Someone will need to use a 3-4 pound, short handle, steel-head sledge hammer. Timidity will not get the old cylinder out so do not be afraid to HIT the mechanism. After 20 some-odd years, they are going to have to HIT the mechanism.

That’s just what I needed to know. And he wasn’t kidding about the weight of the hammer. I didn’t have a sledgehammer handy, and a regular hammer didn’t work, so I improvised:

And that did the trick. I HIT the cylinder a bunch of times, it popped out, I popped the new one in, and I’m back in business.

Thank you, Workplace Designs, for inventing the best ergonomic keyboard ever. Thank you, Jonah, for helping me bring it back to life. Thank you, World Wide Web, for enabling Bodybilt to share a video show exactly how hard to HIT when replacing a hydraulic lifter. And thank you, Captain Kirk Floating Arms Keyboard Chair, for being with me all these years. I’m sorry I threatened to abandon you. It’ll never happen again.

A web of agreements and disagreements

Recently I migrated a wiki from one platform to another. It was complicated in a couple of ways. The first wrinkle was hosting. The old wiki ran on a Linux-based virtual machine and the new one runs on GitHub. The second wrinkle was markup. MediaWiki uses one flavor of lightweight markup and GitHub uses (a variant of) another.

The process was confusing even for me. But logistics aside, it raised questions about standards, interoperability, and the challenge of working in an evolving digital realm.

The wiki in question is the documentation for the Thali project which I’ve mentioned in a number of posts. The project is mainly documented by Thali’s creator, Yaron Goland. Why use a wiki? Thali is a fast-moving project. Yaron has a blog, and he could use that to document Thali. But while blogs are agile publishing tools, they don’t shine when it comes to restructuring and spontaneous editing. Those are the great strengths of wikis.

Thali was originally hosted on CodePlex. Since that service doesn’t offer a built-in wiki, Yaron augmented it with a Bitnami MediaWiki image hosted in Azure. This was a DIY setup, not a managed service, which meant that when the Heartbleed Bug showed up he had to patch it himself, and he would have been on the hook again when Shellshock arrived. Life’s too short for that.

Also, with the project’s source code hosted on GitHub, it made sense to explore hosting the documentation there too. It’s simpler for readers of the code and the documentation to find everything in one place. And it’s simpler for writers of both forms of text to put everything in that place. There’s just one service to authenticate too, and tools for version control and issue tracking can be used for both forms of text.

I started by moving a few experimental pages from the MediaWiki to the GitHub wiki. Were there tools that could automate the translation? Maybe, but I’ve learned to walk before attempting to run. Converting a few pages by hand gave me an appreciation of the differences between the two markup languages. Each is a de facto standard with many derived variations. GitHub, for example, uses a variant of Markdown called GitHub Flavored Markdown (GFM). Tools that read and write “standard” Markdown don’t properly read and write GFM.

If I were teaching a course in advanced web literacy, I’d pose the following homework exercise:

You’re required to migrate a wiki from MediaWiki to GitHub. Possible strategies include:

  1. Use a tool that does the translation automatically.
  2. Create that tool if it doesn’t exist
  3. Do the job manually

Evaluate these options.

Of course there are assumptions buried in the problem statement. A web-literate student should first ask: “Why? Are we just chasing a fad? What problems will this migration solve? What problems will it create? ”

Assuming we agree it makes sense, I’d like to see responses that:

  • Enumerate available translators.
  • Cite credible evaluations of them (and explain why they’re credible).
  • Analyze the source and target data to find out which markup features might or might not be supported by the available translators.
  • Consider the translators’ implementation costs. Are they local or cloud-based? If local how much infrastructure must be installed, how complex are its dependencies? If cloud-based how will bulk operations work?
  • If no translators emerge, make a back-of-the-envelope estimate of the distance between two formats and the effort required to create software to map between them.
  • Evaluate the time and effort required to research, acquire, and use an automated tool, vis a vis that required to do the job manually.
  • Estimate the break-even point at which a resuable automated tool pays off.
  • Recognize that there really isn’t a manual option. Doing the job “by hand” in a text editor means using a tool that enables a degree of automation.

In my case that last point proved salient. The tools landscape looked messy, there were only a few dozen pages to move over, the distance between the two markups wasn’t great, it was (for me) a one-time thing, and I wanted to make an editorial pass through the stuff anyway. So I wound up using a text editor. To bridge one gap between the two formats — different syntaxes for hyperlinks — I recorded a macro to convert one to the other.

To achieve this result in MediaWiki:

all about frogs

You type this:

[[Frog|all about frogs]]

In a GitHub wiki it’s this:

[Frog](all about frogs)

So much writing nowadays happens in browsers, never mind word processors, never mind old-school text editors, that it’s worth pointing out those old dogs can do some cool tricks. I won’t even mention which editor I use because people get religious about this stuff. Suffice it to say that it’s one of a class of tools that make it easy to record, and then play back, a sequence of actions like this:

  • Search for [[
  • Put the cursor on the first [
  • Delete it
  • Search for |
  • Change it to ]
  • Type (
  • Search for ]]
  • Change it to )

You might find an automated translator that encodes that same recipe. You might be able to write code to implement it. But for a large class of textual transformations like this you can most certainly use an editor that records and runs macros. Given that the web is still a largely textual medium, where transformations like this one are often needed, it’s a shame that macros are a forgotten art. I often use them to prototype recipes that I’ll then translate into code. But sometimes, as in this case, they’re all the code I need. That’s something I’d want students of web literacy to realize.

What would really make my day, though, would be for one of those students to say:

“Hey, wait a sec. This doesn’t make sense. There is no such thing as GitHub Flavored HTML. Why is there GitHub Flavored Markdown?

Or Standard Flavored Markdown, which quickly became Common Markdown, then CommonMark. How de facto standards become de jure standards, or don’t, is a fascinating subject. The web works as well as it does because we mostly agree on a set of tools and practices. But it evolves when we disagree, try different approaches, and test them against one another in a marketplace of ideas. Citizens of a web-literate planet should appreciate both the agreements and the disagreements.

A cost-effective way to winterize windows

D’Arcy Norman asks:

If there’s a better way to winterize windows than just taping plastic to the frame, I’d love to hear about it.

Indeed. In New Hampshire, when fuel prices first skyrocketed, we did that for a couple of years. It’s an incredibly effective way to stop the leaks that suck precious warm air out of your home. But it’s a royal pain to install the plastic sheeting every fall, and when you remove it in the spring you inevitably pull paint chips off your window frames.

The solution is interior storms, a really nice hack I learned about from John Leeke. He’s a restorer of historic homes, and — what brought him to my attention — a narrator of that work. Interior storms are just removable frames, surrounded by gaskets, to which you attach your plastic sheets permanently. Once made they pop into your window frames in a few seconds every fall, and pop out as easily in the spring.

Achieving that result is, however, not trivial, at least it wasn’t for me. My first generation of interior storms, based on John’s instructions, were suboptimal. The round backer rod material he recommended had to be split lengthwise to form a D profile. I wound up making a jig to do that by drilling a backer-rod-diameter hole in a piece of wood, splitting it in half, embedding a razor blade on an angle, and joining the pieces. Great idea in principle, but in practice it was still hard to draw hundreds of feet of backer rod through the jig and achieve a clean lengthwise split. It was also hard to apply hundreds of feet of double-sided tape to the split material.

The backer rod I used also turned out not to be sufficiently compressible. The critical thing with interior storms is a tight fit. When you tape plastic to your windows you’re guaranteed to get that result, which is why it’s so effective. Interior storms need to press into their surrounding window frames really snugly to achieve the same effect. Inconsistencies in the width of my split backer rod, and the relative incompressibility of the material, resulted in storms that didn’t always fit as snugly as they should have.

Another problem with the first-geneneration storms was flimsy frames. I ripped pine boards lengthwise to create inch-wide frame members. They really should have been inch-and-a-half.

So last year I rebooted and created second-generation storms. I started with inch-and-a-half wide frame members. Then I ditched the backer rod and went with pretaped rubber gasket. It’s a much more expensive material but it obviates the need for do-it-yourself taping and has the compressibility I was looking for.

Yet another problem with the first-gen storms was that I made all the frames from the same template. The windows were nominally all the same dimensions, but it turns out there were minor variations and those matter when you really need to achieve a snug fit.

So the second time around I customized each frame to its window. Yes, it was tedious. But for our house it was necessary, and it might be for many old houses. Here’s the algorithm I came up with:

1. Cut short dummy pieces from spare inch-and-a-half-wide frame members.

2. Attach gasket to one side of each dummy piece.

3. Make all long and short frame members a bit longer than needed.

4. For each frame member:

– Place dummy pieces on either end

– Place frame member between dummy pieces

– Compress the gasket at one end

– Mark frame member at the other end (accounting for gasket compression on that end)

– Cut frame member

– Label frame member (“living room west wall”)

Once the frame is made, you attach the plastic sheet in the usual way. I used Warp Brothers SK-38 kits which come with double-stick tape. You tape around the edge of the frame, lay down the plastic, smooth it by hand, press it down, trim the edges, and use a blow dryer or heat gun to shrink it tight.

Ths is the kind of job I hate doing. You spend lots of time climbing the learning curve, and then once you’re done you never reuse the knowledge you’ve painfully acquired. Since the method is so effective, though, I’ll toss out an idea that’s been percolating for a while.

Consider an older house in a northern climate, with older windows and storms, and adequate attic insulation. The walls may or may not be adequately insulated, but the first line of defense is to tighten up those windows. It’s expensive to replace them, and the replacements are going to be vinyl that will ruin the aesthetics of the house and won’t age well. It’s even more expensive to hire a restorer to rebuild the old windows.

Let’s say that interior storms deliver 80% of the benefit of replacement windows for 10% of the cost. Deploying this solution to all the eligible houses in a region is arguably the most cost-effective way to tighten up that population of houses. But the method I’ve described here won’t scale. It entails more effort, and more hassle, than most folks will be willing to put up with.

How could we scale out deployment of interior storms across a whole community? I’d love to see high schools take on the challenge. Set up a workshop for making interior storms. Market it as a makerspace. No, it’s not 3D printing, but low-tech interior storms deployed community-wide will mean way more to the community than anything a MakerBot can print. Also, turn the operation into a summer jobs program. Teach kids how to run it like a business and pay themselves better than minimum wage.

Since I am now living in Santa Rosa, winterization of windows is no longer a big concern. But I’ve been meaning to document what I learned and did back in New Hampshire. And I would really like to see John Leeke’s idea applied at scale in places where it’s needed. So I hope that the new owner of the house we sold in Keene will be successful with this method, that D’Arcy Norman and others will too, and that communities will figure out how to make it happen at scale.

3D Elastic Storage, part 3: Five stars to U-Pack!

It’s been a busy month. We sold our house in Keene, NH, drove across the country, and rented a house in Santa Rosa, CA. A move like that entails plenty of physical, emotional, and financial stress. The last thing you need is trouble with a fraudulent mover which, sadly, is so common that http://www.movingscam.com/ needs to exist. Luann spent a lot of time exploring the site and Jeff Walker, its founder, wrote her a couple of really helpful and supportive emails. When we realized that a full-service move wasn’t feasible in our case, Jeff agreed that ABF U-Pack — the do-it-yourself company I’d identified as our only viable option — was a good choice.

I’ve chronicled our experience with U-Pack before and during the move. Now that it’s done, I’m wildly positive about the service. Every aspect of it has been thoughtfully and intelligently designed.

The non-standard size and shape of U-Pack’s ReloCube is, at first, surprising. It’s 6’3″ x 7′ x 8’4″, and the long dimension is the height. As Marc Levinson’s The Box wonderfully explains, standardization of shipping containers created the original Internet of Things: a packet-switched network of 20′ and 40′ boxes. Those shapes don’t meet U-Pack’s requirements for granular storage, transport on flatbed trailers, and delivery to curbside parking spaces. But while the ReloCube’s dimensions are non-standard, the ReloCube system provides the key benefits of a packet-switched network: variable capacity, store-and-forward delivery. In our case, we’ve now taken delivery of the two cubes that held our household stuff. The two that hold Luann’s studio remain in storage until we figure out where that stuff will land. Smaller containers enable that crucial flexibility.

Smaller containers are also easier to load. Here’s a picture of a ReloCube interior:

All the surfaces are nicely smooth. And there are plenty of slots for hooking in straps. But I wound up using very few straps because I was able to pack the cubes tightly. It’s easier to do that in a smaller space.

I also like how the doors shut flush against the edge of the cube:

When you lever the doors shut on a tightly-packed container they compress and help stabilize the load. That wouldn’t be a significant factor with an 8x8x16 PODS container but with the smaller ReloCube it can be.

On the receiving end, I wondered how the cubes would be positioned. You’d want them snug to the curb, but then how could the doors open toward the house? The video linked to this picture documents the elegant solution:

The forklift driver placed the cube’s edge on top of the curb. Not shown in the video is the final tap with the forklift that aligned the cube perfectly. These folks really pay attention to details!

I can’t say enough good things about our U-Pack experience. No conventional service offered the flexibility we needed so none was an option, but we did solicit estimates early on and they were astronomical: three to four times the $6300 we paid U-Pack to move four containers across the country and make them available to us on demand. (We’ll also now pay $100 per-month per-container for the two studio containers until we retrieve them.) There was very little paperwork involved. Every U-Pack employee I talked to was friendly and helpful. So I’m giving the service a five-star rating.

For me the experience was an echo of a time, fifty years ago, when our family moved from suburban Philadelphia to New Delhi. Here are some pictures of the “sea trunk” that was delivered, by bullock cart, to 102 Jorbagh.

Now the delivery vehicle is a flatbed trailer:

But the resemblance between our New Delhi sea trunk and our ReloCubes is, I think, not coincidental.

Actually the sea trunk trumped the ReloCube in one way. When it was delivered back home my dad arranged to keep it, and he turned it into a playhouse in the backyard:

3D Elastic Storage, part 2

Our U-Pack containers arrived on Thursday, August 21. We loaded them Friday through Monday, they departed on Wednesday, August 27. If your loading phase crosses a weekend you get 5 days to load. That’s enough time to consolidate and reconsolidate as you fill the cubes, and to make final decisions about what to take or toss as you go along.

I’ve always enjoyed the challenge of packing things into containers. It’s kind of like building a stone wall. You wind up with oddly-shaped spaces to fill, and you look for oddly-shaped things that will fill them.

In our case we had more odd shapes than normal. Luann collects, among other things, antique wooden boxes that she uses to frame her sculptures and jewelry. On the first iteration I nested them into one another and consolidated them into standard 6 cu ft boxes. The advantage of standard-size boxes is that you can pack them tightly into a container. But if there’s a lot of air inside those boxes you lose many precious cubic feet.

So we unbundled the boxes and began using them, instead of standard small (1.5 cu ft) or medium (3 cu ft) cardboard boxes, for all the loose stuff that wasn’t packed tightly in the drawers of Luann’s various cabinets of wonders. As we filled the wooden boxes we wrapped them with mover’s wrap. That stuff was incredibly useful! It comes in 20″ by 1000′ rolls, it’s cheap, and it’s wonderfully designed for the purpose. The plastic doesn’t shrink-wrap but it’s tough and sticks to itself. We must have wrapped more than a hundred boxes. As a bonus you can see into the boxes so there’s less need to label the contents.

Packing boxes of different sizes and shapes is like a game of Tetris, but in 3D and with irregular shapes. You pack as tightly as you can, but there will be gaps. Fortunately Luann’s studio offered another useful resource: collections of yarn and fabric. These were originally packed in plastic totes. But totes aren’t space-efficient so we tossed them, redistributed the contents into plastic bags of various shapes and sizes, and evacuated as much air from the bags as we could. The result was a supply of packing material to fill spaces and cushion the load. For the studio containers, in particular, we wound up using very few cardboard boxes. An unanticipated benefit of the wooden boxes: structural support. When you’re stacking into an 8-foot-high space cardboard boxes tend to crush, wooden ones don’t.

In the end we used all four of the containers I’d reserved. Containers #1 and #2 are now storing Luann’s studio, #3 and #4 are storing our household stuff. If we’d been really brutal about excluding furniture we could have used only three and returned the fourth unused at no charge. I liked the idea of starting from scratch with nothing but a table and the bed we bought last year. But it the end a sofa, some chairs, and a few other items came along for the ride.

The household containers held no surprises for U-Pack. But the studio containers, especially #1, raised an eyebrow. There are some heavy items in that load. So heavy that I wound up hiring Dave Gillerlain and his team at Affordable Movers to help me load containers #1 and #2. What weighs so much? Among other things, African trade beads. Luann’s been collecting them for a long time, and she put Keene on the map of places that traders visit. A couple of times a year, Ibrahim Kabba would show up in his van and stage a bead show in our house. The van always rode low, and Kabba wore a back brace to carry in his wares. A cabinet packed full of those beads is a surprisingly dense and heavy object.

Here’s container #1 nearly full:

I’d wanted Dave to distribute the heaviest cabinets between containers #1 and #2, but things went quickly and by the time we got to this point I realized #1 was going to be a beast to lift. A useful refinement for U-Pack would be to embed a scale in each container. That feedback would have helped us balance the studio load between #1 and #2.

We’d left the house by Wednesday morning when the truck showed up to fetch the containers. But I dropped by for a final check, just in time for the pickup. It was the same portly middle-aged guy who had delivered the empties. One person can do the job, but that person is heavily augmented with some serious exoskeletons. This time, I was relieved to see, the forklift was much beefier than the one that had unloaded the empties. Still, I was worried about #1. Sure enough, he’d gotten #2, #3, #4 loaded, had struggled with #1, and was about to reposition the forklift for a second try. “What’s in that one?” I explained as best I could, and asked if it’d be OK. “Yep, just need to come at it from another angle.” He was cheerful, like every U-Pack person I’ve talked to, but despite his optimism I couldn’t bear to watch and drove away. Nobody called from U-Pack, and an hour later the truck and all four cubes were gone.

We left Keene a week after the closing, on September 3, drove across the country visiting friends and family along the way, arrived in Santa Rosa on the evening of the 13th, and rented our new home yesterday, the 15th. It’ll be another week before we can move in, but it’s worth the wait. The place we’ve rented has enough space to unload everything and create a basic working studio for Luann. So we’ll be able to retrieve all four containers and end all the storage charges. But that’s an unexpectedly good outcome. This is the North Bay, space is at a premium, and the rental market is tight as a drum. We were prepared to rent a small apartment, retrieve only the two household containers, then later rent a separate studio and retrieve the two studio containers. Shipping a load to two unknown destinations, for retrieval on two unknown dates, with pay-as-you-use storage for each part of the load, was a tricky set of requirements. U-Pack has designed a really smart system that can, perhaps uniquely, meet those requirements.

Not the link Zillow was looking for

In For sale by owner I talked about the online tools that helped us sell our house. I gave Zillow high marks. Even though our buyers didn’t find us on Zillow — in the end, it was a good old-fashioned drive-by — the service was useful for the reasons I mentioned. But now I’m going to have to subtract some points.

A few days ago I received this email, misleadingly titled Zillow inquiry:

Hi Jon,

I work for Zillow, the online real estate network. When looking for groups that have cited our brand, I came across your great blog post discussing your marketing strategy when selling you (sic) home and noticed you mentioned Zillow. http://blog.jonudell.net/2014/08/05/for-sale-by-owner/

Would you consider linking the word ‘Zillow’ in the third paragraph within the text as a resource to your users? Here’s the URL to the Zillow City Page http://www.zillow.com/keene-nh/

We really appreciate your coverage and thank you for considering the link on your page. Feel free to use me as a point of contact here if you need any data or content in the future, and if nothing else, I’m just glad to have had the chance to connect!

If this is not the correct contact would you please forward it to someone that can be of any assistance, thanks.

Regards,

NAME WITHHELD

I’m withholding the name because the guy was just doing his job. But shame on Zillow for making that his job. It got worse. A few days later:

Hey Jon,

Just wanted to follow up to see if you can help with adding the link. Let me know, thanks!

Regards,

NAME WITHHELD

Where to start? First, this is my blog. I choose whether to link the word ‘Zillow’ in paragraph 3, and if so, where to point that link. And now, because you had the gall to tell me how to do that, and then bug me about it, I’m going to point here.

Second, people who need a link to Zillow in order to find Zillow, if such people exist, are not your customers.

Third, consider who you’re dealing with. Zillow’s users are by definition going through a seriously stressful phase of life. We are likely to be emotionally and physically exhausted by the process of buying and/or selling a home, and by preparing to move. We wake up in the middle of the night obsessing about our checklists. You presume to add to our lists? Disrespectful. Bad form. Don’t.