Skip to main content

Mike Sugarbaker

Are there still any non-technical publications that talk mainly about the web? Or is the web just kinda... obsolete that way?

Mike Sugarbaker

(Although it begins to look like IndieWeb and beakerbrowser.com are in the beginnings of a beautiful friendship!)

Mike Sugarbaker

All IndieWeb'd up and nothing to post.

Mike Sugarbaker

I cannot quite yet tell whether this is interesting:

http://beakerbrowser.com

Mike Sugarbaker

Do the tools work? What was the point of this?

Mike Sugarbaker

Two and a half consciousnesses are having tea at Townshend's

Mike Sugarbaker

There is no front-end web development crisis

10 min read

I mean, that’s a thing, right? Us devs are all talking all the time about the tool chain and the libraries and the npms and just how hard building the web has become. But we’re talking about it wrong. The problem is not that writing JavaScript today is hard; the problem is we still have to write so much of it.

There isn’t a front-end development crisis; there is a browser development crisis.

Think about how many rich text editors there are. I don’t mean how many repositories come up when you search for “wysiwyg” or whatever on GitHub; I mean how many individuals out there had to include a script to put a rich text editor in their page. And for how long now? Ten years? Sure, we got contenteditable, but how much human suffering did that bring us?

Where is <textarea rich=”true” />? Where, for the Medium-editor fans, is <textarea rich=”true” controls=”popup” />?

Believe me, I have already thought of your cynical response. There are 9,999 reasons to call this a pipe dream. I don’t have time for them, thanks to all the Webpack docs I have to read. I’m not talking about things that’d break the web here – I don’t want us to try to build jQuery or React into the JS engine. I’m talking about things that are eminently polyfillable, no matter how people are deploying them now. And do I want to start another browser war? Yes – if that’s the only way my time can be won back.

Lots of web pages have modal content – stuff that comes up over the top of other stuff, blocking interaction until it’s dismissed. It was pointed out to me on Twitter that there have already been not one, but two standards for modals in JS, both of which have been abandoned. But they tried to reach toward actual, application-level modals, which already constitutes a UX disaster even before you add the security problems. By contrast, the web modals you see in use today are just elements in the page; a <modal> element, that you can inspect and delete in Dev Tools if you want, makes perfect sense. It might not replace a ton of code, but every little bit helps.

It doesn’t stop with obvious individual elements, although that may be the best initial leverage point (reference the <web-map /> initiative and its Web Components-based polyfill, the better to slot neatly into a standards proposal). There are plenty of cowpaths to pave. We need to start looking at anything that gets built over and over again in JS as a polyfill… even if for a standard that might not have been proposed yet.

You know what a lot of web sites have? Annotations on some element or another, that users can create. They have some sort of style, probably; that’s already handled. They might send data to a URL when you create them; that could be handled by nothing more than an optional attribute or two. While you’re at it, I want my Ajax-data-backed typeahead combobox. But now that we’re talking to servers…

You know what a lot of web sites have? Users. I’m not the first to point out that certificates have been a thing in browsers for pretty much the entire history of the web, but have always had the worst UX on all the civilized planets. There is no reason a browser vendor couldn’t do a little rethinking of that design, and establish a world in which identity lives in the browser. People who want to serve different content to different humans should be able to do it with 20% of the code it takes now, tops. (Web Access Control is on a standards track. Might some of it require code to be running on the server? Okay – Apache and Nginx are extensible, and polyfills aren’t just for JS; they’re for PHP too.)

And all of that implies: you know what a lot of web sites have? ReST APIs. Can our browser APIs know more about that, and use it to make Ajax communication way more declarative without any large JS library having to reinvent HTML? Again, it’s been like ten years. ReST is a thing.

While we’re talking reinvention, remember the little orange satellite-dish icons that nobody could figure out? Well, if we didn’t want to reinvent RSS, maybe we shouldn’t have de-invented it to begin with. In the time since we failed to build adequate feed-reading tools into browsers and the orange icons faded away, nearly all of the value of the interconnected web has been captured for profit by about three large companies, the largest being Facebook. For all practical purposes in America, you can no longer simply point to a thing on the web and expect people who read you to see it. Nor can you count on them seeing any update you make, unless you click Boost Post and kick down some cash.

Users voted with their feet for a connected web, which had to be built on one company or another’s own servers – centralized. It had to be centralized because we weren’t pushing forward on the strength of the web’s connective tissue, making it easy enough to get the connections users wanted. And credit where it’s due to Facebook and Twitter (and Flickr before them) for doing the hard work of making the non-obvious obvious – now we know, for example, that instead of inscrutable little orange squares in the location bar, we should put a Follow button in the toolbar whenever a page has an h-feed microformat in it. Or a bunch of FOAF resources marked out in RDFa, for that matter.

Speaking of microformats and RDF and bears oh my[1], it might be time to stop laughing at the Semantic Web people, now known as the Linked Data people. While we’ve been (justifiably) mocking their triplestores, they’ve quietly finished building a bunch of really robust data-schema stuff that happens to be useful for a clear and present problem: that of marking things up for half a dozen different browser-window sizes. Starting with structured data is great for that. Structured data may also be helpful for the project of making browsers help us do things to data by default, instead of having to build incredibly similar web applications, over and over and over again.

But Mike, you’re thinking, if the browsers build all these things we’ve been building in JS as in-browser elements, then everything will look the same! To which I say, yes – and users will stand up and applaud. They love Facebook, after all, and there ain’t no custom CSS on my status updates. It’s not worth it. Look, I don’t want to live in a visual world defined by Bootstrap any more than you do, but it’s time for the pendulum to swing back for a little while. We need to spend some time getting right about how the web works. Then we can go back to sweating its looks. And it’s not as if I’m asking for existing browser functionality to go away.

But Mike, you’ve now thought hard enough that you’re furiously typing it into a response box already, you have no idea. Seriously, you have no idea how hard it would be to do all this. Well, you don’t spend 20 years building the web, as I have, without getting at least some idea of how hard some of this will be. But you’re right, it will be stupid hard. And I’ve never been a browser engineer, so I have no real idea how hard.

And you, I counter, have no idea how worth it all the hard work will be. To break Facebook’s chokehold on all the linking and most of the journalism, or, if that doesn’t move you, to just see what would happen, what new fields would open up to us if connection were free instead of free-to-play. To bring users some more power and consistency whether individual web builders lift a finger or not. And yes, to bring front-end web development back a little bit towards the realm of the possible and practical.

Flash is dead; that is good. Apple may have dealt the decisive blow, but browser vendors did most of the legwork, and now as a direct result we have browser APIs for everything from peer-to-peer networking to 3D, VR, and sound synthesis. All of that is legitimately awesome. But for all the talk about defending the open web, that stuff only got done because a software-platform vendor (or three – Google and Microsoft’s browser teams helped a bunch) detected a mortal threat to the market of its product. When Mozilla threw its first View Source conference in Portland last year, that was my biggest takeaway: Mozilla is a software platform vendor, first and foremost, and will make decisions like one. It happens to be a nonprofit, which is great, but which may also contribute to its proclivity to protect itself first. That self-interest is what will drive it to do things.

So. Dear Mozilla: there is a new mortal threat to the market of your product. It is the sheer weight of all this code, not in terms of load time, although that’s bad enough, but development time. The teacups of abstraction are wobbling something awful, and we need you to have our backs. You employ people who are way smarter than me, and they can probably think of way better things to put into place than the examples I’ve got here. That isn’t the point. The point is there has to be less code to write. Pave some cowpaths. Make Browsers Great Again. Or something. Please. Thank you.

[1] Because I know they’ll get all in my mentions, I hasten to add that microformats were created by an entirely different tribe of developers than RDF-and-such, and were in fact created as a direct response to how awful RDF was to deal with at the time. And yeah, it was pretty awful to deal with… at the time. Now it’s better, and I kind of think team Linked Data has regained the edge. I tried really hard not to make this piece into a Semantic Web/IndieWeb beef tape. I’m sorry.

Mike Sugarbaker

That funny-looking Roundhead kid

7 min read

One of my favorite memories of childhood is lying on the floor of my Dad’s office at Northbrae Community Church – he was the minister, about which I have a great story that has been cut for time – in Berkeley, California, reading Peanuts strips, of which my Dad had several collections shelved alongside Bibles, commentaries, philosophy and whatnot. I want to say those comic strip collections had pride of place on that bookshelf, but the truth is I was lying on the floor and that’s the only reason I spotted them there on the lowest shelf. So who knows?

Somehow I got to reading some about Charles Schulz and his approach to his work. Maybe there was an interview in the back of one of the books. Early on, I took in his opinion that the reason we all love Charlie Brown so much is that “he keeps on trying.” To kick the football, to talk to the little red-haired girl, to win a baseball game, to belong.

I didn’t connect with that sentiment. It’s not clear what I did do with it, but I never felt like that was a reason to love Charlie Brown. At the time I just loved him instinctually. Here was this kid who, like me, didn’t really fit in, and got a lot of shit thrown at him for no reason that he could see – maybe just because others needed some entertainment. Just like him, I didn’t have the tools to deal with that harassment, not without poisoning myself a little bit inside every time, and just to mix metaphors and switch over to the Peanuts animated cartoons, none of the adults seemed to be speaking my language when I asked them what to do. Their advice – just ignore them when they pick on you! – might as well have been a series of muted trumpet sounds.

I didn’t love Charlie Brown because he kept on trying – I loved him because the alternative was loving a world that thinks some people are just better than others, and that those people who don’t seem to have the world’s favor should certainly never ask why or why not. They should just keep on trying. (Charles Schulz, by the way, was a lifelong Republican donor.)

Now, I’m notorious for reading literature a bit shallowly (and yes, Peanuts is literature, up there with The Great Gatsby as some of the greatest and most iconically American of the 20th century, but that’s another post), and I miss layers of meaning sometimes. My dad pointed out as I was writing this that reading Charlie Brown more generally as hope, and specifically as a tragic hero defined by his inability to give up hope, is a pretty strong reading that also supports that Schulz quote. Personally, I could see Schulz connecting with Charlie Brown more on the level of commitment to one’s job; the fact that Schulz could do the same gags with Charlie Brown for 50+ years and never have to deal with him changing is something he could feel good about (n.b. his own career as a cartoonist, and the occasional strips about Brown’s father, a barber, and his connection to that craft). Charlie Brown kept showing up for work, which Schulz and others could admire and enjoy on more than one level.

But permit me an indulgence. Lately I’ve been nursing this crackpot theory that the American Civil War actually started in England in the 1600’s. I have another theory on the side, more straightforwardly supportable, that said war is also ongoing. To get at my case for its beginning, though, I’ve gone to Albion’s Seed: Four British Folkways in America by historian David Hackett Fischer. One of the so-called folkways – a “normative structure of values, customs and meanings” – Fischer chronicles is that of the Royalist side of the English Civil War that became known as the Cavaliers.

The Cavaliers were, as you might guess, known for having horses when their opponents more often didn’t, but also for mostly being wealthy and interested in letting you know they were wealthy, and for their interest in having big estates with really, really big fuck-off lawns; a particular style of being landed as well as moneyed. The English Civil War separated the monarchy from political power – if not quite for good, and as it turns out, Puritans make lousy rulers – but it didn’t separate the Cavaliers from the kind of power that they had. And when England got cold for them in the 1640’s, a lot of them moved to more receptive territory in the colonies, namely in Virginia and points south. Fischer draws a strong correlation between this migration and the “Southern Strategy” that put conservatism back into its current power in America.

In the English Civil War, the King and the Cavaliers were opposed by a bunch of factions which, thanks in part to the close-cropped Puritan hairstyle, became collectively known as Roundheads. I was so happy when I heard that. I imagined that round-headed kid, good ol’ Charlie Brown, in peasant clothes holding up a pike, demanding an end to the divine right of kings. Permit me that.

I allow that Charlie Brown is an awkward symbol for forces aligned against conservatism. He doesn’t win much, for starters. There’s also the uncomfortable invitation to misogyny in the relationship between failed jock Charlie Brown and frequent football holder Lucy Van Pelt, which a certain flavor of person will accept wholeheartedly. Speaking of which, one facet of Charlie’s woes is a major contributor to the entitlement we now see in certain nerd cultures gone sour. (There was a point when it could easily have done that in me. I’m still not entirely sure how I avoided this.)

Instead, I ask you to respond to Charlie-Brown-the-symbol the way I did as a child, but couldn’t articulate until recently: negatively. I want you to tell him to stop being who he is, to grow out of his perhaps-essential nature and start making demands. But stay his friend, by demanding that the forces that make his world step into the frame and be seen, lose the muted trumpets this time, and name their reasons for letting this world exist. Charlie Brown has hope, but he shouldn’t need it.

This is obviously personal for me. I didn’t become tough and wise by virtue of recreational abuse at the hands of my peers; any wisdom I have I was able to get in spite of their best efforts. Any strength is left over from what they sapped. Some kids might respond to abuse and interpersonal adversity by getting stronger, but if you’re writing off the ones who don’t as losers, or trying the same methods over and over of teaching them to cope, you’re indulging yourself in a toxic, convenient fantasy. Making others feel small to feel bigger yourself is no more inevitable a part of human life than humans killing one another for sport. Polite society eliminated one of those; it can lose its taste for the other.

When people become identified with a power they take for granted, they go halfway into bloodlust when you threaten to mitigate that power in even the smallest way. In the end, that’s the basis of conservatism. But the power to take a shit on someone, at some point, when we’ve decided it’s okay, might be one that we all identify with. So I don’t have a lot of hope that we’ll change this in my lifetime, or even make a dent. But I want to stop kicking the football. I want to start asking the question.

Mike Sugarbaker

You knew you were tired, but then where are your friends tonight?

7 min read

In late October I declared November to be NaNoTwiMo – National No Twitter Month – and took the month off of Twitter. I pledged neither to read posts nor to make them, except in emergencies. I declared an emergency for the day I finally got user creation working for theha.us, my multi-user instance of the up-and-coming “distributed social network” tool Known. (I say “up-and-coming” when I ought to say “coming someday,” since the distributed part is still unimplemented, but uh, I’ll get into that later.) And I decided not to count the occasional trip to the profile page of a tech person who’d recently announced something – the public nature of Twitter often makes it more useful than email for open-source-related communications. And I cheated a few times.

Why do this when Twitter is more or less where I live online these days? Because Twitter, corporately speaking, is steadily becoming less committed to letting me direct my own attention. I can turn off the display of retweets, but not globally – just one friend at a time – and Twitter now also occasionally offers me something from someone a friend follows, apropos of nothing. I can use a list, for those times that I only want updates from the people dearest to me, but lists now ignore my no-retweets settings. Without that ability to turn down the noise when I want, I find that using Twitter makes me less happy. And this is all to say nothing of Twitter’s then-ongoing refusal to do anything systemic to manage its abuse problem and protect my most vulnerable friends. (Things have since gotten a hair better on that front.)

In a post on Ello that’s no longer visible to the public, net analyst Clay Shirky wrote, “really, the only first-order feature that anyone cares about on a social network is ‘Where my dogs @?’” It is devastatingly, sublimely true. It is astonishing how much people will put up with to be where their people are.

For November, when I had something to say I generally put it on Ello. My account, like Shirky’s, is set only to be visible to other registered Ello users (I have invites if you’re curious). I’m not sure why I’m doing that, as it doesn’t make things private per se; Shirky is also aware of this and thoughtful about how different levels of privacy influence a piece of writing. It feels right sometimes to talk this way in a different room, even if the door isn’t closed. The most surprising thing about the last month is how many people – how many of my friends – not only came over to Ello when I raised it as an option, but stayed. They didn’t burn their Twitter accounts down behind them, and they didn’t show up a lot; I’m often the only voice I can see above the fold in my Ello Friends stream. But there were Monica and Jesse and Jenny and Megan, showing up now and then, posting things that are longer than 140 characters, the way we thought we would (and did for a while!) at Google+.

But that’s not a movement. It’s a pleasant day trip, and it might be over.

It’s an article of faith in the tech community that a social network can always hollow out the way MySpace did when a new competitor reaches a certain level. But that was a different world. Almost ten years ago, right? Getting all the kids to move is a whole other ballgame from moving the kids, plus their parents, plus the brands and photo albums and invitations and who knows what else. Not to be too specific; I’m just citing Facebook as an example, my beef isn’t with them in particular. (Facebook also beat MySpace in part by being perceived as high status, and what’s higher status than every celebrity you could name having an @-name?)

The last ten years have made us awfully demanding in some ways. If you ship social software to the web, it had better have every feature that people might want and have it immediately, because it will be taken for always-and-forever being what it is when the first wave of hype hits. No minimum viable product is going to win over the mass. Even more frustrating is the IndieWeb movement: I may be about to display myself here as one of those who give up hope when a feature is missing, but I’m also in a position to know that the rate of progress of open-source distributed social networks has been ludicrously slow. We finally have an almost-viable open-source product, analogous to WordPress – that’s the aforementioned Known – but it still has no interface for following people, whether on the same site or elsewhere. The code infrastructure is there, but there’s no way to use it yet. I guess all its hardcore users are still using standalone RSS readers like good Web citizens or something, but the mainstream was never interested in fiddling with that. (Nor will standalone RSS readers support private posts.) Given the, er, known impatience of the mass for anything that doesn’t do all of the things already, I’m starting to worry that the indie web won’t have what it needs to get traction when the time is ripe (that is, when Twitter finally falls over).

Maybe I’m only running a Known instance, or caring at all, out of nostalgia. I’m old enough to remember the web we lost. On the other hand, there’s an important sense in which we got what we (I) wanted – we’re all together, all connected… and it’s terrible. Clay Shirky has an idea – a whole book in fact – about the cognitive surplus of a population having been liberated by the 40-hour work week and creating a kind of crisis where we didn’t know what to do with ourselves, until television stepped in. Like the gin pushcarts on the streets of London after the industrial revolution, television stopped us from having to figure out what was wrong and fix it. In (Shirky’s) theory, the internet is our equivalent to the parks and urban reforms that made gin pushcarts obsolete – but what if all that connection is actually a crisis of its own? I think a lot about something Brian Eno wrote in 1995 in his book A Year With Swollen Appendices (he was writing about terrorism, but it applies): “the Utopian techie vision of a richly connected future will not happen – not because we can’t (technically) do it, but because we will recognize its vulnerability and shy away from it.”

We may be shying away already, by using mass-blocking lists and tools and the like. Maybe that’s not so bad, provided that Twitter’s infrastructure can keep up. But then, we’re usually willing to do as little as we can to stay comfortable instead of getting to the root of the problem. I’m back on Twitter now, using a second account in place of a list, which isn’t ideal (lists can be private). But where else am I going to tell my friends when I’ve found something better?

Mike Sugarbaker

Design for the user who's over your crap

3 min read

It’s happening again as I write this, with tilde.club: at first people were excited about the stripped-down, back-to-basics user experience of a plain UNIX server designed for serving web pages, and the aspect where logged-in users could chat at the command line gave the place the feeling of an actual social network. But now the initial excitement is spinning down and people are updating their pages less often; whether the chat is still hopping, I couldn’t say – I don’t have an account – but I guarantee you it’s changing.

What do we need from the social network that’s next, the one that we actually own? (You could argue as to whether it’s coming, but no need for that right now.) I propose that the moment we get bored is the most important moment for the designer of an app to consider. Right? Because what’ll people do with whatever revolutionary new web thing you put in front of them? If my experience on both sides of the transaction is any guide, they’ll probably get sick of it, and fast.

There are so many kinds of boredom, though. There’s the smug disappointment of paddling your surfboard over to what looks like the next wave, only to find that it “never” crests. A more common pair, though: there’s the comedown – when something was legit exciting but then the magic leaves – and then there’s the letdown, when something seems exciting at first blush but you investigate and find the glamour was only skin deep. Most systems have more to fear from the latter. New systems that are any good, though, don’t often have a plan for the former. Distributed social networking needs one.

What do people need at first, and then what do they need later?

At first:

  • Co-presence (hanging out)
  • Discovery (more and more users!)
  • Things to play/show off with (hashtags, what have you)

Later:

  • Messaging (purpose-driven – I need to get hold of *that* person)
  • Defense (from spam, griefing, and attention drains of various kinds – generally, but not entirely, from the public)
  • Things to use and enjoy (tools and toys that aren’t purely social)

One’s needs from the first list never go away, exactly. You’ll always want to bring something up to the group now and then (where “the group” is whoever you’re actually personally invested in conversation with), and play and discovery don’t die. But we see so much more design for that first list – probably because a commercial social network needs to privilege user acquisition over user retention… or thinks it does. And as a whole culture we are only now coming around to the importance of designing for defense, despite the evidence having been here for 35 years.

It’s hard to keep coding when the bloom is off the rose of a project. One way to keep yourself motivated, when the work is unpaid, is to take the perspective of that un-jaded, excited new user, discovering and fooling around. This naturally leads to features that appeal to that mindset. A major obstacle we face in developing the decentralized, user-owned permanent social network is making faster progress while maintaining the mindset that will result in a grownup network for grownups.