Tag Archives: software maintainability

Human Shardable Apps: Designing for Perpetuity


February 4, 2013 at 10:56 am (No Comments)
Yea, not so much. (CC by jcarbaugh)

Yea, not so much. (CC by jcarbaugh)

There’s a bright orange Gowalla shirt in my closet.  There’s a sticker for Gowalla on the door of the painfully named Suburbia Torchy’s Tacos for it, exhorting you to check in.  There might even still be a Gowalla app installed on your iPhone.  But Gowalla is no more.  When it was shutting down after the team was aqui-hired by Facebook, they claimed to be working on a way to let users download their data: photos, status messages and check-ins.  That never happened.

Those of us in the web startup community don’t spend much time thinking about the legacy our applications will leave.  We rush to new technologies and platforms without a thought to what will happen when the investors pull their cash or the company pivots to selling speakers out of the back of a truck.  Just like we’ve embraced things like scalability, test suites, and code maintainability, it’s time to start taking our software legacy seriously.  It’s time to start thinking about our responsibility to our users, not as table IDs or profiles, but as human beings.

I’m as guilty of ignoring this issue as anyone.  From 2006 to 2010 I led a team at Polycot that built and hosted the Specialized Riders Club, a social network for riders of Specialized Bicycle Components gear.  We were a contract development shop, so aside from our monthly budget for hosting, we only got paid for doing big new development projects, like adding photo and video sharing or internationalization.  When we designed new features we never discussed what would become of things if the site was shut down, and we didn’t budget money for shutdown contingencies or user data exports.

When the time came to shut the site down and migrate the Riders Club to a new platform, a notification was sent out to users.  They were given a few months to archive any content from the site the old fashioned way, copy and pasting or right clicking and saving.  Then it was gone.  Admittedly the number of active users we had at the Riders Club is dwarfed by the number of users Gowalla had, but the same responsibility applies.  If we’d gotten export requests we would have pulled the data and sent it on, but we need to start thinking about the data our users entrust us with from the start.  By asking them to share their content with us, we have a responsibility to them.

Bruce Sterling talked about this in his 2010 closing talk, and Jason Scott from Archive Team and The Internet Archive had a great talk about it at dConstruct 2012, I suggest you take a listen.  Archive Team tries to collect sites that are destined for the trash heap, archiving things like Fortune City, Geocities and MobileMe.  They have a VM you can run that’ll run their automated scraper tool.  It’s a pretty cool hack, but the fact that Archive Team even has to exist is a testament to how bad we are at considering our legacy.

Historically few sites offered useful data exports, and if they did they were in a format that you’d need to write your own application to utilize.  37 Signals Basecamp had XML exports, but didn’t have an HTML option in 2009.  Facebook added a data export option in 2010, and it’s getting better, but I don’t believe it’s in an application friendly format.  Twitter is finally rolling one out for their users, but it’s been 3 months and I still can’t export mine.  Even if I have mine and you have yours, there’s no way for us to put the two together and get any networked value.  They’re designed for offline reading or data processing, not so the spirit and utility of the service can live on.

Especially as web applications get more dynamic and collaborative, I think we might need to start thinking in terms of giving users the option to have an program to interactively use, or even a program which can utilize multiple data exports to create a mini-version of the site.  If your application’s simple, then maybe a stripped down python or ruby application that you can access with a web browser.  If your application’s complex, then maybe an i386 based VM.  Spin it up, it has a complete site environment on it which can import the data exports from your live site.  Maybe even import as many data exports as you have access to.  You should already have something like this to get your developers up to speed quickly, it shouldn’t be too hard to repurpose it for users.  You may say, “But my code is proprietary, why would I want to share it,” but most sites don’t really do anything special in software.  Gowalla might have had a unique ranking algorithm, but you can pull that out of a public release.  If your code is so terrible that you wouldn’t want it up on Github, you have other problems, but don’t let that stop you.  Bad code is better than no code.  When you start a project, make an implicit pact with your users.  They’ll take care of you if you take care of them.

In Aaron Cope’s time pixels keynote at the New Zealand National Digital Forum he talks about downloading archives of his Flickr photos with a project of his called Parallel Flickr (here’s the related conference talk and blog post), and the idea that maybe if we could download our contacts photos, perhaps it would be possible to re-assemble a useful web of photos when (inevitably) Flickr goes away.  That’s great, but it shouldn’t be left to users to build this code.  As web application developers, we should encourage this.  When you build a client application, give it the ability to use an alternate API endpoint.  That way if your site shuts down and your domain goes away, people can connect it to another host.  Or they can run their application through a private API middleware which archives things they want to keep private away from your service.

Eventually your site’s going to go away, and no matter how much lead time you give people, some day your funding will run out and there won’t be a site to host an export button anymore.  If your site is social, like MySpace or Facebook, is the data has inherent privacy concerns.  You can’t post an archive of all of Facebook or MySpace for people to download.  There are private messages, photos, comments and all kinds of other secure stuff in there.  But knowing this is going to be an issue, maybe we could create a standard method for authenticating sites users and bundling user data.  We could setup archive.org or some other site with enough ongoing donations (kind of like how the Federal Deposit Insurance Commission works) to store all the data for ever, and provide a self-service way to authenticate yourself and get at it.  Maybe a volunteer team to allow children and loved ones to download a deceased relatives data, or to help people who’ve lost access to the email addresses they had.

My birth mom kept a paper diary her entire life, and after she passed away from breast cancer the diaries passed down to her kids.  The family got together at our house last month, and tidbits of information from her diaries were mentioned several times, by my brother’s girlfriend who never even had a chance to meet her.  Imagine if my mom had thought, “Oh, I’ll just use Gowalla to log what I do every day.”  Her future daughter-in-law (hint, hint, Don) would never had the chance to know her in her own words.

Developers of the world, that’s the mandate.  Build your applications with their post-shutdown legacy in mind.  We need to consider it at every step in our development process, just like we consider deployment, usability and scalability.  We need to start building mechanisms for users to maintain their data without us before the money runs out.  We need user-centric exports built into the system from the start.  We need a way for users to get access to that data even when the site hosting the export button disappears.  We need all this so we can build the future with a clear conscience, knowing we’re leaving a legacy we can be proud of.

P.S. If you’re building legacy tools into your codebase, or know of someone who’s doing a really good job of this, leave me a comment.  I’d love to put up a post of real-world examples and pointers.

Update 1: Aaron Cope has an excellent talk/blog post on this topic as it relates to flickr, explaining in eloquent detail the trials and concerns as someone who’s built a shardable version of a major social service.  You can (should) read it here.

Platform Persistence, Virtual Death and Pocket Worlds


October 26, 2012 at 1:00 pm (2 Comments)

Note: This is a long, rambling, train of thought post. The tl;dr version is: Emotional connection to bots happens, we get sad when things we care for go away, so there’s a big ethical risk associated with human-acting bots living in unportable platforms. We members of the ‘Bot 2.0’ community need to address this before we get too far.

A little over a year ago I started playing a cloud-based iPhone game called GodVille. GodVille describes itself as a Zero Player Game. You take the role of a god, you create a hero, and you send that hero out into the game world to fight on your behalf. Your hero is an independent being.  When you come back to check on them, they will have recorded an entertaining diary of monsters fought, treasures collected, and items sold, all without your input. You only have four influence options on your hero: you can encourage them, which makes them heal faster, discourage them, which makes them fight better, shout down at them, and activate some of the items they pick up.

While it isn’t a very interactive game, it’s still a compelling experience. I check on my hero every day or two, look for interesting items to activate, and encourage him as much as I can.

Your GodVille hero can’t permanently die. They can be killed, but they’ll just wait around in the ground, writing notes in their diary until you resurrect them. (They’ll get tired of waiting for you and dig themselves out after a few days.) Not killing these bot-like characters is common in online games, permanent death is generally reserved for the hardcore modes of single player releases. (A really interesting article in wired.co.uk postulates that the free to play model is driving this, because developers don’t want to give you an excuse to walk away from their microtransactions, or get the feeling that your money was wasted.)

Pets in GodVilleOnce sufficiently powerful, your GodVille hero can adopt a pet, it’s own sub-bot that helps it fight and gains it’s own levels. My hero adopted a pet earlier this year. Over the a few weeks I watched the pet (a dust bunny named Felix) fight along side my hero, shield him from attacks and help heal him. The pet went up in level, gained some abilities, and everything was going just peachy.

Then I opened the app one day, and the pet was dead. My hero was carrying around Felix’s corpse. I went to the web and searched for pet resurrection, but found it wasn’t possible. Sometimes the hero will pay to have the pet resurrected, sometimes they’ll just bury them. After a grieving period, they’ll adopt a new one.

Felix’s death had a lot more of an emotional impact on me than I expected. I didn’t know Felix, I never met it, it really only existed as a few hundred bytes of data on a server somewhere. I’ve had more interactions with lamps in my house than I did with Felix.  If you tip a lamp I really like off a table and shatter it into a million pieces, I may be angry, but I likely won’t feel an immediate emotional loss.


A Lamp with Feelings

Felix’s death was hard because I’d made an emotional connection to him, watching him interact with my hero. His death highlighted my powerlessness in the game. I can resurrect my hero, within the confines of the game mechanic, but I can’t resurrect his pet. No matter what I do, no matter how hard I try, I can’t bring Felix back to life.

Someday, inevitably, GodVille will shut down. People will move on to other projects, the server bill won’t get paid, iPhone apps won’t be the hot thing anymore. My hero, his diary and pet will disappear, and because he only lives inside the GodVille system (and being part of that system is a fundamental aspect of who he is), he will be gone forever.

Bruce Sterling at SXSW 2010 (photo by jonl)

Bruce Sterling gave a great talk about this at SXSW in 2010, about how the Internet doesn’t take care of it’s creations. We build and throw away. Startups form, grow like crazy, and if they don’t sufficiently hockey stick, they close. Or they get popular but not popular enough, and the team gets hired away to bigger players. Either way, the service shutters, the content and context disappears, history is lost. If it’s bad to have this happen to your restaurant checkins and photos, how much worse is it when it happens to virtual beings you’ve created an emotional attachment to? As creators, if we encourage platforms like this, roach motels where content comes in and never comes out, what does that say about us?

Eighteen and a half years ago I created my first character on a text based multiplayer internet game called Ghostwheel, hosted by my first ISP, Real/Time Communications. Ghostwheel was a MOO, an Object Oriented version of a Multi-User Dungeon, the progenitor of today’s MMORPGs like World of Warcraft. In a MOO you can create characters, build environments and objects, talk to other people, fight, and even create bots.

Real/Time Communications hosted Ghostwheel on a small server in their data center, a 486 desktop machine. People from all over the world connected to that server, created characters, and wove shared stories together over the early boom years of the internet.

A Late 90’s Austin Ghostwheel Austin Meetup

Eventually Real/Time Communications lost interest in hosting and maintaining Ghostwheel (and eventually Real/Time itself disappeared), so we took it elsewhere. As someone with colocated servers and ISP experience, I ended up hosting it on one of my machines. It now lives in a cloud VM, and even though the players have left for newer, more exciting destinations, everything they created, the characters, the setting, the dusty echoes of romances and feuds and plots all still exist. It still exists because someone with the wherewithal got their hands on it, and cared enough about it to keep it going, and it exists because MOO is an open source platform that doesn’t depend on one company being in business.

While piecing together the thoughts for this post it occurred to me the that the MOO server could probably be compiled on some modern linux based smartphone. They have more than enough CPU power and memory, and even a 3G connection is fine for text. I could conceivably load Ghostwheel on one and carry it around in my pocket. A whole world, nearly a thousand characters, tens of thousands of rooms and objects, dozens and dozens of species of monsters, all living in my pocket. I could hand it to people and ask them about the weight of a world. Every time I think about that it blows my mind. There’s definitely the kernel of something new and weird there.

So back to my point, as I’ve talked about before there’s a whole species of autonomous bots appearing around us that we relate to as nearly human. Like my GodVille character, we don’t have direct control over them, their autonomy being one of the things that makes them seem more human. They’re coming, they’re awesome, and I think in a few years they’ll be as common as Facebook accounts.

The most exciting work I’ve seen in this field is from the good folks at Philter Phactory and their Weavrs system. Weavrs are social bots defined by location, work and play interests, and groups of emotional tags. The Weavrs system hooks into Twitter, generates its own personal web pages (kind of like a bot-only mini-Tumblr) for each weavr, and is extensible through API driven modules called prosthetics. Some example prosthetics include the dreams prosthetic, which folds images the weavr has reposted into strange, creepy kaleidoscopes.

Weavrs are easy to create, they produce some compelling content, and they’re fun to watch. I’ve created a few, my wife has one, several of my friends have them. Interest is picking up from marketing and branding agencies, and where the cool hunters go, tech interests will inevitably follow.

The thing that’s starting to concern me is the possibility that Bots 2.0 could end up being another field like social networking where the hosted model gets out ahead of ownership and portability. What happens when the service hosting our bots disappears?  What happens to all it’s posts, it’s images, it’s conversations?  (I suppose I wouldn’t be qualified to work at a cloud provider if I didn’t have strong feeling about data portability.)

Weavrs as a whole isn’t open source, but it has lots of open source bits. Philter Phactory is trying to run a business, and I don’t begrudge them that. They have the first mover advantage in a field that’s going to be huge. I’m sure data portability is on their radar, but it’s a lot easier to prototype and build a service when you’re the only one running it. Conversely, it’s a lot easier to scale out a platform designed to be run stand-alone than to create a stand-alone version of a platform.

Once a few more folks start to realize how interesting and useful these things are, I think we’re going to see a Cambrian Explosion of social bots, and I’m sure plenty of entrants in the field won’t be thinking in terms of portability. They’ll be thinking about the ease of centralized deployment and management, and the reams of juicy data they can mine out of these things.

I remember in the early 2000’s feeling a similar excitement about self publishing (blogging). It was obviously going to be something that was going to be around forever once it was perfected. You could see the power in it’s first fits and starts, and it was just going to keep getting better. I think there are more than superficial similarities between self publishing platforms and social bot platforms, in fact.

Thinking back on that evolution, I think the archetype that we should hope for would be the WordPress model. I remember Matt Mullenweg visiting the Polycot offices in 2004 or so. He was passionate, had a great project on his hands, and I’m embarrassed to say that we weren’t smart enough to figure out a way to help him with it. Matt, Automattic and the WordPress community have done a great job of managing the vendor lockin problem while still providing a great hosted service people are willing to pay for. They get the best of both worlds, the custom WordPress sites and associated developer community, millions of blogs hosted by ISPs, the plugin developers, and still get to run a nicely profitable, extremely popular managed service.  If wordpress.com goes away (god forbid), someone will still be maintaining the core codebase, and you’ll be able to export your data and run your own instance as long as you like. (Just remember to register your own domain name.)

I hope that the social bot community evolves something similar. I think that platforms are coming online to encourage that, and I think the people in the field are smart and recognize the ethical implications. Maybe in a year you’ll be able to run your bots on a hosted service or, if you’re motivated, run your own bot server and fiddle with it’s innards as you please.  Who knows, you may even run them on your smartphone.