Touring the Breakfast Factory: Thoughts on High Output Management

by phildini on January 10, 2019


As I mentioned in my last post, I recently moved from being a Senior Software Engineer to a Team Lead. I’m fortunate to have received the advice early in my career that moving to management is less a promotion and more starting a new job; I immediately started looking for information on how to get better at this new job, fast. 

I am doubly fortunate to know Jacob Kaplan-Moss, and to have come across his reading list for new engineering managers last year. As soon as I knew that I was heading towards a management path, I bought every book on his list, including Andrew S. Grove’s High Output Management.

One of my winter break goals was to get through as many books as possible, and High Output Management was at the top of the stack. As soon as I started reading it, I understood why it’s so highly recommended in management circles: it’s the best book on managing teams of people that I’ve read so far. It’s so good, in fact, that some of the best ideas in it seemed obvious to me.The ideas seem obvious because every company I’ve worked for has implemented some part of Grove’s ideas about management. They seem obvious because I have the advantage of living in a world that has had High Output Management in it for the past 30 years.

Take the idea of metrics and outcomes guiding a team. Every company I’ve worked for, especially in tech, has every team or department within it tracking metrics that get reported up the chain on a regular basis. As an engineer, this obsession with team metrics and trying to improve them can seem like a waste of time. “We feel good about the things we’re working on, why do we have to spend so much time quantifying them?” 

The answer to this comes almost immediately in High Output Management: Every team is a black box to everyone not on the team, and the only way to know if a team is successful or not is to check the team’s metrics or see what they’ve shipped (which is itself another kind of metric). Thinking about metrics and outcomes in this way permanently changed my approach to teams. I started immediately looking at my team’s reporting metrics not as some arbitrary goal to hit, but as the only measure of the team’s health that most of the rest of the company would see.

Once you start thinking about metrics and outcomes in this way, if you’re like me you’re driven to make sure the metrics are real for your team. “Real” here means that the metrics actually line up with what the team AND the company care about, that your team can do something to affect the metrics, and that the members of the team are bought in to what the metrics represent. That last bit is especially crucial. Once your team knows why the metrics are important and agrees on what they should be, they can start making suggestions for how to improve them that might be better than the planned workstreams.

Speaking of outcomes and ideas that were popularized by High Output Management, let’s talk about OKRs. OKRs are an instance of the endless acronym parade that permeates Silicon Valley, and this one stands for “Objectives and Key Results”. Grove introduces this in talking about “management by objectives” (MBO, hooray another acronym), which is how every team I think I’ve ever been on has been managed without my ever knowing the term. “Objectives and Key Results” is an unfortunately jargon-heavy way to express an idea that I actually love, namely “Here’s where we think we’re going, and here’s how we’ll know if we’re going in the right direction.”

A trivial example. Say I want to get from my house in Alameda to my favorite taco place in Oakland, Xolo. “Get to Xolo” is my objective. There’s a myriad of ways I could check how close I am, and each one of those is a potential key result. I could carefully measure the odometer (metrics based), or I could know that I’m about a quarter of the way there when I hit the dog park, halfway there when I hit the tunnel, and roughly three-quarters of the way there when I turn on 12th st (milestone based). Take this simple idea and expand it to what your team or company cares about, and hopefully some of the chaos of running a team doesn’t just get a little bit clearer.

“Making things clearer over time” could be a subtitle for the book, in fact. Grove lays out his material in such a way that every chapter has at least one idea I found immediately useful, although the later chapters on performance evaluation and especially hiring feel a touch outdated. This is the disadvantage of reading such a seminal book 30 years after it’s publication -- Grove’s ideas were so good we adopted many of them and kept iterating!

After reading High Output Management, I’m doubly indebted to Jacob KM and the others who recommended it to me. Once, because it gave me more tools in my management toolbox. Twice, because I know have an iron-clad recommendation for anyone who asks “what books should I read about being a manager?”. Grove’s book is near the top of that list.

Have you read High Output Management? Think I’m wrong in some of my thinking on it, or want to talk about strategies from the book that worked for you? Drop a note in the comments.
 


Senior Engineer -> Team Lead

by phildini on January 1, 2019


In November, I was promoted from Senior Software Engineer to Team Lead. As of right now, I lead the Platform team at Patreon, and have three software engineers who report to me. I want to talk about how I realized this was something I wanted, how I made this happen for me, and why I’m excited about it. 

First, how did I think this was something I wanted? It’s helpful to know that I am mercilessly driven by the idea of impact. At least once a week, and sometimes multiple times a day, I ask myself these questions: “Am I working on the highest-impact thing I could be working on? If not, why not?” Being impact-driven (which dovetails nicely with being outcomes-driven for devotees of that brand of organizational thinking) means that I’m always looking to increase my impact, and especially looking for tools that will give me more leverage.

For the kind of impact I want to have, the impact to shape customers’ experiences and shape the paths of teams and shape the careers of individuals, the tools available to managers and team leaders are more impactful than the tools available to engineers. Now, I expect some disagreement to that statement, and I welcome discussion in the comments. For the goals I care about, the tools of a manager are higher impact than the tools of an engineer.

How did I make this happen for me? I saw an opportunity, and I pushed for it. That sentence masks a truly monumental amount of privilege, and luck, institutional biases that I think about a lot. Would a non-white, non-dude have been as successful in their push? What institutional biases might have kept those around me from pushing? So, there’s a lot hidden that I would love to go into in another post when I say: “I saw an opportunity, and I pushed for it.”

What opportunity? Well, Patreon is constantly working on becoming a more lean, more agile shop. As a result, in the Summer of this year an Engineering Director was leading the Platform team, at a critical time in the Platform team’s history. We were in the middle of trying to launch the Reddit integration, our Product Manager was on his way out to work full-time on being an author, and we had just hired a new grad engineer. I saw that there was an opportunity for a strong leader to step forward, so I did. What followed was a month and a half of me talking to peers, managers, directors, VPs, and HR folk, to see how feasible this was. I ended up writing my own job description for role that I now occupy, and then vetting that job description in another round with the people I listed above. I began pushing in late September, and in early November my transition to Team Lead become official.

As an aside, why “Team Lead” and not “Engineering Manager”? For one, I’m still doing some engineering work, on the order of 40% of my time. This doesn’t mean “I’m coding 40% of my time” but it does mean I’m doing “Senior Engineer” things with 40% of my time, and “Team Lead” things with the other 60%. I’m also Team Lead for the team I was a Senior Software Engineer on, the Platform team at Patreon, and as a result my number of direct reports is fairly small compared to other Engineering Managers at Patreon.

This is probably obvious since I said above that I wrote the job description, but I am the first and so far only Team Lead at Patreon. There are others who I think would be great at the role, and this is one of the reasons I’m excited about it: Team Lead hopefully gives engineers a chance to explore management without “closing the door” on returning to engineering.

Why am I so excited? At the root is how much I am excited about Patreon, and especially my team at Patreon, and what I think the Platform team is capable of. I’d also be lying by omission if I didn’t mention a few other things: I’m excited by the challenges that management presents, and how different they are from engineering. I’m excited to learn about a whole new discipline of work that has so much impact on engineering, but is not strictly engineering. I’m excited to gain a new lens through which to see the world.

I also really hope I don’t fuck it up. I want to do right by myself, my reports, and the company, in roughly that order. If I’m not doing right by me, I can’t be an effective leader or example. If I’m not doing right by my reports, then my team as a whole suffers, and my measured output as a manager crumbles. If my team is unhappy or unproductive, then the company will as a whole will suffer. The challenge of having to manage the stack of responsibilities, of having to coordinate conflicting demands, of figuring out how to build a team while meeting the needs of each person on the team is a challenge that I’m exceptionally excited about.

Want to know more? Want to challenge me on my thinking or ask questions about how I got here? Leave a comment below or ping me on mastodon at @[email protected].
 


Documentation for Life

by phildini on August 25, 2018


As I started writing this post, I got blocked by the dang title. I couldn't think up one, and so I started writing in the hope that one would come to me. 

It'e been a long time since anything was published to this blog. It's not that I haven't been writing; if anything my volume of prose has gone up dramatically in the past year as I've started pushing for more and more documentation on the teams I work in and the projects I lead.

I think there's three reasons nothing has been published here in the past year.

For starters, there's an infinite bikeshed of possibility in running your own blog, powered by software you maintain. See something you want to fix? The bottomless rabbit hole is there, ready and waiting for you to fix it. It becomes nearly impossibly hard to resist the siren song of everything you want to do in code to make the blog better, forgetting of course that the point of the blog is the content, not the chrome.

Secondly, Dunning-Kruger. I'm past the hump of thinking I know anything at all about the subjects I want to talk about, but don't have the confidence to believe that my observations are valid. This position is, thankfully, changing: I'm starting to get some validation through my work at Patreon and with technical organizations that my experience and opinions are valuable, and worth adding to the collective conversation.

Thirdly, and perhaps most importantly, today especially, is that my my own brain chemistry is acting against my best interests at the moment. There is quite a lot from the last six years of my life that I'm processing, and trying to heal, and covering on a regular basis in therapy. I am out of "fighting for survival" mode, and my brain is taking the break in constant survival stress to raise issues that I need to deal with, and which come with their own flavors of toxic brain chemistry.

So, what do? As I was writing this, and talked above about the fact that my prose output has actually increased, I had the realization that I value documentation to an obsessive degree, and that taking the posts in this blog as attempts at "documenting my life, and the experiences I have in it" might get me around some of those blocks posted above.

We'll see how it goes, wish me luck.

 


A Brief Guide to Locking Down Your Mastdon Account

by phildini on October 22, 2017


Mastodon is currently my favorite social network. I love it so much, I started my own server with some friends, and I'm proud to say it's still going strong. You can read about The Wandering Shop in my previous post about why I started it

Part of the reason I love Mastodon and The Wandering Shop is that it's a social community where we get to define the rules, and we get to control who is and isn't allowed in our neighborhood. Myself and the other shopkeeper, Annalee, do a good job keeping out the riff-raff as per our Code of Conduct. That said, if you aren't on our server, or if you want a tighter grip over who you share with, Mastodon provides some of the most comprehensive options I've seen for privacy in a social network.

So here are 6 things you can do to lock down your Mastodon account.


​1. Develop a good relationship with your server admins

While Mastodon provides some excellent options for blocking people and servers just for your account, involving your server admins will help keep bad actors and bad instances off everyone's feed, and help the neighborhood feel better as a whole. This is tougher on a large server like mastodon.social, but the admins there still try to respond to reports as they can. That "personal relationship" is one reason why I prefer the smaller servers.

2. Lock your account

The next steps in this guide are going to be found in your Mastodon preferences, which you can find under the "Gear" tab in the Mastodon web interface. This guide, and all the screenshots, assume your server is on Mastodon 2.0, which many servers have moved to by this point.

How to lock your account in the Mastodon preferences

In Mastodon, locking your account means that you must manually approve every follower. The Mastodon default is anyone can follow anyone else, without approval. Setting this setting will require action from you every time someone wants to follow you, but it also means no-one can follow you without your permission. This is especially important if you want to...

3-4. Set privacy defaults on toots and unlist from search results

How to change privacy settings and search result settings in Mastodon

The default for toots that you post in Mastodon is "Public", meaning everyone can see them and re-toot them. The next level of privacy is "Unlisted", meaning anyone can see them if they go looking for them, or if they follow you, but they won't show up on the public timelines, like the "Local" feed or the "Federated" feed. The final level of non-direct-message privacy is "Followers-only". When a toot is followers-only, only your followers can see it, they CANNOT re-toot it, and it won't show up in any public feeds.

All of these options are available on a per-toot basis in every client I've seen, but if you'd like your toots to be more restricted by default, you can change that here. However you are most comfortable using Mastodon is the right way to use Mastodon, but it's worth noting that interesting toots in the public timelines is how people find other interesting people on Mastodon, and removing your toots from that by default may limit how many people get to appreciate what you have to offer.

On this same preference page is "Opt out of search engine indexing" option, which will translate to your public profile and status pages not being crawled by search engines that respect things like robots.txt files.

5. Set up 2FA for your account

How to set up 2FA on a Mastodon account

This falls under "Good internet hygiene", but it's a good idea to set up two-factor authentication for your account, and Mastodon has made it easy to do so. Accounts getting hacked sucks, turning on 2FA makes that less likely.

6. Donate to Mastodon development and encourage more privacy features

Mastodon is created and run by volunteers, and you can help support the lead developer through the Mastodon Patreon Page. Additionally, suggestions for more privacy features come up all the time in the Mastodon Github, and you can help make them a reality by pitching in your time and expertise.


WordFugue is a blog run by phildini and oboechick. The best way to show your appreciation is to share this article with friends. The second best way is to donate to phildini's Patreon Page.


Finding Your Tribe or: Why You Should Join Me at DjangoCon

by phildini on June 26, 2017


“If you’re a programmer you should attend technical conferences to further your career.” Some variation of this was said to me so often when I was starting out as a writer of software that it became something like gospel. It became how I approached conferences; I was there to gain skills or a network that would help me further my career in some way, or further the interests of whoever my employer happened to be at the time. 
 
If you approach conferences with this mindset, I think you will be disappointed. I certainly was. And it took a couple years of going to conferences before I realized (with the help of my wife and some close friends, I should point out) that I had the most fun when I focused less on how any particular conference was going to further my career and focused more on making genuine connections with people, and focusing on topics I actually found exciting.
 
This makes sense to me when I step back to think about it. Writing software, even when you’re on a large project or part of a large team, can be a very lonely, isolating business. We spend most of our time in our own heads, building castles of imagination that we make real through code. Given the viral strains of imposter syndrome, burnout, and depression that runs through our industry, it can feel incredibly difficult to reach out and make connections, to share our problems and commiserate even with our closest peers.
 
This is the strength of the best conferences for me. Yes, you will learn things at a good technical conference. You will be exposed to ideas and approaches to problems (both technical and social) that you maybe hadn’t thought of before. Delighting in learning is a totally valid reason to attend technical conferences, and part of why I attend so many.
 
But the primary reason for me is finding and reconnecting with my tribe. Technical conferences, especially in the Python community, are filled with some of the best and brightest people I’ve had the fortune of knowing, and, more than that, are filled with people who are kind, and willing to listen, and also want to connect with others in their community. I will tell you a secret: Many of the best and brightest, those you might be coming to a conference specifically to see speak, are coming because they also want to make those connections. They also want to reach out, commiserate, and find their tribe.
 
Now let’s talk about DjangoCon, specifically DjangoCon US which is coming up in August. PyCon is the big conference in our community, and it draws the biggest crowds. PyCon is excellent, and I enjoy going every year. I connect with people at PyCon that I basically don’t see for the rest of the year. But where PyCon is the big yearly reunion with the whole community, and can therefore be overwhelming, DjangoCon is the smaller gathering with friends. Where PyCon is, in many ways, a week-long festival for the Python community, DjangoCon is closer to an intimate dinner party, where you can hear more of each other’s conversations, and join in some incredible discussions.
 
If you’re still searching for a tribe, or want to reconnect with the Python and Django Community, and want to do so in an intimate gathering of friends, I hope you’ll consider attending DjangoCon this year. As an added bonus, you’ll get to hear myself and the other speakers give a frankly incredible lineup of talks. Seriously, I get excited just looking at it. 
 
Now, some people might be turned off by the fact that the conference is in Spokane. It’s a little out of the way, this is true, but this is one of the reasons I get excited about conferences: Chances to visit places I wouldn’t visit otherwise. I’ll also say that the best breakfast I ever had was in a small town in Washington, and I’m excited for the brunch game in Spokane.
 
If you’re still not sure that DjangoCon is where you’ll find your tribe, I direct you to the opening talk: “The Shy Person’s Guide to Tech Conferences”. DjangoCon is here for you, and we can’t wait to meet you. 
 
Hope to see you in Spokane.
 
P.S. About that “technical conferences will further your career” thing. Nothing has done more for my career, and my well-being as human, as having a collection of real friends that I’ve met at conferences.
 



WordFugue is an independent collection of writings and ramblings, and we’ll never show you ads. If you want to support our work, consider donating to Philip’s Patreon, or buy a book from our Amazon affiliate store. In the spirit of DjangoCon, consider purchasing Two Scoops of Django, our favorite Django book.
 


Introducing Epithet

by phildini on June 13, 2017


There are many challenges to running an Open Source organization, but the one that I have personally felt the pain of again and again is that our tooling is awful. Github (and realistically we’re all using Github at this point) still feels in many ways like a tool designed around the idea that all the action is going to happen in one repo. This may not be entirely the fault of Github. Git itself is very tightly coupled to the idea that anything you care about for a particular action is going to happen in one, and only one, repository. 
 
When Github released Organizations, the world rejoiced, because we could now map permissions and team members in our source repository the way they were mapped in the real world. Every new feature Github adds to its Organizations product causes more rejoicing, because so many teams work across multiple repos, and the tooling around multiple repos is still awful.
 
The awfulness of this tooling is probably a strong factor in the current trend towards “microservice, monorepo” code organization, but that’s another post. 
 
I’ve been the equivalent of a core contributor for a half dozen Github organizations, and I’ve noticed that one area where the tooling is especially lacking is around labels. I’ve seen labels used to designate team or individual ownership, indicate the status of pull requests, signal that certain issues are friendly for beginners, and even used as deploy targets for chunks of code. It’s fair to say that labels form a core tool in the infrastructure of every team I’ve seen using Github, and yet the tooling Github exposes for labels is painfully lacking.
 
I could go on and on about this, but my goal here isn’t to necessarily make Github feel bad. I hope they’re working on better label tooling, and if they want ideas, boy am I willing to give them. But there is one label-specific wall I kept banging my head against, and that is label consistency across all the repos of an Organization. 
 
Some of you read that and feel remembered pain. I feel that pain with you, and we are here for each other. Some of you might have no idea what I’m talking about, so I’ll explain a bit more.
 
Let’s say you want to add a “beginner-friendly” label to all the repos in your Open Source Organization, so that new contributors can find issues to start with. Right now on Github, you would need to go into every repo, click into the Issues page, click into the Labels tab, and manually create that label. There are no “Org-wide labels”, and no tool for easily creating and updating labels across all the repos of an organization.
 
Until now.
 
Introducing Epithet, a Python-based command line tool for managing labels across an organization. You give it a Github key, organization, and label name, and it will make sure that label exists across all the repos in your org. Give it a color, and it’ll make the color of that label consistent across all repos as well. Have you decided you’re done with a particular label? Epithet can delete it from all your repos for you. Are you using Github Enterprise? Epithet supports that too.
 
Epithet exists to fill a very particular need in open (and closed) source Github organizations, and it’s still pretty alpha. We use it for the BeeWare project, and it might be used soon for syncing labels in the Ragtag organization. You can start using it today by checking out the (sadly small) documentation, and if there’s a feature missing you’d like to see, I’m happy to work with you on getting a PR submitted.
 
Managing Open Source organizations is hard. My hope is Epithet makes it a little bit easier.


WordFugue is independent, and we will never run traditional ads. If you like what we're doing, consider donating to phildini's Patreon, or buy a book from our affiliate store. This week we're reading Patrick Rothfuss' "The Name of the Wind".

Special thanks to Katie Cunningham and Kenneth Love for reviewing this post.
 


Only We Can Save Pythonkind

by phildini on June 6, 2017


Python is the best technical community I’ve seen, and close to the best community I’ve seen at this scale. If you’ve been programming for any length of time, you’ve seen technologies and frameworks and languages rise and fall. We often bemoan the loss of certain ideas from these fallen works, but rarely talk about the communities that fell with them. Python is in many ways the most deliberate community that I’ve ever seen around a technology, and my life will be worse if it ever falls.

I think the Python Community is either near an inflection point, or right on top of one. What do I mean by that? I mean that, over the next five to ten years, I see two paths for the Python community and ecosystem. (Because “Python community and ecosystem” is long to type and read, I’m going to use “Python” to mean “the Python community and ecosystem” for the rest of this post.)

Path one, the one I hope we take, is the one where we take active steps to grow Python. It means that we are continuing to welcome new people into the community, from areas we never considered. It means we have a surplus of good, well-paying jobs for Pythonistas at every experience level. It means the companies and organizations creating those jobs recognize what Python gives them, and sponsors the ecosystem and community events to be better than ever.

Path two is the path I’m worried about. It’s the path where we expect Python to take care of itself, where we collectively take a more passive approach to the community that so many of us enjoy, and which has given much to many of us. I think this path results not in Python dying overnight, but in a slow decrease in Python, in Python becoming more and more irrelevant over time. It results in less Python jobs, more Go or Node or “insert language here” jobs. It results in Python being pigeonholed into certain industries, and new Pythonistas being forced to learn some other language to start their career. It results in our major events slowly shrinking over time, and a time where we start counting down attendees instead of counting up.

I’m not going to try too hard to convince you that this is where we are, that we are at or close to a fork in the road. It’s what I believe, and I think you some of you might agree already, but here’s some of the things I’ve noticed that make me think we’re close to such a point.

  • PyCon 2017 was fantastic, and had more attendees than ever, but had noticeably fewer booths in the expo hall then last year, and I believe fewer sponsors overall.
  • Other Python and Django conferences, especially the smaller regional conferences, are finding it harder and harder to get sponsors. Some of this is the market tightening, some of this is companies moving out of Python, or not feeling like they get a return on their investment.
  • More programs and code schools are using Python as their teaching language, but for many the entry-level positions just aren’t there. Some of this is, again, the market not hiring entry-level, some of this is the companies we work for being willing to take risks and train.

Based on the above, and some other feelings and anecdotes, I think we’re right on top of the fork in the road. So what do we do about it? We take deliberate actions to help grow Python. Here’s what I’m planning to do over the next year:

  • Running for the PSF Board of Directors. Why do I think being on the Board is important in the context of this post? Because I can push for growth at the Python organization level, and I can get things done as a Board member that I can’t get done as a non-Board member of the PSF. Anyone reading this can, and should, run for the Board if they feel so inclined. But I’d also love to see more participation in the PSF committees, especially along the lines of fundraising and outreach. No matter the outcome of the election, I’m going to continue my work on the Sponsorships committee, and keep doing the other things on this list.
  • Reaching out to University Computer Science departments about using Python. I’m already in the process of arranging a guest lecture with classes in my old CS department about life as a professional Software Engineer. I’m planning to add specifics about how I use Python (which is more and more the introductory teaching language) in my professional life. My hope is I can help connect classroom lessons to professional Python just by showing up and giving a small talk.
  • Reaching out to University Science departments about Python. If the keynotes at PyCon 2017 taught us anything, they taught us that Python is an incredible resource in research science departments, statistics departments, anywhere deep thinkers need to do computation and visualization. I’m hoping to put together a “Python in Science” roadshow to help with this, but the reality is Software Carpentry is years ahead of me in making this happen, and anything we can to do help with them is almost certainly worthwhile.
  • Being a Core Contributor to the BeeWare project. Python has great stories around developing web applications, working in the sciences, and doing systems tasks. Our stories around developing consumer apps are lacking, and I don’t think they need to be. BeeWare, and many others, are taking a stab at filling this gap, but for you reading this the action item could be “find a Python project in an area you care about, and work at making it the best it can be.”
  • Volunteering time to get more companies and projects started in Python. This one is more nebulous, and I haven’t done it yet but plan to soon. I’m planning to reach out to VCs and incubators and especially hackathons and say “Here’s my background, I’m happy to show up to any event and donate my time to help, but I’m only going to help with Python.” I don’t know how this is going to go over, but this idea has some exciting potential. If we want more jobs in Python, we need to be pushing for more companies and projects to use Python, right from the beginning. 

If any of these ideas seem interesting to you, feel free to copy them! If they seem interesting but daunting, feel free to reach out to me ([email protected]) to chat about them. If these ideas inspired your own ideas in a different direction, great! Tell me about what you’re doing and I’ll share it far and wide. My goal in listing these ideas isn’t to toot my own horn, but start a conversation about methods for Python outreach, in the hope of growing Python.

Of course, I could be wrong in my beliefs. (I’d actually love to be corrected with stories or data that show I’m wrong, and would happily share them here.) What if Python is healthy, and is going to grow consistently over the next decade?
 
Then I’d still do everything I’m planning to do, and encourage others to do the same. I think everything we pour into the Python community is valuable, and any new Pythonista we bring in enriches us all in ways we can’t possibly anticipate.
 
If I’m wrong, and we make Python better for no reason, we’ll still have a better Python.


Working to Ensure the Future of the Python Community

by phildini on May 25, 2017


I'm entering the arena for the third year. I'm running to be a Director for the Python Software Foundation. This post will help explain why.

There's an argument that anything I write here should instead be in my candidate statement. I don't disagree, and the reason I'm writing here instead of there relates to one of the things I'd like to change: Nominating for the Board of Directors requires editing the Python Wiki. The Python Wiki is hard to use, the documentation on it is not well-exposed, and a room full of Pythonistas during the PyCon sprints (including one current Board member) couldn't tell me who maintains it. Beyond that, you have to answer somewhat-esoteric Python trivia to submit your edits.

I'd like a clear definition of what the Board and the community thinks the Wiki is for, and regular check-ins on whether it's serving the community well. I'd like to see us running "PSF Sprints", where Board members or anyone else interested is writing documentation about how the PSF is run. Our election processes and funding processes and budget processes and outreach processes should be checked on regularly, and I'll be pushing for more transparency and openness about how we run the business side of Python.

Speaking of outreach, I'd like the PSF to be doing more of it, and funding groups who are growing the Python community. There will be more about this in a future post, because I have plans on how to get our community of Pythonistas back out into the world growing the community at universities and hackathons and incubators and corporations. I want every group and individual trying to grow Python to know that the PSF has their back, and will put money behind them. 

I also want to be on the Board to remind the PSF that they have power beyond grant giving. Yes, the majority of what the PSF Board has done in recent years has been giving grants to organizations around the world. That work is excellent, and I want to see it increase. But the PSF is also in a unique position to be a promotion clearing house and force multiplier for good ideas in the community. When good learning materials are written, they should be easily findable from the official Python websites. When Python events are being held, the PSF should be a cheerleader, spreading the word about what's happening in the community.

These are the things I plan to do as a PSF Director to help grow Python. I haven't even gotten into the investment I want to see us putting into our core tools and platform infrastructure; that will have to be another post and my brain is a little fried from PyCon. 

So the only question left is: Why do I need to be on the Board to do these things? And the answer is I don't These are things I'm going to push for no matter what. But the PSF is in many ways the voice of the community, and I want to see that voice brought to bear on the issues that will be affecting our community for the next year and the next decade. I think I can help use that voice to speak for the Pythonistas of the future, and I hope you agree.


Thoughts on PyCon 2017, Day 1

by phildini on May 19, 2017


Here we are the end of the first conference day of PyCon. Thinking over the day, and including thoughts from the opening reception last night, I'm struck by something that is even more true this year than it was last year:

The Python community is incredible. We are at an inflection point where we need to be making measured, conscious decisions to keep Python and its community thriving. 

I'm going to be writing even more about this in the coming weeks, but let me jot down some observations, and then try to sum them up at the end:

1. It was pretty obvious to anyone who had been here last year that there were less sponsor booths in the expo hall. Noticeably less. Speaking to someone who had a booth last year and chose not to this year, there was perhaps a level if disgruntledness with the organizers that caused them to skip this year. That's troubling. What's even more troubling is talking to conference organizers for other Python and Django conferences about how it's gotten harder this year to find sponsors.

2. Jake VanderPlas' keynote this morning highlighted some areas where Python is making incredible inroads, and showcased how Python is becoming the defacto tool in many areas of the science community. They choose Python for many reasons, and one of them is:

 

 

Or, to put it more succintly:

 

These thoughts really resonated with the audience, based on the number of likes and retweets I got. And I think these are sentiments the community at large shares. We don't (necessarily) choose Python because it's the fastest language on the planet. We choose it because we like working in the language and we love the community that comes with it.

3. Speaking of that community, I didn't get a chance to see as many of the talks as I would like, because I spent so much time chatting with people about fascinating topics in the hallway track. The hallway track continues to be one of the best parts of PyCon, and it was especially noticeable this year that people were being encouraged to participate. One of the amazing things about PyCon is that all the talks are recorded and put online for free, sometimes within hours of their being given, so attending a talk can often be considered secondary to meeting interesting people in the hallway.

4. This is even more pure anecdote than (3), but it felt like I heard of more people finding it harder to get jobs in Python building web applications, and easier in things like data or science. I can't prove this is true, and it might not be all bad, but it's something to watch. Any area where it's suddenly harder to find work in Python means a pillar of our community is weakening, and we should be aware of it.

5. The day ended with lightning talks, and I hope everyone in the audience saw Cameron Dershem's talk about what the Rust community is doing better than the Python community, especially when it comes to improving usability of the language and making it easier to contribute. Furthermore, I hope it was a call to arms for all of us to start pushing for making every aspect of our community feel welcoming, and like new people can make a difference.

Summing up: Python's community still feels like home, to me and many others, and PyCon feels like a homecoming. If we want to make sure the community continues to be incredible, we need to keep an eye on trends in where people and companies are using PyCon. We also need to continue to be excellent to each other, whether the person we're talking to has been here for years or just learned about Python today.

I'm going to keep pushing to make Python better, and I look forward to seeing you all at PyCon tomorrow.


Keeping Shop

by phildini on May 8, 2017


There is a recurring villain in the Terry Pratchett novels called The Auditors. They show up over a number of books as the adversary of Death, and make one of their most daring ploys in Thief of Time, which I also consider in the top five of the Discworld canon.

The Audtiors, who I describe in the singular because it thinks of itself as one entity, is one of the most insidious adversaries in the Discworld because it is not evil, or even mean-spirited. It simply wishes there was more order to the universe, and would prefer all life to stop because life is so disorderly.

One of the best descriptions for the existance of The Auditors begins thusly:

Nine-tenths of the universe is the knowledge of the position and direction of everything in the other tenth.  Every atom has its biography, every star its file, every chemical exchange its equivalent of the inspector with a clipboard.  It is unaccounted for because it is doing the accounting for the rest of it, and you cannot see the back of your own head.
Nine-tenths of the universe, in fact, is paperwork.

This phrasing has stuck with me recently in regards to critique of the internet and online communities. In many ways, the internet is its own auditor, its own bookeeper, its existance is the record of its existance. And yet, thousands upon millions of words have been written explaining more about the communities and worlds that make up the internet. Entire libraries could be filled with the printed pages of digital commentary on Twitter and Facebook and all manner of internet forum that have come before.

One of those pieces of internet critique, which I will not link to because I do not wish to drive traffic to it, rankled me, to the point where I felt a need to counter it. But to counter it properly, we need to establish some background. Carl Sagan once said "If you wish to make an apple pie from scratch, you must first invent the universe."

And so must we, but the universe we're inventing is The Fediverse, specifically one small corner of it called The Wandering Shop. This is the story of how The Wandering Shop came to be.


Twitter is awful. This is the sentinment that so many of us, myself included, have felt in the past months and years. It's product decisions are opaque, it's community vascillates between cynicism and bigotry on a daily basis, and you can never shake the feeling that you're trying to have an intimate conversation while yelling at the top of your lungs in the public square.

Enter Mastodon, a piece of Open Source software (which in this case, as in most cases of Open Source, means software anyone can edit and everyone has opinions about which they'll happily go to the barricades for). Mastodon has it's own fascinating and complex history, involving multiple internet generations of Open Source and Free Software activists, but it is, in essence, a piece of software that behaves much like Twitter that you can host yourself. This definition is untrue by almost every measure, but it aids wonderfully in understanding.

Two crucial facts about Mastodon are true, and possibly beautiful, making them True again: 1) You can host it yourself, meaning you have full ultimate control over the canonical copy of the data. 2) Anyone who joins your instance, your little slice of the network of servers, is subject to your rules, and your community decisions.

One of the things that Twitter got wrong, in my opinion and the opinion of other heavy Twitter users, is that it thought the only way to grow and to have a userbase committed enough to fill the gaping maw of Venture Capital was to allow free speech to be the default. It's truly mind-boggling how, when unfettered free speech isn't allowed in the classroom, the workplace, on public television or radio, or in most human relationships, the developers of communication tools for humans think that free speech with no rules (or poorly enforced rules, amounting to the same thing) will prompt quality conversations.

Mastodon, from the outset, made no bones of the fact that the owners of the instances could mute or block or kick off individuals who broke that instance's community guidelines. They could even block or mute whole other servers, allowing the instance owners to set their own rules and sanctions.

This is what interested me, when I first read the post from Eugen Rochko that kicked off Mastodon for most of us. I joined immediately, and loved the growing community I found there. Mastodon replaced Twitter in my life so quickly that at times it almost felt like whiplash. My fingers would reach to open Twitter of their own accord, and when my mind returned from wherever it had been I would realize what I was doing, close Twitter, and open Mastodon.

I'm forever grateful to that muscle memory, however, because without it the Wandering Shop would not be. It was during one of those brief unconscious Twitter checks that I saw my friend Annalee suggesting the idea of a moderated Scifi/Fantasy Mastodon instance. You can still see the thread here. We got to chatting, first over Twitter, then over email, and days later The Wandering shop was born.

As you can see from that thread, a strong Code of Conduct was part of the Shop's DNA from the beginning. Many of us, myself included, feel like The Wandering Shop is a real, shared coffee shop that happens to mainly exist in our minds, and so it makes sense to me to say that the guidelines and code of conduct are built into the very walls of the Shop.

Annalee and I wanted to build a community that was open to all, but deliberate in what was and was not acceptable behavior. To the best of my knowledge, we have yet to act on poor behavior from the Shop's patrons (those who have registered their accounts at The Wandering Shop), but we have muted or blocked accounts and instances that we thought were making life in the Shop worse. Additionally, we've put effort into deliberate community building. We made a weekly calendar to encourage conversations, and we try to generally be available as part of the community.

This is, for me, the power of Mastodon, when run deliberately: You can build online communities like neighborhoods, full of people you enjoy sharing the sunset with, and still be connected to the rest of the city down the road. I can honestly say The Wandering Shop is in the top three online communities I've ever been a part of. Twitter and Facebook don't even make top ten.


The article that came out today, that fired me up enough to write this piece, was describing why an instance admin was shutting down his Mastodon instance.

I'm paraphrasing here, but the gist was: "I set up my instance as a place with no rules, where anyone could come and be who they wanted to be, and say what they wanted to say. I was stunned and saddened at the abuse and horrific imagery I had to encounter when dealing with this instance, and so I am shutting it down before it gets me in real trouble."

There is a part of me, a small part, that is always reaching out to connect with other human beings. I don't think anyone could try so hard to start online communities and not have a portion of yearning in that direction. I feel for this man, who tried and experiment and had it go so wrong.

But mostly I look at what is possible when thoughtful care and attention is taken towards creating a deliberate community, and a spark goes off behind my eyes. I think of the world we're building at The Wandering Shop, and compare it to this person's dismissal of the whole concept, and that spark quickly becomes the heart of a forge.

At the XOXO Festival in 2015, Eric Meyer spoke about building the kinds of online communities we want to live in. We're trying to build a deliberate community at The Wandering Shop. I hope you'll join us, if you feel so inclined.

You can support The Wandering Shop by donating to our Patreon.


Point's Problems a Product of the Presidency?

by phildini on May 5, 2017


This originally appeared as a Letter to the Editor for the Alameda Sun on May 4th, 2017.

The recent article regarding the slowdown with the Alameda Point Site A project (“Point’s Developer Reaches Impasse,” April 27) does an excellent job of explaining the current state of affairs between Alameda Point Partners and the City of Alameda, but doesn’t go into a possible cause for lack of subcontractor bids and increased construction costs: uncertainty around our economic future and the effects of the current national administration.

We have already witnessed adverse effects with other projects in Alameda. As the national tax structure becomes less certain, as our economic future seems to change day-to-day, organizations across the board are becoming spooked and less willing to spend on economic growth.

The fact remains that Alameda is seeing its largest homeless population in years, that the number of empty storefronts on Park and Webster streets keeps growing, and that the there is no guarantee the national climate changes for our benefit over the next four years. I hope the City Council keeps in mind how desperately we need the housing and economic growth this project will bring when they make their decision in July.


Things Can Only Get Better

by phildini on May 4, 2017


This post is a long time coming. A much longer time coming than I intended, actually. It's been almost a year since any post was made to this site, and it's just been sitting here, ticking away. I've changed jobs, the country has changed Presidents, the world and I have changed in so many immeasurable ways.

There is more to say here, and trust me that I'm going to be writing here even more. Just the act of writing is helping centering me right now, and I'm coming to think that maybe I need to do it for my own survival.

Along those lines, with my dear friend Annalee (who helps run The Bias) I have started a small community for writers, specifically Science Fiction and Fantasy writers. Think of it as a Twitter for Writers, with an excellent community and a strong Code of Conduct. It lives at the Wandering Shop, and you're cordially invited to join us. The community there is part of why this post exists, and why I'm trying to resurrect this site. They inspire me to write every day.

More will come. It's quite honestly been pretty rocky for me over the past year, but things can only get better.


My Capstone for the Data Science Immersive at General Assembly

by oboechick on October 14, 2016


I started my capstone project by looking for some way to look at the last reports from Trends in International Mathematics and Science Study (TIMSS). This is a group that looks at the way that math is taught worldwide and creates studies to see the best way to teach math so that the students remember it years after they've stopped going to school. TIMSS does this for Science as well but I am less familiar with that branch of the group.

While I was looking for this data I read an article that summarized results from the Organisation for Economic Co‑operation and Development Programme for International Student Assessment (OECD PISA). I discovered that this was one of the umbrellas under which TIMSS published it results. So I dove in and collected the data.

PISA is a group that gives assessments to students ages 15 years 3 months to 16 years 2 months from about 70 different countries. The assessments determine how literate the students are in math, science, language, and finances (starting in 2012). This assessment is given every 3 years starting in 2000. I was able to collect the survey scores from the 2012 assessments.

This data came in the form of over 300 different excel spreadsheets about 250 of them had more than one sheet in the file. I decided that I would start by importing all of the spreadsheets into pandas dataframes and clean all of them with one function. I then followed these steps.

  1. Pulled all of the excel files into pandas dataframe
  2. Left out first row because it was irrelevant
  3. Combined 2nd and 3rd rows to make the headers more descriptive and understandable
  4. Wanted to merge all dataframes together using country column as the index but had the problem of how do I know which question goes with which header?
  5. Added question ID and sheet name to the beginning of every header
  6. Merged all the data
  7. Got dataframe with 65 rows and 4,095 columns
  8. Attempted to use logistic regression.
    1. Used the countries as target y.
    2. Used headers that had words “none”, “once”, “twice”, “four”, and “five” as my  X features.
    3. I got 1,194 columns in my X and I could not get this model to work

There were many problems with the way that I went about this. Apparently not all my headers got cleaned, half of the columns in my features list were click logs not assessment results, and I had click logs and survey results all in the one dataframe.

## What did I learn from this? 

Find a dictionary of the data or create one before you try to analyze it and take your time familiarizing yourself with the data. In the end it will take less time if you don't have to go back and correct things after you have done all the hard work. 


Thoughts on the PSF, Introduction

by phildini on June 14, 2016


The Python Software Foundation (PSF) is the non-profit that owns python.org, helps run PyPI, and makes sure PyCon happens. This is the introduction to a series of posts that will discuss some challenges that face the PSF and community as a whole, as well as some suggested solutions.

The big idea underlying all the little ideas in the following posts is this: The Python community is a unique and incredible community, and it is a community that I want to see grow and improve.

Python is full of welcoming, caring people, and that the Python community has shown over and over that it is not content to rest with any past good deeds, but is continually pushing to be more welcoming and more diverse. It was an incredibly powerful symbol to me that I spoke with multiple people at PyCon who don’t currently use Python for their jobs, but come to PyCon to be a part of the community. When I find people who want to get into programming, I point them at Python partially because I think the language is more beginner-friendly than most, but mostly because I know the community is there to support them.

The only qualification I claim for this series is caring deeply about this incredible community. If you want to learn more about my background, check out the about page. The ideas that I’m going to be presenting are a combination of my own thoughts, and conversations I’ve had at various conferences, and in IRC channels, and on mailing lists. I’m not claiming to be most qualified to speak on these things.

I have no real desire to critique the past. My goal is to start a conversation about the PSF’s future, a future which hopefully sees the PSF taking an even bigger role in supporting the community. To that end, there’s three things that I think we should be talking about, which I’ll discuss over the next three posts.

  • Strengthening the Python ecosystem
  • Encouraging new adoption of Python and new Python community members
  • Supporting the existing Python community

If you are inspired to start these conversations, comments will be open on these posts, although I will be moderating heavily against anything the devolves into attacks. Assume the the PyCon Code of Conduct applies. I would be thrilled if these posts started discussion on the official PSF mailing lists, or in local user groups, or among your friends. 

In the upcoming post, I’ll talk about challenges that face the Python ecosystem. I’ll talk about support and maintenance of the Python Package Index, why it should matter tremendously to the Python community, and what the community and the PSF could be doing to better support PyPI and package maintainers. Sign up for our mailing list to hear about the next post when it’s published.


My Open Source Workflow

by phildini on June 7, 2016


I think people have an impression that I make lots of contributions to Open Source (only recently true), and that therefore I am a master of navigating the steps contributing to Open Source requires (not at all true).

Contributing to Open Source can be hard. Yes, even if you’ve done it for a while. Yes, even if you have people willing to help and support you. If someone tries to tell you that contributing is easy, they’re forgetting the experience they’ve gained that now makes it easy for them.

After much trial and error, I have arrived at a workflow that works for me, which I’m documenting here in the hopes that it’s useful for others and in case I ever forget it.

Let’s say you want to contribute to BeeWare’s Batavia project, and you already have a change in mind. First you need to get a copy of the code.

 

Image of arrow pointing to the "fork" button on the Batavia repo.

 

I usually start by forking the repository (or “repo”) to my own account. “Forking” makes a new repo which is a copy of the original repo. Once you fork a repo, you won’t get any more changes from the original repo, unless you ask for them specifically (more on that later).

Now I have my own copy of the batavia repo (note the phildini/batavia instead of pybee/batavia)


Image of an arrow pointing at the batavia repo name on phildini's GitHub account.

 

To get the code onto my local machine so I can start working with it, I open a terminal, and go to the directory where I want to code to live. As an example, I have a “Repos” directory where I’ve checked out all the repos I care about.

cd Repos
git clone [email protected]:phildini/batavia.git

This will clone the batavia repo into a folder named batavia in my Repos directory. How did I know what the URL to clone was? Unfortunately, GitHub just changed their layout, so it’s a bit more hidden than it used to be.

 

The GitHub clone popup

 

Now we have the code checked out to our local machine. To start work, I first make a branch to hold my changes, something like:

git checkout -b fix-class-types

I make some changes, then make a commit with my changes.

git commit -av

The -a flag will add all unstaged files to the commit, and the -v flag will show a diff in my editor, which will open to let me create the commit message. It’s a great way to review all your changes before you’ve committed them.

With a commit ready, I will first pull anything that has changed from the original repo into my fork, to make sure there are no merge conflicts.

But wait! When we forked the repo, we made a copy completely separate from the original, and cloned from that. How do we get changes from the official repo?

The answer is through setting up an additional remote server entry.

If I run:

git remote -v

I see:

origin	[email protected]:phildini/batavia.git (fetch)
origin	[email protected]:phildini/batavia.git (push)

Which is what I would expect -- I am pulling from my fork and pushing to my fork. But I can set up another remote that lets me get the upstream changes and pull them into my local repo.

git remote add upstream [email protected]:pybee/batavia

Now when I run:

git remote -v

I see:

origin	[email protected]:phildini/batavia.git (fetch)
origin	[email protected]:phildini/batavia.git (push)
upstream	[email protected]:pybee/batavia.git (fetch)
upstream	[email protected]:pybee/batavia.git (push)

So I can do the following:

git checkout master
git pull upstream master --rebase
git push origin master --force
git checkout fix-class-types
git rebase master

These commands will:

  1. Check out the master branch
  2. Pull changes from the original repository into my master branch
  3. Update the master branch of my fork of the repo on GitHub.
  4. Checkout the branch I’m working on
  5. Pull any new changes from master into the branch I’m working on, through rebasing.

Now that I’m sure my local branch has the most recent changes from the original, I push the branch to my fork on github:

git push origin fix-class-types

With my branch all ready to go, I navigate to https://github.com/pybee/batavia, and GitHub helpfully prompts me to create a pull request. Which I do, remembering to create a helpful message and follow the contributing guidelines for the repo.

That’s the basic flow, let’s answer some questions.

Why do you make a branch in your fork, rather than make the patch on your master branch?

  • GitHub pull requests are a little funny. From the moment you make a PR against a repo, any subsequent commits you make to that branch in your fork will get added to the PR. If I did my work on my master, submitted a PR, then started work on something else, any commits I pushed to my fork would end up in the PR. Creating a branch in my fork for every patch I’m working on keeps things clean.

Why did you force push to your master? Isn’t force pushing bad?

  • Force pushing can be very bad, but mainly because it messes up other collaborator’s histories, and can cause weird side effects, like losing commits. On my fork of a repo, there should be no collaborators but me, so I feel safe force pushing.  You’ll often need to force push upstream changes to your repo, because the commit pointers will be out of sync.

What if you need to update your PR?

  • I follow a similar process, pulling changes from upstream to make sure I didn’t miss anything, and then pushing to the same branch again. GitHub takes care of the rest.

What about repos where you are a Core Contributor or have the commit bit?

  • Even when I’m a Core Contributor to a repo, I still keep my fork around and make changes through PRs, for a few reasons. One, it forces me to stay in touch with the contributor workflow, and feel the pain of any breaking changes. Two, another Core Contributor should still be reviewing my PRs, and those are a bit cleaner if they’re coming from my repo (as compared to a branch on the main repo). Three, it reduces my fear of having a finger slip and committing something to the original repo that I didn’t intend.

That’s a good overview of my workflow for Open Source projects. I’m happy to explain anything that seemed unclear in the comments, and I hope this gives you ideas on how to make your own contribution workflow easier!


Tips for Becoming a Core Contributor

by phildini on June 5, 2016


During the PyCon 2016 Sprints, I was made a Core Contributor to the BeeWare project, and was given the ‘commit bit’ on Batavia, an implementation of the Python virtual machine written in Javascript. A friend of mine who works with the PDX PyLadies and regularly encourages people to contribute to Open Source saw this, and asked that I write a blog post on becoming a Core Contributor to Open Source projects.

It’s true that, for many projects, how you become a Core Contributor can seem mysterious. It often seems unclear what a Core Contributor even does, and it doesn’t help that each Open Source project has a slightly different definition of the responsibilities of a Core Contributor.

So this deliberately isn’t a “How to Become a Core Contributor” guide. It would be impossible to write such a guide and be definitive. This is me trying to reverse engineer how I became a Core Contributor on BeeWare and then extracting out things I think are good behaviors for getting to that stage.

How I Became a Core Contributor to BeeWare:

  1. Met Russell Keith-Magee at DjangoCon EU 2016, where he spoke about BeeWare and Batavia.

  2. Chatted with Russell about BeeWare, sprinted some on Batavia at DjangoCon EU 2016.

  3. Saw Russell and Katie McLaughlin at PyCon 2016, chatted more about BeeWare with both of them, joined the BeeWare sprint.

  4. Recognized that BeeWare had some needs I could fill, namely helping onboard new people and reviewing Pull Requests.

  5. Asked Russell for, and received, the ‘commit bit’ on the Batavia project so I could help review and merge PRs.

Tips I Can Give Based on My Experience:

  • Be excited about the project and the project’s future. I think the whole BeeWare suite has amazing potential for pushing Python to limits it hasn’t really reached before, and I want to see it succeed. A Core Contributor is a caretaker of a project’s future, and should be excited about what the future holds for project.

  • Be active in the community. Go to conferences and meetups when you can, join the mailing lists and IRC channels, follow the project and the project maintainers on Twitter. I met Russell and Katie at a conference, then kept in touch via various IRC and twitter channels, then hung out with them again at another conference. Along the way, I was tracking BeeWare and helping where I could.

  • Be friendly with the existing project maintainers and Core Contributors. It’s less likely I would be a Core Contributor if I wasn’t friends with Russell and Katie, but the way we all became friends was by being active in the community around Python, Django, and BeeWare. One way to figure out if you want to be a Core Contributor on a project is to see which projects and project maintainers you gravitate towards at meetups and conferences. If there’s a personality match, you’re more likely to have a good time. If you find yourself getting frustrated with the existing Core Contributors that’s probably a sign you’ll be more frustrated than happy as a Core Contributor to that project. It’s totally fine to walk away, or find other ways to contribute.

  • Focus on unblocking others. I still make individual code contributions to BeeWare projects, but I prioritize reviewing and merging pull requests, and helping out others in the community. From what I’ve seen, a Core Contributor’s time is mainly one of: Triaging issues in the issue tracker, reviewing patches or pull requests, and helping others. It’s only when everyone else is unblocked that I start looking at my own code contributions.

  • Have fun. I asked to become a Core Contributor to BeeWare because I enjoy the community, enjoy Russell’s philosophy on bringing on newcomers, and think the project itself is really neat. If you’re having fun, it’s obvious, and most Core Contributors want to promote the people who are on fire for a project.

My hope is that I have made becoming a Core Contributor to an Open Source project seem achievable. It is completely achievable, no matter your current skill level. There’s a lot more detail I didn’t cover here, and I can’t promise that if you do all these things you’ll become a Core Contributor, even on the BeeWare project. When you ask to become a Core Contributor to a project, the existing project maintainers are evaluating all kinds of things, like how active you are, how well you might mesh with the existing team, and what existing contributions you’ve made to the project and the community. It might not be a great fit, but it doesn’t mean you’re not a great person.

What I can say is that being a Core Contributor is work, hard work, but incredibly rewarding. Seeing someone make their first contribution, and helping shepherd that contribution to acceptance, is more rewarding for me than making individual contributions. Seeing a project grow, seeing the community grow around a project, makes the work worth it.

If you want have questions about my experience, or about contributing to Open Source in general, I'm happy to answer questions in the comments, or on twitter @phildini, or email [email protected].


Self-Importance

by phildini on May 13, 2016


Originally, I was going to start this post with:

Humans have a tendency to over-attribute our own importance.

But then I realized by starting the post that way, I was being incredibly guilty of the very thing I was saying. I mean, read that sentence again. I was getting ready to start a blog post by pretending to speak for all of humanity. That's like god-level delusions of self-importance there. So. Let's try one more time.

I have a tendency to over-attribute my own importance. This manifests itself most often in thinking that the way people act around me has something to do with me. I'll meet a friend on the street, or have an interaction with someone at work, and if it doesn't go the way I'm planning, or they seem upset, the conclusion I'll immediately jump to is that I did something wrong, or that they don't like me. On the one hand, this seems like a form of social anxiety, that I'm trying to please all the people around me in an attempt to make and keep friends. And I'm not saying I don't have that going on, and it's a struggle that's being fought in my head a whole lot of the time, but let's take a step back. 

How egotistical do I have to be to start by thinking that I am the sole driver of how someone else behaves?

It is totally possible that I am doing something or saying something to cause these weird social interactions, but in order to be fair, to treat the other person as a f%&$ing human being with some measure of agency in their own lives, I need to allow that at least fifty percent of their reaction to any given situation comes from what's going on in their heads, and has nothing to do with me at all. I say "everyone is the hero in their own story" so often that it's almost a damn catchphrase, but when it comes to dealing with the people in my own life I rarely stop to think through what that means.

If I am doing my best to be a decent human being, and treating the people around with me respect, then whether or not any given interaction goes well is basically out of my control. I should, we all should, be trying to treat other people with a baseline of respect, and not attributing to malice that which can be described by ignorance (excepting blatant -isms. F--- you HB2!), but I should also remember the flip side: Sometimes people have bad days, or don't like me, and that's not always my fault or under my control.

To think otherwise is pure ego.

Looping back to how I was going to start this post, I think I'm not the only one who has trouble with this. A theme among people I talk to, especially people who live on the internet, is that they attribute good social interactions to the other person, and take all the blame for the bad interactions on themselves. That is self-loathing, and self-importance, and I hope I can remember to do better should we ever meet (again).

tl;dr: I should examine my words and actions to make sure they meet my own standards, and remember that people are entitled to their own lives and reactions.

 


Using Django Channels as an Email Sending Queue

by phildini on April 8, 2016


Channels is a project by led Andrew Godwin to bring native asynchronous processing to Django. Most of the tutorials for integrating Channels into a Django project focus on Channels' ability to let Django "speak WebSockets", but Channels has enormous potential as an async task runner. Channels could replace Celery or RQ for most projects, and do so in a way that feels more native.

To demonstrate this, let's use Channels to add non-blocking email sending to a Django project. We're going to add email invitations to a pre-existing project, and then send those invitations through Channels.

First, we'll need an invitation model. This isn't strictly necessary, as you could instead pass the right properties through Channels itself, but having an entry in the database provides a number of benefits, like using the Django admin to keep track of what invitations have been sent.

from django.db import models
from django.contrib.auth.models import User


class Invitation(models.Model):

    email = models.EmailField()
    sent = models.DateTimeField(null=True)
    sender = models.ForeignKey(User)
    key = models.CharField(max_length=32, unique=True)

    def __str__(self):
        return "{} invited {}".format(self.sender, self.email)

We create these invitations using a ModelForm.

from django import forms
from django.utils.crypto import get_random_string

from .models import Invitation


class InvitationForm(forms.ModelForm):

    class Meta:
        model = Invitation
        fields = ['email']

    def save(self, *args, **kwargs):
        self.instance.key = get_random_string(32).lower()
        return super(InvitationForm, self).save(*args, **kwargs)

Connecting this form to a view is left as an exercise to the reader. What we'd like to have happen now is for the invitation to be sent in the background as soon as it's created. Which means we need to install Channels.

pip install channels

We're going to be using Redis as a message carrier, also called a layer in Channels-world, between our main web process and the Channels worker processes. So we also need the appropriate Redis library.

pip install asgi-redis

Redis is the preferred Channels layer and the one we're going to use for our setup. (The Channels team has also provided an in-memory layer and a database layer, but use of the database layer is strongly discouraged.) If we don't have Redis installed in our development environment, we'll need instructions for installing Redis on our development OS. (This possibly means googling "install redis {OUR OS NAME}".) If we're on a Debian/Linux-based system, this will be something like:

apt-get install redis-server

If we're on a Mac, we're going to use Homebrew, then install Redis through Homebrew:

brew install redis

The rest of this tutorial is going to assume we have Redis installed and running in our development environment.

With Channels, redis, and asgi-redis installed, we can start adding Channels to our project. In our project's settings.py, add 'channels' to INSTALLED_APPS and add the channels configuration block.

INSTALLED_APPS = (
    ...,
    'channels',
)

CHANNEL_LAYERS = {
    "default": {
        "BACKEND": "asgi_redis.RedisChannelLayer",
        "CONFIG": {
            "hosts": [os.environ.get('REDIS_URL', 'redis://localhost:6379')],
        },
        "ROUTING": "myproject.routing.channel_routing",
    },
}

Let's look at the CHANNEL_LAYERS block. If it looks like Django's database settings, that's not an accident. Like we have a default database defined elsewhere in our settings, here we're defining a default Channels configuration. Our configuration uses the Redis backend, specifies the url of the Redis server, and points at a routing configuration. The routing configuration works like our project's urls.py. (We're also assuming our project is called 'myproject', you should replace that with your project's actual package name)

Since we're just using Channels to send email in the background, our routing.py is going to be pretty short.

from channels.routing import route

from .consumers import send_invite

channel_routing = [
    route('send-invite',send_invite),
]

Hopefully this structure looks somewhat like how we define URLs. What we're saying here is that we have one route, 'send-invite', and what we receive on that channel should be consumed by the 'send_invite' consumer in our invitations app. The consumers.py file in our invitations app is similar to a views.py in a standard Django app, and it's where we're going to handle the actual email sending.

import logging
from django.contrib.sites.models import Site
from django.core.mail import EmailMessage
from django.utils import timezone

from invitations.models import Invitation

logger = logging.getLogger('email')

def send_invite(message):
    try:
        invite = Invitation.objects.get(
            id=message.content.get('id'),
        )
    except Invitation.DoesNotExist:
        logger.error("Invitation to send not found")
        return
    
    subject = "You've been invited!"
    body = "Go to https://%s/invites/accept/%s/ to join!" % (
            Site.objects.get_current().domain,
            invite.key,
        )
    try:
        message = EmailMessage(
            subject=subject,
            body=body,
            from_email="Invites <[email protected]%s.com>" % Site.objects.get_current().domain,
            to=[invite.email,],
        )
        message.send()
        invite.sent = timezone.now()
        invite.save()
    except:
        logger.exception('Problem sending invite %s' % (invite.id))

Consumers consume messages from a given channel, and messages are wrapper objects around blocks of data. That data must reduce down to a JSON blob, so it can be stored in a Channels layer and passed around. In our case, the only data we're using is the ID of the invite to send. We fetch the invite object from the database, build an email message based on that invite object, then try to send the email. If it's successful, we set a 'sent' timestamp on the invite object. If it fails, we log an error.

The last piece to set in motion is sending a message to the 'send-invite' channel at the right time. To do this, we modify our InvitationForm

from django import forms
from django.utils.crypto import get_random_string

from channels import Channel

from .models import Invitation


class InvitationForm(forms.ModelForm):

    class Meta:
        model = Invitation
        fields = ['email']

    def save(self, *args, **kwargs):
        self.instance.key = get_random_string(32).lower()
        response = super(InvitationForm, self).save(*args, **kwargs)
        notification = {
            'id': self.instance.id,
        }
        Channel('send-invite').send(notification)
        return response

We import Channel from the channels package, and send a data blob on the 'send-invite' channel when our invite is saved.

Now we're ready to test! Assuming we've wired the form up to a view, and set the correct email host settings in our settings.py, we can test sending an invite in the background of our app using Channels. The amazing thing about Channels in development is that we start our devserver normally, and, in my experience at least, It Just Works.

python manage.py runserver

Congratulations! We've added background tasks to our Django application, using Channels!

Now, I don't believe something is done until it's shipped, so let's talk a bit about deployment. The Channels docs make a great start at covering this, but I use Heroku, so I'm adapting the excellent tutorial written by Jacob Kaplan-Moss for this project.

We start by creating an asgi.py, which lives in the same directory as the wsgi.py Django created for us.

import os
import channels.asgi

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myproject.settings")
channel_layer = channels.asgi.get_channel_layer()

(Again, remembering to replace "myproject" with the actual name of our package directory)

Then, we update our Procfile to include the main Channels process, running under Daphne, and a worker process.

web: daphne myproject.asgi:channel_layer --port $PORT --bind 0.0.0.0 -v2
worker: python manage.py runworker --settings=myproject.settings -v2

We can use Heroku's free Redis hosting to get started, deploy our application, and enjoy sending email in the background without blocking our main app serving requests.

Hopefully this tutorial has inspired you to explore Channels' background-task functionality, and think about getting your apps ready for when Channels lands in Django core. I think we're heading towards a future where Django can do even more out-of-the-box, and I'm excited to see what we build!


Special thanks to Jacob Kaplan-Moss, Chris Clark, and Erich Blume for providing feedback and editing on this post.


6 Steps to Make Beginner Workshops More Beginner Friendly

by oboechick on March 30, 2016


I have spent roughly the last ten years of my life studying how to teach mathematics and music. I am not by any means an expert. I am simply a person with an educated opinion in how to teach beginners. I find that code is very similar to mathematics and music. So, I am going to put this into terms that I understand and hope they help you.

I want you to imagine you are teaching how to play Clair du Lune by Claude Debussy on the piano to an oboe player. 

  1. You would not say, “Here is the music.”

  2. “Here is a chart to show you what the notes are on the piano.”

  3. You know how to read music. Ready, set, go!”

This makes no sense. Here is a plan that would work a lot better.

  1. Make sure that the person can read the music. An oboe player only has to read one of the staffs that a piano does on a regular basis and so they may only read one of the staffs.

  2. Teach them the fundamentals of how to play the piano. A piano and an oboe are completely different. One major difference is that for the oboe, dynamics or volume, is controlled by air. On the piano it is controlled by how hard you press the key. It doesn’t matter how hard you press the keys of an oboe, it will not make it louder.

  3. Break the music down several times to make sure that the music is possible.

    1. Each hand on the piano is playing something different. So we start by breaking the music up into a few bars at a time and only work with one hand at a time.

    2. After the student knows what each hand is doing separately, you take it a few bars at a time putting the two hands together adding more in each lesson until you reach the end of the piece.

  4. Play the whole piece and enjoy!

    1. Note that this could take a week or a year. The secret is to know that it is ok for it to take as long as you need. It doesn't mean that you are stupid or that you will never be able to do it, it just means that you need more time than others.

When I walk into a workshop at a tech conference the first words I usually hear are, “Here are a set of directions, follow them and you can make (fill in the blank)!”. It is nice, each person is able to follow the instructions at their own pace. There can be two people or fifty people in the room and the workshop leader can walk around and answer questions without feeling too overwhelmed by the number of people present. This is an optimal way to teach something given limited time, unknown space, and differing skill levels of the attendees.

There is a lot that is being done very well. The people are very friendly and higher level attendees help those who are lower level. However, for a person who is looking at coding for close to the first time or a person who is not familiar with the programs being used it can be overwhelming if there is a lot of information thrown at them all at once.  I am going to use the experience I had in my first workshop because it has stuck with me the most. I will not name that person who ran the workshop or the conference the workshop was held at because I do not want there to be backlash on this person or the conference organizers. I find the person who ran this workshop is very smart, it was simply their first time running a workshop and the con is one of my favorites.

That said there are a few but very important steps that those who are running these workshops can do that will make it so that beginners feel that what they are working on is doable.

  1. There is no question that is stupid or boring.

    1. When you teach something, questions are the metric to show you how well you, as a teacher, are doing. You want to make the environment you are in safe. If you make someone believe that their question is stupid they are less likely to ask more questions which could result in them quitting.

  2. Don’t get mad at a beginner because they did not “google” it before they asked you.

    1. Having to figure out the right question to find the correct context, which site will have the most correct answer, and having to worry about whether or not there is a reliable internet connection to even do a web search can make a beginner who is on the verge of quitting quit.

    2. I have a background in Mathematics education. For a short time I tried to go into theoretical math. This did not work out but it has influenced the way that I look for information. Some of the subjects I was trying to research did not exist on the internet. That meant I had to go the the school library and go through a couple hundred books to find one reference. If you were lucky you could find a person who had the information and could help you find the right resources. This means that I did not think about doing an internet search to look for something. I ask a person because in my experience that is the fastest way to gather information. I did not appreciate being scolded for not doing something that was not regular for me. This was the reason I stopped learning to code the first time. It was almost three years before I started again.

  3. Don’t use words like “easy”, “simple”, and “fast”.

    1. The first workshop I ever attended was a twitter bot tutorial. I do not remember which program was used for the workshop, I was going to be able to build a twitter bot and nothing else really mattered at the time. I was excited because my partner had been making a lot of twitter bots and now I would be able to make one too. The workshop was two hours long and the phrases I heard used to describe the workshop were along the lines of “this is so easy you will be done in no time”. I was given a booklet of instructions (which were online) with fifteen to twenty steps on each page and there were somewhere between ten and fourteen pages. I did not understand half of what was on the first page. I was so overwhelmed that not even ten minutes into the workshop that I left the room to go hide in the bathroom and cry. Do not imply that something is “easy” or “simple”. Not everyone will find it easy, fast, or simple. When a person sees these words they often feel that if it isn’t easy for them, they must be stupid. This can make the difference between whether a beginner quits or keeps going.

  4. Don’t assume they will know all of the terms that seem second nature to you.

    1. The hardest part about learning mathematics is the vocabulary. Depending on where you go there could be as many as ten different terms that mean the same thing. There are also a few terms that mean different things depending on the context. When planning a workshop for beginners, pretend that the person has no coding experience. This means they will not know any of the correct vocabulary.

      1. The best example I can think from my first workshop experience is the word “fork”. In the instructions I was told to fork something on github. First, I had no idea what github was and secondly, what in the world does “fork” mean!?! I knew it was an eating utensil but what did that have to do with coding? (There were a lot of curse words swirling around in my head at this point.)

      2. Another example: I once had to explain what “click” means to someone who did not speak English as a first language. Do not assume, even the most basic terminology may not be what they know

    2. Here are a few suggestions for what you can use to fix this without having to make everyone sit through you vocally explaining what every term is.

      1. Place a footnote on the page with the term explaining what it is. This small step will take time but it will help the beginner feel less overwhelmed by the amount of information you are asking them to process.

      2. Have a glossary with any term that you think may not be known by a new person. (This is the option I believe is best.) If no one needs it then they don’t have to even look at it. This would be like a glossary in the back of a history or math book with all the vocabulary words and definitions in it. You can create the glossary once and use for any workshop you do. If someone asks you about one you don’t have in your glossary, thank them, write it down, and add it in for next time. Having this glossary will make the workshop more accessible all around.

    3. Does the definition of the word you are using match what your attendees know? Here is an example of why having a some form of glossary or footnote for the vocabulary is a really good idea, Take the word “set”.

      1. ​There are eight definitions in the Merriam-Webster dictionary for the English language none have anything to do with math or computers.

      2. In math there are different kinds of sets. Subsets and power sets are a few examples. But the number of rules defining whether or not a group is a set or not changes depending on the context.

      3. In computer science a set is a group of data. Some of these definitions may overlap with those in math but I am too unfamiliar with the subject to say definitively one way or the other. I have been told that there are at least five definitions in computer science.

      4. Are you confused yet? I certainly was when I was first learning set theory. This is one word that I chose, imagine what is it like looking at a whole booklet of instructions and not understanding half of what is there. Believe me when I say there have been more than a few tears shed over similar difficulties and I’m pretty sure they were not shed only by me.

    4. People will not feel stupid because you put more information than was needed into the instructions but they will feel stupid if they have no idea what a word means and they have to ask again and again. Especially if you roll your eyes then mutter under your breath that they should know it already because it is a beginning thing. This is supposed to be a workshop for beginners. They will not know everything you know. I know that I felt like I was taking a big risk by going to this workshop with my limited skills. I am guessing that there are many others who will feel the same. It will work best if you simply pretend none of your attendees know what you are talking about and go from there.

  5. Break it down more than you think is needed.

    1. The best way to prevent the feeling of being overwhelmed is to break the steps down more.

      1. As mentioned above, at the first workshop I went to I was given a booklet of instructions with fifteen to twenty steps on each page and somewhere between ten and fourteen pages. This is a lot to process all at once even if I did know all of the correct terminology. The workshop booklet that I was asked to look at felt like a calculus book and I was a student who had just started algebra 1. I felt overwhelmed and all I had done was look at the instructions.

      2. Don’t put more than three to five directions on a page. Designing the instructions into a powerpoint will help you know the right amount of information to put on a page. The less a beginner has to look at a time will make it seem more doable.

        1. If something is not blatantly obvious, add pictures with arrows. A picture of what something should look like will be a point that the student can check to make sure they are on the right track.

          1. In the instructions for the workshop, I was being asked to create a new file in the program. This was all I was given. There were no instructions for how to do it. It turned out that I had to back out of where I was, find a menu (there were five) and then click the new file option. Beginners are not likely to know where to look on a random program. Especially if you are using this program for the first time, the beginner should not be held to a standard higher than you hold yourself.

        2. Breaking it down further will help you see steps you skipped. This is something that I know I have nightmares about when I am having to plan a lesson either for student teaching or I am just being asked to teach something. Looking at something three to five steps at a time and asking yourself if anything needs to be done in between each step will help eliminate the chance of missed steps.

  6. Always have a backup plan.

    1. Know that no matter what you do, chances are you are going to have to change something (or everything). Every group of people that you work with is going to have different experiences and knowledge to pull from. This may mean that everyone who shows up are advanced skill levels, everyone is a beginner, or you have a group that is all over the board. Be ready to make changes once you see who is there, however, remember it is always easier to make an easy lesson harder but it is almost impossible to simplify a lesson on the fly.

    2. If you plan to use the internet, MAKE SURE THERE IS A RELIABLE INTERNET SOURCE! If you are unsure, download a copy onto a thumbdrive and have a plan that allows the attendees to participate in the workshop whether or not internet is available.

      1. If there is a program that the attendees need to have downloaded before hand, have a thumb drive with copies of the program on it so that the attendee doesn’t spend two hours downloading the program(s). (This is assuming there is internet to download it.)

      2. You could also try to get the conference to put a note into the description of your workshop asking the attendees to download what you are using before they come if that is possible. Not everyone will be able to do it but it may make it so that fewer people are frustrated by trying to download something with questionable internet or not having enough thumbdrives for everyone.

Just because a coder feels that the level of responsibility they have been given is more than they are capable does not necessarily mean that they are in fact, a true beginner. Do these people have more that they need to learn? Yes, the moment you stop learning is the moment your career dies. However, these coders also know the basics for coding and are therefore not true beginners. We need to stop allowing people who are beginning-intermediate, intermediate, advanced-intermediate, and advanced level coders set the bars for what a beginner is because chances are they have a hard time remembering what it was like to be a true beginner.

These six steps should help to make it so that true beginners feel more welcome at workshops while allowing higher level coders to participate. Let’s all remember that the goal is to encourage as many people as possible to code.


The Engineer's Day

by oboechick on March 23, 2016


Waiting waiting for this to run,

Ooo! I've found a site that looks fun!

Maybe if I'm really good,

I'll finish this before its done!


Jonah Henderson is One Dynamite Guy

by phildini on March 16, 2016


I think last night's Alameda City Council meeting might be my fault. Before the meeting, as I was sitting in the Council Chambers, I looked at the agenda, and tweeted:

With a total of 3 regular agenda items on tonight's #alamtg agenda, I'm hoping for an efficient meeting.

— Philip John James (@philipjohnjames) March 16, 2016

This obvsiously jinxed the whole enterprise, and I apologize to all those involved.

The meeting started out with incredible promise. Some items were pulled out of the consent calendar, like changing the city's investment strategy, and the Alameda Landing Transportation Demand Management Program. These items had some discussion, most notably Councilmember Daysog being strongly for the Alameda Landing TDM (which endears him to me), and continuing to be against the project at 2100 Clement.

The first regular item, the confirmation of members to the various city commissions and boards, also passed unanimously, without comment. Given that members were being confirmed to the Housing Authority Board of Commissioners and the Rent Review Advisory Committee, I was somewhat shocked there was no discussion from the various renters groups in Alameda. There has been months of work and comment about rent control, but no apparent interest in the people who sit on the committees overlooking housing and rent.

Then we come to the big enchilada, the two hour discussion that was both comedy and tragedy in equal measure: the Building 8 project at Alameda Point. There are some important things to know:

  1. Jonah Henderson, one of the chief developers on the project, is a stand-up guy. This was proved by comments from bankers, lawyers, tenants, Berkeley City Council Members, the Mayor of Berkeley, various artists, various historical committees, friends of Jonah's, and the teachers of Jonah's children. (He's apparently an amazing parent too.)*
  2. Mayor Trish Spencer is worried about housing. And parking. And whether we're getting enough money from the developer. And whether city staff did their due diligence. There were also multiple occasions where she indicated that the Planning Board was not public enough for the discussion at hand. (Having attended a Planning Board meeting last night, they seemed fairly public and well-managed to me.)

The discussion between councilmembers at times broke into actual barbs and interruptions, with threats from the Mayor of invoking the gavel. I have not been at enough meetings to know how serious the threat of the gavel is.

The meeting ended at 1059pm, with the last agenda item pushed to the beginning of next meeting.

Once again, my apologies for placing a jinx upon last night's city council meeting. Also, city staff: you're doing a great job. Keep it up. 


*The title of this post is tongue-in-cheek. Jonah Henderson seems like a really nice guy, the litany of his awesomeness was maybe a tad much.


Follow the Yellowbrick Pavement

by phildini on March 15, 2016


Over the past couple months, it has been at times painful to watch civic discourse in Alameda. I believe the City Council has arrived at a good starting point in most of its decisions, but the path taken has often been confusing to follow.

Last night's Planning Board meeting was gust of fresh air in comparison. Two major issues were discussed by the Board: street names for the 2100 Clement Street project, and Design Review Approval for block 11, block 8, and phase 1 of the waterfront park at Alameda Point Site A.

  • The Planning Board wants to put more consideration behind the street names for the 2100 Clement project. There's a worry about the current suggestions being pronounceable to the average Alamedan, as well as a worry about the appropriateness of the some of the alternates.
  • The review for the design of block 11, block 8, and the waterfront park at Alameda Point Site A revolved around:
    • Making sure windows are up to city code
    • Close inspection of the proposed exterior construction materials
    • The color of the street in the shared plaza. Earlier sketches showed a yellowish color, last night's designs had the road returning to a more street-ish gray. Based on the comments of President Knox White and others, there's going to be more thought put into this area, as many on the Board feel the street color helps dictate how the space would be used, and the apparent preference is for it to be pedestrian-focused.
    • Discussion on the name of the street currently called West Atlantic, which would potentially be an extension of Ralph Appezzato Memorial Parkway. There's a concern that Alamedans already shorten that streetname to RAMP and so the board should consider naming the street Appezzato Parkway or Appezzato Boulevard.

(To that last point: Ralph Appezzato was the first Alameda Mayor I knew personally, and during these months of turmoil in Alameda civic discourse I find myself missing his presence strongly. I am immensely glad the Planning Board is doing what it can to keep his name in the memory of Alamedans.)

The Planning Board voted to approve the proposed design for the pieces of Alameda Point Site A mentioned above, and I believe the vote was unanimous. In both discussions, President Knox White and the other board members asked informed questions, voiced solid points, and arrived at conclusions that balanced a push for future improvements to Alameda with a sense of keeping Alameda's history and character. 

It may seem like I am being extraordinarily complementary to the Planning Board, or that I'm being far too friendly with them. To that I would say: The Planning Board accomplished the business they came there to do, including time for public comments, and did so in less than 90 minutes.

It will be interesting to see how tonight's City Council meeting compares.


A Rising Tide Lifts All Transit

by phildini on March 11, 2016


One of the most common concerns I hear about Alameda growing as a city is the congestion at our bridges and tunnels, and the difficulty people have getting on and off the island. To me, it feels like we need a comprehensive plan that increases public transit access and thinks about access to and from Alameda in terms of the whole island and the whole region. Public transit is critical if Alamedans want to maintain the quality of life the island has to offer.

The speakers at Wednesday night's City of Alameda Democratic Club meeting agree that public transit is essential. Speakers from all the transit agencies that serve Alameda spoke in turn about what they're doing already to serve the area, and how they would like to improve.

  • BART has about 430000 riders every weekday, riding on an infrastructure that was built in the 70s, and in train cars that are about as old. They want to spend 9.6 billion in improvements, mostly in purchases of new train cars and infrastructure improvements.
  • BART has about half the money they're looking for, and will most likely be putting a parcel tax or a bond measure on the ballot in November for the other half.
  • AC Transit has about 179000 riders every weekday, mostly people going to work and schoolchildren. Schoolchildren alone make up 30k of their riders. AC Transit's main goal is to increase service by working closely with the City of Alameda; they're expanding lines that run through the city and collaborating on a city transit plan.
  • AC Transit hopes to improve its service and its fleet with the funds it already has, although they are also investigating a parcel tax.
  • The Water Emergency Transportation Authority (WETA, the agency in charge of the San Francisco Bay Ferries) sees Alameda as its greatest-service city, and wants to deepen its commitment to Alameda by building a maintenance facility and another terminal on the Southwest side of Alameda.
  • WETA knows that transit to the terminals, as well as parking at the terminals, is the greatest challenge their ridership faces; they're hoping for stronger collaboration with the other agencies and the city to make it easier to get to the existing Alameda ferry terminals.
  • The West Alameda Transportation Demand Management Association (TMA) is running a series of apparently ridiculously successful shuttles from the West End of Alameda to 12th St. BART in Oakland, and wants to see their service expand as well. 
  • TMA's main focus right now seems to be on education, getting Alamedans, Alameda businesses, and the employees of Alameda businesses thinking about public transit options and how we can all better utilize public transit.

It was an information-dense first half of a meeting, to say the least. The major takeaways for me were:

  • Public transportation is on the up in Alameda, and many want to see it increase.
  • The transit agencies see themselves in cooperation, not competition. They understand their inter-connectedness to each other, and seem to want each other to thrive.
  • They're all trying to buy American and bring jobs to Alameda.

As I am unabashedly in favor of more public transit, I'm thrilled to hear about the programs currently in place, and that those programs are trying to expand. I want Alameda to be more walkable, and bikable, and I want public transit to be a deeply viable option to owning a car in Alameda.

I said above that public transit is critical to the quality of life for current Alamedans, and it will be just as critical for future Alamedans. The second half of the meeting was dedicated to presentations from property developers, specifically the organizations behind the Del Monte project, the Encinal Terminals project, and the Alameda Point Site A project.

I'm not going to go too much into the projects here, mostly because I don't have hard numbers like I have with the transit agencies, and partially because growth in Alameda, and in the Bay Area, is a thorny subject. I think more growth is good for Alameda, and I think these projects have a shot at being a massive net positive to the city. Others feel differently.

What I can say is that both projects feel public transit is a critical need for their developments to succeed, and both are putting plans in place to improve transit in their development. The group in charge of Del Monte/Encinal Terminals seems a bit more on the ball in this regard, as they talked about having an organization like the TMA (potentially joined with the TMA) to continually improve transit in that part of the city, but both groups stressed how transit would be integral to what they're building.

I've talked so far about increasing public transit because it will ease congestion, and make Alameda an even better place to live. But there's another benefit of public transit that the Alameda Point Site A group drove home for me: Getting cars off the road.

The plan for Alameda Point Site A includes raising the level of some streets and buildings, and a terraced waterfront park area. Why? Because global warming has become enough of a reality that property developers are working "sea level rise strategies" into their plans. They're so certain the seas will rise from global warming that they're betting money on it. 

Every train car BART adds, every bus or shuttle added by AC Transit or TMA, every ferry added by WETA gets cars off the road and less CO2 in our atmosphere. In a world where major corporations are now banking on global warming happening, increased public transit in Alameda can't come soon enough.


Rain

by oboechick on March 10, 2016


Rain oh rain. 

Thou pluggest up the drain. 

Whatever shall I do with you, 

But admit the love I have for you is true.


Rentopia in Alameda

by phildini on March 4, 2016


The Alameda Renters Coalition has published the text of the amendment to the Alameda City Charter they're trying to add to the ballot for November. It's well worth a read, but here's the key points, as I see them:

  • Renters and Homeowners should have protection under the law
  • Alameda needs a Rental Housing Board to oversee administration of rental units in the city
  • Evictions should only be enacted for Just Cause
  • Rent should be pinned to the Consumer Price Index

The charter amendment, if enacted, will provide incredible renter protections in Alameda. I haven't read the text of the rent control measures for other California cities, but I'm willing to wager this proposal would put Alameda in the top three cities in terms of renter friendliness.

I'm biased here, but as a renter (and someone who both wants to see more people in Alameda and see the current residents protected), I'm in favor of shifting the landlord-tenant power balance a bit more in favor of the tenants. That said, there's definitely some parts in the measure that gives me pause.

The big one is capital improvements. As I read the amendment, there is no allowance for general capital improvements to a property. The amendment is very explicit about allowing relocation and rent increases for capital improvements to bring the property up to code, but what if a landlord wants to do general remodeling to make a property nicer, or more attractive? The amendment doesn't seem to allow for that. It feels like an oversight that could be taken advantage of, and which would decrease the overall appeal of the housing stock in Alameda.

Also, the Rental Housing Board, again as I read the amendment, seems to operate with absolutely no oversight. They're chosen by general election, operate completely autonomously from the rest of city government, and their budget is approved only by them. Ostensibly, this is so the Board can't be influenced by a city council that is being too partisan to landlords or tenants. However, the way the amendment is currently worded, the Rental Housing Board could decide to charge a $1000/unit Rental Housing Fee, and the only recourse would be a lawsuit or another election. There doesn't seem to be a lot of 'check' to this 'balance'.

Where does that leave us? I think the debate around this amendment, especially in light of the rent stabilization passed by the City Council on March 1st, is going to be intense, and I hope it raises the level of discourse about how to prepare Alameda for the next decade and the next century. I want strong renter protections, I want myself and other renters to feel secure in our homes. Housing is a home, first and foremost. This amendment provides for that idea, but seems focused on solving the problems of the present, without considering the problems of the future.


Things Learned at the City Council Meeting

by phildini on March 1, 2016


Here are some things, learned by myself and others, at the Alameda City Council meeting on March 1st.

  • The city council continues to treat its staff in a way I find weirdly antagonistic
  • Whenever a council member uses the phrase "Real World", what they mean is: "You researched presentation means nothing, city staffer. Alameda is different."
  • Appropriate means you make funds available for. Those funds can be taken back, especially if they aren't spent
  • The Mayor and City Council are maybe really underpaid?
  • Alameda cares about golf way more than I thought it did. Like, an hour and half more than I thought it did.
  • I don't understand Councilmember Daysog's long-term strategy for Alameda
  • Councilmembers Ashcraft and Oddie seem like people I would enjoy hanging out with

And, the big one

  • Alameda now has rent stabilization.


Housing in Alameda, Eyes on November

by phildini on February 28, 2016


Update 2016-03-04 The ARC has published the text of the amendment. Read my thoughts about that here.

The Alameda Renters Coalition is going to be filing an initiative with the City of Alameda tomorrow (2016-02-29). Text of press release follows.

Press Release
For Immediate Release                   Media Contact: Catherine Pauling 510.220.2030
Alameda Renters Coalition filing ballot initiative 2/29/16 at 4 p.m. at Alameda City Hall
ALAMEDA, CA - The Alameda Renters Coalition (ARC) will file the “Alameda Renter Protection and Community Stabilization Charter Amendment” initiative at Alameda City Hall Monday for inclusion on the November ballot in response to a crisis of mass evictions and average rent increases of more than fifty percent over a span of only four years.
ARC spokesperson Catherine Pauling says, "We are filing this initiative so the people of Alameda can do what its City Council has been unable to do: enact a firm set of laws to stabilize our community and protect renters from greedy investors."
City officials have deliberated over the rental crisis for three years, recently crafting an ordinance the coalition says falls far short of what's needed to protect tenants.
"ARC recognizes the City Council's efforts but their ordinance has too many concessions to real estate interests and will not keep Alameda renters in their homes" says Pauling.
Specifically, the coalition objects to there being no cap on rent increases, merely a review process triggered by a rent increase of more than 5 percent.  Additionally, the City ordinance subjects renters, once again, to the threat of no cause evictions when the current moratorium expires. ARC further objects to what it calls a "poison pill" in the ordinance allowing landlords to escape all eviction controls by simply issuing a "Fixed Term Lease".
"This filing begins the process of allowing voters of Alameda, over half of whom are renters, the opportunity to choose clear protections that renters and owners deserve instead of the cumbersome and un-tested process outlined in the City ordinance," says Pauling.
The Alameda Renters Coalition, formed in 2014, advocates for clear, rational rent stabilization tied to the Consumer Price Index, as is used by many other cities in the Bay Area in determining rental rate increases.
Prior to the 4 p.m. filing in the City Clerk's office, representatives of the Alameda Renters Coalition will be available on the steps of Alameda City Hall at 2263 Santa Clara Avenue to answer questions of the media and provide copies of the initiative.

I'll be linking to the actual press release if I can find it online.  I'm very interested to read the text of what they will be filing, and the timing is excellent. Tuesday will see the continuation of the City Council's January discussion about rent stabilization, and I think we all hope this one won't go to 4 am.

It's increasingly clear that housing and rent stabilization will be a critical issue in this upcoming election. As an Alamedan and renter, I am thrilled these issues are getting the attention they deserve.


Hello, world.

by phildini on February 26, 2016


This is the first post posted to WordFugue.com. WordFugue is an experiment, something neither of the founders of this site have ever done before. I've had blogs off and on for over a decade, but two things are different about WordFugue.

  1. It's the first blog I've run where I've built the whole thing as a blog, and built it just for myself and my partner to use.
  2. It's the first time I'm blogging with a partner, sharing a space with someone else, sharing the writing, and the code, and vision.
There will be posts that backfill to different dates as I start migrating various other blogs to this blog, but this is where it all starts in many ways.
Watch this space.


An Interview, Interrupted

by phildini on October 25, 2015


This is a short story inspired by this post from Chuck Wendig, mashing up two other stories. Full reveal of the mashups at the bottom!

To say I was nervous would be an understatement. Months of research, of hunting down leads, of following urban legends and whispered truths had paid off with this night, this potentially life-changing night.

It started with a rumor, heard at parties and in whispers throughout the year, growing strongest around Halloween, a rumor of an shadow figure draped equal parts in violence and elegance. A question that would be asked if the right people were sufficiently drunk with the other right people.

“You know there’s a vampire in San Francisco, right?”

It sounds crazy in my own head when I think about it. How cliche, to think of that book and the history around it, and try to extend that world into the real world. What a perfect representation of this city, to think that’s there’s a creature of cultured carnage who drifts among us, civilized on the outside with a tortured heart of evil inside.

My name is Susan Harper, I’m a reporter for the San Francisco Chronicle. Well, I like to say I’m a reporter for the Chronicle. Really, I’ve had just a few bylines in print, and most of my writing has been for the collection of blogs that catalog the former hippies and capitalist yuppies that make up the City by the Bay.

I’m known for tracking down urban legends and weird stories, pieces of San Francisco folklore  that get passed in some new age oral tradition at parties and bars and in parlors. I hear about them, or they get sent to me, and I spend a few weeks to a few months tracking down the truth and the origin of these tales, then selling the story to whatever outlet will pay the most for it.

Emperor Norton’s Ghost, wandering around the Barbary Coast? That would be George, a lovely if eccentric man who works in theaters around the city and likes dressing up. The moans of dead gold miners, trapped under Nob Hill to haunt those who had gotten wealthy off their gold? A problem with the city’s natural water lines. That one required actually prying the manhole off a sewer entry, and almost getting arrested, but resulted in an official thank-you. Turns out the city didn’t know about the leak, and it almost disrupted the foundations under a city councilmember’s house. Oops.

My success record isn’t perfect, but I’ve been able to find an answer to most of the legends and weird occurrences that have persisted over the years. Except the damn vampire tale.

I kept hearing it, it felt like someone mentioned it at every party, and it rattled in my brain until I could think about nothing else. I started getting emails, tweets, forum posts asking me about it; I felt like the city itself was crying out in my dreams.

“You know there’s a vampire in San Francisco, right?”

The unspoken second question was always: “Is it really true?”

I reached a breaking point, put aside all the other stories I was working on, even got the Chronicle to put me on a small retainer to work on the story. Enough people were talking about it that I thought it would be two weeks, tops, until I had this story in the bag, and had leveraged the pageviews into a more solid gig with the paper. That was six months ago.

Days, weeks, then months went by and I had no proof, no shred of the origin of the legend. I began to doubt myself, doubt my sanity, doubt the sanity of the whole city, and became more and more certain that there was nothing there.

But the whispers! They never stopped! I expected an initial flood after people found out I was working the vampire story, but I wasn’t prepared for the constant typhoon. It seemed my investigation had opened a bottomless pit of shadows.

Normally, when an urban legend persists there’s some kernel of truth to the story. Somebody sees something, like, say, an old woman dressed all in grey walking along Ocean Beach in the fog, and the watcher is slightly drunk, or high, so they make up a story about the Grey Ghost of Ocean Beach or whatever. They tell their friends, and the legend spreads for a bit, or dies right there. If enough weird things happen that roughly match the outline of the story, the spread intensifies, and the story might enter the realm of city folklore. The best urban legends can carry on for years, told and re-told until everyone who could possibly be interested moves away, or dies, or the story is exposed by someone like me. The longest I had seen a piece of folklore live, without exposure, and still be taken seriously, was about twenty years, give or take. Enough time for a whole generation to come up and move on in our ever-changing city.

After digging into stories, and old journals, and hinted rumors in ancient newspapers and antique books, it looked like the Vampire story had been living, non-stop, in San Francisco for over a hundred years. Well before the publishing of that damn book, almost to the glory gold rush days themselves. When I was able to trace that line all the way back, I felt my first thrill of uncertainty, tinged with fear. The immensity of the story seemed to loom over me.

And yet! I still had no clear lead, no clear path. Rumored sightings, whispered stories, nothing concrete! Barely a consistent description, and one that could have matched most of the men in the Financial District. Pale, blond, lithe or muscular depending on who you asked. And always, always impeccably dressed. I would hear he had been at this party, or that gala, or this orgy (San Francisco being what it is), but never any proof, any evidence. Once, I got a text from a friend at party, who knew how long I had been searching: “HE’S HERE COME NOW”. I practically sprinted across town, not even remembering how I got there, and rushed into the club, only to find my friend looking like she was on the biggest high of her life, dreamy and moving slowly.

“Where is he?” I asked, yelling over the music.

“Wha?” she replied.

“The Vampire! You said he was here!”

“He… he was!” She looked around. “I don’t see him now, though.”

I never did know if she was just fucking with me, but I left the party feeling lower than I had ever felt. I got back to my apartment, stared at the snowdrifts of printouts and newspaper articles, dotted with rotting takeout boxes like flowers in the snow, and decided to pack it in. I would write the most unsatisfactory conclusion to six months of searching that I could imagine, the journalistic equivalent of a shrug emoji. I would fade back into the obscurity of San Francisco’s limitless pool of wannabe journalists, and keep making rent by writing copy for soon-to-fail startups.

I was sorting the last scraps of paper into trashbags and wishful-thinking storage boxes, with the first draft of my greatest shame sitting open on my laptop, when my cell phone rang. Despite my policy of never answering numbers where the caller ID says “Unknown”, I was looking for anything to distract myself from the disappointment and tedium. I picked up the call, and clear male voice with the barest hint of an Eastern European accent said:

“I hear you’ve been looking for me.”

“I.. What?” Not my most graceful response, but how do you answer that?

“I am under the impression that you would like to write a story about me.”

“A story about you? Who are you?”

“Ah, my apologies, I thought it would be obvious. I am the Vampire of San Francisco.” He paused, while my heart stopped beating for a moment. “The only one, as far as I know.”

My first thought was that some loony had got ahold of my number, and wanted his ego (hopefully only his ego) stroked by having an actual journalist listen to him for what would probably be hours. I’m normally pretty tight with my real cell phone number, but a few friends have it and one of them could have been convinced to give it to some rando. It wouldn’t be the first time, or probably the last.

Well, that’s not quite true. My first thought, if I’m being more honest, was a mixture of hope and fear and uncertainty. Hope that my story might not be dead after all, uncertainty about what my next step was, and fear that maybe the rumor was right.

“I can understand if you think this might be a deception, but I assure you I am being completely honest. I got your number from a mutual friend.”

“Are you reading my mind?” I mentally kicked myself for saying the first thing that came to mind. Probably should have been a bit more guarded than that, Susan. I’ll admit I was caught off-guard by his directness, and how close he was to what I was thinking.

The man claiming to be a vampire on the other end of the line laughed, and it was a full, throaty laugh that seemed genuine and slightly predatory.

“No, reading minds is not a gift of mine, and doing so over the phone would be a feat I’ve never heard of. You might say instead that I can think very, very quickly, and select the best outcome for any given situation. Were I in your place, I would also suspect this might be a ruse.”

“Why me?” Again with brain-mouth malfunction, Susan. Get it together. “I mean, why contact me now? If what you say is true, you’ve done an excellent job staying out of the spotlight for decades. Why expose yourself now?”

“Partially because you impress me,” the voice replied. “I’ve read all your work, and you show a thoroughness and intelligence that helps me believe I’ll get a good story out of our interview. As for why I’m granting such an interview, my reasons are my own. Say it’s boredom, if that satisfies you.”

Many thoughts in quick succession: a flush of pride at the idea that someone found my work worthy, a double-take at how quickly he had assumed we were going to interview, and a lingering suspicion at his motives.

“I will admit you’ve got my attention, Mr. Claims-to-be-a-Vampire. When and where would you like to meet?”

I swear I could hear him smile a fanged smile as he replied. “Excellent! It just so happens that the opening gala for the Museum of Modern Art is this Friday. Would eight o’clock work?”

Eight o’clock at the MoMA gala. How on earth was I going to get tickets? But if this guy was for real, I needed to take this interview. I’d bribe someone at the Chronicle’s Art and Culture desk if I had to. “Sounds great. How will I recognize you?”

“Oh, I’ll recognize you, Ms. Harper. Until Friday.” Thanks for that extra bout of creepyness, mystery man. The line went dead.

An interview with a… Good lord. My life actually is becoming that damn book. If he asks me to call him Louis or Lestat, I’m leaving and publishing the shrug.

I convinced myself the interview was credible, and was able to convince the editor I had been assigned at the Chronicle. She gave me the go-ahead on taking the interview, and gave me a memo to use as armor against the snooty stares of the arts and culture desk in acquiring a ticket to the gala. The only condition was that I take a photographer with me, some young kid from New York who was out here as part of an exchange. Paul something or another. I wasn’t thrilled about the photog, since I didn’t know if it would spook Mr. Vampire, but I figured having an extra to corroborate my story couldn’t hurt, and photographic evidence of San Francisco’s vampire might well get me that regular job I had been angling for.

Which is how we get here, to this night, to the opening gala at the Museum of Modern Art in glorious, sunny, foggy San Francisco, with me in my best dress and some photographer from New York in a fairly smart tux at my side. I’d spent most of the week complete unsure of what I was getting myself into. Every piece of folklore and weirdness I’ve chased down has either faded away as people lost interest, or been debunked. Here was a man claiming to be the embodiment of a legend over a hundred years old, and I couldn’t tell you going into the gala if I thought he was real or fake.

He had called on Monday. By Tuesday morning somehow all my friends, and it seemed most of the city, knew I was interviewing the vampire. I study rumor for a living and I still get surprised at how fast news travels. Everyone I knew was calling to see if it was true, offer me advice, or offer me a warning. The truly surprising thing was how small the number of skeptics was.

All of this, the months of confusion and hunting, the whirlwind of rumor and the calm, predatory nature of the voice on the end of the line, led me to be more nervous than I can remember being as I walked in the large glass doors at the front of the MoMA.

There’s this thing I do, when I’m presented with something that overloads my rational mind. My brain seems to slow down, and make one of those photo-mosaics out of what I’m seeing. It’s like I’m taking hyper-accurate pictures of a thousand little details, and only once I’ve got all the details will the I see the whole scene. I call it my “reporter’s sense”, and it’s served me well as I try to navigate the world of urban fantasies.

The gala was a sensory overload, and I found my reporter’s sense kicking in as a I tried to process everything I was seeing. There was the Mayor, standing with the chief curator of the museum, each of their spouses dressed to the nines and flashing bright smiles for the camera. There was the chief of police, sharing a drink with a councilwoman, and my brain annotated the detail that they were rumored to be having an affair. Between the groups of urban aristocracy and political dignitaries was Donald Peregrine, the venture capitalist. The open secret of San Francisco was that most of the political machine and new money in the city owed him favors, and that real policy in the city was set by him.

Off in the corner, never far from the bar, was the Arts & Culture Editor for the Chronicle, who I’m sure would pretend like I didn’t exist all night long.

As the picture of the gala came together in my mind, one piece of the mosaic stood out. Off in the corner, uniquely apart from the crowd, stood a man who was almost certainly my interview. He was dressed in an impeccable suit that appeared dark as night on first glance, but revealed itself to be grey with darkest red accessories when I focused in. His face was pale, paler than you normally find under the California sun, and his hair was silver-speckled blond that seemed to halo his head. Standing as he was, with the enormous Mark Rothko painting at his back, he presented a striking image, like a modern-day king holding court.

I turned to the photog to snap a photo that would be the centerpiece for sure (he had to have staged himself like that, right?), but Paul whoever from New York had disappeared. Great. Guess it’s just me and Mr. Vampire then.

I walked across the gala with a purpose, my eyes fixed firmly on the man who was staring at me and now grinning a smile that looked nothing so much like a jungle cat. A small group of partiers crossed in front of me, blocking my view of him, and when they passed he was gone. Of course. Mr. Vampire wants to play hide and seek.

I reached the point where he had been standing, and spun in a slow circle, trying to see if I could spot him. I caught a flash of brilliant hair and dark suit turning a corner down the hall and nearly sprinted after him.

Through the upper echelons of the city’s elite I ducked and weaved, trying to keep a smile on my face so I wouldn’t be stopped with awkward questions. My mysterious quarry led me through galleries and showcases, up and down stairs, through parts of the museum I had never seen, until I was thoroughly lost. Some rational part of my brain screamed at me to stop letting this man, who at his most harmless had convinced himself he was a dangerous predator, lead me into who knows what.

That part of my brain was outweighed by the part that had spent six months chasing mist, and who really enjoyed seeing the byline “Susan Harper” in print.

Finally, I found Mr. Vampire in a small, dim, dead-end gallery on one of the upper floors, lounging casually on one of those strange couch-benches they have for gazing at art.

“Ms. Harper,” he said as I approached. “I’m so pleased you accepted my invitation. I’m sure you have many questions. Please, won’t you have a seat?” He indicated the cushioned section next to him, and I hesitated at the familiarity of his gesture. The only thing I knew about this man was that he dressed immaculately, claimed to be a vampire, and had led me to a corner of the building where I suspected help would be a long time coming.

He saw my hesitation and chuckled. “I’m only here to meet you, Ms. Harper. My intentions are strictly honorable.” He patted the cushion again, and I found myself subconsciously leaning closer, my body rebelling against my mind. Luckily, my will held and I remained standing. A fire twinkled in his eyes and his smile grew more feral.

“Suit yourself. Would you like to begin?”

It took me a minute to find my voice, but when I did so I started with the basics. “Well, since it wouldn’t exactly read well to call you Mr. Claims-to-be-a-vampire, what is your name?”

“You can call me Drake, and I’m not merely claiming to be a vampire, I am indeed a vampire.”

“Just Drake?”

“Just Drake for now, Ms. Harper. Any last name I gave you at this point would perforce be a lie, and I would hate to start our conversation on falsehoods.”

“Ok, let’s start at the beginning. You say you’re a vampire. Were you born one?”

“Hah! No, no-one I know was born a vampire. I was born a poor peasant in what is now Eastern Europe.”

“When were you born?”

“Time has not always been so accurately measured as it is now, but around the time of the Crusades.”

“The Crusades,” I said, disbelief in my voice. “Like, the Charlemagne, Holy Roman Empire Crusades?”

“Yes.” Drake said simply.

“O…k. How did you become a vampire?”

“Ah!” Drake said, brightening, “that tale will take some time!”

Drake stood to begin his tale of dark rituals and frightened villagers, of his transformation into something out of nightmare, of his lonely years wandering as a monster, of his slow re-integration into society, and of his travels around the world before making his home in San Francisco. As he told his tale, he began to pace around the room, his face and hands animated to punctuate the highs and lows of his story, and I didn’t notice until his voice was winding down that he had been pacing closer, and closer, until he was just a breath away from me.

Up close, I could see glimpses of his teeth, I would swear they were pointed, and the closer he came the less I seemed able to think clearly. As his story was ending, with the tale of his increasing loneliness and how it had caused him to reach out to a young reporter who might understand, I saw his head began to lower towards my neck.

It was all I could do to softly say “What about your honorable intentions?”, to which he replied “Your life for my story seems an honorable trade to me…”. Then his lips were on my neck and-

CRASH!

The skylight above us shattered, and glass rained down on the couch where I was now very glad I had chosen not to sit. Drake’s head snapped around to look, and suddenly I could think clearly again.

A figure, dressed in a tight black suit from head to toe, slid upside-down through the skylight, hanging on what seemed to be a rope made of silver thread.

“Hey. This guy bothering you?” the figure said.

Drake snarled, and moved faster than I would’ve thought possible, going straight from standing to leaping at the masked figure in a blink. A shot of some silver-greyish goo fired from a a device at the figure’s wrist, and hit Drake square in the face. Drake paused to claw it off, and the masked man fired another string of the stuff at Drake’s feet, binding him to the marble floor.

The man in black dropped to the floor, and fired a few more blasts at Drake’s arms and legs, partially mummifying the vampire where he stood. Walking past the snarling and straining Drake,  the masked man said “Ok Not-feratu, stay put. I’m going to check on that nice reporter you were trying to snack on.”

Walking up to me, he asked “Are you alright miss? Did he hurt you?”

“Me? I’m fine,” I said. Another helpful aspect of my honed reporter instincts: I can delay shock-processing until I’m back at my apartment, preferably with a bottle of scotch. Tonight was going to be hell on my liquor cabinet. “What about you? Who ARE you?”

“Me? I’m just your friendly neighborhood… hmm.” The man paused. “This isn’t really my neighborhood, is it?”

As he was pondering, Drake burst out of his bonds with a roar, snarled in our direction, and leapt straight up through the skylight. The man in black sighed, and said “Next time, load the shooters with garlic. Check.” He started running towards the center of the room, yelled back at me “Good luck with the story!”, then also jumped straight through the skylight and into the night.

Only after Drake and the mysterious stranger had left did security arrive, and the best answer I could give them about what happened was “Earthquake. Didn’t you feel it?” I still got escorted from the party, while the Arts & Culture Editor tried to kill me with his brain.

I got back to my apartment, stared down at the draft of my story, and eventually pieced it into something that would read well, even if it was mostly fiction. I mixed enough truth with fantasy to be believable, even if I didn’t believe the truth myself. I had spent my whole career disproving myths and legends, and it turned out vampires and super-human masked crusaders actually existed in the world. The story, a cobbled-together city-interest piece about Eastern European cults and the power of rumor, was enough to please my editor, and the mystery surrounding the myth made the piece my most popular ever. The whispers about what actually transpired at the gala didn’t hurt the story’s popularity, by any stretch.

For most, the vampire story was put to bed, and I started hearing about the Vampire of San Francisco less and less. I’m not sure what actually happened between Drake and the masked man that night, but now I have an answer when people ask. “You know there’s a vampire in San Francisco, right?”

“I heard he died,” I reply.

“Of spiderbite.”

Thanks for reading! This story was a mashup of Anne Rice's "Interview with a Vampire" and Marvel's Spiderman. Hope you enjoyed it, please leave feedback in the comments!


Why Doesn't the Django CSRF Cookie Default to 'httponly'?

by phildini on October 19, 2015


Recently, some questions asked by a friend prompted me to look deeper into how Django actually handles it's CSRF protection, and something stuck out that I want to share.

As a refresher, Cross-Site Request Forgery (CSRF) is a vulnerability in web applications where the server will accept state-changing requests without validating they came from the right client. If you have example.com/user/delete, where normally a user would fill out a form to delete that account, and you're not checking for CSRF, potentially any site the user visits could delete the account on your site.

Django, that marvelous framework for perfectionists with a deadline, does some things out-of-the-box to try and defend you from CSRF attacks. It comes default-configured with the CSRF middleware active in the middleware stack, and this is where most of the magic happens.

The middleware works like so: When it gets a request, it tries to find a csrf_token in the request's cookies (all cookies the browser knows about for a URL are sent with every request to that URL, and you can read about some interesting side-effects of that here: Cookies Can Be Costly On CDNs). If it finds a token in the cookie, and the request is a POST request, it looks for a matching token in the request's POST data. If it finds both tokens, and they match, hooray! The middleware approves the request, and the request marches forward. In all other cases, the middleware rejects the request, and an error is returned.

The CSRF middleware also modifies the response on its way out, in order to do one important thing: set the cookie with the CSRF token to read. It's here that I noticed something interesting, something that struck me as curious: The CSRF token doesn't default to 'httponly'.

When a site sets a cookie in the browser, it can choose to set an 'httponly' property on that cookie, meaning the cookie can only be read by the server, and not by anything in the browser (like, say, JavaScript). When I first read this, I thought this was weird, and possibly a mistake. Not setting the CSRF token 'httponly' means that anyone who can run JS on your pages could steal and modify the CSRF cookie, rendering its protection meaningless.

Another way to read what I just wrote would be: "If my site is vulnerable to Cross-Site Scripting (XSS) attacks, then they can break my CSRF protection!" This phrasing highlights a bit more why what I just said is funny: If your site is vulnerable to an XSS attack, that's probably game over, and worrying about the CSRF protection is akin to shutting the barn door after the horse has been stolen.

Still, if the CSRF cookie defaulted to 'httponly', and you discovered your site had an XSS, you might breathe a little easier knowing that bad state-changing requests had a harder time getting through. (Neglecting other ways the cookie could be broken in an XSS attack, like cookie jar overflow). I was talking to Asheesh Laroia about this, and he called this the "belt-and-suspenders" approach to securing this facet of your web application. He's not wrong, but I was still curious why Django, which ships with pretty incredible security out-of-the-box, didn't set the default to 'httponly'.

We don't know the answer for sure (and I would love to have someone who knows give their thoughts in the comments!), but the best answer we came up with is: AJAX requests.

The modern web is composed less-and-less of static pages. Increasingly, we're seeing rich client-side apps, built in JavaScript and HTML, with simple-yet-strong backends fielding requests from those client-side apps . In order for state-changing AJAX requests to get the same CSRF protection that forms on the page get, they need access to the CSRF token in the cookie.

It's worth noting that we're not certain about this, and the Django git history isn't super clear on an answer. There is a setting you can adjust to make your CSRF cookie 'httponly', and it's probably good to set that to 'True', if you're certain your site will never-ever need CSRF protection on AJAX requests.

Thanks for reading, let me know what you think in the comments!

Update (2015-10-19, 10:28 AM): Reader Kevin Stone left a comment with one implementation of what we’re talking about:

$.ajaxSetup({
    headers: {
         'X-CSRFToken': $.cookie('csrftoken')
    }
}

 

 

Django will also accept CSRF tokens in the header ('X-CSRFToken'), so this is a great example. 

Also! Check out the comment left by Andrew Godwin for confirmation of our guesses.


Bots!

by phildini on September 29, 2015


Last week I went to an excellent meetup hosted by Erin McKean of Wordnik on making twitter bots, and now I've got the bot bug. Making bots, these little autonomous pieces of code that exist for some singular purpose, has the highest satisfaction-to-lines-of-code ratio I've ever experienced. This is the most sheer fun I've had writing code in a while, and I'm full of ideas for writing more. Philip's Forest of Bots is currently small, but growing:

  • Legendary Bot was the first bot I created, at that workshop last week. If you've seen How I Met Your Mother, and heard Barney Stinson say "It's going to LEGEN-wait for it-DARY!", then you know how this bot operates.
  • SnozzBot was bot number 2, conceived as I walked home from that meetup. Inspired by the original Willy Wonka movie, picture Gene Wilder saying "The snozzberries taste like snozzberries" and this bot will make more sense.
  • BuddyBot is still a work in progress. After writing the two twitter bots above, I wanted to do something with Slack. BuddyBot sends positive messages to members of my social Slack group, because we could all use more positivity in our day.

This post is just to get these bots out there, more details and resources on building bots to come, thanks for reading.


Porting Django Apps to Python 3, Part 1

by phildini on May 26, 2015


Hello! Welcome to the first in a series of posts about my experiences making Django apps Python 3 compatible. Through these posts I'll start with a Django app that is currently written for Python 2.7, and end up with something can be run on Python 3.4 or greater.

Some quick notes before we begin:

  • Why am I doing this? Because we have 5 years until Python 2.7 goes end-of-life, and I want to be as ready as possible for making that change in the code that I write for my job. To prep for that, I'm converting all the Django apps I can find, from side-projects and Open Source projects.
  • Why 5 years? Because that's the time outlined in PEP-0373, and based on Guido's keynote at PyCon 2015, that's the timeline we all should be sticking to. It's also recently been brought to my attention that further Python 2.7 releases are really the responsibility of one person, the inimitable Benjamin Peterson, and if he for any reason decides to stop making updates that 2020 timeline may get drastically shortened. It's better to be prepared now.
  • Why "Python 3 compatible"? Why not fully Python 3? Because I believe the best way forward for the next 5 years will be writing polyglot code that can be run in either Python 2.7 or Python3.4+ environments. (I'm going to start shortening those to py2 and py3 for the rest of this post.) So I won't be using 2to3, but I will be using six.

With those pieces in mind, let's begin!

I started with Cards Against Django, a Django implementation of Cards Against Humanity that I wrote with some friends a couple years ago. We didn't own Cards Against Humanity, and hilariously thought it would be easier to build it than to buy it. (We also may have just wanted the challenge of building a usable Django app from scratch). The end result was a game that could be played with an effectively unlimited number of players, each on their own device, and which was partially optimized for mobile play. To get a sense of what the code was like before I started the migration, browse the Github repo at this commit.

Now it turns out I made one assumption right at the beginning of this port that made things a bit harder, and may have distracted from the original mission. The assumption was that Django 1.5 is not py3 compatible, when in fact it was the first py3-compatible version. Had I found and read this Python 2 to 3 porting guide for Django, I may have saved myself some headache. You now get the benefit of a free mini-lesson on upgrading from Django 1.5 to Django 1.8.

Resource #1: The Django Python 3 Porting Guide

Real quick, I'm going to go through how my environment was set up at the beginning of this project, based on the starting commit listed above.

This snippet will setup a virtual environment using mkvirtualenv, install the local requirements for the app, and initialize the db using the local settings.

Ok, let's upgrade to Django 1.8 $ pip install -U Django ..and naively try to run the dev server.

Well that's a bummer, but fairly expected that I wouldn't be able to make the jump to 1.8 easily. What's interesting about this error is that it's not my code that seems to be the problem -- it looks like the problem is in django-nose.

$ pip install -U django-nose nose

Try runserver again...

Hmm... obviously the API for transactions changed between Django 1.5 and Django 1.8. Here I looked at the Django release notes, and noticed that 'commit_on_success' was deprecated in 1.8. Digging in to the new transaction API, it looked like 'transaction.atomic' was pretty much the behavior I wanted, so I went with that.

Resource #2: The Django Release Notes

Third time's the charm, yes?

Apparently not. This one was weird to me, because I didn't have South in my installed apps. Through a sense of intuition that I can't really explain, I suspected django-allauth, the authentication package this project uses. I wondered if an older version of django-allauth was trying to do South-style migrations.

$ pip install -U django-allauth

Sure enough, an old version of allauth was the culprit, and an upgraded version allowed the runserver to launch successfully.

So now I have the development server running, but I've got that warning about needing to run migrations. This is the part of this upgrade that I knew was coming, and I was most worried about. I already have the database initialized from Django 1.5's 'syncdb' -- what will happen when I run 'migrate'?

It turns out, not a whole lot. Running this command gave me a 'table already exists' DatabaseError. Googling for this issue left me a little stumped, so eventually I turned to the #django channel on Freenode IRC. (If you're curious how to get a persistent connection to IRC, check out this post.) I was able to get some great help there, and it was suggested I try the one-two punch of:

That '--fake' bit did the trick, convincing Django I had run the migrations (since the tables were already correctly created), and silencing the warning.

With the development server running on Django 1.8 (including the very limited test suite), I'm feeling confident about the migration to Python 3. Is my confidence misplaced? Find out in part 2!

If you'd like to see the totality of the work required to migrate this Django app from 1.5 to 1.8, check out this commit.

If you have feedback about what I did wrong or right, or have questions about what's here, leave a comment, and I'll respond as soon as I'm able!


Review: The Improbable Rise of Singularity Girl by Bryce Anderson

by phildini on May 22, 2015


If you look at the people who are trying to predict Strong AI, Artificial Intelligence that's equal to or better than a human's intelligence, there's two pieces of consensus among them: 1) That there's a real good chance we'll have that kind of human-or-better AI by 2040, and 2) that the reality of such an AI will change our world and our existence in ways that we almost can't comprehend. If you dig into that second piece a bit, you find two camps of people. One camp thinks "the future is so bright we're going to need shades." The other camp thinks "Yeah. Shades to shield our eyes from the nuclear fallout when a bunch of AIs decide humans aren't worth keeping around anymore." (I'm mischaracterizing the pessimist group, but not by much)

Caught between these two extremes, it's pretty easy to gain anxiety about the future, especially if you work in tech and know how fragile things currently are. (If you want to join me, and a lot of other really smart people, in celebrating/fearing the future, read these two blog posts from Wait But Why.) Both camps agree on one thing though: Humanity basically won't be able to keep up, at all, with our new technological Gods.

But there's an idea that's not explored in the blog posts above, a third option that could be far better or far worse than a benevolent machine God or destructive robotic despot (but ultimately more relatable than either): What if we could upload a human brain, upload all human brains, and beef up their processing power to beyond any intelligence level we can think of today? What if the next superintelligence was actually a human?

This is the idea that's explored in Bryce Anderson's The Improbable Rise of Singularity Girl. A young woman, Helen, the titular character of Anderson's novel, donates her body, and most specifically her frozen brain, to science, on the condition that they try to rebuild her, neuron by neuron, in a computer. Or, more realistically, a vast network of computers. As time progresses, Moore's Law marches on, the computers powering Helen get faster and faster, she gets smarter and smarter, and eventually reaches a level of intelligence and power that can only be described to us real-time, single-brained humans through some very clever literary devices.

The road to super-intelligence is not easy for Helen, as she must navigate the landscape of human interactions while at the same time being a brand new type of human. Not to mention having to make political arguments to fund her survival through grants, and keeping an eye on a true Strong AI that may not have humanity's best interests at heart.

All of this is set against the backdrop of a technological near-future that I had no trouble believing in. With the blog posts above fresh in my mind, I was prepared to dismiss any fictional representation of AI as Science Fantasy, but Anderson has done his homework, and knows his subject material well. (The dates he includes at the start of the book's chapters help build a timeline that will seem fairly plausible after reading Wait But Why). The most impressive part of the book, from a literary standpoint, is the way Anderson can construct the worlds-within-worlds-within-worlds required for a story that happens in an increasingly digital space, and not leave the reader confused as to where they are. There were only a few moments in the book where I felt lost as to what environment the characters were really in, and even then my confusion didn't distract from the action.

The thing that drew me in deep, however, the thing that made me sit up and take notice and plow through Singularity Girl, was that core idea, the idea that maybe we can prevent the technological apocalypse by making ourselves better, rather than making the machines better than us. I'm sure there are many that consider the idea wishful thinking, that would point out there's nothing inherently great about humans at a galactic scale, and that I shouldn't make our species out to be any better than it is. To me, it seems like theres a very thin line between a machine that has our best interests at heart and a machine that wants to turn us all into power sources. One line of code may be all it takes, and it may be the only thing that can fight a super-intelligent robot, is a super-intelligent human.

You should absolutely go read The Improbable Rise of Singularity Girl. The book has good characters, incredible worlds, edge-of-your-seat action sequences, and is almost guaranteed to expand your mind.


IRC all the way down (ZNC + IRCCloud + Quassel)

by phildini on May 2, 2015


For years, I felt that IRC was something I had to put up with. Most of the communities I want to be part of have a large IRC presence, and so I would fire up my trusty local IRC client, connect to Freenode or OFTC, and try to learn from the excellent people who also hang out in various IRC communities. But I was always frustrated by the fact that I would miss discussions when I wasn't connected.

A few months back, a friend of mine introduced me to Quassel, an open source software package that gets around IRC's major limitation (from my point of view): that your ability to read the contents of a channel are limited by your client being connected to the network. (The number of IRC loggers and other workarounds for persistence indicates others also find this a limitation.)

Quassel, in it's preferred configuration, requires at least two machines: a core that runs on an always-on server, and a client that connects to that core. The core is what actually connects to the IRC networks with your ident, and keeps a persistent connection for you. On the surface, this might not seem like an improvement over, say, irssi running on a server. It's an improvement for me because, despite several attempts, I have never been able to wrap my mind or fingers around irssi's keyboard shortcuts. Quassel has a nicer interface, a good desktop app, and some mobile mobile app support.

How do you get Quassel? Quite easily, if you're on an Ubuntu system. I recommend one of the cheap boxes from DigitalOcean. They're easy to use, and only $5/month for a 512MB RAM / 20GB disk box.

On the server where you want your Quassel core to run, add the Quassel ppa to your apt repositories:

sudo add-apt-repository ppa:mamarley/quassel

Install the Quassel core package:

sudo apt-get update; sudo apt-get install quassel-core

You also want to make sure you've opened up port 4242 to outside traffic, as that's the port Quassel runs on. If you're not running a firewall (you probably should be!), you don't have to do anything. If you're running ufw like I am, you'll need to do this:

sudo ufw allow 4242
sudo ufw reload

Now that your core is all set up, let's configure it! One of the amazing things about Quassel is that you configure the core through the client. Download the client for your OS of choice, and it will walk you through how to get everything up and running.

So Quassel is great, and for a few months it served all my IRC needs perfectly well. But as I started getting more and more involved in communities on IRC, I started to feel the desire for a more mobile-ready solution. Quassel does have a free Android app, but I currently run iOS, and the iOS app didn't thrill me based on what I saw of it. I started looking for a better solution.

Some of my friends on IRC have been using IRCCloud for months, and they seemed to really enjoy it. I got an invite to the service from one of them, played around a bit, but didn't immediately see the appeal. At the time, I was still happy with my Quassel core and client. When I started hankering for a mobile solution, I gave IRCCloud another look, but didn't feel I could leave Quassel completely behind. By this point, I had given accounts on the core to some other friends interested in IRC, so I knew I couldn't shut it down. Plus, having Quassel as a backup in case IRCCloud ever went down seemed like a great idea. How could I get the best of both worlds, where Quassel and IRCCloud could use the same IRC connection, and I would never lose uptime?

Enter ZNC. ZNC is an IRC bouncer, a piece of software that essentially proxies IRC connections for you. It connects to IRC, and you connect to it, similarly to Quassel. The difference is, the Quassel client speaks to the Quassel core over the Quassel protocol. You can connect to ZNC over IRC, using any client. Like IRCCloud, and the Quassel core.

How do you get setup with ZNC? On the same box where you're running that Quassel core, do:

sudo apt-get install python-software-properties
sudo add-apt-repository ppa:teward/znc
sudo apt-get update
sudo apt-get install znc znc-dbg znc-dev znc-perl znc-python znc-tcl

This will add the ZNC ppa to your apt repositories, and install ZNC. Next you need to choose a user that will run the ZNC service. This could be your default user, although that's not recommended, and it most certainly shouldn't be the root user. I created a new user for running ZNC like this:

sudo adduser znc-admin

Before you configure ZNC to run under this user, you'll need to open another port in your firewall.

sudo ufw allow 5000
sudo ufw reload

Now you're ready to start up ZNC.

sudo su znc-admin
znc --makeconf

ZNC will ask you a whole bunch of questions, like what port to run on, what users to create, and how connections should be set up. The directions starting about halfway down this DigitalOcean article are pretty good, and I followed most of their options, changing the user details to match what I needed. Once you've finished setup, ZNC will give you two important URLs: The URL to connect to the ZNC web interface, where you'll most likely configure ZNC going forward, and the URL for connecting an IRC client to ZNC. That connection URL will be in the form of:

{your server address or IP}:{port you chose} {username}:{password}

If you have an IRCCloud account, you'll need to pay special attention to those last bits, because {username}/{network name}:{password} will be your full server password to connect to the right account. For example:

UserName/freenode:password

When you add the network to IRCCloud, it'll look something like this:

IRCCLoud settings

You can use similar settings to connect Quassel to the same ZNC server.

Unfortunately, IRCCloud makes you upgrade your account to add servers with passwords. But in my opinion, IRCCloud is totally worth the $5/month. The more I use it, the more I like the service, the interface, and the mobile support. IRCCloud plus ZNC, with Quassel as a backup client connected to the same ZNC service, solves all my IRC woes. Hopefully, some combination of these services will be helpful to you as well.

And I'll see you on IRC.


I Must Not Fear

by phildini on March 10, 2015


I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.

Recently, I've had a lot of anxiety in my life. I'm dealing with closing my father's estate, projects are changing at work, and parts of my home life are adding a kind of stress I thought I had left behind in college.

I was brought up religious, and the response my mother instilled in me when presented with stress (to be fair, this was her idea of an appropriate response to everything) was "prayer and exercise". I'm not sure how religious I consider myself, but while teenage me thought my mother's advice was too simple, new adult (when do you actually stop being a young adult?) me thinks that simplicity is part of its elegance.

I have discovered few situations that don't seem just a little bit better by working out and admitting your problems, either to yourself or to some higher power.

I posted the quote above because I read Dune in high school, and the Litany Against Fear has stuck with me ever since. You may think it's silly that the mantra of a made-up religious order from a science fiction novel would bring such comfort, but I encourage you to say the words to yourself a few times and see if you don't have a reaction. Also, of course I would get solace from science fiction.

Now if you'll excuse me, I have a long walk to take, and some words to ponder.


Review: Atlanta Burns by Chuck Wendig

by phildini on January 31, 2015


I should be working on editing my own novel, but for some reason I find editing more nerve-wracking and fear-inducing than I found writing the thing, so I'm going to do another review, and see if that brings the focus. On the docket today: Atlanta Burns, the first in a series of the same name by Chuck Wendig.

I think I'll start by claiming some bias. I've been following Chuck Wendig on the Twitterz for about four months, and called upon his mighty spirit to help me get through my first fight with editing my novel. He responded with a virtual bourbon-beard.

It was a touching moment. I like Chuck Wendig, or at least the part of his persona that he shows through his blogging and tweeting. I've had discussions with co-conspirators in the past about how to define a relationship where you feel really close to someone you've never met, and I've never heard of a great word or phrase to adequately describe it. So: bias disclosed, I think Chuck Wendig is pretty great.

I'm conflicted about his latest novel.

This feels like a bit of a betrayal to put in type, and possibly hypocritical. If we look at just the facts, ma'am, the fact is that I finished the novel in *checks GoodReads* less than 48 hours. That's a pretty quick turnaround for someone who is working full-time and watching too much Futurama to boot. So I can't say I wasn't gripped by the story, or engaged by the characters, because I certainly was. Both those barrels hit me in the face and I kept going back for more.

But reading Atlanta Burns was painful. Not painful in the "Oh God, what creative writing dropout wrote this" kind of way, because the writing is excellent. Like, seriously, the man breaks one of his own rules for YA character perspective and does it amazingly. No, Atlanta Burns was painful because I felt the pain the novel's protagonist (named, as it so happens, Atlanta Burns) was dragged through practically from the the first page. It felt visceral in a way that I truly wasn't expecting.

I always feel a little strange trying to give a synopsis of a book when I'm reviewing it, because the back cover will do a better job that I ever will, and in reality you should go read the book and then come read my review. But this feels like an appropriate moment to say: Atlanta Burns is novel about high-school girl who resists being molested by her mother's boyfriend through the mechanism of a shotgun blast into the boyfriend's nether regions. That's more-or-less the start, and things kind of go downhill from there. The novel takes her through a series of Sisyphean tasks against the most downright-messed-up characters that the mind can imagine when it thinks "backwoods America". People die in this book, and not the people you want to, when all is said and done.

This is background for what I mean when I say I felt some of the pain Atlanta went through. There were moments of physical pain that made my muscles clench, and there were moments of mental anguish where I had to step away for a moment. Wendig is a great writer; it was a bit like being slowly cut by the most exquisitely crafted scalpel, perfectly honed and embellished with decorative filigree.

To say I'm conflicted about the work is an understatement. On the one hand, I've known people who have gone through situations that are approximations of what Atlanta goes through, and there's some scar tissue there. On the other hand, Atlanta takes every opportunity for agency she is given, and is basically the epitome of "don't let the bastards get you down".

It's probably against the law to talk about YA fiction with lead female characters without mentioning The Hunger Games, but here's the difference: Most of what happens to Katniss is the result of a system, of a corrupt governance inflicting oppression and pain on its people. Katniss is often a victim by proxy; President Snow never slams her head against a metal wall himself. Everything that happens to Atlanta is, more or less, personal. The villains are going after her or her friends directly. The scale of the violence is much smaller than in Panem, but it's all the more visceral for it.

Yeah, I'm conflicted.

I have one true complaint, and only one, really. (GREAT SPIRIT OF CHUCK WENDIG FORGIVE MY TRANSGRESSIONS!) I don't read a ton of YA fiction, so maybe this sort of thing is normal. There's a bit at the end where Atlanta records a video message to bring hope to the downtrodden and a warning against the oppressors. In a book where the main character has tried to fight the worst of humanity and fight for the outcast at every turn, the statement felt unnecessary, and diminished, for me, the character's power. The bad guys know what she's capable of, the audience has seen her take a beating and give it back ten-fold, neither side needs the reminder.

There's a quote by Cory Doctorow that goes something along the lines of "I write so many blog posts to help me realize what I actually think about things." Having now written a review of Chuck Wendig's Atlanta Burns, I can say:

Atlanta Burns was one hell of a ride, and worth reading. I'm both excited and terrified for the next volume, but I will certainly be checking it out.


Let's Talk About Country Music

by phildini on January 13, 2015


I make no bones of the fact that I'm not a big fan of Country Music. The closest I get to enjoying the genre is the fact that I love Johnny Cash, but I make a special exception for him in my head: "He's not country, he's like really good folk rock or something." And though I was blown away the first time I witnessed Garth Brooks stage presence (through a YouTube video, no less), I could not in good faith call myself a country music fan, and have often made and laughed at many jokes at the expense of the genre and those who like it.

Likewise, I was prepared to laugh and join in the fun-poking when I saw an article on Gawker about how all country songs sound the same. You should click through, and watch the video all the way to the end. It's background for the rest of this post, and entertaining as hell.

I reacted, as many of you may have reacted, with an amused smile followed be hearty laughter. How unoriginal those country artists are! How funny this compilation is! We were right to laugh at them all along!

Except. Spectacular, wildly popular art is often created when the artist is under some set of constraints. We respect well-made stained glass because of the constraints of the medium. We respect poetry because it is more constrained than prose. We admire Shakespeare in part because of what he was able to do in the restrained structure of iambic pentameter.

As I listened to the video above, and listened again, I noticed that while the instrumentals were almost identical, the lyrics and the stories being told were unique. Six songs, six stories, all constrained by the definition of the most popular country melody. I realized that the musical composition that has been consistent in popular country for years is the canvas that the artists paint their stories on.

And it's a hard constraint. The most popular country songs from the past few years are about the same length, with about the same structure, and about the same time given to lyrics as instrumentals. With the tiny bit of writing I've done, I can easily see how shoehorning the story the you want to tell into that structure would be quite a challenge.

This was a 'eureka' moment. Everything about country, from the audience to the marketing, to the songs, to the artists themselves is geared not around the musical composition, but around the story. Hell, popular culture even refers to the purveyors of the genre as artists more often than as musicians. They know they're story-tellers more than rock stars. (When was the last time you heard of a rock or rap artist?) They know their music is really about the stories they're telling, and they smile their kind genuine smiles waiting for those of us who turn up our noses to realize this.

As and aside: My wife grew up in an area where Country is King, and country's core audience knows that music is secondary to the story. They're waiting for the rest of us to get off our high horses too.

I'm not saying that Garth Brooks is the next Shakespeare, or that Taylor Swift is channeling Emily Dickinson. And I'll probably continue listening to the same eclectic mix of electronic, classical, and indie rock that I've listened to for the past decade.

But the next time I think or hear the phrase "Country music all sounds the same", I'll remind myself that it's so the story might flow.


Review: Redwall by Brian Jacques

by phildini on January 6, 2015


It has been more than a decade since I first picked up a Redwall book. I can't quite remember what pushed me to pick up that first volume of heroic mice and baleful rats, although I fancy that some well-meaning librarian recommended them to me. The result, of course, is that I tore through every volume the library had, reading Redwall, then Mossflower, then Mattimeo, all the way up to around the Triss-era. I fell out of the series around 2004, and didn't really pick up the following novels.

Since I've spent so much time recently trying to determine my literary roots and inspiration, I got it into my head that I should re-read some of the Redwall series, starting with the titular book itself.

Re-reading Redwall as an adult, with potentially hundreds of books and almost a dozen years between that first reading and now, was a simultaneously enthralling and disappointing journey. About fifty pages in, I realized that the writing was not at all what I remembered. Not necessarily bad, just overly simplistic, as though Brian is trying to talk down to his readers.

I give some credit to the fact that Redwall was the first, and by all accounts first novels are never as good as what comes after. I may dig into the later books at a later date to see if the writing improves, but there were whole sections of Redwall that seemed just too sappy and simple to have ever been believable.

Then again, maybe I'm just cynical, and jaded.

The balance to the at-times mediocre writing (and here I feel bad, damning the dead author and causing my inner child to cry a little) is the fantastic story being told. Redwall is a book whose characters are defined by their actions, not their words, and the actions of the humble band of woodland creatures that inhabit Redwall abbey in their fight against a horde of rats still make me race through the pages. It is a testament to the Jacques' quality as a storyteller that, even knowing the end of the story, there were times where I couldn't put the book down, couldn't wait to see what would happen to Matthias and Constance and Basil and Cluny the Scourge.

While the speeches Jacques' characters give can feel flat, the actions they take make them more real than some humans I've met.

I'll end by saying that as I've written this review, it has occurred to me that perhaps Redwall might be best experienced read aloud, and indeed it seems like a perfect book to make into a bedtime story. After a child has outgrown Peter Rabbit, perhaps their minds can feast on Matthias, champion of Redwall. Mossflower wood is waiting, and Redwall abbey is the gateway to a world of adventure.


Homage for the Holidays: Pobal - Twitter Link Aggregation

by phildini on December 16, 2014


I think this week's Homage for the Holidays solidifies my status in the Andy Baio fanclub. My card is in the mail, I'm sure.

Not too long after XOXO, I saw a link on Andy's twitter to this site called BELONG. I wasn't sure what I was looking at, at first. It looked like collection of interesting links pulled from twitter and aggregated. I got the impression that it was links from people Andy follows, but I've never really known anything about how it was constructed until I was researching this blog post. (The most Andy seems to have talked about it is in this Product Hunt listing.) BELONG is a collection of interesting things shared by people Andy thinks are themselves interesting.

Let's talk about what it does, or at least does for me.

BELONG shows me a set of viewpoints, lets me peek into world views that I'm not sure how else I would've been exposed to. Product announcements, interesting articles, discussions on race and class and gender and equality - BELONG mixes all of these into a half-daily-ish digest that serves me better than any 'social news' site I've seen. I've been turned off Reddit almost completely this year, the Hackernews echo chamber is wearing thin in some places, and once a week is about all I can deal with my Facebook feed. Yet I check my twitter, and BELONG, multiple times a day.

I began to wonder what my own twitter feed would look like if it were given the same treatment as BELONG, so I built one. It's called POBAL ( an Irish word for community) and you can find mine under my pebble.ink account.

Let's talk for a brief moment about what POBAL is, starting with some techno-babble. Feel free to skip to the next paragraph for a tl;dr. POBAL is a python script, a shell script, an html template, and a cron job. The POBAL script, triggered to run every hour by cron, pulls tweets from my twitter feed, figures out which ones have links, fetches the titles for the pages being linked to, and renders a nice list to html. The links are (currently) weighted by one-half the number of favorites plus the number of retweets. The algorithm may change as I play with it more. All of the code, minus the one line of cron, lives in the POBAL github.

So: POBAL is a collection of interesting links from my twitter.

You're welcome to POBAL it as you see fit. If you'd like to use this but don't want to do the setup, I'm hoping to get POBAL to a point this week where others can have their own easily. You're welcome to ask me for help, and I will lovingly take feedback (my design sense is probably atrocious, and the logo was generated from a Python art program). Pull requests also welcomed.

This is in many ways the project that really inspired Homage for the Holidays, and the one that I will probably use the most. There's something indescribable about seeing what your network is sharing with you. You begin to get a sense of the caliber of people you follow, what your network cares about, and by extension what you care about. I've been using POBAL for about 24 hours, and it's already prompted me to take a good hard look at who I'm following, and the quality of what they're adding to my life. I've followed some others, and unfollowed some dead weight (mostly corporate twitter accounts).

But the other thing about seeing all the best links from your twitter listed out is that they get harder to ignore. Social media isn't really ephemeral, in that nothing ever actually dies, but the way we consume it often is. Pulling the materials being shared out of the stream-of-consciousness context forces you to look at them more critically, to evaluate what normally drifts past your eyeballs. In the best case, it exposes you to thoughts that make you and those around you better human beings.

POBAL is not BELONG. The code is different, the algorithm is different, the design is different (way worse, most likely) but the spirit of gathering news from your network how you want is there. That makes it a decent homage, I think. If I'm feeling grandiose, BELONG is a facet of the new oral tradition we call social media, of which POBAL is an imperfect mirror. If I'm being more realistic, POBAL was just fun to build and a kick to use.

Hope you enjoy.


Homage for the Holidays: Pebble.Ink

by phildini on December 9, 2014


I'll admit, the next Homage for the Holidays post is kissing the line between "homage" and "abject copy".

Here's the backstory. A couple weeks after XOXO 2014, Internet's Paul Ford started a project called tilde.club. You should go read the original post about the project, but here's a quick tl;dr: Tilde.club is a delicious piece of nostalgia from the beginning of the world wide web. Every member gets shell access, email to other members, and small piece of web real estate.

I feel like "nostalgia" may not be the right word to describe tilde.club, because that word has become a little overloaded. The explosion of pixel-art games and the social conservative rhetoric of a return to simpler times has added a context to "nostalgia" that can leave a bad taste in the mouth. This is a little unfortunate, because things that are older are not necessarily bad. (see above re: pixel-art. Some fabulous examples there.)

A better word might be "ownership". Being a member of tilde.club means you have a little slice of the web that is yours, you can put whatever you want there, make it look however you like. While owning a chunk of the web may sound like no big deal, and in fact I would wager that most tilde.club members have their own dedicated web presence elsewhere, there's a fundamental difference.

Having your own space on the web, on a server you own or rent with your own domain name, is like having a massive plot of land a few hours outside of town. It's yours, you can do whatever you want with it, but you have to push people to come over and visit. Having a space in tilde.club is like leasing an apartment in the trendy new complex on the town square. You can still do almost anything you want with it, but your neighbors are all in shouting distance, they're probably really friendly, and you can see what their places look like as well. And, because the place is run by Internet's Paul Ford, a lot of people are always dropping by to visit.

That's the setup, here's the homage. I applied, and did not get into, tilde.club. This mildly bummed me out a bit, until I realized that I possessed the materials to build my own. So I did. I spun up the cheapest Digital Ocean box that they make, installed apache with user directories turned on, and opened it up to friends. It's called Pebble.Ink

Now, I want to open it up to. As of today, if you're a member of Pebble.Ink, you get shell access, email forwarding, and you're own little slice of the web to do whatever you want with. More features will be coming soon, including (possibly) the ability to run rich python and php apps.

If you've been paying attention to the tilde.club story, you probably noticed that other people had the same idea I had, and started their own versions. This put a damper on my plans somewhat, but I'm charging forward. I have not added Pebble.Ink to that list, although I plan to try and do so soon. I'm not using the official tilde.club puppet script, either.

I would be thrilled to have you along for the ride. Check out Pebble.Ink for more info, and watch this space for updates.


Homage for the Holidays: from Portland get XOXO

by phildini on December 2, 2014


This is the first Homage for the Holidays project. It may help to read the previous blog post for context.

Let's follow the logic train on this one. XOXO was awesome, and inspired me to want to do projects not for fame or money, but just because they were interesting to me and hopefully pushed the envelope in some way. I started by looking at what the organizers, Andy Baio and Andy McMillan, had done previous to and alongside XOXO for inspiration. Digging through Andy Baio's profile, I found his work with Inform 7, and Playfic.

Briefly: Inform 7, often shortened just to Inform, is a language and tool for creating and running interactive fiction. If you think you don't know what interactive fiction is, think back to really old text based games - "go north you are eaten by a grue" type stuff. Playfic is a site where you can run and play interactive fiction works in a web browser. Playfic was created by Andy Baio, because he wanted to -- well, I'll let him tell it:

Andy loves interactive fiction and wanted to make a game, but found it to hard to share his work-in-progress online. In an epic tale of yak shaving, he built Playfic before writing his first game.

From the PlayFic Website I deeply empathized with this level of yak shaving, and Playfic/Inform seemed like a great way give homage to one of the people who had inspired me. Additionally, in my head Inform and I had unfinished business. If my education had gone according to plan, I would have taken a course on interactive fiction (a subject that really excites me) and spent three months doing a deep dive into Inform. My school career didn't exactly follow a linear path, so I never got a chance to play with Inform.

Until now.

Here is what I have done, although started may be a better word. I have rebuilt, as much as possible, the XOXO 2014 experience in Inform 7, hosted on Playfic.

The project is called "from Portland get XOXO".

Is it finished? Nope, although I'll be working on it more this week, and posting updates when I can. What is there is the bones, including pretty much every major location and most of the events from the four days of fun.

What would I like to happen? Well, mostly I want people to play it. And have a reaction to it that hopefully isn't boredom. I also want people to change it, make it better, make it crazier, basically take what I've done as a base and push the limits of it.

So I've put the whole thing in a github repo, and I am actively soliciting issues and pull requests. Anyone who contributes will get my eternal thanks, and a call-out on twitter plus this blog.

Play it, have fun with, and make it your own.

Thanks, and look for more updates about the project this week, plus more Homage for Holiday posts in the coming weeks


There's No Place Like Homage for the Holidays

by phildini on November 30, 2014


Update 2014-12-08: Pebble.Ink, the second Homage for the Holidays post, is up.

Update 2014-12-01: The first Homage for the Holidays project is up.

Over the summer, on my first trip to Portland, I read the book Steal Like an Artist by Austin Kleon. If you do something that could remotely be considered creative, I highly recommend you check out the book, and his more recent Show Your Work.

Reading Steal Like an Artist, then returning to Portland for XOXO in September, helped drive home a lesson that my Game Design professors tried to teach me in college, but that I rejected: The best way to expand your creative horizons is to copy work you like, preferably with your own take on it.

When I was in school, I didn't see the value in trying to copy what someone else had done. That didn't sound 'creative enough' to me, partly from ignorance and partly from arrogance. Maybe more than partly from arrogance. It's also 100% possible that I just didn't care enough about games to try and emulate the ones that interested me.

So I graduated, started an engineering job, and was quickly exposed to the fact that those who are best at what they do often succeed because they've learned from the mistakes and successes of those who have come before them. And the best way to learn from someone is try and do what they've done.

I got a double helping of this when I tried to build my own Content Management System - basically a glorified blogging system - for The Adventures of Captain Quail, a webcomic run by myself and my incredibly talented artistic partner. I went into the project thinking that I'd have the whole thing together in a month, tops. A year later, there's still more I want to fix about it. Did I need to build a CMS? Is mine any better that what else is out there? Probably not. But the lessons I learned in building it I'm not sure I could have learned any other way.

This is where I've learned the value of building things that are directly or indirectly influenced by others: You get a tiny insight into what they've gone through, and can use that to make your own work better. I think this was, in many ways, a subtheme of this year's XOXO: That nobody produces great work in a vacuum, and the projects that seem most original or creative can be traced to specific influences from the creator's life.

So, reading Auston Kleon's book and going to XOXO and thinking about what I've discovered in the past year that really excites me, I'm embarking on a project I'm calling Homage for the Holidays. Every week, starting December 1st, I'll release a new project directly inspired by something I've seen this year that I thought was awesome. I'll be posting about them here, and right now the projects I'm planning to release will be available on the web. I'll try to document everything that happens with them, and the source materials for all of them will be freely distributed online.

Most of the projects I'm planning will be collaborative by nature, and I would be thrilled and grateful if people wanted to work on them with me, but I'd also love to see other people run their own homages. Tell me about them, and I'll link them here.

Thank you to everyone who's provided me with inspiration this year. I hope I can spread that to other people this month.

See you on the 1st!


XOXO 2014

by phildini on September 22, 2014


A sample of my badge collection.

I've gone to a lot of conferences. When I was 17 I lied on the registration for LinuxWorld and said I was 18, which was the minimum age requirement. I was pretty into Linux in those days, and having LinuxWorld in San Francisco was too good a chance to miss. As it turns out, I probably interacted with people at that conference who would end up being friends and co-workers more than 5 years in the future.

My early conference-going years, and I should point out this was before I was actually full-time employed in tech, were all about LinuxWorld and MacWorld. Those were the things I liked, because I was really weird in high school, apparently. In 2010 I discovered comic conventions with WonderCon. I was blown away with the realization that this whole world of comic-and-movie lovers existed, and I had only dipped my toes in it in comparison.

2012-2014 was filled with technical conferences and fan conventions. HTML5DevConf, PyCon, KrakenCon, AOD, APE, BigWow ComicFest. There was something that drove me to each of them, and PyCon sticks out as being full of friendly pythonistas. PyCon still ranks as my favorite technical conference.

Sometime through all this I realized that I enjoy going to conferences and conventions because I like hanging out with groups of people who are guaranteed to share at least one interest with me. The strength of that shared interest normally dictates how much I like the con.

Now we come to XOXO 2014. I will be forever grateful to Tom Cenzani, one of the many excellent people I work with at Eventbrite, for showing the videos from last year's XOXO Conf. As I watched Cabel Sasser, and Jonathan Coulton, and Maciej Cegłowski and all the other speakers talk openly and honestly about their successes, and failures, and fears in trying to build things, in trying to add something to the world while following their own path, I knew I had to be a part of that. The idea that there was a con, this thing that I already knew I loved, dedicated at least in part to being a creator, a thing I struggled with daily - how could I not want to be there?

Of course, I promptly forgot about this desire. The videos came out last fall, registration didn't open until the spring, and we lead busy lives. But when registration opened, I remembered watching those videos, and immediately went to sign up. And right from the beginning of registration I had a feeling I was in for something special.

We've established that I go to a lot of conferences, but most of them are technical, and therefore not deeply accessible to my wife. She's incredibly intelligent, but has chosen math and music over technology. I have, in the past, felt sad about not being able to share my joy at certain cons with her, because the common thread or theme was something that she didn't have experience with. So when I saw the blog post from XOXO about families, I paused for thought, and then realized I had finally found a conference that (maybe, hopefully) my wife and I could enjoy together. That in and of itself is an incredible notion, one that still brings me joy.

So I got her to sign up, and we went. Myself, my wife, and my cofounder. I wasn't entirely sure what to expect, they were definitely not sure what to expect, but all of us saw enough interesting bits in the program that we were excited. And well we should have been excited. From the opening party Thursday night to the closing party Sunday night, XOXO 2014 was an experience that none of us were truly prepared for, and none of us will forget.

This is where writing this post gets tricky, for me. I could go into excruciating detail about every game, and musician, and speaker that we loved, but I think that would miss some of the true character of XOXO, for me. PyCon was the first convention where I heard about the concept of the 'hallway track' as a measure of how good a con is. The 'hallway track' is all the conversations and random meetings and excellent discussions you have between sessions, over meals, or generally outside the scripted part of the con. For cons like PyCon, the quality of the 'hallway track' is one of many factors used to determine how good a particular year's con was.

For XOXO, the 'hallway track' is the con. The spirit of XOXO may be distilled in speakers, and the musicians, and the games, and the films, but it lives and breathes and shouts with joy at the conversations and chance meetings that take up every spare moment. I thought I had found friendly groups of people at other cons - they don't hold a candle to the friendliness and warmth of the attendees at XOXO. I made what I hope are lifelong friends during that weekend in Portland. Out of <em>hundreds</em> of conversations, I can only remember one where I didn't come away feeling excited and so happy to have talked with that person.

For me, XOXO wasn't a huge 'BANG' of insight or revelation. It was a slow burn shared with a thousand perfect strangers and true friends who were there because they make things, and who wanted to share the reassurance that makers are not alone. If you look up a bit from the candle that you are desperately trying to keep burning, you'll see hundreds of others, all with their own candles, ready to lend a hand. It was the best four days of my life that I can remember, and the fact that I got to share that with my wife makes it all that much sweeter.

Now, the hardest part: I am privileged. I am a white, straight, twenty-something male who works as a software engineer for a startup in the San Francisco Bay Area. It would be really, really hard to be playing life on an easier mode than what I'm currently playing. I try, every hour of every day, to recognize my privilege and not let it drive my actions. I am surrounded by those less privileged than I, and I struggle with what I can do to help. I am not perfect, and never will be.

And because I recognize my privilege, and want to be a human being despite it, I applaud and support everything that the enthusiasm of Andys do to make XOXO a more diverse place. I don't think that the conference is for everyone (how could it be? how can anything with any focus be for everyone?), but I do think it benefits from having as diverse an audience as possible. Which means that, if there is another XOXO, it's unlikely I'll be selected to attend.  There is most likely someone far more deserving for the spot, and I am incredibly lucky to have been able to attend once. This is probably how the universe should work.

But I also desperately, desperately, desperately want another chance to spend four days in Portland with all the friends I have, and all the friends I haven't met yet. To be inspired together, laugh together, play together, sweat together, and remind one another that we are not alone.

Thank you, Andy, and thank you, Andy, for an incredible XOXO.


XOXO


Weekly Words 3

by phildini on September 2, 2014


I'm working my way through a book of Jack Vance-inspired stories, and the vocabulary is thick as fog.

philtre - a magic potion (Wiki)

ague - a fever (Merriam-Webster)

whit - the smallest part imaginable (Merriam-Webster)

sortie - deployment of military forces from a strongpoint (Wiki)

ell - a unit of measure derived from the length of a man's forearm. (Wiki)

lictor - a bodyguard for Roman political leaders (Wiki)

nacre - another name for mother-of-pearl (Wiki)

cataphract - heavily armored cavalry (Wiki)


Weekly Words 2

by phildini on August 25, 2014


Another installment of "Words What I Looked Up This Week", defined for our mutual pleasure.

transon - transverse horizontal structural beam, normally over a door (Wiki)

loggia - exterior gallery support by pillars in facade (Wiki)

pilaster - ornamental fake column (Wiki)

sestina - fixed verse poen consisting of six six-line stanzas (Wiki)

villanelle - 19 line poem consisting of 5 tercets, followed by a quatrain (Wiki)

damask - reversible fabric of patterned silk (Wiki)

orpiment - deep orange/yellow arsenic sulfide mineral (Wiki)

roseate - rose-colored, optimistic (Wiki)


Weekly Words 1

by phildini on August 14, 2014


Howdy. On the recommendation of a really good book I picked up a couple weeks ago, I've started collecting words that are unknown to me. Here's the words I came across this week that I needed to look up, with my interpretation of their definition.

milieu - the general environment something takes place in, including natural and unnatural features, atmosphere, and culture (Wiki Milieu)

serge - a type of fabric, tightly woven, often wool, with a distinct pattern (Wiki Serge)

zweiback - twice-baked sweet toast from Germany. Literally means "twice baked" (Wiki Zweiback)

dramaturge - researcher and developer in a theater company. Kind of a meta-producer, responsible for meshing all the different creative visions with the source material (Wiki Dramaturge)

trinitite - glassy mineral residue left by the Trinity Bomb test (Wiki Trinitite)

Check back next week for more words!


On The Long Memory of the Internet

by phildini on August 13, 2014


Its common knowledge that anything you do online stays online forever. Once you publish something to the internet, it can never really be unpublished. I'm not saying my past weekend completely disproved this theory, but it definitely made me think hard about what actually lives and dies on the Great Wide Web.

This blog has been running in some form or another since 2008 or so. This past weekend, after discovering my long-running Wordpress setup was utterly borked, and seeing that I had a recent backup, I decided to blow away the Wordpress install, re-install Wordpress, and restore the backup.

This went smoothly enough, until I realized that what I had for a backup wasn't a Wordpress backup, but a SQL dump. After trying to look online for ways to restore that SQL dump into the new Wordpress installation, I gave up, and momentarily sat stunned at my loss.

Its not a complete loss, since I still have all the text and all data in that SQL backup, but it means that any links to anything I've written in the past 6 years no longer work. This was a bigger blow, since I'd like to fancy I have decent SEO for some of the things I've written.

And that train of thought led me to the idea at top: Nothing dies on the internet, right? Between the Internet Archive's Wayback Machine and the copies of the internet Google has cached, all my work is still accessible online, right?

The answer is "kind of". Yes, there are three snapshots of my site on the Wayback Machine (which is a bit of an ouch moment- so much for my importance online!). These copies are pretty much just text, and don't include any of the media in the posts. And there are probably cached copies of my posts in the Google archive, but good luck finding them. I could not.

Food for thought when deciding whether to host your own system, or host through a company that has some interest in your data always being accessible. I'm not proposing abandoning Wordpress and moving completely to Tumblr (or Medium or Svbtle or Posthaven), but it is making me consider one value of cross-posting: backup!



Well, Crap.

by phildini on August 7, 2014


Looks like I lost everything and have to start over on this blog.

I have a sql db, that I should seriously look at restoring in some meaningful way.

Until then, I'm starting over.