There are many challenges to running an Open Source organization, but the one that I have personally felt the pain of again and again is that our tooling is awful. Github (and realistically we’re all using Github at this point) still feels in many ways like a tool designed around the idea that all the action is going to happen in one repo. This may not be entirely the fault of Github. Git itself is very tightly coupled to the idea that anything you care about for a particular action is going to happen in one, and only one, repository.
When Github released Organizations, the world rejoiced, because we could now map permissions and team members in our source repository the way they were mapped in the real world. Every new feature Github adds to its Organizations product causes more rejoicing, because so many teams work across multiple repos, and the tooling around multiple repos is still awful.
The awfulness of this tooling is probably a strong factor in the current trend towards “microservice, monorepo” code organization, but that’s another post.
I’ve been the equivalent of a core contributor for a half dozen Github organizations, and I’ve noticed that one area where the tooling is especially lacking is around labels. I’ve seen labels used to designate team or individual ownership, indicate the status of pull requests, signal that certain issues are friendly for beginners, and even used as deploy targets for chunks of code. It’s fair to say that labels form a core tool in the infrastructure of every team I’ve seen using Github, and yet the tooling Github exposes for labels is painfully lacking.
I could go on and on about this, but my goal here isn’t to necessarily make Github feel bad. I hope they’re working on better label tooling, and if they want ideas, boy am I willing to give them. But there is one label-specific wall I kept banging my head against, and that is label consistency across all the repos of an Organization.
Some of you read that and feel remembered pain. I feel that pain with you, and we are here for each other. Some of you might have no idea what I’m talking about, so I’ll explain a bit more.
Let’s say you want to add a “beginner-friendly” label to all the repos in your Open Source Organization, so that new contributors can find issues to start with. Right now on Github, you would need to go into every repo, click into the Issues page, click into the Labels tab, and manually create that label. There are no “Org-wide labels”, and no tool for easily creating and updating labels across all the repos of an organization.
Introducing Epithet, a Python-based command line tool for managing labels across an organization. You give it a Github key, organization, and label name, and it will make sure that label exists across all the repos in your org. Give it a color, and it’ll make the color of that label consistent across all repos as well. Have you decided you’re done with a particular label? Epithet can delete it from all your repos for you. Are you using Github Enterprise? Epithet supports that too.
Epithet exists to fill a very particular need in open (and closed) source Github organizations, and it’s still pretty alpha. We use it for the BeeWare project, and it might be used soon for syncing labels in the Ragtag organization. You can start using it today by checking out the (sadly small) documentation, and if there’s a feature missing you’d like to see, I’m happy to work with you on getting a PR submitted.
Managing Open Source organizations is hard. My hope is Epithet makes it a little bit easier.
WordFugue is independent, and we will never run traditional ads. If you like what we're doing, consider donating to phildini's Patreon, or buy a book from our affiliate store. This week we're reading Patrick Rothfuss' "The Name of the Wind".
Python is the best technical community I’ve seen, and close to the best community I’ve seen at this scale. If you’ve been programming for any length of time, you’ve seen technologies and frameworks and languages rise and fall. We often bemoan the loss of certain ideas from these fallen works, but rarely talk about the communities that fell with them. Python is in many ways the most deliberate community that I’ve ever seen around a technology, and my life will be worse if it ever falls.
I think the Python Community is either near an inflection point, or right on top of one. What do I mean by that? I mean that, over the next five to ten years, I see two paths for the Python community and ecosystem. (Because “Python community and ecosystem” is long to type and read, I’m going to use “Python” to mean “the Python community and ecosystem” for the rest of this post.)
Path one, the one I hope we take, is the one where we take active steps to grow Python. It means that we are continuing to welcome new people into the community, from areas we never considered. It means we have a surplus of good, well-paying jobs for Pythonistas at every experience level. It means the companies and organizations creating those jobs recognize what Python gives them, and sponsors the ecosystem and community events to be better than ever.
Path two is the path I’m worried about. It’s the path where we expect Python to take care of itself, where we collectively take a more passive approach to the community that so many of us enjoy, and which has given much to many of us. I think this path results not in Python dying overnight, but in a slow decrease in Python, in Python becoming more and more irrelevant over time. It results in less Python jobs, more Go or Node or “insert language here” jobs. It results in Python being pigeonholed into certain industries, and new Pythonistas being forced to learn some other language to start their career. It results in our major events slowly shrinking over time, and a time where we start counting down attendees instead of counting up.
I’m not going to try too hard to convince you that this is where we are, that we are at or close to a fork in the road. It’s what I believe, and I think you some of you might agree already, but here’s some of the things I’ve noticed that make me think we’re close to such a point.
PyCon 2017 was fantastic, and had more attendees than ever, but had noticeably fewer booths in the expo hall then last year, and I believe fewer sponsors overall.
Other Python and Django conferences, especially the smaller regional conferences, are finding it harder and harder to get sponsors. Some of this is the market tightening, some of this is companies moving out of Python, or not feeling like they get a return on their investment.
More programs and code schools are using Python as their teaching language, but for many the entry-level positions just aren’t there. Some of this is, again, the market not hiring entry-level, some of this is the companies we work for being willing to take risks and train.
Based on the above, and some other feelings and anecdotes, I think we’re right on top of the fork in the road. So what do we do about it? We take deliberate actions to help grow Python. Here’s what I’m planning to do over the next year:
Running for the PSF Board of Directors. Why do I think being on the Board is important in the context of this post? Because I can push for growth at the Python organization level, and I can get things done as a Board member that I can’t get done as a non-Board member of the PSF. Anyone reading this can, and should, run for the Board if they feel so inclined. But I’d also love to see more participation in the PSF committees, especially along the lines of fundraising and outreach. No matter the outcome of the election, I’m going to continue my work on the Sponsorships committee, and keep doing the other things on this list.
Reaching out to University Computer Science departments about using Python. I’m already in the process of arranging a guest lecture with classes in my old CS department about life as a professional Software Engineer. I’m planning to add specifics about how I use Python (which is more and more the introductory teaching language) in my professional life. My hope is I can help connect classroom lessons to professional Python just by showing up and giving a small talk.
Reaching out to University Science departments about Python. If the keynotes at PyCon 2017 taught us anything, they taught us that Python is an incredible resource in research science departments, statistics departments, anywhere deep thinkers need to do computation and visualization. I’m hoping to put together a “Python in Science” roadshow to help with this, but the reality is Software Carpentry is years ahead of me in making this happen, and anything we can to do help with them is almost certainly worthwhile.
Being a Core Contributor to the BeeWare project. Python has great stories around developing web applications, working in the sciences, and doing systems tasks. Our stories around developing consumer apps are lacking, and I don’t think they need to be. BeeWare, and many others, are taking a stab at filling this gap, but for you reading this the action item could be “find a Python project in an area you care about, and work at making it the best it can be.”
Volunteering time to get more companies and projects started in Python. This one is more nebulous, and I haven’t done it yet but plan to soon. I’m planning to reach out to VCs and incubators and especially hackathons and say “Here’s my background, I’m happy to show up to any event and donate my time to help, but I’m only going to help with Python.” I don’t know how this is going to go over, but this idea has some exciting potential. If we want more jobs in Python, we need to be pushing for more companies and projects to use Python, right from the beginning.
If any of these ideas seem interesting to you, feel free to copy them! If they seem interesting but daunting, feel free to reach out to me ([email protected]) to chat about them. If these ideas inspired your own ideas in a different direction, great! Tell me about what you’re doing and I’ll share it far and wide. My goal in listing these ideas isn’t to toot my own horn, but start a conversation about methods for Python outreach, in the hope of growing Python.
Of course, I could be wrong in my beliefs. (I’d actually love to be corrected with stories or data that show I’m wrong, and would happily share them here.) What if Python is healthy, and is going to grow consistently over the next decade?
Then I’d still do everything I’m planning to do, and encourage others to do the same. I think everything we pour into the Python community is valuable, and any new Pythonista we bring in enriches us all in ways we can’t possibly anticipate.
If I’m wrong, and we make Python better for no reason, we’ll still have a better Python.
I'm entering the arena for the third year. I'm running to be a Director for the Python Software Foundation. This post will help explain why.
There's an argument that anything I write here should instead be in my candidate statement. I don't disagree, and the reason I'm writing here instead of there relates to one of the things I'd like to change: Nominating for the Board of Directors requires editing the Python Wiki. The Python Wiki is hard to use, the documentation on it is not well-exposed, and a room full of Pythonistas during the PyCon sprints (including one current Board member) couldn't tell me who maintains it. Beyond that, you have to answer somewhat-esoteric Python trivia to submit your edits.
I'd like a clear definition of what the Board and the community thinks the Wiki is for, and regular check-ins on whether it's serving the community well. I'd like to see us running "PSF Sprints", where Board members or anyone else interested is writing documentation about how the PSF is run. Our election processes and funding processes and budget processes and outreach processes should be checked on regularly, and I'll be pushing for more transparency and openness about how we run the business side of Python.
Speaking of outreach, I'd like the PSF to be doing more of it, and funding groups who are growing the Python community. There will be more about this in a future post, because I have plans on how to get our community of Pythonistas back out into the world growing the community at universities and hackathons and incubators and corporations. I want every group and individual trying to grow Python to know that the PSF has their back, and will put money behind them.
I also want to be on the Board to remind the PSF that they have power beyond grant giving. Yes, the majority of what the PSF Board has done in recent years has been giving grants to organizations around the world. That work is excellent, and I want to see it increase. But the PSF is also in a unique position to be a promotion clearing house and force multiplier for good ideas in the community. When good learning materials are written, they should be easily findable from the official Python websites. When Python events are being held, the PSF should be a cheerleader, spreading the word about what's happening in the community.
These are the things I plan to do as a PSF Director to help grow Python. I haven't even gotten into the investment I want to see us putting into our core tools and platform infrastructure; that will have to be another post and my brain is a little fried from PyCon.
So the only question left is: Why do I need to be on the Board to do these things? And the answer is I don't These are things I'm going to push for no matter what. But the PSF is in many ways the voice of the community, and I want to see that voice brought to bear on the issues that will be affecting our community for the next year and the next decade. I think I can help use that voice to speak for the Pythonistas of the future, and I hope you agree.
Here we are the end of the first conference day of PyCon. Thinking over the day, and including thoughts from the opening reception last night, I'm struck by something that is even more true this year than it was last year:
The Python community is incredible. We are at an inflection point where we need to be making measured, conscious decisions to keep Python and its community thriving.
I'm going to be writing even more about this in the coming weeks, but let me jot down some observations, and then try to sum them up at the end:
1. It was pretty obvious to anyone who had been here last year that there were less sponsor booths in the expo hall. Noticeably less. Speaking to someone who had a booth last year and chose not to this year, there was perhaps a level if disgruntledness with the organizers that caused them to skip this year. That's troubling. What's even more troubling is talking to conference organizers for other Python and Django conferences about how it's gotten harder this year to find sponsors.
2. Jake VanderPlas' keynote this morning highlighted some areas where Python is making incredible inroads, and showcased how Python is becoming the defacto tool in many areas of the science community. They choose Python for many reasons, and one of them is:
"Speed of development is primary, speed of execution is secondary" #PyCon2017
These thoughts really resonated with the audience, based on the number of likes and retweets I got. And I think these are sentiments the community at large shares. We don't (necessarily) choose Python because it's the fastest language on the planet. We choose it because we like working in the language and we love the community that comes with it.
3. Speaking of that community, I didn't get a chance to see as many of the talks as I would like, because I spent so much time chatting with people about fascinating topics in the hallway track. The hallway track continues to be one of the best parts of PyCon, and it was especially noticeable this year that people were being encouraged to participate. One of the amazing things about PyCon is that all the talks are recorded and put online for free, sometimes within hours of their being given, so attending a talk can often be considered secondary to meeting interesting people in the hallway.
4. This is even more pure anecdote than (3), but it felt like I heard of more people finding it harder to get jobs in Python building web applications, and easier in things like data or science. I can't prove this is true, and it might not be all bad, but it's something to watch. Any area where it's suddenly harder to find work in Python means a pillar of our community is weakening, and we should be aware of it.
5. The day ended with lightning talks, and I hope everyone in the audience saw Cameron Dershem's talk about what the Rust community is doing better than the Python community, especially when it comes to improving usability of the language and making it easier to contribute. Furthermore, I hope it was a call to arms for all of us to start pushing for making every aspect of our community feel welcoming, and like new people can make a difference.
Summing up: Python's community still feels like home, to me and many others, and PyCon feels like a homecoming. If we want to make sure the community continues to be incredible, we need to keep an eye on trends in where people and companies are using PyCon. We also need to continue to be excellent to each other, whether the person we're talking to has been here for years or just learned about Python today.
I'm going to keep pushing to make Python better, and I look forward to seeing you all at PyCon tomorrow.
There is a recurring villain in the Terry Pratchett novels called The Auditors. They show up over a number of books as the adversary of Death, and make one of their most daring ploys in Thief of Time, which I also consider in the top five of the Discworld canon.
The Audtiors, who I describe in the singular because it thinks of itself as one entity, is one of the most insidious adversaries in the Discworld because it is not evil, or even mean-spirited. It simply wishes there was more order to the universe, and would prefer all life to stop because life is so disorderly.
One of the best descriptions for the existance of The Auditors begins thusly:
Nine-tenths of the universe is the knowledge of the position and direction of everything in the other tenth. Every atom has its biography, every star its file, every chemical exchange its equivalent of the inspector with a clipboard. It is unaccounted for because it is doing the accounting for the rest of it, and you cannot see the back of your own head.
Nine-tenths of the universe, in fact, is paperwork.
This phrasing has stuck with me recently in regards to critique of the internet and online communities. In many ways, the internet is its own auditor, its own bookeeper, its existance is the record of its existance. And yet, thousands upon millions of words have been written explaining more about the communities and worlds that make up the internet. Entire libraries could be filled with the printed pages of digital commentary on Twitter and Facebook and all manner of internet forum that have come before.
One of those pieces of internet critique, which I will not link to because I do not wish to drive traffic to it, rankled me, to the point where I felt a need to counter it. But to counter it properly, we need to establish some background. Carl Sagan once said "If you wish to make an apple pie from scratch, you must first invent the universe."
And so must we, but the universe we're inventing is The Fediverse, specifically one small corner of it called The Wandering Shop. This is the story of how The Wandering Shop came to be.
Twitter is awful. This is the sentinment that so many of us, myself included, have felt in the past months and years. It's product decisions are opaque, it's community vascillates between cynicism and bigotry on a daily basis, and you can never shake the feeling that you're trying to have an intimate conversation while yelling at the top of your lungs in the public square.
Enter Mastodon, a piece of Open Source software (which in this case, as in most cases of Open Source, means software anyone can edit and everyone has opinions about which they'll happily go to the barricades for). Mastodon has it's own fascinating and complex history, involving multiple internet generations of Open Source and Free Software activists, but it is, in essence, a piece of software that behaves much like Twitter that you can host yourself. This definition is untrue by almost every measure, but it aids wonderfully in understanding.
Two crucial facts about Mastodon are true, and possibly beautiful, making them True again: 1) You can host it yourself, meaning you have full ultimate control over the canonical copy of the data. 2) Anyone who joins your instance, your little slice of the network of servers, is subject to your rules, and your community decisions.
One of the things that Twitter got wrong, in my opinion and the opinion of other heavy Twitter users, is that it thought the only way to grow and to have a userbase committed enough to fill the gaping maw of Venture Capital was to allow free speech to be the default. It's truly mind-boggling how, when unfettered free speech isn't allowed in the classroom, the workplace, on public television or radio, or in most human relationships, the developers of communication tools for humans think that free speech with no rules (or poorly enforced rules, amounting to the same thing) will prompt quality conversations.
Mastodon, from the outset, made no bones of the fact that the owners of the instances could mute or block or kick off individuals who broke that instance's community guidelines. They could even block or mute whole other servers, allowing the instance owners to set their own rules and sanctions.
This is what interested me, when I first read the post from Eugen Rochko that kicked off Mastodon for most of us. I joined immediately, and loved the growing community I found there. Mastodon replaced Twitter in my life so quickly that at times it almost felt like whiplash. My fingers would reach to open Twitter of their own accord, and when my mind returned from wherever it had been I would realize what I was doing, close Twitter, and open Mastodon.
I'm forever grateful to that muscle memory, however, because without it the Wandering Shop would not be. It was during one of those brief unconscious Twitter checks that I saw my friend Annalee suggesting the idea of a moderated Scifi/Fantasy Mastodon instance. You can still see the thread here. We got to chatting, first over Twitter, then over email, and days later The Wandering shop was born.
As you can see from that thread, a strong Code of Conduct was part of the Shop's DNA from the beginning. Many of us, myself included, feel like The Wandering Shop is a real, shared coffee shop that happens to mainly exist in our minds, and so it makes sense to me to say that the guidelines and code of conduct are built into the very walls of the Shop.
Annalee and I wanted to build a community that was open to all, but deliberate in what was and was not acceptable behavior. To the best of my knowledge, we have yet to act on poor behavior from the Shop's patrons (those who have registered their accounts at The Wandering Shop), but we have muted or blocked accounts and instances that we thought were making life in the Shop worse. Additionally, we've put effort into deliberate community building. We made a weekly calendar to encourage conversations, and we try to generally be available as part of the community.
This is, for me, the power of Mastodon, when run deliberately: You can build online communities like neighborhoods, full of people you enjoy sharing the sunset with, and still be connected to the rest of the city down the road. I can honestly say The Wandering Shop is in the top three online communities I've ever been a part of. Twitter and Facebook don't even make top ten.
The article that came out today, that fired me up enough to write this piece, was describing why an instance admin was shutting down his Mastodon instance.
I'm paraphrasing here, but the gist was: "I set up my instance as a place with no rules, where anyone could come and be who they wanted to be, and say what they wanted to say. I was stunned and saddened at the abuse and horrific imagery I had to encounter when dealing with this instance, and so I am shutting it down before it gets me in real trouble."
There is a part of me, a small part, that is always reaching out to connect with other human beings. I don't think anyone could try so hard to start online communities and not have a portion of yearning in that direction. I feel for this man, who tried and experiment and had it go so wrong.
But mostly I look at what is possible when thoughtful care and attention is taken towards creating a deliberate community, and a spark goes off behind my eyes. I think of the world we're building at The Wandering Shop, and compare it to this person's dismissal of the whole concept, and that spark quickly becomes the heart of a forge.
This post is a long time coming. A much longer time coming than I intended, actually. It's been almost a year since any post was made to this site, and it's just been sitting here, ticking away. I've changed jobs, the country has changed Presidents, the world and I have changed in so many immeasurable ways.
There is more to say here, and trust me that I'm going to be writing here even more. Just the act of writing is helping centering me right now, and I'm coming to think that maybe I need to do it for my own survival.
Along those lines, with my dear friend Annalee (who helps run The Bias) I have started a small community for writers, specifically Science Fiction and Fantasy writers. Think of it as a Twitter for Writers, with an excellent community and a strong Code of Conduct. It lives at the Wandering Shop, and you're cordially invited to join us. The community there is part of why this post exists, and why I'm trying to resurrect this site. They inspire me to write every day.
The Python Software Foundation (PSF) is the non-profit that owns python.org, helps run PyPI, and makes sure PyCon happens. This is the introduction to a series of posts that will discuss some challenges that face the PSF and community as a whole, as well as some suggested solutions.
The big idea underlying all the little ideas in the following posts is this: The Python community is a unique and incredible community, and it is a community that I want to see grow and improve.
Python is full of welcoming, caring people, and that the Python community has shown over and over that it is not content to rest with any past good deeds, but is continually pushing to be more welcoming and more diverse. It was an incredibly powerful symbol to me that I spoke with multiple people at PyCon who don’t currently use Python for their jobs, but come to PyCon to be a part of the community. When I find people who want to get into programming, I point them at Python partially because I think the language is more beginner-friendly than most, but mostly because I know the community is there to support them.
The only qualification I claim for this series is caring deeply about this incredible community. If you want to learn more about my background, check out the about page. The ideas that I’m going to be presenting are a combination of my own thoughts, and conversations I’ve had at various conferences, and in IRC channels, and on mailing lists. I’m not claiming to be most qualified to speak on these things.
I have no real desire to critique the past. My goal is to start a conversation about the PSF’s future, a future which hopefully sees the PSF taking an even bigger role in supporting the community. To that end, there’s three things that I think we should be talking about, which I’ll discuss over the next three posts.
Strengthening the Python ecosystem
Encouraging new adoption of Python and new Python community members
Supporting the existing Python community
If you are inspired to start these conversations, comments will be open on these posts, although I will be moderating heavily against anything the devolves into attacks. Assume the the PyCon Code of Conduct applies. I would be thrilled if these posts started discussion on the official PSF mailing lists, or in local user groups, or among your friends.
In the upcoming post, I’ll talk about challenges that face the Python ecosystem. I’ll talk about support and maintenance of the Python Package Index, why it should matter tremendously to the Python community, and what the community and the PSF could be doing to better support PyPI and package maintainers. Sign up for our mailing list to hear about the next post when it’s published.
I think people have an impression that I make lots of contributions to Open Source (only recently true), and that therefore I am a master of navigating the steps contributing to Open Source requires (not at all true).
Contributing to Open Source can be hard. Yes, even if you’ve done it for a while. Yes, even if you have people willing to help and support you. If someone tries to tell you that contributing is easy, they’re forgetting the experience they’ve gained that now makes it easy for them.
After much trial and error, I have arrived at a workflow that works for me, which I’m documenting here in the hopes that it’s useful for others and in case I ever forget it.
Let’s say you want to contribute to BeeWare’s Batavia project, and you already have a change in mind. First you need to get a copy of the code.
I usually start by forking the repository (or “repo”) to my own account. “Forking” makes a new repo which is a copy of the original repo. Once you fork a repo, you won’t get any more changes from the original repo, unless you ask for them specifically (more on that later).
Now I have my own copy of the batavia repo (note the phildini/batavia instead of pybee/batavia)
To get the code onto my local machine so I can start working with it, I open a terminal, and go to the directory where I want to code to live. As an example, I have a “Repos” directory where I’ve checked out all the repos I care about.
This will clone the batavia repo into a folder named batavia in my Repos directory. How did I know what the URL to clone was? Unfortunately, GitHub just changed their layout, so it’s a bit more hidden than it used to be.
Now we have the code checked out to our local machine. To start work, I first make a branch to hold my changes, something like:
git checkout -b fix-class-types
I make some changes, then make a commit with my changes.
git commit -av
The -a flag will add all unstaged files to the commit, and the -v flag will show a diff in my editor, which will open to let me create the commit message. It’s a great way to review all your changes before you’ve committed them.
With a commit ready, I will first pull anything that has changed from the original repo into my fork, to make sure there are no merge conflicts.
But wait! When we forked the repo, we made a copy completely separate from the original, and cloned from that. How do we get changes from the official repo?
The answer is through setting up an additional remote server entry.
Pull changes from the original repository into my master branch
Update the master branch of my fork of the repo on GitHub.
Checkout the branch I’m working on
Pull any new changes from master into the branch I’m working on, through rebasing.
Now that I’m sure my local branch has the most recent changes from the original, I push the branch to my fork on github:
git push origin fix-class-types
With my branch all ready to go, I navigate to https://github.com/pybee/batavia, and GitHub helpfully prompts me to create a pull request. Which I do, remembering to create a helpful message and follow the contributing guidelines for the repo.
That’s the basic flow, let’s answer some questions.
Why do you make a branch in your fork, rather than make the patch on your master branch?
GitHub pull requests are a little funny. From the moment you make a PR against a repo, any subsequent commits you make to that branch in your fork will get added to the PR. If I did my work on my master, submitted a PR, then started work on something else, any commits I pushed to my fork would end up in the PR. Creating a branch in my fork for every patch I’m working on keeps things clean.
Why did you force push to your master? Isn’t force pushing bad?
Force pushing can be very bad, but mainly because it messes up other collaborator’s histories, and can cause weird side effects, like losing commits. On my fork of a repo, there should be no collaborators but me, so I feel safe force pushing. You’ll often need to force push upstream changes to your repo, because the commit pointers will be out of sync.
What if you need to update your PR?
I follow a similar process, pulling changes from upstream to make sure I didn’t miss anything, and then pushing to the same branch again. GitHub takes care of the rest.
What about repos where you are a Core Contributor or have the commit bit?
Even when I’m a Core Contributor to a repo, I still keep my fork around and make changes through PRs, for a few reasons. One, it forces me to stay in touch with the contributor workflow, and feel the pain of any breaking changes. Two, another Core Contributor should still be reviewing my PRs, and those are a bit cleaner if they’re coming from my repo (as compared to a branch on the main repo). Three, it reduces my fear of having a finger slip and committing something to the original repo that I didn’t intend.
That’s a good overview of my workflow for Open Source projects. I’m happy to explain anything that seemed unclear in the comments, and I hope this gives you ideas on how to make your own contribution workflow easier!
It’s true that, for many projects, how you become a Core Contributor can seem mysterious. It often seems unclear what a Core Contributor even does, and it doesn’t help that each Open Source project has a slightly different definition of the responsibilities of a Core Contributor.
So this deliberately isn’t a “How to Become a Core Contributor” guide. It would be impossible to write such a guide and be definitive. This is me trying to reverse engineer how I became a Core Contributor on BeeWare and then extracting out things I think are good behaviors for getting to that stage.
How I Became a Core Contributor to BeeWare:
Met Russell Keith-Magee at DjangoCon EU 2016, where he spoke about BeeWare and Batavia.
Chatted with Russell about BeeWare, sprinted some on Batavia at DjangoCon EU 2016.
Saw Russell and Katie McLaughlin at PyCon 2016, chatted more about BeeWare with both of them, joined the BeeWare sprint.
Recognized that BeeWare had some needs I could fill, namely helping onboard new people and reviewing Pull Requests.
Asked Russell for, and received, the ‘commit bit’ on the Batavia project so I could help review and merge PRs.
Tips I Can Give Based on My Experience:
Be excited about the project and the project’s future. I think the whole BeeWare suite has amazing potential for pushing Python to limits it hasn’t really reached before, and I want to see it succeed. A Core Contributor is a caretaker of a project’s future, and should be excited about what the future holds for project.
Be active in the community. Go to conferences and meetups when you can, join the mailing lists and IRC channels, follow the project and the project maintainers on Twitter. I met Russell and Katie at a conference, then kept in touch via various IRC and twitter channels, then hung out with them again at another conference. Along the way, I was tracking BeeWare and helping where I could.
Be friendly with the existing project maintainers and Core Contributors. It’s less likely I would be a Core Contributor if I wasn’t friends with Russell and Katie, but the way we all became friends was by being active in the community around Python, Django, and BeeWare. One way to figure out if you want to be a Core Contributor on a project is to see which projects and project maintainers you gravitate towards at meetups and conferences. If there’s a personality match, you’re more likely to have a good time. If you find yourself getting frustrated with the existing Core Contributors that’s probably a sign you’ll be more frustrated than happy as a Core Contributor to that project. It’s totally fine to walk away, or find other ways to contribute.
Focus on unblocking others. I still make individual code contributions to BeeWare projects, but I prioritize reviewing and merging pull requests, and helping out others in the community. From what I’ve seen, a Core Contributor’s time is mainly one of: Triaging issues in the issue tracker, reviewing patches or pull requests, and helping others. It’s only when everyone else is unblocked that I start looking at my own code contributions.
Have fun. I asked to become a Core Contributor to BeeWare because I enjoy the community, enjoy Russell’s philosophy on bringing on newcomers, and think the project itself is really neat. If you’re having fun, it’s obvious, and most Core Contributors want to promote the people who are on fire for a project.
My hope is that I have made becoming a Core Contributor to an Open Source project seem achievable. It is completely achievable, no matter your current skill level. There’s a lot more detail I didn’t cover here, and I can’t promise that if you do all these things you’ll become a Core Contributor, even on the BeeWare project. When you ask to become a Core Contributor to a project, the existing project maintainers are evaluating all kinds of things, like how active you are, how well you might mesh with the existing team, and what existing contributions you’ve made to the project and the community. It might not be a great fit, but it doesn’t mean you’re not a great person.
What I can say is that being a Core Contributor is work, hard work, but incredibly rewarding. Seeing someone make their first contribution, and helping shepherd that contribution to acceptance, is more rewarding for me than making individual contributions. Seeing a project grow, seeing the community grow around a project, makes the work worth it.
If you want have questions about my experience, or about contributing to Open Source in general, I'm happy to answer questions in the comments, or on twitter @phildini, or email [email protected].