Anyone who really knows me knows that I am wary of communities of practice. My wariness has little to do with the people themselves. In the places and spaces I’ve been in a community of practice (education, music, sofware testing), I have found many passionate and well-intentioned people. However, I’ve also observed things about these communities of practice that have invariably led me to move to the edge or even leave most of them.
First, the closer to the center of a community one comes, the more tightly held the belief structure. A friend of mine has a great saying – “Strong opinions, loosely held”. I like it. That means that I can hold an opinion and stand for it with fervor, but, when presented with adequate evidence in a different point of view, I can let go, loosen my grip, make room for multiple interpretations, or even change my view altogether. This level of flexibility is the stuff of academic rigor and scientific integrity, but studies and hypothesis are not required to practice this. The art of listening with the intention of learning (rather than “listening to respond”) is key. Positive communities learn from each other, and the center is less like a rock and more like molten lava, shifting and changing with new information.
Second, most communities of practice start to sound like broken records after a period of time. While there are large bodies of knowledge that inform many communities of practice, they tend to draw upon the same historic body of literature or scholarship, which necessitates a certain kind of repetition. In communities of practice like software testing, where the ideas and practices are rooted significantly less often in rigorous academic study and more often in the personal experiences of the participants themselves, it seems that the ideas are more circular and less well-examined, at least from an academic perspective.
What do I mean by academic perspective? Well, since software testing is largely a qualitative pursuit, at least in the context-driven or exploratory modes, this would entail performing structured qualitative case studies to better understand what actually works, and to be able to answer the question “How do we know it works? What evidence do we have that this practice is working?” Qualitative research methods in the social sciences require specific tools and structure to be applied before a claim is made and verified, and then there is still (and likely always!) discussion and disagreement. However, the areas of disagreement are substantiated through rigorous qualitative research methodologies applied by skilled researchers.
In software testing, we have narratives presented at software conferences as our single biggest source of information and evidence that a practice or idea is working. We do not necessarily know that exploratory testing produces better results than scripted testing, writ large, because we have never studied the practice in any structured way. While inspiring and interesting, narratives and appeals to the “thinking tester” do not make a coherent body of evidence on which to judge any action. Similarly, in other fields like education, teacher narratives and appeals to the “thinking educator” would not constitute a rigorous understanding of what works in classrooms. Many communities of practice suffer from an abundance of opinion and a lack of evidence, but software testing’s ubiquitous absence from academic circles (particularly and especially in North America and Europe, less so in India), make the absence of evidence more conspicuous to the participant with an academic background (in this case, me).
Finally, in communities of practice, even those where many of the participants come from a wide range of backgrounds (bankers, musicians, writers, scientists), the bulk of thinking about the problems within the communities is centered on the domain knowledge of that community. Certainly, participating in a community of practice means understanding the fundamental values, principles, and problems in that community of practice. The thing that is tough for me is that so many participants see the trees, some see the forest, but few see the landscape(s) beyond it. In the education community, most of the teachers I worked with (out of a faculty of 220) told me that they did not understand why I would pursue a doctoral degree (a masters is required in NYS, so we all had them). They frequently reiterated that they did not see the value or point in me pursuing my education. Also, they did not see the value in me teaching in other areas and subjects outside of my own (I taught English grammar at a private school for a time, in addition to my job). Interestingly, they could understand the idea of doing it for money, but they could not understand doing it for learning reasons.
In the software testing community, people initially responded to me learning programming by telling me that I “didn’t need to program” to be a tester. People in the same community have responded to my music and teaching background with surprise – “How did you get into software testing from music teaching? It’s so different!” (for the record, it’s not so different). When people learn that I am a writer (of technical books for others, of poetry/blogs/etc.) for myself, they ask “where do you find the time?” In my mind, the question becomes “How can you afford not to do something outside of your realm of experience?” Meaning that the more seemingly disparate lenses we can look through to understand our problem, the more innovative the potential approaches to solving. Communities of practice, by their definition, don’t encourage breadth of knowledge, as they are focused on solidifying beliefs around a center core.
So, where do we go from here? We can do nothing. That’s easy. Or, we can learn from other communities of practice, and adopt rigorous ways of looking critically at our own experiences so that we have a broader and better of understanding of the good work that we do. It’s up to us.