The After: A take on complexity and emergence in evaluation
Updated: Oct 25
Back in March 2023 I wrote a blog before beginning intensive training through the Emergent Learning project. The blog was about how I was connecting complexity theory with the practice of Emergent Learning (EL) at the outset of the cohort.
As promised, here are my reflections after completing the EL training. Many of these reflections and ideas continue to evolve and have been shaped by my colleagues in the EL cohort. While I still think the initial ideas from this first blog apply, my perspective on the link between complex adaptive systems and Emergent Learning has shifted, or maybe a better metaphor is that I have continued to fine tune my binoculars for better sight.
A. Centering principles instead of practices. In the earlier blog I tried to demonstrate how core concepts in complexity show up in evaluation and learning practices, primarily focusing on methods. Although tools will be helpful in our work here, below I frame core concepts in complexity in alignment with core EL principles.
B. Taking time to unlearn and re-pattern so our Emergent Learning aligns with complexity theory. There are a lot of common mindsets that I believe limit our ability to integrate complex thinking into our work. Below I explore a few “myths” and where we might benefit from re-patterning ways of thinking and doing (more here from Griffith Centre for Systems Innovation).
C. Embodiment as a key component of practicing complexity through EL. My own somatics and mindfulness work has me considering how we’re missing “data'' by focusing on what we’re thinking rather than what we’re feeling or experiencing in our bodies. I’ve experienced how white supremacy culture has taught us that our brain is disconnected from our bodies, but just like the disjointed robot dance I described in the last blog, doesn’t tell us the whole story. Neither does being stuck in our minds alone. I won’t elaborate more here on this one because my journey working through embodiment still feels too in progress to share about yet.
More on the first point above (point A) - let’s just play with two principles here as a start:
Connection to complexity
Returning learning to the system
Core CAS concept: Cannot predict the future, so look at patterns retrospectively, nondeterministic.
Considering the core concept above, one EL principle that can hold us accountable to observing patterns both in real time and retrospectively is the principle around returning learning to the system. My interpretation is that if we are intentional about returning learning to the system, then we don’t try to fool ourselves into predicting the future or get too fixated on one notion of how the future looks.
I’ve also been thinking about how as we review data, say in a data interpretation, our role as evaluators is not to map everything out but invest in a strongly facilitated conversation that adds meaning to the data analysis and interpretation and actively returns learning to the system in real time.
Inviting diverse voices to the table
Core CAS concept: Through the interaction of agents, emergent patterns are generated.
Agents don’t always mean people, but in this case we’re referring to people. In evaluation work that considers participatory and reciprocal approaches as avenues toward equitable evaluation, inviting diverse voices to the table is essential. We say “essential” because we seek to go beyond diversity, equity, and inclusion as reactive to problematic moments or even good for business. Our belief is we don’t even have an accurate view of reality, that our evaluations aren’t rigorous, without requisite diversity.
As for the second point above (point B) related to myths -
Most importantly, I am finding it is helpful to identify what exactly we need to unlearn or re-pattern in order to be thinking more like a complex adaptive systems scientist. Some initial “myths” I have been unpacking for myself:
Preferring analysis to synthesis. We know from CAS that the whole is greater than the sum of its parts, yet so often in evaluation we break things into separate parts to be able to understand them. What would a more holistic way of understanding and synthesizing data allow us to see that we miss when we look only at components?
The assumption that we’re moving closer and closer to an equilibrium. Systems seek balance but aren’t necessarily moving toward that balance in clear, linear ways. Spontaneous and minor changes can have significant impacts. Good things can come from chaos. Progress towards peace and justice will not always be longstanding.
Drawing rigid conclusions from statistical significance. You’ve all heard me say this before, and even the professional statistics community is in active conversation on this. We have a real opportunity to embrace uncertainty over certainty through Emergent Learning practices. These practices may help us better operationalize equity in our work by legitimizing approaches like participatory evaluation and interactive data interpretations as equally important as quantitative data findings.
To me, Emergent Learning and complexity are separate but interrelated ideas. Sometimes adding new frameworks to the mix can dilute or distract us. In this case, I find that layering complex adaptive systems theory onto Emergent Learning has been a helpful way to challenge a tendency to want the world to be more simple and rational than what it is.
I don’t consider this thinking to be a new model, but I am hopeful that reflecting on the interactions between these adjacent schools of thought can serve us in our learning and evaluation work. I am eager to keep pushing our learning at Emergence Collective on these topics, and lucky for me, the EL community just started a sub-group on complex adaptive systems. Please reach out with your additional thoughts and ideas!