Our Community

Sometimes it can be challenging being the only evaluation or impact measurement specialist in your organisation. You might enjoy a unique bird’s eye view across all of your organisation’s activities and impact. But you may also sit (virtually, these days) slightly apart from the teams knee-deep in design and implementation. And so when you run into a really sticky MEL issue – where do you go to for advice?

People, ping pong and healthier lunches – what we’re looking forward to in 2021

For many of us, the start of 2021 has felt very much the same as the last few months of 2020: there’s still much uncertainty and many things put on hold as we wait for a global vaccination roll-out.

In the interim, we thought we’d put into practice one of the lessons learnt from 2020 – time for reflection and focusing on the small, good things to get us through. So we asked the Clear Horizon team what learnings from 2020 they’ll be bringing into the new year, what they’re looking forward to.

“Spending time together face to face in our new office. Our end of year get-together was so fantastic, and while we have come to love many aspects of virtual meetings, and have honed our online facilitation, it was a reminder that it can be surprisingly lovely to meet in person. So now comes the year where we perfect the hybrid model?”
– Jess Dart, Founding Director, Health Futures Lead

“I’m looking forward to graduating and finishing my studies this year. I’m also looking forward to going back into the office and seeing everyone again and getting back in the routine of exercising as well as our epic ping pong battles at lunch.”
– Kim Saldago, BI Developer

“It feels like last year pushed us 10 years into the future. We’ve all be dragged (some kicking and screaming) into online learning. Many of us have experienced how great it can be when done well and how truly awful it is when done poorly (somehow worse than my first year chemistry lecturer who scrawled illegibly on a blackboard with his back to the room mumbling to himself). So this year, I’m excited about continuing to explore how to design and deliver truly engaging participatory online learning experiences that surpass face-to-face learning – yes, it’s possible!”
– Cam Elliott, Head of the Clear Horizon Academy

“When we began working remotely, we decided to start each day with a 15 minute team check-in – who’s doing what, who needs help with what. We found it so valuable that we’re keeping this habit, despite returning to a physical office. It’s made us more attuned to what’s going on for each of us, each day, and helps us better share the load in terms of our team’s work.”
– Lee-Anne Molony, Director

“I am looking forward to the new normal of being back in the office a few days a week and working alongside colleagues but also keeping up the exercise and (the more healthier!) eating habits I established whilst working from home.”
– Jenny Riley, Chief Digital & Data Officer

What it takes to build a Liveable Company

If this year has proven anything, it’s that we are capable of change and of rising to new challenges. In that spirit, we’re challenging ourselves to better walk the talk and build a liveable company. But what does that mean?

Let’s bridge the cultural divide: why evaluation and anthropology need each other

Research Analyst Alessandra Prunotto reflects on the Australian Anthropological Society conference, 2-5 December 2019.

This year’s Australian Anthropological Society conference in Canberra was themed “Values in anthropology, values of anthropology”. With this focus on values, you might assume that there would be a panel on evaluation, or at least a paper or two.

But there wasn’t.

How is this possible? I thought. After all, anthropology and evaluation are disciplines that overlap significantly.

Socio-cultural anthropology is the study of human societies in their environments. And evaluation is the practice of ascribing value to an initiative that’s creating change within a socio-environmental system. So at an anthropology conference on values, some debates on the thorny issues in evaluation practice seemed like par for the course.

But this was not the case. Instead, the undercurrent of concern that permeated the conference was the value that anthropology as a discipline has to those outside academia.

I suspect the conference theme of “value” manifested in this way because the majority of attendees were academics or PhD students. I believe that some of these intellectuals are struggling to understand the meaning of their work in a world where the pursuit of knowledge for its own sake is seen as retreating into an ivory tower.

As I listened to the various discussions circulating around “applied” and “public” anthropology, I noted down the themes emerging around the principles and skills that anthropology can offer the world beyond academia.

Looking around at the work we’re doing at Clear Horizon, I believe anthropology has much to offer evaluation.

In what follows, I expand on the ways that anthropological perspectives can help us navigate emerging trends in evaluation practice, as we see change-making initiatives move towards systems-change approaches and as cultural safety becomes essential for evaluation in First Nations contexts.

Anthropology’s superpowers

What are anthropology’s strengths, in comparison with other social sciences? At the conference, I noticed several themes emerging that painted a picture of applied anthropology done well.

Understanding of the other. As the comparative study of human societies, anthropologists intend to understand other people’s perspectives, no matter how radically different. This requires you to listen deeply and attentively, with humility. You aim to learn and empathise before reflecting critically.

Challenging assumptions. With an awareness that your way of being and thinking is one of many, anthropology compels you to question your own assumptions, as well as that of others. Anthropology is a critical discipline that aims to surface the most fundamental aspects of different worldviews and value systems.

Bringing together different worlds. As the goal of anthropology is to bridge cultural divides, it gives you the foundation to bring together different stakeholders, creating a space for different voices and perspectives to come together. It helps you to find zones where mutual understanding can flourish.

Reflexivity and awareness of power dynamics. Sharpening your observation skills and handing you a toolkit of theorists from Fanon to Foucault, anthropology enables you to identify power dynamics and maintain an awareness of your own position within them.

Contextualisation and comfort with complexity. Anthropology aims to understand human behaviour and beliefs within a broader social, historical, environmental and political context. It asks you to embrace messiness and complexity, and avoid reductive explanations.

Anthropology in systems-change evaluation

Across the change-making sector, we’re seeing a move away from programs and towards systems-change initiatives, such as place-based and collective impact approaches.

A number of elements that characterise these approaches heighten the complexity required of evaluation practice. These elements include the increased number and diversity of stakeholders involved, long timeframes, the range of changes initiated and the difficulties in identifying how initiatives contribute to certain outcomes.

As we can see from the list above, anthropological training leaves you well-placed to tackle these kinds of evaluations.

It allows you to bring together diverse stakeholders and facilitate the negotiations between different value systems and worldviews. It helps you to understand and manage the power relations that might exist between these different groups. And sitting with the complexity of the work is a sign that you’re dealing with the reality of change, not boxing it down into something that is both manageable and fictional.

Anthropology in First Nations evaluation

Attitudes in relation to First Nations evaluation are changing. At Clear Horizon, we’re hearing more compelling demands for decolonising evaluation and power-sharing. A merely participatory and inclusive approach to evaluating with First Nations peoples is no longer acceptable – instead, evaluation needs to be conducted by First Nations peoples “as” First Nations peoples, a principle that is underpinned by the rights of Indigenous peoples in the UNDRIP.

Anthropological perspectives have much to offer to help non-Indigenous evaluators collaborating with First Nations peoples navigate these issues sensitively and appropriately.

Most importantly, anthropology can provide a means to understand that hegemonic practices in evaluation and change-making are not the only ways of working, nor are they necessarily the right ones. Furthermore, anthropological perspectives can foreground how histories of colonisation can continue to manifest in power relations and structural disadvantage today. These aspects of anthropology can lend the humility needed to truly shift evaluative practice back to First Nations peoples and communities who have the expertise to evaluate in their own context.

Bridging the divide

It’s clear that anthropology can bring important skills and perspectives to the emerging trends in evaluation practice.

But it seems that evaluation is not on the radar as a potential career for anthropology students. As I’ve highlighted, it was not even mentioned at the most recent annual conference, despite the focus on “value” and a more public anthropology. And from my own experience, having graduated from an anthropology degree just last year, I’d never heard evaluation mentioned as a potential career path.

What can we do to bring evaluation and anthropology together?

One option might be a partnership, or at least more dialogue, between the Australian Evaluation Society and anthropology departments. Local chapters of the AES might have the potential to run careers information talks, mentorships, or even graduate placement programs.

Perhaps advertisements for evaluation positions could specifically mention that anthropology graduates are welcome in the list of disciplinary backgrounds accepted.

But the skills and ideas of anthropology should not just be limited to those who studied it at university. There could also be an opportunity for anthropologists to run training for evaluators who come from disciplines more removed from anthropology, to upskill them for these emerging areas of work.

Evaluation and anthropology have the potential to achieve more impact together. Evaluation can benefit from the nuanced, critical perspectives anthropology can bring. And anthropologists are seeking ways to make themselves useful outside universities.

Let’s reach out.

Reflections on transferring to the international evaluation space

Kaisha Crupi is a consultant at Clear Horizon and has recently made the move from the domestic Sustainable Futures team into Clear Horizon International. Below is a reflection piece from Kaisha, discussing her learnings and observations in her new role.

Before joining Clear Horizon 18 months ago, I had only a small taste of the international working world. However, since I made the full transition into ‘Clear Horizon International’, affectionately known as CHI (pronounced chai), I feel as if I have jumped into the deep end with each team member supporting me to learn how to swim, and to swim fast. I also feel that the rest of the organisation is on the sideline cheering me on. Below are my reflections and learnings from the last few months of being part of the CHI team.

Working in the international space is tricky

When I was in the domestic team I was exposed to many different industries, types of work and ways of working and met so many people who are passionate about their work through workshops, interviews and product development. Now starting to work in the international space, I am learning about things outside my home country, and in countries that I am working in (and not necessarily living in). Trying to understand different cultural customs, to work around language barriers, across different time zones and to understand different political systems and social contexts is proving to be quite tricky. I am learning a lot more, am asking a lot of questions and reading wider than my easily accessible news. I am also being kind to myself – I know that I am not expected to know everything and am not putting pressure on myself to do so, especially in a short amount of time!

The work is the same, but different

When I first joined the CHI team, I thought this would be great to learn a whole new skill set and challenge myself even further by learning something different and working in a different way. To my surprise, my first task when I joined the team was to conduct a document review against the monitoring and key evaluation questions for an upcoming workshop, which is something that I had finished doing for a domestic project a week earlier to feed into a report! The questions were similar and the way to go about it was the same. The only thing (which was quite a big thing, mind) was that the language and jargon was different, and instead of talking about an area or region, the project was focusing on a whole country! My biggest challenge in joining the team so far is getting used to all the acronyms in reports and discussions with my peers and our clients. I am slowly getting there; though someone should quiz me in the early stages of next year.

Understanding complex challenges

By going to a destination for an international holiday versus going for work, you learn about a country in a very different way. There is the saying that you should not discuss politics when you are in polite company – this is very different in the international working space, particularly when working in design, monitoring and evaluation. You learn about a country’s context on a more granular level, ask the difficult political questions and try to understand the country as much as you can, as fast as you can, especially whilst in-country. I have been exposed to the complex ways of working, what the client deals with and small spot fires they must put out on a day-to-day basis (which are quite different than in the domestic space). These issues also do not have quick-fix solutions. There is at time a feeling of helplessness – now that you know about this information, what are you going to do with it? I believe that doing design, monitoring and evaluation work helps with this, as knowledge is power and can be a communication tool to change someone’s life for the better.

I feel very fortunate to have landed where I am today. Not many people can say that they have ended up with their dream job, especially in such a great values-driven organisation in a very niche area. I have a great team of people supporting me and putting up with my insurmountable amount of questions and reflections, whilst also looking back fondly at my time in the domestic team, where I was able to build my strong foundational knowledge and be supported in every direction. I am looking forward to continuing to swim out toward the horizon and reflecting on the successes and challenges that are happening in such a complex world.

2019 Changefest: Evaluation Tools session

Evaluation Tools session, with Monique Perusco (Jesuit Social Services and Working Together in Willmot), Skye Trudgett and Ellise Barkley (Clear Horizon)

Our session started with a contextual summary of the work and characteristics of ‘Together in Willmot’, a collaborative social change effort in Mt Druitt involving The Hive, Jesuit Social Services, service providers, schools and many other partners. Clear Horizon is working with Together in Willmot as an evaluation partner. Our shared approach to learning and evaluation responds to the challenges of evaluating systems change and place-based approaches, and is tailored to the phase, pace and strengths of the collaboration. We introduced the process for evaluation we are undertaking, which has involved training in Most Significant Change Technique and local data collection which will feed into building a theory of change and then an evaluation plan. We are planning next year to do co-evaluation focused on the efforts and outcomes to date.

During the session we looked at examples of some Most Significant Change stories so far collected as part of this work.

Most Significant Change (MSC) technique was developed by Jess Dart and Rick Davies. Together Jess (Clear Horizon’s founder and CEO) and Rick authored the User Guide in 2005, and MSC is now applied to innumerable contexts worldwide. MSC story based method that can be used for participatory monitoring and evaluation. The process follows a simple interview structure that can generate a one page change story. It is participatory because many stakeholders are involved both in deciding the sorts of change to be recorded, and in analysing the data. MSC uses stories as a form of data collection. Stories are collected from those most directly involved, such as project participants and field staff. Stories are usually collected by asking a simple question such as ‘during the past year, what in your opinion, has the been the most significant change for participants as a result of this program? Stories are then collected, and stakeholders sit together to analyse the stories, at this time, participants are asked to select the story that represents the most significant change for them. The process of selecting the most significant story allows for dialogue from project stakeholders about what is most important. This dialogue is then used as evaluation data to create knowledge about the project and what it is achieving. 

We also covered concepts and tools for evaluating systems change and place-based approaches from the Place-based Evaluation Framework  and Place-based Evaluation Toolkit, which was commissioned by Commonwealth and Queensland governments last year and is a leading guide for evaluation in this context. We introduced the generic theory of change for place-based approaches and ‘the concept cube’ that shows the multiple dimensions of evaluation in this context. Clear Horizon worked with TACSI and CSIA to lead the co-design of the framework and have been working with government, community, philanthropy and non-government partners to test, apply and progress these learning, measurement and evaluation approaches.

AES Conference 2019 – for a first-time attendee & emerging Evaluator

By now I’ve had a few weeks to reflect on my first Australian Evaluation Society conference, where I was exposed to an amazing variety of evolving and inspiring ideas from within the Australasian evaluation community.

On the first day I found myself inspired by Dr Tracey Westerman, who raised the dubiousness of using or adapting Western-focused data collection tools in a First Nations context. For me, this highlighted the necessity of remaining reflective and adaptive in our approach, as well as tailoring every evaluation’s approach and methods to the context, preferably in partnership with community. This in turn made me reflect on the work of one of my Clear Horizon colleagues, Skye Trudgett, a proud Aboriginal woman who is passionately pursuing Indigenous Data Sovereignty and is leading a program to build First Nations young women’s evaluation capacity in remote Australia.

Following Dr Tracey Westerman’s plenary, I attended a Systems Evaluation Theory Application presentation by Lewis Atkinson, Brian Keogh and Ralph Renger, which helped frame my thinking regarding complexity. I found myself identifying with the concept of cascading success or failure as a neat way to consider the upstream challenges which produce a downstream issue. I could see the concept’s similarity to framing challenges through approaches such as the Problem Tree and found it a resonating concept in which to couch my thinking on casual pathways.

My third and final reflection was on the space for emerging evaluators. The conference provided a valuable sounding board for ideas and challenges facing those new to the profession and was highlighted on the final day by Eunice Sotelo and Francesca Demetriou, who are conducting research on these experiences. I found myself identifying closely with the key findings, and introspectively, the session highlighted a need to establish a community of support and practice for emerging evaluators. Personally, I will be seeking to develop an informal network, but I believe that an AES-associated group would be invaluable in attracting, retaining and developing those new to the space. Young people are not only the future (inevitably), but have the potential to bring new ideas, perspectives and approaches to evaluation. I feel that this potential influx of creativity and vigour should be encouraged and enhanced through more formal arrangements.

As Kailash Satyarthi is credited as saying, ‘No segment in the society can match the power, idealism, enthusiasm and courage of the young people’, or in this context, young (not necessarily age-specific) evaluators.

AES International Evaluation conference Day 2!

It’s been an amazing AES conference so far – lots of interesting topics and great conversations. Like Jess, the highlight for me so far has been the key note speaker from day one, Dr Tracey Westerman – an Aboriginal woman from the Pilbara in WA. She has been a trail blazer in Aboriginal mental health. The key take away message for me was that measurement matters – but even more importantly, the right measures matter. She described that in many cases of Aboriginal youth suicide, there was been no prior mental health assessment. But when assessment tools are used, they are western based and not culturally appropriate. This can lead to misdiagnosis. For over 20 years, Tracey has argued that it is not appropriate to ‘modify’ existing measures because of their inherent racism. The only way is to develop new tools from the ground up. Tracey has developed seven tools specifically for Aboriginal youth mental health with not a lot of funding – no easy feat. It was a truly inspiring presentation from an amazingly passionate and optimistic woman who really cares about her people.

A highlights from day 2 was a panel of designers and evaluators from Australia and New Zealand: Jess Dart, Kate McKegg, Adian Field, Jenny Riley and Jacqueline (Jax) Wechsler, who explored how we might move out of the traditional box of program evaluation, to make a bigger difference. They discussed the role of evaluators in supporting people to move beyond measuring to think though whether we are doing the right things and whether we are really making a difference across complex systems. Questions were covered such as where can evaluators add value in a co-design process, does evaluation get in the way and slow things down, do evaluators need new skills to help analyse and make sense of big data? Jenny reminded us that evaluators are learners and we are curious, and that we need to get on board with the digital revolution.

One really interesting concurrent session I attended was on the use of Rubrics by Julian King, Kate McKegg, Judy Oakden and Adrian Field. They presented the basics of rubric and then described how rubrics can be a tool for democratising evaluative reasoning, stakeholder engagement and communicating of results. They presented two very different examples– one in a developmental evaluation and the other was using rubrics to evaluate the value for money of an agricultural funding program. I found the second example particularly interesting having experienced the challenges of answering the value for money question. Using a rubric in this way is great for balancing the multiple dimensions of value from different perspectives.

Another memorable moment was at an ignite session (which is a really short presentation). Damien Sweeny and Dave Green from Clear Horizon did a great job at presenting a rather convincing argument for placing more emphasis on monitoring over evaluation – Big M vs small e as they call it. And they cheekily suggested changing the name of the AES to AMeS. An interesting thought.

The day finished with a second keynote speaker, Gary VanLandingham, from the Askey School of Public Administration and Policy. He reminded us of the vast amount of evaluative information available through ‘What Works’ warehouses. They are a good place to start when starting an evaluation, but there are warnings. The main caution for me is that they privilege certain types of data over others, and they don’t include what doesn’t work or things not measured using experimental approach (such as randomised control trials, and quasi-experimental methods).

The day was topped off by a short ferry ride to Luna Park where we had a lovely conference dinner overlooking the opera house. Sydney is a very beautiful city and a great setting for a wonderful conference.

Now for day three….

Have you visited our booth at the conference?

AES International Evaluation Conference Day 1!

Dr Tracey Westerman, proud Aboriginal woman, had me totally gripped throughout her keynote presentation on Day 1 at the AES International Evaluation Conference. She began with the statistic that Australia has the highest rate of child suicide in the word. But she cautioned us to be optimistic and focus also on the positive outcomes that are occurring, such as the six young Aboriginal people graduating from medicine in WA this year alone. She stressed that education is the most powerful solution in the world and described how in one generation her family ‘closed the gap’ – she gained a doctorate despite living very remotely, with a family background of very limited formal education.

She walked us through the importance of developing assessment tools that are culturally sensitive and to avoid confusing causes with risk factors. It seems many tools in existence today are culture-blind and can lead to stereotyping and actual discrimination. She has developed a whole range of specific assessment tools that are culturally sensitive, including an assessment tool for work-based culture. She made the case that there hasn’t been the right sort of research with Aboriginal people, that the causes are different, and need to be assessed in culturally sensitive ways.

She’s working on ‘building an army’ of Indigenous psychologists across the country to address child suicide and influence child protection. She ended with the note that there is nothing that can’t be achieved by Aboriginal people if they believe in themselves.

After this I had the privilege of moderating a panel about the client-consultant relationship, a topic dear to my heart and my business! The panel was from the Victorian Department of Education and training (DEET), as well as consultants from Melbourne Uni and Deloitte Access Economics. DEET have set up a ‘state of the art’ supplier panel with over 30 suppliers on it, and are working to deepen the partnership between evaluation suppliers and commissioners, as well as embedding a culture of evaluation. They were generous in sharing their challenges, including lots of tricky moments around data sharing and IP.

Just before lunch I had the pleasure of indulging in a session of evaluation theory led by Brad Astbury and Andrew Hawkins from (ARTD). The explored the seven great thinkers of evaluation, who laid the foundations of our seven-decade-long journey of building our theoretical foundations. So lovely to wallow in theory, I remember savouring that learning when studying for my PhD. Their conversation was framed around Foundations of Program Evaluation by ShadishCook and Leviton (1991) – it was my evaluation textbook back then and good to see it’s still valued!

It was a busy day for me, and I also convened a fun panel on digital disruption. We had a great spread of panellists, with a designer from Paper Giant, Ruben Stanton; a data scientist, Kristi Mansfield; a social procurement expert, Chris Newman; as well as our very own Chief Innovation Officer, Jenny Riley. We explored the trends, the opportunities and the scary stuff that might come with the fourth industrial revolution – yes, the robots are here! I saw a few people jump in their seats when Jen described how Survey Monkey stores data overseas and is not bound by the same data security regulations. We also looked into the amazing opportunities for evaluators to be the sensemakers of Big Data. When the firehose of data hits you in the face, maybe the evaluators will be there to bring you calmly back to the most important questions. We also explored the opportunities for evaluators to get involved in evaluating new technology and innovation, and to help consider how ready the system is to receive these innovations. I didn’t want it to end!

The day was topped off with a keynote from David Fetterman on empowerment evaluation. Empowerment evaluation is now 26 years old! David explained how empowerment evaluation is a self-evaluation approach designed to help people help themselves. It felt familiar!

Progress 2019

Progress 2019 was held over 2 days in June to address the pressing social and environmental issues the world is currently facingThe conference is a biennial event at which progressive thinkers and change makers in the social and environmental space come together to discuss current issues. Over 1500 attendees from Australia and around the world attended the event at Melbourne Town HallHeadline speakers included Anat Shenker-Osorio, Ellie Mae O’Hagan, Bruce Pascoe, Kumi Naidoo, Owen Jones and Behrouz Boochani. The two-day program was emceed by Yassmin Abdel-Magied. 

Kaisha Crupi and Shani Rajendra represented Clear Horizon at Progress 2019 in a bid to understand how Clear Horizon can better engage in this space. Their key learnings are as follows:

  1. We work in complex and messy systems that are tricky to navigate

We found that intersectionality was a common thread throughout this year’s conference. The key speakers noted that Australia cannot achieve true social change unless social change includes all Australians, especially the interests of First Nations people. Panels such as the “First Nations Justice” and “First Nations Land Justice highlighted the significant gaps in current social change movements. Similarly, “A Very Human Climate Crisis” panel reinforced that Australia cannot achieve true social change unless it is coupled with environmental change. Progress 2019 drove home the message that our everyday challenges in the social and environmental change space are complex and multi-faceted in nature. It was an important reminder that we all need to work collaboratively across different initiatives to tackle such complex issues.

The discussion on intersectionality tied well with the other discussions in the conference around embedding a greater understanding of complex systems in which we operate into progressive practice. In his keynote address, Kumi Naidoo, the Secretary-General of Amnesty International, referenced the role of systems change. The three requirements where the world needs to push the boundaries include system re-design, system innovation and system transformation. With these requirements, we will be in a better position to achieve social and environmental change. Kumi’s sentiments were also echoed by Lyn Morgain, Chief Executive at CoHealth who stated that:

“We need to acknowledge that transformation is iterative, dynamic and murky.” 

For us working at Clear Horizon, the discussion of intersectionality and systems thinking was really exciting for us as this is how we like to work. Clear Horizon prefers to work with organisations who are already working across multi-faceted issues, particularly where social and environmental justice intersect. This also includes systems-thinking projects and organisations.

  1. Working in complexity is best done through strong partnerships

Each panel discussion at Progress reinforced the importance of partnerships in trying to achieve social and environmental change. At the “Philanthropists & Changemakers: Effective Partnerships to Win Systemic Change” conversation, the panellists discussed that although developing relationships and partnerships are challenging, true partnership is rewarding once it is fully developed. It was noted that the key element is to come to an agreed understanding, in order to build trust. This includes the fact that building trust takes time and that understanding what everyone wants to achieve from the partnership (both as individuals and as together) takes careful research and negotiation. We learnt that all organisations can benefit from developing strong partnerships with like-minded organisations and working together to achieve a common goal.

From this discussion, it reinforced how lucky we are to work in a space which already has already existing partnerships, which have formed over the years and are now incredibly strong. Clear Horizon has long-standing relationships with several organisations. We have found that by working in increasingly complex environments, the need to have a wide range of strong partnerships helps have more of a social and environmental impact, rather than if we were going at it alone.

  1. We need to make space for marginalised voices

We were reminded throughout the conference about the importance of creating a space for people from underrepresented groups in the social and environmental change space. Speakers at Progress highlighted the often used and sometimes tokenistic “let them speak” approach to inclusion which is used a lot in current practice. Instead, change makers should be looking towards organisations who are providing a supportive platform and tools in which people from underrepresented groups can choose to use when speaking for themselves. Phillip Winzer from Seed said that the question that organisations should be asking themselves is:

“What can we do to support Aboriginal and Torres Strait Islander people to implement the solutions they want?”

This question made us reflect on how organisations currently work, both at Clear Horizon and other organisations we work with. We also thought about current Clear Horizon strategies in how to recruit people from underrepresented communities as well as how we work with these communities. This includes people who advising us on how to better create spaces for people to feel welcome and accepted.

Progress taught us that no matter how progressive or values-driven organisations are, to achieve social and environmental change, they need to be proactive in involving underrepresented groups. As Kumi Naidoo said in his keynote address:

“If progressive organisations do not internalise fully the challenges we face, then we are part of the problem.”

  1. We need to think about how we communicate 

Progress also showed us that social and environmental sectors need to think about how they connect with wider audiences. This includes talking using more accessible and plain language so everyone can understand. This includes not speaking in jargon, providing image descriptions for people with a visual impairment, or communicating with drawings and cartoons rather than text. Anat Shenker-Osorio, Founder and Principal of ASO Communications did a presentation on communication in campaigning which highlighted the need for strengths-based communication that avoids “othering”. This was a quite important learning for organisations who are trying to use alternative voices in the spaces that they work in.

From our Clear Horizon perspective, we understood that we can further develop our practices. This includes how to communicate in our workshops with people with a disability (through digital storytelling or verbal visual descriptions), not using our evaluation and design (and sometimes sector-specific) jargon as well as having clear messaging and metrics of success.


Progress 2019 was a thought-provoking two-days. A major learning for us was how Clear Horizon could include more diverse voices in our spaces. This could be done through more meaningful partnerships and working on having jargon free, accessible communication. We felt that as a values-based, mission-driven organisation, the benefit for Clear Horizon in continuing to engage with progressive movements such as Progress is a must-need. We also realised that our current work in systems thinking, monitoring and evaluation, and design provide us with a useful toolkit to contribute to the social and environmental justice space. We hope to work with more organisations striving for this in the future.