Impact
Why measure impact? Not every local campaign measures the impact of their work, but it's a good idea to capture what your campaign achieves. Knowing how successful your campaign is can help you plan future campaigns, motivate participants, recruit partners and obtain funding.
Take Back the Tech! has a local campaign reporting form we use to get impact from campaigners so we can put all the numbers and stories together to analyse the results of the campaign as a whole. You can fill it out and send it to ideas@takebackthetech.net. You can use it to guide your campaign's evaluation, but you can also develop your own outcomes to measure.
Developing outcomes
What do you want your campaign to achieve? Obviously, we're all working toward significant cultural change, which is tough to measure, especially since widespread cultural change happens over time – far longer than the duration of one campaign. But each campaign is chipping away at entrenched power structures, so there are outcomes to measure in the meantime. You can develop short-term outcomes you want to achieve with a single campaign and long-term outcomes you want to achieve through several campaigns over a period of time.
To figure out what to measure, you first need to determine what you want to achieve with your target audience. Are you influencing lawmakers to pass or improve legislation? Are you pushing internet intermediaries to develop new policies? Are you raising awareness of online GBV among young people? Are you improving digital security skills of women journalists? Maybe you're doing all of it! You can measure each one.
Outcomes express a change in knowledge, attitude, behaviour or condition. Below are some examples of change at different levels for campaign actions that aim to end GBV.
Outcomes should be SMART: specific, measurable, achievable, realistic, and time-bound. You can write up your outcomes like these samples below:
Individuals | Institutions | Society | |
---|---|---|---|
Knowledge | Know more about online GBV, safety measures, digital security, reporting, tech tools, digital content creation, activism, advocacy | Know more about the needs of women and LGBTQI staff and stakeholders | Knows more about online GBV as a rights violation, the experiences of survivors, how to help, how to protect privacy |
Attitude | Recognise online GBV as causing real harm, apply a feminist lens, want to take action | Are open to hearing from marginalised groups, internal culture is more gender sensitive | Does not tolerate online GBV, supports women's rights, wants internet intermediaries to take action |
Behaviour | Adopt safety measures, take action against GBV, support other women facing online GBV, start a blog, share their stories | Create and follow formal policies and procedures for dealing with online GBV internally, work with marginalised groups, train staff on relevant issues and skills, create an atmosphere in which people feel safe reporting | Develops policies and laws or improves ones to address online GBV, takes action when witnessing online GBV |
Condition | Feel safer, are able to engage in self-expression online, experience more meaningful internet access | Become more diverse, sustainable, responsive, effective | Experiences increased civic, political and economic participation since women and LGBTQI people can more freely exercise rights |
Outcomes should be SMART: specific, measurable, achievable, realistic, and time-bound. You can write up your outcomes like these samples below:
Campaign/action | TBTT game | |
Target | Women human rights defenders | |
Change (or objective) | Increased understanding of how online GBV affects women human rights defenders | |
Expected outcome(s) that reflect or contribute to change |
|
Campaign/action | Online GBV tweet chat led by participants in the youth programme | |
Target | Young people in Manila using Twitter | |
Change (or objective) | Increased understanding of online GBV | |
Expected outcome(s) that reflect or contribute to change |
|
Campaign/action | Letter/email writing | |
Target | Local telecom | |
Change (or objective) | Develop a reporting mechanism for people experiencing online GBV from users whose internet access is provided by the telecom | |
Expected outcome(s) that reflect or contribute to change |
|
Another strategy you can use is to take quantitative data and find a way to dig deeper. Let's say you're tracking media mentions of your campaign. That's great (and something we ask for in our local campaign reporting form), but you could also do media monitoring where you watch for an increase in media coverage of the issue you are addressing, changes in how journalists frame the issue and an increase in audience interest (more comments, tweets, etc.) in such coverage.
If you are supporting survivors, remember that each survivor defines success differently and that's often tough to measure in a uniform way. Also, you may not be able to stop the abuse, but you can help people feel safer and stronger despite the abuse. In these cases, measuring changes in knowledge and attitude might be more effective than measuring changes in behaviour. Maybe you helped survivors feel more confident in some aspect of their lives, and that's fantastic even if it was not has high as you were aiming.
You may not meet all of your desired outcomes, and that's all right. You decide what success means for your campaign, and evaluation is a continuous process. We tweak outcomes and measurement tools all the time, and sometimes we find that what we set out to measure wasn't quite right. With some practice, you will start to learn what's realistic for your work and your community. Furthermore, not achieving all of your outcomes does not mean you failed. Learning is a success in itself, and often there are unexpected outcomes.
Measuring outcomes
Once you have figured out what changes you want to measure, you need to determine the baseline (if possible). The baseline is the status before your action to encourage change. If you want to improve knowledge, what is the level of knowledge among your target population right now? You have to find a way to show change, so starting with the baseline gives you a before and after comparison. If you want to do a digital security training, for example, query participants about their level of knowledge or skills prior to starting the training and then do the same thing after. With the telecom example above, our baseline would be that the telecom has done nothing to address online GBV, including expressing any recognition of the problem or developing ways for people to report incidents.
You can find baseline by surveying or conducting focus groups with your target audience and by searching government and civil society resources for statistics. If you are holding workshops or other activities where you expect participants to experience a change during the short period of the activity, you can do a pre-test to get the baseline and then do the same test as a post-test at the end of the activity to capture change. Let's say you are holding a digital storytelling workshop. Your pre- and post-test could ask questions about level of knowledge and confidence related to digital storytelling tools and methods.
Sometimes you can't get the baseline in advance. As long as you can show a change, that's okay. With the tweet chat example in the section above, you might see the change you need in the content of the tweets themselves. You could also encourage participants to take a Twitter poll that asks them if the chat improved their understanding of online GBV. If you perform a play or do street theatre, you could poll the audience before they leave.
For each outcome you want to measure, you will need to determine how you will get the data you need. Conventional tools include surveys, pre- and post-tests, interviews and focus groups, but they may not be feasible for all of your actions, especially those that take place online. Even conventional tools can be used in creative ways. Often, surveys are conducted using a Likert scale (where there are five options ranging from two extremes) One local campaigner uses a range of smiley and frowning faces as answer options in surveys to capture participants' reactions to the activity and feelings about their knowledge and capacity in a way that feels informal and comfortable.
Another method we like is the Most Significant Change technique. It suitable for everyone regardless of level of experience with evaluation, and it works well when outcomes are hard to define, when outcomes are too varied to measure easily or when you want participants to be active in multiple levels of the evaluation process. The original guide by Rick Davies and Jess Dart is long but fairly easy to understand, and Davies notes that you do not have to follow all ten steps. You can adapt the technique to suit your needs.
Social media analytics
Social media platforms make some analytics available. For instance, in the upper righthand corner of Twitter, you can find your analytics page, which includes such data as tweet impressions, profile visits, mentions and follower change by month. On Facebook, you can click the insights link at the top of the page to discover analytics that you can define by a date range and export for your records.
For more detailed data, you will need to use a social media analysis tool. Unfortunately, most of these tools charge a fee, but you can make the most of their free trials. There are some free tools that are decent and some companies that offer their most basic tools at no cost. Some characteristics to look for: if they have a free trial you can use, how many accounts and hashtags you can measure, what kind of data you have to give them, what timeframes they offer.
Note that we already track #takebackthetech, #dominemoslastic and any additional global hashtags we create for campaigns, but you may want to track hashtags you have created or translated. For all social media use, you will want to tracks metrics around likes, follower increase and reach, but make sure you monitor metrics that reflect more significant engagement, such as shares and conversations. You might also find good qualitative data in social media interactions. We emphasise learning by doing and connecting online and offline, so the success of a campaign cannot be reduced to the number of retweets. Helping one woman stay online is worth more than 100 retweets.
More resources
If you want to get serious about measuring impact, here are some more resources on monitoring and evaluation:
Attachment | Size |
---|---|
TBTT_CampaignEvaluation.odt | 13.37 KB |
TBTT_CampaignEvaluation.doc | 38.5 KB |