Innovation Advisory · Framework 03
The Evidence
Canvas
Eight questions about what counts as proof and for whom. For practitioners commissioning studies, reviewing reports, or deciding whether the evidence in front of them is worth acting on.
The Audience
Who needs to be convinced — and what would they do differently if convinced?
A district collector needs different proof than a journal reviewer. Name the person, not the institution.
The Decision
What decision does this evidence feed? If no evidence would change the decision, why are you collecting it?
If the answer is "for the record," the study is decoration. Decoration is expensive.
The Counterfactual
What would have happened without the programme? Can you answer that honestly?
This is the hardest question in evaluation. If you cannot answer it, say so in the report rather than pretending.
The Method
Does the method match the question — or the funding call?
An RCT answers one kind of question well and other kinds badly. Pick the method that fits the question, not the method that fits the budget line.
The Distance
Who collected the data, and how far are they from the person who will analyse it?
The distance between the enumerator and the analyst is where bias lives. Shorten it or name it.
The Exclusion
What did you leave out? Which voices, which geographies, which seasons?
Every study has exclusions. Write yours on the first page. The ones you hide will be found by someone less sympathetic.
The Monday Test
Can a practitioner use this finding on Monday morning?
Evidence that cannot be acted on was written for the wrong audience. If it only works as a conference slide, rewrite the conclusion.
The Hostile Read
Give the report to the person most likely to disagree. Does it survive?
If it does, it is ready. If it does not, the problem is not the reader.