The chat explored six questions as prompts for the discussion:Welcome to the 3nd #HeritageChat! We’re tweeting from 13.00-14.00 on the theme of Heritage and evaluation. This month’s #HeritageChat comes from @ERS_Limited tweeting as @HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
We’ve compiled the answers to each question below, but you can also look back at the conversation chronologically on Twitter via #HeritageChat. The discussion first addressed the nature of evaluation and the different types of evaluation data that heritage projects tend to collect.January's #HeritageChat is on 'Evaluation'. Join us and @ERS_Limited on 18 Jan 13.00-14.00 to explore how the heritage sector approaches evaluation and how we can share evaluation data and create a shared evidence base. Here are the 6 Qs we'll be using https://t.co/Lqsmgp8XSz pic.twitter.com/nUIpw1zZH8
— Heritage Chat (@HeritageChat) January 15, 2018
Perhaps I'll try that again. What type of evaluation data do you collect for your heritage projects?#HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
A1 #HeritageChat - value for money, case studies, learning outcomes…@Heritage2020 is interested in evaluating changes in working practice #collaboration
— Historic Environment Forum (@HistEnvForum) January 18, 2018
@ERS_Limited - what does your experience show are common types of #evaluation studies? #HeritageChat
— Historic Environment Forum (@HistEnvForum) January 18, 2018
Evaluation means different things to different people; it's not an audit but a means of telling a story #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Many heritage projects are only undertaking evaluation out of obligation to HLF/other funders
— ERS Ltd (@ERS_Limited) January 18, 2018
There needs to be buy in across your organisation and a realisation that this is so much more than a tick box exercise #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Thanks @ERS_limited - and we need to tell our stories with numbers as well as words? #HeritageChat https://t.co/s88rPl65CA
— Heritage Chat (@HeritageChat) January 18, 2018
Good evaluation is about bringing together robust and meaningful quantitative and qualitative evidence, analysing it and presenting it appropriately #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Asked this Q1 about #evaluation data at the start of today’s #HeritageChat as could be so varied that difficult to get any common framework - what do others think?
— Heritage Chat (@HeritageChat) January 18, 2018
#HeritageChat Historic England have developed a detailed monitoring framework for the HAZ programme which aims to collect annual indicators according to 7 themes.
— Adala Leeson (@AdalaLeeson) January 18, 2018
Thanks @AdalaLeeson - is this framework published? Or are there other guidelines/principles widely used in the #heritagesector #HeritageChat https://t.co/Ume1XnrGz6
— Heritage Chat (@HeritageChat) January 18, 2018
The framework is not published yet but is based on best practice evidence and indicators from the @heritagelottery and @ArchHFund.
— Adala Leeson (@AdalaLeeson) January 18, 2018
@CITiZAN1 ask our volunteers for feedback on our training and social events: are they confident in the new skills they've learned, and - v. important - whether they enjoyed it! Feedback is essential for us to deliver the best possible programme for our vols! #HeritageChat https://t.co/erHunaD0VJ
— CITiZAN (@CITiZAN1) January 18, 2018
Contributors shared guidance and resources:Not nearly enough! (Data, analysis, sharing) - 30 yrs exp in sector & one of my biggest frustrations, so much we need to know, esp qualitative visitor feedback ??
— Carolyn Lloyd Brown (@heritageangel) January 18, 2018
#HeritageChat Q2 - What resources, guidance or principles do you draw on to inform your evaluation?
— Heritage Chat (@HeritageChat) January 18, 2018
#HeritageChat The recent publication of the Treasury's Public Value Framework led by Sir Michael Barber is an excellent new addition to the discussion.
— Adala Leeson (@AdalaLeeson) January 18, 2018
The Sir Michael Barber report can be found here https://t.co/5tg84kC7Pj
— Alex Hayes (@AlexHayes27) January 18, 2018
A theory of change is a really important starting point for any funded programme and its evaluation. @NPCthinks has some good guidance here https://t.co/4ruDrG6aVe
— Alex Hayes (@AlexHayes27) January 18, 2018
An action research approach involving stake-holders at a range of levels can bring learning to the foreground and inform change. Combine that with theory-of-change for something interesting #HeritageChat
— Chris Strong (@StrChristo) January 18, 2018
Sounds good - any examples of people doing this to share with #HeritageChat ?
— Historic Environment Forum (@HistEnvForum) January 18, 2018
Hopefully us with our KTD programme 'Ignite Yorkshire'! 🙂
— Chris Strong (@StrChristo) January 18, 2018
See #kickthedust
— Chris Strong (@StrChristo) January 18, 2018
#DustKickers are Heritage Ambassadors for #KickTheDust @heritagelottery projects. Good for involving young people. Follow some of their activities here https://t.co/mFgcf0YUwl #HeritageChat
— Historic Environment Forum (@HistEnvForum) January 18, 2018
We have used Generic Learning Outcomes evaluation via the museum service in the past.
— ArchaeologyUK (@archaeologyuk) January 18, 2018
The discussion then touched on the relationship between the evaluation data required by funders and the types of data required for internal planning.Thanks @archaeologyuk - these are widely used by #museums. Useful guidance/ toolkit from @ace_national https://t.co/ke5iwdIeL2 https://t.co/i5shEhbmCW
— Heritage Chat (@HeritageChat) January 18, 2018
Is there an issue that the types of data you need to plan activity differ from those included in reports to HLF/other funders? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Sometimes you need to think of YOUR organisation as well as funders, because the data you will need in the future may differ from the funding body, combining both evaluations can be fruitful to both parties. But plan at the start and open to change as the project progresses.
— ArchaeologyUK (@archaeologyuk) January 18, 2018
It's important to recognise that the requirements of funders differ. e.g. if you're funded by HLF and ERDF don't think the same report will satisfy both. #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
How can evaluation create added value?Hence guidance such as that available from HE may guide one report, but ERDF guidance be needed for another #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
#HeritageChat Q3 - How can your evaluation create added value (e.g. through involving young people or integrating capacity building)
— Heritage Chat (@HeritageChat) January 18, 2018
We've found it extremely useful to work with an external evaluator who has worked with us from set-up of the project to the end: to provide regular feedback throughout the project that we can incorporate and improve on; not just receiving an assessment at the end. #HeritageChat
— CITiZAN (@CITiZAN1) January 18, 2018
Providing training for heritage sector staff & volunteers alongside an evaluation, about the purpose, methodology & benefits of evaluation. Don't assume that members of the team know much about evaluation.
— Esther Gill (@Esther_Gill) January 18, 2018
Yes #evaluation provides excellent #learning opportunities for all #HeritageChat https://t.co/s3sWu4EkNp
— Heritage Chat (@HeritageChat) January 18, 2018
Do any of you have experience of using volunteers, local schools, voluntary groups etc. to support evaluation? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Using volunteers etc. can be resource-intensive but potentially valuable. It can enhance the quality of the evaluation (helping to gain valuable insights from peer groups) and offer those helping useful skills and experience. #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
ERS’s word of caution on the need for honesty in self-evaluation sparked a debate on admitting failures as well as successes:Adam Richards from @socialvalueuk: Value is in the eye of the stakeholder #HeritageChat
— Alice Purkiss (@AlicePurk) January 18, 2018
Word of caution re self-evaluation: there is a need for honesty. Admitting failures can be difficult for some but it's all part of the learning process. Evaluation should not be predicated on just promoting the positives #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Agree with this. Many @Heritage2020 conversations around #Learning from when things don’t go so well… But is the sector willing to share these examples? #HeritageChat https://t.co/6MNS54XZQF
— Historic Environment Forum (@HistEnvForum) January 18, 2018
Completely agree. Sometimes things work on paper but not in practice -evaluations are important tools for assessing what works and what doesn't. #HeritageChat
— Adala Leeson (@AdalaLeeson) January 18, 2018
Understanding what didn't work is just as important as what didn't work and must be shared. And this also includes understanding: how/why did it work/not work? In what context did it work/not work? Only then, can we gain meaningful learning and insight for future projects.
— Alex Hayes (@AlexHayes27) January 18, 2018
Agree that understanding & communicating failure is critical to growth, but this can be understandably difficult for an organisation if it 'feels' that future funding relies upon promoting the positives.
— Esther Gill (@Esther_Gill) January 18, 2018
We've had a few examples from members but when we put a call out for case studies for our Toolkit it was one of the least responded to requests. There's definitely a lot to learn from each other's mistakes but it takes guts to admit our own. https://t.co/hmBrFoYLcs
— Heritage Trust Network (@HTNmembers) January 18, 2018
We have seen people who've supported our evaluations gain valuable skills, grow in confidence and go on to get jobs or do other volunteering #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
In our evaluations we encourage honesty. @ERS_Limited has completed more than 700 evaluations and we've yet to find the perfect project #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Something that our Start-up trusts can take comfort in. Even the most successful projects will have made mistakes somewhere down the line but by learning from them, it enables them to improve and gives them important knowledge for their next projects. #HeritageChat https://t.co/trJQOpffSZ
— Heritage Trust Network (@HTNmembers) January 18, 2018
Where time allows and evaluation starts early in project delivery there is an opportunity for it to shape delivery and avoid mistakes being repeated #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
How can we embed a culture and practice of evaluation? Suggestions included demystifying the process, embedding it from the very start of a project and encouraging discussions on the purpose of evaluation.Wise words from @ERS_Limited on #Evaluation as part of #HeritageChat https://t.co/sZ6MIZExf0
— Historic Environment Forum (@HistEnvForum) January 18, 2018
Q4: How can heritage professionals and stakeholders embed a culture and practice of evaluation in a) their organisation b) the sector? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
#HeritageChat A4 by demystifying it. Some organisations see it as a huge effort post project, if smaller evaluation practice takes place on a regular basis in their organisation from the start it becomes part of the culture.
— ArchaeologyUK (@archaeologyuk) January 18, 2018
Agree. Evaluation needs to be seen as part of an organisational culture or learning and not simply something to do as part of a funding requirement. #heritageChat
— Esther Gill (@Esther_Gill) January 18, 2018
Absolutely, make it part and parcel of the process right from the very start, not just something to do once the project has finished. #heritagechat
— Laura Jayne Gardner (@lajaga) January 18, 2018
Also explore the different types of evaluation and feedback processes that can be done - some are a hell of a lot more engaging than others! #heritagechat
— Laura Jayne Gardner (@lajaga) January 18, 2018
I think clarifying the purpose and benefits of evaluation is critical for getting everyone on board. It's not just a data collection exercise but a real opportunity to understand whether the objectives have been achieved and what can be learned from the process
— Alex Hayes (@AlexHayes27) January 18, 2018
Also to encourage open discussion about the purpose of data gathering & evaluation. Why are you doing it? Do you need to do it? Who is it for? #HeritageChat
— Esther Gill (@Esther_Gill) January 18, 2018
Like this - going back to our #Guidelines question - is more needed, or is there enough already out there to encourage people to start early and embed throughout so it’s not a big ‘end of project’ task? #HeritageChat https://t.co/E8jZG4nrHp
— Heritage Chat (@HeritageChat) January 18, 2018
#HeritageChat organisations appoint a H&S representative, why not an Evaluation rep?
— ArchaeologyUK (@archaeologyuk) January 18, 2018
There is so much out there e.g. Magenta Book. The key is to develop a monitoring framework linked to your objectives and populate it with good quality monitoring data #HeritageChat
— Adala Leeson (@AdalaLeeson) January 18, 2018
Where there may be resistance, the merits of evaluation need to be clearly explained. This isn't about finger wagging it's more about hand holding (where needed) #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
The conversation also touched on the benefits and logistics of sharing data and frameworks:Some people can become very defensive, fearing that anything that is perceived negatively will reflect badly on them. Such fears can be overcome with reassurance as to what evaluation is about: learning, improvement etc.
— ERS Ltd (@ERS_Limited) January 18, 2018
Q5: What do you see as the benefits of sharing evaluation data and do you have any examples of evaluation frameworks that can be shared? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Glad you mentioned #sharing ! How can we be better at this? #HeritageChat https://t.co/pHmk4vK2KS
— Heritage Chat (@HeritageChat) January 18, 2018
Online forum across all sectors (?HLF) & focused conferences / workshops would be great (hint hint) ? #HeritageChat https://t.co/sJkhJmmTT9
— Carolyn Lloyd Brown (@heritageangel) January 18, 2018
Yes, #sharing should also about building the bigger picture for advocacy purposes as well as satisfying funders. #HeritageChat https://t.co/9hN3mcAnNF
— Kate Pugh (@kateheritage) January 18, 2018
Of course, evaluation frameworks need to be bespoke to each project, but some component parts might be replicable #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
I think this is where I struggle - how realistic is a common framework for #evaluation? Or are there guiding principles that if used, would enable comparison and sharing of data once collected? #HeritageChat #Collaboration https://t.co/RVsIcwh9sG
— Historic Environment Forum (@HistEnvForum) January 18, 2018
Up front, it is essential that all the right questions are in place (so far as can be anticipated) and built into the Framework #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
This suggests a collaborative approach with partners. You don't want to get to the end of the project and have someone ask, why did you never collect evidence on such and such? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Coming back to #KickTheDust - I think this is what @heritagelottery is trying to do for this scheme #HeritageChat https://t.co/0yMG57zMsv
— Heritage Chat (@HeritageChat) January 18, 2018
More generally, the trend towards bringing in evaluators at the outset is very welcome. It ensures that evaluation is embedded and that data capture systems are in place #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Finally, #HeritageChat considered how to adapt models from other sectors – the question gathered very few responses:Waiting until the latter stages of a project to bring in evaluators means it cannot benefit from lessons learnt and some opportunities to have demonstrated achievement may have been missed #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Q6: Are there any models of how other sectors share evidence and evaluation data that the heritage sector could adapt? #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Already our final #HeritageChat for this lunch time!
— Heritage Chat (@HeritageChat) January 18, 2018
What happens elsewhere that we could adapt? https://t.co/85uawpjyuw
There is perhaps insufficient recognition of the diversity of the heritage sector. e.g. some projects are focused on visitor numbers/ reaching new audiences but this is not always the case #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
There is much to be learnt from looking at the approaches taken by others, but wholesale adoption of indicators is not advised - you need to choose what is right for you #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
But perhaps possible where evaluating the same type of project/activity? #HeritageChat https://t.co/CKbnDwRwQD
— Heritage Chat (@HeritageChat) January 18, 2018
The same goes for targets. Some projects are far better placed than others to boost visitor numbers or engage volunteers. That doesn't mean more modest targets are unimportant. Or easy. #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
ERS and @HeritageChat wrapped up this productive session with a few final thoughts.There are often common themes: education, collections, volunteering etc. so you are bound to glean something from the approaches others have taken #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
The past hour has flown by!Thanks so much for your contributions to #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
And thank you @ERS_Limited for leading our January #HeritageChat on #evaluation and sharing all your experience with us!
— Heritage Chat (@HeritageChat) January 18, 2018
A few final thoughts: choose indicators that reflect your ambitions and in respect of which there is a link between what you are doing and what you claim can be achieved. (Having a school attend an event may be great but might not be crucial to exam results) #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Select targets that are achievable but stretching and make sure you can gather data relating to those indicators cost-effectively. If you can't, there's really not much point adopting them! #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Shout-out to the following contributors who highlighted things happening elsewhere in the sector on the topic of evaluation:Thanks very much for your company. We've been @ERS_Limited and I've been @carryonkeith Please feel free to DM us with any follow-up queries #HeritageChat
— ERS Ltd (@ERS_Limited) January 18, 2018
Delighted to welcome colleagues from across the heritage sector to today’s workshop at @UniofOxford on ‘Evaluation: A Question of Measurement’, perfectly (if serendipitously)timed for today’s #HeritageChat pic.twitter.com/PbhOJPxsOn
— Alice Purkiss (@AlicePurk) January 18, 2018
Fascinating talk from Kim Pickin of the @TheStoryMuseum on developing an institutional ‘Happy Tracker’, responding to different funders’ evaluation requirements & not letting evaluation kill the impact of a visitor’s experience #HeritageChat
— Alice Purkiss (@AlicePurk) January 18, 2018
The @Heritage_NGOs is moving offices as we speak but those in @HeritageChat may be interested in our latest @Givingtoheritag evaluation demonstrating the programme brought in over £3m as a result of the training. https://t.co/VarAM7loH1 #heritagechat
— Lizzie Glithero-West (@heritage_lizzie) January 18, 2018
Many thanks to everyone who participated in this month’s #HeritageChat – our next chat will take place on the 15 February (13.00-14.00) on the topic of Diversity.Today is all about evaluation - @heritagelottery has also published it's latest impact study on Heritage Grants. https://t.co/6cyplEHxND
— Sara Crofts FRSA (@sarajcrofts) January 18, 2018
Bringing together senior members of staff from public and non-government bodies to work collaboratively on strategic matters for the historic environment sector.