Commitment 4.4 We reflect on, share and apply results and lessons with stakeholders.

Compliance Indicators

Compliance with the Commitments will be assessed against the following Compliance Indicators. All of the applicable Compliance Indicators must be met by every ACFID Member to be considered compliant with the Code. Each of the Compliance Indicators has one or more compliance Verifiers. Verifiers are the description of evidence that is required to substantiate compliance with each Compliance Indicator. Guidance is also provided.

4.4.1 Members disseminate information about results and lessons to all stakeholders – primary stakeholders, partners and donors.


  • Policy, statement or guidance document which commits Members to the dissemination of information about results and lessons to all stakeholders – primary stakeholders, partners and donors.
  • Evidence of consistent dissemination of information and results on website.


You could disseminate information through: structured feedback sessions with partners and communities and other stakeholders in-country and with your own staff; publication of evaluation reports or findings in newsletters; presentation of results and lessons at public conferences or meetings; and through your website.

Dissemination of information on your website could include: publication of evaluation reports with both positive and negative findings; findings of research; or outcomes of reflections processes.

4.4.2 Members reflect on results and lessons in order to inform and improve practice.


Documented process or evidence of consistent reflection on results and lessons and how these are used to inform and improve practice.


Your approaches could include: sharing of evaluation reports and findings with partners and other staff; scheduled, and resourced meetings or workshops bringing together key staff and partners, providing the time and space away from day to day work, to systematically discuss results challenges and learnings; the establishment of a shared data base capturing data and lessons.

Download and read ACFID’s PMEL Guidance tool from the resources section below for further guidance on developing planning, monitoring, evaluation and learning frameworks and tools that meet this requirement.

Good Practice Indicators

The following Good Practice Indicators describe a higher standard of practice than that set out in the Compliance Indicators. While Members do not need to meet the Good Practice Indicators to be considered compliant with the Code, they will self-assess against these indicators once every three years. This provides a clear pathway for Members to strengthen and improve practice over time.

  • Multi stakeholder learning events (This may include conferences, workshops, presentations, etc) are hosted and/or engaged with.
  • A yearly schedule of reflection and learning events is in place.
  • Mechanisms are in place to ensure findings are shared and feedback is sought from primary stakeholders in accessible ways.


Good Practice Guidance

Here are some practical suggestions for your organisation to further deepen and improve practice over time.


  • Prepare information about results and lessons, in accessible formats and languages to ensure authentic accessibility to all stakeholders.
  • Jointly define with your partners and other stakeholders what success or progress will look like and how it will be assessed and measured. This could involve defining indicators and targets or could be done in a more open-ended manner.
  • Establish monitoring and evaluation systems that regularly and systematically include the participation and leadership of partners, community members and other critical stakeholders
  • Consider including staff from other partner organisations or projects in evaluation teams to enable peer learning and sharing.
  • Present findings and seek feedback in an accessible and appropriate way to your stakeholders. This may require summaries of research or evaluations to be translated into local languages.
  • Organise events or opportunities for stakeholders and staff to reflect on lessons learned and explicitly incorporate those lessons into forward planning.
  • Maintain a data base of learnings which is searchable and can be accessed by staff in their roles designing future projects.
  • Ensure that project visits, evaluations and research trips include debriefing times with local stakeholders and partner staff to present findings, and receive feedback.
  • Schedule and resources regular staff meetings/ partner meetings for the structured discussion of results and findings.
  • Schedule and resource community sessions to share results and findings.



DAISI’s Commitment to Principle 4.4 We reflect on, share and apply results and lessons with stakeholders.

  • It is DAISI policy that the Minutes for Board meeting must include a review of reporting and evaluation methods.
  • The Annual Report must include reporting of outcome factors in a truthful and transparent manner
  • This reporting should be publicly available for the perusal of all members and  stakeholders – primary stakeholders, partners and donors.
  • Members must reflect on lessons learnt from field trips, and this is best achieved with Planning, Evaluation, Monitoring.
  • DAISI has clear policy guidelines on the planning, Evaluation and Monitoring requirements of volunteers before during and after their trips.   It is often the post trip analysis that shapes policy development and improvement of future trips/programs.
  • For medical and surgical trips scheduled reporting from Members and Volunteer should be provided within 30 days of completing a trip to the South Pacific and should include the following:
    • clear statement and aims and goals of the trip with measurable outcome factors.
    • detailed documentation of activities performed (for surgical trips this would include a comprehensive de-identified log book of cases seen, operated on)
    • Objective measurable outcome factors (for surgical trips this would include morbidity and mortality data).
    • Outcome of briefing and debriefing sessions, and level of volunteer and partner involvement in these sessions.
    • Where aims and goals and measurable outcome factors were met and where they were not
    • Consideration to lessons learnt, and areas for improvement.