Our Evaluation Practice
We encourage high standards of evidence. We specialise in applying experimental and quasi-experimental designs that are relevant to the context. We embed these designs in our mixed methods approach to conduct evaluations focused on impact, programme activities and context, model comparisons and overall learning.
We promote relevant and inclusive processes, often working with our clients and stakeholders to ensure an appropriate evaluation. We provide technical assistance to evaluation teams.
We focus on informing decisions, helping people access quality evidence to inform decisions throughout the implementation process to inform success, and at the end for accountability purposes.
- Impact evaluation
- Formative evaluation
- Summative evaluation
- Process evaluation
- Outcomes evaluation
- Experimental design
- Quasi-experimental design
- Randomised trials (RCT)
- Mixed methods
Our Research Practice
We review and synthesise evidence, including everything from reviewing literature, documents and data, to undertaking meta-analyses. We synthesise these data so they are useful for their intended purpose
We analyse data, using the most appropriate qualitative or quantitative techniques for both secondary and primary data. We use the IDI and explore other administrative datasets.
- Statistical analysis
- Systematic review
- Meta analysis
- Literature review
- Segmentation research
Our Monitoring Practice
We inspire progress through evidence. Data is vital when working with multiple stakeholders to achieve “collective impact”. We develop theories of change with stakeholders, to ensure measures are relevant and have the buy-in required to succeed. We also develop and assess measures to inspire confidence and ensure monitored evidence supports the overall goals.
We make quality data accessible. We create reports that pull together relevant data, and in real time, so it is useful for everyone who can benefit from data and continuous improvement – implementation teams, managers and funders.
- Progress reporting
- Collective impact monitoring
- Evaluative monitoring
- Data management system
Our Measurement Practice
- Evaluation framework
- Implementation plan
- Instrument design
- Sampling design
- Population estimation
- Scale validation
- Measurement review
Our Data Science and Analytics Practice
- insights generation
- descriptive analysis/profiling
- multivariate/regression modeling
- validation/operationalization of predictive models
- randomized-controlled trials, AB/split testing
- SAS, SQL, R, Python, Cloud