The evidence is anecdotal – if it exists at all
There have been 337 programmes in Australia to “support the attraction, retention and progression of girls and women,” into STEM careers. Just seven have provided “evidence of impact or an evaluation of effectiveness.”
Merryn McKinnon (ANU) argues “understanding what initiatives are most effective, and why, is fundamental to making any significant progress in increasing the participation of women in STEM studies and careers,” and so she examines who programmes have targeted, plus why and where.
But as to what they have achieved – evidence, when recorded at all, is often anecdotal, focusing on participants enjoyment and enthusiasm. This can be unavoidable, the success of an activity that convinced a girl in primary school to become a mathematician won’t be apparent for 20 years.
So, what is to be done? Dr McKinnon proposes, “mandating a certain percentage of any grant awarded” to evaluation and making reports public, so that “policymakers have an opportunity to create a culture of evaluation, where rigorous examination of impact and open sharing of results is considered the norm rather than the exception.
“Evidence collected here suggests that any existing requirements are not effective – or stringent – enough,” she concludes.