New Self Regulatory System Unsafe for Children

At the end of last year, the Turnbull government announced the trial of a self- regulatory tool to classify films and television series. This Australian innovation is a world first, and while it seems tech savvy and a time saver, there are inevitable dangers, particularly in the lack of protection for children gaining access to harmful material.

This self –regulating system involves a broad range of classifications, which can then be reviewed by the Classification Board.

 

Over the next 12 month trial period, the Board will assess the integrity of this self regulatory tool, to ensure that the ratings meet Australian community standards; however, there is no guarantee that in this trial period that harmful images will not reach our young people. 

For example, although children’s profiles can be created in Netflix with restrictive content – children can easily click on mum and dad’s account – if it is not password protected. 

According to studies released by stoppornculture.org, 79% of kids’ unwanted exposure to porn occurs in the home. Home should be the one place where parents have more control around their children’s viewing habits. 

There are many different categories and genres of films and TV shows on Netflix, and while Netflix doesn’t contain an actual XXX section of adult films, it still contains some content that is sexually explicit, violent, or otherwise inappropriate for children. 

There are also many titles that are Unrated (UR) or Not Rated (NR). The NC-17 rating—the highest rating given by the Motion Picture Association of America (MPAA) is not even shown in mainstream theatre chains due to their notoriously graphic sexual content. Films with any of these three ratings can show full frontal male and female nudity as well as long, graphic sex scenes.

If needed, we are told, the Classification Board has the power to revoke classifications made by the Netflix tool and replace them with its own decisions, but how long does this take? Will there be an appeal process for Netflix, and how many children will be affected during, and possibly after the review process?

The principal advantages of this system seem to be for Netflix itself – such as the increased speed of the classification process and removal of red tape; there appears very little advantage for parents who wish to exclude inappropriate material from their children’s viewing or for children who deserve to be protected from such material.

This self regulatory tool is the second of its kind in Australia – the Government is currently evaluating its pilot of the International Age Rating Coalition tool – another innovative classification system which has produced over half a million Australian classifications for online and mobile games on Google Play, the Microsoft Windows Store and the Nintendo eShop.

There is no indication of when the evaluation will be available.

Ultimately, these self-regulatory tools are entirely incompatible with last year’s findings of the Australian Government’s Select Committee on the harm being done to Australian Children through access to pornography on the Internet.

The Select Committee concluded in their report findings that the Australian Government needs to do more to ensure the safety of children on line, and the government was urged to do so.

 The Committee also recommended the Federal Government Commission research the exposure of Australian children to online pornography, including the adequacy of current policies – particularly alarming was evidence to the inquiry about child-on-child sexual abuse in schools.

It appears that children are accessing harmful images on line and acting out on other children – predominantly in schools with very little being done about it.

When the harm done to these children is recognised, extensive therapeutic intervention the forms of cognitive behavioural and preventative and diversionary education is necessary, and there appears to be no cohesive plan in place to provide this.  

The experience of the Royal Commission into Institutional Responses to Child Abuse has demonstrated the long term emotional and financial costs of ignoring the damage children suffer at the hands of those who are supposed to protect them.

Self regulation systems for organisations or industries are frequently shown to be ineffective, as in many cases they act only to promote the interests of the organisation which does not necessarily reflect the values of the society in which they function. 

As a society, regulation to ensure minimisation of harm being done to children is a basic and first step. In the words of Pearl Buck in “My Several Worlds: A Personal Record” (1954) ‘the test of a civilisation is the way that it cares for its helpless members’. 

Our children deserve to grow up in an environment where they are protected from all harm – in line with Australia’s due diligence obligations to protect children from material injurious to their well being – as laid out in Articles 3 and 17(e) of the Convention on the Rights of the Child

 Dietrich Bonhoeffer; “The ultimate test of a moral society is the kind of world that it leaves to its children”. 

This article was also also posted on the Porn Harms Kids website here. 

Showing 1 reaction

Please check your e-mail for a link to activate your account.

Donate Volunteer