Sex and Gender Education’s (SAGE) Australia) Response to the Consultation to The Parliament of the Commonwealth of Australia HOUSE OF REPRESENTATIVES Online Safety Bill 2020 (short title – Online Safety Act, 2020 (Communications, Cyber Safety and the Arts) A Bill for an Act relating to online safety for Australians, and for other purposes – Exposure draft.
Submission 1 February 2021
Prepared for SAGE by Dr Tracie O’Keefe DCH, Sexologist, post-grad ADV Dip NSHAP, BHSc, ND. Tracie is a clinical psychotherapist, sex educator, researcher and therapist, mental health professional, member of Psychotherapy and Counselling Federation of Australia (PACFA), College of Psychotherapy, Australian Society of Sex Educators, Researchers and Therapists (ASSERT, NSW). She has been in private practice for 26 years and previously worked with sex and/or gender diverse people in the voluntary sector for 25 years. She is presently involved in a three-year research project on suicide in sex and/or gender diverse groups, and author of four books and over 100 articles and papers on sex and/or gender diverse groups.
Since 2001 SAGE has campaigned for the human and legal rights and dignity of people of sex and/or gender diverse groups who may be intersex, sex non-specific, transexed, transsexual, transgendered, cross-dressers, androgynous, bigendered, gender fluid, without sex and/or gender identity, have atypical sex characteristics, and people with sex and gender culturally specific differences. Sex and/or gender diverse groups of people is an inclusive phrase and excludes no one who may be sex and/or gender diverse in any way.
Background
Electronic, internet and social media bullying, naming and shaming, creating victims of people from sex and/or gender diverse groups is a large problem in Australia. This includes international hosted networks that are registered and administered outside the Australian jurisdiction that offer services to Australian end users. Posting of information and comments, because it goes out publicly and to a large number of people, even in private groups, it is essentially broadcasting.
The line between non-consensual internet broadcasting and consensual sharing of material becomes clear when the broadcaster intentionally seeks to damage the second party via the contents of the material posted.
It can be seen in the Australian Trans Pathway study on adolescent mental health of trans and gender diverse youth that 74% experienced bullying, much of which will have been via electronic means (Strauss, Cook, Winter, Watson, Wright Toussaint and Lin, 2020). This directly correlates to 68% experiencing discrimination, 74.6% depression, 74.6% anxiety, 82.4%, suicidal thoughts, 48.1% attempting suicide.
Many adults from sex and/or gender diverse groups have also committed suicide due to online bullying. This is equally as great a problem as high levels of depression and social ostracization and suicide that occur due to online bullying. Abreu and Kenny (2017) reported findings in a systemic review of literature that non-cisgendered individuals are more at risk of online bullying that ordinary persons.
The proposed act sets out to create an eSafety Commissioner who will handle complaints around electronic abuse and bullying, seeking fast remedies for victims.
The Proposed Draft Bill
The proposed bill purports to improve online (internet) safety against the effects of bullying and damage done by the posting of untrue and extreme biased statements against Australians. This should particularly pertain to vulnerable groups whose lives and mental health could be severely damaged by prejudicial and untrue statements.
“The core stated objective of the Act is:
3 Objects of this Act
(a) to improve online safety for Australians; and
- b) to promote online safety for Australians.”
The bill proposes
There is to be an eSafety Commissioner.
The functions of the Commissioner include:
- promoting online safety for Australians; and
(b) administering a complaints system for cyber-bullying material targeted at an Australian child; and
(c) administering a complaints system for cyber-abuse material targeted at an Australian adult; and
(d) administering a complaints and objections system for non-consensual sharing of intimate images; and
(e) administering the online content scheme; and
(f) coordinating activities of Commonwealth Departments, authorities and agencies relating to online safety for Australians.” (Parliament of the Commonwealth, 2020)
Regulation is aimed at providers of social media, hosting service providers and a person who posts cyber-bullying material, with the Commissioner having the power to issue a notice to remove any offending material including text, intimate non-consensual intimate images and videos.
“An internet service provider may be requested or required to block access to
(a)material that promotes abhorrent violent conduct;
(b)material that incites abhorrent violent conduct; or
(c)material that instructs in abhorrent violent conduct; or
(d)abhorrent violent material.” (Parliament of the Commonwealth, 2020)
The bill indicates that the Australian authorities would have the ability to request the service provider to disclose the identity of the original source of the posted information. Problems arise here in that postings can be posted from outside Australia by someone in Australia via masked identities and that sources may even be hidden from the service provider. This would make it impossible for the service provider to provide such information.
A better provision would to be where the victim can request that the service provider remove the offending misinformation and bullying material within 24 hours; failure to do so then would result in the service provider being the offending and libelous body.
There is a proposed provision for persons and corporations to be issued with punitive fines for posting such material and for not removing such material. Electronic material can go viral in a very short space of time and be posted outside the legal reach of the Australian authorities.
The bill however does not make provision for damage done to the victim by such viral postings and streaming, derived from the original posting, over which the Australian authorities have no control to order removal. Should this happen, extra punitive measures need to be levied against offending parties.
It is unclear whether the service provider and the eSafety Commissioner would have a duty to inform the victim about who was the original person or organisation that posted the original offending material.
It is important for personal safety that the victim of offending postings is informed of the original source of the material wherever possible. Not to do so would leave them at future risk of victimisation from that source.
Allowing the service providers to form voluntary online codes of service has not worked. Such companies’ first obligation is to their shareholders’ profits and they often only remove offending posts after considerable, protracted publicity. A victim of online bullying may be emotionally and mentally damaged and rendered vulnerable by the material and be afraid to publicly blow the incident up for fear of further victimisation. Large tech companies often register their businesses outside Australia, flaunt the tax laws, and fail to comply with a whole host of Australian guidelines, rules and regulations.
Tech companies often do not respond to the public’s enquiries. Many of their staff are offshore in Asia and do not understand Australian laws or have the desire to comply to them. Complaints are frequently not escalated or even responded to at all. Dealing with complaints costs these companies money which they often do not wish to spend.
We can see the Mark Zuckerberg, the CEO of Facebook, told a USA Senate Committee that Facebook would not take down lies or fact check content a lot of posted content (Zuckerberg & Ocasio-Cortez, 2019).
After the USA attack on the senate by rioters, supposedly incited by Donald Trump, Facebook blocked Trump’s account; however it then targeted extreme right-wing groups with adverts for combat gear (Vamos, 2021)
Google takes down genuine information from the internet in order to block options that it deems are counter to supporting the products and profits its business, acquired companies, sister companies and other companies belonging to their parent company (Karp, 2021). So, not only does it bully some competitors but also large corporations and can do that because it is a monopoly, which claims to have 90% of online searches.
Twitter has been accused of being a bullying playground. It runs programs where certain algorithms select keywords and phrases for removal of material that may be associated with bullying. However, this is technology management by programs that fail to understand the semantics, pragmatics and contextual meanings of human communication so it is management by machine and not people. Therefore its ability to regulate bullying by programs fails to respond to a great deal of bullying. (Sterner and Felmlee, 2017).
A new app for social communication is launched ever few weeks and bullying then often moves location onto its facilities.
In short, tech companies cannot be trusted to form voluntary codes to stop bullying. Many people from sex and/or gender diverse groups of people have been bullied electronically which leads to suicide, physical violence and murder.
Allowing tech companies to supposedly combat bullying via a voluntary industry code would create a beast with no teeth and no protection for the Australian public.
It is necessary the eSafety Commissioner, with public consultation, be the one to make an industry standard also requiring tech companies to respond to a complaint using people, not simply via automatic programs. Only that way can the public be better protected.
Legal compliance of foreign-based tech companies operating in Australia, as well as other countries, is a large problem today. These companies often act as if they are outside Australian laws, even though they are operating within the jurisdiction. Sometimes they have no tangible assets in Australia, considering themselves above the law and able to operate with impunity, giving them little incentive to protect Australian consumers.
In forming new regulation protecting Australians against online bullying, it would be necessary to require tech companies to respond to misinformation, bullying material or postings of non- consensual intimate images complaints within 24 hours. Failure to do so must run the risk of the eSafety Commissioner issuing an emergency removal order, fines far higher than 500 penalty units along with publication of each offences within the past five years on the eSafety Commission’s website.
The present draft omits any reference to government employees or departments being as libel as the general public or companies for posting offending material. This is out of line with standard Australian anti-discrimination laws that do not exempt the behaviour of government employees and departments.
In drafting the legislation, it is necessary to include the wording that states government employees and departments can be possible offenders in posting bullying material. No Australian entity should be exempt of liability in this legislation.
In determining whether material is offensive the draft refers to the following:
“The matters to be taken into account in deciding for the purposes of this Act whether an ordinary reasonable person in the position of a particular Australian adult would regard particular material as being, in all the circumstances, offensive, include:
The general character of the material (including whether it is of a medical, legal or scientific character)” (Parliament of the Commonwealth, 2020)
Science is neither exact nor static so in other words is it not dogma. When a scientific study suggests that a hypothesis is true, the job of science is then to attempt to disprove that hypothesis and defend the new hypothesis publicly.
The draft uses a standard of offensiveness judged by an “ordinary reasonable person in the position of a particular Australian adult”. The problems here is “position of a particular Australian adult’ is not scientifically qualified (Parliament of the Commonwealth, 2020). Neither could such a person determine scientific fact nor determine the variety of a scientific hypothesis for which they are not trained.
Human rights are constantly determined according to scientific arguments being intertwined with sometimes opposing scientist’s perspectives. Sex and/or gender diverse groups of people have constantly been oppressed by so-called scientific dogma in Australia. Section 8 Clause C of the draft needs to be removed as it is indefensible in law.
Whilst the act is federal each state has its own set of laws regulating offensive behavior and policies according to state anti-discrimination laws. Those anti-discrimination laws have been fought for by generations and should never be disregarded.
The state laws should not be ignored and taken into consideration by the eSafety Commissioner. We are the Australian states not the State of Australia.
Tech as well as other corporations practice risk assessment comparative to reduction in profits. In other words, if it is not causing the corporation financial losses then the problem is usually not addressed, including persistent internet bullying. This is standard in large corporation management which is even more pronounced in multi-nationals that register their businesses outside the countries where they operate and the problem may exist. We can see that the Australian Competition and Consumer Commission (ACCC) has no ability to handle complaints against tech companies. In fact, no Australian agency has that ability when the corporation is a foreign entity operating in Australia via the internet, that states its data processing is done outside Australia.
Persistent complaints and offences in not taking down offensive material, connected with bullying and contravening Australian state and federal ani-discrimination law, needs to warrant increasingly heavier punitive financial penalty than 5000 penalty points decided by the eSafety Commissioner.
Trying to get tech companies to give apologies to the victims of bullying is impractical. Companies do not issue apologies for legal reasons of admitting legal liability unless it is a matter of public relations. One of the basic tenets of large corporation law is ‘admit nothing unless proved’, which is indeed the legal advice they generally receive. Making an order for service providers to give a large number of apologies to victims would be impractical.
Bullies who post offensive material, creating victims on the internet may have mental and social problems themselves. There is accidental bullying where the offender posts inappropriate material and comments that they did not realise it was bullying who may be happy to apologise. There is, however, psychopathic bullying where the bully has mental health problems, attachment issues, lacks empathy with others and may be incapable of apologising regardless of any order made against them by the commissioner.
Punitive orders made against service providers and end users who post offensive material should be kept to large financial fines and removal and destruction orders. The fact that a fine has been issued is proof enough of wrong doing and can be a major incentive against reoffending.
Exemptions to posting of offensive material can be a slippery slope, licensing privilege where members of groups in society are licensed to operate above the law. Certainly, freedom of speech in civilised societies should never be squashed, shelved or constrained. However, unreasonable bombarding and pilloring of minority groups such as people from sex and/or gender diverse groups can do great harm both by creating physical and mental health problems in those groups, as well as inciting violence, suicide and murder against them, as the research shows.
There is a division where freedom of speech crosses over to abuse and produces the intent to cause harm or offence. This should never be masked as journalistic freedom or religious privilege in an egalitarian secular society, which is how Australia bills itself on the world stage.
There should be no exemptions for transgressing Australian equal rights or anti-discrimination laws for any party posting offensive material leading to bullying on the grounds of being a journalist or having religious license.
The very nature of electronic material is that it can be stored without detection at times for the term of a person’s life and beyond to re-emerge at any time. In dealing with offensive and sensitive information, images, or film that incites bullying, simply asking a service provider to no longer make that information public does not restore the privacy of the victim. The information may still be stored by third parties out of the control of the victim.
In some cases, the posting of offensive material will involve the abuse of minors in sometimes unpredictable ways, whether that be physical, mental or sexual. In other cases, it will involve crimes against adults.
The new act needs to ensure and give the eSafety Commissioner the ability to issue notices ordering the destruction of all such offensive material held by parties other than the victim when the complaint has been resolved. Even governments should not have the ability to store potentially damaging material, images or film that has caused the victim distress in the first place.
While the complainants must be assured confidentiality in making a complaint with regard to privacy, this should not supersede the eSafety Commissioner’s and staff’s obligation of mandatory reporting.
In cases of child abuse and criminal actions the eSafety Commissioner and staff should be obliged to inform complainants of the department’s mandatory reporting obligations. In revelation of abuse or crimes outside Australia, the eSafety Commission should work with the relevant ministers, Interpol and foreign police to report such matters.
Electronic evolution is fast and unexpected, meaning that new ways of electronically communicating emerge each year in forms that cannot be foreseen or foretold. Some of them are bought up by large tech companies to stop the new evolution competing with their already profitable product. Others, however, can emerge from grassroots beginnings and become widely publicly accessible to end users within a matter of a few months.
The proposed bill needs to give itself wide enough scope to encompass emerging electronic technologies where bullying may take place in the future.
This proposed draft bill fails to protect children and people against discrimination and bulling on the grounds of their sex and/or gender presentation. By omitting that wording it fails to comply with the Sex Discrimination Act (1985) and Australia’s United Nations international commitment to human rights (1966).
The Bill must contain wording that protects people on the grounds of their sex and/or gender presentation and sexuality to comply to Australian Law.
Conclusion
It is clear from a broad spectrum of research that that youth and people from sex and/or gender diverse groups are at a high risk of being bullied in Australian society. Much of that bullying takes place online and is presently unregulated by any Australian agency because all those agencies claim they have no jurisdiction over internet providers’ private corporations registered abroad.
The victims of such bullying have no recourse but to complain to the service providers who often do not respond, claim freedom of speech, deny any responsibility for material posted or take months to process a complaint. During that time any damaging posted material remains on the internet, is spread virally, creates greater and compounded victimisation and the victims get no or little resolution.
Industry voluntary codes clearly do not work in this area as the service providers can be registered throughout the world, often in principalities that offer them tax advantages over which the Australian Government has no jurisdiction. Therefore SAGE supports the formation of an eSafety Commissioner who has the legislated power to order a service provider offering services to Australians to immediately remove offensive materials and the ability to issue large punitive financial fines to both the service provider and end user who posted the offensive material and an order for destruction of the offensive material.
In order for this bill to be truly protective of sex and/or gender diverse groups it must mention that people should not be bullied or discriminated against on the grounds of their sex, gender or sexuality.
Recommendations
- There needs to be provision where the victim can request the service provider remove the offending misinformation and bullying material with 24 hours. Failure to do so then would result in the service provider being the offending and libellous body.
- The bill does not make provision for damage done to the victim by such viral postings and streaming, derived from the original posting, over which the Australian Authorities have no control to order removal. Should this happen, extra punitive measures need to be levied against offending parties.
- It is important for personal safety that the victim of offending posting is informed of the original source of the offending material wherever possible. Not to do so would leave them at future risk of victimisation from that source.
- It is necessary that the eSafety Commissioner, with public consultation, be the one to make an industry standard and require tech companies to respond to a complain using people, not simply via automatic programs. Only that way can the public be better protected.
- In forming new regulation protecting Australians against online bullying, it would be necessary to require tech companies to respond to misinformation, bullying material or postings of non-consensual intimate images complaints within 24 hours. Failure to do so must run the risk of the eSaftey Commissioner issuing an emergency removal order, fines far higher than 500 penalty units along with publication of each offences within the past five years on the eSaftery Commission’s website.
- In drafting the legislation, it is necessary to include the wording that government employees and departments can also possibly be offenders in posting bullying material. No Australian entity should be exempt of liability in this legislation.
- Human rights are constantly determining according to scientific arguments being intertwined with sometimes opposing scientific perspectives. Sex and/or gender diverse groups of people have constantly been oppressed by so-called scientific dogma in Australia. Section 8 Clause C of the draft needs to be removed as it is indefensible in law.
- The state laws should not be ignored and taken into consideration by the eSafety Commissioner. We are the Australian states not the State of Australia.
- Removal of offending material leading to bullying is time-sensitive and the eSafety Commissioner must have the legal ability to make an emergency immediate removal order while the complaint is processed. Persistent complaints and offences in not taking down offensive material, connected with bullying and contravening Australian state and federal anti-discrimination law, needs to warrant increasingly heavier punitive financial penalties than 5000 penalty points decided by the eSafety Commissioner.
- Punitive orders made against service providers and end users posting offensive materials should be kept to large financial fines and removal and destruction orders and not requesting apologies. The fact that a fine has been issued is proof enough of wrong doing and can be a major incentive against reoffending.
- There should be no exemptions for transgressing Australian equal rights or anti-discrimination laws for any party posting offensive material leading to bullying, on the grounds of being a journalist or having religious license.
- The new Act needs to ensure and give the eSafety Commissioner the ability to issue notices to destroy all such offensive material held by parties other than the victim when the complaint has been resolved. Even governments should not have the ability to store potentially damaging material, images or film that has caused the victim distress in the first place.
- In cases of child abuse and criminal actions the eSafety Commissioner and staff should be obliged to inform complainants of the department’s mandatory reporting obligations. In revelation of abuse or crimes outside Australia the eSafety Commission should work with the relevant ministers, Interpol and foreign police to report such matters.
- The proposed bill needs to give itself wide enough scope to encompasses emerging electronic technologies where bullying may take place in the future.
- The Bill must contain wording that protects people on the grounds of their sex, gender and sexuality to comply to Australian Law.
Government officials, politicians, media outlets and others wishing to respond to this paper or discuss the issues can contact Dr Tracie O’Keefe DCH at the Australian Health & Education Centre (02 8021 6429) or by email at sageaustraliateam@gmail.com
References
Abreu, L. & Kenny, M. (2018). Cyberbullying and LGBTQ Youth: A Systematic Literature Review and Recommendations for Prevention and Intervention. J Child Adolesc Trauma, 11(1), 81–97. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7163911/
Karp, P. (2021, January 13). Google admits to running ‘experiments’ which remove some media sites from its search results. The Guardian. Retrieved from https://www.theguardian.com/technology/2021/jan/13/google-admits-to-running-experiments-which-remove-some-media-sites-from-its-search-results
PBS NewsHour (2019, October 25). Rep. Ocasio-Cortez questions Mark Zuckerberg on when Facebook will fact check. Retrieved from https://www.youtube.com/watch?v=xT9BRUoXhh8
Sex Discrimination Act 1985 (Cth) (Austl.).
Sterner, G., & Felmlee, D. (2017, June). The Social Networks of Cyberbullying on Twitter. ResearchGate. Retrieved from https://www.researchgate.net/publication/318136951_The_Social_Networks_of_Cyberbullying_on_Twitter
Strauss, P., Cook, A., Winter, S., Watson, V., Toussaint, D., & Lin, A. (2020). Associations between negative life experiences and the mental health of trans and gender diverse young people in Australia: findings from Trans Pathways. Psychol Med. 50(5), 808-817. Retrieved from https://pubmed.ncbi.nlm.nih.gov/31280740/
United Nations, International Covenant on Civil and Political Rights (ICCPR) for Australia. (1966, December 16). Retrieved from: https://www.ohchr.org/en/professionalinterest/pages/ccpr.aspx
The Parliament of the Commonwealth of Australia HOUSE OF REPRESENTATIVES (2020). Online Safety Bill 2020, EXPOSURE DRAFT, 2019-2020. Australian Government Department of Infrastructure, Transport, Regional Development and Communication. Retrieved from https://www.communications.gov.au/have-your-say/consultation-bill-new-online-safety-act
Vamos, Igor. (2021). Facebook is bombarding rightwing users with ads for combat gear. See for yourself. The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2021/jan/26/facebook-ads-combat-gear-rightwing-users