Online Safety Live (in conjunction with SWGfL and UK Safer Internet Centre)

CPD For MeOnline Safety

Attending one of SWGfL and the UK Safer Internet Centre (UKSIC)’s Online Safety Live events is a must, especially in these Covid-19 times when so much of our lives are now being spent online, whether for education/work or leisure purposes.  If you missed this one, do try and reserve a place at one of their forthcoming events; keep an eye on their website or within their electronic newsletters for further dates.

After introducing themselves, Boris Radanovic outlined the format of the session.  This was to concentrate on several areas:

  • Covid-19;
  • Threats;
  • Obligations;
  • Actions;
  • Support.

To begin with, Boris reflected on the impact of Covid-19.  He discussed the situation pre-Covid in terms of technology (AI and IoT; big data and smart places; voice interaction and facial recognition; services with no content; personalised) and safety (empowering, not stifling; risks to all online, not just young people; data and privacy; regulation and safety tech industry) and where we are now (connectivity; contactless; screen stacking; communities of interest versus place; trust; necessity is the mother of invention).  The images projected clearly showed how the ‘landscape’ has changed.  In addition, ‘2020: This is what happens in an Internet minute’ was fascinating.

SWGfL and the UK Safer Internet Centre have produced a resource, entitled ‘Covid-19: Expectation and effects on children online’, which looks at digital migration, warnings and predictions and impact, and makes for interesting reading.  Their concluding message is: ‘Policymakers will need to consider and accommodate the impacts of Covid-19 on children for many years to come.  There will, without doubt, be a lasting impact on children from Covid-19.’

There are five main threats, which Andrew Williams addressed in turn, namely gaming; mental well-being; harmful content; online bullying and self-generated images.  In terms of gaming, Andrew explored the question, ‘Have we seen more negative issues arising from Covid-19?’ and discussed content, sociality and spend.  Surprisingly, the average female gamer is 36 years old and the average male gamer is 32 years old.  Mobile gaming, e.g. played on a smartphone or tablet, has increased dramatically, from 39% to 51% between 2016 and 2020 and is estimated to be worth around $72.3 billion today.  There are more male than female social gamers (63% compared with 37%); 42% of social gamers fall within the 16-24 age bracket and 45% of gamers played a game online with their real friends in the last month.  We digressed somewhat to talk about the popularity of ‘Among Us’, including the benefits to be gained and some of the dangers to be aware of.

Andrew also questioned whether online bullying had improved or worsened during Covid-19 and explored three aspects, e.g. variation, benefits and targeted.  Rising levels of hate speech and online toxicity during this time of crisis have been apparent; after analysing millions of websites, popular teen chat sites and gaming platforms, AI-based start-up Light recorded a 900% increase in hate speech directed towards the Chinese and a 40% rise in online toxicity among teens and children.

In conjunction with mental well-being, Andrew explored the question: ‘Is screen time an issue?’.  He considered addiction versus problematic use.  An infographic created by SWGfL in March 2018 was projected, suggesting that ‘there are clear correlations between screen time and well-being.’  The report series found that: time increases with age; a third spend more than three hours a day and males are more likely to be heavy users.  Heavy users are more likely to send and receive abuse, see upsetting content, go online because they are lonely and worry about what they have seen and how much time they spend online.  He shared some advice for parents and carers (precautionary approach; screen-free meal times; no screens before bed and take a break), as well as emphasising the importance of discussion, agreeing boundaries and leading by example.

Do children and parents need to be warned about dangerous apps or challenges?  Andrew ‘zoomed in’ on a few articles relating to Blue Whale, the Doki Literature Club and Momo and we debated the difference between fake news, misinforming and disinforming.  He congratulated all those who ‘took a step back’ and did not publicise such ‘harmful content’; it did not give them the attention that they were craving.

With relation to self-generated images, Andrew asked if children share these less if they know that it is a crime. The Internet Watch Foundation (IWF) assessed 22 484 reports of self-generated child sexual abuse material between January and June 2019 and discovered that 96% was from girls, with 85% being between 11 and 13 years old.

So, what obligations do we have?  Boris linked to some of the very latest guidance that has been produced by SWGfL and the UK Safer Internet Centre, as well as that issued by the Department of Education and Ofsted.  Educational professionals need to make sure that they have accessed these and familiarised themselves with the content of each.

Action can be sub-divided into seven areas, namely ownership, reporting, policy, staff, education, technology and evaluation.  Each section was visited and useful resources highlighted.  For example, the 360⁰ safe online safety self-review tool and 360⁰ early years reinforce ownership.  Online safety BOOST, SELMA hacking hate and Childnet’s Digital Leaders programme can aid reporting.  SWGfL has many policy templates that can be downloaded.  Make sure that staff receive appropriate online safety training that is relevant and up-to-date from the likes of SWGfL, Edify or the UK Safer Internet Centre.  The UK Council for Child Internet Safety (UKCIS), Childnet International and the UK Safer Internet Centre have numerous, high-quality resources to educate children and young people effectively.  Andrew also promoted ‘Project Evolve’ here; a global first with more than 1000 schools on board to date!  Do register and explore the incredible toolkit that is growing all the time.  Technology should seek guidance from SWGfL, the 360⁰ data protection self-review tool, National Cyber Security Centre and UK Safer Internet Centre.  Andrew showcased ‘Swiggle’, a child-friendly search engine that can be set for use in schools (www.swiggle.org.uk).  To facilitate evaluation, it was recommended to look at SWGfL and the 360⁰ safe online safety self-review tool.

Finally, Boris provided an overview of the support that is available to both educational professionals and parents/carers, e.g. organisations to report illegal and legal content; the Professionals Online Safety Helpline (POSH), updates via newsletters, podcasts and SWGfL’s website.  Make sure you save the following date too; Safer Internet Day on Tuesday 9th February 2021.

Many thanks to Boris and Andrew for a hugely informative and engaging session this afternoon.  Great teamwork, once again!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.