Around 6 months or so ago I wrote a series of posts covering the DfE Meeting Digital and Technology Standards in schools and colleges specifications that at that time covered broadband internet, network switching and cabling and wireless networking standards.

This was followed by the rather mammoth series looking at Cyber security standards.
As of March 2023, there are now a set of three additional standards added to that list covering:
  • Filtering and monitoring
  • Cloud solution standards
  • Servers and storage standards

As before, specific recommendations for each area are broken down into component parts with advice on how and when to implement them.

We will look at these over the course of three articles.

First up, filtering and monitoring.  In a slight deviation from earlier standards, this one is much more focused on the policy, process and procedural controls rather than specifically technical.  It also begins to strain a little at having a single document covering schools and colleges which often operate in very different ways.

Filtering and Monitoring

Roles and Responsibilities

Filtering and monitoring of online activity should already be an important part of the safeguarding activities of your organisation but these standards aim to add a degree of structure to this aspect of the operation.

As with earlier parts of this series of standards and recommendations, what is put in place to mitigate risks can be characterised as technical, operational or policy-based in nature.

The first element of the standard, therefore, isn’t any technical capability but instead to seek clarity that overall strategic responsibility for filtering and monitoring has been given to specific people, namely that a member of the senior leadership team and a governor be tasked with ensuring that the standards have been met.

It is also clear that responsibilities for all staff and other stakeholders, such as third-party providers, need to be set out and clearly understood.  The simplest way to ensure this is to develop and communicate policies that articulate the expectations for everyone of responsibilities for filtering and monitoring, both in terms of those responsible for the activities but crucially also to let the wider user community understand what is being monitored and what actions will follow any failure to adhere to the rules.

Unlike in schools where often the Head will be the arbiter, all too often in further education establishments the decisions as to what is blocked and what is allowed sit with the IT team and this is not in my view appropriate.  By default, the technology will generally have a standard set of blocked themes and sites, updated automatically on a regular basis.  However, this will often lead to conflict when sites that are deemed valid by teaching staff are blocked.

The decision on whether to block or unblock sites should rest with a group rather than an individual and that group should include senior safeguarding and academic input alongside the technical team as often the arguments are complex.  For example, should sites dealing with suicide be blocked?

Well, the easy answer is yes, but how does sociology cover topics relating to peer pressure leading to suicide or other such topics?  There are a range of potentially contentious issues, drugs and drug policy, guns, gang culture, racism etc that may be blocked by default and yet have genuine and valid reasons for being available for curriculum purposes.  This differential becomes even greater in colleges that also have HE provision which will usually utilise the same infrastructure.

There is a tendency to try and make these issues a problem to solve with technology but this can lead to a great deal of complexity, trying to have different rule sets for different cohorts can be an administrative overhead that most IT departments are ill-equipped to manage.  Instead, a policy-based solution, agreed within that wider organisational context is a better way to make decisions.

Such decisions also need a robust system of documentation to explain and justify them in the context in which they were made.  This can include any discussion and differing views that may have formed the eventually agreed position.

Of course, having technologies in place and people with defined roles and responsibilities is fine but they also need the time and capacity to deal with those activities so it is important that clear expectations exist around what is monitored and what reporting is undertaken.  Is it routine or ad hoc based on concern?  Is it all users or based on activity (volume of usage or sites accessed or attempted to access)?

Although in some of these standards the ordering of specific points within the main topic area can be arbitrary, I think in this case the placement of this area as the first element underlies that of all the standards, this one is less about technical measures and much more about the policy framework and operational processes that utilise any technology.

The emphasis on a team effort involving safeguarding and curriculum expertise alongside the IT team is also in my view a critical part of making this work effectively.

Review

This standard recommends that filtering and monitoring be reviewed at least annually.  This may be new to many organisations who set up systems several years ago and while keeping things up to date don’t actually check to ensure that they are still fit for purpose within the ever-changing regulatory and technology landscape.

Once again though, this isn’t framed as a technical evaluation and indeed the IT team may lack the skills and knowledge to answer many of the questions posed by the standard.

The standard suggests that to contextualise the effectiveness of the filtering and monitoring you should consider some of the following elements:

  • the risk profile of your pupils, including their age range, pupils with special educational needs and disability (SEND), pupils with English as an additional language (EAL)
  • what your filtering system currently blocks or allows and why
  • any outside safeguarding influences, such as county lines
  • any relevant safeguarding reports
  • the digital resilience of your pupils
  • teaching requirements, for example, your RHSE and PSHE curriculum
  • the specific use of your chosen technologies, including Bring Your Own Device (BYOD)
  • what related safeguarding or technology policies do you have in place
  • what checks are currently taking place and how resulting actions are handled

The review should of course lead to actions where appropriate and those could be related to updating policies and procedures, amendment of roles and responsibilities, additional training or consideration of future monitoring strategies or technologies – which could naturally have budgetary considerations.

The specification suggests at least an annual review but a review should also be triggered by a change of working practices such as remote learning or when new technologies are introduced.  As always, a sensible approach is to review the risks that may be introduced by any change as part of the change process.  There also needs to be a flow in the other direction so that any safeguarding or other emerging risk identified will trigger a review, such as happened with the introduction of PREVENT a few years ago.

As well as reviewing the processes it is also important to test that the current mitigations do what they were designed to do and all such tests should be recorded along with any actions identified to close any gaps.  As with any other element of safeguarding, staff should know how to report and record concerns relating to this area and equally, IT teams should receive training to recognise when a support request would in fact be a request to amend filtering or monitoring elements.

It is also important that technical staff in particular understand the legal framework under which testing, investigation and monitoring take place.  As an extreme example, there is no legal exemption from laws surrounding child sexual exploitation and abuse images in respect of testing or investigation activities undertaken by staff who could potentially face prosecution should they access such materials.

Block Harmful or inappropriate content without unreasonably impacting teaching and learning

As noted earlier, decisions as to what should and should not be filtered or blocked should not be left to the IT team or even the IT leadership.  Assessing the validity of subject areas or individual sites often requires far deeper understanding of the curriculum or the needs of young people.

The standard sets the conundrum of effectively filtering harmful sites and inappropriate content while not:
  • unreasonably impacting teaching and learning or administration tasks
  • restricting students from learning how to assess and manage risks themselves

To provide a starting point, most providers of filtering systems will have a well-researched default list of categories that can be either allowed, blocked or investigated further to apply more granular subject blocks, and when implementing a new service this should be the first port of call for the group responsible for decision-making.

Further reassurance can be gained by ensuring that the provider of the technology, and the lists, are:
  • members of the Internet Watch Foundation (IWF)
  • signed up to Counter-Terrorism Internet Referral Unit lists (CTIRU)
  • blocking access to  illegal content including child sexual abuse material
From a technical standpoint, the key points relate to who and what the filtering covers.  In essence, it should apply to:
  • all users, including guest accounts and guest wireless networks
  • all devices owned by the organisation
  • all devices using the organisational internet feeds
The systems should also:
  • cover all internet feeds including backups and any remote sites that have independent internet access
  • be age and ability-appropriate
  • handle multiple languages, images and account for slang, abbreviations and common misspellings
  • identify and block technologies to subvert the system such as VPNs and proxies
  • provide the users and those monitoring with alerts when attempts are made to access blocked content

When collecting activity data it is important that sufficient information is present to allow for meaningful analysis and this would include recording devices, IP addresses and information about who is accessing the materials. The logs should also record times/dates and what content was blocked or what search terms were used to try and access the inappropriate materials.

Obviously, the above data would potentially count as being personally identifiable for the purposes of GDPR and so a data protection impact assessment should be undertaken as part of the review of the processes and systems.  It is also important that users of the devices and networks are aware that they will be monitored as a condition of access to the systems.

Effective monitoring strategies

Of all the elements of these standards, monitoring is perhaps the most challenging for schools and colleges.  Not from a technical perspective, although that shouldn’t be underestimated, but from a time and resourcing angle.

The best filtering and activity monitoring technology can be deployed but without time being allocated to someone or a team to review the data and make a determination as to what it means these data are largely irrelevant.

Some systems will of course flag inappropriate activity via alerts, again though this raises the question of who receives those alerts and what are they tasked with doing with them.  All too often monitoring is in reality reporting after the fact.  For example, a suspicion has been flagged that a student has been sending inappropriate emails – this can be investigated but will probably only come to light following a complaint.  

Similarly, unless real-time keystroke monitoring is taking place (hint: it should!), inappropriate conversations could be taking place on platforms that are fully allowed – even including the establishments’ own learning platforms.  Again, even if such technology is deployed, where are alerts sent and what follows is as if not more important than the technology itself.

Clearly, this is part of a wider discussion relating to safeguarding.  There has sometimes been a tendency to treat “real-world” and online safeguarding as separate discussions which I have always felt is totally wrong.  There needs to be an organisational-wide joined-up approach, led by safeguarding experts.


2 Comments

Meeting Digital and Technology Standards In Schools and Colleges – TNB Part 2 – ITspire Consulting · 01/09/2023 at 9:23 am

[…] article on the ever-growing MBTSSC (don’t think that will catch on as an acronym) I looked at filtering and monitoring. This time around we are looking at Cloud solution […]

Meeting Digital and Technology Standards in Schools and Colleges – TNB Part 3 – ITspire Consulting · 06/09/2023 at 7:06 pm

[…] in this series, we have looked at filtering and monitoring requirements along with the thorny topic of the cloud. In the final part (for now) we will take a […]

Comments are closed.