Tag Archive for: privacy

One Weird Trick for Media Literacy (and Cyber Security, and Scams, and…)

“Does anyone in the audience have any top tips for Cyber Security?”

While at SCVO’s annual Gathering earlier this month, I attended the session on cyber safety hosted by the inimitable Maddie Stark. After a range of questions to the knowledgeable panel, she put this question to the audience.

Now, despite having ‘digital’ in my job title, I’m far from an expert on how to deal with data breaches, ransomware or any of the other ghouls that third sector execs hear going bump in the night. I suspected that many of my fellow attendees were similarly wary, attending the session with more questions than answers – which is of course how it should be when you sign up to absorb the wisdom of experts.

But I did have the germ of an answer to Maddie’s query, and Cyber Scotland Week seems like the ideal time to share…

A bit of background: Rather than being a cyber security specialist, my focus is on digital inclusion, and these days it manifests in the ways that media literacy and critical thinking overlap with our online lives. In this work, I’ve often wondered what the secret sauce is, the ‘one weird trick’ that might unlock criticality.

My gut feeling is that it’s got very little to do with knowing what button to press next or which rules to follow, because those actions necessarily occur after a little light bulb has appeared to prompt questions about what we’re seeing. To put this another way, in his book ‘Thinking Fast and Slow’, Daniel Kanneman talks about two modes of thought: system one is immediate and intuitive, whereas system two is a more logical and laborious process of deliberation.

So, to me, the key thing is to power that wee system one light bulb so that it’s ready to flicker on, whenever it’s needed. It’s tricky, because something instinctual is needed for people to quickly feel, rather than think, when something’s awry. I’m mixing my metaphors a little, but folk need a kind of muscle memory to kick in. How the heck do we build that muscle?

I’ve slowly come round to the conclusion that one possible answer lies in self-recognition: in knowing yourself as the kind of person who doesn’t believe everything you see, who’s nobody’s fool, who’s savvy enough to contest the things that need to be contested. When you understand yourself as someone who carries themselves that way every day, it could provide the spark of electricity for our light bulb when a piece of fake news or spurious content pops up. Not only that, but it’s so much more positive than a ‘danger everywhere’ mindset, where nothing can be trusted. There’s a world of difference between “don’t believe everything you hear” and “you can’t trust anything”: whereas the latter is exhausting and stressful, the former provides a jolt of self-esteem and genuine power. Perhaps that could fuel and inspire the light bulb?

In that spirit, my one weird trick is: saying no to cookies.

 

Wait, hear me out! I did say it was weird.

Cookies, in case you need a refresher, are tiny files that track your activity online. There are various sorts, and some are perfectly benign, doing things like remembering your password or keeping things in your virtual basket as you shop. These ‘functional cookies’ make websites work, and you’re never given the choice to reject them where they’re needed. Other gremlins, often called ‘tracking’ or ‘third party’ cookies, can follow you around the web, and are the reason you Google for couches one day and then see adverts for discounted three-seaters on the web for months afterwards. The data they hold is big business too, making up a sizeable chunk of the multi-billion surveillance capitalism that Shoshanna Zuboff discusses in her tome of the same name.

Since 2018, we’ve had the choice to reject those furtive files, and yet… most people don’t. Instead, in a fleeting gesture, they tap that oh-so colourful and easily clickable “accept” button, which is usually (purposely) contrasted with a sad, grey refusal button or, worse, a whole host of sub-options and menus to be worked through. The line of least resistance, persuasive design, or just cynical social engineering?

Either way, I’m always harping on about them, and I love this as a teachable behaviour because our adversaries give us ample room for practice! On any given day, we all surely see five or ten of these prompts, and they allow our muscle memory to build as we reject them. It’s also a spectacularly easy thing to explain: when you’re asked to accept cookies, channel your inner Nancy Reagan and “just say no”. Yuk, maybe not – we can quote Bartleby instead: “I would prefer not to.”

From a digital skills point of view, we can build on this to discuss actively clearing existing cookies from our devices, and the key considerations that come into play at that point – most vitally, whether we’ll have access to the passwords and other data that might be cleansed in the process, or how we can use our email accounts to re-access lost logins when we’ve been locked out. It’s also hugely instructive to show the difference in algorithmic content or advertising once the virtual decks have been cleared.

In community settings, that simple starting point regularly gives me a springboard to other enlivening conversations: the idea that our online activity is tracked and traded to sell advertising space and to tempt us into buying stuff. The fact that somebody built this or that website with explicit intention of monetising our behaviour. The financial incentives to keep our eyeballs on the screen for as long as possible. The core media literacy idea that behind every presentation there are people, motivations and systems at work. That’s not conspiratorial, it’s the economics of the internet.

All that from a quick question of “what does this say, and do I accept it?” Amplifying these moments of quiet refusal can generate a shift from passive acceptance to active enquiry, and from there, it’s a hop, skip and a jump to self-recognition. The good people at Ofcom (the funders of Mhor and Glasgow Life’s media literacy efforts) would say it’s “building digital and media savvy” – understanding how the online world and media work.

Having gained the instinct to read something online, challenge it and make decisions based on what we’re seeing, we can start to become the smart, savvy, curious and emboldened versions of ourselves we want to be. Beyond media literacy, in the field of fraud and scam awareness, we nourish a split-second of challenge that rises up whenever a too-good deal or a peremptory message comes into view.

This is just good community work: moving, as Paulo Freire said, from answering questions to questioning answers; from a relatively minor techy tip to sweeping enquiriy about how the world works. Maybe it leaches out into how we understand the simplistic narratives of populist politicians, or the mangled ‘protest’ messaging of their minions. Perhaps we feel more confident in challenging everyday racism or misogyny rather than letting it go, like a tiny ‘accept’ button that we know is easy, and yet also problematic.

But back to cyber security. Although it may be a stretch, we know that a great many incidents occur because of one momentary lapse: a link that should never have been clicked, new bank details that weren’t properly vetted, a spoofed email from the boss that wasn’t verified verbally. Miniscule mistakes in the blink of an eye, but with vast consequences.

What if something deep in us could introduce a bit of healthy scepticism at the vital moment? What if there was a way to flex our muscles umpteen times daily when we’re invited to passively accept something, via a simple and basic action that has no downsides?

Well, you know my tip by now. Saying no to cookies is a crystal clear, repeatable action that helps us dispute persuasive messages just as we’re being asked to accept them without question. It’s a spur for media literacy, critical conversations, scam awareness, cyber security, and so much more.

Maybe it’s weird, maybe it isn’t. But it’s no trick.

Digital Lifelines Are Not Optional: They Are Human Rights Infrastructure

Yesterday’s announcement that Home Office authorities can now seize mobile phones from people deemed to be in the UK illegally without arresting them, and even check their mouths for concealed SIM cards, is deeply concerning. Furthermore, this new piece of legislation also notes that authorised actors of the state can also take biometric information from individuals. 

For us, this raises a deeper question that goes far beyond any single policy decision: what happens when the state expands its digital reach faster than it strengthens its safeguards? What happens, really, when this is happening in a democracy moving further to the right?

For people fleeing conflict, persecution, or trafficking, a phone is not a luxury. It is the only route to legal representation, healthcare, support, contact with family, evidence‑gathering for asylum claims, protection from exploitation.

We know from multiple evaluations of digital inclusion programmes that access to a device and data is directly tied to safety, wellbeing, and the ability to exercise rights. When digital access is removed or restricted, people’s health deteriorates, their isolation increases, and their ability to navigate essential services collapses.

In other words: digital access is a lifeline, not a loophole.

The slow creep of state powers deserves more scrutiny

The UK’s own State of Digital Government Review acknowledges that the public sector now holds vast digital capacity, with over £26bn spent annually and millions of interactions mediated through digital systems. Yet the same review warns that many of these systems were not designed for a digital age, and that oversight, transparency, and accountability have not kept pace.

This matters, because every new power layered onto a digital system risks reducing previous freedoms, normalising ‘exceptional’ measures, expanding surveillance without proportionality tests and creating data trails that outlive the policy that created them, that could be used in ways we can barely imagine to control and limit our lives.

When the state gains new powers over people already in precarious situations, the burden of proof must be higher, not lower.

Three Questions We Should All Be Asking

1. Do we have safeguards that genuinely minimise the loss of previous freedoms?

Safeguards cannot be theoretical. They must be independently monitored, transparent, challengeable and designed with the people most affected.

At present, many digital systems across government lack these foundations — a gap highlighted repeatedly in official reviews.

2. Will there be meaningful intelligence to judge whether these powers are proportionate?

Proportionality requires evidence, impact assessments, rights‑based analysis and ongoing evaluation. Without this, “proportionate” becomes a political claim rather than a measurable standard. 

3. Is this subject to regular, independent review?

Any expansion of state power should come with:

  • Short term clauses for review
  • independent oversight
  • public reporting
  • mechanisms for civil society challenge

If the review mechanisms are weaker than the powers themselves, the balance is already lost.

And our fourth question: Do we actually care, as a society,  about the people behind these policies?

Those of us working directly in the places where people are moved, the so‑called “hotels” that are in reality the most basic, isolating forms of accommodation, see the truth every day. We know that even the idea of offering people access to former barracks in disrepair sparks public outrage. Yet the outrage is rarely about the conditions people are forced to live in. It’s about their presence.

So we have to ask: where is our humanity? Where is our shared care?

If we can ‘tolerate’ people living in institutional spaces designed for containment rather than dignity, then the question is no longer about policy design. It’s about who we are becoming and what freedoms we are willing to let erode when the people affected are those with the least power to resist.

Digital Rights Are Human Rights and They Must Be Defended Proactively

We cannot treat digital access as a side issue when it is now the primary route to justice, safety, and participation. Nor can we allow the slow normalisation of expanded state powers to go unchallenged simply because they are digital, invisible, or framed as administrative efficiency.

The question is not whether the state can do these things. It is whether it should and under what conditions.

If we want a society where digital systems uphold rights rather than erode them, we need stronger safeguards with transparent oversight. We should expect proportionate, deeply scrutinised powers within a commitment to centring the experiences of those most affected.

Every day in our work, we see families living in limbo, people navigating unimaginable trauma in spaces that offer no privacy, digital isolation compounding emotional isolation and rights that barely exist on paper and increasingly not in practice.

And we also see the public reaction. Not outrage at the conditions, but outrage at the idea that people seeking safety might be treated with even a fraction of dignity.

So the question becomes unavoidable: where is our shared care?

If we can accept these conditions for some people, we normalise them for all people. The erosion of rights always begins at the margins.

This is why digital lifelines matter. This is why safeguards matter. This is why proportionality and oversight matter. Because when humanity is already stretched thin, the expansion of state powers without robust checks doesn’t just risk harm, it guarantees it.