A threat to health is being weaponised: inside the fight against online hate crime – The Guardian
In the winter of 2002, nine months before Hanif Qadir unpacked his bag at a terrorist training camp in Afghanistan, a group of men walked into the London MOT testing centre he owned with his brothers. They were collecting money for civilians caught up in the US invasion of Afghanistan; hundreds of children had been orphaned by indiscriminate bombing, the men claimed. Could he help? The appeal resonated with Qadir, who had lost his father when he was seven. He made a donation.
The men returned regularly. Each time, they asked for more money, before gradually changing the subject to Qadirs faith. Eventually they invited him to a meeting at a local house to discuss the war in Afghanistan more freely. I felt they were sincere and genuine, Qadir recalls. At the meeting, the men encouraged Qadir to visit websites that claimed to show photographic evidence of violence against Afghan civilians by western troops.
Qadir browsed hundreds of distressing images, among them scores of orphans, each accompanied by extended captions that described the way in which the childs family had been killed. One girls story has remained with him. The website claimed she had lost 21 members of her family to a stray US missile. The caption explained it had taken locals three days to scrape their remains from the walls of the girls home. The more he saw, the closer Qadir became to the men who were, unbeknown to him, recruiters for al-Qaida.
If someone calls for genocide against Muslims, theyve essentially tattooed a swastika to their forehead online
Qadir grew up in Thornaby-on-Tees, a small town in North Yorkshire. After his father died, he had disengaged from school, leaving at 14 and moving to London. After a few odd jobs, he founded a business with his brothers, buying, repairing and selling cars. By the early 2000s, the business was profitable enough that he was able to donate generously to charitable community causes, a reputation that, he believes, led the recruiters to his door.
The suggestion that Qadir travel to Afghanistan was seeded gently. When a person is radicalised they become suggestible, he tells me. We discussed that, in order to prevent more loss of life, we needed to be prepared to fight. On 2 December 2002, he flew to Islamabad in Pakistan. A few days later, he crossed the border into Afghanistan.
Soon after he arrived at a training camp, Qadir saw a man measuring up children who lived there. I thought they were being tailored for new clothes, he recalls. Then he heard one of the leaders telling the children they would soon be reunited with their dead parents. They were being fitted for suicide vests. I felt sick and angry, he says. I wanted to walk away.
But in the middle of a desert compound patrolled by armed guards, any attempt to defect could be fatal. Qadir was trapped. I knew that if I asked to leave things would end badly. He had to think carefully.
***
In 2002, when Qadir was being radicalised, the internet was not yet ubiquitous. There was no Twitter, no Facebook; websites looking to groom people into supporting extremist causes were obscure. Two decades later, the digital landscape has been transformed. As the All-Party Parliamentary Group on Hate Crime wrote last year, the internet has become a key breeding ground for extremism and hate speech emboldened by the increasing ease of dissemination, anonymity and, thanks to outdated legislation, a lack of meaningful consequences.
Perpetrators of terrorist attacks now routinely leave online statements or manifestos to justify their actions, hoping their words might encourage others. The 28-year-old gunman who killed 51 mosque-goers in Christchurch, New Zealand, last year posted a 73-page white nationalist rant to the fringe web forum 8chan and livestreamed the attack on Facebook.
But now, just as Facebook and Twitter have become the prodigious muck-spreaders of our age, a handful of clandestine startups are using technology to stem the flow. Moonshot, whose office is at a secret location in London, is, at five years old, a veteran in this emerging industry. Its premises have the feel of a typical Silicon Valley operation: distressed floorboards, glass-fronted offices, beanbags by an open fireplace, exposed brickwork, a snug for breathers. There are a few clues that the companys business using technology to disrupt violent extremism is different from that of the fitness app developers, social media influencers and virtual reality speculators with whom it shares an aesthetic. The posters are not vintage prints but disquieting infographics revealing, for example, that after 22 people were shot dead in an El Paso Walmart last August, there was an 82% rise in the Google search term how to murder Mexicans. There is also a bomb-proof door.
Cofounder Vidhya Ramalingam set up the EUs first intergovernmental research initiative to investigate far-right terrorism in the aftermath of the 2011 murder of 77 people by Anders Breivik in Norway. She describes Moonshots work as experimental programming. The company employs 50 people, and uses a mixture of software and human judgment to identify individuals on the internet who, like Qadir, appear interested in extremist propaganda. They then attempt to serve them counter-messaging.
The technology uses a database of indicators of risk. An individual is awarded risk points according to their online behaviour. You score one point for showing curiosity about the Ku Klux Klan or National Socialist Movement. Activity that indicates sympathy with a violent movement or ideology (eg Googling white pride worldwide) earns three points, while showing a desire to join, send money to, or commit acts on behalf of a violent extremist group or individual earns six.
Home Office initiatives such as Prevent have traditionally focused on training teachers and other leaders to identify people likely to be drawn to violent extremism within their communities but these methods risk introducing discriminatory practices. In France, for example, there were posters telling people their sons might be at risk of violent extremism if they grow a beard, start speaking Arabic or stop eating baguettes, explains Ross Frenett, Moonshots cofounder. That is obvious bullshit.
By contrast, Frenett says, if someone makes a post glorifying Hitler, or calls for genocide against Muslims, there is a high degree of certainty that they fall into a high-risk category. Theyve essentially tattooed a swastika to their forehead in the online space, he says. So our level of confidence when identifying individuals who are vulnerable to radicalisation is way higher online than it could ever be offline. And it sidesteps some of the discriminatory, stigmatising practices weve seen in an offline setting.
Moonshot, founded in September 2015, is a for-profit company that earns its income from government contracts in the UK, US, Canada, Australia and across western Europe. It does not limit its work to any particular strain of radicalism; in addition to the far-right and jihadism, Moonshots work covers everything from Buddhist extremism in south Asia, to Hindu nationalism and incel terrorism in Canada.
The skill is in finding out what raised a persons interest in extremist ideology. You cant redirect them until you do
At the broadest level, Moonshot runs what it refers to as redirection material advertising that is designed to get in front of extremist material in Googles search results. Google has granted Moonshot dispensation to advertise against banned search terms such as join Isis. If a user clicks on one of Moonshots camouflaged results, they are taken to, for example, a mental health website with relevant downloadable guides and a chat option. (These sites are run by partnered mental health organisations and groups that have experience dealing with gang violence. As Frenett puts it, they have appropriate risk protocols, and connections with law enforcement, should they be required.) So long as the search terms are carefully calibrated (advertising against white power is useless, Frenett explains, as you end up competing with power-tool companies) this can be an effective first contact.
Success is measured in much the same way as any company seeking to advertise on Google, via click conversions. (We pay for advertising just like any commercial advertiser does, Ramalingam says. We dont get special rates. I wish we had a better story on that front.) A key metric is search impression share, which records the amount of time your at-risk audience saw the ad. Weve had campaigns that have run with only 50%, and thats not good enough, Ramalingam says. So we work hard to get that up to 98% where possible. For this reason, as well as mental health practitioners and ex-police officers, Moonshot also employs marketers. Most of our work is analytics, marketing and social work, Frenett says. It just happens to be marketing, analytics and social work related to terrorism.
Occasionally the company will identify an individual who is too high risk for their interventions. Thats where, depending on the country were working in, we refer a user to the police, Frenett says. In Australia, for example, Moonshot identified someone at the top of a network of around 200 at-risk individuals considered so risky we couldnt intervene. A few days later, the local police arrested the man, who was subsequently convicted on terror charges.
There are deeper kinds of intervention. One of Moonshots advertisements for, say, bomb manuals will take the searcher to a WhatsApp chat manned by a specialist trained in deradicalisation techniques. The company may also identify someone on a particular social media platform openly espousing pro-extremist or pro-terrorist views. Then a trained social worker, typically from a charitable partner organisation, contacts that individual via Twitter direct message or Facebook Messenger. Choosing the right person to make this kind of contact, which may be perceived as invasive, is essential. In many cases, the right person is a former extremist someone like Hanif Qadir.
***
When Qadir realised the children in the Afghan training camp were being measured for suicide vests, his first instinct was to exact revenge on the people who had manipulated him. But I only had a knife, no gun, he says. And I knew that I couldnt tell anyone I wanted out.
He stepped outside the gate of the camp to consider his options. There he spotted the driver of a pick-up truck with whom he had talked a few times. The men did not share a first language, but Qadir gambled. He pulled out 50 and waved it at the man, asking if he could hitch a ride to Turkham, on the border between Pakistan and Afghanistan. The driver nodded. Qadir climbed into the passenger seat. I didnt even collect my bag, he recalls.
Qadir says he cried during the flight to London. I kept asking myself: What the hell have I done? When he arrived home, he and his brothers attempted to find his recruiters, but they had disappeared; the word was that they had moved to Manchester. Qadir decided he no longer wanted to run the car business and convinced his brothers to sell up. I just wanted to stay at home with my children.
After a period of recuperation, he and his brothers opened a gym in a disused nightclub, which became a place where local youths, many of them young Muslims, would congregate. Wed talk, he says. Id ask them questions about Afghanistan. I saw a lot of anger and questioning. It was clear to me that all it would take is for one person to manipulate them emotionally and they would get straight on a plane to fight. Or maybe they would do something here.
Eager to communicate this to someone in a position of power, Qadir started attending local council meetings. A police inspector, Ian Larnder, took him for a coffee, hoping to better understand why this former mechanic seemed so passionate about the subject. Until then, I had told nobody about what had happened, Qadir recalls. Ian was the first person I opened up to. A week later, Larnder was appointed to the polices national community tension team. He took Qadir with him to speak to forces around the country about his experiences.
Today, with a number of other former extremists, Qadir works with Moonshot, where he provides training for online interventions. The skill is in finding out what has raised a persons interest in extremist ideology, he explains. You cant redirect a person until you understand this. Its no good asking something so broad as: What do you think about what is happening in India? It has to be specific and personable. So instead you might say: Is it permissible to seek revenge for the loss of a loved one?
This sort of broad line of questioning and the fact that an anonymous dialogue might tail off, without scope for any follow-up can seem frustratingly opaque for anyone trying to measure Moonshots success. Its a criticism the company is used to fielding. The struggle with preventive work is that, very often, its unscientific and we have to ask people to take it on trust, Frenett says. Its easy for a military contractor to come in and say, I installed a big, high fence and a man with a gun and that reduced terrorism. Likewise, the army can come along and state: We killed 200 Taliban this week.
But its much harder to say, OK. We invested $1m here and we prevented this much terrorism. Our long-term aim is to start to change that calculation. Then well be able to say: If one dollar in every 100 spent on military hardware went towards targeted, community-focused preventive work it would be better value and probably better for the world.
***
In the corner of a chilly room at the end of a corridor in Cardiff Universitys Glamorgan Building, a flood of racial slurs, misogyny, antisemitism and far-right slogans flows across a PC screen. Imagine you had a crystal ball in which you could watch someone perpetrating every hate crime as it occurred somewhere out there, on the streets, explains Matthew Williams, director of HateLab. Thats what youre looking at here, except the hate is happening online.
While Moonshot and Qadir intervene with individuals who are vulnerable to extremism, HateLabs aim is to provide a more accurate picture of hate speech across the internet. It is, Williams says, the first platform to use AI to detect online hate speech in real time and at scale.
Moonshots ads for, say, bomb manuals take the searcher to a WhatsApp chat manned by a deradicalisation specialist
Online hatred is so commonplace that the majority of incidents go unreported. According to British government data, 1,605 hate crimes occurred online between 2017 and 2018, a 40% increase on the previous year. But the Home Office admits this figure is probably a gross underestimate.
Unlike the police, we dont have to wait for a victim to file a report, Williams says. The program reflects a true indication of the prevalence of online hatred.
It offers a granular indication, too. Williams specifies a date range, then picks from a filter of potential target groups: Jews, homosexuals, women, and so on (misogyny is by far the most prevalent form of hate speech on Twitter, he says). He selects anti-Muslim and a heat map of the UK lights up in red blotches showing geographical hotspots. Elsewhere, it reports the average number of hateful posts per minute and the peak times of day (hate speech, the group has found, is most prevalent during the daily commute, when people read and react to the days news).
A word cloud indicates the most-used anti-Muslim slurs, while a spiderweb visualises a network of perpetrators, identifying the thought leaders who are generating the most retweets, and how they are linked, via online accounts. HateLab gives situational awareness to hate speech on Twitter at any given time, Williams says.
Early last month, HateLab identified three forms of coronavirus-related hate speech: anti-Chinese or Asian; antisemitic, focused on conspiracy theories; and Islamophobic, focused on accusations of profiteering. What we are seeing is a threat to health being weaponised to justify targeting minority groups, no matter how illogical the connections may seem, Williams explains.
(Moonshot has monitored similar rises in hate speech targeting Chinese nationals. The hashtag #ChinaLiedPeopleDied was tweeted 65,895 times in March, while #coronavirustruth, implying that the pandemic is a hoax, was used 77,548 times. The company also picked up tweets showing old videos of Muslim men leaving mosques accompanied by text claiming the footage was filmed during quarantine, a seemingly deliberate attempt to create anti-Muslim sentiment.)
Williams, author of a forthcoming book titled The Science Of Hate, is a professor of criminology at Cardiff, but his interest in the field is not purely academic. In 1998, he travelled to London with friends to celebrate a birthday. At some point during the evening, he stepped out of the gay bar in which the group was drinking. Three young men approached. One asked if Williams had a light. As he handed over his Zippo, the man punched him in the face. Williams returned to his friends but said nothing, fearing that they would want to retaliate. Eventually, one of them noticed blood on his teeth and urged him to report the attack. I said no, Williams recalls. At that time my parents didnt know I was gay. My siblings didnt know, and neither did most people from my town. I didnt want to come out to the police.
But Williams returned to Wales a changed person. Any attack on your identity has a profoundly destabilising effect, he says. I became angry and depressed. I modified my behaviour. I stopped holding my boyfriends hand. I still wont show affection in public. He was not alone in failing to report his attackers; based on the combined 2015/16 to 2017/18 Crime Survey for England and Wales, only 53% of hate crime incidents came to the attention of the police. People are fearful of secondary victimisation, Williams says.
As domestic internet use became more commonplace, Williams noticed the hate speech he encountered on the streets reflected online. The difference was that it was there for everyone to witness. Fellow academics were initially sceptical of his preoccupation with online behaviour, but by 2011 everyone knew hate speech was the key problem of the internet. That year, Williams received a lottery grant of more than half a million pounds to accelerate his research.
Every social media platform represents a torrent of information too deep and wide to sift by hand. Williams and his team began by taking a random sample of 4,000 tweets from a dataset of 200,000. The trove was then handed to four police officers, trained to recognise racial tensions, who each evaluated whether every tweet was discriminatory. If three of the four officers concurred, the tweet was classified as hate speech. Over a four-week period, the officers identified around 600 tweets they deemed discriminatory, data that formed the gold standard by which the AI would test if a message was malignant or benign.
You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed
On the afternoon of 22 May 2013, when fusilier Lee Rigby was killed by two Islamist converts in Woolwich, London, the software had its first live test. Within 60 minutes of the attack, Williams and his team began harvesting tweets that used the keyword Woolwich. As the software sifted the data, the team was able to examine the drivers and inhibitors of hate speech, and identify accounts spreading anti-Muslim rhetoric. The team found that hate speech peaked for 24-48 hours, and then rapidly fell, while the baseline of online hate remained elevated for several months. Astonishingly, this was one of the first times a link between terror attacks and online hate speech had been demonstrated. And importantly, an increase in localised hate speech both anticipated the attack and, in the aftermath, shadowed it, showing that it might be possible to predict real world attacks.
The data fascinated social scientists, but Williams believed it was more than interesting: it could have a practical application in helping counter these narratives. In 2017, he began a pilot scheme with the national online hate crime hub, which was set up to coordinate reporting into this area. It now uses the HateLab dashboard to gauge ebbs and flows in the targeting of particular groups, as well as nuances in local tensions. This information can then inform operational decisions, helping direct frontline police work.
There are obvious privacy concerns, and HateLab must comply with data protection regulations. The platform depends on the willingness of Twitter to make its data available to third-party applications. (Facebook closed down open access in 2018, so independent organisations cannot screen its posts.) Twitter shares data on the proviso that HateLab does not identify individual accounts via its dashboard. In that sense, we can only provide the 10,000ft view, Williams says. The dashboard can highlight patterns, target groups and geographical hotspots but connecting with individuals is outside its remit.
Meanwhile, Qadir and the other former extremists working alongside Moonshot recognise the power that hate speech can have, and know firsthand that a conversation can steer someone down a more positive path. You can only change people if you can reach them via conversation, he tells me. Violent extremists do this very cleverly, and evidence shows that it works for them, so I based all my programmes on this concept. You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed.
Since Moonshot was founded, there has been a radical shift in the perception of technologys role when it comes to extremist terrorism. Five years ago, there were still people inside the government who thought tech was for the kids, Frenett says. There was a sense that it was almost amusing that terrorists were on the internet. You dont get that any more. Likewise, five years ago there were some great organisations doing great work on the violent far-right, but again it was almost seen as niche. Thats no longer the case.
See more here:
A threat to health is being weaponised: inside the fight against online hate crime - The Guardian
- John Chandler Served as Expert Witness in Landmark Social Media Cases - Newsroom | University of St. Thomas - April 17th, 2026 [April 17th, 2026]
- What Dove, Netflix, and Nike Didn't Do on Reddit Is Why They're Winning - ADWEEK - April 17th, 2026 [April 17th, 2026]
- What parents say about their teens uses of social media - Pew Research Center - April 17th, 2026 [April 17th, 2026]
- 3 Ways Nonprofits Are Using Social Media To Build Trust and Advocacy - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Molly McPherson Analyzes 3 PR Fiascos and the Brand Mistakes Behind Them - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Why the Inbox Is the New Algorithm - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Dhar Mann Kicks Off Social Media Week With $1,000 Old Navy Giveaway - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Teens Experiences on TikTok, Instagram and Snapchat - Pew Research Center - April 17th, 2026 [April 17th, 2026]
- Emma Gredes 4 Rules for Making Your Brand Actually Matter on Social - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Why brands can't stop acting like reply guys and jumping into viral comment threads on social media - Modern Retail - April 17th, 2026 [April 17th, 2026]
- Why Refusing to Change the Format Led Subway Takes to Viral Success - ADWEEK - April 17th, 2026 [April 17th, 2026]
- This craving to go viral is tiresome: the artists sick of the pressure to promote on social media - The Guardian - April 17th, 2026 [April 17th, 2026]
- How Manscaped Used AI to Evolve Beyond Ball Memes - ADWEEK - April 17th, 2026 [April 17th, 2026]
- Alex Cooper and Alix Earle Are Fighting. Or Are They? - The New York Times - April 17th, 2026 [April 17th, 2026]
- Digital Marketing of Unhealthy Foods and Beverages to Children on Social Media in Kenya - Unicef - April 12th, 2026 [April 12th, 2026]
- Snapchat makes a push for Snapcodes as a marketing tool - Social Media Today - April 10th, 2026 [April 10th, 2026]
- What Indonesias Social Media Ban Means for the Future of Youth Marketing - Little Black Book | LBBOnline - April 10th, 2026 [April 10th, 2026]
- Ads and AI: Leveraging AI Creative in 2026 - Social Media Examiner - April 10th, 2026 [April 10th, 2026]
- New Voice, Bold Vision: Vusani Rathogwa Joins Marketing, Branding and Communication as Social Media Officer - University of Venda - April 10th, 2026 [April 10th, 2026]
- The Week in Tech: Social media use falling and OpenAIs manifesto - Marketing Week - April 8th, 2026 [April 8th, 2026]
- Users of social media and AI chatbots for health information are more likely to say they are convenient than accurate - Pew Research Center - April 8th, 2026 [April 8th, 2026]
- How social media is driving teens toward steroids and extreme body transformations - CBS News - April 8th, 2026 [April 8th, 2026]
- St. Louis police respond to social media post claiming a man wearing body armor pointed a gun at cars - First Alert 4 - April 8th, 2026 [April 8th, 2026]
- Landmark Groups Home Centre takes KitKat Heist beyond social trend to in-store marketing - Campaign Middle East - April 8th, 2026 [April 8th, 2026]
- Interconnected forces are paid ads or organic posts the future of social? - Campaign - April 8th, 2026 [April 8th, 2026]
- Join Jane Hayter on the Main Stage panel: Social Media & Marketing That Works for Salons at Glasgow Regional Growth Summit 2026 - Professional... - April 8th, 2026 [April 8th, 2026]
- These Marketing Leaders Were Asked How Theyd Spend an Extra $1 Millionand Their Answers Reveal Where Social Is Headed - inc.com - April 7th, 2026 [April 7th, 2026]
- I always considered social media evil: big tobacco whistleblower on techs addictive products - theguardian.com - April 7th, 2026 [April 7th, 2026]
- Realme Hands Social Media Duties To Creativeland Asia - BW Marketing World - April 7th, 2026 [April 7th, 2026]
- Once-vetoed social-media regulatory bills are becoming law. Heres why. - The Sum and Substance - April 5th, 2026 [April 5th, 2026]
- Australia online regulator reports non-compliance with social media ban - Jurist.org - April 5th, 2026 [April 5th, 2026]
- Digital marketing: How to keep up with algorithm updates and stay ahead of the curve - KBBFocus - April 1st, 2026 [April 1st, 2026]
- The Third Pillar of Marketing: Olivia Blairman and Coolrs Social-First Revolution - Roastbrief US - April 1st, 2026 [April 1st, 2026]
- Internet pile-on turns KitKat theft into a marketing win for Nestl - Famous Campaigns - April 1st, 2026 [April 1st, 2026]
- LinkedIn will no longer allow real-time livestreams - Social Media Today - April 1st, 2026 [April 1st, 2026]
- TikTok partners with Cameo on custom videos - Social Media Today - April 1st, 2026 [April 1st, 2026]
- Major social media platforms accused of flouting teen ban - AFR - April 1st, 2026 [April 1st, 2026]
- How Social Media Creates Global Marketing Opportunities For Online Casinos - The Nation Newspaper - April 1st, 2026 [April 1st, 2026]
- Why Tobaco firms have shifted to social media to lure youths - standardmedia.co.ke - April 1st, 2026 [April 1st, 2026]
- Missouri considering bills that enforce age verification for social media apps - WGEM - March 28th, 2026 [March 28th, 2026]
- Landmark lawsuit finds that social media addiction is a feature, not a bug - theconversation.com - March 28th, 2026 [March 28th, 2026]
- What Does the Landmark Ruling Against Meta and Google Mean for Brands? - Vogue - March 28th, 2026 [March 28th, 2026]
- Leading Toronto Digital Marketing Agency Jeff Social Marketing Expands Into the U.S. Market - Yahoo Finance - March 28th, 2026 [March 28th, 2026]
- Juries Take the Lead in the Push for Child Online Safety - The New York Times - March 28th, 2026 [March 28th, 2026]
- The whistleblower who thinks change is coming to social media : Here & Now Anytime - NPR - March 28th, 2026 [March 28th, 2026]
- What Is Clipping, the Viral Marketing Strategy Thats Taking Over the Music Biz? - Variety - March 28th, 2026 [March 28th, 2026]
- Meta and YouTube found liable of negligence in social media addiction trial - NBC News - March 28th, 2026 [March 28th, 2026]
- Parents see hope in back-to-back rulings that social media providers failed to protect young users - KOLN | Nebraska Local News, Weather, Sports |... - March 28th, 2026 [March 28th, 2026]
- Phones, social media and young people but what about the rest of us? - enlighten.scot - March 28th, 2026 [March 28th, 2026]
- Influencer marketing: Top strategies to maximize ROI in 2026 - Hootsuite Blog - March 28th, 2026 [March 28th, 2026]
- Meta and YouTube lose key battle in social media addiction trial - Marketing-Interactive - March 28th, 2026 [March 28th, 2026]
- DMWF Spotlight: Why video-first social intelligence is the new standard for authenticity - marketingtechnews.net - March 28th, 2026 [March 28th, 2026]
- Ad Creative Strategy: The Easy Way to Improve Facebook and Instagram ROAS - Social Media Examiner - March 26th, 2026 [March 26th, 2026]
- Is social media addictive? The science reveals whats at stake - Scientific American - March 26th, 2026 [March 26th, 2026]
- Jury finds Instagram and YouTube liable in landmark social media addiction trial - Lookout Santa Cruz - March 26th, 2026 [March 26th, 2026]
- Social media is shaping teens - Tri City Voice - March 26th, 2026 [March 26th, 2026]
- UK government to trial social media ban for hundreds of teens - CNBC - March 26th, 2026 [March 26th, 2026]
- Meta adds more tools to help businesses drive in-app sales - Social Media Today - March 26th, 2026 [March 26th, 2026]
- Missouri lawmakers weigh bills aimed at protecting children on social media, AI - First Alert 4 - March 26th, 2026 [March 26th, 2026]
- Snapchat rolls out new ad options to help brands increase their in-app presence - Social Media Today - March 26th, 2026 [March 26th, 2026]
- Most followed influencers in the world - Marketing4eCommerce - March 26th, 2026 [March 26th, 2026]
- Creator Economy, Influencer Marketing, And Social Media Events To Watch In May 2026 - Net Influencer - March 26th, 2026 [March 26th, 2026]
- Social Media Week is the Industrys Most Honest Conversation About Marketing - ADWEEK - March 20th, 2026 [March 20th, 2026]
- COPHs Dr. Angela Makris elected to lead international social marketing association - University of South Florida - March 20th, 2026 [March 20th, 2026]
- One Chargers Social Media Detox Journey and the Benefits of Choosing Presence over Scrolling - University of New Haven - March 20th, 2026 [March 20th, 2026]
- Dubai Turns to Glossy Marketing and Social-Media Crackdown to Protect Its Image - WSJ - March 20th, 2026 [March 20th, 2026]
- YouTube partners with FIFA on exclusive World Cup elements - Social Media Today - March 20th, 2026 [March 20th, 2026]
- Reddit partners with Pacvue to expand ad access - Social Media Today - March 20th, 2026 [March 20th, 2026]
- How to Use TikToks Verified Business Account Features and Local Feed - Social Media Examiner - March 20th, 2026 [March 20th, 2026]
- After the social media ban, old-school hobbies make a comeback - AFR - March 20th, 2026 [March 20th, 2026]
- Three months in, Australians say social media ban is working - but only just - Marketing-Interactive - March 20th, 2026 [March 20th, 2026]
- Too good to be true? How to vet tax advice - NPR - March 17th, 2026 [March 17th, 2026]
- DMWF Spotlight: The 2026 social media trends rewriting the rules of discovery and distribution - Marketing Tech News - March 17th, 2026 [March 17th, 2026]
- Reddit shares insight into the growth of sports fandoms in the app - Social Media Today - March 17th, 2026 [March 17th, 2026]
- Workshop to explore how AI is transforming social media - The Worcester News - March 17th, 2026 [March 17th, 2026]
- DMWF Spotlight: The state of social media in 2026: What the data reveals about platforms, performance, and the people behind the handles - Marketing... - March 17th, 2026 [March 17th, 2026]
- DMWF Spotlight: The new rules of B2B: Gen Z, AI, and the rise of social-first marketing - Marketing Tech News - March 17th, 2026 [March 17th, 2026]
- Meta is switching up its ad transparency labels in-stream - Social Media Today - March 17th, 2026 [March 17th, 2026]
- Meet the No-Name Creator: The Surprising Social Media Trend Driving Sales in 2026 - inc.com - March 15th, 2026 [March 15th, 2026]
- How short-form video on social media is impacting TV viewership in 2026 - YouGov - March 15th, 2026 [March 15th, 2026]