A threat to health is being weaponised: inside the fight against online hate crime – The Guardian
In the winter of 2002, nine months before Hanif Qadir unpacked his bag at a terrorist training camp in Afghanistan, a group of men walked into the London MOT testing centre he owned with his brothers. They were collecting money for civilians caught up in the US invasion of Afghanistan; hundreds of children had been orphaned by indiscriminate bombing, the men claimed. Could he help? The appeal resonated with Qadir, who had lost his father when he was seven. He made a donation.
The men returned regularly. Each time, they asked for more money, before gradually changing the subject to Qadirs faith. Eventually they invited him to a meeting at a local house to discuss the war in Afghanistan more freely. I felt they were sincere and genuine, Qadir recalls. At the meeting, the men encouraged Qadir to visit websites that claimed to show photographic evidence of violence against Afghan civilians by western troops.
Qadir browsed hundreds of distressing images, among them scores of orphans, each accompanied by extended captions that described the way in which the childs family had been killed. One girls story has remained with him. The website claimed she had lost 21 members of her family to a stray US missile. The caption explained it had taken locals three days to scrape their remains from the walls of the girls home. The more he saw, the closer Qadir became to the men who were, unbeknown to him, recruiters for al-Qaida.
If someone calls for genocide against Muslims, theyve essentially tattooed a swastika to their forehead online
Qadir grew up in Thornaby-on-Tees, a small town in North Yorkshire. After his father died, he had disengaged from school, leaving at 14 and moving to London. After a few odd jobs, he founded a business with his brothers, buying, repairing and selling cars. By the early 2000s, the business was profitable enough that he was able to donate generously to charitable community causes, a reputation that, he believes, led the recruiters to his door.
The suggestion that Qadir travel to Afghanistan was seeded gently. When a person is radicalised they become suggestible, he tells me. We discussed that, in order to prevent more loss of life, we needed to be prepared to fight. On 2 December 2002, he flew to Islamabad in Pakistan. A few days later, he crossed the border into Afghanistan.
Soon after he arrived at a training camp, Qadir saw a man measuring up children who lived there. I thought they were being tailored for new clothes, he recalls. Then he heard one of the leaders telling the children they would soon be reunited with their dead parents. They were being fitted for suicide vests. I felt sick and angry, he says. I wanted to walk away.
But in the middle of a desert compound patrolled by armed guards, any attempt to defect could be fatal. Qadir was trapped. I knew that if I asked to leave things would end badly. He had to think carefully.
***
In 2002, when Qadir was being radicalised, the internet was not yet ubiquitous. There was no Twitter, no Facebook; websites looking to groom people into supporting extremist causes were obscure. Two decades later, the digital landscape has been transformed. As the All-Party Parliamentary Group on Hate Crime wrote last year, the internet has become a key breeding ground for extremism and hate speech emboldened by the increasing ease of dissemination, anonymity and, thanks to outdated legislation, a lack of meaningful consequences.
Perpetrators of terrorist attacks now routinely leave online statements or manifestos to justify their actions, hoping their words might encourage others. The 28-year-old gunman who killed 51 mosque-goers in Christchurch, New Zealand, last year posted a 73-page white nationalist rant to the fringe web forum 8chan and livestreamed the attack on Facebook.
But now, just as Facebook and Twitter have become the prodigious muck-spreaders of our age, a handful of clandestine startups are using technology to stem the flow. Moonshot, whose office is at a secret location in London, is, at five years old, a veteran in this emerging industry. Its premises have the feel of a typical Silicon Valley operation: distressed floorboards, glass-fronted offices, beanbags by an open fireplace, exposed brickwork, a snug for breathers. There are a few clues that the companys business using technology to disrupt violent extremism is different from that of the fitness app developers, social media influencers and virtual reality speculators with whom it shares an aesthetic. The posters are not vintage prints but disquieting infographics revealing, for example, that after 22 people were shot dead in an El Paso Walmart last August, there was an 82% rise in the Google search term how to murder Mexicans. There is also a bomb-proof door.
Cofounder Vidhya Ramalingam set up the EUs first intergovernmental research initiative to investigate far-right terrorism in the aftermath of the 2011 murder of 77 people by Anders Breivik in Norway. She describes Moonshots work as experimental programming. The company employs 50 people, and uses a mixture of software and human judgment to identify individuals on the internet who, like Qadir, appear interested in extremist propaganda. They then attempt to serve them counter-messaging.
The technology uses a database of indicators of risk. An individual is awarded risk points according to their online behaviour. You score one point for showing curiosity about the Ku Klux Klan or National Socialist Movement. Activity that indicates sympathy with a violent movement or ideology (eg Googling white pride worldwide) earns three points, while showing a desire to join, send money to, or commit acts on behalf of a violent extremist group or individual earns six.
Home Office initiatives such as Prevent have traditionally focused on training teachers and other leaders to identify people likely to be drawn to violent extremism within their communities but these methods risk introducing discriminatory practices. In France, for example, there were posters telling people their sons might be at risk of violent extremism if they grow a beard, start speaking Arabic or stop eating baguettes, explains Ross Frenett, Moonshots cofounder. That is obvious bullshit.
By contrast, Frenett says, if someone makes a post glorifying Hitler, or calls for genocide against Muslims, there is a high degree of certainty that they fall into a high-risk category. Theyve essentially tattooed a swastika to their forehead in the online space, he says. So our level of confidence when identifying individuals who are vulnerable to radicalisation is way higher online than it could ever be offline. And it sidesteps some of the discriminatory, stigmatising practices weve seen in an offline setting.
Moonshot, founded in September 2015, is a for-profit company that earns its income from government contracts in the UK, US, Canada, Australia and across western Europe. It does not limit its work to any particular strain of radicalism; in addition to the far-right and jihadism, Moonshots work covers everything from Buddhist extremism in south Asia, to Hindu nationalism and incel terrorism in Canada.
The skill is in finding out what raised a persons interest in extremist ideology. You cant redirect them until you do
At the broadest level, Moonshot runs what it refers to as redirection material advertising that is designed to get in front of extremist material in Googles search results. Google has granted Moonshot dispensation to advertise against banned search terms such as join Isis. If a user clicks on one of Moonshots camouflaged results, they are taken to, for example, a mental health website with relevant downloadable guides and a chat option. (These sites are run by partnered mental health organisations and groups that have experience dealing with gang violence. As Frenett puts it, they have appropriate risk protocols, and connections with law enforcement, should they be required.) So long as the search terms are carefully calibrated (advertising against white power is useless, Frenett explains, as you end up competing with power-tool companies) this can be an effective first contact.
Success is measured in much the same way as any company seeking to advertise on Google, via click conversions. (We pay for advertising just like any commercial advertiser does, Ramalingam says. We dont get special rates. I wish we had a better story on that front.) A key metric is search impression share, which records the amount of time your at-risk audience saw the ad. Weve had campaigns that have run with only 50%, and thats not good enough, Ramalingam says. So we work hard to get that up to 98% where possible. For this reason, as well as mental health practitioners and ex-police officers, Moonshot also employs marketers. Most of our work is analytics, marketing and social work, Frenett says. It just happens to be marketing, analytics and social work related to terrorism.
Occasionally the company will identify an individual who is too high risk for their interventions. Thats where, depending on the country were working in, we refer a user to the police, Frenett says. In Australia, for example, Moonshot identified someone at the top of a network of around 200 at-risk individuals considered so risky we couldnt intervene. A few days later, the local police arrested the man, who was subsequently convicted on terror charges.
There are deeper kinds of intervention. One of Moonshots advertisements for, say, bomb manuals will take the searcher to a WhatsApp chat manned by a specialist trained in deradicalisation techniques. The company may also identify someone on a particular social media platform openly espousing pro-extremist or pro-terrorist views. Then a trained social worker, typically from a charitable partner organisation, contacts that individual via Twitter direct message or Facebook Messenger. Choosing the right person to make this kind of contact, which may be perceived as invasive, is essential. In many cases, the right person is a former extremist someone like Hanif Qadir.
***
When Qadir realised the children in the Afghan training camp were being measured for suicide vests, his first instinct was to exact revenge on the people who had manipulated him. But I only had a knife, no gun, he says. And I knew that I couldnt tell anyone I wanted out.
He stepped outside the gate of the camp to consider his options. There he spotted the driver of a pick-up truck with whom he had talked a few times. The men did not share a first language, but Qadir gambled. He pulled out 50 and waved it at the man, asking if he could hitch a ride to Turkham, on the border between Pakistan and Afghanistan. The driver nodded. Qadir climbed into the passenger seat. I didnt even collect my bag, he recalls.
Qadir says he cried during the flight to London. I kept asking myself: What the hell have I done? When he arrived home, he and his brothers attempted to find his recruiters, but they had disappeared; the word was that they had moved to Manchester. Qadir decided he no longer wanted to run the car business and convinced his brothers to sell up. I just wanted to stay at home with my children.
After a period of recuperation, he and his brothers opened a gym in a disused nightclub, which became a place where local youths, many of them young Muslims, would congregate. Wed talk, he says. Id ask them questions about Afghanistan. I saw a lot of anger and questioning. It was clear to me that all it would take is for one person to manipulate them emotionally and they would get straight on a plane to fight. Or maybe they would do something here.
Eager to communicate this to someone in a position of power, Qadir started attending local council meetings. A police inspector, Ian Larnder, took him for a coffee, hoping to better understand why this former mechanic seemed so passionate about the subject. Until then, I had told nobody about what had happened, Qadir recalls. Ian was the first person I opened up to. A week later, Larnder was appointed to the polices national community tension team. He took Qadir with him to speak to forces around the country about his experiences.
Today, with a number of other former extremists, Qadir works with Moonshot, where he provides training for online interventions. The skill is in finding out what has raised a persons interest in extremist ideology, he explains. You cant redirect a person until you understand this. Its no good asking something so broad as: What do you think about what is happening in India? It has to be specific and personable. So instead you might say: Is it permissible to seek revenge for the loss of a loved one?
This sort of broad line of questioning and the fact that an anonymous dialogue might tail off, without scope for any follow-up can seem frustratingly opaque for anyone trying to measure Moonshots success. Its a criticism the company is used to fielding. The struggle with preventive work is that, very often, its unscientific and we have to ask people to take it on trust, Frenett says. Its easy for a military contractor to come in and say, I installed a big, high fence and a man with a gun and that reduced terrorism. Likewise, the army can come along and state: We killed 200 Taliban this week.
But its much harder to say, OK. We invested $1m here and we prevented this much terrorism. Our long-term aim is to start to change that calculation. Then well be able to say: If one dollar in every 100 spent on military hardware went towards targeted, community-focused preventive work it would be better value and probably better for the world.
***
In the corner of a chilly room at the end of a corridor in Cardiff Universitys Glamorgan Building, a flood of racial slurs, misogyny, antisemitism and far-right slogans flows across a PC screen. Imagine you had a crystal ball in which you could watch someone perpetrating every hate crime as it occurred somewhere out there, on the streets, explains Matthew Williams, director of HateLab. Thats what youre looking at here, except the hate is happening online.
While Moonshot and Qadir intervene with individuals who are vulnerable to extremism, HateLabs aim is to provide a more accurate picture of hate speech across the internet. It is, Williams says, the first platform to use AI to detect online hate speech in real time and at scale.
Moonshots ads for, say, bomb manuals take the searcher to a WhatsApp chat manned by a deradicalisation specialist
Online hatred is so commonplace that the majority of incidents go unreported. According to British government data, 1,605 hate crimes occurred online between 2017 and 2018, a 40% increase on the previous year. But the Home Office admits this figure is probably a gross underestimate.
Unlike the police, we dont have to wait for a victim to file a report, Williams says. The program reflects a true indication of the prevalence of online hatred.
It offers a granular indication, too. Williams specifies a date range, then picks from a filter of potential target groups: Jews, homosexuals, women, and so on (misogyny is by far the most prevalent form of hate speech on Twitter, he says). He selects anti-Muslim and a heat map of the UK lights up in red blotches showing geographical hotspots. Elsewhere, it reports the average number of hateful posts per minute and the peak times of day (hate speech, the group has found, is most prevalent during the daily commute, when people read and react to the days news).
A word cloud indicates the most-used anti-Muslim slurs, while a spiderweb visualises a network of perpetrators, identifying the thought leaders who are generating the most retweets, and how they are linked, via online accounts. HateLab gives situational awareness to hate speech on Twitter at any given time, Williams says.
Early last month, HateLab identified three forms of coronavirus-related hate speech: anti-Chinese or Asian; antisemitic, focused on conspiracy theories; and Islamophobic, focused on accusations of profiteering. What we are seeing is a threat to health being weaponised to justify targeting minority groups, no matter how illogical the connections may seem, Williams explains.
(Moonshot has monitored similar rises in hate speech targeting Chinese nationals. The hashtag #ChinaLiedPeopleDied was tweeted 65,895 times in March, while #coronavirustruth, implying that the pandemic is a hoax, was used 77,548 times. The company also picked up tweets showing old videos of Muslim men leaving mosques accompanied by text claiming the footage was filmed during quarantine, a seemingly deliberate attempt to create anti-Muslim sentiment.)
Williams, author of a forthcoming book titled The Science Of Hate, is a professor of criminology at Cardiff, but his interest in the field is not purely academic. In 1998, he travelled to London with friends to celebrate a birthday. At some point during the evening, he stepped out of the gay bar in which the group was drinking. Three young men approached. One asked if Williams had a light. As he handed over his Zippo, the man punched him in the face. Williams returned to his friends but said nothing, fearing that they would want to retaliate. Eventually, one of them noticed blood on his teeth and urged him to report the attack. I said no, Williams recalls. At that time my parents didnt know I was gay. My siblings didnt know, and neither did most people from my town. I didnt want to come out to the police.
But Williams returned to Wales a changed person. Any attack on your identity has a profoundly destabilising effect, he says. I became angry and depressed. I modified my behaviour. I stopped holding my boyfriends hand. I still wont show affection in public. He was not alone in failing to report his attackers; based on the combined 2015/16 to 2017/18 Crime Survey for England and Wales, only 53% of hate crime incidents came to the attention of the police. People are fearful of secondary victimisation, Williams says.
As domestic internet use became more commonplace, Williams noticed the hate speech he encountered on the streets reflected online. The difference was that it was there for everyone to witness. Fellow academics were initially sceptical of his preoccupation with online behaviour, but by 2011 everyone knew hate speech was the key problem of the internet. That year, Williams received a lottery grant of more than half a million pounds to accelerate his research.
Every social media platform represents a torrent of information too deep and wide to sift by hand. Williams and his team began by taking a random sample of 4,000 tweets from a dataset of 200,000. The trove was then handed to four police officers, trained to recognise racial tensions, who each evaluated whether every tweet was discriminatory. If three of the four officers concurred, the tweet was classified as hate speech. Over a four-week period, the officers identified around 600 tweets they deemed discriminatory, data that formed the gold standard by which the AI would test if a message was malignant or benign.
You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed
On the afternoon of 22 May 2013, when fusilier Lee Rigby was killed by two Islamist converts in Woolwich, London, the software had its first live test. Within 60 minutes of the attack, Williams and his team began harvesting tweets that used the keyword Woolwich. As the software sifted the data, the team was able to examine the drivers and inhibitors of hate speech, and identify accounts spreading anti-Muslim rhetoric. The team found that hate speech peaked for 24-48 hours, and then rapidly fell, while the baseline of online hate remained elevated for several months. Astonishingly, this was one of the first times a link between terror attacks and online hate speech had been demonstrated. And importantly, an increase in localised hate speech both anticipated the attack and, in the aftermath, shadowed it, showing that it might be possible to predict real world attacks.
The data fascinated social scientists, but Williams believed it was more than interesting: it could have a practical application in helping counter these narratives. In 2017, he began a pilot scheme with the national online hate crime hub, which was set up to coordinate reporting into this area. It now uses the HateLab dashboard to gauge ebbs and flows in the targeting of particular groups, as well as nuances in local tensions. This information can then inform operational decisions, helping direct frontline police work.
There are obvious privacy concerns, and HateLab must comply with data protection regulations. The platform depends on the willingness of Twitter to make its data available to third-party applications. (Facebook closed down open access in 2018, so independent organisations cannot screen its posts.) Twitter shares data on the proviso that HateLab does not identify individual accounts via its dashboard. In that sense, we can only provide the 10,000ft view, Williams says. The dashboard can highlight patterns, target groups and geographical hotspots but connecting with individuals is outside its remit.
Meanwhile, Qadir and the other former extremists working alongside Moonshot recognise the power that hate speech can have, and know firsthand that a conversation can steer someone down a more positive path. You can only change people if you can reach them via conversation, he tells me. Violent extremists do this very cleverly, and evidence shows that it works for them, so I based all my programmes on this concept. You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed.
Since Moonshot was founded, there has been a radical shift in the perception of technologys role when it comes to extremist terrorism. Five years ago, there were still people inside the government who thought tech was for the kids, Frenett says. There was a sense that it was almost amusing that terrorists were on the internet. You dont get that any more. Likewise, five years ago there were some great organisations doing great work on the violent far-right, but again it was almost seen as niche. Thats no longer the case.
See more here:
A threat to health is being weaponised: inside the fight against online hate crime - The Guardian
- X Shares Holiday Marketing Insights and Tips - Social Media Today - December 22nd, 2025 [December 22nd, 2025]
- Which Beauty Brands and Influencers Won on Social Media in 2025? - Vogue - December 21st, 2025 [December 21st, 2025]
- Inside the convenience store marketing warsfood is new battleground as chains shift ad approach - Ad Age - December 21st, 2025 [December 21st, 2025]
- How to build a social media scorecard that closes the reporting gap and proves ROI to leaders - Sprout Social - December 21st, 2025 [December 21st, 2025]
- Merriam-Webster's Word of the Year for 2025 Is 'Slop,' the A.I.-Generated Junk That Fills Our Social Media Feeds - Smithsonian Magazine - December 21st, 2025 [December 21st, 2025]
- YouTubes CEO limits his kids social media use other tech bosses do the same - CNBC - December 14th, 2025 [December 14th, 2025]
- Social media misinformation about ICE creating fear in immigrant communities - Live 5 News - December 14th, 2025 [December 14th, 2025]
- Top Social Media Stocks To Keep An Eye On - December 12th - MarketBeat - December 14th, 2025 [December 14th, 2025]
- Social Media Study 2026: Trends, Real Data and Formats That Work - Metricool - December 14th, 2025 [December 14th, 2025]
- Australian travellers to US face forced disclosure of social media - AFR - December 14th, 2025 [December 14th, 2025]
- A new frontier: 5 trends that will impact social media and influencer marketing in 2026 - Marketing Week - December 12th, 2025 [December 12th, 2025]
- The social marketing trends that took over our feeds in 2025 - Marketing Brew - December 12th, 2025 [December 12th, 2025]
- Introducing This Years Best of Aquatics in Marketing and Social Media - Aquatics International - - December 12th, 2025 [December 12th, 2025]
- Australias social media ban leaves a 15-year-old worried about losing touch with friends - AP News - December 12th, 2025 [December 12th, 2025]
- 7 social media trends you need to know in 2026 - Sprout Social - December 12th, 2025 [December 12th, 2025]
- Australia is trying to enforce the first teen social media ban. Governments worldwide are watching. - CNBC - December 12th, 2025 [December 12th, 2025]
- TikTok Partners With DoubleVerify To Offer More Ad Performance Insight - Social Media Today - December 12th, 2025 [December 12th, 2025]
- 6 marketing priorities leaders will obsess over in 2026 - Sprout Social - December 12th, 2025 [December 12th, 2025]
- Inside the Dark and Predatory World of Crypto Casinos - The New York Times - December 12th, 2025 [December 12th, 2025]
- Jeff Social Marketing Wins Tech Behemoths Awards 2025 for PR, Content Marketing, and WordPress - The AI Journal - December 10th, 2025 [December 10th, 2025]
- Ai Social Marketing Affiliate Pte Ltd AISO Pioneers AI-Driven Creator Monetization, Redefining the Global Content Economy with Blockchain Technology -... - December 10th, 2025 [December 10th, 2025]
- Snapchat Outlines its Ad Development Focus for 2026 - Social Media Today - December 10th, 2025 [December 10th, 2025]
- How to Do Influencer Marketing That Customers Actually Trust - Harvard Business Review - December 10th, 2025 [December 10th, 2025]
- Beyond Rigid Automation: How Custom GPTs Add Flexibility to Your Workflows - Social Media Examiner - December 10th, 2025 [December 10th, 2025]
- Did Detroit Mayor-elect get married this weekend? - Detroit Free Press - December 10th, 2025 [December 10th, 2025]
- Millions of children and teens lose access to accounts as Australias world-first social media ban begins - The Guardian - December 10th, 2025 [December 10th, 2025]
- #paid Wins AdWeek Tech Stack Awards in Both Creator Marketing & Social Media Platform of the Year - Eagle-Tribune - December 10th, 2025 [December 10th, 2025]
- Fei Siong Group taps social media agency for Encik Tan, Popeyes and more - Marketing-Interactive - December 10th, 2025 [December 10th, 2025]
- We tried to break Australias social media ban. It wasnt hard - AFR - December 10th, 2025 [December 10th, 2025]
- To Slang or Not To Slang? That Is the Question for Marketing Pros - The University of Texas at Dallas - December 7th, 2025 [December 7th, 2025]
- Everyone will miss the socialising but its also a relief: five young teens on Australias social media ban - The Guardian - December 7th, 2025 [December 7th, 2025]
- Alex Warren on the Creation of Ordinary and His Social-Media Campaign to Make the Song Go Viral - Variety - December 7th, 2025 [December 7th, 2025]
- Behind the fake brand apology trend and why social media experts hate it - Ad Age - December 7th, 2025 [December 7th, 2025]
- The UK tech firm profiting from age bans on Meta and TikTok - AFR - December 7th, 2025 [December 7th, 2025]
- EU hits Musk's X with $210 million fine for breaching bloc's social media law - Australian Broadcasting Corporation - December 7th, 2025 [December 7th, 2025]
- The future of social media: 7 expert predictions for 2026 - Sprout Social - December 4th, 2025 [December 4th, 2025]
- Whats happening with social media bans in the US and Australia? - Marketing Brew - December 4th, 2025 [December 4th, 2025]
- Afternoon Update: under-16s social media shutdown begins; Starc lights up Ashes again; and Australias Spotify Wrapped wrap-up - The Guardian - December 4th, 2025 [December 4th, 2025]
- The Social G Co. Unveils New Brand Identity After Securing Comcast RISE Grant and Earning Top Platinum Honors in Digital and Social Media Marketing -... - December 4th, 2025 [December 4th, 2025]
- Inside the economics of Candace Owenss media empire and the Macron lawsuit threatening to unravel it - Fortune - December 4th, 2025 [December 4th, 2025]
- Snapchat Shares Research into Evolving Car Buying Trends - Social Media Today - December 4th, 2025 [December 4th, 2025]
- YouTube says it will comply with Australias under-16s social media ban, with Lemon8 to also restrict access - The Guardian - December 4th, 2025 [December 4th, 2025]
- Fashion house Valentino criticised over 'disturbing' AI handbag ads - BBC - December 2nd, 2025 [December 2nd, 2025]
- Social media users flee X, flock to TikTok and Reddit according to Pew Research - Axios - November 30th, 2025 [November 30th, 2025]
- Beyond Zohran Mamdani: Social media amplifies the politics of feelings - The Conversation - November 30th, 2025 [November 30th, 2025]
- Rebel nuns who busted out of Austrian care home win reprieve if they stay off social media - The Guardian - November 30th, 2025 [November 30th, 2025]
- Trump says he wants to permanently pause migration to the US from poorer countries - KBTX News 3 - November 30th, 2025 [November 30th, 2025]
- BeFound Social CEO Announces Industry Shift: AI to Split Marketing Agencies Into Two Groups - Markets Financial Content - November 30th, 2025 [November 30th, 2025]
- Gen Z perspectives: Omnicom-IPG merger, KFC Kallang's revamp and MY's social media ban - Marketing-Interactive - November 30th, 2025 [November 30th, 2025]
- Northern Ontario homesteader says social media paints a romanticized version of the lifestyle - CBC - November 30th, 2025 [November 30th, 2025]
- Children should be at least 16 to access social media, say MEPs - European Parliament - November 26th, 2025 [November 26th, 2025]
- Two Singapores: Why heavy and light social media users need different marketing strategies - YouGov - November 26th, 2025 [November 26th, 2025]
- TikTok Highlights the Value of Creator Partnerships for Marketing - Social Media Today - November 26th, 2025 [November 26th, 2025]
- Social media use trends and insights for marketing professionals - Marketing Tech News - November 24th, 2025 [November 24th, 2025]
- Marketers Reset Strategies As TV Loses Ground To OTT, CTV & Social - BW Marketing World - November 24th, 2025 [November 24th, 2025]
- Act On Real-Time Insights With A Consumer Intelligence Platform - Forrester - November 24th, 2025 [November 24th, 2025]
- Labor has brushed aside concerns over the social media ban. But what if it doesnt work as promised? - The Guardian - November 24th, 2025 [November 24th, 2025]
- I helped build the architecture of addiction for social media and I see warning labels coming. That's just a start - Fortune - November 23rd, 2025 [November 23rd, 2025]
- Facebook and Instagram to start kicking Australian teenagers off platforms as social media ban looms - The Guardian - November 23rd, 2025 [November 23rd, 2025]
- The Smartest Way to Grow on TikTok in 2025 - Social Media Examiner - November 23rd, 2025 [November 23rd, 2025]
- Social Media and Marketing Toolkit - SXSW - November 23rd, 2025 [November 23rd, 2025]
- LinkedIns Advertising Business Is Surging - MarketingProfs - November 18th, 2025 [November 18th, 2025]
- SMART Lab to showcase social media research with Nov. 21 virtual event - University of NebraskaLincoln - November 18th, 2025 [November 18th, 2025]
- TikTok Launches Hub To Assist With Holiday Campaigns - Social Media Today - November 18th, 2025 [November 18th, 2025]
- China and Korea Look to Curb Creator Influence on Sensitive Topcs - Social Media Today - November 18th, 2025 [November 18th, 2025]
- As social media grows more toxic, college athletes ask themselves: Is it worth it? - NPR - November 14th, 2025 [November 14th, 2025]
- What social media audiences want in 2026, by the numbers - Ad Age - November 14th, 2025 [November 14th, 2025]
- Are you limiting the time you spend online? Wed like to hear from you - The Guardian - November 14th, 2025 [November 14th, 2025]
- RateMyAgent, Curated Social rebrand to form Renowned - HousingWire - November 14th, 2025 [November 14th, 2025]
- Winning With Pinterest Ads: How to Increase Your B2C Sales - Social Media Examiner - November 14th, 2025 [November 14th, 2025]
- Responding To Post Comments Can Have a Big Impact on Overall Performance - Social Media Today - November 14th, 2025 [November 14th, 2025]
- The new creator-led marketing playbookwhy companies like Unilever are pouring billions into social-first strategy - Ad Age - November 14th, 2025 [November 14th, 2025]
- Reddit Says Women Auto Buyers Are Increasingly Turning to the App - Social Media Today - November 14th, 2025 [November 14th, 2025]
- The power of content: What do SMBs need to grow their social presence? - samsung.com - November 11th, 2025 [November 11th, 2025]
- Park City Marketing Club: Holiday Social and Business Book Swap - TownLift Park City - November 11th, 2025 [November 11th, 2025]
- Anti-Counterfeit & Brand Protection Guide for Social Commerce - Influencer Marketing Hub - November 11th, 2025 [November 11th, 2025]
- Waterford Social media experts tips for how to stay sane on the internet - waterford-news.ie - November 11th, 2025 [November 11th, 2025]
- They treat men like vending machines: inside the hidden world of social media sperm selling - The Guardian - November 10th, 2025 [November 10th, 2025]
- Social media marketers eye UGC and influencer content for growth, but burnout threatens - eMarketer - November 10th, 2025 [November 10th, 2025]
- Older adults share more political misinformation. Here's why - University of Colorado Boulder - November 10th, 2025 [November 10th, 2025]