Select Page
Rumble Is Part of an ‘Active and Ongoing’ SEC Investigation

Rumble Is Part of an ‘Active and Ongoing’ SEC Investigation

In May 2021, the site was reportedly valued at an estimated $500 million. In September 2022, Rumble became a publicly traded company listed on the Nasdaq as part of a Special Purpose Acquisition Company (SPAC) deal. Its valuation currently exceeds $1.2 billion.

In April 2023, investment research firm Culper Research released a report expressing skepticism about the legitimacy of Rumble’s claimed monthly active user (MAU) counts, a key metric for investors to evaluate the performance of a social media company. Culper Research said it had taken a short position in Rumble, meaning it stands to profit if Rumble’s stock price decreases.

“Combined, the web and app data suggest to us that Rumble has only 38 to 48 million unique users, and the Company has overstated its user base by 66% to 108%,” Culper Research claimed in its report.

In a quarterly earnings call following the report’s publication, Rumble reported that its monthly active users declined by 40 percent during the first three months of 2023, from 80 million to 48 million. In a financial filing, Rumble attributed the decrease in users to its popular creators being less active on the platform in the first part of 2023, and news events slowing down following the 2022 midterm elections.

“Investors should be especially dubious of rumors peddled by short-sellers who are attempting to distort facts for their own financial benefit. We are aware of misleading claims about Rumble’s monthly active user (MAU) statistics, which, as we have previously disclosed, are provided by Google Analytics,” Rumble spokesperson Rumore says. “Any suggestion that Rumble has inflated its MAUs is false—as any objective person quickly realizes upon even a cursory review of the data.”

Christian Lamarco, the founder of Culper Research, believes the change in reported users was a response to its report. “That was a bit of validation, in my view,” he says.

Updated 5:45 pm ET, January 8, 2024: Immediately following publication, Chris Pavlovski, Rumble’s founder and CEO, said in a post on X that the SEC investigation was part of “the playbook to try and destroy” the company.

“A short seller creates a bogus report and sends it to the SEC. The SEC investigates the bogus report. Then the short seller talks to the media to get a story about how the SEC is investigating the report that started with him. The media happily writes the story,” Pavlovski wrote. “The report is bogus, but that doesn’t matter—it’s all to get investors to sell the stock so the short seller profits.”

Pavlovski added that the company used Google Analytics to track user metrics “so we could be ready for this very moment.”

The Hamas Threat of Hostage Execution Videos Looms Large Over Social Media

The Hamas Threat of Hostage Execution Videos Looms Large Over Social Media

Ahmed alleges that the companies are failing to implement systems that automatically detect violent extremist content as effectively as they detect some other kinds of content. “If you have a snatch of copyrighted music in your video, their systems will detect it within a microsecond and take it down,” Ahmed says, adding that “the fundamental human rights of the victims of terrorist attacks” should carry as much urgency as the “property rights of music artists and entertainers.”

The lack of details about how social platforms plan to curb the use of livestreams is, in part, because they are concerned about giving away too much information, which may allow Hamas, Palestinian Islamic Jihad (PIJ), and other militant groups or their supporters to circumvent the measures that are in place, an employee of a major platform who was granted anonymity because they are not authorized to speak publicly claimed in a communication with WIRED.

Adam Hadley, founder and executive director of Tech Against Terrorism, a United Nations-affiliated nonprofit that tracks extremist activity online, tells WIRED that while maintaining secrecy around content moderation methods is important during a sensitive and volatile conflict, tech companies should be more transparent about how they work.

“There has to be some degree of caution in terms of sharing the details of how this material is discovered and analyzed,” Hadley says. “But I would hope there are ways of communicating this ethically that don’t tip off terrorists to detection methods, and we would always encourage platforms to be transparent about what they’re doing.”

The social media companies say their dedicated teams are working around the clock right now as they await the launch of Israel’s expected ground assault in Gaza, which Hadley believes could trigger a spate of hostage executions.

And yet, for all of the time, money, and resources these multibillion-dollar companies appear to be putting into tackling this potential crisis, they are still reliant on Tech Against Terrorism, a tiny nonprofit, to alert them when new content from Hamas or PIJ, another paramilitary group based in Gaza, is posted online.

Hadley says his team of 20 typically knows about new terrorist content before any of the big platforms. So far, while tracking verified content from Hamas’ military wing or the PIJ, Hadey says the volume of content on the major social platforms is “very low.”

Insiders Say X’s Crowdsourced Anti-Disinformation Tool Is Making the Problem Worse

Insiders Say X’s Crowdsourced Anti-Disinformation Tool Is Making the Problem Worse

On Saturday, the official Israel account on X posted a picture of what looks like a child’s bedroom with blood covering the floor. “This could be your child’s bedroom. No words,” the post reads. There is no suggestion the picture is fake, and publicly there are no notes on the post. However, in the Community Notes backend, viewed by WIRED, multiple contributors are engaging in a conspiracy-fueled back-and-forth.

“Deoxygenated blood has a shade of dark red, therefore this is staged,” one contributor wrote. “Post with manipulative intent that tries to create an emotional reaction in the reader by relating words and pictures in a decontextualized way,” another writes.

“There is no evidence that this picture is staged. A Wikipedia article about blood is not evidence that this is staged,” another contributor writes.

“There is no evidence this photo is from the October 7th attacks,” another claims.

These types of exchanges raise questions about how X approves contributors for the program, but this, along with precisely what factors are considered before each note is approved, remains unknown. X’s Benarroch did not respond to questions about how contributors are chosen.

None of those approved for the system are given any training, according to all contributors WIRED spoke to, and the only limitation placed on the contributors initially is an inability to write new notes until they have rated a number of other notes first. One contributor claims this approval process can take fewer than six hours.

In order for notes to become attached to a post publicly, they need to be approved as “helpful” by a certain number of contributors, though how many is unclear. X describes “helpful” notes as ones that get “enough contributors from different perspectives.” Benarroch did not say how X evaluates a user’s political leanings. However, the system at least previously employed a technique known as bridge-based ranking to favor notes that receive positive interactions from users estimated to hold differing viewpoints. Still, how this works is not clear to at least some Community Notes contributors. 

“I don’t see any mechanism by which they can know what perspective people hold,” Anna, a UK-based former journalist whom X invited to become a Community Notes contributor, tells WIRED. “I really don’t see how that would work, to be honest, because new topics come up that one could not possibly have been rated on.” Anna asked only to be identified by her first name for fear of backlash from other X users.

How to Install Threads on Your Windows Desktop

How to Install Threads on Your Windows Desktop

After all the press for Mark Zuckerberg and Elon Musk potentially taking it to each other in the octagon, the only analog we’re likely to see is Twitter versus Meta’s new darling–Threads. The platform has picked up 70 million sign-ups in the first couple of days, and it shows no sign of slowing down. The only trouble is that, right now, it’s mobile only. You can view individual posts in a browser, but you can’t post or read your whole feed.

Personally, my relationship with the blue bird has been in sharp decline over the last few months, so I decided to give Threads a try. The best way I can describe it is, it’s like rerolling a new character class in a game after having already been through the endgame content. You know, kind of refreshing.

Threads still falls behind because it’s mobile locked to Android and iOS devices, so I can’t really use it on anything other than my phone and tablet. But if you’re running Windows 11, there’s a quick path around that restriction using the Windows Subsystem for Android.

It all hinges on the Amazon Appstore and activating the ability to sideload Android APKs with the flick of a few switches. So this guide isn’t just for Threads, but more of a … meta guide for most Android apps that have available APKs. (I hope you see what I did there.)

Before We Begin

  • Have Windows 11 installed.
  • Have the latest Windows updates installed.
  • Make sure Microsoft supports Amazon Appstore in your country or region. (Check here.)

Install Amazon Appstore/Windows Subsystem for Android

  • Open the Microsoft Store and search for Amazon Appstore. Click Install or Get to begin the download.
  • This will start you through a three-step setup process—just follow it through. It will ask for permission to make changes to a couple of utilities—allow them, and you’ll soon be prompted to restart your computer.
  • When it comes back from restart, your PC will automatically begin installing the Windows Subsystem for Android. When that’s finished, you’ll be prompted with an Amazon Appstore login screen. (You don’t have to log in.)
  • You’ll now be able to find Windows Subsystem for Android in your Start menu. Open it and select Advanced settings on the left, then toggle the Developer mode slider to the right.

Download the Threads APK

There are multiple options for downloading Android app APKs, but if you don’t know where you’re going, you can end up in some unsavory corners of the web. One of the safest in my experience is APKMirror.

  • Get the Threads APK using the APKMirror link here.
  • By default this will go into your Downloads folder.

Install WSATools

While there are several options for apps that allow installation of Android APKs once the Windows Subsystem for Android is installed, WSATools is one of the simplest and most straightforward, and it’s another pickup from the Microsoft Store.

  • Open the Microsoft Store and search for WSATools. Install it.

Install Threads

All the pieces are in place, let’s go!

  • WSATools will now be available on your Start menu. Open it up.
  • Click Install an APK. This will tell you the ADB is missing the first time you run it. Click Install and select or make a folder to install it into. Personally, I just made C:\ADB to be simple. (You’ll never have to do this again.)
  • Once that’s done, you’ll get another prompt to find your file. Go to your Downloads folder and select our freshly downloaded Threads APK.
  • Click Install when it shows the Threads icon and information.
  • It is possible that it will ask permission for ADB Debugging. If so, click yes.

If this doesn’t work, or it says it can’t access the WSA, restarting your machine to try again will do the trick.

And that’s it! Threads should now be available in your Start menu, so when you’re at your desk at work or on your gaming rig taking a break between runs, you can open Threads on your Windows 11 PC with the same ease as Microsoft’s Facebook, Twitter, and Instagram Windows apps. Happy spooling! (Is that what we’re calling it?)

Google Made Millions From Ads for Fake Abortion Clinics

Google Made Millions From Ads for Fake Abortion Clinics

Researchers at the CCDH also found several marketing firms catering to crisis pregnancy centers and offering services, including help accessing the Google ad grants, along with strategies to ensure that their content appears next to legitimate reproductive health information by hijacking keywords used by people seeking abortions.

“There’s a set of keywords which are clearly abortion search keywords, and those keywords tend to be the names of abortion providers,” says Callum Hood, head of research at CCDH. “Amongst the top keywords that fake clinics target, ‘planned parenthood’ is in the top five.” Planned Parenthood is a genuine reproductive health organization.

This is not the first time Google’s free advertising perks have gone to anti-abortion groups. In 2019, a group of anti-choice clinics run by a Catholic group were found to have received tens of thousands of dollars worth of free advertising on Google. In response, the company changed its policies to require such organizations to note whether they actually offer abortion services. 

But the CCDH report found that sometimes these labels were still not applied to ads from crisis pregnancy centers. And even then, Shakouri says the label can be confusing to users who don’t know the difference between a crisis pregnancy center and a legitimate health clinic that may simply not provide abortion care.“There’s a lot of ways people could interpret that labeling, and that labeling has been applied to organizations like abortion funds or services that act as referral services,” she says.

This confusion extends beyond ads and search to Google Maps, where crisis pregnancy centers often show up alongside legitimate clinics.

“It’s very hard for people that are less digitally literate to find out who is a legitimate provider,”  says Sanne Thijssen, the creator of #HeyGoogle, which maps crisis pregnancy centers throughout Europe to help women better identify fake clinics. “A lot of times if they see something on Google Maps … they aren’t able to really distinguish as well.”

Martha Dimitratou, media manager for PlanC, a nonprofit that provides information about access to the abortion pill, says that the organization’s Google Ads account was banned over a year ago for advertising “unauthorized pharmacies.” 

“We have tried to appeal this very many times, but Google does not want to change the system,” she says.

Meanwhile,  Google continues to allow ads from crisis pregnancy centers directing users to sites that promote “abortion reversal,” an unscientific method of administering progesterone to a woman who has taken abortion medication in order to stop its effects.

Angela Vasquez-Girouxat, vice president of communications and research at abortion advocacy group Naral, notes that a past study on “abortion reversal” had to be halted because the regimen posed a threat to the health of the women involved. “Imagine if there were a vaccine study that found the vaccines were harmful to people,” she says. “Google probably wouldn’t promote that as a legitimate regimen, but they allow these organizations to continue to promote abortion pill reversal and other fake science, despite the fact that it is physically dangerous.”