Home » Business » Bad Bot Surge Forces Cyber Defenses in Retailers

Bad Bot Surge Forces Cyber Defenses in Retailers

Artificial intelligence is responsible for the significant increase in sophisticated bot traffic in the first three months of this year. These bad bots generated almost half of the web traffic instead of human internet surfers.

AI-driven superbots accounted for 33% of observed activities and used advanced evasion methods to bypass traditional detection techniques. These high-level automated attacks against e-commerce revenue and customers generate increasing financial losses and security breaches.

On May 30, bot defense developer Kasada The report covers the period from January to March 2024. The report reveals a shift in strategy towards more organized and financially motivated fraud activities online. It shows how adversaries combine existing and new solvers services with advanced exploit kit to bypass traditional anti-bot tools.

The fact that bots account for 46% internet traffic isn’t surprising. Nick Rieniets is the field CTO of Kasada. He said that it was surprising to find out that almost one-third (33%) of these bad bots were classified as sophisticated.

It shows that bots are getting more advanced and can now overcome even the most sophisticated of bot defenses. Rieniets, a Rieniets spokesperson for the E-Commerce Times, said that fraudsters use tools such as highly customized versions Google Puppeteer or Microsoft Playwright to create these automated threats.

Increase in fraudulent online transactions

The Kasada report reveals the main shifts in bot operation compared to prior quarters. The Quarterly Threat Report’s primary objective is to provide cybersecurity professionals and threat intelligence analysts with critical information to counter current attack vectors.

The sophistication and coordination of cyberattacks has evolved to a new level. Four observations are made.

  1. Advanced solvers can automatically bypass Captcha or other human verification methods. They mimic human interaction by using machine-learning algorithms, human-assisted solutions and human-assisted methods.
  2. New exploit kits that have been updated target vulnerabilities found in web applications and APIs. These automated processes allow attackers launch large-scale attacks with minimal effort. These automated processes increase the efficiency of attacks and their scalability, posing a serious threat to organizations who rely on traditional security measures.
  3. Bots are created to mimic legitimate traffic. They do this by imitating human behavior, such as mouse movements, keystrokes or other user interactions. This indicates a shift in using bots as a tool for online fraud.
  4. In underground online forums, bad bot builders plan future account takeover campaigns and arbitrage possibilities. These forums serve as a hotbed for the sale of automated tools, services and other products that facilitate these activities. This strategy lowers bad actors’ entry barriers, increasing the frequency of automated attacks.

Bots are being created by people of very low skills. Web scrapers are used aggressively by organizations that provide public LLMs to train their models. So, this has become a major concern for many businesses today,” observed Rieniets, adding that cybercrime-as-a-service is also a contributing factor.

Today, they can buy anything [bots] You can deploy them as you wish. Some of them are so automated that they can run the whole process, including all-in-one bots or AIOs,” said he.

Geographical breakdown

Analysis of bots reveals that hotspots are located in areas with high antagonistic activity. These include the United States and Great Britain.

Technology is Fueling Bad Bots Availability

Rieniets has not been surprised by the increase in bot traffic. As sophisticated bots have become more prevalent, the situation has gotten worse. bots originally developed for purchasing sneakers online Repurposed to commit fraud and abuse across a broader range of retail, ecommerce, travel, hospitality, and other segments.

The bots also provide a cost-effective and scalable method to generate profit with fraud techniques such as credential stuffing, reselling cracked account and abusive tactics like automating the purchase of electronics and sneakers.

He added that “accessibility to better bots will lead to bigger profits.”

Account takeover (ATO) is a problem that occurs when consumers use the exact same login credentials to access multiple accounts. Fraudsters take advantage of this by launching credential-stuffing attempts using stolen credentials.

But consumers are not the only ones to blame. He said that many companies rely on anti-bot defenses which are ineffective and cannot detect automated abuse of their customer’s account login.

Cybercrime: The Cheap Cost of Committing It

Rieniets was most surprised to learn that the average cost of a retail account stolen is just $1.15. He said that these accounts are worth much more to those who will commit fraud.

These stolen accounts can be used to make unauthorized transactions and redeem loyalty rewards. He said that they could make huge profits because they can easily and cheaply obtain stolen customer accounts on marketplaces and in private Discord or Telegram groups.

Attackers have mastered Captchas, anti-bot defences and other traditional methods. Solver services costing less than one penny per solution are available. This tiny cost tips the balance in favor of attackers because attacks are very cheap. Rieniets explained that the defenders have to spend a lot of time and money trying to mitigate their attacks, which makes it difficult for them pivot quickly.

He said that a lot of the stolen accounts are due to the outdated anti-bot defences. The operator retooled them, and customers often don’t know it.

According to Rieniets, the solution for defenders would be to make it more expensive for their adversaries to attack or retool. Modern antibot defenses have the ability to adjust their defenses in order to present themselves differently each time.

This approach confuses and deceives the attackers. The attempt to succeed is extremely time-consuming, and costly. By doing so, modern tools make it impossible for attackers to easily profit.