- Malicious bots accounted for 21.8% of all Internet traffic in 2017
- Companies struggle to distinguish bad bots from benign ones
- Companies can mitigate bot risk by steering bad bots away from core business assets and data
Last year, IT managers at an Asian airline noticed a troubling trend. Visits to the airline’s website to check seat availability and fares were surging—so much that they were over‑taxing an externally‑managed reservation system. Yet ticket sales hardly budged.
The behavior was causing a spike in IT costs and throwing off revenue and budget planning.
After an extensive investigation, the airline discovered it wasn’t consumers who were creating so many itineraries without buying. It was an army of bots, the tiny software applications that perform automated tasks over the web.
Bots, it turned out, had all but hijacked the airline’s website and reservation systems, accounting for 86% of all site traffic. Some of that came from benign commercial bots run by price‑comparison sites and competitors scouring the web for real‑time fare data. But the lion’s share of the bot behavior was malicious activity, designed to choke the website with fake seat‑availability requests and other queries.
The airline’s bot battles aren’t unique. Bad bots, run by competitors, spammers, and online criminals and transmitted through web browsers, are a growing problem for business. In 2017, bad bots accounted for 21.8% of all internet traffic, a 9.5% rise since 2016, according to an annual report from Distil Networks, a company that offers bot mitigation tactics and tools.
Airlines, financial services, and healthcare are among the sectors that malicious bots target the most. This year, bad bots also caused an estimated $6.5 billion in corporate losses from digital ad fraud, according to a study by the Association of National Advertisers.
Bad bots create complex challenges for companies that rely on accurate, real‑time data. One reason: even if they’re aware of the problem, companies often have trouble identifying the real culprits. In the healthcare sector alone, only 20% of IT security execs say they can tell good bots from bad ones, according to a Ponemon Institute survey.
Bad bots = bad data
Bots—also called spiders, scrapers, or crawlers—perform a wide range of automated routines much more efficiently than people can. Search engines developed some of the first bots to “crawl” websites for content and data that helped improve the quality of search indexing and rankings.
Other commercial bots collect data for weather apps or currency‑rate services. So‑called good bots accounted for 20 percent of all internet traffic in 2017, according to the Distil report. The Asian airline and many other businesses don’t mind dealing with the IT burdens of good bots, because they raise the visibility of their products and services.
Bad bots are another story because they use their powers to disrupt inventory and transaction systems and pickpocket competitive data. Web “scraper” bots collect product information and dynamic price data from rivals. Bot operators can use the data to their own advantage or sell it to third parties that target price‑sensitive buyers.
The financial risks are significant. A typical online ecommerce site today that generates $100 million in annual revenue will incur an estimated $4 million a year in bot‑related losses, according to a study by Aberdeen.
Other bots take over customer shopping accounts and repeatedly place products in shopping carts without purchasing. Called “hoarding,” it’s a technique used to unmask hidden discounts only offered in the latter stages of buying, or to trigger “sold out” notices that steer shoppers to other sites.
Bots “can derive estimates of your inventory,” says Patrick Sullivan, senior director of security technology at Akamai, which helps companies solve bot‑related problems. Rivals can then beef up their own inventory in response. Whatever the tactics, bad bots most frequently wreak havoc with inventory and sales data and cause companies to make costly mistakes with pricing and purchasing.
For businesses that have grown accustomed to ubiquitous bots, the most painful consequences are often the indirect ones. Bad bots corrupt traffic data, and the resulting analytics can throw strategy and decision‑making off course.
One airline that recently launched an email campaign for bargain fares noticed that conversions were coming in below target. “The marketing team spent half a year wondering why they couldn’t get customers to convert,” says Reid Tatoris, marketing VP at Distil Networks.
Distil’s team discovered that bad bots were completing signups and generating 50% of the clicks on emails, all in an effort to scrape pricing and other data, Tatoris says. The bots grabbed the data but never bought tickets, creating the appearance of an unsuccessful campaign. Once IT analysts figured out how to identify the malicious portion of traffic, they discovered their conversion rates actually exceeded goals.
In sectors such as online ticketing, advanced bots mimic the behavior of human users and can easily bypass hurdles such as Captcha, a prompt that requires users to answer brief questions. One ticket broker unleashed an army of bots that created more than 9,000 Ticketmaster accounts and used them to buy more than 30,000 tickets to the musical “Hamilton” over a 20‑month period. A lawsuit is pending, and Ticketmaster has since deployed a new user‑verification platform to defend against bot scalpers.
If you can’t beat ’em, fool ’em
How else can companies lower their risks? Some experts recommend a “manage, not mitigate” strategy. Give up the notion that you will reduce the number of bad bots or change their behaviors, and instead find ways to beat them at their own game.
It’s more effective to make bot operators believe they’re getting something worthwhile and in a way that doesn’t interfere with analytics.
Companies can deter bots by misleading them about the data they’re stealing, or serve up cached web pages with outdated pricing or “out of stock” notices so they can’t scrape true data from a website. Other tactics, such as moving the checkout process from web browsers to native apps, can deter bots from placing products in shopping carts.
Bot traffic can also be diverted to servers that don’t host key workflows, protecting the business analytics, says Sullivan. The Asian airline took this approach to win its bot battles. Using Akamai’s tools, the airline neutralized most of its bot traffic and slashed vendor costs for its external reservation system by 59%.
Another effective countermeasure is to block traffic from outdated web browsers. Companies can use this approach to prevent less‑advanced bots from accessing their pricing and inventory data.
Whatever the tactics, companies should devise a bot strategy soon if they don’t yet have one. The first step is usually the hardest: discovering where the bad bots are hiding with the good ones.