The move itself certainly made sense. The retailer tried moving non-customer traffic—such as search engine bots and agents trying to track a variety of Web performance elements—to some slightly slower servers, with the intent of making the experience much faster for paying customers. And if a bot for Bing, Yahoo or Google get its data a little bit more slowly, no harm done. That was the theory.
But slowing down performance for a Google spider, for instance, will in turn cause Google to assume your site is slower and that could impact a retailer's ranking on the search giant. It also caused some companies that track—and report publicly—Web performance to see the retailer's Web presence as slow or, in this case, down completely.
The IT team at 1-800-Flowers initially thought the reports of the site being down involved a carrier hiccup, but they soon turned their attention to internal server changes. This is an unidentified 1-800-Flowers IT team member, who was commenting on an outage report issued by uptime tracking firm Pingdom, according to an E-mail sent from Joe Pititto, the investor relations VP for 1-800-Flowers.
"On [February] 11th, we moved our traffic to run out of our primary datacenter. If [Pingdom's] bot traffic was pointing to our backup datacenter or any of the its associated cells, they would have seen an issue," said the IT person. "Pingdom monitors our site using a special user agent called the Pingdom.com_bot. It is possible that some of our cells have differential treatment for this user agent, as we have a number of redirect rules for different user agents for mobile users, search engines, etc. If Pingdom were to monitor our site using a standard user agent like Internet Explorer or Firefox, they would not have encountered this problem. To prove this theory though, we will have to get into an interactive troubleshooting exercise with Pingdom. This might also explain why sometimes Gomez sees us as down, when in fact we are alive and well."
Pingdom responded that its research into the problem came up with the same conclusion. "Clearly, the Pingdom user agent is routed differently by your systems. Responses/load times are much slower (several seconds versus less than one second) than if we use the Firefox user agent, and we see this consistently from all 25 locations," wrote Pingdom's Peter Alguacil. "So, I'm guessing you're redirecting them to different servers, and the one for 'bots' (depending on your criteria) is performing very poorly. So in this (very rare) case, Pingdom doing the 'polite' thing and clearly identifying itself to your servers actually made a performance difference."
The "polite" reference is a point made by multiple Web performance firms, and it may be the heart of this issue. The "polite" reference is a point made by multiple Web performance firms, and it may be the heart of this issue. The premise is that it allows a retailer to easily identify and isolate those bots, for the purpose of preventing conversion rates from becoming meaningless. If a search engine bot is hitting a site repeatedly, for example, it could make the conversion percentages plummet, as the analytics engine would interpret every visit as a customer.
Some services, such as Gomez, also self-identify their agents but sometimes let clients not self-identify, so the service can make sure its performance metrics are real. In other words, by not identifying your bots—or by using direct browser connections, which is what Keynote Systems typically does—you avoid this issue, but you also can ruin a retailer's conversion calculations.
1-800-Flowers' Pititto didn't want to discuss the strategy behind moving bots to a different server, suggesting it was the wrong takeaway from this incident. "From our standpoint, this is less an issue of us or any retailer being unaware (of the implications of rerouting bots) and more an issue of the Pingdoms of the world being unable to accurately track actual uptime and, therefore, reporting inaccurate information," Pititto said. "Our focus is not on improving their accuracy; rather, as always, maintaining 100 percent uptime for our customers."
That's a fair point, but other companies that track retail uptime were much less hesitant to comment on the bot-rerouting strategy itself.
"It looks like [Flowers] deliberately deflected robots when they thought it was a peak time," said Dave Karow, a senior product manager at Keynote Systems, adding that doing such a move on purpose is unwise. "Deflecting low-value non-revenue traffic to the dungeon is short-sighted, extremely short-sighted. If they made this move on purpose, they did not understand the implications of what they were doing."
When pushed for any legitimate reason to make such a move, Karow came up with a deliciously Machiavellian thought. "The only positive motivation for hosing Gomez and others," he said, was to let your rivals have incorrect information on your Web performance. Some chains, the theory goes, will tweak their sites to keep up with or slightly better their competition. This type of bot-rerouting approach would cause competitors "to not push themselves while your customer still get a really fast performance."
That said, Karow still concluded that letting the bots see the real thing is the best approach. "If they're gaming the system, they should be treating the bots better" to get better performance stats to be publicized, to attract more customers.
Karow compared this technique to a Web velvet rope, where some chains—including Sephora and Home Depot—"start deflecting people to a light version of their site when they have high traffic. You don't get a failure. With Home Depot, you don't even know that it's happened."
Over at Web tracking firm Gomez, Chief Technology Officer Imad Mouline echoed Keynote's opinion that rerouting bots could prove short-sighted. "It may have unintended consequences. The spidering types of services, they each have some type of value for the retailer," Mouline said. "If you're representing a different face, a different performance" to some visitors, you could end up being quite unhappy.