Notes from: How my Botnet Purchased Millions of Dollars in Cars and Defeated the Russian Hackers

What makes a good bot(net) project?

1) Don’t be afraid to do something different.

2) Don’t assume that because you can spider and scrape, that you can make a copy of the internet!

Screen scraper, spider projects don’t mean you can copy the internet. So if your project requires both batch processing and real time results, you have a problem. Projects of massive scaling also cause problems for botnet projects.

3) Realize you don’t own the target’s server

For example, wanting to build a project that monitors around 100,000 items on Amazon every 5 seconds. This would actually end you up in court with a trespass the channels suit.

4) Have a realistic profit model

Because, you have to get paid.

He worked for a car dealership that wanted to purchase cars from an online sales site. People would manually refresh, and they would never get the purchase, since there was incredible lag and competition.

He set up a number of bots that would sync themselves with the sales site’s server and countdown to estimate the time the car would go on sale via meta refresh, then attempt to make the purchase. This resulted in a 95% success rate for purchases.

This type of bot is typically called a “sniper”. Some bot that is setup to hit one specific target at certain times, or through certain logic.

Eventually, another dealership hired a team of Russian hackers to develop a bot of their own, and his success rate dropped to 50%.

In response, he programmed the botnet incrementally cascade in volume. Each bot fired would then attempt to buy it multiple times, instead of just once. This brought them back to close to 100%.

Retrospective about Bots from his experience

Bots should be very lightweight clients, and they need to be easily updated and distributed. Should’ve built in analytics and collect metrics. What was the exact success rate per attempt? What were they really purchased, then sold for (to determine exact value)? Should’ve added in some process to figure out good vehicle selection, like cross-referencing Kelley blue book to look for better deals to purchase.

He could’ve tried to purchase before the sale time opened by enabling the Buy Now button, or trying to make the request directly outside of the browser, but that would’ve showed their hand, and they probably would’ve been noticed.

Today, he uses a “Task Queue” which is essentially a table in a database that keeps track of what tasks his bots need to do. The task queue is then fed to “harvesters”, just computers that execute the workers.

He uses iMacros, macros for the browser that you can replay over and over again, so his harvesters will create iMacros, and through that, his bots can “manipulate any website”. He can easily emulate human behavior via bots through this approach. After each run, the harvester will update the task queue with the new status.