Access member only content, take part in discussions with comments on blogs, news and reviews and receive all the latest security industry news directly to your inbox. Join now for free.
Processing registration... Please wait.
This process can take up to a minute to complete.
A confirmation email has been sent to your email address - SUPPLIED EMAIL HERE. Please click on the link in the email to verify your email address. You need to verify your email before you can start posting.
If you do not receive your confirmation email within the next few minutes, it may be because the email has been captured by a junk mail filter. Please ensure you add the domain @scmagazine.com.au to your white-listed senders.
Botnets are being used to generate up to 80,000 daily queries on search engines daily.
This experts said allows for a filtered list of potentially exploitable sites to be created in a very short time with minimal effort.
The attacker's identity remains concealed as searches are conducted using botnets and not the hacker's IP address.
Amichai Shulman, CTO of Imperva, said that hackers have become experts at using Google to create a map of hackable targets on the web and this allows them to be more productive when it comes to targeting attacks.
"These attacks highlight that search engine providers need to do more to prevent attackers from taking advantage of their platforms,” he said.
Imperva said that search engines deploy detection mechanisms, based on the IP address of the originating request in order to block automated search campaigns. However hackers easily overcome these detection mechanisms by distributing their queries across botnets.
According to Imperva, its Application Defense Center observed a specific botnet examine dozens and even hundreds of returned results using paging parameters in the query during May and June.
This resulted in almost 550,000 queries (up to 81,000 daily queries and 22,000 average daily queries) being requested during the observation period. The attacker was able to take advantage of the bandwidth available to the dozens of controlled hosts in the botnet to seek and examine vulnerable applications.
In terms of recommendations for search engines, Imperva said that search engine providers should start looking for unusual suspicious queries, such as those that look for known sensitive files or database data files.
It also recommended blacklisting internet service providers that are suspected of being part of a botnet and to apply strict anti-automation policies (using CAPTCHA) or identify additional hosts that exhibit the same suspicious behaviour pattern to update the IPs blacklist.
This article originally appeared at scmagazineuk.com
To begin commenting right away, you can log in below or register an account if you don't yet have one. Please read our guidelines on commenting. Offending posts will be removed and your access may be suspended. Abusive or obscene language will not be tolerated. The comments below do not necessarily reflect the views or opinions of SC Magazine, Haymarket Media or its employees.