Well, web vulnerability scanners are for sure a big help while you test web application security but you shouldn't stick to it. I mean, of course, they save time, but they also make so many noises and are pretty dangerous. Imagine if somewhere in an application you find a SQL Injection and the query behind is a DROP one, the scanner could make the database drop everything! Also scanners just make WAFS/IDS/Protections raise up alerts at likely all forged requests.

So, after that said, I'll give out an opinion, based also on some facts and experience.The best scanner doesn't exist. There  are many scanners that can suit better in your environment / needs. It always depends on the web application environment and scopes limits.Also the costs for the product itself.

But let's get more detailed:

1) Which one do I choose?

As said before, you choose the one that suits better in your environment and with your technology. There are Enterprise solution or not. I'll talk about the non-enterprise as the Enterprise solution are just the same scanners but with managed engines and a pretty web UI.It depends also on what you want to find, test, and how much and how long.(Talking for legal use). You need to test them before saying one is better than the other. But if you need only an automated scan, you could go for Netsparker, Appspider, Acunetix, Burp Suite (for the scanner itself). If you need to do much work manually, then I would choose Burp Suite anyway.

2) False positives to 0%.

Welp, nowadays this can be true and not. Scanners just use signatures based on some data expected from responses (from response changes, response body changes, time changes, more on) and it's easy to get some false positives out of the box.Yeah, now technology is more powerful and there are better algorithms to check if there is a vulnerability is a false positive, but it's not for sure, because of signatures.What Netsparker says about 100% guarantee of False Positives isn't true, because here comes the talk about False Negatives.False negatives arise when a scanner doesn't find a vulnerability that is in some endpoint and param.False negatives are more common and easy to get.

3) Remember that it's marketing.

I hope you know that every scanner needs to attract clients, so Netsparker, Burp Suite, Acunetix just want to have more client.But you can't have an opinion only from what they say from their website. Of cours,e they won't say negative things about their product, and I understand that.So you need to go to some websites for opinions and ideas, but first, you should try them and get your own idea.So far there is my list, not in order of preference.""Best"" Web Vulnerability Scanners:- Burp Suite,- Acunetix,- Netsparker,- AppSpider,- FAST by Wallarm (really cool)What I would not use:- AppScan,- Web Inspect,- Owasp Zap (nice project, but not that good to my opinion, also open source),- Qualys WAS (Cloud-based), Any other not well documented.Other scanners like Detectify, and other (cloud and not) solutions: I haven't tried them and I'm not sure even I want to.Here I'll also list a few pros and cons of that scanner (personal).

Burp Suite:


- Very good scanner

- You can use extension to extend checks and usability and also create your own

- Awesome for manual tests

-You have total control over the tool (except for totally automated scans, but you can direct them)


- Sadly is based on java, it's heavy and resource consuming (although they are working on that)

- Extensions can give much false positives (but it's a problem of who wrote them, because the entire scanner has different way to verify an issue, but Extensions API helps out and is very documented),- Slow scanner

- The new crawler can skip some endpoints /requests, even the beta one based on chromium



- Very good crawler, it likely finds all path/endpoints, but the bigger the website is, the bigger will be the time to cover it all

- Checks are pretty good, but Out of Bound checks are way better

- Has a Scan optimizer that can help optimize the scan policy in use to suit better on the environment (detect webserver, language used for website, technologies) and it adjust the scan based on them

- Has some manual tools to exploit some vulnerabilities and for doing some manual test


- Very slow scan, but that's because it tests every possible parameter and it's due also to scan policy

- Doesn't find every vulnerabilities (False negatives)

- Manual tools, aren't that good or just a few

- Sometimes DOM/JS Parsing just fail or loop/timeout even with simpler websites



- Good crawler

- Surprisingly fast

- Have checks also for many CVEs

- Integration with OpenVAS for Network Scans (Good feature, but openvas isn't that good)


- Many false negatives, due to skipping check that could take long and skips also some parameters

- Fully automated and no manual tools / things to do. (actually, there are Acunetix Manual Tools software, but they're useless IMHO)

- Scan policy edit is a bit restrictive to vulnerability family only or single known checks

- Sometimes out of bound checks are prone to false positives (but this, change from environment to environment)

So this was just a little list based on my ideas and experience.I hope you open your eyes to those tools and learn by your self.Those are just tools to help the tester out on generic vulnerabilities, but they can't find a logical flaw and similar shit, because they're not intelligent in any way, they just run payloads and use signatures even if advanced or not.

I want to say also that this is my opinion based on personal experiences and facts.
And also I'm not advicing
If you want to ask me something or correct me, you can find me on Twitter at @h0nus7