Hello everyone, In this post I will show you 4 tools which will make your recon process easier and you can get effective results.
When you are using gowitness or any web screenshot tool for fetching screenshots of many URLs, you will get so many screenshots including duplicate screenshots because many URLs have the same content. So this tool will help you to remove images with duplicate content, for example:-
If you have many httprobed URL screenshots of "bingo.com" which you have got from gowitness tool or if you have run gowitness on dirsearch result, so you will get many screenshots which you cannot check manually for duplicates.
Above is httprobed URL's screenshot generated using gowitness
Now we will run dupimageremover on this directory to remove duplicate screenshots
You can download this tool from github https://github.com/rook1337/dupimageremover
This is simple tool to remove duplicate lines from any text file. You can use this tool when you so many subdomains from different sources like crt.sh, amass, subfinder, etc. You can merge all subdomains in 1 text file and remove duplicate lines.
This tool will help you to compare two text files and find out new lines. For example, you have crt.sh subdomains list and subfinder subdomains list for "bingo.com" target. Now you want to see how many new subdomains are found by crt.sh as compared to subfinder list.
As per the above screenshot, crt.sh subdomains list has 21 duplicate subdomains which are in subfinder list also and new subdomains are filtered out in newlines.txt file.
This tool will let you fetch links, domains from any url. Example usage:-
Fetching all subdomains/domains from links on apple.com page
Using this tool, you can retrieve domains/subdomains from anywhere, but you must filter out unnecessary domains/subdomains.