I'M SORRY ABOUT THE SCROLL BAR AT THE BOTTOM!
Bingoo is a dorking tool that utilizes Bing and Google to find multiple types of vulnerabilities in websites using either a single dork or a mass dork list. Bingoo Detects SQLi, XSS, RFI, LFI, Ect.
Bingoo Downloadhttps://github.com/Hood3dRob1n/BinGoo
Bingoo OverviewBinGoo is my version of an all-in-one dorking tool written in pure bash. It leverages Google AND Bing main search pages to scrape a large amount of links based on provided search terms. You can choose to search a single dork at a time or you can make lists with one dork per line and perform mass scans. Once your done with that, or maybe you have links gathered from other means, you can move to the Analyzing tools to test for common signs of vulnerabilities. The results are neatly sorted into their own respective files basedon findings. If you want to take further you can run them through the SQL or LFI tools which are some semi working homebrewed creations I made in bash or you can use the SQLMAP and FIMAP wrapper tools I wrote which work much better and with greater accuracy and results. I have also included a few neat features to make life easy, such as Geo dorking based on domain type or domain country codes or shared hosting checker which uses preconfigured Bing search and a dork list to find possible vulns on other sites on same server. I also included a simple admin page finder which simply works based on a provided list and server response codes for confirmation of existance. Together I think it all works as a nice little package!
Bingoo Functions Overview1) Google & Bing Dorkers: ------------------------ Simply choose the search engine you want to use at the main menu and then follow the prompts. You can run a single dork based on your user input and get as custom as you like OR you can point it to a dork file. I have included a few with the default package which can be found in the dorks/ directory within BinGoo/ folder. You can point it anywhere you want, just ensure there is one dork per line so it doesnt mess things up. Once all info provided the tool will work its magic and generate links files for output containing all results. You will get a file called b-links.txt for Bing searches and g-links.txt for Google searches. NOTE: The Google dorker doesnt bypass the Google restrictions anymore but if you scan and then analyze or scan with Google, then scan with Bing, then Analyze you will keep the blocks down to a minimum and shouldnt notice any issues. If you dont get results from Google it is likely due to temporary IP block, just use Bing until it refreshes and goes away (take note of first advice to avoid this situation).
2) Bing Geo Dorker: ------------------ This is my way of allowing users to define the domain type for their searches. I have set a pre-built list of dorks which i use for this option (dorks/site.lst) which you can modify if you like. It runs a list scan against the user provided SITE TYPE or COUNTRY CODE and generates the link results into a file which will include the Geo code or Site Type provided to make it easier to keep track of. Options for Bing Geo Dorker include: AC AD AE AF AG AI AL AM AO AQ AR AS AT AU AW AX AZ BA BB BD BE BF BG BH BI BJ BM BN BO BR BS BT BW BY BZ CA CC CD CF CG CH CI CK CL CM CN CO CR CU CV CX CY CZ DE DJ DK DM DO DZ EC EE EG ER ES ET EU FI FJ FK FM FO FR GA GD GE GF GG GH GI GL GM GN GP GQ GR GS GT GU GW GY HK HM HN HR HT HU ID IE IL IM IN IO IQ IR IS IT JE JM JO JP KE KG KH KI KM KN KP KR KW KY KZ LA LB LC LI LK LR LS LT LU LV LY MA MC MD ME MG MH MK ML MM MN MO MP MQ MR MS MT MU MV MW MX MY MZ NA NC NE NF NG NI NL NO NP NR NU NZ OM PA PE PF PG PH PK PL PM PN PR PS PT PW PY QA RE RO RS RU RW SA SB SC SD SE SG SH SI SK SL SM SN SO SR SS ST SV SY SZ TC TD TF TG TH TJ TK TL TM TN TO TR TT TV TW TZ UA UG UK US UY UZ VA VC VE VG VI VN VU WF WS YE ZA ZM ZW BIZ COM INFO NET ORG AERO ASIA CAT COOP EDU GOV INT JOBS MIL MOBI MUSEUM TEL TRAVEL XXX
3) Bing Shared Hosting Check: ---------------------------- This module will run a list scan using dorks/sharedhosting.lst against user provided IP or Site name to see if there are any other sites on the same server which might have possible vulnerabilities. You can shrink or expand the dork file used as you like.
4) Digger Recon Tool: -------------------- This is another script I wrote separately and decided to incorporate into the mix as I find it useful all the time so figured others would too. It is in the plugins folder and is called by the main script via the menu options. It uses a few built in tools and a few online sites to gather information on a target site. It will run a check on Alexa Ranking, SameIP Shared hosting check, Bing shared hosting check, Sub-domain check, whois info, and a quick nmap scan. Together this can provide a wealth of information with barely taking direct aim (you can comment out the nmap scan if you like to tone it down a bit)
Analyze & Tools Section: ----------------------- This option actually takes you to a new menu section where you can perform post dorking activities. Here is quick overview of the available options from this menu: Analyze Bing Links File: this option will analyze ONLY the b-links.txt file which is generated by a Bing search from the main menu Analyze Google Links File: this option will analyze ONLY the g-links.txt file which is generated by a Google search from the main menu Analyze BOTH Google & Bing links files: this option actually combines the two files into a single file and then runs the analysis check on the new file. Analyze My Links File: This is an option which allows you to point the tool to your own links file. This can be any file you want as long as it has a single link per line NOTE: above analyze options all run a injection and regex check against the pages to check for common signs of injection vulnerabilities and classifies accordingly. You can check source for regex if you want to add or remove anything from the lists used. The positive results are filted into the results/ folder and classified based on how or what was found. The positive regex is clearly labeled as LFI or SQL and in terminal even includes the line where it was found which is helpful in determine what is garbage and what is real. Possibles.results file is full of pages where the injected page returned a 85% loss of text AND had a reduction in table rows (a decent indication that the injection has caused changed also indicating a potential vuln), not very accurate but better to mark and review manually than to omit altogether. If you have a better method please share with me so I can improve this function
Small issue when back to back vulns identified causing the next third site to be marked as vuln when it isn't, further validation tests will get rid of these but this is your heads up its not 100% perfect.
Admin Finder: ------------ This is my homebrewed admin finder. It uses a small word list (plugins/admin.lst) and judges server response and reports back accordingly. You can add to the list as much as you like. The source has been coded to optimize and emulate multi-threading to handle larger lists if/when needed. The default list works pretty good though.
LFI Tools: --------- This takes you to another sub-menu where you can choose to run my homebrewed validator script or you can run my FIMAP wrapper to confirm LFI vulnerabilities. My tester doesnt work 100% and is a work in progress but included in case someone wants to help me get it going a little better just to show it can be done in bash
FIMAP wrapper works great! It allows you to choose if you want to run scan against a single site, the default found results/LFI.results file which is generated after a successful analysis run, or you can point it at your own link file to test. This means you can process one at a time or in bulk!
SQLi Tools: ---------- Very similar to the LFI tools section. I have created a homebrew column counter for verbose vulnerabilities. It works ok, but still in the works. As a backup I have built a SQLMAP wrapper script to allow you to pass them off to be validated by SQLMAP for certainty. As with the LFI wrapper you can choose to run against a single site, the default results/SQLi.results file from analysis stage, or a custom file of your chosing with one link per line to test in bulk.
./Tutorial StartOnce you have downloaded Bingoo, Extract it and open up terminal. You'll need to CD to the directory you extracted it to. so for instance, I'd use this command in terminal
cd /home/endo/Desktop/scripts/bingoo
Once you have changed Directories to Bingoo Simply run this command:
./bingoo
You should now be looking at the bingo menu.
You then can choose either Google or Bing. For this tutorial I'm going to use google.
So Press number 1 and hit Enter.
It now gives you the option to either use a single dork or a mass dork list. I'm going to go with dork list. So i will press 2
then enter. It then asks you to provide the path to your dork list. The good thing about bingoo is that it comes with quite a few
dork lists. Open the Bingoo folder, and look in the dorks folder to find the list you want to use. I will use SQLi.lst.
So my path will look like this
Press enter, and away it goes! It'll start to go through whatever search engine you selected looking for vulnerable websites with
either the single dork you provided or it will goes through the entire list of dorks if you specified one. Once its done scanning
you'll be prompted with a menu. Since I did the google scan i will use optin 2. If you used bing then press option 2 then enter.
If you ran two instances of bingoo then you can do option 3 and it will scan both your google & bing links.
Once you've selected your option. It will go through and check if the links are vulnerable or not. It will also list the type of
Vulnerability. Since I did SQLi it say [Possible blind] Those links are not guaranteed to be vulnerable. The links with the Regex
are the ones that are most likely exploitable.
Hey! There we go! Now we have links that we can test and bingoo pulled for us. Also, another neat feature of Bingoo is that it saves
the links in the "Results" folder in the bingoo directory.
That is it for bingoo. Go to your Bingoo/results folder for the actual list of vulnerable links. There is a part two of this tutorial
in the works. It will involve using your links you got from Bingoo and run them in pwn3r
Join us on IRC.
/server -ssl irc.anonops.com 6697
#LevelTwo
Twitter: _anonendo[/center]