Author Topic: "Hidden" website finder  (Read 3642 times)

0 Members and 1 Guest are viewing this topic.

Offline jyggorath

  • /dev/null
  • *
  • Posts: 8
  • Cookies: 2
    • View Profile
"Hidden" website finder
« on: October 22, 2013, 12:45:50 pm »
I came across this article on hackthissite: https://www.hackthissite.org/forums/viewtopic.php?f=104&t=10334&sid=d41da93c989495cdde82020036725157
In the article some guy claims to have written a python script to scan the "deep web". All it does is constantly generate  random IPs and attempting connections on port 80. I figured I could use this as a base for a script that actually digs up "hidden" websites.
[gist]anonymous/993898b8121a597f6ca0[/gist]
My script generates random IPs and does a reverse DNS lookup on them. If the lookup FAILS, it attempts connection on port 80. If THIS is a success, then it most likely exists a website on this IP that doesn't have a domain (it's sort of "hidden").
Now, I didn't think that the script would have so much when I started out, and the result is VERY messy code. But it works  ;D !
Includes functionality for doing one single IP instead of constantly spamming them, logging the findings to a file, searching for keywords on the sites that turns up, and different degrees of verbosity.


Finally, here's some things that I need help with:
I use the socket.setdefaulttimeout() function to have one second timeout at the connection (for preformance reasons). BUT, it turns out that socket.gethostbyaddr() doesn't care what the timeout is. It only works for socket.connect(). HOW can I set timeout for gethostbyaddr??
Also, I am not satisfied by randomly generating IPs to scan. My goal is to fill a list with EVERY IP address from 1.1.1.2 to 254.255.255.254, then shuffle the list to randomize the scanning order. But I haven't found a smart way to fill the list yet.


Any and all improvements and suggestions are welcomed

Offline proxx

  • Avatarception
  • Global Moderator
  • Titan
  • *
  • Posts: 2803
  • Cookies: 256
  • ФФФ
    • View Profile
Re: "Hidden" website finder
« Reply #1 on: October 22, 2013, 01:02:51 pm »
Very interesting concept, I thought about this some time ago.
The code really hurts my eyes.
if something == True:
Ouch, that is just painstaking.
All the try: except: blocks will probably kill any perfomance that wasnt there to begin with.
This guy must have used this reference: http://sssslide.com/www.slideshare.net/pydanny/python-worst-practices
Ill do a rewrite sometime.
« Last Edit: October 22, 2013, 01:05:38 pm by proxx »
Wtf where you thinking with that signature? - Phage.
This was another little experiment *evillaughter - Proxx.
Evilception... - Phage

Offline jyggorath

  • /dev/null
  • *
  • Posts: 8
  • Cookies: 2
    • View Profile
Re: "Hidden" website finder
« Reply #2 on: October 22, 2013, 01:50:53 pm »
You are abselutely right, it hurts my eyes too. I'm just too lazy to fix all the ugly if conditions at the moment. In my defense, I have to have some try and except in there, because otherwise, it will crash when connections fail. Both gethostbyaddr and connect throw exceptions when they fail so... But I'm to fed up with it to rewrite today.
« Last Edit: October 22, 2013, 07:42:43 pm by Kulverstukas »

Offline proxx

  • Avatarception
  • Global Moderator
  • Titan
  • *
  • Posts: 2803
  • Cookies: 256
  • ФФФ
    • View Profile
Re: "Hidden" website finder
« Reply #3 on: October 22, 2013, 02:05:14 pm »
No need for praising, Im as often wrong as right, nor am I any sort of expert.

I agree with you that try and except have their uses, personally I use them as little as possible because it makes debugging troublesome, good bool test can replace them 80% of the time IMO.


Wtf where you thinking with that signature? - Phage.
This was another little experiment *evillaughter - Proxx.
Evilception... - Phage

Offline Raavgo

  • Peasant
  • *
  • Posts: 88
  • Cookies: 12
  • On my way from a n00b to a PRO
    • View Profile
Re: "Hidden" website finder
« Reply #4 on: October 22, 2013, 02:14:29 pm »
That sure is interesting, but how are you searching the deep web with it?
Don't you need the tor browser to access the deep web, or am I mistaking?

But I like your idea of finding "hidden" websites in the www, but beware there are things in the web you just DON'T want to see!

To everybody who wants to use this script:
Be careful there a lots of perverts out there in the www, and I personally think these people want to hide their websites

But thats just my opinion.
« Last Edit: October 22, 2013, 02:14:46 pm by Raavgo »

Offline Kulverstukas

  • Administrator
  • Zeus
  • *
  • Posts: 6627
  • Cookies: 542
  • Fascist dictator
    • View Profile
    • My blog
Re: "Hidden" website finder
« Reply #5 on: October 22, 2013, 03:08:34 pm »
This is actually useless unless you can make the tool check if the IP is associated with any domain. Otherwise you would get so many IPs that actually have a domain, it's ridiculous...

Offline proxx

  • Avatarception
  • Global Moderator
  • Titan
  • *
  • Posts: 2803
  • Cookies: 256
  • ФФФ
    • View Profile
Re: "Hidden" website finder
« Reply #6 on: October 22, 2013, 03:13:33 pm »
In the description is says it does a reverse lookup before doing anything.
Which basically covers the issue you described Kulver.
« Last Edit: October 22, 2013, 03:13:44 pm by proxx »
Wtf where you thinking with that signature? - Phage.
This was another little experiment *evillaughter - Proxx.
Evilception... - Phage

Offline I_Learning_I

  • Knight
  • **
  • Posts: 267
  • Cookies: 26
  • Nor black or white, not even grey. What hat am I?
    • View Profile
    • Hacking F0r Fr33
Re: "Hidden" website finder
« Reply #7 on: October 22, 2013, 03:37:26 pm »
That sure is interesting, but how are you searching the deep web with it?
Don't you need the tor browser to access the deep web, or am I mistaking?

No, the Deep Web is just a website that doesn't show up when you google it. This can easily be done by simply blocking the Search Engines Bots (through user-agent).
Some people actually use the same User-Agent as Google Crawler just so they can see information that requires registration (that is sometimes closed).

But now I'm the one with a doubt:
If this search's for IP and sees if it has a domain name, won't that rule out a lot of possible websites?
A server can have tons of websites with domain names that still won't show up in a google search, isn't that right? Or do all registered Domain Names show up?
Also if you can't access the contents of the website even if you can see the main page it will still be considered DeepWeb. Like UndergroundHackers and CodeShock.
Thanks for reading,
I_Learning_I

Offline jyggorath

  • /dev/null
  • *
  • Posts: 8
  • Cookies: 2
    • View Profile
Re: "Hidden" website finder
« Reply #8 on: October 22, 2013, 05:23:01 pm »
Quote
That sure is interesting, but how are you searching the deep web with it? Don't you need the tor browser to access the deep web, or am I mistaking?
The deep web is not defined as tor sites, I'm not actually sure it is defined as anything at all. And I don't know if that term is correct at all when referring to my script. But the idiot who wrote the script that gave me the idea called it "deep web".
Quote
To everybody who wants to use this script: Be careful there a lots of perverts out there in the www, and I personally think these people want to hide their websites
I actually didn't think of that. Sorry guys, he's right, take caution! Maybe I should add a functionality for excluding sites with specific keywords as well, that way you also get rid of all the site that only has "It's working!".
[qoute]If this search's for IP and sees if it has a domain name, won't that rule out a lot of possible websites? A server can have tons of websites with domain names that still won't show up in a google search, isn't that right? Or do all registered Domain Names show up?
It's not perfect, all sites that is hidden on a server that already has a domain will be discarded. If anyone has a good idea to solve this, please, let me know.
« Last Edit: October 22, 2013, 05:28:08 pm by jyggorath »

Offline I_Learning_I

  • Knight
  • **
  • Posts: 267
  • Cookies: 26
  • Nor black or white, not even grey. What hat am I?
    • View Profile
    • Hacking F0r Fr33
Re: "Hidden" website finder
« Reply #9 on: October 22, 2013, 06:15:13 pm »
Quote
The simplest solution I can think of would be to do 2 different queries.
1) For IP only, in which you can discard everything with DNS, and everything that wouldn't give you a "can't reach" or "Network Timeout" would go into a list, even so it wouldn't matter much as if you're running something illegal you will probably disable regular ports and have some weird port running HTTPS/FTPS/SSH only. So... an Nmap scan with -sS to the IP would be the best option, if any Port was open it would put it on the list.(I once had a problem where it wouldn't detect a port unless I specified the port range, so you might have to do it to all ports).

2)Query (With DNS)
I'm thinking in: Connect to IP, reverse-DNS to see if there is a Domain Name associated. If there is , instead of discarding the IP), do a port scan, and then you need to "bruteforce" the server, trying to access the IP on all ports, but that won't suffice, so to know if it's DeepWeb you need to do a google search and filter the results as in site:domainname. If the results are (<=2), all the same or if the content in all of them is = 2 (The main page and the other pages asking for login), it is DeepWeb.

Problem: With this you still haven't found any of those hidden website on a running server with other websites.
So to solve this you have to apply 2) in 2 ways.
You have to implement it in 1) and make a 2) using bruteforce.

Resume:
So everytime you check an IP you would be checking if it has or not Domain associated.

If it doesn't goes to DeepWeb (which isn't necessarily true either, but that can be fixed later, by applying the 2) which is check Search Engine results).

If it has Domain Name you can:

a) Check it (Checking the Google results for that search)
     a1)You can/should make a list with all the checked, so you won't re-check them on DNS bruteforce, which will also make the list verify slower and can corrupt memory when the list becomes too big, saving however many HTTP requests.
      (Advantage: You would get this website in the deepweb right off the bat)

b) ignore it
    b1)Later you will bruteforce all DNS's possible.
    (Advantage: Simpler code, safer code, will take longer and it will have many more HTTP Requests)

After that:
DNS Bruteforce in which you apply the method 2)

Hmmmmmm
Half way through with my thinking I just remembered that this won't work because you can have an:
Code: [Select]
IP/asdshdjsadsalkdçlsanldksnadnsadnskajndjns-Website1/
google.com/youwillneverfindthisinamillionyearsbecauseits2big-Illegalshit1/
asdsadsadsadsadsadsadasdsadsadasd-W1.com/youwillneverfindthisinamillionyearsbecauseits2big-Illegalshit1/

This means you would have to brutforce every IP for Folders AND bruteforce DNS and also bruteforce DNS for Folders.

That's impossible due to the number of requests too high (you would get your IP banned), the suspicions it would raise to your ISP for so many Nmap-like requests not to mention governmental companies, and also due to the fact that even if you had a 1Gbit bandwidth it would still take you zillions of years, it's a network-based bruteforce (which is much slower) and you would have to do it to ALL IP's(Don't know if there are IPv6 websites, but even V4 would be a pain...) and DNS (which max length is 253 char, so (72+chars)^252, feel free to remove the '.')

That's why DeepWeb it's called so I guess x)
« Last Edit: October 22, 2013, 07:46:02 pm by Kulverstukas »
Thanks for reading,
I_Learning_I

Offline jyggorath

  • /dev/null
  • *
  • Posts: 8
  • Cookies: 2
    • View Profile
Re: "Hidden" website finder
« Reply #10 on: October 22, 2013, 06:40:54 pm »
In that case I guess it would be no point in trying to make this a true "deep web" scanner... Your first solution to log all ip's to a list and then do a massive nmap scan is good though  :D  But that means that it's no longer fully automated...

Offline I_Learning_I

  • Knight
  • **
  • Posts: 267
  • Cookies: 26
  • Nor black or white, not even grey. What hat am I?
    • View Profile
    • Hacking F0r Fr33
Re: "Hidden" website finder
« Reply #11 on: October 22, 2013, 06:45:48 pm »
In that case I guess it would be no point in trying to make this a true "deep web" scanner... Your first solution to log all ip's to a list and then do a massive nmap scan is good though  :D  But that means that it's no longer fully automated...
Nmap scans are easy to forge, the hard part is the -sV in which you need all the service fingerprint they have, just like -O.
But even so, you could make it a .zip/.tar including nmap file.

But yes, DeepWeb is unthinkably large and can't be found (if it is well done).
There was a project about that that included something along side queries that would make queries or something like that (didn't read much about it).

See it better here:
Code: [Select]
http://en.wikipedia.org/wiki/Deep_Web#Crawling_the_deep_Web
Thanks for reading,
I_Learning_I

Offline proxx

  • Avatarception
  • Global Moderator
  • Titan
  • *
  • Posts: 2803
  • Cookies: 256
  • ФФФ
    • View Profile
Re: "Hidden" website finder
« Reply #12 on: October 22, 2013, 07:08:40 pm »
In that case I guess it would be no point in trying to make this a true "deep web" scanner... Your first solution to log all ip's to a list and then do a massive nmap scan is good though  :D  But that means that it's no longer fully automated...

Sure you can use nmap automated, there is for example a nmap module for python.
Besides its pretty easy to create something along those lines with raw sockets or build a simple SYN scanner.

I would also suggest some multthreading and just using something real plain , no urllib or any of that bs.
Just make it ditch a list of responding  adresses, you will probably be better off using something like curl or just a browser to view the final page.
Its a fun idea and I started some code myself, think I done it before but cant really remember where the F I put that.
You do simple test to make sure its not 192.x.x.x or 255.x.x.x.x or 0.x.x.x or 169.x.x.x.
Im currently just using raw sockets , just for speed.
You could try doing lookups with something like scapy, or just do it manually all the way.
Python is not my language , I dont really understand all that high level shit :P

« Last Edit: October 22, 2013, 07:16:18 pm by proxx »
Wtf where you thinking with that signature? - Phage.
This was another little experiment *evillaughter - Proxx.
Evilception... - Phage

Offline proxx

  • Avatarception
  • Global Moderator
  • Titan
  • *
  • Posts: 2803
  • Cookies: 256
  • ФФФ
    • View Profile
Re: "Hidden" website finder
« Reply #13 on: October 26, 2013, 12:50:42 pm »
I decided to do some code bases on this idea.
Its kinda quick and dirty but it runs pretty slick , pretty fast.
I didnt like socket.gethostname and used nslookup, the optimization of this tool is pretty much unbeatable.
The print statements all scattered around is ugly, gonna fix that for sure.
Just run it, any parameters are hardcoded for now, argparsing is such a pain in the ass.
Im using a DNS flux to prevent excessive requests to one server.


Code: [Select]
#! /usr/bin/env python2
#
#httpBrutev0.01a
#By proxx @ evilzone.org

## IMPORTS
import random,socket,os,time,requests

## ENV
socket.setdefaulttimeout(1.

## VARIABLES
amount = 0
has80 = []
starttime = time.time()
endtime = 0
timeout1 = 0.2
timeout2 = 0.8
DNSserver = ['208.67.222.222','8.8.8.8','156.154.70.1','198.153.192.1','4.2.2.1']


## TEMPSETTINGS
amount = 200



def timer(starttime):
    endtime = time.time()
    return str(endtime-starttime)+' Seconds'
   

def genIP():
    blocka = random.randrange(0,255)
    blockb = random.randrange(0,255)
    blockc = random.randrange(0,255)
    blockd = random.randrange(0,255)
    while blocka == 192 or blocka == 168 or blocka == 255 or blocka == 10 or blocka == 0:
        blocka = random.randrange(0,255)
    while blockc == 255:
        blockc = random.randrange(0,255)
    IP=str(blocka) +'.'+ str(blockb)+'.'+str(blockc) +'.'+str(blockd)

    return IP

def testIP(IP):
    print '[*]Trying connection on port 80'
    s = socket.socket()
    host = IP
    port = 80
    resp = ''
    steps = 2
   
    for tr in range(0,steps):
        global timeout1
        global timeout2
       
        if tr ==0:
            socket.setdefaulttimeout(timeout1)
            print '[*]Trying... ',IP
           
        if tr ==1:
            socket.setdefaulttimeout(timeout2)
            print '[*]Retrying With Higher Timeout..'
        try:
           
            s.connect((host,port))
            s.close()
            return True
   
        except:
            print '[-]Failed (REASON: Timeout) On:',IP,':80'
           
            if tr ==2:
                return False




def dumpToFile(has80):
    file = 'OUTPUT' + str(time.time())
    fop = open(file,'w')
    fop.writelines(has80)
    fop.close()


def reverseDNS(IP):
    global DNSserver
    DNSchoice = random.choice(DNSserver)
    print '[*]Checking for DNS record.. @',DNSchoice
    cmd = str('nslookup -timeout=1'+ IP + ' ' + DNSchoice)
    ext = os.popen(cmd)
    ret = ext.read()

    if "can't find" in ret or 'timed out' in ret:
        return True

    if 'Non-authoritative answer:' in ret:
        print '[+]DNS record = ', ret[100:180].replace('\n','')

       
def grabRaw(has80):
   
    for ip in has80:
        ip = 'http://'+ip
        r = requests.get(ip)
        print '[+]',ip,'HTTP Status Code: ',r.status_code
        print '[+]',r.text[0:200]
       


#Exec
#




for i in range(0,amount):
    print '\n'
    IP = genIP()
    hasnodns = False
    print '[*]Generating IP..'
    print '[+]Gnerated: ' , IP


   

    if reverseDNS(IP):
        print '[+]No record, Moving on.'
        hasnodns = True
    else:
        print '[+]Has Record, Skipping. '
        hasnodns = False



    if hasnodns:
        if testIP(IP):
            print '[+]Has Service 80.'
            print '[*]Appending To Results.'
            IP = IP + '\n'
            has80.append(str(IP))
        else:
            print '[-]Giving up..'
       

if has80:
    dumpToFile(has80)


    print '\n'
    print '[+]',str(amount),'Adresses Tested'

if has80:
    print '[*]Generaing Results..'
    print '[+]Results:'

    for entry in has80:
        print '\n[*] WebSerover Running On: ',entry

    print ''
else:
    print '[-]No Results..'

if has80:
    grabRaw(has80)


print '\n[+]Execution took: ',timer(starttime)
print '[+]Tested: ', amount, 'Adresses.'
print '[+]Bailing out, bye bye.'

Sample output:
Code: [Select]
./httpBrutev0.01a.py


[*]Generating IP..
[+]Gnerated:  2.66.135.136
[*]Checking for DNS record.. @ 198.153.192.1
[+]No record, Moving on.
[*]Trying connection on port 80
[*]Trying...  2.66.135.136
[-]Failed (REASON: Timeout) On: 2.66.135.136 :80
[*]Retrying With Higher Timeout..
[-]Failed (REASON: Timeout) On: 2.66.135.136 :80
[-]Giving up..


[*]Generating IP..
[+]Gnerated:  108.45.19.88
[*]Checking for DNS record.. @ 4.2.2.1
[+]DNS record =  n-authoritative answer:1.2.2.4.in-addr.arpa    name = a.resolvers.level3.net.Aut
[+]Has Record, Skipping.


[*]Generating IP..
[+]Gnerated:  245.127.75.219
[*]Checking for DNS record.. @ 4.2.2.1
[+]DNS record =  Non-authoritative answer:1.2.2.4.in-addr.arpa    name = a.resolvers.level3.net.A
[+]Has Record, Skipping.


[*]Generating IP..
[+]Gnerated:  123.46.116.56
[*]Checking for DNS record.. @ 208.67.222.222
[+]DNS record =  on-authoritative answer:222.222.67.208.in-addr.arpa    name = resolver1.opendns.co
[+]Has Record, Skipping.
[-]No Results..

[+]Execution took:  1.87449288368 Seconds
[+]Tested:  4 Adresses.
[+]Bailing out, bye bye.


Here it actually found something:

Code: [Select]
[+] 200 Adresses Tested
[*]Generaing Results..
[+]Results:

[*] WebSerover Running On:  220.177.34.138

[+] http://220.177.34.138 HTTP Status Code:  200
[+] <html><head>
<title>Web user login</title>
<script language=javascript src="js/MulPlatAPI.js"></script>

<script language=javascript>

function onBodyLoad()
{
    var sUserLanguage = getBrowserLa

[+]Execution took:  37.0749008656 Seconds
[+]Tested:  200 Adresses.
[+]Bailing out, bye bye.



There is just one major flaw in the theory.
Back in the days random IP's (dynamics) didnt have a name.
Nowadays many ISP's assign something, thus its filtered, perhaps not justified.

Ill just let one run for a while and post any results :)
I need multithreading..

*Edit*
Quote
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access /
on this server.</p>
<p>Additional


<!--
 -------------------------------------------------------------
                 !!DO NOT MODIFY THIS FILE!!

 Manual changes will be lost when this file is regenerated.

 Please rea


<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta http-equiv=
<meta HTTP-EQUIV="REFRESH" content="0; url=http://www.e-muge.lt">
 
</html>


<html><head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>ERROR: The requested


<html><head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>ERROR: The requested

Quote
207.59.130.10
147.76.241.129
59.58.17.179
210.56.101.68
67.88.121.250
124.153.242.4
201.174.101.77
76.61.26.232
169.198.203.204
40.136.25.75
76.86.180.13
70.41.145.18
107.12.13.239
78.233.129.33
201.229.218.29
88.245.85.100
96.243.139.62
99.123.234.77
207.70.159.46
114.221.225.254
216.92.65.199
176.28.68.106
195.26.78.132
145.217.135.227
91.135.248.217
132.215.47.85
114.143.56.168
12.162.231.85
60.218.127.83
218.155.150.220
92.115.68.227
197.37.28.114
63.123.142.162
70.41.54.118
24.9.86.239
151.237.16.231
120.238.173.45
212.52.39.83
27.130.219.46
82.223.226.71
128.165.220.241
158.227.36.164
78.179.52.58
223.7.231.31
197.231.244.71
198.240.83.110
92.44.122.70
205.155.151.218
46.137.88.89
160.17.20.194
5.34.45.160
130.55.24.131
110.85.159.151
216.169.109.210
188.76.124.29
76.61.37.194
178.77.80.122
64.17.145.227
78.83.194.127
193.239.191.133
85.124.41.126
23.79.220.7
212.193.229.252
87.237.210.225
108.14.178.46
23.78.165.211
23.75.161.32
131.239.55.225
50.118.116.56
78.4.116.202
219.3.38.179
160.26.157.51
23.74.148.227
93.81.226.30
178.139.181.92
174.104.51.69
23.50.144.156
64.113.107.45
38.99.15.246
133.242.207.163
88.238.25.7
123.149.188.94
202.152.128.194
38.110.50.116
86.49.82.143
176.113.96.199
130.47.50.134
50.76.31.17
180.245.207.80
69.164.74.64
222.90.70.57
141.211.144.56
71.18.248.42
41.41.35.117
85.41.16.227
76.20.90.3
158.227.211.176
58.147.158.241
184.28.246.24
120.151.24.34
220.173.181.156
142.11.243.98
189.235.251.122
83.44.53.72
39.45.208.188
198.27.64.197
210.247.172.95
195.11.16.133
188.128.146.230
199.30.156.34
217.118.128.48
108.189.28.240
87.221.211.127
59.91.126.28
210.155.149.2
83.59.236.63
81.174.234.105
95.84.219.161
89.245.153.121
71.69.53.173
180.253.250.178
222.112.46.160
166.154.190.137
184.26.62.126
82.58.154.224
88.198.43.165
216.147.168.75
41.37.58.56
24.195.229.160
1.4.133.237
201.184.3.50
54.242.251.224
46.211.61.106
223.6.57.141
1.245.154.21
69.197.73.9
63.116.192.130
54.230.127.222
140.237.167.208
199.83.128.213
187.134.215.125
115.9.71.41
190.251.56.117
38.75.24.155
62.7.72.242
189.175.10.32
201.239.164.38
62.176.124.250
77.88.73.194
176.31.184.209
190.212.168.218
76.141.132.145
208.113.208.18
172.232.91.112
118.11.253.79
142.4.190.40
95.132.248.74
65.188.78.42
77.55.100.30
121.58.232.250
12.230.88.145
125.164.90.209
72.44.206.55
85.133.163.109
163.239.77.214
23.44.214.99
67.48.195.235
64.111.120.3
23.13.118.79
87.47.36.49
216.218.162.220
184.28.254.253
128.165.135.65
108.58.148.81
209.160.4.28
23.13.188.32
165.219.23.6
186.160.147.147
41.225.144.214
190.233.230.109
64.246.117.203
23.43.90.229
151.237.84.56
97.74.132.172
186.116.229.118

Mostly just modems that are open to the web, some weird stuff about granny porn.
Some weird things, kinda funny.
« Last Edit: October 26, 2013, 05:25:00 pm by proxx »
Wtf where you thinking with that signature? - Phage.
This was another little experiment *evillaughter - Proxx.
Evilception... - Phage

Offline jyggorath

  • /dev/null
  • *
  • Posts: 8
  • Cookies: 2
    • View Profile
Re: "Hidden" website finder
« Reply #14 on: October 28, 2013, 12:01:25 pm »
Very nice indeed! I had not considered using that solution, I try to avoid using system commands from python as much as I can, but I admit it might be a good solution here. Nice work  :D