You could bruteforce, but as said it brings a lot of noise to the admin or ids etc.
If the links are on the website somewhere, you could use a spider and pull all the <a> tags and find the links.
An old school method to find hidden/disallowed dirs is to search robots.txt, but that wont include direct file links, and only work if the admin wants those dirs to be excluded by search engines.
You could craft some google dorks, or write up a quick script to bruteforce using google, unless he disallows search engines from viewing the directories. Then again, the website would have to have been cached/spidered by google before hand.