# This is the A9.com robots.txt file. The only directories that should be crawled and # indexed are the a9 homepage (a9.com/) and the company directories (a9.com/-/*) User-Agent: * Disallow: /a Disallow: /b Disallow: /c Disallow: /d Disallow: /e Disallow: /f Disallow: /g Disallow: /h Disallow: /i Disallow: /j Disallow: /k Disallow: /l Disallow: /m Disallow: /n Disallow: /o Disallow: /p Disallow: /q Disallow: /r Disallow: /s Disallow: /t Disallow: /u Disallow: /v Disallow: /w Disallow: /x Disallow: /y Disallow: /z Disallow: /A Disallow: /B Disallow: /C Disallow: /D Disallow: /E Disallow: /F Disallow: /G Disallow: /H Disallow: /I Disallow: /J Disallow: /K Disallow: /L Disallow: /M Disallow: /N Disallow: /O Disallow: /P Disallow: /Q Disallow: /R Disallow: /S Disallow: /T Disallow: /U Disallow: /V Disallow: /W Disallow: /X Disallow: /Y Disallow: /Z Disallow: /1 Disallow: /2 Disallow: /3 Disallow: /4 Disallow: /5 Disallow: /6 Disallow: /7 Disallow: /8 Disallow: /9 Disallow: /0 Disallow: /% Disallow: /& Disallow: /? Disallow: /#