## Exploit [https://owasp.org/www-project-web-security-testing-guide/stable/4-Web_Application_Security_Testing/01-Information_Gathering/01-Conduct_Search_Engine_Discovery_Reconnaissance_for_Information_Leakage]() 1. Explore public `robots.txt` ```bash ❯ curl http://10.0.2.15/robots.txt User-agent: * Disallow: /whatever Disallow: /.hidden ``` 1. Explore the links in `http://10.0.2.15/.hidden` with the `crawl.bash` script to find a flag ## Fix Same as **Information gathering 1**: do not put sensitive data in `robots.txt` because this file is publicly available.