aboutsummaryrefslogtreecommitdiffstats
path: root/Information gathering 1
diff options
context:
space:
mode:
authorThomas Vanbesien <tvanbesi@proton.me>2026-03-26 22:13:08 +0100
committerThomas Vanbesien <tvanbesi@proton.me>2026-03-26 22:13:08 +0100
commitcf5cc6e1db519ef7bd1d786656027a64c208d8b9 (patch)
treeb524668499e16b919118ef21539e6acfbb4884d8 /Information gathering 1
parent0df2b018e730f32915012ee466db1953a8b84cd3 (diff)
downloaddarkly-cf5cc6e1db519ef7bd1d786656027a64c208d8b9.tar.gz
darkly-cf5cc6e1db519ef7bd1d786656027a64c208d8b9.zip
Add information gathering via robots.txt solution
Diffstat (limited to 'Information gathering 1')
-rw-r--r--Information gathering 1/Resources/htpasswd1
-rw-r--r--Information gathering 1/Resources/notes.md19
-rw-r--r--Information gathering 1/flag1
3 files changed, 21 insertions, 0 deletions
diff --git a/Information gathering 1/Resources/htpasswd b/Information gathering 1/Resources/htpasswd
new file mode 100644
index 0000000..9e167a8
--- /dev/null
+++ b/Information gathering 1/Resources/htpasswd
@@ -0,0 +1 @@
+root:437394baff5aa33daa618be47b75cb49
diff --git a/Information gathering 1/Resources/notes.md b/Information gathering 1/Resources/notes.md
new file mode 100644
index 0000000..8d2ab06
--- /dev/null
+++ b/Information gathering 1/Resources/notes.md
@@ -0,0 +1,19 @@
+## Exploit
+
+[https://owasp.org/www-project-web-security-testing-guide/stable/4-Web_Application_Security_Testing/01-Information_Gathering/01-Conduct_Search_Engine_Discovery_Reconnaissance_for_Information_Leakage]()
+
+1. Explore public `robots.txt`
+ ```bash
+ ❯ curl http://10.0.2.15/robots.txt
+ User-agent: *
+ Disallow: /whatever
+ Disallow: /.hidden
+ ```
+1. Found a md5 hash for user `root` at `http://10.0.2.15/whatever/htpasswd`
+1. Used [this website](https://md5.gromweb.com/) to reverse lookup the md5 hash and get `qwerty123@`
+1. Found an admin interface by [enumerating some common application admin interfaces] at `http://10.0.2.15/admin`
+1. Logged in the admin interface with the credentials to find the flag
+
+## Fix
+
+`robots.txt` purpose is to mark files and directories as not to be indexed by search engines crawlers. However, it makes anything written there publicly available so it should not contain sensitive data. Instead these resources must be stored outside of the web root and thus not mentionned at all in `robots.txt`.
diff --git a/Information gathering 1/flag b/Information gathering 1/flag
new file mode 100644
index 0000000..5e2459c
--- /dev/null
+++ b/Information gathering 1/flag
@@ -0,0 +1 @@
+d19b4823e0d5600ceed56d5e896ef328d7a2b9e7ac7e80f4fcdb9b10bcb3e7ff