Introduction
As a seasoned red teamer or penetration tester, one of your primary objectives is to gain access to a target’s sensitive data. If your target uses a MySQL database, you must know how to transfer that data to your system. Sometimes, the tools you will use to exploit this data may only work with Postgres. In this article, we’ll show you how to use Docker, mysqldump, and pgloader to transfer data from a MySQL database to your system.
Docker
Docker is a platform for running applications in containers. A container is a lightweight, standalone executable package with everything needed to run an application, including code, libraries, and system tools. Docker allows you to create and manage containers, making deploying and running applications easy.
MySQL
MySQL is a popular open-source relational database management system (RDBMS) widely used for web application and is known for its high performance, reliability, and scalability.
Mysqldump
Mysqldump is a command-line utility that comes with MySQL. It allows you to create backups of MySQL databases in SQL format. Mysqldump is helpful for transferring data from a MySQL database to another system.
Pgloader
Pgloader is a tool for loading data from various sources, including MySQL, into PostgreSQL databases. It can handle large data sets and is highly configurable, making it a popular choice for data migrations.
Prerequisites
Before transferring data from a MySQL database to your system using Docker, mysqldump, and pgloader, you’ll need to ensure that you have the necessary software and tools installed on your local machine. Here are the prerequisites you need to meet:
Docker: Docker is an essential component of this process. Ensure that you have Docker installed on your local machine. You can install Docker on your system by following the instructions provided on the Docker website: https://docs.docker.com/engine/install/
MySQL Client: To interact with the MySQL database, you’ll need a MySQL client installed on your system. You can install the MySQL client using the package manager provided by your operating system.
On Ubuntu or Debian, you can install the client with the following command:
sudo apt-get update sudo apt-get install mysql-client
PostgreSQL Client: Similarly, you’ll need a PostgreSQL client installed on your system to interact with the PostgreSQL database. You can install the PostgreSQL client using the package manager provided by your operating system.
On Ubuntu or Debian, you can install the client with the following command:
sudo apt-get update sudo apt-get install postgresql-client
Mysqldump: Mysqldump is a utility for exporting data from MySQL databases. Ensure that you have mysqldump installed on your local machine. If it’s not already installed, you can install it using the package manager provided by your operating system.
On Ubuntu or Debian, you can install mysqldump with the following command:
sudo apt-get update sudo apt-get install mysql-client-core-8.0
Pgloader: Pgloader is a utility for loading data installed on your local machine. If it’s not already installed, you can install it using the package manager provided by your operating system.
On Ubuntu or Debian, you can install pgloader with the following command:
sudo apt-get update sudo apt-get install pgloader
Text Editor: You’ll need a text editor to create and modify files, including SQL scripts. You can use any text editor of your choice. Some popular text editors include VS Code, Sublime Text, and Atom. I prefer VS Code, but you can use any text editor you like if it doesn’t attempt to format the document.
By meeting these prerequisites, you’ll be well-prepared to follow along with the process of transferring data from a MySQL database to your system using Docker, mysqldump, and pgloader. Ensure that each prerequisite is installed and working correctly before starting the process.
Process
Diagram
Before we dive into the details, let’s take a look at a diagram of the process:
The steps involved in transferring data from a MySQL database to your system using Docker, mysqldump, and pgloader are as follows:
- Set up a MySQL container in Docker and create a test database and table.
- Export the data from the MySQL database using mysqldump and save it to a dump file.
- Set up a PostgreSQL container in Docker and create a test database and table.
- Copy the dump file from the MySQL container to the PostgreSQL container.
- Use pgloader to load the data from the MySQL dump file into the PostgreSQL database.
- Connect to the PostgreSQL database from your local system using a PostgreSQL client.
- Verify that the data was loaded correctly.
For this exercise, we’ll create a test database in the MySQL container, but in a real-world scenario, you would likely be working with an existing database.
Step 1: Set up a MySQL container
First, we need to set up a MySQL container in Docker. We can do this by running the following command:
docker run --name mysql-container -e MYSQL_ROOT_PASSWORD=<password> -d mysql
This command creates a container named mysql-container
and sets the root
password to <password>
. The -d
flag tells Docker to run the container in the
background.
Step 2: Connect to the MySQL container
Next, we need to connect to the MySQL container. We can do this by running the following command:
docker exec -it mysql-container mysql -p
This command tells Docker to execute a command in the mysql-container
container. The -it
flags specify that we want an interactive terminal session.
The mysql -p
command tells Docker to start the MySQL command-line client and
prompt us for the root password.
Step 3: Create a test database and table
Now that we’re connected to the MySQL container let’s create a test database and table. We can do this by running the following SQL commands:
CREATE DATABASE test;
USE test;
CREATE TABLE users (id INT, name VARCHAR(255));
INSERT INTO users VALUES (1, 'Alice');
INSERT INTO users VALUES (2, 'Bob');
These commands create a database named test and a table called users
with two
rows.
Step 4: Export the data using mysqldump
Now that we have some data in our MySQL database, let’s export it using mysqldump. We can do this by running the following command:
mysqldump -u root -p test > dump.sql
This command tells mysqldump to export the test database to a file named
dump.sql
. The -u
flag specifies the username, and the -p
flag tells
mysqldump
to prompt us for the root password.
Step 5: Set up a PostgreSQL container
Next, we need to set up a PostgreSQL container in Docker. We can do this by running the following command:
docker run --name postgres-container -e POSTGRES_PASSWORD=<password> -d postgres
Step 6: Copy the dump file to the PostgreSQL container
Now that we have exported the data from the MySQL database, we must copy the dump file to the PostgreSQL container. We can do this by running the following command:
docker cp dump.sql postgres-container:/dump.sql
This command tells Docker to copy the dump.sql
file to the /dump.sql
path
inside the postgres-container
container.
Step 7: Connect to the PostgreSQL container
Next, we need to connect to the PostgreSQL container. We can do this by running the following command:
docker exec -it postgres-container psql -U postgres
This command tells Docker to execute a command in the postgres-container
container. The -it
flags specify that we want an interactive terminal session.
The psql -U
postgres command tells Docker to start the PostgreSQL command-line
client and connect to the postgres
user.
Step 8: Create a test database and table
Now that we’re connected to the PostgreSQL container let’s create a test database and table. We can do this by running the following SQL commands:
CREATE DATABASE test;
\c test
CREATE TABLE users (id INT, name VARCHAR(255));
These commands create a database named test and a table called users
with no
rows.
Step 9: Load the data using pgloader
Now that we have a database and table in PostgreSQL let’s load the data from the MySQL dump file using pgloader. We can do this by running the following command:
pgloader mysql://root:<password>@mysql-container/test postgresql://postgres:<password>@localhost/test
This command tells pgloader
to load the data from the MySQL database running
in the mysql-container
container to the PostgreSQL database running on
localhost. The -U
flag specifies the username, and the -p
flag tells
pgloader to prompt us for the root password.
Step 10: Verify the data
Finally, let’s verify that the data was loaded correctly. We can do this by running the following SQL command:
SELECT * FROM users;
This command should return the following output:
id | name
----+------
1 | Alice
2 | Bob
(2 rows)
Conclusion
This article showed you how to transfer data from a MySQL database to your system using Docker, mysqldump, and pgloader. We started by setting up a MySQL container in Docker, creating a test database and table, and exporting the data using mysqldump. We then set up a PostgreSQL container in Docker, created a test database and table, and loaded the data using pgloader. Finally, we verified that the data was loaded correctly.
These tools and techniques are essential for red teamers and penetration testers who must transfer data from a target’s MySQL database to their system. With these tools and techniques, you can quickly and easily transfer data and begin analyzing it for vulnerabilities and weaknesses.
This is also extremely useful for developers who transfer data from a MySQL database to a PostgreSQL database. With these tools and techniques, you can quickly and easily transfer data and begin working with it.