How to backup MySQL databases with Unix console?

The backup of a MySQL database over a surface like PhPMyAdmin is very simple. If you manage your database through a Unix console, you can easily create a backup using the following command:

mysqldump -u root -p --all-databases > dump.sql

If you changed the default database user when setting up the database, then root must be replaced by it. Otherwise, you could use ‘root’ standardly.

After backup file was created in the folder where you executed the mysqldump, you have the option to copy it to a local machine. Before you need to quit server connection to execute the following command locally:

scp remote_user@:remote_host/home/dump.sql /Users/local_user

Update: Since I was asked how to backup the file data of a website, the best way is to create a compressed file of your website folder executing command

tar zcvf backup.tar.gz my_website_folder/

and then download the compressed folder backup as described above.

MySQL: Fulltext Search – An Example

A fulltext index accelerates the search of tokens noticeble. On a simple LIKE search implementation (i.e. token = %mytoken%) all database rows have to be touched in order to find the token(s). This is not a big deal on a small database. But on big database holding several millions of rows this can cause really big performance problems. Fulltext search is the solution for such cases. If fulltext search is enabled, all fulltext-enabled columns of an inserted entry are indexed. If you search on a indexed colum, only the index is touched, which runs much more faster in search.

This short example shows how to implement a fulltext for MySQL database:

CREATE TABLE blogentry(
entryID INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
title VARCHAR(255) ,
message text,
fulltext (title, message)
);

In the above example we define that the row title and body are indexed on a new entry insertion. Of course, you can only index one colum or even more than two. But the more colums you index, the more time it needs to search fulltext. So, try to find an appropriate middle course.
A fulltext search is then conducted with the MATCH() AGAINST() command.

SELECT title
FROM blogentry
WHERE MATCH (title, body) AGAINST ('Java Apache Lucene');

How to import and export a CSV file to and from Postgres

You can simply import and export a comma seperated values file (CSV) by using the command line or the pgadmin SQL interface. There are slightly different commands between both approaches.

Import

By using the pgadmin SQL interface you need to add a public infront of the table name:

 copy public."TableName" from '/data/my_csv_file.csv' using delimiters ';'
 

By using the command line you do not need the public appendix:

copy TableName from '/data/my_csv_file.csv' using delimiters ';'

Export

Pretty the same difference applies to the export commands.

Pgadmin SQL interface:

\copy public."TableName" to '/data/my_csv_file.csv' delimiters ';'

Command line:

\copy TableName to '/data/my_csv_file.csv' delimiters ';'

Fehler: “Kein Server angegeben!” PHPPgAdmin

Wer versucht größere Dumps oder SQL-Dateien über die PHP Oberfläche des PgAdmin Interfaces einzulesen, dem erscheint oftmals die wenig aussagekräftige Fehlermeldung

Kein Server angegeben!

Wer PHPPgAdmin in englischer Sprache verwendet, dem erscheint sie als

No Server supplied!

Dahinter versteckt sich kein Problem von Postgres, sondern vielmehr eines von PHP. Die Fehlermeldung bedeutet nämlich das die Datei größer ist als die derzeit für PHP erlaubte maximale heraufzuladende Größe. Standardmäßig ist die Größe lediglich 2 MB. Um diese zu ändern, müssen zwei Datenparameter in der Datei php.ini geändert werden. In Unix-Systeme kann die Datei einfach über die Konsole mit locate php.ini gefunden werden. In der Datei müssen nun die Parameter

upload_max_filesize

und

post_max_size

so angepasst werden, dass diese größer sind als die einzuspielende Datei.

Danach klappt es auch mit dem Einlesen des Dumps.