|Access||Adobe photoshop||Algoritmi||Autocad||Baze de date||C||C sharp|
|Calculatoare||Corel draw||Dot net||Excel||Fox pro||Frontpage||Hardware|
|Php||Power point||Retele calculatoare||Sql||Tutorials||Webdesign||Windows|
|Asp||Autocad||C||Dot net||Excel||Fox pro||Html||Java|
Tutorial - Introduction to Indexes
Database indexes are very similar to book indexes, they are used to speed up retrieval of data.
A book index lets you see where particular words are used in a book and a page number or numbers where it is used.
A database index provides the same functionality, giving your database a shortcut to a row of data so it doesn't have to search
through the whole table to find the row it wants.
Through this tutorial, we will look at when to use them and how to set up indexes properly.
Creating indexes on tables is easy. They are created after the table is created. There are also different types of indexes as we
will see. We will only cover the basic datatypes (text, numeric and binary) but the principles cover any sort of data you want to
store and retrieve.
Let's create the new database for this exercise now:
pgsql@elrond ~$ createdb indextest
To create and populate the tables, we will use a simple perl script since we need a lot of data to make sure we are utilizing the
indexes and to make it easier to see the benefits.
$count = 1;
$outputfile = 'import.sql';
@chars = ('A 'Z', 'a 'z', 0 .. 9);
@numbers = (0 .. 9);
@single_chars = ('a' .. 'e');
$totalrecords = 5000000; # 5 million
open(OUTPUT, '> $outputfile');
print OUTPUT 'CREATE TABLE indextest (';
print OUTPUT 'id INT, name TEXT, number INT, letter CHAR(1)';
print OUTPUT ');';
print OUTPUT 'COPY indextest (id,name,number,letter) FROM stdin;';
while ($count <= $totalrecords) (1 .. 8)]);
$randnum = join('', @numbers [map (1 .. 8)]);
$randletter = join('', @single_chars [map(1)]);
print OUTPUT $count.'t'.$randstring.'t'.$randnum.'t'.$randletter.'n';
print OUTPUT '.n';
print OUTPUT 'VACUUM ANALYZE indextest;n';
This will create a file import.sql with a table definition and 5 million rows of data ready to import it similar to this:
l7S692Zy 57812383 c
YR4d8IS2 38591827 e
tiQrAR53 31930296 a
pN4JWLdo 96351579 e
ku5LvEBR 33878365 b
Depending on your computer hardware this might take some time.
Now we have the data ready to import, let's get to it:
[pgsql@elrond pgsql]$ time psql -d indextest < import.sql
On my system, it took approximately 5 minutes to import the 5 million records.
The vacuum command is important. PostgreSQL keeps statistics about table sizes, distribution of data and so on. Running vacuum
analyze updates these statistics.
So we have a large table, let's try and search it.
Let's try to grab a specific row:
[pgsql@elrond pgsql]$ time echo 'SELECT * FROM indextest WHERE id='50000'' | psql -d indextest
id | name | number | letter
50000 | UrMQnH4s | 7536232 | b
It took 32 seconds to retrieve one row from the table. This might seem reasonable considering the size of the table. Let's see if we can make it quicker.
Note: Throughout this tutorial, you will get different results to the ones shown here both in the time taken to perform actions and actual results of queries. This is because the data is randomly generated, the hardware you are running the tests on is different and what else your system is doing at the time.
Politica de confidentialitate|
Adauga cod HTML in site