Bill Oliver wrote:
I have a user who creates his databases, creates tables, inserts initial data, and then wants to set the statistics on the indexes he just created. In PostgreSQL, he would issue the `VACUUM DATABASE;` command.
Here is the what he came up with
EXECUTE BLOCK AS
DECLARE VARIABLE L_INDEX VARCHAR(100);
DECLARE VARIABLE L_SQL VARCHAR(200);
BEGIN
FOR SELECT RDB$INDEX_NAME FROM RDB$INDICES INTO :L_INDEX DO
BEGIN
L_SQL = 'SET statistics INDEX ' || L_INDEX || ';';
EXECUTE STATEMENT :L_SQL;
END
END
This is ok, I guess. I wondered if there was a cleaner way to do this, though?
Ann W. Harrison answers:
The cleanest way to do it is to:
- Create databases
- Create tables
- Load data & commit
- Create indexes
That sequence reduces load time, creates dense indexes, and sets the statistics correctly.
Bill Oliver follows:
I suppose the user finds it more logical to submit all of the DDL first and create the data model up front.
If you used a data modeling tool like Erwin, you would get an output SQL file that would have the tables and indexes created in one place.
Ann W. Harrison answers:
That's nice and logical (I guess) but loading first then indexing is much more efficient.
Addendum from Pavel Cisar:
Another way is to create indices together with other database objects, but as INACTIVE. Then activate them after data insert. You can use the same EXECUTE BLOCK, but with ALTER INDEX ... ACTIVE instead SET STATISTICS command.