DBIx-Admin-BackupRestore
view release on metacpan
or search on metacpan
Changelog.ini
view on Meta::CPAN
116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 | the first and last quote was removed, so a name like `public.t` became
public.t. But with Oracle, quotes are used around the schema and table
names separately, so "X" . "T" became X "." T :-(.
Now, all quotes are removed, so `public.t` still becomes public.t, but also
"X" . "T" becomes X.T.
EOT
[V 1.08]
Date=2005-07-19T10:45:00
Comments= <<EOT
- The previous patches did not include sub backup() calling sub process_table()
to split the schema name off from the table name, so table names with schema
names still attached were not being skipped. Big mistake. My aplologies
- Add XML::Parser, XML::Records and XML::TokeParser to Build.PL and Makefile.PL.
They should have been included since V 1.06
EOT
[V 1.07]
Date=2005-06-28T13:41:11
Comments= <<EOT
- The default behaviour of this version is the same as for previous version,
so there is no need to upgrade unless you need the new features.
- Fix bug whereby sub backup could output <row></row> because all columns
in a row were null, & sub restore couldn't cope.
- Document that all spaces are converted to underscores in table and column names.
- New options:
o The value of the database handle attribute FetchHashKeyName governs
how table names are handled. Values are:
'NAME' : The default - use the value returned by the database server 'NAME_uc' : Convert table names to upper case
'NAME_lc' : Convert to lower case. This is the recommended value
This possible conversion of the case of table names affects how you
specify table names in the constructor options:
- rename_columns
- rename_tables
- skip_tables
The following new options can be passed to the constructor of this module.
o croak_on_error => 0 or 1. 1 is the default , for backwards compatibility.
During backup(), the $sth -> execute() is now wrapped in eval {}, and if
an error occurs, and croak_on_error is 1, we Carp::croak.
If croak_on_error is 0, we continue . Not only that, but if verbose is 1,
the error is printed to STDERR.
o odbc => 0 or 1. 0 is the default .
During backup, if odbc is 1 we use the simplified call $dbh -> tables() to get the list of table names. This list includes what MS Access calls
Queries, which are possibly equivalent to views. MS Access does not
support the syntax used in the non-ODBC situation:
$dbh -> tables( '%' , '%' , '%' , 'table' ).
o rename_columns => {}. You can specify a hash ref in the form:
rename_columns => { 'old name' => 'new name' , ...}.
For example, 'order' is a reserved word under MySQL, so you would use :
rename_columns => { order => 'orders' } (or whatever you want).
The option affects all tables.
The database handle attribute FetchHashKeyName affects this option.
|
Changelog.ini
view on Meta::CPAN
172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 | rename_tables => { 'old name' => 'new name' , ...}.
The database handle attribute FetchHashKeyName affects this option.
Renaming takes place after the effect of FetchHashKeyName.
- Add method get_column_names(). This returns a hash ref , where the keys are
table names, possibly transformed according to the database handle attribute
FetchHashKeyName, and the values are array refs of column names, also converted
according to the aforesaid and understated FetchHashKeyName. Further, these
column names are sorted, and all spaces in column names are converted to
underscores.
This hashref is acceptable to the module DBIx::Admin::CreateTrivialSchema :-).
- The demo examples/backup-db.pl contains a list of MS Access tables which you
almost certainly want to supply to the skip_tables option if exporting
from MS Access.
EOT
[V 1.06]
Date=2005-05-20T15:45:00
Comments= <<EOT
- Correct docs discussing the value 2 for the fiddle_timestamp option, which said
timestamp and should have said datetime.
- Add an FAQ to the docs
- Add method restore_in_order(), which lets you specify the order in which tables
are restored. This allows you to define a column with a clause such as
'references foreign_table (foreign_column)' , and to populate the foreign_table
before the dependent table.
But mutually-dependent and self-referential tables are still not catered for .
- Add method split (), which reads an XML file output by backup() and splits out into
a separate file each table you are not skipping. The file names are the tables'
names, including schema if any, and with an extension of 'xml' . The output files
have headers and footers so they are identical in structure to the file output
by backup(). Hence they can be fed back in to restore() and restore_in_order().
This method helps circumvent the drawback of restore_in_order(), which reads its
input file once per table.
Since this is a file-to-file operation, the dbh parameter to new() is no longer
mandatory.
See examples/ split -xml.pl and all-tables.xml for a demo.
- Change methods backup(), restore() and the new restore_in_order() and split (),
to use lower case XML tags 'dbi' , 'resultset' , and 'row' , as they should have been in the first place.
- Methods restore() and split () will read a file containing upper or lower case
tags.
- Warning: restore_in_order() only handles lower case tags, due to the way
XML::Records works.
- This module now requires these modules, installed in this order:
o XML::Parser
o XML::TokeParser
o XML::Records
|
Changelog.ini
view on Meta::CPAN
251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 | 2005-04-15 09:34:00.
- Expand the docs referring to fiddle_timestamp
EOT
[V 1.04]
Date=2005-03-02T13:20:00
Comments= <<EOT
- A new option has been added to the constructor: skip_schema.
Here, 'schema' is defined to be the prefix on a table name,
and to be separated from the table name by a '.' .
Eg: The backup phase, with Postgres as the input database, will output tables
with names like 'information_schema.sql_features' and 'pg_catalog.pg_aggregate' .
If new is called as new( skip_schema => [ 'some_schema_name' ]), the restore phase
does not restore any tables in the named schema.
This parameter is optional. It defaults to [], so the module behaves as it did
before .
- A new option has been added to the constructor: transform_tablenames.
Now, new( transform_tablenames => 1) chops the schema, up to and including the
first '.' , off table names. Thus a table exported from Postgres as
'public.service' can be renamed 'service' when being imported into another
database, eg MySQL.
|
Changes
view on Meta::CPAN
79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 | - Switch from DBI's table() method to table_info(). This is for Oracle
- Discard table names which match /^BIN\$.+\$./. This is for Oracle
- Change the handling of quotes around schema and table names. Previously,
the first and last quote was removed, so a name like `public.t` became
public.t. But with Oracle, quotes are used around the schema and table
names separately, so "X" . "T" became X "." T :-(.
Now, all quotes are removed, so `public.t` still becomes public.t, but also
"X" . "T" becomes X.T.
1.08 2005-07-19T10:45:00
- The previous patches did not include sub backup() calling sub process_table()
to split the schema name off from the table name, so table names with schema
names still attached were not being skipped. Big mistake. My aplologies
- Add XML::Parser, XML::Records and XML::TokeParser to Build.PL and Makefile.PL.
They should have been included since V 1.06
1.07 2005-06-28T13:41:11
- The default behaviour of this version is the same as for previous version,
so there is no need to upgrade unless you need the new features.
- Fix bug whereby sub backup could output <row></row> because all columns
in a row were null, & sub restore couldn't cope.
- Document that all spaces are converted to underscores in table and column names.
- New options:
o The value of the database handle attribute FetchHashKeyName governs
how table names are handled. Values are:
'NAME' : The default - use the value returned by the database server 'NAME_uc' : Convert table names to upper case
'NAME_lc' : Convert to lower case. This is the recommended value
This possible conversion of the case of table names affects how you
specify table names in the constructor options:
- rename_columns
- rename_tables
- skip_tables
The following new options can be passed to the constructor of this module.
o croak_on_error => 0 or 1. 1 is the default , for backwards compatibility.
During backup(), the $sth -> execute() is now wrapped in eval {}, and if
an error occurs, and croak_on_error is 1, we Carp::croak.
If croak_on_error is 0, we continue . Not only that, but if verbose is 1,
the error is printed to STDERR.
o odbc => 0 or 1. 0 is the default .
During backup, if odbc is 1 we use the simplified call $dbh -> tables() to get the list of table names. This list includes what MS Access calls
Queries, which are possibly equivalent to views. MS Access does not
support the syntax used in the non-ODBC situation:
$dbh -> tables( '%' , '%' , '%' , 'table' ).
o rename_columns => {}. You can specify a hash ref in the form:
rename_columns => { 'old name' => 'new name' , ...}.
For example, 'order' is a reserved word under MySQL, so you would use :
rename_columns => { order => 'orders' } (or whatever you want).
The option affects all tables.
The database handle attribute FetchHashKeyName affects this option.
|
Changes
view on Meta::CPAN
135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 | rename_tables => { 'old name' => 'new name' , ...}.
The database handle attribute FetchHashKeyName affects this option.
Renaming takes place after the effect of FetchHashKeyName.
- Add method get_column_names(). This returns a hash ref , where the keys are
table names, possibly transformed according to the database handle attribute
FetchHashKeyName, and the values are array refs of column names, also converted
according to the aforesaid and understated FetchHashKeyName. Further, these
column names are sorted, and all spaces in column names are converted to
underscores.
This hashref is acceptable to the module DBIx::Admin::CreateTrivialSchema :-).
- The demo examples/backup-db.pl contains a list of MS Access tables which you
almost certainly want to supply to the skip_tables option if exporting
from MS Access.
1.06 2005-05-20T15:45:00
- Correct docs discussing the value 2 for the fiddle_timestamp option, which said
timestamp and should have said datetime.
- Add an FAQ to the docs
- Add method restore_in_order(), which lets you specify the order in which tables
are restored. This allows you to define a column with a clause such as
'references foreign_table (foreign_column)' , and to populate the foreign_table
before the dependent table.
But mutually-dependent and self-referential tables are still not catered for .
- Add method split (), which reads an XML file output by backup() and splits out into
a separate file each table you are not skipping. The file names are the tables'
names, including schema if any, and with an extension of 'xml' . The output files
have headers and footers so they are identical in structure to the file output
by backup(). Hence they can be fed back in to restore() and restore_in_order().
This method helps circumvent the drawback of restore_in_order(), which reads its
input file once per table.
Since this is a file-to-file operation, the dbh parameter to new() is no longer
mandatory.
See examples/ split -xml.pl and all-tables.xml for a demo.
- Change methods backup(), restore() and the new restore_in_order() and split (),
to use lower case XML tags 'dbi' , 'resultset' , and 'row' , as they should have been in the first place.
- Methods restore() and split () will read a file containing upper or lower case
tags.
- Warning: restore_in_order() only handles lower case tags, due to the way
XML::Records works.
- This module now requires these modules, installed in this order:
o XML::Parser
o XML::TokeParser
o XML::Records
|
Changes
view on Meta::CPAN
205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 | You would use this option when transferring data from MySQL 's ' datetime' type to Postgres ' ' datetime' type, and some MySQL output values match
/0000-00-00 00:00:00/ and some values are real dates, such as
2005-04-15 09:34:00.
- Expand the docs referring to fiddle_timestamp
1.04 2005-03-02T13:20:00
- A new option has been added to the constructor: skip_schema.
Here, 'schema' is defined to be the prefix on a table name,
and to be separated from the table name by a '.' .
Eg: The backup phase, with Postgres as the input database, will output tables
with names like 'information_schema.sql_features' and 'pg_catalog.pg_aggregate' .
If new is called as new( skip_schema => [ 'some_schema_name' ]), the restore phase
does not restore any tables in the named schema.
This parameter is optional. It defaults to [], so the module behaves as it did
before .
- A new option has been added to the constructor: transform_tablenames.
Now, new( transform_tablenames => 1) chops the schema, up to and including the
first '.' , off table names. Thus a table exported from Postgres as
'public.service' can be renamed 'service' when being imported into another
database, eg MySQL.
|
MANIFEST
view on Meta::CPAN
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | Changelog.ini
Changes
examples/all-tables.xml
examples/backup-db.pl
examples/restore-db.pl
examples/ split -xml.pl
lib/DBIx/Admin/BackupRestore.pm
LICENSE
Makefile.PL
MANIFEST This list of files
MANIFEST.SKIP
README
t/00.versions.t
t/00.versions.tx
|
MANIFEST.SKIP
view on Meta::CPAN
15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | \bMakeMaker-\d
\b_build
\bBuild$
\bBuild.bat$
\bcover_db
~$
\
\.
\.bak$
\.old$
\.rej$
\.tmp$
|
examples/backup-db.pl
view on Meta::CPAN
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | #!/usr/bin/perl
my ( $file_name ) = shift || die ( "Usage: perl backup-db.pl backup-db.xml" );
my ( $driver ) = ($^O eq 'MSWin32' ) ? 'mysql' : 'Pg' ;
my ( $dsn ) = ( $driver eq 'mysql' ) ? 'dbi:mysql:db_name' : 'dbi:Pg:dbname=db_name' ;
my ( $username ) = ( $driver eq 'mysql' ) ? 'root' : 'postgres' ;
my ( $password ) = ( $driver eq 'mysql' ) ? 'pass' : '' ;
my ( $dbh ) = DBI -> connect
(
$dsn , $username , $password ,
{
AutoCommit => 1,
FetchHashKeyName => 'NAME_lc' ,
|
examples/backup-db.pl
view on Meta::CPAN
32 33 34 35 36 37 38 39 40 41 42 43 | open (OUT, "> $file_name" ) || die ( "Can't open(> $file_name): $!" );
print OUT DBIx::Admin::BackupRestore -> new
(
clean => 1,
dbh => $dbh ,
skip_tables =>
[
qw/msysaces msysaccessobjects msyscolumns msysimexcolumns msysimexspecs msysindexes msysmacros msysmodules2 msysmodules msysobjects msysqueries msysrelationships/
],
verbose => 1,
) -> backup( 'db_name' );
close OUT;
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 | sub adjust_case
{
my ( $self , $s ) = @_ ;
$$self { '_dbh' }{ 'FetchHashKeyName' } eq 'NAME_uc' ? uc $s : $$self { '_dbh' }{ 'FetchHashKeyName' } eq 'NAME_lc' ? lc $s : $s ;
}
sub backup
{
my ( $self , $database ) = @_ ;
Carp::croak( 'Missing parameter to new(): dbh' ) if (! $$self { '_dbh' });
$$self { '_quote' } = $$self { '_dbh' } ? $$self { '_dbh' } -> get_info(29) : '' ;
$$self { '_tables' } = $$self { '_odbc' } ? $self -> odbc_tables() : $self -> tables();
$$self { '_xml' } = qq|<?xml version = "1.0"?>\n| ;
$$self { '_xml' } .= qq|<dbi database = "| . $self -> encode_xml( $database ) . qq|">\n| ;
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 | my ( $data , $display_sql , $display_table );
my ( $field );
my ( $i );
my ( $output_column_name );
my ( $sql , $sth );
my ( $table_name );
my ( $xml );
for $table_name (@{ $$self { '_tables' } })
{
$self -> process_table( 'backup' , $table_name );
next if ( $$self { '_skipping' });
$display_table = $self -> adjust_case( $$self { '_current_table' });
$sql = "select * from $$self{'_current_table'}" ;
$display_table = $$self { '_rename_tables' }{ $display_table } ? $$self { '_rename_tables' }{ $display_table } : $display_table ;
$display_sql = "select * from $display_table" ;
$display_sql = $self -> adjust_case( $display_sql );
$display_sql = $self -> encode_xml( $display_sql );
$$self { '_xml' } .= qq|\t<resultset statement = "$display_sql">\n| ;
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 | $$self { '_xml' } .= "\t\t<row>\n$xml\t\t</row>\n" if ( $xml );
}
Carp::croak( "Can't fetchrow_hashref($sql): $DBI::errstr" ) if ( $DBI::errstr );
$$self { '_xml' } .= "\t</resultset>\n" ;
}
$$self { '_xml' } .= "</dbi>\n" ;
}
sub decode_xml
{
my ( $self , $s ) = @_ ;
for my $key ( keys %_decode_xml )
{
$s =~ s/ $key / $_decode_xml { $key }/eg;
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 | {
next if ( $line =~ m!^(<\?xml|<dbi|</dbi)!i);
if ( $line =~ m!<resultset .+? from (.+)">!i)
{
$self -> process_table( 'Restore' , $1);
}
elsif ( (! $$self { '_skipping' }) && ( $line =~ m!<row>!i) )
{
$$self { '_key' } = [];
$$self { '_value' } = [];
while ( ( $line = <INX>) !~ m!</row>!i)
{
if ( $line =~ m!^\s*<(.+?)>(.*?)</\1>!i)
{
push @{ $$self { '_key' } }, $1;
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 | $$record { 'row' } = [ $$record { 'row' }] if ( ref $$record { 'row' } ne 'ARRAY' );
for $row (@{ $$record { 'row' } })
{
@{ $$self { '_key' } } = keys %$row ;
$$self { '_value' } = [];
$self -> transform( $_ , $$row { $_ }) for @{ $$self { '_key' } };
$self -> write_row();
}
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 | open (OUT, "> $output_file_name" ) || Carp::croak( "Can't open($output_file_name): $!" );
print OUT qq|<?xml version = "1.0"?>\n| ;
print OUT qq|<dbi database = "$$self{'_database'}">\n| ;
print OUT qq|\t<resultset statement = "select * from $table_name">\n| ;
}
}
elsif ( (! $$self { '_skipping' }) && ( $line =~ m!<row>!i) )
{
print OUT qq|\t\t<row>\n| ;
while ( ( $line = <INX>) !~ m!</row>!i)
{
print OUT $line ;
}
print OUT qq|\t\t</row>\n| ;
}
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 | C<DBIx::Admin::BackupRestore> - Backup all tables in a database to XML, and restore them
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 | This method takes a set of parameters. Only the dbh parameter is mandatory.
For each parameter you wish to use , call new as new( param_1 => value_1, ...).
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 | under the strict option for MySQL V 5, whereas 1970-01-01 00:00:00 is invalid.
This parameter is optional.
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 | The default value is [].
If new is called as new( skip_tables => [ 'some_table_name' , ...]), the restore phase
does not restore the tables named in the call to C<new()>.
This option is designed to work with CGI scripts using the module CGI::Sessions.
Now, the CGI script can run with the current CGI::Session data, and stale CGI::Session
data is not restored from the XML file.
See examples/backup-db.pl for a list of MS Access tables names which you are unlikely
to want to transfer to an RDBMS.
This parameter is optional.
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 | Note: You would normally use these options to port data from Postgres to MySQL: new( skip_schema => [ 'information_schema' , 'pg_catalog' ], transform_tablenames => 1).
This parameter is optional.
|
lib/DBIx/Admin/BackupRestore.pm
view on Meta::CPAN
907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 | Further, these column names are sorted, and all spaces in column names are converted
to underscores.
This hashref is acceptable to the module DBIx::Admin::CreateTrivialSchema :-).
|
t/02.io.t
view on Meta::CPAN
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 | my ( @opts ) = ( $ENV {DBI_DSN}, $ENV {DBI_USER}, $ENV {DBI_PASS});
try
{
populate(\ @opts , 't0' );
populate(\ @opts , 't1' );
open (OUT, "> $xml_file" ) || die ( "Can't open(> $xml_file): $!" );
print OUT DBIx::Admin::BackupRestore -> new( dbh => generate_dbh(\ @opts ) ) -> backup( $db_file );
close OUT;
ok(-r $db_file , "$db_file is readable" );
ok(-r $xml_file , "$xml_file is readable" );
$db_file = File::Spec -> catfile( $out_dir , 'restore.sqlite' );
unlink $db_file ;
|
t/03.empty.table.t
view on Meta::CPAN
98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 | my ( @opts ) = ( $ENV {DBI_DSN}, $ENV {DBI_USER}, $ENV {DBI_PASS});
try
{
populate(\ @opts , 't0' , false);
populate(\ @opts , 't1' , true);
populate(\ @opts , 't2' , false);
open (OUT, "> $xml_file" ) || die ( "Can't open(> $xml_file): $!" );
print OUT DBIx::Admin::BackupRestore -> new( dbh => generate_dbh(\ @opts ) ) -> backup( $db_file );
close OUT;
ok(-r $db_file , "$db_file is readable" );
ok(-r $xml_file , "$xml_file is readable" );
$db_file = File::Spec -> catfile( $out_dir , 'restore.sqlite' );
unlink $db_file ;
|