view release on metacpan or search on metacpan
lib/BackupPC/XS.pm view on Meta::CPAN
=head2 BackupPC::XS::AttribCache
Maintain a cache of directories, with full share/path semantics.
$ac = BackupPC::XS::AttribCache::new($host, $backupNum, $shareNameUM, $compress);
$attrHash = $ac->get($fileName, $allocateIfMissing, $dontReadInode);
$ac->set($fileName, $attrHash, $dontOverwriteInode);
$ac->delete($fileName);
lib/BackupPC/XS.pm view on Meta::CPAN
If you specify :all (see SYNOPSIS), then the BPC_FTYPE_ values are exported.
=head1 SEE ALSO
BackupPC, backuppc.sourceforge.net.
rsync-bpc.
=head1 AUTHOR
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Badger/Filesystem/Visitor.pm view on Meta::CPAN
inclusive options (0/1 flags, names, regexen, subroutines, or list refs
containing any of the above).
$dir->visit(
no_files => '*.bak',
no_dirs => ['tmp', qr/backup/i],
not_in_dirs => ['.svn', '.DS_Store'],
);
When the visit is done, the L<collect()> method can be called to return
a list (in list context) or reference to a list (in scalar context) of the
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Basset/Object.pm view on Meta::CPAN
Copies the object. B<Be warned>! Copy does a B<deep> copy of the object. So any objects/references/etc
pointed to by the original object will also be copied.
You may optionally pass in a different object/structure and copy that instead.
my $backupBoard = $game->copy($game->board);
=cut
=pod
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Batch/Batchrun/Retain.pm view on Meta::CPAN
#=========================================================================
# File: Retain.pm
#
# Usage: Subroutine
#
# Purpose: Copy and Compress files saving to backup directories
#
# EXAMPLE -
# use Batch::Batchrun::Retain;
#
# retain(FILE=>test, LIMIT=>5, DIR=>/apps/irmprod/archive,COMPRESS=>yes,
lib/Batch/Batchrun/Retain.pm view on Meta::CPAN
__END__
=head1 NAME
Retain - keep backup copies of a file
=head1 SYNOPSIS
lib/Batch/Batchrun/Retain.pm view on Meta::CPAN
retain(FILE=>test,LIMIT=>5,DIR=>/apps/irmprod/archive,COMPRESS=>yes,DELETE=>NO);
=head1 DESCRIPTION
The C<retain> function provides a convenient way to keep backups of files. It keeps
a determined number of files in numbered directories. Arguments are passed using named
parameters. Each name is case insensitive. Of the several parameters only FILE and DIR
are required.
=head2 REQUIRED PARAMETERS
lib/Batch/Batchrun/Retain.pm view on Meta::CPAN
=over 4
=item B<COMPRESS>
compress the backup copies of the file. True values are indicated by
passing 1 or yes. (unix only - defaults to no)
=item B<CHMOD>
the numeric mode to use when creating the backup file
(defaults to 0775)
=item B<DELETE>
deletes the original file if specified. True values are indicated by
passing 1 or yes. (defaults to no)
=item B<LIMIT>
number of backup copies to keep.
=item B<PREFIX>
the prefix to use for each numbered directory. The numbered directory will
automatically be created if it does not exist. (defaults to bk)
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST.SKIP view on Meta::CPAN
\bBuild.bat$
\bBuild.COM$
\bBUILD.COM$
\bbuild.com$
# Avoid temp and backup files.
~$
\.old$
\#$
\b\.#
\.bak$
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Beagle/Cmd/Command/shell.pm view on Meta::CPAN
show_time($start) if enabled_devel;
}
else {
# backup settings
my ( $devel, $cache, $root ) =
( enabled_devel(), enabled_cache(), current_root('not die') );
my $start = Time::HiRes::time();
eval { Beagle::Cmd->run };
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST.SKIP view on Meta::CPAN
\bBuild.bat$
# Avoid Devel::Cover generated files
\bcover_db
# Avoid temp and backup files.
~$
\#$
\.#
\.bak$
\.old$
view all matches for this distribution
view release on metacpan or search on metacpan
share/PerlCritic/Critic/Utils.pm view on Meta::CPAN
@newfiles = File::Spec->no_upwards(@newfiles);
@newfiles = grep { not $SKIP_DIR{$_} } @newfiles;
push @queue, map { File::Spec->catfile($file, $_) } @newfiles;
}
if ( (-f $file) && ! _is_backup($file) && _is_perl($file) ) {
push @code_files, $file;
}
}
return @code_files;
}
#-----------------------------------------------------------------------------
# Decide if it's some sort of backup file
sub _is_backup {
my ($file) = @_;
return 1 if $file =~ m{ [.] swp \z}xms;
return 1 if $file =~ m{ [.] bak \z}xms;
return 1 if $file =~ m{ ~ \z}xms;
return 1 if $file =~ m{ \A [#] .+ [#] \z}xms;
share/PerlCritic/Critic/Utils.pm view on Meta::CPAN
Given a list of directories, recursively searches through all the
directories (depth first) and returns a list of paths for all the
files that are Perl code files. Any administrative files for CVS or
Subversion are skipped, as are things that look like temporary or
backup files.
A Perl code file is:
=over
view all matches for this distribution
view release on metacpan or search on metacpan
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
tables => {
unit_table => 'bench_units',
benchmark_table => 'benchs',
benchmark_value_table => 'bench_values',
subsume_type_table => 'bench_subsume_types',
benchmark_backup_value_table => 'bench_backup_values',
additional_type_table => 'bench_additional_types',
additional_value_table => 'bench_additional_values',
additional_relation_table => 'bench_additional_relations',
additional_type_relation_table => 'bench_additional_type_relations',
backup_additional_relation_table => 'bench_backup_additional_relations',
},
};
my $hr_column_ba_mapping = {
bench_value_id => 'VALUE_ID',
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
$or_self->{query}->copy_additional_values({
new_bench_value_id => $i_bench_value_id,
old_bench_value_id => $hr_atts->{rows}[0]{bench_value_id},
});
for my $hr_backup_row ( @{$hr_atts->{rows}} ) {
if ( $hr_backup_row->{bench_subsume_type_rank} == 1 ) {
if ( $hr_atts->{backup} ) {
# copy data rows to backup table
$or_self->{query}->copy_benchmark_backup_value({
new_bench_value_id => $i_bench_value_id,
old_bench_value_id => $hr_backup_row->{bench_value_id},
});
my $i_bench_backup_value_id = $or_self->{query}->last_insert_id(
$or_self->{config}{tables}{benchmark_backup_value_table},
'bench_backup_value_id',
);
$or_self->{query}->copy_benchmark_backup_additional_relations({
new_bench_value_id => $i_bench_backup_value_id,
old_bench_value_id => $hr_backup_row->{bench_value_id},
});
}
}
else {
# update bench_value_id in backup table
$or_self->{query}->update_benchmark_backup_value({
new_bench_value_id => $i_bench_value_id,
old_bench_value_id => $hr_backup_row->{bench_value_id},
});
}
# now lets remove the old rows
$or_self->{query}->delete_benchmark_additional_relations(
$hr_backup_row->{bench_value_id},
);
$or_self->{query}->delete_benchmark_value(
$hr_backup_row->{bench_value_id},
);
}
};
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
my $or_strp = DateTime::Format::Strptime->new( pattern => '%F %T', );
my @a_rows;
my $i_counter = 0;
my $i_sum_value = 0;
my $b_backup = ((not exists $hr_options->{backup}) || $hr_options->{backup}) ? 1 : 0;
my $s_last_key = q##;
while ( my $hr_values = $or_data_values->fetchrow_hashref() ) {
my $s_act_key = join '__',
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
if ( $i_counter ) {
$or_self->$fn_add_subsumed_point({
rows => \@a_rows,
VALUE => $i_sum_value / $i_counter,
backup => $b_backup,
type_id => $hr_subsume_type->{bench_subsume_type_id}
});
}
@a_rows = ();
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
if ( $i_counter ) {
$or_self->$fn_add_subsumed_point({
rows => \@a_rows,
VALUE => $i_sum_value / $i_counter,
backup => $b_backup,
type_id => $hr_subsume_type->{bench_subsume_type_id}
});
}
return 1;
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
=head3 subsume
This is a maintenance function for reducing the number of data points in the
database. Calling this function reduces the rows in the benchmark values table
by building an average value for all benchmark data points grouped by specfic
columns. By default all old grouped columns will be added to backup tables for
rebuilding the original state.
It is highly recommended to do this periodically for better search performance.
my $b_success = $or_bench->subsume({
subsume_type => 'month',
exclude_additionals => [qw/ benchmark_date /],
date_from => '2013-01-01 00:00:00',
date_to => '2014-01-01 00:00:00',
backup => 0,
});
=over 4
=item subsume_type
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
=item exclude_additionals
Array Reference of additional values that should be excluded from grouping.
=item backup
By default all subsumed rows will be inserted to backup tables. If this
isn't desired a false value must be passed.
=back
=head3 init_search_engine( $force )
lib/BenchmarkAnything/Storage/Backend/SQL.pm view on Meta::CPAN
tables => {
unit_table => 'bench_units',
benchmark_table => 'benchs',
benchmark_value_table => 'bench_values',
subsume_type_table => 'bench_subsume_types',
benchmark_backup_value_table => 'bench_backup_values',
additional_type_table => 'bench_additional_types',
additional_value_table => 'bench_additional_values',
additional_relation_table => 'bench_additional_relations',
additional_type_relation_table => 'bench_additional_type_relations',
backup_additional_relation_table => 'bench_backup_additional_relations',
}
=item select_cache [optional]
In case of a true value the module cache some select results
view all matches for this distribution
view release on metacpan or search on metacpan
lib/BerkeleyDB/Manager.pm view on Meta::CPAN
=item log_auto_remove
Enables automatic removal of logs.
Normally logs should be removed after being backed up, but if you are not
interested in having full snapshot backups for catastrophic recovery scenarios,
you can enable this.
See L<http://www.oracle.com/technology/documentation/berkeley-db/db/ref/transapp/logfile.html>.
Defaults to false.
view all matches for this distribution
view release on metacpan or search on metacpan
av_top_index|5.017009|5.003007|p
av_top_index_skip_len_mg|5.025010||Viu
av_undef|5.003007|5.003007|
av_unshift|5.003007|5.003007|
ax|5.003007|5.003007|
backup_one_GCB|5.025003||Viu
backup_one_LB|5.023007||Viu
backup_one_SB|5.021009||Viu
backup_one_WB|5.021009||Viu
bad_type_gv|5.019002||Viu
bad_type_pv|5.016000||Viu
BADVERSION|5.011004||Viu
BASEOP|5.003007||Viu
BhkDISABLE|5.013003||xV
view all matches for this distribution
view release on metacpan or search on metacpan
Makefile.PL view on Meta::CPAN
use File::Find;
find( \&filecheck, "." );
sub filecheck {
unlink if /~$/; # Remove any emacs backup files
die "Aborting: Swapfile $_ found" if /\.swp$/;
}
&WriteMakefile(
'NAME' => 'Biblio::ILL::ISO',
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Biblio/Isis/Manual.pod view on Meta::CPAN
Sorted Link file (long terms)
=item C<xxxxxx.BKP>
Master file backup
=item C<xxxxxx.XHF>
Hit file index
lib/Biblio/Isis/Manual.pod view on Meta::CPAN
always 0 for user data base file (1 for system message files)
=back
(the last four fields are used for statistics during backup/restore).
=head2 C. Master file block format
The Master file records are stored consecutively, one after the other,
each record occupying exactly C<MFRL> bytes. The file is stored as
lib/Biblio/Isis/Manual.pod view on Meta::CPAN
As indicated above, as Master file records are updated the C<MST> file
grows in size and there will be lost space in the file which cannot be
used. The reorganization facilities allow this space to be reclaimed by
recompacting the file.
During the backup phase a Master file backup file is created (C<.BKP>).
The structure and format of this file is the same as the Master file
(C<.MST>), except that a Crossreference file is not required as all the
records are adjacent. Records marked for deletion are not backed up.
Because only the latest copy of each record is backed up, the system
does not allow you to perform a backup whenever an Inverted file update
is pending for one or more records.
During the restore phase the backup file is read sequentially and the
program recreates the C<MST> and C<XRF> file. At this point alt records which
were marked for logical deletion (before the backup) are now marked as
physically deleted (by setting C<XRFMFB = -1> and C<XRFMFP = 0>.
Deleted records are detected by checking holes in the C<MFN> numbering.
=head1 Inverted file structure and record formats
view all matches for this distribution
view release on metacpan or search on metacpan
lib/BigIP/iControl.pm view on Meta::CPAN
return %res
}
=head3 save_configuration ($filename)
$ic->save_configuration('backup.ucs');
# is equivalent to
$ic->save_configuration('backup');
# Not specifying a filename will use today's date in the
# format YYYYMMDD as the filename.
$ic->save_configuration();
lib/BigIP/iControl.pm view on Meta::CPAN
print $ic->download_file('/config/bigip.conf');
This method provides direct access to files on the target system. The method returns a scalar containing
the contents of the file.
This method may be useful for downloading configuration files for versioning or backups.
=cut
sub download_file {
my ($self,$file_name) = @_;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Bigtop/Backend/Init/Std.pm view on Meta::CPAN
# Avoid Module::Build generated and utility files.
\bBuild$
\b_build
# Avoid temp and backup files.
~$
\.tmp$
\.old$
\.bak$
\#$
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST.SKIP view on Meta::CPAN
# Avoid Module::Build generated and utility files.
\bBuild$
\b_build/
# Avoid temp and backup files.
~$
\.old$
\#$
\b\.#
\.bak$
view all matches for this distribution
view release on metacpan or search on metacpan
Write DTD for defaults; convert Adaptor.pm to parsing XML
Future items:
1. Option to backup software prior to install.
2 .starting and stopping of servers - is this working / robust
during or after an install
3. server monitoring
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Bio/MUST/Core/Tree.pm view on Meta::CPAN
$mode == 2 ? $tree->get_entities :
$mode == 1 ? $tree->get_internals :
$tree->get_terminals
};
# Note: old labels are backuped in specified attributes and vice-versa
# TODO: allow appending acc for terminal nodes?
for my $node (@nodes) {
my $label = $node->get_name;
my $attribute = $node->get_generic($key);
$node->set_generic($key => $label);
lib/Bio/MUST/Core/Tree.pm view on Meta::CPAN
sub store_tpl {
my $self = shift;
my $outfile = shift;
# backup and discard branch lengths
# Note: I have to do that since I cannot clone the tree (Bio::Phylo bug?)
my @branch_lengths;
for my $node ( @{ $self->tree->get_entities } ) {
push @branch_lengths, $node->get_branch_length;
$node->set_branch_length(undef);
view all matches for this distribution
view release on metacpan or search on metacpan
scripts/manipulate_datasets.pl view on Meta::CPAN
}
else {
# need to sort by the mean of provided column indices
# we will generate a temporary column of the mean
# first need to set the target of mean which is needed by combine function
my $original = $opt_target; # keep a backup just in case
$opt_target = 'mean';
combine_function(@indices);
my $i = $Data->last_column;
$opt_target = $original; # restore backup just in case
$Data->sort_data( $i, $direction );
$Data->delete_column($i); # delete the temporary column
}
# remove any pre-existing sorted metadata since no longer valid
scripts/manipulate_datasets.pl view on Meta::CPAN
and recombine the file (use C<join_data_file.pl>). This could be done
through a simple shell script.
The program keeps track of the number of manipulations performed, and if
any are performed, will write out to file the changed data. Unless an
output file name is provided, it will overwrite the input file (NO backup is
made!).
=head1 FUNCTIONS
This is a list of the functions available for manipulating columns. These may
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Bio/WGS2NCBI.pm view on Meta::CPAN
# have a FASTA file
if ( $file =~ /(.+)\.fsa$/ ) {
my $stem = $1;
# make backup of FASTA file
rename "${INDIR}/${file}", "${INDIR}/${file}.bak";
# read file, look op non-missing residue positions, write truncated
open my $fh, '<', "${INDIR}/${file}.bak" or die $!;
open my $out, '>', "${INDIR}/${file}" or die $!;
lib/Bio/WGS2NCBI.pm view on Meta::CPAN
INFO "$id\t$i1 .. $i2";
$coord{$id} = [ $i1, $i2 ];
$seq->trunc( $i1 + 1, $i2 + 1 )->write_fasta($out);
}
# make backup of TBL file, open handle for writing
rename "${INDIR}/${stem}.tbl", "${INDIR}/${stem}.tbl.bak";
open my $outtbl, '>', "${INDIR}/${stem}.tbl" or die $!;
# initialize variables
my $tr = Bio::WGS2NCBI::TableReader->new(
view all matches for this distribution
view release on metacpan or search on metacpan
lib/Bio/DB/SoapEUtilities/FetchAdaptor/seq.pm view on Meta::CPAN
\s*(.*?)
\s*(?: \( (.*?) \) )?\.?
$}xms ) {
($organelle, $abbr_name, $common) = ($1, $2, $3); # optional
} else {
$abbr_name = $get->('source'); # nothing caught; this is a backup!
}
# Convert data in classification lines into classification array.
my @class = split(/; /, $get->('taxonomy'));
view all matches for this distribution
view release on metacpan or search on metacpan
Bio/SeqFeature/Tools/Unflattener.pm view on Meta::CPAN
# of genbank records
#
# if no resolver tag is specified, we revert to the normal
# resolver_method
if ($resolver_tag) {
my $backup_resolver_method = $resolver_method;
# closure: $resolver_tag is remembered by this sub
my $sub =
sub {
my ($self, $sf, @possible_container_sfs) = @_;
my @container_sfs = ();
Bio/SeqFeature/Tools/Unflattener.pm view on Meta::CPAN
}
$match;
} @possible_container_sfs;
}
else {
return $backup_resolver_method->($sf, @possible_container_sfs);
}
return map {$_=>0} @container_sfs;
};
$resolver_method = $sub;
}
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST.SKIP view on Meta::CPAN
\bbuild.com$
# and Module::Build::Tiny generated files
\b_build_params$
# Avoid temp and backup files.
~$
\.old$
\#$
\b\.#
\.bak$
view all matches for this distribution
view release on metacpan or search on metacpan
lib/BioX/Workflow/Plugin/Drake.pm view on Meta::CPAN
global:
- indir: /home/user/workflow
- outdir: /home/user/workflow/output
- file_rule: (.csv)$
rules:
- backup:
local:
- INPUT: "{$self->indir}/{$sample}.csv"
- OUTPUT: "{$self->outdir}/{$sample}.csv"
- thing: "other thing"
process: |
lib/BioX/Workflow/Plugin/Drake.pm view on Meta::CPAN
;
; Starting Workflow
;
;
; Starting backup
;
;
; Variables
; Indir: /home/guests/jir2004/workflow
; Outdir: /home/guests/jir2004/workflow/output/backup
; Local Variables:
; INPUT: {$self->indir}/{$sample}.csv
; OUTPUT: {$self->outdir}/{$sample}.csv
; thing: other thing
;
/home/guests/jir2004/workflow/output/backup/$[SAMPLE].csv <- /home/guests/jir2004/workflow/$[SAMPLE].csv
cp $INPUT $OUTPUT
;
; Ending backup
;
;
; Starting grep_VARA
lib/BioX/Workflow/Plugin/Drake.pm view on Meta::CPAN
Run drake
drake --workflow workflow.full.drake
The following steps will be run, in order:
1: /home/user/workflow/output/backup/test1.csv <- /home/user/workflow/test1.csv [timestamped]
2: /home/user/workflow/output/backup/test2.csv <- /home/user/workflow/test2.csv [timestamped]
3: /home/user/workflow/output/grep_vara/test1.grep_VARA.csv <- /home/user/workflow/output/backup/test1.csv [projected timestamped]
4: /home/user/workflow/output/grep_vara/test2.grep_VARA.csv <- /home/user/workflow/output/backup/test2.csv [projected timestamped]
5: /home/user/workflow/output/grep_varb/test1.grep_VARA.grep_VARB.csv <- /home/user/workflow/output/grep_vara/test1.grep_VARA.csv [projected timestamped]
6: /home/user/workflow/output/grep_varb/test2.grep_VARA.grep_VARB.csv <- /home/user/workflow/output/grep_vara/test2.grep_VARA.csv [projected timestamped]
Confirm? [y/n] y
Running 6 steps with concurrence of 1...
--- 0. Running (timestamped): /home/user/workflow/output/backup/test1.csv <- /home/user/workflow/test1.csv
--- 0: /home/user/workflow/output/backup/test1.csv <- /home/user/workflow/test1.csv -> done in 0.02s
--- 1. Running (timestamped): /home/user/workflow/output/backup/test2.csv <- /home/user/workflow/test2.csv
--- 1: /home/user/workflow/output/backup/test2.csv <- /home/user/workflow/test2.csv -> done in 0.01s
--- 2. Running (timestamped): /home/user/workflow/output/grep_vara/test1.grep_VARA.csv <- /home/user/workflow/output/backup/test1.csv
Working on /home/user/workflow/output/backup/test1csv
--- 2: /home/user/workflow/output/grep_vara/test1.grep_VARA.csv <- /home/user/workflow/output/backup/test1.csv -> done in 0.01s
--- 3. Running (timestamped): /home/user/workflow/output/grep_vara/test2.grep_VARA.csv <- /home/user/workflow/output/backup/test2.csv
Working on /home/user/workflow/output/backup/test2csv
--- 3: /home/user/workflow/output/grep_vara/test2.grep_VARA.csv <- /home/user/workflow/output/backup/test2.csv -> done in 0.01s
--- 4. Running (timestamped): /home/user/workflow/output/grep_varb/test1.grep_VARA.grep_VARB.csv <- /home/user/workflow/output/grep_vara/test1.grep_VARA.csv
--- 4: /home/user/workflow/output/grep_varb/test1.grep_VARA.grep_VARB.csv <- /home/user/workflow/output/grep_vara/test1.grep_VARA.csv -> done in 0.01s
--- 5. Running (timestamped): /home/user/workflow/output/grep_varb/test2.grep_VARA.grep_VARB.csv <- /home/user/workflow/output/grep_vara/test2.grep_VARA.csv
lib/BioX/Workflow/Plugin/Drake.pm view on Meta::CPAN
cat drake.log #Here is the log for the first run
2015-06-21 14:02:47,543 INFO Running 3 steps with concurrence of 1...
2015-06-21 14:02:47,568 INFO
2015-06-21 14:02:47,570 INFO --- 0. Running (timestamped): /home/user/workflow/output/backup/test1.csv <- /home/user/workflow/test1.csv
2015-06-21 14:02:47,592 INFO --- 0: /home/user/workflow/output/backup/test1.csv <- /home/user/workflow/test1.csv -> done in 0.02s
#So on and so forth
If you look in the example directory you will see a few png files, these are outputs of the drake workflow.
view all matches for this distribution
view release on metacpan or search on metacpan
- outdir: data/processed
- file_rule: (.*)csv$
- plugins:
- FileExists
rules:
- backup:
local:
- wait: 0
process: cp {$self->indir}/{$sample}.csv {$self->outdir}/{$sample}.csv
- grep_VARA:
process: |
view all matches for this distribution
view release on metacpan or search on metacpan
t/lib/TestsFor/BioX/Workflow/Test001.pm view on Meta::CPAN
global:
- indir: t/example/data/raw/test001
- outdir: t/example/data/processed/test001
- file_rule: (.*).csv\$
rules:
- backup:
process: cp {\$self->indir}/{\$sample}.csv {\$self->outdir}/{\$sample}.csv
- grep_VARA:
process: |
echo "Working on {\$self->{indir}}/{\$sample}.csv"
grep -i "VARA" {\$self->indir}/{\$sample}.csv >> {\$self->outdir}/{\$sample}.grep_VARA.csv
t/lib/TestsFor/BioX/Workflow/Test001.pm view on Meta::CPAN
my $cmd3 = <<EOF;
grep -i "VARB" {\$self->indir}/{\$sample}.grep_VARA.csv >> {\$self->outdir}/{\$sample}.grep_VARA.grep_VARB.csv
EOF
my $process_exp = [
{ backup => {
process =>
'cp {$self->indir}/{$sample}.csv {$self->outdir}/{$sample}.csv'
}
},
{ grep_VARA => { process => $cmd2 } },
t/lib/TestsFor/BioX/Workflow/Test001.pm view on Meta::CPAN
#return;
is( $got, $expected, "Got expected output!" );
ok( -d "$Bin/example/data/processed/test001" );
my @processes = qw(backup grep_VARA grep_VARB);
foreach my $process (@processes) {
ok( -d "$Bin/example/data/processed/test001/$process" );
}
}
view all matches for this distribution
view release on metacpan or search on metacpan
if(@file>0){
if($file[0]=~/\S\.[n]?fa/){ ## to handle .nfa and .fa files
$output_file = $file[0]; $out_file_name_provided=1;
if(-s $output_file){
rename($output_file, "$output_file\.bak");
print "\n# (INFO) $output_file is present. $output_file\.bak will be created for backup\n";
}
}elsif($file[0]=~/(\S+)\.\S+/){
$output_file = "$1\.fa"; $out_file_name_provided=1;
}
}else{ $output_file='default_out.fa'; }
my %correct_head_box_entry = %{&read_correct_head_box()};
for($p=0; $p < @file; $p++){
$in_file = $file[$p];
##""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
## Make backup of the input file
##""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
&cp( "$in_file", "$in_file\.bak$$");
print "\n $in_file\.bak$$ is created as a backup \n\n";
print chr(7);
##""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
## Open files
##""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST.SKIP view on Meta::CPAN
# Avoid Module::Build generated and utility files.
\bBuild$
\b_build
# Avoid temp and backup files.
~$
\.gz$
\.old$
\.bak$
\.swp$
view all matches for this distribution
view release on metacpan or search on metacpan
av_top_index|5.017009|5.003007|p
av_top_index_skip_len_mg|5.025010||Viu
av_undef|5.003007|5.003007|
av_unshift|5.003007|5.003007|
ax|5.003007|5.003007|
backup_one_GCB|5.025003||Viu
backup_one_LB|5.023007||Viu
backup_one_SB|5.021009||Viu
backup_one_WB|5.021009||Viu
bad_type_gv|5.019002||Viu
bad_type_pv|5.016000||Viu
BADVERSION|5.011004||Viu
BASEOP|5.003007||Viu
BhkDISABLE|5.013003||xV
view all matches for this distribution