File-Remote
view release on metacpan or search on metacpan
NAME
File::Remote - Read/write/edit remote files transparently
SYNOPSIS
#
# Two ways to use File::Remote
#
# First, the function-based style. Here, we can use the
# special :replace tag to overload Perl builtins!
#
use File::Remote qw(:replace); # special :replace tag
DESCRIPTION
This module takes care of dealing with files regardless of whether
they're local or remote. It allows you to create and edit files without
having to worry about their physical location on the network. If a file
passed into a function is of the form "host:/path/to/file", then
"File::Remote" uses rsh/rcp (or ssh/scp, depending on how you configure
it) to edit the file remotely. Otherwise, it assumes the file is local
and passes calls directly through to Perl's core functions.
The nice thing about this module is that you can use it for *all* your
file calls, since it handles both remote and local files transparently.
This means you don't have to put a whole bunch of checks for remote
files in your code. Plus, if you use the function-oriented interface
along with the ":replace" tag, you can actually redefine the Perl
builtin file functions. This means that your existing Perl scripts can
automatically handle remote files with no re-engineering(!).
There are two ways to program with "File::Remote", an object-oriented
style and a function-oriented style. Both methods work equally well,
it's just a matter of taste. One advantage of the object-oriented method
is that this allows you to read and write from different servers using
print FILE "Hello, world!\n";
close(FILE) or die "Close failed: $!\n";
mkdir("/local/new/dir", "2775");
mkdir("host:/remote/new/dir");
chown("root", "other", "/local/new/dir");
unlink("host:/remote/file");
This is pretty neat; since "File::Remote" will pass calls to local files
straight through to Perl's core functions, you'll be able to do all this
"transparently" and not care about the locations of the files. Plus,
this has the big advantage of making your existing Perl scripts capable
of dealing with remote files without having to rewrite any code.
Because the names for the "File::Remote" methods clash with the Perl
builtins, if you use the function-oriented style with the ":standard"
tag there is an extra 'r' added to the front of the function names.
Thus, "<$remote-"open>> becomes 'ropen' in the ":standard"
function-oriented version:
# Function-oriented method
1;
#------------------------------------------------
# Documentation starts down here...
#------------------------------------------------
__END__ DATA
=head1 NAME
File::Remote - Read/write/edit remote files transparently
=head1 SYNOPSIS
#
# Two ways to use File::Remote
#
# First, the function-based style. Here, we can use the
# special :replace tag to overload Perl builtins!
#
use File::Remote qw(:replace); # special :replace tag
This module takes care of dealing with files regardless of whether
they're local or remote. It allows you to create and edit files without
having to worry about their physical location on the network. If a file
passed into a function is of the form C<host:/path/to/file>, then
C<File::Remote> uses rsh/rcp (or ssh/scp, depending on how you configure it)
to edit the file remotely. Otherwise, it assumes the file is local and
passes calls directly through to Perl's core functions.
The nice thing about this module is that you can use it for I<all> your
file calls, since it handles both remote and local files transparently.
This means you don't have to put a whole bunch of checks for remote files
in your code. Plus, if you use the function-oriented interface along with
the C<:replace> tag, you can actually redefine the Perl builtin file
functions. This means that your existing Perl scripts can automatically
handle remote files with no re-engineering(!).
There are two ways to program with C<File::Remote>, an object-oriented
style and a function-oriented style. Both methods work equally well,
it's just a matter of taste. One advantage of the object-oriented
method is that this allows you to read and write from different servers
print FILE "Hello, world!\n";
close(FILE) or die "Close failed: $!\n";
mkdir("/local/new/dir", "2775");
mkdir("host:/remote/new/dir");
chown("root", "other", "/local/new/dir");
unlink("host:/remote/file");
This is pretty neat; since C<File::Remote> will pass calls to local files
straight through to Perl's core functions, you'll be able to do all this
"transparently" and not care about the locations of the files. Plus,
this has the big advantage of making your existing Perl scripts capable
of dealing with remote files without having to rewrite any code.
Because the names for the C<File::Remote> methods clash with the Perl builtins,
if you use the function-oriented style with the C<:standard> tag there is
an extra 'r' added to the front of the function names. Thus, C<<$remote->open>>
becomes 'ropen' in the C<:standard> function-oriented version:
# Function-oriented method
use File::Remote qw(:standard); # use standard function names
( run in 0.283 second using v1.01-cache-2.11-cpan-8780591d54d )