Alien-Build
view release on metacpan or search on metacpan
lib/Alien/Build/Manual/AlienAuthor.pod view on Meta::CPAN
# PODNAME: Alien::Build::Manual::AlienAuthor
# ABSTRACT: Alien author documentation
# VERSION
__END__
=pod
=encoding UTF-8
=head1 NAME
Alien::Build::Manual::AlienAuthor - Alien author documentation
=head1 VERSION
version 2.84
=head1 SYNOPSIS
perldoc Alien::Build::Manual::AlienAuthor
=head1 DESCRIPTION
B<Note>: Please read the entire document before you get started in
writing your own L<alienfile>. The section on dynamic vs. static
libraries will likely save you a lot of grief if you read it now!
This document is intended to teach L<Alien> authors how to build their
own L<Alien> distribution using L<Alien::Build> and L<Alien::Base>.
Such an L<Alien> distribution consists of three essential parts:
=over 4
=item An L<alienfile>
This is a recipe for how to 1) detect an already installed version of
the library or tool you are alienizing 2) download and build the library
or tool that you are alienizing and 3) gather the configuration settings
necessary for the use of that library or tool.
=item An installer C<Makefile.PL> or C<Build.PL> or a C<dist.ini> if you are using L<Dist::Zilla>
This is a thin layer between your L<alienfile> recipe, and the Perl
installer (either L<ExtUtils::MakeMaker> or L<Module::Build>.
=item A Perl class (.pm file) that inherits from L<Alien::Base>
For most L<Alien>s this does not need to be customized at all, since
L<Alien::Base> usually does what you need.
=back
For example if you were alienizing a library called libfoo, you might
have these files:
Alien-Libfoo-1.00/Makefile.PL
Alien-Libfoo-1.00/alienfile
Alien-Libfoo-1.00/lib/Alien/Libfoo.pm
This document will focus mainly on instructing you how to construct an
L<alienfile>, but we will also briefly cover making a simple
C<Makefile.PL> or C<dist.ini> to go along with it. We will also touch
on when you might want to extend your subclass to add non-standard
functionality.
=head2 Using commands
Most software libraries and tools will come with instructions for how to
install them in the form of commands that you are intended to type into
a shell manually. The easiest way to automate those instructions is to
just put the commands in your L<alienfile>. For example, lets suppose
that libfoo is built using autoconf and provides a C<pkg-config> C<.pc>
file.
We will also later discuss plugins. For common build systems like
autoconf or CMake, it is usually better to use the appropriate plugin
because they will handle corner cases better than a simple set of
commands. We're going to take a look at commands first because it's
easier to understand the different phases with commands.
(Aside, autoconf is a series of tools and macros used to configure
(usually) a C or C++ library or tool by generating any number of
Makefiles. It is the C equivalent to L<ExtUtils::MakeMaker>, if you
will. Basically, if your library or tool instructions start with
'./configure' it is most likely an autoconf based library or tool).
(Aside2, C<pkg-config> is a standard-ish way to provide the compiler and
linker flags needed for compiling and linking against the library. If
your tool installs a C<.pc> file, usually in C<$PREFIX/lib/pkgconfig>
then, your tool uses C<pkg-config>).
Here is the L<alienfile> that you might have:
use alienfile;
probe [ 'pkg-config --exists libfoo' ];
share {
start_url 'http://www.libfoo.org/src/libfoo-1.00.tar.gz';
download [ 'wget %{.meta.start_url}' ];
extract [ 'tar zxf %{.install.download}' ];
build [
[ './configure --prefix=%{.install.prefix} --disable-shared' ],
[ '%{make}' ],
[ '%{make} install' ],
];
};
gather [
[ 'pkg-config --modversion libfoo', \'%{.runtime.version}' ],
[ 'pkg-config --cflags libfoo', \'%{.runtime.cflags}' ],
[ 'pkg-config --libs libfoo', \'%{.runtime.libs}' ],
];
There is a lot going on here, so lets decode it a little bit. An
L<alienfile> is just some Perl with some alien specific sugar. The
first line
use alienfile;
imports the sugar into the L<alienfile>. It also is a flag for the
reader to see that this is an L<alienfile> and not some other kind of
Perl script.
The second line is the probe directive:
probe [ 'pkg-config --exists libfoo' ];
is used to see if the library is already installed on the target system.
If C<pkg-config> is in the path, and if libfoo is installed, this should
exit with a success (0) and tell L<Alien::Build> to use the system
library. If either C<pkg-config> in the PATH, or if libfoo is not
installed, then it will exist with non-success (!= 0) and tells
L<Alien::Build> to download and build from source.
You can provide as many probe directives as you want. This is useful if
there are different ways to probe for the system. L<Alien::Build> will
stop on the first successfully found system library found. Say our
library libfoo comes with a C<.pc> file for use with C<pkg-config> and
also provides a C<foo-config> program to find the same values. You
could then specify this in your L<alienfile>
probe [ 'pkg-config --exists libfoo' ];
probe [ 'foo-config --version' ];
Other directives can be specified multiple times if there are different
methods that can be tried for the various steps.
Sometimes it is easier to probe for a library from Perl rather than with
a command. For that you can use a code reference. For example, another
way to call C<pkg-config> would be from Perl:
probe sub {
my($build) = @_; # $build is the Alien::Build instance.
system 'pkg-config --exists libfoo';
$? == 0 ? 'system' : 'share';
};
The Perl code should return 'system' if the library is installed, and
'share' if not. (Other directives should return a true value on
success, and a false value on failure). You can also throw an exception with
C<die> to indicate a failure.
The next part of the L<alienfile> is the C<share> block, which is used
to group the directives which are used to download and install the
library or tool in the event that it is not already installed.
share {
start_url 'http://www.libfoo.org/src/libfoo-1.00.tar.gz';
download [ 'wget %{.meta.start_url}' ];
extract [ 'tar zxf %{.install.download}' ];
build [
[ './configure --prefix=%{.install.prefix} --disable-shared' ],
[ '%{make}' ],
[ '%{make} install' ],
];
};
The start_url specifies where to find the package that you are alienizing.
It should be either a tarball (or zip file, or what have you) or an
HTML index. The download directive as you might imagine specifies how
to download the library or tool. The extract directive specifies how
to extract the archive once it is downloaded. In the extract step, you
can use the variable C<%{.install.download}> as a placeholder for the archive
that was downloaded in the download step. This is also accessible if
lib/Alien/Build/Manual/AlienAuthor.pod view on Meta::CPAN
command line tools, a pure Perl implementation (L<PkgConfig>), or
libpkgconf, depending on what is available). When using negotiation
plugins you may omit the C<::Negotiate> suffix. So as you can see using
the plugin here is an advantage because it is more reliable than just
specifying a command which may not be installed!
Next we use the download negotiation plugin. This is also better than
the version above, because again, C<wget> my not be installed on the
target system. Also you can specify a URL which will be scanned for
links, and use the most recent version.
We use the Extract negotiation plugin to use either command line tools,
or Perl libraries to extract from the archive once it is downloaded.
Finally we use the Autoconf plugin
(L<Alien::Build::Plugin::Build::Autoconf>). This is a lot more
sophisticated and reliable than in the previous example, for a number of
reasons. This version will even work on Windows assuming the library or
tool you are alienizing supports that platform!
Strictly speaking the build directive is not necessary, because the
autoconf plugin provides a default which is reasonable. The only reason
that you would want to include it is if you need to provide additional
flags to the configure step.
share {
...
build [
'%{configure} --enable-bar --enable-baz --disable-shared',
'%{make}',
'%{make} install',
];
};
=head2 Multiple .pc files
Some packages come with multiple libraries paired with multiple C<.pc>
files. In this case you want to provide the
L<Alien::Build::Plugin::PkgConfig::Negotiate> with an array reference
of package names.
plugin 'PkgConfig' => (
pkg_name => [ 'foo', 'bar', 'baz' ],
);
All packages must be found in order for the C<system> install to succeed.
Once installed the first C<pkg_name> will be used by default (in this
example C<foo>), and you can retrieve any other C<pkg_name> using
the L<Alien::Base alt method|Alien::Base/alt>.
=head2 A note about dynamic vs. static libraries
If you are using your L<Alien> to build an XS module, it is important
that you use static libraries if possible. If you have a package that
refuses to build a static library, then you can use L<Alien::Role::Dino>.
Actually let me back up a minute. For a C<share> install it is best
to use static libraries to build your XS extension. This is because
if your L<Alien> is ever upgraded to a new version it can break your
existing XS modules. For a C<system> install shared libraries are
usually best because you can often get security patches without having
to re-build anything in perl land.
If you looked closely at the "Using commands" and "Using plugins"
sections above, you may notice that we went out of our way where
possible to tell Autotools to build only static libraries using the
C<--disable-shared> command. The Autoconf plugin also does this by
default.
Sometimes though you will have a package that builds both, or maybe
you I<want> both static and dynamic libraries to work with XS and FFI.
For that case, there is the L<Alien::Build::Plugin::Gather::IsolateDynamic>
plugin.
use alienfile;
...
plugin 'Gather::IsolateDynamic';
What it does, is that it moves the dynamic libraries (usually .so on
Unix and .DLL on Windows) to a place where they can be found by FFI,
and where they won't be used by the compiler for building XS. It usually
doesn't do any harm to include this plugin, so if you are just starting
out you might want to add it anyway. Arguably it should have been the
default behavior from the beginning.
If you have already published an Alien that does not isolate its
dynamic libraries, then you might get some fails from old upgraded
aliens because the share directory isn't cleaned up by default (this is
perhaps a design bug in the way that share directories work, but it
is a long standing characteristic). One work around for this is to
use the C<clean_install> property on L<Alien::Build::MM>, which will
clean out the share directory on upgrade, and possibly save you a lot
of grief.
=head2 Verifying and debugging your alienfile
You could feed your alienfile directly into L<Alien::Build>, or
L<Alien::Build::MM>, but it is sometimes useful to test your alienfile
using the C<af> command (it does not come with L<Alien::Build>, you need
to install L<App::af>). By default C<af> will use the C<alienfile> in
the current directory (just as C<make> uses the C<Makefile> in the
current directory; just like C<make> you can use the C<-f> option to
specify a different L<alienfile>).
You can test your L<alienfile> in dry run mode:
% af install --dry-run
Alien::Build::Plugin::Core::Legacy> adding legacy hash to config
Alien::Build::Plugin::Core::Gather> mkdir -p /tmp/I2YXRyxb0r/_alien
---
cflags: ''
cflags_static: ''
install_type: system
legacy:
finished_installing: 1
install_type: system
name: libfoo
original_prefix: /tmp/7RtAusykNN
version: 1.2.3
libs: '-lfoo '
libs_static: '-lfoo '
prefix: /tmp/7RtAusykNN
version: 1.2.3
You can use the C<--type> option to force a share install (download and
( run in 0.691 second using v1.01-cache-2.11-cpan-39bf76dae61 )