App-Dapper
view release on metacpan - search on metacpan
view release on metacpan or search on metacpan
Result:
```
Quick. Brown. Fox.
```
#### [%~ Chomp Greedy ~%]
Remove all adjacent whitespace.
```
Quick.
[%~ "Brown." ~%]
Fox.
```
Result:
```
Quick.Brown.Fox.
```
## Dumping
The `dump` directive inserts a Data::Dumper printout of the variable or
expression. If no argument is passed it will dump the entire contents of the
current variable stash (with private keys removed). The output also includes the
current file and line number that the DUMP directive was called from. Example:
[% dump %] # dump everything
[% dump site %] # dump everything in the site
[% dump page %] # dump everything in the current page
# Deployment
To deploy your content, you have a number of options. This section outlines
a few of them.
## Amazon S3
It's possible to serve a static website using Amazon S3. Here's how.
1. Go to Amazon's AWS Console and create 2 buckets: 'www.mydomain.com' and
'mydomain.com'. Content will be loaded into the 'mydomain.com' bucket and
'www.mydomain.com' will just point to it.
2. Under the properties for the 'www.mydomain.com' bucket, choose 'redirect
all requests to another host name' under 'Static Web Hosting'.
3. Under properties for 'mysite.com', choose 'enable website hosting' under
'Static Web Hosting', and set 'Index Document' to 'index.html'.
4. Install [s3cmd](http://s3tools.org/s3cmd). On Mac OSX, using
[Homebrew](http://brew.sh/), install like this:
$ pip install s3cmd
5. Configure `s3cmd` with your Amazon credentials (AWS access key, secret
key):
$ s3cmd --configure
6. Now any time you want to rebuild your content and push it to s3, it's a
simple call to:
$ s3cmd sync _output/ s3://mydomain.com --reduced-redundancy --acl-public --delete-removed
A few notes about the options. First, `--reduced-redundancy` tells
Amazon that your website content is non-critical and easily
reproducible if there is a problem. It means your charges from Amazon
will be less and is a good option for most static sites.
Second, the `--acl-public` option makes everything in the bucket
public, which is what we want for a static website. We want the world
to have read access to the contents of the bucket, and `--acl-public`
accomplishes this.
Third, the `--delete-removed` option tells `s3cmd` to delete files in
the bucket that are not stored locally. This cleans things up so that
you don't have lots of extra crap sitting in your bucket that isn't
being used, but is costing you money. If you upload image files
independently of Dapper or `s3cmd`, you may want to not use this option.
An optional step is to route traffic to your website through Amazon's Route 53.
To do this, follow these steps:
1. Create 2 A records, one for 'mydomain.com' and one for 'www.mydomain.
com'.
2. For each A record, set 'Alias' to yes, and set 'Alias Target' to the S3
bucket with the same name.
To make it easy to publish to Amazon S3, one option is to create a Makefile
that encodes the publishing instructions. Here is a Makefile that I use for
[Vanilla Draft](http://vanilladraft.com/):
BASEDIR=$(CURDIR)
INPUTDIR=$(BASEDIR)/_source
OUTPUTDIR=$(BASEDIR)/_output
S3_BUCKET=vanilladraft.com
build:
dapper build
serve: build
dapper serve
publish: build
s3cmd sync $(OUTPUTDIR)/ s3://$(S3_BUCKET) --reduced-redundancy --acl-public --delete-removed
watch:
dapper watch
.PHONY: build serve publish watch
## Github Pages
view all matches for this distributionview release on metacpan - search on metacpan
( run in 1.375 second using v1.00-cache-2.02-grep-82fe00e-cpan-f73e49a70403 )