This cookbook installs and configures the elasticsearch search engine on a Linux system.
It requires a working Java installation on the target node; add your preferred java
cookbook to the node run_list
.
The cookbook downloads the elasticsearch tarball (via the ark
provider),
unpacks and moves it to the directory you have specified in the node configuration (/usr/local/elasticsearch
by default).
It installs a service which enables you to start, stop, restart and check status of the elasticsearch process.
If you include the elasticsearch::monit
recipe, it will create a configuration file for Monit,
which will check whether elasticsearch is running, reachable by HTTP and the cluster is in the "green" state.
(Assumed you have included a "monit" cookbook in your run list first.)
If you include the elasticsearch::aws
recipe, the
AWS Cloud Plugin will be installed on the node,
allowing you to use the Amazon AWS features (node auto-discovery, etc).
Set your AWS credentials either in the "elasticsearch/aws" data bag, or directly in the role/node configuration.
You may want to include the elasticsearch::proxy
recipe, which will configure Nginx as
a reverse proxy for elasticsearch, so you may access it remotely with HTTP Authentication.
Set the credentials either in a "elasticsearch/users" data bag, or directly in the role/node configuration.
Read the tutorial on deploying elasticsearch with Chef Solo using this cookbook.
For Chef Server based deployment, include the elasticsearch
recipe in the role or node run_list
.
Then, upload the cookbook to the Chef server:
knife cookbook upload elasticsearch
To enable the Amazon AWS related features, include the elasticsearch::aws
recipe.
You will need to configure the AWS credentials, bucket names, etc.
You may do that in the node configuration (with knife node edit MYNODE
or in the Chef Server console),
but it is arguably more convenient to store the information in an "elasticsearch" data bag:
mkdir -p ./data_bags/elasticsearch
echo '{
"id" : "aws",
"discovery" : { "type": "ec2" },
"gateway" : {
"type" : "s3",
"s3" : { "bucket": "YOUR BUCKET NAME" }
},
"cloud" : {
"aws" : { "access_key": "YOUR ACCESS KEY", "secret_key": "YOUR SECRET ACCESS KEY" },
"ec2" : { "security_group": "elasticsearch" }
}
}' >> ./data_bags/elasticsearch/aws.json
Do not forget to upload the data bag to the Chef server:
knife data bag from file elasticsearch aws.json
Usually, you will restrict the access to elasticsearch with firewall rules. However, it's convenient
to be able to connect to the elasticsearch cluster from curl
or a HTTP client, or to use a management tool such as
BigDesk or Paramedic.
(Don't forget to set the node.elasticsearch[:nginx][:allow_cluster_api]
attribute to true if you want to access
these tools via the proxy.)
To enable authorized access to elasticsearch, you need to include the elasticsearch::proxy
recipe,
which will install, configure and run Nginx as a reverse proxy, allowing users with proper
credentials to connect.
As with AWS, you may store the usernames and passwords in the node configuration, but also in a data bag item:
mkdir -p ./data_bags/elasticsearch
echo '{
"id" : "users",
"users" : [
{"username" : "USERNAME", "password" : "PASSWORD"},
{"username" : "USERNAME", "password" : "PASSWORD"}
]
}
' >> ./data_bags/elasticsearch/users.json
Again, do not forget to upload the data bag to the Chef server:
knife data bag from file elasticsearch users.json
After you have configured the node and uploaded all the information to the Chef server, run chef-client
on the node(s):
knife ssh name:elasticsearch* 'sudo chef-client'
The cookbook comes with a Vagrantfile
,
allowing you to test-drive the installation and configuration with Vagrant,
a tool for building virtualized development infrastructures.
First, make sure, you have both VirtualBox and Vagrant installed.
Then, clone this repository into a elasticsearch
directory on your development machine:
git clone git://github.com/karmi/cookbook-elasticsearch.git elasticsearch
Switch to the cloned repository:
cd elasticsearch
Install the neccessary gems:
bundle install
You need to download the required third-party cookbooks (unless you already have them in ~/cookbooks
).
You can install the cookbooks manually by simply downloading them:
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/1184/original/apt.tgz | tar xz -C tmp/cookbooks
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/1421/original/java.tgz | tar xz -C tmp/cookbooks
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/1098/original/vim.tgz | tar xz -C tmp/cookbooks
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/1413/original/nginx.tgz | tar xz -C tmp/cookbooks
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/915/original/monit.tgz | tar xz -C tmp/cookbooks
curl -# -L -k http://s3.amazonaws.com/community-files.opscode.com/cookbook_versions/tarballs/1631/original/ark.tgz | tar xz -C tmp/cookbooks
An easier way, though, is to use the bundled Berkshelf support -- the cookbooks will be
automatically installed and mounted in the virtual machine.
(You can use the berks install --path ./tmp/cookbooks
command as well.)
The Vagrantfile
supports four Linux distributions so far:
- Ubuntu Precise 64 bit
- Ubuntu Lucid 32 bit
- Ubuntu Lucid 64 bit
- CentOS 6 32 bit
Use the vagrant status
command for more information.
We will use the Ubuntu Precise 64 box for the purpose of this demo. You may want to test-drive this cookbook on a different distribution; check out the available boxes at http://vagrantbox.es or build a custom one with veewee.
Launch the virtual machine with Vagrant (it will download the box unless you already have it):
time vagrant up precise64
The machine will be started and automatically provisioned with
chef-solo.
(Note: To use the latest version of Chef, run CHEF=latest vagrant up precise64
.
You may substitute latest with a specific version.)
You'll see Chef debug messages flying by in your terminal, downloading, installing and configuring Java, Nginx, elasticsearch, and all the other components. The process should take about 15 minutes on a reasonable machine and internet connection.
After the process is done, you may connect to elasticsearch via the Nginx proxy from the outside:
curl 'http://USERNAME:PASSWORD@33.33.33.10:8080/test_chef_cookbook/_search?pretty&q=*'
Of course, you should connect to the box with SSH and check things out:
vagrant ssh precise64
ps aux | grep elasticsearch
service elasticsearch status --verbose
curl http://localhost:9200/_cluster/health?pretty
The cookbook provides test cases in the files/default/tests/minitest/
directory,
which are executed as a part of the Chef run on Vagrant
(via the Minitest Chef Handler support).
They check the basic installation mechanics, populate the test_chef_cookbook
index
with some sample data and performs a simple search.
attributes/default.rb
: version, paths, memory and naming settings for the nodeattributes/aws.rb
: AWS settingsattributes/proxy.rb
: Nginx settingstemplates/default/elasticsearch.init.erb
: service init scripttemplates/default/elasticsearch.yml.erb
: main elasticsearch configuration filetemplates/default/elasticsearch-env.sh.erb
: environment variables needed by the Java Virtual Machine and elasticsearchtemplates/default/elasticsearch_proxy.conf.erb
: the reverse proxy configuration for Nginxtemplates/default/elasticsearch.conf.erb
: Monit configuration filefiles/default/tests/minitest
: integration tests
Author: Karel Minarik (karmi@elasticsearch.com) and contributors
MIT LICENSE