It’s almost 2 months since i’ve started playing full time on ansible. Like most of the SYS-Admin’s, ive been using ansible via cli most of the time. Unlike Salt/Puppet, ansible is an agent less one. So we need to invoke things from the box which contains ansible and the respective playbooks installed. Also, if you want to use ansible with ec2 features like auto-scaling, we need to either buy Ansible Tower, or need to use ansible-fetch along with the userdata script. I’ve also seen people, who uses custom scripts, that fetches their repo and execute ansible playbook locally to bootstrap.
Being a good fan of Flask, i’ve used flask on creating many backend API’s to automate a bunch of my tasks. So this time i decided to write a simple Flask API for executing Ansible playbook/ Ansible Adhoc commands etc.. Ansible also provides a Python API, which also made my work easier. Like most of the Ansible user’s, i use Role’s for all my playbooks. We can directly expose an API to ansible and can execute playbooks. But there are cases, where the playbook execution takes > 5min, and offcourse if there is any network latency it will affect our package download etc. I don’t want to force my HTTP clients to wait for the final output of the playbook execution to get a response back.
So i decided to go ahead with a JOB Queue feature. Each time a request comes to my API, the job is queued in Redis and the JOB ID will be returned to the clients. Then my job-workers pick the job’s from the redis queue and performs the job execution on the backend and workers will keep updating the job status. So now, i need to expose 2 API’s first, ie, one for receiving jobs and one for job status. For Redis Queue, there is an awesome library called rq. I’ve been using rq for all queuing tasks.
Flask API
The JOB accepts a bunch of parameters like host
, role
, env
via HTTP POST method. Since the role/host etc.. have to be retrieved from the HTTP request, my playbook yml file has to be a dynamic one. So i’ve decided to use Jinja templating to dynamically create my playbook yml file. Below is my sample API for Role based playbook execution.
@app.route('/ansible/role/', methods=['POST'])
def role():
inst_ip = request.form['host'] # Host to which the playbook has to be executed
inst_role = request.form['role'] # Role to be applied on the Playbook
env = request.form['env'] # Extra evn variables to be passed while executing the playbook
ans_remote_user = "ubuntu" # Default remote user
ans_private_key = "/home/ubuntu/.ssh/id_rsa" # Default ssh private key
job = q.enqueue_call( # Queuing the job on to Redis
func=ansble_run, args=(inst_ip, inst_role, env, ans_remote_user, ans_private_key,), result_ttl=5000, timeout=2000
)
return job.get_id() # Returns job id if the job is successfully queued to Redis
Below is a sample templating function that generates the playbook yml file via Jinja2 templating
def gen_pbook_yml(ip, role):
r_text = ''
templateLoader = jinja2.FileSystemLoader( searchpath="/" )
templateEnv = jinja2.Environment( loader=templateLoader )
TEMPLATE_FILE = "/opt/ansible/playbook.jinja" # Jinja template file location
template = templateEnv.get_template( TEMPLATE_FILE )
role = role.split(',') # Make Role as an array if Multiple Roles are mentioned in the POST request
r_text = ''.join([random.choice(string.ascii_letters + string.digits) for n in xrange(32)])
temp_file = "/tmp/" + "ans-" + r_text + ".yml" # Crating a unique playbook yml file
templateVars = { "hst": ip,
"roles": role
}
outputText = template.render( templateVars ) # Rendering template
text_file = open(temp_file, "w")
text_file.write(outputText) # Saving the template output to the temp file
text_file.close()
return temp_file
Once the playbook file is ready, we need to invoke Ansible’s API to perform our bootstrapping. This is actually done by the Job workers. Below is a sample function which invokes the playbook API from Ansible CORE.
def ansble_run(ans_inst_ip, ans_inst_role, ans_env, ans_user, ans_key_file):
yml_pbook = gen_pbook_yml(ans_inst_ip, ans_inst_role) # Generating the playbook yml file
run_pbook = ansible.playbook.PlayBook( # Invoking Ansible's playbook API
playbook=yml_pbook,
callbacks=playbook_cb,
runner_callbacks=runner_cb,
stats=stats,
remote_user=ans_user,
private_key_file=ans_key_file,
host_list="/etc/ansible/hosts", # use either host_file or inventory
# Inventory='path/to/inventory/file',
extra_vars={
'env': ans_env
}
).run()
return run_pbook # We can tune the output that has to be returned
Now the job-workers executes and updates the status on the Redis. Now we need to expose our JOB status API. Below is a sample Flask API for the same.
@app.route("/ansible/results/<job_key>", methods=['GET'])
def get_results(job_key):
job = Job.fetch(job_key, connection=conn)
if job.is_finished:
ret = job.return_value
elif job.is_queued:
ret = {'status':'in-queue'}
elif job.is_started:
ret = {'status':'waiting'}
elif job.is_failed:
ret = {'status': 'failed'}
return json.dumps(ret), 200
Now, we have a fully fledged API server for executing Role based playbooks. This API can also be used with user data scripts in autoscaling, where in we need to perform an HTTP POST request to the API server, and our API server will start the Bootstrapping. I’ve tested this app locally with various scenarios and the results are promising. Now as a next step, i’m planning to extend the API to do more jobs like, automating Code Pushes, Running AD-Hoc commands via API etc… With applications like Ansible, Redis, Flask, i’m sure SYS Admins can attain the DevOps Nirvana :). I’ll be pushing the latest working code to my Github account soon…