Get your free Liquibase Fundamentals Certification!

Product CI/CD & DevOps

How to Install Datical in AWS

How to Install Datical in AWS

Introduction

Cloud adoption among large enterprises is at record numbers after nearly a decade of growing investments. AWS is a market leader in serving the growing enterprise appetite for cloud. Enterprises look to AWS for the following key benefits: cost reduction, flexibility, and availability. These benefits stem from the fact that cloud means hardware utilization is on-demand, delivery is near-instant along with scale up or scale down of infrastructure, and business continuity is better guaranteed as services can be replicated across the globe.

As cloud footprints are growing, organizations are looking beyond basic infrastructure as a service and are keen on getting more out of cloud investments by taking advantage of platform services to enable serverless architectures, machine learning, or intelligent networking such as automatic load-balancing and more. However, as cloud footprints start to mature, organizations are being held back by a slow, manual, and opaque database release process which threatens all the benefits of the cloud delivery model. In response, enterprises are now realizing the need to make database code deployment as fast, transparent, and automated as application releases.

To streamline software delivery from the cloud, organizations are looking to bring continuous integration to their databases in order to:

  • Discover mistakes quickly.
  • Deliver updates faster and more frequently.
  • Help developers write better code.
  • Automate the database release management process.

The end goal is make sure database code can be promoted automatically and in lockstep with application code changes. With database continuous integration and delivery, software development teams can deliver smaller, less risky deployments, making it possible to respond more quickly to business or customer needs. After installing Datical on AWS, you will have a CI/CD pipeline for your database code.

Architecture Diagram

Prerequisites

You’ll need an AWS account, an Amazon EC2 key pair, and appropriate permissions for AWS Identity and Access Management (IAM). From Datical, you’ll need access to Datical software. Request a custom demo of Datical. You will need to be familiar with executing SQL against at database and general CI/CD concepts.

This solution will utilize EC2 and RDS instances; you should be familar with these technologies along with AWS Networking and IAM permissions.

Support

For support, contact support@datical.com.

Costs

This guide will create the AWS resources outlined below. You are responsible for the cost of AWS services used while running this deployment. The following assets are required to provide a
functional platform:
• 1 EC2 Instances
• 1 RDS Instances

Refer to the AWS Pricing guide to see the most recent costs in your region. As of writing, estimated Datical EC2 instance monthly costs in US West is $281. RDS is dependent on your database size. Request current Datical software licensing costs here.

Disaster Recovery and High Availability

If Datical goes down, your mission critical systems will not be impacted. This will only prevent you from updating your database schema during downtime.

You can optionally leverage AWS Disaster Recovery and High Availability for EC2 and RDS. Refer to the following documents for more information:

  • https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-increase-availability.html
  • https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.MultiAZ.html

Health Check

If you Datical installation is not performing as expected, you can verify status with the following command.

$ su - datical
$ datical-control services show
$ datical-control versions show

Restarting Datical services is the first step in diagnosing an issue

$ su - datical
$ datical-control services restart
 To verify the status for your AWS Region visit https://status.aws.amazon.com/. You can also view your Personal Health Dashboard from this link. For more information on EC2 monitoring, visit this page: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitoring-system-instance-status-check.html.
Service Limits
To monitor your AWS service limits, we recommend using AWS Trusted Advisor. See the following for more information on increasing service limits, if necessary: https://docs.aws.amazon.com/general/latest/gr/aws_service_limits.html.

Install and configure Datical

Datical is a microservices architected log running server process. As such, you will need to install it on an AWS EC2 instance. This guide assumes you will be installing on CentOS. Datical  prerequisites include 4 cores and 8GB RAM. For this example, we will use a m5.2xlarge instance with 250GB of storage. Make certain that you can access SSH, HTTP, and HTTPS with your VPC configuration.

The external hostname of your EC2 instance needs to be discoverable from local machine. Simply issue “sudo hostname <hostname>” to the results of “nslookup <external IP address>”.

Step 0: Create an IAM user or role for Datical

Your IAM user should have a policy that allows AWS EC2 and RDS actions. Do not use your root account to deploy Datical. This deployment requires permissions to all services listed in the following section.

Step 1: Create AWS EC2 instance to run Datical

Visit console.aws.amazon.com and create a new EC2 instance. For this guide, we will be using the public AWS AMI for Centos 7. Create an m5.2xlarge instance with 250GB of storage. Verify that you can access SSH, HTTP, and HTTPS for your VPC configuration.

Note: You can certainly (and absolutely should) configure your EC2 instance to be in the same VPC as your database or have access to the VPC. Also, verify that the same IAM permissions are applied to both the Datical EC2 instance and RDS instances you will manage with Datical.

Reference: https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html

Your VPC settings should allow communication to your RDS database ports. For example, if you are running Oracle in RDS, you should expose port 1521.

Step 2: Configure AWS EC2 instance to run Datical

Log into the EC2 instance using your command line and SSH.

Run the following command to verify that all necessary packages are installed.

sudo yum -y update
yum install epel-release -y
yum install jq -y
sudo yum install \
"/bin/bash" \
"/bin/sh" \
"/sbin/ldconfig" \
"/usr/bin/env" \
"/usr/bin/python2" \
"attr" \
"ca-certificates" \
"container-selinux >= 2.9" \
"createrepo" \
"curl" \
"device-mapper-libs >= 1.02.90-1" \
"ebtables" \
"ethtool" \
"iproute" \
"iptables" \
"iptables >= 1.4.21" \
"java-1.8.0-openjdk-devel" \
"ld-linux-x86-64.so.2()(64bit)" \
"ld-linux-x86-64.so.2(GLIBC_2.3)(64bit)" \
"libacl.so.1()(64bit)" \
"libacl.so.1(ACL_1.0)(64bit)" \
"libaio.so.1()(64bit)" \
"libaio.so.1(LIBAIO_0.1)(64bit)" \
"libaio.so.1(LIBAIO_0.4)(64bit)" \
"libc.so.6(GLIBC_2.17)(64bit)" \
"libc.so.6(GLIBC_2.2.5)(64bit)" \
"libc.so.6(GLIBC_2.4)(64bit)" \
"libc.so.6(GLIBC_2.6)(64bit)" \
"libc.so.6(GLIBC_2.8)(64bit)" \
"libcgroup" \
"libcrypto.so.10()(64bit)" \
"libcrypto.so.10(OPENSSL_1.0.1_EC)(64bit)" \
"libcrypto.so.10(libcrypto.so.10)(64bit)" \
"libdevmapper.so.1.02()(64bit)" \
"libdevmapper.so.1.02(Base)(64bit)" \
"libdevmapper.so.1.02(DM_1_02_97)(64bit)" \
"libdl.so.2()(64bit)" \
"libdl.so.2(GLIBC_2.2.5)(64bit)" \
"libgcc_s.so.1()(64bit)" \
"libgcc_s.so.1(GCC_3.0)(64bit)" \
"libgcc_s.so.1(GCC_3.3.1)(64bit)" \
"liblvm2app.so.2.2()(64bit)" \
"liblvm2app.so.2.2(Base)(64bit)" \
"libm.so.6()(64bit)" \
"libncurses.so.5()(64bit)" \
"libpthread.so.0()(64bit)" \
"libpthread.so.0(GLIBC_2.12)(64bit)" \
"libpthread.so.0(GLIBC_2.2.5)(64bit)" \
"libpthread.so.0(GLIBC_2.3.2)(64bit)" \
"libreadline.so.6()(64bit)" \
"librt.so.1()(64bit)" \
"librt.so.1(GLIBC_2.2.5)(64bit)" \
"libseccomp >= 2.3" \
"libseccomp.so.2()(64bit)" \
"libsqlite3.so.0()(64bit)" \
"libssl.so.10()(64bit)" \
"libssl.so.10(libssl.so.10)(64bit)" \
"libsystemd.so.0()(64bit)" \
"libsystemd.so.0(LIBSYSTEMD_209)(64bit)" \
"libtinfo.so.5()(64bit)" \
"libuuid.so.1()(64bit)" \
"libuuid.so.1(UUID_1.0)(64bit)" \
"libxml2.so.2()(64bit)" \
"libxml2.so.2(LIBXML2_2.4.30)(64bit)" \
"libxml2.so.2(LIBXML2_2.6.0)(64bit)" \
"libxml2.so.2(LIBXML2_2.6.3)(64bit)" \
"libz.so.1()(64bit)" \
"lvm2" \
"nano" \
"nmap-ncat" \
"psmisc" \
"python(abi) = 2.7" \
"python-jinja2" \
"python-paramiko" \
"python-pycurl" \
"python-setuptools" \
"python-six" \
"python2-cryptography" \
"pyxattr" \
"rpcbind" \
"rtld(GNU_HASH)" \
"shadow-utils" \
"socat" \
"sshpass" \
"systemd" \
"systemd-units" \
"tar" \
"unzip" \
"util-linux" \
"vim" \
"wget" \
"xz" \
"yum-utils"

On the system where you are going to install Datical, do the following:

Create the datical user, set a password, and add it to sudo and docker groups.

$ sudo useradd -m datical 
$ sudo passwd datical 
$ sudo usermod -aG wheel datical

Visit software.datical.com and navigate to Folders –> COMMON –> Datical Service Software download the file get_datical_service.sh. Execute the script using “sudo ./get_datical_service.sh 5.12” and enter your username and password for software.datical.com. This will download all require software onto your EC2 instance.

Step 3: Install Datical

On the host where you are going to install Datical Service, create directories to contain the installation files you downloaded.

sudo mkdir -p /opt/datical/install-files

Move all files from the download host to the Datical Service host.

  • From:  /tmp/datical59files on the download host
  • To: /opt/datical/install-files on the intended Datical Service host

Run the Pre-Installation Check Script

The check script helps you be sure that the host where you intend to install Datical Service meets requirements. The output guides to to issues that need to be addressed.

Add execution privilege to the script before running it.

cd /opt/datical/install-files
chmod +x datical-service-preinstall-check_VERSION.sh
sudo ./datical-service-preinstallcheck-VERSION.sh

The check script checks the following areas:

  • Hardware resources (CPUs, RAM)
  • yum configuration (existence of “extras” repository)
  • Dependencies (packages that must already be on the host)
  • Network and ports
  • Administrative user exists
  • All required .part files are available

Unpack Files into /opt/datical

Unpack the main Datical Service files into the Datical Service directory.

cat /opt/datical/install-files/datical-service*.part* | sudo tar xfv - -C /opt/datical

Run the Installation Script

sudo /opt/datical/datical-service-install.sh

Answer “y” whenever prompted during the dependencies check. Any needed packages are installed from the local repo installed from the .part files.

You may be prompted more than once.

The dependencies check output is in the following form. Numbers for the actual dependencies appear in places of NNN.

Dependencies Resolved
# (list of packages and their repos appears here)
Transaction Summary
============================================================================
Install  N Package  (+NNN Dependent packages)
Upgrade             (  NN Dependent packages)
Total download size: NNN M
Is this ok [y/d/N]:

After the last dependencies check is compete, installation continues.

Answer the prompts with answers for your environment. 

You are prompted to confirm each answer.  The confirmation prompts are not shown in the example below.
Configuring System...
------------------------------------------------------
------ Datical OS Check --------------
------------------------------------------------------

What IP should Datical Server listen on? [172.30.1.197 (ens5)]
Are you sure you want Datical Service to listen on 172.30.1.197? [y/N] y
What is the hostname used to access Datical Service? ec2-54-191-92-21.us-west-2.compute.amazonaws.com
Are you sure you will use 'https://ec2-54-191-92-21.us-west-2.compute.amazonaws.com' to access Datical Service? [y/N] y
Starting Datical Server OS-level configuration...

Verify the Installation

The installation script starts the services.

Verify that all services are running.

$ su - datical
$ datical-control services show
$ datical-control versions show

Note: It may take a minute or so for all services to show as RUNNING. Run the command again until they all show as RUNNING.

Step 3: Log On

Access and log in to Datical Service in a browser, using the virtual image URL.

URL – https://<hostname>/

  • Use the value you entered in the installation prompt: What is the hostname used to access Datical Service? [myhost]

User – support@datical.com

  • This is the pre-set user with administrative privileges in Datical Service.

Password – datical

  • This is the pre-set password for the initial administrative user.

Step 4: Prepare for future Deployments and Recovery. (optional)

To support later Datical deployments and recovery, we recommend you create a private AMI of the previously configured system.

Reference: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AMIs.html

Use the following CLI command to back up all services.

datical-control backup all

Backing Up Selected Components

You can also back up the following components individually by using an alternate option on the command line:

  • postgresql – Backs up the PostgreSQL application database
  • internal-client – Backs up user data stored in the internal client service (Jenkins).

They are included when you use backup all.

Backup Location

Manual backups are stored in set directories using a timestamped file name pattern. Examples:

/opt/datical/data-clustered/backup/all/datical-service-all-2018-10-23_18-29.bak
/opt/datical/data-clustered/backup/postgresql/datical-service-postgresql-2018-10-23_18-29.bak
/opt/datical/data-clustered/backup/jenkins/datical-internal-client-2018-10-23_18-29.bak

Note: backups in the all directory come only from manual backups.

The postgresql and jenkins directories contain automated backups and any manual backups you perform. See Auto-Backup.

Backing Up Selected Components

Only the last 10 backups are retained in the postgresql and jenkins directories. Older backups are discarded as new backups are made.

There is no limit on backups kept in the all directory.

Restoring Datical Service

Use the following command to restore backup files.

sudo datical-control restore [all | postgresql | internal-client] --file <filename>

You may specify the absolute path or the path from the current working directory.

Auto-Backup

To perform auto-backup, cron jobs run at 12:00 am, 6:00 am, and 6:00 pm. They cannot be changed or turned off.

They perform the following backups:

  • postgresql – PostgreSQL application database
  • internal-client – User data stored in the internal client service (Jenkins).

Auto-backup uses the same directory and file pattern that manual backups use.

/opt/datical/data-clustered/backup/postgresql/datical-service-postgresql-2018-10-23_18-29.bak
/opt/datical/data-clustered/backup/jenkins/datical-internal-client-2018-10-23_18-29.bak

Disaster Recovery and Backup Storage

You should provide tools or processes for moving backup files to an offsite location so they are available for disaster recovery.

We recommend that you back up the following directory in coordination with your offline backup processes. It contains all automated and manual backups.

/opt/datical/data-clustered/backup/

Step 4: Datical upgrades. (optional)

Datical can be upgraded via the Datical installer. Simply download a newer version of the software as detailed above. During installation, you will be prompted that an existing installation has been detected. Select “Y” to update the existing installation.