×
Back to book

Custom SQL Scripts with SpringBot

This article will explore how you can create custom SQL scripts in your SpringBot application.

As part of SpringBot, custom SQL scripts can be added by the developers to gain more fine-grain control over the underlying database. These scripts can be used to define the desired database state with both schema and data migration scripts.

By default, SpringBot uses Hibernate and Java Persistence API (or JPA) as a bridge between the application and the database. However, you may sometimes require custom database manipulation that is not possible with Hibernate. SQL scripts can be added to allow better control and performance.

This article will walk through how to add your custom SQL scripts and then elevate them in your project.

Migration Scripts

By default, Spring exposes multiple Hibernate configurations which can be found in the serverside/src/main/resources directory. These files take the form of application-[EnvironmentName].properties. For more details, refer to Spring Common Application Properties documentation.

The most important and database-related Hibernate configurations is spring.jpa.hibernate.ddl-auto. This configuration determines what Hibernate should do upon server bootstrapping.

The two most common options are:

Option Description
update Update current database, and keep the data.
create-drop Drop the tables in the database, and create new tables

However, this is just for development or testing purposes. Database migration in production is inherently more risky and requires more care. When performing a production release of a SpringBot application, database migration is performed in a separate process which is run during server bootstrapping. Therefore, spring.jpa.hibernate.ddl-auto=none should be set in the beta or production environments instead.

Liquibase

SpringBot database migration employs Liquibase, an open-source database-independent library for tracking, managing and applying database schema changes.

To help developers to easily set up and use Liquibase in their local environment without installing, we suggest creating a Gradle project. This project can be executed before the actual project to handle database migration.

Setting up a Liquibase Project

Creating a Gradle Project

To start, you will need a folder which contains a Java project that employs Liquibase for database migration.

  1. Create a new folder in serverside called sql.

  2. Inside the project, create the following files:

    File Name Description Location
    build.gradle Gradle configuration file serverside/sql/build.gradle
    gradle.properties Gradle properties file serverside/sql/build.gradle
    settings.gradle Gradle settings file serverside/sql/settings.gradle
    changelog.xml Change log configuration for Liquibase serverside/sql/changelog.xml
  3. If you have previously installed Gradle as part of your environment setup, open a new terminal and run gradle wrapper --gradle-version=5.2.1 in your new sql directory.

This command initialises a new Gradle wrapper so you can run the project with ./gradlew without installing Gradle. Among the initialised file is build.gradle, where we will next add Gradle configuration for our project.

Adding Gradle Configuration

  1. Insert the following code into your build.gradle:
/**
 * Gradle Project for the database migration
 */

plugins {
  id 'org.liquibase.gradle' version '2.0.1'
}

repositories {
    mavenCentral()
}

dependencies {
    liquibaseRuntime 'org.liquibase:liquibase-core:3.8.0'
    // Database
    liquibaseRuntime 'org.postgresql:postgresql:42.2.5'
}

def CHANGELOG = "$projectDir/changelog.xml"

liquibase {
    activities {
        main {
            changeLogFile CHANGELOG
            url DATABASE_URL
            username DATABASE_USERNAME
            password DATABASE_PASSWORD
            driver 'org.postgresql.Driver'
        }
    }
}

This adds Liquibase as our project dependencies, and configure its task with CHANGELOG, DATABASE_URL, DATABASE_USERNAME, and DATABASE_PASSWORD properties. These properties come from the gradle.properties file, which holds all properties that are relevant to our project. To implement it properly, you will need to include them in there as well.

  1. Insert the following code into your gradle.properties:
     # Dev Database Configuration
     DATABASE_URL=your_database_url
     DATABASE_USERNAME=username
     DATABASE_PASSWORD=password

Adding Changelog Configuration

Liquibase requires a changelog file for migration. By default, Liquibase provides a lot of functionality to manage the migration process, such as pre-condition, changeset, and author of the change. For more details, refer to Liquidbase's documentation.

Follow the steps below to add changelog configuration

  1. Insert the following code into your changelog.xml:

     <?xml version="1.0" encoding="UTF-8"?>
     <databaseChangeLog
         xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
                      http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.1.xsd">
    
         <includeAll path="update_scripts"/>
     </databaseChangeLog>

    Here the changelog includes all the *.sql and *.xml files inside the update_scripts directory, which are then run in alphabetical order. As best practice, we recommend naming these files in the format of yyyymmdd_{index}_{description}.sql. By default, Liquibase will run all the *.sql and *.xml scripts in alphabetical order, which is why use prepend the file name with the current date.

Creating Migration Scripts

First, create the directory update_scripts in your sql directory. We will be using this folder to contain all of our update script files. The path for your new directory should be serverside/sql/update_scripts.

Here we will create a custom update SQL script to add a bunch of Tanks, Species and Fish into our app:

Create a new file called 20200114_01_insert_tank.sql (but with the currentdate) and insert the following code:

-- Load extension to generate uuid
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

DO $$
DECLARE
    tank_uuid uuid := uuid_generate_v4();
    clownfish_uuid uuid := uuid_generate_v4();
    moorish_idol_uuid uuid := uuid_generate_v4();
    yellow_tang_uuid uuid := uuid_generate_v4();
    pacific_leaner_shrimp_uuid uuid := uuid_generate_v4();
    porcupine_pufferfish_uuid uuid := uuid_generate_v4();
    black_white_damselfish_uuid uuid := uuid_generate_v4();
    seastar_uuid uuid := uuid_generate_v4();
    royal_gramma_uuid uuid := uuid_generate_v4();
BEGIN

-- Insert Dentist Tank
insert into tank_entity (id, created, created_by, modified, modified_by, clean, height, last_cleaned, length, name, width) values (tank_uuid, now(), null, now(), null, null, 300, now(), 200, 'Dentist Tank',  200);
-- Insert Species
insert into species_entity (id, created, created_by, modified, modified_by, name) values (clownfish_uuid, now(), null, now(), null, 'Clownfish');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (moorish_idol_uuid, now(), null, now(), null, 'Moorish Idol');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (yellow_tang_uuid, now(), null, now(), null, 'Yellow Tang');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (pacific_leaner_shrimp_uuid, now(), null, now(), null, 'Pacific Cleaner Shrimp');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (porcupine_pufferfish_uuid, now(), null, now(), null, 'Porcupine Pufferfish');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (black_white_damselfish_uuid, now(), null, now(), null, 'Black & White Damselfish');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (seastar_uuid, now(), null, now(), null, 'Seastar');
insert into species_entity (id, created, created_by, modified, modified_by, name) values (royal_gramma_uuid, now(), null, now(), null, 'Royal Gramma');
-- Insert Fishes
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Nemo', clownfish_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Gill', moorish_idol_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Bubbles', yellow_tang_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Jacques', pacific_leaner_shrimp_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Bloat', porcupine_pufferfish_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Deb', black_white_damselfish_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Peach', seastar_uuid, tank_uuid);
insert into fish_entity (id, created, created_by, modified, modified_by, alive, date_of_birth, born,  name, species_id, tank_id) values (uuid_generate_v4(), now(), null, now(), null, true, now(), 1,  'Gurgle', royal_gramma_uuid, tank_uuid);
END$$

Executing Migration Scripts

Now that we have created the migration scripts, we can run the migration process. In this example we will apply the migration process to the local development database.

  1. Open a terminal, navigate to serverside/sql directory and run:
    ./gradlew update

Here update is a Gradle command which will run Liquibase to update the database with the change log configuration we have made previously.

Once done, your database will now be created with the following tables:

  • databasechangelog
  • databasechangeloglock

By default, Liquibase creates two tables: databasechangelog and databasechangeloglock. These two tables are used to record any database changes from the migration scripts we have defined previously. For more details, refer to this Liquidbase update documentation.

If done correctly, you should also have some fish, tanks and species in your application now.

Run ./gradlew update again and you may notice that there are no database changes. Liquibase keeps a record of all migrations in databasechangelog table preventing the same script from being run more than once. Additionally, all scripts are checked via checksum comparison, and so any scripts which may have been changed will be caught and cause an error.

Invalid SQL

Liquibase will throw errors when trying to perform migrations with invalid SQL.

  1. Create a new file called 20200114_02_invalid_sql.sql and insert the following code:
    ALTER TABLE scales
    ADD COLUMN description varchar(255);
  2. Open a terminal and run ./gradlew update
  3. You will encounter a Liquibase MigrationFailedException exception due to relation "scales" does not exist.

When Liquidbase encounters a failed migration, it will not record the failure in databasechangelog. Therefore the script still could be executed after fixing the invalid SQL statement.

Testing Migration Scripts

To make sure the migration scripts are running correctly, we recommend testing the scripts on a development database before putting it into the update_scripts directory. A useful tool testing migration scripts is PgAdmin.

Creating a Connection to the Development Database

  1. After installing and starting PgAdmin, create a connection to your development database by clicking the Add New Server in the home page.

  2. Click the Connection tab and type in your testing database information and click Save.

  3. If successful, you will see the menu has been updated with your database.

Running SQL Statements

  1. Right-click your development database and click Query Tool...

  2. A new SQL editor will open. Copy and paste in the SQL statements you want to run. In this example we will use the following:

     ALTER TABLE fish
     ADD COLUMN testing_column varchar(255);
  3. Click the lightning button (highlighted in the red square below) to execute the statements.

  4. If the script was executed without error, and the changes you want have been made to the database, you can now use the statements in your migration scripts.

  5. To make sure you don't change your app too much, make sure you undo the changes you just made by running:

     DROP COLUMN IF EXISTS testing_column

If you run the Liquibase update in the same database after testing in PgAdmin, the Liquibase command will throw errors because you have already run the scripts before in this database. One solution is to backup your database before testing the migrations. Alternatively, manually revert the changes.

What's next?

Database migration in SpringBot is still in development. It is expected that this functionality will be moved into the bot, automating database migration during the release process.

For the time-being you can use the sql project to handle data migration for your project.