I’ve been part of the Ruby community for a while, and while I’ve had a lot of experience with other frameworks, I’ve drank a lot of the Kool-Aid when it comes to the pleasure of working with Ruby on Rails. This holds especially true for me when I think about the ease of starting new services. I understand though, that Ruby is a smaller community, comparatively speaking. While I love writing Ruby code, I realize other languages like Java still are the dominant and preferred languages for most enterprises. So I set out to understand if I could make my experience in Java more Ruby-like and this was my experience. First off, I want to experience beginning with a brand new web service.
The Beginning
I started with getting a project off the ground, and the recommended approach was to use Spring Initialzr. There are some easily selected items out of the box so I started with some recommended configurations with the expectation I’d have to hunt for more libraries to include:
Lombok which saves a lot of boilerplate code
This will include QueryDSL, which will be useful later
PostgreSQL Driver
From there I wanted to simply get a controller up verifying that the application was accepting requests. Here is the sample of a basic alive and ready check controller, similar to what Kubernetes would expect in a deployment:
package com.demo.fulfillment.controllers;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
public class Availability {
@GetMapping("/live")
public String alive(){
return "Fulfillment Application online";
}
@GetMapping("/ready")
public String well(){
return "Fulfillment Application ready for orders";
}
}
Great! now let’s run the program!
java -jar target/fulfillment-0.0.1-SNAPSHOT.jar
...
...
2020-12-06 16:02:34.119 DEBUG 39858 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter : Application failed to start due to an exception
org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Failed to determine a suitable driver class
at org.springframework.boot.autoconfigure.jdbc.DataSourceProperties.determineDriverClassName(DataSourceProperties.java:235) ~[spring-boot-autoconfigure-2.4.0.jar!/:2.4.0]
...
...
2020-12-06 16:02:34.137 ERROR 39858 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
Failed to configure a DataSource: 'url' attribute is not specified and no embedded datasource could be configured.
Reason: Failed to determine a suitable driver class
Well I guess that means we’ve hit the first snag. We need to configure the DataSource.
Configuring the DataSource
In Spring Boot, these types of configurations are typically done with special XML files or app.properties files. While this is the standard approach in Java, this made my Rubyist brain unhappy, so I strove to seek a way to configure the DataSource programmatically, similar to how ActiveRecord in Ruby on Rails does this with just the DATABASE_URL environment variable (like Heroku Data). Fortunately I stumbled upon a few public blogs about how to go about accomplishing this task.
HikariCP, Hibernate, and Spring Boot
Since Hikari Connection Pooler and Hibernate ORM come standard in Spring Boot, I chose to try to work inside the Spring Boot framework as much as possible.
Example of a programmatic DataSource Configuration
This includes setup ahead for using Spring Data JPA, more on that in a bit
Then comes configuring connections to a Postgresql Database. In this example, I want to use PostgreSQL 12, though as of writing 13 is available. The XML and properties configuration files hide a lot of the setup complexity, but since I want to be able to tune my application dynamically based on deployments and environments, this seems like a good way to understand the moving pieces under the hood.
package com.demo.fulfillment.config;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import org.hibernate.cfg.Environment;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.JpaVendorAdapter;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.security.core.parameters.P;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.sql.DataSource;
import java.util.Properties;
@Configuration
@EnableTransactionManagement
public class GeneralDataConfig {
final Properties additionalProperties() {
final Properties hibernateProperties = new Properties();
hibernateProperties.setProperty(Environment.HBM2DDL_AUTO, "update");
hibernateProperties.setProperty(Environment.DIALECT, org.hibernate.dialect.PostgreSQLDialect.class.getCanonicalName());
hibernateProperties.setProperty(Environment.SHOW_SQL, "true");
hibernateProperties.setProperty(Environment.FORMAT_SQL, "true");
return hibernateProperties;
}
@Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl("jdbc:postgresql://localhost:5432/postgres");
config.setUsername("bart");
config.setPassword("51mp50n");
config.setDriverClassName(org.postgresql.Driver.class.getCanonicalName());
config.addDataSourceProperty("connectionTimeout", "20000");
config.addDataSourceProperty("minimumIdle", "10");
config.addDataSourceProperty("maximumPoolSize", "20");
config.addDataSourceProperty("idleTimeout", "300000");
return new HikariDataSource(config);
}
@Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource) {
LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
entityManagerFactory.setDataSource(dataSource);
entityManagerFactory.setPackagesToScan("com.demo.fulfillment.models");
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setDatabasePlatform(org.hibernate.dialect.PostgreSQLDialect.class.getCanonicalName());
vendorAdapter.setGenerateDdl(true);
entityManagerFactory.setJpaVendorAdapter(vendorAdapter);
entityManagerFactory.setJpaProperties(additionalProperties());
return entityManagerFactory;
}
@Bean
public PlatformTransactionManager transactionManager(LocalContainerEntityManagerFactoryBean entityManagerFactory) {
final JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory.getObject());
return transactionManager;
}
@Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
}
One thing to note: “setPackagesToScan” has to be set or the programmatic configuration will not register with Spring Boot, even if there are no Java classes there yet. Now I can try this again with my local Postgres running on Docker. A quick sample docker-compose.yml below explains how to introspect this locally very easily though I am omitting the other pieces.
version: "3.8"
services:
db:
image: postgres:12
ports:
- "5432:5432"
expose:
- "5432"
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=bart
- POSTGRES_PASSWORD=51mp50n
Then I run the application once my database is up:
docker-compose up -d
java -jar target/fulfillment-0.0.1-SNAPSHOT.jar
The nice piece of this setup, is that if you have a Database GUI tool like DBeaver, you can actually see the connections, DDL, and data changes as you make changes to your application.
Things I noticed that I thought were odd
Upon trying to start the application again, I am successful! Though as I poked around in the programmatic configuration I noticed an odd strike line in my Intellij IDE on “PostgreSQLDialect”:
vendorAdapter.setDatabasePlatform(org.hibernate.dialect.PostgreSQLDialect.class.getCanonicalName());
Turns out, this is expected behavior for this “generic” named dialect defaults to PostgreSQL95Dialect class. This functionality is deprecated so Spring Boot maintainers advise using the version of your SQL dialect explicitly for the Hibernate ORM, which is documented Spring Boot issue 18641. Further digging found that the latest dialect available is PostgreSQL10Dialect is the most recent implementation at the time of this writing December 2020. A few things I wondered here: is there a reasonable chance not all functionality will be available, is the naming just misleading, or if features are missing how could I detect what is missing? For now, I will proceed with the current working solution as this does function for PostgreSQL 12.
Additionally, this is still however, a very localized setup with hardcoded connection information. I need to avoid checking those secrets in. There’s a couple of ways to do this:
I could use a System.getenv call and parse the output
I could use a .properties file (do not want to do this)
I could use a Java Programmatic Properties component ding ding
package com.demo.fulfillment.config;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
@Component
public class FulfillmentEnvProperties {
@Value("${DATABASE_CONN_URL}")
public String databaseConnectionString;
@Value("${POSTGRES_USER}")
public String databaseUser;
@Value("${POSTGRES_PW}")
public String databasePassword;
}
I opted for the Java programmatic properties, as I can be assured that environment variables are injected in a uniform way and read equally uniform in the application. A few other changes have to accompany this in the original Java Config.
public class GeneralDataConfig {
@Autowired
FulfillmentEnvProperties envProperties;
To be able to access the information I inject the variables into the programmatic configuration in the dataSource Bean:
config.setJdbcUrl(envProperties.databaseConnectionString);
config.setUsername(envProperties.databaseUser);
config.setPassword(envProperties.databasePassword);
This now means, however, I need the environment variables to start my application. I could inject them each time in the java command. BUT! my docker-compose.yml can run my application automatically setting this information. I will have a Dockerfile to define the build of the application image and use docker-compose kick off the build of the application image, run a container of the application and the database, and network them together along with exposing them to my actual computer. While I’m here, I might as well set up remote debugging for my IDE as well (default remote debugging port is 5005 on Intellij).
# Dockerfile
FROM adoptopenjdk/openjdk14:latest
WORKDIR /app
EXPOSE 8080
ADD ./target/fulfillment-*.jar fulfillment.jar
ENTRYPOINT["java"]
CMD["-jar", "fulfillment.jar"]
# docker-compose.yml
version: '3.8'
services:
fulfillment:
image: fulfillment-demo
build:
context: ./
dockerfile: Dockerfile
restart: 'no'
environment:
POSTGRES_USER: bart
POSTGRES_PW: 51mp50n
DATABASE_CONN_URL: jdbc:postgresql://db:5432/postgres?user=bart&password=51mp50n
ports:
- '8080:8080'
- '5005:5005'
expose:
- '8080'
- '5005'
command:
- '-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005'
- '-jar'
- 'fulfillment.jar'
db:
image: postgres:12
restart: 'always'
ports:
- '5432:5432'
expose:
- '5432'
environment:
POSTGRES_DB: postgres
POSTGRES_USER: bart
POSTGRES_PASSWORD: 51mp50n
Now, I can run a one-line command to watch what’s happening on both containers at the same time.
docker-compose up -d && docker-compose logs -f
curl localhost:8080/live
Fulfillment Application online
Schema, POJO, DAO, or something else?
To verify the basics that Hibernate is now in charge of my database schema, queries, and data inserts, I need a way to insert and query data. At this point, my Ruby on Rails brain is begging for some help. Earlier examples I read all mention making templates Data Access Objects (DAO) or serializing plain old Java objects (POJO) which I don’t want to have to re-code in every Java project I make. I miss my ActiveRecord, so I started my search for an easier path. I discovered I can leverage some of the niceties of Spring Boot Data JPA and Lombok libraries to achieve a fairly close proximity.
Entities and Schemas: Strategy and Explicity
package com.demo.fulfillment.models;
import com.fasterxml.jackson.databind.PropertyNamingStrategy;
import com.fasterxml.jackson.databind.annotation.JsonNaming;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.hibernate.annotations.CreationTimestamp;
import org.hibernate.annotations.UpdateTimestamp;
import javax.persistence.*;
import java.util.UUID;
@Entity
@Access(AccessType.FIELD)
@Table(name = "customers")
@Builder
@Data
@NoArgsConstructor
@AllArgsConstructor
@JsonNaming(PropertyNamingStrategy.SnakeCaseStrategy.class)
public class Customer {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
@Column(name = "id", unique = true, nullable = false)
private UUID id;
@Column(name = "first_name", nullable = false)
private String firstName;
@Column(name = "last_name", nullable = false)
private String lastName;
@Column(name = "email", nullable = false)
private String email;
@Column(name = "address_one", nullable = false)
private String addressOne;
@Column(name = "address_two")
private String addressTwo;
@Column(name = "city", nullable = false)
private String city;
@Column(name = "state", nullable = false)
private String state;
@Column(name = "zip_code", nullable = false)
private Integer zipCode;
@Column(name = "created_at")
@CreationTimestamp
private Date createdAt;
@Column(name = "updated_at")
@UpdateTimestamp
private Date updatedAt;
}
I start off with making a simple Customer Entity Model to represent the table of “customers” in the database The Entity models used in Hibernate and JPA function as both Database Schema as code AND migrations here, so this is where some attentiveness is necessary.
Calling out the “@Access” use for anyone not used to seeing this. I’ve elected for the explicit FIELD member variable strategy. These entities can use getters and setters instead of the fields themselves, the PROPERTY Enum, to define the columns and names as well. While you can use this access strategy, it’s generally discouraged for maintainability and readability, but some may see older Java code that does this. Be careful when dealing with Access annotations that are PROPERTY based.
The customer model will have a JSON serialization and DDL naming strategy to fit with other standard ORMs like ActiveRecord using snake_case, but I can still represent the Java objects and names the way most people who use Java would expect.
Finally, I can have Hibernate generate the column IDs for me to guarantee new records make it and created and updated timestamps populate records for me as well, for my Rails brain accustomed to using the :timestamps Rails migration symbol.
Now that we have our schema set up, Java expects separation of concerns here, so we make a separate class to perform query execution. This is where Spring Data JPA shines to me. I will be using the JpaRepository class to have Spring build the basic query functionality for me.
// CustomerRepository.java
package com.demo.fulfillment.repositories;
import com.demo.fulfillment.models.Customer;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import java.util.UUID;
@Repository
public interface CustomerRepository extends JpaRepository<Customer, UUID> {
}
There is almost no code for the JpaRepository with extreme intention. I do not want to write a bunch of boilerplate error handling for lookups by ID and select statements like I don’t have to in ActiveRecord either. Now I can add the Controller endpoints to round out the testing. And finally I am ready to verify an end-to-end test.
REALIZING END-TO-END FUNCTIONALITY
package com.demo.fulfillment.controllers;
import com.demo.fulfillment.models.Customer;
import com.demo.fulfillment.repositories.CustomerRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Random;
import java.util.UUID;
@RestController
@RequestMapping(name = "/customers", produces = MediaType.APPLICATION_JSON_VALUE)
public class Customers {
@Autowired
CustomerRepository repository;
@GetMapping
@ResponseBody List<Customer> getCustomers() {
return repository.findAll();
}
@GetMapping("/{id}")
@ResponseBody Customer getCustomers(@PathVariable UUID id) {
return repository.findById(id).orElse(null);
}
@PostMapping(consumes = MediaType.ALL_VALUE)
@ResponseBody Customer createCustomer() {
String email = "chedards" +
(new Random().nextInt()) +
"@someplace.com";
return repository.save(Customer.builder()
.firstName("Charles")
.lastName("Edwards")
.email(email)
.addressOne("123 Smith Lane")
.city("New York")
.state("NY")
.zipCode(10011)
.build());
}
}
I now have the Customers controller able to generate mock data to help so here I go with my API requests. I can see the DDL changes in the docker-compose logs as well as requests now.
... Customers table is created
...
...
fulfillment_1 | 2020-12-07 04:13:38.252 DEBUG 1 --- [ main] o.h.loader.entity.plan.EntityLoader : Static select for entity com.demo.fulfillment.models.Customer [NONE]: select customer0_.id as id1_0_0_, customer0_.address_one as address_2_0_0_, customer0_.address_two as address_3_0_0_, customer0_.city as city4_0_0_, customer0_.created_at as created_5_0_0_, customer0_.email as email6_0_0_, customer0_.first_name as first_na7_0_0_, customer0_.last_name as last_nam8_0_0_, customer0_.state as state9_0_0_, customer0_.updated_at as updated10_0_0_, customer0_.zip_code as zip_cod11_0_0_ from customers customer0_ where customer0_.id=?
fulfillment_1 | 2020-12-07 04:13:38.376 DEBUG 1 --- [ main] org.hibernate.SQL :
fulfillment_1 |
fulfillment_1 | create table customers (
fulfillment_1 | id uuid not null,
fulfillment_1 | address_one varchar(255) not null,
fulfillment_1 | address_two varchar(255),
fulfillment_1 | city varchar(255) not null,
fulfillment_1 | created_at date,
fulfillment_1 | email varchar(255) not null,
fulfillment_1 | first_name varchar(255) not null,
fulfillment_1 | last_name varchar(255) not null,
fulfillment_1 | state varchar(255) not null,
fulfillment_1 | updated_at date,
fulfillment_1 | zip_code int4 not null,
fulfillment_1 | primary key (id)
fulfillment_1 | )
fulfillment_1 | Hibernate:
fulfillment_1 |
fulfillment_1 | create table customers (
fulfillment_1 | id uuid not null,
fulfillment_1 | address_one varchar(255) not null,
fulfillment_1 | address_two varchar(255),
fulfillment_1 | city varchar(255) not null,
fulfillment_1 | created_at date,
fulfillment_1 | email varchar(255) not null,
fulfillment_1 | first_name varchar(255) not null,
fulfillment_1 | last_name varchar(255) not null,
fulfillment_1 | state varchar(255) not null,
fulfillment_1 | updated_at date,
fulfillment_1 | zip_code int4 not null,
fulfillment_1 | primary key (id)
fulfillment_1 | )
...
...
curl -X POST localhost:8080/customers
I hope you found some of my musing insightful. So far, it’s not as easy as I’d like it to be in the initial setup but yet I will persist. Stay tuned for Part 2 when I tackle more advanced features like multi-table queries, PostgreSQL extra extensions to support JSON columns and more.