BarleyDB

BarleyDB is a Java ORM library which makes it as easy as possible to load and save application domain models to and from the database. It is also extremely powerfull with a lot of features.

Query model generation

Query DSL classes are auto-generated from a schema specification and allow the programmer to easily build queries to load their application data.

  QUser quser = new QUser();
  quser.joinToAddress();
  QDepartment qdepartment = quser.joinToDepartment();
  qdepartment.joinToCountry();

  List<User> users = ctx.performQuery( quser ).getList();

Domain model generation

Application domain models are autogenerated from a schema specification.

Domain model inheritence

The object orientated inheritence model is supported, where entities can extend other entities. The generated domain model classes extend each other as required by the Java programming language and abstract query DSL classes are also generated allowing abstract data to be queried.

//load all people 
for (Person person: ctx.performQuery(new QPerson()).getList()) {
  if (person instanceof Employee) {
    Employee emp = (Employee)person;
    ...
  }
  else if (person instanceof Customer) {
    Customer cust = (Customer)person;
    ...
  }
}

Complex where clauses

The Query DSL support the usual operators logical and arithmetic operators as well as subqueries for example.

  QUser quser = new QUser();
  QDepartment qdepartment = quser.existsDepartment();
  quser.where( quser.name().equal("fred") )
       .andExists( qdepartment.where( qdepartment.name().like("computing") ) );

Batching multiple queries

The queries will be combined into a composite query returning multiple result-sets (depending on database vendor support).

QueryBatcher batch = new QueryBatcher();
batch.addQuery(new QUser());
batch.addQuery(new QDepartment());
batch.addQuery(new QCountry());

ctx.performQueries( batch );

Fetch plan definition whenever you want

Fetch plans for lazy loading can be registered at any time by specifying query models with the desired joins.

//create and register the fetch plan to be used the next time a deparment must be fetched.
QDepartment qdepartment = new QDepartment();
qdepartment.joinToCountry();
ctx.registerQuery( qdepartment );

//call user.getDepartment() which will cause a fetch.
Department dep = user.getDepartment();
//country was fetch along with the deparment as per the fetch plam
Country country = dep.getCountry();

Persisting changes to the database

Persist requests are used to bundle together domain models to be persisted. Persist operations cascade down to owned relations as expected.

PersistRequest pr = new PersistRequest();
pr.save( userA, userB, userC );
pr.delete( userX, userY );

ctx.persist( pr  );

Dependency analysis and batching of operations

During persistence, a dependency tree is used to identify the correct operation order to satify database constraints and to promote operation batching.

BatchExecuter executing insert batch for EntityType [ org.example.acl.model.AccessArea ] of size 1
BatchExecuter 1 rows were modified in total
BatchExecuter executing insert batch for EntityType [ org.example.etl.model.Template ] of size 1
BatchExecuter 1 rows were modified in total
BatchExecuter executing insert batch for EntityType [ org.example.etl.model.TemplateContent ] of size 2
BatchExecuter 2 rows were modified in total
BatchExecuter executing insert batch for EntityType [ org.example.etl.model.BusinessType ] of size 2
BatchExecuter 2 rows were modified in total
BatchExecuter executing insert batch for EntityType [ org.example.etl.model.TemplateBusinessType ] of size 2
BatchExecuter 2 rows were modified in total

DTO Models

DTO classes are automatically generated which allow programmers to work with simple DTO objects. DTOConverter utilities are provided to make it very easy to convert between DTOS and Entities

public List<UserDto> loadUsers() {
 //create a ctx and perform the  query
 EntityContext ctx = new EntityContext(env, namespace);
 List<User> users = ctx.performQuery(new QUsers()).getList(); 

 //convert all entities in the ctx to dtos
 DtoConverter converter = new DtoConverter(ctx);
 converter.convertToDtos();

 //get the list of userDtos which match the list of user entities from the query.
 List<UserDto> usersDto = converter.getDtos(users);
 return usersDto;
}

The DTOs extend BaseDto and have both EntityState and EntityConstraints so that information is not lost when mapping to and from entities. 1:N relationships are managed with DtoList which keeps track of the fetched state of the 1:N relation.

// update account 100
 AccountDto account = new Account();
 account.setId(100);
 //set the fetched state to true - indicates that the relation is considered fetched and therefore already contains all data
 account.getTransactions().setFetched(true);
 account.getTransactions().add( tran1 );
 account.getTransactions().add( tran2 );

 service.save(account);

Auditing

Auditing is very straightforward as a change report is generated every time domain models are persisted. The report below shows the table name, column name, old value and new value.

audit AUDIT SS_XML_MAPPING                 ID                             null                           3                             
audit AUDIT SS_XML_MAPPING                 SYNTAX_MODEL_ID                null                           1                             
audit AUDIT SS_XML_MAPPING                 XPATH                          null                           /root3                        
audit AUDIT SS_XML_MAPPING                 TARGET_FIELD_NAME              null                           target3                       
audit AUDIT SS_XML_MAPPING                 ID                             null                           4                             
audit AUDIT SS_XML_MAPPING                 SYNTAX_MODEL_ID                null                           2                             
audit AUDIT SS_XML_MAPPING                 XPATH                          null                           sub1                          
audit AUDIT SS_XML_MAPPING                 TARGET_FIELD_NAME              null                           subtarget1                    
audit AUDIT SS_XML_MAPPING                 ID                             null                           5                             
audit AUDIT SS_XML_MAPPING                 SYNTAX_MODEL_ID                null                           2                             
audit AUDIT SS_XML_MAPPING                 XPATH                          null                           sub2                          
audit AUDIT SS_XML_MAPPING                 TARGET_FIELD_NAME              null                           subtarget2     

Access control

An access control check is performed on each and every update insert or delete operation. A 3rd party access control library can easily be plugged in to the framework.

Relationship management

Ownership vs simple referral relationships impact how data is persisted across relations.

Freshness checking (optimistic locking)

BarleyDB will verify any optimistic locks defined on entities.

Transaction management

ctx.setAutocommit(false);
..
..
ctx.commit();

Large data-set support / streaming

Domain models can be streamed from the database.

QUser quser = new QUser();
quser.joinToAddress();

try ( ObjectInputStream<User> in = ctx.streamObjectQuery( quser ); ) {
  User user;
  while( (user = in.read()) != null ) {
    ...
  }
}

A stream can also be opened on any 1:N relation on any domain model.

try ( ObjectInputStream<Address> in = user.streamAddresses(); ) {
  Address address;
  while( (address = in.read()) != null ) {
    ...
  }
}

Garbage collection of unreferenced entities

BarleyDB supports garabage collection so that entities which are no longer referred to are removed. This works very well in combination with large data-set streaming as memory will be reclaimed automatically as the program proceeds through the data stream.

3 tier architecture support

BarleyDB can be used on the client tier in a 3 tier architecture. A client can create a remote context to the application server to perform queries and persist domain models as normal.

Easy domain schema specification

Both Java and XML Schema definition is supported though Java is preferred as the compiler can catch any inconsistencies.

public class ApplicationSpec extends PlatformSpec {

    public ApplicationSpec() {
        super("com.mycomp.application");
    }

    @Enumeration(JdbcType.INT)
    public static class EmployeeType {
        public static final int ADMIN = 1;
        public static final int ENGINEER = 2;
        public static final int MANAGER = 3;
        public static final int HR = 4;
        public static final int CHIEF = 5;
    }

    @Enumeration(JdbcType.INT)
    public static class Language {
        public static final int ENGLISH = 1;
        public static final int FRENCH = 2;
        public static final int SPANISH = 3;
        public static final int GERMAN = 4;
        public static final int SWISS = 5;
    }

    @Entity("GEN_EMPLOYEE")
    public static class Employee {

        public static final NodeSpec id = longPrimaryKey();

        public static final NodeSpec employeeType = mandatoryEnum(EmployeeType.class);

        public static final NodeSpec name = name();

        public static final NodeSpec department = mandatoryRefersTo(Department.class);

        public static final NodeSpec countryOfOrigin = mandatoryRefersTo(Country.class);

        public static final NodeSpec motherLang = mandatoryEnum(Language.class);

    }

    @Entity("GEN_LOB")
    public static class LineOfBusiness{

        public static final NodeSpec id = longPrimaryKey();

        public static final NodeSpec name = name();

        public static final NodeSpec parent = optionallyRefersTo(LineOfBusiness.class, "PARENT_ID");

        public static final NodeSpec children = ownsMany(LineOfBusiness.class, LineOfBusiness.parent);

    }    
    ...

The XML schema specification looks like so

        <Definitions namespace="org.scott.vvl.gen">
            <EnumSpecs/>
            <EntitySpecs>
                <EntitySpec className="org.scott.vvl.gen.model.Country" tableName="acn_country" abstract="false">
                    <queryClass>org.scott.vvl.gen.query.QCountry</queryClass>
                    <NodeSpecs>
                        <NodeSpec name="id" javaType="LONG" jdbcType="BIGINT" columnName="id" nullable="NOT_NULL" optimisticLock="false" pk="true">
                            <id>org.scott.vvl.gen.model.Country.id</id>
                        </NodeSpec>
                        <NodeSpec name="modifiedAt" javaType="UTIL_DATE" jdbcType="TIMESTAMP" columnName="modified_at" nullable="NOT_NULL" optimisticLock="false">
                            <id>org.scott.vvl.gen.model.Country.modifiedAt</id>
                        </NodeSpec>
                        <NodeSpec name="name" javaType="STRING" jdbcType="VARCHAR" columnName="name" nullable="NOT_NULL" length="50" optimisticLock="false">
                            <id>org.scott.vvl.gen.model.Country.name</id>
                        </NodeSpec>
                        <NodeSpec name="address" optimisticLock="false">
                            <id>org.scott.vvl.gen.model.Country.address</id>
                            <relation type="REFERS" entitySpec="org.scott.vvl.gen.model.Address" backReference="org.scott.vvl.gen.model.Address.country" joinType="LEFT_OUTER_JOIN"/>
                        </NodeSpec>
                    </NodeSpecs>
                    <Constraints>
                        <PrimaryKey name="pk_acn_country" nodes="org.scott.vvl.gen.model.Country.id"/>
                    </Constraints>
                </EntitySpec>
      ...

Modular definition of Schemas and importing of schemas

Each schema definition has it's own namespace and can import other schemas to build highly modular applications.

Database script generation

Create scripts, drop scripts and clean scripts can be automatically generated.

Auto generation of artifacts from database meta data

All artifacts (schema specification, query classes, model classes) can be generated from the database meta-data. The below example generates all artifacts from a Postgresql database.

    EnvironmentDef liveDef = EnvironmentDef.build()
            .withDataSource()
                .withDriver("org.postgresql.xa.PGXADataSource")
                .withUser("test_user")
                .withPassword("password")
                .withUrl("jdbc:postgresql://172.18.0.3:5432/test_db")
                .end();

  FromDatabaseSchemaToSpecification fdb = new FromDatabaseSchemaToSpecification("scott.acnplayautogen.autogenspec");
  fdb.removePrefix("vvl", "acn");

  /*
   * process the database meta data and generate the specifications
   */
   SpecRegistry registry = fdb.generateSpecification(liveDef.getDataSource());

   Generator.generate(registry, "src/main/java", "src/main/resources", false);

Easy bootstrapping

To get up and running simply specify the datasource and the schema definitions like so. As can be see below the schema can also be dropped and recreated.

Environment env = EnvironmentDef.build()
        .withDataSource()
            .withDriver("org.postgresql.xa.PGXADataSource")
            .withUser("test_user")
            .withPassword("password")
            .withUrl("jdbc:postgresql://localhost:5432/test_db")
            .end()
         .withSpecs(ApplicationSpec.class)
         .withDroppingSchema(true)
         .withSchemaCreation(true)
         .create();

ApplicationCtx ctx = new ApplicationCtx( env );
ctx.performQuery(new QUser());

General Overview

BarleyDB is a Java ORM library which takes a different approach. Some of the interesting features of BarleyDB are:

Dynamic and Static Nature

Another key interesting aspect of BarleyDB the fact that compilation is a completely optional step!. It is completely possible and valid to import an XML specification outlining the complete database schema. You can then use the meta-model to query and persist data. This is a very unusual and powerfull feature which no other the Java ORM solution offers and it allows BarleyDB to be used in interesting ways.

It is of course also possible to generate Java classes which then allow static and compilation safe interaction with the database.

Benefits to UI development of dynamic nature

You could use the schema / meta model to create generic CRUD UI screens in the UI technology of your choice which would allow users to view and edit the database data. This would allow products to ship very early. Custom / fancy UI screens could then be created on an as needed basis.

The custom UI screens can then use Java classes generated from the schema to query and persist data ensuring that any custom UI is completely compile safe.

Benefits to ETL systems of dynamic nature

As someone who has worked extensively with ETL tools. BarleyDB could be used to dynamically define a database schema. The schema could then be loaded and the ETL tool could allow a message to be mapped to the meta-model provided by BarleyDB. Once the data is mapped to the meta model . Then BarleyDB could simply be asked to persist the whole dataset to the database.

Loading data from an older schema version on the fly

As no compilation is required to load and save data from a database schema. It is possible to import BarleyDB XML schema definitions of an older schema into a running system on the fly and then use those definitions to pull data out of the older database.

Sophisticated version management (future)

BarleyDB can generate XML schema definition files. Each schema definition can be reduced to a SHA-1 hash. If migation logic was introduced, then it would be possible to define how to migrate data from one schema definition to another. This would allow for automatic forward porting and backporting of data.

Such a system would allow connecting to a database with an older schema version, then loading in data and upgrading it to match the current schema and then inserting it into the current database. If backporting was supported, the reverse could also be accomplished.

Data Structure

BarleyDB has it's own simple data model for holding database data. It consists of:

Class Generation

As programmers usually want to have their own classes to program against. BarleyDB can generate the required Classes for the programmer which are simply proxies to the underlying Entity data structure.

BarleyDB also generates a domain specific query DSL which can be used to query for data. A simple example is as follows:

  //build a simple query
  QUser quser  = new QUser();
  quser.where( quser.name().equal("John") );

  //execute the query and process the results.
  for (User user: ctx.performQuery( quser ).getList()) {
     System.out.println(user.getName() + " - " + user.getAge());
  }

A more detailed look...

The following features are supported by BarleyDB

Querying

A more complex query eample is as follows:

  //find users with name 'John' who have a primary address with postcode 'KW14' or a seconary address with
  //postcode 'OSA'
  QUser quser = new QUser();
  QAddress primAddr = quser.existsPrimaryAddress(); //sub-query for primary address
  QAddress secAddr = quser.existsSecondaryAddress(); //sub-query for secondard address

  //join to the user's department and the department's country so the data is pulled
  //in as part of the same query.
  quser.joinToDepartment().joinToCountry();

  quser.where( quser.name().equal("John") )
       .andExists( primAddr.where( primAddr.postCode().like("KW14") ) )
       .orExists ( secAddr.where( primAddr.postCode().like("OSA") ) );

  //execute the query and process the results.
  for (User user: ctx.performQuery( quser ).getList()) {
     System.out.println(user.getName() + " - " + user.getDepartment().getCountry().getName());
  }

The feature set is as follows:

Persisting

An example of persisting data is as follows:

//create a new user
User user = ctx.newModel(User.class);
user.setName("John");

//create a new department
Department dept = ctx.newModel(Department.class);
dept.setName("Computer Science");

//assign the department to the user.
user.setDepartment( dept );

//Save the user. The user has a FK reference to the department so the department is saved too.
PeristRequest req = new PeristRequest();
req.save( user );
ctx.persist( req );

The feature set is as follows:

Client / Server

BarleyDB supports sending queries and data over the wire allowing for:

Entity Context

The entity context functions as the executer of queries and persist-requests. It also holds all of the entites. In this respect it is similar to the JPA EntityManager. The entity context however supports some extra features which the EntityManager does not:

Specification

BarleyDB allows the schema to be specified in an XML file or via Java classes. Java classes have the advantage of using compilation safety to ensure foreign key references between tables.

Getting Started

The best way to see how BarleyDB works is to look at the test cases.