Datastores : SchemaTool

The meta-data files and annotations define how a class maps across to a persistent store (e.g database). Some datastore (RDBMS, HBase, Excel, ODF, MongoDB for example) have a schema defining the structure of the tables/columns where the classes are persisted. A schema can be

  • Already existing, and so the user maps their classes to their existing tables
  • Manually created by the user.
  • Automatically created at runtime by use of the datanucleus.autoCreateSchema property.
  • Created before running the application, using DataNucleus SchemaTool .

We will describe here the use of DataNucleus SchemaTool . It currently works with RDBMS, HBase, Excel, OOXML, ODF, MongoDB datastores, and is very simple to operate.

DataNucleus SchemaTool has the following modes of operation :

  • create - create all database tables required for the classes defined by the input data.
  • delete - delete all database tables required for the classes defined by the input data.
  • validate - validate all database tables required for the classes defined by the input data.
  • dbinfo - provide detailed information about the database, it's limits and datatypes support. Only for RDBMS currently.
  • schemainfo - provide detailed information about the database schema. Only for RDBMS currently.
In addition for RDBMS, the create / delete modes can be used by adding "-ddlFile {filename}" and this will then not create/delete the schema, but instead output the DDL for the tables/constraints into the specified file.

For the create , delete and validate modes DataNucleus SchemaTool accepts either of the following types of input.

  • A set of MetaData and class files. The MetaData files define the persistence of the classes they contain. The class files are provided when the classes have annotations.
  • The name of a persistence-unit . The persistence-unit name defines all classes, metadata files, and jars that make up that unit. Consequently, running DataNucleus SchemaTool with a persistence unit name will create the schema for all classes that are part of that unit.

Here we provide many different ways to invoke DataNucleus SchemaTool

Manual Usage

If you wish to call DataNucleus SchemaTool manually, it can be called as follows

java [-cp classpath] [system_props] [modes] [options] [props] 
                    [mapping-files] [class-files]
    where system_props (when specified) should include
        -Ddatanucleus.Mapping=orm_mapping_name (optional)
        -Dlog4j.configuration=file:{} (optional)
    where modes can be
        -create : Create the tables specified by the mapping-files/class-files
        -delete : Delete the tables specified by the mapping-files/class-files
        -validate : Validate the tables specified by the mapping-files/class-files
        -dbinfo : Detailed information about the database
        -schemainfo : Detailed information about the database schema
    where options can be
        -ddlFile {filename} : RDBMS - only for use with "create"/"delete" mode to dump the DDL to the 
                              specified file
        -completeDdl : RDBMS - when using "ddlFile" in "create" mode to get all DDL output and not 
                              just missing tables/constraints
        -includeAutoStart : whether to include any auto-start mechanism in SchemaTool usage
        -api : The API that is being used (default is JDO, but can be set to JPA)
        -pu {persistence-unit-name} : Name of the persistence unit to manage the schema for
        -v : verbose output
    where props can be
        -props {propsfilename} : PMF properties to use in place of the "system_props"

All classes, MetaData files, "persistence.xml" files must be present in the CLASSPATH. In terms of the schema to use, you either specify the "props" file (recommended), or you specify the System properties defining the database connection, or the properties in the "persistence-unit". You should only specify one of the [modes] above. Let's make a specific example and see the output from SchemaTool. So we have the following files in our application

src/java/...                 (source files and MetaData files)
target/classes/...           (enhanced classes, and MetaData files)
lib/log4j.jar                (optional, for Log4J logging)
lib/datanucleus-rdbms.jar, lib/datanucleus-hbase.jar,  etc
lib/mysql-connector-java.jar (JDBC driver for our database, if using RDBMS)

So we want to create the schema for our persistent classes. So let's invoke DataNucleus SchemaTool to do this, from the top level of our project. In this example we're using Linux (change the CLASSPATH definition to suit for Windows)

java -cp target/classes:lib/log4j.jar:lib/jdo-api.jar:lib/datanucleus-core.jar:lib/datanucleus-{datastore}.jar:
                lib/mysql-connector-java.jar -create

DataNucleus SchemaTool (version 3.0.0) : Creation of the schema

DataNucleus SchemaTool : Classpath
>>  /home/andy/work/DataNucleus/samples/packofcards/target/classes
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/log4j.jar
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/datanucleus-core.jar
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/datanucleus-api-jdo.jar
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/datanucleus-rdbms.jar
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/jdo-api.jar
>>  /home/andy/work/DataNucleus/samples/packofcards/lib/mysql-connector-java.jar

DataNucleus SchemaTool : Input Files
>> /home/andy/work/DataNucleus/samples/packofcards/target/classes/org/datanucleus/examples/inverse/package.jdo
>> /home/andy/work/DataNucleus/samples/packofcards/target/classes/org/datanucleus/examples/normal/package.jdo

DataNucleus SchemaTool : Taking JDO properties from file ""

SchemaTool completed successfully

So as you see, DataNucleus SchemaTool prints out our input, the properties used, and finally a success message. If an error occurs, then something will be printed to the screen, and more information will be written to the log.


If you are using Maven2 to build your system, you will need the DataNucleus Maven2 plugin. This provides 5 goals representing the different modes of DataNucleus SchemaTool . You can use the goals datanucleus:schema-create , datanucleus:schema-delete , datanucleus:schema-validate depending on whether you want to create, delete or validate the database tables. To use the DataNucleus Maven2 plugin you will may need to set properties for the plugin (in your pom.xml ). For example

Property Default Description
metadataDirectory ${} Directory to use for schema generation files (classes/mappings)
metadataIncludes **/*.jdo, **/*.class Fileset to include for schema generation
metadataExcludes Fileset to exclude for schema generation
persistenceUnitName Name of the persistence-unit to generate the schema for (defines the classes and the properties defining the datastore)
props Name of a properties file for the datastore (PMF)
log4jConfiguration Config file location for Log4J (if using it)
jdkLogConfiguration Config file location for JDK1.4 logging (if using it)
api JDO API to enhance to (JDO, JPA)
verbose false Verbose output?
fork true Whether to fork the SchemaTool process (from M2 plugin v3.0.1) . Note that if you don't fork the process, DataNucleus will likely struggle to determine class names from the input filenames, so you need to use a persistence.xml file defining the class names directly.
ddlFile Name of an output file to dump any DDL to (for RDBMS)
completeDdl false Whether to generate DDL including things that already exist? (for RDBMS)
includeAutoStart false Whether to include auto-start mechanisms in SchemaTool usage

So to give an example, I add the following to my pom.xml


So with these properties when I run SchemaTool it uses properties from the file at the root of the Maven project. I am also specifying a log4j configuration file defining the logging for the SchemaTool process. I then can invoke any of the Maven2 goals

mvn datanucleus:schema-create              Create the Schema
mvn datanucleus:schema-delete              Delete the schema
mvn datanucleus:schema-validate            Validate the Schema
mvn datanucleus:schema-info                Output info for the Schema
mvn datanucleus:schema-dbinfo              Output info for the datastore


An Ant task is provided for using DataNucleus SchemaTool . It has classname , and accepts the following parameters

Parameter Description values
mode Mode of operation. create , delete, validate, dbinfo, schemainfo
verbose Whether to give verbose output. true, false
props The filename to use for PMF properties
ddlFile The filename where SchemaTool should output the DDL (for RDBMS).
completeDdl Whether to output complete DDL (instead of just missing tables). Only used with ddlFile true, false
includeAutoStart Whether to include any auto-start mechanism in SchemaTool usage true, false
api API that we are using in our use of DataNucleus JDO | JPA
persistenceUnit Name of the persistence-unit that we should manage the schema for (defines the classes and the properties defining the datastore).

The SchemaTool task extends the Apache Ant Java task, thus all parameters available to the Java task are also available to the SchemaTool task.

In addition to the parameters that the Ant task accepts, you will need to set up your CLASSPATH to include the classes and MetaData files, and to define the following system properties via the sysproperty parameter (not required when specifying the persistence props via the properties file, or when providing the persistence-unit )

Parameter Description Optional
datanucleus.ConnectionDriverName Name of JDBC driver class Mandatory
datanucleus.ConnectionURL URL for the database Mandatory
datanucleus.ConnectionUserName User name for the database Mandatory
datanucleus.ConnectionPassword Password for the database Mandatory
datanucleus.Mapping ORM Mapping name Optional
log4j.configuration Log4J configuration file, for SchemaTool's Log Optional

So you could define something like the following, setting up the parameters schematool.classpath , datanucleus.ConnectionDriverName , datanucleus.ConnectionURL , datanucleus.ConnectionUserName , and datanucleus.ConnectionPassword to suit your situation.

You define the jdo files to create the tables using fileset .

<taskdef name="schematool" classname="" />

<schematool failonerror="true" verbose="true" mode="create">
        <path refid="schematool.classpath"/>
    <fileset dir="${classes.dir}">
        <include name="**/*.jdo"/>
    <sysproperty key="datanucleus.ConnectionDriverName" 
    <sysproperty key="datanucleus.ConnectionURL" 
    <sysproperty key="datanucleus.ConnectionUserName" 
    <sysproperty key="datanucleus.ConnectionPassword" 
    <sysproperty key="datanucleus.Mapping" 

Jython is a powerful scripting language that allows you to automate tasks and easy the development. If you are using Jython, and wants to use DataNucleus tools, all you need is to the place the DataNucleus jars, the persistent classes, metadata files, jdbc driver jars and dependencies into the classpath.

The Jython script may be written in several forms but achieving the same goals.

Here we have a template for invoking the main method of the Schema Tool. The main method acts like the command line, by parsing arguments and invoking the appropriate methods.

from import SchemaTool
tool = SchemaTool()
tool.main(<<<[options] [mapping-files] [class-files]>>>)
The below is a concrete example.
from import SchemaTool
tool = SchemaTool()

For other operations of the SchemaTool consult the DataNucleus javadocs. For questions about Jython, please refer to Jython WebSite.

SchemaTool API

DataNucleus SchemaTool can also be called programmatically from an application. You need to get hold of the StoreManager and cast it to SchemaAwareStoreManager . The API is shown below.


public interface SchemaAwareStoreManager
     public int createSchema(Set<String> classNames, Properties props);

     public int deleteSchema(Set<String> classNames, Properties props);

     public int validateSchema(Set<String> classNames, Properties props);

So for example to create the schema for classes mydomain.A and mydomain.B you would do something like this

JDOPersistenceManagerFactory pmf = 
NucleusContext ctx = pmf.getNucleusContext();
List classNames = new ArrayList();
    Properties props = new Properties();
    // Set any properties for schema generation
    ((SchemaAwareStoreManager)ctx.getStoreManager()).createSchema(classNames, props);
catch(Exception e)