Tuesday 20 November 2012

my-datagenerator

my-datagenerator



What is my-datagenerator

my-datagenerator is Java Swing GUI application (Operating system independent) that i have done to have an Apache DBunit GUI and a MySql data generator. His features are:
- Clean all the database tables
- Full database export as DBunit dataset
- Partial database export selecting only the tables of your interest
- Import in the database a DBunit dataset compliant with the database structure
- Obtain informations about the tables and know the right filling order that respect the FK constraints
- Random data generations to fill  Mysql 5.x databases.

Download & Installation
Source code and Download at:
https://github.com/fulvio999/my-datagenerator

To run the application the only requirement is a working installation of JRE (Java Runtime Environment).
No installation is required. Download the jar file with the "-bin" suffix (can be necessary set the execution bit on *nix systems) from the command line execute "java -jar MyDatagenerator-<version>-bin.jar"
or right click on the jar file and choose the option "Open with Java" or a similar one (depending on you Operating system).

How use use it

The first operation is set the target database providing the connection parameters in the "Target Database" panel. Also is possible enable an internal logging system based on Apache Log4j framework to trace the activity of the tool. This features is useful for debug purpose.
The generated log file will be named "mydatagenerator.log" and at each execution will be overwritten.

Follow the descriptions about the single features.

Database configuration panel



Database Cleaning panel



Allow to clean the contents of ALL the tables. In Mysql the tables with engine type "ARCHIVE" don't support the deletion operation (delete, truncate).
The user can choose if delete or not this tables marking the checkbox in the cleaning panel.
NOTE: the deletion of the ARCHIVE tables, is performed changing temporary the engine type to "InnoDB" and restoring it to "ARCHIVE" after the deletion.


Export panel



In this panel, the user can choose if export the full database content as Apache DBunit dataset or only some tables content.
In the second case, is necessary mark the table(s) to export (after loading them) and optionally customize the default export query by double clicking on it and press return to confirm the changes.
Note: there is no validations about the edited query, be careful !

The default export policy, extract all the table content. is possible extract only a subset adding a "where condition" to filter the table data.
Example 1: extract only the row with "myField" value greater than 5:

Default query: select * from myTable
Custom query: select * from myTable where myField > 5

Example 2: extract only the field "myField" :

Default query: select * from myTable
Custom query: select myField from myTable


Import Panel



Here the user can import a valid Apache DBunit dataset in the target database. Obviously the input dataset must respect the structure and data type of the destination database.


Table Utils



Display the tables contained in the databse ordered by FK (ie the filling order). That list can be exported to a text file. Selecting a table are displyed informartions about it, like his fields type and the engine type.


Data Generator



Fill the target database (previously set in Target Database panel) with random generated values. This features is useful in case of new web applications when there are no datas for the front-end part of the application.
The data are generated with the help of Apache Commons Library. For performance reasons, and to prevent possible errors caused by duplicated data, the max number of rows that can be inserted in each table in 300.
Maybe this limit can be increased without unwanted effects, but i haven't  made test to find a new limit.
Is possible customize this limit making a custom build of the application.
To change this limit, download the source code, edit the configuration file named "MyDatagenerator-conf.properties" and build with the command "mvn clean package" (you need java and Apache Maven to do this). The short solution could be edit the file directly in the jar file.
During my tests i have successfully filled a database on localhost with 85 table (with an average of 5/6 fields for each table) in [771123] ms
Important: The partitioned tables are supported by the data generator, but it can fails if there is no partitions for a generated value (ie the value can't be placed in any partitions).
This issues could be fixed adding the "ignore" condition at the insert query, but can produce problems during the database filling like table with different amount of row or (in the worst case) empty table if all generated valued cann't be placed in any partitions.

Before starting with the database filling is better empty the database, but this is not mandatory. The only unwanted effects are that the tables can contains differert amount of rows and errors caused by duplicated data, specially if the database contains a lot of data.

Tips: restart the application before try a new database filling, to reset the generators, caching...

The sql statements executed during the filling process can be logged to a specified output file.

Thursday 13 September 2012

Disconnect from SCM



Is a Nautilus script for Gnome desktop environment (available only for *nix like OS) to disconnect a source code project from a SCM (Source code management) like Subversion, CVS, Git, Mercurial.

Do you have a source code project containing the data folders about the versioning system used by the author and you want clean the source tree ? 
This is the tool for you.



More info & Download

Audio File Info

Nautilis Script for obtain information with SOXI

This is a simple Nautilis Script for Gnome desktop environment (ie available only for *nix like systems) to get informations about an audio file using the SOXI command line tool.



More info and download at:


Thursday 28 June 2012

JXMLUTIL


What is
With JXMutiL i want offer a simple cross-platform GUI for Apache Xalan, Apache xerces and json-lib.

It is useful to perform XML transformation, XML validation. For example XSLT transformation, XPATH expression execution, various XML conversions (xml - json; java Properties - xml), XML validation with XSD.
Simple to use and cross-platform (ie: run on Windows, *nix, Mac OS).


Software requirement
A running Java JRE installation.
Download link


version 1.1 : better xml tree viewer
version 1.2 : NEW feature. XML file batch validation, validate more xml files against the same xsd



 Execution
Right click on the downloaded jar and choose “open with jre” (or similar, depending on your OS).
Otherwise open a terminal and execute: java -jar JxmlUtil.jar 
(Note: the “java” command must be in the PATH). On *nix operating system can be necessary set the execution bit.
How use it
Choose an input file to be processed (xml, json... depending on the conversion to perform). optionally choose a file where write the output result. Then press the "Starting processing" button.
The input file can be edited using an internal editor and after reloaded in the dedicated text are with the refresh button; the the user can retry the conversion.

Screenshot
XPATH
Apply an xpath expression on a input xml file and show the result
 
XSLT 
Apply an XSLT transformation (with a provide xsl file) to an input xml file

 XSD 
Check if a provided XML file is conform with the XSD supplied or with the batch validation (available from the version 1.2) can be validated more XML files against the same XSD file.




TREE
Show an xml file a tree

 JSON
Convert an xml to json and vice versa

 PROPERTIES
Convert a java.util.Properties file to xml and vice versa

Technical details
 
As XSLT processor is used Apache Xalan (http://xml.apache.org/xalan-j/)
for XML validation and parsing is used Apache Xerces (http://xerces.apache.org/xerces2-j/)

For json to xml (and vice versa) is used the library json-lib (http://json-lib.sourceforge.net/)
The conversion Java Properties to xml (and vice versa) is made with the standard JDK API.