This is the 10th day article of Java Advent Calendar 2016 ¥-Qiita. The article on the 9th day was deaf_tadashi's About JMH, a microbenchmark tool. The 11th day is @ leak4mk0.
When you want to put test data (mainly images or PDF) in BLOB type columns using DBUnit There is a mechanism to specify the file in XML.
However, this time, I would like to introduce how to put test data from EXCEL into a BLOB type column.
If you have never used DBUnit, there is a lot of Japanese information on how to use it, so Please go around.
I stumbled upon importing BLOB column data, so it's a share. I think it's a highly rare situation, You can go this way! I would appreciate it if you could know that.
OS : Redhat 6.5 Java : 1.8.20 DB : Oracle 12c
DBUnit : 2.5.3 OJDBC : 8
In a non-idempotent unit test code execution environment (data is in the DB, but the data can be easily changed by human hands). In the development where if the test code fails, you will be asked to find the cause and fix it. This introduces Ikan and CI, and for test code that has data in the DB, I can not fix it anymore, so Let's migrate the data so that it can be imported by DBUnit so that the data can be restored from the IDE at any time! Measures.
By doing the above, it is possible to prevent the test from being broken except by modifying the software, and it does not take extra man-hours. Introduced by pushing hard on the point and the quality merit obtained by doing CI.
It was done using the brute force of copying from SQL Developer. BLOB type column data is BASE64 encoded with a SELECT statement. Throw a stored function and copy and paste.
When you execute DatabaseOperation # execute of DBUnit, ClassCastException occurs at the write location of the BLOB column and the process fails.
On the OJDBC side, simply "If I wanted to write data to a BLOB column, I would use a BlobObject, You can't cast because the data you're pouring in is bytes! " It seems that.
Aside from reading, when I write a BLOB myself, I don't use a BlobObject, but I think I'm forced to write it in bytes. I need to change the DataType of org.dbunit.dataset.Column to BINARY, so I created the following code.
java code (excerpt)
IDataSet databaseDataSet = con.createDataSet();
ITableMetaData tableMetaData = databaseDataSet.getTableMetaData(tableName);
//OraclePreparedStatement tries to put a BLOB object in a BLOB column,
//I want to set byte data, so convert it to BINARY
//As a caveat, ITableMetaData#columns and IDataSet#ITableMetaData#With columns
//Both need to be updated to BYNARY due to dual management
ITable table = dataset.getTable(tableName);
Column[] cols = tableMetaData.getColumns();
for(int i = 0; i < cols.length; i++){
//Replace the BLOB guy with a BINARY
if(DataType.BLOB.equals(cols[i].getDataType())){
Column col = new Column(cols[i].getColumnName(), DataType.BINARY);
//Since the elements of the array reference relationship can be freely rewritten, this is used to achieve the purpose.
cols[i] = col;
int idx = table.getTableMetaData().getColumnIndex(cols[i].getColumnName());
table.getTableMetaData().getColumns()[idx] = col;
}
}
//Data flow
DatabaseOperation.CLEAN_INSERT.execute(con, dataset);
As you can see in the comment, it is realized by forcibly rewriting the object in the reference state.
You have now successfully written BLOB column data to your DB using DBUnit. If you are in the same situation, please do!
Because the BLOB data was a Java instance created on the system side (not a PDF or image file, but the progress of business processing) I took this approach. Don't dig into why this is happening.
Also, since there are many opportunities to simply use Excel like a DB, This is because it is more convenient for learning if the developer can define the data in Excel.
If you use DBUnit to drop data into a BLOB column via JDBC, Set the DataType of the columns field of ITableMetaData and IDataSet # ITableMetaData to BINARY and it's OK! We have delivered a great technique for everyone.
Recommended Posts