[JAVA] Csv output processing using super-csv

1. Purpose

I want to output the contents of schedule to CSV

CSV output the contents of the schedule table displayed on the Top screen by pressing the result DL button image.png

2. Preparation

Added super-csv to build.gradle

build.gradle


plugins {
	id 'org.springframework.boot' version '2.3.3.RELEASE'
	id 'io.spring.dependency-management' version '1.0.10.RELEASE'
	id 'java'
}

group = 'com.example'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '11'

configurations {
	compileOnly {
		extendsFrom annotationProcessor
	}
}

repositories {
	mavenCentral()
}

dependencies {
	implementation 'org.springframework.boot:spring-boot-starter-thymeleaf'
	implementation 'org.springframework.boot:spring-boot-starter-web'
	implementation 'org.mybatis.spring.boot:mybatis-spring-boot-starter:2.1.3'
	compileOnly 'org.projectlombok:lombok'
	developmentOnly 'org.springframework.boot:spring-boot-devtools'
	runtimeOnly 'mysql:mysql-connector-java'
	annotationProcessor 'org.springframework.boot:spring-boot-configuration-processor'
	annotationProcessor 'org.projectlombok:lombok'
	testImplementation('org.springframework.boot:spring-boot-starter-test') {
		exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
	}
	implementation 'javax.validation:validation-api:2.0.1.Final'
	// https://mvnrepository.com/artifact/javax.validation/validation-api
	implementation 'javax.validation:validation-api:2.0.1.Final'
	// https://mvnrepository.com/artifact/org.hibernate.validator/hibernate-validator
	runtimeOnly 'org.hibernate.validator:hibernate-validator:6.0.17.Final'
	// https://mvnrepository.com/artifact/org.glassfish/javax.el
	runtimeOnly 'org.glassfish:javax.el:3.0.1-b11'
	// https://mvnrepository.com/artifact/org.webjars/fullcalendar
	compile group: 'org.webjars.bower', name: 'fullcalendar', version: '3.5.1'
	// https://mvnrepository.com/artifact/org.webjars.bower/moment
	compile group: 'org.webjars.bower', name: 'moment', version: '2.19.1'
	// https://mvnrepository.com/artifact/org.webjars/jquery
	compile group: 'org.webjars', name: 'jquery', version: '2.0.3'
	// https://mvnrepository.com/artifact/com.github.mygreen/super-csv-annotation
	compile group: 'com.github.mygreen', name: 'super-csv-annotation', version: '2.2'
}

test {
	useJUnitPlatform()
}

3.Controller Set header information in HttpServletResponse and call CSV output processing method in Controller File write related exceptions may be raised in the service process, so declare IOException as throw

TopController.java


...Omission

//Accepting CSV output requests
	@RequestMapping(value = "/top/csv", method = RequestMethod.GET)
	public String csvDownload(HttpServletResponse response) throws IOException {
		String header = String.format("attachment; filename=\"%s\";", UriUtils.encode("result.csv", StandardCharsets.UTF_8.name()));
		response.setHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_OCTET_STREAM_VALUE);
		response.setHeader(HttpHeaders.CONTENT_DISPOSITION, header);
		topService.csvDownload(response);
		return "/top";
	}

4.Service Create an instance of org.supercsv.io.CsvMapWriter class with OutputStreamWriter and Excel format constructor Write with writeHeader / writeComment method while formatting the bean that got the schedule information from DB

TopService.java



package com.example.alhproject.service;

import java.io.IOException;
import java.io.OutputStreamWriter;
import java.nio.charset.Charset;
import java.util.List;

import javax.servlet.http.HttpServletResponse;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.supercsv.io.CsvMapWriter;
import org.supercsv.prefs.CsvPreference;

import com.example.alhproject.entity.Schedule;
import com.example.alhproject.mapper.ScheduleMapper;

@Service
public class TopService {
	@Autowired
	private ScheduleMapper scheduleMapper;

	private static final String OUTPUT_SCHEDULE_FORMAT = "%s,%s,%s,%s,%s,%s";
	private static final String SJIS = "SJIS";
	private static final String TITLE = "title";
	private static final String CONTEXT = "context";
	private static final String USER_ID= "user_id";
	private static final String CREATED_DATE = "created_date";
	private static final String SCHEDULE_START_TIME = "schedule_start_time";
	private static final String SCHEDULE_END_TIME = "schedule_end_time";

    //schedule table contents acquisition
	public List<Schedule> getAllSchedule() {
		return scheduleMapper.selectAll();
	}

    //CSV output processing
	public void csvDownload(HttpServletResponse response) throws IOException {
		try (OutputStreamWriter osw = new OutputStreamWriter(response.getOutputStream(), Charset.forName(SJIS));
				CsvMapWriter wr = new CsvMapWriter(osw, CsvPreference.EXCEL_NORTH_EUROPE_PREFERENCE)) {
			wr.writeHeader(String.format(OUTPUT_SCHEDULE_FORMAT,
					TITLE,
					CONTEXT,
					USER_ID,
					CREATED_DATE,
					SCHEDULE_START_TIME,
					SCHEDULE_END_TIME
			));

			getAllSchedule().forEach(dbsc -> {
				String scheduleResult = String.format(OUTPUT_SCHEDULE_FORMAT,
						dbsc.getTitle(),
						dbsc.getContext(),
						dbsc.getUserId().toString(),
						dbsc.getCreatedDate().toString(),
						dbsc.getScheduleStartTime().toString(),
						dbsc.getScheduleEndTime().toString());
				try {
					wr.writeComment(scheduleResult);
				} catch (IOException e) {
					e.printStackTrace();
				}
			});
		}
	}
}

(Reference) LazyCsvAnnotationBeanWriter seems to be easy to use if it exactly matches the record name. https://mygreen.github.io/super-csv-annotation/sphinx/labelledcolumn.html

Sample.java


    //When writing all records at once
    public void sampleWriteAll() {
...
        LazyCsvAnnotationBeanWriter<UserCsv> csvWriter = new LazyCsvAnnotationBeanWriter<>(
                SampleCsv.class,
                Files.newBufferedWriter(new File("sample.csv").toPath(), Charset.forName("Windows-31j")),
                CsvPreference.STANDARD_PREFERENCE);
...
5. Result

image.png

image.png

Recommended Posts

Csv output processing using super-csv
csv file output with opencsv
CSV output with Apache Commons CSV
Data processing using Apache Flink
Supports 0 drop in CSV output
[Processing] Try using GT Force.
Output Rails Routes as csv
Excel output using Apache POI!
Implement declarative retry processing using Spring Retry
[Ruby on Rails] CSV output function
Deleting files using recursive processing [Java]
Perform parallel processing using Java's CyclicBarrier