What you do is as the title says, but the content is mainly about how to use Jackson
.
ffprobe
works as follows.
cat ${file name} | ffprobe -hide_banner -v error -print_format json -show_streams -i pipe:0
I have specified various things, but the following two points are important.
---print_format json
-> Output processing result in JSON format
---show_streams
-> Output each stream information such as sound, video, subtitles, etc.
Please refer to the following article for how to call from Java
.
-[Java] Enter into stdin of Process -Qiita
The result of the call is as follows (it's long, so fold it).
{
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "Main",
"codec_type": "video",
"codec_time_base": "752/45075",
"codec_tag_string": "avc1",
"codec_tag": "0x31637661",
"width": 1920,
"height": 1080,
"coded_width": 1920,
"coded_height": 1088,
"has_b_frames": 1,
"sample_aspect_ratio": "1:1",
"display_aspect_ratio": "16:9",
"pix_fmt": "yuv420p",
"level": 41,
"color_range": "tv",
"chroma_location": "left",
"refs": 1,
"is_avc": "true",
"nal_length_size": "4",
"r_frame_rate": "30000/1001",
"avg_frame_rate": "45075/1504",
"time_base": "1/30000",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 601600,
"duration": "20.053333",
"bit_rate": "6653705",
"bits_per_raw_sample": "8",
"nb_frames": "601",
"disposition": {
"default": 1,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
},
"tags": {
"creation_time": "2014-03-30T07:09:03.000000Z",
"language": "eng",
"handler_name": "\u001fMainconcept Video Media Handler",
"encoder": "AVC Coding"
}
},
{
"index": 1,
"codec_name": "aac",
"codec_long_name": "AAC (Advanced Audio Coding)",
"profile": "LC",
"codec_type": "audio",
"codec_time_base": "1/48000",
"codec_tag_string": "mp4a",
"codec_tag": "0x6134706d",
"sample_fmt": "fltp",
"sample_rate": "48000",
"channels": 2,
"channel_layout": "stereo",
"bits_per_sample": 0,
"r_frame_rate": "0/0",
"avg_frame_rate": "0/0",
"time_base": "1/48000",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 962560,
"duration": "20.053333",
"bit_rate": "317375",
"max_bit_rate": "317625",
"nb_frames": "942",
"disposition": {
"default": 1,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
},
"tags": {
"creation_time": "2014-03-30T07:09:03.000000Z",
"language": "eng",
"handler_name": "#Mainconcept MP4 Sound Media Handler"
}
}
]
}
I will map the probe result I got earlier to the object.
The output will be an array of stream
s in the video.
This time, we will map to the following objects (@ Getter
and @ Setter
are written assuming lombok
).
The basis of the result is stream
, and the types are separated by video
or ʻaudio` (or other).
@JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "codec_type", defaultImpl = NoClass.class)
@JsonSubTypes({
@JsonSubTypes.Type(value = FfprobeStream.VideoStream.class, name = "video"),
@JsonSubTypes.Type(value = FfprobeStream.AudioStream.class, name = "audio")
})
@JsonIgnoreProperties(ignoreUnknown = true)
@Getter @Setter
abstract class FfprobeStream {
private Double duration;
private String codecName;
@Getter @Setter
static class VideoStream extends FfprobeStream {
private Integer width;
private Integer height;
}
@Getter @Setter
static class AudioStream extends FfprobeStream {
private String channelLayout;
}
}
In the following annotation, the type of the map destination is controlled by the value of codec_type
.
There is no way to create a god object and map it there, but it is easier to do the rest, so it is better to control the type of the map destination.
@JsonTypeInfo(use = JsonTypeInfo.Id.NAME, property = "codec_type", defaultImpl = NoClass.class)
@JsonSubTypes({
@JsonSubTypes.Type(value = FfprobeStream.VideoStream.class, name = "video"),
@JsonSubTypes.Type(value = FfprobeStream.AudioStream.class, name = "audio")
})
The following annotations ignore fields that cannot be mapped. This time, only some fields of the result are prepared, so if you do not specify this, you will get an error if you can not map.
@JsonIgnoreProperties(ignoreUnknown = true)
You can map the probe results to an array of stream
s in your video with the following code.
Details are as commented.
static List<FfprobeStream> parseFfprobeResult(String ffprobeResult) throws IOException {
ObjectMapper mapper = new ObjectMapper();
//ffprobe outputs Property Naming Strategy.SNAKE_Since it is JSON of CASE, specify it as such
mapper.setPropertyNamingStrategy(PropertyNamingStrategy.SNAKE_CASE);
// JSON ->Implement Java object conversion
Map<String, List<FfprobeStream>> map
= mapper.readValue(ffprobeResult, new TypeReference<Map<String, List<FfprobeStream>>>() {});
//Since the parse result comes out in Map, get and return by specifying the parameter
return map.getOrDefault("streams", Collections.emptyList());
}
Recommended Posts