Fluid visualization BOS method that can be done at home to see what is invisible

Introduction

The clear gas is invisible to the naked eye. Still, there are times when I want to see it. Fluid visualization technologies such as the Schlieren method and shadow graph method are useful there. This time, I will try to visualize this fluid with something that is at home. Basically, all you need is a camera.

References

I experimented with reference to this. We recommend that you read here for detailed theory. Improvement of density gradient visualization method based on BOS method

Introduction of fluid visualization technology

Schlieren method

If the density gradient is large, you may see something with the naked eye, such as the heat haze seen in the summer or the gas of a lighter. The shadow graph method visualizes these on the same principle that they are visible to the naked eye. The Schlieren method is a method to further increase the sensitivity of this method. schlierens_sam13.jpg NOBBY TECH Schlieren device

This optical system has such a configuration.

シュリーレン法原理図.png

The light emitted from a point light source is collected at one point by the lens. In reality, the focus has some extent. Install a knife edge to block this light by half. If there is a density gradient (unevenness of density) on the optical path, the light will be refracted and will not be concentrated at one point. Refracting to the edge of the knife will block it and darken the image on the sensor, while refracting it will brighten it. In this way, the density gradient can be recorded as brightness information.

This method is highly sensitive and can take beautiful pictures, but it is a little difficult to reproduce at home. What is difficult is that there are many things that must be prepared first.

Next is the difficulty of installing the optical system. It takes a lot of effort to align the optical axis. In particular, the knife edge must be in the exact focal position. It would be nice if there was a fine movement stage, but if it wasn't, some ingenuity would be required.

SPBOS method (Stripe-patterned Background Oriented Schlieren)

This BOS method is a method that can visualize the density gradient in the same way as the Schlieren method. This does not require a knife edge.

The optical system is also very simple. BOS法原理図.png

Take two images, one with only the background and one with a density gradient, and calculate the density gradient from the difference. There is also a BOS method that uses a random pattern in the background, but that makes the program cumbersome because it calculates the cross-correlation and finds the deviation like PIV. This time, we will use the SPBOS method that uses a striped pattern that makes the program easier.

Experiment

Things to prepare

  1. Striped screen
  2. Camera
  3. What you want to measure

The ones used this time are as follows.

  1. iphone8
  2. D5100, AF-S DX NIKKOR 55-300mm f/4.5-5.6G ED VR
  3. Monotaro Air Duster

I think that the screen can be printed on paper with a smartphone or a printer.

It is desirable that the camera can be saved in RAW. The bit depth of the D5100 is 14 bits (16383 gradations), but when saved as jpg, it has only 8 bits (255 gradations) to match the gradation of the display. Also, gamma correction is effective, so returning to linear makes it even worse. A tripod is fine, but I think it's okay to put it on the floor. Be careful not to shake the camera when you press the shutter.

photograph

This time I installed it like this. It has a focal length of 300 mm, an exposure time of 1/60, an F value of 20, and an ISO sensitivity of 400. BOS光学系セットアップ.png

Focus the camera on the screen.

You need to be careful about the distance between the screen and the object to be measured. If it is too close, the sensitivity will not be obtained. Even if there is a density gradient in the immediate vicinity of the screen and the light bends, the bent light will be imaged at the same position as when it did not bend. On the other hand, if it is too far, the object to be measured will be blurred. Increase the F value to increase the depth of field.

The image for the background was created by the following program.

import numpy as np

#display
ppi = 326 # 1inch(25.4mm)Number of pixels per
x_pix = 750 #Horizontal resolution
y_pix = 1334 #Vertical resolution
pix_per_mm = ppi / 25.4
x_mm = x_pix / pix_per_mm
y_mm = y_pix / pix_per_mm
lam = 1.0 # mm

def sin_img(direction="x",bit=8):
    if direction == "x":
        x = np.linspace(0.0,x_mm,x_pix)
        sin = np.sin(2.0*np.pi*x/lam)
        img = np.tile(sin,(y_pix,1)).T
        img = 0.5 * (img + 1.0) * (2**bit-1.0)
    elif direction == "y":
        y = np.linspace(0.0,y_mm,y_pix)
        sin = np.np.sin(2.0*np.pi*y/lam)
        img = np.tile(sin,(x_pix,1))
        img = 0.5 * (img + 1.0) * (2**bit-1.0)
        
    if bit == 8:
        img = img.astype(np.uint8)
    elif bit == 16:
        img = img.astype(np.uint16)
    else:
        print("Number of bits is 8 or 16")
    return img

Background taken ref.png

There is a measurement object jet.png

analysis

First, paste the program used for analysis.

RAW images are developed using rawpy. Uses linear scale brightness values without gamma correction.

import rawpy
import numpy as np
from scipy import signal
import matplotlib.pyplot as plt
import cv2
from pathlib import Path

def postproccesing(path):
    raw = rawpy.imread(str(path))
    return raw.postprocess(gamma=[1.0,1.0],
                               no_auto_bright=True,
                               output_color=rawpy.ColorSpace.raw,
                               use_camera_wb=True,
                               use_auto_wb=False,
                               output_bps=16,
                                no_auto_scale=True
                              )

ref_path = Path("data/ref2.NEF")
jet_path = Path("data/jet2.NEF")

ref_img = postproccesing(ref_path)
jet_img = postproccesing(jet_path)

ref_gray = ref_img[:,:,1] #Grayscale
jet_gray = jet_img[:,:,1]

Visualization only takes the difference and multiplies the space. By taking the difference, you can see the part refracted by the density gradient. Even if the light bends in the same direction, it will be darker or brighter depending on the background pattern, so you need to multiply the gradient.

bos = (jet_gray-ref_gray) * np.gradient(0.5*(jet_gray+ref_gray))[1]

Apply a low-pass filter. Determine the band while looking at the appearance appropriately.

def fft_lpf(img,r):
    #Frequency space mask
    mask = np.zeros_like(img)
    X,Y = np.meshgrid(np.arange(img.shape[1])-img.shape[1]/2.0,np.arange(img.shape[0])-img.shape[0]/2.0)
    mask[np.sqrt(X**2 + Y**2) < r] = 1.0
    
    #FFT
    fft_img = np.fft.fftshift(np.fft.fft2(img))
    fft_img *= mask
    return np.abs(np.fft.fft2(np.fft.fftshift(fft_img)))

After applying the low-pass filter, save the image and finish. 調整.png

There is a lot of noise, but the jet can be visualized firmly. It seems that noise can be reduced with a little more ingenuity.

2020/04/14 postscript I fixed the camera properly and took a picture again, and adjusted the brightness later to reduce noise. You can take such a beautiful picture. 論文実装.png

Summary

We have shown that fluid visualization is possible as it is with only the things in the home. Please, try it.

Recommended Posts

Fluid visualization BOS method that can be done at home to see what is invisible
A mechanism to call a Ruby method from Python that can be done in 200 lines
What is Newton's method? ?? Approximate solution of equation to be solved by Newton's method