Oliver Bartlett
We present the current progress in reducing noise from astronomical survey data via a new processing method. This processing method uses machine learning, and more specifically uses a certain artificial intelligence structure known as an autoencoder. The concept is to train this autoencoder to distinguish what is perceived as noise and what is perceived as a target, and thus remove the noise whilst preserving the morphological structure of the target. This application, when placed in a real world situation, would save a great amount of time and effort to reduce visible noise on survey data. This is in comparison to current means to reduce noise, such as in the case with stacking.
The Poster is 3 pages, each containing two columns. On page 1 in the first column are the concept and introduction sections. These sections cover using machine learning to reduce noise from survey data. There is a gantt chart depicting a large selection of surveys between the 1950’s and the 2000’s, and shows the number of detections each one made. This shows the fact that surveys are collecting too much data for human effort to handle, and machine learning is a solution. The second column contains a noise reduction section as well as a reference section. The noise reduction section states what noise is, how it affects surveys, and what the current method to reduce noise is. This method, called stacking, is effective but can not be applied to transient targets. Therefore this new machine learning proposal aims to reduce noise from these transient images. This section contains two images of an artificially constructed survey image, one with noise on it and one without.
The first column on page two has the Machine learning section and autoencoder section. The machine learning section outlines the basics of the machine learning field, focusing on a model called an artificial neural network. This artificial neural network operates like a brain, and depending on how it is structured the artificial neural network can be designed to accomplish certain tasks. This section gives the example that a convolutional neural network is good for image classification due to the network containing convolutional layers. This leads into another neural network structure called an autoencoder. In this section it details that an autoencoders purpose is to take an input, generate a representation of the input via the removal of non necessary features, and reconstruct the representation to form the original input. This structure is then applied to remove noise as an unnecessary feature, but retaining the morphological structure of the target. This section has an image depicting the representation. The second column contains another image depicting the structure of an autoencoder. At the end of the second column is a reference section.
The third page contains the early results section in the first column. Data augmentation was used to create a training and testing set for the autoencoder from just one image. It says how noise was applied to these images and then fed into the autoencoder after training. The results are then compared using image comparison techniques; specifically using mean squared error and structural similarity index functions to compare images. For mean squared error if the result is close to 0.00 the images are nearly identical. For structural similarity index if the result is close to 1.00 the images are nearly identical. This column ends with the resulting image comparisons. The original verses noisy image has a mean squared error of 0.00 and a structural similarity index of 0.66. The original verses cleaned image has a mean squared error of 0.00 and a structural similarity index of 0.78.
The second column contains the analysis section, a summary section, and contact information. The analysis states that this process is still being refined to further maximise noise reduction and to preserve morphological features of the target. But based on what has been tested the process is looking very promising.
Contact information
Email: O.J.Bartlett-2018@hull.ac.uk
Website: http://www.milne.hull.ac.uk/#home