
Shota SaitoSkillUp AI Co. Ltd
Shota Saito
Master of Information Science
About
10
Publications
663
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
17
Citations
Publications
Publications (10)
This study targets the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously. The CMA-ES, our focus in this study, is a population-based stochastic search method that samples solution candidates from a multivariate Gaussian distribution (MGD), which shows excellent performanc...
Using machine learning, we optimized an ultrasmall photonic crystal nanocavity to attain a high Q. Training data was collected via finite-difference time-domain simulation for models with randomly shifted holes, and a fully connected neural network (NN) was trained, resulting in a coefficient of determination between predicted and calculated values...
The convolutional neural network (CNN) is a standard tool for image recognition. To improve the performance of CNNs, it is important to design not only the network architecture but also the preprocessing of the input image. Extracting or enhancing the meaningful features of the input image in the preprocessing stage can help to improve the CNN perf...
A method of simultaneously optimizing both the structure of neural networks and the connection weights in a single training loop can reduce the enormous computational cost of neural architecture search. We focus on the probabilistic model-based dynamic neural network structure optimization that considers the probability distribution of structure pa...
A method of simultaneously optimizing both the structure of neural networks and the connection weights in a single training loop can reduce the enormous computational cost of neural architecture search. We focus on the probabilistic model-based dynamic neural network structure optimization that considers the probability distribution of structure pa...
High sensitivity of neural architecture search (NAS) methods against their input such as step-size (i.e., learning rate) and search space prevents practitioners from applying them out-of-the-box to their own problems, albeit its purpose is to automate a part of tuning process. Aiming at a fast, robust, and widely-applicable NAS, we develop a generi...
Black box discrete optimization (BBDO) appears in wide range of engineering tasks. Evolutionary or other BBDO approaches have been applied, aiming at automating necessary tuning of system parameters, such as hyper parameter tuning of machine learning based systems when being installed for a specific task. However, automation is often jeopardized by...
In machine learning, feature selection is a commonly used technique for improving the predictive performance and interpretability of a trained model. Feature selection techniques are classified into three approaches: the filter, wrapper, and embedded approaches. The embedded approach performs the feature selection process during the model training...