Shota Saito

Shota Saito
SkillUp AI Co. Ltd

Master of Information Science

About

10
Publications
663
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
17
Citations

Publications

Publications (10)
Preprint
Full-text available
This study targets the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously. The CMA-ES, our focus in this study, is a population-based stochastic search method that samples solution candidates from a multivariate Gaussian distribution (MGD), which shows excellent performanc...
Article
Using machine learning, we optimized an ultrasmall photonic crystal nanocavity to attain a high Q. Training data was collected via finite-difference time-domain simulation for models with randomly shifted holes, and a fully connected neural network (NN) was trained, resulting in a coefficient of determination between predicted and calculated values...
Chapter
The convolutional neural network (CNN) is a standard tool for image recognition. To improve the performance of CNNs, it is important to design not only the network architecture but also the preprocessing of the input image. Extracting or enhancing the meaningful features of the input image in the preprocessing stage can help to improve the CNN perf...
Chapter
A method of simultaneously optimizing both the structure of neural networks and the connection weights in a single training loop can reduce the enormous computational cost of neural architecture search. We focus on the probabilistic model-based dynamic neural network structure optimization that considers the probability distribution of structure pa...
Preprint
A method of simultaneously optimizing both the structure of neural networks and the connection weights in a single training loop can reduce the enormous computational cost of neural architecture search. We focus on the probabilistic model-based dynamic neural network structure optimization that considers the probability distribution of structure pa...
Preprint
High sensitivity of neural architecture search (NAS) methods against their input such as step-size (i.e., learning rate) and search space prevents practitioners from applying them out-of-the-box to their own problems, albeit its purpose is to automate a part of tuning process. Aiming at a fast, robust, and widely-applicable NAS, we develop a generi...
Preprint
Full-text available
Black box discrete optimization (BBDO) appears in wide range of engineering tasks. Evolutionary or other BBDO approaches have been applied, aiming at automating necessary tuning of system parameters, such as hyper parameter tuning of machine learning based systems when being installed for a specific task. However, automation is often jeopardized by...
Conference Paper
Full-text available
In machine learning, feature selection is a commonly used technique for improving the predictive performance and interpretability of a trained model. Feature selection techniques are classified into three approaches: the filter, wrapper, and embedded approaches. The embedded approach performs the feature selection process during the model training...

Network

Cited By