Recently, OpenAI collaborated with UberAI to propose a new approach — Synthetic Petri Dish — for accelerating the most expensive step of Neural Architecture Search (NAS). The researchers explored whether the computational efficiency of NAS can be improved by creating a new kind of surrogate, one that can benefit from miniaturised training and still generalise beyond the observed distribution of ground-truth evaluations.
Deep neural networks have been witnessing success and are able to mitigate various business challenges such as speech recognition, image recognition, machine translation, among others for a few years now.
According to the researchers, Neural Architecture Search (NAS) explores a large space of architectural motifs and is a compute-intensive process that often involves ground-truth evaluation of each motif by instantiating it within a large network, and training and evaluating the network with thousands or more data samples. By motif, the researchers meant the design of a repeating recurrent cell or activation function that is repeated often in a larger Neural Network blueprint.
Comments are closed.