top of page

Being effective in developing deep neural nets requires the selection of model hyper parameters. The process of choosing the set of optimal hyper parameters for a learning algorithm is called hyper parameter tuning and some of the common methods used include Grid Search and Random Search. We have developed a new and novel approach for hyper parameter tuning (Aisara-opti-search) which yields small loss value/high accuracy with less number of trials (9 only).


  • AiSara-opti-search  works similar to Random search method but it utilizes AiSara predictive algorithm which creates the entire search space and explores it for the optimal set of HPs.

  • Our method is based on the concept of search space reduction where some of the random search trials will be performed within focused search spaces instead of just run all the them within the entire search space.

  • The sampling of the HP sets is done using latin hypercube because it allows more exploration of the search space and this is required to assist our AiSara algorithm for a better suggestion of the focused search space boundaries.




At the sidebar, choose the dataset from the preset dataset given. Click View Data to preview the dataset

Libraries & framework

bottom of page