Gradient boosting algorithms such as XGBoost, CatBoost, and LightGBM are popular solutions for regression and classification tasks. These algorithms have become very common due to the high results in competitions on the kaggle.com platform. One of the tasks of machine learning is hyperparameters optimization. The most popular optimization techniques are grid search and random search. All search methods are accompanied by a large number of consideration of possible combinations of hyperparameter values, followed by the developing and evaluation of models. In this article we consider the possibility of simplifying the hyperparameters optimization by dividing the set of hyperparameters into groups and applying a stopping criterion.