We demonstrate that in N-body simulations of isolated disc galaxies, there is numerical vertical heating which slowly increases the vertical velocity dispersion and the disc thickness. Even for models with over a million particles in a disc, this heating can be significant. Such an effect is just the same as in numerical experiments by Sellwood. We also show that in a stellar disc, outside a boxy/peanut bulge, if it presents, the saturation level of the bending instability is rather close to the value predicted by the linear theory. We pay attention to the fact that the bending instability develops and decays very fast, so it cannot play any role in secular vertical heating. However, the bending instability defines the minimal value of the ratio between the vertical and radial velocity dispersions sigma_z/sigma_R = 0.3 (so indirectly the minimal thickness), which stellar discs in real galaxies may have. We demonstrate that observations confirm the last statement.