The number of epochs is not that significant. More important is the the validation and training error. As long as it keeps dropping training should continue. For instance, if the validation error starts increasing that might be a indication of overfitting.
Just to be clear, an epoch is one learning cycle where the learner sees the whole training data set. If you have two batches, the learner needs to go through two iterations for one epoch.
If you have enough data, you can try Early Stopping method: divide data in three data sets, training, validation and evaluation. Train each network along a sufficient number of epochs to see the training Mean Squared Error to be stuck in a minimum.
This approach can be applied also with regularization methods and k-fold cross validation.
You can follow the "elbow method". The basic idea is that, let's say your priority is the loss function. Then, make a line plot of your loss function vs the number of epochs. As soon as the slope of your line drops dramatically (to almost zero) that point is called an elbow point.
Well, the correct answer is the number of epochs is not that significant. more important is the validation and training error. As long as these two error keeps dropping, training should continue.
For instance, if the validation error starts increasing that might be an indication of overfitting.
You should set the number of epochs as high as possible and terminate the training when validation error start increasing。
REF
https://www.researchgate.net/post/How_does_one_choose_optimal_number_of_epochs