## ch-3 Improving the way neural networks learn

The techniques we’ll develop in this chapter include

- a better choice of
**cost function**, the cross-entropy cost function; - four so-called
**“regularization”**methods ( L1 and L2 regularization, dropout, and artificial expansion of the training data ), which make our networks better at generalizing beyond the training data; - a better method for
**initializing the weights**in the network; - a set of heuristics to help choose
**good hyper-parameters**;