adjust default batch size and passes to be a bit more CPU pytorch friendly
using AdamW optim again, AdamW is the go to for small toys and transformers. refactored NN classes to thier own module under pairwise_comp_nn.py