Skip to main content

Posts

Review of Large Margin Deep Networks for Classification

Current deep learning research is pushing further in several directions. There are new structures appearing: Automatic liver tumor segmentation in CT with fully convolutional neural networks and object-based postprocessing Look Closer to See Better: Recurrent Attention Convolutional Neural Network for Fine-grained Image Recognition Dropout: A Simple Way to Prevent Neural Networks from Overfitting There are new optimization techniques for enabling faster distributed DNN training: Sparsified SGD with Memory1-Bit Stochastic Gradient Descent and its Application to Data-Parallel Distributed Training of Speech DNNs And last, but not least, current research is also looking at new loss functions, which is topic of this blog post: Large Margin Deep Networks for Classification Under motto 'You get what you optimize', the authors of the paper argue that benefits for using large margin loss function include better robustness to input perturbations and in section 4 of the paper on MNIST datase…

Second

Mystery Lake

500 days of blogging

Sport clothes

I always thought you should do sports in outfit that makes you feel comfortable so you can concentrate on what you do best and forget you wear anything at all - or better said - clothes are not in your way to achieve your best. I personally did not find this outfit any way offensive. Great response though...

And just a reminder of Wimbledon in 1985:

Blue Pool, Oregon

Crater Lake, Oregon