Skip to main content


Favorite movies of 2020

Yes, there were way less movies than usual and even the ones which came out, some of them were a little underwhelming when it came to my expectations (Maybe I was too excited to see it?). So, here are the most intriguing movies / documentaries / tv shows of 2020 I noticed: The Boys in the Band Hillary (documentary) Brave New World

Happy 20-th birthday Wikipedia

  20 years with Wikipedia (January 15th, 2001)

When you are having too much fun with 3D glasses

  3D Anaglyphic Stereoscopic glasses (Best 10$ ever spent) Python code for solar system viewing

When you convert color computer monitor into black and white three dimensional monitor

I still remember when I was high school kid, our math teacher (Mr. Volansky) brought into class book with anaglyphic glasses and you could see all the trigonometry  we were taught in three dimensions. On that day, I thought that's the coolest thing I have ever seen. We already have three dimensional movies . For some reason 3D TVs are no longer selling that much, so why not to buy some cheap anaglyphic glasses , write some python code and get lost in wonderland? [Why don't we have passive glasses computer monitors is beyond my understanding. 3D games would be so much blast. And we don't need any heavy VR/AR headset. With passive glasses monitors we would have even colors.] Enjoy the images (Merry Belated Christmas): If you already have the glasses, you might enjoy  Grand Teton National Park, Wyoming as well.

Comparing end result of cross entropy and standard deviation loss

  Standard deviation loss Outputs of Resnet end up closer together Cross entropy loss Outputs of Resnet have bigger differences  between correct samples Also, correct probability of test set  looks log-normal once again Images created with pytorch CIFAR-10 training .

Power Log Normal?

  Power Log Normal Distribution? It's funny, but when you remove relu from Resnet, distributions look more like normal distribution. Once you add it back, it looks more Power Log Normal.

Having fun with losses (from standard deviation to bell's curves)

  Cross entropy still rocks