Setup Linux Env for Deep Learning

Install Ubuntu 18.04 LTS

The old server has been dead for a while. Finally I bought a new PSU (not sure whether it is the PSU’s problem), reinstalled everything, organized cables, now I have new a desktop. Later I reinstalled the OS too, previously using a Windows Server 2012 for some reason (failure to install any linux at that time due to the nividia driver problem).

Surprisingly, this time it is a pretty smooth process to install a Ubuntu 18.04 LTS (with a AMD GPU at the beginning), and then switch to a nVidia GPU (old 4GB GTX 970).

In case you have problem install Ubuntu with a GTX GPU, try the solution of ‘blacklist nouveau driver’ in here

Install nVidia Drivers

sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt update
sudo apt install nvidia-driver-440
sudo reboot
# also you can install in setting -> advanced drivers

Install CUDA

If you want to delete old CUDA installed, try sudo apt-get –purge remove cuda.

wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/cuda-ubuntu1804.pin
sudo mv cuda-ubuntu1804.pin /etc/apt/preferences.d/cuda-repository-pin-600
wget http://developer.download.nvidia.com/compute/cuda/10.2/Prod/local_installers/cuda-repo-ubuntu1804-10-2-local-10.2.89-440.33.01_1.0-1_amd64.deb
sudo dpkg -i cuda-repo-ubuntu1804-10-2-local-10.2.89-440.33.01_1.0-1_amd64.deb
sudo apt-key add /var/cuda-repo-10-2-local-10.2.89-440.33.01/7fa2af80.pub
sudo apt-get update
sudo apt-get -y install cuda

Set CUDA ENV (not sure whether needed)

export PATH=$PATH:/usr/local/cuda-10.2/bin
export CUDADIR=/usr/local/cuda-10.2
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda-10.2/lib64

Test Cuda

mkdir cuda-testing
cd cuda-testing/
cp -a /usr/local/cuda-10.2/samples samples-10.2
cd samples-10.2
make -j 4 # (add -k to skip errors)
~/cuda-testing/samples-10.2/bin/x86_64/linux/release$ ./nbody

Error

Cuda 10.2+Ubuntu 18.04 might give you error message of

'make: Target 'all' not remade because of errors.'
cudaNvSci.h:14:10: fatal error: nvscibuf.h: No such file or directory
#include
^~~~
compilation terminated.
Makefile:394: recipe for target 'cudaNvSci.o' failed

It is a temporary new feature issue according to https://github.com/NVIDIA/cuda-samples/issues/22#issuecomment-562105202

Install CuDNN

Download CuDNN lib for corresponding CUDA version from here

sudo dpkg -i  libcudnn7_7.6.5.32-1+cuda10.2_amd64.deb
sudo dpkg -i libcudnn7-dev_7.6.5.32-1+cuda10.2_amd64.deb
sudo dpkg -i libcudnn7-doc_7.6.5.32-1+cuda10.2_amd64.deb

Test CuDNN Installation

cp -r /usr/src/cudnn_samples_v7/ ~/cuda-testing/cudnn_samples_v7/
~/cuda-testing/cudnn_samples_v7/mnistCUDNN$ make clean && make
./mnistCUDNN

Install Pytorch through Anaconda

sh ./Anaconda3-2020.02-Linux-x86_64.sh
conda install pytorch torchvision cudatoolkit=10.1 -c pytorch

#test
python -c 'import torch;print(torch.cuda.is_available())'
True

Install Tensorflow and Keras

from here

pip install tensorflow-gpu==1.14
pip install keras

conda install tensorflow-gpu=1.14 keras

#test 
from keras import backend as K 
K.tensorflow_backend._get_available_gpus()    

Install MXNet

pip install mxnet-cu102 d2lzh

Done

Reference

[Paper Reading] A Fast Learning Algorithm for Deep Belief Nets

DBN

  • It is a Unsupervised Probabilistic generative graphical model to learn P(X), while LeNet/AlexNet and so on are discriminative models that focus on P(Y|X).
  • The top two layers of the DBN form an undirected bipartite graph called Restricted Boltzmann Machine
  • The lower layers form a directed sigmoid belief network
  • DBN can be formed by “stacking” RBMs. Later Autoencoder is used instead.
  • Greedy, layer-by-layer learning
  • Optionally fine-tuned with gradient descent and backpropagation.

RBM

  • RBMs are a variant of Boltzmann machines, with the restriction that their neurons must form a bipartite graph
  • Architecture: RBM has an input layer (also referred to as the visible layer) and one single hidden layer and the connections among the neurons are restricted. So RBM looks like a MLP connection between two layers

Reference

Book Reading List for 2020

Collections for Image Classification (and video and others)

Simply a collection of classic computer vision papers

Image Classification

  • Gradient-based learning applied to document recognition, LeNet-5, Proceedings of the IEEE 1998, pdf
  • ImageNet Classification with Deep Convolutional Neural Networks, AlexNet, NIPS 2012, pdf, slides
  • Visualizing and Understanding Convolutional Networks, ZFNet, ECCV 2014, pdf
  • Network In Network, NiN, ICLR 2014, pdf
  • Very Deep Convolutional Networks for Large-Scale Image Recognition, VGG, pdf
  • Going Deeper with Convolutions, Inception, CVPR 2015, pdf
  • Deep Residual Learning for Image Recognition, ResNet, CVPR 2016, pdf
  • Wide Residual Networks, BMVC 2016 , pdf
  • Rethinking the Inception Architecture for Computer Vision, Inception v3, CVPR, pdf
  • Aggregated Residual Transformations for Deep Neural Networks, ResNext, pdf
  • Densely Connected Convolutional Networks, DenseNet, CVPR 2017, pdf
  • Squeeze-and-Excitation Networks, SENet, CVPR 2018, pdf
  • Residual Attention Network for Image Classification, CVPR 2018, pdf

Video Classification

  • A Closer Look at Spatiotemporal Convolutions for Action Recognition, R(2+1)D, CVPR 2018, pdf
  • Video Classification with Channel-Separated Convolutional Networks, CVPR 2019, pdf
  • Large-scale weakly-supervised pre-training for video action recognition, pdf

MISC Resources

Papers About Attention

Started to seriously read some NLP/CV/MulitModal publications, try to focus on fancy terms such as attention, fusion that are more frequently mentioned everywhere.

‘Attention Model incorporates this notion of relevance by allowing the model to dynamically pay attention to only certain parts of the input that help in performing the task at hand effectively’

  • Attention is all you need from Google, 2019

Transformer is proposed in this paper.

Book Reading Planning

Start to plan to read 52 26 books per year. Considering that I have already finished 14 book in Wechat Reading since June, so it should be a P75 target. 1/3 of the books will be novel and whatever for recreational purpose. 1/3 for NON-tech but personal improvement such as GTD/time management/finance. The last 1/3 for Tech like new language (seems I am no longer interested in trying new language), ML/DL (major focus), software engineering and so on.

For Tech:

  1. Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville
  2. Model-Based Machine Learning by Bishop (not sure whether I want to read PRML again, so just start the new book)
  3. The Master Algorithm by Pedro Domingos (bought a Chinese version in Wechat Reading…)
  4. Some Caffe/PyTorch/TF coding book

For Personal Growth

  1. Getting Things Done I (Bought in Wechat Reading and also Douban), but never finished. 50%
  2. Deep Work
  3. Outliers: The Story of Success
  4. A Message to Garcia: And Other Essential Writings on Success (I cannot believe I finished reading this one)
  5. One Up On Wall Street by Peter Lynch and John Rothchild
  6. So Good They CAN’t Ignore You
  7. Pomodoro Technique Illustrated by Staffan Noteberg

For Amusement

  1. Origin by Dan Brown

 

Served by AWS Lightsail

Did not realize that VPS (Virtual Private Server) has been made so easy and so inexpensive. Last week, when reading the book “Deep Work“, suddenly realized that I am out of school, and again be free to write my ideas publicly. However, the static pages hosted github.io using markdown is kind of less efficient and hard (easy to forget) to include meta information. An inexpensive VPS (with static IP) can be a better choice. Find a good VPS has been an task in my list… until I remembered it, and googled, and saw the amazon lightsail.

A really lifesaver, log in with amazon count, a few clicks, here it is. I have a new wordpress blog with static IP. Find the user name and password for the wordpress takes some time: user name is user, password is in the VPS: login to your vps via SSH, and the pwd is stored in the file called bitnami_application_password.

Here we go, changed to some simple theme, write some intro (hesitate to put my resume here), and wrote the first post. Done.

Happy Fourth of July!