Welcome to data science seminar 2018-2019. This seminar will about deep and machine learning, from both theoretical and practical points of view.
If You have any problem, I will stay in 1570, 11/30(Thursday) night(till 8:40) to give an office hour.
Every time the seminar will have two parts
- Paper sharing: every one should give a presentation about a paper(Presenting more will be welcome.) he has read last week.
- Area Introduction: A Speaker will give us a review about a related domain.
Every week, you should finish a small assignment, publishing an article on this website. The method to utilize this website will be introduced later.
You are welcome to add any article on this website. [Pay attention, I have the administrator rights :)smile
1.Deep Learning Theory：
Including traditional machine learning theory application in deep learning, like PAC learning bounds in order to give a provable bound of the generalization of deep neural networks. We also focus on empirical explanation to the success of deep learning, in order to give us new insights of how deep learning is working and provide new directions of learning theory researches in the future.
2.Deep Metric Learning, Deep Transfer Learning and Application
It seems that the feature extracted by the neural network can be efficiently used in different areas, like metric learning and transfer learning. We hoping that the pre-trained networks can gain benefits to traditional tasks and traditional wisdom can also work in deep learning.
3.Deep generative models: everyone wants to do something about it.
4.Deep learning on complicated domains like graph and manifold
Convolution, subsampling and many operators is not be well-defined on complicated domains as graph and manifolds. Irregular sampling is also a challenge for us to design continuous-time RNN. At the same time, 3d convolution will bring high computational cost. It’s a challenge to do deep learning on high dimension and irregular domain.
5.Computer vision tasks
Some traditional tasks is still very hard, like blind debluring, optical flow, blind in-painting and etc.
Prove nothing will work except stochastic gradient descent in deep learning.[Take complexity and generalization into consider.]
Guide To the Ghost system
Ghost is a a fully open source, hackable platform for building and running a modern online publication. You can get more information from https://ghost.org/.
Every week, you are asked to publish a new post about the paper sharing section in this week's seminar. You should publish a abstract about the paper you gives presentation on and make comments to other's presentation. The following one is an example.
Paper'....' gives a new framework about xxxx, in order to xxx. The paper made several assumptions that the data are lying on a manifold, which I think is too strong. In the real word dataset like cifar, this algorithm will not work. However, the numerical method utilized in this paper is very interesting..... This paper is absolute nonsense, I don't know why xxx chose this one as his topic.
The ghost systems use markdown, which is a lightweight markup language with plain text formatting syntax. You can take https://www.zhihu.com/question/20409634 as a reference. It is easy to use and very efficient.(I also recommend
typora, which is a very amazing markdown editor.)
It is easy to publish a new post, you only need to click
New Post and write the article and then click the button
Publish Now on the top right corner.
I have add a tag
seminar, if you add an article about the seminar, remember to add the tag!
We also recommend a seminar hold by Zhihan Li about tool box in applied math, see Github.