中国·太阳集团tcy8722(有限公司)官方网站-Weixin百科

太阳集团tcy8722网站

Data Science| The Pruning Technique of Neural Networks Related to Compressed Sensing

来源:太阳集团tcy8722网站 发布时间:2019-10-23   706

报告人:夏羽博士(杭州师范大学)

报告题目:The Pruning Technique of Neural Networks Related to Compressed Sensing

时间与地点:2019年10月28日下午3:30-5:30,玉泉校区教十一417
报告摘要:We discuss the Net-Trim, which is a technique simplifies a trained neural network. The method is a convex post-processing module, which prunes a trained network layer by layer, while preserving the internal responses. The initial and retrained models before and after Net-Trim application is consistent and the number of training samples needed to discover a network can be expressed using a certain number of nonzero terms. Specifically, if there is a set of weights that uses at most s terms that can re-create the layer outputs from the layer inputs, these weights can be found from O(slog N/s) samples, where N is the input size. The related theoretical results are similar to those for sparse regression using the Lasso and the tools are related to concentration of measure and convex analysis.


联系人:李松教授(songli@zju.edu.cn)


Copyright © 2023 中国·太阳集团tcy8722(有限公司)官方网站-Weixin百科    版权所有

    浙ICP备05074421号

技术支持: 创高软件     管理登录

    您是第 1000 位访问者

Baidu
sogou