当前位置: 首页  科学研究  学术活动

数学交叉科学研究所学术报告(张海樟教授,中山大学)

来源:系统管理员 发布时间:2024-10-11

报告题目Power of Function Composition in Deep Neural Networks

报告人:张海樟教授,中山大学

报告时间20241015日(周二) 14:30-15:30

报告地点腾讯会议654-266-148

摘要It is well-known that major breakthroughs in applied and computational mathematics in the past were majorly stimulated by new function representation systems, which include power series, splines, Fourier series, wavelets, and intrinsic mode functions. The recent example is deep neural networks for deep learning. Unlike those linear representation systems, deep neural networks are nonlinear and the essence is to generate representation functions by composition of the linear functions and an activation function. It is a mystery why function composition is so efficient in representing complex functions. We try to understand the power of function composition in deep neural networks from two perspectives. The first one is to explain why function composition is more inclined to generate complex functions. The second one is to understand the efficiency of function approximation by compositions.

报告人简介:张海樟,中山大学数学学院(珠海)教授。 研究兴趣包括学习理论、应用调和分析和函数逼近。代表性成果有再生核的Weierstrass逼近定理,以及在国际上首创的再生核巴拿赫空间理论。以再生核巴拿赫空间为基础的心理学分类方法入选剑桥大学出版社的《数学心理学新手册》。在Journal of Machine Learning ResearchApplied and Computational Harmonic AnalysisNeural NetworksNeural ComputationNeurocomputingJournal of Approximation TheoryIEEE Transactions系列等发表多篇原创性工作单篇最高他引超过360次。主持包括多项国家和省部级基金。

邀请人:向道红