<del id="r9dxj"><thead id="r9dxj"><noframes id="r9dxj"><strike id="r9dxj"></strike>
<ruby id="r9dxj"></ruby>
<ruby id="r9dxj"></ruby>
<address id="r9dxj"><noframes id="r9dxj">
<listing id="r9dxj"></listing>
<menuitem id="r9dxj"></menuitem>
<strike id="r9dxj"><dl id="r9dxj"></dl></strike>
<cite id="r9dxj"></cite><del id="r9dxj"><th id="r9dxj"></th></del>
<cite id="r9dxj"><th id="r9dxj"><thead id="r9dxj"></thead></th></cite>
<span id="r9dxj"><i id="r9dxj"><ruby id="r9dxj"></ruby></i></span><th id="r9dxj"></th>
<span id="r9dxj"></span><span id="r9dxj"></span>
<menuitem id="r9dxj"></menuitem>
首頁 > 科學研究 > 學術看板 > 正文

誤差建模原理

供稿:    責任編輯:安果    時間:2018-05-15    閱讀:

主講人:孟德宇

報告人單位:西安交通大學數學與統計學院

報告時間:2018年5月15日15:30

報告地點:基礎教學樓A303

個人主頁或EMAIL:http://gr.xjtu.edu.cn/web/dymeng


報告摘要:傳統機器學習主要關注于確定性信息的建模,而在復雜場景下,機器學習方法容易出現對數據噪音的魯棒性問題,而該魯棒性問題與誤差函數的選擇緊密相關。本次報告聚焦于如何針對包含復雜噪音數據進行誤差建模的魯棒機器學習原理。這一原理對在線視頻處理、醫學圖像恢復等問題,已體現出個性化的應用優勢,該原理亦有希望能夠引導出更多有趣的機器學習相關應用與發現。

Traditional machine learning methods are sensitive to the noise that probably exists into data. Such a robustness issue is close to the choice of error reformulation. In this talk, I will introduce the principle of robust machine learning, as well as how to mathematically formulate the noisy data. As shown in our studies, the formulation of error has shown promising performance in video processing, medical image restoration, and so on.


報告人簡介:西安交通大學數學與統計學院教授,博導。曾赴香港理工大學,Essex大學與卡內基梅隆大學進行學術訪問與合作。共接收/發表論文80余篇,其中包括IEEE Trans論文22篇和CCF A類會議30篇。擔任ICML,NIPS等會議程序委員會委員,AAAI2016高級程序委員會委員。目前主要聚焦于自步學習、誤差建模、張量稀疏性等機器學習相關方向的研究。


Prof. Deyu Meng received the B.Sc., M.Sc., and Ph.D. degrees from Xian Jiaotong University, Xi- an, China, in 2001, 2004, and 2008, respectively. He is currently a Full Professor with the Institute for Information and System Sciences, Xian Jiao- tong University. From 2012 to 2014, he took his two- year sabbatical leave in Carnegie Mellon University. His current research interests include self-paced learning, noise modeling, and tensor sparsity.


王者彩票下载