Exploration of Big Visual Data From A Human Tracking Perspective

发布时间:2017-07-10 15:01  浏览:220次

学术报告通知

 

美国西雅图华盛顿大学的Jenq-Neng Hwang教授应信息网络体系构建与融合学科创新引智基地温向明教授邀请,将于2017年7月10日来北邮作学术报告。欢迎有兴趣的师生踊跃参与。

 

题目:Exploration of Big Visual Data From A Human Tracking Perspective

主讲人:Jenq-Neng Hwang教授(IEEE Fellow

主持人:刘勇教授

时间:2017710日(星期一)下午14:30

地点:教三楼108

摘要: With the huge amount of networked video cameras installed everywhere nowadays, such as the statically deployed surveillance cameras or the constantly moving cameras on the vehicles or drones, there is an urgent need of systematic exploration of the dynamic environment based on the collected big visual data. In this talk, I will first present an automated and robust human tracking within a camera through self-calibration of static/moving cameras. These cameras are also continuously learning the temporal and color/texture appearance characteristics among one another in a fully unsupervised manner so that the human tracking across multiple cameras can be effectively integrated and reconstructed via the 3D open map service.

 

该讲座为前沿讲座,欢迎全校师生踊跃参加。

 

 

校学术委员会、信息与通信工程学院、信息网络体系构建与融合学科创新引智基地

                               2017年7月5日

 


 

附主讲人简介:

Jenq-Neng Hwang是美国西雅图华盛顿大学(University of Washington,UW)教授,博导,IEEE Fellow,以及我校信息网络体系构建与融合创新学科创新引智基地学术大师。分别于1981和1983年获台湾大学(NTU)学士和硕士学位,于1988年获南加州大学博士学位。1989年就职于UW电气工程(Electrical Engineering,EE)系,现任系副主任,主管科研和全球事务,是IEEE Transactions on Signal Processing (T-SP), Transactions on Neural Networks (T-NN), Transactions on Circuits and Systems for Video Technologies (T-CSVT), Transactions on Image Processing (T-IP)和Signal Processing Magazine (SPM)等期刊编委。

Jenq-Neng Hwang教授的研究方向是多媒体网络,机器学习和模式识别,曾出版专著《多媒体网络:从理论到实践》,在通信领域顶级学术期刊与国际会议上共发表论文300余篇,申请美国专利7个。他带领实验室师生在视频追踪领域和视频大数据处理方面取得了诸多成果,部分研究成果曾在波士顿马拉松爆炸事件中实际应用,并发挥了巨大作用。新科学家(NEW Scientist) 期刊曾以此为主题进行了报道:”Jenq-Neng Hwang and colleagues at the University of Washington in Seattle have also built a system which can track humans across different cameras automatically, even when the camera views do not overlap.”。