Multimedia

Abstract: Submodular functions capture a wide spectrum of discrete problems in machine learning, signal processing and computer vision. They are characterized by intuitive notions of diminishing returns and economies of scale, and often lead to practical algorithms with theoretical guarantees.

In the first part of this talk, I will give a general introduction to the concept of submodular functions, their optimization and example applications in machine learning.

In the second part, I will demonstrate how the close connection of submodularity to convexity leads to fast algorithms for minimizing a subclass of submodular functions - those decomposing as a sum of submodular functions. Using a specific relaxation, the algorithms solve the discrete submodular optimization problem as a "best approximation" problem. They are easy to use and parallelize, and solve both the convex relaxation and the original discrete problem. Their convergence analysis combines elements of geometry and spectral graph theory.

This is joint work with Robert Nishihara, Francis Bach, Suvrit Sra and Michael I. Jordan.

Speaker Bio: Stefanie Jegelka is the X-Consortium Career Development Assistant Professor in the Department of EECS at MIT, and a member of CSAIL and the Institute for Data, Systems and Society. Before joining MIT, she was a postdoctoral scholar in the AMPLab at UC Berkeley. She earned her PhD from ETH Zurich in collaboration with the Max Planck Institutes in Tuebingen, Germany, and a Diplom from the University of Tuebingen. Her research interests lie in algorithmic and combinatorial machine learning, with applications in computer vision, materials science, and biology. She has been a fellow of the German National Academic Foundation, and has received several other fellowships, as well as a Best Paper Award at ICML and the 2015 Award of the German Society for Pattern Recognition.

Abstract: Submodular functions capture a wide spectrum of discrete problems in machine learning, signal processing and computer vision. They are characterized by intuitive notions of diminishing returns and economies of scale, and often lead to practical algorithms with theoretical guarantees.

In the first part of this talk, I will give a general introduction to the concept of submodular functions, their optimization and example applications in machine learning.

In the second part, I will demonstrate how the close connection of submodularity to convexity leads to fast algorithms for minimizing a subclass of submodular functions - those decomposing as a sum of submodular functions. Using a specific relaxation, the algorithms solve the discrete submodular optimization problem as a "best approximation" problem. They are easy to use and parallelize, and solve both the convex relaxation and the original discrete problem. Their convergence analysis combines elements of geometry and spectral graph theory.

This is joint work with Robert Nishihara, Francis Bach, Suvrit Sra and Michael I. Jordan.

Speaker Bio: Stefanie Jegelka is the X-Consortium Career Development Assistant Professor in the Department of EECS at MIT, and a member of CSAIL and the Institute for Data, Systems and Society. Before joining MIT, she was a postdoctoral scholar in the AMPLab at UC Berkeley. She earned her PhD from ETH Zurich in collaboration with the Max Planck Institutes in Tuebingen, Germany, and a Diplom from the University of Tuebingen. Her research interests lie in algorithmic and combinatorial machine learning, with applications in computer vision, materials science, and biology. She has been a fellow of the German National Academic Foundation, and has received several other fellowships, as well as a Best Paper Award at ICML and the 2015 Award of the German Society for Pattern Recognition.

The Television Academy announced today that Alan Bovik, professor in the Cockrell School of Engineering at The University of Texas at Austin and member of the Wireless Networking and Communications Group (WNCG), and his team of former students and collaborators will be honored with a 2015 Primetime Engineering Emmy Award for Outstanding Achievement in Engineering Development. The team will be recognized for their development of an advanced algorithm that enhances the video viewing experiences for tens of millions of people throughout the world. 

The awards were presented at the 67th Engineering Emmy Awards on October 28th in Hollywood and hosted by Josh Brener of the HBO show “Silicon Valley.”

Join WNCG Alumnus and Senior Manager of the Advanced Wireless Research Group at Qualcomm's Ian Wong as he discusses an overview of candidate waveforms for 5G during this year's Texas Wireless Summit: The View to 5G. 

 

Pages