Programme
Outline of the programme
All events will take place Room Oahu, Sheraton Waikiki Hotel, unless otherwise stated.
Saturday, July 5
- 1.30pm-2pm: Registration and opening of the workshop
- 2pm-3pm: Keynote talk: Rissanen
- 3pm: Coffee break
- 3:30pm-4:45pm: Invited talks
- 5.30pm-7pm: Welcoming reception
Note change in time!
Location: Cabana on the Beach, Royal Hawaiian (same property as Sheraton)
Sunday, July 6
- 9am-10am: Plenary talk: Anantharam
- 10am: Coffee break
- 10:30am-12:10pm: Invited talks
- 12:10pm: Lunch break (on your own)
- 2pm-3pm: Plenary talk: Szpankowski
- 3pm: Coffee break
- 3:30pm-4:45pm: Invited talks (CSoI special session)
- 6pm-8pm: Banquet dinner
Note change in time!
Location: Terrace at Sheraton Waikiki (overlooking the ocean)
Monday, July 7
- 9am-10:15am: Invited talks
- 10:15am: Coffee break
- 10:45am-12:00pm: Invited talks
- 12:00pm: Lunch break (own your own)
- 14:00pm-15:15 Invited talks
- 15:15-15:45: Closing of the workshop
Technical Programme
Saturday, July 5
- 2pm-3pm Keynote talk: Jorma Rissanen "Entropy and Estimation
of Random ML Models
- 3:30pm Andrew Barron "Overview of Recent Developments in Penalized Likelihood and MDL"
- 3:55pm Sabyasachi Chatterjee: "Statistical Implications of the Connections between Penalized Likelihood and MDL"
- 4:20pm Peter Harremoës "Consistent MDL"
Sunday, July 6
- 9am-10am Plenary talk: Venkat Anantharam "Entropy Power Inequalities: Results and Speculation"
- 10:30am Mokshay Madiman "Entropy and the Additive Combinatorics of Probability Densities on Rn"
- 10:55am Jun-ichi Takeuchi "When is a Tree Model an Exponential Family?"
- 11:20am Matthew Parry "Local Scoring Rules and Statistical Inference in Unnormalized Models"
- 11:45am Tara Javidi "Extrinsic Jensen-Shannon Divergence and Dynamic Noisy Search"
- 2pm-3pm Plenary talk: Wojciech Szpankowski "Structural Information"
- 3:30pm Narayana Prasad Santhanam "Data Driven Pointwise Convergence in Prediction and Compression"
- 3:55pm Sudeep Kamath "Information Flow in Wireline and Wireless Networks"
- 4:20pm Joachim Buhmann "Information Theory of Algorithms"
Monday, July 7
- 9:00am Sumio Watanabe "Discovery Phenomenon and Information Criteria"
- 9:25am Tommi Mononen "On the Applicability of WAIC and WBIC in the Gaussian Process Framework"
- 9:50am Xiao Yang "Compression and Predictive Distributions for Large Alphabet i.i.d and Markov models"
- 10:45am Kazuho Watanabe "Rate-Distortion Analysis for an Epsilon-Insensitive Loss Function"
- 11:10am Wray Buntine: "Trees of Probability Vectors and Gibbs Sampling"
- 11:35am Teemu Roos "Information Theory of Squiggly Lines: Authentication by Gestures"
- 2:00pm Yuriy Mileyko "Letting Loops Loose"
- 2:25pm Lu Wei "Product of Random Matrices and Wireless Communications"
- 2:50pm Susanne Still
Invited speakers
- Andrew Barron, Yale University, USA
- Joachim Buhmann, ETH Zurich, Switzerland
- Wray Buntine, NICTA, Australia
- Sabyasachi Chatterjee, Yale University, USA
- Peter Harremoës, Copenhagen Business College, Denmark
- Tara Javidi, University of California, San Diego, USA
- Sudeep Kamath, University of California, San Diego, USA
- Mokshay Madiman, University of Delaware, USA
- Yuriy Mileyko, University of Hawaii, USA
- Tommi Mononen, Aalto University, Finland
- Matthew Parry, University of Otago, New Zealand
- Teemu Roos, University of Helsinki, Finland
- Narayana Prasad Santhanam, University of Hawaii, USA
- Susanne Still, University of Hawaii, USA
- Jun-ichi Takeuchi, Kyushu University, Japan
- Kazuho Watanabe, Toyohashi University of Technology, Japan
- Sumio Watanabe, Tokyo Institute of Technology, Japan
- Lu Wei, University of Helsinki, Finland
- Xiao Yang, Yale University, USA
Plenary speakers
- Abstract: Shannon's entropy power inequality
characterizes the minimum differential entropy achievable by the sum
of two independent random variables with fixed differential entropies.
Since the pioneering work of Shannon, there has been a steady stream
of results over the years, trying to understand the structure of
Shannon's entropy power inequality, as well as trying to develop
similar entropy power inequalities in other scenarios, such as for
discrete random variables.
We will discuss some aspects of this landscape in this talk.
We will present old results, new results, and share
some speculation about how to prove new kinds of entropy power
inequalities.
- Time: 9am-10am, Sunday July 6
- Abstract:
F. Brooks argued in his 2003 JACM paper on the challenges of computer
sciences that there is ''no theory that gives us a metric for the
information embodied in structure''. C. Shannon himself noticed this
fifty years earlier in his 1953 paper. More generally, we lack an
information theory of data structures (e.g., graphs, sets, social networks,
chemical structures, biological networks). In this talk, we present
some recent research results on structural information.
We first propose some fundamental limits of information content
for a wide range of data structures with correlated labels and then
propose asymptotically optimal lossless compression algorithms
achieving these limits for unlabeled graphs. Then we move to Markov
fields and try to understand structural properties of
large systems with local mutual dependencies and interaction.
In particular, we focus on enumerating Markov field and
universal types. Finally, we study capacity of a
sequence to structure channel arising in protein folding applications.
The channel itself is characterized by the Boltzmann distribution with
a free parameter corresponding to temperature.
Interestingly, capacity of such a channel exhibits an unusual phase transition
with respect to temperature. We tackle most of these problems
by complex analysis methods, thus within the realm of
analytic information theory.
- Time: 2pm-3pm, Sunday July 6