|
Slides from the tutorial now available! Part I: Information theory basics: entropy, MI, estimation Part II: Dynamic systems, transfer entropy Part III: Entropy in high-dimensional spaces, content on social networks Abstract Social media is a collection of moving targets. Both the platforms and the behaviors of the users of these platforms are diverse and constantly evolving. Ad hoc models based on assumptions about today's users may not hold tomorrow. Information theory provides a general framework for identifying meaningful signals without relying on assumptions about human behavior or on platform-specific implementation details. The flexibility of the information-theoretic approach allows researchers to go beyond the study of "re-tweets" to consider rich data including textual content, timing, and context. The main objective of this tutorial is to provide a gentle introduction to basic information-theoretic concepts and to demonstrate how those concepts can be applied in the context of social network analysis. In particular, we emphasize an interpretation of these quantities as measures of predictability. The strongest signals in social media, and the ones most amenable to research, are the ones that most predictably lead to change. We will use several case studies to illustrate how information theory can be fruitfully applied to real-world social media and to demonstrate how this analysis can be simplified with available tools. Background The emergence of Information Theory as a scientific discipline is commonly attributed to a 1948 landmark paper by Claude Shannon where he laid down the basic principles of data transmision through a noisy communication channel. In particular, Shannon's theory tells us that the amount of information we can send through the noisy channel is related to a quantity called "mutual information". Mutual Information between two random variabes (e.g., transmitted and received messages) measures the average reduction in the uncertainty in one variable, if we know the value of the other variabe. This concept is illustrated using the Venn diagram below: Here the yellow and light blue areas denote the unceratinty in variabes X and Y, respectively. Those uncertainies are quantified by the corresponding entropes H(X) and H(Y). The mutual informatiion then corresponds to the area of the intersection. The noisy channel is a powerful framework that has been found numerous applications in speech recognition, machine translation, text summarization, and so on. ![]() What does this have to do with influence, human speech, or social media? This abstract framework is remarkably flexible. What if the input is some statement made by Alice? Then the “noisy channel” consists of (e.g.) sound waves, the ear drum, and the brain of Bob. Now Bob “outputs” some other statement. In the example below on the left, Bob has said something very relevant to Alice's statement: ICWSM'13 is in Boston, and Bob mentions that enjoys the MIT campus. Bob's statement gives us some information about what Alice's original statement was. ![]() More generally, in recent years information-theoretic concepts have been used successfully to characterize processes in dynamic social networks and social media. For instance, Ghosh et. al. used information-theoretic approach to classification of user activity on Twitter [4]. In particular, they traced the user activity connected with particular URL, and identifed two features, time-interval entropy, and user entropy. They were able to achieve good classification of Intutively, a nutshel, the time which we use to classify retweeting activity. They good separation of different activities using just these two features and are able to categorize content based on the collective user response it generates. ![]() Scope of the tutorial We will begin with a survey on topics such as random variables, entropy, mutual information, and conditional mutual information, focusing on developing a deeper intuition for what these quantities represent. After demonstrating common pitfalls, we will demonstrate practical, state of the art methods for estimating entropic measures from limited data samples. Finally, we will show how these tools can be fruitfully applied to real-world social media data using several case stud- ies. Possible examples include discovering meaningful relationships from social signals using transfer entropy [1, 2], use of entropic measures for classifying temporal activity patterns of users in social media, characterizing randomness in social interactions on Twitter [5], and information-theoretic methods for community detection in social networks [3]. References The following publications are recommended. |