Speaker: Gerardo Berbeglia
Affiliation: Melbourne Business School
Title: The Effect of a Finite Time Horizon in the Durable Good Monopoly Problem with Atomic Consumers
Date: Monday, 27 June 2016
Time: 4-5pm
Location: OGGB, Room 6115

Abstract:
A durable good is a long-lasting good that can be consumed repeatedly over time, and a duropolist is a monopolist in the market of a durable good. In 1972, Ronald Coase conjectured that a duropolist who lacks commitment power cannot sell the good above the competitive price if the time between periods approaches zero. Coase’s counterintuitive conjecture was later proven by Gul et al. (1986) under an infinite time horizon model with non-atomic consumers. Remarkably, the situation changes dramatically for atomic consumers and an infinite time horizon. Bagnoli et al. (1989) showed the existence of a subgame-perfect Nash equilibrium where the duropolist extracts all the consumer surplus. Observe that, in these cases, duropoly profits are either arbitrarily smaller or arbitrarily larger than the corresponding static monopoly profits — the profit a monopolist for an equivalent consumable good could generate. In this paper we show that the result of Bagnoli et al. (1989) is in fact driven by the infinite time horizon. Indeed, we prove that for finite time horizons and atomic agents, in any equilibrium satisfying the standard skimming property, duropoly profits are at most an additive factor more than static monopoly profits. In particular, duropoly profits are always at least static monopoly profits but never exceed twice the static monopoly profits. Finally we show that, for atomic consumers, equilibria may exist that do not satisfy the skimming property. For two time periods, we prove that amongst all equilibria that maximise duropoly profits, at least one of them satisfies the skimming property. We conjecture that this is true for any number of time periods.

Speaker: Arkadii Slinko
Affiliation: Department of Mathematics
Title: Growth of dimension in complete simple games
Date: Monday, 16 May 2016
Time: 4:00 pm
Location: Clock Tower 032

Simple games are used to model a wide range of situations from decision making in committees to reliability of systems made from unreliable components and McCulloch-Pitts units in threshold logic. Weighted voting games are a natural and practically important class of simple games, in which each agent is assigned a numerical weight, and a coalition is winning if the sum of weights of agents in that coalition achieves a certain threshold.

The concept of dimension in simple games was introduced by Taylor and Zwicker in 1993 as a measure of remoteness of a given simple game from a weighted game. They demonstrated that the dimension of a simple game can grow exponentially in the number of players. However, the problem of worst-case growth of the dimension in the important class of complete games was left open. Freixas and Puente (2008) showed that complete games of arbitrary dimension exist and, in particular, their examples demonstrate that the worst-case growth of dimension in complete games is at least linear. In this paper, using a novel technique of Kurz and Napel (2015), we demonstrate that the worst-case growth of dimension in complete games is at least polynomial in the number of players. Whether or not it can be exponential remains an open question.

This is a joint paper with Liam O’Dwyer.

Everyone welcome!

Speaker: Mark Wilson
Affiliation: Computer Science Department
Title: Average-case analysis of random assignment algorithms
Date: Monday, 2 May 2016
Time: 4:00 pm
Location: Clock Tower 032

I present joint work with summer scholarship student Jacky Lo. The problem of one-sided matching without money (also known as house allocation), namely computing a bijection from a finite set of items to a finite set of agents each of whom has a strict preference order over the items, has been much studied. Symmetry considerations require the use of randomization, yielding the more general notion of random assignment. The two most commonly studied algorithms (Random Serial Dictatorship (RP) and Probabilistic Serial Rule (PS)) dominate the literature on random assignments. One feature of our work is the inclusion of several new algorithms for the problem. We adopt an average-case viewpoint: although these algorithms do not have the axiomatic properties of PS and RP, they are computationally efficient and perform well on random data, at least in the case of sincere preferences. We perform a thorough comparison of the algorithms, using several standard probability distributions on ordinal preferences and measures of fairness, efficiency and social welfare. We find that there are important differences in performance between the known algorithms. In particular, our lesser-known algorithms yield better overall welfare than PS and RP and better efficiency than RP, with small negative consequences for envy, and are computationally efficient. Thus provided that worst-case and strategic concerns are relatively unimportant, the new algorithms should be seriously considered for use in applications.

Everyone welcome!

Speaker:     Jiamou Liu
Affiliation: Computer Science Department
Title:       Hierarchies, Ties and Power in Organizational Networks
Date:        Monday, 11 Apr 2016
Time:        4:00 pm
Location:    Clock Tower 032
An organizational structure consists of a network where employees are connected by work and social ties. Analyzing this network, one can discover valuable insights into information flow within the organization. Moreover, properly defined centrality measures reveal the distribution of power and, therefore, important individuals in the network. We develop this idea and propose a simple network model that is consistent with management theory, and that captures main traits of large corporations. The carcass of the model is an organizational hierarchy. We extend it by allowing additional types of connections such as collaboration, consultation, and friendship. Having both reporting and non-reporting interpersonal ties, our model supports a multilevel approach to social networks.  We then formally define power and power consistency in organizations. These notions enable us to analyze a range of organizational phenomena such as limited hierarchy height, restructuring through flattening, and impact of non-reporting ties. This is a joint work with Anastasia Moskvina.
Everyone welcome!

Speaker:     Rachel Liu
Affiliation: The University of Auckland
Title:       Generalised Second-Price Auctions
Date:        Monday, 4 Apr 2016
Time:        4:00 pm
Location:    Clock Tower 032
We model the Generalized Second-Price Auction for internet advertising as a strategic form game with complete information. We begin with the simplest game in which two advertisers bid against each other over two advertising positions. We show that each advertiser has a unique weakly dominant strategy and there are multiple Nash equilibria. We then study the game with three advertisers and three positions and show that no player has a dominant strategy.  Each of the six possible allocations may arise as Nash equilibrium. It is not clear which allocation results in the largest welfare loss compared to the socially optimal outcome.  In the general case no player has a dominant strategy as well.
Everyone welcome!

Speaker:     Arkadii Slinko
Affiliation: Department of Mathematics
Title:       Approximation Algorithms for Fully Proportional Representation by Clustering Voters
Date:        Wednesday, 9 Dec 2015
Time:        12:00 pm
Location:    303-561
Charles Dodgson (Lewis Carroll) asserted that “a representation system should find the coalitions in the election that would have formed if the voters had the necessary time and information … and allow each of the coalitions to elect their representative using some single-winner voting method.”
Both the Chamberlin-Courant and Monroe voting rules do exactly that. Given the preferences of voters, they select committees whose members represent the voters so that voters’ satisfaction with their assigned representatives is maximised. These rules suffer from a common disadvantage, being computationally intractable to compute the winning committee exactly when the numbers get large. As both of these rules, explicitly or implicitly, partition voters, they can be seen as clustering of voters so that the voters in each group share the same representative.This suggest studying approximation algorithms for these voting rules by means of cluster analysis, which is the subject of this paper. We develop several such algorithms and experimentally analyse their performance.
Joint work with Piotr Faliszewski and Nimrod Talmon.
Everyone welcome!

Speaker: Mark Wilson
Affiliation: Computer Science Department
Title: Predicting FPP elections
Date: Tuesday, 6 Oct 2015
Time: 5:00 pm
Location: CAG15/114-G15 (Commerce A)

In this informal talk I will discuss some basic issues involved predicting elections in countries using the First Past The Post (single-winner plurality in districts) electoral system. A variety of methods have been tried with varying success. Part of the reason for this talk is to clarify for myself what “success” means. The talk will focus on standard methods involving models of “swing”, which often underlie more complicated models. I will make some predictions for the Canada 2015 election.

Everyone welcome!

Speaker: Samin Aref
Affiliation: Department of Computer Science
Title: Measuring Partial Balance in Signed Networks
Date: Tuesday, 29 Sep 2015
Time: 5:00 pm
Location: CAG15/114-G15 (Commerce A)

Is the enemy of an enemy necessarily a friend, or is a friend of a friend a friend? If not, to what extent does this tend to hold? Such questions were formulated in terms of signed (social) networks and necessary and sufficient conditions for a network to be “balanced” were obtained around 1960. Since then the idea that signed networks tend over time to become more balanced has been widely used in several application areas, such as international relations. However investigation of this hypothesis has been complicated by the lack of a standard measure of partial balance, since complete balance is almost never achieved in practice.

We formalise the concept of a measure of partial balance, compare several known measures on real-world and synthetic datasets, as well as investigating their axiomatic properties. We use both well-known datasets from the sociology literature, such as Read’s New Guinean tribes, and much more recent ones involving senate bill co-sponsorship. The synthetic data involves both Erdős-Rényi and Barabási-Albert graphs.

We find that under all our measures, real-world networks are more balanced than random networks. We also show that some measures behave better than others in terms of axioms. We make some recommendations for measures to be used in future work.

Everyone welcome!

Speaker:     José A. Rodrigues-Neto
Affiliation: Australian National University
Title:       Self-Consistency and Common Prior in Non-Partitional Knowledge Models
Date:        Tuesday, 22 Sep 2015
Time:        5:00 pm
Location:    CAG15/114-G15 (Commerce A)
In non-partitional models of knowledge with objective and subjective state spaces, the issue of self-consistency arises. The present paper defines a multigraph G_j for each player j, and also a global multigraph G. The posteriors of player j are self-consistent if and only if all cycle equations associated with cycles in G_j are satisfied. Similarly, the posteriors of all players are consistent with a common prior when all cycle equations corresponding to the cycles in G are satisfied. In particular, the self-consistency of player j is automatic when G_j is acyclic. Consistency always holds when G is acyclic, regardless of any probabilistic information. There is a simple formula to check for the acyclicity of G_j , and another formula to check for the acyclicity of G.
This is a joint paper with Luciana C. Fiorini.
Everyone welcome!