INTERNATIONAL GROUP OF COORDINATORS:
[To join, please contact one of the coordinators to get a meeting ID and password]
LIST OF TALKS
Date and time: 8 December 9AM GMT, 9AM London, 10AM Paris, 12AM Moscow, 6PM Seoul & Tokyo, 10PM Auckland
Title: Nondeterministic proportional consensus – theoretical and simulated properties of a novel nonmajoritarian single-winner voting method aiming at fairness and efficiency [paper & slides]
Abstract: Probabilistic methods can trivially distribute power equally so that any subgroup of x% size can guarantee their favourite a winning probability of x%. But can they at the same time elect a promising compromise option with certainty in strong Nash equilibrium? I’ll motivate and discuss a method, “MaxParC”, that does so even for partial compromises. It uses conditional commitments via thresholds for approval and then proceeds like Duddy’s “Conditional Utilitarian Rule”. I’ll present some theoretical results and extensive agent-based simulations that compare MaxParC to Plurality, Approval, IRV, Random Ballot, Nash Max Product and other methods. We used various preference models, risk attitude types, and behavioural types for these. One finding suggests that the “welfare costs of fairness” are small compared to the inequality costs of majoritarianism. Joint talk with Forest W Simmons.
Date and time: 1 December 8:30 AM Montreal, 14:30 Paris, 16:30 PM in Moscow, 10:30 PM in Seoul, and 2 December 2:30 AM Auckland
Title: Treating future people impartially implies avoiding future lives with low well-being
Abstract: It has been claimed that climate policies can be evaluated by the Pareto principle. However, climate policies lead to different identities and different numbers of future people. Even if one assumes that the number of future people is countably in finite independently of policy choice, the problem is that there exists no natural one-to-one correspondence between the components of the compared alternatives. This non-existence means that the components of streams are indexed by natural numbers that do not correspond to particular people, making a case for impartiality in the sense of Strong anonymity. Strong anonymity is incompatible with Strong Pareto. The paper re-examines this incompatibility and investigates how far sensitivity for the well-being at any one component can be extended without contradicting Strong anonymity. We show that Strong anonymity combined with four rather innocent axioms has two consequences:
(i) There can be sensitivity for the well-being at a particular component of the stream if and only if a cofinite set of people have well-beings that are more than an \epsilon > 0 higher, and
(ii) adding people to the population cannot have positive social value.
Joint work with Kohei Kamaga and Stephane Zuber.
Date and time: 24 November 9AM GMT
Title: Ex-ante versus ex-post compromises
Abstract: A social choice rule maps preference profiles to alternatives. Depending on the properties that this function satisfies, very different outcomes can be produced starting from the same initial profile. The plurality rule consists in selecting, as winners, the alternatives that are considered the best by the largest number of voters forming the society. Yet, this rule can pick as a winner an alternative that is considered the worst by a strict majority of voters. Several procedures have been proposed in the literature that aim to find a compromise involving at least a majority of individuals. Nevertheless, all those rules can be defined as ex-ante or procedural compromises, i.e., they impose over individuals a willingness to compromise but they do not ensure an outcome where everyone has effectively compromised. In this work, we approach the problem of compromise from an ex-post perspective, favoring an outcome where all individuals give up as equally as possible from their ideal points. For large enough number of individuals and alternatives, Condorcet extensions, scoring rules, and even BK-compromises fail to pick ex-post compromises, under any reasonable meaning attributed to “giving up equally”. With two individuals, for large enough number of alternatives, all well-known two-person social choice rules of the literature fail to pick ex-post compromises. This is joint work with Beatrice Napolitano and Remzi Sanver.
Date and time: 17 November 9AM GMT
Title: Fair and consistent prize allocation in competitions [paper]
Abstract: Given the final ranking of a competition, how should the total prize endowment be allocated among the competitors? We study consistent prize allocation rules satisfying elementary solidarity and fairness principles. In particular, we axiomatically characterize two families of rules satisfying anonymity, order preservation, and endowment monotonicity, which all fall between the Equal Division rule and the Winner-Takes-All rule. Specific characterizations of rules and subfamilies are directly obtained. This is joint work with Aleksei Y. Kondratev.
Date and time:10 November 9AM GMT
Title: Accountable Voting
Abstract: We propose a new framework of collective decision-making, which takes into account conflicts of interest among the voters and the voting alternatives (e.g., applicants for some competition). We define accountability as properties of a social welfare function such that alternatives/applicants with less number of “related” voters are not disadvantaged as compared to those who have more “related” voters. We introduce two accountability axioms, impartiality and no-power-game property. Impartiality, which requires independence from preferences over applicants of interest, is violated by ordinary scoring rules. We formulate many score-based rules to satisfy impartiality. Among them, the winning-rate rule considers the entire interest structure and satisfies some desirable axioms. Our another important finding is that it is impossible to satisfy a variant of unanimity axiom and no-power-game property, which requires that having one more interested voter should not improve the applicant’s social ranking. This is a joint work with Yoko Kawada [Keio U], Yuta Nakamura [Yokohama City U] and Noriaki Okamoto [Meiji Gakuin U]).
Date and time: 3 November 9AM GMT (9AM London, 10AM Paris, 12AM Moscow, 6PM Seoul, 10PM Auckland)
Title: Manipulation-resistant false-name-proof facility location mechanisms for complex graphs [paper]
Abstract: In many real-life scenarios, a group of agents needs to agree on a common action, e.g., on a public facility location, while there is some consistency between their preferences, e.g., all preferences are derived from a common metric space. The facility location problem models such scenarios, and it is a well-studied problem in social choice. We study mechanisms for facility location on unweighted undirected graphs that are resistant to manipulations (strategy-proof, abstention-proof, and false-name-proof ) by both individuals and coalitions on one hand and anonymous and efficient (Pareto-optimal) on the other hand. We define a new family of graphs, ZV-line graphs, and show a general facility location mechanism for these graphs satisfying all these desired properties. This mechanism can also be computed in polynomial time, and it can equivalently be defined as the first Pareto-optimal location according to some predefined order. Our main result, the ZV-line graphs family and the mechanism we present for it, unifies all works in the literature of false-name-proof facility location on discrete graphs, including the preliminary (unpublished) works we are aware of. In particular, we show mechanisms for all graphs of at most five vertices, discrete trees, bicliques, and clique tree graphs. Finally, we discuss some generalizations and limitations of our result for facility location problems on other structures: Weighted graphs, large discrete cycles, infinite graphs, and for facility location problems concerning infinite societies. This is joint work with Taiki Todo and Makoto Yokoo.
Date and time:27 October 9AM GMT
Title: Polarization in Networks: Identification-alienation framework
Abstract: We introduce a model of polarization in networks as a unifying setting for the measurement of polarization that covers a wide range of applications. We consider a substantially general setup for this purpose: node- and edge-weighted, undirected, and connected networks. We generalize the axiomatic characterization of Esteban and Ray (1994) and show that only a particular instance within this class can be used justifiably to measure polarization in networks. This is joint work with Kenan Huremovic (IMT Lucca).
Date and time: 20 October 8AM GMT
Title: Menu mechanisms
Abstract: We investigate menu mechanisms: dynamic mechanisms where at each history, an agent selects from a menu of his possible assignments. In comparison to direct mechanisms, menu mechanisms offer better privacy to participants; we formalize this with a novel notion of mechanism informativeness. We consider both ex-post implementation and full implementation, for both subgame perfection and a strengthening of dominance that covers off-path histories, and provide conditions under which menu mechanisms provide these implementations of rules. Our results cover a variety of environments, including elections, marriage, college admissions, auctions, labor markets, matching with contracts, and object allocation. This is joint work with Yu Zhou from Kyoto University.
Date and time: 13 October 8AM GMT
Title: A Map of Elections: What is It and What Is It Good For?
Recording of the talk
Abstract: In this talk, I will discuss the problem of measuring distances (similarity) between elections (or, more specifically, between profiles of ordinal preferences). Since I will be interested in measuring distances between elections generated from a number of statistical cultures, I will consider distances that are invariant with respect to renaming candidates and voters (I refer to such distances as isomorphic distances). While many such distances are NP-hard to compute, there are also some that are efficiently computable and that, apparently, give meaningful results.
With an appropriate isomorphic distance in hand, I will describe a “map of elections”, i.e., a collection of elections generated from a number of statistical cultures. Such a map can be conveniently visualized: It suffices to compute distances between each pair and use an appropriate graph-drawing algorithm. I will show that such visualizations are very useful. In particular, they reveal some features of the statistical cultures considered, and they allow us to understand elections better (for example, we will see how close are various elections to having Condorcet winners, how the Borda scores of their winners align, and how much time various algorithms need for them).
If you have seen one of my previous talks on this topic—there will be new pictures to see and more evidence that maps of elections are useful!
Date and time: 6 October 8:30AM NY time (7 Oct 1:30am in AKL)
Title: Resource Allocation by Frugal Majority Rule
Abstract: We propose a model of ‘frugal aggregation’ in which the evaluation of social welfare must be based on information about agents’ top choices plus general qualitative background conditions on preferences. The former is elicited individually, while the latter is not. We apply this model to problems of public budget allocation, relying on the specific assumptions of convexity and separability of preferences. For the case of separably convex preferences, we propose a particular aggregation rule called ‘Frugal Majority Rule’ and show that it amounts to minimizing the sum of the L1– distances to the agents’ tops. An algorithmic characterization of Frugal Majority Rule as ‘coordinated’ median renders it analytically tractable and efficiently computable; this also allows one to derive a number of desirable properties such as efficiency, the absence of the no show paradox and partial strategy-proofness. For the case of merely convex preferences, we propose an alternative solution concept, the ‘Tukey’ median. Finally, we consider intermediate preference restrictions in which separability is weakened to allow for complementarities and/or substitutivities among subsets of goods. This is joint work with Klaus Nehring (University of California Davis).
Date and time: 29 September 8AM GMT (i.e., e.g., 10am in Paris, 9pm in AKL)
Abstract: In the classical bankruptcy or estate division problem, an estate has to be divided among n players who each have an entitlement on the estate. The formal analysis of this problem started with O’Neill (1982) and Aumann and Maschler (1985). Most of the approaches in the literature are of an axiomatic and/or cooperative nature, but O’Neill (1982) also proposed a claim game: players put claims on parts of the estate, limited by their entitlements, and then each part of the estate is divided proportionally among the claimants for that part. There is also a literature which pursues several variations on this approach, and it has relations with several other topics, such as Hotelling games, contests, or Blotto games. In this presentation, we briefly review some of this literature, and also present new current work, in particular sequential claim games, in which the players put their claims sequentially, rather than simultaneously.
Date and time: 22 September 8AM GMT
Title: On the axiomatic approach to sharing the revenues from broadcasting sports leagues
Abstract: We take the axiomatic approach to uncover the structure of the revenue-sharing problem from broadcasting sports leagues. Our starting point is to explore the implications of three basic axioms: additivity, order preservation and weak upper bound. We show that the combination of these axioms characterizes a large family of rules, which is made of compromises between the uniform rule and concede-and-divide, such as the one represented by the equal-split rule. The members of the family are fully ranked according to the Lorenz dominance criterion, and the structure of the family guarantees the existence of a majority voting equilibrium. Strengthening some of the previous axioms, or adding new ones, we provide additional characterizations within the family. Weakening some of those axioms, we also characterize several families encompassing the original one. Joint paper with Gustavo Bergantiños (ECOSOT, Universidade de Vigo).
Date and time: 15 September 8:30AM New York Time (i.e., 10am in Paris, Sep 16, 12:30 AM AKL)
Title: Generalized Borda Rules as a Resolution of Arrow’s Impossibility Challenge
Abstract: The Borda rule as evaluates an alternatives by the average of the majority margins over all feasible alternatives. Generalized Borda rules (GBRs) evaluate alternatives by a weighted average of the majority margins over all feasible alternatives, where the weights may depend on the set of feasible alternatives. We call this weighting function the “relevance index” defining the GBR. A basic result of the paper characterizes GBRs in terms of an axiom of Ordinal Admissibility. Ordinal Admissibility expresses the normative desideratum that the social choice rule be justifiable as optimal on “purely ordinal” grounds. The Borda rule itself is not a satisfactory response to Arrow’s impossibility challenge, since it is highly vulnerable to the exact specification of the feasible set. In particular, it starkly violates the Independence of Clone axiom due to Tideman 1987. The original formulation due to Tideman 1987 has proved rather restrictive and is arguably too strong. In particular, essentially forces Polarization, ie. social choice of an alternative that is top ranked by a majority of the agents and bottom ranked by all others. By contrast, we how that moderate versions of Invariance to Cloning are satisfied by GBRs whenever their relevance indices satisfying appropriate invariance conditions. Such GBRs need not, and, typically, do not exhibit Polarization. A simple and naturally axiomatized index is the plurality index given by the distribution of agents preference tops. The induced GBR, the “Pluri-Borda rule”, has attractive properties and instantiates a new possibility result on Post-Arrowian social choice.
Date and time: 8 September 8AM GMT
Title: A solution to the two-person implementation problem
Abstract: We propose strike mechanisms as a solution to the classical problem of Hurwicz and Schmeidler [1978] and Maskin [1999] according to which, in two- person societies, no Pareto efficient rule is Nash-implementable. A strike mechanism specifies the number of alternatives that each player vetoes. Each player simultaneously casts these vetoes and the mechanism selects randomly one alternative among the unvetoed ones. For strict preferences over alternatives and under a very weak condition for extending preferences over lotteries, these mechanisms are deterministic-in-equilibrium. They Nash implement a class of Pareto efficient social choice rules called Pareto-and-veto rules. Moreover, under mild richness conditions on the domain of preferences over lotteries, any Pareto efficient Nash-implementable rule is a Pareto-and-veto rule and hence is implementable through a strike mechanism. Joint paper with JF Laslier and Matias Nuñez.
Date and time: 1 September 8AM GMT
Title: Walking a Mile in Your Shoes: an Escape from Arrovian Impossibilities [slides]
Abstract: This paper explores approaches to comparative justice (Sen 2009) by drawing on Social Choice Theory. We introduce a procedure to correct for the influence of unquestioned parochial values on individual justice rankings: individuals are put into the position of other members of society allowing them to question (and possibly change) their justice ordering. In a first step, it is shown under which conditions this procedure leads to a domain restriction such that majority rule yields a social justice ordering. In a second step, it is examined how the introduced procedure can be used to distinguish between “reasoned” and “unreasoned” agreement. The paper concludes with a discussion as to how the findings cast doubt on the unqualified acceptance of the (weak) Pareto condition.
Date and time: 25 August 8AM GMT
Title: Pairwise justified changes in collective choices
Abstract: Consider the following principle regarding the performance of collective choice rules. ì If a rule selects alternative x in situation 1, and alternative y in situation 2, there must be an alternative z, and some member of society whose appreciation of z relative to x has increased when going from situation 1 to situation 2.î This principle requires a minimal justiÖcation for the fall of x in the consideration of society: someone must have decreased its appreciation relative to some other possible alternative. We study the consequences of imposing this requirement of pairwise justifiability on a large class of collective choice rules that includes social choice and social welfare functions as particular cases. When preference profiles are unrestricted, it implies dictatorship, and both Arrowís and the Gibbard-Satterthwaite theorems become corollaries of our general result. On appropriately restricted domains, pairwise justifiability, along with anonymity and neutrality, characterize Condorcet consistent rules, thus providing a foundation for the choice of the alternatives that win by majority over all others in pairwise comparisons, when they exist. This is a joint paper with Dolors Berga, Bernardo Moreno, and Antonio Nicoló.
Date and time: 18 August 8AM GMT
Title: A New Approach to Contests with Complete and Incomplete Information [slides]
Abstract: We propose an alternative strategy of modelling contests by introducing a new class of truncated polynomial probability-of-win functions. Our approach permits finding a closed form solution and obtaining valuable comparative static results not only for the complete information case, (both for mixed and pure strategies), but also for the case of incomplete (private) information. Particularly, we are able to address an important question of information design in contests. In addition, this approach also allows to explore the case when a further increase in maximal effort exerted in a contest bestows a positive externality on other contenders with lower efforts, increasing their marginal probability of winning. We argue that this situation usually arises in R&D competition and patent races and cannot be modelled by standard Tullock-style contest models. This is a joint work with Alexander Matros, North Carolina and Lancester.
Date and time: 11 August 8AM GMT
Host: Simona Fabrizi
Title: Population ethics in an infinite universe [paper and slides]
Abstract: Population ethics studies the tradeoff between the total number of people who will ever live, and their welfare. But widely accepted theories in modern cosmology say that spacetime is probably infinite. In this case, its population is also probably infinite, so the quantity/quality tradeoff of population ethics is no longer meaningful. Instead, we face the problem of how to evaluate the social welfare of an infinite population dispersed through time and space. I propose spatiotemporal Cesàro average utility as a way to make this evaluation, and axiomatically characterize it.
Date and time: 4 August 8AM GMT
Host: Remzi Sanver
Title: Manipulation of social choice functions under incomplete information
Abstract: The failure of strategy-proofness requires the presence of an individual who completely knows the others’ preferences. However, as observed by some authors, an individual might decide to misrepresent her preferences on the basis of a smaller amount of information. It is then possible to consider weak versions of strategy-proofness which take into account the amount of information individuals can get. We analyse one of those weak versions of strategy proofness, called PC-strategy-proofness. A social choice function is PC-strategy-proof if it cannot be manipulated by an individual whose information about the preferences of the other members of the society is limited to the knowledge, for every pair of alternatives, of the number of individuals preferring the first alternative to the second one. We show that there are Pareto optimal, PC-strategy-proof and non-dictatorial social choice functions. We also prove that, when at least three alternatives are considered, no Pareto optimal, PC-strategy-proof and anonymous social choice function exists. Generalizing the definition of PC-strategy-proofness, we further propose a special family of weak versions of strategy-proofness which allows to define a new index for measuring the degree of manipulability of social choice functions. Some simple remarks on that index are discussed.
Date and time: 28 July 8AM GMT
Host: Clemens Puppe
Title: A new scheme for constructing large Condorcet domains
Abstract: We present a new method of constructing Condorcet domains from pairs of Condorcet domains of smaller sizes (concatenation+shuffle scheme). The concatenation+shuffle scheme provides maximal, connected, copious, peak-pit domains whenever the original domains had these properties. It provides peak-pit maximal Condorcet domains that are larger than those obtained by the Fishburn’s alternating scheme for all n≥13 (it was previously known for n≥40). The paper provides a new lower bound 2.1685^n for the size of the largest peak-pit Condorcet domain with n alternatives, and a new lower bound 2.2021^n for the size of the largest Condorcet domain without any restrictions.
Date and time: 21 July 9AM GMT (i.e., e.g., 11am in Paris, 9pm in AKL)
Title: Compromising on Compromise Rules [paper]
Abstract: We propose three mechanisms to reach a compromise between two opposite parties that must choose one out of a set of candidates and operate under full information. All three mechanisms weakly implement the Unanimity Compromise Set. They all rely on the use of some rule of k names, whereby one of the parties proposes a shortlist of k candidates, from which the opposite party selects the one to appoint. The decision regarding which particular rule in the class will be used involves determining who will be the first mover and the size of k. The chosen rule results endogenously from the strategic interaction between the parties, rather than being imposed a priori by any exogenous convention.
Date and time: 7 July 8AM GMT
Title: Fair Division with Money
Abstract: Agents share indivisible objects (desirable or not) and use cash transfers to achieve fairness. Utilities are linear in money but otherwise arbitrary. We look for n-person division rules preserving the informational simplicity of Divide and Choose or the Texas Shoot Out between two agents, treating agents symmetrically, and offering high individual welfare Guarantees. A single round of bidding for the whole manna is one such method but it does not capture the potential efficiency gains from debundling the objects as in Divide and Choose. Our Bid and Choose rules fix a price vector p for the objects; in each of the n-1 rounds of bidding the winner must also pay for the remaining objects he picks. These rules are simpler than Kuhn’s n-person generalisation of Divide and Choose, and they typically offer better Guarantees. They help agents with subadditive utilities, to the detriment of those with superadditive utilities. The talk is based on joint research with Anna Bogomolnaia.
Date and time: 30 June 8AM GMT
Title: Proportionality and the Limits of Welfarism
Abstract: We study two influential voting rules proposed in the 1890s by Phragmen and Thiele, which elect a committee or parliament of k candidates which proportionally represents the voters. Voters provide their preferences by approving an arbitrary number of candidates. Previous work has proposed proportionality axioms satisfied by Thiele but not Phragmen. By proposing two new proportionality axioms (laminar proportionality and priceability) satisfied by Phragmen but not Thiele, we show that the two rules achieve two distinct forms of proportional representation. Phragmen’s rule ensures that all voters have a similar amount of influence on the choice of the committee, and Thiele’s rule ensures a fair utility distribution. (Thiele’s rule is a welfarist voting rule that maximises a function of voters’ utilities). We show that no welfarist rule can satisfy our new axiom, and we prove that no such rule can satisfy the core. Conversely, some welfarist fairness properties cannot be guaranteed by Phragmen-type rules. This formalises the difference between the two types of proportionality. We then introduce an attractive committee rule which satisfies a property intermediate between the core and extended justified representation (EJR). It satisfies laminar proportionality, priceability, and is computable in polynomial time. The talk is based on the recent paper: Dominik Peters and Piotr Skowron. Proportionality and the Limits of Welfarism. EC-2020.
Date and time: 23 June 8AM GMT
Title: Participatory Budgeting with Cumulative Votes
Slides of the talk
Recording of the talk
Abstract: In participatory budgeting we are given a set of projects—each project having a cost, an integer specifying the available budget, and a set of voters who express their preferences over the projects. The goal is to select—based on voter preferences—a subset of projects whose total cost does not exceed the budget. We propose several aggregation methods based on cumulative votes, i.e., for the setting where each voter is given one coin and specifies how this coin should be split among the projects. We compare our aggregation methods based on (1) axiomatic properties and (2) computer simulations. We identify one method, Minimal Transfers over Costs, that demonstrates particularly desirable behaviour — in particular, it significantly improves on existing methods and satisfy a strong notion of proportionality — and thus is promising to be used in practice. This is a joint paper with Piotr Skowron and Arkadii Slinko.
Date and time: 16 June 8AM GMT
Title: Condorcet super-domains
Abstract: We consider Condorcet domains (CD) formed by a rhombus tiling on a zonogone Z(n; 2) as voting designs and consider a problem of aggregation of voting designs using the majority rule. A Condorcet super-domain is a collection of CDs obtained from rhombus tilings with the property that if voting designs (serving as ballots) belong to this collection, then the simple majority rule does not yield cycles. I will discuss methods of constructing Condorcet super-domains and related problems. The talk is based on joint paper with Vladimir Danilov and Aleksandre Karzanov (arxiv 2004.08183 math.CO).
Date and time: 9 June 8AM GMT
Title: Unanimous Jury Voting with an Ambiguous Likelihood
Recording of the talk
Abstract: We study collective decision-making in a voting game under the unanimity rule, with an ambiguous likelihood and ambiguity-averse voters who are MaxMin Expected Utility maximizers. We characterize the symmetric voting equilibria of this game, demonstrating that ambiguity helps reduce Type I errors: under ambiguity, voters are less likely to vote strategically against their information. Information aggregation improves as a result, and may even be restored to a fully informative equilibrium. We report evidence from a laboratory experiment supporting these predictions. This is joint work with Steffen Lippert, Addison Pan, and Matthew Ryan.
Date and time: 2 June 11AM GMT
Abstract: Recent work by Bilò et al [2019] concerns allocating graph vertices (treated as indivisible objects) so that each share forms a connected subgraph, and so that no agent x envies another’s share “up to one outer good.” They obtain positive results that apply to arbitrarily many agents, but these are limited to Hamiltonian (aka traceable) graphs. What of the non-Hamiltonian case? We show that among topological classes of graphs, any non-Hamiltonian class has an upper bound on the number of agents for which fair shares are guaranteed. On the other hand, for the case of exactly 3 agents, positive results exist for some infinite, non-Hamiltonian graph classes. Our results – positive and negative – are obtained via transfer from related theorems in continuous fair division, but we must go beyond the standard model, which employs the unit interval [0,1] as the continuously divisible “cake.” Instead, we use several copies of [0,1] glued at their endpoints, to form the letter Y, or the figure 8, or the outline of a kiss . . . a “tangle.”
Date and time: 26 May 8AM GMT
Recording of the talk
Abstract: We consider manipulation of social choice correspondences in a preference-approval environment where voters not only rank the alternatives but also evaluate them as acceptable or unacceptable. A social choice correspondence is evaluationwise strategy-proof iff no voter can misrepresent his preference and obtain an outcome which he finds more acceptable than the one that would occur if he had told the truth. As outcomes are irresolute sets of alternatives, our analysis needs to extend the notion of acceptability of alternatives over sets. Under a plausible extension, we show the existence of efficient and evaluationwise strategy-proof social choice correspondences that satisfy one of anonymity and neutrality. However, if anonymity and neutrality are jointly imposed, then an impossibility occurs when the number of voters is a multiple of 4. On the other hand, when there are three alternatives and the number of voters is not a multiple of 4, we show the existence of social choice correspondences which are efficient, evaluationwise strategy-proof, anonymous and neutral.
Date and time: 19 May, 8AM GMT
Contributor: Arkadii Slinko
Title: Generalisation and Properties of the Danilov-Karzanov-Koshevoy Construction for Peak-Pit Condorcet Domains
Recording of the talk
Abstract: Danilov, Karzanov and Koshevoy (2012) geometrically introduced an interesting operation of composition on Condorcet domains and using it they disproved a long-standing problem of Fishburn about the maximal size of connected Condorcet domains. We give an algebraic definition of this operation and investigate its properties. We give a precise formula for the cardinality of composition of two Condorcet domains and improve the Danilov, Karzanov and Koshevoy result showing that Fishburn’s alternating scheme does not always produce a largest connected Condorcet domain. I will outline some new exciting developments in the search of largest Condorcet domains.