Jin Akiyama ,
(Tokyo University of Science), Reversible Figures and Solids.An example of reversible (or hinge inside-out transformable) figures is Dudeney's Haberdasher's puzzle in which an equilateral triangle is dissected into four pieces, hinged like a chain, and then is transformed into a square by rotating the hinged pieces. Furthermore, the entire boundary of each figure goes into the inside of the other figure and becomes the dissection lines of the figure. Many intriguing results on reversibilities of figures have been found in the preceding research, but most of them are results on polygons. We generalize those results to general connected figures. It is shown that two nets obtained by cutting the surface of an arbitrary convex polyhedron along non-interesting dissection trees are reversible. Moreover, we generalize reversibility for 2D-figures to one for 3D-solids.Allan Borodin , (University of Toronto), Simplicity Is in Vogue (again). Throughout history there has been an appreciation of the importance of simplicity in the arts and sciences. In the context of algorithm design, and in particular in apprxoximation algorithms and algorithmic game theory, the importance of simplicity is currently very much in vogue. I will present some examples of the current interest in the design of "simple algorithms". And what is a simple algorithm? Is it just "you'll know it when you see it", or can we benefit from some precise models in various contexts?José Correa, (Universidad de Chile), Subgame Perfect Equilibrium: Computation and Efficiency. The concept of Subgame Perfect Equilibrium (SPE) naturally arises in games which are played sequentially. In a simultaneous game the natural solution concept is that of a Nash equilibrium in which no players has an incentive to unilaterally deviate from her current strategy. However, if the game is played sequentially, i.e., there is a prescribed order in which the players make their moves, an SPE is a situation in which all players anticipate the full strategy of all other players contingent on the decisions of previous players. Although most research in algorithmic game theory has been devoted to understand properties of Nash equilibria including its computation and the so-called price of anarchy in recent years there has been an interest in understanding the computational properties of SPE and its corresponding efficiency measure, the sequential price of anarchy.Alan Frieze, (Carnegie Mellon University), Buying Stuff Online. Suppose there is a collection x_{1}, x_{2}, ..., x_{N} of independent uniform [0, 1] random variables, and a hypergraph F of target structures on the vertex set {1, ..., N}. We would like to buy a target structure at small cost, but we do not know all the costs x_{i} ahead of time. Instead, we inspect the random variables x_{i} one at a time, and after each inspection, choose to either keep the vertex i at cost x_{i}, or reject vertex i forever.Héctor García-Molina, (Stanford University), Data Crowdsourcing: Is It for Real?. Crowdsourcing refers to performing a task using human workers that solve sub-problems that arise in the task. In this talk I will give an overview of crowdsourcing, focusing on how crowdsourcing can help traditional data processing and analysis tasks. I will also give a brief overview of some of the crowdsourcing research we have done at the Stanford University InfoLab. |
[Top] [Home] [LATIN 2016] |
Ronitt Rubinfeld,
(MIT), Something for Almost Nothing: Advances in Sub-linear Time Algorithms.Linear-time algorithms have long been considered the gold standard of computational endciency. Indeed, it is hard to imagine doing better than that, since for a nontrivial problem, any algorithm must consider all of the input in order to make a decision. However, as extremely large data sets are pervasive, it is natural to wonder what one can do in sub-linear time. Over the past two decades, several surprising advances have been made on designing such algorithms. We will give a non-exhaustive survey of this emerging area, highlighting recent progress and directions for further research.Gilles Barthe, (IMDEA Software Institute), Computer-Aided Cryptographic Proofs. Robert Sedgewick, (Princeton), "If You Can Specify It, You Can Analyze It" - The Lasting Legacy of Philippe Flajolet. The "Flajolet School" of the analysis of algorithms and combinatorial structures is centered on an effective calculus, known as analytic combinatorics, for the development of mathematical models that are sufficiently accurate and precise that they can be validated through scientific experimentation. It is based on the generating function as the central object of study, first as a formal object that can translate a specification into mathematical equations, then as an analytic object whose properties as a function in the complex plane yield the desired quantitative results. Universal laws of sweeping generality can be proven within the framework, and easily applied. Standing on the shoulders of Cauchy, Polya, de Bruijn, Knuth, and many others, Philippe Flajolet and scores of collaborators developed this theory and demonstrated its effectiveness in a broad range of scientific applications. Flajolet's legacy is a vibrant field of research that holds the key not just to understanding the properties of algorithms and data structures, but also to understanding the properties of discrete structures that arise as models in all fields of science. This talk will survey Flajolet's story and its implications for future research.Gonzalo Navarro, (University of Chile), Encoding Data Structures. Classical data structures can be regarded as additional information that is stored on top of the raw data in order to speed up some kind of queries. Some examples are the suffix tree to support pattern matching in a text, the extra structures to support lowest common ancestor queries on a tree, or precomputed shortest path information on a graph.Dexter Kozen, (Cornell U.), Kleene Algebra with Tests and the Static Analysis of Programs. Succinct data structures are data representations that use the (nearly) the information theoretic minimum space, for the combinatorial object they represent, while performing the necessary query operations in constant (or nearly constant) time. So, for example, we can represent a binary tree on n nodes in 2n + o(n) bits, rather than the "obvious" 5n or so words, i.e. 5n lg(n) bits. Such a difference in memory requirements can easily translate to major differences in runtime as a consequence of the level of memory in which most of the data resides. The field developed to a large extent because of applications in text indexing, so there has been a major emphasis on trees and a secondary emphasis on graphs in general; but in this talk we will draw attention to a much broader collection of combinatorial structures for which succinct structures have been developed. These will include sets, permutations, functions, partial orders and groups, and yes, a bit on graphs. |
[Top] [Home] [LATIN 2014] |
Martin Davis,
(Courant Institute, NYU, USA), Universality is Ubiquitous.Scott Aaronson, (Massachusetts Institute of Technology, USA), Turing Year Lecture. Kirk Pruhs, (University of Pittsburgh, USA), Green Computing Algorithmics: Managing Power Heterogeneity. Marcos Kiwi, (Universidad de Chile, Chile), Combinatorial and Algorithmic Problems Involving (Semi-)Random Sequences. |
[Top] [Home] [LATIN 2012] |
Piotr Indyk,
(Massachusetts Institute of Technology, USA), Sparse Recovery Using Sparse Random Matrices.Cristopher Moore, (University of New Mexico and Santa Fe Institute, USA), Continuous and Discrete Methods in Computer Science. Sergio Rajsbaum, (Universidad Nacional Autónoma de México, Mexico), Iterated Shared Memory Models. Leslie Valiant, (Harvard University, USA), Some Observations on Holographic Algorithms. Ricardo Baeza-Yates, John Brzozowski, Volker Diekert, and Jacques Sakarovitch, (Yahoo Research (Spain), University of Waterloo (Canada), Universität Stuttgart (Germany), and Ecole Nationale Supérieure des Télécommunications (France)), Vignettes on the work of Imre Simon. |
[Top] [Home] [LATIN 2010] |
Claudio Lucchesi,
(Unicamp, Brazil), Pfaffian Bipartite Graphs: The Elusive Heawood Graph.Moni Naor, (Weizmann Institute, Israel), Games, Exchanging Information and Extracting Randomness. Wojciech Szpankowski, (Purdue University, USA), Tries. Eva Tardos, (Cornell U., USA), Games in Networks. Robert Tarjan, (Princeton U., USA), Graph Algorithms. |
[Top] [Home] [LATIN 2008] |
Ricardo Baeza-Yates,
(Universidad de Chile & Universitat Pompeu Fabra), Algorithmic Challenges in Web Search Engine.In this paper we present the main algorithmic challenges that large Web search engines face today. These challenges are present in all the modules of a Web retrieval system, ranging from the gathering of the data to be indexed (crawling) to the selection and ordering of the answers to a query (searching and ranking). Most of the challenges are ultimately related to the quality of the answer or the efficiency in obtaining it, although some are relevant even to the existence of search engines: context based advertising.Anne Condon , (RNA molecules: glimpses through an algorithmic lens), U. British Columbia. Dubbed the ``architects of eukaryotic complexity'', RNA molecules are increasingly in the spotlight, in recognition of the important catalytic and regulatory roles they play in our cells and their promise in therapeutics. Our goal is to describe the ways in which algorithms can help shape our understanding of RNA structure and function.Ferran Hurtado, (Universitat Politècnica de Catalunya), Squares. In this talk we present several results and open problems having squares, the basic geometric entity, as a common thread. These results have been gathered from various papers; coauthors and precise references are given in the descriptions that follow.R. Ravi, (Matching Based Augmentations for Approximating Connectivity Problems), Carnegie Mellon University. We describe a very simple idea for designing approximation algorithms for connectivity problems: For a spanning tree problem, the idea is to start with the empty set of edges, and add matching paths between pairs of components in the current graph that have desirable properties in terms of the objective function of the spanning tree problem being solved. Such matching augment the solution by reducing the number of connected components to roughly half their original number, resulting in a logarithmic number of such matching iterations. A logarithmic performance ratio results for the problem by appropriately bounding the contribution of each matching to the objective function by that of an optimal solution.Madhu Sudan, (Modelling Errors and Recovery for Communication), MIT. The theory of error-correction has had two divergent schools of thought, going back to the works of Shannon and Hamming. In the Shannon school, error is presumed to have been effected probabilistically. In the Hamming school, the error is modeled as effected by an all-powerful adversary. The two schools lead to drastically different limits. In the Shannon model, a binary channel with error-rate close to, but less than, 50\% is useable for effective communication. In the Hamming model, a binary channel with an error-rate of more than 25\% prohibits unique recovery of the message.Sergio Verdú, (Lossless Data Compression via Error Correction), Princeton. This plenary talk gives an overview of recent joint work with G. Caire and S. Shamai on the use of linear error correcting codes for lossless data compression, joint source/channel coding and interactive data exchange.Avi Wigderson, (The power and weakness of randomness in computation), IAS. Humanity has grappled with the meaning and utility of randomness for centuries. Research in the Theory of Computation in the last thirty years has enriched this study considerably. We describe two main aspects of this research on randomness, demonstrating its power and weakness respectively. |
[Top] [Home] [LATIN 2006] |
Cynthia Dwork,
(Microsoft Research), Fighting Spam: The Science.Mike Paterson, (U. of Warwick), Analysis of Scheduling Algorithms for Proportionate Fairness. We consider a multiprocessor operating system in which each current job is guaranteed a given proportion over time of the total processor capacity. A scheduling algorithm allocates units of processor time to appropriate jobs at each time step. We measure the goodness of such a scheduler by the maximum amount by which the cumulative processor time for any job ever falls below the ``fair'' proportion guaranteed in the long term.Yoshiharu Kohayakawa, (U. Sao Paolo), Advances in the Regularity Method. Jean-Eric Pin, (CNRS/U. Paris VII), The consequences of Imre Simon's work in the theory of automata, languages and semigroups. In this lecture, I will show how influential has been the work of Imre in the theory of automata, languages and semigroups. I will mainly focus on two celebrated problems, the restricted star-height problem (solved) and the decidability of the dot-depth hierarchy (still open). These two problems lead to surprising developments and are currently the topic of very active research.Dexter Kozen, (Cornell U.), Kleene Algebra with Tests and the Static Analysis of Programs. I will propose a general framework for the static analysis of programs based on Kleene algebra with tests (KAT). I will show how KAT can be used to statically verify compliance with safety policies specified by security automata. The method is sound and complete over relational interpretations. I will illustrate the method on an example involving the correctness of a device driver. |
[Top] [Home] [LATIN 2004] |
Jennifer T. Chayes,
(Microsoft Research), Phase Transitions in Computer Science.Phase transitions are familiar phenomena in physical systems. But they also occur in many probabilistic and combinatorial models, including random versions of some classic problems in theoretical computer science.Christos H. Papadimitriou, (U. California, Berkeley), The Internet, the Web, and Algorithms. The Internet and the worldwide web, unlike all other computational artifacts, were not deliberately designed by a single entity, but emerged from the complex interactions of many. As a result, they must be approached very much the same way that cells, galaxies or markets are studied in other sciences: By speculative (and falsifiable) theories trying to explain how selfish algorithmic actions could have led to what we observe. I present several instances of recent work on this theme, with several collaboratorsJoel Spencer, (Courant Institute), Erdos Magic. The Probabilistic Method is a lasting legacy of the late Paul Erdos. We give two examples - both problems first formulated by Erdos in the 1960s with new results in the last few years and both with substantial open questions. Further in both examples we take a Computer Science vantagepoint, creating a probabilistic algorithm to create the object (coloring, packing respectively) and showing that with positive probability the created object has the desired properties.Jorge Urrutia, (UNAM), Open Problems in Computational Geometry. In this paper we present a collection of problems which have defied solution for some time. We hope that this paper will stimulate renewed interest in these problems, leading to solutions to at least some of them.Umesh V. Vazirani, (U. California, Berkeley), Quantum Algorithms. Mihalis Yannakakis, (Avaya Laboratories), Testing and Checking of Finite State Systems. Finite state machines have been used to model a wide variety of systems, including sequential circuits, communication protocols, and other types of reactive systems, i.e., systems that interact with their environment. In testing problems we are given a system, which we may test by providing inputs and observing the outputs produced. |
[Top] [Home] [LATIN 2002] |
Allan Borodin,
(U. Toronto), On the Competitive Theory and Practice of Portfolio Selection. Philippe Flajolet, (INRIA), . Joachim von zur Gathen, (U. Paderborn), Subresultants Revisited. Yoshiharu Kohayakawa, (U. Sao Paulo), Algorithmic Aspects of Regularity. Andrew Odlyzko, (AT&T Labs), Integer Factorization and Discrete Logarithms. Prabhakar Raghavan, (IBM Almaden), Graph Structure of the Web: A Survey. |
[Top] [Home] [LATIN 2000] |
Noga Alon,
(Tel Aviv Univ.), Spectral Techniques in Graph Algorithms. Richard Beigel, (Lehigh Univ.), The Geometry of Browsing. Gilles Brassard, (Univ. of Montréal), Quantum Cryptanalysis of Hash and Claw-Free Functions. Herbert Edelsbrunner, (Univ. of Illinois, Urbana-Champaign), Shape Reconstruction with Delaunay Complex. Juan A. Garay, (IBM, Yorktown Heights), Batch Verification with Applications to Cryptography and Checking. |
[Top] [Home] [LATIN 1998] |
Josep Diaz,
NC Approximations. Isaac Scherson, Load Balancing in Distributed Systems. J.Ian Munro, Fast, Space Efficient Data Structures. Alberto Apostolico, The String Statistics Problem. Mike Waterman, Probability Distributions for Sequence Alignment Scores . |
[Top] [Home] [LATIN 1995] |
Jean-Paul Allouche,
(CNRS), q-Regular Sequences and other Generalizations of q-Automatic Sequences. Manuel Blum, (U. California, Berkeley), Universal Statistical Tests. Kosaburo Hashiguchi, (Toyohashi U. of Technology), The Double Reconstruction Conjectures about Colored Hypergraphs and Colored Directed Graphs. Erich Kaltofen, (Rensselaer Polytechnic Institute), Polynomial Factorization 1987-1991. Arjen K. Lenstra, (Bellcore), Massively Parallel Computing and Factoring. Gene Myers, (U. of Arizona), Approxiamte Matching of Network Expressions with Spacers. Jean-Eric Pin, (Bull), On Reversible Automata. Vaughan Pratt, (Standford U.), Arithmetic + Logic + Geometry = Concurrency. Daniel D. Sleator, (Carnegie Mellon U.), Data Structures and Terminating Petri Nets. Michel Cosnard, (Ecole Normale Supérieure de Lyon), Complexity Issues in Neural Network Computations. |
[Top] [Home] [LATIN 1992] |