### Workshops & Tutorials

**
**### Workshops & Tutorials (Sunday June 23, 2019)

** 9:00-12:30 **
** Data Science Through a Geometric Lens**

Location: West 212C (Organizers: Sanjoy Dasgupta, Cyrus Rashtchian, Ilya Razenshteyn)
**Nash Welfare, Stable Equilibrium, and Stable Polynomials**

Location: West 213A (Organizers: Nima Anari, Jugal Garg, Vasilis Gkatzelis)
** New Frontiers of Automated Mechanism Design for Pricing and Auctions (tutorial)**

Location: West 213B (Organizers: Maria-Florina Balcan, Tuomas Sandholm, Ellen Vitercik)

**2:00-5:00 **
**Recent Advances in High Dimensional Robust Statistics**

Location: West 212C (Organizers: Ilias Diakonikolas, Daniel Kane)
**TCS Women Spotlight**

Location: West 213A (Organizers: Sofya Raskhodnikova, Barna Saha, Virginia Vassilevska Williams)
**New Frontiers of Automated Mechanism Design for Pricing and Auctions (workshop)**

Location: West 213B (Organizers: Maria-Florina Balcan, Tuomas Sandholm, Ellen Vitercik)

Machine learning and data science applications heavily rely on geometric ideas and algorithms, such as clustering, metric embeddings, sketching, and nearest neighbor search. Unfortunately, classical computational models and worst-case analysis often fail to capture both the nuances of modern datasets and the overwhelming success of many methods in practice. Therefore, the time is ripe to focus on geometric algorithms with an eye towards relevant use cases. This workshop will introduce participants to both (i) the theoretical achievements in the area and (ii) the current and future applications of these ideas. Several excellent speakers will present their research, which should be of interest to anyone working in the union of theoretical computer science, machine learning, geometry, and statistics.

The last four years have seen a surge of work on the problem of partitioning a set of indivisible items among a group of agents aiming to maximize the Nash social welfare (NSW) objective. A feasible partition allocates to each agent some subset of the items, providing him with some utility, and the NSW objective corresponds to maximizing the geometric mean of the agents' utilities. This is a fundamental problem in fair division, closely related to the Santa-Claus problem which aims to maximize the minimum utility across all agents and received a lot of attention a decade ago. Recent work on the NSW objective has led to a sequence of novel constant factor approximation algorithms that leverage a range of powerful algorithmic techniques: from the use of seemingly unrelated market equilibrium notions that define fractional solutions for rounding, to the use of stable polynomial techniques that have also found applications such as the construction of Ramanujan graphs, Kadison-Singer and the traveling salesman problem, and many sampling, counting, and optimization problems involving determinants and determinantal distributions. This half-day tutorial will provide an in-depth overview of the key results, algorithmic techniques, and a plethora of open problems from this exciting research area.

Mechanism design is a field of game theory with significant real-world impact, encompassing areas such as pricing and auction design. Mechanisms are used in sales settings ranging from large-scale internet marketplaces to the US government's radio spectrum reallocation efforts. A powerful and prominent approach in this field is automated mechanism design, which uses optimization and machine learning to design mechanisms based on data. This automated approach helps overcome challenges faced by traditional, manual approaches to mechanism design, which have been stuck for decades due to inherent computational complexity challenges: the revenue-maximizing mechanism is not known even for just two items for sale! This workshop is focused on the rapidly growing area of automated mechanism design for revenue maximization. This encompasses both the foundations of batch and online learning (including statistical guarantees and optimization procedures), as well as real-world success stories.

One of the major recent advances in theoretical machine learning is the development of efficient learning algorithms for various high-dimensional statistical models. The Achilles heel of these algorithms is the assumption that the samples are precisely generated from the model. This assumption is crucial for the performance of these algorithms: even a very small fraction of outliers can completely compromise the algorithms' behavior.
Recent results have led to the development of the first computationally efficient robust estimators for a range of high-dimensional models. The goal of this tutorial is to introduce the broad TCS community to the core insights and techniques in this area of algorithmic high-dimensional robust statistics, and discuss new directions and opportunities for future work.

The workshop will feature an inspirational talk by Ronitt Rubinfeld and short Rising Star talks by graduate students and postdocs. The workshop is open to all.