Courses
  • IITGN
  • Online
  • Short Courses
  • Other

191014K02: Randomized Methods for Approximation and Parameterized Algorithms

191014K02: Randomized Methods for Approximation and Parameterized Algorithms

A GIAN Course by Prof Daniel Lokshtanov

December 5—9 2022
About the Course

Most computational problems that model real-world issues are not known to admit efficient algorithms that are provably correct on all inputs. Many of these problems can be reduced to one of the classical problems called NP-complete problems which are unlikely to admit efficient algorithms in practice, and the issue of whether they do is a fundamental open problem in computer science. Although these problems are very unlikely to be solvable efficiently in the immediate future, computer scientists, over the last few decades, have come up with several “workarounds” to “cope” with NP-hardness.

Two fundamental approaches in this program include approximation and fixed- parameter tractability. An approximate algorithm is a way of dealing with NP- completeness for optimization problem. This technique does not guarantee the best solution. The goal of an approximation algorithm is to come as close as possible to the optimum value in a reasonable amount of time which is at most polynomial time. On the other hand, parameterized algorithms aim to restrict the exponential blow-up to an identified parameter of the problem, leading to efficient exact algorithms whenever the said parameter is reasonably small. In recent times, there has been substantial research that involves an interplay of techniques from both approaches as well.

All paradigms of algorithm design, including efficient polynomial time algorithms as well as the methods of approximation and parameterization discussed above, are substantially more powerful when combined with techniques based on randomness. Carefully employed, randomization leads to approaches that are faster and easier to implement than their deterministic counterparts, making them particularly well-suited to practice.

Over the last two decades, sophisticated probabilistic techniques have been developed for a broad range of challenging computing applications. To begin with, this course will introduce the basic probabilistic techniques used in the design of randomized algorithms and in probabilistic analysis of algorithms. The course covers the basic probability theory required for working with these techniques and demonstrates their use in various computing applications, especially in the context of parameterized and approximation algorithms.

This course will demonstrate the algorithmic techniques in the context of a variety of combinatorial optimization problems that have significant real-world applications. These include: Longest Path, Minimum Cut, Maximum Cut, Clustering, Vertex Cover, Feedback Vertex Set, and Closest String.

Target Audience

Anyone who is biting their nails from the NP-completeness cliffhanger at the end of their introduction to algorithms will probably enjoy this course. The course is open to students, postdocs, faculty, industry professionals, and anyone who is interested and is confident about the prerequisites enlisted below.

Prerequisites

This is a theoretical course that will require mathematical maturity (in particular, the ability to understand and write formal mathematical proofs), and some background in the design and analysis of algorithms.

Probability Prerequisites

Discrete probability spaces • Events • Random variables • Independence (of events, of random variables) • Conditional probability • Expectation of random variable. • Linearity of Expectation • Conditional Expectation (of random variable on event, and on another random variable) • Binomial coefficients (Pascal’s Triangle) • Bernoulli, Binomial, Geometric random variables.

General Prerequsites

Correctness proofs for algorithms • Paradigms: Greedy, DP, Divide and Conquer • Big-Oh and asymptotic runtime analysis • Formulating and solving recurrences • P, NP, NP-completeness

Math Prerequisites

Linear algebra (Matrices, vectors, rank, basis, linear independence)

References
  1. The Design of Approximation Algorithms • David P. Williamson and David B. Shmoys
  2. Parameterized Algorithms • Marek Cygan, Fedor V. Fomin, Lukasz Kowalik, Daniel Lokshtanov, Daniel Marx, Marcin Pilipczuk, Michal Pilipczuk, and Saket Saurabh
  3. Randomized Algorithms • Motwani and Raghavan
  4. Beyond the Worst-Case Analysis of Algorithms • Tim Roughgarden
  5. Algorithms • Jeff Erickson
Logistics
  • 5th December 2022: Registration + Coffee: 9AM to 10AM | Outside 1/101

  • 5th December 2022: Inaugaral Event: 10AM | 1/101

    1. Address by Prof. Rajat Moona
      (Director, IITGN)
    2. Address by Prof. Anirban Dasgupta
      (Discipline Coordinator, Computer Science and Engineering, IITGN and Local GIAN Coordinator for IITGN)
    3. Address by Prof. Saket Saurabh
      (Professor, The Institute of Mathematical Sciences)
  • Lectures: Mon - Thu • 11AM — 12:30PM • 2:30PM — 4PM | Fri • 11AM — 12:30PM

  • Tutorial: Mon - Thu • 4:30PM — 6PM | Fri • 2:30PM — 4PM

  • Venue: Mon - Wed: 1/101 • Thu: 7/101 • Fri: 7/102


  • Zoom links and invitations to a Whatsapp group and Discord server were sent out to all registered participants.

  • Register here. {{< bi exclamation-triangle-fill color="indianred" >}} Registrations are now closed. You can follow along on Youtube if you missed registering!


About the Instructor: Daniel Lokshtanov

Daniel Lokshtanov is a Professor at the Department of Computer Science at the University of California Santa Barbara, before which he was a Professor at the Department of Informatics at the University of Bergen. He received his PhD in Computer Science (2009), from the University of Bergen. He spent two years (2010- 2012) as a Simons Postdoctoral Fellow at University of California at San Diego.

His research interests span a wide area of algorithms, and he has made several fundamental contributions in the areas of exact exponential algorithms, parameterized and fine-grained algorithms and approximation algorithms. He has been awarded the Meltzer Prize for Young Researchers for his work at the University of Bergen. He is a recipient of the Bergen Research Foundation young researcher grant and of an ERC starting grant on parameterized algorithms. He is a co-author of two recently published texts — Kernelization (Cambridge University Press, 2019) and Parameterized Algorithms (Springer, 2015).

  • Lectures
  • Tutorials
  • Memories
  • Feedback
Date Lecture Slides Notes Video Recap
05 Dec, 2022 1. Introduction

Randomized Algorithms
MaxCut
MinCut

05 Dec, 2022 2. Linear Programs

Algorithm: Vertex Cover LP.
2-Approximation by Deterministic Rounding
Algorithm: Set Cover.
Randomized Rounding for O(log n) approximation.

06 Dec, 2022 3. Vertex Cover and FVS

Randomized 2-Approximation algorithm for Vertex Cover, and an exact exponential algorithm algorithm for Vertex Cover
Randomized 4-Approximation algorithm for FVS and an exact exponential algorithm for FVS
Comments on F-Deletion

06 Dec, 2022 4. Color Coding

Color Coding (k-Path)
Chernoff Bounds
Approximate counting of (K-paths)
Chromatic Coding (Feedback Arc Set on Tournaments and d-Clustering)

07 Dec, 2022 5. Closest String and Max Cut

(1+eps) approximation for Closest String by Randomized Rounding and Local Search
Max Cut on Dense Graphs

07 Dec, 2022 6. Max Cut and SDP

Max Cut on Dense Graphs (continued)
Introduction to Semi-Definite Programs
0.87 approximation for Max Cut on General Graphs

08 Dec, 2022 7. Algebraic Techniques

Isolation Lemma
Cut and Count

08 Dec, 2022 8. Derandomization

Method of Conditional Expectation
Splitters

09 Dec, 2022 9. Glimpses of Exact Algorithms

1.333^n algorithm for 3-SAT
2-approximation for Tournament Feedback Vertex Set

No matching items
Issued Assessment Problems Solutions
05 Dec, 2022 Tutorial 1

06 Dec, 2022 Tutorial 2

07 Dec, 2022 Tutorial 3

08 Dec, 2022 Tutorial 4

09 Dec, 2022 Tutorial 5

No matching items

Made with Quarto and 🩶

 

Content by Neeldhara Misra