This problem was open for until last year but has recently been solved twice. The first solution gives a competitive algorithm while the second gives a nearly optimal competitive algorithm for requests. The latter result is based on the Steiner point, which is the exact optimal solution to a related geometric problem called Lipschitz selection and dates from In the talk, I will briefly outline the first solution and fully explain the second. The goal of the school is to educate top international theory PhD students about exciting recent developments in the field.

The winter school will be held in Zinal, a mountain village in the Swiss Alps that has a long tradition of hosting academic workshops and that allows for nice excursions and stimulating discussions in a relaxed atmosphere.

## this week's finds

The application deadline is November 15th , and acceptance notifications will be sent by December 1st Attendance of the winter school is free of charge and includes room and board shared rooms. In this paper we study the feasibility of robust learning from the perspective of computational learning theory, considering both sample and computational complexity. In particular, our definition of robust learnability requires polynomial sample complexity. We start with two negative results.

We show that no non-trivial concept class can be robustly learned in the distribution-free setting against an adversary who can perturb just a single input bit. Finally, we provide a simple proof of the computational hardness of robust learning on the boolean hypercube. Unlike previous results of this nature, our result does not rely on another computational model e. These are the first graph-based codes shown to have this property. Previously, the only codes known to achieve list-decoding capacity were completely random codes, random linear codes, and codes constructed by algebraic rather than combinatorial techniques.

This result opens up a potential avenue towards truly linear-time list-decodable codes which achieve list-decoding capacity.

Our result on list decoding follows from a much more general result: any local property satisfied with high probability by a random linear code is also satisfied with high probability by a random LDPC code from Gallager's distribution. Local properties are properties characterized by the exclusion of small sets of codewords, and include list-decoding, list-recovery and average-radius list-decoding. Along the way, we give a characterization of sets of codewords that are likely to appear in a random linear code, which may be of independent interest.

John Robson has worked on various problems including what is still the best result on separating wordsâ€”the topic we discussed the other day. Ken first knew him for his proof than checkers is -complete and similar hardness results for chess and Go. Today I want to talk about his theorem that any two words can be separated by an automaton with relataivley few states. In his famous paper from , he proved an upper bound on the Separating Word Problem.

This is the question: Given two strings and , how many states does a deterministic automaton need to be able to accept and reject? His theorem is:. Then there is an automaton with at most states that accepts and rejects. The story of his result is involved.

For starters, it is still the best upper bound after almost three decades. Another issue is that a web search does not quickly, at least for for me, find a PDF of the original paper. I tried to find it and could not. More recent papers on the separating word problem reference his paper, but they do not explain how he proves it. Recall the problem of separating words is: Given two distinct words of length , is there a deterministic finite automaton that accepts one and rejects the other?

And the machine has as few states as possible. Thus his theorem shows that roughly the number of states grows at most like the square root of. I did finally track the paper down. The trouble for me is the paper is encrypted. Well not exactly, but the version I did find is a poor copy of the original. Here is an example to show what I mean:. The first is he uses some basic string-ology. That is he uses some basic facts about strings. For example he uses that a non-periodic string cannot overlap itself too much.

## Table of contents for issues of Studies in History and Philosophy of Modern Physics

He also uses a clever trick on how to simulate two deterministic machines for the price of one. This in general is not possible, and is related to deep questions about automata that we have discussed before here. Robson shows that it can be done in a special but important case. Let me explain. Suppose that is a string. We can easily design an automaton that accepts if and only if is the string.

The machine will have order the length of states. So far quite simple.

- Stress Responses of Lactic Acid Bacteria.
- A quasi-static polynomial nodal method for nuclear reactor analysis.
- Geometric and Topological Methods for Quantum Field Theory | Hernan Ocampo | Springer.
- Great Powers and Strategic Stability in the 21st Century: Competing Visions of World Order (Routledge Global Security Studies)?
- Curriculum.

Now suppose that we have a string of length and wish to find a particular occurrence of the pattern in. We assume that there are occurrences of in. The task is to construct an automaton that accepts at the end of the copy of. Robson shows that this can be done by a automaton that has order.

### Search Tips

Here is the length of the string. This is a simple, clever, and quite useful observation. Clever indeed. The obvious automaton that can do this would seem to require a cartesian product of two machines. This would imply that it would require. The goal is to see if there is a copy of this string starting at a position that is a multiple of. The machine starts in state and tries to find the correct string as input.

If it does, then it reaches the accepting state.

If while doing this it gets a wrong input, then it switches to states that have stopped looking for the input. After seeing three inputs the machine reaches and then moves back to the start state. Lemma 2 Suppose and. Then all but primes satisfy. Proof: Consider the quantity for not equal to. Call a prime bad if it divides this quantity.

This quantity can be divisible by at most primes. So there are at most bad primes in total.

### Shop by category

We need some definitions about strings. Let be the length of the string. Also let be the number of occurrences of in. A string has the period provided. A string is periodic provided it has a period that is less than half its length. Lemma 3 For any string either or is not periodic. Proof: Suppose that is periodic with period where is a single character.

Let the length of equal.

## A Survey on Mathematical Feynman Path Integrals: Construction, Asymptotics, Applications

So by definition,. So it follows that. This shows that and cannot both be periodic, since. Lemma 4 Suppose that is not a periodic string.