Publier

INFO-H-422 Information, Coding, Computing and Complexity Theory

< Retour

Exam June 2023 - Partie Nicolas Cerf - 10 Jun 2023

Voici une petite liste des questions posées à différentes personnes au cours de la session. À chaque fois, il s\'agissait de deux questions principales avec des sous questions en cours de route.

- Q1 : Explain what is a BSC and compute its capacity, and generalize (explain what is a symmetric channel, not necessary binary). In the intermediary questions, he asked me first why the capacity of a erasing channel is superior than for a binary flip channel with a probability of flip (for BSC) and proba of erasing (in erasing channel) of 0.1. He asked me to draw both graphs to compare and then to explain. It is because when there is an erasing, we know where the erasing occurred so we have more information, while in the case of the bit flip we don\'t know where the bit was flipped. In second, he asked me what would be the capacity in the case we had to concanated BSC. The answer was the proof of the data processing inequality, shown in the TP. Q2 : Explain what is an ECC (in few sentences). Explain in details what is a cyclic code. In the intermediary questions, he asked me why is a cyclic code so liked in practice. It is because in practice, we work with big n (very long codeword). Shannon 2nd implies that the bigger n is, the smaller the error is. But the problem is that if we work with huge n\'s, it is very hard to compute hamming matrices whose columns are distinct from each other. (He aslo asked me, why the columns of the hamming matrix must be distinct ?). And the cyclic code is a very godd recepee to build huge matrix with huge n whose columns are all distinct.

-Q1 Huffman code (prove optimality) et Q2 Hamming code (bound on number of parity bits)

-Q1 C\'est quoi un code instantané + Kraft inequalit, Q2 Démontrez second théorème de shanon

-Q1 : Expliquer Code correcteur d erreur, decoder ideal, distance de code et son utilité. Q2 :Demo borne inferieur code instantaneous

-Q1 : code instantané et inégalité de Kraft. Q2 : capacité informationnelle et opérationnelle, 2e théorème de Shannon et sa réciproque. Et enfin la data processing inequality (à expliquer et démontrer)

-Q1 :expliquer les notions et définitions sur la typiqualitée et de son application, + Q2 : calculer la capacité d’un canal binaire avec effacement

Partie Cerf : Typicality, Hamming Bound - 20 Jun 2022

# Main question : Typicality, basically the whole chapter
- Define what is a typical sequence
- Prove Shannon\'s first theorem using this notion
Intermediary questions, depending on what I said :
- Why do we say that for a binomial source, a typical sequence has in average np zeros and n(1-p) ones ? (see course 1 in the introduction lol)
- Okay, this theorem show we can get close to the entropy, but can we go below the entropy ? Can we compress more ? This is basically the content of the Converse of Shannon\'s first theorem, which is proven in the chapter on Data compression : L-H > 0, using the relative entropy and KRAFT\'S INEQUALITY. This is actually the key of the proof of Shannon\'s first theorem. Kraft\'s Inequality applies not only for Inst. Codes but also for Uniq. Decod. codes, etc. Kraft\'s Inequality is at the heart of the converse of Shannon\'s first theorem, on the impossibility to compress further than the entropy (for a symbol, on average)

# Second question : Hamming bound
- What is the Hamming bound ? Introduce the context, etc.
- Proof the validity of the Hamming bound for an (M,n) ECC

Intermediary questions :
- You wrote d>2e+1, can you show why having d=3 allows to correct 1 error ?
- Before proving it, can you say, by intuition, why it is valid ? What happens if d grows and we also want to have a higher M ? -> as we are in a finite space, the more the distance grows the less space we have for codewords

Examen oral 13 juin 2022 - Partie Cerf - 13 Jun 2022

Deux questions : une principale (2/3 des points) et une subsidiaire (1/3 des points).

Question 1 :
- Expliquer l\'inégalité de Kraft et prouver-la
- Donner un exemple d\'encodage avec le Huffman code
- Donner la propriété principale du Huffman code et prouver-la (plus ou moins rigoureusement)

Question 2:
- Donner un exemple de Hamming code
- Expliquer l\'encodage et le décodage.


Nicolas Cerf est très sympa et de bonne composition, il laisse expliquer ce qu\'on a noté et intervient de temps en temps s\'il veut plus de détails. Si il voit que vous ne trouvez pas direct, il aide en donnant un peu des indices. Donc pas de stress, courage!

Partie Computing and Complexity Theory - 11 Jun 2022

Question 1: Prove that TM are equivalent to NTM (demo using the MTM)

Question 2: Prove that EQ_lba is undecidable using E_lba ( I inspired myself from the same demonstration but with TM\'s, you use mapping reduction).

Partie Computing and Complexity Theory - 11 Jun 2022

Question 1 :
You are given a RASP whose memory has already been initialised with a set A and it is stored in memory starting at register 1000. so you have A[0] in r1000, A[1] in r1001 etc... You are given as input i and j, and you are tasked to code the rasp such that it outputs A[i]/A[j]. (he is mostly interested in your idea than the actual code it self).
with rasp, the code has to be in memory so you need to precise that, and also that pointer/indirect addressing is not possible in rasp, so you have to do the little trick with the 6 steps (see slides).

other than explaining my code and how i did the indirect addressing, he just asked me what was the difference between ram and rasp (code in registers in rasp so can change during the run, code in a memory which is fixed at launch for ram) and that the extra power of the ram was that it can do indirect addressing.

Question 2:
Explain what is a polynomial time verifier and give a language which can be verified in polynomial time:
- explain in details what a polynomial time verifier is
- explain in details the language.
i explained the clique, and there were some questions related to what i had written on the board such as \"you have written that clique belongs to np, why?\" and it\'s because NP is the class of languages that have polynomial-time verifiers.

all in all, the oral was quite chill no stress, he is quite helpfull, even if you make mistakes he will ask you again and lead you if needed.

Partie Roland - 9 Jun 2022

Préparation : tu pioches tes 2 questions (1 computability, 1 complexity) et il te laisse dans une pièce pour préparer pendant 45mn. Tu as droit à tes notes et tu peux écrire tout ce que tu veux sur les tableaux. Tu peux faire l\'oral en français ou anglais.

Questions :
1) HALT = {<M,w> | M is a RAM that halts on w} (/!\\ dans le cours c\'est une TM, pas une RAM)
- decidable? (no)
- Turing recognizable? (yes)
- co-Turing recognizable? (no)
--> il insiste sur les détails (qu’est-ce qui est RAM, qu’est-ce qui est TM, c’est quoi l’input, c’est quoi le langage, etc.)

2) Prove that if A poly reducible to B and B € P —> A € P
(même démo qu\'au cours)
--> il insiste sur les définitions de réduction, de P, de input, etc.

Question bonus :
Pour la 2, est-ce qu’on peut échanger et mettre A€P —> B€P ? (pas toujours pcq la réduction est basée sur f : w€A —> f(w)€B et on ne peut échanger que si on peut retrouver w à partir de f donc si f est inversible)

Addendum : ambiance à l\'exam de JRoland - 9 Jun 2022

il prend note de TOUT CE QUE TU DIS je pense qu\'il a une checklist de ce qu\'il veut entendre pendant la démo et il questionne sur ce qui manque

En tout cas il \"aide\" dans le sens où il grimace ou il laisse des \"hiiin\" si tu penches sur la mauvaise réponse lors d\'une question binaire

Et c\'est \"bonne ambiance\" de manière générale : il est là pour voir ce que tu as compris, interroger sur des cas limites, sur l\'impact de changer tel ou tel terme dans une définition, etc.


Partie Roland - 9 Jun 2022

Bonjour, piochez deux cartes au hasard, puis 45 minutes de préparation cours ouvert, où on peut écrire sur les tableaux.

# Question 1 : decidable languages.
Show that the class of decidable languages is closed under the following operations :

a) Concatenation. I.e, show that if L1 and L2 are decidable languages over the same alphabet, then the language {vw | v\\in L1, \\in L2} is also decidable

b) idem with complementarity : if L is decidable, show that \\bar L is decidable

c) Idem with intersection

Answers are in the course (except for intersection but we just have to let the TMs run the one after the other and expect an accepting run for both)

Those were easy to answer because we are talking about decidable language. We can run the TMs the one after the other without risk of looping. He admitted that those were the easiest questions so we went onto the side dishes. These side questions really depends on what you are saying. Examples :
- Concatenation : can just w in L1 be in the concatenated language ? Is this a valid concatenation ? -> Discussing extreme cases
- What do we need to define a Turing machine ?
- What happens for each of the proofs when we now ask for turing-recognizability ? Use MTM for a) and c), but doesn\'t work for b). A counter example is A_TM, as shown in the course : A_TM is TR but \\bar A_TM is not TR (bc A_TM is undecidable)

# Question 2 : The class P
Formally define the class P. Give an example of a language that is in class P. Define it formally and prove that it is in class P
-> I went with the PATH language. Very formally, I was very precise so he only had a few questions to ask more.
Side questions :
- What happens to P if we change the definition of TIME(t(n)) to languages that are O(t(n))-decidable by another computational machine ? -> It depends. If it is a RAM, RASP, MTM, well P will be the same. But if NTM, then we don\'t have P anymore we have NP.
- Encoding of the graph : we need to precise that we take the binary encoding of the INTEGER encoding of the node labels.
- Explain the intuition on why step 2)a) of the proof takes polynomial time. -> The graph encoding is polynomial. At stage 2)a) we are scanning and marking, scanning and marking, so this step is actually at worse doing a whole scan of the tape, multiple times, marking, etc. and we can mark at most $m$ nodes, so everything here is polynomial.


Partie Cerf 2021 - 8 Jun 2021

- Explain the principle of Shannon code. Prove upper and lower bound on its average length. What happens to these bounds when the distribution is unknown ?
- Describe the binary erasure channel. Show its capacity, and compare it with feedback scheme.

Il aide et est sympa. Il fait beaucoup de parallèles avec le reste du cours quand vous êtes en train de répondre, donc faut bien faire les liens du cours. (le format de cet exam en covid c\'était 45 min de préparation avec ce que vous voulez + 45 min de réponse)



Il n'y a pas de publications plus anciennes.