I was given a HW assignment that asks me the following:
Given a system of $m$ linear equations in variables $x_1,x_2,...,x_n$ over $\mathbb{F_p},ドル find a randomized algorithm that find an assignment for all the $x_i$ such that the expected number of equations solved are $m/p$. Then do a derandomization that gives an efficient deterministic algorithm that solves at least $m/p$ of the equations.
A randomized algorithm is easy - assign each variable a value uniformly from $\mathbb{F_p},ドル by the principle of deferred decisions each equation is satisfied with a probability at least 1ドル/p,ドル since there are $m$ equations and by the linearity of the expectation we get the desired result.
I am having difficulty with the second part, I wish to use the method of conditional expectation: choosing an assignment for the first variables then choosing an assignment to the next variable etc', and the expectation at each point is at least $m/p$ so that after $n$ iterations we have assigned all variables and the expected number of satisfied equations is the number of satisfied equations. I have seen this method used for derandomization of MAX-CUT.
My problems making adjustments:
Unlike MAX-CUT - I can only bound the expected number of solved equations by $m/p$ and not calculate the exact expectancy, this gives me problems when trying to figure out the next best assignment since I can't choose one that maximizes the conditional expectancy
Unlike MAX-CUT - Choosing an assignment for some of the variables can determine the value of another variable (e.g choosing $x_1,x_2$ when we have $x_1+x_2+x_3=1$)
I have a suggestion, but I can't manage to prove it works: At iteration $k$ choose an assignment for $x_k$ s.t the number of equations that are false is minimal (that is equations with one free variable that is not assigned a value contradicting the equality such as $x_j=t$ when we assigned $x_j$ a different value)
I would appreciate help analyzing my suggested algorithm or a point in the right direction
-
$\begingroup$ Try running your randomized algorithm on the one-equation linear system 0$\cdot$x = 1$_F$ . $\endgroup$user12859– user128592016年08月26日 10:20:06 +00:00Commented Aug 26, 2016 at 10:20
-
$\begingroup$ @RickyDemer - lets assume that there are at least two free variables (coefficient not zero) in each equation, then the above analysis would be ok $\endgroup$Belgi– Belgi2016年08月26日 10:21:45 +00:00Commented Aug 26, 2016 at 10:21
-
$\begingroup$ To "calculate the exact expectancy", work out the expectancy for equations with fewer non-zero coefficients too. $\endgroup$user12859– user128592016年08月26日 10:40:15 +00:00Commented Aug 26, 2016 at 10:40
-
1$\begingroup$ As Ricky Demer mentions, you have to assume that the equations are non-trivial (each of them includes at least one variable whose coefficient is non-zero). $\endgroup$Yuval Filmus– Yuval Filmus2016年08月26日 13:28:55 +00:00Commented Aug 26, 2016 at 13:28
1 Answer 1
In your case the method of conditional expectations devolves to a greedy algorithm. At each stage, you choose the value for the variable under consideration that satisfies the most equations (if any; otherwise choose an arbitrary value), and substitute this value.
Why does this work? Suppose that at the beginning of stage $i,ドル you have $N_i$ remaining equations (i.e., ones in which there is at least one remaining variable with non-zero coefficient). The greedy choice satisfies at least $(N_i-N_{i+1})/p$ equations. In total, the number of equations you satisfy is at least $$ \sum_{i=1}^n \frac{N_i-N_{i+1}}{p} = \frac{N_1 - N_{n+1}}{p} = \frac{N}{p}, $$ where $N$ is the total number of equations, and $N_{n+1} = 0$ is the number of remaining equations at the end of the algorithm.
-
$\begingroup$ Thank you but where can I find book that have theory and solved exercise to work on these randomization stuff ? $\endgroup$Tuong Nguyen Minh– Tuong Nguyen Minh2024年03月10日 18:18:53 +00:00Commented Mar 10, 2024 at 18:18
Explore related questions
See similar questions with these tags.