Problem Description
You are given an array A, consisting of N integers.
You are also given 2 integers K and L.
You must divide the whole array A into exactly K nonempty intervals, such that the length of each interval is not greater than L.
The cost of an interval [S, E] is the bitwise XOR sum of all elements of A whose indices are in [S, E].
The score of a division is simply the maximum cost of K intervals in the division. You are interested in the best division, which minimizes the score of the division. Since this is too simple for you, the problem is reversed.
You know the minimum score: the answer for the original problem is not greater than X. Now you want to know, the maximum value of K.
Input
There are several test cases.
The first line of the input contains an integer T (1<=T<=20), the number of test cases. Then T test cases follow.
Each test case starts with 3 integers N, X, L (1<= L<=N<=100000, 0<=X<268435456), which are described above.
The next line contains 3 integers A[1], P, Q. All other integers of the array A are generated from these 3 integers in the following rule:
For every integer 1<k<=N, A[k] = (A[k-1]*P+Q) mod 268435456.
(0 <= A[1], P, Q < 268435456)
Output
For each test case, you should print a single line containing the answer.
If the answer does not exist, just print 0.
Sample Input
2
3 1 2
1 1 1
3 0 3
1 1 1
Sample Output
2
1