You can generate a "random" set of N different numbers from 0 to N-1 by starting with an arbitrary number R, and then the next R = (aR + b) mod N.
The conditions for choosing a and b under which this works are:
1. The greatest common factor of N and b = 1
2. a - 1 is a multiple of all of the prime factors of N.
3. Either a mod 4 = 1, or N mod 4 = 0.
For example, let N = 8
b can be any odd number
The only prime factor of N is 2, so a - 1 is a multiple of 2;, a can be any odd number (so far)
N mod 4 = 0, so a mod 4 = 1; a can be any number that is 1 more than a multiple of 4
Let a = 5 and b = 3
Start with 0; the sequence is 0, 3, 2, 5, 4, 7, 6, 1, 0, 3, 2, 5, and so on, repeating itself.
Quote: ThatDonGuyI don't know about a Markov chain per se, but the "standard" RNG I learned is a chain.
You can generate a "random" set of N different numbers from 0 to N-1 by starting with an arbitrary number R, and then the next R = (aR + b) mod N.
The conditions for choosing a and b under which this works are:
1. The greatest common factor of N and b = 1
2. a - 1 is a multiple of all of the prime factors of N.
3. Either a mod 4 = 1, or N mod 4 = 0.
For example, let N = 8
b can be any odd number
The only prime factor of N is 2, so a - 1 is a multiple of 2;, a can be any odd number (so far)
N mod 4 = 0, so a mod 4 = 1; a can be any number that is 1 more than a multiple of 4
Let a = 5 and b = 3
Start with 0; the sequence is 0, 3, 2, 5, 4, 7, 6, 1, 0, 3, 2, 5, and so on, repeating itself.
link to original post
thanks for your response... i have this feeling that no one answered because its kind of a rhetorical question i would assume because generating random numbers can be aided by markov chains... i think i need to reword the question... ive gotten them to understand what i meant in another thread but this one i feel is a bit different as i didnt really elaborate
"A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed."
Since there is no possibility of branching, the whole mathematics developed for Markov chains is wasted on such a trivial chain. The probability of being in each state is identical for every state.