SRM 448 DIVII LEVEL 3
http://forums.topcoder.com/?module=Thread&threadID=651040&start=0
Let D(n,m) = number of ways to arrange the cards given that we have used "m" cards (m = bitmask representation) and the last card that was chosen was card n.
Saturday, November 28, 2009
Saturday, November 21, 2009
sscanf
to read in the numbers in the following string
string s = "1000 ml of water, weighing 1000 g"
Note: the use of c_str() and sscanf()
string s = "1000 ml of water, weighing 1000 g"
int v;
sscanf(s, "%d ml of", &v);
size_t p = s.find(',');
int m;
sscanf(s.substr(p+2, s.length()-p).c_str(), "weighing %d", &m);
Note: the use of c_str() and sscanf()
STL comparison FUNCTOR
http://www.topcoder.com/tc?module=Static&d1=match_editorials&d2=srm374
#include < iostream >
#include < vector >
#include < string >
#include < algorithm >
using namespace std;
bool isvowl(char c) {
if (c == 'a' || c == 'e' || c == 'i' || c == 'o' || c == 'u')
return true;
return false;
}
struct word {
string w;
vectors1; //sorted
vectors2; //unsorted
};
// comparison function
bool operator < (const word &a, const word &b) {
if (a.s1 != b.s1) return a.s1 < b.s1;
return a.s2 < b.s2;
}
class SyllableSorting {
public:
vectorsortWords(vector words) {
vectorvw;
for(int i = 0; i < words.size(); ++i) {
vectorsyl;
word W;
string s("");
s += words[i][0];
for (int j = 1; j < words[i].length(); ++j) {
char c = words[i][j];
bool pv = isvowl(words[i][j-1]);
bool v = isvowl(c);
if (pv) { //previous char is a vowl
if (v)
s += c;
else {
syl.push_back(s);
s = c;
}
} else { // previous char is a consonant
s += c;
}
}
syl.push_back(s);
for(int k = 0; k < syl.size(); ++k)
cout << syl[k] << " ";
cout << endl;
W.w = words[i];
W.s2 = (syl);
sort(syl.begin(), syl.end());
W.s1 = (syl);
vw.push_back(W);
}
sort(vw.begin(), vw.end());
vectorret;
for(int i = 0; i < vw.size(); ++i)
ret.push_back(vw[i].w);
return ret;
}
};
Wednesday, November 18, 2009
Coin toss and Absorbing Markov Chain
http://www.mitbbs.com/mitbbs_bbsdoc_div_article.php?board=Quant&gid=31212871
A fair coin is flipped until the first time one of the following two
patterns appeared: TTH or HTH
ask: which one you should choose for a better chance to win
+ TT = .5 TTH + .5 TT
| HT = .5 HTH + .5 TT
| T = .5 TT + .5 H
| H = .5 HT + .5 H
+ S = .5 T + .5 H
If we let TTH = 1 and HTH = 0, we finally get S = 5/8 (the probability of reaching TTH first)
or otherwise
we let HTH = 1 and TTH = 0, we finally get S = 3/8 (the probability of reaching HTH first)
We start from the definition of Absorbing Markov Chain.
REF: Chap 11, Markov Chains, from www.dartmouth.edu/~chance/teaching
A state s_i of a Markov chain is called absorbing if it is impossible to leave it (i.e., p_ii = 1). A Markov chain is absorbing if it has at lease one absorbing state, and if from every state it is possible to go to an absorbing state (not necessarily in one step). The state that is not absorbing is called transient.
The interesting questions about an absorbing Markov chain are:
- What is the probability that the process will eventually reach a given absorbing state?
- On the average, how long will it take for the process to be absorbed?
- On the average, how many times will the process be in each transient state?
In general, the answers to all questions depends on
- the state from which the process starts
- the transition probabilities.
The answers to the above questions are the theorems listed below.
We first write the transition matrix in canonical Form by explicitly write Q and R.
TT T S H HT
Q =
TT 0.5000 0 0 0 0
T 0.5000 0 0 0.5000 0
S 0 0.5000 0 0.5000 0
H 0 0 0 0.5000 0.5000
HT 0.5000 0 0 0 0
TTH HTH
R =
TT 0.5000 0
T 0 0
S 0 0
H 0 0
HT 0 0.5000
Theorem: For an absorbing Markov chain the matrix I-Q has an inverse N = I + Q + Q^2 + .... The ij-entry of the matrix N is the expected number of times the chain is in state s_j, given that it starts in state s_i.
>> N = inv(eye(5)-Q)
N =
2.0000 0 0 0 0
1.5000 1.0000 0 1.0000 0.5000
1.2500 0.5000 1.0000 1.5000 0.7500
1.0000 0 0 2.0000 1.0000
1.0000 0 0 0 1.0000
By reading the first line of N, we know that starting from TT, the expected number of times we still in TT is 2.
Theorem: The sum of row i of N is the expected number of steps before the chain is absorbed given that the chain starts in state s_i.
>> t = N*ones(5,1)
t =
2
4
5
4
2
So starting from S, the expected number of steps before absorbing in either TTH or HTH is 5.
Theorem: Let b_ij be the probability that an absorbing chain will be absorbed in the absorbing state s_j if it starts in the transient state s_i. Let B be the matrix with entries b_ij. Then B is an txr matrix, and
B = NR,
where N is the fundamental matrix and R is as in the canonical form.
>> B = N*R
B =
1.0000 0
0.7500 0.2500
0.6250 0.3750
0.5000 0.5000
0.5000 0.5000
Thus starting from S, we have probability of 0.625 = 5/8 to be absorbed into state TTH and probability of 0.375 = 3/8 to be absorbed into state HTH.
Now let us consider adding another absorbing state THT. The transition graph is as below:
Q = S T H TH HT TT
0 0.5000 0.5000 0 0 0 S
0 0 0 0.5000 0 0.5000 T
0 0 0.5000 0 0.5000 0 H
0 0 0.5000 0 0 0 TH
0 0 0 0 0 0.5000 HT
0 0 0 0 0 0.5000 TT
R = THT HTH TTH
0 0 0 S
0 0 0 T
0 0 0 H
0.5000 0 0 TH
0 0.5000 0 HT
0 0 0.5000 TT
>> N = inv(eye(6)-Q)
N =
1.0000 0.5000 1.2500 0.2500 0.6250 1.1250
0 1.0000 0.5000 0.5000 0.2500 1.2500
0 0 2.0000 0 1.0000 1.0000
0 0 1.0000 1.0000 0.5000 0.5000
0 0 0 0 1.0000 1.0000
0 0 0 0 0 2.0000
>> N*ones(6,1)
ans =
4.7500 S
3.5000 T
4.0000 H
3.0000 TH
2.0000 HT
2.0000 TT
>> B = N*R
B = THT HTH TTH
0.1250 0.3125 0.5625 S
0.2500 0.1250 0.6250 T
0 0.5000 0.5000 H
0.5000 0.2500 0.2500 TH
0 0.5000 0.5000 HT
0 0 1.0000 TT
Subscribe to:
Posts (Atom)