[Work Log]

July 16, 2013
Project Tulips
Subproject Data Association v2
Working path projects/​tulips/​trunk/​src/​matlab/​data_association_2
Unless otherwise noted, all filesystem paths are relative to the "Working path" named above.

Implementing new direct method for marginal likelihood.

Implemented in curve_ml5.m.

After initial attempt, the new method produced inconsistent results.

Re-derived the result two more ways and the math comes out the same. The theory looks right.

Finally found the bug -- was computing

det = log(sum(chol(Sigma)))

instead of

det = 2 * log(sum(chol(Sigma)))

Since the Cholesky decomposition is the square-root of the matrix Sigma, I was forgetting to double it's determinant to get the determinant of Sigma.

Now getting the correct results; now time to make it fast.

Optimizing direct method for ML

Need to get O(n) runtime instead of O(n3). Will use existing code for markov-decomposed PDF evaluation.

Bottleneck is 100% the Cholesky decomposition. The direct method doesn't remove redundant dimensions, so it's slower than curve_ml4.m, which does. Recall that this fact makes curvE_ml4.m not general enough to use the new foreground models with.

...........

After some investagation, the previous statement appears to be untrue. The bottlenect is dense matrix multiplication, which is fixed by using sparse matrices.

Now, cholesky is the bottlenect, but not as huge as before. The markov decompose method only gives a ~30% speedup at best, while introducing ~2.5% error and significant extra complexity. Is it worth it?

Posted by Kyle Simek
blog comments powered by Disqus