26 Oct 2003 20:07:30 |

Tim - Secretariat |

change |

The Chinese have a great philosophy regarding change: it is inevitable and you'd better be aware of that FACT. Don't be too upset with the real good or the real bad, because it will change....and we often predicate our handicapping on what HAS happened (past performances) more than the aspect of what MIGHT happen (which is really more in line with reality). A race, once run, even if the SAME horses go next time, is a ONE time thing. The horse's form changes, the course changes, the riders change, etc. The old philosphy of never being able to throw a stone in the same river again is at work here. I realize that there has to be a baseline of similairity in order to be even considered as a possible race winner today, but think about it for a moment: You are the trainer of a horse that is in today. Do you try exactly the same thing that got you beat last out? or do you change things (within the realm of what is capable for the horse and rider) today? Or a rider goes about a contest one way, realizes the problem with the ride, changes and gets a different (hopefully better) result next out. Something to ponder. Tim Yatcak |

28 Oct 2003 14:05:15 |

Ol Railbird |

Re: change |

The more things change, the more they stay the same. Most of what we "see" in a race result is an illusion -- the consequence of dumb, blind luck. Only by running the SAME race an infinite number of times can we know with certainty the true odds on the entrants. Evidence of this is found within what statisticians call "the laws of large numbers." Poisson (http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Poisson.html) discussed this at length, expounding upon a theorem originally put forth by Bernoulli. (http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Bernoulli_Jacob.htm l) Quoting from the website above, "The interpretation of probability as relative-frequency says that if an experiment is repeated a large number of times then the relative frequency with which an event occurs equals the probability of the event." Mathematically, Bernoulli's theorem may be quoted as follows: "If an experiment, whose results are simple alternatives with the probability p for the positive result, is repeated n times, and if e is an arbitrary small number, the probability that the number of positive results will not be smaller than n(p - e), and not larger than n(p + e), tends to 1 as n tends to infinity." (from the book Probabability, Statistics and Truth by Richard von Mises) Think of flipping a coin. If you flip it 100 times, how many heads will come up? n = 100, p=.5, e=0.01 n(p-e) = lower bounds= 100( 0.5 - 0.01) = 50 - 1 = 49 n(p+e)= upper bounds= 100(0.5 + 0.01) = 50 + 1 = 51 If we must toss a coin 100 times to approach "true odds", how many times must we toss a horse race? Tim makes the point that if the same horses are entered in a SECOND race, the connections might employ a different strategy in their attempt to win. But this SECOND race is not the FIRST race. He has changed the things that affect the limiting frequency of the odds on the horses in the FIRST race. The "baseline of similarity" Tim mentions can be measured by a horseplayer's historical record of winning percentage. Assuming you do not grossly modify your selection method, and you have a lot of experience in handicapping similar race situations (big n), the limiting frequency of the race in front of you is probably very close to your historical win percentage in similar races. So yes, we should pause and ponder the changes made because we are now handicapping an ENTIRELY DIFFERENT RACE. Speaking of races, it's time to watch Great Lake Downs. Luck be with you, olrailbird "Tim - Secretariat" <[email protected] > wrote in message news:[email protected] > The Chinese have a great philosophy regarding change: it is inevitable > and you'd better be aware of that FACT. Don't be too upset with the real > _SNIP_ |