Untitled


Normal Deviate

STATISTICS DECLARES WAR ON MACHINE LEARNING!

Well I hope the dramatic title caught your attention. Now I can get to the real topic of the post, which is: finite sample bounds versus asymptotic approximations.

In my last post I discussed Normal limiting approximations. One commenter, Csaba Szepesvari, wrote the following interesting comment:

What still surprises me about statistics or the way statisticians do their business is the following: The Berry-Esseen theorem says that a confidence interval chosen based on the CLT is possibly shorter by a good amount of $latex {c/\sqrt{n}}&fg=000000$. Despite this, statisticians keep telling me that they prefer their “shorter” CLT-based confidence intervals to ones derived by using finite-sample tail inequalities that we, “machine learning people prefer” (lies vs. honesty?). I could never understood the logic behind this reasoning and I am wondering if I am missing something. One possible answer is that the Berry-Esseen result could be…

View original post 556 more words

By wowdevqa

WowDevQA: Software: Generators: Builders: Testers: AI: ML: DL: CHatBots: We are committed to provide you Software Solutions: based on customized needs and Quality: Assurance: determined by your direct involvement: We are also specialized in Cyber: Security: Hacking: Prevention: Protection: Data: Analytics: PenTesting: Tools: Kali: Linux: and others are used: Smart: Mobile: Applications:Progressive: Web: Apps: Science: Engineering: Technology: IoT: InternetofThings: Key To Future: Innovation: https://wowdevqa.com/ #QAWowDev #DevQAWow #WowDevQA