Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I normally see that from engineers using "O(x)" as "approximately x" whenever it's clear from context that you're not actually talking about asymptomatic complexity.
 help



I've always thought it was like this, maybe I'm wrong:

O(some constant) -- "nearby" that constant (maybe "order of magnitude" or whatever is contextually convenient)

O(some parameter) -- denotes the asymptotic behavior of some parametrized process

O(some variable representing a small number) -- denotes the negligible part of something that you're deciding you don't have to care about--error terms with exponent larger than 2 for example


Those last two notations are, formally, the same. To call a part negligible, we say it's asymptotically bounded above by a constant multiple of this expression, which obviously goes away as we approach the limit. The first one is a colloquial alternative definition that would probably be considered "wrong" in formal writing.

Agreed



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: