分类 高数 下的文章


\lim_{x\to0} \frac{\sin x}{x} = 1  

\lim_{x\to oo} (1+\frac{1}{x})^x = e

\lim_{x\to 0} (1+x)^\frac{1}{x} = e

x\rightarrow 0

(1+x)^\alpha  \implies 1+\alpha x

1+\alpha x \implies (1+x)^\alpha 

\sin x \implies \tan x \implies x

e^x \implies 1+x

x \implies e^x - 1

a^x = e^{x\ln a}

a^x - 1 = e^{x\ln a} - 1 = x\ln a

a^x = (e^{\ln a})^x = e^{x\ln a}

({a^b})^c = a^{bc} = (a^c)^b

1+\sin x \Rightarrow 1 + x \Rightarrow e^x

\ln (1+x) \implies x

x' = C

C' = 0

(x^n)' = nx^{n-1}

(a^x)' = a^x \ln a

(e^x)' = e^x \ln e = e^x

(\log_a x )' = \frac{1}{x\ln a } 

(\ln x )' = \frac{1}{x} 

(\sin x)' = \cos x

(\cos x)' = -\sin x

(\tan x)' = \sec^2 x

(\cot x)' = -csc^2x

(\sec x)' = \sec x \tan x

(\csc x)' = -\csc x \cot x

(\arcsin x)' = \frac{1}{\sqrt{1-x^2} } 

(\arccos x)' = -\frac{1}{\sqrt{1-x^2} } 

(\arctan x)' = \frac{1}{1+x^2 } 

(arccot  x)' = -\frac{1}{1+x^2 } 

(f(x) + \mu(x))' = f(x)' + \mu(x)'

(f(x)\mu(x))' = f(x)'\mu(x) + f(x)\mu(x)'

(C f(x))' = C f(x)'

( \frac{u}{v}  )' = \frac{u'v - uv'}{v^2} 

(uv)' = u'v + uv'

(uvw)' = u'vw + uv'w + uvw'

x\rightarrow a

f(x)\rightarrow 0, g(x)\rightarrow 0 或 f(x)\rightarrow oo, g(x)\rightarrow oo

f(x)' g(x)'

g(x)' \neq 0

\lim_{x\to a} \frac{f(x)}{g(x)}  

\lim_{x\to a} \frac{f(x)}{g(x)}   = \lim_{x\to a} \frac{f(x)'}{g(x)'}  


sin ^2 \alpha = \frac{1-cos2\alpha}{2} 

cos ^2 \alpha = \frac{1+cos2\alpha}{2} 

sin \frac{\alpha}{2} = \pm \sqrt{ \frac{1-cos\alpha}{2} } 

cos \frac{\alpha}{2} = \pm \sqrt{ \frac{1+cos\alpha}{2} } 

sin(\alpha \pm \beta ) = sin\alpha \cos\beta \pm cos\alpha sin\beta

cos(\alpha \pm \beta ) = cos\alpha \cos\beta \mp sin\alpha sin\beta

sin2\alpha = 2sin\alpha cos\beta

cos2\alpha = cos^2\alpha - sin^2\alpha = 1 - 2sin^2\alpha = 2cos^2\alpha - 1

sin\alpha  cos\beta = \frac{1}{2} [ sin( \alpha + \beta )  +  sin( \alpha - \beta ) ]

cos\alpha  cos\beta = \frac{1}{2} [ cos( \alpha + \beta )  +  cos( \alpha - \beta ) ]

sin\alpha  sin\beta = -\frac{1}{2} [ cos( \alpha + \beta )  - cos( \alpha - \beta ) ]

sin \alpha + sin \beta  = 2 sin \frac{\alpha+\beta}{2} cos \frac{\alpha-\beta}{2} 

cos \alpha + cos \beta  = 2 cos \frac{\alpha+\beta}{2} cos \frac{\alpha-\beta}{2} 

a^m * a^n = a^{m+n}

(a^m)^n = a^{m*n}

(ab)^n = a^n * b^n

a^0 = 1

a^{-n} = \frac{1}{a^n} 

a^{ \frac{m}{n}  } = \sqrt]{a^m} 

a^b = n \Leftrightarrow b = \log_a n 

\lg N = \log_{10} N

\ln N = \log_e N

a^{\log_aN} = N

\log_a(M*N) = \log_aM + \log_aN

\log_a{ \frac{M}{N}  } = \log_aM - \log_aN

\log_a(M^b) = b{\log_a(M)}

\log_bN = \frac{ \log_aN }{ \log_ab }  ( a,b >0, a,b \neq  1, N>0 ) 

隐约记得之前做过一个c++的题目是判断一个数是否素数(质数) 我当时给的算法是判断 2 - x/2, 因为被除数大于 x/2 那商一定小于2,所以被除数必须大于x/2

最近看书的时候发现通用的算法是计算 2- sqrt(x) 即 根号x 这就让我产生疑问了,毋庸置疑,这个算法的效率更高,时间复杂度是logn。 那为什么到sqrt(x)就够了呢?

我反复思考总算得出了结论,这里用反证法即可:

已知 n 不是素数,且a,b是 n的两个根, a*b = n
假设 b>sqrt(n),且a>=sqrt(n)
则a*b > sqrt(n) * sqrt(n) 即 a*b > n 与条件相悖

得出若存在一个根大于sqrt(n),
那必定存在另一个小于sqrt(n)的根

与此对应的逆否命题是

若不存在小于sqrt(n)的根,则不存在大于sqrt(n)的根

根据这个证明的结论,判断是否是素数,最多只需要判断到 n 的平方根即可。