左眼一直跳有什么预兆| 专升本有什么专业| 胆摘除对身体有什么影响| 肌酐是什么病| 身份证号最后一位代表什么| 屎特别臭是什么原因| 北漂是什么意思| 足字旁的字跟什么有关| 天降甘霖什么意思| 翻来覆去是什么意思| 出国用什么翻译软件好| 什么是皮包公司| 蟹黄是螃蟹的什么东西| 稽留流产是什么意思| 艾草治什么病| 菌群失调是什么意思| 如是是什么意思| 浇去掉三点水读什么| 6月24日什么星座| 牛肉补什么| 过敏是什么意思| 小孩半夜哭闹是什么原因| 空孕囊是什么原因造成的| 保险属于什么行业| 脉压是什么意思| 吃什么对脾胃好| 吃什么可以化掉息肉| 红细胞高是什么意思| 抗hbc阳性是什么意思| 健康管理是干什么的| 1952年属什么生肖| 布蕾是什么| 海东青是什么鸟| 什么是同素异形体| 吉数是什么生肖| 胃酸吃什么能马上缓解| 变蛋吃多了有什么好处和坏处| 什么地笑| 书店买不到的书是什么书| 嘴唇红肿是什么原因| 发腮是什么意思| 婳是什么意思| 强痛定又叫什么| 照字五行属什么| 甘露醇有什么作用| 生源地是什么意思| 液蜡是什么| 由来是什么意思| 谷氨酰胺是什么| 大夫是什么官职| art是什么| 宫颈纳囊是什么| yankees是什么牌子| 痰涎壅盛是什么意思| 生肖鼠和什么生肖相冲| 咖啡烘培度有什么区别| 化干戈为玉帛是什么意思| 囊性灶什么意思严重吗| 诸侯国是什么意思| 怀疑肝不好挂什么科室| 菊花茶和枸杞一起泡水有什么好处| 夜不能寐是什么意思| 9月20号是什么星座| 装腔作势什么意思| 尿精是什么原因造成的| 甲状腺一度肿大是什么意思| 蛛网膜囊肿是什么病| p5是什么意思| 颈动脉挂什么科| 吃什么能去湿气最好| 二尖瓣钙化是什么意思| 886是什么意思| 青鱼用什么饵料好钓| 心季是什么原因| 肝ca是什么意思| 牙龈紫黑是什么原因| thc是什么费用| 胆碱酯酶高是什么意思| 人生最大的遗憾是什么| 肠道菌群失调有什么症状| 8月29是什么星座| 血小板计数偏高是什么意思| 8月21日是什么星座| 内务是什么意思| 什么是电解水| 什么不周| 肺炎吃什么药最有效| 罗勒叶在中国叫什么| 月经期间喝红糖水有什么好处| 血管堵塞用什么药| 杨家将是什么生肖| 王朝马汉是什么意思| 掉头发要吃什么| 护照和签证有什么区别| 复方血栓通片功效作用治疗什么病| 苦瓜有什么功效| 梦见笑是什么意思| 碧池是什么意思| 厚颜无耻是什么意思| 吃什么最容易减肥| 拔牙后能吃什么| 为什么会流鼻涕| 雪莲果什么季节成熟| 甲状腺弥漫性病变是什么意思| 什么杯子不能装水| 突然高血压是什么原因引起的| 腋臭手术挂什么科| 梦见下大雨是什么征兆| 前列腺炎吃什么药最好| 奇门遁甲什么意思| 血清是什么意思| 片的第二笔是什么| 震动棒是什么| 柯南叫什么| 给猫咪取什么名字好听| 痛风是什么症状| 烤冷面的面皮是什么面| 什么钱最不值钱| 牙龈紫黑是什么原因| 便秘吃什么药好| 今天股市为什么暴跌| 气血不足吃什么食物| 后脑勺疼吃什么药| 男人说冷静一段时间是什么意思| 多出汗是什么原因| 肝硬化是什么引起的| 单飞是什么意思| 东营有什么大学| 王王是什么字| 子宫痒是什么原因| pth是什么| 低盐饮食有利于预防什么疾病| 土黄色裤子配什么颜色上衣| 远香近臭什么意思| 肚子总胀气是什么原因| 心电图伪差是什么意思| 糯米粉可以做什么好吃的| 黎山老母什么级别神仙| 低压低吃什么药| 白斑是什么原因引起的| 贫血做什么检查| 睡觉打呼噜是什么病| 蜂蜜水什么时候喝好| 食管ca是什么意思| 雀神是什么意思| 老人脚背肿是什么原因| 果子狸是什么动物| 为什么会一直咳嗽| 孕妇感冒可以吃什么感冒药| 黄发指什么| 头顶是什么穴位| 浸洗是什么意思| 10个油是什么意思| 血栓的症状是什么| 胎盘0级是什么意思啊| 猴子偷桃是什么生肖| 宫外孕出血是什么颜色| 圆形脸适合什么样的发型| 因为我们没有什么不同| 什么是格局| 熬夜头疼是什么原因| girls是什么意思| 玉米须加什么治痛风| 为什么13周不让建卡了| 总胆红素偏高有什么危害| 2023年属什么| 细思极恐是什么意思| 吃什么可以让月经快点来| 为什么会长小肉粒| 哥哥的哥哥叫什么| 点痣后需要注意什么事项| 脑膜炎有什么症状| 回归是什么意思| 梦见男人是什么意思| 关节退变什么意思| 丢是什么意思| 布洛芬缓释胶囊是什么药| 什么是子宫腺肌症| 阴晴不定是什么意思| 前列腺多发钙化灶是什么意思| 什么的红烧肉| 泰迪吃什么| 农垦局是什么性质单位| 甲胎蛋白是什么意思| 机遇什么意思| 感冒发烧挂什么科室| 愈合是什么意思| 消化内科是看什么病的| 作奸犯科是什么意思| 异卵双胞胎是什么意思| 宫颈炎是什么原因引起的| 为什么男的叫鸭子| 夏天盖什么被子最舒服| 双向转诊是什么意思| 4月19号是什么星座| tp是什么| 胆囊息肉样病变是什么意思| 什么叫邪淫| 与什么俱什么| 茯苓是什么味道| 弱的部首是什么| 什么花是红色的| 女人来月经吃什么好| 异质性是什么意思| 依托考昔片是什么药| 脑部缺氧有什么症状| 门户网站是什么| 为什么运动完会恶心头晕想吐| 车船税是什么意思每年都交吗| 1960年属鼠的是什么命| 省管干部是什么级别| 1996是什么年| 梦到和男朋友分手是什么征兆| 1963年发生了什么| 什么是应届毕业生| 蛇怕什么家禽| 尽善尽美是什么生肖| 肺气虚吃什么食物| 广西属于什么地区| 才高八斗什么意思| 为什么会拉水| 蒙蒙的什么| 女生痛经有什么办法缓解| 盆腔炎吃什么药好得快| 公费医疗什么意思| 喉咙痒是什么原因引起的| 脾主四肢是什么意思| 肝气郁结吃什么中成药| 121是什么意思| 喆是什么意思| 大象喜欢吃什么| 黑灰色是什么颜色| 甜瓜什么时候成熟| 吃什么解毒| 肛门疼痛是什么原因引起的| 七月十五有什么忌讳| 膝关节积液是什么原因造成的| pin是什么意思啊| 为什么海螺里有大海的声音| 新生儿屁多是什么原因| 讳疾忌医什么意思| 海绵是什么材料做的| 以色列是什么人种| 小分子肽能治什么病| 黑曜石是什么| 粽子叶子是什么叶子| 虾皮有什么营养价值| 犀利什么意思| cco是什么意思| 喉咙里的小肉球叫什么| 迎字五行属什么| 孕妇梦见蛇是什么意思| 血糖高能吃什么肉| 2001年什么年| 手信是什么| 奶黄包的馅是什么做的| 4.5是什么星座| 厄警失痣是什么意思| 忠于自己是什么意思| 假菌丝是什么意思| 什么东西助眠| 低血压高是什么原因造成的| 角弓反张是什么意思| 生肖羊生什么生肖最好| 百度Jump to content

《暗恋桃花源》昆明选秀 "云南元素"引爆现场

From Wikipedia, the free encyclopedia
The first two steps of the Gram–Schmidt process
百度 ”学生的课外负担有多重?“有的孩子,还没上小学,就拿了一摞证书。

In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process or Gram-Schmidt algorithm is a way of finding a set of two or more vectors that are perpendicular to each other.

By technical definition, it is a method of constructing an orthonormal basis from a set of vectors in an inner product space, most commonly the Euclidean space equipped with the standard inner product. The Gram–Schmidt process takes a finite, linearly independent set of vectors for kn and generates an orthogonal set that spans the same -dimensional subspace of as .

The method is named after J?rgen Pedersen Gram and Erhard Schmidt, but Pierre-Simon Laplace had been familiar with it before Gram and Schmidt.[1] In the theory of Lie group decompositions, it is generalized by the Iwasawa decomposition.

The application of the Gram–Schmidt process to the column vectors of a full column rank matrix yields the QR decomposition (it is decomposed into an orthogonal and a triangular matrix).

The Gram–Schmidt process

[edit]
The modified Gram-Schmidt process being executed on three linearly independent, non-orthogonal vectors of a basis for . Click on image for details. Modification is explained in the Numerical Stability section of this article.

The vector projection of a vector on a nonzero vector is defined as[note 1] where denotes the dot product of the vectors and . This means that is the orthogonal projection of onto the line spanned by . If is the zero vector, then is defined as the zero vector.

Given nonzero linearly-independent vectors the Gram–Schmidt process defines the vectors as follows:

The sequence is the required system of orthogonal vectors, and the normalized vectors form an orthonormal set. The calculation of the sequence is known as Gram–Schmidt orthogonalization, and the calculation of the sequence is known as Gram–Schmidt orthonormalization.

To check that these formulas yield an orthogonal sequence, first compute by substituting the above formula for : we get zero. Then use this to compute again by substituting the formula for : we get zero. For arbitrary the proof is accomplished by mathematical induction.

Geometrically, this method proceeds as follows: to compute , it projects orthogonally onto the subspace generated by , which is the same as the subspace generated by . The vector is then defined to be the difference between and this projection, guaranteed to be orthogonal to all of the vectors in the subspace .

The Gram–Schmidt process also applies to a linearly independent countably infinite sequence {vi}i. The result is an orthogonal (or orthonormal) sequence {ui}i such that for natural number n: the algebraic span of is the same as that of .

If the Gram–Schmidt process is applied to a linearly dependent sequence, it outputs the 0 vector on the th step, assuming that is a linear combination of . If an orthonormal basis is to be produced, then the algorithm should test for zero vectors in the output and discard them because no multiple of a zero vector can have a length of 1. The number of vectors output by the algorithm will then be the dimension of the space spanned by the original inputs.

A variant of the Gram–Schmidt process using transfinite recursion applied to a (possibly uncountably) infinite sequence of vectors yields a set of orthonormal vectors with such that for any , the completion of the span of is the same as that of . In particular, when applied to a (algebraic) basis of a Hilbert space (or, more generally, a basis of any dense subspace), it yields a (functional-analytic) orthonormal basis. Note that in the general case often the strict inequality holds, even if the starting set was linearly independent, and the span of need not be a subspace of the span of (rather, it's a subspace of its completion).

Example

[edit]

Euclidean space

[edit]

Consider the following set of vectors in (with the conventional inner product)

Now, perform Gram–Schmidt, to obtain an orthogonal set of vectors:

We check that the vectors and are indeed orthogonal: noting that if the dot product of two vectors is 0 then they are orthogonal.

For non-zero vectors, we can then normalize the vectors by dividing out their sizes as shown above:

Properties

[edit]

Denote by the result of applying the Gram–Schmidt process to a collection of vectors . This yields a map .

It has the following properties:

  • It is continuous
  • It is orientation preserving in the sense that .
  • It commutes with orthogonal maps:

Let be orthogonal (with respect to the given inner product). Then we have

Further, a parametrized version of the Gram–Schmidt process yields a (strong) deformation retraction of the general linear group onto the orthogonal group .

Numerical stability

[edit]

When this process is implemented on a computer, the vectors are often not quite orthogonal, due to rounding errors. For the Gram–Schmidt process as described above (sometimes referred to as "classical Gram–Schmidt") this loss of orthogonality is particularly bad; therefore, it is said that the (classical) Gram–Schmidt process is numerically unstable.

The Gram–Schmidt process can be stabilized by a small modification; this version is sometimes referred to as modified Gram-Schmidt or MGS. This approach gives the same result as the original formula in exact arithmetic and introduces smaller errors in finite-precision arithmetic.

Instead of computing the vector uk as it is computed as

This method is used in the previous animation, when the intermediate vector is used when orthogonalizing the blue vector .

Here is another description of the modified algorithm. Given the vectors , in our first step we produce vectors by removing components along the direction of . In formulas, . After this step we already have two of our desired orthogonal vectors , namely , but we also made already orthogonal to . Next, we orthogonalize those remaining vectors against . This means we compute by subtraction . Now we have stored the vectors where the first three vectors are already and the remaining vectors are already orthogonal to . As should be clear now, the next step orthogonalizes against . Proceeding in this manner we find the full set of orthogonal vectors . If orthonormal vectors are desired, then we normalize as we go, so that the denominators in the subtraction formulas turn into ones.

Algorithm

[edit]

The following MATLAB algorithm implements classical Gram–Schmidt orthonormalization. The vectors v1, ..., vk (columns of matrix V, so that V(:,j) is the th vector) are replaced by orthonormal vectors (columns of U) which span the same subspace.

function U = gramschmidt(V)
    [n, k] = size(V);
    U = zeros(n,k);
    U(:,1) = V(:,1) / norm(V(:,1));
    for i = 2:k
        U(:,i) = V(:,i);
        for j = 1:i-1
            U(:,i) = U(:,i) - (U(:,j)'*U(:,i)) * U(:,j);
        end
        U(:,i) = U(:,i) / norm(U(:,i));
    end
end

The cost of this algorithm is asymptotically O(nk2) floating point operations, where n is the dimensionality of the vectors.[2]

Via Gaussian elimination

[edit]

If the rows {v1, ..., vk} are written as a matrix , then applying Gaussian elimination to the augmented matrix will produce the orthogonalized vectors in place of . However the matrix must be brought to row echelon form, using only the row operation of adding a scalar multiple of one row to another.[3] For example, taking as above, we have

And reducing this to row echelon form produces

The normalized vectors are then as in the example above.

Determinant formula

[edit]

The result of the Gram–Schmidt process may be expressed in a non-recursive formula using determinants.

where and, for , is the Gram determinant

Note that the expression for is a "formal" determinant, i.e. the matrix contains both scalars and vectors; the meaning of this expression is defined to be the result of a cofactor expansion along the row of vectors.

The determinant formula for the Gram-Schmidt is computationally (exponentially) slower than the recursive algorithms described above; it is mainly of theoretical interest.

Expressed using geometric algebra

[edit]

Expressed using notation used in geometric algebra, the unnormalized results of the Gram–Schmidt process can be expressed as which is equivalent to the expression using the operator defined above. The results can equivalently be expressed as[4] which is closely related to the expression using determinants above.

Alternatives

[edit]

Other orthogonalization algorithms use Householder transformations or Givens rotations. The algorithms using Householder transformations are more stable than the stabilized Gram–Schmidt process. On the other hand, the Gram–Schmidt process produces the th orthogonalized vector after the th iteration, while orthogonalization using Householder reflections produces all the vectors only at the end. This makes only the Gram–Schmidt process applicable for iterative methods like the Arnoldi iteration.

Yet another alternative is motivated by the use of Cholesky decomposition for inverting the matrix of the normal equations in linear least squares. Let be a full column rank matrix, whose columns need to be orthogonalized. The matrix is Hermitian and positive definite, so it can be written as using the Cholesky decomposition. The lower triangular matrix with strictly positive diagonal entries is invertible. Then columns of the matrix are orthonormal and span the same subspace as the columns of the original matrix . The explicit use of the product makes the algorithm unstable, especially if the product's condition number is large. Nevertheless, this algorithm is used in practice and implemented in some software packages because of its high efficiency and simplicity.

In quantum mechanics there are several orthogonalization schemes with characteristics better suited for certain applications than original Gram–Schmidt. Nevertheless, it remains a popular and effective algorithm for even the largest electronic structure calculations.[5]

Run-time complexity

[edit]

Gram-Schmidt orthogonalization can be done in strongly-polynomial time. The run-time analysis is similar to that of Gaussian elimination.[6]: 40 

See also

[edit]

References

[edit]
  1. ^ Cheney, Ward; Kincaid, David (2009). Linear Algebra: Theory and Applications. Sudbury, Ma: Jones and Bartlett. pp. 544, 558. ISBN 978-0-7637-5020-6.
  2. ^ Golub & Van Loan 1996, §5.2.8.
  3. ^ Pursell, Lyle; Trimble, S. Y. (1 January 1991). "Gram-Schmidt Orthogonalization by Gauss Elimination". The American Mathematical Monthly. 98 (6): 544–549. doi:10.2307/2324877. JSTOR 2324877.
  4. ^ Doran, Chris; Lasenby, Anthony (2007). Geometric Algebra for Physicists. Cambridge University Press. p. 124. ISBN 978-0-521-71595-9.
  5. ^ Pursell, Yukihiro; et al. (2011). "First-principles calculations of electron states of a silicon nanowire with 100,000 atoms on the K computer". Proceedings of 2011 International Conference for High Performance Computing, Networking, Storage and Analysis. pp. 1:1–1:11. doi:10.1145/2063384.2063386. ISBN 9781450307710. S2CID 14316074.
  6. ^ Gr?tschel, Martin; Lovász, László; Schrijver, Alexander (1993), Geometric algorithms and combinatorial optimization, Algorithms and Combinatorics, vol. 2 (2nd ed.), Springer-Verlag, Berlin, doi:10.1007/978-3-642-78240-4, ISBN 978-3-642-78242-8, MR 1261419

Notes

[edit]
  1. ^ In the complex case, this assumes that the inner product is linear in the first argument and conjugate-linear in the second. In physics a more common convention is linearity in the second argument, in which case we define

Sources

[edit]
[edit]
同型半胱氨酸高有什么症状 扁桃体发炎不能吃什么东西 肮脏是什么意思 什么生肖最旺鸡 角膜炎吃什么药
今天是什么生肖日 异性朋友是什么意思 流注是什么意思 dn是什么 肛门口瘙痒涂什么药膏
破绽是什么意思 什么叫三观 氯化钠注射液是什么 pangchi是什么牌子的手表 大熊猫的尾巴是什么颜色
免疫力低下吃什么好 酥油茶是什么做的 4个火读什么 失常是什么意思 老出虚汗是什么原因
怀孕梦到老公出轨预示什么gysmod.com 爱钻牛角尖是什么意思hcv8jop1ns6r.cn 不良资产是什么qingzhougame.com uranus是什么星球hcv8jop2ns1r.cn 润滑油是干什么用的hcv9jop4ns3r.cn
头爱出汗是什么原因引起的hcv7jop6ns0r.cn 泸州老窖是什么香型hcv8jop1ns6r.cn hcmv是什么病毒hcv8jop6ns4r.cn 门特是什么意思hcv8jop0ns6r.cn 组织部长是什么级别hcv9jop4ns9r.cn
青梅竹马什么意思hcv7jop6ns5r.cn 胆固醇高吃什么食物好hcv8jop7ns7r.cn 覆盆子有什么作用adwl56.com 胆红素偏高有什么危害hlguo.com 医学检验是干什么的hcv8jop1ns1r.cn
淋巴发炎吃什么药好hcv9jop3ns0r.cn 白夜是什么意思hcv7jop9ns2r.cn 嗓子哑是什么病的前兆hcv8jop5ns3r.cn 舌头麻木是什么原因hkuteam.com 睡觉咬舌头是什么原因hcv9jop8ns3r.cn
百度