aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorVasil Zlatanov <v@skozl.com>2018-11-20 16:44:56 +0000
committerVasil Zlatanov <v@skozl.com>2018-11-20 16:44:56 +0000
commit2bfa88f521b89a803f0b870b048b4ad593c03c9e (patch)
treef285ee185a3dca88b302868612f316969330da34
parentcdaace0bbdc53e4735f52452538801c4b807d466 (diff)
downloadvz215_np1915-2bfa88f521b89a803f0b870b048b4ad593c03c9e.tar.gz
vz215_np1915-2bfa88f521b89a803f0b870b048b4ad593c03c9e.tar.bz2
vz215_np1915-2bfa88f521b89a803f0b870b048b4ad593c03c9e.zip
Minor reductions of words
-rwxr-xr-xreport/paper.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/report/paper.md b/report/paper.md
index 809af3a..ef1e9c4 100755
--- a/report/paper.md
+++ b/report/paper.md
@@ -219,7 +219,7 @@ affect recognition the most are: glasses, hair, sex and brightness of the pictur
# Question 2, Generative and Discriminative Subspace Learning
-To combine both method it is possible to perform LDA in a generative subspace created by PCA. In order to
+One way to combine generative and discriminative learning is made possible by performing LDA on a generative subspace created by PCA. In order to
maximize class separation and minimize the distance between elements of the same class it is necessary to
maximize the function J(W) (generalized Rayleigh quotient): $J(W) = \frac{W\textsuperscript{T}S\textsubscript{B}W}{W\textsuperscript{T}S\textsubscript{W}W}$.
@@ -239,7 +239,7 @@ of the projected samples: $W\textsuperscript{T}\textsubscript{pca} = arg\underse
= arg\underset{W}max\frac{|W\textsuperscript{T}W\textsuperscript{T}
\textsubscript{pca}S\textsubscript{B}W\textsubscript{pca}W|}{|W\textsuperscript{T}W\textsuperscript{T}\textsubscript{pca}S\textsubscript{W}W\textsubscript{pca}W|}$.
-However, performing PCA followed by LDA carries a loss of discriminative information. This problem can
+Performing PCA followed by LDA carries a loss of discriminative information. This problem can
be avoided through a linear combination of the two [@pca-lda]. In the following section we will use a
1-dimensional subspace *e*. The cost functions associated with PCA and LDA (with $\epsilon$ being a very
small number) are H\textsubscript{pca}(*e*)=