aboutsummaryrefslogtreecommitdiff
path: root/report
diff options
context:
space:
mode:
Diffstat (limited to 'report')
-rw-r--r--report/makefile10
-rwxr-xr-xreport/metadata.yaml6
-rwxr-xr-xreport/paper.md2
3 files changed, 14 insertions, 4 deletions
diff --git a/report/makefile b/report/makefile
index 6359a42..03a10e1 100644
--- a/report/makefile
+++ b/report/makefile
@@ -10,8 +10,16 @@ FLAGS = --bibliography=bibliography.bib \
FLAGS_PDF = --template=template.latex
-all: pdf
+all: pdf code
+code:
+ echo '\small' > build/code.aux
+ echo '~~~~ {.python .numberLinese}' >> build/code.aux
+ cat ../train.py >> build/code.aux
+ echo -n '~~~~' >> build/code.aux
+ pandoc -V geometry:margin=3em \
+ -o build/code.pdf build/code.aux
+ pdfjoin build/paper.pdf build/code.pdf -o build/paper+code.pdf
pdf:
pandoc -o $(OUTPUT)/paper.pdf $(FLAGS) $(FLAGS_PDF) $(FILES)
diff --git a/report/metadata.yaml b/report/metadata.yaml
index 6c482df..7113dce 100755
--- a/report/metadata.yaml
+++ b/report/metadata.yaml
@@ -9,10 +9,10 @@ numbersections: yes
lang: en
babel-lang: english
abstract: |
- In this coursework we will analyze the benefits of different face recognition methods.
- We analyze dimensionality reduction with PCA, obtaining a generative subspace which is very reliable for face reconstruction. Furthermore, we evaluate LDA, which is able to perform reliable classification, generating a discriminative subspace, where separation of classes is easier to identify.
+ In this coursework we analyze the benefits of different face recognition methods.
+ We look at dimensionality reduction with PCA, obtaining a generative subspace which is very reliable for face reconstruction. Furthermore, we evaluate LDA, which is able to perform reliable classification, generating a discriminative subspace, where separation of classes is easier to identify.
- In the final part we analyze the benefits of using a combined version of the two methods using Fisherfaces and evaluate the benefits of ensemble learning with regards to data and feature space ranodmisation. We find that combined PCA-LDA obtains lower classification error PCA or LDA individually, while also maintaining a low computational costs, allowing us to take advantage of ensemble learning.
+ In the final part we analyze the benefits of using a combined version of the two methods using Fisherfaces and evaluate the benefits of ensemble learning with regards to data and feature space ranodmisation. We find that combined PCA-LDA obtains lower classification error than PCA or LDA individually, while also maintaining low computational costs, allowing us to take advantage of ensemble learning.
The dataset used includes 52 classes with 10 samples each. The number of features is 2576 (46x56).
...
diff --git a/report/paper.md b/report/paper.md
index 44d3d70..7fb0961 100755
--- a/report/paper.md
+++ b/report/paper.md
@@ -514,4 +514,6 @@ We know that $S\boldsymbol{u\textsubscript{i}} = \lambda \textsubscript{i}\bolds
From here it follows that AA\textsuperscript{T} and A\textsuperscript{T}A have the same eigenvalues and their eigenvectors follow the relationship $\boldsymbol{u\textsubscript{i}} = A\boldsymbol{v\textsubscript{i}}$
+# Code
+All code and \LaTeX sources are available at [https://git.skozl.com/e4-pattern/](https://git.skozl.com/e4-pattern/).