aboutsummaryrefslogtreecommitdiff
path: root/report
diff options
context:
space:
mode:
Diffstat (limited to 'report')
-rw-r--r--report/paper.md8
1 files changed, 6 insertions, 2 deletions
diff --git a/report/paper.md b/report/paper.md
index 53cdb3f..e053353 100644
--- a/report/paper.md
+++ b/report/paper.md
@@ -151,7 +151,7 @@ architectures in Q2.**
We measure the performance of the considered GAN's using the Inecption score [-inception], as calculated
with L2-Net logits.
-$$ \textrm{IS}(x) = \exp(\mathcal{E}_x \left( \textrm{KL} ( p(y\|x) \|\| p(y) ) \right) ) $$
+$$ \textrm{IS}(x) = \exp(\mathbb{E}_x \left( \textrm{KL} ( p(y\mid x) \| p(y) ) \right) ) $$
```
\begin{table}[]
@@ -252,7 +252,11 @@ as most of the testing images that got misclassified (mainly nines and fours) sh
# Bonus
-This is an open question. Do you have any other ideas to improve GANs or
+## Relation to PCA
+
+Similarly to GAN's, PCA can be used to formulate **generative** models of a system. While GAN's are trained neural networks, PCA is a definite statistical procedure which perform orthogonal transformations of the data. While both attempt to identify the most important or *variant* features of the data (which we may then use to generate new data), PCA by itself is only able to extract linearly related features. In a purely linear system, a GAN would be converging to PCA. In a more complicated system, we would ndeed to identify relevant kernels in order to extract relevant features with PCA, while a GAN is able to leverage dense and convolutional neural network layers which may be trained to perform relevant transformations.
+
+* This is an open question. Do you have any other ideas to improve GANs or
have more insightful and comparative evaluations of GANs? Ideas are not limited. For instance,
\begin{itemize}