diff options
author | nunzip <np.scarh@gmail.com> | 2019-03-08 10:37:52 +0000 |
---|---|---|
committer | nunzip <np.scarh@gmail.com> | 2019-03-08 10:37:52 +0000 |
commit | 1ea8da8eef6b424794c01b9ebb23bd674ed90b20 (patch) | |
tree | 07a65f8126cbe32560a55ff737b51767d1f38c0b /report | |
parent | e58605e30e90bbfcbfd37dcac57e9d97d4c17a85 (diff) | |
download | e4-gan-1ea8da8eef6b424794c01b9ebb23bd674ed90b20.tar.gz e4-gan-1ea8da8eef6b424794c01b9ebb23bd674ed90b20.tar.bz2 e4-gan-1ea8da8eef6b424794c01b9ebb23bd674ed90b20.zip |
Add complete inception table
Diffstat (limited to 'report')
-rw-r--r-- | report/paper.md | 27 |
1 files changed, 16 insertions, 11 deletions
diff --git a/report/paper.md b/report/paper.md index f079f95..54f25db 100644 --- a/report/paper.md +++ b/report/paper.md @@ -150,17 +150,22 @@ with L2-Net logits. $$ \textrm{IS}(x) = \exp(\mathcal{E}_x \left( \textrm{KL} ( p(y\|x) \|\| p(y) ) \right) ) $$ -GAN type Inception Score (L2-Net) Test Accuracy (L2-Net) -MNIST(ref) 9.67 1% -cGAN 6.01 2% -cGAN+VB 6.2 3% -cGAN+LS 6.3 . -cGAN+VB+LS 6.4 . -cDCGAN+VB 6.5 . -cDCGAN+LS 6.8 . -cDCGAN+VB+LS 7.3 . - - +\begin{table}[] +\begin{tabular}{lll} + & \begin{tabular}[c]{@{}l@{}}Test Accuracy \\ (L2-Net)\end{tabular} & \begin{tabular}[c]{@{}l@{}}Inception Score \\ (L2-Net)\end{tabular} \\ \hline + Shallow CGAN & 0.7031 & 5.8 \\ + Medium CGAN & 0.7837 & 6.09 \\ + Deep CGAN & 0.8038 & 6.347 \\ + Convolutional CGAN & 0.7714 & 6.219 \\ + \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ One-sided label smoothing\end{tabular} & 0.8268 & 6.592 \\ + \begin{tabular}[c]{@{}l@{}}Convolutional CGAN\\ One-sided label smoothing\end{tabular} & 0.821 & 7.944 \\ + \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.1\end{tabular} & 0.7697 & 6.341 \\ + \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.5\end{tabular} & 0.751 & 6.16 \\ + \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch Normalization\end{tabular} & 0.787 & 6.28 \\ + \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch Normalization\\ One-sided label smoothing\end{tabular} & 0.829 & 6.62 \\ + *MNIST original test set & 0.9846 & 9.685 + \end{tabular} + \end{table} # Re-training the handwritten digit classifier |