diff options
author | nunzip <np.scarh@gmail.com> | 2019-03-10 19:51:34 +0000 |
---|---|---|
committer | nunzip <np.scarh@gmail.com> | 2019-03-10 19:51:34 +0000 |
commit | 3b7847633545673117eff53f66f47db519ad6cf2 (patch) | |
tree | d60c8f4489becbe36f068734101a446bbb0e241f /report | |
parent | d5e414f34e968fe18ff287591b7dbffec910314b (diff) | |
download | e4-gan-3b7847633545673117eff53f66f47db519ad6cf2.tar.gz e4-gan-3b7847633545673117eff53f66f47db519ad6cf2.tar.bz2 e4-gan-3b7847633545673117eff53f66f47db519ad6cf2.zip |
Rewrite table
Diffstat (limited to 'report')
-rw-r--r-- | report/paper.md | 28 |
1 files changed, 12 insertions, 16 deletions
diff --git a/report/paper.md b/report/paper.md index f3d73dc..3853717 100644 --- a/report/paper.md +++ b/report/paper.md @@ -153,26 +153,22 @@ with L2-Net logits. $$ \textrm{IS}(x) = \exp(\mathbb{E}_x \left( \textrm{KL} ( p(y\mid x) \| p(y) ) \right) ) $$ -``` \begin{table}[] \begin{tabular}{llll} -& \begin{tabular}[c]{@{}l@{}}Test \\ Accuracy \\ (L2-Net)\end{tabular} & \begin{tabular}[c]{@{}l@{}}Inception \\ Score \\ (L2-Net)\end{tabular} & \begin{tabular}[c]{@{}l@{}}Execution \\ time\\ (Training \\ GAN)\end{tabular} \\ \hline - Shallow CGAN & 0.645 & 3.57 & 8:14 \\ - Medium CGAN & 0.715 & 3.79 & 10:23 \\ - Deep CGAN & 0.739 & 3.85 & 16:27 \\ - Convolutional CGAN & 0.737 & 4 & 25:27 \\ - - \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ One-sided label \\ smoothing\end{tabular} & 0.749 & 3.643 & 10:42 \\ - \begin{tabular}[c]{@{}l@{}}Convolutional CGAN\\ One-sided label \\ smoothing\end{tabular} & 0.601 & 2.494 & 27:36 \\ - \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.1\end{tabular} & 0.761 & 3.836 & 10:36 \\ - \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.5\end{tabular} & 0.725 & 3.677 & 10:36 \\ - \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch \\ Normalization\end{tabular} & ? & ? & ? \\ - \begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch \\ Normalization\\ One-sided label \\ smoothing\end{tabular} & ? & ? & ? \\ - *MNIST original & 0.9846 & 9.685 & N/A - + & \begin{tabular}[c]{@{}l@{}}Test \\ Accuracy \\ (L2-Net)\end{tabular} & \begin{tabular}[c]{@{}l@{}}Inception \\ Score \\ (L2-Net)\end{tabular} & \begin{tabular}[c]{@{}l@{}}Execution \\ time\\ (Training \\ GAN)\end{tabular} \\ \hline +Shallow CGAN & 0.645 & 3.57 & 8:14 \\ +Medium CGAN & 0.715 & 3.79 & 10:23 \\ +Deep CGAN & 0.739 & 3.85 & 16:27 \\ +Convolutional CGAN & 0.737 & 4 & 25:27 \\ +\begin{tabular}[c]{@{}l@{}}Medium CGAN\\ One-sided label \\ smoothing\end{tabular} & 0.749 & 3.643 & 10:42 \\ +\begin{tabular}[c]{@{}l@{}}Convolutional CGAN\\ One-sided label \\ smoothing\end{tabular} & 0.601 & 2.494 & 27:36 \\ +\begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.1\end{tabular} & 0.761 & 3.836 & 10:36 \\ +\begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Dropout 0.5\end{tabular} & 0.725 & 3.677 & 10:36 \\ +\begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch \\ Normalization\end{tabular} & ? & ? & ? \\ +\begin{tabular}[c]{@{}l@{}}Medium CGAN\\ Virtual Batch \\ Normalization\\ One-sided label \\ smoothing\end{tabular} & ? & ? & ? \\ +*MNIST original & 0.9846 & 9.685 & N/A \end{tabular} \end{table} -``` # Re-training the handwritten digit classifier |