diff options
-rwxr-xr-x | report2/paper.md | 44 |
1 files changed, 19 insertions, 25 deletions
diff --git a/report2/paper.md b/report2/paper.md index 0ceeedd..e4c3ec0 100755 --- a/report2/paper.md +++ b/report2/paper.md @@ -1,17 +1,13 @@ -# Summary -In this report we analysed how distance metrics learning affects classification -accuracy for the dataset CUHK03. The baseline method used for classification is -Nearest Neighbors based on Euclidean distance. The improved approach we propose -mixes Jaccardian and Mahalanobis metrics to obtain a ranklist that takes into -account also the reciprocal neighbors. This approach is computationally more -complex, since the matrices representing distances are effectively calculated -twice. However it is possible to observe a significant accuracy improvement of -around 10% for the $@rank1$ case. Accuracy improves overall, especially for -$@rankn$ cases with low n. - # Formulation of the Addresssed Machine Learning Problem -## CUHK03 +## Probelm Definition + +The problem to solve is to create a ranklist for each image of the query set +by finding the nearest neighbor(s) within a gallery set. However gallery images +with the same label and taken from the same camera as the query image should +not be considered when forming the ranklist. + +## Dataset - CUHK03 The dataset CUHK03 contains 14096 pictures of people captured from two different cameras. The feature vectors used come from passing the @@ -23,13 +19,6 @@ on a training set (train_idx, adequately split between test, train and validation keeping the same number of identities). This prevents overfitting the algorithm to the specific data associated with query_idx and gallery_idx. -## Probelm to solve - -The problem to solve is to create a ranklist for each image of the query set -by finding the nearest neighbor(s) within a gallery set. However gallery images -with the same label and taken from the same camera as the query image should -not be considered when forming the ranklist. - ## Nearest Neighbor ranklist Nearest Neighbor aims to find the gallery image whose feature are the closest to @@ -46,7 +35,7 @@ EXPLAIN KNN BRIEFLY \begin{figure} \begin{center} \includegraphics[width=20em]{fig/baseline.pdf} -\caption{Recognition accuracy of baseline Nearest Neighbor @rank k} +\caption{Top K Accuracy for Nearest Neighbour classification} \label{fig:baselineacc} \end{center} \end{figure} @@ -54,26 +43,31 @@ EXPLAIN KNN BRIEFLY \begin{figure} \begin{center} \includegraphics[width=22em]{fig/eucranklist.png} -\caption{Ranklist @rank10 generated for 5 query images} +\caption{Top 10 ranklist for 5 probes} \label{fig:eucrank} \end{center} \end{figure} - # Suggested Improvement \begin{figure} \begin{center} \includegraphics[width=24em]{fig/ranklist.png} -\caption{Ranklist (improved method) @rank10 generated for 5 query images} +\caption{Top 10 ranklist (improved method) 5 probes} \label{fig:ranklist2} \end{center} \end{figure} + +TODO: +~~ +s/kNN/NN/ +~~ + \begin{figure} \begin{center} \includegraphics[width=20em]{fig/comparison.pdf} -\caption{Comparison of recognition accuracy @rank k} +\caption{Top K Accurarcy} \label{fig:baselineacc} \end{center} \end{figure} @@ -81,7 +75,7 @@ EXPLAIN KNN BRIEFLY \begin{figure} \begin{center} \includegraphics[width=17em]{fig/pqvals.pdf} -\caption{Recognition accuracy varying K1 and K2} +\caption{Top 1 Accuracy when k1 and k2} \label{fig:pqvals} \end{center} \end{figure} |