In the past few years, Generative Adversarial Network (GAN) became a prevalent research topic. GAN has ability to generate good quality images that look like natural images from a random vector. In this paper, we follow the basic idea of GAN and propose a novel model for image saliency detection, which is called Supervised Adversarial Networks (SAN). However, different from GAN, the proposed method uses fully supervised learning to learn both G-Network and D-Network by applying class labels of the training set. Moreover, a novel kind of layer call conv-comparison layer is introduced into the D-Network to further improve the saliency performance. Experimental results on Pascal VOC 2012 database show that the SAN model can generate high quality saliency maps for many complicate natural images.
In fine-grained object recognition task, over-fitting problem often occurred due to the small number of fine-gained data for each category, especially for the CNN with deep layers and millions parameters. Therefore, we proposed a data augmentation method based on interest points of feature, which can alleviate the over-fitting problem and improve the classification accuracy effectively. The key idea of our method is finding the interest points that attract the classifier which come from the output of the CNN middle layer, locating the areas corresponding to the interest points in the original images and cutting the areas out for augmentation. All work can be done through training procedure completely. The method requires no additional training models and more other parameters. We applied the proposed data augmentation method on CUB200-2011, Stanford Dogs and Aircraft datasets and achieved excellent performance with 11.32% classification accuracy improvement. The experiment results showed that the proposed method can mitigate the problem of over-fitting in fine-grained images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.