Loading [MathJax]/extensions/MathMenu.js
Training Generative Adversarial Networks With Weights | IEEE Conference Publication | IEEE Xplore

Training Generative Adversarial Networks With Weights


Abstract:

The impressive success of Generative Adversarial Networks (GANs) is often overshadowed by the difficulties in their training. Despite the continuous efforts and improveme...Show More

Abstract:

The impressive success of Generative Adversarial Networks (GANs) is often overshadowed by the difficulties in their training. Despite the continuous efforts and improvements, there are still open issues regarding their convergence properties. In this paper, we propose a simple training variation where suitable weights are defined and assist the training of the Generator. We provide theoretical arguments which indicate that the proposed algorithm is better than the baseline algorithm in the sense of creating a stronger Generator at each iteration. Performance results showed that the new algorithm is more accurate and converges faster in both synthetic and image datasets resulting in improvements ranging between 5% and 50%.
Date of Conference: 02-06 September 2019
Date Added to IEEE Xplore: 18 November 2019
ISBN Information:

ISSN Information:

Conference Location: A Coruna, Spain

1. Introduction

A fully data-driven paradigm in conducting science has been emerged during the last years with the advent of GANs [1]. A GAN offers a new methodology for drawing samples from an unknown distribution where only samples from this distribution are available making them one of the hottest areas in machine learning/artificial intelligence research. Indicatively, GANs have been successfully utilized in (conditional) image creation [2], [3], [4], generating very realistic samples [5], [6], speech signal processing [7], [8], natural language processing [9] and astronomy [10], to name a few.

Contact IEEE to Subscribe

References

References is not available for this document.