Loading [MathJax]/extensions/MathMenu.js
Augmented GAN: Advancing Generative Adversarial Networks Through Enhanced Discriminator Training and Generator Stability | IEEE Conference Publication | IEEE Xplore

Augmented GAN: Advancing Generative Adversarial Networks Through Enhanced Discriminator Training and Generator Stability


Abstract:

General Adversarial Networks (GANs) have emerged as a powerful framework for generating reliable and transformative synthetic data in areas such as image generation, imag...Show More

Abstract:

General Adversarial Networks (GANs) have emerged as a powerful framework for generating reliable and transformative synthetic data in areas such as image generation, image and text synthesis, and data augmentation. This paper presents a comprehensive guide to building a Generative Adversarial Neural Network using TensorFlow and Python. Through our research on this topic, we delve into the implementation details, providing step-by-step instructions for constructing both the generator and discriminator networks using TensorFlow, a popular deep-learning framework. Furthermore, we explore techniques for optimizing GAN performance, including architectural modifications, loss function selection, and training strategies. We demonstrate how to train a GAN on real-world datasets through practical examples, showcasing its capability to generate high-quality synthetic samples that closely resemble the training data distribution. In this paper, we aim to match fashion products such as clothing and accessories using GAN. This paper also focuses on improving the robustness and generalization of Convolutional Neural Network (CNN) models, such as image classification and product recommendation. By combining image results with GANs and incorporating these artificial data into the training process, we aim to improve the performance of CNN models in real-world applications.
Date of Conference: 04-05 July 2024
Date Added to IEEE Xplore: 22 August 2024
ISBN Information:
Conference Location: Karaikal, India
No metrics found for this document.

Contact IEEE to Subscribe

References

References is not available for this document.