Training Generative Adversarial Networks (GANs) Over Parameter Server and Worker Node Architecture

Machine Learning and Big Data Analytics(2023)

引用 0|浏览1
暂无评分
摘要
The latest technological discovery in the field of artificial intelligence (AI) is the learning and widespread use of different Generative Adversarial Networks (GANs) applications. GANs have made progress in numerous applications like image editing, style transfer, scene generation, so on. However, these types of generative models demand high computation because GANs are made out of two deep neural networks and in light of the fact that it trains on huge datasets. As with other AI models, GANs also face problems of insufficient data while training for some real-world situations. In numerous situations, available databases might be restricted and distributed over various worker nodes (i.e., end users) where the local datasets are intrinsically private and ultimately workers toward the end do not want to share them. In this chapter, we addressed the issue of training GANs in a distributed way so that they can train over datasets that are distributed to various worker nodes. We have developed a training framework for GANs under the setting of the parameter server and worker node. Under this framework, various workers can produce results similar to real data while keeping it completely in a distributed way and also keeping their information confidential. Test results obtained with the CIFAR-10 dataset indicate that our architecture can produce high-quality data samples that look similar to real data and can be used in various real-life applications.
更多
查看译文
关键词
gans,parameter server,worker node architecture,networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要