Machine Learning-Driven Random Number Generation: A Comparative Study of WGAN-GP and RNNs for Cryptographic Security

Main Article Content

Rana Saeed Hamdi, Saif Al-alak, Elaf Ali Abbood

Abstract

Numerous applications require a high level of randomness, making random number generation an essential component of modern cryptographic systems. Using recurrent neural networks (RNNs) and Generative Adversarial Networks by Wasserstein with Gradient Penalty (WGAN-GP), this paper explores high-quality random number generation using machine learning methods with different complexities. The random sequences produced by the models under consideration could be used in secure cryptographic applications. In addition, while an RNN model captures temporal dependencies to convey complex sequences, a WGAN-GP utilizes the Earth-Mover's distance to improve its training stability and random number quality.      The random numbers produced are exhaustively tested for their level of randomness using the NIST and Diehard test suites. In major statistical tests, higher p-values indicate that WGAN-GP performs better than RNNs in the context of randomness quality. This work applies the groundwork for further studies on secure cryptographically random number generators (RNGs) and shows how machine learning could enhance the effectiveness of RNGs.

Article Details

Section
Articles