I. Introduction
Data standardization involves converting data formats to exchange data and ensuring the number of samples is consistent due to varying sizes. Consequently, data must be normalized to optimize the method for particular research [1]. In addition to standardizing reports, the software contains a semiology extrapolation function to automatically identify key semiology functions [2]. The inherent data schema helps overcome barriers to information sharing, promoting transparency and regulatory compliance, utilizing techniques such as Deep Learning (DL) [3]. Differences in data and techniques, as well as parameters, affect performance of trained data, including accuracy and precision [4]. To evaluate the effectiveness of proposed model, several experiments were conducted, which are essential for data as they allow different systems to exchange data consistently [5]. Data standardization ensures that companies have a complete and accurate picture of their data, allowing them to improve data sharing using DL techniques [6].