1. Introduction
In recent years, along with the rapid development of information technologies and extensive amounts of data observed also the field of mathematical modeling and computer simulation has enjoyed intense advances in both directions of data and computational intensive strategies. Computer simulations are of great importance mainly in such situations when it is impractical, financially expensive or too risky for the realworld system to be directly subjected to experimentation, or in case of evaluating a system before it is actually built [1], [2]. Applying simulations, the behavior of the event can be predicted based on a set of parameters and initial conditions, and solutions can be found out before time, money, and materials are invested. In most cases simulations with a real scenario represent a long-time, computational intensive and memory consuming job. Moreover, both the model calibration and also the research objective itself calls for performing a great number of simulation runs using diverse input scenarios with variable input parameters and boundary conditions. For these facts it is unavoidable for the researcher to have a HPC infrastructure in hand which is capable to accomplish its task taking time as short as possible. The second no minor important issue is understanding of the middleware necessary to execute applications. For non-informatics scientists it is often too complicated to be used correctly. A promising approach has proven to employ various application supporting instruments that make easier the process of the submission and execution of simulations. Hiding much of the middleware complexity they can insulate the researcher from the computing platform and give him the impression that all of the used resources are available in a coherent virtual computer center.