I. Introduction
The last two decades have witnessed increasing efforts to develop effective models for predicting proton induced single event upsets (p-SEUs) and other radiation induced effects (total dose, displacement damage, etc.) in microelectronic devices, operating in space (as well as by neutrons in avionics systems). These efforts evolved from the simple approach of the BGR model [1] to sophisticated models, like the one of the IBM group [2], which takes into account all stages of the upset process from the creation of the products of the nuclear interaction of the primary nucleon through the charge collection up to the evolution of upsets in a particular device. Since there is no clear information on the accuracy of the models in each stage of these complex processes, it became difficult to estimate their overall applicability, even in view of the rough estimation (up to an error factor of three) acceptable in SEU rate calculations. An increase of the accuracy of the calculations in the nuclear interaction stages will allow for reducing the uncertainty in the calculated SEU rates. This calls for an extended comparison between the different nuclear models, attempting to find the more appropriate one for space applications. To our knowledge, no such comparison has ever been made. A previous work [3] compared the energy deposition spectra calculated by the HETC and the COSMIC codes (see below), which are based on similar intranuclear cascade (INC) models, thus, no large differences are expected.