The performance of a conservative time management algorithm in a distributed simulation system degrade s significantly if
a large number of null messages are exchanged across the logical processes in order to avoid deadlock. This situation gets
more severe when the exchange of null messages is increased due to the poor selection of key parameters such as lookahead
values. However, with a mathematical model that can approximate the optimal values of parameters that are directly involved
in the performance of a time management algorithm, we can limit the exchange of null messages. The reduction in the exchange
of null messages greatly improves the performance of the time management algorithm by both minimizing the transmission overhead
and maintaining a consistent parallelization. This paper presents a generic mathematical model that can be effectively used
to evaluate the performance of a conservative distributed simulation system that uses null messages to avoid deadlock. Since
the proposed mathematical model is generic, the performance of any conservative synchronization algorithm can be approximated.
In addition, we develop a performance model that demonstrates that how a conservative distributed simulation system performs
with the null message algorithm (NMA). The simulation results show that the performance of a conservative distributed system
degrades if the NMA generates an excessive number of null messages due to the improper selection of parameters. In addition,
the proposed mathematical model presents the critical role of lookahead which may increase or decrease the amount of null
messages across the logical processes. Furthermore, the proposed mathematical model is not limited to NMA. It can also be
used with any conservative synchronization algorithm to approximate the optimal values of parameters.