What means "Stochastic"?, basically being or having a random variable.
A stochastic model, as a model is a tool for estimating probability distributions of potential outcomes allowing random variations in one or more inputs over time. To obtain the behaviour if this random variation usually historical data for a selected period is used.
Examples of sthocastic models can be the simulation models, the queueing models and the markov chains.
A queueing model, in the scope of the queueing theory, is used to approximate a real queueing system in order to allow its mathematical analisys. Queueing models allow a number of useful steady state performance measures to be determined, including:
- The average number (of entities to be serverd) in the queue, or the system.
- The average time spent (by the entities) in the queue, or the system.
- The statistical distribution of the arrivals and the service times.
- The probability the queue is full, or empty.
- The probability of finding the system in a particular state.
These performance measures are important as issues or problems caused by queueing situations are often related to customer dissatisfaction with service or may be the root cause of economic losses in a business. Analysis of the relevant queueing models allows the cause of queueing issues to be identified and the impact of proposed changes to be assessed.
In mathematics, a Markov chain, (see note 1), is a stochastic process with the Markov property (see note 2).
Future states will be reached through a probabilistic process instead of a deterministic one. At each step the system may change its state from the current state to another state, or remain in the same state, according to a certain probability distribution.
The changes of state are called transitions, and the probabilities associated with various state-changes are called transition probabilities.
- Andrey (Andrei) Andreyevich Markov (June 14, 1856 N.S. – July 20, 1922) was a Russian mathematician. He is best known for his work on theory of stochastic processes. His research later became known as Markov chains.
- Having the Markov property means that, given the present state, future states are independent of the past states. In other words, the description of the present state fully captures all the information that could influence the future evolution of the process.