Discrete-time stochastic systems are an essential modelling tool for many engineering systems. We consider stochastic control systems that are evolving over continuous spaces. For this class of models, methods for the formal verification and synthesis of control strategies are computationally hard and generally rely on the use of approximate abstractions. Building on approximate abstractions, we compute control strategies with lower- and upper-bounds for satisfying unbounded temporal logic specifications. Firstly, robust dynamic programming mappings over the abstract system are introduced to solve the control synthesis and verification problem. These mappings yield a control strategy and a unique lower bound on the satisfaction probability for temporal logic specifications that is robust to the incurred approximation errors. Secondly, upper-bounds on the satisfaction probability are quantified, and properties of the mappings are analysed and discussed. Finally, we show the implications of these results for linear stochastic dynamic systems with a continuous state space. This abstraction-based synthesis framework is shown to be able to handle infinite-horizon properties. Approximation errors expressed as deviations in the outputs of the models and as deviations in the probabilistic transitions are allowed and are quantified using approximate stochastic simulation relations.