Decision-making for dynamic systems is challenging due to the scale and dynamicity of such systems, and it is comprised of decisions at strategic, tactical, and operational levels. One of the most important aspects of decision-making is incorporating real-time information that reflects immediate status of the system. This type of decision-making, which may apply to any dynamic system, needs to comply with the system’s current capabilities and calls for a dynamic data driven planning framework. Performance of dynamic data driven planning frameworks relies on the decision-making process which in return is relevant to the quality of the available data. This means that the planning framework should be able to set the level of decision-making based on the current status of the system, which is learned through the continuous readings of sensory data. In this work, a Markov-chain Monte-Carlo (MCMC) sampling method is proposed to determine the optimal fidelity of decision-making in a dynamic data driven framework. To evaluate the performance of the proposed method, an experiment is conducted, where the impact of workers performance on the production capacity and the fidelity level of decision-making are studied.