Skip to main content
Log in

Fusion of probabilistic unreliable indirect information into estimation serving to decision making

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Bayesian decision making (DM) quantifies information by the probability density (pd) of treated variables. Gradual accumulation of information during acting increases the DM quality reachable by an agent exploiting it. The inspected accumulation way uses a parametric model forecasting observable DM outcomes and updates the posterior pd of its unknown parameter. In the thought multi-agent case, a neighbouring agent, moreover, provides a privately-designed pd forecasting the same observation. This pd may notably enrich the information of the focal agent. Bayes’ rule is a unique deductive tool for a lossless compression of the information brought by the observations. It does not suit to processing of the forecasting pd. The paper extends solutions of this case. It: \(\triangleright\) refines the Bayes’-rule-like use of the neighbour’s forecasting pd \(\triangleright\) deductively complements former solutions so that the learnable neighbour’s reliability can be taken into account \(\triangleright\) specialises the result to the exponential family, which shows the high potential of this information processing \(\triangleright\) cares about exploiting population statistics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Availability of data and material

Not applicable.

Notes

  1. The existence of regular probability densities of inspected probabilistic measures with respect to Lebesgue’s or counting measures is assumed [39].

  2. Our manipulations assume discrete-valued modelled variables. The uncertain pds acting on them are probabilistic vectors and their distributions are then modelled without technicalities of the measure theory. The found solution is valid without this assumption.

  3. The work [43] calls the same functional “cross-entropy”. The use of this term is often challenged so we stay with the name “Kulback–Leibler divergence”.

  4. The agreed implicit conditioning on the agent’s action, \(a_{\mathfrak {a}}\), and its regressor, \(r_{\mathfrak {a}}\), applies.

  5. It uses the implicit conditioning \(\mathsf {F}_{\mathfrak {n}}(o_{\mathfrak {a}})=\mathsf {F}_{\mathfrak {n}}(o_{\mathfrak {a}}|a_{\mathfrak {a}},r_{\mathfrak {a}})\), \(\mathsf {M}_{\mathfrak {a}}(o_{\mathfrak {a}}|p)=\mathsf {M}_{\mathfrak {a}}(o_{\mathfrak {a}}|p,a_{\mathfrak {a}},r_{\mathfrak {a}})\). The assumed neighbour, see Sect. 2, implies the relevance of \(a_{\mathfrak {a}},r_{\mathfrak {a}}\) in the forecasting pd \(\mathsf {F}_{\mathfrak {n}}\).

  6. Let us stress that the neighbour, \(\mathfrak {n}\), is generally unaware of the model, \(\mathsf {M}_{\mathfrak {a}}\), and its parameter, p.

  7. It uses the KLDs of posterior pds not the KLDs of joint pds.

References

  1. Antoniak C (1974) Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. Ann Stat 2(6):1152–1174

    Article  MathSciNet  Google Scholar 

  2. Åström K (1970) Introduction to stochastic control. Academic Press, New York

    MATH  Google Scholar 

  3. Bader K, Lussier B, Schon W (2017) A fault tolerant architecture for data fusion: a real application of Kalman filters for mobile robot localization. Robot Auton Syst 88:11–23

    Article  Google Scholar 

  4. Bar-Shalom Y, Li X, Kirubarajan T (2003) Estimation with applications to tracking and navigation. Wiley, Hoboken

    Google Scholar 

  5. Barndorff-Nielsen O (1978) Information and exponential families in statistical theory. Wiley, New York

    MATH  Google Scholar 

  6. Berger J (1985) Statistical decision theory and Bayesian analysis. Springer, Berlin

    Book  Google Scholar 

  7. Bernardo J (1979) Expected information as expected utility. Ann Stat 7:686–690

    Article  MathSciNet  Google Scholar 

  8. Bogdan P, Pedram M (2018) Toward enabling automated cognition and decision-making in complex cyber-physical systems. In: 2018 IEEE ISCAS, pp 1–4

  9. Foley C, Quinn A (2018) Fully probabilistic design for knowledge transfer in a pair of Kalman filters. IEEE Signal Proc Lett 25(4):487–490

    Article  Google Scholar 

  10. Galeano P, Pena D (2019) Data science, big data and statistics. Test 28:289–325

    Article  MathSciNet  Google Scholar 

  11. Genest C, Zidek J (1986) Combining probability distributions: a critique and annotated bibliography. Stat Sci 1(1):114–148

    MathSciNet  MATH  Google Scholar 

  12. Hall D, Llinas J (1997) An introduction to multisensor data fusion. Proc IEEE 85(1):6–23

    Article  Google Scholar 

  13. Hlaváčková-Schindler K, Naumova V, Pereverzyev S (2016) Granger causality for ill-posed problems: ideas, methods, and application in life sciences. In: Wiedermann W, von Eye A (eds) Statistics and causality: methods for applied empirical research, Wiley, pp 249–276

  14. Hoshino T, Igari R (2017) Quasi-Bayesian inference for latent variable models with external information: application to generalized linear mixed models for biased data. In: Keio-IES Discussion Paper Series 2017–014, Institute for Economics Studies, Keio University

  15. Jazwinski A (1970) Stochastic processes and filtering theory. The Press of Atlantic City, Pleasantville

    MATH  Google Scholar 

  16. Jensen F (2001) Bayesian networks and decision graphs. Springer, New York

    Book  Google Scholar 

  17. Kárný M, Bodini A, Guy T, Kracík J, Nedoma P, Ruggeri F (2014) Fully probabilistic knowledge expression and incorporation. Stat Interface 7(4):503–515

    Article  MathSciNet  Google Scholar 

  18. Kárný M, Böhm J, Guy T, Jirsa L, Nagy I, Nedoma P, Tesař L (2006) Optimized bayesian dynamic advising: theory and algorithms. Springer, London

    Google Scholar 

  19. Kárný M, Guy T (2012) On support of imperfect Bayesian participants. In: Guy T et al (eds) Decision making with imperfect decision makers, vol 28. Springer, Berlin, pp 29–56 Int. Syst. Ref. Lib

    Chapter  Google Scholar 

  20. Kárný M, Herzallah R (2017) Scalable harmonization of complex networks with local adaptive controllers. IEEE Trans SMC Syst 47(3):394–404

    Google Scholar 

  21. Kasabov N, Hu Y (2010) Integrated optimisation method for personalised modelling and case studies for medical decision support. Int J Funct Inform Person Med 3(3):236–256

    Google Scholar 

  22. Kern-Isberner G, Lukasiewicz T (2017) Special issue on challenges for reasoning under uncertainty, inconsistency, vagueness, and preferences. Künstl Intell 31:5–8. https://doi.org/10.1007/s13218-016-0479-z

    Article  Google Scholar 

  23. Koopman R (1936) On distributions admitting a sufficient statistic. Trans Am Math Soc 39:399

    Article  MathSciNet  Google Scholar 

  24. Kracík J, Kárný M (2005) Merging of data knowledge in Bayesian estimation. In: Filipe J et al (eds) Proceedings of the 2nd International Conference on Informatics in Control, Automation and Robotics, Barcelona, pp 229–232

  25. Kuhn H, Tucker A (1951) Nonlinear programming. In: Proceedings of the 2nd Berkeley Symposium, University of California Press, pp 481–492

  26. Kulhavý R, Zarrop MB (1993) On a general concept of forgetting. Int J Control 58(4):905–924

    Article  MathSciNet  Google Scholar 

  27. Kullback S, Leibler R (1951) On information and sufficiency. Ann Math Stat 22:79–87

    Article  MathSciNet  Google Scholar 

  28. van Laere J (2009) Challenges for IF performance evaluation in practice. In: 12th Intern. Conf. on Information Fusion, IEEE, Seattle, pp 866–873

  29. Lee H, Lee B, Park K, Elmasri R (2010) Fusion techniques for reliable information: a survey. Intern J Digit Cont Technol Appl 4(2):74–88

    Google Scholar 

  30. Leevy J, Khoshgoftaar T, Bauder R, Seliya N (2018) A survey on addressing high-class imbalance in big data. J Big Data 5(42):1–30

    Google Scholar 

  31. Meng T, Jing X, Yan Z, Pedrycz W (2019) A survey on machine learning for data fusion. Inform Fus. https://doi.org/10.1016/j.inffus.2019.12.001

  32. Mine H, Osaki S (1970) Markovian decision processes. Elsevier, Amsterdam

    MATH  Google Scholar 

  33. Nelsen R (1999) An introduction to copulas. Springer, New York

    Book  Google Scholar 

  34. O’Hagan A, et al. (2006) Uncertain judgement: eliciting experts probabilities. J Wiley

  35. Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufman, Burlington

    MATH  Google Scholar 

  36. Peterka V (1981) Bayesian system identification. In: Eykhoff P (ed) Trends and progress in system identification, Pergamon Press, Oxford, pp 239–304

  37. Quinn A, Ettler P, Jirsa L, Nagy I, Nedoma P (2003) Probabilistic advisory systems for data-intensive applications. Int J Adapt Control Signal Proc 17(2):133–148

    Article  Google Scholar 

  38. Quinn A, Kárný M, Guy T (2017) Optimal design of priors constrained by external predictors. Int J Approx Reason 84:150–158

    Article  MathSciNet  Google Scholar 

  39. Rao M (1987) Measure Theory and Integration. J Wiley

  40. Sassani B, Alahmadi A, Sharifzadeh H (2019) A cluster based collaborative filtering method for improving the performance of recommender systems in e-commerce. In: K. Arai et al (eds) Proceedings of the Future Technologies Conference (FTC) 2018, Advances in Intelligent Systems and Computing, vol 881. Springer, Cham

  41. Savage L (1954) Foundations of statistics. Wiley, Hoboken

    MATH  Google Scholar 

  42. Scanagatta M et al (2019) A survey on Bayesian network structure learning from data. Progress AI 8:425–439

    Google Scholar 

  43. Shore J, Johnson R (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans Inform Theory 26(1):26–37

    Article  MathSciNet  Google Scholar 

  44. Smith A, Makov U (1978) A quasi-Bayes sequential procedures for mixtures. J R Stat Soc 40(1):106–112

    MathSciNet  MATH  Google Scholar 

  45. Tsai C, Lai C, Chiang M, Yang L (2014) Data mining for internet of things: a survey. IEEE Commun Surv Tutor 16(1):77–95

    Article  Google Scholar 

  46. Wang P, Yang L, Li J, Chen J, Hu S (2019) Data fusion in cyber-physical-social systems: State-of-the-art and perspectives. Inform Fus 51:42–57

    Article  Google Scholar 

  47. Xu Z, He Y, Wang X (2019) An overview of probabilistic-based expressions for qualitative decision-making: techniques, comparisons and developments. Int J Mach Learn Cybern 1513–1528:10

    Google Scholar 

  48. Zadeh L (1976) A fuzzy-algorithmic approach to the definition of complex or imprecise concepts. Syst Theory Soc Sci 8:202–282

    MathSciNet  Google Scholar 

Download references

Acknowledgements

The paper was notably influenced by discussions with Dr. T.V. Guy.

Funding

The reported research has been supported by MŠMT ČR LTC18075 and EU-COST Action CA16228.

Author information

Authors and Affiliations

Authors

Contributions

Both authors tightly cooperated on the paper. MK dominated in writing the text and FH in experiments.

Corresponding author

Correspondence to Miroslav Kárný.

Ethics declarations

Conflict of interest

The authors have no affiliation with any organization with a direct or indirect financial interest in the subject matter discussed in the manuscript. This manuscript has not been submitted to, nor is under review at, another journal or other publishing venue.

Code availability

The code of examples is available at https://gitlab.com/hula-phd/bks.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kárný, M., Hůla, F. Fusion of probabilistic unreliable indirect information into estimation serving to decision making. Int. J. Mach. Learn. & Cyber. 12, 3367–3378 (2021). https://doi.org/10.1007/s13042-021-01359-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-021-01359-9

Keywords

Navigation