Authors B.K. Lebedev, O.B. Lebedev, E.M. Lebedeva
Month, Year 07, 2016 @en
Index UDC
DOI DOI 10.18522/2311-3103-2016-7-89101
Abstract The paper deals with the clustering problem, the solution of which is based on the sequential method and the method of integration on the basis of a collective alternative adaptation. Randomly or by one of the structural diversity algorithms performed on objects classes L, selected a solution from the solution space. The basic procedure provides separation into two classes (L=2). In the case where L>2, it can be assumed that the partition is constructed to separate the first image of the first class (image) from the rest. Further decomposition is repeated on the remaining set of images in order to identify a second class etc. The process of decomposition is terminated after only a subset of another separation becomes impossible. In the process of finding solutions presented in the form of an adaptive system that works in a partial (or complete) a priori uncertainty and changing environmental conditions, and obtained in the process of information about these conditions, is used to improve performance. The current situation is characterized by two factors: the state of the environment in which the object and the object itself to adapt. The process of adap-tation of the search has a serial multi-stage character, at each stage is determined by adapting the impact of an object increases its efficiency and optimize the quality criteria. Working under the action of adapting the object is to move the impact of the cluster in which it is housed in one of the neighboring clusters. The nature and magnitude of the impact of adapting to each individual ele-ment. Under the influence of a series of adapting influences the nature and amount of which varies in each iteration, objects (team) consistently redistributed between clusters. The purpose of a par-ticular object mi achieve a state in which the vector sum of forces acting on it from all the objects located in the same cluster with mi, has a maximum value. The purpose of the collective object is to achieve such objects diversity clusters where the minimum distance between the pair of objects belonging to different clusters, has a maximum value mi.To implement the mechanism of adaptation to each object is assigned a AAi automatic adaptation to the two groups of states {С1i and С2i}corresponding to the two alternatives A1i and A2i. The number of states in a group given by Qi parameter called memory depth. On AAi adaptation machine input signal is "promotion" or "pun-ishment" depending on the state of adaptation of the object in the environment mi. At each step of the adaptive system of collective adaptation process is carried out in four cycles. Studies have shown that the time complexity of the algorithm in one iteration has an opinion О(n), where n - number of elements, and the maximum efficiency of the adaptive search is provided by control parameters: q = 2, T = 80, where the depth of memory q – AA and T – the number of iterations.

Download PDF

Keywords Pattern recognition; clustering; collective alternative adaptation; automatic adaptation; hybrid algorithm.
References 1. Han J., Kamber M. Data mining: Concepts and Techniques. Morgan Kaufmann Publishers, 2001.
2. Ian H. Witten, Eibe Frank and Mark A. Hall. Data Mining: Practical Machine Learning Tools and Techniques. 3rd Edition. Morgan Kaufmann, 2011.
3. Lbov G.S., Berikov V.B. Ustoychivost' reshayushchikh funktsiy v zadachakh raspoznavaniya obrazov i analiza raznotipnoy informatsii [Stability of decision functions in problems of pattern recognition and analysis of diverse information]. Novosibirsk: Izd-vo In-ta matematiki, 2005, 218 p.
4. Zhuravlev Yu.I., Ryazanov V.V., Sen'ko O.V. Raspoznavanie. Matematicheskie metody. Pro-grammnaya sistema. Prakticheskie primeneniya [Recognition. Mathematical methods. Software system. Practical application]. Moscow: Fazis, 2006, 159 p.
5. Shlezinger M., Glavach V. Desyat' lektsiy po statisticheskomu i strukturnomu raspoznavaniyu [Ten lectures on statistical and structural recognition]. Kiev: Naukova dumka, 2004, 545 p.
6. Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning. Springer, 2001. COMPACT - Comparative Package for Clustering Assessment. A free Matlab package, 2006.
7. Vorontsov K.V. Algoritmy klasterizatsii i mnogomernogo shkalirovaniya. Kurs lektsiy [Clus-tering algorithms and multidimensional scaling. A course of lectures]. Moscow: MGU, 2007.
8. Lebedev B.K., Lebedev V.B. Evolyutsionnaya protsedura obucheniya pri raspoznavanii obrazov [The evolutionary procedure learning for image recognition], Izvestiya TRTU [Izvestiya TSURE], 2004, No. 8 (43), pp. 83-88.
9. Kotov A., Krasil'nikov N. Klasterizatsiya dannykh [Clustering data], 2006, 16 p.
10. Berkhin P. Survey of Clustering Data Mining Techniques, Accrue Software, 2002.
11. Berikov V.S., Lbov G.S. Sovremennye tendentsii v klasternom analize [Modern trends in cluster analysis], Vserossiyskiy konkursnyy otbor obzorno-analiticheskikh statey po prioritetnomu napravleniyu «Informatsionno-telekommunikatsionnye sistemy», 2008 [all-Russian competitive selection of survey and analytical articles on priority direction "Information-telecommunication systems", 2008], 26 p.
12. Fern X.Z., Brodley C.E. Clustering ensembles for high dimensional data clustering, In Proc. International Conference on Machine Learning, 2003, pp. 186-193.
13. Fred A., Jain A.K. Combining multiple clusterings using evidence accumulation, IEEE Tran. on Pattern Analysis and Machine Intelligence, 2005, Vol. 27, pp. 835-850.
14. Jain A., Murty M., Flynn P. Data clustering: A review, ACM Computing Surveys, 1999,
Vol. 31, no. 3, pp. 264-323.
15. Kureychik V.M., Lebedev B.K., Lebedev O.B. Poiskovaya adaptatsiya: teoriya i praktika [Search adaptation: theory and practice]. Moscow: Fizmatlit, 2006, 272 p.
16. Tsetlin M.L. Issledovaniya po teorii avtomatov i modelirovaniyu biologicheskikh system [Re-search in automata theory and modeling of biological systems]. Moscow: Nauka, 1969, 316 p.
17. Kureychik V.M., Lebedev B.K., Lebedev O.B. Razbienie na osnove modelirovaniya adaptivnogo povedeniya biologicheskikh sistem [Partitioning based on simulation of adaptive behavior of biological systems], Neyrokomp'yutery: razrabotka, primenenie [Neurocomputers: development, application], 2010, No. 2, pp. 28-34.
18. David Arthur & Sergei Vassilvitskii. "How Slow is the k-means Method?", Proceedings of the 2006 Symposium on Computational Geometry (SoCG), 2006.
19. Gorban A.N., Zinovyev A.Y. Principal Graphs and Manifolds, Ch. 2 in: Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods, and Techniques, Emilio Soria Olivas et al. (eds), IGI Global, Hershey, PA, USA, 2009, pp. 28-59.
20. Mirkes E.M. K-means and K-medoids applet. University of Leicester, 2011.
21. Adam Coates and Andrew Y.Ng. Learning Feature Representations with K-means, Stanford University, 2012.
22. Vyatchenin D.A. Nechetkie metody avtomaticheskoy klassifikatsii [Fuzzy methods of automatic classification]. Minsk: Tekhnoprint, 2004, 219 p.

Comments are closed.