Data Analytics in Bioinformatics. Группа авторов
IET Inf. Secur., 9, 6, 313–320, 2015.
77. Weinberger, K.Q. and Saul, L.K., Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res., 10, Feb, 207–244, 2009.
78. Keller, J.M., Gray, M.R., Givens, J.A., A fuzzy k-nearest neighbor algorithm. IEEE Trans. Syst. Man Cybern., 15, 4, 580–585, 1985.
79. Jain, R., Camarillo, M.K., Stringfellow, W.T., Drinking Water Security for Engineers, Planners, and Managers, Oxford: Elsevier, Inc, United Kingdom 2014.
80. Meyer-Baese, A. and Schmid, V.J., Pattern Recognition and Signal Analysis in Medical Imaging, Elsevier, Netherlands, 2014.
81. Staelens, S. and Buvat, I., Monte Carlo simulations in nuclear medicine imaging, in: Advances in Biomedical Engineering, pp. 177–209, Elsevier, Netherlands, 2009.
82. Murthy, S.K., Automatic construction of decision trees from data: A multi-disciplinary survey. Data Min. Knowl. Discovery, 2, 4, 345–389, 1998.
83. Criminisi, A., Shotton, J., Konukoglu, E., Decision forests: A unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning. Found. Trends® Comput. Graph. Vis., 7, 2–3, 81–227, 2012.
84. Tanha, J., van Someren, M., Afsarmanesh, H., Semi-supervised self-training for decision tree classifiers. Int. J. Mach. Learn. Cybern., 8, 1, 355–370, 2017.
85. Zahir, N. and Mahdi, H., Snow Depth Estimation Using Time Series Passive Microwave Imagery via Genetically Support Vector Regression (case Study Urmia Lake Basin). The Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci., 40, 1, 555, 2015.
86. Gualtieri, J.A. and Cromp, R.F., Support vector machines for hyperspectral remote sensing classification, in: 27th AIPR Workshop: Advances in Computer-Assisted Recognition, 1999, January, vol. 3584, International Society for Optics and Photonics, pp. 221–232.
87. Brefeld, U. and Scheffer, T., Co-EM support vector learning, in: Proceedings of the Twenty-First International Conference on Machine Learning, 04, July, p. 16, 20.
88. Chang, C.C. and Lin, C.J., LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST), 2, 3, 1–27, 2011.
89. Hsu, C.W. and Lin, C.J., A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Networks, 13, 2, 415–425, 2002.
90. Shawe-Taylor, J. and Cristianini, N., Support Vector Machines, vol. 2, Cambridge University Press, Cambridge, 2000.
91. Marto, A., Hajihassani, M., Jahed Armaghani, D., Tonnizam Mohamad, E., Makhtar, A.M., A novel approach for blast-induced flyrock prediction based on imperialist competitive algorithm and artificial neural network. Sci. World J., 2014, 1–11, 2014.
92. Zhang, Z. and Friedrich, K., Artificial neural networks applied to polymer composites: A review. Compos. Sci. Technol., 63, 14, 2029–2044, 2003.
93. Maind, S.B. and Wankar, P., Research paper on basic of artificial neural network. Int. J. Recent Innovation Trends Comput. Commun., 2, 1, 96–100, 2014.
94. Dahl, G.E., Yu, D., Deng, L., Acero, A., Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans. Audio Speech Lang. Processing, 20, 1, 30–42, 2011.
95. Baltzakis, H. and Papamarkos, N., A new signature verification technique based on a two-stage neural network classifier. Eng. Appl. Artif. Intell., 14, 1, 95–103, 2001.
96. Zhao, Z.Q., Huang, D.S., Sun, B.Y., Human face recognition based on multi-features using neural networks committee. Pattern Recognit. Lett., 25, 12, 1351–1358, 2004.
97. Patil, V. and Shimpi, S., Handwritten English character recognition using neural network. Elixir Comput. Sci. Eng., 41, 5587–5591, 2011.
98. Davydova, 10 Applications of Artificial Neural Networks in Natural Language Processing, Retrieved from https://medium.com/@datamonsters/artificial-neural-networks-in-natural-language-processing-bcf62aa9151a.
99. Murakawa, M., Yoshizawa, S., Kajitani, I., Yao, X., Kajihara, N., Iwata, M., Higuchi, T., The grd chip: Genetic reconfiguration of dsps for neural network processing. IEEE Trans. Comput., 48, 6, 628–639, 1999.
100. Mozolin, M., Thill, J.C., Usery, E.L., Trip distribution forecasting with multi-layer perceptron neural networks: A critical evaluation. Transport. Res. Part B: Meth., 34, 1, 53–73, 2000.
101. Kalchbrenner, N., Grefenstette, E., Blunsom., P., A Convolutional Neural Network for Modelling Sentences, in: Proceedings of ACL, vol. 1, pp. 655–665, 2014.
102. Setiono, R., Baesens, B., Mues, C., Recursive neural network rule extraction for data with mixed attributes. IEEE Trans. Neural Networks, 19, 2, 299–307, 2008.
103. Gregor, K., Danihelka, I., Graves, A., Rezende, D.J., Wierstra, D., Draw, in: Proceedings of the 32nd International Conference on Machine Learning, PMLR, vol. 37, pp. 1462–1471, 2015.
104. Zen, H. and Sak, H., Unidirectional long short-term memory recurrent neural network with recurrent output layer for low-latency speech synthesis, in: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2015, April, IEEE, pp. 4470–4474.
105. Sutskever, I., Vinyals, O., Le, Q.V., Sequence to sequence learning with neural networks, in: Advances in Neural Information Processing Systems, pp. 3104–3112, 2014.
106. Oymak, S. and Soltanolkotabi, M., Towards moderate overparameterization: global convergence guarantees for training shallow neural networks. IEEE J. Sel. Areas Inf. Theory, 1, 84–105, 2020.
*Corresponding author: [email protected]
2
Introduction to Unsupervised Learning in Bioinformatics
Nancy Anurag Parasa1, Jaya Vinay Namgiri1, Sachi Nandan Mohanty2 and Jatindra Kumar Dash1*
1 Department of Computer Science and Engineering, SRM University-AP, Andhra Pradesh, Amaravathi, India
2 Department of Computer Science and Engineering, IcfaiTech, ICFAI Foundation for Higher Education, Hyderabad, India
*Corresponding author: [email protected]
Abstract
Unsupervised learning algorithmic techniques are applied in grouping the data depending upon similar attributes, most similar patterns, or relationships amongst the dataset points or values. These Machine learning models are also referred to as self-organizing models which operate on clustering technique. Distinct approaches are employed on every other algorithm in splitting up data into clusters. Unsupervised machine learning uncovers previously unknown patterns in data. Unsupervised machine learning algorithms are applied in case of data insufficiency. Few applications of unsupervised machine learning techniques include: Clustering, anomaly detection. Clustering