CFP last date
15 January 2025
Reseach Article

Software Fault Detection using Honey Bee Optimization

by D. Asir Antony Gnana Singh, A. Escalin Fernando, E. Jebamalar Leavline
International Journal of Applied Information Systems
Foundation of Computer Science (FCS), NY, USA
Volume 11 - Number 1
Year of Publication: 2016
Authors: D. Asir Antony Gnana Singh, A. Escalin Fernando, E. Jebamalar Leavline
10.5120/ijais2016451565

D. Asir Antony Gnana Singh, A. Escalin Fernando, E. Jebamalar Leavline . Software Fault Detection using Honey Bee Optimization. International Journal of Applied Information Systems. 11, 1 ( Jun 2016), 1-9. DOI=10.5120/ijais2016451565

@article{ 10.5120/ijais2016451565,
author = { D. Asir Antony Gnana Singh, A. Escalin Fernando, E. Jebamalar Leavline },
title = { Software Fault Detection using Honey Bee Optimization },
journal = { International Journal of Applied Information Systems },
issue_date = { Jun 2016 },
volume = { 11 },
number = { 1 },
month = { Jun },
year = { 2016 },
issn = { 2249-0868 },
pages = { 1-9 },
numpages = {9},
url = { https://www.ijais.org/archives/volume11/number1/900-2016451565/ },
doi = { 10.5120/ijais2016451565 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2023-07-05T19:03:37.235231+05:30
%A D. Asir Antony Gnana Singh
%A A. Escalin Fernando
%A E. Jebamalar Leavline
%T Software Fault Detection using Honey Bee Optimization
%J International Journal of Applied Information Systems
%@ 2249-0868
%V 11
%N 1
%P 1-9
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The recent developments in the software technology assist humanities in various fields including engineering, technology, management, medical science, research, education, banking etc. Fault identification is a crucial one to the software testing professionals since a huge number of tests are carried out to identify the level of the defect. Therefore the machine learning algorithms are employed to develop software fault detection model in order to predict the fault in the software. The irrelevant and redundant test data reduces the accuracy of fault detection model. The accuracy of the fault detection model highly depends on the number of significant relevant test data. Therefore feature selection concept is applied to select the accurate features for developing the fault detection model. This paper proposes a method to select appropriate features with honey bee optimization technique for reducing the search space and to improve the accuracy in the software fault detection.

References
  1. Peng, Hanchuan, Fuhui Long, and Chris Ding. "Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy." Pattern Analysis and Machine Intelligence, IEEE Transactions on 27.8 (2005): 1226-1238.
  2. Jain, Anil, and Douglas Zongker. "Feature selection: Evaluation, application, and small sample performance." Pattern Analysis and Machine Intelligence, IEEE Transactions on 19.2 (1997): 153-158.
  3. Collins, Robert T., Yanxi Liu, and Marius Leordeanu. "Online selection of discriminative tracking features." Pattern Analysis and Machine Intelligence, IEEE Transactions on 27.10 (2005): 1631-1643.
  4. Yu, Lei, and Huan Liu. "Feature selection for high-dimensional data: A fast correlation-based filter solution." ICML. Vol. 3. 2003.
  5. Kohavi, Ron, and Dan Sommerfield. "Feature Subset Selection Using the Wrapper Method: Overfitting and Dynamic Search Space Topology." KDD. 1995.
  6. Chen, Yixin, Jinbo Bi, and James Z. Wang. "MILES: Multiple-instance learning via embedded instance selection." Pattern Analysis and Machine Intelligence, IEEE Transactions on 28.12 (2006): 1931-1947.
  7. Viola, Paul, and Michael Jones. "Rapid object detection using a boosted cascade of simple features." Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on. Vol. 1. IEEE, 2001.
  8. Liu, Huan, and Rudy Setiono. "Chi2: Feature selection and discretization of numeric attributes." tai. IEEE, 1995.
  9. Law, Martin HC, Mario AT Figueiredo, and Anil K. Jain. "Simultaneous feature selection and clustering using mixture models." Pattern Analysis and Machine Intelligence, IEEE Transactions on 26.9 (2004): 1154-1166.
  10. Kwak, Nojun, and Chong-Ho Choi. "Input feature selection for classification problems." Neural Networks, IEEE Transactions on 13.1 (2002): 143-159.
  11. Liu, Huan, and Lei Yu. "Toward integrating feature selection algorithms for classification and clustering." Knowledge and Data Engineering, IEEE Transactions on 17.4 (2005): 491-502.
  12. Oh, Il-Seok, Jin-Seon Lee, and Byung-Ro Moon. "Hybrid genetic algorithms for feature selection." Pattern Analysis and Machine Intelligence, IEEE Transactions on 26.11 (2004): 1424-1437
  13. Kwak, Nojun, and Chong-Ho Choi. "Input feature selection by mutual information based on Parzen window." Pattern Analysis and Machine Intelligence, IEEE Transactions on 24.12 (2002): 1667-1671.
  14. Avidan, Shai. "Ensemble tracking." Pattern Analysis and Machine Intelligence, IEEE Transactions on 29.2 (2007): 261-271.
  15. John, George H., Ron Kohavi, and Karl Pfleger. "Irrelevant features and the subset selection problem." Machine Learning: Proceedings of the Eleventh International Conference. 1994.
  16. Hsu, Chun-Nan, Hung-Ju Huang, and Stefan Dietrich. "The ANNIGMA-wrapper approach to fast feature selection for neural nets." Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on 32.2 (2002): 207-212.
  17. Talavera, Luis. "An evaluation of filter and wrapper methods for feature selection in categorical clustering." Advances in Intelligent Data Analysis VI. Springer Berlin Heidelberg, 2005. 440-451.
  18. Mao, K. Z. "Feature subset selection for support vector machines through discriminative function pruning analysis." Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on 34.1 (2004): 60-67.
  19. Yuan, Huang, et al. "A two-phase feature selection method using both filter and wrapper." Systems, Man, and Cybernetics, 1999. IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on. Vol. 2. IEEE, 1999.
  20. Muni, Durga Prasad, Nikhil R. Pal, and Jyotirmoy Das. "Genetic programming for simultaneous feature selection and classifier design." Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on 36.1 (2006): 106-117.
  21. Reunanen, Juha. "Overfitting in making comparisons between variable selection methods." The Journal of Machine Learning Research 3 (2003): 1371-1382.
  22. Pal, Mahesh, and Giles M. Foody. "Feature selection for classification of hyperspectral data by SVM." Geoscience and Remote Sensing, IEEE Transactions on 48.5 (2010): 2297-2307.
  23. Nie, Feiping, et al. "Efficient and robust feature selection via joint ℓ2, 1-norms minimization." Advances in neural information processing systems. 2010.
  24. Sun, Yijun, Sinisa Todorovic, and Steve Goodison. "Local-learning-based feature selection for high-dimensional data analysis." Pattern Analysis and Machine Intelligence, IEEE Transactions on 32.9 (2010): 1610-1626.
  25. Afshar, Abbass, et al. "Honey-bee mating optimization (HBMO) algorithm for optimal reservoir operation." Journal of the Franklin Institute 344.5 (2007): 452-462.
  26. Karaboga, Dervis, and Bahriye Basturk. "Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems." Foundations of Fuzzy Logic and Soft Computing. Springer Berlin Heidelberg, 2007. 789-798.
  27. Wang, Jianyu, Xilin Chen, and Wen Gao. "Online selecting discriminative tracking features using particle filter." Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on. Vol. 2. IEEE, 2005.
  28. Kuo, Bor-Chen, and David Landgrebe. "Nonparametric weighted feature extraction for classification." Geoscience and Remote Sensing, IEEE Transactions on 42.5 (2004): 1096-1105.
  29. Tuia, Devis, et al. "Learning relevant image features with multiple-kernel classification." Geoscience and Remote Sensing, IEEE Transactions on 48.10 (2010): 3780-3791.
  30. Gu, Quanquan, Zhenhui Li, and Jiawei Han. "Generalized fisher score for feature selection." arXiv preprint arXiv:1202.3725 (2012).
  31. Pham, Duc Truong, et al. "Optimising neural networks for identification of wood defects using the bees algorithm." Industrial Informatics, 2006 IEEE International Conference on. IEEE, 2006.
  32. Karaboga, Dervis, and Bahriye Akay. "A survey: algorithms simulating bee swarm intelligence." Artificial Intelligence Review 31.1-4 (2009): 61-85.
  33. Zhu, Guopu, and Sam Kwong. "Gbest-guided artificial bee colony algorithm for numerical function optimization." Applied Mathematics and Computation 217.7 (2010): 3166-3173.
  34. Deb, Kalyanmoy, et al. "A fast and elitist multiobjective genetic algorithm: NSGA-II." Evolutionary Computation, IEEE Transactions on 6.2 (2002): 182-197.
  35. Harik, Georges R., Fernando G. Lobo, and David E. Goldberg. "The compact genetic algorithm." Evolutionary Computation, IEEE Transactions on 3.4 (1999): 287-297.
  36. Shi, Yuhui, and Russell Eberhart. "A modified particle swarm optimizer." Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on. IEEE, 1998.
  37. Merkle, Daniel, Martin Middendorf, and Hartmut Schmeck. "Ant colony optimization for resource-constrained project scheduling." Evolutionary Computation, IEEE Transactions on 6.4 (2002): 333-346.
  38. Parpinelli, Rafael S., Heitor S. Lopes, and Alex Freitas. "Data mining with an ant colony optimization algorithm." Evolutionary Computation, IEEE Transactions on 6.4 (2002): 321-332.
  39. Kwak, Nojun, and Chong-Ho Choi. "Input feature selection for classification problems." Neural Networks, IEEE Transactions on 13.1 (2002): 143-159.
  40. Khan, Javed, et al. "An attribute selection process for software defect prediction." Informatics, Electronics & Vision (ICIEV), 2014 International Conference on. IEEE, 2014.
  41. Nemati, Shahla, et al. "A novel ACO–GA hybrid algorithm for feature selection in protein function prediction." Expert systems with applications 36.10 (2009): 12086-12094.
  42. Dash, Manoranjan, and Huan Liu. "Feature selection for classification." Intelligent data analysis 1.1 (1997): 131-156.
  43. Schiezaro, Mauricio, and Helio Pedrini. "Data feature selection based on Artificial Bee Colony algorithm." EURASIP Journal on Image and Video Processing 2013.1 (2013): 1-8.
  44. D. Asir Antony Gnana Singh, S.Appavu alias Balamurugan, E. Jebamalar Leavline, “A novel feature selection method for image classification”, Optoelectronics and Advanced Materials – Rapid Communications, vol. 9, no. 11-12, 2015. ISSN: Print: 1842-6573.
  45. D. Asir Antony Ganana Singh and E. J. Leavline, “Data Mining In Network Security - Techniques & Tools: A Research Perspective”, Journal of Theoretical and Applied Information Technology, vol. 57, (2013).
  46. D.Asir Antony Gnana Singh ,S.Appavu alias Balamurugan, E.Jebamalar Leavline, “Improving performance of Supervised Learners using Unsupervised Variable Selection Algorithm: A novel approach”, International journal of soft computing,vol. 9, No.5, pp.303-307, 2014, ISSN : 1816-9503
  47. D.Asir Antony Gnana Singh ,S.Appavu alias Balamurugan, E.Jebamalar Leavline, “Improving the Accuracy of the Supervised Learners using Unsupervised based Variable Selection”, Asian Journal of Information Technology, vol. 30, No.9, pp. 530 – 537, 2014, ISSN : 1682-3915 (Print), ISSN : 1993-5994 (Online)
  48. D. Asir Antony Gnana Singh, E. Jebamalar Leavline, Decision making in enterprise computing: A data mining approach, International journal of core engineering and management, vol. 1, issue 11, Feb 2015, pp. 103 - 113, ISSN: 2348-9510
Index Terms

Computer Science
Information Sciences

Keywords

Feature selection software fault detection algorithm honey bee optimization data mining approaches machine learning algorithm improving accuracy in classification algorithms.