فهرست:
فصل اول: مقدمه
1
1-1-آسیبپذیری
2
1-1-1-تعریف آسیبپذیری
2
1-1-2- کلاس بندی آسیبپذیریها
3
1-1-3- علتهای ایجاد آسیبپذیریها
4
1-1-4- شناسایی و حذف آسیبپذیریها
5
1-2- مفاهیم اولیهی مورد نیاز
5
1-2-1- متن کاوی
5
1-2-2- کلاسبندی و پیش بینی
8
1-2-3- خوشهبندی
12
1-2-4- انتخاب ویژگی
14
1-3- هدف تحقیق
16
فصل دوم: مروری بر تحقیقات پیشین
18
2-1- نقش افراد و فرآیندهای مختلف بر آسیبپذیریها
19
2-2- روشهای ارزیابی و رده بندی آسیبپذیریها
24
2-2-1- سیستم نمره دهی آسیبپذیری متعارف
25
2-3- دسته بندی آسیب پذیریها
30
2-4- پیش بینیهای امنیتی با استفاده از گزارشهای آسیب پذیریها
36
2-5- تشخیص آسیب پذیریها با استفاده از سورس کد نرم افزارها
36
فصل سوم: دادههاوروشاستخراجویژگیها
39
3-1- دادههای تحقیق
40
3-2- روش استخراج ویژگیها برای کلاسبندی و پیشبینی
44
3-3- روش استخراج ویژگیها برای خوشهبندی
47
فصل چهارم: روش انجام و نتایج آزمایشات
50
4-1- روش و نتایج آزمایشات کلاسبندی و پیشبینی
51
4-1-1- پیشبینی بهره کشی برون خط
51
4-1-2- پیشبینی بهره کشی برخط
54
4-1-3- پیشبینی زمان
56
4-2- مقایسه OSVDB و CVE
62
4-3- ارزیابی ویژگیها
64
4-4- خوشه بندی آسیب پذیریها
66
4-4-1- تحلیل دستههای موجود در پایگاه داده OSVDB
68
4-4-2- ارائه دسته بندی آسیب پذیریها
78
4-4-3- ارزیابی دستهبندی ارائه شده
84
فصل پنجم: بحث و نتیجهگیری
87
5-1- پیش بینی بهره کشی از آسیب پذیریها
88
5-2- خوشه بندی آسیب پذیریها
89
نتیجه گیری
89
پیشنهادات برای پژوهشهای آینده
90
منابع و ماخذ
91
منبع:
1. The Three Tenents of Cyber Security, U.S. Air Force Software Protection Initiative. http://www.spi.dod.mil/tenets.htm. (Last visited 2011-07-10).
2. ISO/IEC, Information technology- Security techniques-Information security risk management, ISO/IEC FIDIS 27005:2008.
3. Internet Engineering Task Force RFC 2828 Internet Security Glossary
4. CNSS Instruction No.4009, dated 26 April 2010.
5. Risk Management Glossary Vulnerability, (Last visited 2011-08-23) http://www.enisa.europa.eu/act/rm/cr/risk-management-inventory/glossary#G52 .
6. Technical Standard Risk Taxonomy ISBN 1-931624-77-1, Document Number: C081 Published by the Open Group, January 2009.
7. An Introduction to Factor Analysis of Information Risk (FAIR), Risk Management Insight LLC, November 2006. URL: www.riskmanagementinsight.com.
8. Vacca, J.R., 2009. Computer and Information Security Handbook, Morgan Kaufmann Pubblications Elsevier Inc p. 393, ISBN 978-0-12-374354-1.
9. Krsul, I., 1997, Computer Vulnerability Analysis: Thesis Proposal, The COAST Laboratory Department of Computer Sciences, Purdue University.
10. The Web Application Security Consortium Project, Web Application Security Statistics (Last visited 2011-08-23),
http://projects.webappsec.org/w/page/13246989/Web-Application-Security-Statistics.
11. Han, J., AND Kamber, M., 2001. Data Mining: Concepts and Techniques. Morgan Kaufman.
12. Witten, I.H., AND Frank, E., 2000. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco.
13. Kohonen T. Self-organizing maps. Berlin, Germany: SpringerVerlag; 1995.
14. Ultsch, A., and Morchen F., 2005. ESOM-Maps: tools for clustering, visualization, and classification with Emergent SOM. Technical Report 46, CS Department, Philipps-University Marburg, Germany.
15. Duan KB, Rajapakse JC, Wang H, Azuaje F, 2005. Multiple SVM-RFE for gene selection in cancer classification with expression data. IEEE Trans Nanobioscience 4: 228–234. URL http://view.ncbi.nlm.nih.gov/pubmed/16220686.
16. Frei, S., Schatzmann, D., Plattner B., AND Trammel., B., 2009. Modeling the Security Ecosystem — The Dynamics of (In)Security. In Proc. of the Workshop on the Economics of Information Security (WEIS).
17. Arora, A., Krishnan, R., Telang, R., AND Yang, Y. 2010. An Empirical Analysis of Software Vendors’ Patch Release Behavior: Impact of Vulnerability Disclosure. Information Systems Research Vol. 21, No. 1,115–132.
18. Schryen, G., 2009. A Comprehensive and Comparative Analysis of the Patching Behavior of Open Source and Closed Source Software Vendors. Fifth International Conference on IT Security Incident Management and IT Forensics, 153-168.
19. Joh, H.C., AND Malaiya, YK., 2009. Seasonal variation in the vulnerability discovery process. Proc. International Conference on Software Testing Verification and Validation, 191-200.
20. United States Computer Emergency Readiness Team (US-CERT). US-CERT Vulnerability Note Field Descriptions, (last visited 2011-07-10). http://www.kb.cert.org/vuls/html/fieldhelp.
21. SANS Institute. SANS Critical Vulnerability Analysis Archive. (last visited 2011-07-10). http://www.sans.org/newsletters/cva/.
22. Microsoft Corporation. Microsoft Security Response Center Security Bulletin Severity Rating System. (last visited 2011-07-10). http://www.microsoft.com/technet/security/bulletin/rating.mspx.
23. Forum of Incident Response and Security Teams (FIRST). Common Vulnerabilities Scoring System (CVSS). http://www.first.org/cvss/ (last visited 2011-07-10).
24. Mell P., Scarfone K., and Romanosky S., 2007. The Common Vulnerability Scoring System (CVSS) and Its Applicability to Federal Agency Systems. NIST Interagency Report 7435.
25. Mell, P., Scarfone, K., Romanosky, S., 2006. Common Vulnerability Scoring System. IEEE Security and Privacy 4(6). 85-89.
26. Gallon, L., 2010, On the impact of environmental metrics on CVSS scores, IEEE International Conference on Privacy, Security, Risk and Trust, 987-992.
27. Fruhwirth, C., and Mannisto, T., 2009, Improving CVSS-based vulnerability prioritization and response with context information, Third International Symposium on Empricial Software Engineering and Measurement, 535-544.
28. Gallon, L., 2011, Vulnerability discrimination using CVSS framework, New Technologies, Mobility and Security (NTMS).
29. Joh, H.C. and Malaiya, Y.K., A framework for software security risk evaluation using the vulnerability lifecycle and CVSS metrics, 430-434.
30. Bishop M. A taxonomy of UNIX system and network vulnerabilities. Technical Report CSE-9510. Davis: Department of Computer Science, University of California; 1995.
31. Krsul IV. Software vulnerability analysis. Available from: http://www.krsul.org/ivan/articles/main.pdf; May 1998.
32. Venter HS, Eloff JHP. Harmonising vulnerability categories. South African Computer Journal 2002;29. ISSN: 1015-7999:24–31. Computer Society of South Africa.
33. Kujawski P. Why networks must be secured. Cisco Systems, Inc.;2003.
34. Microsoft Commerce Server 2002. The STRIDE threat model. Available from: http://msdn2.microsoft.com/en-us/library/ms954176.aspx; (last visited 2011-07-10).
35. SAINT Corporation. Available from: http://www.saintcorporation.com/; (Last visited 2011-07-10).
36. SFProtect. Available from:
http://www.winnetmag.com/Article/ArticleID/8401/8401.html; (Last visited 2011-07-10).
37. MOREnet. Available from:
http://www.more.net/services/rva/categories.html; (last visited 2011-07-10).
38. Venter H. S., Eloff J. H. P., and Li Y. L., Standardising vulnerability categories, Computers & Security In Press, Corrected Proof.
39. Huang, S., Tang, H., Zhang, M., AND Tian, J., 2010. Text Clustering on National Vulnerability Database. Computer Engineering and Applications (ICCEA).
40. Bozorgi, M., Saul, L.K., Savage, S., AND Voelker., G.M., 2010. Beyond Heuristics: Learning to Classify Vulnerabilities and Predict Exploits. KDD’10.
41. Shin, Y., 2008. Exploring complexity metrics as indicators of software vulnerability. In Proc. of the Int. Doctoral Symp. On Empirical Soft. Eng. (IDoESE'08).
42. Shin, Y., AND Williams, L., 2008. An empirical model to predict security vulnerabilities using code complexity metrics. Proc. International symposium on Empirical Software Engineering and Measurement, Kaiserslautern, Germany, pp. 315-317.
43. Neuhaus, S., Zimmermann, T., Holler, C., AND Zeller, A., 2007. Predicting vulnerable software components. In Proc. of the ACM conference on Computer and communications security.
44. OSVDB. The Open Source Vulnerability Database. http://osvdb.org/ (last visited 2011-07-10).
45. CVE Editorial Board. Common Vulnerabilities and Exposures: The Standard for Information Security Vulnerability Names. http://cve.mitre.org/ (last visited 2011-07-10).
46. The Word Vector Tool, http://wvtool.sf.net (last visited 2011-07-10).
47. StatistiXL PCA tool, http://www.statistixl.com/.
48. LIBLINEAR –A Library for Large Linear Classification. http://www.csie.ntu.edu.tw/~cjlin/liblinear/ (last visited 2011-07-10).
49. Random Jungle, http://www.randomjungle.org (last visited 2011-07-10).
50. Personal Communication with Mehran Bozorgi via Email (Feb 9, 2011).
51. Personal Communication with Brian Martin via Email (Feb 13, 2011).
52. Databionics ESOM Tool, http://databionic-esom.sourceforge.net/ (last visited 2011-07-10).
53. Sordo, M., and Zeng, Q., 2005. On sample size and classification accuracy: A performance comparison. LNCS, 3745:193-201.