Many approaches have been proposed to handle DoS and DDoS
attacks. These approaches address diverse aspects of these complex
threats, such as attack prevention, detection or response. Still,
there is not a common, comprehensive methodology to evaluate an impact
of a DoS attack on a given network, or the performance of a given
defense. Such a methodology is needed for the following reasons:
-
To be able to protect systems from DDoS attacks, we need ways to
characterize how dangerous the attack is, to estimate the potential
damage/cost from the attack to a specific network (with or without
defense).
-
Given many DDoS defenses, we need a common evaluation setting to
evaluate and compare the performance of these defenses. These tests
will also indicate a defense's weak features that need improvement.
This page describes our project on building a common methodology for DDoS
defense evaluation. The project consists of: (1) DDoS benchmarks that
represent a set of scenarios to be used for defense evaluation, (2) a
set of performance metrics that characterize an attack's impact and a
defense's performance and (3) A set of tools we used for benchmark
development, integration of benchmarks with the DETER testbed and
calculation of performance metrics from tcpdump traces collected
during DDoS experimentation.
Publications
-
Erinc Arikan,
Attack
Profiling for DDoS Benchmarks, MS Thesis, University of Delaware,
August 2006.
- J. Mirkovic,
B. Wilson, A. Hussain, S. Fahmy, P. Reiher, R. Thomas and S. Schwab,
Automating DDoS
Experimentation, Proceedings of the DETER workshop, August 2007
- J. Mirkovic, A. Hussain, B. Wilson, S. Fahmy, P. Reiher, R. Thomas, W. Yao, and S. Schwab, Towards User-Centric Metrics for Denial-Of-Service Measurement, Proceedings of the Workshop on Experimental Computer Science, June 2007
- J. Mirkovic, S. Wei, A. Hussain, B. Wilson, R. Thomas, S. Schwab, S. Fahmy, R. Chertov and P. Reiher DDoS Benchmarks and Experimenter's Workbench for the DETER Testbed, Proceedings of the Tridentcom 2007, May 2007.
- J. Mirkovic, A. Hussain, B. Wilson, S. Fahmy, W. Yao, P. Reiher, S. Schwab and R. Thomas When Is Service Really Denied? A User-Centric DoS Metric, Proceedings of the Sigmetrics 2007, June 2007
- J. Mirkovic, E.
Arikan, S. Wei, S. Fahmy, R. Thomas, and P. Reiher Benchmarks for DDoS Defense Evaluation, Proceedings of the
Milcom 2006, October 2006
- J. Mirkovic, P. Reiher, S. Fahmy, R. Thomas, A. Hussain, S. Schwab and C. Ko Measuring Denial-of-Service, Proceedings of the 2006 Quality of Protection Workshop, October 2006
-
J. Mirkovic, S. Fahmy, P. Reiher, R. Thomas, A. Hussain,
S. Schwab, and C. Ko, Measuring Impact of DoS Attacks, In Proceedings
of the DETER Community Workshop on Cyber Security Experimentation,
June 2006.
-
J. Mirkovic, E. Arikan, S. Wei, S. Fahmy, R. Thomas,
P. Reiher, Benchmarks for DDoS Defense Evaluation, In Proceedings of
the DETER Community Workshop on Cyber Security Experimentation, June
2006
-
R. Chertov, S. Fahmy, N. B. Shroff, High Fidelity Denial of Service
(DoS) Experimentation, In Proceedings of the DETER Community Workshop
on Cyber Security Experimentation, June 2006.
-
R. Chertov, S. Fahmy, P. Kumar, D. Bettis, A. Khreishah, N. B. Shroff,
Topology Generation, Instrumentation, and Experimental Control Tools
for Emulation Testbeds, In Proceedings of the DETER Community Workshop
on Cyber Security Experimentation, June 2006.
-
A. Hussain, S. Schwab, R. Thomas, S. Fahmy, and J. Mirkovic,
DDoS
Experiment Methodology, In Proceedings of the DETER Community Workshop
on Cyber Security Experimentation, June 2006.
-
S. Wei, J. Mirkovic and E. Kissel,
Profiling
and Clustering Internet Hosts, Proceedings of the 2006 International Conference on
Data Mining, June 2006
Tools and Reports
|