Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Readme #7

Open
withzombies opened this issue Jul 28, 2016 · 4 comments
Open

Readme #7

withzombies opened this issue Jul 28, 2016 · 4 comments

Comments

@withzombies
Copy link
Contributor

withzombies commented Jul 28, 2016

Write a readme that includes:

  • Introduction to the challenge set
    • Why they're cool
    • What you can do with them
    • Some example uses (e.g. linter shootout, static-analysis tool comparison, fuzzer comparison, translation tool testing, etc)
  • Build Instructions
    • For all the binaries and individual binaries
    • Explanation of patched and non-patched binaries
  • Test Instructions
    • How to generate polls
    • How to run polls
    • How to test the pov (vs both patched and vulnerable versions)
  • Why we made some decisions when porting
    • cmake
    • dynamic linking vs system libc
    • nostdinc
  • State of each binary on each platform
@artemdinaburg
Copy link
Contributor

Going to start filling in some of these to be added to readme. Mosty copied from blog post.

What are they:
These programs were specifically designed with vulnerabilities that represent a wide variety of software flaws. They are more than simple test cases, they approximate real software with enough complexity to stress both manual and automated vulnerability discovery.

Why they're cool:

We all now have an industry benchmark to evaluate program analysis tools. We can make comparisons such as:

  • How good are the CGC tools vs. existing program analysis and bug finding tools
  • When a new tool is released, how does it stack up against the current best?
  • Do static analysis tools that work with source code find more bugs than dynamic analysis tools that work with binaries?
  • Are tools written for Mac OS X better than tools written for Linux, and are they better than tools written for Windows?

What you can do with them:

The challenge binaries, valid test inputs, and sample vulnerabilities create an industry standard benchmark suite for evaluating:

  • Bug-finding tools
  • Program-analysis tools (e.g. automated test coverage generation, value range analysis)
  • Patching strategies
  • Exploit mitigations

@artemdinaburg
Copy link
Contributor

About -nostdinc:

The challenge binaries were written for a platform without a standard libc. Each binary re-implemented just the necessary libc features. Therefore, standard symbols were re-defined. By using the -nostdinc flag during compilation, we were able to disable the use of standard library headers, and avoid rewriting a lot of challenge binary code.

@dguido
Copy link
Member

dguido commented Jul 31, 2016

Added this to the readme: 1541ea9

@dguido
Copy link
Member

dguido commented Aug 1, 2016

This is pretty ok, but we could still use an explanation of cmake and dynamic linking vs system libc.

artemdinaburg added a commit that referenced this issue Aug 1, 2016
…riptions of and added a link to the cb-test utility. Should address some of #7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants