Скачать презентацию Secure Development with Static Analysis Ted Unangst Coverity Скачать презентацию Secure Development with Static Analysis Ted Unangst Coverity

c50c3936cdc45fbd1c5c63bd9c77a3bc.ppt

  • Количество слайдов: 31

Secure Development with Static Analysis Ted Unangst Coverity Secure Development with Static Analysis Ted Unangst Coverity

Source Code Analysis: Why do you care? n n n As a developer, source Source Code Analysis: Why do you care? n n n As a developer, source code is the malleable object Many techniques can be shared between source and binary analysis Flip side: Improve RE by looking for blind spots in source analysis Ted Unangst – Coverity

Outline n n Program Analysis Source Code analysis Source code analysis for security auditing Outline n n Program Analysis Source Code analysis Source code analysis for security auditing Deployment Ted Unangst – Coverity

Program Analysis n n n Source vs Binary Static vs Dynamic Complementary Ted Unangst Program Analysis n n n Source vs Binary Static vs Dynamic Complementary Ted Unangst – Coverity

Source and Binary analysis n Source n n n Easy to identify location of Source and Binary analysis n Source n n n Easy to identify location of flaw CPU independent Language depen. Environment independent? 1 st party only Binary n n n Hard to map back to source CPU dependent Language indep? Environment independent? 3 rd party utility Ted Unangst – Coverity

Static and Dynamic Analysis n Static n n n Complete coverage false positives Can Static and Dynamic Analysis n Static n n n Complete coverage false positives Can analyze anytime, anywhere Precise description of problem, unknown impact Dynamic n n Very rare to get close to 100% No false positives Requires ability to run program Precise understanding of impact, possibly unknown cause Ted Unangst – Coverity

Static Source Code Analysis n The source has a lot of knowledge embedded n Static Source Code Analysis n The source has a lot of knowledge embedded n n Extract the good parts, ignore the remainder Many types of analysis n n Different ways to approach the problem Different goals n n n Find Defects Enhance Run time analysis Gain insight into code Ted Unangst – Coverity

Static Source Code Analysis n n n Type qualifier checking Pattern recognition Source rewriting Static Source Code Analysis n n n Type qualifier checking Pattern recognition Source rewriting Model checking Simulated execution Flow sensitive? n Context sensitive? n Ted Unangst – Coverity

Simulated Execution n Flow sensitive bar() { free(g_p); } baz() { free(g_p); } foo() Simulated Execution n Flow sensitive bar() { free(g_p); } baz() { free(g_p); } foo() { if (x) bar(); else baz(); } n Context sensitive if (use_malloc) p = malloc(); /* … */ if (use_malloc) free(p); n How much state to track? n n n Exponential number of paths Loops Heap is unbounded Ted Unangst – Coverity

Advantages n Automatic Should be capable of scanning the entire source base on a Advantages n Automatic Should be capable of scanning the entire source base on a regular basis n Infrastructure should be able to adapt to changing code bases n n Complete n Looks at all the code including edge conditions and error handling Ted Unangst – Coverity

Challenges n Parsing Any non-trivial analysis requires parsing the code n Nobody writes standard Challenges n Parsing Any non-trivial analysis requires parsing the code n Nobody writes standard C; they write [name of compiler] C n Many extensions are not documented (if they’re even unintentional) n Hard to analyze if you can’t read it n Ted Unangst – Coverity

Parsing Fun struct s { // parenthesis what? unsigned int *array 0[3 ]; ()unsigned Parsing Fun struct s { // parenthesis what? unsigned int *array 0[3 ]; ()unsigned int *array 1[3 ]; ( unsigned int *array 2[3)]; ( unsigned int *array 3[3 ]; }; { // lvalue cast? const int x = 4; (int)x = 2; } { // what’s a static cast? static int x; static char *p = (static int *)&x; } { // no, that’s not = int reg @ 0 x 1234 abcd; } { // goto over initializer goto LABEL; string s; LABEL: cout << s << endl; } { // bad C. but in C++? char *c = 42; } { // init struct with int? struct s { int x; int y; }; struct s z = 0; } { // new constant types int binary = 11010101 b; } Ted Unangst – Coverity

Challenges n n n What properties to look for? Can only analyze what we Challenges n n n What properties to look for? Can only analyze what we can see Linkage affects run time behavior RPC, COM, function pointers False positives False negatives Ted Unangst – Coverity

Test generation n n Use source code analysis to find edge cases faster Precisely Test generation n n Use source code analysis to find edge cases faster Precisely directed test cases Ted Unangst – Coverity

A window into the black box n n n Occasionally we have some of A window into the black box n n n Occasionally we have some of the source XML, JPEG, PNG libraries Use source code analysis to discover properties about the API Which functions allocate memory? Free memory? n Write to a buffer? How big? n Ted Unangst – Coverity

Security n Static Source Analysis pros Thorough n Reveals root cause and path n Security n Static Source Analysis pros Thorough n Reveals root cause and path n data flow tracking (? ) n n Cons Unable to understand impact n Some data dependencies are just too complicated n Ted Unangst – Coverity

Security n What qualifies as a defect? n n Every strcpy()? What properties can Security n What qualifies as a defect? n n Every strcpy()? What properties can we determine statically? Buffer overruns n Integer overflows n Race conditions n Memory leaks n Ted Unangst – Coverity

SQL Injection Example name = read_form_entry(“NAME”); res = run_query(“select * from user where name SQL Injection Example name = read_form_entry(“NAME”); res = run_query(“select * from user where name = ‘%s’”, name); n n n How do we know name is bad? How do we know run_query will behave incorrectly? Configuration: n n read_form_entry : USERDATA(RETURN) run_query : TRUST(ALLARGS) Ted Unangst – Coverity

SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ = 0; name = p; p = strchr(p, ‘, ’); *p++ = 0; res = run_query(“select * from user where name = ‘%s’”, name); n Configuration: n n n read_form_entry : USERDATA(RETURN) run_query : TRUST(ALLARGS) strchr : COPY(ARG 0, RETURN) Ted Unangst – Coverity

SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ = 0; name = p; p = strchr(p, ‘, ’); *p++ = 0; name = escape_sql(name); res = run_query(“select * from user where name = ‘%s’”, name); n Configuration: n n read_form_entry : USERDATA(RETURN) run_query : TRUST(ALLARGS) strchr : COPY(ARG 0, RETURN) escape_sql : OKDATA(RETURN) Ted Unangst – Coverity

SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ SQL Injection Example csv = read_form_entries(); otherval = p = strchr(csv, ‘, ’); *p++ = 0; name = p; p = strchr(p, ‘, ’); *p++ = 0; if (!validate_name(name)) return (EINVAL); res = run_query(“select * from user where name = ‘%s’”, name); n Configuration: n n read_form_entry : USERDATA(RETURN) run_query : TRUST(ALLARGS) strchr : COPY(ARG 0, RETURN) validate_name : 0(RETURN) => USERDATA(ARG 0); 1(RETURN) => OKDATA(ARG 0) Ted Unangst – Coverity

Configuration n n Tedious to annotate by hand Can use statistics to derive correct Configuration n n Tedious to annotate by hand Can use statistics to derive correct function pairings n n Start at read, recv, … n n Some developers may get it wrong more than right Assume return values are tainted Assume all functions trust input Converges quickly We haven’t verified escape_sql works correctly Ted Unangst – Coverity

Exploit? if (issetugid() == 0) errx(1, “Improperly installed”); str = getenv(“NUMTHREADS”); n = atoi(str); Exploit? if (issetugid() == 0) errx(1, “Improperly installed”); str = getenv(“NUMTHREADS”); n = atoi(str); n *= sizeof(widget); ptr = malloc(n); Ted Unangst – Coverity

Soundness n Many tools cut corners Pointer analysis is hard n Two choices: leave Soundness n Many tools cut corners Pointer analysis is hard n Two choices: leave some bugs behind or get swamped by false positives n Delicate balance n Still very good at catching the low hanging fruit and finding dangerous constructs n Ted Unangst – Coverity

Cost of False Positives and False Negatives n False positives are costly n n Cost of False Positives and False Negatives n False positives are costly n n If uncontrolled, can easily sap more time from development or auditing than the analysis saves Over time, the trend is to 100% false positives Important consideration for tool adoption Cost of false negatives is harder to estimate n n False sense of security If you start spending less time testing, you’re in for a nasty surprise Analysis should help focus and direct audits, not replace them Some properties can be verified; in the general case it’s impossible Ted Unangst – Coverity

Build or Buy? n Build n n n Find those bugs you are especially Build or Buy? n Build n n n Find those bugs you are especially interested Hard, Hard Users are never happy How much do 5 -10 developers cost per year? Buy n n n Maybe Not a perfect fit Checks many additional properties Generality allows migration to other projects Ted Unangst – Coverity

Understanding the Tool n n The average analysis tool doesn’t think like a developer Understanding the Tool n n The average analysis tool doesn’t think like a developer Error messages may require interpretation “What do you mean ‘n’ could be 4294967295? ” n Too much information to present all of it; what can be omitted? n Best solution is regular exposure n Ted Unangst – Coverity

Usage (Care and feeding of your source code analysis tool) n n The tool Usage (Care and feeding of your source code analysis tool) n n The tool is static; your usage should not be Most effective with regular usage n n n You can’t fix everything the week before the release Adapt checkers to unique problems Simplify code where possible to eliminate false positives n But don’t try to outsmart the tool Ted Unangst – Coverity

Organizational Deployment n Developers are not always incentivized to use the tools available More Organizational Deployment n Developers are not always incentivized to use the tools available More work n Accountability! n Needs an internal champion n n Maintenance n Verify all code going out the door is being checked Ted Unangst – Coverity

Round up n n n Static source code analysis can augment other forms of Round up n n n Static source code analysis can augment other forms of analysis Mostly confined to developers (need source) but adoption is slow or lacking in many organizations Much like secure programming, performing security analysis requires dedication and patience Ted Unangst – Coverity

The End n Thanks RECon n Coverity n n Questions? Ted Unangst – Coverity The End n Thanks RECon n Coverity n n Questions? Ted Unangst – Coverity