As Published In
Oracle Magazine
May/June 2009

COMMENT: All Secure


Good Reporting, Good Governance

By Mary Ann Davidson

Measuring “How are we doing?” in Oracle Software Security Assurance

Most of us are list makers or “list takers.” When we were kids, it was summer reading lists and back-to-school lists. Now that we’re adults, it’s New Year’s resolutions, grocery lists, and life goals. The only thing that changes is that we use more gadgets (a smartphone) to create lists than we did in days of yore (the back of an envelope). Lists work by helping you prioritize (Must, Want, Like) or sequence (1, 2, 3 . . .) what needs doing, and they help you track whether you did those things or not. That last bit is important, especially when you manage work not entirely under your control or must show someone else what you have done, both of which are elements of governance.

Oracle has recently put more governance structures around Oracle Software Security Assurance. A very sharp manager on my team wanted to know how consistently and broadly teams across all the Oracle development groups (including teams we’ve added via acquisition) were following our secure development practices. He wanted to measure progress—clearly, a brand-new acquisition cannot change overnight to do things The Oracle Way. In particular, he wanted to identify teams that were slower to adopt Oracle Software Security Assurance so we could target them for more assistance. The Security Assurance Compliance Scorecard he developed is a tool that enables us to manage what needs doing, in priority order, and how well we are doing it. What none of us expected was how powerful the idea of governance—of what we say we do, intend to do, and are doing—would turn out to be. Report cards, I had forgotten, are another kind of list. 

Next Steps



 LEARN more about Oracle Software Security Assurance

READ more Davidson
blogs.oracle.com/maryanndavidson

Our first success was the mandatory class on secure development practices that almost everyone in development has to take. Developers are seemingly always busy: there is always another release to plan, code, test, and deliver, so mandatory training is hard to fit in. Mind you, we already tracked who had to take it, who had taken it, and who had passed, and we reminded groups often that they needed to “get on with it.” What kicked the program into high gear was collating all that information into a bar chart and presenting it to senior management, who could then see each development group’s training status and also which groups had the highest completion rates and which had the lowest. When we first created the bar chart, there were a few groups that were “noncompliant” (below 75 percent complete), most were “making progress” (between 75 percent and 95 percent complete), and a few were “compliant” (at or above 95 percent complete). Within six weeks of presenting the chart to senior management, all but one group was compliant. That outlying group created an action plan, and six weeks later, that group was compliant too. Governance—a management-level synopsis of “Are we doing what we say we are doing?”—works. (So does reporting leaders and laggards to the boss.)

Training is merely one element in the larger Security Assurance Compliance Scorecard, in which we track compliance by development group for major elements of Oracle Software Security Assurance, such as use of automated tools, secure configuration, and integration into the critical patch update process. We are also breaking down each element into smaller compliance steps for reporting consistency and to measure incremental progress. The scorecard is presented left to right—the first Oracle Software Security Assurance elements that development groups generally adopt are on the left, and the “later elements” are on the right. We provide not only a key (the degree to which a group is compliant is coded by color) but also an attachment describing how teams can become compliant. The report—which goes to the highest levels of the company—has had a galvanizing effect on development groups: everyone wants to be an A student (or solid green). That said, the “report to management” is intended to be, “Here’s how we’re doing,” not (primarily), “This team is bad.”

Good governance is really just managing well, and that includes giving people the tools they need in order to do so. My team strives for absolute accuracy in reporting, and we ensure that development security leads receive the report—and have a chance to raise any concerns— before it goes to senior management. We want the people doing the work to be the first ones to see their report card, and we also highlight to senior management the groups that have really improved and the individuals who have made that happen. As my mom always says, “You catch more flies with honey than with vinegar.”

 



Mary Ann Davidson
is the chief security officer of Oracle, responsible for secure development practices and security evaluations and assessments. She represents Oracle on the board of directors of the Information Technology Information Security Analysis Center (IT-ISAC), has served on the U.S. Defense Science Board, and is on the editorial review board of SC Magazine.


Send us your comments