Home > InfoSec > InfoSec Triads: Cost/time/functionality

InfoSec Triads: Cost/time/functionality

2010/06/30
Triad: Cost, Functionality, Time

Triad: Cost, Functionality, Time

Following InfoSanity’s recent (and unexplainable) focus on triads in previous posts is the relationship between cost, time and functionality. For the purpose of this discussion assume a scenario for introducing a new product/service or adding new capabilities to an existing service.

In an ideal world all projects would have enough resources and realistic timescales to develop all required functionality to the highest level of quality. However in the real world this is rarely achievable when working with external constraints. Therefore in any project compromises are inevitable.

The theory stands that with a given set of resources, only a finite amount of functionality can be developed. Therefore it stands to reason that additional functionality can be added to a project by increasing the length of the project or adding additional resources (although the Mythical Man Month and Dilbert may refute this simplistic theory).

Within ever tightening economic conditions and competition, in order to reduce costs and/or development and implementation time functionality is stripped from the service. As security is often seen by wider business as a nicety rather than a necessity, security features are commonly the first to be dropped, or the security of features still implemented is reduced.

Despite what infosec professionals (myself included) may like to think, reducing security to meet business or market drivers isn’t necessarily a bad thing. Providing that the benefit gained is proportionate to the additional risks introduced, and those risks are acceptable to the business and/or client. However, in the increasing world of regulatory compliance this can provide a false economy to a business as it is almost universally more costly in implement additional security on-top of an existing solution than it is to bake the required security into the design and development phases.

And if anyone tells you that a less secure solution is temporary and will be rectified at a later date… Don’t believe them 🙂

— Andrew Waite

Advertisements
Categories: InfoSec
  1. 2010/07/01 at 15:45

    When customers (internal/external) talk to me about risk, I always throw this question at them: How many root compromises that result in significant theft of confidential information, proprietary information or money (as applicable) are going to be acceptable to the CEO/stock holders this year?

    When it comes to security, I hate this triangle with a passion. I can barely live with implementing functionality in phases (it’s a good thing and yes, it does make sense from a management standpoint – it’s just me), but to implement security in increments is just wrong.

    If a software development manager senses that management is starting to put time and cost ahead of security, it’s very important that he document the threats and risks, by painting various realistic scenarios that include the impact of the theft, modification or insertion of specific information to the organization and customers. Don’t just say the apps going to get hacked, because people just blow that off – talk about short and long term tangible consequences.

    I would really have to think twice about continuing to work for an organization that found the risk associated with identify theft to employees or customers to be “acceptable.” The same thing goes for other people’s money.

    Cheers

    “…(and unexplainable)…” – That’s funny Andrew. 😉

  2. 2010/07/01 at 16:14

    Mister Reiner :

    I would really have to think twice about continuing to work for an organization that found the risk associated with identify theft to employees or customers to be “acceptable.” The same thing goes for other people’s money.

    I agree, and this is what I had intended by ‘ensuring risks are proportionate to benefits’. In hindsight I should/could have been clearer. The kind of scenarios that I envisioned where security may (acceptably) lapse in the face of external business pressures were at the level of spending many man hours ensuring that an internal web app with no access to confidential information or PII is susceptable to XSS being weighed against getting the project out the door and re-allocating resources to other projects.

    Again, ideal world then I want no security issues at all, but business is far from a perfect world.

    I like your point about not just saying ‘it can be hacked…’; as you say people’s eye’s roll back and the point gets ignored, but trying to fully explain the issue to a non-security tech with terms like ‘buffer overflow’ or ‘CRSF’ doesn’t improve the situation. If the company has an internal audit function then VA or pentesting will obviously pick up the issue you are concerned about, but if security levels are being impacted by cost/resource concerns will this activities still be undertaken? And if you (as an infosec resource) take it upon yourself to test without instruction then at best your results won’t be appreciated, at worst you may encounter disciplinary issues for using company resources for activities not asked by senior management.

    Not an ideal scenario, but I think if we can fully understand the aspects that make the issue less than ideal then we will be better equiped to deal with the business drivers and company politics. Or can you propose a better solution to the problem?

    • 2010/07/02 at 02:52

      Let me try and explain myself a little better…

      I agree with you that technical terms will definitely go over people’s head. It would be better to say things like:

      – The hacker will have full administrative access to the application and be able to view all of our customers’ information.

      – The hacker will be able to create fraudulent transaction that will will be unable to detect because we are not incorporating (fill in the blank) security feature.

      – If a hacker is able to steal the customer’s credentials he will be able to login from any computer vice the customer’s registered computer that utilizes our special key fob USB authentication device.

      – Once a customer’s name, date of birth and social security number are compromised and we notify the customer, trends have shown that companies typically lose 15-20% of their customers over the next 2-3 months over fears that our security is inadequate (I just made those number up)

      – If our developmental formula for transparent aluminum gets into the hands of our competitors and they are able to figure out how to solidify the medium before we do, they will be able to patent our research and we will have nothing.

      – If we deploy this application before it’s fully testing against SQL injection attacks, a hacker may be able to gain full access to the database

  3. 2010/07/02 at 09:46

    Mister Reiner :

    Let me try and explain myself a little better…

    That works and I like your examples, I think we’re both coming at the problem from the same angle but I’ve done a poor job of describing my intentions (shouldn’t post replies whilst fighting fires, never comes across right…)

    You got me excited though, I was just about to ask what sources you were using for your figures as they could help in similar discussions elsewhere, then you point out you made them up (but then 87.284% of all stats are made up on the spot)

    • 2010/07/03 at 02:38

      It’s all good 🙂

      Sorry about the numbers, ha ha. I’m sure that the real figures are a well-guarded secret.

  1. No trackbacks yet.
Comments are closed.
%d bloggers like this: