Search Results

Keyword: ‘triad’

InfoSec Triads: Cost/time/functionality

Triad: Cost, Functionality, Time

Triad: Cost, Functionality, Time

Following InfoSanity’s recent (and unexplainable) focus on triads in previous posts is the relationship between cost, time and functionality. For the purpose of this discussion assume a scenario for introducing a new product/service or adding new capabilities to an existing service.

In an ideal world all projects would have enough resources and realistic timescales to develop all required functionality to the highest level of quality. However in the real world this is rarely achievable when working with external constraints. Therefore in any project compromises are inevitable.

The theory stands that with a given set of resources, only a finite amount of functionality can be developed. Therefore it stands to reason that additional functionality can be added to a project by increasing the length of the project or adding additional resources (although the Mythical Man Month and Dilbert may refute this simplistic theory).

Within ever tightening economic conditions and competition, in order to reduce costs and/or development and implementation time functionality is stripped from the service. As security is often seen by wider business as a nicety rather than a necessity, security features are commonly the first to be dropped, or the security of features still implemented is reduced.

Despite what infosec professionals (myself included) may like to think, reducing security to meet business or market drivers isn’t necessarily a bad thing. Providing that the benefit gained is proportionate to the additional risks introduced, and those risks are acceptable to the business and/or client. However, in the increasing world of regulatory compliance this can provide a false economy to a business as it is almost universally more costly in implement additional security on-top of an existing solution than it is to bake the required security into the design and development phases.

And if anyone tells you that a less secure solution is temporary and will be rectified at a later date… Don’t believe them 🙂

— Andrew Waite

Categories: InfoSec

InfoSec Triads: Security/Functionality/Ease-of-use

Triad: Security, Functionality, Ease of Use

Triad: Security, Functionality, Ease of Use

Following from an introduction of the C.I.A. Triangle another triangle is used to help explain the relationship between the concepts of security, functionality and ease of use. The use of a triangle is because an increase or decrease in any one of the factors will have an impact on the presence of the other two.

As an example, increasing the amount of functionality in an application will also increase the surface area that a malicious user can attack when attempting to find an exploitable weakness.

The trade-off between security and ease of use is commonly encountered in the real world, and often causes friction between users and those responsible for maintaining security. Microsoft had long been targetted by the security community for allowing everyday users to operate the system with administrative or system level permissions, which resulted in any exploit targeting a userland application was immediately given access with full rights. When Microsoft tried to limit this functionality by forcing users to specifically request elevated privilages via User Access Control (UAC) there were high number of complaints from users who weren’t happy with the extra actions required to complete tasks. As a result many instructions and guides were created to teach users how to disable the UAC functionality; increasing the ease with use and decreasing the steps needed to perform some tasks, but at the expense of disabling an improved security system.

A recent blog post discussing the security of windows operating system states, quite colorfully that:

Windows is an open cesspool to anyone developing applications. Developers can store information anywhere in the registry and store executable components anywhere in the file system – this includes overwriting existing registry entries and files. They can also write “hooks” to intercept, monitor and replace operating system calls to do fancy things. While all of this is great from a functionality standpoint, it’s also the main reason why Windows can never be secured.

Leaving the bias and hyperbole of the above, rightly or wrongly developers are able to write to the filesystem, registry and hook API calls in order to provide the functionality expected and requested by end users. From this standpoint no functional operating system will never be 100% secure, what every system and ultimately user must settle on a compromise between acceptable functionality and usability, and acceptable security.



I’d been looking for this Dilbert strip when writing the post, just came across it now, enoy:

Dilbert - Security trade-off


Categories: InfoSec

InfoSec Triads: C.I.A.

Triad: Confidentiality, Integrity, Availability

Triad: Confidentiality, Integrity, Availability

Information security is a far reaching and often all encompassing topic, but at it’s core information security and the protection of digital assets can be reduced to three central attributes. These are Confidentiality, Integrity and Availability; often referred to as the CIA triangle (not to be confused with the US’ Central Intelligence Agency).

Each factor provides a different and complementary protection to data and all three must be sufficiently preserved to maintain the useful of the information and information systems that are being protected.


Maintaining data’s confidentiality requires ensuring that only those users and/or systems authorised to access the stored information are able to access the protected data.

As a result this aspect is often what first comes to someone’s mind when thinking about information security, the act of preventing those that shouldn’t be able to access the data from doing so. Some of the most commonly understand security systems and processes fall into this category, for example requiring user authentication in the form of a password, or preventing remote access to a restricted resource with a firewall. Removing technology from the process, this is the equivilent of using a lock on the office filing cabinet.


Maintaining data integrity involves ensuring that the data remains correct, whilst in storage or transit, and that only authorised changes are made to the data.

At a highlevel data integrity can be protected using similar controls to those enforcing data confidentiality discussed above, if the data can only be accessed by those that are authorised to view and modify the data are able to access it in the first place then data integrity must be enforced. Unfortunately this is only part of the story as it only protects against the data’s integrity being compromised by a malicious third part. Data integrity can also be compromised by an authorised user changing the data in error, a program handling the data could contain a programming or logic flaw resulting in it changing the data in a way other than desired, or a hardware error could result in the stored data becoming corrupt.

Hashing algorithms like MD5 or SHA1 can be used to determine if the contents of a file has been changed, but this cannot determine if the file has been changed correctly. This highlights one of the key problems within the realm of information security; while data integrity falls under the remit of security there are many different factors, and in the business world many different individuals and/or departments, that can have a direct impact on data integrity; and the overall protection is only as strong as the weakest link.


Maintaining availability of both systems and information is crucial for most IT professionals to continue in gainful employment. As a result a lot of tasks geared towards ensuring systems availability are already incorporated into most business practices, including frequent backups and maintaining standby systems to replace production units in the event of a failure.

Availability of data can be attacked by a malicious user, application or script deleting the data itself; a newer form of attack, ransomware, follows a general trend of monetizing computer attacks involves preventing the legitimate users of the data by cryptographically protecting the data with a key know only to the malicious parties, who then attempt to extort money from the victim in return for the kep to unlock the data.

Alternatively denial of service (DoS) attacks prevent legitimate use of a service by utilising all resources of the server. Like ransomware DoS attacks can be monetized by a malicious party, either before the incident with the potential victim required to pay-up to prevent the attack from occuring in the first incident or being forced to pay to stop an incident that is already in place. Online betting sites were among the first of those businesses to be threatened in this way. The IACIS has a good paper on the topic of cyberextortion [pdf].

— Andrew Waite

Categories: InfoSec

2010: A Review

Originally I wasn’t planning on reviewing this year, didn’t think that much had happened, but during some end of year house keeping came across the InfoSanity review of 2009 and wanted to keep the trend going. In keeping with last years review. I’ll start with the non-technical (again on pain of death 😉 ); wedding plans going strong so I should be a married man early 2011.

Back to the technical: Despite my initial concerns; the site, blog and research environment are still here and still growing. To all those who’ve read, contributed and (most importantly) told me I’m wrong over the past year (you know who you are), thank you.

Lab Environment(s): To complement the home lab established in 2009, 2010 saw the introduction of a hosted virtual lab which has provided the opportunity to easily try new (and old) technologies in the real world. As part of this InfoSanity has setup (and in some cases also removed) instances of honeyd, Dionaea, Amun and Kippo. These systems have also resulted in some new utilities being developed and released as I worked through various findings.

Whilst standing on the shoulders of giants (thanks Markus), some of the findings from the InfoSanity environment are now available publically. Although I really must complete both automating the process and including findings from other systems, 2011’s to-do list is already growing.

Public Speaking: For some reason I’ve still been asked to talk in public about topics I find fascinating; so thanks to the Disaster Protocol team for having me on the show. I felt it was a great discussion of honeypot technologies and infosec in general, and from feedback I’ve had others seem to agree.

Trying new things: Whilst trying to grow and mature over the year InfoSanity tried a few different themes and topics, some worked, like basic ssh hardening guidelines (potentially more to come in new year) and some didn’t, like the ‘Infosec Triads’ series. But if you don’t stretch yourself you’ll stop learning, so expect more posts that don’t quite work in 2011.

Friends, contact and groups: As with last year, the best part of 2010 has definitely been the people I’ve either continued talking to and/or working with and those I’ve met for the first time. 2010 saw a growth spurt in local and online groups I’ve been involved in, including the start of NEBytes, ToonCon and the Kippo User Group. There are also a huge number of awesome groups which I don’t get as much time to get involved with as I’d like; EH-Net, Group51, DissectingTheHack, Exotic Liability…the list goes on.

2011?: Who knows? Every time I try to make plans or predictions the Sky Fairies and Flying Spaghetti Monsters mock me, so I won’t try to make any. But whatever the outcome, I’m not expecting a letup in the pace, and can already see some exciting new opportunities on the horizon.

Another decade down, and a new year of opportunity ahead. See you all in 2011.

–Andrew Waite

Categories: Uncategorized