Archive for February, 2010

Month of PHP bugs 2010

Following in the now well-established form of a ‘Month of X Bugs’ has just opened it’s call for papers for a second month, to update and expand on it’s successful run month in 2007.

I’ll admit that I largely ignored the original Month of PHP Bugs (MOPB), at the time I had just made the decision to stop coding in PHP and try a more mature language. I had found PHP to be a very simple language to learn and code it, but as a result I also found it very simple to code very badly in as well. (and I’ve since found that a bad coder can code badly in any language, hence why I gave up the career path of developer).

However, this month’s SuperMondays event changed my perspective slightly. Lorna Jane gave a great presentation on using PHP to provide a web services architecture, at first glance looks like PHP has improved and matured significantly since I last used it. For those interested Lorna’s talk was recorded and is available here, and Lorna’s own take on the event can be found here.

So while I’m not in a position to contribute to the month’s releases, I will be paying closer attention to the resources released this time around. If you think you can contribute the organizers have posted a list of accepted topics:

Accepted Topics/Articles

  • New vulnerability in PHP [1] (not simple safe_mode, open_basedir bypass vulnerabilities)
  • New vulnerability in PHP related software [1] (popular 3rd party PHP extensions/patches)
  • Explain a single topic of PHP application security in detail (such as guidelines on how to store passwords)
  • Explain a complicated vulnerability in/attack against a PHP widespread application [1]
  • Explain a complicated topic of attacking PHP (e.g. explain how to exploit heap overflows in PHP’s heap implementation)
  • Explain how to attack encrypted PHP applications
  • Release of a new open source PHP security tool
  • Other topics related to PHP or PHP application security

[1] Articles about new vulnerabilities should mention possible fixes or mitigations.

And prizes are available for the best submissions:

# Prize
1. 1000 EUR + Syscan Ticket + CodeScan PHP License
2. 750 EUR + Syscan Ticket
3. 500 EUR + Syscan Ticket
4. 250 EUR + Syscan Ticket
5.-6. CodeScan PHP License
7.-16. Amazon Coupon of 65 USD/50 EUR

So what are you waiting for? Get contributing…

–Andrew Waite

Book Review: Virtualization for Security

2010/02/27 Comments off

After having this on my shelf and desk for what seems to be an eternity, I have finally managed to finish Virtualization for Security: Including Sandboxing, Disaster Recovery, High Availability, Forensic Analysis and Honeypotting. Despite having one of the longest titles in the history of publishing, it is justified as the book covers a lot of topics and subject matter. The chapters are:

  1. An Introduction to Virtualization
  2. Choosing the right solution for the task
  3. Building a sandbox
  4. Configuring the virtual machine
  5. Honeypotting
  6. Malware analysis
  7. Application testing
  8. Fuzzing
  9. Forensic analysis
  10. Disaster recovery
  11. High availability: reset to good
  12. Best of both worlds: Dual booting
  13. Protection in untrusted environments
  14. Training

Firstly, if you’re not security focused don’t let the title put you off picking this up. While some of the chapters are infosec specific a lot of the material is more general and could be applied to any IT system, the chapters on DR, HA and dual booting are good examples of this.

Undoubtedly the range of content in the book is one of it’s biggest draws, I felt like a kid in a sweet shop when I first read the contents and had a quick flick through, I just couldn’t decide where to start. This feeling continued as I read through each chapter, different ideas and options that I hadn’t tried were mentioned and discussed, resulting in me scribbling another note to my to-do list or putting the book down entirely while I turned my lab on to try something.

The real gem of information that I found in the book was under the sandboxing chapter, which was one of the topics that persuaded me to purchase the book in the first place. Considering that one of the books authors is Cartsten Willems, the creator of CWSandbox it shouldn’t be too surprising that this chapter covers sandboxing well. The chapter also covers creating a LiveCD for sandbox testing, while very useful for the context it was explained in, it was one of several parts to the book where by brain started to hurt from an overload of possible uses.

As you might have already guessed, the range of topics is also one of the books biggest weaknesses. There just isn’t enough space to cover each topic in sufficient depth. I felt this most in the topics that I’m more proficient with, while the Honeypotting chapter does a great job of explaining the technology and methodology but I was left wanting more. The disappointment from this was lessened on topics that I have less (or no) experience with as all the material was new.

Overall I really liked the book, it provides an excellent foundation to the major uses of virtualisation within the infosec field, and perhaps more importantly leaves the reader (at least it did with me) enthusiastic to research and test beyond the contents of the book as well. The material won’t help you become an expert, but if you want to extend your range of skills there are definitely worse options available.

–Andrew Waite

Categories: InfoSec, Reading

Random 419

2010/02/27 Comments off

I want to say thank you to everyone who has supported this site and blog, but it is closing down as I am now rich thanks to the Central Bank of Nigeria. No, seriously, they sent me an email and everything….

Okay, maybe not, but it’s a while since I’ve seen a 419 (advance fee fraud) slip through to my inbox so thought I’d share. Originally I hand planned to critique different parts of the email, but I still can’t believe people fall for these so instead I’ll just share the ‘wealth’ for all.

This is to congratulate you for scaling through the hurdles of screening by the board of directors of this payment task force. Your payment file was approved and the instruction was given us to release your payment and activate your ATM card for use.

The first batch of your card which contains 1,000.000.00 MILLION U.S. DOLLARS  has been activated and is the total fund loaded inside the card. Your fund which is in total 10,000.000.00 MILLION U.S. DOLLARS  will come in batches of 1,000.000.00 MILLION U.S. DOLLARS  and this is the first batch.

Your payment would be sent to you via UPS or FedEx, Because we have signed a contract with them which should expired by MARCH 30th 2010 Below are few list of tracking numbers you can track from UPS website( to confirm people like you who have received their payment successfully.

JOHNNY ALMANTE ==============1Z2X59394198080570
CAROL R BUCZYNSKI ==============1Z2X59394197862530
KARIMA EMELIA TAYLOR ==============1Z2X59394198591527
LISA LAIRD ==============1Z2X59394196641913
POLLY SHAYKIN ==============1Z2X59394198817702

Good news, We wish to let you know that everything concerning your ATM CARD payment despatch is ready in this office and we have a meeting with the house (Federal government of Nigeria) we informed them that your fund should not cost you any thing because is your money (Your Crad). Moreover, we have an agreement with them that you should pay only delivering of your card which is  82 U.S. DOLLARS by FedEx or UPS Delivering Company.

However, you have only three working days to send this 82 U.S. DOLLARS  for the delivering of your card, if we don’t hear from you with the payment information; the Federal Government will cancel the card.

This is the paying information that you will use and send the fee through western union money transfer.

Address: Lagos-Nigeria
Question: 82
Answer: yes

I wait the payment information to enable us proceed for the delivering of your card.


Do I really need to suggest anyone ignore similar opportunities that they may reach their inbox?

Additionally if you want to find out more, or a good laugh at the expense of these ‘con-men’ take a trip over to the excellent 419Eater site, these guys (and gals) do great work.

–Andrew Waite

Categories: Uncategorized

Direct Access at NEBytes

Tonight was the second NEBytes event, and after the launch event I was looking forward to it. Unfortunately the turn out wasn’t as good as the first event, 56 were registered but I only counted approximately 22 in the audience. The topic I was most interested in was a discussion of Microsoft’s Direct Access (DA), this was billed as an ‘evolution in remote access capabilities’. Being a security guy, obviously this piqued my interest.

Tonight’s speaker covering DA was Dr Dan Oliver, managing director at Sa-V. Before I start I want to state that I have/had no prior knowledge of DA, and my entire understanding comes from the presentation/sales-pitch by Dan tonight, if anyone with more knowledge once to point out any inaccuracies in my understanding or thoughts I’d more than welcome getting a better understanding of the technology.

DA is an ‘alternative’ to VPNs (discussed more later) for a Microsoft environment. The premise is that it provides seamless access to core resources whether a user is in the office or mobile. The requirements are fairly steep, and as Dan discussed on several occasions may be a stumbling block for an organisation to implement DA immediately. These are (some of) the requirements:

  • At least one Windows 2008 R2 server for AD and DNS services
  • A Certificate Authority
  • Recent, high-end client OS: Windows 7, Ultimate or Enterprise SKU only.
  • IPv6 capable clients (DA will work with IPv6 to IPv4 technologies)

As few organisations have a complete Win7 roll-out, and even less have the resources available to roll-out the higher end versions Dan was asked why the requirement. Answer: ‘Microsoft want to sell new versions, sorry’.

With DA pitched as an alternative to VPN at numerous points in the presentation the was a comparison between the two solutions, and to me the sales pitch for DA seemed schizophrenic. Dan kept switching between DA being an improvement to the current VPN solutions completely, and DA being suitable for access to lower priority services and data but organisation may prefer to remain with VPNs for more sensitive data. At this point I couldn’t help thinking ‘why add DA to the environment if you’re still going to have VPN technologies as well’. This was especially the case as Dan stated (and I can’t verify) that Microsoft do not intend to stop providing VPN functionality in their technologies.

From a usability and support perspective DA is recommended as it does not require additional authentication to create a secure connection to ‘internal’ services. Apparently having to provide an additional username/password (with RSA token/smartcard/etc.) needed to establish a VPN connection is beyond the capabilities of the average user.

One aspect that I did agree with (and if you listen to Exotic Liability you will be familiar with) is the concept of re-perimeterisation. The concept that the traditional perimeter of assets internal to a firewall is no longer relevent to protect resources in the modern environment, and that the modern perimeter is where data and users are, not tied to a particular geographical location or network segment. However, rather than the perimeter expending to encorporate any end user device that may access or store sensitive data, Dan claimed that DA would shrink the perimeter to only include the data centre, effectively no longer being concerned with the securityof the client system (be it desktop, laptop, etc.).

This point made me very concerned for the model of DA, if the client machine has seamless, always on access to ‘internal’ corporate services and systems I would be even more concerned for the security of the end user machine. If a virus/trojan/worm infects the system with the same access as the user account, then it too has seamless, always on access to the same internal services. I’m hoping this weakness is only my understanding of the technology, seems like a gaping whole in technology. If anyone can shed any light on this aspect of DA I’d appreciate some additional pointers to help clear up my understanding.

At this point I still can’t see an advantage to implementing DA over more established alternatives, my gut feeling is that DA will either become ubiquitous over the coming years or disappear without making an impact. Due to the fact it doesn’t play nice with the most widely implemented MS technologies, let alone ‘nix or OSX clients and the strict requiremented making a roll-out expensive I expect it to be the latter, but I’ve been wrong before.

At this point I decided to make a speedy exit from the event (after enjoying some rather good pizza) as the second event was dev based (Dynamic consumption in C# 4.0, Oliver Sturm) and I definitely fit in the ‘IT Pro’ camp of NEBytes audience.

Dispite my misgivings from the DA presentation I still enjoyed the event and look forward to the next. If you were at either of the events please let the organisers know your thoughts and ideas for future events by completing this (very) short survey. Thanks Guys.

— Andrew Waite

Categories: Event, MS Windows, Presentation

New Projects Section

2010/02/17 Comments off

The core InfoSanity site has just (last 24hours) had the first of several planned refreshes go live. In this case it is a section of the site dedicated to the code and tools released as part of the research carried out by InfoSanity. No new content yet, but it has served as a nice reminder of some of the intended features still incomplete in existing projects, hopefully updates should be coming soon.

The start of the section can be found here, alternatively just navigate from the site’s menu. For those feeling lazy, a sneak peek:

— Andrew Waite

Categories: Tool-Kit