Search Results

Keyword: ‘cloud’

Clouds in BlackHat’s conference

Being the other side of the pond I wasn’t able to attend Black Hat, but I have been keeping a keen eye on the posted conference materials and talk recordings being released after the conference’ close. As I’ve recently been researching the latest buzz of Cloud Computing, naturally I was initially drawn to the talks with Cloud computing as a topic.

First up is Kostya Kortchinsky’s Cloudburst: Hacking 3D (and Breaking Out of VMware. This presentation details an exploit vector for breaking out of the guest environment and allowing arbitrary code execution on the underlying host. Kortchinsky clearly knows his stuff, but I’ll admit most of his talk goes well above my head. For reasons touched on below I think this is a virtualisation issue not a Cloud issue, which was likely added to title to cash in on the current buzz, but either way the bottom line is guest escape is rapidly moving from theoretical threat to practical attack vector and something that should be considered when designing any system, network or architecture.

Secondly, the Sensepost team do a great job of explaining security issues new or prevelant to Cloud architecture with Clobbering the Cloud! and include some great (read humorous) images to help illustrate they points. I especially like the idea of building and sharing trojaned/backdoored machine images and waiting for the unsuspecting to take advantage of your generousity 🙂 The videos used within the actual presentation are available direct from the Sensepost site, here.

Taking away the award for longest talk title related to Cloud Computing is: Cloud Computing Models and Vulnerabilities: Raining on the Trendy New Parade. This talk discusses the three components of the cloud ‘stack’; Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (Iaas).

I love the definition used for cloud computing or more accurately the statement that Cloud Computing is NOT:

  • Virtualisation
  • Remote Backup
  • Most of the stuff called cloud computing
  • And: ‘If you’re not re-writing your software, it’s not Cloud Computing’

From my previous research into Cloud Computing I feel that a lot of the security concerns often raised are not new or unique to the Cloud, and that well established and basic best practice will defend against the issues. The speakers of this presentation seem to be of a similar mind, but suggest that the early big players in this market are not necessarily doing all in their power, the example is that something as basic as logging and audit trails aren’t fully available within the current on market solutions.

Likewise depending on Cloud providers contracts and EULA clients of cloud services may not be able to fully control the security testing of ‘their’ environment as some providers forbid ‘malicious’ traffic being targetted at their architecture and platforms, which could limit and/or remove the ability to perform fully comprehensive penetration testing, which depending on location, market and data may be a legal or regulatory requirement.

Whilst not related to the Black Hat conference I read an article from datacentreknowledge.com from RackSpace, claiming that the Cloud is going spell the end of shared hosting as we know it. In my view this can only be a PR fluff piece, as anyone that understands hosted services, even those selling Cloud services themselves, agree that regardless of how you rate the benefits of Cloud architecture it is not, and cannot be, a silver bullet to solve all the world’s IT problems, leaving a market for traditional architectures.

If the Cloud is here to stay, so is everything else. Regardless of an individual IT professional’s personal opinion of Cloud computing it must be fully understood and measured on technical merits alongside existing solutions to be able to provide best value and ROI, implementing any solution based on ‘religious’ arguements is not in the best interests of any business.

Andrew Waite

Advertisements
Categories: Uncategorized

CloudCamp Lightening talks

2009/08/01 Comments off

Last week’s CloudCamp in Newcastle started of with a series of lightning talks, five minutes on a topic of the speakers choice.

Simon @ Amazon

Simon focused on security issues arising from implementing service provision based on Cloud architecture, starting of suggesting that most cloud implementations don’t consider security issues until after the initial implementation. It was also proposed that a lot of the security concerns were physcological, people feel less confidence in the security of their systems if they don’t control the physical hardware, but that sufficient security can be achieved by following best practice at other layers of the system architecture. To assist, Amazon’s cloud provision denies access to all network ports by default.

Gehan Munasighe @ Flexiscale

Gehan discussed provisioning cloud systems in more general terms. Cloud services are not virtualisation, but virtualisation is an integral component to a functional cloud offering. The goal of a cloud provider, and the benefit to a client hosting within the cloud, is that a client should not notice or be aware of any system failure within the cloud.

Stewart Townsend @ Sun

This presentation contained nearly every buzzword related to the Cloud, but trying to prove that the buzzwords aren’t important. The benefits provided by a Cloud environment are low-cost, increased agility and greater efficiency. Stewart claims the the technology required for Cloud systems is simple, the roadblocks to Cloud implementation are often developers and deployers, and in some case out-dated corporate policy.

Matt Deacon @ Microsoft

Cloud computing is required for progressive enterprises. The computing industry is currently an industry in transistion, but this transistion will likely not be realised for another 20 years.

Steve Caughey @ Arjuna

Steve started out by detailing some universal laws of computing. In addition to the well known Moores Law which states that processor power doubles every 18months, the law of storage states that disk capacity doubles every 12 months and Gilder’s law doubles every 6 months. These increases mean that geography and phyical location of resources become less important over time, allowing businesses to take advantage of economies of scale, but this must be tempered by consideration of local legal requirements.

Ross Cooney @  EmailCloud

Ross discussed a use for Cloud services that he calls ‘Boot strap & Transistion’. The theory is that by utilising Cloud services in the short term, start-ups and new services can be instantiated without the initial capital expenditure commonly associated with new IT environments. Once business is stable, and return on investiment can be proven the service can be transistioned back to in house hardware to increase control of the service and to increase potential client base as some businesses currently do not trust the cloud model. Alternatively if the venture proves unsuccessful the stakeholders can walk away with penalty or outstanding debt.

Andrew Waite

Categories: Cloud

CloudCamp sound bites

2009/07/30 1 comment

Same story as my previous post on the event; I’m still trying to fully digest all of the information and ideas presented. Whilst I research further I thought I’d share some of the comments and soundbites (mostly paraphrased) a took a note of during the event, which are currently bouncing around my head.

(If any of the speakers feel these are mis-quoted or out of context, please let me know)

Reading back through my notes, I find it interesting that most of these could relate equally well to any form of IT-based service, feeding back into my original feeling that cloud computing isn’t especially new but is simply the evolution of other shared IT frameworks (main-frames, multi-user systems, etc.). Which brings me nicely to my first quote:

The ideas and technologies behind cloud computing aren’t new; it is the billing model that is innovative and creating opportunities.

Use multiple cloud providers to ensure tolerance to failure

Balance the cost of a failure against the cost of mitigating the risk

Run a business/service expecting failures to happen, and plan accordingly

Contractual SLAs are not insurance against failure

Security issues related to Cloud computing aren’t new or worse than security issues within traditional architectures, they’re just more visible

Traditional systems don’t scale well within a cloud architecure

Todays archicture and system components will evolve to be more efficient with a cloud based environment

The cost of failure is often the biggest cost of IT systems

Traditional licensing models for OS and applications needs to evolve to match the requirements of cloud based services

And finally, which was said with a wry smile:

Cloud computing is good news for consultants

Andrew Waite

Categories: Cloud

Initial thoughts from CloudCamp

2009/07/29 1 comment

Tonight was the second CloudCamp event in the North East of England, and my first serious look at cloud computing. I really enjoyed the event and believe I recieved excellent value from attendence, so thanks to all those who helped run the event, presented and discussed aspects of the field with me during the breakout sessions.

My head is still spinning with new ideas and understanding as a result of the event so I’ll try to keep this brief and to act as a semi disclaimer for future postings regarding cloud computing.

Before the event my understanding of cloud computing was very cursory and I was very dubious of both it’s implementation and actual value to an organise. As such I attended the event in an effort to gain a greater insight into this new buzz word in service provisioning, either to join the bandwagon and take advantage of the Cloud’s potential, or to be able to better argue against adoption with a more reasoned argument than ‘I don’t like it’.

For this goal the event was perfect for my needs, as I know have a better understanding of what Cloud computing is (and isn’t) and have been able to answer some of my fundemental questions.

Short and sweet was the intention of this post so I’ll finish with a quote (paraphrased) from the event which has in some ways changed my outlook on Cloud computing, and more specifically the ability to secure a Cloud:

Security issues related to Cloud computing aren’t new or worse than security issues within traditional architectures, they’re just more visible.

Andrew Waite

Categories: Cloud, InfoSec

OWASP at Northumbria Uni – June 2010

June 16th marked the first time the Open Web Application Security Project’s (OWASP) Leeds/Northern Chapter ran an event at Northumbria University, meaning it was the first time I was able to attend. Jason Alexander started off events with a brief overview of OWASP and the projects the group is involved with.

ENISA Common Assurance Maturity Model (CAMM) Project

Colin Watson did a good job of explain the work he and others have been working on. The project have released two documents which Colid discussed, the Cloud Computing Risk Assessment[.pdf] and the Cloud Computing Information Assurance Framework[.pdf]. Don’t be put off by the focus on ‘Cloud’, whilst this was the focus and reasoning behind the work at the start of the project, the information and processes Colin describes could easily be related to any IT environment and at first glance seem to be well worth a read.

Open Source Security Myths

Next up David Anumudu gave a somewhat brave talk considering the audience discussing and (potentially) debunking the assumption that open source software is more secure than it’s closed source competitors. David picked on the now famouse phrase from The Cathedral and the Bazaar, ‘ Given enough eyeballs, all bugs are shallow’. David argues that while this is true and reasonable, it only works in practice if all the eyeballs have both the incentive and the skills to effectively audit the code for bugs, something is rarely discussed. A sited example of insecurities in prominent open source software was that of the MD6 hashing algorithm, intruced at Crypto 2008, where despite being designed and developed by a very clued up team still had a critical flaw in it’s implementation.

My ultimate take away from this talk was that software’s licensing model has no direct impact on the security and vulnerabilities of any codebase, only the development model and developers themselves have any real impact.

SSL/TLS – Just when you thought it was safe to return

Arron Finnon (Finux) gave a great presentation on vulnerabilities and weaknesses with the implementation of SSL protection. Arron argues that most problems with SSL are actually related to the implementation rather than methodology itself, and that despite the high profile of problems related to SSL most techies still don’t ‘get’ it; and most users, regardless of user awareness training will continue to blindly click through the cert warning prompts.

Several of Moxie Marlinspike’s tools were discussed, mainly SSLStrip and SSLSniff. I was aware of both tools, but hadn’t tried them out in my own lab yet, after Arron’s discussion of the problem and capabilities this is definitely something that I intend to rectify shortly. Especially when combined with other SSL issues, including the SSL renegotiation attack and the Null Prefix[.pdf] attack issues with SSL can be deadly to an environment.

Main takeaway from this talk was that SSL isn’t as secure as some would state, and that when planning to defend against the attack vectors we need to stop thinking ‘what if’ and start working towards ‘what when’.

AppSensor – Self aware web app

Colin Watson came back to the front to discuss the work currently being undertaken with the AppSensor project. The idea behind the project is to create web applications that are ‘self aware’ to a lesser extent enabling any user making ‘suspicious’ web requests to be limited or disconnected to limit the damage that they can cause to the target system, and works on the premise that the application can identify and react to malicious users in fewer connection requests than the user needs to find and exploit a vulnerability.

The identification comes from watching for a collection of red flags and tripwires built throughout the system, from simply looking for X number of failed log-in attempts to real-time trend analysis looking for an unusual increase in particular functionality requests. A lot of the potential indicators and trapped reminded me a lot of an old post on the Application Security Street Fighter blog, convering using honeytokens to identify malicious activity, which I’ve covered previously.

Summary

Overall I really enjoyed the event, I’m hoping that the Leeds/Northern OWASP chapter decide to run more events within Newcastle, but if not it’s convinced me that the events are worth the time and cost to travel down to the other locations. Always good to discuss infosec topics face to face with some really knowledgeable people.

–Andrew Waite

ReportSpammers.net

I was recently pointed towards www.reportspammers.net, which is a good resource for all things spam related and is steadily increased the quantity and quality of the information available. As much as I like the statistics that can be gathered from honeypot systems, live and real stats are even better and the data utilised by Report Spammers is taken from the email clusters run by Email Cloud.

One of the first resources released was the global map showing active spam sources (static image below), it is updated hourly and the fully interactive version can be found here.

Where are spammers global map

In addition to the global map, Report Spammers also lists the most recent spamvertised sites seen on it’s mail clusters. I’m undecided with the ‘name and shame’ methodoly due to the risk of false postives, but if your looking for examples of spamvertised sites it will prove a good resource (and one I intend to delve deeper into next time I’m bored). Just beware, sites that actively advertise via spam are rarely places that you want to point you home browser at, you have been warned.

If you are wanting a resource to explain spam and the business model behind it Report Spammers could be a good starting point. It even has the ability to explain spam to non-infosec types that still think spam comes in tins. Keep this in mind next time you need to run another information security awareness campaign.

— Andrew Waite

Categories: InfoSec, Tool-Kit

NEBytes Launch Event

Last night (2010-01-20) I had the pleasure of attending the launch event for NEBytes.

North East Bytes (NEBytes) is a User Group covering the North East and Cumbrian regions of the United Kingdom.  We have technical meetings covering Development and IT Pro topics every month. About

SharePoint 2010

The launch event was done in conjuction with the Sharepoint User Group UK (SUGUK), so was no surprise when the first topic of the night covered Sharepoint 2010, delivered very enthusiastically by Steve Smith. I’ve got no experience with Sharepoint so can’t comment too much on the material, but from the architectural changes I got the impression that it 2010 may be more secure that previous versions as the back-end is becoming more segmented, with different parts of the whole have discrete, dedicated databases. While it might not limit the threat of a vulnerability, it should be able to reduce the exposure in the event of a single breach.

Steve also highlight that there is some very granular accountability logging, in that every part of the application and every piece of data recieves a unique ‘Correlation ID’. The scenarios highlighted suggested that this allows for indepth debugging to determine the exact nature of a crash or system failure, by the same system this should allow for some good forensic pointers when investigating a potential compromise or breach.

Again viewing the material from a security stand point I was concerned that the defaults that appeared as part of Steve’s walkthrough defaulted to less secure options, NTLM authentication not Kerberos and non encrypted communication over SSL. One of Steve’s recommendations did concern me, to participate in the Customer Experience Improvement Program. While I’ve got no evidence to support it, I’m always nervous about passing debugging and troubleshooting information to a third party, never know what information might get leaked with it.

Silverlight

Second session of the night was Silverlight, covered by Mike Taulty (and should be worth pointing out that this session came after a decent quantity of freely provided pizza and sandwiches). As with Sharepoint I had no prior experience of Silverlight other than hearing various people complain about it via twitter, so found the talk really informative. For those that don’t know, Silverlight is designed to be a cross-browser and cross-platform ‘unified framework’ (providing your browser/platform supports Silverlight…)

From a developer and designer perspective Silverlight must be great, the built in functionality provide access to capabilities that I could only dream about when I was looking at becoming a dev in a past life. The intergration between Visual Studio for coding and Blend for design was equally impressive.

Again I viewed the talk’s content from a security perspective. Mike pressed on the fact that Silverlight runs within a tightly controlled sandbox to limit functionality and provide added security. For example the code can make HTTP[S] connections out from the browsing machine, but is limited to the same origin as the code or cross domain code providing the target allows cross domain from the same origin.

However, Silverlight applications can be installed locally in ‘Trusted’ mode, which reduces the restrictions in place by the sandbox. Before installing the app, the sandbox will inform the user that the app is to be ‘trusted’ and warn of the implications. This is great, as we all know users read these things before clicking next when wanting to get to the promised videos of cute kitties… I did query this point with Mike after the presentation and he, rightly, pointed out that any application installed locally would have the ability access all the resources that aren’t protected when in trusted mode. I agree with Mike, but I’m concerned that average Joe User will think ‘OK, it’s only a browser plugin’ (not that this is the case anyway) where they might be more cautious if a website asked them to install a full blown application. Users have been conditioned to install plugins to provide the web experience they expect (flash etc.)

Hyper-V

The final talk was actually the one I was most interested in at the start of the night, and was presented by James O’Neil. In the end I was disappointed, unlike the other topics I didn’t get too much that was new to me from the session, I’m guessing because virtualisation solutions are something I encounter on a regular basis. Only real take-away from the talk was the James gets my Urgh! award for using the phrase ‘private cloud infrastructure’ without cracking a smile at the same time.

Summary

The night was great, so a big thanks to the guys that setup and ran the event (with costs coming out of their own pockets too). The event was free, the topics and speakers were high quality and to top it off there were some fairly impressive give aways as well, from the usual stickers and pens to boxed Win7 Ultimate packs.

If you’re a dev or IT professional, I’d definitely recommend getting down to the next event.

— Andrew Waite

Categories: Event

2009: A review

Well, the year is nearly over and it seems everyone is in a reflective mode so I thought I’d join in. And I’m glad I did, didn’t really just how turbulent year I’ve had. I’d better (on pain of death) start with the none technical, as it is around 12 months since I got engaged to my long-time girlfriend.

Back to the technical: The InfoSanity blog went live in February with the first post. Originally I was far from confident that I would be able to keep up blogging as I had a ‘fear’ of social media and web2.0, but nearly a year on I’m still here and despite some peaks and troughs posting articles regularly. I’ve found it a great platform for getting ideas out of my head and into practice, hopefully I’ve managed to be of benefit to others in the process.

Lab environment: February was also when I purchased the server for my virtual lab environment. This has got be the best buy of the year, providing a solid framework for testing and experimenting with everything else I have done this year. Lab environments also seem to be one of the areas that gathers a lot of interest from others, the two posts discussing configuration of virtual networks and guest systems were InfoSanity’s most popular posts this year by a good margin. In the process of improving my lab environment I also read Thomas Wilhelm’s Professional Penetration Testing book and reviewed it for the Ethical Hacker Network, for which I’m indebted to Don for organising.

Wireless: Included in my long list of purchases this year was an Alfa AWUS036H wireless card and a BU-353 GPS Reciever. This resulted in a basic attempt to write a utility to create maps from the results of wardriving with Kismet, whilst the short development time of the project was enjoyable it was promptly shelved once people introduced me to Jabra’s excellent giskismet. It also resulted in the creation of the still to be field-tested, James Bond-esque warwalking case.

Honeypots: Whilst I had had Nepenthes honeypot system running before the turn of the year, I hadn’t really worked with it in earnest until the first post on the subject in February, and subsequent statistic utilities. These posts also became the topic for my first experience with public speaking, for local (and rapidly expanding) technical group, SuperMondays. As the technology has improved the honeypot system has recently been migrated over to Nepenthes’ spiritual successor Dionaea. Over the year I have also had the pleasure and privilege of talking with Markus Koetter (lead dev of Nepenthes and Dionaea) and Lukas Rist (lead dev of Glastopf), these guys *really* know their stuff.

Public Speaking: As mentioned above I gave my first public talk for SuperMondays, discussing Nepenthes honeypots and the information that can be gathered from them. Unfortunately (or thankfully) there is only limited footage available for the session as the camera’s battery ran out of juice. My second session was for a group of Northumbria University’s Forensics and Ethical Hacking students as an ‘expert speaker’, and I still think they mistook me for someone else. This time a recording was available thanks to a couple of the students, full review and audio available here. My public speaking is still far from perfect, coming out at a rapid fire pace, but I’m over my initial dread and actually quite enjoy it. Hopefully they’ll be additional opportunities in the future.

Friends and Contacts: Throughout the year I have ended up in contact with some excellent and interesting people; from real-world network events like SuperMondays and Cloudcamp, old school discussions in forums (EH-Net) and IRC channels, to the ‘2.0’ of Twitter (@infosanity btw). Along with good debates and discussions I’d also like to think I’ve made some good friendships, too many people to name (and most wouldn’t want to be associated 😉 ) but you know who you are.

So that’s the year in brief, couple of smaller activities along the way, from investigating newly released attack vectors to trying my hand at lock picking. In hindsight it has been one hell of a year, and with some of the side projects in the pipeline I’m expecting 2010 to be even better. Onwards and upwards.

— Andrew Waite

Categories: Honeypot, InfoSec, Lab, Presentation

SuperMondays – Barcamp style

2009/10/26 Comments off

This months SuperMondays was a deviation from the usual format; rather than speaker followed by Q&A the event was run in a similar format to Barcamp. This meant that there were several simultaneous conversations ongoing at any one time with attendees floating between discussions and chipping in as appropriate.

SuperMondays Logo

For my part the first talk I attended was on cloud computing, which regular readers will know is something I’ve spent some time looking at recently. General consensus was that cloud may be the future, but no one was willing to place their critical data in the cloud just yet.

Second up was a discussion on encryption. This discussion started slowly, whilst there were several people present, most had some interest in encryption and had wanted to learn more from those more knowledgeable. Basic outcome: encryption is something you want to be doing for critical data.

Third and final discussion I got to was a comparison of open vs closed source development. In all honesty I was expecting an argument, with plenty of MS bashing all around. The discussion was remarkably calm and impartial, with a general consensus of ‘both have their place depending on circumstances’.

Some of the other talks included web development frameworks, a demo of Google Wave and a discussion of requirements for new start-ups.

Overall I think the event worked well with some interesting discussions but I do think I prefer the more traditional format. At least from the talks I attended I don’t think those new to a topic would have walked away with any usable information, likewise the ‘knowledgeable’ attendees likely didn’t hear anything to change their opinions or beliefs.

There were some interesting announcements, including that which can’t be discussed (hint: if you want the inside scoop, some stuff gets announce at SuperMondays events before getting released in public domain, shhh!).

  • SuperChristmas has now been organised in partnership with other local networking groups, December 17th for all those in need of additional festivities.
  • North East Blog Directory: as part of SuperMondays the group is compiling a list of local technical blogs.
  • SuperMondays Google Groups: The Google Groups section for SuperMondays is starting to pick up pace. If you want to keep upto date with the group, suggest a topic of generally discuss the event sign up and join in.

That’s all for this month, as usual thanks for a good night and see you all at the next one.

Andrew Waite

Categories: SuperMondays

BCS Exit Survey

Sorry for the non-security related rant. I recently recieved my renewal reminder for the BCS, I’ve been increasingly disappointed with the ‘advantages’ of being a member. Whilst I don’t like not being a member of a professional body for my craft, I simply cannot justify the cost any longer. I don’t like being negative but my response to a question on the exit survey says it all:

What, if anything, do you feel BCS could be doing to better serve it’s members?

Primarily: Better regional events. Most (all?) events are located in London, making events infeasible for members in other regions of the country. When I joined as a member there were several good events, covering a wide range of topics, held by my local groups. My local branch (Newcastle) has not ran a decent event in excess of 12months and currently do not have ANY events organised for the future (using newcastle.bcs.org as a source and point of contact).

Alternative groups in the area (SuperMondays, CloudCamp NE, among others) are free of charge and provide significantly better events, networking opportunites and information than BCS alternatives. Taking the geographical location out of the equation, the quality of discussion on the BCS’s online forums is limited, infrequent and in most cases superfical. It seems most members do not view the forums as a good source for information or discussion.

The last event I attended was finished off with a presentation and Q&A session by Rachael Burnett, at the time president of the BCS. For the head of the organisation Rachael appeared out of touch with the real-world industry, this is a situation that I’ve seen mirrored in the organisation as a whole in my experience.

When starting my career, the information provided by the newsletters, email announcements, etc. from the BCS were valuable. Lately however, the articles have been dated, with me already recieving the information from another source in some cases weeks before the BCS version. As a result the BCS emails now recieve little more than a cursory glance before being deleted.

I’m aware that there is work in progress to provide a local branch of the YPG in my region. Whilst I sincerely hope this is successful I do not have high hopes for it’s success and after several years paying membership with seeing any real benefit this move is too little too late for me.

There is a hugely active and skilled computing profession in the North East of England, but the BCS seems to completely ignore the region and fails (from my experience) to provide any benefit to the region or the region’s members; either that or the BCS is equally out of touch and poorly serving the UK’s IT community as a whole.

Andrew Waite

Categories: Uncategorized