Nov 25 2009

ENISA Cloud Computing Risk Assessment

Category: Cloud computingDISC @ 4:22 pm

Network and Information Security Agency
Image via Wikipedia

Cloud Security and Privacy: An Enterprise Perspective on Risks and Compliance

The ENISA (European Network and Information Security Agency) released the Cloud Computing Risk Assessment document.

The document does well by including a focus on SMEs (Small and Medium sized Enterprises) because, as the report says, “Given the reduced cost and flexibility it brings, a migration to cloud computing is compelling for many SMEs”.

Three initial standout items for me are:

1. The document’s stated Risk Number One is Lock-In. “This makes it extremely difficult for a customer to migrate from one provider to another, or to migrate data and services to or from an in-house IT environment. Furthermore, cloud providers may have an incentive to prevent (directly or indirectly) the portability of their customers services and data.”

Remember that the document identified SMEs as a major market for cloud computing. What can they do about the lock-in? Let’s see what the document says:

The document identifies SaaS lock-in:

Customer data is typically stored in a custom database schema designed by the SaaS provider. Most SaaS providers offer API calls to read (and thereby ‘export’) data records. However, if the provider does not offer a readymade data ‘export’ routine, the customer will need to develop a program to extract their data and write it to file ready for import to another provider. It should be noted that there are few formal agreements on the structure of business records (e.g., a customer record at one SaaS provider may have different fields than at another provider), although there are common underlying file formats for the export and import of data, e.g., XML. The new provider can normally help with this work at a negotiated cost. However, if the data is to be brought back in-house, the customer will need to write import routines that take care of any required data mapping unless the CP offers such a routine. As customers will evaluate this aspect before making important migration decisions, it is in the long-term business interest of CPs to make data portability as easy, complete and cost-effective as possible.

And what about PaaS Lock-In?:

PaaS lock-in occurs at both the API layer (ie, platform specific API calls) and at the component level. For example, the PaaS provider may offer a highly efficient back-end data store. Not only must the customer develop code using the custom APIs offered by the provider, but they must also code data access routines in a way that is compatible with the back-end data store. This code will not necessarily be portable across PaaS providers, even if a seemingly compatible API is offered, as the data access model may be different (e.g., relational v hashing).

In each case, the ENISA document says that the customer must develop code to get around the lock-in, in order to bridge APIs and to bridge data formats. However, SME’s generally do not have developers on staff to write this code. “Writing code” is not usually an option for an SME. I know – I worked for an EDI service provider who serviced SMEs in Europe – we would provide the code development services for the SMEs when they needed data transformation done at the client side.

But there is another answer. This bridging is the job of a Cloud Service Broker. The Cloud Service Broker addresses the cloud lock-in problem head-on by bridging APIs and bridging data formats (which, as the ENISA document mentions, are often XML). It is unreasonable to expect an SME to write custom code to bridge together cloud APIs when an off-the-shelf Cloud Service Broker can do the job for them with no coding involved, while providing value-added services such as monitoring the cloud provider’s availability, encrypting data before it goes up to the cloud provider, and scanning data for privacy leaks. Read the Cloud Service Broker White Paper here.

2. “Customers should not be tempted to use custom implementations of authentication, authorisation and accounting (AAA) as these can become weak if not properly implemented.”

Yes! Totally agree. There is already a tendency to look at Amazon’s HMAC-signature-over-QueryString authentication scheme and implement a similar scheme which is similar but not exactly like it. For example, an organization may decide “Let’s do like Amazon do and make sure all incoming REST requests to our PaaS service are signed by a trusted client using HMAC authentication”, but omit to include any timestamp in the signed data. I can certainly imagine this, because this would happen all the time in the SOA / Web Services world (an organization would decide “Let’s make sure requests are signed using XML Signature by trusted clients”, but leave the system open to a simple capture-replay attack). Cloud PaaS providers should not make these same mistakes.

3. STRIDE and DREAD
Lastly, the document’s approach of examining the system in terms of data-at-rest and data-in-motion, identifying risks at each point (such as information disclosure, eavesdropping, or Denial-of-Service), then applying a probability and impact to the risks, is very reminiscent of the “STRIDE and DREAD” model. However I do not see the STRIDE and DREAD model mentioned anywhere in the document. I know it’s a bit long in the tooth now, and finessed a bit since the initial book, but it’s still a good approach. It would have been worth mentioning here, since it’s clearly an inspiration.

Read the source entry…

Reblog this post [with Zemanta]

Tags: Application programming interface, Business, Cloud computing, Platform as a service, Service-oriented architecture, Small and medium enterprises, Software as a service, Web service


Oct 16 2009

Web Services and Security

Category: Cloud computing,Information SecurityDISC @ 4:01 pm

Cloud Security and Privacy

Because of financial incentive, malicious software threats are real and attackers are using the web to gain access to corporate data. Targeted malicious software’s are utilized to steal intellectual property and other confidential data, which is sold in the black market for financial gain. With use of social media in corporate arena, organizations need to have web services use policy, to ensure employees use the internet for business and comply with company web use policies. To have an effective web use policy makes business sense and to implement this policy efficiently is not only due diligence but also assist in compliance. After implementing, the key to the success of web use policy is to monitor the effectiveness of the policy on regular basis.

webservices

Hosted web security services operate at the internet level, intercepting viruses, spyware and other threats before they get anywhere near your network. These days if malicious software has infected your gateway node the attacker is home free and it is basically game over. How to fight this malice is to use hosted web security services, which is transparent to users and stop the malwares before they get to the corporate network.

Things to look at web security hosted services are protection, control, security, recovery and multilayer protection.

Protect your corporation from anti-virus, anti-spam, and anti-spyware
Content Control of images, URL filtering and enterprise instant messages, all web request are checked against the policy
Secure email with encryption
Archive email for recovery
Multilayer protection against known and unknown threats including mobile user protection

Web Security Anti-Virus, Anti-Spyware – stops web-borne spyware and viruses before they infiltrate your network, protecting your business from information theft and costly diminished network performance.

Web Filtering – enables you to block access to unwanted websites by URL, allowing you to control Internet use and enforce acceptable Internet usage policies


Download a free guide for the following hosted solutions

Hosted email solution
Hosted email archiving
Hosted web monitoring
Hosted online backup

Tags: archive email, boundary encryption, content control, email archiving, email solution, image control, Malicious Software, Malware, multilayer protection, online backup, Spyware, url filtering, web filtering, web monitoring, wen security


Sep 10 2009

Way beyond the edge and de-perimeterization

Category: Cloud computing,Information SecurityDISC @ 2:59 pm

Wie eine Firewall arbeitet / how a firewall works
Image by pittigliani2005 via Flickr

De-perimeterization term has been around almost for a decade and finally industry is taking it seriously because of virtualization and cloud computing popularity. Is it time for businesses to emabrace de-perimeterization?

De-perimeterization is a double edge sword for industry which creates scalable options for operation and huge challenges for safeguarding the assets beyond the edge. One of the major advantages for de-perimeterization is that user can access corporate information over the internet; in this situation user can access corporate data from any where, it’s hard to draw the line where the edge begins and where it ends. All you basically need a functional laptop with internet connection. On the other hand, de- perimeterization poses a great challenge due to possibility of viruses, spywares and worms spreading in your internal protected infrastructure.

In de-perimeterized environment, security attributes shall follow the data, wherever the data may go or reside.

In security architecture where firewall was considered a very effective perimeter defense has been weakens by virtualization and cloud computing. In early days of firewall defense, organization only needed to open few necessary protocols and ports to do business. Internet accessible systems were located on the DMZ and the communication was initiated from the corporate to internet. Now there are whole slew of protocols and ports which needs to be open to communicate with application in the cloud. As corporate application move out of the organization network into the cloud, the effectiveness of firewall diminished.

Defense in depth is required for additional protection of data because as new threats emerge, the firewall cannot be used as an only layer of security. The key to the security of de-perimeterization is to push security at each layer of infrastructure including application and data. Data is protected at every layer to ensure the confidentiality, integrity and availability (CIA). Various techniques can be utilized for safeguarding data including data level authentication. The idea of data level authentication is that data is encrypted with specific privileges, when the data move, those privileges are moved with the data.

layered-defense

Endpoint security is relevant in today’s business environment especially for laptop and mobile devices. Agents on laptops and mobile devices utilized pull/push techniques to enforce relevant security policies. Different policies are applied depending on the location of the laptop. Where security policy will ensure which resources are available and what data need to be encrypted depending on the location of the device.

When corporate application and important data reside in the cloud, SLA should be written to protect the availability of the application and confidentiality of the data. Organizations should do their own business continuity planning so they are not totally dependent on the cloud service provider. For example backup your important data or utilize remote backup services where all data stored is encrypted.


Cloud Security and Privacy: An Enterprise Perspective on Risks and Compliance


Download a free guide for following cloud computing applications

Hosted email solution
Hosted email archiving
Hosted web monitoring
Hosted online backup


Reblog this post [with Zemanta]

Tags: business continuity, Cloud computing, cloud computing article, cloud computing concerns, cloud computing email, cloud computing hosting, cloud computing information, cloud computing security, cloud computing services, cloud security, cloud services, de-perimeterizations, DMZ, iso assessment


Jul 07 2009

Cloud Computing Pros and Cons

Category: Cloud computingDISC @ 6:19 pm

Cloud Application Architectures: Building Applications and Infrastructure in the Cloud

Cloud computing is the future of the computing, which happens to provide common business applications online that run from web browser and is comprised of virtual servers located over the internet. Basic idea behind cloud computing is the accessibility of application and data from any location as long as you are connected to the internet. Cloud computing makes the laptop the most essential tool to get the job done.

For example Hosted Email (SaaS) Security provides safeguards at the Internet level, eliminating spam and malware before they reach your internal network infrastructure. The hosted email provides centralized security with built-in redundancy, failover, and business continuity, while easing network and security administration. In the hosted email software as a service the security controls are at work at the internet level. It’s about time to expand the corporate perimeter beyond firewall and one of the major benefit of cloud computing is to give organizations capability to implement security controls at internet level and eliminate threats before they reach the internal network.

An online backup service is another example of software as a service (SaaS) which provides users with an online system for backing up and storing computer files.

Cloud computing incorporates several different types of computing, including:
 software as a service (SaaS)
 platform as a service (PaaS)
 infrastructure as a service (IaaS)

It is a range of technologies that have come together to deliver scalable, tailored and virtualized IT resources and applications over the Internet.

Cloud Computing have several benefits and potential risks which you may want to know before signing a contract with a cloud vendor.



Cloud Computing benefits

  • Users can avoid capital expenditure on hardware, software, and other peripheral services, when they only pay a provider for those utilities they use;

  • Consumption is billed as a utility or subscription with little or no upfront cost;

  • Immediate access to a broad range of applications, that may otherwise be out of reach, due to:

  • The lowering barriers to entry;

  • Shared infrastructure, and therefore lower costs;

  • Lower management overhead.

  • Users will have the option to terminate a contract at any time, avoiding return on investment risk and uncertainty.

  • Greater flexibility and availability of ‘shared’ information, enabling collaboration from anywhere in the world – with an internet connection.


  • Cloud computing associated risks

  • Cloud computing does not allow users to physically possess the storage of their data which leaves responsibility of data storage and control in the hands of their provider;

  • Cloud Computing could limit the freedom of users and make them dependent on the cloud computing provider;

  • Privileged user access – how do you control who has access to what information?

  • Security of sensitive and personal information lay with the vendor. How do you explain this to your customers when their data is compromised without sounding like you’re ‘passing the buck’?

  • From a business continuity stand point, can you rely on each vendor to have adequate resilience arrangements in place?

  • Long-term viability — ask what will happen to data if the company goes out of business; how will data be returned and in what format?



  • Complexities of cloud computing will introduce new risks and complexity is the enemy of security. The organizations and end users should be mindful of this security principle before introducing this new variable into their risk equation. As a consumer you need to watch out and research your potential risks before buying this service and consider getting a comprehensive security assessment from a neutral third party before committing to a cloud vendor.

    Recomended books on cloud computing

    Reblog this post [with Zemanta]

    Tags: Cloud computing, cloud computing article, cloud computing benefits, cloud computing concerns, cloud computing email, cloud computing hosting, cloud computing information, cloud computing network, cloud computing platform, cloud computing risks, cloud computing security, cloud computing services, cloud computing solutions, cloud security, cloud services, Infrastructure as a service, Platform as a service


    Jun 04 2009

    Virtualization and compliance

    Category: Cloud computing,VirtualizationDISC @ 1:04 am

    Virtualization madness
    Image by lodev via Flickr

    The core technology utilized in the cloud computing is virtualization. Some organization may not want to jump into cloud computing because of inherent risks can take a shot at virtualization in their data centers. Virtualization can be utilized to reduce hardware cost and utility cost. Organization that might have 100 servers can consolidate into 10, where each physical machine will support 10 virtual systems will not only reduce the size of data center, but also hardware cost, and huge utility bill savings.

    Virtualization was being utilized to increase efficiency and cost saving, which is now turning into centralized management initiative for many organizations. In centralized management patches, viruses and spam filter and new policies can be pushed to end points from central management console. Policies can be utilized to impose lock out period, USB filtering and initiate backup routines, where policies can take effect immediately or next time when user check in with the server.

    The way virtualization works is OS sits on an open source hypervisor which provides 100% hardware abstractions where drivers become irrelevant. With OS image backed up at management console, which allows virtualization technology a seamless failover and high availability for desktop and servers.

    As I mentioned earlier, virtualization allows enforcing of policies on end points (desktops). As we know compliance drive security agenda. If these policies are granular enough which can be map to existing regulations and standards (SOX, PCI and HIPAA) then virtualization solution can be utilized to implement compliance controls to endpoints. It is quite alright if the mapping is not 100% that is where the compensating controls come into play. The compliance to these various regulations and standards is not a onetime process. As a matter of fact standard and regulation change over time due to different threats and requirements. True security requires nonstop assessment, remediation’s and policy changes as needed.

    Reblog this post [with Zemanta]

    Tags: Cloud computing, Data center, Health Insurance Portability and Accountability Act, hipaa, Hypervisor, Open source, PCI, Security, sox, Virtualization


    Apr 02 2009

    Cloud computing and security

    Category: Cloud computingDISC @ 5:55 pm
    File:Cloud comp architettura.png

    https://commons.wikimedia.org/wiki/File:Cloud_comp_architettura.png

    Cloud computing provide common business applications online that run from web browser and is comprised of virtual servers located over the internet. Main concern for security and privacy of user is who has access to their data at various cloud computing locations and what will happen if their data is exposed to an unauthorized user. Perhaps the bigger question is; can end user trust the service provider with their confidential and private data.

    “Customers must demand transparency, avoiding vendors that refuse to provide detailed information on security programs. Ask questions related to the qualifications of policy makers, architects, coders and operators; risk-control processes and technical mechanisms; and the level of testing that’s been done to verify that service and control processes are functioning as intended, and that vendors can identify unanticipated vulnerabilities.”

    Three categories of cloud computing technologies:

    • Infrastructure as a Service (IaaS)
    • Platform as a Service (PaaS)
    • Software as a Service (SaaS)

    Cloud computing is offering lots of new services which increase the exposure and add new risk factors. Of course it depends on applications vulnerabilities which end up exposing data and cloud computing service provider transparent policies spelling out responsibilities which will increase end user trust. Cloud computing will eventually be used by criminals to gain their objectives. The transparent policies will help to sort out legal compliance issues and to decide if the responsibility of security breach lies on end user or service provider shoulders.

    Complexities of cloud computing will introduce new risks and complexity is the enemy of security. The organizations and end users should be mindful of this security principle before introducing this new variable into their risk equation. As a consumer you need to watch out and research your potential risks before buying this service and consider getting a comprehensive security assessment from a neutral third party before committing to a cloud vendor.

    Possible risks involved in cloud computing
    Complete data segregation
    Complete mediation
    Separation of duties
    Regulatory compliance (SOX, HIPAA, NIST, PCI)
    User Access
    Physical Location of data
    Availability of data
    Recovery of data
    Investigative & forensic support
    Viability and longevity of the provider
    Economy of mechanism

    Continue reading “Cloud computing and security”

    Tags: Cloud computing, cloudcomputing, compliance, Computer security, iaas, IBM, Information Privacy, Infrastructure as a service, paas, Platform as a service, Policy, privacy, saas, Security, security assessment, Security Breach, Services


    « Previous Page