Monthly Archives: August 2011
Exploring the internal cloud
Using cloud-based infrastructure services doesn’t always mean sending your precious data to a third-party provider. While services such as Amazon EC2 are great for SMBs or enterprises with occasional workload spikes, sometimes organizations want the benefits of a cloud with more control, flexibility and security assurances than a third party can provide.
That’s why some companies looking to adopt a shared-services operating model are choosing to build their own internal clouds. Here we cover the differences between internal and public clouds, what an internal cloud can do for your organization, and what you should do to prepare for building your own.
What’s the difference?
When most people think of cloud computing, they think of the most visible examples, such as Amazon EC2, Google AppEngine and Microsoft Azure. Those services are public clouds, which means they operate outside of customers’ firewalls. The resources that power the clouds are owned by the companies that operate them, not by the customers.
Perhaps the biggest feature of public clouds is they give customers a way to avoid purchasing and managing certain hardware and software. That’s why they’re so attractive to organizations that don’t have the budget or internal resources to make this capital expenditure.
An internal cloud, on the other hand, is owned by the company and operates behind the company firewall. An internal cloud doesn’t let companies ditch hardware altogether. Instead, it’s all about creating dynamically available resources based on a highly virtualized, tightly integrated, converged infrastructure.
What’s the better model?
Both internal and public clouds can be excellent choices, depending on a company’s unique requirements. But internal clouds are becoming popular because they offer a degree of flexibility, compliance, security, transparency and control that public clouds typically don’t.
Public cloud: easy start up, less control
To understand the benefits of internal clouds, let’s first look at the advantages and disadvantages of public clouds. On the plus side, public clouds are convenient and easy to use. The infrastructure is already set up, so you don’t have to worry about how to build it. You simply go to the provider’s web site, order a service, and pay for only what you use.
On the other hand, public clouds don’t afford the control that customers have within their own firewalls. That means the provider may not use the exact security, privacy, and compliance mechanisms that your business requires. Although you can choose from a menu of services and service-level agreements (SLA), your pricing options are limited. The biggest worry, however, is that everything that could go wrong is out of your hands.
Internal cloud: more control, more service delivery options
An internal cloud is an attractive alternative because it features many benefits of public clouds without any of the drawbacks. Your organization can stick to its tried-and-true security mechanisms. You can maintain specific compliance procedures that might be required for your industry. You can maintain the benefits of the IT infrastructure you’ve already built. You can fine-tune every part of your internal cloud without feeling beholden to a third-party.
Another benefit of internal clouds: Companies can create new revenue streams by offering the internal cloud to external companies.
One drawback is that, depending on an IT organization’s maturity and existing infrastructure, building an internal cloud may call for additional capital and skills. But for many companies, the ability to serve multiple BUs and fulfill multiple SLAs quickly and consistently provides plenty of justification for the expense.
How do we prepare?
Building an internal cloud is a bigger endeavor than signing up for an external cloud service. But for organizations that require a high degree of security, control, flexibility and cross-BU standardization, it’s a smarter option.
As in any major IT initiative, the first step is careful planning. Experts recommend that organizations planning an internal cloud follow these six steps before implementation:
- Rally stakeholders around a common vision.
- Create a common understanding of concepts and trends.
- Investigate best practices.
- Mobilize customer teams for fast decision-making.
- Identify your strategic cloud-related initiatives.
- Lay out the next steps in a project roadmap.
Get a jumpstart with TFP, call us now for a free consultation! Tel : 603 – 8060 0088
Article extracted from : http://www.informationweek.com/news/storage/systems/229402739
By Chandler Harris InformationWeek
An early access program will allow select enterprises to use the HP P6000 Enterprise EVA before the official unveiling of the upgraded product.
Hewlett-Packard on Tuesday confirmed its next-generation Enterprise Virtual Array (EVA) is coming in June. In the meantime, HP is offering an early access program that allows select enterprises to use the HP P6000 Enterprise EVA before the official unveiling of the upgraded product.
A “couple hundred” customers will be treated to HP’s mainstream enterprise block access storage array, with most of them existing EVA users, said Tom Joyce, vice president of marketing for HP Storage, in an interview.
“The reason we decided to do it this way is because the EVA is one of the most successful storage products in history, with up to a 100,000 out there,” Joyce said. “We have a very significant customer base and a lot of folks are looking forward to the new product.”
The EVA is part of HP’s Converged Infrastructure portfolio that includes virtualization, unification, and management of storage, networking, and compute resources. The HP EVA is typically deployed for core enterprise applications from messaging to enterprise resource planning.
The HP P6000 EVA is the company’s fifth generation EVA and will feature increased capacity and performance improvements. The HP P6000 EVA will include a 6 Gbps Serial Attached SCSI (SAS) back-end, support for standard SAS hard disk drives, and 8 Gbps Fibre Channel connectivity. It will have new software features for increased functionality, but HP isn’t disclosing details yet.
The EVA is HPs mainstream enterprise block access storage array, with an estimated 100,000 deployed. After HP acquired 3PAR last year, it said there would be two more generations of EVA to fit into HP’s Converged Infrastructure product portfolio. The upcoming EVA upgrade is the first of these.
While there is overlap in both the offerings, Joyce says they are far enough apart to provide offerings for different enterprise needs. He gives a car analogy to describe the product, with the EVA the Ford F150 truck and the 3PAR product the Ferrari.
“Many consumers decide to go with the EVA for its functionality, simplicity, and economic value for general storage use cases,” Joyce said. “If an organization is primarily services oriented they can go with 3Par.”
The EVA upgrade is part of HP’s ongoing Converged Infrastructure strategy. HP’s merger with cloud storage provider 3Par last year helped in its competition against archrival Cisco, which has been strong in the conjoined data center and cloud computing markets. For its converged infrastructure strategy, HP noted that it is focusing on its storage, server, and networking portfolio. 3Par helped to beef up HP’s storage operation.