XIOLOGIX DELIVERS VIRTUALIZED INFRASTRUCTURE SOLUTIONS
Virtualization enables Cloud computing because it makes it easier to move server workloads around. So whether you’re looking to virtualize your server infrastructure in your own data center, move some – or all – of your workloads into the Cloud, or just need some help in developing or refining your Cloud strategy, Xiologix can help you plan, implement, and support your virtualized infrastructure.
XIOLOGIX DELIVERS VIRTUALIZED INFRASTRUCTURE SOLUTIONS
The VMware Cross-Cloud Architecture
See how VMware’s Cross-Cloud Architecture helps you avoid cloud silos, giving you both freedom and control in IT infrastructure.
VMware Cloud on AWS Overview
Currently in Technology Preview, VMware Cloud on AWS is a vSphere-based cloud service. The new service will bring our enterprise-class Software-Defined Data Center (SDDC) software to the AWS cloud.Customers will be able to run any application across vSphere-based private, public and hybrid cloud environments. It will be delivered, sold and supported by VMware as an on-demand, elastically scalable service and customers will be able to leverage the global footprint and breadth of services from AWS.The service will integrate the capabilities of our flagship compute, storage and network virtualization products (vSphere, Virtual SAN and NSX) along with vCenter management, and optimize it to run on next-generation elastic, bare metal, AWS infrastructure.This will enable customers to rapidly deploy secure, enterprise-grade AWS cloud-based resources that are operationally consistent with vSphere-based clouds. The result is a comprehensive turnkey service that works seamlessly with both on-premises private clouds and advanced AWS services.
1 + 1 = 3 : VMware + Veeam = Better Together
Ratmir Timashev, Veeam’s Chief Executive Officer and Carl Eschenbach, VMware’s President and COO discuss the partnership of their two companies and the strength of their services together on stage at VeeamON 2015.Learn more about VMware and Veeam at:https://www.veeam.com/vmware-vsphere-solutions.html
Runecast – What it is
Runecast – Software-defined Expertise for VMware vSphere based infrastructures
VMware Network Virtualization: The Story So Far with Bruce Davie
Bruce Davie, CTO of Networking, introduces VMware NSX network virtualization and discusses the use cases and success stories so far. Recorded at Tech Field Day Extra at Interop 2016 on May 4, 2016. For more information, please visit http://VMware.com/products/NSX/ or http://TechFieldDay.com/event/eilv16/
Brocade VNF Manager can prevent virtual network services sprawl
Brocade announced today the availability of its virtual network function (VNF) Manager. The product is a commercial version of OpenStack Tacker, an OpenStack lead project designed to make it easier to deploy and operate virtual network services. The initiative is compatible with the European Telecommunications Standards Institute (ETSI) Network Functions Virtualization (NFV) Framework.For those not familiar with NFV, the technology allows organizations to run network services as virtual functions instead of requiring a single appliance per function. NFV has many cost benefits, as it reduces the overall hardware that needs to be purchased and managed. More important, it gives network services the same level of agility as virtual servers and storage. Infrastructure agility is a core requirement of becoming a digital company, and NFV enables that at the network level.  ZK Research, 2015Brocade’s announcement of VNF Manager may not seem like an overly sexy announcement, but it is an important one for both service providers and enterprises. Historically, the topic of NFV has been primarily linked to network operators for use cases such as service chaining and on-demand service creation, but NFV is now something enterprises are looking at.In 2015, ZK Research conducted a survey asking businesses where they are in terms of enterprise NFV. Sixty-one percent were somewhere between the researching phase and deployment. The other 39 percent said they had no plans. But the industry is early in the technology cycle, so I certainly expect to see more organizations embrace NFV as it matures. ZK Research, 2015As NFV plans move from the testing phase into large-scale deployment, a product like Brocade’s VNF Manager becomes extremely important for the long-term manageability of the network services.To understand the potential problem, think back to the early days of server virtualization. Initially, the technology was used to consolidate severs. Instead of having 10 workloads on 10 servers, each using 5 percent of the overall capacity, run them all on one server and push the utilization of the single server to 50 percent.However, over time, server virtualization became increasingly popular for use cases other than consolidation. Application developers, Q/A departments and other groups started using virtualization as a faster way to deploy a server instead of having to physically procure hardware. For the infrastructure team, the self-service model seemed ideal because the groups that needed servers could just provision their own without having to order, deploy and connect a physical box.Over time, though, the expansion of virtual servers caused an unforeseen problem. So many virtual machines were being created that no one really knew how many virtual servers had been deployed, who owned them or even if they were still being used. I recall a conversation with a CTO from a mid-size bank who told me he had twice as many virtual servers in his company than he had physical servers prior to the deployment of VMware. The explosion of virtual servers was known as “virtual machine sprawl” and was a significant problem for years until VMware developed the tools necessary to manage large-scale virtual server environments. Avoiding NFV sprawlWithout a tool such as Brocade’s VNF Manager, network operations teams risk running into NFV sprawl as virtual network services proliferate across the company. The server industry had to go through significant pain before the management tools were developed, so it’s good to see Brocade being proactive about NFV management to prevent its customers from going through similar pain.Also, VNF Manager is bringing NFV together with software-defined networking (SDN). Customers can use VFN Manager to instantiate the virtual network functions and load them with an initial configuration. The VNFs can then be mounted and managed using Brocade’s SDN controller through its southbound interfaces. This enables the lifecycle of VFNs to be orchestrated and be aligned with SDN initiatives.VNF Manager is offered as a free download from the Brocade website. The download is a full-featured version that comes with a 60-day license, as well as 60 days of free technical assistance center (TAC) support to help customers get the product up and running.After the 60 days, customers will need to purchase either a one- or three-year license. Brocade offers different bundles for operation teams versus developers, as well as professional services and in-person training sessions to help ensure its customers are successful with the product.If you’re one of the 61 percent of organizations considering NFV, my recommendation is to be aggressive with the technology, as there are tremendous cost and operational benefits. Just make sure you have the proper management tools in place before deploying it.
Companies high on virtualization despite fears of security breaches
Companies are feeling more comfortable with the cloud, virtualization and even software defined data centers than ever before, despite their fears about security breaches, according to a study due out this month by technology companies HyTrust and Intel. While no one thinks security problems will go away, companies are willing to tolerate the risk in the name of agility, flexibility and lower costs.Some 62 percent of executives, network administrators and engineers surveyed expect more adoption of SDDC in 2016, which can quantifiably drive up virtualization and server optimization, while 65 percent predict that these implementations will be faster.Still, there are no illusions about security. A quarter of those surveyed say security will still be an obstacle, and 54 percent predict more breaches this year. In fact, security concerns are the No. 1 reason that 47 percent of respondents avoid virtualization, according to the report. They have good reason for concern. A single point of failure in a virtualized platform, such as a hack into the hypervisor software that sits just above the hardware and acts like a shared kernel for everything on top of it, has the potential to exploit an entire network, not just a single system.[ MORE VIRTUALIZATION: Virtualization doubles the cost of security breach ] “There’s a strong desire, especially by senior-level executives, to move forward with these projects because there are tangible benefits,” says Eric Chiu, president and co-founder of HyTrust. The opportunity to increase agility, revenues and profits trumps making the virtual environment safer, he adds.Meanwhile, in the IT department, staff tends to focus on what they know how to protect, not necessarily what they need to protect, according to a Kapersky Labs report. Only a third of organizations surveyed possess strong knowledge of the virtualized solutions that they use, and around one quarter have either a weak understanding of them or none at all.Dave Shackleford knows this all too well. He teaches a week-long course on virtualization and cloud security for the SANS Institute. By the end of the first day, he usually realizes that 90 percent of the students, a broad mix of system and virtualization/cloud administrators, network engineers and architects, have very little idea of exactly what they’re up against when it comes to securing virtual infrastructure.“You’ve got organizations out there that are 90 percent virtualized, which means your whole data center is running in a box out of your storage environment. Nobody is thinking about it this way,” says Shackleford, who is also CEO of Voodoo Security. “It’s not uncommon to go into even really big, mature enterprises and find an enormous number of security controls that they’re unaware of or being overlooked in one way or another” in the virtual environment, he adds. Adding to the confusion, virtualization has caused a shift in IT responsibilities in many organizations, says Greg Young, research vice president at Gartner. The data center usually includes teams trained in network and server ops, but virtualization projects are typically being led by the server team. “The network security issues are things they haven’t had to deal with before,” Young says.The average cost to remediate a data breach in a virtualized environment tops $800,000, according to Kapersky Labs, and remediation costs bring the average closer to $1 million – nearly double the cost of a physical infrastructure attack.Companies don’t see technology as the sole answer to these security problems just yet, according to the HyTrust survey. About 44 percent of survey-takers criticize the lack of solutions from current vendors, the immaturity of vendors or new vendor offerings, or issues with cross-platform interoperability. Even as vendors like Illumio, Catbird, CloudPassage and Bracket Computing emerge with fixes to some virtualization security problems, companies can’t afford to wait for the next security solution.“If you’re 50 percent virtualized today, in two years you’re going to be 70 percent to 90 percent virtualized, and it’s not going to get any easier to add security,” Shackleford says. “If you start moving things out to Amazon or Azure or any big cloud provider, you want to have your security at least thought through or ideally in place before you get there, where you’re going to have even less control than you may have had to date.”Four steps toward a more secure environmentThese security pros agree that companies can indeed have a secure virtual environment today if they can gain a clear picture of their virtual infrastructure, use some of the technology and security tools they already have, and better align technology and security in the organization.1. Get a grip on your virtual infrastructure“You can have very good security just through planning – taking the steps and making sure the safeguards are there,” Young says. This starts with inventory management. “The security team needs to get the lay of the land with regards to virtualization,” Shackleford says. “You need to try to get a handle on where hypervisors are, where management consoles are, what’s in-house, where it lives, and what the operational processes are around maintaining those. Next, define standards for locking them down. If nothing else, at least lock down the hypervisors,” Shackleford adds. Major vendors like VMware and Microsoft have guides to help you, as well as the Center for Internet Security.2. Rethink the way you look at data and storage.People seriously need to think about their environment as a set of files, Shackleford says. “It’s a very big shift for security professionals to realize that your whole data center runs from your SAN – your storage network. So they need to at least get familiar with the types of controls that they’ve put in place.”Vendors are also rethinking their security postures and welcoming third parties who can provide security fixes. “The problem before was, could I apply fine-grained network security to my virtualized environment, and in the past the network ops people said ‘absolutely not. We can’t support it,’ says Chris King, vice president in the networking and security business unit at VMware.“Now there are technologies available that will enable them to revisit that request and that can now cut the common thread in [these] breaches, which is once an attacker is inside, they’re stuck in that compartment and have to break through another wall in order to attack.”3. Encrypt the dataIt’s top of mind these days, but many companies are still not encrypting, Chiu says. “There’s this outdated thought process, which is ‘if it’s within my four walls, then I don’t need to worry about it,’ but that’s definitely not the case. You need to at least encrypt all customer data and all intellectual property wherever it is in your environment,” Chiu says. “Of course the cloud makes finding it worse because you don’t know for sure where that data is – but encrypting all that data should be a fundamental principle.”4. Coordinate security and infrastructure teams early on.There needs to be alignment and coordination between security and infrastructure teams at the beginning of virtualization projects, Chiu says. “It’s a lot easier to build in security controls and requirements in the beginning than to bolt something on later.”Security also needs to map the requirements of the organization for the next several years, he adds. “Does the company plan to virtualize PCI data, HC data, move to a shared environment where business units and application tiers are all going to get collapsed together? All those things matter because your requirements are going to be different.” This story, “Companies high on virtualization despite fears of security breaches” was originally published by CSO.
Virtualization vs. Cloud Computing: What’s the Difference?
Is virtualization right for your business? How about cloud computing? Don’t know the difference? That’s OK — most non-IT folks don’t either. The word “cloud” is often thrown around as an umbrella term, while “virtualization” is often confused with cloud computing. Although the two technologies are similar, they are not interchangeable, and the difference is significant enough to affect your business decisions. Here is a guide to help demystify the tech behind the jargon. What is virtualization? In a nutshell, virtualization is software that separates physical infrastructures to create various dedicated resources. It is the fundamental technology that powers cloud computing. [Cloud Computing: A Small Business Guide] “Virtualization software makes it possible to run multiple operating systems and multiple applications on the same server at the same time,” said Mike Adams, director of product marketing at VMware, a pioneer in virtualization and cloud software and services. “It enables businesses to reduce IT costs while increasing the efficiency, utilization and flexibility of their existing computer hardware.” The technology behind virtualization is known as a virtual machine monitor (VMM) or virtual manager, which separates compute environments from the actual physical infrastructure. Virtualization makes servers, workstations, storage and other systems independent of the physical hardware layer, said John Livesay, vice president of InfraNet, a network infrastructure services provider. “This is done by installing a Hypervisor on top of the hardware layer, where the systems are then installed.” How is virtualization different from cloud computing? Essentially, virtualization differs from cloud computing because virtualization is software that manipulates hardware, while cloud computing refers to a service that results from that manipulation. “Virtualization is a foundational element of cloud computing and helps deliver on the value of cloud computing,” Adams said. “Cloud computing is the delivery of shared computing resources, software or data — as a service and on-demand through the Internet.” Most of the confusion occurs because virtualization and cloud computing work together to provide different types of services, as is the case with private clouds. The cloud can, and most often does, include virtualization products to deliver the compute service, said Rick Philips, vice president of compute solutions at IT firm Weidenhammer. “The difference is that a true cloud provides self-service capability, elasticity, automated management, scalability and pay-as you go service that is not inherent in virtualization.” What are the advantages of a virtualized environment over the cloud? To best understand the advantages of virtualization, consider the difference between private and public clouds. “Private cloud computing means the client owns or leases the hardware and software that provides the consumption model,” Livesay said. With public cloud computing, users pay for resources based on usage. “You pay for resources as you go, as you consume them, from a [vendor] that is providing such resources to multiple clients, often in a co-tenant scenario.” A private cloud, in its own virtualized environment, gives users the best of both worlds. It can give users more control and the flexibility of managing their own systems, while providing the consumption benefits of cloud computing, Livesay said. On the other hand, a public cloud is an environment open to many users, built to serve multi-tenanted requirements, Philips said. “There are some risks associated here,” he said, such as having bad neighbors and potential latency in performance.   In contrast, with virtualization, companies can maintain and secure their own “castle,” Philips said. This “castle” provides the following benefits: Maximize resources — Virtualization can reduce the number of physical systems you need to acquire, and you can get more value out of the servers.  Most traditionally built systems are underutilized. Virtualization allows maximum use of the hardware investment. Multiple systems — With virtualization, you can also run multiple types of applications and even run different operating systems for those applications on the same physical hardware. IT budget integration — When you use virtualization, management, administration and all the attendant requirements of managing your own infrastructure remain a direct cost of your IT operation.  [8 Reasons to Fear Cloud Computing] How do you know if your business needs a virtualization solution? Determining whether or not virtualization is the best solution for a business requires an in-depth analysis of the organization’s specific needs and requirements. “Some of the items we discuss with customers when they are evaluating private cloud — virtualization — versus cloud computing include who is going to be providing the support and how challenging is integration with other systems,” Livesay said. You should also consider costs — total cost of expenditure (TCO), operational expenditures (OPEX) and capital expenditures (CAPEX) — how much management the business can and want to do, scalability requirements, security needs and how much feature development can be expected, Livesay said. “Generally speaking, businesses who work more on an OPEX model that have less IT staff and fewer security concerns are more cloud oriented,” Livesay said. “Businesses that need greater control for integration and security or who work more on a CAPEX model would lean towards virtualization.” How do businesses know if they should use a true cloud solution? While virtualization is the best solution for some organizations, a cloud solution offers several benefits that are more suitable for other businesses. Philips said cloud solutions are best for business with the following needs: Outsourced IT — The day-to-day administration, care and feeding of supporting systems move away from you to the service provider. This could free up internal IT resources for higher-value business support and allow you to put IT budget dollars toward efforts that advance your business.  Quick setup — Cloud startup is relatively quick and easy. Plus, servers, appliances and software perpetual licenses go away when you use such a service. Pay-as-you-go — An example could be found in Software-as-a-Service (SaaS) applications available today that allow the off-loading of basic IT requirements to cloud service providers.  You pay for what you need and use. But you do not have to continue to invest in many of the products used to support the network and systems, such as spam/anti-virus, encryption, data archiving, email services and off-site storage. Scalability — By using the cloud, you can also temporarily scale your IT capacity by off-loading high-demand compute requirements to an outside provider. As a result, as mentioned above, you pay for only what you need and use, only at the time when you need it. Keep in mind, however, that virtualization and cloud services are not end-all, be-all solutions. Like any other technology or service a business adopts, things can always change. “While cloud computing and virtualization each have their own benefits, they are not competing approaches,” Adams said. “We view cloud computing as an evolution of virtualization. Customers that virtualize their hardware servers may adopt cloud computing over time for increased self-service, scale, service delivery levels and agility.” What should businesses look for in a virtualization provider? Businesses considering virtualization should think about the following questions, Adams said: Is it a tried and tested solution? Research the vendor’s track record of product innovation, success and customer adoption. Is there a vision and public roadmap for the solution? You want to understand how the solution will advance and how it will help your business in the long run. What type of ecosystem support exists for the solution? It’s imperative that the vendor work with key business and industry-specific independent software vendors (ISVs), as well as a wide range of resellers, service providers and system integrators. Does the solution support openness and choice? As your business grows, you want the flexibility to evolve your products and processes, and the ability to incorporate other technologies over time. Originally published on BusinessNewsDaily.