VMware first announced its concept of the virtual data center operating system (VDC-OS) at VMworld 2008. Paul Maritz, VMware Inc.’s CEO at the time, took the stage and began to share his vision of a software-defined data center. Maritz is no longer in the driver’s seat, but this destination of a virtual data center operating system and the software-defined data center is finally coming into view.
At VMworld 2013, the concept of a VDC OS took two big steps forward. Learn more about this in my latest TechTarget article “Not lost, just recalculating: VMware’s route to a VDC-OS has been long“, then come back here and leave a comment.
Not all hypervisors have reached a level of parity in features, functionality and performance (regardless of what some marketing campaigns might say). However, the virtualization heavyweights are beginning to see real competition, and they realize that the gaps between the leading hypervisors are closing quickly. Given these narrowing feature gaps, how will we compare hypervisors in the future?
As the hypervisor battle evens out, I foresee a kind of stalemate. Vendors will struggle to differentiate their products from the competition, and the short attention span of IT pros will move to areas that provide greater value.
What can this mean for your organization and your long-term IT strategies? For more on this topic, read my TechTarget article “As feature gaps narrow, how will we compare hypervisors in the future?“.
There is a lot of talk about the software-defined data center lately, though the shift from hardware to software has been going on since the first logical partitions appeared in mainframe computing. Surprisingly, this shift continues to sneak up on people. When that happens, there can be power struggles and confusion as to who’s responsible for what. Many technology disciplines — networking teams, for example — still see themselves as focused on managing hardware rather than software and fail to see that virtualization has moved us beyond these distinctions.
What a networking device does, it’s core functionality, has been abstracted and is taking a new form…but will network administrators embrace this change? I go into more detail on this topic in my latest TechTarget article “Virtualization widens schism between server and networking teams“. Please take a minute to read that, then come back here and tell me what you think.
In these times of lean staffing budgets, some IT professionals are hesitant of diving too deep into a new automation tool, worried they could actually automate themselves out of a job.
I came into technology as a Perl programmer and UNIX administrator. I spent years writingcommon gateway interfaces for websites and automating routine tasks. As I moved along in my career, the ability to script and automate tasks was one of my top priorities when interviewing prospective employees. In my opinion, the ability to leverage an automation tool is a sign of a mature engineer. So why are so many people hesitant to embrace automation tools?
An automation tool can be anything from a scripting language to an application that allows you to build a workflow of tasks to be executed as a single action. Some tools are easy to learn, and others are a little more daunting to master. However, it may not be the learning curve that keeps many people from leveraging these tools. While some may simply not appreciate thevalue of automation, others are truly apprehensive about the end result. Some fear that if they can do their job in half the time, they or one of their co-workers may no longer be necessary. That could not be further from the truth.
I cannot remember anyone automating themselves out of a job. Gaining valuable skills only makes you more valuable, not less. I love this topic, and I take a deeper look at it in my article “Don’t fear an automation tool — it may be your best bet at job security“.