Weblog of Mark Vaughn, and IT professional and vExpert specializing in Enterprise Architecture, virtualization, web architecture and general technology evangelism

Tag: Strategy (Page 1 of 3)

Orchestration: The Marriage of IT and Business

With a graduate degree in business, and 20 years of experience in IT, I am often frustrated to see these two organizations at odds with one another. While it is nearly impossible to conduct business, today, without IT…IT does not exist just to make pretty lights blink in the data center. IT exists to serve the business and to ultimately aid the business in achieving their goals. These should be complimentary goals, not contrary ones.

For the IT strategists and engineers that “get” this concept, there are two core technologies that you need to be exploring. The first is automation, and the second is orchestration. Automation will make routing processes more reliable and their outcomes more predictable. Automation is not easy. It can often be much more difficult to automate a task then to simply execute the steps individually, but the outcome is more valuable.

The second concept, orchestration, is where IT and the business meet. With orchestration, you add intelligence to automation, allowing tasks to be triggered by predefined conditions. These can be technological conditions or even business conditions.

To end 2013, I wrote a two part series on these topics for TechTarget. The first article is “Car assembly plants can teach a valuable lesson in IT automation“, dealing with automation and a great experience I had at a GM plant. The second article is “IT orchestration can help bridge gaps to unite divided business units“, building on the previous work on automation.

I hop that you will take a minute to read these articles, and then come back here and leave a comment to let me know what you think.

Heads Up! Avoiding Seagull Consultants.

In my 20+ years in technology, I spend roughly 16 as a customer and the last 4 as a consultant. Those can be two very different roles, while dealing with very similar challenges and solutions. However, they do not have to bee that different. If you know what to look for, you can find consultants that are focused on your business, and not just their technology.

Have you ever heard the term “seagull consultant”? It is a humorous term, but one that can be outright terrifying for your business. To learn more about this term, and how to select the right consultant for your business, read my latest TechTarget article “Meeting long-term business goals, avoiding seagull consultants“. If you would like, feel free to come back here and leave a comment. I would love to hear your thoughts on the topic.

VDC-OS: Is it finally here?

VMware first announced its concept of the virtual data center operating system (VDC-OS) at VMworld 2008. Paul Maritz, VMware Inc.’s CEO at the time, took the stage and began to share his vision of a software-defined data center. Maritz is no longer in the driver’s seat, but this destination of a virtual data center operating system and the software-defined data center is finally coming into view.

At VMworld 2013, the concept of a VDC OS took two big steps forward. Learn more about this in my latest TechTarget article “Not lost, just recalculating: VMware’s route to a VDC-OS has been long“, then come back here and leave a comment.

The Changing Hypervisor Role

Not all hypervisors have reached a level of parity in features, functionality and performance (regardless of what some marketing campaigns might say). However, the virtualization heavyweights are beginning to see real competition, and they realize that the gaps between the leading hypervisors are closing quickly. Given these narrowing feature gaps, how will we compare hypervisors in the future?

As the hypervisor battle evens out, I foresee a kind of stalemate. Vendors will struggle to differentiate their products from the competition, and the short attention span of IT pros will move to areas that provide greater value.

What can this mean for your organization and your long-term IT strategies? For more on this topic, read my TechTarget article “As feature gaps narrow, how will we compare hypervisors in the future?“.

Software Defined Networking: Can’t We All Just Get Along?

There is a lot of talk about the software-defined data center lately, though the shift from hardware to software has been going on since the first logical partitions appeared in mainframe computing. Surprisingly, this shift continues to sneak up on people. When that happens, there can be power struggles and confusion as to who’s responsible for what. Many technology disciplines — networking teams, for example — still see themselves as focused on managing hardware rather than software and fail to see that virtualization has moved us beyond these distinctions.

What a networking device does, it’s core functionality, has been abstracted and is taking a new form…but will network administrators embrace this change? I go into more detail on this topic in my latest TechTarget article “Virtualization widens schism between server and networking teams“. Please take a minute to read that, then come back here and tell me what you think.

Don’t Fear Automation, Embrace It

In these times of lean staffing budgets, some IT professionals are hesitant of diving too deep into a new automation tool, worried they could actually automate themselves out of a job.

I came into technology as a Perl programmer and UNIX administrator. I spent years writingcommon gateway interfaces for websites and automating routine tasks. As I moved along in my career, the ability to script and automate tasks was one of my top priorities when interviewing prospective employees. In my opinion, the ability to leverage an automation tool is a sign of a mature engineer. So why are so many people hesitant to embrace automation tools?

An automation tool can be anything from a scripting language to an application that allows you to build a workflow of tasks to be executed as a single action. Some tools are easy to learn, and others are a little more daunting to master. However, it may not be the learning curve that keeps many people from leveraging these tools. While some may simply not appreciate thevalue of automation, others are truly apprehensive about the end result. Some fear that if they can do their job in half the time, they or one of their co-workers may no longer be necessary. That could not be further from the truth.

I cannot remember anyone automating themselves out of a job. Gaining valuable skills only makes you more valuable, not less. I love this topic, and I take a deeper look at it in my article “Don’t fear an automation tool — it may be your best bet at job security“.

Storage Landscape is Changing

Virtualization transformed data centers and restructured the IT hardware market. In this time of change, startups seized the opportunity to carve out a niche for products like virtualization-specific storage. But are these newcomers like Nutanix and Fusion-io here to stay or will they struggle to compete as established companies catch up with storage innovations of their own?

For a long time, it appeared storage vendors were growing complacent. A few interesting features would pop up from time to time, and performance was steadily improving, but there were few exciting breakthroughs. Users weren’t demanding new features, and vendors weren’t making it a priority to deliver storage innovations. Virtualization changed that tired routine.

In many ways, now, it is storage vendors that are knocking down technology walls and enabling new technologies to flourish. I discuss this topic more in my TechTarget article “Virtualization storage innovations challenge market leaders“. Please give it a read and come back here to leave any comments.

Virtualization Paying Off?

Several years ago, server virtualization rolled into the data center with all of the outrageous promises and unbelievable claims of a sideshow barker. Experts and vendors claimed it was going to improve server efficiency, shrink your infrastructure, slash bloated power bills, make cumbersome administrative tasks disappear and cure the common cold. The sales pitch was smooth and we all bought in, but has virtualization fulfilled the promises?

Did your infrastructure shrink when you implemented virtualization?

I want to hear more from you about this. Take a minute to read my TechTarget article “Virtualization improved server efficiency, but did it meet the hype?“, then come back here and contribute to the conversation.

Beware: Storage Sizing Ahead

Managing data growth continues to be a struggle, and organizations are beginning to outgrow that first storage array they bought a few years ago. As they do, some are in for a big surprise. For years, the focus has been on adding storage capacity. In fact, the development of a storage strategy is still referred to as a “sizing” exercise. However, today the challenge is now accessing that huge amount of data in an acceptable amount of time, and the size or capacity of the drive will have little or no correlation to the performance of the storage. An IT administrator who focuses too narrowly on adding storage capacity can end up with an array that can hold all the data, but can’t support the IOPS demanded by applications.

If you are considering a storage upgrade, it is critical that you understand how this can impact your organization. I cover this in more detail in my TechTarget article “Adding storage capacity can actually hurt IOPS“. Please take a minute to read the article, the come back here and leave a comment to contribute to the conversation.

Server To Go

Desktop hypervisors, such as VMware Workstation and Parallels Desktop, open up a world of management and troubleshooting possibilities for server virtualization admins.

Whether you are new to server virtualization or a seasoned veteran, there is a very good chance that your first hands-on experience with the technology was in the form of a desktop tool such as VMware WorkstationVMware FusionParallels or even Windows Virtual PC. You probably installed it as a chance to kick the virtual tires or maybe to aid in a major operating system change.

Regardless of the reason, for many, the virtualization journey began with a desktop hypervisor. In fact, I don’t think we give enough credit to just how great of a role these desktop tools play in the world of server virtualization.

Desktop hypervisors may provide more value than you realize, and no IT admin has a good excuse to not be running one. For more on this topic, check out my TechTarget article ““Why virtualization admins and desktop hypervisors should be BFFs“, then come back here and leave any comments that you may have.

« Older posts