Archive for August, 2008

Emerging Technology Is More Painful Than Management Wants To Believe

Virtualization Critical Evaluation – Chapter 05

There is a trend in the Information Technology (IT) industry that is well known, well understood, and the truth about it, well hidden at the same time. Not because it is held under non-disclosure-agreement, not because it is patent or copyright owned, but because no one wants to acknowledge it. The fact is that the concept of faster, better, cheaper is foolishness in the IT industry. Moreover, it is an excuse, not a strategy. Real work, real quality in design and development takes effort, and exactly what management does not want to here, real time. Good work always takes good time to do.

Want an example? VoIP (Voice Over IP), it is horrible, to be honest. The voice quality is one step short of ridiculous, the quality and consistency is nothing compared to good old analog. I am now on my third vendor for VoIP, and considering looking for a fourth, if I could or even can. And I bet you thought I was going to illustrate that virtualization would be the example a technology before its time? Well, give me a few minutes, maybe I will.

Compromise is the dominate theory in the development in the IT industry today. I dare anyone reading this blog to say otherwise. I dare anyone to debate this issue in an honest, rationale, and objective manner. Change context for a minute, where are the 5, 10, 15 or even 20 year strategic plans for product development? How is the IT industry really going to go green, and not just pay lip service to the concept of green for decades? The IT goals of 20 years ago have not been very effective now have they? The paperless office never really got off the ground did it? For years, laser printers generated more volume of print on page than the total book publishing houses, world wide, ever did or do? Want an example? Purchase a new automobile, there are still somewhere between 25 and 40 pages printed for getting out of a dealership. Where did we go wrong? Such great ideas, such poor implementation? Why, it all comes down to compromise of ideals, goals, and objectives. Why is it that all the vendors only have roadmaps that are 12 or 18 months long? Because no one is thinking long-term, no one wants to make real commitment in a specific direction, Edison labs did what, some 6000 attempts to get a better light bulb? I doubt the typical technology firm that develops a new PCI device or USB device does more than 100 code builds once alpha code is locked, before market release.

The IT industry, reference to product and solution development is a mob mentality. Baby steps all the time, not leaps and bounds. This is not to say that the IT industry as not done some wonderful things. It has. Computing technology has revolutionized many fields of science and technology, in ways, that even 40 or just 20 years ago would have been closer to a Star Trek series, pick your favorite and insert here episode, than what even I could have dreamed up at the time, and believe me, I have some crazy ideas. Just ask my friends! But am I making my point? How many great ideas have been scrapped because someone was unwilling to wait for the solution to be realized? Or never wanted to commit resources, financial and temporal, to the solution when it was nothing but an idea? How many ideas never made it to a napkin in a coffee shop because time and resources were impossible to get? Or, and this is frightful, how many solutions have been strangled in the validation and certification effort phases, because some one, some where, in the chain of command, was unwilling to wait? Because someone refused to believe in potential because someone could not see short-term profit? Imagine if Edison, Tesla, or even von Braun, von Neumann, Babbage, and sorry, almost forgot, Einstein, had a project time line that strangled their explorations of thought, never mind, prototyping in a lab.

I experienced this first hand as a very young support technical resource person. I was assigned to an evaluation team, for a new software application, this was about 15 years ago. We worked day and night to improve the application in question. Feedback to the developers, feedback to us from the end-user population that was doing alpha testing, then beta testing, and even, finally, release candidate testing for the application, it was wonderful to feel progress, that a quality solution was near completion. However, there was not enough time, we just did not move fast enough it seemed. For whatever reason, even though real progress was made, we were not making everyone happy. The timeline for release was predetermined by management more than a year before; the first line of code was typed. This was a new experience for me. I was taught, both at home and in school, you do the job right, you do the right thing. Meaning, in my innocent, rose-colored glass view of the world, in the IT industry world, that if the application was not quite done, you delayed release, you got it right, then, and only then, you released it. Quality was the key to success. The key to true massive profitability, I was quite wrong, or so it seemed at the time.

The lead project manager walked into the end-user test lab, early one morning with no warning and began red-lining the project test plan. Days and weeks of regression, system integration, and component validation testing just disappeared from the master timeline. We would miss real bugs and real issues by doing this. I was about half way done with my breakfast, when I got the news from others on the evaluation tea. Why was I eating breakfast? Well I had been up all night chasing a nasty bug in the code, trying to isolate the issue, so the developers could move forward sooner rather than later. When I happened to see the project manager, after he had done the nasty to the master time line. I asked…What is this? Are we closing the project? The reply was…No. We are selectively shifting features to the planned version 2.0, rather than version 1.0 release. Of course, being young, and lacking tact in political scenarios, I asked…What about those features that were agreed upon with the end-users, our customers, our clients, before we started? How will you explain that significant features are really there, but not enabled because they are incomplete, when we are so close to being done? The answer, and I am being explicit, was…The application must be released on 15 days; we will never train the end-users on the additional features that were not validated, we will never acknowledge that some features have been dropped from 1.0 release. I was in shock, I was confused, I felt betrayed. This decision just did not make sense. Why? We still had some time, but they shortened the total timeline, some 5 days ahead of the original planned release. What the heck!

Now, you may be asking, why I explained this story from so long ago? Because I now realize, 15 years later, that this theory of product release is so ingrained into the IT industry, at all levels, that it is killing the industry. Management in the IT industry is under so much pressure to make things happen, on a strict mathematical schedule, with no exceptions, no flexibility. Thus all the true creative effort and the artistic aspect of idea development and design are dying out. The ugly aspect of this is that quality is something you get with version 3.0, which actually costs the customer or client even more. Look at Windows? Was it not 3.1 that really was functional to any reasonable degree? What did early adopters do? Spend a ton of money on Windows 1.0 or 2.0?

When was the last time we had any true, knock your socks off, quantum leap in the IT industry? 15 years, or 25 years, or more? Is it not true? Nothing new under the sun should be slogan for the IT industry. Tell me I am wrong? I am not bitter about this; I am not even surprised by it…any more. I was at a technical conference recently, sponsored for the most part, by one of the big three hardware vendors. Which one it was is not significant to my discussion here, but what every technical session screamed at me, was, yes, you guessed it, compromise. In ideas, in design, in implementation, and the attempts to compare these just average products to the competition only reinforced how all three vendors are in lock-step with each other, with solutions that are so close in capacity and function, that picking one over the other is almost insignificant. Of course, we all know the one that offers the cheapest cost, will be declared the best, quasi faster and better. Great, faster, better, cheaper is back!

A number of things have contributed to this, out-sourcing, why own when you can leverage? Lowering of educational standards, hey, expecting results above and beyond the average is not fair; you might damage some below average student esteem, rather than encourage improvement and achievement? The lack of large firms willing to develop talent, create careers, versus steal talent, only to let it go when out sourced? Why invent, when you can purchase? The Japanese still work according to 20 year or longer timelines, they expect achievement, but they also commit to technologies that seemed logical in reference to maturity in future years, not quarters. Ask General Motors? They have had more than 30 years to get something on the table, to really change the world, and they have failed, and a 100 year firm is all but dust. Look at Toyota? They are only just now peaking on plans established more than 30 years ago. Just imagine what the computing industry would be like if that type of effort was done? Don’t like automobiles as an example? What about fuel? Brazil has done better than most Countries along the same idea, did someone yell sugarcane?

And how does this have anything to do with virtualization? It is simple and easy to see, if you take the time to look. We have so much computing power compared to the past, cores upon cores, that we over purchased, over scaled, and under use it to the point, that an entire new segment of the IT infrastructure was created and now dominates said IT industry, and it is called virtualization. What is really stupid is that it is hypervisor virtualization, not application instance virtualization that dominates now. Why, because we want to achieve, faster, better, cheaper of course! Hypervisors are a result of faster, better, cheaper, mind set in virtualization. Virtualization should have only resulted in flexible environments, not utilization redirection. Iif all of those project managers, developers, designers, etc., years ago, took just that extra bit of time and effort, to do something right before hand, then there would be no afterwards, no emergence of virtualization, as we know it today. No outrageous cost avoidance because the environment would have been lean and mean. No zealous endorsement of…faster, better, cheaper. Well, at least not in the IT industry.

I am putting my rose-colored-glasses on e-bay, I have just enough faith in the future, that there must be someone out there that needs them…I hope.

Add comment August 22nd, 2008

Was Greene Kicked to the Curb or What?

Virtualization Critical Evaluation – Chapter 03

EMC, and to an extent VMware, think they are in trouble. Why else would Greene be kicked to the curb? I have refrained from comment in this blog about Greene until now. This has been a decision based on perspective, meaning that, since I have not been a CEO of one of the most significant firms in the last 10 years, nor walked in the shoes of Greene herself, I feel it is not my place to comment on her, as an individual, in any personal context. This is difficult because I feel her view of the virtualization industry, has in no small part created some issues for VMware. No one, I believe could decry the fact that VMware, as a company has been very careful for years to be say we are not, really, part of EMC. I remember being corrected by several VMware employees, or should I say, associates, since employee seems to be a dirty word, that EMC is nothing more than a necessary evil to protect VMware from others, did someone say Microsoft? What the heck does that mean? VMware is not really part of EMC? EMC paid hard cash for VMware, so to speak. Since when is purchase not ownership?

But regardless of your perspective on the EMC ownership of VMware, the negative impact of this philosophical view of non-ownership of VMware is real, painful in fact. When you are an EMC customer, and use VMware Virtual Infrastructure or a VMware customer and use EMC storage solutions, the internal infighting between EMC and VMware over the last few years has been nothing but frustrating. The disjointed nature of the relationship has done nothing to benefit EMC or VMware. I know this from personal experience, as well as from quite a few friends in the industry that work in or with virtualization scope, and the two entities. EMC says tomato, VMware says tamota? That is the last thing anyone wants to hear doing architectural design meetings, or even worse, after you get EMC and VMware to agree during design efforts, to be up at 3 am in the morning, on a weekend, working on a nasty storage processor issue with EMC behind your virtualization infrastructure, only to hear the VMware technical resource on the same conference call say…ah, why did you do that, we don’t support that…whatever that is…that should just never happen between EMC and VMware. Fortune 10, or even fortune 50, nay, fortune 100 firms have absolutely no patience for this type of bull, cough, zero tolerance for this type of foolishness, and again, I know this from direct experience.

So why was Greene kicked to the curb, if in fact that is what happened? Because I am not completely sure that is what happened. What is obvious? Is it that EMC and VMware not agreeing routinely seem a big enough issue to cause such shake up? I say maybe it was. Never mind the fact that VMware stock is slipping, never mind the fact that VMware has laid a few eggs that stink, ESX 3i, yes 3i, was a great idea, but either immature or marketed wrong, the classification is your choice. ESXi has not yet emerged as the Hyper-V killer it should have been? It is not an enterprise solution, yet. A fact that I have made known in the past. VCB which was a great idea, but just completely failed at any scale, approaching enterprise needs. I do understand how a storage technology firm can not create a backup solution that works at scale? That is core to their completive advantage! It happened because VMware ignored EMC? Yes, ignored, worse, VMware kept thinking, VCB must work for all customers, not just EMC customers. Maybe EMC will get it right with Avamar and VMware ESXi. That remains to be seen as well, but seems to have more potential than VCB did at a minimum.

I think the real reason that Greene was kicked to the curb was much darker. In fact, it goes to the core of VMware management direction, and policy. VMware does not know if it is an Enterprise client firm, or a small mom-and-pop firm. VMware is struggling with its identity when it should be 100% focused on its product development, and improvement, I said improvement, of its core business, not biased to innovation. Some say VMware has lost its way. That VMware is no longer listening to its non-Enterprise, smaller customers. That VMware is defending its self from Hyper-V by ignoring its smaller customers? Well, to be honest, that is exactly what I believe that Greene was focused on, the non-Enterprise customers, because of how VMware talks, walks, and explains about its self. I get this impression based on what and how she communicated to entire VMware organization. The scary thing is that small customers, in part become bigger customers, because they think and act like Enterprise customers, with hard work, a bit of luck, and thinking strategically, not tactically alone, any firm can become an Enterprise scale entity?

What is my evidence? In short, VMware does not listen to its enterprise customers, or has not for last few years in a consistent manner. VMware does not sound like a strategic thinking company even today at times, in how they present their new products, or new features. They still think like a smallish customer organization. Unfortunately, scale is everything, and profitability in competition with Microsoft is scale, scale, scale, and market share, market share, market share. Thus, VMware must reassert a two channel marketing plan, something that VMware has struggled with in the past. I have experienced this first hand. Growing faster than lightening has been confusing for VMware, but that is no excuse for VMware not catering to its enterprise customers, the way it must do, to survive. It is the big scale, large infrastructure firms that are going to allow VMware to survive. With the current economic situation, that the United States faces, in 2009, it is the major players of scale, that will have the resources and goals to continue with virtualization as the pace and scope that VMware must have to continue to be successful.

This is not to say that the smaller customers are not important, in fact, if VMware changes its cost model, which I believe must happen as well, to take some of the sting out of purchasing VMware solutions, small customers will continue to be significant, and help avoid de-facto acceptance of Hyper-V. But smallish customers can not continue to dominate the VMware thinking in reference to product design and evolution. It is just too easy for big customers, Enterprise scale, to go to Microsoft, which thinks big, does big, and is at its core focused on its cash cows, so addresses Enterprise customers concerns with easy and expectation enterprise customers demand. EMC sees this, and so, I think this is why Greene is sitting on the curb. The choice of the new CEO for VMware just screams…VMware is an Enterprise friendly firm, really we are, believe us, we are listening, well, we are now. The question is, just who, still, is listening, and who has already drank the Hyper-V favored cool-aid?

2 comments August 15th, 2008


August 2008
« Jul   Sep »

Posts by Month

Posts by Category