Desktop Virtualization… The Right Way (Politics and Puppy Chow)

Politics and dog food… some might say they go hand-in-hand (especially if you watched any coverage about the healthcare debate). But politics and dog food are also relevant in most organizations, especially when undertaking a massive restructuring in the way you deliver desktops to users.

Desktop virtualization is not something you can just turn on one day. It takes planning. Although some organizations made the decision to only implement a small-scale, limited virtual desktop environment, the anticipated improvements with such an environment are lost. Instead of having a centrally managed desktop environment, capable of supporting users in all geographies, the small-scale implementation ends up creating more complexity as two different types of desktop environments must now be supported: traditional and virtual.

If the desktop virtualization solution will be a success in the long run, it requires collaboration between multiple IT groups: network engineers, desktop administrators, server specialists, application experts, and the support team. In many organizations, these teams each have their own objectives and responsibilities. Taking on a desktop virtualization project requires time, resources and commitment; something that is unlikely to happen organically.

If done correctly, a desktop virtualization initiative must have executive level buy-in. Only when the executives are on board with the new initiative will all of the pieces fall into place, which includes:

  • IT Collaboration: Once the executives make desktop virtualization a corporate imperative, the IT groups must forgo the day-to-day politics of their own silos and work together to try and come up with a solution that meets the objectives of the business. This will be difficult and involve breaking down the typical barriers between groups, but with a mandate from the highest levels within the organization, the groups will have no alternative but to work together on a common goal.
  • Funding: Desktop virtualization requires the purchase of additional data center hardware, the levels of which are based on the scale, configuration and virtual desktop types delivered. Most departments do not have the financial resources to create a best-of-breed solution for the users, which results in a fragmented solution not meeting the organization’s expectations. However, with the executive buy-in, funding must be made available to upgrade the infrastructure to support the new environment. This funding must include new server hardware, storage infrastructure, management tools, and networking optimizations.
  • Users and Change: Most users hate change, especially when it comes to their desktop.  Change often means downtime, broken applications or lost files. However, this change can be mitigated by hearing first hand from the executives about why the initiative is being undertaken, what the expectations are, and how the users can help make the project a success. In fact, executives should be the first ones embracing the new environments configured in the same way that users will be expected to work.  By watching the leaders eat the puppy chow, users are more inclined to believe that the new solution is the right thing for them.

With all of the talk about dog food, I think I’ll buy some stock in Purina Puppy Chow.

This is Part 7 in the Desktop Virtualization… The Right Way series:

Daniel Feller
Lead Architect – Worldwide Consulting Solutions
Citrix Systems, Inc.
Blog: Virtualize My Desktop


I’m working when I’m Idle

I’m working when I’m idle, well from my desktop’s perspective that is the case.  We all know that when a desktop starts, when a user logs on, when a user is working and when a user logs off that user has an impact on the system resources: CPU, Memory and Disk.  When the user is idle, we expect CPU and Disk to idle as well.  But is this accurate?   Take a look at my perfmon graph from my desktop for almost an entire workday:

As you can see, when I’m working my CPU and disk activity increases.  CPU increases to 25-30% and disk IOPS jumps to roughly 8 write IOPS and minimal read IOPS.  This is all as expected.  However, when I’m idle, my CPU utilization does drop, but not to 0%.  My disk IOPS does drop, but not to 0.  In fact, my write IOPS only drops to 4 IOPS.
Why? I’m idle. Doesn’t that mean I’m not consuming resources?  Time to put on my Sherlock Holmes hat and find out.

First, I had to look at the applications I was running:  Outlook, TweetDeck, Yahoo IM, Firefox, Word, OneNote.

Second, I had to understand what those applications do, which was easy because I use these applications every day.  TweetDeck is refreshing my Twitter feeds every few minutes.  Outlook is downloading and syncing my email every few minutes.  Firefox had 5 different pages open, many of which had constantly updating data.  Are we starting to see something interesting yet?

Evidently my dear Watson, even though I’m idle, my applications are not.  My applications are still active.  My applications are still using CPU resources. My applications are still storing information to disk.  Also, my operating system is still running processes as well (services, event logs, etc), which also have an impact on the resources.

This is an important realization in that your users still consume a fair amount of resources while at lunch or in meetings.   So when trying to calculate the number of IOPS you will generate from a single hypervisor server, you need to determine what percentage of desktops on that server are in which stage:

  • Bootup
  • Logon
  • Working
  • Idle
  • Logoff

By calculating this out, you will get a better idea of what your storage must be capable of supporting.

Now if I can only get my boss to believe that I’m working while I’m sleeping then we would be golden.
Daniel Feller
Lead Architect – Worldwide Consulting Solutions
Citrix Systems, Inc.
Blog: Virtualize My Desktop

Desktop Virtualization… The Right Way (Migration)

We’ve followed all of the best practices, did a proper analysis and design and are ready to start moving users to their brand new virtual desktop. But not so fast. We need to make sure we have the proper plan in place or else we will end up with incorrect applications, confused users, or lost files. A migration plan must be put into place providing the following for the users.

  • Personalization Synchronization: A user’s traditional desktop has become polluted with numerous application installs, updates, and patches. Blindly transferring these settings into the new environment will have unforeseen results. An alternate approach is to start with a clean environment for every user and only then migrate the required settings (example: outlook signature, browser favorites, etc). By identifying the settings beforehand, the migration process can be validated and the settings can be tested to make sure they remain compatible with the new system. By the way, there are solutions out there (like AppSense) that can help simplify this aspect of the migration.
  • Data Synchronization: In addition to the user’s settings, any data resident on the local desktop must be transferred to a location accessible by the virtual desktop, which preferably is a network share. As users have the tendency to store this data anywhere on their desktop, analysis should quickly determine two likely spots: My Documents or a folder on the local drive. When a user is ready for migration to the virtual desktop, the data is moved to the network share. Once the move is complete, the user should start working from the virtual desktop immediately. Even though these locations are often accessible from the traditional desktop, doing so would typically result in poorer performance, slow file access and potential contention issues. Because of this, it is advisable to stay within the virtual desktop once the user has been migrated.
  • End User Support: A migration is going to have an impact on the users. By identifying this as a fact, a proper support structure can be put into place beforehand. The support team should accommodate the typical issues encountered during a migration. The common questions/issues must be documented and communicated to all users who are about to undergo migration. These materials should be in an easy to find location. But these steps alone are not all that is required. During the first week of migration, there will be a flood of user issues and questions. If a thorough User Acceptance Test was completed, many of these challenges would have already been identified and a valuable FAQ would have already been created. The support team needs the tools and training in place to be able to assist the users in a timely manner. The support team is much more effective if they are able to see the user’s end-point device and the user’s virtual desktop, which is possible with GoToAssist. This gives support full visibility into the user’s challenges.

The point to remember is that the migration plan must not be set in stone. As the first users are migrated, gaps in the process will come to light. The process must be flexible to accommodate unforeseen challenges along the way. Changes to the process must be communicated and followed by the rollout team. And finally, once a user’s data/settings are migrated, they MUST move to the virtual desktop. If not, expect data/personalization discrepancies between the physical and virtual desktop worlds.

Daniel Feller
Lead Architect – Worldwide Consulting Solutions
Citrix Systems, Inc.
Blog: Virtualize My Desktop

Desktop Virtualization… The Right Way (User Experience)

The user discussion doesn’t just end with an understanding of the topology. Desktop virtualization architecture will only get a users so far: access to a virtualized desktop. If the virtualized desktop does not provide the required experience in different scenarios, users will find ways of reverting back to their traditional model or find a way to make life very difficult for you, the architect of this less than stellar solution.

Trying to get these users back is challenging as the bad perceptions must be changed and that takes time. Many of the missteps with regards to the user experience are based on improper analysis and planning. In order to have an environment that is aligned with the user community, understanding the following items are critical.

  • Network Impact: Desktop virtualization requires a network connection, either temporary or permanent depending on the virtual desktop model selected. Trying to understand the network impact is not a trivial task and will never get one to the exact numbers because user will do different things like typing, printing, browsing, Flash video, WMV video, online Facebook games, etc. However, the Performance Assessment and Bandwidth Analysis white paper should help understand the impact of each activity and allow an architect to plan appropriately.
  • Peripherals: One of the beauties of a traditional desktop is it is customizable with peripherals: printers, scanners, webcams, and external drives. These requirements must be understood and supported, but not at the expense of security. For example, should users be able to copy data from the data center to a personal USB storage drive? This might be construed as a security hole. What about allowing a user to copy a file from the USB drive to the data center? This might put the data center at risk for viruses or malware. The justification for certain devices must be determined, but regardless of the outcome, proper security procedures must be put into place.
  • Resources: Users who are not given the proper amount of dedicated resources (CPU and memory) are either left with a desktop experience that is unusable due to the constant delays and sluggish responses because of competing resource requests or a desktop with ample resources but costing the business significant amounts of money due to unused and idle hardware. Although it is easier to allocate one resource configuration for every user, users have different requirements and should be given different configurations. It is usually a better option to create 3-4 different resource configurations for Light, Normal and Power users. With proper analysis of the requirements, users can be placed into one of a few defined configurations.
  • Mobility: A user’s requirement for offline mobility plays an important part in the over analysis. This one requirement significantly limits the possibilities for the user in respect to the most appropriate FlexCast model. Many desktop virtualization models require an active network connection. An active network connection is not guaranteed for the mobile user. Identifying this group of users allows for the design of an offline model of desktop virtualization.

These are some of the most important things to understand regarding the users and their experience expectations. If a user believes they are allowed to use a webcam within their virtual desktop and it does not work, that user now has a bad perception. The experience matters to the user, so it must matter to the architect.

Desktop Virtualization… The Right Way (User Topology)

One office with one type of desktop… Easy. Hundreds of offices with any type and age of desktops… Difficult but not impossible.

Most organizations find themselves in the difficult camp. A user’s desktop can be completely different (in terms of hardware, resources, applications and configuration) than the person sitting next to them doing a similar job. As the environment includes users from different departments, in different offices, with different requirements it becomes clear that the understanding of the user topology for an organization is critical before one can create a desktop virtualization solution.

In previous blogs, I’ve discussed how understanding the underlying standards, applications and storms plays an important role in creating a successful virtual desktop design. The fourth requirement is to understand the organization’s user topology. More specifically, one must get a grasp of the endpoints and user locations.

First, the endpoints. Most organizations follow a 3-5 year desktop refresh cycle. At a minimum, there will be 5 different hardware configurations for each of the 5 years (in actuality, there will likely be many, many, many more configurations). Also, the desktops that are less than 2-3 years old have hardware configurations that can easily support Windows 7 and the latest applications. These newer desktops have more virtual desktop options than an endpoint that is 5+ years old. Newer desktops have the processing power to support the Local Streamed Desktop FlexCast model instead of the hosted VM-Based VDI desktop model.

With Local Streamed Desktop, the desktop is still virtualized and centrally managed, the desktop still receives the virtualized applications, and the users still have their personalized settings applied. The difference is that instead of using resources on a physical server in the data center, the local desktop resources are used. Because local desktop resources are consumed, fewer data center servers are required to support the same number of users

This is but one example of how understanding the endpoints helps determine the type of virtual desktop a user requires. However, just knowing the endpoints is only one aspect of the user topology. The second aspect, user’s location, also plays an important role in selecting the most appropriate virtual desktop.

Certain desktops require a high-speed connection to the infrastructure while other options can allow slower networks with higher latency. By assessing the user locations and the connections to the data center, the proper solution can be put into place to support the virtual desktop FlexCast model.

  • Hosted shared desktop: Can be used on networks with low speeds and high latency
  • Hosted VM-based VDI desktop: Can be used on networks with low speeds and high latency
  • Hosted blade PCs: Can be used on networks with low speeds and high latency
  • Streamed local desktop: Requires a fast, low latency network to the physical desktop for optimal performance
  • Virtual Apps to Installed Desktops: Can be used on networks with low speed and high latency. If application streaming is used (as compared to hosted applications), slower networks will delay application startup time, but users have the ability to work disconnected.
  • Local VM-based desktop (not yet available): Can be used on networks with low speed and high latency, although the slower the network the longer it will take to sync the image to the endpoint. Images can be tens of GBs in size. But once delivered to the end point, all communication remains local to the desktop.

When deciding on the appropriate virtual desktop type, the endpoint and the user’s location matter. Without taking both into account, a user might end up with a fast virtual desktop that takes 5 minutes to start Microsoft Word. Gather all the information before deciding on your virtual desktop type.

Desktop Virtualization… The Right Way (Standards)

Total power often leads to corruption. No, I’m not talking about business or politics. I’m talking about desktops. Have you been in a meeting where people talk about giving users admin rights to workstations. I have two words for you… Be afraid… Be very afraid… OK that was 5 words, but the point is clear. Be afraid.

Many of the challenges with the traditional, distributed desktop operating environment are the lack of standard definitions and enforcement. Most organizations strive for a secured and locked down desktop environment, but over time users were granted exceptions. Throughout the months and years, those exceptions became the new de facto standard.

Now, users have local admin rights. Thousands of unique applications are installed throughout the organization. Every desktop configuration is unique. This is an almost impossible situation for any IT organization to support. This environment did not happen overnight; it took time. Standards slipped because it was simply easier and faster to circumvent the standards instead of troubleshooting the issue. Because of the lack of standards, the environment is so convoluted and complex, it is excruciatingly difficult to make any changes or updates without causing mass confusion.

That being said, can these types of organizations still use desktop virtualization? Yes. And they will see many of the benefits with desktop virtualization that have been discussed over and over again. It will just be more difficult to achieve than an organization who has the desktop standards in place and actively followed.

Many organizations look at desktop virtualization at being the solution to simplify the desktop operating environment. Desktop virtualization is an enabler.

If done to the fullest extent, desktop virtualization is an enabler towards better IT management. Desktop virtualization can enable an organization to discard the bad habits of the past and replace them with best practices that can help an IT organization survive and succeed within an ever increasingly complex computing environment. In order to simplify the management of the desktop, reduce desktop operating costs, and achieve desktop virtualization success, the organization must have alignment in terms of:

  • User rights: Users must have enough abilities to do their job, but this does not mean users should be a local administrator. IT must be able to provide the users with the correct applications and resources when requested. If modifications are required, IT must be able to accommodate in a reasonable amount of time. If IT is unable to meet the agreed upon time frames, alternatives must be made available so users can continue to be productive, which might require an open, and temporary virtual desktop playground area where users can utilize these applications until IT integrates them into the mix. I discussed this in a previous blog about a virtual desktop playground.
  • Applications: Allowing users to install their own applications into the corporate desktop image increases complexity and reduces the security of the system. IT has no visibility into the application and is unable to plan upgrades, updates, or hardware refreshes. The applications could open up holes in the infrastructure that others could exploit. The organization must gain control of the applications if the organization is going to be more flexible.
  • Operating Procedures: IT must deliver the resources users require in an adequate amount of time. This involves the development of new IT processes and ways of working. If a user requires an application, IT must find a way of either incorporating the application into the environment, or finding the user an acceptable alternative while working within the confines of the corporate standards.

Simply moving to desktop virtualization will help us solve some of our challenges, but if you want to make a significant improvement in the way IT is seen within your organization, there must be a new approach. Without clear definition of the operating standards, moving to a desktop virtualization solution will result in many of the same challenges observed with the traditional, distributed desktop operating model. Chaos. Except this time it will be virtual chaos.

Your primary desktop is a

Fill in the blank if you will.  There are many people who are super excited about the upcoming release of the latest tablet PCs (iPad, Slate, etc).  I recently received a comment from someone on Facebook related to a previous blog saying that the iPad Will Not Replace Your Desktop.  The comment basically said

Does the iPad and like devices need to be fully functional to be successful?  How many people have more than one mobile device like a laptop and a netbook?”
That is an interesting question.  But I’m starting to wonder if we need a laptop and an iPad?  Do we need a laptop and a netbook?  Depending on what you do, the iPad or the netbook could potentially replace your laptop.  As I see it, most users have a smartphone and a main work computer, for many that is a laptop because they require a larger form factor device while not in their office.  But what if we did the following:

•    Main computer: Thin client
•    Mobile computer: iPad/Netbook
•    Ultra-mobile computer: Smartphone

If we have Citrix Receiver on all of these devices, we access the same applications/data/environment.

Think about all of the problems we hear about with laptops: stolen, dropped, lost, expensive, etc.  If we went down the virtual desktop route, stolen, broken or lost laptops would not be a problem because your data would be in the data center with your virtual desktop.  So why use a laptop?

Is it possible that tablets and netbooks could mean that those of us with laptops can toss them away?  If the tablets/netbooks provides us with a connection to a virtual desktop from anywhere, why would we need the laptop functionality?

Of course this won’t work for everyone. Some people will need a laptop. But what we will see in the coming months/years is a much more diverse end point environment. We know this is coming, so it is  good idea to start planning how you will integrate all of these endpoints into your infrastructure while still trying to keep the environments secure.

My virtual desktop journey