User guide

© 2011 Cisco Systems, Inc. All rights reserved. This document is Cisco Public Information. Cisco Validated Design Page 107
Software components
Cisco UCS firmware 1.3(1i)
XenServer 5.6, XenCenter 5.6
XenDesktop 4
Windows 7 32 bit, 1vCPU, 1.5 GB of memory, 30 GB per virtual machine
6.4 Testing Methodology
All validation testing was conducted on-site within the Cisco labs with joint support from both Citrix and NetApp
resources. The testing results focused on the entire process of the virtual desktop lifecycle by capturing metrics
during the desktop boot-up, user logon, user workload execution (also referred to as steady state), and user logoff
for both the Hosted Shared and Hosted VDI models. Test metrics were gathered from the hypervisor, virtual
desktop, storage, and load generation software to assess the overall success of an individual test cycle. Each test
cycle was not considered passing unless all metrics were within the permissible thresholds as noted as success
criteria. Test were conducted a total of three times for each hardware configuration and results were found to be
relatively consistent from one test to the next
6.4.1 Load Generation
Within each test environment load generators were utilized to put demand on the system to simulate multiple
users accessing the XenDesktop environment and executing a typical end-user workflow. To generate load within
the environment, an auxiliary software application was required to generate the end user connection to the
XenDesktop environment, provide unique user credentials, initiate the workload, and evaluate the end user
experience. Based on the environment design, different load generators were used between the Hosted VDI and
Hosted Shared environment.
In the Hosted VDI environment an internal Citrix automated test tool was used to generate end user connections
into the environment and record performance metrics through an agent running on the core XenDesktop
infrastructure components. In the Hosted Shared environment, the standard Login VSI launcher was used
simulate multiple users making a direct connection to the shared desktop of the XenApp servers via an ICA
connection.
6.4.2 User Workload Simulation Login VSI from Login Consultants
One of the most critical factors of validating a XenDesktop deployment is identifying a real-world user workload
that is easy for customers to replicate and standardized across platform to allow customers to realistically
reference for a variety of worker tasks. To accurately represent a real-world user workload, third-party tools from
Login Consultants were used throughout the Hosted Shared and Hosted VDI testing. These tools have the added
benefit of taking measurements of the in-session response time, providing an objective way to measure the
expected user experience for individual desktop throughout large scale testing, including login storms.
Login Virtual Session Indexer (Login Consultants VSI 2.1) methodology designed for benchmarking Server Based
Computing (SBC) and Virtual Desktop Infrastructure (VDI) environments is completely platform and protocol
independent and hence allows customers to easily replicate the testing results in their environment. Login VSI
calculates an index based on the amount of simultaneous sessions that can be run on a single machine.
Login VSI simulates a medium-heavy workload user (intensive knowledge worker) running generic applications
such as: Microsoft Office 2007, Internet Explorer including Flash applets and Adobe Acrobat Reader (Note: For
the purposes of this test, applications were installed locally, not streamed or hosted on XenApp). Like real users,
the scripted session will leave multiple applications open at the same time. Every session will average about 20%
minimal user activity, similar to real world usage. Note that during each 12 minute loop users open and close files
a couple of time per minutes which is probably more intensive that most users.
The following outline the automated Login VSI simulated user workflows that were used for this validation testing: