Technical data
DATA CENTER and CAMPUS NETWORKS DEPLOYMENT GUIDE
Deploying Brocade Networks with Microsoft Lync Server 2010 36 of 52
Headquarters
All the main Microsoft Lync Server 2010 servers were located at the headquarters, which included two Front End
Servers, two Director Servers, one Monitoring Server, and two Edge Servers. The two Edge Servers were placed into
the DMZ that serviced external clients connecting to the environment.
Microsoft core services, such as Active Directory and SQL Server, were also deployed at the headquarters. SQL
Server 2008 was configured using Microsoft Clustering Services to provide fault tolerance. The SQL Server 2008
database resided on an enterprise-class HP storage system. The SQL Server 2008 servers used Brocade Fibre
Channel (FC) Host Bus Adapters (HBAs) connected to a Brocade 5100 Switch. The infrastructure included Brocade
FastIron SX 800 Switches for Layer 2/3 services and a pair of Brocade hardware load balancer ServerIron ADX 1000
switches for hardware-based load balancing for the Edge Servers, Front End Servers, and Director Servers.
ISP
The Brocade MLX was used to simulate an Internet Service Provider (ISP). A Brocade FastIron SX was configured for
OSPF, to direct traffic to the appropriate site.
Branch Offices
Each branch office used a Brocade FCX for both Layers 2 and 3. In addition, each site was configured for a different
latency based on the distance from the headquarters. The following link was leveraged for estimated latencies
between each site. A 15 percent packet loss between the San Jose and New York sites simulated the extreme end of
packet loss. In most cases, 5 percent is the maximum packet loss experienced with a reliable service provider. Even
with a 15 percent packet loss, the quality of voice and High Definition video calls were not affected. However, video
is more susceptible to latency and will get out of sync between the voice and video. In addition, SBAs were placed in
Austin and San Francisco.
The following were the latencies and packet loss between the sites. Different packet losses were simulated to
observe the effect on calls.
• San Jose to New York: 50 ms latency with 0–15 percent packet loss
• San Jose to Austin: 25 ms latency with 0–15 percent packet loss
• San Jose to Seattle: 5 ms latency with 0–15 percent packet loss
Hardware and Equipment
Server Roles
• FE-1: Front-End Microsoft Lync Server 2010 server
• FE-2: Front-End Microsoft Lync Server 2010 server
• Dir-1: Director Microsoft Lync Server 2010 server
• Dir-2 Director Microsoft Lync Server 2010 server
• Edge-1: Edge Microsoft Lync Server 2010 server
• Edge-2 Edge Microsoft Lync Server 2010 server
• Mon-1: Monitoring Microsoft Lync Server 2010 server
• SQL-1: Back End SQL Server
• SQL-2 Back End SQL Server
• DC1: Domain Controller
• AUSBOA: Microsoft Lync Server 2010SBA
• SFBOA: Microsoft Lync Server 2010SBA
Software Requirements
• All the servers: Microsoft Windows 2008 R2 operating system
• Microsoft SQL Server 2008: SQL Server instances, using Microsoft Clustering Services
• Microsoft Lync Server 2010 with the most recent patches at the time of writing










