Category: vSphere

  • VMware VCP 4 – Additional Study Material

    For those of you wanting to study and pass the VCP 410 exam you need to know this isn’t something that’s easily done, I have heard it mentioned that VMware expect that 50% of people taking the exam for the first time fail it but I haven’t managed to verify that figure myself. I do know that VMware are very keen to ensure that those holding the VCP have done it the right way and I believe that they have it right by requiring people attend an accredited course, of course that in itself does give you the required knowledge to just pass the exam and that’s hopefully where this post can help.

    Wanting to pass the VCP 410 exam has been on my agenda for a while, having got over the main hurdle of the required course (I sat the VMware ICM course at the 360GSP VMware Academy) I next had to ensure I had enough in the way of additional material.

    For my study prep I have been using the following books, both physical and kindle editions (I own some of these books twice, once in each format).

    Mastering VMware vSphere 4VCP Study GuideVCP 4 Review GuidevSphere 4 ImplementationMaximum vSphereHA & DRSVMware vSphere DesignVMware vSphere 4 AdminvSphere for Dummys

    Each book is dipped into when I need to read up on something, be it home lab environments in Maximum vSphere or some light hearted banter in the vSphere for Dummies book.

    (more…)

  • vSphere Remote Management Applications for the iPad and iPhone

    As most of you are aware VMware released the vSphere Client for the iPad back in March, the client itself is fairly limited in what it offers but as a freebie offering isn’t too bad. I wanted to see however how it compared to other vSphere apps for the iPad\iPhone.

    With that in mind I downloaded or purchased three vSphere Remote Management Applications.

    1. VMware vSphere Client for iPad

    2. Nym Networks iDatacenter

    3. Project Eureka LLC’s iVMControl

    (more…)

  • Free utilities for VMware administration

    Having just finished my VMware Install, Configure and Manage (ICM) course I wanted to offer my classmates some advice on free utilities to make their lives easier when it came to managing their vSphere environments.

    As the class are aware of my blog I have decided to also mention them here so that they can get the links to download them for themselves as well as offering links to other people who are interested.

    There are a number of decent free utilities out there that people aren’t aware of unless they actually go out and look for them, some of the utilities I use and recommend to others are the following, this list is by no means complete and will be updated over time.

    (more…)

  • Renaming a datastore fails with “The name ‘datastore_name’ already exists”

    In my recent experiments with various NAS\SAN solutions I suddenly came up across an issue where my usual name for my iSCSI datastore all of a sudden failed when trying to add it into my ESXi environment.

    Having decided to do some more SAN\NAS testing using several different solutions I had made the decision to power down all of my VM’s so that I was only using the single IOmeter VM, at the same time I had also decided to power down 2 of my ESXi servers so that I was using similar resources across all tested platforms. With the ESXi servers powered down I then had to move all of the VM’s to different storage, using Veeams’ FastSCP I copied the VM’s across from my existing Openfiler environment to my main IX4 rather than my ESXi dedicated one.

    (more…)

  • Active Directory Web Services Event Log Errors – vCenter Server (4.1u1)

    I am going through a project at home to centralise my ESXi and Windows Event Logs using Splunk and Snare and part of that process was going through my Event logs to try and fix any niggling issues.

    Going through my vCenter event logs I discovered some errors in the ADWS logs that I did a Google for and came across a post from Gregg Robertson over on his site that resolved the issue.

    (more…)

  • Veeam FastSCP Windows 64bit Fix

    Having installed Veeam FastSCP onto my Windows 7 (64) Ultimate workstation I was getting frustrated with the error messages I was getting "Retrieving the COM class factory for component with CLSID [5F1555F0-0DBB-47F6-B10B-0AB0E1C1D8CE} failed due to the following error: 800700c1", a quick google search found the answer for me.

    Looking into the fix I found here I realised that it’s a tad long winded and bloated (no disrespect intended, as it is the fix). So I thought to myself, did I actually need all of that for the fix to work. The answer is no.

    Having looked at what was required (running "corflags "C:\Program Files (x86)\Veeam\Veeam Backup and FastSCP\VeeamShell.exe"  /32BIT+" to resolve the issue with FastSCP not working on any 64bit Windows OS resulted in the above error message when trying to copy files between Datastores) I decided to see what happens if I just took the corflags.exe file and ran that instead of having the entire SDK installed, the results were that the program now works.

    (more…)

  • Hardware Status not displaying on vSphere Client – Fix

    Today I reinstalled my vCenter server so that I could manage both of my vSphere Hosts centrally, the installation itself went as smoothly as possible but there was one small blip.

    When connecting to the vCenter server I discovered that I couldn’t browse to the Hardware Status page but if I connected directly to the host with the vSphere client I had no issues.

    A Google search later found that this had been an issue back in 2009 but I hadn’t really found much since, the methods to fix this differed slightly with me probably down to the fact that I had installed vCenter onto a 2008 R2 SP1 installation.

    To fix the issue on a 2008 R2 installation do the following.

    1. On the vCenter server go to Start – Run – ADSI Edit

    2. Select connect to and ensure the following is selected under connection settings

    – Connection name – vCenter

    – Connection Point – Distinguished Name is selected and name is “dc=virtualcenter, dc=vmware,dc=int”

    – Computer – Server name : localhost

    – port : 389

    – Click ok to connect

    4. Once in Adsi Edit browse to

    Right click on the CN=VIMWEBSVC and choose Properties.

    Scroll Down to vmw-vc-URL

    Here we can see that this is populated with a DNS name, we are going to change it to an IP address.

    Click OK and exit out from ADSI Edit.

    At this stage I then launched Services.MSC and restarted the two following services.

    However, when trying to restart the VMware VirtualCenter Server service (which requires a restart of the Webservices service anyway) I had an issue where the VMware VirtualCenter Management Webservices service wouldn’t restart, at this point a simple reboot resolved the issues.

    Once the vCenter server had restarted (and I reconnected back via the vSphere Client) I then went to the Hardware Status Page and was presented with the following screen.

    All fixed 🙂

  • Iomega IX4 v Openfiler Performance Testing

    Running my own home based lab I had struggled to find out which storage solution was going to be the best for me, I had multiple choices with the types of storage I could use (I own the following storage enabled\capable hardware; Buffalo TeraStation Pro 2, Iomega IX4-200d 2TB and HP’s MicroServer running Openfiler 2.3).

    Over the last couple of weeks I have been carrying out various tests to see which device I would be using as my NAS\SAN solution and which device would end up being the location to store my Veeam Backups on.

    All three devices run software raid, although I am about to try and fit an IBM M1015 SAS\SATA Controller in to my HP MicroServer (with the Advanced Key to allow Raid 5 and 50)) so both the Iomega and HP were similar where raid types were concerned. The Terastation is an already operational device that has existing data on it and could only be tested using NFS, it’s never really been in contention where SAN\NAS devices for ESXi was concerned.

    Where I wasn’t sure about was whether I would be better off using RAID’s 0, 5 or 10 (obviously I am aware of the resilience issues with RAID 0 but I do have to consider the performance capacity of it as I do want to run a small VMware View lab here as well), not only was there a decision on the RAID types but also should I go down the iSCSI or NFS route as well.

    Having read a number of informative blog and forum posts I knew that to satisfy my own thirst for knowledge I was going to have to perform my own lab testing.

    Lab Setup

    OS TYPE: Windows XP SP3 VM on ESXi 4.1 using a 40gb thick provisioned disk
    CPU Count \ Ram: 1 vCPU, 512MB ram
    ESXi HOST: Lenovo TS200, 16GB RAM; 1x X3440 @ 2.5ghz  (a single ESXi 4.1 host with a single running Iometer VM was used during testing).

    STORAGE TYPE

    Iomega IX4-200d 2TB NAS, 4x 500gb,  JBOD – iSCSI, JBOD – NFS, RAID 10 – iSCSI, RAID 10 –NFS, RAID 5 – iSCSI and finally RAID5 – NFS ** Software RAID only **

    Buffalo TeraStation Pro 2, 4 x 1500gb, RAID 5 – NFS (this is an existing storage device with existing data on it so I could only test with NFS and the existing RAID set, the device isn’t iSCSI enabled).

    HP MicroServer, 2gb ram, 4 x 1500gb + the original servers 1.6tb disk for the Openfiler OS install, RAID 5 – iSCSI, RAID5 – NFS, RAID 10 – iSCSI, RAID 10 –NFS, RAID 0 – iSCSI and finally RAID 0 – NFS.

    Storage Hardware: Software based iSCSI and NFS.

    Networking: NetGear TS724T 24 x 1 GB Ethernet switch

    Iometer Test Script

    To allow for consistent results throughout the testing, the following test criteria were followed:

    1, One Windows XP SP3 with Iometer was used to monitor performance across the three platforms.

    2, I utilised the Iometer script that can be found via the VMTN Storage Performance thread here, the test script was downloaded from here.

    The Iometer script tests the following:-

    TEST NAME: Max Throughput-100%Read

    size,% of size,% reads,% random,delay,burst,align,reply

    32768,100,100,0,0,1,0,0

    TEST NAME: RealLife-60%Rand-65%Read

    size,% of size,% reads,% random,delay,burst,align,reply

    8192,100,65,60,0,1,0,0

    TEST NAME: Max Throughput-50%Read

    size,% of size,% reads,% random,delay,burst,align,reply

    32768,100,50,0,0,1,0,0

    TEST NAME: Random-8k-70%Read

    size,% of size,% reads,% random,delay,burst,align,reply

    8192,100,70,100,0,1,0,0

    Two runs for each configuration were performed to consolidate results.

    Lab Results

    After a long week or so (not only did I have to test each device twice, I also had to move the VM between devices which also took up time) I have come up with the following results.

    Iomega IX4-200D Results

    Openfiler 2.3 Results

    TeraStation Pro II Results

    Conclusions

    Having looked at the results the overall position is clear, the Iomega IX4-200D is now going to be my Veeam backup destination whilst my HP MicroServer is going to be my centralised storage host for ESXi, I now have to decide whether to go for the R0 or R10 iSCSI approach as they offer the best performance, at this stage I am tempted to go for the Raid 10 approach however because the disks in the server aren’t new. Over the next few months I will see how the reliability of the solution is and take it from there.

    One thing I can add however is that over the next couple of days I will be attempting to fit my M1015 RAID controller in there and seeing how that performs, once fitted I will re do the Openfiler tests and post an update.

  • Openfiler (running on HP Microserver) or IX4 as my iSCSI Share?

    So I have to make a decision, I am unsure on the speed of the IX4, add to that I am only using the 2tb version for iSCSI.

    My Microserver has been running Openfiler for the last couple of weeks (nothing configured, just installed), like the IX4 it’s running software raid (I have 5 SATA drives in the Microserver, 4 of them are soft raided to RAID 5 (4x1500gb)).

    I am considering using the IX4 as my backup target for Veeam Backup instead.

    Peoples thoughts??

  • My HomeLab – Setup Part 1

    It’s begun, over the weekend I started to put together my-homelab.

    In order to build up the environment detailed on my Home Labs page, I used 3 of my Lenovo TS200 servers, 2 of them with e3440 Xeon processors (ESXi) and 1 with the e3460 Xeon processor (Hyper-V). These are all quad core, eight threaded, single processors tower servers in which I’ve installed 16gb ram for the ESXi and 8gb ram for Hyper-V. Only the Hyper-V server has disks installed (4 x 750GB SATA drives in a hardware RAID5 setup (courtesy of the M1015 and Adv. Feature key) and an additional 250gb OS disk. The remaining 2 TS200’s are utilising the internal USB slot for ESXi.

    Building the Environment

    Over the weekend I finally had everything I needed to put my environment together, I wired up, plugged in and powered up a total of 9 devices that will be used in my home lab.

    3 Lenovo TS200 Servers
    1 Iomega IX4-200d 2TB NAS
    1 HP 8 Port KVM
    1 Netgear GS724T Switch
    1 HP 19in Monitor
    1 Management PC (QX9650 based gaming rig that’s been retired for 6 months)
    1 HP MicroServer

    Using instructions found in the following article “Installing ESXi 4.1 to USB Flash Drive” I pre-provisioned my 2gb Cruizer Blade USB keys with ESXi and installed them straight into the server (you have to love VMware Player )

    An additional step in configuring the environment up was to ensure that IP addressing was logical, because I will be using the entire server infrastructure on my home network I needed to ensure that I didn’t run out of network addresses (or more importantly use DHCP addresses in the server pool).

    I have configured the network up as follows.

    192.168.x.2 – 192.168.x.99 – Server and Networking Infrastructure
    192.168.x.100 – 192.168.x.150 – Workstations (DHCP)
    172.16.x.10 – 192.168.x.20 – iSCSI Traffic (and management PC)
    10.x.x.10 – 10.x.x.20 – vMotion Traffic

    The 172.16.x.x and 10.x.x.x networks are going to be vlan’d up to isolate the traffic from the rest of the network.

    Building the Storage

    Due to the my failure of increasing the disk capacity on my Iomega IX4-200d unit I have had to throw in an additional storage device, I have changed the role that my MicroServer was going to be doing (it was going to be a backup server utilising Microsoft DPM server). With that in mind I have installed OpenFiler on to the MicroServer, a nice and easy installation compared to NexentaStor (which failed to install due to my lack of CD drive as I am using the 5th SATA port as another drive).

    Both NAS devices will be configured for iSCSI services and will be used to connect both ESXi servers.

    I’ll point to the excellent post on TechHeads site on configuring OpenFiler for use with vSphere.

    The specifics for the lab is that both the OpenFiler and IX4-200d devices will be connected to the storage LAN (172.16.x.x) and not the main VM LAN. The IX4 device will be used for the VM’s whilst the OpenFiler storage will be used for the VM backups that I’ll be doing later.

    The Active Directory Domain Controller will also install directly onto the IX4 whilst the vCenter server will be installed onto the Hyper-V server (utilising DAS storage).

    Installing the Active Directory Domain Controller

    VMware’s vCenter Server requires Windows Active Directory as a means of authentication, that means that we need to have a domain controller for the lab. Steering clear of best practice (which requires at least two domain controllers for resilience) means that I am going to just install one for the moment. I sized the DC VM to be fairly small: 1 vCPU with 512MB RAM, 40GB hard disk (thin provisioned) and 1 vNIC connecting to the lab LAN.

    The setup was a standard Windows Server 2008 R2 install, followed by Windows Updates before running dcpromo.

    Host “WIN-DC01”
    Domain “MY-HOME.LAB”.

    Next up is the vCenter server. We’ll continue with that journey soon.