Zoho Banner September 2011

Archive for the ‘Exchange Server’ Category

I’ve had my Lenovo Ideapad Yoga 13 for a little over a year now. Generally, I’m very happy with it.  It has two internal SSDs, 8GB RAM and an Intel Core i7 processor.  Windows 8.1 runs very nicely on it.  I use the Ideapad for my day-to-day work as well as running test labs in Hyper-V.  Memory is generally my main limitation with Hyper-V, but mostly I can starve the VMs of RAM as performance isn’t a key issue for me for demos and/or testing purposes.  [As an aside, Exchange 2013 is a complete resource hog and won't run nicely unless you give each machine at least 4GB of RAM, which makes running a DAG near impossible for me].   Recently, I noticed that my disk latency (average response time) on the SSD that I run the VMs off was really high (around 11,000ms).  Ok, I was running 3 VMs simultaneously, but still!  So I downloaded AS SSD Benchmark to see how my SSD was performing.  The overall result was 438, which is not great when compared with what others have posted on line with the same SSD.

as-ssd-bench M4-CT256M4SSD3 15.07.2014 8-31-44 a.m.

After some deep thinking (i.e. staring idly into space over a coffee), the idea struck me that Bitlocker might be the culprit.  So I disabled Bitlocker for that drive and tried again. The difference was significant (around 20%) without being remarkable.  Interestingly, the read times before and after were almost identical.  The write times were where the difference was appreciable.

as-ssd-bench M4-CT256M4SSD3 15.07.2014 4-29-01 p.m.

The disk is still performing slowly compared with others online.  I checked my other SSD (a Samsung) and it was also slow, so the conclusion I’ve reached is that there must be some other factor (controller?) causing the slowness.  It would be interesting to hear what others with Ideapads are seeing, or if you have any ideas on how to improve performance.  Windows 8.1 is apparently optimised for SSD use, so I haven’t found any silver bullet for speeding things up.

I helped a customer out with a problem recently.  They had been running Windows Server Backup on an Exchange 2013 CU3 server.  The backups had been configured to “VSS Full” and the scheduled jobs were showing as completing successfully.  The only issue was that the mailbox database transaction logs were not be deleted/removed by the backup job.  I had a close look at the configured backup options and they looked to be fine.   

The only obvious sign that something was wrong (other than the fact that the logs were not being cleared) was that the backup status (Get-MailboxDatabase -Status | fl name, *backup) was not showing a date/time stamp next to LastFullBackup.  Instead the date/time stamp indicated that a copy backup had taken place (LastCopyBackup).

After a fair bit of research I found the following had worked for some other people in the same position:

  1. Open REGEDIT and browse to:

HKEY_LOCAL_MACHINE\Software\Microsoft\ExchangeServer\v15\Replay\Parameters

  1. Add a new DWORD value named EnableVSSWriter, and set its value to 0.
  2. Exit Registry Editor and then restart the Microsoft Exchange Replication service.

This worked for my customer too!  It doesn’t appear to be well documented, so I thought I would share it here.

I had a requirement recently to try and find out where and when a whole lot of mailboxes were hidden from the GAL.  Yes, fingering some poor sucker for the blame is an immensely satisfying task, isn’t it?  I’ve found that an effective way to do this is to query the AD replication metadata for the attribute concerned (in this case ‘msexchhidefromaddresslists’).  The replication metadata will provide you with the date/time for when the attribute value was last changed as well as the name of the DC where the last change was made.  From there you can search the Security Event Log on the DC in question for the audit events corresponding to the change.  This of course assumes that you have Audit Directory Service Changes switched on.

Typically, I would use the excellent Repadmin.exe command line tool to query the replication metadata, e.g. -

Repadmin /showobjmeta MyDC1 “CN=MyUser,OU=User Accounts,DC=contoso,DC=com”

However, in this case someone had already reversed most of the changes (i.e. unhidden the mailboxes) and I needed to query a large number of objects to find some others that were still hidden, hoping that some of them would have a common data/time stamp.  For this the Repadmin.exe would work but would be hopelessly inefficient.  And what (I said to myself) is the best method for performing bulk operations such as this?  Yes, that’s right:  Powershell to the rescue!

After some Googling, I found an excellent code snippet from Powershell MVP Brandon shell that hooks into the underlying .Net class to expose the replication metadata.  His is the clever bit (that’s why he’s paid the big bucks) – I’ve basically just done the wrapper to perform a bulk query and output the results to a CSV file.  Here’s the script for your enjoyment.

 

#########################################################
#
# Name: BulkReportReplicationMetadata.ps1
# Author: Tony Murray
# Version: 2.0
# Date: 27/03/2014
# Comment: PowerShell 2.0 script to find change times
# for an individual AD attribute using replication metadata
# 
# Some bits borrowed from: Get-ADObjectREplicationMetadata.ps1
# Brandon Shell (www.bsonposh.com)
#
#########################################################

# import the AD module
ipmo ActiveDirectory

# Define variables
$domain = (get-addomain).dnsroot # Use the current AD domain
$property = "msexchhidefromaddresslists" # This is the AD attribute we are interested in
$outfile = "c:\csv\outfile.csv" # CSV output file

# Blow away the existing file if it exists
if (test-path $outfile) {remove-item $outfile}

# We will build our own CSV rather than work with export-csv
$header = "samaccountname,modified,dc"
Add-Content -Value $header -Path $outfile

$sb = "OU=Standard User Accounts,DC=contoso,DC=com" # Search base for where our mailbox users live
$fl = "(&(homemdb=*)(msexchhidefromaddresslists=TRUE))" # LDAP filter to find our users
$users = Get-ADUser -LDAPFilter $fl -searchbase $sb

# Loop through our list of users
foreach ($user in $users) {

    $objectDN = $user.distinguishedname # used for finding the replication metadata
    $name = $user.samaccountname # Just for info
    # Sets Context to Domain for System.DirectoryServices.ActiveDirectory.DomainController
    $context = new-object System.DirectoryServices.ActiveDirectory.DirectoryContext("Domain",$domain)
    # .NET Class that returns a Domain Controller for Specified Context
    $dc = [System.DirectoryServices.ActiveDirectory.DomainController]::findOne($context)
    # GetReplicationMetadata returns metadata from the DC for the DN specified.
    $meta = $dc.GetReplicationMetadata($objectDN)
    # Get the last time the attribute value was changed
    $ctime = $meta | %{$_.$Property.LastOriginatingChangeTime}
    # Get the DC that the change was made on
    $dcon = $meta | %{$_.$Property.OriginatingServer}
    # Build the values to write to the output file
    $line =  "`"$name`",`"$ctime`",`"$dcon`""
    # Write the line to the output file
    Add-Content -Value $line -Path $outfile

} # end foreach

The shrewd amongst you would ask why I didn’t query the Exchange (2010 in this case) audit log for this information. The answer is that I did, but couldn’t find the relevant audit entries. The Exchange audit events are only captured if the Exchange tools (EMC/EMS/ECP) were used to perform the change. In my case the changes had been made in bulk, probably using the AD cmdlets.

Today’s post is another short one.  It’s a Powershell one-liner to find all the Primary SMTP address suffixes in use by the mailboxes in your Exchange Org.

In this example I know that my default suffix is “contoso.com”, but I want to find out what others are being used as primary:

get-mailbox -ResultSize unlimited | ? {$_.primarySMTPaddress -notmatch "@contoso.com" } `
| select @{l="SMTPSuffix";expression={$_.primarysmtpaddress.tostring().Split("@")[1]}} –unique

The output (which will take a while as there is a lot of post-processing in the pipeline) looks like this:

SMTPSuffix
_______
fabrikam.com
northwindtraders.com
fouthcoffee.com

Enjoy!

 

Recently I was doing some testing with a new Exchange 2010 Receive Connector and wanted a method to check how many messages it was processing.  I came up with the following Powershell snippet that seems to work well.

$i = 0
do {
    $now = get-date
    (Get-MessageTrackingLog -ResultSize unlimited -Start "11/10/2012 3:00PM" -End $now -Server MYSERVER `
    | ? {$_.connectorid -eq "MYSERVER\SMTP Relay"}).count
    sleep 30
    $i = $i + 1
    $i
} 
until ($i -eq 100) 

The script uses the “do until” method to query the message tracking logs on a specific server at 30 second intervals for instances of the Receive Connector and displays the count.  It does this a hundred times (or until you stop the script).

Quest Software make it hard to love them sometimes.  When they made Quest Quick Connect Express for Active Directory available at no cost it was a real boon for anyone wanting to synchronise objects from AD to AD (or AD LDS instances).  In particular it offered a great free method of achieving GAL Sync between two Exchange Organisations, the likes of which have not been seen since the days of Microsoft’s Identity Integration Feature Pack (IIFP – a cut down version of MIIS/ILM/FIM). I thought was smart, strategic thinking on Quest’s part: make the sync engine available with basic functionality to get everyone used to the product and then generate revenue through add-on licences for other data sources (generic LDAP, SQL, Oracle, etc.).  Sadly, the strategic approach seems to have been thrown out in the (mistaken) belief that charging for the AD connector will bring in more revenue.  Hopefully Dell (Quest’s new owner) will hear the howls of derision and bring back the free version.

Now that I’ve got that off my chest, what are the options left for (free) GAL Sync?  Well, if you have a copy of the Quest One Quick Connect Sync Engine version 4.7 or 5.0 you can still use these to achieve GAL Sync free of charge.  The current version of the Sync Engine (5.1) has had the AD DS/AD LDS connectors disabled so if you download that you will need to purchase a Quest One Quick Connect Express for Active Directory licence to get the old functionality back.

It doesn’t look like version 5.0 of the Sync Engine is available on the Quest web site, but you can still download version 4.7.  To get there you need to register for the Quest One Quick Connect Express for AD trial version and you will then see the download options for the Sync Engine.  The Step-by-Step Guide that I originally wrote was for version 4.7 and is still available:

http://www.open-a-socket.com/index.php/2011/01/06/quest-activeroles-quick-connect-express-gal-sync-step-by-step-guide/

If you have version 5.0 downloaded somewhere, consider yourself lucky – and hold on to it!

If you are in a cross-forest mailbox migration scenario and use Exchange message classifications, this script might be useful to you.  If you plan to have the same message classifications in the target Exchange Organisation then you will want the classification IDs to match.  Without this, there is potential for the classification on migrated mail items not to be recognised.  For example, if you forward or reply to a migrated message and do not change the classification, then the classification will match the source Exchange Organisation and will not be recognised (even if the names match).  To avoid this scenario it is important to ensure that the classification IDs are the same in the source and target environment.

The first step is to export the message classifications in the source Exchange Organisation using the Export-OutlookClassification.ps1 in the \Scripts folder in the Exchange installation path.  This creates an XML file for you to use for the import.  Once you have the file, copy it to the target Exchange Organisation and run the script below.

#########################################################
#
# Name: Import-MessageClassification.ps1
# Author: Tony Murray
# Version: 0.1
# Date: 17 May 2012
# Comment:
# PowerShell script to import Exchange 2010 message
# classifications created in one forest to another forest.
#
# Uses xml file created from the
# Export-OutlookClassification.ps1 script in the source
# forest
#
# This method preserves the ClassificationID, which can
# be beneficial in cross-forest migration scenarios
#
##########################################################

$classfile = "C:\xml\Classifications_E2010.xml"
[xml]$mcs = Get-Content $classfile

foreach ($mc in $mcs.classifications.classification) {
    $name = $mc.name
    $dname = $mc.description
    $clid = $mc.guid
    New-MessageClassification -Name $name -DisplayName $dname `
    -ClassificationID $clid -SenderDescription $name

    # Tidy up variables used in foreach loop
    Clear-Variable -ErrorAction SilentlyContinue -Name mc
    Clear-Variable -ErrorAction SilentlyContinue -Name name
    Clear-Variable -ErrorAction SilentlyContinue -Name dname
    Clear-Variable -ErrorAction SilentlyContinue -Name clid
} # end foreach

# Tidy up global variables
Clear-Variable -ErrorAction SilentlyContinue -Name classfile
Clear-Variable -ErrorAction SilentlyContinue -Name mcs

The other day I was doing some troubleshooting on a DAG member in a remote site.  I needed to get a picture of the copy and replay queues for the server over a period of time. To do this I wrote a small script to poll the queues at 60 second intervals over a 24 hour period.  The output is in CSV format to allow the results to examined/graphed using Excel.  I thought it might be useful to others.

#########################################################
#
# Name: Get-QueueLength.ps1
# Author: Tony Murray
# Version: 1.0
# Date: 25/01/2012
# Comment: PowerShell script to output DAG database
# queue lengths to file
#
#########################################################

$outfile = "c:\QueueLength.csv"

$server = "MyExchangeServer"

$head = "Date,Time,Database,CopyQLength,ReplayQLength"

if (Test-Path $OutFile) {Remove-Item $outfile}

Add-Content -Value $head -Path $outfile

$i = 0
do {
    $dat = Get-Date -Format d
    $tim = get-date -Format HH:mm
    $stats = Get-MailboxDatabaseCopyStatus -Server $server
    foreach ($stat in $stats) {
        $dba = $stat.databasename
        $clen = $stat.CopyQueueLength
        $rlen = $stat.ReplayQueueLength
        $line = "$dat,$tim,$dba,$clen,$rlen"
        Write-Host $line
        Write-Host $i
        Add-Content -Value $line -Path $Outfile
    } # End foreach
    $i = $i + 1
    Start-Sleep -Seconds 60
    } # End of Do
While ($i -le 1439)

I came across an anomaly with the Exchange 2007 Export-Mailbox cmdlet at a customer site recently.  It created a major inconvenience for some bulk mailbox exports, so I thought I would share it here.  Basically, I was able to generate two different search results depending on whether or not I specified a PST file as target.  I’ve since managed to reproduce the behaviour in my own test lab, so the problem appears to be generic and not limited to the specific customer’s environment.

This is what my test environment looks like:

Mailox Server  = Windows Server 2008 SP2 with Exchange Server 2007 SP2 RU5

Export workstation = Windows 7 SP1 with Outlook 2010 and Exchange Server 2007 SP2 RU5 Management Tools

My goal was to export all items that contain the string [blah] (including the square brackets) to a target.  If I specify a PST file as the target then all items that contain the specified string are exported to the PST file as expected, e.g.

Export-Mailbox -Identity c781e3a3-1e08-40a7-abab-ba71b9dddc0b -AllContentKeywords “[blah]” `
-DeleteContent:$false -DeleteAssociatedMessages:$false -PSTFolderPath $pstpath -Confirm:$false

However if I specify a folder in another mailbox as the target and use the same search string then items matching [blah] are copied to the target as well as all items matching blah (i.e. without the square brackets), e.g.

Export-Mailbox -Identity c781e3a3-1e08-40a7-abab-ba71b9dddc0b -AllContentKeywords “[blah]” `
-TargetFolder “EM” -TargetMailbox d4aa986b-c33c-4a89-9e08-1a3ceb5c796e `
-DeleteContent:$false -DeleteAssociatedMessages:$false -Confirm:$false

As you can see, the string passed to the AllContentKeywords parameter is exactly the same in both examples, but the result is different. 

I haven’t yet found a reasonable explanation for why this is happening, but it seems that the search behaviour is different depepending on whether or not the cmdlet includes the option to export to PST.  For example, a straight delete using export-mailbox (i.e. no target at all) will also match both [blah] and blah:

Export-Mailbox -Identity c781e3a3-1e08-40a7-abab-ba71b9dddc0b -AllContentKeywords “[blah]” `
-DeleteContent:$true -DeleteAssociatedMessages:$true -Confirm:$false

However combining the delete option with an export to PST will just match on [blah].

My guess is that the PST option somehow causes the cmdlet to use a different search method (or index?).  When the PST option isn’t used the cmdlet simply ignores the square brackets (and I guess any other special characters).  I haven’t yet found a way to escape the special characters to ensure they are included in all searches.  If anyone knows how to do this, please let me know.

If you’re looking for a Global Address List synchronisation solution for Exchange that simply uses Powershell, look no further than this excellent script from fellow MVP Carol Wapshere.

http://www.wapshere.com/missmiis/a-galsync-powershell-script

The script doesn’t leverage the DirSync control (and hence doesn’t use deltas), which means that it isn’t perhaps as efficient as some of the full-blown solutions out there, but it has the beauty of simplicity!  Another advantage is that it doesn’t require any expensive infrastructure components – unlike most solutions that need at least one dedicated server and a database.

It also works with a variety of Exchange versions!

The script is likely to be most useful for SMEs during migration scenarios. Larger organisations or those looking for something long-term are more likely to invest in a more comprehensive solution such as FIM, SimpleSync or Quest ActiveRoles QuickConnect.

I encourage you to check it out.