Quantcast
Channel: Exchange – Microsoft Technologies Blog
Viewing all articles
Browse latest Browse all 214

TroubleShooting Fast growing transactional logs – Part 2

$
0
0

I am continuing from the previous post ” TroubleShooting Fast growing transactional logs”.

In my experience I have seen the most of the issues where transactional log drives gets fill up quickly is because of  Active sync enabled devies.

Microsoft has provided a very good script of finding out the devices atht are causing issues. Below is the link to the script  which I have have clubed with mine script & which has made it very easy for me to find out the problematic devices/users.

http://blogs.technet.com/b/exchange/archive/2012/01/31/a-script-to-troubleshoot-issues-with-exchange-activesync.aspx

Now here are the scripts that I have used along with the above one..

1.  Extract the data from all the CAS servers (This will map drive from each CAS extract logs  & remove the mapping)

you have to input date for which you want to extract data..

—————————————————————————————————————————–
Function Active ($CAS, $date, $hits) {
$net = $(New-Object -ComObject WScript.Network)
$net.MapNetworkDrive(“V:”, “\\$cas\e$“)
.\ActiveSyncReport.ps1 -IISLog “V:\LogFiles\W3SVC1” -LogparserExec “C:\Program Files (x86)\Log Parser 2.2\LogParser.exe” -ActiveSyncOutputFolder C:\EASReports -Date $date -MinimumHits $hits -ActiveSyncOutputPrefix $CAS

$net.RemoveNetworkDrive(“V:”)
}

$time = “03-08-2012”
$Shoot = “1500”

Active CASP01 $time $Shoot
Active CASP02 $time $Shoot
Active CASP03 $time $Shoot

————————————————————————————————————————————————–

2. Now if you know the database that is having isssues.

you have to input the same date as in above script with CMS & data base name.

$date = “03-08-2012”

$input1 = “C:\EASReports” + “\” + “EASyncOutputReport-xcasp01_”+ $date +”_Minimum_Hits_of_1500.csv”
$input2 = “C:\EASReports” + “\” + “EASyncOutputReport-casp02_”+ $date +”_Minimum_Hits_of_1500.csv”
$input3 = “C:\EASReports” + “\” + “EASyncOutputReport-casp03_”+ $date +”_Minimum_Hits_of_1500.csv”

$server = “CMSP02”

$data = import-csv $input1

$database = $server + “\SG11\DB11_AM”
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

$data = import-csv $input2
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

$data = import-csv $input3
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

Note:- put all the scripts in same folder (I have put them in c:\Activesync). Change the scripts according to the enviornment you are supporting.



Viewing all articles
Browse latest Browse all 214

Trending Articles