My ULS Query

Anyone who has worked with SharePoint *should* be familiar with the ULS log where SharePoint can log nearly every operation. Most recent versions allow for a portion of those log files to be cataloged into a database called ULSTraceLog that makes searching easier.

SharePoint logs the records based on UTC time.

However, sometimes, you need to find the exact log file on the SharePoint server based on an event in the database. Depending on how your log files roll over, this might be a challenge.

Below is the work in progress query to convert the UTC time to local time and then further convert to 12 hour time. It is written as a CTE so I could get more familiar with that and it seems to work fine but I don’t know the performance around two CTE’s:

WITH CTE_UlsLogFile
AS
(
SELECT TOP 100
[LogTime]
,CONVERT(datetime, SWITCHOFFSET(CONVERT(datetimeoffset, [ULSTraceLog].LogTime), DATENAME(TzOffset, SYSDATETIMEOFFSET()))) as LogTimeLocal
,[MachineName]
,[ProcessName]
,[ThreadID]
,[Area]
,[Category]
,[Level]
,[EventID]
,[Message]
,[CorrelationId]
,[RowCreatedTime]
FROM [DEV_SP_CFG_UsageAndHealth].[dbo].[ULSTraceLog]
ORDER by LogTime DESC
),
CTE_ULSLogFile_InLocal
AS
(
Select
*
,substring(convert(varchar(max), LogTimeLocal,20),0,12) + substring(convert(varchar(20), [LogTimeLocal], 9), 13, 5) + ‘ ‘ + substring(convert(varchar(30), [LogTimeLocal], 9), 25, 2) as LogTimeLocal12Hour from CTE_UlsLogFile
)
select * from CTE_ULSLogFile_InLocal

 

Comments or thoughts please!

Advertisements

Report Builder 3.0–Application with the same identity already installed

Just a quick note, is if you are trying to use report builder, deployed via ClickOnce, in multiple environments (dev,test,prod) you will have to clean the ClickOnce cache by running, from a command line,:

rundll32 dfshim CleanOnlineAppCache

 

Thanks to:

http://www.wolffhaven45.com/blog/sccm/report-builder-3-0-fails-to-install/

FarmCredentialManagement.aspx

While building out yet another SharePoint 2013 development environment, I decided to change a number of service accounts and delete the old ones from Active Directory. Everything seems fine until I ended up trying to go the http://centralAdmin:5555/admin/FarmCredentialManagement.aspx page to add some other ones. Instead of the page, I got an error message the logged the error the ULS:

Application error when access /_admin/FarmCredentialManagement.aspx, Error=Some or all identity references could not be translated.

I used

Get-SPManagedAccount

To show me all the managed accounts and it became obvious to show the accounts that had been deleted in AD but not cleaned up from the farm.

UserName             PasswordExpiration    Automatic ChangeSchedule
Change
——–             ——————    ——— ————–
LABS\svcSP.FarmAdmin 7/4/2013 5:49:43 AM   False
LABS\svcSP.MySite…                       False
LABS\svcSP.Search… 7/4/2013 5:49:44 AM   False
LABS\svcSP.Services  7/4/2013 5:49:45 AM   False
LABS\svcSP.Portal…                       False
LABS\svcSP.MySite_AP 7/17/2013 3:08:12 AM  False
LABS\svcSP.Portal_AP 7/17/2013 3:08:19 AM  False
LABS\svcSP.Search_AP 7/17/2013 3:16:07 AM  False

A quick check documented the broken ones: (All on one line)

Get-SPManagedAccount | ? {$_.PasswordExpiration -eq $ null}

UserName             PasswordExpiration    Automatic ChangeSchedule
Change
——–             ——————    ——— ————–
LABS\svcSP.MySite…                       False
LABS\svcSP.Portal…                       False

 

And a further one deleted it: (All on one line)

Get-SPManagedAccount | ? {$_.PasswordExpiration -eq $null}

| Remove-SPManagedAccount

Hope that helps anyone…

SharePoint Snippets

 

Following along with my previous post about Powershell Snippets, here is my post for my own SharePoint Snippets, some of which are “PowerShelly.”

View SharePoint 2010 Developer Dashboard Setting:

[Microsoft.SharePoint.Administration.SPWebService]::ContentService.DeveloperDashboardSettings

Change SharePoint 2010 Developer Dashboard

(supports On,Off,OnDemand) (use single line)

[Microsoft.SharePoint.Administration.SPWebService]::
ContentService.DeveloperDashboardSettings.DisplayLevel = 
([Enum]::Parse([Microsoft.SharePoint.Administration.SPDeveloperDashboardLevel], “Off”));
[Microsoft.SharePoint.Administration.SPWebService]::
ContentService.DeveloperDashboardSettings.Update()

Reliably Deploying SharePoint 2010 Taxonomies

One a recent project for Slalom Consulting, my employer, we had a project that made use of SharePoint 2010 Managed metadata including groups, term sets and terms. As we began to create new environments, one of the laborious tasks was to populate the Taxonomy Term Store using the term store manager page accessible via Central Administration or the related site collection (“_layouts/termstoremanager.aspx”). The terms are loaded into the Taxonomy Term Store via CSV file that adheres to a specific format that you can see via the url of your SharePoint Central Administration site: http://centraladminurl/_layouts/1033/ImportTermSet.csv

To manually import a file, you must first create group, then click on that group and chose “Import Term Set”:

image

You are then presented with a dialog box where you can select a CSV file that adheres to the required format to import:

image

This process is fine if you only have one or two terms to load but as your continue to build out a solution and have to do this process over and over again, it can be come tedious. The following is my PowerShell script that loops through all the .csv files in the same directory as the PowerShell script. The caution is the script deletes the terms each time, so any site columns you create from this will need to adjusted if you delete and recreate the term set. However, this is useful during setting up environments during the development process as you create site collections. Use with caution Smile.  I am also very much open to feedback on the PowerShell script and how to improve it.

try

{

 if(((Get-PSSnapin | foreach {$_.Name}) -contains "Microsoft.SharePoint.PowerShell") -eq $false)

 {

 Add-PSSnapIn Microsoft.SharePoint.PowerShell -ErrorAction Stop

 }

}

catch

{

 Write-Host -ForegroundColor "red" "This script only works on machines with SharePoint 2010 installed.`n"

 Exit

}

 

function Pause ($Message="Press any key to continue...")

 {

 # The ReadKey functionality is only supported at the console (not is the ISE)

 if (!$psISE)

 {

 Write-Host -NoNewLine $Message

 $null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

 Write-Host ""

 }

 }

 

function AddCSVToTermSet([string] $termGroup, [string] $termSetCSVPath)

{

$termStore = $session.TermStores[$mmdServiceName]

if ($termStore.Groups[$termGroup] -eq $null) {

    $group = $termstore.CreateGroup($termGroup);

    $termStore.CommitAll() 

}

else {

    $group = $termStore.Groups[$termGroup]

}

# Import Taxonomy 

# Create an instance of ImportManager

$Importer = $termStore.GetImportManager()

$varImported = ""

$varMessages = ""

$reader = [System.IO.File]::OpenText($termSetCSVPath) 

 

# Syntax: ImportTermSet(TermGroup, TextReader, out isImported, out errorMessage)

$Importer.ImportTermSet($group ,$reader, ([REF] $varImported), ([REF] $varMessages))

}

 

$termGroup = "Location"

$centralAdminUrl = $(Get-spwebapplication -includecentraladministration | where {$_.IsAdministrationWebApplication} | Select-Object -Property Url).Url

$session = Get-SPTaxonomySession -site $centralAdminUrl

$mmdServiceName = $(((Get-SPFarm).ServiceProxies | Where {$_.TypeName -eq "Managed Metadata Web Service Proxy"}).ApplicationProxies | Select-Object -Property DisplayName).DisplayName

 

#delete all previous, 

#use with caution as any site solumns created with this will lose their relationship if the terms are deleted

 

$termStore = $session.TermStores[$mmdServiceName]

$termStore.Groups[$termGroup].TermSets | ForEach-Object {$_.Delete() }

$termStore.Groups[$termGroup].Delete()

$termStore.CommitAll() 

 

#process all CSV in same directory

$files = Get-ChildItem "." | Where {$_.extension -eq ".csv"}

Foreach ($csv in $files) {

    AddCSVToTermSet $termGroup $csv.FullName

}

 

Pause