July 2010 - Posts

I posted a while back on how to do your Managed Property mappings in PowerShell, so I wanted to follow up with how to add search scopes next.  I have to give props to the SDK team because they have done a pretty good job documenting all of these PowerShell commands.  They even provide examples which I like a lot.  When trying out command sto create scopes and their associated rules, I ran into a few things that I wanted to share.  You will want to put all of these commands into a .ps1 script file.  You can create your script file in PowerShell-ISE or just use notepad.  We start out by getting a reference to the search application.

$searchapp = Get-SPEnterpriseSearchServiceApplication "Search Service Application"

At this point, we are ready to create a new scope.  It’s pretty simple.  To create a scope, we use the New-SPEnterpriseSearchQueryScope command.  Here is my command to create a scope called My Scope.  One thing to note here is that DisplayInAdminUI is a required parameter.  It turns out you can create hidden scopes with PowerShell even though you can’t do it through the UI.

$scope = New-SPEnterpriseSearchQueryScope -Name "My Scope" -Description "My scope created in PowerShell" -SearchApplication $searchapp -DisplayInAdminUI $true


We now want to create scope rules for this scope.  I assign the resulting object from the above command into a variable called $scope so that we can pass it to the scope parameter of the New-SPEnterpriseSearchQueryScopeRule command.  When creating a scope rule through the UI, you have four choices: All Content, Property Query, Content Source, and Web Address.  We’ll start with the simplest one, All Content.  For this, we just specify a RuleType value of AllContent.  All commands to create new scope rules require a URL.  It’s not entirely clear what this URL does since it’s not something you enter in the UI when you create a rule.  From the SDK, it simply states that it “specifies the results URL that is associated with the query rule.”  I guess you could give it a path to the results page in your search center.  For now, I just specify the path to the server and it seems to work.

New-SPEnterpriseSearchQueryScopeRule -RuleType AllContent -url http://sp2010 -scope $scope


Next, we’ll create a scope rule using the PropertyQuery Rule Type.  It requires a few more parameters.  The ManagedProperty parameter specifies the name of your managed property.  In my example, I am going to use a property called Color.  The PropertyValue parameter specifies the value.  I want to see products that are red, so I’ll specify Red here.  With any scope rule, you can specify whether the results from the rule should be Included, Required, or Excluded.  We specify this with the FilterBehavior parameter.  This parameter is required when using this PropertyType (note the SDK says it is optional).  Also, when using a property query, the SearchApplication parameter is required too so just pass it the value $searchapp and it will work fine.  Here is the command.

New-SPEnterpriseSearchQueryScopeRule -RuleType PropertyQuery -ManagedProperty Color -PropertyValue Red -FilterBehavior Include -url http://sp2010 -scope $scope -SearchApplication $searchapp


When you execute the command it gives you some basic info about the rule you set up as well as an estimated count.

Setting up a scope with a RuleType of Url took me a bit longer to figure out. This is because there is a parameter called UrlScopeType and the SDK didn’t say what value was expected there.  I had to do some digging.  The SDK, didn’t say what values it was expecting nor did the Get-Help command.  I did some reflecting and finally found an enum that had the answer.  The values it wants are Folder, HostName, or Domain. This of course makes sense when you go back and look at the UI and see the parameters you specify there.


The other parameter you need to know about here is MatchingString.  You specify the value to the folder, hostname, or domain you want to use.  In this case I am setting up a rule for a particular subsite.

New-SPEnterpriseSearchQueryScopeRule -RuleType Url -MatchingString http://sp2010/bcs -UrlScopeRuleType Folder -FilterBehavior Include -url http://sp2010 -scope $scope


So now we can create rules for all content, a web address, and a property query. However, if you have ever set up a scope before, you know there is one more type.  That type is a content source.  The SDK, didn’t have this type listed so I looked around in reflector again and found that we could specify a value of ContentSource for the RuleType parameter.  However, when I tried to specify that parameter, it didn’t work.  I took a look at the code and discovered that there is no code implemented to create a content source scope rule.  After doing some experimenting in PowerShell, I did discover the answer, but I’ll save that for the next post where I will show you how I figured it out.

Remember you can add multiple rules at a time to one scope.  Just put them all in one script file and run it.  Also if you need to delete your Scope, you can use the Remove-SPEnterpriseSearchQueryScope command but you have to pass it an actual scope object which you can get with the Get-SPEnterpriseSearchQueryScope command.  Here is how I deleted my scopes as I was testing.

$searchapp = Get-SPEnterpriseSearchServiceApplication "Search Service Application"
Get-SPEnterpriseSearchQueryScope -Identity "My Scope" -SearchApplication $searchapp | Remove-SPEnterpriseSearchQueryScope

You can do so much in SharePoint with PowerShell.  This is just one more thing, I won’t have to manually configure any more.  The SDK does a great job documenting all of the commands out there (although I would like to see that info on the UrlScopeType parameter added some time :) ).  Try some of them out and you’ll be amazed at what you can accomplish.

The first thing I do when creating a new project in Visual Studio (regardless of type) is change the project name, assembly name, and default namespace.  I like for the names of all these to be consistent.  However, I have noticed in Silverlight, you must make changes in five different places for everything to work alright otherwise you will get errors.  I’m no Silverlight expert, but I thought this post would be useful for people like me who only dabble in it from time to time.

When you create your new Silverlight project, right click on the project name and bring up its properties.  Go ahead and change the default namespace and assembly name just like you would in any other project.


In my case, I change from a namespace of SilverlightApplication1 to DotNetMafia.Silverlight.Test.  Here you can see the new assembly name and default namespace.  There are two other options to set here, but we will have to come back to it.

At this point, we want to correct the namespace in the existing classes.  Let’s start with App.xaml.


I have highlighted the section, I need to change, x:Class.  I’ll change it to DotNetMafia.Silverlight.Test like we see below.


Now, we just need to change the namespace in the code behind file App.xaml.cs.


Now, we have to do the same thing to MainPage.xaml.  Change the namespace here on x:Class as well.


Then, you will change the namespace of the code behind file MainPage.xaml.cs just liked we did before.


At this point, your code will compile.  However, it will not run.  If you try to debug it you will likely get a blank page and if you are using Internet Explorer, you will probably see a script error in the toolbar.


Clicking on the error, you can see the following details.


I have to say they have really improved the way you view script errors.  Here is the text of the error.

Message: Unhandled Error in Silverlight Application
Code: 2103   
Category: InitializeError      
Message: Invalid or malformed application: Check manifest    

Line: 54
Char: 13
Code: 0

Remember, I said we had to change an additional setting in the project properties?


This is the problem here.  We change the startup objects namespace but never update the project properties to match.  You should be able to pick the new namespace of your startup object from the list.  This is also a good time to change the name of your .XAP file if you are so inclined.  Here is what my project properties looks like when I am done.


At this point your application should compile and run.  Now I can view my beautiful Hello World Silverlight application.


Great app huh?  Any how, I hope this helps should you encounter the error above or run into issues changing your namespace.  It’s pretty simple to do, but you’ll definitely get errors if you don’t get all of your changes made.

UPDATED 8/3/2011: DocId

Enterprise Search is one of my favorite SharePoint topics to speak about.  Often in my talks, I use various keyword queries to display results that people often end up asking me about.  Today, I thought it would be useful to show you some of the most useful built-in keywords that you can use to troubleshoot search results or even build custom scopes with.  I see people post in the forums all the time about how to do many of the queries I have in the list below, so I figure this is will be a good post to refer people to.  You can use pretty much all of these with SharePoint 2010 or MOSS 2007 right out of the box (once you crawl).

The ContentSource keyword is incredibly powerful.  You can use it to verify that results exists for a given content source.  For example, if you were having issue crawling a file share and you wanted to see if any results were present or not, you could use this keyword.  Start by identifying the content source in question on your content sources page in your Search Service Application.


Note the spelling of your content source exactly.  If I wanted to return everything from the file share, I would issue this query.

ContentSource:"File Share"

On my server I get results like this.


Keep in mind that you can combine these keywords with other ones.  For example if I only want accounting documents on the file share, it would look like this.

Accounting ContentSource:"File Share"


To query all documents on the SharePoint site itself, you could use a query against the Local SharePoint Sites (Local Office SharePoint Sites on MOSS 2007) content source.  Be warned that this could return a large number of results.

ContentSource:"Local SharePoint Sites"

Another property, more people are more familiar with is IsDocument.  This handy property is a great way to filter out items that are not documents (i.e.: folders, list items, and pages).  You pass a value of 1 for true.  In the example below, I return all documents on my SharePoint server.



If you have spent time building custom scopes, you might be interested in the Scope keyword.  This makes it easy to query everything in your scope at once.  You can use this to build advanced queries or just for simple troubleshooting of the scope.  In this example, I have a scope with rules only to show items from a BCS content source with a property called Color set to red.

Scope:"Red Products"


A lot of times people ask questions about how to just show results for a given site or site collection.  Maybe even a specific document library.  You can do that with contextual search, but you can also do that with the Site keyword as well.  Simply pass it the URL to the specific point on your site and it will filter the results.  For example, if I just want to see results from my ECM subsite, I would use the following query.



Sometimes you might want to see all documents a particular user modified.  The author keyword is a great way to find things when you want to see all documents from someone that recently left the company.  For example, to see all of the documents by our fictitious CFO, Christina Murphy, we would use this query.

Author:"Christina Murphy"


If you know you only want to see documents of a certain content type, why not query for it? You can with the ContentType keyword.  For example to search all documents of my custom type called Custom Document, I would use the following query.

ContentType:"Custom Document"


You can also write queries based on list template such as a document library, task list, or your own custom list template with the ContentClass keyword.  Simply use any known content class with the keyword (here is a list).  For example, to search only items in a task list, I would issue the following query.



Want to see a query that only has Excel spreadsheets? No problem.  Use the FileType keyword.  Don’t include the period on the extension.  You can use multiple FileType keywords together to span multiple types (i.e.: xlsx and xls).



Another keyword I will cover is Write.  You use it to query documents based on modification date.  You can’t take full advantage of this keyword in MOSS 2007, but in SharePoint 2010, you can do some cool things by using the new operators >,<,>=,<=.  This allows you to write queries to see all documents written since the beginning of the month for example.  Your date should be in quotes and make sure there is no space between the keyword and the date itself.



You can also query by Document Id using the DocId keyword.  Here is an example looking for an item with a Document Id of YY7PPZHWQVY7-5-3.



These are all keyword I use on a regular basis.  Remember you can combine most of these together to provide vary narrow search results.  You could also make use of these by writing your own advanced search control.  I hope these queries are useful the next time you use Enterprise Search.  If you can think of any other good ones I left out, please leave a comment.

I am excited to say that I am returning to Oklahoma City next Monday (7/26) to speak about using PowerShell with SharePoint.  This fun talk will show you just enough PowerShell to be dangerous.  You will learn the basics of using it, how to write scripts, and even see how to build a cmdlet.  The talk is at OKC CoCo at 6 pm.  I look forward to seeing you all then.

The reason I write this post today is because my initial searches on the Internet provided me with nothing on the topic.  I was hoping to find a reference to the SDK but I didn’t have any luck.  What I want to do is set a default column value on an existing folder so that new items in that folder automatically inherit that value.  It’s actually pretty easy to do once you know what the class is called in the API.  I did some digging and discovered that class is MetadataDefaults. It can be found in Microsoft.Office.DocumentManagement.dll.  Note: if you can’t find it in the GAC, this DLL is in the 14/CONFIG/BIN folder and not the 14/ISAPI folder.  Add a reference to this DLL in your project.  In my case, I am building a console application, but you might put this in an event receiver or workflow.

In my example today, I have simple custom folder and document content types.  I have one shared site column called DocumentType.  I have a document library which each of these content types registered.  In my document library, I have a folder named Test and I want to set its default column values using code.  Here is what it looks like.  Start by getting a reference to the list in question.  This assumes you already have a SPWeb object.  In my case I have created it and it is called site.

SPList customDocumentLibrary = site.Lists["CustomDocuments"];

You then pass the SPList object to the MetadataDefaults constructor.

MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary);

Now I just need to get my SPFolder object in question and pass it to the meethod SetFieldDefault.  This takes a SPFolder object, a string with the name of the SPField to set the default on, and finally the value of the default (in my case “Memo”).

SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"];

columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo");

You can set multiple defaults here.  When you’re done, you will need to call .Update().


Here is what it all looks like together.

using (SPSite siteCollection = new SPSite("http://sp2010/sites/ECMSource"))


    using (SPWeb site = siteCollection.OpenWeb())


        SPList customDocumentLibrary = site.Lists["CustomDocuments"];

        MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary);


        SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"];

        columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo");




You can verify that your property was set correctly on the Change Default Column Values page in your list settings.


This is something that I could see used a lot on an ItemEventReceiver attached to a folder to do metadata inheritance.  Whenever, the user changed the value of the folder’s property, you could have it update the default.  Your code might look something like this.

columnDefaults.SetFieldDefault(properties.ListItem.Folder, "MyField", properties.ListItem["MyField"].ToString());

This is a great way to keep the child items updated any time the value a folder’s property changes.  I’m also wondering if this can be done via CAML.  I tried saving a site template, but after importing I got an error on the default values page.  I’ll keep looking and let you know what I find out.

Yesterday, I showed you how to deploy a regular web part to MOSS 2007 / WSS3 that was built and packaged in Visual Studio 2010.  Today, we can take that a step further and take advantage of the new Visual Web Part and deploy it the same way.  If you remember, a Visual Web Part is nothing more than a glorified user control.  To get started, create a new empty SharePoint project or use and existing one.  If you need assistance with that, look at yesterday’s post.  Then, go ahead and create a new Visual Web Part.  The user control Visual Studio creates has many references to SharePoint 2010 DLLs that we simply do not need (or can use).  These must be removed.  Here is what it looks like when we start.

<%@ Assembly Name="$SharePoint.Project.AssemblyFullName$" %>

<%@ Assembly Name="Microsoft.Web.CommandUI, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<%@ Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<%@ Register Tagprefix="Utilities" Namespace="Microsoft.SharePoint.Utilities" Assembly="Microsoft.SharePoint, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<%@ Register Tagprefix="asp" Namespace="System.Web.UI" Assembly="System.Web.Extensions, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" %>

<%@ Import Namespace="Microsoft.SharePoint" %>

<%@ Register Tagprefix="WebPartPages" Namespace="Microsoft.SharePoint.WebPartPages" Assembly="Microsoft.SharePoint, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="VisualWebPart1UserControl.ascx.cs" Inherits="WSSWebPart.VisualWebPart1.VisualWebPart1UserControl" %> 

Remove any reference to a SharePoint version 14 DLL and you will have a file that looks like this.  Then I’m just going to add a simple label to demonstrate our user control.  Here is what it looks like after the changes.

<%@ Assembly Name="$SharePoint.Project.AssemblyFullName$" %>

<%@ Register Tagprefix="asp" Namespace="System.Web.UI" Assembly="System.Web.Extensions, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" %>

<%@ Import Namespace="Microsoft.SharePoint" %>

<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="VisualWebPart1UserControl.ascx.cs" Inherits="WSSWebPart.VisualWebPart1.VisualWebPart1UserControl" %>

<asp:Label ID="MyLabel" runat="server" Text="Hello, world!  Visual Web Part compiled in Visual Studio 2010!" /> 

You can replace, the version 14 references with version 12 references if you really need them.  However, I find that most of the time, I am really only using standard ASP.NET controls so they are unnecessary.  That is all you have to do.  Assuming you started with a solution from yesterday, you can package the project and install the .wsp file on your SharePoint 2007 server using STSADM.  If you created a new project don’t forget to remember to remove the SharePoint version attribute in the Package properties (discussed in yesterday’s post).  Here is what my Visual Web Part looks like running on SharePoint 2007.


It’s pretty simple to do.  I am able to leverage the simplicity of the Visual Web Part and take advantage of Visual Studio 2010 building my .wsp file.  The more I work with Visual Studio 2010, the more I realize I can use SharePoint Project Items in previous versions of SharePoint with just a little bit of work.

That’s a mouth full.  I always suspected it was possible to use Visual Studio 2010 to package up my SharePoint web parts and other artifacts into a solution (.wsp file) and turn around and deploy that code into MOSS 2007.  Today I gave it a try and it actually works pretty well.  This post will show you how to do it.  I will remind you that you won’t be able to take advantage of any of the automatic deployment, debugging features built into Visual Studio 2010 and SharePoint 2010, but you will have a nice solution file that was built automatically without having to use a third party tool like WSPBuilder.  You can then take the solution package and deploy it to SharePoint with stsadm.

I will start off by using the SharePoint 2010 Empty Project Template.  Now, unfortunately, the wizard that starts this project has a dependency on SharePoint 2010.  It simply won’t run without it.  However, if you happen to already have a copy of a project that has been created, you can open an existing SharePoint 2010 project template on a computer that does not have SharePoint installed.  I have attached a copy of my Visual Studio solution for you to use as a starting point if you need it. 

Once I have my project open, I proceed to create a web part as shown below in the Solution Explorer.


I don’t have to make any modifications to the class.  So I add some simple “Hello, World!” code to it.


We’re targeting SharePoint version 3, so that means we need to change some references.  All of the DLLs in SharePoint 2010 are version 14.  We need version 12 DLLs.  So what you will need to do is go get a copy of Microsoft.SharePoint.dll (and possibly Microsoft.SharePoint.Security.dll) from your version 3 SharePoint farm.  We then need to remove the reference to the version 14 DLLs.  Click on Microsoft.SharePoint.dll and Microsoft.SharePoint.Security.dll and remove them from the solution.  We then add our version 12 DLLs to the references list and we’re ready to compile our web part for WSS3. 

VSS2010Wss3WebPartReferences VSS2010Wss3WebPartReferencesProperties

At this point, I will remind you of the caveats of perusing this completely unsupported approach.  Obviously, you can’t need to make sure you are only using API calls from version 3.  Using a class from SharePoint 2010 is obviously not going to work.

Build your project and it should compile successfully.  Now, I figured everything would work at this point but I discovered one thing that I had to change in this process.  The SharePoint 2010 solution schema has a new attribute called SharePoint version on the SharePoint element.  WSS3 does not like this.


Luckily, I discovered, that if we delete the value from the property window, it actually removes the attribute.


Simply, remove the value there and we are ready to package the project.


When this is complete, you can browse the file system and find your .wsp file in the bin folder.  Copy the .wsp file to the SharePoint 2007 server if you aren’t already on it.  Then add and deploy the package with stsadm.  At this point, you will have a feature that you can activate on your SharePoint 2007 server.  Go to site collection features and activate it.


The web part should now be in the Solution Gallery.  Now edit a page, and add a new web part.  Your web part should be in the group labeled Custom assuming you haven’t changed it.


We can now verify that the web part code works on the page.


As you can see it’s really pretty easy to build a WSS3 web part in Visual Studio 2010 and deploy it.  We lose some of the cool VS2010/SP2010 integration features of course, but the fact it builds the package for us is a huge win.  Not to mention, upgrading our code to work in SharePoint 2010 later will be pretty easy since all we have to do is change our references from the version 12 to the version 14 DLLs.  I’ve only covered how to do a web part here today.  I suspect other SharePoint Project Items will work as well.  I’ll try them out soon and let you know how they work.  As a reminder, I have attached a copy of my solution to this post for you to use in case you don’t have SharePoint 2010 installed any where.  Give it a try and let me know if it works for you.

I’m filling in tonight and excited to be speaking about PowerShell.  Kyle and I did this talk in Houston at SharePoint Saturday and it was a lot of fun.  In this talk you’ll learn just enough to be dangerous when using PowerShell with SharePoint 2010.  Come on out and see all the new ways you can mess up your SharePoint server from the command line.  We’re at a new location tonight, Tulsa Tech Riverside Campus, room A-144.  See you around 6:00 pm.

I had the opportunity to give two talks at NWA TechFest yesterday in Rogers, AR.  It was a pretty good event organized by the one and only @DavidWalker.  I got to meet a lot of new people and talk about all sorts of things related to SharePoint.  There were plenty of non-SharePoint people to talk to there too.  As promised, I am making my slides available and they are already up on SlideShare.net.

How the SharePoint 2010 Business Connectivity Services will change your life

Introduction to SharePoint 2010 Enterprise Search

Thanks for coming and I’m looking forward to seeing a lot of you at Tulsa TechFest this fall.  Also, I’m speaking at the Tulsa SharePoint Interest Group this Monday about PowerShell, so come on by.

If you build a lot of virtual SharePoint environments, you might find yourself needing some test users in Active Directory to demonstrate various things such as people search or the new social features.  Sure, you can create these users by hand, but that’s very tedious.  I decided to look for a programmatic way to do it.  After searching the web, I found various approaches but many of them were antiquated using things like .vbs files.  I wanted something a bit more modern.  I wanted something in PowerShell.  I stumbled upon a post by Todd Klindt which set me in the right direction.  For my needs though, I needed to expand this approach just a little bit more.  I want to demonstrate the Organization Brower, so I need to set the user’s manager, title, and department properties.  Setting the manager property turned out to add a little bit of complexity.

Before, we can work with Active Directory in PowerShell, we have to import the Active Directory module.  This module is large enough to load that you actually get a progress indicator.  Load it by typing the command below.

Import-Module ActiveDirectory

To do this, I will create a .csv file that has all of my users but one.  I ran into one minor issue putting all of my users in it.  If you specify the manager property on the PowerShell command we use, it requires a value and it has to be an existing user.  This proved to be a problem for my fictitious CEO who did not have a manager.  So let’s create my CEO first.  His name is John Williams (real original I know).  We will also use this as an opportunity to look at the various parameters to the New-ADUser command which creates our new account in Active Directory.

New-ADUser -SamAccountName "john.williams" -UserPrincipalName "john.williams@sharepoint.local" -Name "John Williams" -DisplayName "John Williams" -GivenName "John" -SurName "Williams" -Title "CEO" -Department "Executive" -Path "OU=Test Users,DC=sharepoint,DC=local" -AccountPassword (ConvertTo-SecureString "test41;" -AsPlainText -force) -Enabled $True -PasswordNeverExpires $True -PassThru

The New-ADUser commandlet has a lot of options.  Only a few are required but we need to specify a few more so that when these accounts are imported into the profile store all fields are fully populated.  Some of the information may seem redundant but it has to be set.  When you create a new user through the Active Directory Users and Computers MMC snapin, it takes care of a lot of these defaults for you.  When you create an account with PowerShell though, you have to set it yourself.

Let’s look at the parameters now.  SamAccountName is the traditional NT4 login name that you have come to think of.  When logging in with a DOMAIN\USERNAME, it is the USERNAME.  UserPrincipalName is not required but it is the Windows 2000 style login that looks like an E-mail address.  Name is required and this is the name of the actual object in Active Directory (typically the user’s full name).  DisplayName is optional but it should be specified because the User Profile Store in SharePoint uses it.  If you don’t set the Display Name, none of your imported users in SharePoint will show a name.  Many of the other parameters correspond to property names in Active Directory.  GivenName is the first name.  SurName is the last name.  AccountPassword is the user’s password and has to be specified using the CovnertTo-SecureString commandlet.  In this case I am using the password test41;

You might have noticed I skipped a few parameters.  The rest really are completely optional.  I just specified them because I want them to show up in People Search.  This includes Title, Department, and Manager.  I want my test users in a specific OU in my Active Directory, so I specify the path in LDAP notation with the Path parameter.  Lastly, you need to enable the account with the Enabled parameter and for test accounts I recommend using PasswordNeverExpires

If you want more information on how to use New-ADUser, don’t forget you can use the Get-Help commandlet like this:

Get-Help New-ADUser

Now we’ll look at my CSV file.   Here is what mine looks like.  I simply started at the top of the org chart and added the users down the tree.

christina.murphy,Christina Murphy,Christina,Murphy,CFO,Accounting,john.williams
frank.alcock,Frank Alcock,Frank,Alcock,CIO,Information Technology,john.williams
anna.stevenson,Anna Stevenson,Anna,Stevenson,Chief Legal Counsel,Legal,john.williams
joy.williams,Joy Williams,Joy,Williams,Director,Human Resources,john.williams
craig.johnson,Craig Johnson,Craig,Johnson,Accountant,Accounts Receivable,christina.murphy
jennifer.evans,Jennifer Evans,Jennifer,Evans,Accountant,Accounts Payable,christina.murphy
michael.adams,Michael Adams,Michael,Adams,IT Director,Information Technology,frank.alcock
chris.white,Chris White,Chris,White,Administrator,Help Desk,michael.adams
binh.le,Binh Le,Binh,Le,Technician,Help Desk,chris.white
paul.smith,Paul Smith,Paul,Smith,Director of Application Development,Application Development,frank.alcock
jose.cuervo,Jose Cuervo,Jose,Cuervo,Developer,Application Development,paul.smith
preet.ramakrishnan,Preet Ramakrishnan,Preet,Ramakrishnan,Junior Programmer,Application Development,paul.smith
richard.jackson,Richard Jackson,Richard,Jackson,Team Lead,Application Development,paul.smith

The way PowerShell can read CSV files is quite cool.  It automatically recognizes the header columns in the document and assigns them to variables that you can use with $_.  For example, the name column in the CSV file can be accessed with $_.name.  If you want to set other properties on AD user accounts, you can simply add them to your CSV file and set them later on your PowerShell command.  To import the CSV file into PowerShell, use the Import-Csv command.

Import-Csv .\users.csv

Executing this command by itself will display what it read from your file so that you can verify everything looks correct.  Here is what it looks like.


If you are happy the way it looks.  We can then go to the next step by using the foreach-object command to create a new user for each row it found in the CSV file.  I’ll show what the rest of the script (contained in a .ps1 file) looks like and then I’ll explain what is going on.

Import-Csv .\users.csv | foreach-object {
$userprinicpalname = $_.SamAccountName + "@sharepoint.local"
New-ADUser -SamAccountName $_.SamAccountName -UserPrincipalName $userprinicpalname -Name $_.name -DisplayName $_.name -GivenName $_.GivenName -SurName $_.SurName -Manager $_.Manager –Title $_.Title -Department $_.Department -Path "OU=Test Users,DC=sharepoint,DC=local" -AccountPassword (ConvertTo-SecureString "test41;" -AsPlainText -force) -Enabled $True -PasswordNeverExpires $True -PassThru }

We start by doing the import and then piping its output to the foreach-object command.  Remember, when the .csv is imported the names of the columns (specified in the first row of the CSV automatically become variables that can be access with $_.  For example to get the user’s department I would do $_.Department.   As I mentioned above, I wanted to set the user principal name so I create this name by concatenating my domain name @sharepoint.local to the name of the SamAccountName specified in the .CSV file.  I then pass each variable to the corresponding parameters as you see above.  It executes this once for each row in the file until all my accounts are created.

That’s really all there is to it.  Save your script as a .ps1 file and execute it in PowerShell.  If all goes well, you should see a screen that is similar to the one below.


When building your own user import file, remember that if you are setting the manager property, that the manager’s account has to exist in Active Directory before you set that property on employee accounts.  I hope this PowerShell information is useful.  Feel free to use my fictitious employees.  If you add some to the list, feel free to share them. :-)

UPDATE: Found a bug in my script.  I left out the title.

When my laptop was stolen, I was forced to rebuild a number of Virtual Machines since my new external hard disk had not come in yet for me to back them up.  On my previous laptop, I was running VirtualBox 3.1.8 and my SharePoint 2010 images worked great.  When I got my new laptop, I decided to grab the latest version (at the time 3.2.4).  I noticed immediately that the product was rebranded from Sun VirtualBox to Oracle VirtualBox.  I did have a few existing virtual hard disks in various formats (.vhd and .vmdx) so I created new virtual machines and attached them as normal.  I tried to boot each one of them only to immediately receive a BSoD with a STOP error of 0x0000007F.  At some point during this process, 3.2.6 came out, so I decided to upgrade.  No change.  After doing some research, I discovered this was hard disk controller based.  I looked at my settings and realized that VirtualBox 3.2 had implemented a SATA controller and the hard disk was attached to it.  I finally tracked down one of my existing configuration files to confirm that my hard disk previously were attached as IDE (PIIX4 to be specific).  I reattached the hard disks to IDE and they booted fine.

On my new SharePoint 2010 RTM image (Windows Server 2008 R2 x64), I noticed it wasn’t performing very well.  I wondered if it had something to do with the hard disk controller, but I quickly ruled that out.  What I discovered is that my CPU Utilization was almost always maxed out on the guest (the host OS was fine).  The executables sqlserver.exe, owstimer.exe, and various w3wp.exe processes were consuming most of the memory.  I figured it had some jobs that needed to run and let it sit overnight and the CPU utilization appeared to go down in the morning.  However, as soon as I sent a single page request, the CPUs were pegged out at 100% again.  The VM was pretty much unusable.  I did some research and couldn’t find anything that helped.  In a last ditch effort, I uninstalled 3.2.6 and went back and found version 3.1.8.  I installed it, attached the same virtual hard disks, and the CPU utilization issue has been gone ever since.  During the process I built another separate Virtual Machine and it had the same issue as well.  Maybe it is just my issue or something else, but for now I am avoiding version 3.2.6 of VirtualBox.  I haven’t had a single issue with the product yet, so this was a shock to me.  I’m wondering if anyone else has had issues.  I will continue to use the product but I will be skeptical of any product upgrades until I see some mention of this issue.

David Walker is at it again!  This time it is a TechFest in Northwest Arkansas this week on July 8th.  I’m going to be doing two talks there.  The first about the BCS and the second about Enterprise Search.  If you’re from Tulsa, sorry you’ve already seen my BCS talk, but I won’t be offended if you go see someone else. Just tell them how great my BCS talk is.  :-)  After that, I’ll be talking about Enterprise Search in 2010.  This intro talk will show you how to get started with Enterprise Search and most of the take-aways from the session also apply to MOSS 2007.  So if you can make it come on by to the event.  It should be a good one.