Mittwoch, 28. November 2012

Sharepiont item level rights and a fluent Interface

Fluent Interfaces are a common thing nowdays, and evereyone who ever used Linq knows about them. Other products use them as well. But have you ever tried to write on on your own? Lately I did, and I found that there are different ways to do it depending on the situation where you are starting from.

My sceanrio was, that I wanted to implement a solution for running to set item permissions on a sharepoint list item. You need elevated privileges to do that if you are not an admin. Usually, in the ItemAdded event reciever you do it like so:

var item = properties.ListItem;
var web = properties.web;

SPSecurity.RunWithElevatedPrivileges(delegate()
{
   using (SPSite oSite = new SPSite(web.Site.Url))
   {
     using (SPWeb oWeb = oSite.OpenWeb())
     {
       SPList oList = oWeb.Lists.GetList(item.ParentList.ID, false);
       SPListItem oItem = oList.Items.GetItemById(item.ID);

       oItem.BreakInheritedRights();
      
       SetRights(oWeb, oItem);
       
       oItem.Update();
     }
   }
});

You get the item and the web from the event properties. But in the delegate you must get a new instance of the web, the list and the item because the outside references are running in the low privileges context. The Method BreakInheritedRights is an extender that calls BreakRoleInhreitance on the item and removes all current RoleAssignments. The not shown method SetRights sets the actual rights to the item. I needed to do this for several lists and the above code was to obscure and hard to understand. So I decided to implement it something like that:

var item = properties.ListItem;
var web = properties.web;

SecurityHelper.RunWithElevatedPrivileges()
        .OnSite(web.Site.Url)
        .OnListItem(item.ParentList.ID, item.ID)
        .Execute(setItemPermissions);

This seemed to be more readable to me. So I was in  the lucky situation that this functionality was enterly new. So I have choosen the easiest way to do it: by creating a set of classes to build up your langauage. So I came up with the following set of classes:

public class SecurityHelper
{
  public static SecurityHelper RunWithElevatedPrivileges() { ... }
  public RunWithElevatedPrivilegesSiteContext OnSite(string url) { ... }
}


public class RunWithElevatedPrivilegesSiteContext
{
  public void Execute(Action<SPWeb> actionToRunElevated) { ... }
  public RunWithElevatedPrivilegesListItemContext OnListItem(Guid listId, int itemId) { ... }
}

public class RunWithElevatedPrivilegesListItemContext
{
  public void Execute(Action<SPWeb, SPListItem> executeOnListItem) { ... }
}

The static RunWithElevatedPrivileges method is the entry point. It allows us by calling the OnSite-Method to create a site context to execute elevated code, modeled throug the class RunWithElevatedPrivilegesSiteContext which gets passed in the site url. Using its Execute-Method we can run elevated code that only needs the web. If you want a list item to be in context, you have to call the OnListItem-Method to get a RunWithElevatedPrivilegesListItemContext instance. This allows you to pass in a callback expecting web and list item. This will be obtained by the Execute-Method in the following way:

public void Execute(Action<SPWeb, SPListItem> executeOnListItem)
{
  SPSecurity.RunWithElevatedPrivileges(delegate()
  {
    using (SPSite site = new SPSite(_siteUrl))
    {
      using (SPWeb web = site.OpenWeb())
      {
        SPList list = web.Lists.GetList(_listId, false);
        SPListItem item = list.Items.GetItemById(_itemId);

        executeOnListItem(web, item);
      }
    }
  });
}

If things get more complex, you shoud prefer to define interfaces that define your language an implement them on your classes or you can use extensions methods like in LINQ.

Have a fluid coding!



Donnerstag, 20. September 2012

BASTA Follow-Up

Für alle die meine Sessions auf der Basta verfolgt haben und besonders für alle die sie nicht verfolgen konnten, hier die jeweiligen Folien und Beispiele.

TFS Express Slides

NoSQL vs. SQL Slides and Examples

Auch die versprochenen kleinen agile Tools die im TFS Express fehlen gibt es zum Download. Dazu einige bekannt Einschränkungen:

  • Funktioniert nur mit der englischen Scrum-Template Version.
  • Alle Operationen laufen synchron - eventuelle Wartezeiten (insb. beim starten) können vorkommen.
  • Die Software wird "as-is" geliefert - Nutzung auf eigene Gefahr.
Wer Fragen dazu hat, kann sich melden.

Viel Spaß damit!

HTTP 503 Error in TFS

The night before my TFS Express talk at the BASTA conference I faced an ugly error with my TFS Express installation. The installation ran fine, I could start management console, access the database but trying to access the website at localhost:8080/tfs yielded the 503 error.

As a result of a quick Google query I found out that there is plenty of discussion on the web about this topic, for example at StackOverflow or on MSDN. Some post state that they never found a solution for this, other show a long list of possible things to try - nothing worked for me.

By luck I found the solution to my problem. I used RavenDb on my machine before, which is running on port 8080 as well. Due to a different problem in Raven I had in mind that it does some urlacl-stuff by reserving an urlacl for the URL http://+:8080. This is used to assign rights for accesing this URL for the logged on user. If you face a 503 error when trying to use RavenDb with IIS deployment you should run the following command line:


netsh http delete urlacl http://+:8080/


It seems that running raven from the console does a urlacl reservation, which in turn makes TFS to stop working. The solution to the problem was similar to the solution mentioned on the RavenDb website. Run the above command line and TFS works fine - I assume RavenDb will no longer work properly.

This is neither a RavenDb mistake nor a TFS mistake - you should just configure these servers to run on different ports if they run concurrently.

Anyway this is one possible solutions out of a thousand and perhaps it might save someone some valuable time. Happy serving...




Sonntag, 16. September 2012

Creating a child task for a backlog item with TFS API

Because TFS Express lacks the agile planning tools of the full version, I was writing a small tool that makes creating tasks as children of a backlog item more enjoyable. It took me a while to figure it out, so I share my result.

First you need a reference to the team project collection:

var tfs = new TfsTeamProjectCollection(new Uri("https://your.tfspreview.com"));
tfs.EnsureAuthenticated();

var workItemStore = tfs.GetService<WorkItemStore>();

The the new task work item must be created. To instanciate a work item you must pass its type. You get this from the Project class.

var project = workItemStore.Projects["NameOfYourTeamProject"];
var taskType = project.WorkItemTypes["Task"];
var task = new WorkItem(taskType);
task.Title = "A new task"; 
task.State = "To Do";      
task.Save();

Next, we need to create a parent / child link. We need a WorkItemLinkTypeEnd instance that we can get from the WorkItemLinkType that we can get from the work item store like this:

var linkType = workItemStore.WorkItemLinkTypes["System.LinkTypes.Hierarchy"];
var taskLink = new WorkItemLink(linkType.ReverseEnd, task.Id);

The taskLink instance is a parent / child work item link whose reverse end (where it points to aka the child) points to the id of the task work item we created formerly. To connect the close end to the parent, the backlog item, we have to load it and put the link into its Links collection:

var backlogItem = workItemStore.GetWorkItem(backlogItemId);´
backlogItem.Links.Add(taskLink);
backlogItem.Save();

Thats it - the new task appears as a child of the backlog item (for example in the taskboard). Hope this may save someone some time :-)

Dienstag, 21. August 2012

Portable code between Silverlight, WPF and more

I had some talks about multi-targeting between WPF and Silverlight. The usual approach to achive this was to have one projects for each platform and use file linking to avoid duplicated code. This was working quite well but was a bit of a hassle as well.

With .NET 4.5 Micorsoft gave developers a new tool to solve the problem: portable libraries! There is a new project template called Portable Class Library. If you select it, a dialog box appears asking you which platforms should be targeted:


The selected settings can later be changed using the project properties. The class library can be directly referenced in a Silverlight project as well as in WPF, which was not possible before: an assembly targeted to the .NET Framework could not be referenced from Silverlight.

Some drawbacks...

As every nice feature, this has its drawbacks as well. Of course in the PCL project you can only use a subset of all supported plattforms. On MSDN is a list of supported features. A PCL project can not reference platform specific assemblies - it is limited to other PCL assemblies.
A major thing I miss in contrast to the old school multiple DLL with shared files approach is, that you can no longer use compiler directives to use platform specific code. Neither can you effectivly determine on what platform your code is running. On the one hand that is the idea of PCL, on the other hand it seems like a limitiation to me.

And a conclusion...

I think it all comes down to careful plannig and using the right tool for the right case. I think you can not discard the mulit-assembly approach completely if you want to get most out of mulit targetign. But PCL are a handy way to have real common code - the one that does not need to know the platform it is running on - in a common location. As suggested on MSDN, platform specific code can then be implemented in platform specific assemblies that inherit from your PCL base classes. This yields to a mixed mode between the old and the new approach.

Montag, 30. Juli 2012

VS 2012: Can not show Test Explorer

For quite some time I was working with VS 2012 RC and everything was fine. But all of a sudden I could not open the Test Explorer. I got a MEF error message as shown below:


The main error message was

The composition produces a single composition error. The root cause is provided below. Review the CompositionException.Errors property for more detailed information. 
1) Value cannot be null.
Paramter name: testPlatform
Followed by further error messages. I tried to repair Visual Studio installation, installed different test platform adapters but nothing helped. I found the solution in at Stack Overflow, but very hidden down below, so I repeat it here:

Under C:\Users\Tobias\AppData\Local\Microsoft\VisualStudio\11.0\ComponentModelCache delete the Microsoft.VisualStudio.Default.cache file. That does the trick for me!

Thanks GertGregers and happy testing.

Freitag, 13. Juli 2012

A self made agile taskboard for TFS Express

In the early announcements of TFS Express it was stated that it would contain the agile Taskboard, but not the sprint / backlog planning facilities. But neither in the beta release nor in the RC you could see the taskboard.

According to a tweet from Buck Hodges the taskboard will not be included into TFS Express as a tradoff for being free. While this is sad, because the taskboard was great, I understand that it is needed to cut out some features in the free version - and the features left are still a great deal!

But because the taskboard is great, and because I thought it could not be that hard, I started to write my own. Currently it is a WPF application and not a website, because I am more familar with WPF than ASP.NET MVC. And after a few hours I had my first prototype working!


It is not as nice as the one from Microsoft and it misses some of the features - but it is a starting point. Here is what it can do for you:

  • Configure your TFS / Team Project / Iteration in a config file (no UI for that)
  • Displays BacklogItems as lines and the according tasks in the states "To Do", "In Progress", "Done"
  • You can enter the remaining effort in the textbox in the lower right corner (save on LostFocus)
  • You can drag from "To Do" -> "In Progress",  "In Progress" -> "Done", "To Do" -> "Done" and "Done" -> "To Do" ("Done" -> "In Progress" is forbidden)
  • If you drag to "Done" it sets the effort to 0
Here are the limitations:
  • If you enter wrong TFS Url / Team Project / Iteration the app will crash
  • All operations are synchronous, so you have a delay at start and on every drag 
  • Can not drag from Done to In Progress due to the fact that there must be a rest duration
  • No validation errors are shown if an operation fail
If you want to try it, feel free to download. I have not battle tested this a lot, so it comes without any warranty, as is etc. Use it at your own risk.

If you like it, or you have any suggestions, let me know. By the time I will also release the source code.

Happy Daily Scrum-ing...

Freitag, 29. Juni 2012

A flexible way for validation in Silverlight

There are numerous ways to do validation in Silverlight. Plenty of blog posts have been written about all of them, so I will just provide a list with links and then I will focus on the approach I implemented recently.


Binding property Mechanism Description
ValidatesOnExceptions Exceptions
  • Easy to implement
  • Synchronous
  • Only one error per property at a time.
ValidatesOnDataErrors IDataErrorInfo
  • Must not throw exception in property setter
  • Synchronous
  • Only one error per property at a time
ValidatesOnNotifyDataErrorInfo INotifyOnDataErrorInfo
  • Supports asynchronous validation
  • Can deliver multiple properties at a time
  • Enabled in Binding by default
Data Annotations Attributes
  • Easy to implement
  • Multiple validation rules per property

The INotifyOnDataErrorInfo approach provides the most flexibility. The fact that it is enabled by default saves you a lot of time in case you have a lot of views currently not using any validation rules. But the implementation of the interface seems like a lot of work. So I started thinking about how to make things easy.

How things should work

My idea was to write validation code like this:

    public String Description
    {
      get
      {
        return Get(() => Description);
      }
      set
      {
        Set(() => Description, value);
      }
    }

    public IEnumerable<ValidationError> Validate_Description()
    {
      if (Description.Length < 20)
        yield return new ValidationError("Description is too short.");
    }

The Get() / Set() methods are described in our view model base class. I wanted to use tha same aproach for validation as we use for executing commands through methods prefixed with Execute_.

How to get things to work

The wohle magic happens in the view model base class. This is the one to implement the INotifyOnDateErrorInfo interface:

public abstract class ViewModelBase :  INotifyPropertyChanged, INotifyDataErrorInfo
{
    public event EventHandler<DataErrorsChangedEventArgs> ErrorsChanged;

    public virtual bool HasErrors
    {
      get { return Get(() => HasErrors); }
      set { Set(() => HasErrors, value); }
    }


    public IEnumerable GetErrors(string propertyName)
    {
      
    }

    protected virtual void OnValidationChanged(String PropertyName)
    {
      if (ErrorsChanged != null)
        ErrorsChanged(this, new DataErrorsChangedEventArgs(PropertyName));
    }
}

In the constructor of the class we scan for methods that start with the prefix Validate_ to create validation rules for them. To do so, we define an interface IValidationRule and a class RelayValidator - very similar to the ICommand / RelayCommand implementation:

public interface IValidationRule
{
  IEnumerable<ValidationError> GetValidationErrors(Object Value);
}

public class RelayValdiator : IValidationRule
{
  private Func<object, IEnumerable<ValidationError>> ValidateFunction;
  
  public RelayValdiator(Func<Object, IEnumerable<ValidationError>> Validator)
  {
    this.ValidateFunction = Validator;
  }

  public IEnumerable<ValidationError> GetValidationErrors(Object Value)
  {
    return ValidateFunction(Value);
  }
}

The object parameter passed to the GetValidationErrors method is the property value to be validated. Now we can build up a Dictionary<String, List<IValidationRule>> containing the property name as key and a list of validation rules for the property (as I wanted to be able to have multiple validation rules later).

var ValidateMethodNames = 
  this.GetType().GetMethods()
      .Where(m => m.Name.StartsWith(VALIDATE_PREFIX))
      .Select(m => m.Name.StripLeft(VALIDATE_PREFIX.Length));

var result = ValidateMethodNames
      .ToDictionary(
        name => name, 
        name => new List<IValidationRule>() 
        { 
          new RelayValdiator(x => GetValidationErrors(name, x)) 
        }
      );

The function GetValidationErrors is defined in the base class as well:

private IEnumerable<ValidationError> GetValidationErrors(
                                                   String PropertyName, Object PropertyValue)
  {
    var validateMethodInfo = ViewModel.GetType().GetMethod(VALIDATE_PREFIX + PropertyName);
    if (validateMethodInfo == null)
      return null;

    return (IEnumerable<ValidationError>)
            validateMethodInfo.Invoke(ViewModel, 
            validateMethodInfo.GetParameters().Length == 1 ? new[] { PropertyValue } : null);
    }

The method takes the passed in method name, looks up the method in the class via reflection and invokes it. One little trick is done here anyway: the Validate_Description method as shown above does not take a parameter, as we do not need it in the view model where we can access the value directly. So we see wether we have a parameter and if not, we just do not pass it.

With this in place, we can easily implement the GetErrors method, where m_ValidationRules is the above constructed dictionary.

public IEnumerable GetErrors(string propertyName)
{
  if (propertyName.IsNullOrEmpty())
    return null;

  if (m_ValidateionRules == null)
    return null;

  if (!m_ValidateionRules.ContainsKey(propertyName))
    return null;

  var rules = m_ValidateionRules[propertyName];
  var result = rules.SelectMany(r => r.GetValidationErrors(Get<Object>(propertyName)))
                    .Select(e => new ValidationError(propertyName, e.ErrorMessage));

  // Update HasErrors property

  return result;
}

As mentioned before, the Get<Object>(propertyName) method is part of the view model and delivers the value of a property given its name. It is based on a dictionary as well.

The one thing we missed so far is the HasErrors property. Of course we want it to be set automatically according to the validation errors. So we introduce a list of property names with properties being in an errenous state. Each time we encounter an error we add the name to the list, otherwise we remove it. So if the list contains any element HasErrors must be true, if it is empty it must be false. This is done by the following code, to be inserted in the above listing at the comment:

var propertyHasErrors = result != null && result.Count() > 0;

if (propertyHasErrors)
  if (!m_ErrorProperties.Contains(propertyName))
    m_ErrorProperties.Add(propertyName);
else
  m_ErrorProperties.Remove(propertyName));

HasErrors = m_ErrorProperties.Count != 0;

Further steps

In the first place there is quite some code to write to get things working. But once you have it, you can do some other neat things. For example you can define attributes for validation:

[AttributeUsage(AttributeTargets.Property, AllowMultiple = false, Inherited = true)]
public class IsNotNullValidation : Attribute, IValidationRule
{
  public IEnumerable<ValidationError> GetValidationErrors(object Value)
  {
    var StringValue = Value as String;

    if (StringValue != null && StringValue.IsNullOrEmpty())
    {
      yield return new ValidationError("Der Wert darf nicht leer sein.");
    }

    if (Value == null)
      yield return new ValidationError("Der Wert darf nicht leer sein.");
  }
}

You can stack them on top of your properties. The only thing to do is to not only scan for perfixed methods but for attributes as well and add them to your dictionary.

Happy validating!

Dienstag, 26. Juni 2012

Be careful with (optional) params[]

I just came across an issue with the C# params keyword. It allows you to pass an arbitrary number of arguments to function. Within the funciton the parameters are available as an array which might as well be empty.

The original signature looked like this:
public String GetFormattedMessage(int messageId, params object[] placeholderValues)
{...}

The method looked up the message with the given id in a database. The message might contain placeholder, in which the passed values must be inserted. Because a message can have any number of placeholders, they are passed as params-array (yes, the method handles the case of having to much / to less parameters as well :-))


Calls to the method looked like this:
// The operation {0} succeeded
var msg = GetFormattedMessage(4711, "load customers");
// Customer with first name {0} and last name {1} not found
var msg = GetFormattedMessage(4712, "John", "Smith");
// No service listenting on port {0}
var msg = GetFormattedMessage(4713, 8881);


The new requirement was to be able to hand in a default messagse in case the given message id could not be found, for example due to a missing database update.


So the new signature was:

public String GetFormattedMessage(int messageId, string defaultMessage, params object[] placeholderValues)
{...}

Here comes the problem with the overloaded method: the first two of the above calls immediatly started to call the new overload with the default message. In all cases, and these are the majority, where the first placeholder argument was a string, the second overload is called because the string placholder value is best to fit with the string defaultMessage parameter.

I could not find a nice solution for that. Here were the two options I considered:

  • Introduce a new type for the string defaultMessage parameter like so:
    public String GetFormattedMessage(int messageId, DefaultMessage defaultMessage, params object[] placeholderValues)
  • Do not overload but introduce a new method with a distinct name:
    public String GetFormattedMessageWithDefault(int messageId, string defaultMessage, params object[] placeholderValues)
I have choosen option two because it seemed more explicit to me. I am not happy with that but could not think of another solution - if anyone has thoughts about it, I would be glad to hear. If not, you may keep it in mind to avoid such a situation. At best avoid optinal parameters anyway.

Happy coding!

Freitag, 15. Juni 2012

A better world with a better switch statement

It is known that using switch statements should, in most cases, be replaced by polymorphism when using object oriented languages (refer to Clean Code by Robert Martin, Code Smell G23).

But you must stay pragmatic as well, and sometimes the switch statement is an option. But: why must it's C# syntax look like this? Why is it so inconsistant with the rest of the language? Why do we have to write break?

I imagine a switch-statement like this:

switch (selector)
{
  case (value)
  { ... }

  case (other value || still other value)
  { ... }

  case (new List<>() { a, b, c })
  { ... }

  default 
  { ... }
}

So no ':' that reminds me of lables. No break after each case - use the well known block delimiter instead. Write the argument to the case keyword in braces as like with if, while, for and others. Do not stack control structures but use known logical operators / lists to combine cases.

Would this not be nice?



Mittwoch, 13. Juni 2012

Get hold of the view in the view model

The Problem


Triggered by a tweet from Vaughn Vernon about the question on how to get a reference to the view in a view model I decided to post the solution I used for this.

In general it is not a good practice though, but I needed it sometime as well (mostly because of some part of the system that was not designed with MVVM in mind).

The Solution


I defined an interface for the view and let the view implement it. The interface contains a a reference to the view model through an interface. The view model resides in the views DataContext so we route the property to this:

public interface IView
{ 
  IViewModel ViewModel { get; set; }
}

public partial class View : UserControl, IView
{ 
  public IViewModel ViewModel 
  { 
    get { return DataContext as IViewModel; }
    set { this.DataContext = value; } 
  }
}

The interface for the view model contains a reference to the view through its interface.

public interface IViewModel
{ 
    IView View { get; set; }
}

public class ViewModel : VMBase, IViewModel
{ 
    public IView View { get; set; }
}

Now in the XAML of the view I wire up the view model

<UserControl>
    <UserControl.DataContext>
        <local:ViewModel />
    </UserControl.DataContext>
</UserControl>

In the loaded-event of the view I set the reference back to the view model:

public void Control_Loaded(Object sender, EventArgs e)
{
  if (ViewModel != null)
    ViewModel.View = this;
}

And thats it.

What I like about this solution is that your view / view model know each other only by means of an interface, which gives you a certain amount of decoupling. But it is quite a bit of additional code...

The Drawbacks

As I mentioned earlier it is generally not a good practice to try to do this. Mostly it points to the fact that the system is not MVVMable - but sometimes reality knocks at the door...
A problem with this apporach is timing. If you need the view reference say in the constructor of your view model you wont be successful with the above technique. But you can use it in commands or any time after the view was loaded which was always sufficient for me.
Finally I don't claim that this is a general purpose solution to all cases one might want to use this - it is just what did the job for me.

Maybe it might be helpful to someone else...





Dienstag, 5. Juni 2012

Unit Testing and static methods

Miško knew since 2008 what I stumbled about today: you have a hard time if you try to test things using static classes / members.

My case was similiar to this one:

public static class SomeFactory
{
  public ISomeObject Create() { // do nasty things such as service calls }
}

public class ThisIsWhatIWantToTest
{
  public void AMethod()
  {
    // ...
    var someObject = SomeFactory.Create();
    // ...
  }
}


I wanted to write tests for AMethod but I had a hard time. Acutally I could not manage to find a seam to work around the service call easily.

Another thing I noticed along the way is, that static things obscure the dependencies in the code. This is not a good thing because it makes the code harder to understand. If you have dependencies, the should be widely visible for the sake of clarity.

I will very carefully consider the use of statics from now.

Cheers,
Tobias.

Montag, 4. Juni 2012

Incremental Builds with TFS

A very handy TFS feature is kind of well hidden in the settings. The setting is so "tiny" that I did not find it for a long time. But first things first.

The task: set up a team build that compiles only those assemblies that have been modified since the last build. You can call this an incremental build.

The whole magic is the "Clean Workspace" parameter in the build definitions process tab. Here one can choose how the build handles its workspace. You have three choices:

  1. All: Delete Sources & Binaries, then get all and build all. This is a complete rebuild. This is the default
  2. Outputs: Delete the binaries but keep the sources. Gets only the sources that have changed (incremental get). Recompiles all the sources so you get a full set of binaries.
  3. None: Delete nothing. Gets only the sources that have changed and rebuilds only assemblies that have changed. 

Option three is the one that does the trick. If you trigger a build, than change something and trigger another build, your drop location will hold two folders. You will see all the files in both folders but dont be disappointed. All files are moved to the drop location, but mind the changed date: they are all different.


This way you can figure out easily what has acutally changed. This comes in handy if you want to ship hotfixes that only contain files that have really changed to keep installing the hotfix quick.

Have a nice build ;-)
Tobias.

Dienstag, 29. Mai 2012

Code Quality mit Visual Studio und TFS

Besser spät als nie, hier die Slides und Beispiele zu meinem Vortrag "Steigerung der Codequalität mit Visual Studio und TFS".

Bei Fragen und Anregungen gerne melden.

Viel Spaß und viel Erfolg beim clean coden :-)

Montag, 21. Mai 2012

Limits of COM Interop in Silverlight 5?

You can read a whole lot of posts about the P/Invoke capabilities of Silverlight 5. For some weired reasons I needet to initiatie an OLE Drag and Drop from Silverlight to Visual Basic 6.

I thought to use P/Invoke to call the Ole32.dll DoDragDrop Method, which is used under the hood by equally named WinForms function. It has the following signature:

[DllImport("ole32.dll")]
static extern int DoDragDrop(IDataObject pDataObject, IDropSource pDropSource,
   int dwOKEffect, int[] pdwEffect);

Even though there is a IDataObject Interface in Silverlight it is not the matching one for this COM call. So I decided to marshal it by hand, as well as the IDropSource interface. This is were the trouble starts.

The IDataObject interface looks like this (at least this is what I think is right):

  [ComImport]
  [InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
  [Guid("0000010E-0000-0000-C000-000000000046")]
  public interface IDataObject
  {
    void GetData([In] ref FORMATETC format, out STGMEDIUM medium);
    void GetDataHere([In] ref FORMATETC format, ref STGMEDIUM medium);
    [PreserveSig]
    int QueryGetData([In] ref FORMATETC format);
    [PreserveSig]
    int GetCanonicalFormatEtc([In] ref FORMATETC formatIn, out FORMATETC formatOut);
    void SetData([In] ref FORMATETC formatIn, [In] ref STGMEDIUM medium,
      bool release);

    IEnumFORMATETC EnumFormatEtc(DATADIR direction);
    [PreserveSig]
    int DAdvise([In] ref FORMATETC pFormatetc, ADVF advf, IAdviseSink adviseSink, out int connection);
    void DUnadvise(int connection);
    [PreserveSig]
    int EnumDAdvise(out IEnumSTATDATA enumAdvise);
  }

As you can see, there are lots of other types and interfaces in there. All in all there are nine further interfaces, six structs and six enumerations needed to merely define this interface. I did all of it, but when I try to run the code I get the following error:



The error message is "Invalid managed / unmanaged type combination" - which is refered to on stack overflow with the solution that the struct layout must be set to sequential. I felt that this was not my problem but I tried it - with no luck.

What makes me wonder is the second part of the error message. It tells us that marshalling from and to COM interface pointer is not supported. As far as I can see this is a limitation of P/Invoke in Silverlight - but I can find no reference elsewehere on the net. All P/Invoke samples call rather simple functions that accept only primitive data types.

Anyway I have uploaded the whole source code so anyone who feels lucky today can give it a try...

Freitag, 27. April 2012

Silverlight DataBinding und NULL

I recently noticed that, if you have an ObservableCollection<Something> in Silverlight bound to an ItemsControl with an explicit DataTemplate attached to it, that if you insert a NULL reference in to the collection you get the following error message:


Zeile: 1587
Fehler: Unhandled Error in Silverlight Application
Code: 4004   
Category: ManagedRuntimeError      
Message: System.Collections.Generic.KeyNotFoundException: Der angegebene Schlüssel war nicht im Wörterbuch angegeben.
   at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
   at System.Windows.ResourceManagerWrapper.GetResourceForUri(Uri xamlUri, Type componentType)
    



While this seems cryptic in the first place, it still makes perfectly sense: for the type of object you add there is no something in the resource dictionaries to display it - hence the error.

The ugly thing is, that the error causes Silverlight to crash completely. A registered ApplicationUnhandledException handler just setting e.Handled = true does not seem to help either. So if you add something to a bound collection always check for null.

Maybe this saves some of you some time...

Cheers,
Tobias.

Samstag, 31. März 2012

Team Build Anpassungen

Besser spät als nie - hier die Demos aus meinem Vortrag zum Thema TFS Team Build Anpassungen, für alle die selber Anpassungen planen.

Bei Fragen fragen - ansonsten happy builiding :-)

Build Workflow Demos

Montag, 30. Januar 2012

Closures in C#

Closures kommen eigentlich auch den funktionalen Programmiersprachen. Einige Konzepte solcher Sprachen haben in C# einzug gehalten, und durch Lambda-Expressions zum Beispiel auch die Closures.

Hier ein kleines Beispiel:

public void DoSomething(int i)
{
  var x = 2 * i;
}

private Action GetAction(int initial)
{
  var i = 2 * initial;
  return () => DoSomething(i);
}

public void CallAction()
{
  Action a = GetAction(5);
  a();
}

Die Variable i in GetAction ist lokal und wird im Lambda-Ausdruck () => DoSomething(i) festgehalten. Beim Aufruf der Action in CallAction ist i eigentlich schon gar nicht mehr im Scope. Wegen des Closures kann der Delegat trotzdem darauf zugreifen.

Soweit so gut. Jetzt ein kleines Beispiel wo mich dieses Feature ein kleine Debugging-Session gekostet hat. Dazu ein vereinfachtes Beispiel:

private List<Action> actions = new List<Action>();

private void CreateActions()
{
  for (int i = 0; i < 10; i++)
  {
    var param = i * 2;
    AddAction(() => DoSomething(param));
  }
}

private void AddAction(Action a)
{
  if (!actions.Contains(a))
    actions.Add(a);
}

public void DoSomething(object i)
{
  var x = 2 * (int)i;
}

Jetzt werden die Actions zunächst in einer Liste gespeichert, aber nur falls die gleiche Action nicht schon enthalten ist.

Die Preisfrage lautet: wie viele Einträge enthält die Liste actions nach der Ausführung von CreateActions?

Die richtige Antwort ist: 10. Denn obwohl der Ausdruck () => DoSomething(param) immer gleich aussieht, ist es durch den Closure jedesmal eine andere Action, weshalb die Bedingung !actions.Contains(a) niemals greift.

Bei folgender Varianten ist die Menge der Varianten in der Liste gleich 1:

private void CreateActions2()
{
  for (int i = 0; i < 10; i++)
  {
    AddAction(() => DoSomething(1));
  }
}

Soweit klar, hier wird ja kein Element aus dem externen Scope eingeschlossen.
Aber auch diese Variante liefert einen Eintrag in der Liste actions:

private object context = 1;
private void CreateActions3()
{
  for (int i = 0; i < 10; i++)
  {
    context = 2 * i;
    AddAction(() => DoSomething(context));
  }
}

Hier ist die Variable context ein Referenztyp und das Closure schlíeßt nur die Addresse der Variablen mit ein.
Wenn man es mal so einfach hinschreibt, eigentlich logisch :-)

Donnerstag, 5. Januar 2012

A BranchCreatedEvent and TFS Extensibility

There are many extension and integration points in TFS. A very fine thing is the event system. We can hook into this either by defining a web service that gets called by the TFS - this works fine with WCF. Or we define a server side plugin for the event we are interested in, by implementing a class that implements the ISubscriber interface and deploying this into TFS.

Most documentation found on the web is written for TFS 2010 but as of now everything is running well with my TFS vNetxt installation.

The BranchCreatedEvent
There are a lot of useful events, with TFS vNext the list still got longer compared to version 2010. I wanted to do something, everytime a branch is created, so i was looking for a BranchCreatedEvent. But surprise: there is no such event, neither in TFS 2010 nor in TFS vNext. A suggested solution was to create a service that polls for new branches. I am not happy with this approach, but there seems to be no other way. So I started thinking about where to put my polling service.
One opportunity was to create a long running WCF service. I discarded that, because I was not sure wehter the service would restart once the App Pool was recycled. Another option was to write a custom Windows Service. Regarding this solution I was concerned that I would end up with a new services for each new requirement. So I thougth to implement a plugin based service to have on spot to add new features.

A plugin  based Task Scheduler for TFS  
So I wanted to have a task scheduler that could be extented thorugh plugins. And surprise again, there is already such a thing in TFS, called the TFS Job Agent - the thing that is also responsible for initiating the event processing. So all I had to was to find out how to implement a plugin for this agent.

Implementing a custom TFS Job Agent Job
Information about how to accomplish this was a little harder to find. This one put me on track, and other nice information can be found here.

Here is how the story goes:

Create a new class library solution and add the follwoing references:
  • Microsoft.TeamFoundation.Client [GAC]
  • Microsoft.TeamFoundation.Common [GAC]
  • Microsoft.TeamFoundation.Framework.Server [C:\Program Files\Microsoft Team Foundation Server Dev11\Application Tier\TFSJobAgent\]
Add a class implementing the ITeamFoundationJobExtension interface:

    public class MyFirstJob : ITeamFoundationJobExtension 
    {
        public TeamFoundationJobExecutionResult Run(TeamFoundationRequestContext requestContext, TeamFoundationJobDefinition jobDefinition, DateTime queueTime, out string resultMessage)
        {
            resultMessage = "Successfuly created my first job";
            return TeamFoundationJobExecutionResult.Succeeded;
        }
    }

The following code is needed to register the job and get it executed every 30 seconds:

var tfsConfigServerUri = new Uri(String.Format("http://localhost:8080/tfs"));
var tfsConfigServer = TfsConfigurationServerFactory.GetConfigurationServer(tfsConfigServerUri);
var service = tfsConfigServer.GetService<ITeamFoundationJobService>();

var definition = new TeamFoundationJobDefinition(
                    new Guid("E5B15F37-1B19-4014-B354-B6CA3DA908E7"),
                    "My First Job",
                    "Lab.TFSJob.FirstTry.MyFirstJob",
                    null,
                    TeamFoundationJobEnabledState.Enabled);

var schedule = new TeamFoundationJobSchedule(new DateTime(2012, 1, 5, 9, 0, 0), 30);
definition.Schedule.Add(schedule);
                
service.UpdateJob(definition);

To queue the job initially you can add the following line:

var Result = service.QueueJobNow(definition, false);

The dll must be deployed to the %ProgramFiles%\Microsoft Team Foundation Server Dev11\Application Tier\TFSJobAgent\plugins\ folder. After this the job agent service must be restarted once, otherwise the assembly will not be loaded. You can debug your job by attaching to the TFSJobAgent.exe process on the TFS machine.

Now only some logic to check wether there are new branches between two polls and you are done!

Enjoy!