Blog

Actor Model

An actor is an isolated, independent unit of compute and state with single-threaded execution. The actor pattern is a computational model for concurrent or distributed systems in which a large number of these actors can execute simultaneously and independently of each other. Actors can communicate with each other and they can create more actors.

The concurrency models we have considered so far have the notion of shared state in common. Shared state can be accessed by multiple threads at the same time and must thus be protected, either by locking or by using transactions.

We now have a look at an entirely different approach but without the notion of shared state. State is still mutable, however it is exclusively coupled to single entities that are allowed to alter it, so-called actors.

An actor is a computational entity that, in response to a message it receives, can concurrently:

  • send a finite number of messages to other actors
  • create a finite number of new actors
  • designate the behavior to be used for the next message it receives.

There is no assumed sequence to the above actions and they could be carried out in parallel.

Actor-Model

 

For communication, the actor model uses asynchronous message passing.
When implementing the actor model, it is important to adhere to the set of rules defined by the original idea. First and foremost, actors must not share any state. This disallows actors to pass references, pointers or any other kind of shared data as part of a message. Only immutable data and addresses (i.e. “names”) of actors should be sent. Message passing between actors is often enriched with a few more guarantees compared to the entirely best-effort style. Most implementations ensure that two messages sent from one actor to another maintain their order at arrival. Messaging is always asynchronous and the interleaving of incoming messages sent by multiple actors is indeterminate.

Frameworks for the Actor Model

There are many frameworks for the development of a distributed system based on the actor model. The most popular are Akka, Orleans, Service Fabric, TPL Dataflow..

TPL Dataflow model promotes actor-based programming by providing in-process message passing for coarse-grained dataflow and pipelining tasks.These dataflow components are useful when you have multiple operations that must communicate with one another asynchronously or when you want to process data as it becomes available.The TPL Dataflow Library provides a foundation for message passing and parallelizing CPU-intensive and I/O-intensive applications that have high throughput and low latency. It also gives you explicit control over how data is buffered and moves around the system.

Service Fabric Reliable Actors is an implementation of the actor design pattern.Although the actor design pattern can be a good fit to a number of distributed systems problems and scenarios, careful consideration of the constraints of the pattern and the framework implementing it must be made.

As general guidance, consider the actor pattern to model your problem or scenario if:

  • Your problem space involves a large number (thousands or more) of small, independent, and isolated units of state and logic.
  • You want to work with single-threaded objects that do not require significant interaction from external components, including querying state across a set of actors.
  • Your actor instances won’t block callers with unpredictable delays by issuing I/O operations.

WPF Multithreading

WPF have the concept of the UI thread and background (or secondary) threads. Code on a background thread is not allowed to interact with UI elements (controls, etc.) on the foreground thread.  What happens when you need to add and remove items from a collection from that background thread? That’s pretty common in applications which must keep a list updated . When the collection raises the change notification events, the binding system and listeners, on the UI thread, have code executed.

Before WPF 4.5 we have to use dispatcher to enable this feature in observable collection or we end up creating thread safe observable collection and also expose Add Range function which is not available in observable collections.

WPF 4.5 includes a number of key targeted performance and capability features, one of which is cross-thread collection change notification. Enabling the change notification is as simple as the inclusion of a lock object and a single function call. Versus the manual dispatching approach, this can be a real performance win, not to mention save you some coding time.

Use BindingOperations.EnableCollectionSynchronization to enable this cross thread access of observable collections.

So inside your view model you call this method and provide the lock to synchronise the collection from multiple threads. BindingOperations.EnableCollectionSynchronization, the WPF data binding engine participates in locking. The default behavior is to acquire a lock on the object specified in the aforementioned call, but you also have the option to use more complex locking schemes.

something like

BindingOperations.EnableCollectionSynchronization(this.items, itemsSyncLock);

where

private readonly object itemsSyncLock = new object();

private readonly ObservableCollection<string> items = new ObservableCollection<string>();

Now when you have to add data into collection from background thread

simply lock the collection and update it

lock (itemsSyncLock)
{
// Once locked, you can manipulate the collection safely from another thread
items.Add(value);
}

This means now you don’t need IDispatcher in you application anymore..

There are also other binding properties like IsAsync (to get the data asynchronously) , Delay (provides delay before committing binding values etc)..

Sandbox AppDomain Creation

Application domains provide a unit of isolation for the common language run-time. Creating Sandbox environment to isolate code base that can be used to load and unload assemblies make possible to unload unwanted assemblies to avoid memory leaks, eg. Host for Rosyln Compiler etc.

Security is important aspect when you allow 3rd party assemblies like plugins to run within your application,With the evidence, you can control the pre-defined permissions sets granted to code running in sub app domain.

Code snippet

private PermissionSet GetPermissionSet()
{
//create an evidence of type zone
var ev = new Evidence();
ev.AddHostEvidence(new Zone(SecurityZone.MyComputer));

//return the PermissionSets specific to the type of zone
var permissions = SecurityManager.GetStandardSandbox(ev);

if (permissions != null)
{
permissions.AddPermission(new SecurityPermission(SecurityPermissionFlag.Execution));
permissions.AddPermission(new EnvironmentPermission(PermissionState.Unrestricted));
permissions.AddPermission(new FileIOPermission(PermissionState.Unrestricted));
}

return permissions;
}

private AppDomain CreateSandbox()
{
var ps = GetPermissionSet();

var setup = new AppDomainSetup
{
ApplicationBase = AppDomain.CurrentDomain.BaseDirectory,
ApplicationName = DomainName,
DisallowBindingRedirects = true,
DisallowCodeDownload = true,
DisallowPublisherPolicy = true
};

var trusted = new[]
{
typeof (EngineFactory).Assembly.Evidence.GetHostEvidence<StrongName>()
};

var domain = AppDomain.CreateDomain(DomainName, AppDomain.CurrentDomain.Evidence, setup, ps, trusted);
return domain;
}

Host Website programmatically including apppool

The Powershell reads the sample json file and setup your website based on information in json file..

#Powershell to create web site host from metadata
function Get-Service-Name
{
param(
[parameter(Mandatory=$false)]
[ValidateNotNullOrEmpty()]
[object] $component
)


function Get-Service-Name
{
 param(
 [parameter(Mandatory=$false)] 
 [ValidateNotNullOrEmpty()] 
 [object] $component 
 )
 
 return [System.IO.Path]::GetFileNameWithoutExtension($component.solutionPath)
}

function Get-WebHost-Descriptor
{
 param(
 [parameter(Mandatory=$false)] 
 [ValidateNotNullOrEmpty()] 
 [object] $component 
 )
 $webDescription = $component.hostDescription;
 
 return $webDescription ;
}

function Create-WebHost
{
 param(
 [parameter(Mandatory=$false)] 
 [Object] $component, 
 [bool] $recreateHost = $false
 )

 process
 {
 $webDescription = Get-WebHost-Descriptor $component

 $websitePath = "IIS:Sites$($webDescription.Name)";
 if ($webDescription.appPool)
 {
 $webAppPoolPath = "IIS:AppPools$($webDescription.appPool.Name)"
 }

 if ($recreateHost)
 {
 if (Test-Path $websitePath -PathType Container)
 {
 Write-Verbose "Removing site $($webDescription.Name)"
 Remove-Item $websitePath -Confirm:$false -Force -Recurse 
 if ($webAppPoolPath)
 { 
 Write-Verbose "Removing appPool $($webDescription.appPool.Name)"
 Remove-Item $webAppPoolPath -Confirm:$false -Force -Recurse 
 }
 }
 }

 if ($webAppPoolPath -and -Not (Test-Path $webAppPoolPath -PathType Container))
 {
 Write-Verbose "Creating app pool $($webDescription.appPool.Name)"
 $appPool = New-Item $webAppPoolPath
 $appPool | Set-ItemProperty -Name "managedRuntimeVersion" -Value "v4.0"
 if ($webDescription.appPool.User)
 {
 Write-Verbose "Setting app pool $($webDescription.appPool.Name) identity"
 $appPool.processModel.userName = $webDescription.appPool.User;
 $appPool.processModel.password = $webDescription.appPool.Password;
 $appPool.processModel.identityType = 3;
 $appPool | Set-Item
 $appPool.Stop();
 $appPool.Start();
 }
 }
 
 if (-Not (Test-Path $websitePath -PathType Container))
 {
 New-Item -ItemType Directory -Force -Path $webDescription.PhisicalPath > $null
 
 $bindings =$webDescription.Protocols | % {$i=0}{ @{protocol=$_;bindingInformation="$($webDescription.BindingInformations[$i++])"}} | ? {-Not [string]::IsNullOrWhiteSpace($_.bindingInformation)}
 
 Write-Verbose "Creating web site $($webDescription.Name)"
 New-Website -Name $webDescription.Name -Port $bindings[0].bindingInformation.Replace(":","") -PhysicalPath $webDescription.PhisicalPath > $null

 #$a = New-Item $websitePath -bindings ($b) -physicalPath $webDescription.PhisicalPath

 $bindings | Select -skip 1 | % {New-ItemProperty -path $websitePath -name bindings -value @{protocol=$_.protocol.ToString();bindingInformation=$_.bindingInformation.ToString()}}
 #$bindings | Select -skip 1 | % {New-WebBinding -Name $webDescription.Name -IPAddress "*" -Port $_.bindingInformation.Replace(":","").Replace("*","") -Protocol $_.protocol > $null}
 Set-ItemProperty $websitePath -name EnabledProtocols -Value (($webDescription.Protocols | select -uniq) -join ',')
 if ($webAppPoolPath)
 {
 Set-ItemProperty $websitePath -Name "applicationPool" -Value $webDescription.appPool.Name
 }
 }

 $webAppPath = "$websitePath$(Get-Service-Name $component)"

 if (-Not (Test-Path $webAppPath -PathType Container))
 {
 Write-Verbose "Creating web app $webAppPath"
 New-Item $webAppPath -physicalPath $webDescription.PhisicalPath -type Application > $null
 if ($webDescription.Protocols -and $webDescription.Protocols.Count -gt 0)
 {
 $protocols = (($webDescription.Protocols | select -uniq) -join ',')
 Write-Verbose "Setting EnabledProtocols on WebApp $protocols"
 Set-ItemProperty $webAppPath -name enabledProtocols -Value (($webDescription.Protocols | select -uniq) -join ',')
 if ($webAppPoolPath)
 {
 Set-ItemProperty $webAppPath -Name "applicationPool" -Value $webDescription.appPool.Name
 }
 }
 }
 return @{Name=$webDescription.Name}
 }
}


# Example to create web site from json defination

 [string] $json = '[
 {
 "type": "Web",
 "hostDescription": {
 "BindingInformations": [":323:", "3808:*"],
 "Name": "YourWebsiteName",
 "Protocols": ["http", "net.tcp"],
 "PhisicalPath": "c:\inetpub\wwwroot\",
 "solutionPath": "YourWebDirectoryName",
 "appPool": {
 "Name": "YourAppPoolName",
 "User": "AccountForAppPool",
 "Password": "PasswordForAppPool"
 }
 }
 }
 ]'

$components = ($json | ConvertFrom-Json)
$firstWebSite = $components[0];
Write-Host $firstWebSite
Create-WebHost $firstWebSite , $true

Validate SSIS package on server

SSIS server validation

Untitled

 

  • Any change in database schema will break the associated SSIS package.
  • Generally each change is responsible to identify dependent components to be included in the change scope.
  • But what about any missed dependencies. E.g. Recent production issue.
  • Such broken SSIS can be identified in Pre-Prod only if pre-prod is well controlled and identical copy of prod environment.
  • Process is required to identify packages impacted by dependencies changes like database schema.
  • Proactive validation of SSIS package could help to identify the issue and reduce the system downtime.

High level architecture

Untitled2

 

The solution is capable of validating all ssis packages for sql server version 2008 on-wards. Instead of publishing the full source code below is the pseudo code and sql scripts to validate the packages.

  1. Create C# console base project which takes the server name you want to validate. The app calls the web service to validate the SQL server (validate SSIS packages on that server).
  2. Create web api project and expose following methods (Job Controller)
    1. Validate – start the validation process by creating SQL agent job dynamically on target server
    2. ValidationFinished – this will be invoked by job when validation is finished.
    3. GetValidationStatus – this will provide the SQL agent job status currently validating server

The Validate method works as follows..

  • Get the SQL Server version
private Version GetServerVersion(string serverName)
 {
 var connectionString = string.Format("Server={0};Database=master;Trusted_Connection=True;", serverName);
 using (var connection = new SqlConnection(connectionString))
 {
 connection.Open();
 return new Version(connection.ServerVersion);
 }
 }
  • Create interface IProcessor and implement it for sql server 2008 and 2012 as they both have different way to validate package
  • Based on server version get the eligible concrete implementation.
  • The validation engine will create the validation job on target sql server. for each proxy account, the script will create job step to trigger a network share package running under proxy account. (for details see createValidationJob.sql)
  • Each validation step gets the package scheduled under the current account and validate the package, populate the results back on central server. and in the end extract the job history and delete the job.)
  • The network deployed ssis package uses same engine to validate 2008 and 2012 via IProcessor since validation is done differently for both servers.

The common scripts used for both servers are here (download and rename doc to zip)

Common-Scripts

The SQL server 2008 validation is done via DTEXEC with /validate command (download and rename doc to zip)

SQl 2008-Scripts

The SQL 2012 validation is done via EXECUTE SSISDB.catalog.validate_project stored proc. (download and rename doc to zip)

SQL 2012 Scripts

The centralized server which keeps the status of SSIS validation looks like

Untitled3

The font end to display the data was build on angular 2 API..