Code coverage on angular application

Posted on Updated on

The goal of this post is to speak about code coverage on angular application
We define set of steps permit you to add code coverage after executing on angular application
1. We have to define configuration karma on application, browser etc.
2. And add karma reporter,
3. We have to also add script build, we write PowerShell permit you to add behavior.

For development need I suggest you to download visual studio code (https://code.visualstudio.com/ ) and install angular application (https://angular.io/guide/quickstart).

We speak also around build continuous.
After creating application, modify karma.conf.js file by adding requirements on section .

basePath: '',
frameworks: ['jasmine', '@angular/cli'],
plugins: [
require( 'karma-phantomjs-launcher' ),

Register reporters

reporters: ['progress', 'kjhtml', 'coverage'],

Modify preprocessors node
preprocessors: {
// source files, that you wanna generate coverage for
// do not include tests or libraries
// (these files will be instrumented by Istanbul)
'src/**/*.js': ['coverage']

Add coverage reporter node
coverageReporter: {
type : 'html',
dir : 'coverage/'

Delete Chrome browser, and add PhatomJS, without this browser, it’s not possible to launch your tests, without blocking.
browsers: ['PhantomJS'],

Define attribute singleRun to true value.
singleRun: true

Full file :

// Karma configuration file, see link for more information
// https://karma-runner.github.io/1.0/config/configuration-file.html

module.exports = function (config) {
basePath: '',
frameworks: ['jasmine', '@angular/cli'],
plugins: [
require( 'karma-phantomjs-launcher' ),
clearContext: false // leave Jasmine Spec Runner output visible in browser
coverageIstanbulReporter: {
reports: [ 'html', 'lcovonly' ],
fixWebpackSourcePaths: true
angularCli: {
environment: 'dev'
reporters: ['progress', 'kjhtml', 'coverage'],
preprocessors: {
// source files, that you wanna generate coverage for
// do not include tests or libraries
// (these files will be instrumented by Istanbul)
'src/**/*.js': ['coverage']
coverageReporter: {
type : 'html',
dir : 'coverage/'
port: 9876,
colors: true,
logLevel: config.LOG_INFO,
autoWatch: true,
browsers: ['PhantomJS'],
singleRun: true

Don’t forget to uncomment lines of polyfil.ts

/** IE9, IE10 and IE11 requires all of the following polyfills. **/
import 'core-js/es6/symbol';
import 'core-js/es6/object';
import 'core-js/es6/function';
import 'core-js/es6/parse-int';
import 'core-js/es6/parse-float';
import 'core-js/es6/number';
import 'core-js/es6/math';
import 'core-js/es6/string';
import 'core-js/es6/date';
import 'core-js/es6/array';
import 'core-js/es6/regexp';
import 'core-js/es6/map';
import 'core-js/es6/weak-map';
import 'core-js/es6/set';

Now we write script permitting you to launch code coverage :

param (
function Test-Front([string] $rootApplication)
Write-Host 'Angular Front - Begin testing' -Verbose
throw [System.ArgumentNullException] "$rootApplication"

Set-Location -Path $rootApplication

Write-Host 'Launch of command : npm i --save-dev karma-phantomjs-launcher'
npm i --save-dev karma-phantomjs-launcher

Write-Host 'Launch of command : ng test'
Write-Host 'angular tests are hosted by PhantomJS browser, you can check config into karma.conf'
ng test --code-coverage

Write-Host 'Angular Front - End testing' -Verbose

Test-Front -rootApplication "$rootApplication"

ng test –code-coverage : command permit to launch executing of tests and generate code coverage.

After executing, we have as result folder artefact containing metrics under the application (similar to dist folder after building process)

below screen describing results.


VSTS and build agent on Premise

Posted on

After DryRun migration on VSTS, you have to ensure that your builds run correctly, so each build server have specific capabilities, so it can be hard to construct your build agent on Azure, if you don’t have infra as code (Template ARM, etc.), or try with hosted agent.
In order to validate the migration easily of your build layer, the practice is to define connection between your VSTS platform and your premise build agent.

Before to use hosted agent build. For information below description of each agent available on VSTS.

• Hosted VS2017 if your team uses Visual Studio 2017.
• Hosted Linux if your team uses development tools on Ubuntu.
• Hosted macOS Preview if your team uses development tools on macOS.
• Hosted if your team uses Visual Studio 2013 or Visual Studio 2015.

I suggest you to follow theses steps :

1.Firstly go to your DryRun build board https://test-dryrun.visualstudio.com/_admin/_AgentPool

2.Download agent for Windows, this action permit you to download file agent vsts-agent-win7-x64-2.124.0.zip


3. Connect to your premise server (Not azure server for this use case), create VSTS directory under your C:\, copy paste your agent file and unzip your file into this last.

4. Add proxy file into this directory.

5. Add following environment variables with good values :

  • http_proxy
  • https_proxy
  • no_proxy

6. Unzip your agent file, below result


7. Run your powershell ISE as administrator.

8. Go to your specific directory, for our case it’s test, contains all agent of test VSTS platform

cd C:/vsts/test
PS C:\vsts\test> .\config.cmd

9. Provide following answer for each question.

Q1. Enter server URL ?
Enter the address of your dryrun  https://test-dryrun.visualstudio.com
Q2. Enter authentication type (press enter for PAT)  ?
Press Enter button
Q3. Enter personal access token  ?
Follow steps below in order to generate token

For this need, go to security section


Click on security section


Click on Add button in order to add new token with full access.


Copy paste the token into POWERSHELL ISE command prompt


After that premise server try to connect to your vsts, in order create and configure agent.

Q4. Enter agent pool (press enter for default) ?
Define  default as value


Q5. Enter agent name (press enter for YourServer-1) ?
Define  YourServer-1 as value


Q6. Enter work folder (press enter for _work)?
Define _work as result


Q7. Enter run agent as service? (Y/N) (press enter for N) ?
Enter Y for response


Q8. Enter User account to use for the service (press enter for NT AUTHORITY\NETWORK SERVICE)?
Press Enter for this question

Below list of answers for each question


After that we can go to pool agent in order to ensure that our agent is good configured.


How to migrate to VSTS steps by steps

Posted on Updated on

With the multiples Team foundation server platforms 2015, 2017 etc., and the generalization of the VSTS platform, you have a lot of companies which want move from TFS 2015 to VSTS, without passing by 2017 step.
However, for this constraint you add another steps that is to restore your database on TFS2017, before go on VSTS.
This post will describe the steps related to this use case.

Here post about differences Between TFS and vsts link

Remark : Before final migration, you pass by DryRun step, this later permit you to check your state of migration.

Below steps about migration :

1. Dettach your collection on TFS 2015.

2. Backup your collection on your TFS 2015 (I suggest you to use administration console in order to backup and restore data layer).


3. Restore your collection on TFS 2017.

Remark : TfsMigrator tool ensure migrating from 2017 to VSTS, so it’s required to pass by this temporary step.

Download TFSMigrator tool from here link, Tfs Mgrator is console application, you have lot of command for each technical need, we begin always by validating your migration, next step is preparing and we finish by importing.

Remark : Preparing process execute always firstly validating process.

4. We run validating command

TfsMigrator validate /collection: http://yourServer/yourCollectionName

You have to correct all errors before run migration, if you have warning it’s not important, you can ignore.

For example if you have errors about maching between and group sid, I suggest you to run TFSSecurity (C:\Program Files\Microsoft Team Foundation Server 15.0\Tools).


TFSSecurity.exe /a+ Identity "397c326b-b97c-4510-8271-75aac13de7a9\\" Read sid:S-1-9-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-0-0-0-0-3 ALLOW /collection:https://YourServer:8080/tfs/YourCollection

Run another times validate command until passe validating process.
After validating of your migration, for preparing of your migration run prepare command.

Below screen describing command.

tfsmigrator prepare /collection:http://yourServer/yourCollectionName /t
enantDomainName:yourDomain.com /accountregion:WEU
  • Enter your collection’s url
  • Enter tenant domain name

Remark : With new version ensure that you have added accountregion:WEU.

Below you have set of generated and modified files, in previous sections.


Now we can view content of json file, and modify attributes values.

Foreach attributes we have set of actions to execute, in order to define value.


We have defined test as name of account name, with this name you can check your migration with this url : https://test-dryrun.visualstudio.com/.

We define flag DryRun, in order to significate that result if test version of migration.

We generate dacpac file in the root of TfsMigrator, so we indicate the path of dacpac file.

For generating of dacpac file corresponding you database, SqlPackage is defined tool for operation.

GO to this link C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin.

Just before execute SqlPackage for generate dacpac file, dettach collection

Below command associated, set values of server and database, ensure that we have target file in output :


SqlPackage.exe /sourceco

nnectionstring:"Data Source=TourSrver;Initial Catalog=Yourdatabase;Integrated S

ecurity=True" /targetFile:c:\DACPAC\Tfs_test.dacpac /action:extract /p:ExtractA

llTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:St



After generating dacpac file , we copy this later into migration directory, under TfsMigrator.

Now we try to define storage accoutn url, for this need we follow steps below :

Download Microsoft Azure Storage Explorer, connect to your subscription.

Open Storage accounts under your subscription

Open Blob containers,

Right click and create specific container under BlocContainer.


Define access signature with Expiry time of unless 1 month.


Ensure that you have good expiry data, and all authorizations.

Click on create button.


Click on copy button in order to copy and paste this later on your json file as storage’s url into Location attribute.

Upload identity map file and dacpac file into azure explorer, by clicking on Upload button.

Below two files uploaded.



Ensure below that we have imported all files.10

Now open command prompt and run import command in order to finalize migration.

tfsmigrator import /importFile:import.json

After running you have this screen.


You can also follow migration  screen on board.







How to override summary of your release or build ?

Posted on Updated on

If you want to add additional panel into your summary of your release instance,
I suggest you to add new markdown, upload this last with vso specific task.
Aggregate this development into custom task link.

Below detail of your development :

1. Begin by adding your mark down template file, set this later into task directory.

2. Add your markdown data file , into the same directory.

3. below content of your two files.

Template is (Template file ensure that we load for every release the same template with same keys) :

# Section name #
- Artefact 1 : @@Artefact1@@
- Artefact 2 : @@Artefact2@@

Data is (Data file is published on server for every release):

# Section name #
 - Artefact 1 : @@Artefact1@@
- Artefact 2 : @@Artefact2@@

Remark : Theses two files have same content.

4. Now we add Wrapper.ps1 file permit you to execute backend treatment, below content of your file.
For our case, we decide to get datas of testing layer.

  [string] $artefact1,
  [string] $artefact2

Function permit you to access vsts or tfs platform, without enter credentials.

function Get-AccessToken
	Write-Verbose -Verbose "Getting Personal Access Token for the Run" 
	$vssEndPoint = Get-ServiceEndPoint -Name "SystemVssConnection" -Context $distributedTaskContext
	$personalAccessToken = $vssEndpoint.Authorization.Parameters.AccessToken
	if (!$personalAccessToken) 
		throw "Could not extract personal access token. Exitting"     

Load template file markdown, access datas and set datas into data file.

$template = (Get-Content $PSScriptRoot\readmeTemplate.md)

#replace artefact
$template.Replace("@@Artefact1@@", $artefact1) | Set-Content $PSScriptRoot\readme.md

#get access
$token = Get-AccessToken
$returnObj = @{}
$url = "$($env:SYSTEM_TEAMFOUNDATIONSERVERURI)$($env:SYSTEM_TEAMPROJECT)/_apis/test/runs?api-version=1.0"
$response = Invoke-RestMethod -Method Get -Uri $url -Headers @{Authorization = "Bearer $token"}

if($response -eq $null)
	Write-Host "We don't have datas..."
        (Get-Content $PSScriptRoot\readme.md).replace("@@Artefact2@@", $artefact2) | Set-Content $PSScriptRoot\readme.md 


Upload specific section into summary, with vso tasks.

Write-Host "##vso[task.uploadsummary]$PSScriptRoot\readme.md"

Documentation link vso tasks

Below screen describing result


Access to VSTS Releases by C# in preview version

Posted on Updated on


This post describes somes lines of code which permit you to access VSTS platform (Visual Studio Team Service), we describe fundamntal steps permit to access new objects and services, we begin by decsribing packages that you have to download from NUGET in order to develop access layers.

The benefits of this post is to identify easily list of packages and class that you have known, in order access datas, without forget that we are in preview version.

Firslty begin by adding theses packages, we remarks that we add Microsoft.VisualStudio.Services.Release.Client package with preview version.

Below full list of packages that you have download :

-Microsoft.AspNet.WebApi.Client with version =5.2.2
-Microsoft.AspNet.WebApi.Core with version =5.2.2
-Microsoft.IdentityModel.Clients.ActiveDirectory with version =3.13.5
-Microsoft.TeamFoundation.DistributedTask.Common.Contracts with version =15.120.0-preview
-Microsoft.TeamFoundationServer.Client with version =15.120.0-preview
-Microsoft.TeamFoundationServer.ExtendedClient with version =15.120.0-preview
-Microsoft.Tpl.Dataflow with version =4.5.24
-Microsoft.VisualStudio.Services.Client with version =15.120.0-preview
-Microsoft.VisualStudio.Services.InteractiveClient with version =15.120.0-preview
-Microsoft.VisualStudio.Services.Release.Client with version =15.120.0-preview
-Newtonsoft.Json with version =8.0.3
-System.IdentityModel.Tokens.Jwt with version =
-WindowsAzure.ServiceBus with version =3.3.2

Below source code permit you to create release definition, we define VssClientCredentials object permit to ensure connection to your VSTS, so you need to register your login and your token.

If you wish to get token, go to security panel of your VSTS, and create basic token with retency of 90 days.


We create VssClientCredentials with credentials, for etablish connection with VSTS we define VssConnection object based on credentials.

ReleaseHttpClient is the service permit you to access releases. based on ReleaseHttpClientBase abstract class.

ReleaseHttpClientBase permit you to create, delete, update, modify your definitions.

In this sample we create definition of release.

 VssClientCredentials credentialsDestination = new VssClientCredentials
                            (new VssBasicCredential(ConfigurationManager.AppSettings["loginForMigration"],

            using (VssConnection connectionDestination = new VssConnection(serverUrlDestination, credentialsDestination))
                using (var releaseHttpClient = connectionDestination.GetClient())
                        Task releaseDefinitionTask = releaseHttpClient.CreateReleaseDefinitionAsync(releaseDefinition, projectName);
                    catch (Exception ex)
                        throw ex;

After describing call of CreateReleaseDefinitionAsync, we speak about business rules that definition have to respect.

Constraints on VSTS :

  • Definition must have Id.
  • Definition must have Name unique.
  • On each environment of your release you have to define RetentionPolicy with three properties DaysToKeep, RetainBuild and ReleasesToKeep.
  • We three types of agents, in order to stay coherent with TFS platform, we use AgentDeploymentInput type as agent.

Remarks : In the future evolutions we preconise to use Deployment group, so replace AgentBasedDeployPhase type with MachineGroupBasedDeployPhase type.

 foreach (var environment in definition.Environments)
                environment.RetentionPolicy = new EnvironmentRetentionPolicy
                    DaysToKeep = daysToKeep,
                    RetainBuild = true,
                    ReleasesToKeep = releasesToKeep

                var agentDeploymentInput = new AgentDeploymentInput

                if (agentName == AgentNameDefaultConstant)
                    agentDeploymentInput.QueueId = 1;

                DeployPhase deployPhase = new AgentBasedDeployPhase
                    Name = deployPhaseName,
                    Rank = deployPhaseRank,
                    DeploymentInput = agentDeploymentInput


                // Adjust tasks of step
                if (environment.DeployStep != null &&
                    deployPhase.WorkflowTasks = .....;

All tasks are contained on DeployPhase object.


Document your extension TFS before publish – Markdown

Posted on Updated on

This post is here in order to help you to improve quality on published extension on market place, because we have lot of extensions without documentation, so it’s very hard to use theses, we can add documentation easily by defining markdown.
We have set of steps to follow in order to document.

1. Create your markdown file , add readme.md to your library, if you have one extension library with set of task, suggest you to have one mardown for all tasks.

Remarks: Your file don’t need to be copied in content, let default options.


Below samples of common syntax of markdown :

# This is an H1
## This is an H2
###### This is an H6
*Italic characters*
_Italic characters_
**bold characters**
__bold characters__
~~strikethrough text~~
This is [an example](http://www.amayestech.com/) inline link.

You can also install markdown extension in order to modify easily :


After installing Markdown Editor, can view the design of my code, view small symbol M on my file.


complete doc here : https://code.visualstudio.com/docs/languages/markdown

2. Register your readme.md file into manifeste extension by adding section of code below


3. For this sample we add images in readme.md, theses images must be copied to directory img (content / do not copy)
and registered into manifeste extension


Now after packaging and publishing actions, if we go to market place manager, we can view documentation of your extension.


Package and publish your extension TFS 2015 VNEXT

Posted on Updated on

In order apeak about solution , we explain just that we have two aspects to setup, packaging and publishing, we begin with our sample with packaging.


For the packaging we follow steps below in order to construct our package, the result of this step is vsix file.

1. Setup Node.js

2. Install vset tool with this command : npm install vset –save-dev


3. Run Node.js command prompt tool (C:\Windows\System32\cmd.exe /k “C:\Program Files (x86)\nodejs\nodevars.bat”)

4. Locate you on directory of your extension

5. Run vset package command


Your vsix is generated into your root irectory


For information you can inspect content of your extension, by unzip your vsix



Before publish our extension it’s possible to visit differents extension avaiable on the cloud, it’s possible to download and install on premise version of tfs, in order to reuse.

We choose to publish our expansion on the market place on Azure

We follow theses steps, firstly we create a publisher, secondly

1. Go to https://marketplace.visualstudio.com/manage/publishers/
2. Choose to create a publisher by completing a unique Id

After that
3. Upload your extension by drag and drop your vsix

4. Correct and adapt your manifest information


5. Now it’s ok and my vsix is downloaded


6. Share my extension on my account azure, an it’s possible to update my version on clickin on update button


7. Go to admin tfs section, and selection extension tab, clck on my extension target



8. On Build VNEXT find my created extension


Create your extension TFS 2015 VNEXT

Posted on Updated on

Start by downloading a template project, you have two project templates, one dedicated to the integration part, another dedicated to the realization of custom build or release tasks.

In this post we will realize an extension that will aim to start an application pool, without managing this specify.

For the part back we will combine our extension with a shell script.

Begin with open Explorer Visual Studio extensions,

Install the following template :


Below is a detail on the different of your solution.

Most of these directories will be deleted in the second time.


Now we will start developing our own extension,

First, delete the files that we use nothing:

  1. Remove the test directory
  2. Remove typings directory
  3. Delete the file app.ts
  1. Add directory Sample
  2. Create a powershell file Sample.ps1
  3. Create a task manifest file task.json
  4. Create logo into directory

Below the output of created project


  1. Edit the task.json by defining your layout based on controls and groups concepts, inputs are your controls typed with type propecty, and are grouped into groups, by using group property (Below sample of grouping, we have three groups)


Below the task.json file after modify


We have another section related to call code behind, for our project, code behind is powersell, it can be another type

  1. Implement your Code behind and match with your layout arguments

Ensure that arguments ps1 match with json arguments, in execution section on your json, ensure that you have referenced your target file.



  1. For debugging after implementing, open your ps ise, you can find-it in C:\Windows\System32\WindowsPowerShell\v1.0\powershell_ise.exe

Ensure that you have right of debugging by executing this script, and just click F5


  1. Register your extension in the manifeste



MCT Summit

Posted on

The 2016 MCT Summit in Sydney Australia invites Microsoft Certified Trainers from Australia, New Zealand and beyond to a two-day event co-hosted by the MCT community and Microsoft. MCT events are your best opportunity to interact with your peers and engage directly with Microsoft. Come learn about the future of Microsoft learning, the MCT Program, how to teach a specific course or technology, or how to become a better trainer

Microsoft’s Worldwide MCT/MCP Program Manager, Patrick Thomas will be here from Microsoft Corporation, Redmond, USA as well as Chee Sing Chen, Microsoft Asia Pacific Learning Experiences Director will be in attendance and you will have the ability to find out more about Microsoft’s MCT Program Strategy and Vision.

Registration is now open and Agenda details are available HERE

Disclaimer: Microsoft respects your privacy. Review our online Privacy Statement.

If you would prefer to no longer receive this communication, please email MCTPM@Microsoft.com with “UNSUBSCRIBE” in the subject line. To set your contact preferences for other Microsoft communications, see the communications preferences section of the Microsoft Privacy Statement.

Microsoft Corporation
One Microsoft Way
Redmond, WA, USA 98052

Welcome to AEA for Aghilas

Posted on Updated on

Cool I am now part of the AEA committee

Dear Aghilas,

Welcome to the Association of Enterprise Architects (AEA). As a new member, you have joined forces with EA professionals worldwide who are working together to advance the profession of Enterprise Architecture and promote professional excellence.

Your username for the AEA is aghilas.yakoub and your Membership Number is ………

To sign into the AEA website please go to: https://www.globalaea.org

After logging in you will be able to manage your Profile by editing your bio,managing preferences and email subscriptions to blogs and forums.

You will also be able to access content and features, such as blogs, photo gallery, networks, and our AEA community via messaging, connections, groups,chapters, Work Groups, and Forums. You will be able to connect with fellow members, share ideas and expertise.

We hope you enjoy our online community and look forward to your participation!

Association of Enterprise Architects (AEA)