Month: November 2017

VSTS and build agent on Premise

Posted on


After DryRun migration on VSTS, you have to ensure that your builds run correctly, so each build server have specific capabilities, so it can be hard to construct your build agent on Azure, if you don’t have infra as code (Template ARM, etc.), or try with hosted agent.
In order to validate the migration easily of your build layer, the practice is to define connection between your VSTS platform and your premise build agent.

Before to use hosted agent build. For information below description of each agent available on VSTS.

• Hosted VS2017 if your team uses Visual Studio 2017.
• Hosted Linux if your team uses development tools on Ubuntu.
• Hosted macOS Preview if your team uses development tools on macOS.
• Hosted if your team uses Visual Studio 2013 or Visual Studio 2015.

I suggest you to follow theses steps :

1.Firstly go to your DryRun build board https://test-dryrun.visualstudio.com/_admin/_AgentPool

2.Download agent for Windows, this action permit you to download file agent vsts-agent-win7-x64-2.124.0.zip

13.png

3. Connect to your premise server (Not azure server for this use case), create VSTS directory under your C:\, copy paste your agent file and unzip your file into this last.

4. Add proxy file into this directory.

5. Add following environment variables with good values :

  • http_proxy
  • https_proxy
  • no_proxy

6. Unzip your agent file, below result

14

7. Run your powershell ISE as administrator.

8. Go to your specific directory, for our case it’s test, contains all agent of test VSTS platform

cd C:/vsts/test
PS C:\vsts\test> .\config.cmd

9. Provide following answer for each question.

Q1. Enter server URL ?
Enter the address of your dryrun  https://test-dryrun.visualstudio.com
Q2. Enter authentication type (press enter for PAT)  ?
Press Enter button
Q3. Enter personal access token  ?
Follow steps below in order to generate token

For this need, go to security section

15

Click on security section

16

Click on Add button in order to add new token with full access.

17

Copy paste the token into POWERSHELL ISE command prompt

20.png

After that premise server try to connect to your vsts, in order create and configure agent.

Q4. Enter agent pool (press enter for default) ?
Define  default as value

 

Q5. Enter agent name (press enter for YourServer-1) ?
Define  YourServer-1 as value

 

Q6. Enter work folder (press enter for _work)?
Define _work as result

 

Q7. Enter run agent as service? (Y/N) (press enter for N) ?
Enter Y for response

 

Q8. Enter User account to use for the service (press enter for NT AUTHORITY\NETWORK SERVICE)?
Press Enter for this question

Below list of answers for each question

21.png

After that we can go to pool agent in order to ensure that our agent is good configured.

22

How to migrate to VSTS steps by steps

Posted on Updated on


With the multiples Team foundation server platforms 2015, 2017 etc., and the generalization of the VSTS platform, you have a lot of companies which want move from TFS 2015 to VSTS, without passing by 2017 step.
However, for this constraint you add another steps that is to restore your database on TFS2017, before go on VSTS.
This post will describe the steps related to this use case.

Here post about differences Between TFS and vsts link

Remark : Before final migration, you pass by DryRun step, this later permit you to check your state of migration.

Below steps about migration :

1. Dettach your collection on TFS 2015.

2. Backup your collection on your TFS 2015 (I suggest you to use administration console in order to backup and restore data layer).

1

3. Restore your collection on TFS 2017.

Remark : TfsMigrator tool ensure migrating from 2017 to VSTS, so it’s required to pass by this temporary step.

Download TFSMigrator tool from here link, Tfs Mgrator is console application, you have lot of command for each technical need, we begin always by validating your migration, next step is preparing and we finish by importing.

Remark : Preparing process execute always firstly validating process.

4. We run validating command

TfsMigrator validate /collection: http://yourServer/yourCollectionName

You have to correct all errors before run migration, if you have warning it’s not important, you can ignore.

For example if you have errors about maching between and group sid, I suggest you to run TFSSecurity (C:\Program Files\Microsoft Team Foundation Server 15.0\Tools).

2.png

TFSSecurity.exe /a+ Identity "397c326b-b97c-4510-8271-75aac13de7a9\\" Read sid:S-1-9-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-0-0-0-0-3 ALLOW /collection:https://YourServer:8080/tfs/YourCollection

Run another times validate command until passe validating process.
After validating of your migration, for preparing of your migration run prepare command.

Below screen describing command.

tfsmigrator prepare /collection:http://yourServer/yourCollectionName /t
enantDomainName:yourDomain.com /accountregion:WEU
  • Enter your collection’s url
  • Enter tenant domain name

Remark : With new version ensure that you have added accountregion:WEU.

Below you have set of generated and modified files, in previous sections.

3.png

Now we can view content of json file, and modify attributes values.

Foreach attributes we have set of actions to execute, in order to define value.

5

We have defined test as name of account name, with this name you can check your migration with this url : https://test-dryrun.visualstudio.com/.

We define flag DryRun, in order to significate that result if test version of migration.

We generate dacpac file in the root of TfsMigrator, so we indicate the path of dacpac file.

For generating of dacpac file corresponding you database, SqlPackage is defined tool for operation.

GO to this link C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin.

Just before execute SqlPackage for generate dacpac file, dettach collection

Below command associated, set values of server and database, ensure that we have target file in output :

 


SqlPackage.exe /sourceco

nnectionstring:"Data Source=TourSrver;Initial Catalog=Yourdatabase;Integrated S

ecurity=True" /targetFile:c:\DACPAC\Tfs_test.dacpac /action:extract /p:ExtractA

llTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:St

orage=Memory

 

After generating dacpac file , we copy this later into migration directory, under TfsMigrator.

Now we try to define storage accoutn url, for this need we follow steps below :

Download Microsoft Azure Storage Explorer, connect to your subscription.

Open Storage accounts under your subscription

Open Blob containers,

Right click and create specific container under BlocContainer.

6.png

Define access signature with Expiry time of unless 1 month.

7.png

Ensure that you have good expiry data, and all authorizations.

Click on create button.

8.png

Click on copy button in order to copy and paste this later on your json file as storage’s url into Location attribute.

Upload identity map file and dacpac file into azure explorer, by clicking on Upload button.

Below two files uploaded.

 

9.png

Ensure below that we have imported all files.10

Now open command prompt and run import command in order to finalize migration.


tfsmigrator import /importFile:import.json

After running you have this screen.

11.png

You can also follow migration  screen on board.

12.png

 

 

 

 

 

How to override summary of your release or build ?

Posted on Updated on


If you want to add additional panel into your summary of your release instance,
I suggest you to add new markdown, upload this last with vso specific task.
Aggregate this development into custom task link.

Below detail of your development :

1. Begin by adding your mark down template file, set this later into task directory.
1

2. Add your markdown data file , into the same directory.

3. below content of your two files.

Template is (Template file ensure that we load for every release the same template with same keys) :

# Section name #
- Artefact 1 : @@Artefact1@@
- Artefact 2 : @@Artefact2@@

Data is (Data file is published on server for every release):

# Section name #
 - Artefact 1 : @@Artefact1@@
- Artefact 2 : @@Artefact2@@

Remark : Theses two files have same content.

4. Now we add Wrapper.ps1 file permit you to execute backend treatment, below content of your file.
For our case, we decide to get datas of testing layer.

[CmdletBinding()]
param(
  # VARIABLES TAKEN FROM THE VSTS EXTENSION
  [string] $artefact1,
  [string] $artefact2
)

Function permit you to access vsts or tfs platform, without enter credentials.

function Get-AccessToken
{
	Write-Verbose -Verbose "Getting Personal Access Token for the Run" 
	$vssEndPoint = Get-ServiceEndPoint -Name "SystemVssConnection" -Context $distributedTaskContext
	$personalAccessToken = $vssEndpoint.Authorization.Parameters.AccessToken
	if (!$personalAccessToken) 
	{ 
		throw "Could not extract personal access token. Exitting"     
	} 
	$personalAccessToken
}

Load template file markdown, access datas and set datas into data file.

$template = (Get-Content $PSScriptRoot\readmeTemplate.md)

#replace artefact
$template.Replace("@@Artefact1@@", $artefact1) | Set-Content $PSScriptRoot\readme.md

#get access
$token = Get-AccessToken
$returnObj = @{}
$url = "$($env:SYSTEM_TEAMFOUNDATIONSERVERURI)$($env:SYSTEM_TEAMPROJECT)/_apis/test/runs?api-version=1.0"
$response = Invoke-RestMethod -Method Get -Uri $url -Headers @{Authorization = "Bearer $token"}

if($response -eq $null)
{
	Write-Host "We don't have datas..."
}
else
{
	.....
        (Get-Content $PSScriptRoot\readme.md).replace("@@Artefact2@@", $artefact2) | Set-Content $PSScriptRoot\readme.md 

}

Upload specific section into summary, with vso tasks.

Write-Host "##vso[task.uploadsummary]$PSScriptRoot\readme.md"

Documentation link vso tasks

Below screen describing result

2