Compare commits

..

340 Commits

Author SHA1 Message Date
Hai Cao
31bee67f00 bump STS (#23148) 2023-05-16 14:40:54 -07:00
Alan Ren
e2949d4494 add securable settings (#22936) (#23141)
* wip

* Update typings

* nullable

* update test service

* support securables

* updata test data

* fix issues

* fix build failure

* update test mocks

* fix typo

* fix reference

* fix findobjectdialog issue

* update SearchResultItem type

* fix table component perf issue

* hide effective permission for server role

* hide effective permission for app role and db role

* vbump sts and fix a couple issues

* STS update and UI update

* fix user login display issue

* vbump sts
2023-05-15 15:35:47 -07:00
Cheena Malhotra
b6bd726066 Support advanced options in command line arguments (#23104) (#23124) 2023-05-12 13:51:15 -07:00
Benjin Dubishar
0fe638974d Fixing issue where sqlcmdvars wouldn't load from publish profile in ADS (#23116) (#23121)
* fixing issue where sqlcmdvars wouldn't load from publish profile in ADS

* in -> of
2023-05-12 13:50:26 -07:00
Benjin Dubishar
f364e52079 Fix deploy/generatePlan/saveProfile when no sqlcmdvars are defined (#23112) (#23120)
* fix deploy/generate when no sqlcmdvars are defined

* saveProfile
2023-05-12 13:50:02 -07:00
Maddy
84143d3caf remove the access point (#23105) (#23114) 2023-05-11 19:31:57 -07:00
Kim Santiago
0cfaf69647 fix sqlcmd variables not getting loaded correctly in vscode (#23103) (#23108) 2023-05-11 19:31:34 -07:00
Aasim Khan
343d878457 Adding telemetry to ads OE filter (#23089) (#23106)
* Adding telemetry to ads oe filter

* Fixing prop names

* fixing prop name

* Fixing localized strings

* Update src/sql/azdata.proposed.d.ts



* Update src/sql/workbench/contrib/objectExplorer/browser/serverTreeView.ts



---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-05-11 19:31:07 -07:00
Cheena Malhotra
7d0cfc4f99 Support migrating credentials to new format (#23088) (#23095) 2023-05-11 18:41:11 -07:00
Barbara Valdez
f13406570e update workbench file and fix relative link not working in markdown (#23109) (#23113) 2023-05-11 16:52:05 -07:00
Benjin Dubishar
c44ecf56f2 Fixing bug where SQLCMD vars weren't getting JSONified (#23082) (#23096)
* changing param for sqlcmdvars back to Record since Json.stringify doesn't handle Maps

* swapping over savePublishProfile
2023-05-10 23:12:12 -07:00
Aasim Khan
f10fc9b56d Adding back OE filtering in rmay release. (#23070) 2023-05-10 08:41:26 -07:00
Benjin Dubishar
72acd2af83 Bumping SQL Tools Service (#23046) (#23051) 2023-05-09 14:09:15 -07:00
Cheena Malhotra
6112eabc3c Fixes import wizard to work with enabled SQL auth provider (#23004) (#23052) 2023-05-09 11:12:19 -07:00
Lewis Sanchez
27bac701bb Use notification prompt before running a command (#23035) (#23048)
* Use notification prompt before running a command

https://github.com/microsoft/vscode/pull/179702/commits

* Removes unused declarations
2023-05-09 11:10:46 -07:00
Cheena Malhotra
0bcd010d9a Fixes build folder compilation + enable linux .deb files (#23006) (#23042) 2023-05-09 11:10:14 -07:00
Aasim Khan
bca671bc3f disabling async tree by default (#23037) 2023-05-09 11:09:33 -07:00
Aasim Khan
706ba6c974 Adding filtering dialog and action to OE (#22937) (#23036)
* Adding init change

* Adding filter cache in OE

* Adding more filtering changes

* Fixed stuff with dialog

* Fixing filter

* Adding support for connecitons

* Fixed stuff

* filtering

* Fixing  date

* Filters

* Removing is filtering supported

* Removing contracts

* Fixing filters

* Fixing cache

* Adding some accessibility changes

* Reverting some more changes to pull in changes from the main

* Adding comments

* Fixing boolean operators

* Fixing stuff

* Fixing stuff

* Fixing error handling and making dialog generic

* Fixing more stuff

* Making filter a generic dialog

* adding erase icon

* removing floating promises

* Fixing compile issue

* Adding support for choice filter with different and actual value.

* Adding null checks

* Adding durability type fix

* Fixing filtering for providers that do not play well with empty filter properties
2023-05-09 11:07:03 -07:00
Kim Santiago
8e38295691 vbump sql projects to 1.1.1 (#23029) (#23039) 2023-05-08 16:44:44 -07:00
Alan Ren
a97d882e3c fix #174264 (#174845) (#23027) (#23034)
Co-authored-by: Sandeep Somavarapu <sasomava@microsoft.com>
2023-05-08 15:49:00 -07:00
Aasim Khan
f605a3c644 Fixing node retention after disconnect (#23020) 2023-05-08 10:13:53 -07:00
Aasim Khan
6fa948adbe Fixing delete server group error when we try to focus on root (#23019) 2023-05-08 09:46:40 -07:00
Alex Hsu
4b9147a6a0 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230508154231869. (#23026) 2023-05-08 09:23:56 -07:00
Raymond Truong
6684dbb78c [SQL Migration] Add storage/VM connectivity validation (#22982)
* Implement storage account connectivity check for SQL VM targets

* Add missing break statement

* Address PR comments
2023-05-08 08:42:40 -07:00
Aasim Khan
3ae97b81f2 Fixing the error showing up in console log (#23018) 2023-05-08 08:29:39 -07:00
Alex Ma
ff3f6d53c7 Langpack source update for May release (#23014) 2023-05-07 19:58:41 -07:00
Benjin Dubishar
1620b3b374 Fixing deleteDatabaseReference test for vscode-mssql (#23008) 2023-05-07 13:08:04 -07:00
Alan Ren
688b0c0819 add label for database dropdown (#23015) 2023-05-07 10:27:30 -07:00
Alan Ren
718b149e84 introduce fieldset component (#23005) 2023-05-06 21:36:08 -07:00
Sakshi Sharma
feed449d97 Update default folder structure option in VSCode (#23002) 2023-05-05 16:50:32 -07:00
Cheena Malhotra
77c8b3bda1 Validate MSAL library is enabled (#23000) 2023-05-05 16:44:04 -07:00
Benjin Dubishar
127a2d2e2f Updating SqlProjects readme to have absolute github link for image (#23001) 2023-05-05 15:53:02 -07:00
Alex Ma
898bb73a34 Revert new connection string format (#22997) 2023-05-05 13:41:40 -07:00
Cory Rivera
27e0d67dec Add context menu entries for deleting a database (#22948) 2023-05-05 12:12:35 -07:00
Cheena Malhotra
0dc05a6a4c Hide tenant dropdown from Connection Dialog (#22973) 2023-05-05 10:40:00 -07:00
Cory Rivera
876a4a24f6 Bump SQL Tools dependency to 4.7.0.28 (#22983) 2023-05-05 10:39:34 -07:00
Charles Gagnon
c3bf85f026 Create separate ScriptableDialogBase (#22974)
* Create separate ScriptableDialogBase

* more
2023-05-05 09:17:51 -07:00
Cheena Malhotra
9af7a049e6 Revert build folder update to fix it properly (#22981)
* Revert "Disable publishing crash reports temporarily (#22950)"

This reverts commit 13a791d14e.

* Revert "Compile build folder (#22811)"

This reverts commit 2c07c09d0d.
2023-05-05 08:50:23 -07:00
Alan Ren
70e756b82d fix duplicate required indicator (#22976) 2023-05-04 21:53:42 -07:00
Alan Ren
b37df9b6ad fix tab style in hc mode (#22975) 2023-05-04 21:53:24 -07:00
Charles Gagnon
88197a5162 Fix validation errors in package.json when clause (#22972) 2023-05-04 16:19:52 -07:00
Aasim Khan
302855e4a4 Fixing async server tree error handling and removing timeout. (#22955)
* Fixing async server tree issues and removing timeout

* removing empty results for connection errors

* Fixing error message fetching

* Update src/sql/workbench/services/objectExplorer/browser/asyncServerTreeDataSource.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-05-04 15:50:37 -07:00
Alan Ren
86cd0003fe add required indicator (#22971) 2023-05-04 15:23:22 -07:00
Benjin Dubishar
94745a69f5 Bumping Tools Service dependency (#22963) 2023-05-04 11:45:00 -07:00
Benjin Dubishar
8a56e0c0cd Bumping azdata dependency (#22961) 2023-05-04 11:33:43 -07:00
Alan Ren
8a5387d97a remove the rename db context menu (#22962) 2023-05-04 10:52:07 -07:00
Alex Hsu
47893e005c Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230504154225133. (#22960) 2023-05-04 09:29:04 -07:00
Alex Ma
e76e6df49c [Loc] final update to localization before code complete (#22956) 2023-05-03 17:16:12 -07:00
Cheena Malhotra
13a791d14e Disable publishing crash reports temporarily (#22950) 2023-05-03 16:20:01 -07:00
Charles Gagnon
c6549dd6f4 Remove vs code engine version for whoisactive extension (#22952) 2023-05-03 16:07:00 -07:00
Karl Burtram
1ddcce5a75 Bump STS to 4.7.0.26 for User w Login fix (#22934) 2023-05-03 14:10:30 -07:00
AkshayMata
ced2f7938f Bump sql-migration version (#22946)
Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-03 13:43:43 -07:00
Raymond Truong
0d2ed6e517 [SQL Migration] Improve IR registration experience (#22926)
* Add registration instructions to IR page

* Clean up

* Typo

* Fix typo

* Replace link with aka.ms link

* Refactor + implement regenerate auth keys

* Update strings and clean up comments

* Fix sqlMigrationServiceDetailsDialog

* Fix sqlMigrationServiceDetailsDialog width

* Extract helpers to utils

* Add IR registration instructions to sqlMigrationServiceDetailsDialog

* Update SHIR description slightly
2023-05-03 16:43:00 -04:00
Kim Santiago
844ed758a5 Fix .publish.xml file extension not being used on mac (#22939) 2023-05-03 13:18:42 -07:00
Kim Santiago
8f37ea8746 fix macros not getting replaced in sql project item scripts (#22945) 2023-05-03 13:18:02 -07:00
Sakshi Sharma
55d652198c Fix schema comparison failure for Azure synapse (#22938) 2023-05-03 13:03:40 -07:00
AkshayMata
a8a88ccbeb [SQL-Migration] Improve log migrations telemetry (#22927)
- Bucketized errors to track top errors
- Created separate login migration specific error to improve monitoring

---------

Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-03 10:35:44 -07:00
Cheena Malhotra
2c07c09d0d Compile build folder (#22811)
* Compile build folder

* Fix build compile issues (#22813)

* Revert changes

* Update gulp-shell

* Test

* Update

* Update modifiers

* Try reverting minimist

* Generates linux deb artifact (#22922)

* Remove deb files that were brought in with the latest merge.

* Add debian back to linux gulp file

* Remove async from anonymous function.

* Remove run core integration tests build step in pipeline

* Revert "Remove async from anonymous function."

This reverts commit 7ad1ce2942954fce58939b9965343b46b9311a7e.

* Revert "Add debian back to linux gulp file"

This reverts commit 96b7c0f0995c8024ef67ed886da34255a5caa325.

* Revert "Remove deb files that were brought in with the latest merge."

This reverts commit bf3aae233b8da1f9111a149a96d77cc78d376094.

* Removes dependency checks

* Fix dependency gen errors

* Reenable "Build Deb" step

* Reenable publish deb

* Run core integration tests

* Revert "Run core integration tests"

This reverts commit 7cafbada194feebe771862af796fb3416b5dd686.

* Revert "Try reverting minimist"

This reverts commit 38fd843c1d5c33318a92f4bbc7057e951c5a9f71.

* Disable code coverage step intermin

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
Co-authored-by: Lewis Sanchez <87730006+lewis-sanchez@users.noreply.github.com>
2023-05-02 19:32:46 -07:00
Alex Ma
32c99ab1a6 [Loc] Localization update for 5-2-2023 (Second to last before code complete) (#22940) 2023-05-02 17:36:20 -07:00
Kim Santiago
ab44d205d0 add telemetry for saving publish profiles (#22933) 2023-05-02 16:09:18 -07:00
Kim Santiago
e57fc9f202 shorten 'Save Profile as...' button text to 'Save as...' (#22932)
* shorten 'Save Profile as...' button text to 'Save as...'

* uppercase
2023-05-02 15:37:16 -07:00
Raymond Truong
6de9c5e1ae [SQL Migration] Add support for assessing XEvent session files (#22210)
* Template

* Refactor

* Update strings

* Clean up

* Add clear button

* Clean up

* Fix typo and use aka.ms link

* Refactor to use GroupContainer

* Remove dialog and clean up common strings

* Fix previous/forward behavior

* Make group container default to collapsed

* Clean up

* Slightly reword string

* Add https to aka.ms link
2023-05-02 10:45:13 -04:00
Cheena Malhotra
1f3a514c90 Add launch config for VS Code selfhost test provider (#22913) 2023-05-01 20:27:37 -07:00
Alex Ma
7edd48fb11 added new password reset number (#22920) 2023-05-01 18:28:28 -07:00
Alex Ma
88c9e0b492 [Loc] update to XLF on 5-1-2023 prior to code complete (#22921) 2023-05-01 17:20:51 -07:00
Kim Santiago
a56109dad7 vbump data workspace (#22914) 2023-05-01 15:50:31 -07:00
Charles Gagnon
457365537c Add License section to extension READMEs (#22912)
* Add License section to extension READMEs

* vbump
2023-05-01 13:51:51 -07:00
Charles Gagnon
a650f268f7 Remove sqlservernotebook extension (#22903) 2023-05-01 13:44:14 -07:00
Alan Ren
6a7899281a remove auth type from user (#22905)
* remove auth type from user

* vbump sts
2023-05-01 13:29:55 -07:00
AkshayMata
af6f9089f7 [SQL-Migration] Improve error message for failed migration service download (#22846)
Add troubleshooting links to error message when SQL-Migration fails to download the MigrationService as seen in this issue: #22558

---------

Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-01 13:24:35 -07:00
Cheena Malhotra
7b2a07befd Respect 'showDashboard' disabled by default (#22907) 2023-05-01 11:55:38 -07:00
Cheena Malhotra
ea6bb41f45 Allow 'ApplicationName' to be specified for MSSQL connections (#22890) 2023-05-01 10:55:05 -07:00
Cheena Malhotra
f4952c76b8 Handle no matching account scenario (#22908) 2023-05-01 10:54:42 -07:00
Charles Gagnon
24e67a1cbd Remove publisher link and fix license link (#22904) 2023-05-01 09:44:56 -07:00
Charles Gagnon
519a42c5b3 Fix missing placeholder warning for infoboxes (#22898) 2023-05-01 09:30:02 -07:00
Benjin Dubishar
6b1dd0e468 Changing "files" to "sqlObjectScripts" to be more accurate (#22899)
* Changing "files" to "sqlObjectScripts" to be more accurate

* fixing func call
2023-05-01 08:31:13 -07:00
Aasim Khan
b86463ee71 Adding filtering to OE Service (#22900)
* Adding filtering to OE Service

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/serverTreeRenderer.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Fixing compile errors

* Fixing circular dependency

---------

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>
2023-04-29 17:22:38 -04:00
Cheena Malhotra
e26937b101 Touch up MSAL errors (#22906) 2023-04-28 21:22:26 -07:00
Cheena Malhotra
ed8149599c Update error for ignored tenants (#22881) 2023-04-28 16:19:29 -07:00
Benjin Dubishar
29ff6ca16c Adding Move, Exclude, and Rename support for folders (#22867)
* Adding exclude folder and base for move folder

* checkpoint

* rename

* Fixing up tests

* Adding exclude test to projController

* Adding tests

* fixing order of service.moveX() calls

* Updating move() order in sqlproj service

* PR feedback

* unskipping

* reskipping test

* Fixing folder move conditional

* updating comments
2023-04-28 16:05:38 -07:00
Kim Santiago
934d8ff8fa Add nupkg database reference option for sql projects in vscode (#22882)
* add nupkg db ref option in vscode

* add telemetry for nupkg db ref

* update comment
2023-04-28 14:57:22 -07:00
Alan Ren
468b3e4f06 update shortcut keys for parse query (#22896) 2023-04-28 14:45:28 -07:00
Alan Ren
15d26b7f9a fix the sqlservices sample extension (#22893) 2023-04-28 14:45:20 -07:00
Alan Ren
4f53d76eb5 User Management - Support new object types: Server Role, Application Role and Database Role (#22889)
* server role dialogs

* dialogs for other types

* refactor

* find object dialog

* script button

* refactoring

* fix issues

* fix title

* vbump sts

* remove language from links
2023-04-28 12:05:20 -07:00
Kim Santiago
ba09248483 fix build errors from sql projects test file (#22894) 2023-04-28 11:57:54 -07:00
Sakshi Sharma
757067b132 Required changes to make sql projects extension work with vscode (#22847)
* Update CreateProject api

* More updates

* Fix a few comments

* Address comments

* Remove package.json changes

* Fix error

* Fix testUtil
2023-04-28 10:27:59 -07:00
Cheena Malhotra
c04b8af1d2 Prompt user to refresh account credentials for AADSTS70043 and AADSTS50173 error codes (#22853) 2023-04-27 20:44:22 -07:00
Lewis Sanchez
942786c2a7 Enables previously skipped tests (#22884) 2023-04-27 17:20:12 -07:00
brian-harris
fe32180c71 SQL-Migration: enable cross subscription service migration (#22876)
* x subscription migration support

* refresh after cutover

* fix service irregular status load behavior

* queue service status requests, fix typo

* add migationTargetServerName helper method

* save context before api call
2023-04-27 16:28:32 -07:00
Charles Gagnon
65f8915b7e Remove azurehybridtoolkit extension (#22879) 2023-04-27 13:56:14 -07:00
Karl Burtram
109d428d8c Bump STS to pickup latest User management updates (#22880) 2023-04-27 11:32:22 -07:00
Aasim Khan
9c68043137 Adding OE filtering interfaces. (#22738)
* Adding interfaces for tree filtering

* fixing type

* Adding enums in sqlhost

* Fixing comment

* Fixed filters lol

* Fixing some contract definitions

* Removing flag

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Removing filters

* Removing filters2

* Fixing enum

* Fixing interface and enum names

* Fixing interface name

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Removing type from filter

* Adding is null and is not null operators

* Fixing enums

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-04-27 13:15:11 -04:00
Alex Hsu
c4b19597b6 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230427154831319. (#22874) 2023-04-27 09:38:01 -07:00
Benjin Dubishar
62255fe4dd no longer filtering to well-known database sources (#22864) 2023-04-26 13:31:15 -07:00
Kim Santiago
abff6a0a34 fix schema compare diff editor colors not being reversed after merge (#22852) 2023-04-26 09:25:07 -07:00
Lewis Sanchez
1fcba44772 Removes package-lock.json (#22855) 2023-04-25 21:49:42 -07:00
Charles Gagnon
5ba8369cb0 Fix Azure rest calls not working (#22854) 2023-04-25 19:29:32 -07:00
Kim Santiago
d551f5170d add telemetry for add database reference quickpick (#22848) 2023-04-25 13:52:48 -07:00
Cheena Malhotra
24af5db4a2 Fix resetting tenant when dropdown is not visible (#22845) 2023-04-25 12:53:58 -07:00
Charles Gagnon
64dd4f0904 Fix IConnectionProfile to have same options key when generated from ConnectionProfile (#22840) 2023-04-25 09:52:51 -07:00
Cheena Malhotra
a887bb199e Fix username to respect existing values (#22837) 2023-04-24 15:28:44 -07:00
Charles Gagnon
167ef2fea8 Finish up no-unsafe-assignment fixes in azurecore (#22836)
* Finish up no-unsafe-assignment fixes in azurecore

* Add link to typings

* Remove unused

* Ignore webpack file
2023-04-24 14:13:00 -07:00
Kim Santiago
8616c5948b Update vscode-mssql.d.ts to match what's in the vscode-mssql repo (#22830)
* update vscode-mssql.d.ts

* update extensions that need updates because of the vscode-mssql.d.ts changes

* remove skip

* fix sql projects tests failing because vscode-mssql couldn't be found
2023-04-24 13:40:05 -07:00
Alex Ma
c42418d89c Update to langpack base source file, and fix to xlf generation. (#22829)
* update to build lib and langpack base files

* fix for gulp localization xlf task and xlf update

* fix to yarn
2023-04-24 11:33:13 -07:00
Charles Gagnon
7f388dd420 Add azurecore HTTP typings (#22828)
* Add azurecore HTTP typings

* undo + spelling fix
2023-04-24 10:39:02 -07:00
Erin Stellato
e5e8824d34 Minor updates to readme (#22827)
* Minor updates to readme

* removed extraneous URL

removed ?view=sql-server-ver16 from docs URL
2023-04-24 10:51:44 -04:00
Cheena Malhotra
2247d5de88 Prevent reconnects for mssql provider (#22825) 2023-04-21 16:56:40 -07:00
Benjin Dubishar
c581b0285d Bumping SqlToolsService to 4.7.0.17 (#22823)
* Bumping STS

* Whoops

* Upgrading STS to 4.7.0.17
2023-04-21 16:44:52 -07:00
Cheena Malhotra
db03525b10 Fix launch.json (#22822) 2023-04-21 14:34:05 -07:00
Charles Gagnon
3bd85a5798 Move stringifying of request body for azure REST calls (#22820)
* Move stringifying of request body for azure REST calls

* spelling

* Remove unused
2023-04-21 13:27:11 -07:00
Charles Gagnon
ec81fc89bb Fix CI workflow (#22819) 2023-04-21 13:26:39 -07:00
Charles Gagnon
651f1ed85b Few more no-unsafe-assignment cleanups (#22818) 2023-04-21 11:27:03 -07:00
Kim Santiago
41e6f3b84b fix accessibility issue for open dialog location radio button (#22812)
* fix accessibliity issue where location radio button showed as required

* fix ariaLabel
2023-04-21 09:53:02 -07:00
Lewis Sanchez
c8618d39fe Update distro commit hash (#22810) 2023-04-20 21:41:51 -07:00
Cheena Malhotra
f83815cecd Enable Sql Authentication Provider by default (targeting May release) (#22213) 2023-04-20 17:55:56 -07:00
Benjin Dubishar
2142c706b0 Improving error message when projects fail to load (#22786)
* Improving message when project fails to load

* Cleaning up string
2023-04-20 14:00:43 -07:00
Cheena Malhotra
8613176817 Cache access tokens in local cache file to prevent MSAL throttling (#22763) 2023-04-20 13:55:30 -07:00
Karl Burtram
0bdb35d9ab Enable proposed APIs for copilot (#22805) 2023-04-20 12:51:55 -07:00
Benjin Dubishar
a9e359f58f Bumping STS to 4.7.0.15 (#22796)
* Bumping STS

* Whoops
2023-04-20 10:43:28 -07:00
Kim Santiago
b98ac1d211 change value to default value in sqlcmd variable prompt (#22787) 2023-04-20 10:18:48 -07:00
Karl Burtram
a9951e977c Disable DEB publish step (#22795)
* Disable DEB publishing

* Revert "Reenable DEB build (#22794)"

This reverts commit 9132695d61.
2023-04-20 08:35:14 -07:00
Karl Burtram
9132695d61 Reenable DEB build (#22794) 2023-04-20 07:36:05 -07:00
Karl Burtram
e7d3d047ec Merge from vscode merge-base (#22780)
* Revert "Revert "Merge from vscode merge-base (#22769)" (#22779)"

This reverts commit 47a1745180.

* Fix notebook download task

* Remove done call from extensions-ci
2023-04-19 21:48:46 -07:00
Alan Ren
decbe8dded simplify object management feature APIs (#22781) 2023-04-19 19:26:29 -07:00
Raymond Truong
34d092a7dd vbump extension (#22784) 2023-04-19 13:56:49 -07:00
Alan Ren
35829acd00 fix error message steal focus issue (#22782)
* fix steal focus issue

* update comment
2023-04-19 11:39:45 -07:00
Kim Santiago
39a28c5f51 Add nupkg option to add database reference dialog (#22772)
* add nupkg option to add database reference dialog

* Add required indicator

* only show nupkg radio button for SDK-style projects

* fix enable ok button

* hookup

* fix typo
2023-04-19 10:09:15 -07:00
Kim Santiago
c29bb27d9e fix sqlcmd table var table in publish dialog (#22770)
* fix sqlcmd var table in publish dialog

* const

* remove unused _value
2023-04-19 09:17:34 -07:00
Cheena Malhotra
ba694a0558 Register mysql, pgsql with mssql as default 'sql' kernel providers (#22729) 2023-04-18 22:07:02 -07:00
Karl Burtram
47a1745180 Revert "Merge from vscode merge-base (#22769)" (#22779)
This reverts commit 6bd0a17d3c.
2023-04-18 21:44:05 -07:00
Lewis Sanchez
6bd0a17d3c Merge from vscode merge-base (#22769)
* Merge from vscode merge-base

* Turn off basic checks

* Enable compilation, unit, and integration tests
2023-04-18 18:28:58 -07:00
Cheena Malhotra
6186358001 Rename 'body' to 'data' to prevent breaking change (#22761) 2023-04-18 18:04:51 -07:00
Kim Santiago
4709eab293 make UserDatabaseReferenceProjectEntry class (#22768) 2023-04-18 13:23:09 -07:00
Kim Santiago
2dcbdc9c63 Handle nukpg database references in project.ts (#22762)
* changes in project.ts for adding nupkg database references

* Add tests

* more tests

* fix comment

* remove it.only
2023-04-18 11:11:42 -07:00
Alex Ma
b69e87df15 Connection URI with complete options (finalized) (#22735)
* Connection URI made to include every option available instead of basic details (#22045)

* Revert "Merge remote-tracking branch 'origin' into feat/connectionUri"

This reverts commit 11b2d31bf99e216daee823f732254f69a017fee1, reversing
changes made to 36e4db8c0744f81565efdfd2f56a3ae3c0026896.

* Revert "Revert "Merge remote-tracking branch 'origin' into feat/connectionUri""

This reverts commit f439673c2693e1144c52e04c14e82cd8566c13a6.

* Added changes and fixes for feat connectionuri (#22706)

* add title generation at start

* added await to refreshConnectionTreeTitles
2023-04-18 11:08:48 -07:00
Karl Burtram
a9bc34acf0 Bump STS to 4.7.0.12 for Login script fix (#22766) 2023-04-18 09:37:29 -07:00
Karl Burtram
48cc5e53bd Bump STS to 4.7.0.12 for Login script fix (#22766) 2023-04-18 09:36:57 -07:00
Aasim Khan
fe086dc778 Fixing adding connections on a new ads install (#22760)
* Fixing stuff

* Fixing connection groups as well
2023-04-18 08:39:53 -07:00
Aasim Khan
b7d24dcecd Cleaning up some code in Async Server Tree. (#22732) 2023-04-17 23:23:41 -07:00
Kim Santiago
938a8bffbe Add AddNugetPackageReferenceRequest for sql projects (#22757)
* Add AddnugetPackageReferenceRequest for sql projects

* update comment

* fix whitespace
2023-04-17 13:31:17 -07:00
Benjin Dubishar
02e61d1598 Swapping Record usage to Map in SQL Projects (#22758)
* Changing SqlCmdVars from Record to Map

* Converting the rest

* Updating tests

* more cleanup

* Updating test to use new test creation API
2023-04-17 12:56:39 -07:00
Charles Gagnon
4a4580e9ef More azurecore cleanup (#22748)
* More azurecore cleanup

* convert to any

* Fix name
2023-04-17 10:29:49 -07:00
Alan Ren
f66d3349b9 add tooltip for dashboard learn more button (#22744) 2023-04-17 08:19:59 -07:00
Charles Gagnon
2d25c8626f Fix typings (#22747) 2023-04-16 16:36:47 -07:00
Karl Burtram
89797e2e94 Bump STS to 4.7.0.11 (#22746) 2023-04-14 18:45:15 -07:00
Alan Ren
1f57a10f7f auto focus message area (#22743) 2023-04-14 16:30:47 -07:00
Hai Cao
5de4f205b1 Add more description to queryEditor.nullBackground color setting (#22739)
* add desc to disable null bgcolor

* add more desc to queryEditor.nullBackground

* move color outside of the loc string

* comment
2023-04-14 16:22:16 -07:00
Benjin Dubishar
b1c2cc1740 Converting remaining services to use runWithErrorHandling() (#22720)
* Converting remaining services to use `runWithErrorHandling()`

* Updating sqlops-dataprotocolclient to 1.3.3

* upgrading dataprotocol and swapping to that baseService

* Adding async to make thenable -> promise conversion happy

---------

Co-authored-by: Alan Ren <alanren@microsoft.com>
2023-04-14 16:08:07 -07:00
Cheena Malhotra
47bf7efd4a Reset IV/Key if MSAL cache file decryption fails (#22733) 2023-04-14 15:45:37 -07:00
Alan Ren
46bc19bcb8 adjust the default max column width (#22740) 2023-04-14 15:29:12 -07:00
Alan Ren
9456285c65 support scripting in object management dialogs (#22429)
* user management - scripting

* remove confirmation

* update sts

* update string
2023-04-14 13:52:06 -07:00
Sakshi Sharma
d69e5b97df Update SC dialog to save/read file structure to/from schema compare file (#22727)
* Read/Send ExtractTarget information from/to STS

* Remove comment

* Cleanup comment and update azdata dependency
2023-04-14 11:47:59 -07:00
Kim Santiago
18a541b0a6 removed unused data sources code from publish dialog (#22722) 2023-04-14 11:44:46 -07:00
Aasim Khan
8d9ddebd98 Adding restart ads notification when async server tree is toggled (#22726) 2023-04-14 07:54:28 +01:00
Cheena Malhotra
537df7cbac Fix file upload (#22725) 2023-04-13 20:51:03 -07:00
Cheena Malhotra
bb3ddc7364 Fix advanced options for CMS and MSSQL (#22708) 2023-04-13 19:21:42 -07:00
Sakshi Sharma
91ea2b43d6 Save publish profile in Publish UI workflow (#22700)
* Add profile section in Publish project UI

* Move publish profile row below Publish Target

* Add contract for savePublishProfie and SaveProfileAs button functionality

* Make the DacFx contract functional

* Send values from UI to DacFx service call

* Fix build error

* Address comment, remove print statements

* Address comments

* Set correct connection string

* Fix functionality for rename, exclude, delete publish profiles. Add new profile to the tree and sqlproj.

* Address comment to update alignement of button

* Address comments

* Update button to use title casing
2023-04-13 17:08:24 -07:00
Kim Santiago
3deb163210 replace deprecated onDidClick() handler in sql projects (#22721) 2023-04-13 16:27:59 -07:00
Cheena Malhotra
87571b2706 Support placeholder text for connection dialog options (#22693) 2023-04-13 15:08:12 -07:00
Benjin Dubishar
733359de57 Updating yarn lock for ML (#22719) 2023-04-13 14:00:38 -07:00
Cory Rivera
8aade56214 Use column names as keys for table data in SQL cell outputs (#22688) 2023-04-13 13:26:12 -07:00
Benjin Dubishar
8202e1ec4e Changing dacFxService and schemaCompareService to use shared logic for error handling (#22718)
* Move helper function to base class

* Switching DacFxService over

* Thenable -> Promise

* Converting SchemaCompareService
2023-04-13 12:44:13 -07:00
Kim Santiago
35e1d63871 fix project listed twice when using multi root workspaces (#22705)
* fix project listed twice when using multi root workspaces

* uppercase
2023-04-13 09:49:14 -07:00
Kim Santiago
c78cc6e534 vbump sql database projects to 1.1.0 (#22707)
vbump after the April release
2023-04-12 20:27:23 -07:00
Cheena Malhotra
728f70bf6e Append Endpoint suffix (#22704) 2023-04-12 20:24:52 -07:00
Christopher Suh
897f026d1e Firewall Dialog Account/Tenant Selection (#22695)
* preselect account in firewall dialog from connection details

* cleanup

* fix element reference

* add initial tenant selection

* fix compile & cleanup

* pr comments

* change to private
2023-04-12 15:59:31 -07:00
Christopher Suh
62c2fd3290 add ms graph endpoint for us gov (#22574) 2023-04-12 14:58:58 -07:00
erpett
0310b132a9 Update changelof for 1.43 and update readme with evergreen links (#22699) 2023-04-12 14:37:13 -07:00
Sakshi Sharma
12011209f3 Update STS to bring in save publish profile changes (and others) (#22684) 2023-04-12 12:32:54 -07:00
Charles Gagnon
df88d881c5 Remove disposable from connection (#22687)
* Remove disposable from connection

* Remove from group
2023-04-11 15:27:01 -07:00
Cheena Malhotra
219bdabfb2 Upgrade xml2js to v0.5.0 + migration to @vscode/vsce + migration to @azure/storage-blob (#22664) 2023-04-11 15:13:43 -07:00
Aasim Khan
577d99e790 Fixing dup connections (#22670) 2023-04-11 14:16:59 -07:00
Benjin Dubishar
4d3d74e3da Fix for issue where bulk-adding scripts could perform a rescan after each script (#22665)
* Changed bulk script add to delay reloading file list until end of operation.

* Adding style name to sqlproj typing file

* vBump to 1.0.1
2023-04-10 16:28:53 -07:00
Christopher Suh
6857718cb3 Fix http request format (#22618)
* fix http request format

* encode to utf 8 and add body for put requests

* encode proxy request body

* change ?.data to ?.body

* add note for textencoder

* change content-type to application/json
2023-04-10 14:36:35 -07:00
Christopher Suh
e7a5ea6ee8 Add No Resources Found label under azure browse (#22660)
* Add No Resources Found label under azure browse

* update localize message

* add period
2023-04-10 10:27:01 -07:00
Raymond Truong
b6f1b949d7 [SQL Migration] Miscellaneous UI improvements from feedback (#22592)
* Hide more info for assessment issues without help links

* Add info box about blob container folders

* WIP - reuse create DMS dialog for IR registration

* Revert "Add info box about blob container folders"

This reverts commit 30b8892ea7918841a6466b59058181062d367ba5.

* Add help link to target platform selection page explaining Azure SQL offerings

* Revert "WIP - reuse create DMS dialog for IR registration"

This reverts commit 5fac6b5c7148b2520cc42ce9fad549cde28baba2.

* Don't show storage account warning banner for DB scenario

* Vbump extension and migration service

* Test - fix http request format from chsuh/fixFormat

* Add instructions for table mapping and schema migration

* Revert "Test - fix http request format from chsuh/fixFormat"

This reverts commit 4992603532e98dff3b7ba6f04ba9304d173fc5ad.
2023-04-10 10:20:39 -07:00
brian-harris
a60d6107b4 SQL-Migration: improve SQL DB table selection ux to include missing tables (#22659)
* add missing target tables ux

* fix number formatting
2023-04-07 16:00:12 -07:00
Cory Rivera
0412ba194b Add more error handling for python installation (#22650) 2023-04-07 14:55:17 -07:00
Charles Gagnon
6a2ec12a35 Fix some no unsafe assignment errors in azurecore (#22643) 2023-04-06 16:50:24 -07:00
Aasim Khan
5ae0e3a503 Removed parent override (#22648) 2023-04-06 15:26:50 -07:00
Alan Ren
2ee7c2649d reduce toolbar space (#22647)
* reduce toolbar space

* feedback
2023-04-06 14:56:48 -07:00
Leila Lali
a62d8f8960 Adding telemetry config to client config to send to STS (#22644)
* Adding telemetry config to client config so they can be sent to STS to monitor updates
2023-04-06 08:18:23 -07:00
Drew Skwiers-Koballa
755a3e9e00 fixing setting description in sql projects readme (#22640) 2023-04-05 17:18:52 -07:00
Drew Skwiers-Koballa
6447e92870 updating SQL projects extension readme (#22632) 2023-04-05 13:01:51 -07:00
Charles Gagnon
df1accf918 Improve missing placeholder warning message (#22630)
* Improve missing placeholder warning message

* stringify
2023-04-05 11:42:26 -07:00
brian-harris
a30719c471 add tooltips to explain migrations list columns (#22598) 2023-04-05 08:16:49 -07:00
Benjin Dubishar
efd489ecde Adding patch to rename backup file when it already exists (#22613) 2023-04-04 13:20:12 -07:00
Benjin Dubishar
cec349d2a4 Fix issue where errors during project update were swallowed (#22612)
* adding errors

* Fixing bug where errors occurring during updating project got swallowed

* removing unnecessary "vscode"
2023-04-04 13:19:07 -07:00
Aasim Khan
958a3f85e5 Enabling async server tree in insiders (#22596) 2023-04-03 17:15:52 -07:00
erpett
b2f131a8ba version bump 1.44 (#22599) 2023-04-03 15:23:47 -07:00
Alan Ren
071d76bc94 fix the empty schema issue (#22594) 2023-04-03 13:57:51 -07:00
Alan Ren
61b3285eaf make table keyboard shortcuts configurable (#22582)
* make table keyboard shortcuts configurable

* fix error

* new slickgrid version

* add comment

* tree grid
2023-04-03 13:21:00 -07:00
Aasim Khan
38a3312cb6 Fixing edited connections not working for root in Async Server Tree (#22580)
* Fixing edited connections not working for root

* Fixing comment
2023-04-03 11:58:19 -07:00
Benjin Dubishar
15ecdc8653 Fixing smattering of issues regarding updating projects for cross-plat (#22575)
* Fixing smattering of issues regarding updating projects for cross-plat

* Adding both options for modal

* Removing quickpick
2023-04-03 10:10:31 -07:00
Benjin Dubishar
267a830775 Disabling revert button when no changes to SQLCMD vars at publish time (#22552)
* disabling revert SQLCMD var value button when no changes to values; changing string

* Updating docstring

* Updating behavior to account for SQLCMD vars defined in publish profiles
2023-04-03 10:09:06 -07:00
Alex Hsu
55820e94f9 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230403154147337. (#22586) 2023-04-03 10:00:25 -07:00
Alex Hsu
59c5ca605c Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230402154501941. (#22581) 2023-04-03 08:29:45 -07:00
Alan Ren
2a14562ec7 update parse query action (#22577)
* update parse query action

* revert export notebook change

* comment
2023-03-31 18:24:47 -07:00
Alex Ma
0f21ecd531 Add fix to enter press on last row of Edit Data. (#22351)
* added fix to enter on null row for edit data

* added fix to displayValue after new row is added

* added suggested changes and fixes
2023-03-31 18:09:34 -07:00
Cory Rivera
1dbb2211e3 Use blue icon for dashboard New Notebook task (#22568) 2023-03-31 16:06:50 -07:00
Hai Cao
490ea592f2 bump STS (#22576) 2023-03-31 15:27:47 -07:00
Kim Santiago
2280c54d4d vbump sql projects to 1.0.0 (#22553)
* vbump sql projects to 1.0.0

* set preview to false
2023-03-31 13:35:43 -07:00
Christopher Suh
ccd5775093 Update remaining Axios calls (#22525)
* initial commit adding put and delete operations

* change response.data => response.body

* add client response interface

* add error to interface

* add reqHeaders
2023-03-31 12:22:31 -07:00
Cheena Malhotra
152eb32278 Fix saving profile from Azure tree nodes (#22518) 2023-03-31 11:58:49 -07:00
Cheena Malhotra
87b3ac1c05 Bump STS version (#22564) 2023-03-31 11:32:21 -07:00
brian-harris
887053c604 SQL-Migration: add retry migration prompt (#22555)
* add retry migration prompt

* updating review comments

* update context menu postion to match toolbar
2023-03-31 10:25:39 -07:00
Alex Ma
e948b4e842 Langpack source and XLF update for April 2023 Release (#22554) 2023-03-31 09:41:43 -07:00
Aasim Khan
d49ff85afc Fixing connection without saved password not connecting in Async Server Tree (#22535) 2023-03-31 09:19:01 -07:00
Aasim Khan
a1acaf2096 Adding caching in OE service (#22539) 2023-03-31 09:18:36 -07:00
Sakshi Sharma
9d8006562d Fix project name validation (#22547)
* Fix project name validation

* Add/update tests

* Address comments

* Fix error
2023-03-31 08:46:58 -07:00
Alan Ren
6cc5e9a70d fix object management dialog's validation issue (#22556)
* fix validation issue

* make message optional

* fix errors
2023-03-31 07:59:38 -07:00
Kim Santiago
e2d4d07c0b fix console error when trying to drag a sql project node (#22551)
* fix console error when trying to drag a sql project node

* fix typo

* update comment
2023-03-30 17:27:01 -07:00
Raymond Truong
3ca583760f [SQL Migration] Add additional checks for multiple database backups in same folder (#22483)
* Add info box about blob container folders

* Add warning banner if non-unique backup locations

* WIP - Add offline scenario

* Clean up

* Address PR comments
2023-03-30 17:19:59 -07:00
Kim Santiago
8e82f62164 cleanup sql projects readme (#22538)
* cleanup sql projects readme

* update to use code blocks for commands
2023-03-30 15:56:17 -07:00
Aasim Khan
4ff16885c1 Cleaning update children logic in async server tree (#22550) 2023-03-30 15:17:45 -07:00
Christopher Suh
2e1689b44d Add warning message when pii logging enabled (#22533)
* add warning message when pii logging enabled

* fix typo

* Update extensions/azurecore/src/localizedConstants.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* update warning message

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-03-30 14:44:25 -07:00
Alex Hsu
92f9b11012 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230330154329299. (#22545) 2023-03-30 14:34:28 -07:00
Cory Rivera
621f390eef Add New Notebook task to server and database dashboards. (#22548) 2023-03-30 12:41:03 -07:00
Kim Santiago
0feabc9a6f use fsPath instead of path when calculating relative path to sqlproj (#22534) 2023-03-30 10:01:06 -07:00
Alan Ren
ee2e59243a revert changes for testing (#22544) 2023-03-30 08:37:40 -07:00
Alan Ren
ff603fa911 handle table designer errors (#22527)
* handle table designer error

* log to console
2023-03-29 16:37:15 -07:00
Kim Santiago
4458a5bd57 move publish interfaces out of sqldbproj.d.ts (#22521) 2023-03-29 16:27:23 -07:00
Charles Gagnon
3320bb55c2 Update privacy documentation (#22520)
* Update privacy documentation

* move
2023-03-29 15:40:01 -07:00
Benjin Dubishar
edc2c5e200 Surfacing better error messages about SQLCMD var names (#22509)
* Surfacing better error messages about SQLCMD var names

* correcting docstring

* adding space to join char
2023-03-29 14:23:03 -07:00
Aasim Khan
2a9705c495 Fixing async server tree (#22511)
* Changing look of new OE

* Fixing styling

* Fixing moving of connected profile

* Fixing drag and drop where treenodes delete connection nodes

* Fixing Deleting and disconnecting in AsyncServerTree

* Adding constant for OE timeout

* Updated interfaces

* Removing test compilation errors

* Fixing most events in async server tree

* Fixing connection pane styling

* Fixing find node function

* Fixing some more operations

* Fixing some ops

* All operations done

* Fixed active connections

* Fixed data source

* Adding support for setting parents

* code cleanup

* Fixing icon styling issues

* Fix errors

* Fixing comment

* Fixing spacing

* Adding explanation to OE service.

* Reverting server delegate changes

* Reverting styling

* reverting more styling change

* reverting more styling

* Fixing has children

* Fixing drag and drop to tree nodes

* fixing drag and drop

* reverting timing

* fixing drag and drop

* cleaning some code

* Fixed server and group moving

* spell check

* consolidating some logic

* Fixed whitespace

* fixing moving to root group
2023-03-29 13:59:35 -07:00
Austin Bryan
7ecbbdf398 Fixing setTargetServerName in SqlMigration extension. Previously if a case was entered it would run all code in the following cases as well (#22477) 2023-03-29 13:44:29 -07:00
Raymond Truong
4867a3747c [SQL Migration] Add storage/MI connectivity validation (#22410)
* wip

* Add SQL VM POC

* Undo azurecore changes

* Add warning banner instead of blocking on next

* Fix warning banner behavior

* Add private endpoint support

* Fix navigation issue

* Add offline scenario

* Address PR comments

* Fix merge conflicts
2023-03-29 12:48:22 -07:00
Cheena Malhotra
e70865ff20 Send telemetry with Auth Library when adding/refreshing account (#22506) 2023-03-29 11:56:56 -07:00
Christopher Suh
87fb3f9b86 Add kusto endpoints for non-public clouds (#22515)
* add kusto endpoints for non-public clouds

* fix endpoints
2023-03-29 11:42:38 -07:00
Alan Ren
e01e3b0e8e update sts (#22516) 2023-03-29 10:45:06 -07:00
Alex Hsu
09b0488978 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230329154135186. (#22513) 2023-03-29 09:34:39 -07:00
Charles Gagnon
2ccd7405c0 Promote some proposed typings (#22508) 2023-03-29 09:06:44 -07:00
brian-harris
ef02e2bfce SQL-Migration: add new migration monitoring data to migration details (#22460)
* add new migration details

* move migraiton target type enum to utils

* address review feedback, refectore, text update

* fix variable name

* limit and filter migrations list to mi/vm/db
2023-03-29 07:48:30 -07:00
Alan Ren
afafee844c fix notebook serialization issue for non-mssql providers (#22504)
* use mssql as notebook serialization provider

* comment

* use fallback
2023-03-28 21:34:38 -07:00
Hai Cao
83e35ad7f8 fix hover for cells with null value (#22507) 2023-03-28 20:19:15 -07:00
Aasim Khan
f60bd1335c Adding promises and operation timeouts to fix race conditions and infinite loading in OE (#22475)
* Adding promises and operation timeouts to fix race conditions

* cleaning up logic

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Fixing promise type

* Reverting back to old error logic

* Making onsessioncreated async

* Removed polling and converted to event based

* removing connection variable out of promise

* Combining promises

* Update src/sql/workbench/services/objectExplorer/browser/treeUpdateUtils.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Fixing error messages and localizing user facing errors

* Fixing error message

* localizing config

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Fixing comment

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-03-28 18:36:45 -07:00
Charles Gagnon
5c6ea2890a Switch "Install" to "Download" for downloaded extensions (#22498) 2023-03-28 15:35:43 -07:00
Kim Santiago
19a9611407 vbump sql database projects (#22501) 2023-03-28 15:13:38 -07:00
Benjin Dubishar
15c0c68e44 SQL Database Projects backend swap to DacFx/Tools Service (#22451)
* Move call to STS move api into project.ts (#22071)

* move call to STS move api into project.ts

* remove undefined

* Remove convert to sdk style code (#22081)

* remove convert to sdk style code

* remove from package.json

* Merging latest from main (#22097)

* [SQL-Migration] Login migrations telemetry (#22038)

This PR enhances telemetry for login migrations (and in the following ways:

Add details for starting migration (number of logins migrating, type of logins)
Log Migration result (number of errors per step, duration of each step, type of logins, if system error occurred)
Add sql-migration extension to our telemetry
Adds details when trying to connect to target
Tracks clicking "done" from the wizard
Fixes bucketizing for navigating telemetry in the login migration wizard
Sample usage of kusto query for new telemetry:
RawEventsADS
| where EventName contains 'sql-migration'
| extend view = tostring(Properties['view'])
| extend action = tostring(Properties['action'])
| extend buttonPressed = tostring(Properties['buttonpressed'])
| extend pageTitle = tostring(Properties['pagetitle'])
| extend adsVersion = tostring(Properties['common.adsversion'])
| extend targetType = tostring(Properties['targettype'])
| extend tenantId = tostring(Properties['tenantid'])
| extend subscriptionId = tostring(Properties['subscriptionid'])
| where view contains "login"
//| where adsVersion contains "1.42.0-insider"
| where ClientTimestamp >= ago(18h)
| project EventName, ClientTimestamp, SessionId, view, pageTitle, action, buttonPressed, targetType
, tenantId, subscriptionId
, adsVersion, OSVersion, Properties

* Add Secure Enclaves dropdown with customizable Advanced options (#22019)

* Update extension READMEs (#22079)

* Fix query-history README images (#22084)

* [Loc] update to mssql and sql-migration xlf files (#22087)

* [Loc] small fix to Portuguese lcl file (#22088)

* [Loc] small fix to Portuguese lcl file

* remove newline

* Adding None bindings to the sqlProjects service (#22085)

* Adding None bindings

* updating names of None bindings

---------

Co-authored-by: AkshayMata <akam520@gmail.com>
Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>
Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
Co-authored-by: Alex Ma <alma1@microsoft.com>

* Swap add and update sqlcmd variables in sql projects to use STS apis (#22086)

* delete sqlcmd variable working

* undo add

* remove variable from add and update sqlcmd variable apis

* hookup add and edit sqlcmd variable

* update vscode-mssql.d.ts

* move add and edit to project.ts

* update STS and tests

* move delete sqlcmd variable to project.ts (#22104)

* add test for add and edit sqlcmd variable (#22107)

* Swapping property access to STS (#22115)

* checkpoint

* Adding sqlproj property bindings

* Swapping out property loading and setting

* consolidating to this.sqlProjService

* Update dacpac reference to use STS api (#22116)

* Update add dacpac reference to use STS api

* remove changes for project ref

* validate unset settings from add database reference dialog

* update one more place getting sqlprojservice

* addressing comments

* fix path for dacpac reference (#22143)

* Swap add project reference to call STS (#22148)

* initial changes for swapping add project reference

* fix include path

* move common logic to helper function

* read sqlcmd variables from STS (#22147)

* Swapping .sqlproj- and crossplat compatibility-related functions to use STS (#22146)

* Supporting roundtrip

* Updating sqlproj style checks and cross-platform compatibility to use STS

* removing unnecessary awaits

* Fixing assertions

* Adding roundtrip update confirmations

* test cleanup

* cleaning up comment; localizing error

* Swap add system db reference (#22171)

* swap adding system database references

* fix tests

* remove only in test

* Read database references from STS (#22179)

* Read database references from STS

* fix system dacpac names

* fix project reference name

* Swap changeTargetPlatform to call STS (#22172)

* swap changeTargetPlatform to call STS

* Address comments

* De-duplicating enum for SystemDatabase (#22196)

* Deudping SystemDatabase enum

* simplifying enum refs

* Removing the now-unused imports code from SqlProjects (#22199)

* Removing unused importTargets entries

* whitespace; also to retrigger github checks on correct branch

* Hooking in Move() for Nones and PublishProfiles (#22200)

* Swap delete database reference to call STS (#22215)

* initial changes

* update contracts

* remove unnecessary info from SystemDatabaseReferenceProjectEntry

* uppercase master and msdb

* cleanup

* update test

* update comment

* undo change in projectController.ts

* remove unused system dacpac helper functions (#22204)

* more cleanup of project.ts (#22242)

* fix a couple database reference tests (#22248)

* Organizing sqlcmd variable and db reference code (#22284)

* organize database references and sqlcmd variable functions

* separate database reference tests

* Script and folder get + add support (#22166)

* Initial sqlobjectscripts

* adding mock filter

* test fixing

* another test passing

* swapping pre/post/none checkpoint

* awaiters

* convert addExistingItem

* swapping folders

* removing print

* stripping out project folder and file readers

* adding some regions

* Updating sqlproj style checks and cross-platform compatibility to use STS

* Updating sqlproj style checks and cross-platform compatibility to use STS

* added type property to tree items

* projectController swapovers

* removing imported targets

* Deleting the last of the TS XML parsing!

* Removing old functions

* renamed readNoneScripts to readNoneItems

* fixing path passed to STS calls

* remove system dacpac tests that were moved to DacFx (#22299)

* fix error when opening file after adding a file to sql project (#22309)

* fix error when opening file after adding a file to sql project

* remove unused import

* fix exclude for table and externalStreamingJob (#22310)

* add reload project (#22313)

* set DSP from STS (#22320)

* fix adding post-deployment script and existing item (#22317)

* Test cleanup for .sqlproj content operations (#22330)

* Fixing up tests

* sqlproj content operations tests

* remove only

* Cleanup

* Correcting collation

* cleanup constants.ts (#22336)

* fix folders not showing in project tree (#22319)

* Fix project controller tests (#22327)

* fixing ProjectController tests after swap

* remove only from database reference tests

* change system dbs back to lowercase in sql projects (#22353)

* Bump tools service

* Updated yarn.lock file

* pass SystemDacpacsLocation when building legacy style sql projects (#22329)

* Benjin/fix types (#22365)

* Updated yarn.lock file

* Fixing types

* fix projectController tests (#22375)

* Fixing the deletion flow for files and folders (#22366)

* checkpoint

* PR feedback

* Fixing up SDK-style operations project test group (#22377)

* Fixing up SDK-style project test group

* Removing .only

* Fixing up database reference tests (#22380)

* Fixing DB reference test group

* Extra cleanup

* removing only

* Consolidating None and PublishProfile; lighting up test (#22382)

* Lighting up project property tests (#22395)

* Checkpoint

* Lighting up project property tests

* removing timeout

* Fixing buildHelper test (#22400)

* Unskipping up roundtrip test (#22397)

* Refactoring database references to split literalVariable from databaseName (#22412)

* refactoring database references to split databaseVariableLiteralValue out from databaseName

* renaming more properties

* Removing branch in entry population

* removing only

* Fixing baselines for delete test

* PR feedback

* Fixing up ProjectTree tests (#22419)

* Fixing up projectTree tests

* remove only

* Updating projectController exclude test (#22413)

* Updating test

* moving filtering for external folders to readFolders() method

* Removing EntryType import

* fix ups (#22435)

* adding extra info for test failure

* hide exclude folder from context menu until it's supported (#22454)

* Adding current test name to generated folder to avoid conflicts (#22478)

* Adding current test name to generated folder to avoid conflicts

* passing correct test parameter in

* Adding trimming and entropy

* Deleting unused baselines (#22497)

* Replacing addToProject() with addSqlObjectScripts() (#22489)

* checkpoint

* Fixing test

* Updating file scraper function to filter only to .sql files (no folders, no txt)

* changing var names to reflect that the lists only contain .sql scripts

---------

Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
Co-authored-by: AkshayMata <akam520@gmail.com>
Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>
Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
Co-authored-by: Alex Ma <alma1@microsoft.com>
2023-03-28 13:39:57 -07:00
Kim Santiago
a7ecff77dd turn on noUnusedParameters in Schema Compare (#22452)
* turn on noUnusedParameters in Schema Compare

* use _click

* Add names to unused parameters in test files
2023-03-28 11:17:14 -07:00
Kim Santiago
929184514e fix undefined error when opening Create Project from Database dialog (#22480) 2023-03-28 11:16:53 -07:00
Aasim Khan
3aee3f006a Fixing icon styling in tree (#22490) 2023-03-28 09:23:15 -07:00
Alan Ren
e15b612166 update manage extension action (#22488) 2023-03-27 20:40:26 -07:00
Alex Ma
fd8d8b0a27 [Loc] fix for webpack for sql-migration and update to XLFs two weeks prior to code complete. (#22484) 2023-03-27 17:07:14 -07:00
Cory Rivera
97eb69477e Remove unused files in sql and extensions folders (#22444) 2023-03-27 16:40:32 -07:00
junierch
e741fa0bbd telemetry for tde user actions (#22474)
* telemetry for user actions

* remove unused action

* try catch around admin function
2023-03-27 17:48:47 -04:00
Aasim Khan
e80c6f2dcb Fixing async tree styling (#22458) 2023-03-27 13:16:23 -07:00
Alan Ren
115062c0fc bring back ads edit (#22472) 2023-03-27 11:27:41 -07:00
Karl Burtram
7377feeb22 Bump STS to pick-up Intellicode fix (#22462) 2023-03-25 11:52:29 -07:00
Alan Ren
86e6b42bdc select the row on double click (#22456)
* select row on double click

* skip row number column
2023-03-24 20:10:05 -07:00
Alan Ren
35cb233851 support default action for OE node (#22455)
* support default action for OE node

* fix floating promise
2023-03-24 20:09:48 -07:00
Kim Santiago
3d2a531976 add hover text to icon buttons in sql projects extension (#22453) 2023-03-24 16:01:30 -07:00
Kim Santiago
8fdcafcca7 fix aria label for name input box in create project from db dialog (#22449) 2023-03-24 15:21:44 -07:00
Kim Santiago
c8d0edf048 add aria labels to a couple radio groups in sql database projects (#22446) 2023-03-24 15:21:33 -07:00
Christopher Suh
738987c4d5 Show message indicating that ADAL is being deprecated (#22432)
* show message indicating that ADAL is being deprecated

* pr comments update

* added settings deprecation message

* fix deprecation message
2023-03-24 14:42:45 -07:00
Kim Santiago
31ac636222 make screenreader announce task status (#22434) 2023-03-24 11:14:10 -07:00
Cheena Malhotra
d5c495f05a Address Secure enclaves feedback to show required indicator optionally (#22428) 2023-03-24 09:57:32 -07:00
Karl Burtram
ad6c202e34 Enable contained and Windows user types (#22440) 2023-03-24 09:11:16 -07:00
Cheena Malhotra
8d49b15b53 Skip forceRefresh for full (owning) tenant (#22421) 2023-03-23 21:01:03 -07:00
Charles Gagnon
dfc6469c9d Enable strict null in MSSQL extension (#22433) 2023-03-23 16:49:48 -07:00
Alan Ren
59ad572800 fix strict null issues (#22430) 2023-03-23 15:19:23 -07:00
Christopher Suh
00897fc513 Replace Axios calls with HttpClient (#22417)
* replace axios calls with httpClient

* add latest files

* fix url
2023-03-23 13:47:25 -07:00
Charles Gagnon
57c35ca255 More MSSQL strict null (#22420) 2023-03-23 11:13:04 -07:00
Charles Gagnon
2805f9f499 More MSSQL strict null (#22402) 2023-03-23 07:04:18 -07:00
Cheena Malhotra
43f97f4f56 Handle out of sync extension activations for encryption keys updated event (#22415) 2023-03-22 22:23:03 -07:00
Kim Santiago
0741e18533 vbump dacpac extension after release (#22418) 2023-03-22 17:47:56 -07:00
Christopher Suh
14ea5e9dd7 fix logic (#22405) 2023-03-22 12:57:49 -07:00
Charles Gagnon
ed37ad315f Clean up buildConnectionInfo (#22407)
* Clean up buildConnectionInfo

* Add test and cleanup
2023-03-22 12:54:29 -07:00
Cory Rivera
9f435e271a Establish connection after running New Notebook from Azure data explorer view (#22408) 2023-03-22 12:53:03 -07:00
erpett
94e779ef0d Changes for release (#22409) 2023-03-22 12:49:50 -07:00
Charles Gagnon
ca4722360a More strict SSL proxy setting fixes (#22404) 2023-03-22 12:45:51 -07:00
Cheena Malhotra
a3e77c674c Address console warnings with new connection dialog (#22293) 2023-03-22 12:31:25 -07:00
Cheena Malhotra
94b3261276 Notify STS when encryption keys are updated in azurecore (#22384) 2023-03-22 11:46:30 -07:00
Karl Burtram
1e4800a60c Support adding Windows users (#22399)
* Support adding Windows users

* Bump STS
2023-03-22 09:53:20 -07:00
Kim Santiago
3b68aaed72 change publish options to use declarative table (#22398) 2023-03-22 09:44:27 -07:00
Charles Gagnon
d7bd87fac0 Fix incorrect default for proxy setting when downloading STS (#22396) 2023-03-22 09:28:32 -07:00
Vsevolod Kukol
e3135aca4c Preserve name and group when using Connection String (#22341) 2023-03-22 07:55:18 -07:00
Kim Santiago
21bb7eb482 add aria label for checkboxes in declrative table (#22393) 2023-03-21 16:35:28 -07:00
Charles Gagnon
62ece298cc Some MSSQL strict null check fixes (#22383) 2023-03-21 16:10:57 -07:00
Kim Santiago
2be49a9911 switch exclude object types table to use declarative table component (#22390)
* switch exclude options to use declarative table component

* add to disposableListeners
2023-03-21 14:09:31 -07:00
Alan Ren
21f271671d new drop object request (#22387)
* simplify drop object requests

* update sts

* pr comments
2023-03-21 10:51:55 -07:00
Hai Cao
ffc7f05c10 Add background color for null cell in query editor result grid (#22370) 2023-03-20 15:55:57 -07:00
Aasim Khan
f9e72b0d93 Fixing connection group change for connections (#22379) 2023-03-20 15:37:47 -07:00
Charles Gagnon
7bf11118af Enable datavirtualization tests (#22378)
* Enable datavirtualization tests

* Remove unneeded
2023-03-20 14:45:13 -07:00
Charles Gagnon
47ce587fef Update ads-extension-telemetry to 3.0.1 (#22374) 2023-03-20 12:48:31 -07:00
Alex Hsu
b06398b32b Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230318154256685. (#22368) 2023-03-20 09:43:04 -07:00
Alan Ren
aa47729f90 fix find parent node issue (#22356)
* fix find parent node issue

* sts update

* fix errors

* pr comments and a fix
2023-03-20 08:20:11 -07:00
Benjin Dubishar
9d16a48dee Bump tools service (#22361) 2023-03-17 15:11:39 -07:00
Benjin Dubishar
0b3a9ca0f0 Updated yarn.lock file (#22363) 2023-03-17 15:11:11 -07:00
Alex Hsu
54e86e19a3 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230317154311330. (#22359) 2023-03-17 09:30:47 -07:00
Cheena Malhotra
4b02c26a52 MSAL cache encryption + log improvements (#22335) 2023-03-16 16:53:16 -07:00
Kim Santiago
931c44ac41 add ariaLive to publish options description (#22338)
* add ariaLive to publish options description

* use correct string and update typings

* update type

* remove comment update

* create type AriaLiveType

* update MockInputBoxComponent with AriaLiveValue

* update azdata-test and add comment
2023-03-16 16:08:24 -07:00
Alan Ren
20cf2489a2 support rename for login, user and a few other types (#22331)
* rename object

* add comment

* use URN property

* vbump STS

* revert loc string change

* fix name check

* pr comments
2023-03-16 15:00:07 -07:00
Alex Hsu
f5628ed8e3 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230316192338900. (#22350) 2023-03-16 12:33:19 -07:00
Alan Ren
1d99060443 update the display name (#22345) 2023-03-16 11:49:17 -07:00
Alex Hsu
8406ce8f93 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230315234519077. (#22339) 2023-03-15 17:07:25 -07:00
Charles Gagnon
8c20e827ad Remove error messages from error events (#22337) 2023-03-15 15:19:16 -07:00
dependabot[bot]
a2d66a288c Bump webpack from 5.73.0 to 5.76.0 (#22326)
Bumps [webpack](https://github.com/webpack/webpack) from 5.73.0 to 5.76.0.
- [Release notes](https://github.com/webpack/webpack/releases)
- [Commits](https://github.com/webpack/webpack/compare/v5.73.0...v5.76.0)

---
updated-dependencies:
- dependency-name: webpack
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-15 15:04:52 -07:00
junierch
cb66ef349e focus on tde edit button (#22314) 2023-03-15 13:30:34 -04:00
Alex Ma
8e6f747336 [Loc] SQL xlf update (Added node expansion labels) (#22328) 2023-03-14 17:15:10 -07:00
Alan Ren
91f64df29f update yarn.lock (#22324) 2023-03-14 14:00:44 -07:00
Aasim Khan
f0a5d296bf Fixing new connections not expanding in object explorer. (#22307)
* Fixing new connection node not expanding in OE

* Fixing new connections not expanding and fixing expand request not resolving because of some provider error.

* Fixing test

* Adding a setting for node expansion timeout

* Not saving when loading tree based connections

* Adding some logs

* Removing special casing for mssql provider

* Missing providers

* Adding user toast for node expansion timeout

* Adding notification service to test

* Fixing node type for mssql

* remove polling

* Fixing onNodeStatus

* Fixing stuff

* consolidating functions

* Consolidating resolve logic

* removing extra try catch

* code cleanup

* adding size checks

* Removing commented code

* Ignoring errors from other sessions and nodepaths.
2023-03-14 10:50:46 -07:00
Alex Ma
ef99e67cfe [Loc] update to XLFS for 3-13-2023 (#22318) 2023-03-13 17:18:30 -07:00
Karl Burtram
efc5938141 Fix casing of OE icon files to match source (#22315) 2023-03-13 15:04:15 -07:00
brian-harris
7c41d45e66 add SHIR registration for create service for sqldb (#22308) 2023-03-13 12:36:56 -07:00
Cheena Malhotra
8539b63a5c Upgrade http(s)-proxy-agent dependent npm packages (#22306) 2023-03-13 11:05:59 -07:00
Alan Ren
653293aad9 fix action bar size (#22301) 2023-03-10 14:39:45 -08:00
Alan Ren
7f3461c026 use latest STS version (#22297) 2023-03-10 11:36:46 -08:00
Cheena Malhotra
c163830bc4 Use Preferred username over email as 'user' property (#22288) 2023-03-09 20:25:07 -08:00
Cheena Malhotra
4c3f622250 Update @azure/msal-node to v1.16.0 (#22287) 2023-03-09 17:09:05 -08:00
Alex Ma
344ca478ef Fix for Edit Data grid newline on multiline strings (#22269)
* Fix for newline replacement

* added proper null regex and null character check

* removed cell class

* removed unnecessary function

* revert back to old check

* change onbeforeeditcell

* small fix to typo in onBeforeEditCell

* made changes based on feedback

* added comments clarifying isDBCellValue
2023-03-09 16:10:31 -08:00
Cheena Malhotra
f51fe75397 Null & Error handling in Azure core (#22259) 2023-03-09 14:34:39 -08:00
Kim Santiago
0bbc290790 fix accessibility issues in update project from db dialog (#22274)
* fix accessibility issues in update project from db dialog

* remove aria label on radio buttons
2023-03-09 14:21:43 -08:00
Kim Santiago
394d88445a undo remove project change in data workspace (#22283) 2023-03-09 14:14:39 -08:00
Alan Ren
6ace579986 user management fixes (#22282)
* fix a few user management bugs

* revert user dialog change
2023-03-09 14:11:04 -08:00
brian-harris
39f90afe6c add delete migration method in migration list and details pages (#22243)
* add delete migraiton, fix stale token detection

* address pr comments, fix api error response format
2023-03-09 13:52:48 -08:00
Kim Santiago
12a5bd0ef2 update target platform text for azure synapse sql pool (#22265) 2023-03-09 10:07:12 -08:00
Cheena Malhotra
d3d594e826 Revert "Update @azure/msal-node to 1.16.0 (#22255)" (#22268)
This reverts commit 87defc0367.
2023-03-09 09:27:30 -08:00
Kim Santiago
ab68c3060c Add required property in missing places in sql projects (#22253)
* add required property for missing places in sql projects

* remove comma
2023-03-09 08:45:21 -08:00
Alan Ren
ec515a3b23 allow nothing is selected as initial state (#22260) 2023-03-09 08:44:35 -08:00
Maddy
5fc69a0445 Fix find highlight on cell edits (#22080)
* update the rendered text onCellSourceChange

* fix test

* fix highlight in split mode

* update corresponding test

* update hasEditor with getEditor

* update event

* add comment
2023-03-08 22:26:07 -08:00
Cheena Malhotra
87defc0367 Update @azure/msal-node to 1.16.0 (#22255) 2023-03-08 20:25:37 -08:00
Cheena Malhotra
cc52eb9dd8 Include App Path in service launch arguments (#22233) 2023-03-08 18:44:17 -08:00
Charles Gagnon
8e5d89ef94 Fix datavirtualization wizard opening to wrong database (#22232)
* Fix datavirtualization wizard opening to wrong database

* vBump
2023-03-08 14:27:19 -08:00
Aasim Khan
9fddefc8fb Fixing execution plan not finding provider (#22238) 2023-03-08 12:55:49 -08:00
Aasim Khan
84c9a03bf9 Fixing icon for group by schema (#22212) 2023-03-08 12:13:49 -08:00
Kim Santiago
954d521a83 Fix delete database reference request (#22214) 2023-03-08 11:03:44 -08:00
Charles Gagnon
3937f62ce4 Fix query history storage folder creation (#22229) 2023-03-08 10:51:14 -08:00
Charles Gagnon
31fd467bec Add backup and restore keywords to sql grammar (#22226)
* Add BACKUP, STATS and CHECKSUM to keywords list

* LOG and REPLACE
2023-03-08 10:38:40 -08:00
Alan Ren
d19aad9b37 remove commands (#22218) 2023-03-08 08:10:11 -08:00
Alex Hsu
c78c2815cc Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230307165240996. (#22191) 2023-03-07 09:42:54 -08:00
Benjin Dubishar
14b0e508cb Bumping sql tools service (#22188) 2023-03-07 09:59:22 -07:00
Charles Gagnon
a575eb1b87 Fix telemetry property names (#22190) 2023-03-07 08:31:24 -08:00
Karl Burtram
b4d5a08199 Bump STS to 4.6.0.1 (#22184) 2023-03-06 20:41:49 -08:00
Aasim Khan
d3f4f0daa4 Adding light contrast theme (#22028)
* Adding light contrast theme

* fixing oe icons

* Fixing more files

* Converting ep files to support hc light

* Revert "Copy Headers for Selected Columns (#21622)"

This reverts commit f74d6f6d9b.

* Adding more css rules

* Fixing modal

* Fixing azure icons
2023-03-06 17:31:27 -08:00
Alan Ren
29c1f5edd0 disable the checkbox for schemas owned by user (#22178) 2023-03-06 15:14:46 -08:00
Alex Ma
f25599119e [Loc] Massive fix for March Langpack (#22180) 2023-03-06 15:04:31 -08:00
Karl Burtram
8a71302aea Only support 'user with login' in dialog (#22175) 2023-03-06 14:32:30 -08:00
erpett
bf753309e7 Updating package.json to show the next build number now that the release branch has split for 1.42 (#22169) 2023-03-06 12:01:27 -08:00
Charles Gagnon
a657aa6cb5 Remove hasEditor from ICellEditorProvider (#22145) 2023-03-06 11:02:41 -08:00
3411 changed files with 158318 additions and 86658 deletions

View File

@@ -3,6 +3,7 @@
**/extensions/**/*.d.ts
**/extensions/**/build/**
**/extensions/**/colorize-fixtures/**
**/extensions/azurecore/extension.webpack.config.js
**/extensions/css-language-features/server/test/pathCompletionFixtures/**
**/extensions/html-language-features/server/lib/jquery.d.ts
**/extensions/html-language-features/server/src/test/pathCompletionFixtures/**

View File

@@ -770,6 +770,7 @@
"chart.js",
"plotly.js",
"angular2-grid",
"kburtram-query-plan",
"html-to-image",
"turndown",
"gridstack",
@@ -1146,6 +1147,7 @@
"extensions/azuremonitor/src/prompts/**",
"extensions/azuremonitor/src/typings/findRemove.d.ts",
"extensions/kusto/src/prompts/**",
"extensions/mssql/src/hdfs/webhdfs.ts",
"extensions/mssql/src/prompts/**",
"extensions/mssql/src/typings/bufferStreamReader.d.ts",
"extensions/mssql/src/typings/findRemove.d.ts",

View File

@@ -5,5 +5,3 @@
* Ensure that the code is up-to-date with the `main` branch.
* Include a description of the proposed changes and how to test them.
-->
This PR fixes #

22
.github/workflows/bad-tag.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
name: Bad Tag
on:
create
jobs:
main:
runs-on: ubuntu-latest
if: github.event.ref == '1.999.0'
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: stable
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run Bad Tag
uses: ./actions/tag-alert
with:
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
tag-name: '1.999.0'

177
.github/workflows/basic.yml vendored Normal file
View File

@@ -0,0 +1,177 @@
name: Basic checks
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
main:
if: github.ref != 'refs/heads/main'
name: Compilation, Unit and Integration Tests
runs-on: ubuntu-latest
timeout-minutes: 40
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v3
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- name: Setup Build Environment
run: |
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- uses: actions/setup-node@v3
with:
node-version: 16
- name: Compute node modules cache key
id: nodeModulesCacheKey
run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
- name: Cache node modules
id: cacheNodeModules
uses: actions/cache@v3
with:
path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules23-
- name: Get yarn cache directory path
id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: echo "::set-output name=dir::$(yarn cache dir)"
- name: Cache yarn directory
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
uses: actions/cache@v3
with:
path: ${{ steps.yarnCacheDirPath.outputs.dir }}
key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64"
- name: Run Unit Tests
id: electron-unit-tests
run: DISPLAY=:10 ./scripts/test.sh
- name: Run Integration Tests (Electron)
id: electron-integration-tests
run: DISPLAY=:10 ./scripts/test-integration.sh
# {{SQL CARBON TODO}} Bring back "Hygiene and Layering" and "Warm up node modules cache"
# hygiene:
# if: github.ref != 'refs/heads/main'
# name: Hygiene and Layering
# runs-on: ubuntu-latest
# timeout-minutes: 40
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v3
# - uses: actions/setup-node@v3
# with:
# node-version: 16
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v3
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules23-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v3
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
# - name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# env:
# PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
# ELECTRON_SKIP_BINARY_DOWNLOAD: 1
# run: yarn --frozen-lockfile --network-timeout 180000
# - name: Run Hygiene Checks
# run: yarn gulp hygiene
# - name: Run Valid Layers Checks
# run: yarn valid-layers-check
# - name: Compile /build/
# run: yarn --cwd build compile
# - name: Check clean git state
# run: ./.github/workflows/check-clean-git-state.sh
# - name: Run eslint
# run: yarn eslint
# - name: Run vscode-dts Compile Checks
# run: yarn vscode-dts-compile-check
# - name: Run Trusted Types Checks
# run: yarn tsec-compile-check
# warm-cache:
# name: Warm up node modules cache
# if: github.ref == 'refs/heads/main'
# runs-on: ubuntu-latest
# timeout-minutes: 40
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v3
# - uses: actions/setup-node@v3
# with:
# node-version: 16
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v3
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules23-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v3
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
# - name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# env:
# PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
# ELECTRON_SKIP_BINARY_DOWNLOAD: 1
# run: yarn --frozen-lockfile --network-timeout 180000

View File

@@ -1,27 +1,29 @@
name: CI
on:
push:
branches:
- main
- release/*
pull_request:
branches:
- main
- release/*
on: workflow_dispatch
# on:
# push:
# branches:
# - main
# - release/*
# pull_request:
# branches:
# - main
# - release/*
jobs:
windows:
name: Windows
runs-on: windows-2019
timeout-minutes: 30
runs-on: windows-2022
timeout-minutes: 60
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
- uses: actions/setup-node@v2
- uses: actions/setup-node@v3
with:
node-version: 16
@@ -230,13 +232,13 @@ jobs:
hygiene:
name: Hygiene and Layering
runs-on: ubuntu-latest
timeout-minutes: 30
timeout-minutes: 40
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
- uses: actions/setup-node@v2
- uses: actions/setup-node@v3
with:
node-version: 16
@@ -248,8 +250,8 @@ jobs:
uses: actions/cache@v2
with:
path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules14-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules14-
key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules23-
- name: Get yarn cache directory path
id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
@@ -273,59 +275,27 @@ jobs:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Download Playwright
run: yarn playwright-install
- name: Run Hygiene Checks
run: yarn gulp hygiene
- name: Run Valid Layers Checks
run: yarn valid-layers-check
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn monaco-compile-check
- name: Compile /build/
run: yarn --cwd build compile
- name: Check clean git state
run: ./.github/workflows/check-clean-git-state.sh
- name: Run eslint
run: yarn eslint
# {{SQL CARBON EDIT}} Don't need this
# - name: Run Monaco Editor Checks
# run: yarn monaco-compile-check
# {{SQL CARBON EDIT}} Don't need this
# - name: Run vscode-dts Compile Checks
# run: yarn vscode-dts-compile-check
- name: Run Trusted Types Checks
run: yarn tsec-compile-check
# - name: Editor Distro & ESM Bundle {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn gulp editor-esm-bundle
# - name: Typings validation prep {{SQL CARBON EDIT}} Remove Monaco checks
# run: |
# mkdir typings-test
# - name: Typings validation {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./typings-test
# run: |
# yarn init -yp
# ../node_modules/.bin/tsc --init
# echo "import '../out-monaco-editor-core';" > a.ts
# ../node_modules/.bin/tsc --noEmit
# - name: Webpack Editor {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run bundle
# - name: Compile Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run compile
# - name: Download Playwright {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn playwright-install
# - name: Run Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# timeout-minutes: 5
# working-directory: ./test/monaco
# run: yarn test

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
16.14

View File

@@ -4,6 +4,6 @@
"recommendations": [
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig",
"redhat.vscode-yaml"
"ms-vscode.vscode-selfhost-test-provider"
]
}

163
.vscode/launch.json vendored
View File

@@ -10,7 +10,98 @@
"args": [
"hygiene"
]
},
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Git Tests",
"runtimeExecutable": "${execPath}",
"args": [
"/tmp/my4g9l",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/git",
"--extensionTestsPath=${workspaceFolder}/extensions/git/out/test"
],
"outFiles": [
"${workspaceFolder}/extensions/git/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Github Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/github/testWorkspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/github",
"--extensionTestsPath=${workspaceFolder}/extensions/github/out/test"
],
"outFiles": [
"${workspaceFolder}/extensions/github/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (single folder)",
"runtimeExecutable": "${execPath}",
"args": [
// "${workspaceFolder}", // Uncomment for running out of sources.
"${workspaceFolder}/extensions/vscode-api-tests/testWorkspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/singlefolder-tests",
"--disable-extensions"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 3
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (workspace)",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-api-tests/testworkspace.code-workspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/workspace-tests"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 4
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Tokenizer Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-colorize-tests/test",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-colorize-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-colorize-tests/out"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests"
}
},
{
"type": "node",
"request": "launch",
@@ -19,8 +110,18 @@
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"runtimeArgs": [
"--no-cached-data"
"--inspect=5875",
"--no-cached-data",
"--crash-reporter-directory=${workspaceFolder}/.profile-oss/crashes",
// for general runtime freezes: https://github.com/microsoft/vscode/issues/127861#issuecomment-904144910
"--disable-features=CalculateNativeWinOcclusion",
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -53,6 +154,9 @@
"runtimeArgs": [
"--inspect=5875",
"--no-cached-data",
"--crash-reporter-directory=${workspaceFolder}/.profile-oss/crashes",
// for general runtime freezes: https://github.com/microsoft/vscode/issues/127861#issuecomment-904144910
"--disable-features=CalculateNativeWinOcclusion",
],
"webRoot": "${workspaceFolder}",
"cascadeTerminateToConfigurations": [
@@ -108,6 +212,18 @@
"group": "2_attach"
}
},
{
"type": "node",
"request": "attach",
"name": "Attach to CLI Process",
"port": 5874,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{
"type": "pwa-chrome",
"request": "attach",
@@ -128,6 +244,49 @@
"order": 2
}
},
{
"type": "node",
"request": "attach",
"name": "Attach to Search Process",
"port": 5876,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{
"type": "node",
"request": "attach",
"name": "Attach to Pty Host Process",
"port": 5877,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{
/* Added for "VS Code Selfhost Test Provider" extension support */
"type": "pwa-chrome",
"request": "attach",
"name": "Attach to VS Code",
"browserAttachLocation": "workspace",
"port": 9222,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach",
"hidden": true
},
"resolveSourceMapLocations": [
"${workspaceFolder}/out/**/*.js"
],
"perScriptSourcemaps": "yes"
},
{
"type": "node",
"request": "launch",

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"May 2022\""
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"July 2022\""
},
{
"kind": 1,

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal repo:microsoft/vscode-unpkg\n\n$MILESTONE=milestone:\"April 2022\""
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal repo:microsoft/vscode-unpkg\n\n$MILESTONE=milestone:\"July 2022\""
},
{
"kind": 1,

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:\"needs more info\" sort:created-desc"
"value": "$inbox -label:\"info-needed\" sort:created-desc"
},
{
"kind": 2,

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal\n\n$MILESTONE=milestone:\"April 2022\"\n\n$MINE=assignee:@me"
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal\n\n$MILESTONE=milestone:\"July 2022\"\n\n$MINE=assignee:@me"
},
{
"kind": 1,
@@ -147,7 +147,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:needs-triage -label:verification-found"
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:triage-needed -label:verification-found"
},
{
"kind": 1,

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-unpkg repo:microsoft/vscode-references-view repo:microsoft/vscode-anycode repo:microsoft/vscode-hexeditor repo:microsoft/vscode-extension-telemetry repo:microsoft/vscode-livepreview repo:microsoft/vscode-remotehub repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remote-repositories-github repo:microsoft/monaco-editor repo:microsoft/vscode-vsce\n\n// current milestone name\n$milestone=milestone:\"May 2022\""
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-unpkg repo:microsoft/vscode-references-view repo:microsoft/vscode-anycode repo:microsoft/vscode-hexeditor repo:microsoft/vscode-extension-telemetry repo:microsoft/vscode-livepreview repo:microsoft/vscode-remotehub repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remote-repositories-github repo:microsoft/monaco-editor repo:microsoft/vscode-vsce\n\n// current milestone name\n$milestone=milestone:\"July 2022\""
},
{
"kind": 1,
@@ -114,7 +114,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:\"needs more info\"",
"value": "$repos assignee:@me is:open label:\"needs more info\""
},
{
"kind": 1,

View File

@@ -12,7 +12,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-jupyter repo:microsoft/vscode-python\n$milestone=milestone:\"March 2022\""
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-jupyter repo:microsoft/vscode-python\n$milestone=milestone:\"May 2022\""
},
{
"kind": 1,

View File

@@ -7,7 +7,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-dev milestone:\"December 2021\" is:open"
"value": "repo:microsoft/vscode-dev milestone:\"May 2022\" is:open"
},
{
"kind": 2,
@@ -32,11 +32,11 @@
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-repositories-github milestone:\"December 2021\" is:open"
"value": "repo:microsoft/vscode-remote-repositories-github milestone:\"May 2022\" is:open"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remotehub milestone:\"December 2021\" is:open"
"value": "repo:microsoft/vscode-remotehub milestone:\"May 2022\" is:open"
}
]
]

View File

@@ -67,6 +67,13 @@
}
],
"git.ignoreLimitWarning": true,
"git.branchProtection": [
"main",
"release/*"
],
"git.branchProtectionPrompt": "alwaysCommitToNewBranch",
"git.branchRandomName.enable": true,
"git.mergeEditor": true,
"remote.extensionKind": {
"msjsdiag.debugger-for-chrome": "workspace"
},

View File

@@ -1,5 +1,98 @@
# Change Log
## Version 1.43.0
* Release number: 1.43.0
* Release date: April 12, 2023
### What's new in 1.43.0
| New Item | Details |
| --- | --- |
| Connection | Added notation for required properties (e.g. Attestation protocol and Attestation URL) when Secure Enclaves are enabled |
| SQL Database Projects extension | General Availability |
| SQL Database Projects extension | Move and rename files within Database Projects view |
| SQL Database Projects extension | SQLCMD variables available for editing in Database Projects view |
| Object Explorer | Double-clicking on a user or table in Object Explorer will open the designer for the object |
| Query Editor | Added a Parse button to the Query Editor toolbar for parsing queries before execution |
| Query Results | Added support to select a row in query results via double click |
### Bug fixes in 1.43.0
| New Item | Details |
| --- | --- |
| Connection | Added support for linked accounts with same username but different domains |
| Connection | Fixed issue with incorrect cache write path |
| Connection | Added ability to include optional name and grouping when creating a new connection using a connection string |
| Connection | Updating username in MSSQL connections to use Preferred username for the display name |
| Connection | Fixed issue with encoding for OSX keychain on macOS |
| Connection | Added support for Azure MFA and Sql Authentication Provider on Linux |
| Dataverse | Addressed error generated when expanding the database node for a Dataverse database in Object Explorer |
| IntelliCode extension | Fixed error that occurred when launching Azure Data Studio with Visual Studio Code IntelliCode extension installed |
| PostgreSQL extension | Implemented support for exporting query results on Apple M1 from a notebook |
| SQL Database Projects extension | Added Accessibility Fixes related to screen reader, label names, and improved focus when navigating |
For a full list of bug fixes addressed for the April 2023 release, visit the [April 2023 Release on GitHub](https://github.com/microsoft/azuredatastudio/milestone/99?closed=1).
## Version 1.42.0
* Release number: 1.42.0
* Release date: March 22, 2023
### What's new in 1.42.0
| New Item | Details |
| --- | --- |
| ARM64 Support for macOS | Implemented native arm64 SqlToolsService support for arm64 Windows and macOS. |
| Connection | Changed the icon under Linked Accounts when adding a new Azure account. |
| Connection | Introduced support for the Command Timeout connection property. |
| Connection | Added support for all three connection encryption options: Strict, Mandatory, and Optional. |
| Connection | Introduced HostNameInCertificate connection property under Security on the Advanced tab, for server with a certificate configured. |
| Connection | Added new advanced option in the Connection dialog to support Secure Enclaves. |
| Connection | Introduced a new setting, Mssql Enable Sql Authentication Provider to allow connections to be maintained without the concern of losing access token lifetime or getting dropped by server. Access tokens will be refreshed internally by the SqlClient driver whenever they are found to be expired. This option is disabled by default. |
| Connection | Added support for connections to Microsoft Dataverse using the TDS endpoint. |
| Connection | Introduced additional error reporting for Azure connections. |
| Connection | Introduced support for change password. |
| Connection | Added support for encryption options for Arc SQL Managed Instance when server certificates are not installed. |
| Deployment | Moved the New Deployment option from the Connections breadcrumb to the File Menu. |
| Object Explorer | Introduced ability to group objects in Object Explorer by database schema. This applies to all MSSQL connections when enabled or disabled. |
| Object Explorer | Introduced a new option to allow a custom timeout to be configured for Object Explorer. Within Settings, enable Mssql > Object Explorer: Expand Timeout. |
| Query Results | Added option to disable special handling for JSON strings. |
### Bug fixes in 1.42.0
| New Item | Details |
| --- | --- |
| Accessibility | Updated server group color display to improve visibility and contrast. |
| Backup | Addressed inability to select "Backup Set" checkbox. |
| Connection | Removed refresh action for connections which are disconnected. |
| Connection | Fixed issue with MSAL not properly set for connections. |
| Connection | Added ability to delete a server group if no connections exist for it. |
| Connection | Added connection display name to the Delete Connection dialog. |
| Connection | Azure connections with "Do not save" for the server group are no longer added to the default server group list. |
| Connection | Improved error handling in the connection dialog. |
| Connection | Fixed issue where saved passwords were not retained for Azure SQL connections. |
| Connection | Improved method to retrieve database access when connecting to Azure SQL. |
| Connection | Improved connection experience for cloud users. |
| Connection | Improved account and tenant selection when connecting to Azure SQL in the connection dialog. |
| Deployment | Improved narration for deployment wizard. |
| Installation | Updated default install location for the Windows on ARM installer. |
| MySQL Extension | Addressed issue where dialog boxes in the MySQL connection pane were not editable. |
| Notebooks | Fixed issue with updating the relative path in a Notebook cell. |
| Notebooks | Fixed issue that caused internal notebook links to break when editing characters in the page. |
| Notebooks | Addressed error thrown when opening a Notebook via a link. |
| Object Explorer | Fixed issue with Object Explorer node not expanding. |
| Query Editor | Fixed database dropdown list for contained users to display correctly. |
| Query Editor | Addressed issue where database dropdown list was not ordered the same as in Object Explorer. |
| Query Editor | Added the ability to properly escape special characters when they exist in object names. |
| Query Editor | Fixed issue which caused query timer to continue to run even though execution was complete. |
| Query Plan Viewer | Addressed an issue where a query plan would not render when opened via a URL. |
| Query Results | Improved precision formatting for datetimeoffset data type. |
For a full list of bug fixes addressed for the March 2023 release, visit the [bugs and issues list on GitHub](https://github.com/microsoft/azuredatastudio/milestone/95?closed=1).
#### Known issues
For a list of the current known issues, visit the [issues list on GitHub](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue).
## Version 1.41.2
* Release date: February 10, 2023
* Release status: General Availability

3
PRIVACY.md Normal file
View File

@@ -0,0 +1,3 @@
# Data Collection
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoft's privacy statement. Our privacy statement is located at <https://go.microsoft.com/fwlink/?LinkID=824704>. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.

View File

@@ -4,7 +4,7 @@
[![Build Status](https://dev.azure.com/ms/azuredatastudio/_apis/build/status/AzureDataStudio-Localization-CI?branchName=main)](https://dev.azure.com/ms/azuredatastudio/_build/latest?definitionId=453&branchName=main)
[![Twitter Follow](https://img.shields.io/twitter/follow/azuredatastudio?style=social)](https://twitter.com/azuredatastudio)
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
Azure Data Studio is a data management and development tool with connectivity to popular cloud and on-premises databases. Azure Data Studio supports Windows, macOS, and Linux, with immediate capability to connect to Azure SQL and SQL Server. Browse the [extension library](wiki/List-of-Extensions) for additional database support options including MySQL, PostreSQL, and MongoDB.
## **Download the latest Azure Data Studio release**
@@ -18,18 +18,18 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| |.rpm |[64 bit][linux-rpm] |
|Mac |.zip |[Universal][osx-universal]&emsp;[Intel Chip][osx-zip]&emsp;[Apple Silicon][osx-arm64] |
[win-user]: https://go.microsoft.com/fwlink/?linkid=2222768
[win-system]: https://go.microsoft.com/fwlink/?linkid=2222769
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2223104
[win-user-arm64]: https://go.microsoft.com/fwlink/?linkid=2222660
[win-system-arm64]: https://go.microsoft.com/fwlink/?linkid=2222849
[win-zip-arm64]: https://go.microsoft.com/fwlink/?linkid=2222850
[osx-universal]: https://go.microsoft.com/fwlink/?linkid=2222873
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2222874
[osx-arm64]: https://go.microsoft.com/fwlink/?linkid=2222680
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2222918
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2223105
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2222875
[win-user]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/stable
[win-system]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/stable
[win-zip]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/stable
[win-user-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-user/stable
[win-system-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64/stable
[win-zip-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-archive/stable
[linux-zip]: https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/stable
[linux-deb]: https://azuredatastudio-update.azurewebsites.net/latest/linux-deb-x64/stable
[linux-rpm]: https://azuredatastudio-update.azurewebsites.net/latest/linux-rpm-x64/stable
[osx-universal]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-universal/stable
[osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/stable
[osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/stable
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
@@ -58,10 +58,9 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
[in-osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider
[in-osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/insider
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
Please visit our [download page](https://aka.ms/getazuredatastudio) for more specific installation instructions.
Check out the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) or [release notes](https://learn.microsoft.com/sql/azure-data-studio/release-notes-azure-data-studio) for additional details of what's in the each release.
The [Azure Data Studio documentation](https://learn.microsoft.com/sql/azure-data-studio) includes complete details on getting started, connection quickstarts, and feature tutorials.
## **Feature Highlights**
@@ -95,8 +94,12 @@ This project has adopted the [Microsoft Open Source Code of Conduct](https://ope
## Localization
Azure Data Studio is localized into 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). The language packs are available in the Extension Manager marketplace. Simply, search for the specific language using the extension marketplace and install. Once you install the selected language, Azure Data Studio will prompt you to restart with the new language.
## Telemetry
Azure Data Studio collects telemetry data, which is used to help understand how to improve the product. For example, this usage data helps to debug issues, such as slow start-up times, and to prioritize new features. While we appreciate the insights this data provides, we also know that not everyone wants to send usage data and you can disable telemetry as described in the [disable telemetry reporting](https://aka.ms/ads-disable-telemetry) documentation.
## Privacy Statement
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/privacystatement) describes the privacy statement of this software.
The [Microsoft Privacy Statement](https://go.microsoft.com/fwlink/?LinkID=824704) describes the privacy statement of this software.
## Contributions and "Thank You"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:

File diff suppressed because it is too large Load Diff

View File

@@ -1 +1 @@
2022-10-06T02:27:18.022Z
2022-07-19T07:55:26.168Z

1
build/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
.yarnrc

View File

@@ -139,6 +139,14 @@ vscode-encrypt/binding.gyp
vscode-encrypt/README.md
!vscode-encrypt/build/Release/vscode-encrypt-native.node
vscode-policy-watcher/build/**
vscode-policy-watcher/.husky/**
vscode-policy-watcher/src/**
vscode-policy-watcher/binding.gyp
vscode-policy-watcher/README.md
vscode-policy-watcher/index.d.ts
!vscode-policy-watcher/build/Release/vscode-policy-watcher.node
vscode-windows-ca-certs/**/*
!vscode-windows-ca-certs/package.json
!vscode-windows-ca-certs/**/*.node

View File

@@ -4,6 +4,7 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
exports.OctoKitIssue = exports.OctoKit = void 0;
const core_1 = require("@actions/core");
const github_1 = require("@actions/github");
const child_process_1 = require("child_process");
@@ -20,7 +21,6 @@ class OctoKit {
}
async *query(query) {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`;
console.log(`Querying for ${q}:`);
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query,
q,
@@ -41,19 +41,17 @@ class OctoKit {
};
for await (const pageResponse of this.octokit.paginate.iterator(options)) {
await timeout();
await utils_1.logRateLimit(this.token);
await (0, utils_1.logRateLimit)(this.token);
const page = pageResponse.data;
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`);
yield page.map((issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)));
}
}
async createIssue(owner, repo, title, body) {
core_1.debug(`Creating issue \`${title}\` on ${owner}/${repo}`);
(0, core_1.debug)(`Creating issue \`${title}\` on ${owner}/${repo}`);
if (!this.options.readonly)
await this.octokit.issues.create({ owner, repo, title, body });
}
octokitIssueToIssue(issue) {
var _a, _b, _c, _d, _e, _f;
return {
author: { name: issue.user.login, isGitHubApp: issue.user.type === 'Bot' },
body: issue.body,
@@ -64,8 +62,8 @@ class OctoKit {
locked: issue.locked,
numComments: issue.comments,
reactions: issue.reactions,
assignee: (_b = (_a = issue.assignee) === null || _a === void 0 ? void 0 : _a.login) !== null && _b !== void 0 ? _b : (_d = (_c = issue.assignees) === null || _c === void 0 ? void 0 : _c[0]) === null || _d === void 0 ? void 0 : _d.login,
milestoneId: (_f = (_e = issue.milestone) === null || _e === void 0 ? void 0 : _e.number) !== null && _f !== void 0 ? _f : null,
assignee: issue.assignee?.login ?? issue.assignees?.[0]?.login,
milestoneId: issue.milestone?.number ?? null,
createdAt: +new Date(issue.created_at),
updatedAt: +new Date(issue.updated_at),
closedAt: issue.closed_at ? +new Date(issue.closed_at) : undefined,
@@ -73,10 +71,10 @@ class OctoKit {
}
async hasWriteAccess(user) {
if (user.name in this.writeAccessCache) {
core_1.debug('Got permissions from cache for ' + user);
(0, core_1.debug)('Got permissions from cache for ' + user);
return this.writeAccessCache[user.name];
}
core_1.debug('Fetching permissions for ' + user);
(0, core_1.debug)('Fetching permissions for ' + user);
const permissions = (await this.octokit.repos.getCollaboratorPermissionLevel({
...this.params,
username: user.name,
@@ -96,14 +94,14 @@ class OctoKit {
}
}
async createLabel(name, color, description) {
core_1.debug('Creating label ' + name);
(0, core_1.debug)('Creating label ' + name);
if (!this.options.readonly)
await this.octokit.issues.createLabel({ ...this.params, color, description, name });
else
this.mockLabels.add(name);
}
async deleteLabel(name) {
core_1.debug('Deleting label ' + name);
(0, core_1.debug)('Deleting label ' + name);
try {
if (!this.options.readonly)
await this.octokit.issues.deleteLabel({ ...this.params, name });
@@ -116,7 +114,7 @@ class OctoKit {
}
}
async readConfig(path) {
core_1.debug('Reading config at ' + path);
(0, core_1.debug)('Reading config at ' + path);
const repoPath = `.github/${path}.json`;
const data = (await this.octokit.repos.getContents({ ...this.params, path: repoPath })).data;
if ('type' in data && data.type === 'file') {
@@ -128,10 +126,10 @@ class OctoKit {
throw Error('Found directory at config path when expecting file' + JSON.stringify(data));
}
async releaseContainsCommit(release, commit) {
if (utils_1.getInput('commitReleasedDebuggingOverride')) {
if ((0, utils_1.getInput)('commitReleasedDebuggingOverride')) {
return true;
}
return new Promise((resolve, reject) => child_process_1.exec(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) => !err || err.code === 1 ? resolve(!err) : reject(err)));
return new Promise((resolve, reject) => (0, child_process_1.exec)(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) => !err || err.code === 1 ? resolve(!err) : reject(err)));
}
}
exports.OctoKit = OctoKit;
@@ -142,7 +140,7 @@ class OctoKitIssue extends OctoKit {
this.issueData = issueData;
}
async addAssignee(assignee) {
core_1.debug('Adding assignee ' + assignee + ' to ' + this.issueData.number);
(0, core_1.debug)('Adding assignee ' + assignee + ' to ' + this.issueData.number);
if (!this.options.readonly) {
await this.octokit.issues.addAssignees({
...this.params,
@@ -152,7 +150,7 @@ class OctoKitIssue extends OctoKit {
}
}
async closeIssue() {
core_1.debug('Closing issue ' + this.issueData.number);
(0, core_1.debug)('Closing issue ' + this.issueData.number);
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
@@ -161,16 +159,15 @@ class OctoKitIssue extends OctoKit {
});
}
async lockIssue() {
core_1.debug('Locking issue ' + this.issueData.number);
(0, core_1.debug)('Locking issue ' + this.issueData.number);
if (!this.options.readonly)
await this.octokit.issues.lock({ ...this.params, issue_number: this.issueData.number });
}
async getIssue() {
if (isIssue(this.issueData)) {
core_1.debug('Got issue data from query result ' + this.issueData.number);
(0, core_1.debug)('Got issue data from query result ' + this.issueData.number);
return this.issueData;
}
console.log('Fetching issue ' + this.issueData.number);
const issue = (await this.octokit.issues.get({
...this.params,
issue_number: this.issueData.number,
@@ -179,7 +176,7 @@ class OctoKitIssue extends OctoKit {
return (this.issueData = this.octokitIssueToIssue(issue));
}
async postComment(body) {
core_1.debug(`Posting comment ${body} on ${this.issueData.number}`);
(0, core_1.debug)(`Posting comment ${body} on ${this.issueData.number}`);
if (!this.options.readonly)
await this.octokit.issues.createComment({
...this.params,
@@ -188,7 +185,7 @@ class OctoKitIssue extends OctoKit {
});
}
async deleteComment(id) {
core_1.debug(`Deleting comment ${id} on ${this.issueData.number}`);
(0, core_1.debug)(`Deleting comment ${id} on ${this.issueData.number}`);
if (!this.options.readonly)
await this.octokit.issues.deleteComment({
owner: this.params.owner,
@@ -197,7 +194,7 @@ class OctoKitIssue extends OctoKit {
});
}
async setMilestone(milestoneId) {
core_1.debug(`Setting milestone for ${this.issueData.number} to ${milestoneId}`);
(0, core_1.debug)(`Setting milestone for ${this.issueData.number} to ${milestoneId}`);
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
@@ -206,7 +203,7 @@ class OctoKitIssue extends OctoKit {
});
}
async *getComments(last) {
core_1.debug('Fetching comments for ' + this.issueData.number);
(0, core_1.debug)('Fetching comments for ' + this.issueData.number);
const response = this.octokit.paginate.iterator(this.octokit.issues.listComments.endpoint.merge({
...this.params,
issue_number: this.issueData.number,
@@ -223,7 +220,7 @@ class OctoKitIssue extends OctoKit {
}
}
async addLabel(name) {
core_1.debug(`Adding label ${name} to ${this.issueData.number}`);
(0, core_1.debug)(`Adding label ${name} to ${this.issueData.number}`);
if (!(await this.repoHasLabel(name))) {
throw Error(`Action could not execute becuase label ${name} is not defined.`);
}
@@ -235,7 +232,7 @@ class OctoKitIssue extends OctoKit {
});
}
async removeLabel(name) {
core_1.debug(`Removing label ${name} from ${this.issueData.number}`);
(0, core_1.debug)(`Removing label ${name} from ${this.issueData.number}`);
try {
if (!this.options.readonly)
await this.octokit.issues.removeLabel({
@@ -246,14 +243,12 @@ class OctoKitIssue extends OctoKit {
}
catch (err) {
if (err.status === 404) {
console.log(`Label ${name} not found on issue`);
return;
}
throw err;
}
}
async getClosingInfo() {
var _a;
if ((await this.getIssue()).open) {
return;
}
@@ -267,13 +262,12 @@ class OctoKitIssue extends OctoKit {
for (const timelineEvent of timelineEvents) {
if (timelineEvent.event === 'closed') {
closingCommit = {
hash: (_a = timelineEvent.commit_id) !== null && _a !== void 0 ? _a : undefined,
hash: timelineEvent.commit_id ?? undefined,
timestamp: +new Date(timelineEvent.created_at),
};
}
}
}
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`);
return closingCommit;
}
}

View File

@@ -25,7 +25,6 @@ export class OctoKit implements GitHub {
async *query(query: Query): AsyncIterableIterator<GitHubIssue[]> {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`
console.log(`Querying for ${q}:`)
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query,
@@ -50,7 +49,6 @@ export class OctoKit implements GitHub {
await timeout()
await logRateLimit(this.token)
const page: Array<Octokit.SearchIssuesAndPullRequestsResponseItemsItem> = pageResponse.data
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`)
yield page.map(
(issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)),
)
@@ -199,7 +197,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
return this.issueData
}
console.log('Fetching issue ' + this.issueData.number)
const issue = (
await this.octokit.issues.get({
...this.params,
@@ -286,7 +283,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
})
} catch (err) {
if (err.status === 404) {
console.log(`Label ${name} not found on issue`)
return
}
throw err
@@ -314,7 +310,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
}
}
}
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`)
return closingCommit
}
}

View File

@@ -4,17 +4,18 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
exports.TestbedIssue = exports.Testbed = void 0;
class Testbed {
constructor(config) {
var _a, _b, _c, _d, _e;
this.config = {
globalLabels: (_a = config === null || config === void 0 ? void 0 : config.globalLabels) !== null && _a !== void 0 ? _a : [],
configs: (_b = config === null || config === void 0 ? void 0 : config.configs) !== null && _b !== void 0 ? _b : {},
writers: (_c = config === null || config === void 0 ? void 0 : config.writers) !== null && _c !== void 0 ? _c : [],
releasedCommits: (_d = config === null || config === void 0 ? void 0 : config.releasedCommits) !== null && _d !== void 0 ? _d : [],
queryRunner: (_e = config === null || config === void 0 ? void 0 : config.queryRunner) !== null && _e !== void 0 ? _e : async function* () {
yield [];
},
globalLabels: config?.globalLabels ?? [],
configs: config?.configs ?? {},
writers: config?.writers ?? [],
releasedCommits: config?.releasedCommits ?? [],
queryRunner: config?.queryRunner ??
async function* () {
yield [];
},
};
}
async *query(query) {
@@ -47,16 +48,15 @@ class Testbed {
exports.Testbed = Testbed;
class TestbedIssue extends Testbed {
constructor(globalConfig, issueConfig) {
var _a, _b, _c;
super(globalConfig);
issueConfig = issueConfig !== null && issueConfig !== void 0 ? issueConfig : {};
issueConfig.comments = (_a = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) !== null && _a !== void 0 ? _a : [];
issueConfig.labels = (_b = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.labels) !== null && _b !== void 0 ? _b : [];
issueConfig = issueConfig ?? {};
issueConfig.comments = issueConfig?.comments ?? [];
issueConfig.labels = issueConfig?.labels ?? [];
issueConfig.issue = {
author: { name: 'JacksonKearl' },
body: 'issue body',
locked: false,
numComments: ((_c = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) === null || _c === void 0 ? void 0 : _c.length) || 0,
numComments: issueConfig?.comments?.length || 0,
number: 1,
open: true,
title: 'issue title',
@@ -90,7 +90,7 @@ class TestbedIssue extends Testbed {
}
async postComment(body, author) {
this.issueConfig.comments.push({
author: { name: author !== null && author !== void 0 ? author : 'bot' },
author: { name: author ?? 'bot' },
body,
id: Math.random(),
timestamp: +new Date(),

View File

@@ -8,15 +8,15 @@ const core = require("@actions/core");
const github_1 = require("@actions/github");
const octokit_1 = require("../api/octokit");
const utils_1 = require("../utils/utils");
const token = utils_1.getRequiredInput('token');
const label = utils_1.getRequiredInput('label');
const token = (0, utils_1.getRequiredInput)('token');
const label = (0, utils_1.getRequiredInput)('label');
async function main() {
const pr = new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number });
pr.addLabel(label);
}
main()
.then(() => utils_1.logRateLimit(token))
.then(() => (0, utils_1.logRateLimit)(token))
.catch(async (error) => {
core.setFailed(error.message);
await utils_1.logErrorToIssue(error.message, true, token);
await (0, utils_1.logErrorToIssue)(error.message, true, token);
});

View File

@@ -1,24 +1,25 @@
{
"name": "github-actions",
"version": "1.0.0",
"description": "GitHub Actions",
"scripts": {
"test": "mocha -r ts-node/register **/*.test.ts",
"build": "tsc -p ./tsconfig.json",
"lint": "eslint -c .eslintrc --fix --ext .ts .",
"watch-typecheck": "tsc --watch"
},
"repository": {
"type": "git",
"url": "git+https://github.com/microsoft/azuredatastudio.git"
},
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.6",
"@actions/github": "^2.1.1",
"axios": "^0.21.4",
"name": "github-actions",
"version": "1.0.0",
"description": "GitHub Actions",
"scripts": {
"test": "mocha -r ts-node/register **/*.test.ts",
"build": "tsc -p ./tsconfig.json",
"compile": "tsc -p ./tsconfig.json",
"lint": "eslint -c .eslintrc --fix --ext .ts .",
"watch-typecheck": "tsc --watch"
},
"repository": {
"type": "git",
"url": "git+https://github.com/microsoft/azuredatastudio.git"
},
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.6",
"@actions/github": "^2.1.1",
"axios": "^0.21.4",
"ts-node": "^8.6.2",
"typescript": "^3.8.3"
}
}
}

View File

@@ -4,13 +4,16 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
exports.logErrorToIssue = exports.logRateLimit = exports.daysAgoToHumanReadbleDate = exports.daysAgoToTimestamp = exports.loadLatestRelease = exports.normalizeIssue = exports.getRequiredInput = exports.getInput = void 0;
const core = require("@actions/core");
const github_1 = require("@actions/github");
const axios_1 = require("axios");
const octokit_1 = require("../api/octokit");
exports.getInput = (name) => core.getInput(name) || undefined;
exports.getRequiredInput = (name) => core.getInput(name, { required: true });
exports.normalizeIssue = (issue) => {
const getInput = (name) => core.getInput(name) || undefined;
exports.getInput = getInput;
const getRequiredInput = (name) => core.getInput(name, { required: true });
exports.getRequiredInput = getRequiredInput;
const normalizeIssue = (issue) => {
const { body, title } = issue;
const isBug = body.includes('bug_report_template') || /Issue Type:.*Bug.*/.test(body);
const isFeatureRequest = body.includes('feature_request_template') || /Issue Type:.*Feature Request.*/.test(body);
@@ -33,23 +36,25 @@ exports.normalizeIssue = (issue) => {
issueType: isBug ? 'bug' : isFeatureRequest ? 'feature_request' : 'unknown',
};
};
exports.loadLatestRelease = async (quality) => (await axios_1.default.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data;
exports.daysAgoToTimestamp = (days) => +new Date(Date.now() - days * 24 * 60 * 60 * 1000);
exports.daysAgoToHumanReadbleDate = (days) => new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '');
exports.logRateLimit = async (token) => {
exports.normalizeIssue = normalizeIssue;
const loadLatestRelease = async (quality) => (await axios_1.default.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data;
exports.loadLatestRelease = loadLatestRelease;
const daysAgoToTimestamp = (days) => +new Date(Date.now() - days * 24 * 60 * 60 * 1000);
exports.daysAgoToTimestamp = daysAgoToTimestamp;
const daysAgoToHumanReadbleDate = (days) => new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '');
exports.daysAgoToHumanReadbleDate = daysAgoToHumanReadbleDate;
const logRateLimit = async (token) => {
const usageData = (await new github_1.GitHub(token).rateLimit.get()).data.resources;
['core', 'graphql', 'search'].forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit;
const message = `Usage at ${usage} for ${category}`;
if (usage > 0) {
console.log(message);
}
if (usage > 0.5) {
await exports.logErrorToIssue(message, false, token);
await (0, exports.logErrorToIssue)(message, false, token);
}
});
};
exports.logErrorToIssue = async (message, ping, token) => {
exports.logRateLimit = logRateLimit;
const logErrorToIssue = async (message, ping, token) => {
// Attempt to wait out abuse detection timeout if present
await new Promise((resolve) => setTimeout(resolve, 10000));
const dest = github_1.context.repo.repo === 'vscode-internalbacklog'
@@ -70,3 +75,4 @@ ${JSON.stringify(github_1.context, null, 2).replace(/<!--/gu, '<@--').replace(/-
-->
`);
};
exports.logErrorToIssue = logErrorToIssue;

View File

@@ -58,13 +58,10 @@ export const daysAgoToHumanReadbleDate = (days: number) =>
new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '')
export const logRateLimit = async (token: string) => {
const usageData = (await new GitHub(token).rateLimit.get()).data.resources
;(['core', 'graphql', 'search'] as const).forEach(async (category) => {
const usageData = (await new GitHub(token).rateLimit.get()).data.resources;
(['core', 'graphql', 'search'] as const).forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit
const message = `Usage at ${usage} for ${category}`
if (usage > 0) {
console.log(message)
}
if (usage > 0.5) {
await logErrorToIssue(message, false, token)
}

View File

@@ -285,6 +285,8 @@ node-fetch@^2.3.0:
version "2.6.7"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad"
integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==
dependencies:
whatwg-url "^5.0.0"
npm-run-path@^2.0.0:
version "2.0.2"
@@ -371,6 +373,11 @@ strip-eof@^1.0.0:
resolved "https://registry.yarnpkg.com/strip-eof/-/strip-eof-1.0.0.tgz#bb43ff5598a6eb05d89b59fcd129c983313606bf"
integrity sha1-u0P/VZim6wXYm1n80SnJgzE2Br8=
tr46@~0.0.3:
version "0.0.3"
resolved "https://registry.yarnpkg.com/tr46/-/tr46-0.0.3.tgz#8184fd347dac9cdc185992f3a6622e14b9d9ab6a"
integrity sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==
ts-node@^8.6.2:
version "8.9.0"
resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-8.9.0.tgz#d7bf7272dcbecd3a2aa18bd0b96c7d2f270c15d4"
@@ -388,9 +395,9 @@ tunnel@0.0.6, tunnel@^0.0.6:
integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==
typescript@^3.8.3:
version "3.8.3"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.8.3.tgz#409eb8544ea0335711205869ec458ab109ee1061"
integrity sha512-MYlEfn5VrLNsgudQTVJeNaQFUAI7DkhnOjdpAp4T+ku1TfQClewlbSuTVHiA+8skNBgaf02TL/kLOvig4y3G8w==
version "3.9.10"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.9.10.tgz#70f3910ac7a51ed6bef79da7800690b19bf778b8"
integrity sha512-w6fIxVE/H1PkLKcCPsFqKE7Kv7QUwhU8qQY2MueZXWx5cPZdwFupLgKK3vntcK98BtNHZtAF4LA/yl2a7k8R6Q==
universal-user-agent@^4.0.0:
version "4.0.1"
@@ -411,6 +418,19 @@ uuid@^8.3.2:
resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2"
integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==
webidl-conversions@^3.0.0:
version "3.0.1"
resolved "https://registry.yarnpkg.com/webidl-conversions/-/webidl-conversions-3.0.1.tgz#24534275e2a7bc6be7bc86611cc16ae0a5654871"
integrity sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==
whatwg-url@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/whatwg-url/-/whatwg-url-5.0.0.tgz#966454e8765462e37644d3626f6742ce8b70965d"
integrity sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==
dependencies:
tr46 "~0.0.3"
webidl-conversions "^3.0.0"
which@^1.2.9:
version "1.3.1"
resolved "https://registry.yarnpkg.com/which/-/which-1.3.1.tgz#a45043d54f5805316da8d62f9f50918d3da70b0a"

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
@@ -14,7 +14,7 @@ shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files
for (let dir of dirs) {
for (const dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';
import * as crypto from 'crypto';
@@ -19,7 +17,7 @@ shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files
for (let dir of dirs) {
for (const dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = {

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const crypto = require("crypto");
@@ -130,7 +130,6 @@ function getEnv(name) {
return result;
}
async function main() {
var _a;
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
// getPlatform needs the unprocessedType
const platform = getPlatform(product, os, arch, unprocessedType);
@@ -169,7 +168,7 @@ async function main() {
console.log('Blob successfully uploaded to Azure storage.');
})
];
const shouldUploadToMooncake = /true/i.test((_a = process.env['VSCODE_PUBLISH_TO_MOONCAKE']) !== null && _a !== void 0 ? _a : 'true');
const shouldUploadToMooncake = /true/i.test(process.env['VSCODE_PUBLISH_TO_MOONCAKE'] ?? 'true');
if (shouldUploadToMooncake) {
const mooncakeCredential = new identity_1.ClientSecretCredential(process.env['AZURE_MOONCAKE_TENANT_ID'], process.env['AZURE_MOONCAKE_CLIENT_ID'], process.env['AZURE_MOONCAKE_CLIENT_SECRET']);
const mooncakeBlobServiceClient = new storage_blob_1.BlobServiceClient(`https://vscode.blob.core.chinacloudapi.cn`, mooncakeCredential, storagePipelineOptions);

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import { Readable } from 'stream';
import * as crypto from 'crypto';

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const identity_1 = require("@azure/identity");
const cosmos_1 = require("@azure/cosmos");
@@ -19,12 +19,11 @@ function getEnv(name) {
return result;
}
async function main() {
var _a, _b, _c;
const [, , _version] = process.argv;
const quality = getEnv('VSCODE_QUALITY');
const commit = ((_a = process.env['VSCODE_DISTRO_COMMIT']) === null || _a === void 0 ? void 0 : _a.trim()) || getEnv('BUILD_SOURCEVERSION');
const commit = process.env['VSCODE_DISTRO_COMMIT']?.trim() || getEnv('BUILD_SOURCEVERSION');
const queuedBy = getEnv('BUILD_QUEUEDBY');
const sourceBranch = ((_b = process.env['VSCODE_DISTRO_REF']) === null || _b === void 0 ? void 0 : _b.trim()) || getEnv('BUILD_SOURCEBRANCH');
const sourceBranch = process.env['VSCODE_DISTRO_REF']?.trim() || getEnv('BUILD_SOURCEBRANCH');
const version = _version + (quality === 'stable' ? '' : `-${quality}`);
console.log('Creating build...');
console.log('Quality:', quality);
@@ -35,7 +34,7 @@ async function main() {
timestamp: (new Date()).getTime(),
version,
isReleased: false,
private: Boolean((_c = process.env['VSCODE_DISTRO_REF']) === null || _c === void 0 ? void 0 : _c.trim()),
private: Boolean(process.env['VSCODE_DISTRO_REF']?.trim()),
sourceBranch,
queuedBy,
assets: [],
@@ -44,7 +43,7 @@ async function main() {
const aadCredentials = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], aadCredentials });
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]));
}
main().then(() => {
console.log('Build successfully created');

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import { ClientSecretCredential } from '@azure/identity';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';

View File

@@ -6,10 +6,10 @@
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const crypto = require("crypto");
const azure = require("azure-storage");
const mime = require("mime");
const minimist = require("minimist");
const documentdb_1 = require("documentdb");
const storage_blob_1 = require("@azure/storage-blob");
// {{SQL CARBON EDIT}}
if (process.argv.length < 9) {
console.error('Usage: node publish.js <product_quality> <platform> <file_type> <file_name> <version> <is_update> <file> [commit_id]');
@@ -104,21 +104,23 @@ function createOrUpdate(commit, quality, platform, type, release, asset, isUpdat
});
}));
}
async function assertContainer(blobService, quality) {
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
async function assertContainer(containerClient) {
let containerResponse = await containerClient.createIfNotExists({ access: 'blob' });
return containerResponse && !!containerResponse.errorCode;
}
async function doesAssetExist(blobService, quality, blobName) {
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
async function uploadBlob(blobService, quality, blobName, file) {
const blobOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
async function uploadBlob(blobClient, file) {
const result = await blobClient.uploadFile(file, {
blobHTTPHeaders: {
blobContentType: mime.lookup(file),
blobCacheControl: 'max-age=31536000, public'
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
});
if (result && !result.errorCode) {
console.log(`Blobs uploaded successfully, response status: ${result?._response?.status}`);
}
else {
console.error(`Blobs failed to upload, response status: ${result?._response?.status}, errorcode: ${result?.errorCode}`);
}
}
async function publish(commit, quality, platform, type, name, version, _isUpdate, file, opts) {
const isUpdate = _isUpdate === 'true';
@@ -142,54 +144,62 @@ async function publish(commit, quality, platform, type, name, version, _isUpdate
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
await assertContainer(blobService, quality);
const blobExists = await doesAssetExist(blobService, quality, blobName);
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
const storageKey = process.env['AZURE_STORAGE_ACCESS_KEY_2'];
const connectionString = `DefaultEndpointsProtocol=https;AccountName=${storageAccount};AccountKey=${storageKey};EndpointSuffix=core.windows.net`;
let blobServiceClient = storage_blob_1.BlobServiceClient.fromConnectionString(connectionString, {
retryOptions: {
maxTries: 20,
retryPolicyType: storage_blob_1.StorageRetryPolicyType.EXPONENTIAL
}
});
let containerClient = blobServiceClient.getContainerClient(quality);
if (await assertContainer(containerClient)) {
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobClient, file);
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
}
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
const RETRY_TIMES = 10;
async function retry(fn) {

View File

@@ -8,10 +8,10 @@
import * as fs from 'fs';
import { Readable } from 'stream';
import * as crypto from 'crypto';
import * as azure from 'azure-storage';
import * as mime from 'mime';
import * as minimist from 'minimist';
import { DocumentClient, NewDocument } from 'documentdb';
import { BlobServiceClient, BlockBlobClient, ContainerClient, StorageRetryPolicyType } from '@azure/storage-blob';
// {{SQL CARBON EDIT}}
if (process.argv.length < 9) {
@@ -127,24 +127,23 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
}));
}
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
async function assertContainer(containerClient: ContainerClient): Promise<boolean> {
let containerResponse = await containerClient.createIfNotExists({ access: 'blob' });
return containerResponse && !!containerResponse.errorCode;
}
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
const existsResult = await new Promise<azure.BlobService.BlobResult>((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
async function uploadBlob(blobService: azure.BlobService, quality: string, blobName: string, file: string): Promise<void> {
const blobOptions: azure.BlobService.CreateBlockBlobRequestOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
async function uploadBlob(blobClient: BlockBlobClient, file: string): Promise<void> {
const result = await blobClient.uploadFile(file, {
blobHTTPHeaders: {
blobContentType: mime.lookup(file),
blobCacheControl: 'max-age=31536000, public'
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
});
if (result && !result.errorCode) {
console.log(`Blobs uploaded successfully, response status: ${result?._response?.status}`);
} else {
console.error(`Blobs failed to upload, response status: ${result?._response?.status}, errorcode: ${result?.errorCode}`)
}
}
interface PublishOptions {
@@ -180,74 +179,78 @@ async function publish(commit: string, quality: string, platform: string, type:
const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
const storageKey = process.env['AZURE_STORAGE_ACCESS_KEY_2']!;
const connectionString = `DefaultEndpointsProtocol=https;AccountName=${storageAccount};AccountKey=${storageKey};EndpointSuffix=core.windows.net`;
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
await assertContainer(blobService, quality);
const blobExists = await doesAssetExist(blobService, quality, blobName);
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset: Asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (
(
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)
);
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [] as Array<Asset>,
updates: {} as any
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
let blobServiceClient = BlobServiceClient.fromConnectionString(connectionString, {
retryOptions: {
maxTries: 20,
retryPolicyType: StorageRetryPolicyType.EXPONENTIAL
}
}
});
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
let containerClient = blobServiceClient.getContainerClient(quality);
if (await assertContainer(containerClient)) {
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobClient, file);
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset: Asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (
(
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)
);
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [] as Array<Asset>,
updates: {} as any
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
}
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
}
const RETRY_TIMES = 10;

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const identity_1 = require("@azure/identity");
const cosmos_1 = require("@azure/cosmos");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import { ClientSecretCredential } from '@azure/identity';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.retry = void 0;
async function retry(fn) {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
export async function retry<T>(fn: () => Promise<T>): Promise<T> {
let lastError: Error | undefined;

View File

@@ -35,25 +35,66 @@ steps:
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js x64 $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
displayName: Extract node_modules cache
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin
yarn --cwd build --frozen-lockfile --check-files && break
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install build dependencies
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- download: current
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive

View File

@@ -1,270 +1,213 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key"
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
# Set up the credentials to retrieve distro repo and setup git persona
# to create a merge commit for when we merge distro into oss
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
# Setting hardened entitlements is a requirement for:
# * Running tests on Big Sur (because Big Sur has additional security precautions)
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
./scripts/test.sh --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node --build
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --sequential --browser chromium --browser webkit --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium & Webkit)
timeoutInMinutes: 30
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --sequential --build --browser chromium --browser webkit --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium & Webkit)
timeoutInMinutes: 30
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
yarn test-node --build
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
./scripts/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser, Webkit)
timeoutInMinutes: 20
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --sequential --build --browser chromium --browser webkit --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium & Webkit)
timeoutInMinutes: 30
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- script: |
set -e
yarn gulp \
compile-extension:configuration-editing \
compile-extension:css-language-features-server \
compile-extension:emmet \
compile-extension:git \
compile-extension:github-authentication \
compile-extension:html-language-features-server \
compile-extension:ipynb \
compile-extension:json-language-features-server \
compile-extension:markdown-language-features-server \
compile-extension:markdown-language-features \
compile-extension-media \
compile-extension:microsoft-authentication \
compile-extension:typescript-language-features \
compile-extension:vscode-api-tests \
compile-extension:vscode-colorize-tests \
compile-extension:vscode-notebook-tests \
compile-extension:vscode-test-resolver
displayName: Build integration tests
- script: |
set -e
ps -ef
displayName: Diagnostics before smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless
timeoutInMinutes: 10
displayName: Run smoke tests (Browser, Chromium)
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest-no-compile --tracing --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
./scripts/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser, Webkit)
timeoutInMinutes: 20
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- script: |
set -e
ps -ef
displayName: Diagnostics after smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
- script: |
set -e
ps -ef
displayName: Diagnostics before smoke test run
continueOnError: true
condition: succeededOrFailed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos-$(VSCODE_ARCH)
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
yarn --cwd test/smoke compile
displayName: Compile smoke tests
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
artifactName: node-modules-macos-$(VSCODE_ARCH)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- script: |
set -e
yarn smoketest-no-compile --tracing
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- task: PublishPipelineArtifact@0
inputs:
artifactName: logs-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
targetPath: .build/logs
displayName: "Publish Log Files"
continueOnError: true
condition: failed()
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest-no-compile --tracing --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- script: |
set -e
ps -ef
displayName: Diagnostics after smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results

View File

@@ -51,6 +51,50 @@ steps:
set -e
tar -xzf .build/node_modules_cache/cache.tgz
displayName: Extract node_modules cache
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- script: |
set -e

View File

@@ -1,50 +1,73 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key"
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
@@ -64,13 +87,6 @@ steps:
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
@@ -107,115 +123,142 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build Server
# Setting hardened entitlements is a requirement for:
# * Apple notarization
# * Running tests on Big Sur (because Big Sur has additional security precautions)
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp "transpile-client" "transpile-extensions"
displayName: Transpile
- script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
displayName: Archive build
- ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- template: product-build-darwin-test.yml
parameters:
VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
- script: |
set -e
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
# Setting hardened entitlements is a requirement for:
# * Apple notarization
# * Running tests on Big Sur (because Big Sur has additional security precautions)
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH) && zip -Xry vscode-server-darwin-$(VSCODE_ARCH).zip vscode-server-darwin-$(VSCODE_ARCH) && popd
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
displayName: Archive build
# package Remote Extension Host (Web)
pushd .. && mv vscode-reh-web-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH)-web && zip -Xry vscode-server-darwin-$(VSCODE_ARCH)-web.zip vscode-server-darwin-$(VSCODE_ARCH)-web && popd
displayName: Prepare to publish servers
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- script: |
set -e
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive
displayName: Publish client archive
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH) && zip -Xry vscode-server-darwin-$(VSCODE_ARCH).zip vscode-server-darwin-$(VSCODE_ARCH) && popd
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH).zip
artifact: vscode_server_darwin_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish server archive
# package Remote Extension Host (Web)
pushd .. && mv vscode-reh-web-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH)-web && zip -Xry vscode-server-darwin-$(VSCODE_ARCH)-web.zip vscode-server-darwin-$(VSCODE_ARCH)-web && popd
displayName: Prepare to publish servers
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH)-web.zip
artifact: vscode_web_darwin_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish web server archive
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
PackageName: Visual Studio Code
- task: AzureCLI@2
inputs:
azureSubscription: "vscode-builds-subscription"
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- publish: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_darwin_$(VSCODE_ARCH)_sbom
- script: |
set -e
AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
node build/azure-pipelines/upload-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
continueOnError: true
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- publish: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_darwin_$(VSCODE_ARCH)_sbom
- publish: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_darwin_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive
displayName: Publish client archive
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH).zip
artifact: vscode_server_darwin_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish server archive
- publish: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_darwin_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH)-web.zip
artifact: vscode_web_darwin_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish web server archive
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- task: AzureCLI@2
inputs:
azureSubscription: "vscode-builds-subscription"
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- script: |
set -e
AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
node build/azure-pipelines/upload-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
continueOnError: true

View File

@@ -112,18 +112,19 @@ steps:
displayName: Run unit tests
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run core integration tests
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# {{SQL CARBON TODO}} Reenable "Run Core Integration Tests"
# - script: |
# # Figure out the full absolute path of the product we just built
# # including the remote server and configure the integration tests
# # to run with these builds instead of running out of sources.
# set -e
# APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
# APP_NAME="`ls $APP_ROOT | head -n 1`"
# INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
# ./scripts/test-integration.sh --build --tfs "Integration Tests"
# displayName: Run core integration tests
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e

View File

@@ -4,9 +4,7 @@ pool:
trigger:
branches:
include: ["main", "release/*"]
pr:
branches:
include: ["main", "release/*"]
pr: none
steps:
- task: NodeTool@0

View File

@@ -118,7 +118,9 @@ steps:
- script: |
set -e
docker run -e VSCODE_QUALITY -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine-$(VSCODE_ARCH) /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine-$(VSCODE_ARCH) /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
env:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Prebuild
- script: |

View File

@@ -0,0 +1,272 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
displayName: Setup build environment
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
ELECTRON_ROOT=.build/electron
sudo chown root $APP_ROOT/chrome-sandbox
sudo chown root $ELECTRON_ROOT/chrome-sandbox
sudo chmod 4755 $APP_ROOT/chrome-sandbox
sudo chmod 4755 $ELECTRON_ROOT/chrome-sandbox
stat $APP_ROOT/chrome-sandbox
stat $ELECTRON_ROOT/chrome-sandbox
displayName: Change setuid helper binary permission
- ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium)
timeoutInMinutes: 15
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node --build
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium)
timeoutInMinutes: 15
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- script: |
set -e
yarn gulp \
compile-extension:configuration-editing \
compile-extension:css-language-features-server \
compile-extension:emmet \
compile-extension:git \
compile-extension:github-authentication \
compile-extension:html-language-features-server \
compile-extension:ipynb \
compile-extension:json-language-features-server \
compile-extension:markdown-language-features-server \
compile-extension:markdown-language-features \
compile-extension-media \
compile-extension:microsoft-authentication \
compile-extension:typescript-language-features \
compile-extension:vscode-api-tests \
compile-extension:vscode-colorize-tests \
compile-extension:vscode-notebook-tests \
compile-extension:vscode-test-resolver
displayName: Build integration tests
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
./scripts/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser, Chromium)
timeoutInMinutes: 20
- script: |
set -e
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
./scripts/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser, Chromium)
timeoutInMinutes: 20
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics before smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
yarn --cwd test/smoke compile
displayName: Compile smoke tests
- script: |
set -e
yarn smoketest-no-compile --tracing
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage"
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
yarn smoketest-no-compile --remote --tracing
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
yarn smoketest-no-compile --tracing --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage"
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics after smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()

View File

@@ -1,79 +1,113 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- task: DownloadPipelineArtifact@2
inputs:
artifact: reh_node_modules-$(VSCODE_ARCH)
path: $(Build.ArtifactStagingDirectory)
displayName: Download server build dependencies
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: DownloadPipelineArtifact@2
inputs:
artifact: reh_node_modules-$(VSCODE_ARCH)
path: $(Build.ArtifactStagingDirectory)
displayName: Download server build dependencies
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- script: |
set -e
# Start X server
/etc/init.d/xvfb start
# Start dbus session
DBUS_LAUNCH_RESULT=$(sudo dbus-daemon --config-file=/usr/share/dbus-1/system.conf --print-address)
echo "##vso[task.setvariable variable=DBUS_SESSION_BUS_ADDRESS]$DBUS_LAUNCH_RESULT"
displayName: Setup system services
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
# Start X server
/etc/init.d/xvfb start
# Start dbus session
DBUS_LAUNCH_RESULT=$(sudo dbus-daemon --config-file=/usr/share/dbus-1/system.conf --print-address)
echo "##vso[task.setvariable variable=DBUS_SESSION_BUS_ADDRESS]$DBUS_LAUNCH_RESULT"
displayName: Setup system services
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- task: Cache@2
inputs:
key: "genericNodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
@@ -91,6 +125,7 @@ steps:
- script: |
set -e
node build/npm/setupBuildYarnrc
for i in {1..3}; do # try 3 times, for Terrapin
yarn --cwd build --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
@@ -103,7 +138,16 @@ steps:
- script: |
set -e
export npm_config_arch=$(NPM_ARCH)
if [ "$NPM_ARCH" = "armv7l" ]; then
# There is no target_arch="armv7l" supported by node_gyp,
# arm versions for compilation are decided based on the CC
# macros.
# Mapping value is based on
# https://github.com/nodejs/node/blob/0903515e126c2697042d6546c6aa4b72e1a4b33e/configure.py#L49-L50
export npm_config_arch="arm"
else
export npm_config_arch=$(NPM_ARCH)
fi
if [ -z "$CC" ] || [ -z "$CXX" ]; then
# Download clang based on chromium revision used by vscode
@@ -143,12 +187,13 @@ steps:
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
rm -rf remote/node_modules
tar -xzf $(Build.ArtifactStagingDirectory)/reh_node_modules-$(VSCODE_ARCH).tar.gz --directory $(Build.SourcesDirectory)/remote
displayName: Extract server node_modules output
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
rm -rf remote/node_modules
tar -xzf $(Build.ArtifactStagingDirectory)/reh_node_modules-$(VSCODE_ARCH).tar.gz --directory $(Build.SourcesDirectory)/remote
displayName: Extract server node_modules output
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- script: |
set -e
@@ -158,267 +203,136 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp "transpile-client" "transpile-extensions"
displayName: Transpile
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
ELECTRON_ROOT=.build/electron
sudo chown root $APP_ROOT/chrome-sandbox
sudo chown root $ELECTRON_ROOT/chrome-sandbox
sudo chmod 4755 $APP_ROOT/chrome-sandbox
sudo chmod 4755 $ELECTRON_ROOT/chrome-sandbox
stat $APP_ROOT/chrome-sandbox
stat $ELECTRON_ROOT/chrome-sandbox
displayName: Change setuid helper binary permission
- ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- template: product-build-linux-client-test.yml
parameters:
VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
- script: |
set -e
yarn test-node --build
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium)
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: UseDotNet@2
inputs:
version: 2.x
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: EsrpClientTool@1
displayName: Download ESRPClient
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
./scripts/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser, Chromium)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- script: |
set -e
node build/azure-pipelines/common/sign "$(esrpclient.toolpath)/$(esrpclient.toolname)" rpm $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) .build/linux/rpm '*.rpm'
displayName: Codesign rpm
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- script: |
set -e
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/prepare-publish.sh
displayName: Prepare for Publish
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics before smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage"
timeoutInMinutes: 10
displayName: Run smoke tests (Browser, Chromium)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_linux_$(VSCODE_ARCH)_sbom
- script: |
set -e
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
yarn smoketest-no-compile --tracing --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
- script: |
set -e
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_linux_$(VSCODE_ARCH)_sbom
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics after smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(DEB_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_deb-package
displayName: Publish deb package
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux-$(VSCODE_ARCH)
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(RPM_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_rpm-package
displayName: Publish rpm package
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
artifactName: node-modules-linux-$(VSCODE_ARCH)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(TARBALL_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish client archive
- task: PublishPipelineArtifact@0
inputs:
artifactName: logs-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
targetPath: .build/logs
displayName: "Publish Log Files"
continueOnError: true
condition: and(failed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
artifact: vscode_server_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish server archive
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
artifact: vscode_web_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish web server archive
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: UseDotNet@2
inputs:
version: 2.x
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: EsrpClientTool@1
displayName: Download ESRPClient
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
node build/azure-pipelines/common/sign "$(esrpclient.toolpath)/$(esrpclient.toolname)" rpm $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) .build/linux/rpm '*.rpm'
displayName: Codesign rpm
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/prepare-publish.sh
displayName: Prepare for Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(DEB_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_deb-package
displayName: Publish deb package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(RPM_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_rpm-package
displayName: Publish rpm package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(TARBALL_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish client archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
artifact: vscode_server_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
artifact: vscode_web_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_linux_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_linux_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball

View File

@@ -1,49 +1,58 @@
parameters:
- name: VSCODE_QUALITY
type: string
steps:
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: Docker@1
displayName: "Pull Docker image"
inputs:
azureSubscriptionEndpoint: "vscode-builds-subscription"
azureContainerRegistry: vscodehub.azurecr.io
command: "Run an image"
imageName: "vscode-linux-build-agent:centos7-devtoolset8-arm64"
containerCommand: uname
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: Docker@1
displayName: "Pull Docker image"
inputs:
azureSubscriptionEndpoint: "vscode-builds-subscription"
azureContainerRegistry: vscodehub.azurecr.io
command: "Run an image"
imageName: "vscode-linux-build-agent:centos7-devtoolset8-arm64"
containerCommand: uname
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
set -e
@@ -61,17 +70,19 @@ steps:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
displayName: Register Docker QEMU
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
displayName: Register Docker QEMU
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: |
set -e
docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:centos7-devtoolset8-arm64 /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
displayName: Install dependencies via qemu
env:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:centos7-devtoolset8-arm64 /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
displayName: Install dependencies via qemu
env:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: |
set -e

View File

@@ -136,48 +136,49 @@ steps:
displayName: Run core integration tests
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSIONS_ONLY'], 'true'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the unit tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
NO_CLEANUP=1 \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
displayName: Run Extension Unit Tests (Continue on Error)
continueOnError: true
condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), eq(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
# {{SQL CARBON TODO}} Reenable "Run Extension Unit Tests (Continue on Error)" and "Run Extension Unit Tests (Fail on Error)" and "Archive Logs"
# - script: |
# # Figure out the full absolute path of the product we just built
# # including the remote server and configure the unit tests
# # to run with these builds instead of running out of sources.
# set -e
# APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
# APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
# INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
# NO_CLEANUP=1 \
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
# DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
# displayName: Run Extension Unit Tests (Continue on Error)
# continueOnError: true
# condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), eq(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the unit tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
NO_CLEANUP=1 \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
displayName: Run Extension Unit Tests (Fail on Error)
condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
# - script: |
# # Figure out the full absolute path of the product we just built
# # including the remote server and configure the unit tests
# # to run with these builds instead of running out of sources.
# set -e
# APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
# APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
# INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
# NO_CLEANUP=1 \
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
# DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
# displayName: Run Extension Unit Tests (Fail on Error)
# condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
- bash: |
set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp
for folder in adsuser*/
do
folder=${folder%/}
# Only archive directories we want for debugging purposes
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
done
displayName: Archive Logs
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# - bash: |
# set -e
# mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
# cd /tmp
# for folder in adsuser*/
# do
# folder=${folder%/}
# # Only archive directories we want for debugging purposes
# tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
# done
# displayName: Archive Logs
# continueOnError: true
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const json = require("gulp-json-editor");
const buffer = require('gulp-buffer');
@@ -43,7 +43,7 @@ async function mixinClient(quality) {
else {
fancyLog(ansiColors.blue('[mixin]'), 'Inheriting OSS built-in extensions', builtInExtensions.map(e => e.name));
}
return Object.assign(Object.assign({ webBuiltInExtensions: originalProduct.webBuiltInExtensions }, o), { builtInExtensions });
return { webBuiltInExtensions: originalProduct.webBuiltInExtensions, ...o, builtInExtensions };
}))
.pipe(productJsonFilter.restore)
.pipe(es.mapSync((f) => {
@@ -64,7 +64,7 @@ function mixinServer(quality) {
fancyLog(ansiColors.blue('[mixin]'), `Mixing in server:`);
const originalProduct = JSON.parse(fs.readFileSync(path.join(__dirname, '..', '..', 'product.json'), 'utf8'));
const serverProductJson = JSON.parse(fs.readFileSync(serverProductJsonPath, 'utf8'));
fs.writeFileSync('product.json', JSON.stringify(Object.assign(Object.assign({}, originalProduct), serverProductJson), undefined, '\t'));
fs.writeFileSync('product.json', JSON.stringify({ ...originalProduct, ...serverProductJson }, undefined, '\t'));
fancyLog(ansiColors.blue('[mixin]'), 'product.json', ansiColors.green('✔︎'));
}
function main() {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as json from 'gulp-json-editor';
const buffer = require('gulp-buffer');
import * as filter from 'gulp-filter';

View File

@@ -0,0 +1,59 @@
steps:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "genericNodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive

View File

@@ -0,0 +1,189 @@
trigger:
- main
- release/*
pr:
branches:
include: ["main", "release/*"]
variables:
- name: Codeql.SkipTaskAutoInjection
value: true
- name: skipComponentGovernanceDetection
value: true
- name: ENABLE_TERRAPIN
value: false
- name: VSCODE_CIBUILD
value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }}
- name: VSCODE_PUBLISH
value: false
- name: VSCODE_QUALITY
value: oss
- name: VSCODE_STEP_ON_IT
value: false
jobs:
- ${{ if ne(variables['VSCODE_CIBUILD'], true) }}:
- job: Compile
displayName: Compile & Hygiene
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- job: Linuxx64UnitTest
displayName: Linux (Unit Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64IntegrationTest
displayName: Linux (Integration Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64SmokeTest
displayName: Linux (Smoke Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: Linuxx64MaintainNodeModulesCache
displayName: Linux (Maintain node_modules cache)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
steps:
- template: product-build-pr-cache.yml
# - job: macOSUnitTest
# displayName: macOS (Unit Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: true
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: false
# - job: macOSIntegrationTest
# displayName: macOS (Integration Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: true
# VSCODE_RUN_SMOKE_TESTS: false
# - job: macOSSmokeTest
# displayName: macOS (Smoke Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: true
# - job: WindowsUnitTests
# displayName: Windows (Unit Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: true
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: false
# - job: WindowsIntegrationTests
# displayName: Windows (Integration Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: true
# VSCODE_RUN_SMOKE_TESTS: false
# - job: WindowsSmokeTests
# displayName: Windows (Smoke Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: true

View File

@@ -8,6 +8,10 @@ schedules:
- main
- joao/web
trigger:
branches:
include: ["main", "release/*"]
parameters:
- name: VSCODE_DISTRO_REF
displayName: Distro Ref (Private build)
@@ -104,9 +108,11 @@ variables:
- name: VSCODE_BUILD_STAGE_WINDOWS
value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_LINUX
value: ${{ or(eq(parameters.VSCODE_BUILD_LINUX, true), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE_ARM64, true), eq(parameters.VSCODE_BUILD_WEB, true)) }}
value: ${{ or(eq(parameters.VSCODE_BUILD_LINUX, true), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_MACOS
value: ${{ or(eq(parameters.VSCODE_BUILD_MACOS, true), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_WEB
value: ${{ eq(parameters.VSCODE_BUILD_WEB, true) }}
- name: VSCODE_CIBUILD
value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }}
- name: VSCODE_PUBLISH
@@ -162,6 +168,8 @@ stages:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true)) }}:
- stage: Windows
@@ -169,13 +177,60 @@ stages:
- Compile
pool: vscode-1es-windows
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_WIN32, true) }}:
- ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: WindowsUnitTests
displayName: Unit Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: WindowsIntegrationTests
displayName: Integration Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: WindowsSmokeTests
displayName: Smoke Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32, true)) }}:
- job: Windows
timeoutInMinutes: 120
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true)) }}:
- job: Windows32
@@ -184,6 +239,12 @@ stages:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}:
- job: WindowsARM64
@@ -192,6 +253,12 @@ stages:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}:
- stage: LinuxServerDependencies
@@ -206,13 +273,17 @@ stages:
NPM_ARCH: x64
steps:
- template: linux/product-build-linux-server.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if eq(parameters.VSCODE_BUILD_LINUX, true) }}:
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
- job: arm64
variables:
VSCODE_ARCH: arm64
steps:
- template: linux/product-build-linux-server.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}:
- stage: Linux
@@ -221,7 +292,54 @@ stages:
- LinuxServerDependencies
pool: vscode-1es-linux
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_LINUX, true) }}:
- ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: Linuxx64UnitTest
displayName: Unit Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64IntegrationTest
displayName: Integration Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64SmokeTest
displayName: Smoke Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true)) }}:
- job: Linuxx64
container: vscode-bionic-x64
variables:
@@ -230,6 +348,12 @@ stages:
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true), ne(variables['VSCODE_PUBLISH'], 'false')) }}:
- job: LinuxSnap
@@ -249,6 +373,12 @@ stages:
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
# TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true)) }}:
@@ -269,6 +399,12 @@ stages:
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
# TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
@@ -296,13 +432,6 @@ stages:
steps:
- template: linux/product-build-alpine.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WEB, true)) }}:
- job: LinuxWeb
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_MACOS'], true)) }}:
- stage: macOS
dependsOn:
@@ -312,20 +441,76 @@ stages:
variables:
BUILDSECMON_OPT_IN: true
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_MACOS, true) }}:
- job: macOSTest
- ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: macOSUnitTest
displayName: Unit Tests
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin-test.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}:
- job: macOS
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: macOSIntegrationTest
displayName: Integration Tests
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: macOSSmokeTest
displayName: Smoke Tests
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_MACOS, true)) }}:
- job: macOS
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if eq(parameters.VSCODE_STEP_ON_IT, false) }}:
- job: macOSTest
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSSign
dependsOn:
- macOS
@@ -342,7 +527,14 @@ stages:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}:
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSARM64Sign
dependsOn:
- macOSARM64
@@ -362,7 +554,8 @@ stages:
VSCODE_ARCH: universal
steps:
- template: darwin/product-build-darwin-universal.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}:
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSUniversalSign
dependsOn:
- macOSUniversal
@@ -372,6 +565,19 @@ stages:
steps:
- template: darwin/product-build-darwin-sign.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WEB'], true)) }}:
- stage: Web
dependsOn:
- Compile
pool: vscode-1es-linux
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_WEB, true) }}:
- job: Web
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), ne(variables['VSCODE_PUBLISH'], 'false')) }}:
- stage: Publish
dependsOn:

View File

@@ -1,39 +1,47 @@
parameters:
- name: VSCODE_QUALITY
type: string
steps:
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password"
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
@@ -94,11 +102,12 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
# Mixin must run before optimize, because the CSS loader will inline small SVGs
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
# Mixin must run before optimize, because the CSS loader will inline small SVGs
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
@@ -107,59 +116,65 @@ steps:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Compile & Hygiene
- script: |
set -e
yarn --cwd test/smoke compile
yarn --cwd test/integration/browser compile
displayName: Compile test suites
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
yarn --cwd test/smoke compile
yarn --cwd test/integration/browser compile
displayName: Compile test suites
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: AzureCLI@2
inputs:
azureSubscription: "vscode-builds-subscription"
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureCLI@2
inputs:
azureSubscription: "vscode-builds-subscription"
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- script: |
set -e
AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
- script: |
set -
./build/azure-pipelines/common/extract-telemetry.sh
displayName: Extract Telemetry
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -
./build/azure-pipelines/common/extract-telemetry.sh
displayName: Extract Telemetry
- script: |
set -e
tar -cz --ignore-failed-read -f $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-* test/integration/browser/out test/smoke/out test/automation/out
displayName: Compress compilation artifact
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
tar -cz --ignore-failed-read -f $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-* test/integration/browser/out test/smoke/out test/automation/out
displayName: Compress compilation artifact
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz
artifactName: Compilation
displayName: Publish compilation artifact
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz
artifactName: Compilation
displayName: Publish compilation artifact
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn download-builtin-extensions-cg
displayName: Built-in extensions component details
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn download-builtin-extensions-cg
displayName: Built-in extensions component details
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
inputs:
sourceScanPath: $(Build.SourcesDirectory)
continueOnError: true
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
inputs:
sourceScanPath: $(Build.SourcesDirectory)
continueOnError: true

View File

@@ -46,6 +46,7 @@ $stages = @(
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
if ($env:VSCODE_BUILD_STAGE_WEB -eq 'True') { 'Web' }
)
do {

View File

@@ -109,6 +109,7 @@ steps:
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
if ($env:VSCODE_BUILD_STAGE_WEB -eq 'True') { 'Web' }
)
Write-Host "Stages to check: $stages"

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const cp = require("child_process");
let tag = '';

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as cp from 'child_process';
let tag = '';

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const cp = require("child_process");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as cp from 'child_process';
import * as path from 'path';

View File

@@ -103,7 +103,7 @@ stages:
steps:
- template: linux/sql-product-build-linux.yml
parameters:
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azcli", "azurecore", "cms", "dacpac", "data-workspace", "import", "machine-learning", "notebook", "resource-deployment", "schema-compare", "sql-bindings", "sql-database-projects"]
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azcli", "azurecore", "cms", "dacpac", "datavirtualization", "data-workspace", "import", "machine-learning", "notebook", "resource-deployment", "schema-compare", "sql-bindings", "sql-database-projects"]
timeoutInMinutes: 90
- stage: Windows

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const es = require("event-stream");
const Vinyl = require("vinyl");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs';

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.getSettingsSearchBuildId = exports.shouldSetupSettingsSearch = void 0;
const path = require("path");
@@ -52,7 +52,7 @@ function generateVSCodeConfigurationTask() {
const timer = setTimeout(() => {
codeProc.kill();
reject(new Error('export-default-configuration process timed out'));
}, 12 * 1000);
}, 60 * 1000);
codeProc.on('error', err => {
clearTimeout(timer);
reject(err);

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as os from 'os';
import * as cp from 'child_process';
@@ -63,7 +61,7 @@ function generateVSCodeConfigurationTask(): Promise<string | undefined> {
const timer = setTimeout(() => {
codeProc.kill();
reject(new Error('export-default-configuration process timed out'));
}, 12 * 1000);
}, 60 * 1000);
codeProc.on('error', err => {
clearTimeout(timer);

View File

@@ -1,14 +1,16 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const es = require("event-stream");
const vfs = require("vinyl-fs");
const merge = require("gulp-merge-json");
const gzip = require("gulp-gzip");
const identity_1 = require("@azure/identity");
const path = require("path");
const fs_1 = require("fs");
const azure = require('gulp-azure-storage');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION'];
const credential = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
@@ -18,8 +20,8 @@ function main() {
.pipe(merge({
fileName: 'combined.nls.metadata.json',
jsonSpace: '',
concatArrays: true,
edit: (parsedJson, file) => {
let key;
if (file.base === 'out-vscode-web-min') {
return { vscode: parsedJson };
}
@@ -63,7 +65,11 @@ function main() {
break;
}
}
key = 'vscode.' + file.relative.split('/')[0];
// Get extension id and use that as the key
const folderPath = path.join(file.base, file.relative.split('/')[0]);
const manifest = (0, fs_1.readFileSync)(path.join(folderPath, 'package.json'), 'utf-8');
const manifestJson = JSON.parse(manifest);
const key = manifestJson.publisher + '.' + manifestJson.name;
return { [key]: parsedJson };
},
}))

View File

@@ -3,14 +3,14 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs';
import * as merge from 'gulp-merge-json';
import * as gzip from 'gulp-gzip';
import { ClientSecretCredential } from '@azure/identity';
import path = require('path');
import { readFileSync } from 'fs';
const azure = require('gulp-azure-storage');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION'];
@@ -33,8 +33,8 @@ function main(): Promise<void> {
.pipe(merge({
fileName: 'combined.nls.metadata.json',
jsonSpace: '',
concatArrays: true,
edit: (parsedJson, file) => {
let key;
if (file.base === 'out-vscode-web-min') {
return { vscode: parsedJson };
}
@@ -82,7 +82,12 @@ function main(): Promise<void> {
break;
}
}
key = 'vscode.' + file.relative.split('/')[0];
// Get extension id and use that as the key
const folderPath = path.join(file.base, file.relative.split('/')[0]);
const manifest = readFileSync(path.join(folderPath, 'package.json'), 'utf-8');
const manifestJson = JSON.parse(manifest);
const key = manifestJson.publisher + '.' + manifestJson.name;
return { [key]: parsedJson };
},
}))
@@ -113,4 +118,3 @@ main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const es = require("event-stream");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';

View File

@@ -163,6 +163,7 @@ steps:
cd $ROOT && tar --owner=0 --group=0 -czf $WEB_TARBALL_PATH $WEB_BUILD_NAME
displayName: Prepare for publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-web.tar.gz
artifact: vscode_web_linux_standalone_archive-unsigned

View File

@@ -0,0 +1,247 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install" }
displayName: Download Electron and Playwright
- ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-node }
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node test/unit/browser/index.js --sequential --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-node --build }
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-browser-no-install --sequential --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp `
compile-extension:configuration-editing `
compile-extension:css-language-features-server `
compile-extension:emmet `
compile-extension:git `
compile-extension:github-authentication `
compile-extension:html-language-features-server `
compile-extension:ipynb `
compile-extension:json-language-features-server `
compile-extension:markdown-language-features-server `
compile-extension:markdown-language-features `
compile-extension-media `
compile-extension:microsoft-authentication `
compile-extension:typescript-language-features `
compile-extension:vscode-api-tests `
compile-extension:vscode-colorize-tests `
compile-extension:vscode-notebook-tests `
compile-extension:vscode-test-resolver `
}
displayName: Build integration tests
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-integration.bat --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser, Firefox)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\scripts\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser, Firefox)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics before smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn --cwd test/smoke compile }
displayName: Compile smoke tests
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn smoketest-no-compile --tracing }
displayName: Run smoke tests (Electron)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --tracing --build "$AppRoot" }
displayName: Run smoke tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --web --tracing --headless }
displayName: Run smoke tests (Browser, Chromium)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"
exec { yarn gulp compile-extension:vscode-test-resolver }
exec { yarn smoketest-no-compile --tracing --remote --build "$AppRoot" }
displayName: Run smoke tests (Remote)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics after smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build\crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build\logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()

View File

@@ -1,4 +1,21 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0
inputs:
versionSpec: "16.x"
@@ -8,51 +25,58 @@ steps:
versionSpec: "3.x"
addToPath: true
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- task: ExtractFiles@1
displayName: Extract compilation output
inputs:
archiveFilePatterns: "$(Build.ArtifactStagingDirectory)/compilation.tar.gz"
cleanDestinationFolder: false
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
displayName: Prepare tooling
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $(VSCODE_DISTRO_REF) }
Write-Host "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
exec { git checkout FETCH_HEAD }
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") }
displayName: Merge distro
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: ExtractFiles@1
displayName: Extract compilation output
inputs:
archiveFilePatterns: "$(Build.ArtifactStagingDirectory)/compilation.tar.gz"
cleanDestinationFolder: false
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
displayName: Prepare tooling
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $(VSCODE_DISTRO_REF) }
Write-Host "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
exec { git checkout FETCH_HEAD }
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") }
displayName: Merge distro
- powershell: |
if (!(Test-Path ".build")) { New-Item -Path ".build" -ItemType Directory }
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
"$env:ENABLE_TERRAPIN" | Out-File -Encoding ascii -NoNewLine .build\terrapin
node build/azure-pipelines/common/computeNodeModulesCacheKey.js > .build/yarnlockhash
@@ -104,291 +128,176 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn npm-run-all -lp "electron $(VSCODE_ARCH)" }
displayName: Download Electron
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build\lib\policies }
displayName: Generate Group Policy definitions
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)"
displayName: Build
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "transpile-client" "transpile-extensions" }
displayName: Transpile
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" }
displayName: Prepare Package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)"
displayName: Build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin --server }
displayName: Mix in quality
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" }
displayName: Prepare Package
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)"
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin --server }
displayName: Mix in quality
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn npm-run-all -lp "playwright-install" }
displayName: Download Playwright
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)"
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- template: product-build-win32-test.yml
parameters:
VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-node --build }
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: UseDotNet@2
inputs:
version: 3.x
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-browser-no-install --sequential --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: EsrpClientTool@1
displayName: Download ESRPClient
- powershell: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$EsrpClientTool = (gci -directory -filter EsrpClientTool_* $(Agent.RootDirectory)\_tasks | Select-Object -last 1).FullName
$EsrpCliZip = (gci -recurse -filter esrpcli.*.zip $EsrpClientTool | Select-Object -last 1).FullName
mkdir -p $(Agent.TempDirectory)\esrpcli
Expand-Archive -Path $EsrpCliZip -DestinationPath $(Agent.TempDirectory)\esrpcli
$EsrpCliDllPath = (gci -recurse -filter esrpcli.dll $(Agent.TempDirectory)\esrpcli | Select-Object -last 1).FullName
echo "##vso[task.setvariable variable=EsrpCliDllPath]$EsrpCliDllPath"
displayName: Find ESRP CLI
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\scripts\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser, Firefox)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build\azure-pipelines\common\sign $env:EsrpCliDllPath windows $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) $(CodeSigningFolderPath) '*.dll,*.exe,*.node' }
displayName: Codesign
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-archive" }
displayName: Package archive
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics before smoke test run
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:ESRPPKI = "$(ESRP-PKI)"
$env:ESRPAADUsername = "$(esrp-aad-username)"
$env:ESRPAADPassword = "$(esrp-aad-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-system-setup" --sign }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-user-setup" --sign }
displayName: Package setups
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --web --tracing --headless }
displayName: Run smoke tests (Browser, Chromium)
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\prepare-publish.ps1
displayName: Publish
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --tracing --build "$AppRoot" }
displayName: Run smoke tests (Electron)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --tracing --remote --build "$AppRoot" }
displayName: Run smoke tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_win32_$(VSCODE_ARCH)_sbom
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics after smoke test run
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-windows-$(VSCODE_ARCH)
targetPath: .build\crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_win32_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
artifactName: node-modules-windows-$(VSCODE_ARCH)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\$(ARCHIVE_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_archive
displayName: Publish archive
- task: PublishPipelineArtifact@0
inputs:
artifactName: logs-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
targetPath: .build\logs
displayName: "Publish Log Files"
continueOnError: true
condition: and(failed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\$(SYSTEM_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_setup
displayName: Publish system setup
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\$(USER_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_user-setup
displayName: Publish user setup
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: UseDotNet@2
inputs:
version: 3.x
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
artifact: vscode_server_win32_$(VSCODE_ARCH)_archive
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: EsrpClientTool@1
displayName: Download ESRPClient
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$EsrpClientTool = (gci -directory -filter EsrpClientTool_* $(Agent.RootDirectory)\_tasks | Select-Object -last 1).FullName
$EsrpCliZip = (gci -recurse -filter esrpcli.*.zip $EsrpClientTool | Select-Object -last 1).FullName
mkdir -p $(Agent.TempDirectory)\esrpcli
Expand-Archive -Path $EsrpCliZip -DestinationPath $(Agent.TempDirectory)\esrpcli
$EsrpCliDllPath = (gci -recurse -filter esrpcli.dll $(Agent.TempDirectory)\esrpcli | Select-Object -last 1).FullName
echo "##vso[task.setvariable variable=EsrpCliDllPath]$EsrpCliDllPath"
displayName: Find ESRP CLI
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build\azure-pipelines\common\sign $env:EsrpCliDllPath windows $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) $(CodeSigningFolderPath) '*.dll,*.exe,*.node' }
displayName: Codesign
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-archive" }
displayName: Package archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:ESRPPKI = "$(ESRP-PKI)"
$env:ESRPAADUsername = "$(esrp-aad-username)"
$env:ESRPAADPassword = "$(esrp-aad-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-system-setup" --sign }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-user-setup" --sign }
displayName: Package setups
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\prepare-publish.ps1
displayName: Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\$(ARCHIVE_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_archive
displayName: Publish archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\$(SYSTEM_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_setup
displayName: Publish system setup
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\$(USER_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_user-setup
displayName: Publish user setup
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
artifact: vscode_server_win32_$(VSCODE_ARCH)_archive
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
artifact: vscode_web_win32_$(VSCODE_ARCH)_archive
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_win32_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- publish: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_win32_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
artifact: vscode_web_win32_$(VSCODE_ARCH)_archive
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const vscode_universal_bundler_1 = require("vscode-universal-bundler");
const cross_spawn_promise_1 = require("@malept/cross-spawn-promise");
@@ -71,7 +71,7 @@ async function main() {
outAppPath,
force: true
});
let productJson = await fs.readJson(productJsonPath);
const productJson = await fs.readJson(productJsonPath);
Object.assign(productJson, {
darwinUniversalAssetId: 'darwin-universal'
});

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import { makeUniversalApp } from 'vscode-universal-bundler';
import { spawn } from '@malept/cross-spawn-promise';
import * as fs from 'fs-extra';
@@ -80,7 +78,7 @@ async function main() {
force: true
});
let productJson = await fs.readJson(productJsonPath);
const productJson = await fs.readJson(productJsonPath);
Object.assign(productJson, {
darwinUniversalAssetId: 'darwin-universal'
});

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const codesign = require("electron-osx-sign");
const path = require("path");
@@ -40,14 +40,26 @@ async function main() {
identity: '99FM488X57',
'gatekeeper-assess': false
};
const appOpts = Object.assign(Object.assign({}, defaultOpts), {
const appOpts = {
...defaultOpts,
// TODO(deepak1556): Incorrectly declared type in electron-osx-sign
ignore: (filePath) => {
return filePath.includes(gpuHelperAppName) ||
filePath.includes(rendererHelperAppName);
} });
const gpuHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, gpuHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist') });
const rendererHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, rendererHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist') });
}
};
const gpuHelperOpts = {
...defaultOpts,
app: path.join(appFrameworkPath, gpuHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
};
const rendererHelperOpts = {
...defaultOpts,
app: path.join(appFrameworkPath, rendererHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
};
// Only overwrite plist entries for x64 and arm64 builds,
// universal will get its copy from the x64 build.
if (arch !== 'universal') {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as codesign from 'electron-osx-sign';
import * as path from 'path';
import * as util from '../lib/util';

View File

@@ -52,7 +52,6 @@ module.exports.unicodeFilter = [
'!extensions/typescript-language-features/test-workspace/**',
'!extensions/vscode-api-tests/testWorkspace/**',
'!extensions/vscode-api-tests/testWorkspace2/**',
'!extensions/vscode-custom-editor-tests/test-workspace/**',
'!extensions/**/dist/**',
'!extensions/**/out/**',
'!extensions/**/snippets/**',
@@ -93,7 +92,6 @@ module.exports.indentationFilter = [
'!extensions/markdown-math/notebook-out/**',
'!extensions/vscode-api-tests/testWorkspace/**',
'!extensions/vscode-api-tests/testWorkspace2/**',
'!extensions/vscode-custom-editor-tests/test-workspace/**',
'!build/monaco/**',
'!build/win32/**',
@@ -146,7 +144,6 @@ module.exports.indentationFilter = [
'!extensions/admin-tool-ext-win/license/**',
'!extensions/resource-deployment/notebooks/**',
'!extensions/mssql/notebooks/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/integration-tests/testData/**',
'!extensions/arc/src/controller/generated/**',
'!extensions/sql-database-projects/resources/templates/*.xml',
@@ -198,7 +195,6 @@ module.exports.copyrightFilter = [
'!src/vs/editor/test/node/classification/typescript-test.ts',
// {{SQL CARBON EDIT}} Except for stuff in our code that doesn't use our copyright
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/azuremonitor/src/prompts/**',
'!extensions/import/flatfileimportservice/**',
'!extensions/kusto/src/prompts/**',

View File

@@ -17,34 +17,41 @@ const compilation = require('./lib/compilation');
const monacoapi = require('./lib/monaco-api');
const fs = require('fs');
let root = path.dirname(__dirname);
let sha1 = util.getVersion(root);
let semver = require('./monaco/package.json').version;
let headerVersion = semver + '(' + sha1 + ')';
const root = path.dirname(__dirname);
const sha1 = util.getVersion(root);
const semver = require('./monaco/package.json').version;
const headerVersion = semver + '(' + sha1 + ')';
// Build
let editorEntryPoints = [
const editorEntryPoints = [
{
name: 'vs/editor/editor.main',
include: [],
exclude: ['vs/css', 'vs/nls'],
prepend: ['out-editor-build/vs/css.js', 'out-editor-build/vs/nls.js'],
prepend: [
{ path: 'out-editor-build/vs/css.js', amdModuleId: 'vs/css' },
{ path: 'out-editor-build/vs/nls.js', amdModuleId: 'vs/nls' }
],
},
{
name: 'vs/base/common/worker/simpleWorker',
include: ['vs/editor/common/services/editorSimpleWorker'],
prepend: ['vs/loader.js'],
append: ['vs/base/worker/workerMain'],
exclude: ['vs/nls'],
prepend: [
{ path: 'vs/loader.js' },
{ path: 'vs/nls.js', amdModuleId: 'vs/nls' },
{ path: 'vs/base/worker/workerMain.js' }
],
dest: 'vs/base/worker/workerMain.js'
}
];
let editorResources = [
const editorResources = [
'out-editor-build/vs/base/browser/ui/codicons/**/*.ttf'
];
let BUNDLED_FILE_HEADER = [
const BUNDLED_FILE_HEADER = [
'/*!-----------------------------------------------------------',
' * Copyright (c) Microsoft Corporation. All rights reserved.',
' * Version: ' + headerVersion,
@@ -109,12 +116,6 @@ const createESMSourcesAndResourcesTask = task.define('extract-editor-esm', () =>
'inlineEntryPoint:0.ts',
'inlineEntryPoint:1.ts',
'vs/loader.js',
'vs/nls.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/css.js',
'vs/css.build.js',
'vs/css.d.ts',
'vs/base/worker/workerMain.ts',
],
renames: {
@@ -224,7 +225,7 @@ const appendJSToESMImportsTask = task.define('append-js-to-esm-imports', () => {
result.push(line);
continue;
}
let modifiedLine = (
const modifiedLine = (
line
.replace(/^import(.*)\'([^']+)\'/, `import$1'$2.js'`)
.replace(/^export \* from \'([^']+)\'/, `export * from '$1.js'`)
@@ -239,10 +240,10 @@ const appendJSToESMImportsTask = task.define('append-js-to-esm-imports', () => {
* @param {string} contents
*/
function toExternalDTS(contents) {
let lines = contents.split(/\r\n|\r|\n/);
const lines = contents.split(/\r\n|\r|\n/);
let killNextCloseCurlyBrace = false;
for (let i = 0; i < lines.length; i++) {
let line = lines[i];
const line = lines[i];
if (killNextCloseCurlyBrace) {
if ('}' === line) {
@@ -316,7 +317,7 @@ const finalEditorResourcesTask = task.define('final-editor-resources', () => {
// package.json
gulp.src('build/monaco/package.json')
.pipe(es.through(function (data) {
let json = JSON.parse(data.contents.toString());
const json = JSON.parse(data.contents.toString());
json.private = false;
data.contents = Buffer.from(JSON.stringify(json, null, ' '));
this.emit('data', data);
@@ -360,10 +361,10 @@ const finalEditorResourcesTask = task.define('final-editor-resources', () => {
return;
}
let relativePathToMap = path.relative(path.join(data.relative), path.join('min-maps', data.relative + '.map'));
const relativePathToMap = path.relative(path.join(data.relative), path.join('min-maps', data.relative + '.map'));
let strContents = data.contents.toString();
let newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/');
const newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/');
strContents = strContents.replace(/\/\/# sourceMappingURL=[^ ]+$/, newStr);
data.contents = Buffer.from(strContents);
@@ -483,13 +484,13 @@ function createTscCompileTask(watch) {
cwd: path.join(__dirname, '..'),
// stdio: [null, 'pipe', 'inherit']
});
let errors = [];
let reporter = createReporter('monaco');
const errors = [];
const reporter = createReporter('monaco');
/** @type {NodeJS.ReadWriteStream | undefined} */
let report;
// eslint-disable-next-line no-control-regex
let magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings
const magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings
child.stdout.on('data', data => {
let str = String(data);
@@ -502,12 +503,12 @@ function createTscCompileTask(watch) {
report.end();
} else if (str) {
let match = /(.*\(\d+,\d+\): )(.*: )(.*)/.exec(str);
const match = /(.*\(\d+,\d+\): )(.*: )(.*)/.exec(str);
if (match) {
// trying to massage the message so that it matches the gulp-tsb error messages
// e.g. src/vs/base/common/strings.ts(663,5): error TS2322: Type '1234' is not assignable to type 'string'.
let fullpath = path.join(root, match[1]);
let message = match[3];
const fullpath = path.join(root, match[1]);
const message = match[3];
reporter(fullpath + message);
} else {
reporter(str);

View File

@@ -56,6 +56,7 @@ const compilations = [
'json-language-features/client/tsconfig.json',
'json-language-features/server/tsconfig.json',
'markdown-language-features/preview-src/tsconfig.json',
'markdown-language-features/server/tsconfig.json',
'markdown-language-features/tsconfig.json',
'markdown-math/tsconfig.json',
'merge-conflict/tsconfig.json',
@@ -63,12 +64,13 @@ const compilations = [
'npm/tsconfig.json',
'php-language-features/tsconfig.json',
'search-result/tsconfig.json',
'references-view/tsconfig.json',
'simple-browser/tsconfig.json',
'typescript-language-features/test-workspace/tsconfig.json',
'typescript-language-features/tsconfig.json',
'vscode-api-tests/tsconfig.json',
'vscode-colorize-tests/tsconfig.json',
'vscode-custom-editor-tests/tsconfig.json',
'vscode-notebook-tests/tsconfig.json',
'vscode-test-resolver/tsconfig.json'
];
*/
@@ -93,7 +95,7 @@ const tasks = compilations.map(function (tsconfigFile) {
const baseUrl = getBaseUrl(out);
let headerId, headerOut;
let index = relativeDirname.indexOf('/');
const index = relativeDirname.indexOf('/');
if (index < 0) {
headerId = 'microsoft.' + relativeDirname; // {{SQL CARBON EDIT}}
headerOut = 'out';
@@ -102,9 +104,9 @@ const tasks = compilations.map(function (tsconfigFile) {
headerOut = relativeDirname.substr(index + 1) + '/out';
}
function createPipeline(build, emitError) {
function createPipeline(build, emitError, transpileOnly) {
const nlsDev = require('vscode-nls-dev');
const tsb = require('gulp-tsb');
const tsb = require('./lib/tsb');
const sourcemaps = require('gulp-sourcemaps');
const reporter = createReporter('extensions');
@@ -112,7 +114,7 @@ const tasks = compilations.map(function (tsconfigFile) {
overrideOptions.inlineSources = Boolean(build);
overrideOptions.base = path.dirname(absolutePath);
const compilation = tsb.create(absolutePath, overrideOptions, false, err => reporter(err.toString()));
const compilation = tsb.create(absolutePath, overrideOptions, { verbose: false, transpileOnly, transpileOnlyIncludesDts: transpileOnly }, err => reporter(err.toString()));
const pipeline = function () {
const input = es.through();
@@ -154,6 +156,16 @@ const tasks = compilations.map(function (tsconfigFile) {
const cleanTask = task.define(`clean-extension-${name}`, util.rimraf(out));
const transpileTask = task.define(`transpile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true, true);
const nonts = gulp.src(src, srcOpts).pipe(filter(['**', '!**/*.ts']));
const input = es.merge(nonts, pipeline.tsProjectSrc());
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
}));
const compileTask = task.define(`compile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true);
const nonts = gulp.src(src, srcOpts).pipe(filter(['**', '!**/*.ts']));
@@ -186,12 +198,16 @@ const tasks = compilations.map(function (tsconfigFile) {
}));
// Tasks
gulp.task(transpileTask);
gulp.task(compileTask);
gulp.task(watchTask);
return { compileTask, watchTask, compileBuildTask };
return { transpileTask, compileTask, watchTask, compileBuildTask };
});
const transpileExtensionsTask = task.define('transpile-extensions', task.parallel(...tasks.map(t => t.transpileTask)));
gulp.task(transpileExtensionsTask);
const compileExtensionsTask = task.define('compile-extensions', task.parallel(...tasks.map(t => t.compileTask)));
gulp.task(compileExtensionsTask);
exports.compileExtensionsTask = compileExtensionsTask;

View File

@@ -16,7 +16,7 @@ function checkPackageJSON(actualPath) {
const actual = require(path.join(__dirname, '..', actualPath));
const rootPackageJSON = require('../package.json');
const checkIncluded = (set1, set2) => {
for (let depName in set1) {
for (const depName in set1) {
if (depName === 'typescript') {
continue;
}

View File

@@ -11,7 +11,7 @@ require('events').EventEmitter.defaultMaxListeners = 100;
const gulp = require('gulp');
const util = require('./lib/util');
const task = require('./lib/task');
const { compileTask, watchTask, compileApiProposalNamesTask, watchApiProposalNamesTask } = require('./lib/compilation');
const { transpileTask, compileTask, watchTask, compileApiProposalNamesTask, watchApiProposalNamesTask } = require('./lib/compilation');
const { monacoTypecheckTask/* , monacoTypecheckWatchTask */ } = require('./gulpfile.editor');
const { compileExtensionsTask, watchExtensionsTask, compileExtensionMediaTask } = require('./gulpfile.extensions');
@@ -19,6 +19,10 @@ const { compileExtensionsTask, watchExtensionsTask, compileExtensionMediaTask }
gulp.task(compileApiProposalNamesTask);
gulp.task(watchApiProposalNamesTask);
// Transpile only
const transpileClientTask = task.define('transpile-client', task.series(util.rimraf('out'), util.buildWebNodePaths('out'), transpileTask('src', 'out')));
gulp.task(transpileClientTask);
// Fast compile for development time
const compileClientTask = task.define('compile-client', task.series(util.rimraf('out'), util.buildWebNodePaths('out'), compileApiProposalNamesTask, compileTask('src', 'out', false)));
gulp.task(compileClientTask);

View File

@@ -116,7 +116,7 @@ const serverEntryPoints = [
name: 'vs/workbench/api/node/extensionHostProcess',
exclude: ['vs/css', 'vs/nls']
},
{
{
name: 'vs/platform/files/node/watcher/watcherMain',
exclude: ['vs/css', 'vs/nls']
},
@@ -135,7 +135,7 @@ try {
// Include workbench web
...vscodeWebEntryPoints
];
];
} catch (err) {
serverWithWebEntryPoints = [
// Include all of server
@@ -356,14 +356,14 @@ function copyConfigTask(folder) {
const json = require('gulp-json-editor');
return gulp.src(['remote/pkg-package.json'], { base: 'remote' })
.pipe(rename(path => path.basename += '.' + folder))
.pipe(json(obj => {
const pkg = obj.pkg;
pkg.scripts = pkg.scripts && pkg.scripts.map(p => path.join(destination, p));
pkg.assets = pkg.assets && pkg.assets.map(p => path.join(destination, p));
return obj;
}))
.pipe(vfs.dest('out-vscode-reh-pkg'));
.pipe(rename(path => path.basename += '.' + folder))
.pipe(json(obj => {
const pkg = obj.pkg;
pkg.scripts = pkg.scripts && pkg.scripts.map(p => path.join(destination, p));
pkg.assets = pkg.assets && pkg.assets.map(p => path.join(destination, p));
return obj;
}))
.pipe(vfs.dest('out-vscode-reh-pkg'));
};
}

View File

@@ -13,7 +13,7 @@ const ext = require('./lib/extensions');
const loc = require('./lib/locFunc');
const task = require('./lib/task');
const glob = require('glob');
const vsce = require('vsce');
const vsce = require('@vscode/vsce');
const mkdirp = require('mkdirp');
const rename = require('gulp-rename');
const fs = require('fs');
@@ -103,17 +103,22 @@ gulp.task('package-external-extensions', task.series(
return { name: extensionName, path: extensionPath };
})
.filter(element => ext.vscodeExternalExtensions.indexOf(element.name) === -1) // VS Code external extensions are bundled into ADS so no need to create a normal VSIX for them
.map(element => {
const pkgJson = require(path.join(element.path, 'package.json'));
const vsixDirectory = path.join(root, '.build', 'extensions');
mkdirp.sync(vsixDirectory);
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
return vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
.map(async element => {
try {
const pkgJson = require(path.join(element.path, 'package.json'));
const vsixDirectory = path.join(root, '.build', 'extensions');
mkdirp.sync(vsixDirectory);
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info(`Creating vsix for ${element.path}, result: ${packagePath}`);
return await vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: false
});
} catch (e) {
console.error(`Failed to create vsix for ${element.path}, error occurred: ${e}`);
throw e;
}
});
// Wait for all the initial VSIXes to be completed before making the VS Code ones since we'll be overwriting
// values in the package.json for those.
@@ -124,50 +129,54 @@ gulp.task('package-external-extensions', task.series(
// the package.json. It doesn't handle more complex tasks such as replacing localized strings.
const vscodeVsixes = glob.sync('.build/external/extensions/*/package.vscode.json')
.map(async vscodeManifestRelativePath => {
const vscodeManifestFullPath = path.join(root, vscodeManifestRelativePath);
const packageDir = path.dirname(vscodeManifestFullPath);
const packageManifestPath = path.join(packageDir, 'package.json');
const json = require('gulp-json-editor');
const packageJsonStream = gulp.src(packageManifestPath) // Create stream for the original package.json
.pipe(json(data => {
// And now use gulp-json-editor to modify the contents
const updateData = JSON.parse(fs.readFileSync(vscodeManifestFullPath)); // Read in the set of values to replace from package.vscode.json
Object.keys(updateData).forEach(key => {
if (key !== 'contributes') {
data[key] = updateData[key];
try {
const vscodeManifestFullPath = path.join(root, vscodeManifestRelativePath);
const packageDir = path.dirname(vscodeManifestFullPath);
const packageManifestPath = path.join(packageDir, 'package.json');
const json = require('gulp-json-editor');
const packageJsonStream = gulp.src(packageManifestPath) // Create stream for the original package.json
.pipe(json(data => {
// And now use gulp-json-editor to modify the contents
const updateData = JSON.parse(fs.readFileSync(vscodeManifestFullPath)); // Read in the set of values to replace from package.vscode.json
Object.keys(updateData).forEach(key => {
if (key !== 'contributes') {
data[key] = updateData[key];
}
});
if (data.contributes?.menus) {
// Remove ADS-only menus. This is a subset of the menus listed in https://github.com/microsoft/azuredatastudio/blob/main/src/vs/workbench/api/common/menusExtensionPoint.ts
// More can be added to the list as needed.
['objectExplorer/item/context', 'dataExplorer/context', 'dashboard/toolbar'].forEach(menu => {
delete data.contributes.menus[menu];
});
}
});
if (data.contributes?.menus) {
// Remove ADS-only menus. This is a subset of the menus listed in https://github.com/microsoft/azuredatastudio/blob/main/src/vs/workbench/api/common/menusExtensionPoint.ts
// More can be added to the list as needed.
['objectExplorer/item/context', 'dataExplorer/context', 'dashboard/toolbar'].forEach(menu => {
delete data.contributes.menus[menu];
});
}
// Add any configuration properties from the package.vscode.json
// Currently only supports bringing over properties in the first config object found and doesn't support modifying the title
if (updateData.contributes?.configuration[0]?.properties) {
Object.keys(updateData.contributes.configuration[0].properties).forEach(key => {
data.contributes.configuration[0].properties[key] = updateData.contributes.configuration[0].properties[key];
});
}
// Add any configuration properties from the package.vscode.json
// Currently only supports bringing over properties in the first config object found and doesn't support modifying the title
if (updateData.contributes?.configuration[0]?.properties) {
Object.keys(updateData.contributes.configuration[0].properties).forEach(key => {
data.contributes.configuration[0].properties[key] = updateData.contributes.configuration[0].properties[key];
});
}
return data;
}, { beautify: false }))
.pipe(gulp.dest(packageDir));
await new Promise(resolve => packageJsonStream.on('finish', resolve)); // Wait for the files to finish being updated before packaging
const pkgJson = JSON.parse(fs.readFileSync(packageManifestPath));
const vsixDirectory = path.join(root, '.build', 'extensions');
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + packageDir + ' result:' + packagePath);
return vsce.createVSIX({
cwd: packageDir,
packagePath: packagePath,
useYarn: true
});
return data;
}, { beautify: false }))
.pipe(gulp.dest(packageDir));
await new Promise(resolve => packageJsonStream.on('finish', resolve)); // Wait for the files to finish being updated before packaging
const pkgJson = JSON.parse(fs.readFileSync(packageManifestPath));
const vsixDirectory = path.join(root, '.build', 'extensions');
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info(`Creating vsix for ${packageDir} result: ${packagePath}`);
return await vsce.createVSIX({
cwd: packageDir,
packagePath: packagePath,
useYarn: false
});
} catch (e) {
console.error(`Failed to create vsix for ${packageDir}, error occurred: ${e}`);
throw e;
}
});
return Promise.all(vscodeVsixes);
})
));
@@ -179,17 +188,22 @@ gulp.task('package-langpacks', task.series(
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
}).map(element => {
const pkgJson = require(path.join(element.path, 'package.json'));
const vsixDirectory = path.join(root, '.build', 'langpacks');
mkdirp.sync(vsixDirectory);
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
return vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
}).map(async element => {
try {
const pkgJson = require(path.join(element.path, 'package.json'));
const vsixDirectory = path.join(root, '.build', 'langpacks');
mkdirp.sync(vsixDirectory);
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
return await vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: false
});
} catch (e) {
console.error(`Failed to create vsix for ${element.path}, error occurred: ${e}`);
throw e;
}
});
return Promise.all(vsixes);

View File

@@ -64,7 +64,6 @@ const vscodeResources = [
'out-build/vs/base/browser/ui/codicons/codicon/**',
'out-build/vs/base/parts/sandbox/electron-browser/preload.js',
'out-build/vs/platform/environment/node/userDataPath.js',
'out-build/vs/platform/extensions/node/extensionHostStarterWorkerMain.js',
'out-build/vs/workbench/browser/media/*-theme.css',
'out-build/vs/workbench/contrib/debug/**/*.json',
'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt',
@@ -76,7 +75,7 @@ const vscodeResources = [
'out-build/vs/workbench/contrib/tasks/**/*.json',
'out-build/vs/platform/files/**/*.exe',
'out-build/vs/platform/files/**/*.md',
'out-build/vs/code/electron-browser/workbench/**',
'out-build/vs/code/electron-sandbox/workbench/**',
'out-build/vs/code/electron-browser/sharedProcess/sharedProcess.js',
'out-build/vs/code/electron-sandbox/issue/issueReporter.js',
'out-build/sql/**/*.{svg,png,cur,html}',
@@ -124,7 +123,6 @@ const extensionsFilter = filter([
'**/asde-deployment.xlf',
'**/azcli.xlf',
'**/azurecore.xlf',
'**/azurehybridtoolkit.xlf',
'**/azuremonitor.xlf',
'**/cms.xlf',
'**/dacpac.xlf',
@@ -134,7 +132,6 @@ const extensionsFilter = filter([
'**/import.xlf',
'**/kusto.xlf',
'**/machine-learning.xlf',
'**/Microsoft.sqlservernotebook.xlf',
'**/mssql.xlf',
'**/notebook.xlf',
'**/profiler.xlf',
@@ -187,9 +184,9 @@ gulp.task(core);
* @return {Object} A map of paths to checksums.
*/
function computeChecksums(out, filenames) {
let result = {};
const result = {};
filenames.forEach(function (filename) {
let fullPath = path.join(process.cwd(), out, filename);
const fullPath = path.join(process.cwd(), out, filename);
result[filename] = computeChecksum(fullPath);
});
return result;
@@ -202,9 +199,9 @@ function computeChecksums(out, filenames) {
* @return {string} The checksum for `filename`.
*/
function computeChecksum(filename) {
let contents = fs.readFileSync(filename);
const contents = fs.readFileSync(filename);
let hash = crypto
const hash = crypto
.createHash('md5')
.update(contents)
.digest('base64')
@@ -230,8 +227,8 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
'vs/workbench/workbench.desktop.main.js',
'vs/workbench/workbench.desktop.main.css',
'vs/workbench/api/node/extensionHostProcess.js',
'vs/code/electron-browser/workbench/workbench.html',
'vs/code/electron-browser/workbench/workbench.js'
'vs/code/electron-sandbox/workbench/workbench.html',
'vs/code/electron-sandbox/workbench/workbench.js'
]);
const src = gulp.src(out + '/**', { base: '.' })
@@ -322,6 +319,7 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
all = es.merge(all, gulp.src('resources/linux/code.png', { base: '.' }));
} else if (platform === 'darwin') {
const shortcut = gulp.src('resources/darwin/bin/code.sh')
.pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(rename('bin/code'));
all = es.merge(all, shortcut);
@@ -366,10 +364,14 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
result = es.merge(result, gulp.src('resources/win32/VisualElementsManifest.xml', { base: 'resources/win32' })
.pipe(rename(product.nameShort + '.VisualElementsManifest.xml')));
result = es.merge(result, gulp.src('.build/policies/win32/**', { base: '.build/policies/win32' })
.pipe(rename(f => f.dirname = `policies/${f.dirname}`)));
} else if (platform === 'linux') {
result = es.merge(result, gulp.src('resources/linux/bin/code.sh', { base: '.' })
.pipe(replace('@@PRODNAME@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(rename('bin/' + product.applicationName)));
}
@@ -523,7 +525,7 @@ gulp.task(task.define(
gulp.task('vscode-translations-pull', function () {
return es.merge([...i18n.defaultLanguages, ...i18n.extraLanguages].map(language => {
let includeDefault = !!innoSetupConfig[language.id].defaultInfo;
const includeDefault = !!innoSetupConfig[language.id].defaultInfo;
return i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-translations-import/${language.id}/setup`));
}));
});

View File

@@ -16,6 +16,9 @@ const task = require('./lib/task');
const packageJson = require('../package.json');
const product = require('../product.json');
const rpmDependenciesGenerator = require('./linux/rpm/dependencies-generator');
const debianDependenciesGenerator = require('./linux/debian/dependencies-generator');
const sysrootInstaller = require('./linux/debian/install-sysroot');
const debianRecommendedDependencies = require('./linux/debian/dep-lists').recommendedDeps;
const path = require('path');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
@@ -75,12 +78,16 @@ function prepareDebPackage(arch) {
let size = 0;
const control = code.pipe(es.through(
function (f) { size += f.isDirectory() ? 4096 : f.contents.length; },
function () {
async function () {
const that = this;
const sysroot = await sysrootInstaller.getSysroot(debArch);
const dependencies = debianDependenciesGenerator.getDependencies(binaryDir, product.applicationName, debArch, sysroot);
gulp.src('resources/linux/debian/control.template', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@VERSION@@', packageJson.version + '-' + linuxPackageRevision))
.pipe(replace('@@ARCHITECTURE@@', debArch))
.pipe(replace('@@DEPENDS@@', dependencies.join(', ')))
.pipe(replace('@@RECOMMENDS@@', debianRecommendedDependencies.join(', ')))
.pipe(replace('@@INSTALLEDSIZE@@', Math.ceil(size / 1024)))
.pipe(rename('DEBIAN/control'))
.pipe(es.through(function (f) { that.emit('data', f); }, function () { that.emit('end'); }));
@@ -212,7 +219,7 @@ function buildRpmPackage(arch) {
return shell.task([
'mkdir -p ' + destination,
'HOME="$(pwd)/' + destination + '" fakeroot rpmbuild -bb ' + rpmBuildPath + '/SPECS/' + product.applicationName + '.spec --target=' + rpmArch,
'HOME="$(pwd)/' + destination + '" rpmbuild -bb ' + rpmBuildPath + '/SPECS/' + product.applicationName + '.spec --target=' + rpmArch,
'cp "' + rpmOut + '/$(ls ' + rpmOut + ')" ' + destination + '/'
]);
}

View File

@@ -208,7 +208,7 @@ function packageTask(sourceFolderName, destinationFolderName) {
gulp.src('resources/server/code-512.png', { base: 'resources/server' })
);
let all = es.merge(
const all = es.merge(
packageJsonStream,
license,
sources,
@@ -218,7 +218,7 @@ function packageTask(sourceFolderName, destinationFolderName) {
pwaicons
);
let result = all
const result = all
.pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions());

View File

@@ -53,7 +53,7 @@ function hygiene(some, linting = true) {
const m = /([^\t\n\r\x20-\x7E⊃⊇✔✓🎯⚠🛑🔴🚗🚙🚕🎉✨❗⇧⌥⌘×÷¦⋯…↑↓→→←↔⟷·•●◆▼⟪⟫┌└├⏎↩√φ]+)/g.exec(line);
if (m) {
console.error(
file.relative + `(${i + 1},${m.index + 1}): Unexpected unicode character: "${m[0]}". To suppress, use // allow-any-unicode-next-line`
file.relative + `(${i + 1},${m.index + 1}): Unexpected unicode character: "${m[0]}" (charCode: ${m[0].charCodeAt(0)}). To suppress, use // allow-any-unicode-next-line`
);
errorCount++;
}
@@ -115,8 +115,8 @@ function hygiene(some, linting = true) {
})
.then(
(result) => {
let original = result.src.replace(/\r\n/gm, '\n');
let formatted = result.dest.replace(/\r\n/gm, '\n');
const original = result.src.replace(/\r\n/gm, '\n');
const formatted = result.dest.replace(/\r\n/gm, '\n');
if (original !== formatted) {
console.error(

View File

@@ -1,15 +0,0 @@
{
"compilerOptions": {
"module": "commonjs",
"target": "es2017",
"jsx": "preserve",
"checkJs": true
},
"include": [
"**/*.js"
],
"exclude": [
"node_modules",
"**/node_modules/*"
]
}

View File

@@ -1,8 +1,8 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.createAsar = void 0;
const path = require("path");
@@ -81,7 +81,7 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
out.push(file.contents);
}
}, function () {
let finish = () => {
const finish = () => {
{
const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header));

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
const pickle = require('chromium-pickle-js');
@@ -98,7 +96,7 @@ export function createAsar(folderPath: string, unpackGlobs: string[], destFilena
}
}, function () {
let finish = () => {
const finish = () => {
{
const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header));

View File

@@ -45,8 +45,7 @@ function isUpToDate(extension) {
}
}
function syncMarketplaceExtension(extension) {
var _a;
const galleryServiceUrl = (_a = productjson.extensionsGallery) === null || _a === void 0 ? void 0 : _a.serviceUrl;
const galleryServiceUrl = productjson.extensionsGallery?.serviceUrl;
const source = ansiColors.blue(galleryServiceUrl ? '[marketplace]' : '[github]');
if (isUpToDate(extension)) {
log(source, `${extension.name}@${extension.version}`, ansiColors.green('✔︎'));
@@ -98,12 +97,12 @@ function writeControlFile(control) {
fs.writeFileSync(controlFilePath, JSON.stringify(control, null, 2));
}
function getBuiltInExtensions() {
log('Syncronizing built-in extensions...');
log('Synchronizing built-in extensions...');
log(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`);
const control = readControlFile();
const streams = [];
for (const extension of [...builtInExtensions, ...webBuiltInExtensions]) {
let controlState = control[extension.name] || 'marketplace';
const controlState = control[extension.name] || 'marketplace';
control[extension.name] = controlState;
streams.push(syncExtension(extension, controlState));
}

Some files were not shown because too many files have changed in this diff Show More