Compare commits

..

193 Commits

Author SHA1 Message Date
Hai Cao
31bee67f00 bump STS (#23148) 2023-05-16 14:40:54 -07:00
Alan Ren
e2949d4494 add securable settings (#22936) (#23141)
* wip

* Update typings

* nullable

* update test service

* support securables

* updata test data

* fix issues

* fix build failure

* update test mocks

* fix typo

* fix reference

* fix findobjectdialog issue

* update SearchResultItem type

* fix table component perf issue

* hide effective permission for server role

* hide effective permission for app role and db role

* vbump sts and fix a couple issues

* STS update and UI update

* fix user login display issue

* vbump sts
2023-05-15 15:35:47 -07:00
Cheena Malhotra
b6bd726066 Support advanced options in command line arguments (#23104) (#23124) 2023-05-12 13:51:15 -07:00
Benjin Dubishar
0fe638974d Fixing issue where sqlcmdvars wouldn't load from publish profile in ADS (#23116) (#23121)
* fixing issue where sqlcmdvars wouldn't load from publish profile in ADS

* in -> of
2023-05-12 13:50:26 -07:00
Benjin Dubishar
f364e52079 Fix deploy/generatePlan/saveProfile when no sqlcmdvars are defined (#23112) (#23120)
* fix deploy/generate when no sqlcmdvars are defined

* saveProfile
2023-05-12 13:50:02 -07:00
Maddy
84143d3caf remove the access point (#23105) (#23114) 2023-05-11 19:31:57 -07:00
Kim Santiago
0cfaf69647 fix sqlcmd variables not getting loaded correctly in vscode (#23103) (#23108) 2023-05-11 19:31:34 -07:00
Aasim Khan
343d878457 Adding telemetry to ads OE filter (#23089) (#23106)
* Adding telemetry to ads oe filter

* Fixing prop names

* fixing prop name

* Fixing localized strings

* Update src/sql/azdata.proposed.d.ts



* Update src/sql/workbench/contrib/objectExplorer/browser/serverTreeView.ts



---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-05-11 19:31:07 -07:00
Cheena Malhotra
7d0cfc4f99 Support migrating credentials to new format (#23088) (#23095) 2023-05-11 18:41:11 -07:00
Barbara Valdez
f13406570e update workbench file and fix relative link not working in markdown (#23109) (#23113) 2023-05-11 16:52:05 -07:00
Benjin Dubishar
c44ecf56f2 Fixing bug where SQLCMD vars weren't getting JSONified (#23082) (#23096)
* changing param for sqlcmdvars back to Record since Json.stringify doesn't handle Maps

* swapping over savePublishProfile
2023-05-10 23:12:12 -07:00
Aasim Khan
f10fc9b56d Adding back OE filtering in rmay release. (#23070) 2023-05-10 08:41:26 -07:00
Benjin Dubishar
72acd2af83 Bumping SQL Tools Service (#23046) (#23051) 2023-05-09 14:09:15 -07:00
Cheena Malhotra
6112eabc3c Fixes import wizard to work with enabled SQL auth provider (#23004) (#23052) 2023-05-09 11:12:19 -07:00
Lewis Sanchez
27bac701bb Use notification prompt before running a command (#23035) (#23048)
* Use notification prompt before running a command

https://github.com/microsoft/vscode/pull/179702/commits

* Removes unused declarations
2023-05-09 11:10:46 -07:00
Cheena Malhotra
0bcd010d9a Fixes build folder compilation + enable linux .deb files (#23006) (#23042) 2023-05-09 11:10:14 -07:00
Aasim Khan
bca671bc3f disabling async tree by default (#23037) 2023-05-09 11:09:33 -07:00
Aasim Khan
706ba6c974 Adding filtering dialog and action to OE (#22937) (#23036)
* Adding init change

* Adding filter cache in OE

* Adding more filtering changes

* Fixed stuff with dialog

* Fixing filter

* Adding support for connecitons

* Fixed stuff

* filtering

* Fixing  date

* Filters

* Removing is filtering supported

* Removing contracts

* Fixing filters

* Fixing cache

* Adding some accessibility changes

* Reverting some more changes to pull in changes from the main

* Adding comments

* Fixing boolean operators

* Fixing stuff

* Fixing stuff

* Fixing error handling and making dialog generic

* Fixing more stuff

* Making filter a generic dialog

* adding erase icon

* removing floating promises

* Fixing compile issue

* Adding support for choice filter with different and actual value.

* Adding null checks

* Adding durability type fix

* Fixing filtering for providers that do not play well with empty filter properties
2023-05-09 11:07:03 -07:00
Kim Santiago
8e38295691 vbump sql projects to 1.1.1 (#23029) (#23039) 2023-05-08 16:44:44 -07:00
Alan Ren
a97d882e3c fix #174264 (#174845) (#23027) (#23034)
Co-authored-by: Sandeep Somavarapu <sasomava@microsoft.com>
2023-05-08 15:49:00 -07:00
Aasim Khan
f605a3c644 Fixing node retention after disconnect (#23020) 2023-05-08 10:13:53 -07:00
Aasim Khan
6fa948adbe Fixing delete server group error when we try to focus on root (#23019) 2023-05-08 09:46:40 -07:00
Alex Hsu
4b9147a6a0 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230508154231869. (#23026) 2023-05-08 09:23:56 -07:00
Raymond Truong
6684dbb78c [SQL Migration] Add storage/VM connectivity validation (#22982)
* Implement storage account connectivity check for SQL VM targets

* Add missing break statement

* Address PR comments
2023-05-08 08:42:40 -07:00
Aasim Khan
3ae97b81f2 Fixing the error showing up in console log (#23018) 2023-05-08 08:29:39 -07:00
Alex Ma
ff3f6d53c7 Langpack source update for May release (#23014) 2023-05-07 19:58:41 -07:00
Benjin Dubishar
1620b3b374 Fixing deleteDatabaseReference test for vscode-mssql (#23008) 2023-05-07 13:08:04 -07:00
Alan Ren
688b0c0819 add label for database dropdown (#23015) 2023-05-07 10:27:30 -07:00
Alan Ren
718b149e84 introduce fieldset component (#23005) 2023-05-06 21:36:08 -07:00
Sakshi Sharma
feed449d97 Update default folder structure option in VSCode (#23002) 2023-05-05 16:50:32 -07:00
Cheena Malhotra
77c8b3bda1 Validate MSAL library is enabled (#23000) 2023-05-05 16:44:04 -07:00
Benjin Dubishar
127a2d2e2f Updating SqlProjects readme to have absolute github link for image (#23001) 2023-05-05 15:53:02 -07:00
Alex Ma
898bb73a34 Revert new connection string format (#22997) 2023-05-05 13:41:40 -07:00
Cory Rivera
27e0d67dec Add context menu entries for deleting a database (#22948) 2023-05-05 12:12:35 -07:00
Cheena Malhotra
0dc05a6a4c Hide tenant dropdown from Connection Dialog (#22973) 2023-05-05 10:40:00 -07:00
Cory Rivera
876a4a24f6 Bump SQL Tools dependency to 4.7.0.28 (#22983) 2023-05-05 10:39:34 -07:00
Charles Gagnon
c3bf85f026 Create separate ScriptableDialogBase (#22974)
* Create separate ScriptableDialogBase

* more
2023-05-05 09:17:51 -07:00
Cheena Malhotra
9af7a049e6 Revert build folder update to fix it properly (#22981)
* Revert "Disable publishing crash reports temporarily (#22950)"

This reverts commit 13a791d14e.

* Revert "Compile build folder (#22811)"

This reverts commit 2c07c09d0d.
2023-05-05 08:50:23 -07:00
Alan Ren
70e756b82d fix duplicate required indicator (#22976) 2023-05-04 21:53:42 -07:00
Alan Ren
b37df9b6ad fix tab style in hc mode (#22975) 2023-05-04 21:53:24 -07:00
Charles Gagnon
88197a5162 Fix validation errors in package.json when clause (#22972) 2023-05-04 16:19:52 -07:00
Aasim Khan
302855e4a4 Fixing async server tree error handling and removing timeout. (#22955)
* Fixing async server tree issues and removing timeout

* removing empty results for connection errors

* Fixing error message fetching

* Update src/sql/workbench/services/objectExplorer/browser/asyncServerTreeDataSource.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-05-04 15:50:37 -07:00
Alan Ren
86cd0003fe add required indicator (#22971) 2023-05-04 15:23:22 -07:00
Benjin Dubishar
94745a69f5 Bumping Tools Service dependency (#22963) 2023-05-04 11:45:00 -07:00
Benjin Dubishar
8a56e0c0cd Bumping azdata dependency (#22961) 2023-05-04 11:33:43 -07:00
Alan Ren
8a5387d97a remove the rename db context menu (#22962) 2023-05-04 10:52:07 -07:00
Alex Hsu
47893e005c Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230504154225133. (#22960) 2023-05-04 09:29:04 -07:00
Alex Ma
e76e6df49c [Loc] final update to localization before code complete (#22956) 2023-05-03 17:16:12 -07:00
Cheena Malhotra
13a791d14e Disable publishing crash reports temporarily (#22950) 2023-05-03 16:20:01 -07:00
Charles Gagnon
c6549dd6f4 Remove vs code engine version for whoisactive extension (#22952) 2023-05-03 16:07:00 -07:00
Karl Burtram
1ddcce5a75 Bump STS to 4.7.0.26 for User w Login fix (#22934) 2023-05-03 14:10:30 -07:00
AkshayMata
ced2f7938f Bump sql-migration version (#22946)
Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-03 13:43:43 -07:00
Raymond Truong
0d2ed6e517 [SQL Migration] Improve IR registration experience (#22926)
* Add registration instructions to IR page

* Clean up

* Typo

* Fix typo

* Replace link with aka.ms link

* Refactor + implement regenerate auth keys

* Update strings and clean up comments

* Fix sqlMigrationServiceDetailsDialog

* Fix sqlMigrationServiceDetailsDialog width

* Extract helpers to utils

* Add IR registration instructions to sqlMigrationServiceDetailsDialog

* Update SHIR description slightly
2023-05-03 16:43:00 -04:00
Kim Santiago
844ed758a5 Fix .publish.xml file extension not being used on mac (#22939) 2023-05-03 13:18:42 -07:00
Kim Santiago
8f37ea8746 fix macros not getting replaced in sql project item scripts (#22945) 2023-05-03 13:18:02 -07:00
Sakshi Sharma
55d652198c Fix schema comparison failure for Azure synapse (#22938) 2023-05-03 13:03:40 -07:00
AkshayMata
a8a88ccbeb [SQL-Migration] Improve log migrations telemetry (#22927)
- Bucketized errors to track top errors
- Created separate login migration specific error to improve monitoring

---------

Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-03 10:35:44 -07:00
Cheena Malhotra
2c07c09d0d Compile build folder (#22811)
* Compile build folder

* Fix build compile issues (#22813)

* Revert changes

* Update gulp-shell

* Test

* Update

* Update modifiers

* Try reverting minimist

* Generates linux deb artifact (#22922)

* Remove deb files that were brought in with the latest merge.

* Add debian back to linux gulp file

* Remove async from anonymous function.

* Remove run core integration tests build step in pipeline

* Revert "Remove async from anonymous function."

This reverts commit 7ad1ce2942954fce58939b9965343b46b9311a7e.

* Revert "Add debian back to linux gulp file"

This reverts commit 96b7c0f0995c8024ef67ed886da34255a5caa325.

* Revert "Remove deb files that were brought in with the latest merge."

This reverts commit bf3aae233b8da1f9111a149a96d77cc78d376094.

* Removes dependency checks

* Fix dependency gen errors

* Reenable "Build Deb" step

* Reenable publish deb

* Run core integration tests

* Revert "Run core integration tests"

This reverts commit 7cafbada194feebe771862af796fb3416b5dd686.

* Revert "Try reverting minimist"

This reverts commit 38fd843c1d5c33318a92f4bbc7057e951c5a9f71.

* Disable code coverage step intermin

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
Co-authored-by: Lewis Sanchez <87730006+lewis-sanchez@users.noreply.github.com>
2023-05-02 19:32:46 -07:00
Alex Ma
32c99ab1a6 [Loc] Localization update for 5-2-2023 (Second to last before code complete) (#22940) 2023-05-02 17:36:20 -07:00
Kim Santiago
ab44d205d0 add telemetry for saving publish profiles (#22933) 2023-05-02 16:09:18 -07:00
Kim Santiago
e57fc9f202 shorten 'Save Profile as...' button text to 'Save as...' (#22932)
* shorten 'Save Profile as...' button text to 'Save as...'

* uppercase
2023-05-02 15:37:16 -07:00
Raymond Truong
6de9c5e1ae [SQL Migration] Add support for assessing XEvent session files (#22210)
* Template

* Refactor

* Update strings

* Clean up

* Add clear button

* Clean up

* Fix typo and use aka.ms link

* Refactor to use GroupContainer

* Remove dialog and clean up common strings

* Fix previous/forward behavior

* Make group container default to collapsed

* Clean up

* Slightly reword string

* Add https to aka.ms link
2023-05-02 10:45:13 -04:00
Cheena Malhotra
1f3a514c90 Add launch config for VS Code selfhost test provider (#22913) 2023-05-01 20:27:37 -07:00
Alex Ma
7edd48fb11 added new password reset number (#22920) 2023-05-01 18:28:28 -07:00
Alex Ma
88c9e0b492 [Loc] update to XLF on 5-1-2023 prior to code complete (#22921) 2023-05-01 17:20:51 -07:00
Kim Santiago
a56109dad7 vbump data workspace (#22914) 2023-05-01 15:50:31 -07:00
Charles Gagnon
457365537c Add License section to extension READMEs (#22912)
* Add License section to extension READMEs

* vbump
2023-05-01 13:51:51 -07:00
Charles Gagnon
a650f268f7 Remove sqlservernotebook extension (#22903) 2023-05-01 13:44:14 -07:00
Alan Ren
6a7899281a remove auth type from user (#22905)
* remove auth type from user

* vbump sts
2023-05-01 13:29:55 -07:00
AkshayMata
af6f9089f7 [SQL-Migration] Improve error message for failed migration service download (#22846)
Add troubleshooting links to error message when SQL-Migration fails to download the MigrationService as seen in this issue: #22558

---------

Co-authored-by: Akshay Mata <akma@microsoft.com>
2023-05-01 13:24:35 -07:00
Cheena Malhotra
7b2a07befd Respect 'showDashboard' disabled by default (#22907) 2023-05-01 11:55:38 -07:00
Cheena Malhotra
ea6bb41f45 Allow 'ApplicationName' to be specified for MSSQL connections (#22890) 2023-05-01 10:55:05 -07:00
Cheena Malhotra
f4952c76b8 Handle no matching account scenario (#22908) 2023-05-01 10:54:42 -07:00
Charles Gagnon
24e67a1cbd Remove publisher link and fix license link (#22904) 2023-05-01 09:44:56 -07:00
Charles Gagnon
519a42c5b3 Fix missing placeholder warning for infoboxes (#22898) 2023-05-01 09:30:02 -07:00
Benjin Dubishar
6b1dd0e468 Changing "files" to "sqlObjectScripts" to be more accurate (#22899)
* Changing "files" to "sqlObjectScripts" to be more accurate

* fixing func call
2023-05-01 08:31:13 -07:00
Aasim Khan
b86463ee71 Adding filtering to OE Service (#22900)
* Adding filtering to OE Service

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/serverTreeRenderer.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Update src/sql/workbench/services/objectExplorer/browser/objectExplorerService.ts

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>

* Fixing compile errors

* Fixing circular dependency

---------

Co-authored-by: Cheena Malhotra <13396919+cheenamalhotra@users.noreply.github.com>
2023-04-29 17:22:38 -04:00
Cheena Malhotra
e26937b101 Touch up MSAL errors (#22906) 2023-04-28 21:22:26 -07:00
Cheena Malhotra
ed8149599c Update error for ignored tenants (#22881) 2023-04-28 16:19:29 -07:00
Benjin Dubishar
29ff6ca16c Adding Move, Exclude, and Rename support for folders (#22867)
* Adding exclude folder and base for move folder

* checkpoint

* rename

* Fixing up tests

* Adding exclude test to projController

* Adding tests

* fixing order of service.moveX() calls

* Updating move() order in sqlproj service

* PR feedback

* unskipping

* reskipping test

* Fixing folder move conditional

* updating comments
2023-04-28 16:05:38 -07:00
Kim Santiago
934d8ff8fa Add nupkg database reference option for sql projects in vscode (#22882)
* add nupkg db ref option in vscode

* add telemetry for nupkg db ref

* update comment
2023-04-28 14:57:22 -07:00
Alan Ren
468b3e4f06 update shortcut keys for parse query (#22896) 2023-04-28 14:45:28 -07:00
Alan Ren
15d26b7f9a fix the sqlservices sample extension (#22893) 2023-04-28 14:45:20 -07:00
Alan Ren
4f53d76eb5 User Management - Support new object types: Server Role, Application Role and Database Role (#22889)
* server role dialogs

* dialogs for other types

* refactor

* find object dialog

* script button

* refactoring

* fix issues

* fix title

* vbump sts

* remove language from links
2023-04-28 12:05:20 -07:00
Kim Santiago
ba09248483 fix build errors from sql projects test file (#22894) 2023-04-28 11:57:54 -07:00
Sakshi Sharma
757067b132 Required changes to make sql projects extension work with vscode (#22847)
* Update CreateProject api

* More updates

* Fix a few comments

* Address comments

* Remove package.json changes

* Fix error

* Fix testUtil
2023-04-28 10:27:59 -07:00
Cheena Malhotra
c04b8af1d2 Prompt user to refresh account credentials for AADSTS70043 and AADSTS50173 error codes (#22853) 2023-04-27 20:44:22 -07:00
Lewis Sanchez
942786c2a7 Enables previously skipped tests (#22884) 2023-04-27 17:20:12 -07:00
brian-harris
fe32180c71 SQL-Migration: enable cross subscription service migration (#22876)
* x subscription migration support

* refresh after cutover

* fix service irregular status load behavior

* queue service status requests, fix typo

* add migationTargetServerName helper method

* save context before api call
2023-04-27 16:28:32 -07:00
Charles Gagnon
65f8915b7e Remove azurehybridtoolkit extension (#22879) 2023-04-27 13:56:14 -07:00
Karl Burtram
109d428d8c Bump STS to pickup latest User management updates (#22880) 2023-04-27 11:32:22 -07:00
Aasim Khan
9c68043137 Adding OE filtering interfaces. (#22738)
* Adding interfaces for tree filtering

* fixing type

* Adding enums in sqlhost

* Fixing comment

* Fixed filters lol

* Fixing some contract definitions

* Removing flag

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Removing filters

* Removing filters2

* Fixing enum

* Fixing interface and enum names

* Fixing interface name

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Removing type from filter

* Adding is null and is not null operators

* Fixing enums

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* Update src/sql/azdata.proposed.d.ts

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

---------

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2023-04-27 13:15:11 -04:00
Alex Hsu
c4b19597b6 Juno: check in to lego/hb_04604851-bac4-4681-9f74-73de611d6e48_20230427154831319. (#22874) 2023-04-27 09:38:01 -07:00
Benjin Dubishar
62255fe4dd no longer filtering to well-known database sources (#22864) 2023-04-26 13:31:15 -07:00
Kim Santiago
abff6a0a34 fix schema compare diff editor colors not being reversed after merge (#22852) 2023-04-26 09:25:07 -07:00
Lewis Sanchez
1fcba44772 Removes package-lock.json (#22855) 2023-04-25 21:49:42 -07:00
Charles Gagnon
5ba8369cb0 Fix Azure rest calls not working (#22854) 2023-04-25 19:29:32 -07:00
Kim Santiago
d551f5170d add telemetry for add database reference quickpick (#22848) 2023-04-25 13:52:48 -07:00
Cheena Malhotra
24af5db4a2 Fix resetting tenant when dropdown is not visible (#22845) 2023-04-25 12:53:58 -07:00
Charles Gagnon
64dd4f0904 Fix IConnectionProfile to have same options key when generated from ConnectionProfile (#22840) 2023-04-25 09:52:51 -07:00
Cheena Malhotra
a887bb199e Fix username to respect existing values (#22837) 2023-04-24 15:28:44 -07:00
Charles Gagnon
167ef2fea8 Finish up no-unsafe-assignment fixes in azurecore (#22836)
* Finish up no-unsafe-assignment fixes in azurecore

* Add link to typings

* Remove unused

* Ignore webpack file
2023-04-24 14:13:00 -07:00
Kim Santiago
8616c5948b Update vscode-mssql.d.ts to match what's in the vscode-mssql repo (#22830)
* update vscode-mssql.d.ts

* update extensions that need updates because of the vscode-mssql.d.ts changes

* remove skip

* fix sql projects tests failing because vscode-mssql couldn't be found
2023-04-24 13:40:05 -07:00
Alex Ma
c42418d89c Update to langpack base source file, and fix to xlf generation. (#22829)
* update to build lib and langpack base files

* fix for gulp localization xlf task and xlf update

* fix to yarn
2023-04-24 11:33:13 -07:00
Charles Gagnon
7f388dd420 Add azurecore HTTP typings (#22828)
* Add azurecore HTTP typings

* undo + spelling fix
2023-04-24 10:39:02 -07:00
Erin Stellato
e5e8824d34 Minor updates to readme (#22827)
* Minor updates to readme

* removed extraneous URL

removed ?view=sql-server-ver16 from docs URL
2023-04-24 10:51:44 -04:00
Cheena Malhotra
2247d5de88 Prevent reconnects for mssql provider (#22825) 2023-04-21 16:56:40 -07:00
Benjin Dubishar
c581b0285d Bumping SqlToolsService to 4.7.0.17 (#22823)
* Bumping STS

* Whoops

* Upgrading STS to 4.7.0.17
2023-04-21 16:44:52 -07:00
Cheena Malhotra
db03525b10 Fix launch.json (#22822) 2023-04-21 14:34:05 -07:00
Charles Gagnon
3bd85a5798 Move stringifying of request body for azure REST calls (#22820)
* Move stringifying of request body for azure REST calls

* spelling

* Remove unused
2023-04-21 13:27:11 -07:00
Charles Gagnon
ec81fc89bb Fix CI workflow (#22819) 2023-04-21 13:26:39 -07:00
Charles Gagnon
651f1ed85b Few more no-unsafe-assignment cleanups (#22818) 2023-04-21 11:27:03 -07:00
Kim Santiago
41e6f3b84b fix accessibility issue for open dialog location radio button (#22812)
* fix accessibliity issue where location radio button showed as required

* fix ariaLabel
2023-04-21 09:53:02 -07:00
Lewis Sanchez
c8618d39fe Update distro commit hash (#22810) 2023-04-20 21:41:51 -07:00
Cheena Malhotra
f83815cecd Enable Sql Authentication Provider by default (targeting May release) (#22213) 2023-04-20 17:55:56 -07:00
Benjin Dubishar
2142c706b0 Improving error message when projects fail to load (#22786)
* Improving message when project fails to load

* Cleaning up string
2023-04-20 14:00:43 -07:00
Cheena Malhotra
8613176817 Cache access tokens in local cache file to prevent MSAL throttling (#22763) 2023-04-20 13:55:30 -07:00
Karl Burtram
0bdb35d9ab Enable proposed APIs for copilot (#22805) 2023-04-20 12:51:55 -07:00
Benjin Dubishar
a9e359f58f Bumping STS to 4.7.0.15 (#22796)
* Bumping STS

* Whoops
2023-04-20 10:43:28 -07:00
Kim Santiago
b98ac1d211 change value to default value in sqlcmd variable prompt (#22787) 2023-04-20 10:18:48 -07:00
Karl Burtram
a9951e977c Disable DEB publish step (#22795)
* Disable DEB publishing

* Revert "Reenable DEB build (#22794)"

This reverts commit 9132695d61.
2023-04-20 08:35:14 -07:00
Karl Burtram
9132695d61 Reenable DEB build (#22794) 2023-04-20 07:36:05 -07:00
Karl Burtram
e7d3d047ec Merge from vscode merge-base (#22780)
* Revert "Revert "Merge from vscode merge-base (#22769)" (#22779)"

This reverts commit 47a1745180.

* Fix notebook download task

* Remove done call from extensions-ci
2023-04-19 21:48:46 -07:00
Alan Ren
decbe8dded simplify object management feature APIs (#22781) 2023-04-19 19:26:29 -07:00
Raymond Truong
34d092a7dd vbump extension (#22784) 2023-04-19 13:56:49 -07:00
Alan Ren
35829acd00 fix error message steal focus issue (#22782)
* fix steal focus issue

* update comment
2023-04-19 11:39:45 -07:00
Kim Santiago
39a28c5f51 Add nupkg option to add database reference dialog (#22772)
* add nupkg option to add database reference dialog

* Add required indicator

* only show nupkg radio button for SDK-style projects

* fix enable ok button

* hookup

* fix typo
2023-04-19 10:09:15 -07:00
Kim Santiago
c29bb27d9e fix sqlcmd table var table in publish dialog (#22770)
* fix sqlcmd var table in publish dialog

* const

* remove unused _value
2023-04-19 09:17:34 -07:00
Cheena Malhotra
ba694a0558 Register mysql, pgsql with mssql as default 'sql' kernel providers (#22729) 2023-04-18 22:07:02 -07:00
Karl Burtram
47a1745180 Revert "Merge from vscode merge-base (#22769)" (#22779)
This reverts commit 6bd0a17d3c.
2023-04-18 21:44:05 -07:00
Lewis Sanchez
6bd0a17d3c Merge from vscode merge-base (#22769)
* Merge from vscode merge-base

* Turn off basic checks

* Enable compilation, unit, and integration tests
2023-04-18 18:28:58 -07:00
Cheena Malhotra
6186358001 Rename 'body' to 'data' to prevent breaking change (#22761) 2023-04-18 18:04:51 -07:00
Kim Santiago
4709eab293 make UserDatabaseReferenceProjectEntry class (#22768) 2023-04-18 13:23:09 -07:00
Kim Santiago
2dcbdc9c63 Handle nukpg database references in project.ts (#22762)
* changes in project.ts for adding nupkg database references

* Add tests

* more tests

* fix comment

* remove it.only
2023-04-18 11:11:42 -07:00
Alex Ma
b69e87df15 Connection URI with complete options (finalized) (#22735)
* Connection URI made to include every option available instead of basic details (#22045)

* Revert "Merge remote-tracking branch 'origin' into feat/connectionUri"

This reverts commit 11b2d31bf99e216daee823f732254f69a017fee1, reversing
changes made to 36e4db8c0744f81565efdfd2f56a3ae3c0026896.

* Revert "Revert "Merge remote-tracking branch 'origin' into feat/connectionUri""

This reverts commit f439673c2693e1144c52e04c14e82cd8566c13a6.

* Added changes and fixes for feat connectionuri (#22706)

* add title generation at start

* added await to refreshConnectionTreeTitles
2023-04-18 11:08:48 -07:00
Karl Burtram
a9bc34acf0 Bump STS to 4.7.0.12 for Login script fix (#22766) 2023-04-18 09:37:29 -07:00
Karl Burtram
48cc5e53bd Bump STS to 4.7.0.12 for Login script fix (#22766) 2023-04-18 09:36:57 -07:00
Aasim Khan
fe086dc778 Fixing adding connections on a new ads install (#22760)
* Fixing stuff

* Fixing connection groups as well
2023-04-18 08:39:53 -07:00
Aasim Khan
b7d24dcecd Cleaning up some code in Async Server Tree. (#22732) 2023-04-17 23:23:41 -07:00
Kim Santiago
938a8bffbe Add AddNugetPackageReferenceRequest for sql projects (#22757)
* Add AddnugetPackageReferenceRequest for sql projects

* update comment

* fix whitespace
2023-04-17 13:31:17 -07:00
Benjin Dubishar
02e61d1598 Swapping Record usage to Map in SQL Projects (#22758)
* Changing SqlCmdVars from Record to Map

* Converting the rest

* Updating tests

* more cleanup

* Updating test to use new test creation API
2023-04-17 12:56:39 -07:00
Charles Gagnon
4a4580e9ef More azurecore cleanup (#22748)
* More azurecore cleanup

* convert to any

* Fix name
2023-04-17 10:29:49 -07:00
Alan Ren
f66d3349b9 add tooltip for dashboard learn more button (#22744) 2023-04-17 08:19:59 -07:00
Charles Gagnon
2d25c8626f Fix typings (#22747) 2023-04-16 16:36:47 -07:00
Karl Burtram
89797e2e94 Bump STS to 4.7.0.11 (#22746) 2023-04-14 18:45:15 -07:00
Alan Ren
1f57a10f7f auto focus message area (#22743) 2023-04-14 16:30:47 -07:00
Hai Cao
5de4f205b1 Add more description to queryEditor.nullBackground color setting (#22739)
* add desc to disable null bgcolor

* add more desc to queryEditor.nullBackground

* move color outside of the loc string

* comment
2023-04-14 16:22:16 -07:00
Benjin Dubishar
b1c2cc1740 Converting remaining services to use runWithErrorHandling() (#22720)
* Converting remaining services to use `runWithErrorHandling()`

* Updating sqlops-dataprotocolclient to 1.3.3

* upgrading dataprotocol and swapping to that baseService

* Adding async to make thenable -> promise conversion happy

---------

Co-authored-by: Alan Ren <alanren@microsoft.com>
2023-04-14 16:08:07 -07:00
Cheena Malhotra
47bf7efd4a Reset IV/Key if MSAL cache file decryption fails (#22733) 2023-04-14 15:45:37 -07:00
Alan Ren
46bc19bcb8 adjust the default max column width (#22740) 2023-04-14 15:29:12 -07:00
Alan Ren
9456285c65 support scripting in object management dialogs (#22429)
* user management - scripting

* remove confirmation

* update sts

* update string
2023-04-14 13:52:06 -07:00
Sakshi Sharma
d69e5b97df Update SC dialog to save/read file structure to/from schema compare file (#22727)
* Read/Send ExtractTarget information from/to STS

* Remove comment

* Cleanup comment and update azdata dependency
2023-04-14 11:47:59 -07:00
Kim Santiago
18a541b0a6 removed unused data sources code from publish dialog (#22722) 2023-04-14 11:44:46 -07:00
Aasim Khan
8d9ddebd98 Adding restart ads notification when async server tree is toggled (#22726) 2023-04-14 07:54:28 +01:00
Cheena Malhotra
537df7cbac Fix file upload (#22725) 2023-04-13 20:51:03 -07:00
Cheena Malhotra
bb3ddc7364 Fix advanced options for CMS and MSSQL (#22708) 2023-04-13 19:21:42 -07:00
Sakshi Sharma
91ea2b43d6 Save publish profile in Publish UI workflow (#22700)
* Add profile section in Publish project UI

* Move publish profile row below Publish Target

* Add contract for savePublishProfie and SaveProfileAs button functionality

* Make the DacFx contract functional

* Send values from UI to DacFx service call

* Fix build error

* Address comment, remove print statements

* Address comments

* Set correct connection string

* Fix functionality for rename, exclude, delete publish profiles. Add new profile to the tree and sqlproj.

* Address comment to update alignement of button

* Address comments

* Update button to use title casing
2023-04-13 17:08:24 -07:00
Kim Santiago
3deb163210 replace deprecated onDidClick() handler in sql projects (#22721) 2023-04-13 16:27:59 -07:00
Cheena Malhotra
87571b2706 Support placeholder text for connection dialog options (#22693) 2023-04-13 15:08:12 -07:00
Benjin Dubishar
733359de57 Updating yarn lock for ML (#22719) 2023-04-13 14:00:38 -07:00
Cory Rivera
8aade56214 Use column names as keys for table data in SQL cell outputs (#22688) 2023-04-13 13:26:12 -07:00
Benjin Dubishar
8202e1ec4e Changing dacFxService and schemaCompareService to use shared logic for error handling (#22718)
* Move helper function to base class

* Switching DacFxService over

* Thenable -> Promise

* Converting SchemaCompareService
2023-04-13 12:44:13 -07:00
Kim Santiago
35e1d63871 fix project listed twice when using multi root workspaces (#22705)
* fix project listed twice when using multi root workspaces

* uppercase
2023-04-13 09:49:14 -07:00
Kim Santiago
c78cc6e534 vbump sql database projects to 1.1.0 (#22707)
vbump after the April release
2023-04-12 20:27:23 -07:00
Cheena Malhotra
728f70bf6e Append Endpoint suffix (#22704) 2023-04-12 20:24:52 -07:00
Christopher Suh
897f026d1e Firewall Dialog Account/Tenant Selection (#22695)
* preselect account in firewall dialog from connection details

* cleanup

* fix element reference

* add initial tenant selection

* fix compile & cleanup

* pr comments

* change to private
2023-04-12 15:59:31 -07:00
Christopher Suh
62c2fd3290 add ms graph endpoint for us gov (#22574) 2023-04-12 14:58:58 -07:00
erpett
0310b132a9 Update changelof for 1.43 and update readme with evergreen links (#22699) 2023-04-12 14:37:13 -07:00
Sakshi Sharma
12011209f3 Update STS to bring in save publish profile changes (and others) (#22684) 2023-04-12 12:32:54 -07:00
Charles Gagnon
df88d881c5 Remove disposable from connection (#22687)
* Remove disposable from connection

* Remove from group
2023-04-11 15:27:01 -07:00
Cheena Malhotra
219bdabfb2 Upgrade xml2js to v0.5.0 + migration to @vscode/vsce + migration to @azure/storage-blob (#22664) 2023-04-11 15:13:43 -07:00
Aasim Khan
577d99e790 Fixing dup connections (#22670) 2023-04-11 14:16:59 -07:00
Benjin Dubishar
4d3d74e3da Fix for issue where bulk-adding scripts could perform a rescan after each script (#22665)
* Changed bulk script add to delay reloading file list until end of operation.

* Adding style name to sqlproj typing file

* vBump to 1.0.1
2023-04-10 16:28:53 -07:00
Christopher Suh
6857718cb3 Fix http request format (#22618)
* fix http request format

* encode to utf 8 and add body for put requests

* encode proxy request body

* change ?.data to ?.body

* add note for textencoder

* change content-type to application/json
2023-04-10 14:36:35 -07:00
Christopher Suh
e7a5ea6ee8 Add No Resources Found label under azure browse (#22660)
* Add No Resources Found label under azure browse

* update localize message

* add period
2023-04-10 10:27:01 -07:00
Raymond Truong
b6f1b949d7 [SQL Migration] Miscellaneous UI improvements from feedback (#22592)
* Hide more info for assessment issues without help links

* Add info box about blob container folders

* WIP - reuse create DMS dialog for IR registration

* Revert "Add info box about blob container folders"

This reverts commit 30b8892ea7918841a6466b59058181062d367ba5.

* Add help link to target platform selection page explaining Azure SQL offerings

* Revert "WIP - reuse create DMS dialog for IR registration"

This reverts commit 5fac6b5c7148b2520cc42ce9fad549cde28baba2.

* Don't show storage account warning banner for DB scenario

* Vbump extension and migration service

* Test - fix http request format from chsuh/fixFormat

* Add instructions for table mapping and schema migration

* Revert "Test - fix http request format from chsuh/fixFormat"

This reverts commit 4992603532e98dff3b7ba6f04ba9304d173fc5ad.
2023-04-10 10:20:39 -07:00
brian-harris
a60d6107b4 SQL-Migration: improve SQL DB table selection ux to include missing tables (#22659)
* add missing target tables ux

* fix number formatting
2023-04-07 16:00:12 -07:00
Cory Rivera
0412ba194b Add more error handling for python installation (#22650) 2023-04-07 14:55:17 -07:00
Charles Gagnon
6a2ec12a35 Fix some no unsafe assignment errors in azurecore (#22643) 2023-04-06 16:50:24 -07:00
Aasim Khan
5ae0e3a503 Removed parent override (#22648) 2023-04-06 15:26:50 -07:00
Alan Ren
2ee7c2649d reduce toolbar space (#22647)
* reduce toolbar space

* feedback
2023-04-06 14:56:48 -07:00
Leila Lali
a62d8f8960 Adding telemetry config to client config to send to STS (#22644)
* Adding telemetry config to client config so they can be sent to STS to monitor updates
2023-04-06 08:18:23 -07:00
Drew Skwiers-Koballa
755a3e9e00 fixing setting description in sql projects readme (#22640) 2023-04-05 17:18:52 -07:00
Drew Skwiers-Koballa
6447e92870 updating SQL projects extension readme (#22632) 2023-04-05 13:01:51 -07:00
Charles Gagnon
df1accf918 Improve missing placeholder warning message (#22630)
* Improve missing placeholder warning message

* stringify
2023-04-05 11:42:26 -07:00
brian-harris
a30719c471 add tooltips to explain migrations list columns (#22598) 2023-04-05 08:16:49 -07:00
Benjin Dubishar
efd489ecde Adding patch to rename backup file when it already exists (#22613) 2023-04-04 13:20:12 -07:00
Benjin Dubishar
cec349d2a4 Fix issue where errors during project update were swallowed (#22612)
* adding errors

* Fixing bug where errors occurring during updating project got swallowed

* removing unnecessary "vscode"
2023-04-04 13:19:07 -07:00
Aasim Khan
958a3f85e5 Enabling async server tree in insiders (#22596) 2023-04-03 17:15:52 -07:00
erpett
b2f131a8ba version bump 1.44 (#22599) 2023-04-03 15:23:47 -07:00
Alan Ren
071d76bc94 fix the empty schema issue (#22594) 2023-04-03 13:57:51 -07:00
Alan Ren
61b3285eaf make table keyboard shortcuts configurable (#22582)
* make table keyboard shortcuts configurable

* fix error

* new slickgrid version

* add comment

* tree grid
2023-04-03 13:21:00 -07:00
Aasim Khan
38a3312cb6 Fixing edited connections not working for root in Async Server Tree (#22580)
* Fixing edited connections not working for root

* Fixing comment
2023-04-03 11:58:19 -07:00
3145 changed files with 138456 additions and 73839 deletions

View File

@@ -3,6 +3,7 @@
**/extensions/**/*.d.ts **/extensions/**/*.d.ts
**/extensions/**/build/** **/extensions/**/build/**
**/extensions/**/colorize-fixtures/** **/extensions/**/colorize-fixtures/**
**/extensions/azurecore/extension.webpack.config.js
**/extensions/css-language-features/server/test/pathCompletionFixtures/** **/extensions/css-language-features/server/test/pathCompletionFixtures/**
**/extensions/html-language-features/server/lib/jquery.d.ts **/extensions/html-language-features/server/lib/jquery.d.ts
**/extensions/html-language-features/server/src/test/pathCompletionFixtures/** **/extensions/html-language-features/server/src/test/pathCompletionFixtures/**

View File

@@ -770,6 +770,7 @@
"chart.js", "chart.js",
"plotly.js", "plotly.js",
"angular2-grid", "angular2-grid",
"kburtram-query-plan",
"html-to-image", "html-to-image",
"turndown", "turndown",
"gridstack", "gridstack",
@@ -1146,6 +1147,7 @@
"extensions/azuremonitor/src/prompts/**", "extensions/azuremonitor/src/prompts/**",
"extensions/azuremonitor/src/typings/findRemove.d.ts", "extensions/azuremonitor/src/typings/findRemove.d.ts",
"extensions/kusto/src/prompts/**", "extensions/kusto/src/prompts/**",
"extensions/mssql/src/hdfs/webhdfs.ts",
"extensions/mssql/src/prompts/**", "extensions/mssql/src/prompts/**",
"extensions/mssql/src/typings/bufferStreamReader.d.ts", "extensions/mssql/src/typings/bufferStreamReader.d.ts",
"extensions/mssql/src/typings/findRemove.d.ts", "extensions/mssql/src/typings/findRemove.d.ts",

View File

@@ -5,5 +5,3 @@
* Ensure that the code is up-to-date with the `main` branch. * Ensure that the code is up-to-date with the `main` branch.
* Include a description of the proposed changes and how to test them. * Include a description of the proposed changes and how to test them.
--> -->
This PR fixes #

22
.github/workflows/bad-tag.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
name: Bad Tag
on:
create
jobs:
main:
runs-on: ubuntu-latest
if: github.event.ref == '1.999.0'
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: stable
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run Bad Tag
uses: ./actions/tag-alert
with:
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
tag-name: '1.999.0'

177
.github/workflows/basic.yml vendored Normal file
View File

@@ -0,0 +1,177 @@
name: Basic checks
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
main:
if: github.ref != 'refs/heads/main'
name: Compilation, Unit and Integration Tests
runs-on: ubuntu-latest
timeout-minutes: 40
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v3
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- name: Setup Build Environment
run: |
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- uses: actions/setup-node@v3
with:
node-version: 16
- name: Compute node modules cache key
id: nodeModulesCacheKey
run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
- name: Cache node modules
id: cacheNodeModules
uses: actions/cache@v3
with:
path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules23-
- name: Get yarn cache directory path
id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: echo "::set-output name=dir::$(yarn cache dir)"
- name: Cache yarn directory
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
uses: actions/cache@v3
with:
path: ${{ steps.yarnCacheDirPath.outputs.dir }}
key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64"
- name: Run Unit Tests
id: electron-unit-tests
run: DISPLAY=:10 ./scripts/test.sh
- name: Run Integration Tests (Electron)
id: electron-integration-tests
run: DISPLAY=:10 ./scripts/test-integration.sh
# {{SQL CARBON TODO}} Bring back "Hygiene and Layering" and "Warm up node modules cache"
# hygiene:
# if: github.ref != 'refs/heads/main'
# name: Hygiene and Layering
# runs-on: ubuntu-latest
# timeout-minutes: 40
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v3
# - uses: actions/setup-node@v3
# with:
# node-version: 16
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v3
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules23-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v3
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
# - name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# env:
# PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
# ELECTRON_SKIP_BINARY_DOWNLOAD: 1
# run: yarn --frozen-lockfile --network-timeout 180000
# - name: Run Hygiene Checks
# run: yarn gulp hygiene
# - name: Run Valid Layers Checks
# run: yarn valid-layers-check
# - name: Compile /build/
# run: yarn --cwd build compile
# - name: Check clean git state
# run: ./.github/workflows/check-clean-git-state.sh
# - name: Run eslint
# run: yarn eslint
# - name: Run vscode-dts Compile Checks
# run: yarn vscode-dts-compile-check
# - name: Run Trusted Types Checks
# run: yarn tsec-compile-check
# warm-cache:
# name: Warm up node modules cache
# if: github.ref == 'refs/heads/main'
# runs-on: ubuntu-latest
# timeout-minutes: 40
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v3
# - uses: actions/setup-node@v3
# with:
# node-version: 16
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v3
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules23-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v3
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
# - name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# env:
# PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
# ELECTRON_SKIP_BINARY_DOWNLOAD: 1
# run: yarn --frozen-lockfile --network-timeout 180000

View File

@@ -1,27 +1,29 @@
name: CI name: CI
on: on: workflow_dispatch
push:
branches: # on:
- main # push:
- release/* # branches:
pull_request: # - main
branches: # - release/*
- main # pull_request:
- release/* # branches:
# - main
# - release/*
jobs: jobs:
windows: windows:
name: Windows name: Windows
runs-on: windows-2019 runs-on: windows-2022
timeout-minutes: 30 timeout-minutes: 60
env: env:
CHILD_CONCURRENCY: "1" CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v3
- uses: actions/setup-node@v2 - uses: actions/setup-node@v3
with: with:
node-version: 16 node-version: 16
@@ -230,13 +232,13 @@ jobs:
hygiene: hygiene:
name: Hygiene and Layering name: Hygiene and Layering
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 30 timeout-minutes: 40
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v3
- uses: actions/setup-node@v2 - uses: actions/setup-node@v3
with: with:
node-version: 16 node-version: 16
@@ -248,8 +250,8 @@ jobs:
uses: actions/cache@v2 uses: actions/cache@v2
with: with:
path: "**/node_modules" path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules14-${{ steps.nodeModulesCacheKey.outputs.value }} key: ${{ runner.os }}-cacheNodeModules23-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules14- restore-keys: ${{ runner.os }}-cacheNodeModules23-
- name: Get yarn cache directory path - name: Get yarn cache directory path
id: yarnCacheDirPath id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
@@ -273,59 +275,27 @@ jobs:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1 ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000 run: yarn --frozen-lockfile --network-timeout 180000
- name: Download Playwright
run: yarn playwright-install
- name: Run Hygiene Checks - name: Run Hygiene Checks
run: yarn gulp hygiene run: yarn gulp hygiene
- name: Run Valid Layers Checks - name: Run Valid Layers Checks
run: yarn valid-layers-check run: yarn valid-layers-check
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn monaco-compile-check
- name: Compile /build/ - name: Compile /build/
run: yarn --cwd build compile run: yarn --cwd build compile
- name: Check clean git state
run: ./.github/workflows/check-clean-git-state.sh
- name: Run eslint - name: Run eslint
run: yarn eslint run: yarn eslint
# {{SQL CARBON EDIT}} Don't need this
# - name: Run Monaco Editor Checks
# run: yarn monaco-compile-check
# {{SQL CARBON EDIT}} Don't need this # {{SQL CARBON EDIT}} Don't need this
# - name: Run vscode-dts Compile Checks # - name: Run vscode-dts Compile Checks
# run: yarn vscode-dts-compile-check # run: yarn vscode-dts-compile-check
- name: Run Trusted Types Checks - name: Run Trusted Types Checks
run: yarn tsec-compile-check run: yarn tsec-compile-check
# - name: Editor Distro & ESM Bundle {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn gulp editor-esm-bundle
# - name: Typings validation prep {{SQL CARBON EDIT}} Remove Monaco checks
# run: |
# mkdir typings-test
# - name: Typings validation {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./typings-test
# run: |
# yarn init -yp
# ../node_modules/.bin/tsc --init
# echo "import '../out-monaco-editor-core';" > a.ts
# ../node_modules/.bin/tsc --noEmit
# - name: Webpack Editor {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run bundle
# - name: Compile Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run compile
# - name: Download Playwright {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn playwright-install
# - name: Run Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# timeout-minutes: 5
# working-directory: ./test/monaco
# run: yarn test

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
16.14

View File

@@ -4,6 +4,6 @@
"recommendations": [ "recommendations": [
"dbaeumer.vscode-eslint", "dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig", "EditorConfig.EditorConfig",
"redhat.vscode-yaml" "ms-vscode.vscode-selfhost-test-provider"
] ]
} }

163
.vscode/launch.json vendored
View File

@@ -10,7 +10,98 @@
"args": [ "args": [
"hygiene" "hygiene"
] ]
}, },
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Git Tests",
"runtimeExecutable": "${execPath}",
"args": [
"/tmp/my4g9l",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/git",
"--extensionTestsPath=${workspaceFolder}/extensions/git/out/test"
],
"outFiles": [
"${workspaceFolder}/extensions/git/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Github Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/github/testWorkspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/github",
"--extensionTestsPath=${workspaceFolder}/extensions/github/out/test"
],
"outFiles": [
"${workspaceFolder}/extensions/github/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (single folder)",
"runtimeExecutable": "${execPath}",
"args": [
// "${workspaceFolder}", // Uncomment for running out of sources.
"${workspaceFolder}/extensions/vscode-api-tests/testWorkspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/singlefolder-tests",
"--disable-extensions"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 3
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (workspace)",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-api-tests/testworkspace.code-workspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/workspace-tests"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 4
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Tokenizer Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-colorize-tests/test",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-colorize-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-colorize-tests/out"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests"
}
},
{ {
"type": "node", "type": "node",
"request": "launch", "request": "launch",
@@ -19,8 +110,18 @@
"windows": { "windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat", "runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
}, },
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"runtimeArgs": [ "runtimeArgs": [
"--no-cached-data" "--inspect=5875",
"--no-cached-data",
"--crash-reporter-directory=${workspaceFolder}/.profile-oss/crashes",
// for general runtime freezes: https://github.com/microsoft/vscode/issues/127861#issuecomment-904144910
"--disable-features=CalculateNativeWinOcclusion",
], ],
"outFiles": [ "outFiles": [
"${workspaceFolder}/out/**/*.js" "${workspaceFolder}/out/**/*.js"
@@ -53,6 +154,9 @@
"runtimeArgs": [ "runtimeArgs": [
"--inspect=5875", "--inspect=5875",
"--no-cached-data", "--no-cached-data",
"--crash-reporter-directory=${workspaceFolder}/.profile-oss/crashes",
// for general runtime freezes: https://github.com/microsoft/vscode/issues/127861#issuecomment-904144910
"--disable-features=CalculateNativeWinOcclusion",
], ],
"webRoot": "${workspaceFolder}", "webRoot": "${workspaceFolder}",
"cascadeTerminateToConfigurations": [ "cascadeTerminateToConfigurations": [
@@ -108,6 +212,18 @@
"group": "2_attach" "group": "2_attach"
} }
}, },
{
"type": "node",
"request": "attach",
"name": "Attach to CLI Process",
"port": 5874,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{ {
"type": "pwa-chrome", "type": "pwa-chrome",
"request": "attach", "request": "attach",
@@ -128,6 +244,49 @@
"order": 2 "order": 2
} }
}, },
{
"type": "node",
"request": "attach",
"name": "Attach to Search Process",
"port": 5876,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{
"type": "node",
"request": "attach",
"name": "Attach to Pty Host Process",
"port": 5877,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
},
{
/* Added for "VS Code Selfhost Test Provider" extension support */
"type": "pwa-chrome",
"request": "attach",
"name": "Attach to VS Code",
"browserAttachLocation": "workspace",
"port": 9222,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach",
"hidden": true
},
"resolveSourceMapLocations": [
"${workspaceFolder}/out/**/*.js"
],
"perScriptSourcemaps": "yes"
},
{ {
"type": "node", "type": "node",
"request": "launch", "request": "launch",

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"May 2022\"" "value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"July 2022\""
}, },
{ {
"kind": 1, "kind": 1,

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal repo:microsoft/vscode-unpkg\n\n$MILESTONE=milestone:\"April 2022\"" "value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal repo:microsoft/vscode-unpkg\n\n$MILESTONE=milestone:\"July 2022\""
}, },
{ {
"kind": 1, "kind": 1,

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$inbox -label:\"needs more info\" sort:created-desc" "value": "$inbox -label:\"info-needed\" sort:created-desc"
}, },
{ {
"kind": 2, "kind": 2,

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal\n\n$MILESTONE=milestone:\"April 2022\"\n\n$MINE=assignee:@me" "value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal\n\n$MILESTONE=milestone:\"July 2022\"\n\n$MINE=assignee:@me"
}, },
{ {
"kind": 1, "kind": 1,
@@ -147,7 +147,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:needs-triage -label:verification-found" "value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:triage-needed -label:verification-found"
}, },
{ {
"kind": 1, "kind": 1,

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-unpkg repo:microsoft/vscode-references-view repo:microsoft/vscode-anycode repo:microsoft/vscode-hexeditor repo:microsoft/vscode-extension-telemetry repo:microsoft/vscode-livepreview repo:microsoft/vscode-remotehub repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remote-repositories-github repo:microsoft/monaco-editor repo:microsoft/vscode-vsce\n\n// current milestone name\n$milestone=milestone:\"May 2022\"" "value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-unpkg repo:microsoft/vscode-references-view repo:microsoft/vscode-anycode repo:microsoft/vscode-hexeditor repo:microsoft/vscode-extension-telemetry repo:microsoft/vscode-livepreview repo:microsoft/vscode-remotehub repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remote-repositories-github repo:microsoft/monaco-editor repo:microsoft/vscode-vsce\n\n// current milestone name\n$milestone=milestone:\"July 2022\""
}, },
{ {
"kind": 1, "kind": 1,
@@ -114,7 +114,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repos assignee:@me is:open label:\"needs more info\"", "value": "$repos assignee:@me is:open label:\"needs more info\""
}, },
{ {
"kind": 1, "kind": 1,

View File

@@ -12,7 +12,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-jupyter repo:microsoft/vscode-python\n$milestone=milestone:\"March 2022\"" "value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-jupyter repo:microsoft/vscode-python\n$milestone=milestone:\"May 2022\""
}, },
{ {
"kind": 1, "kind": 1,

View File

@@ -7,7 +7,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "repo:microsoft/vscode-dev milestone:\"December 2021\" is:open" "value": "repo:microsoft/vscode-dev milestone:\"May 2022\" is:open"
}, },
{ {
"kind": 2, "kind": 2,
@@ -32,11 +32,11 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "repo:microsoft/vscode-remote-repositories-github milestone:\"December 2021\" is:open" "value": "repo:microsoft/vscode-remote-repositories-github milestone:\"May 2022\" is:open"
}, },
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "repo:microsoft/vscode-remotehub milestone:\"December 2021\" is:open" "value": "repo:microsoft/vscode-remotehub milestone:\"May 2022\" is:open"
} }
] ]

View File

@@ -67,6 +67,13 @@
} }
], ],
"git.ignoreLimitWarning": true, "git.ignoreLimitWarning": true,
"git.branchProtection": [
"main",
"release/*"
],
"git.branchProtectionPrompt": "alwaysCommitToNewBranch",
"git.branchRandomName.enable": true,
"git.mergeEditor": true,
"remote.extensionKind": { "remote.extensionKind": {
"msjsdiag.debugger-for-chrome": "workspace" "msjsdiag.debugger-for-chrome": "workspace"
}, },

View File

@@ -1,5 +1,38 @@
# Change Log # Change Log
## Version 1.43.0
* Release number: 1.43.0
* Release date: April 12, 2023
### What's new in 1.43.0
| New Item | Details |
| --- | --- |
| Connection | Added notation for required properties (e.g. Attestation protocol and Attestation URL) when Secure Enclaves are enabled |
| SQL Database Projects extension | General Availability |
| SQL Database Projects extension | Move and rename files within Database Projects view |
| SQL Database Projects extension | SQLCMD variables available for editing in Database Projects view |
| Object Explorer | Double-clicking on a user or table in Object Explorer will open the designer for the object |
| Query Editor | Added a Parse button to the Query Editor toolbar for parsing queries before execution |
| Query Results | Added support to select a row in query results via double click |
### Bug fixes in 1.43.0
| New Item | Details |
| --- | --- |
| Connection | Added support for linked accounts with same username but different domains |
| Connection | Fixed issue with incorrect cache write path |
| Connection | Added ability to include optional name and grouping when creating a new connection using a connection string |
| Connection | Updating username in MSSQL connections to use Preferred username for the display name |
| Connection | Fixed issue with encoding for OSX keychain on macOS |
| Connection | Added support for Azure MFA and Sql Authentication Provider on Linux |
| Dataverse | Addressed error generated when expanding the database node for a Dataverse database in Object Explorer |
| IntelliCode extension | Fixed error that occurred when launching Azure Data Studio with Visual Studio Code IntelliCode extension installed |
| PostgreSQL extension | Implemented support for exporting query results on Apple M1 from a notebook |
| SQL Database Projects extension | Added Accessibility Fixes related to screen reader, label names, and improved focus when navigating |
For a full list of bug fixes addressed for the April 2023 release, visit the [April 2023 Release on GitHub](https://github.com/microsoft/azuredatastudio/milestone/99?closed=1).
## Version 1.42.0 ## Version 1.42.0
* Release number: 1.42.0 * Release number: 1.42.0
* Release date: March 22, 2023 * Release date: March 22, 2023

View File

@@ -4,7 +4,7 @@
[![Build Status](https://dev.azure.com/ms/azuredatastudio/_apis/build/status/AzureDataStudio-Localization-CI?branchName=main)](https://dev.azure.com/ms/azuredatastudio/_build/latest?definitionId=453&branchName=main) [![Build Status](https://dev.azure.com/ms/azuredatastudio/_apis/build/status/AzureDataStudio-Localization-CI?branchName=main)](https://dev.azure.com/ms/azuredatastudio/_build/latest?definitionId=453&branchName=main)
[![Twitter Follow](https://img.shields.io/twitter/follow/azuredatastudio?style=social)](https://twitter.com/azuredatastudio) [![Twitter Follow](https://img.shields.io/twitter/follow/azuredatastudio?style=social)](https://twitter.com/azuredatastudio)
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. Azure Data Studio is a data management and development tool with connectivity to popular cloud and on-premises databases. Azure Data Studio supports Windows, macOS, and Linux, with immediate capability to connect to Azure SQL and SQL Server. Browse the [extension library](wiki/List-of-Extensions) for additional database support options including MySQL, PostreSQL, and MongoDB.
## **Download the latest Azure Data Studio release** ## **Download the latest Azure Data Studio release**
@@ -18,18 +18,18 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| |.rpm |[64 bit][linux-rpm] | | |.rpm |[64 bit][linux-rpm] |
|Mac |.zip |[Universal][osx-universal]&emsp;[Intel Chip][osx-zip]&emsp;[Apple Silicon][osx-arm64] | |Mac |.zip |[Universal][osx-universal]&emsp;[Intel Chip][osx-zip]&emsp;[Apple Silicon][osx-arm64] |
[win-user]: https://go.microsoft.com/fwlink/?linkid=2228644 [win-user]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/stable
[win-system]: https://go.microsoft.com/fwlink/?linkid=2228645 [win-system]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/stable
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2228646 [win-zip]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/stable
[win-user-arm64]: https://go.microsoft.com/fwlink/?linkid=2229004 [win-user-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-user/stable
[win-system-arm64]: https://go.microsoft.com/fwlink/?linkid=2228647 [win-system-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64/stable
[win-zip-arm64]: https://go.microsoft.com/fwlink/?linkid=2229005 [win-zip-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-archive/stable
[osx-universal]: https://go.microsoft.com/fwlink/?linkid=2228649 [linux-zip]: https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/stable
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2228179 [linux-deb]: https://azuredatastudio-update.azurewebsites.net/latest/linux-deb-x64/stable
[osx-arm64]: https://go.microsoft.com/fwlink/?linkid=2228648 [linux-rpm]: https://azuredatastudio-update.azurewebsites.net/latest/linux-rpm-x64/stable
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2229006 [osx-universal]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-universal/stable
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2228650 [osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/stable
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2228180 [osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/stable
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions. Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
@@ -58,10 +58,9 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
[in-osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider [in-osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider
[in-osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/insider [in-osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/insider
Please visit our [download page](https://aka.ms/getazuredatastudio) for more specific installation instructions.
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release. Check out the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) or [release notes](https://learn.microsoft.com/sql/azure-data-studio/release-notes-azure-data-studio) for additional details of what's in the each release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions. The [Azure Data Studio documentation](https://learn.microsoft.com/sql/azure-data-studio) includes complete details on getting started, connection quickstarts, and feature tutorials.
## **Feature Highlights** ## **Feature Highlights**

File diff suppressed because it is too large Load Diff

View File

@@ -1 +1 @@
2022-10-06T02:27:18.022Z 2022-07-19T07:55:26.168Z

1
build/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
.yarnrc

View File

@@ -139,6 +139,14 @@ vscode-encrypt/binding.gyp
vscode-encrypt/README.md vscode-encrypt/README.md
!vscode-encrypt/build/Release/vscode-encrypt-native.node !vscode-encrypt/build/Release/vscode-encrypt-native.node
vscode-policy-watcher/build/**
vscode-policy-watcher/.husky/**
vscode-policy-watcher/src/**
vscode-policy-watcher/binding.gyp
vscode-policy-watcher/README.md
vscode-policy-watcher/index.d.ts
!vscode-policy-watcher/build/Release/vscode-policy-watcher.node
vscode-windows-ca-certs/**/* vscode-windows-ca-certs/**/*
!vscode-windows-ca-certs/package.json !vscode-windows-ca-certs/package.json
!vscode-windows-ca-certs/**/*.node !vscode-windows-ca-certs/**/*.node

View File

@@ -4,6 +4,7 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.OctoKitIssue = exports.OctoKit = void 0;
const core_1 = require("@actions/core"); const core_1 = require("@actions/core");
const github_1 = require("@actions/github"); const github_1 = require("@actions/github");
const child_process_1 = require("child_process"); const child_process_1 = require("child_process");
@@ -20,7 +21,6 @@ class OctoKit {
} }
async *query(query) { async *query(query) {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`; const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`;
console.log(`Querying for ${q}:`);
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({ const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query, ...query,
q, q,
@@ -41,19 +41,17 @@ class OctoKit {
}; };
for await (const pageResponse of this.octokit.paginate.iterator(options)) { for await (const pageResponse of this.octokit.paginate.iterator(options)) {
await timeout(); await timeout();
await utils_1.logRateLimit(this.token); await (0, utils_1.logRateLimit)(this.token);
const page = pageResponse.data; const page = pageResponse.data;
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`);
yield page.map((issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue))); yield page.map((issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)));
} }
} }
async createIssue(owner, repo, title, body) { async createIssue(owner, repo, title, body) {
core_1.debug(`Creating issue \`${title}\` on ${owner}/${repo}`); (0, core_1.debug)(`Creating issue \`${title}\` on ${owner}/${repo}`);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.create({ owner, repo, title, body }); await this.octokit.issues.create({ owner, repo, title, body });
} }
octokitIssueToIssue(issue) { octokitIssueToIssue(issue) {
var _a, _b, _c, _d, _e, _f;
return { return {
author: { name: issue.user.login, isGitHubApp: issue.user.type === 'Bot' }, author: { name: issue.user.login, isGitHubApp: issue.user.type === 'Bot' },
body: issue.body, body: issue.body,
@@ -64,8 +62,8 @@ class OctoKit {
locked: issue.locked, locked: issue.locked,
numComments: issue.comments, numComments: issue.comments,
reactions: issue.reactions, reactions: issue.reactions,
assignee: (_b = (_a = issue.assignee) === null || _a === void 0 ? void 0 : _a.login) !== null && _b !== void 0 ? _b : (_d = (_c = issue.assignees) === null || _c === void 0 ? void 0 : _c[0]) === null || _d === void 0 ? void 0 : _d.login, assignee: issue.assignee?.login ?? issue.assignees?.[0]?.login,
milestoneId: (_f = (_e = issue.milestone) === null || _e === void 0 ? void 0 : _e.number) !== null && _f !== void 0 ? _f : null, milestoneId: issue.milestone?.number ?? null,
createdAt: +new Date(issue.created_at), createdAt: +new Date(issue.created_at),
updatedAt: +new Date(issue.updated_at), updatedAt: +new Date(issue.updated_at),
closedAt: issue.closed_at ? +new Date(issue.closed_at) : undefined, closedAt: issue.closed_at ? +new Date(issue.closed_at) : undefined,
@@ -73,10 +71,10 @@ class OctoKit {
} }
async hasWriteAccess(user) { async hasWriteAccess(user) {
if (user.name in this.writeAccessCache) { if (user.name in this.writeAccessCache) {
core_1.debug('Got permissions from cache for ' + user); (0, core_1.debug)('Got permissions from cache for ' + user);
return this.writeAccessCache[user.name]; return this.writeAccessCache[user.name];
} }
core_1.debug('Fetching permissions for ' + user); (0, core_1.debug)('Fetching permissions for ' + user);
const permissions = (await this.octokit.repos.getCollaboratorPermissionLevel({ const permissions = (await this.octokit.repos.getCollaboratorPermissionLevel({
...this.params, ...this.params,
username: user.name, username: user.name,
@@ -96,14 +94,14 @@ class OctoKit {
} }
} }
async createLabel(name, color, description) { async createLabel(name, color, description) {
core_1.debug('Creating label ' + name); (0, core_1.debug)('Creating label ' + name);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.createLabel({ ...this.params, color, description, name }); await this.octokit.issues.createLabel({ ...this.params, color, description, name });
else else
this.mockLabels.add(name); this.mockLabels.add(name);
} }
async deleteLabel(name) { async deleteLabel(name) {
core_1.debug('Deleting label ' + name); (0, core_1.debug)('Deleting label ' + name);
try { try {
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.deleteLabel({ ...this.params, name }); await this.octokit.issues.deleteLabel({ ...this.params, name });
@@ -116,7 +114,7 @@ class OctoKit {
} }
} }
async readConfig(path) { async readConfig(path) {
core_1.debug('Reading config at ' + path); (0, core_1.debug)('Reading config at ' + path);
const repoPath = `.github/${path}.json`; const repoPath = `.github/${path}.json`;
const data = (await this.octokit.repos.getContents({ ...this.params, path: repoPath })).data; const data = (await this.octokit.repos.getContents({ ...this.params, path: repoPath })).data;
if ('type' in data && data.type === 'file') { if ('type' in data && data.type === 'file') {
@@ -128,10 +126,10 @@ class OctoKit {
throw Error('Found directory at config path when expecting file' + JSON.stringify(data)); throw Error('Found directory at config path when expecting file' + JSON.stringify(data));
} }
async releaseContainsCommit(release, commit) { async releaseContainsCommit(release, commit) {
if (utils_1.getInput('commitReleasedDebuggingOverride')) { if ((0, utils_1.getInput)('commitReleasedDebuggingOverride')) {
return true; return true;
} }
return new Promise((resolve, reject) => child_process_1.exec(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) => !err || err.code === 1 ? resolve(!err) : reject(err))); return new Promise((resolve, reject) => (0, child_process_1.exec)(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) => !err || err.code === 1 ? resolve(!err) : reject(err)));
} }
} }
exports.OctoKit = OctoKit; exports.OctoKit = OctoKit;
@@ -142,7 +140,7 @@ class OctoKitIssue extends OctoKit {
this.issueData = issueData; this.issueData = issueData;
} }
async addAssignee(assignee) { async addAssignee(assignee) {
core_1.debug('Adding assignee ' + assignee + ' to ' + this.issueData.number); (0, core_1.debug)('Adding assignee ' + assignee + ' to ' + this.issueData.number);
if (!this.options.readonly) { if (!this.options.readonly) {
await this.octokit.issues.addAssignees({ await this.octokit.issues.addAssignees({
...this.params, ...this.params,
@@ -152,7 +150,7 @@ class OctoKitIssue extends OctoKit {
} }
} }
async closeIssue() { async closeIssue() {
core_1.debug('Closing issue ' + this.issueData.number); (0, core_1.debug)('Closing issue ' + this.issueData.number);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.update({ await this.octokit.issues.update({
...this.params, ...this.params,
@@ -161,16 +159,15 @@ class OctoKitIssue extends OctoKit {
}); });
} }
async lockIssue() { async lockIssue() {
core_1.debug('Locking issue ' + this.issueData.number); (0, core_1.debug)('Locking issue ' + this.issueData.number);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.lock({ ...this.params, issue_number: this.issueData.number }); await this.octokit.issues.lock({ ...this.params, issue_number: this.issueData.number });
} }
async getIssue() { async getIssue() {
if (isIssue(this.issueData)) { if (isIssue(this.issueData)) {
core_1.debug('Got issue data from query result ' + this.issueData.number); (0, core_1.debug)('Got issue data from query result ' + this.issueData.number);
return this.issueData; return this.issueData;
} }
console.log('Fetching issue ' + this.issueData.number);
const issue = (await this.octokit.issues.get({ const issue = (await this.octokit.issues.get({
...this.params, ...this.params,
issue_number: this.issueData.number, issue_number: this.issueData.number,
@@ -179,7 +176,7 @@ class OctoKitIssue extends OctoKit {
return (this.issueData = this.octokitIssueToIssue(issue)); return (this.issueData = this.octokitIssueToIssue(issue));
} }
async postComment(body) { async postComment(body) {
core_1.debug(`Posting comment ${body} on ${this.issueData.number}`); (0, core_1.debug)(`Posting comment ${body} on ${this.issueData.number}`);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.createComment({ await this.octokit.issues.createComment({
...this.params, ...this.params,
@@ -188,7 +185,7 @@ class OctoKitIssue extends OctoKit {
}); });
} }
async deleteComment(id) { async deleteComment(id) {
core_1.debug(`Deleting comment ${id} on ${this.issueData.number}`); (0, core_1.debug)(`Deleting comment ${id} on ${this.issueData.number}`);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.deleteComment({ await this.octokit.issues.deleteComment({
owner: this.params.owner, owner: this.params.owner,
@@ -197,7 +194,7 @@ class OctoKitIssue extends OctoKit {
}); });
} }
async setMilestone(milestoneId) { async setMilestone(milestoneId) {
core_1.debug(`Setting milestone for ${this.issueData.number} to ${milestoneId}`); (0, core_1.debug)(`Setting milestone for ${this.issueData.number} to ${milestoneId}`);
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.update({ await this.octokit.issues.update({
...this.params, ...this.params,
@@ -206,7 +203,7 @@ class OctoKitIssue extends OctoKit {
}); });
} }
async *getComments(last) { async *getComments(last) {
core_1.debug('Fetching comments for ' + this.issueData.number); (0, core_1.debug)('Fetching comments for ' + this.issueData.number);
const response = this.octokit.paginate.iterator(this.octokit.issues.listComments.endpoint.merge({ const response = this.octokit.paginate.iterator(this.octokit.issues.listComments.endpoint.merge({
...this.params, ...this.params,
issue_number: this.issueData.number, issue_number: this.issueData.number,
@@ -223,7 +220,7 @@ class OctoKitIssue extends OctoKit {
} }
} }
async addLabel(name) { async addLabel(name) {
core_1.debug(`Adding label ${name} to ${this.issueData.number}`); (0, core_1.debug)(`Adding label ${name} to ${this.issueData.number}`);
if (!(await this.repoHasLabel(name))) { if (!(await this.repoHasLabel(name))) {
throw Error(`Action could not execute becuase label ${name} is not defined.`); throw Error(`Action could not execute becuase label ${name} is not defined.`);
} }
@@ -235,7 +232,7 @@ class OctoKitIssue extends OctoKit {
}); });
} }
async removeLabel(name) { async removeLabel(name) {
core_1.debug(`Removing label ${name} from ${this.issueData.number}`); (0, core_1.debug)(`Removing label ${name} from ${this.issueData.number}`);
try { try {
if (!this.options.readonly) if (!this.options.readonly)
await this.octokit.issues.removeLabel({ await this.octokit.issues.removeLabel({
@@ -246,14 +243,12 @@ class OctoKitIssue extends OctoKit {
} }
catch (err) { catch (err) {
if (err.status === 404) { if (err.status === 404) {
console.log(`Label ${name} not found on issue`);
return; return;
} }
throw err; throw err;
} }
} }
async getClosingInfo() { async getClosingInfo() {
var _a;
if ((await this.getIssue()).open) { if ((await this.getIssue()).open) {
return; return;
} }
@@ -267,13 +262,12 @@ class OctoKitIssue extends OctoKit {
for (const timelineEvent of timelineEvents) { for (const timelineEvent of timelineEvents) {
if (timelineEvent.event === 'closed') { if (timelineEvent.event === 'closed') {
closingCommit = { closingCommit = {
hash: (_a = timelineEvent.commit_id) !== null && _a !== void 0 ? _a : undefined, hash: timelineEvent.commit_id ?? undefined,
timestamp: +new Date(timelineEvent.created_at), timestamp: +new Date(timelineEvent.created_at),
}; };
} }
} }
} }
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`);
return closingCommit; return closingCommit;
} }
} }

View File

@@ -25,7 +25,6 @@ export class OctoKit implements GitHub {
async *query(query: Query): AsyncIterableIterator<GitHubIssue[]> { async *query(query: Query): AsyncIterableIterator<GitHubIssue[]> {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}` const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`
console.log(`Querying for ${q}:`)
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({ const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query, ...query,
@@ -50,7 +49,6 @@ export class OctoKit implements GitHub {
await timeout() await timeout()
await logRateLimit(this.token) await logRateLimit(this.token)
const page: Array<Octokit.SearchIssuesAndPullRequestsResponseItemsItem> = pageResponse.data const page: Array<Octokit.SearchIssuesAndPullRequestsResponseItemsItem> = pageResponse.data
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`)
yield page.map( yield page.map(
(issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)), (issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)),
) )
@@ -199,7 +197,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
return this.issueData return this.issueData
} }
console.log('Fetching issue ' + this.issueData.number)
const issue = ( const issue = (
await this.octokit.issues.get({ await this.octokit.issues.get({
...this.params, ...this.params,
@@ -286,7 +283,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
}) })
} catch (err) { } catch (err) {
if (err.status === 404) { if (err.status === 404) {
console.log(`Label ${name} not found on issue`)
return return
} }
throw err throw err
@@ -314,7 +310,6 @@ export class OctoKitIssue extends OctoKit implements GitHubIssue {
} }
} }
} }
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`)
return closingCommit return closingCommit
} }
} }

View File

@@ -4,17 +4,18 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.TestbedIssue = exports.Testbed = void 0;
class Testbed { class Testbed {
constructor(config) { constructor(config) {
var _a, _b, _c, _d, _e;
this.config = { this.config = {
globalLabels: (_a = config === null || config === void 0 ? void 0 : config.globalLabels) !== null && _a !== void 0 ? _a : [], globalLabels: config?.globalLabels ?? [],
configs: (_b = config === null || config === void 0 ? void 0 : config.configs) !== null && _b !== void 0 ? _b : {}, configs: config?.configs ?? {},
writers: (_c = config === null || config === void 0 ? void 0 : config.writers) !== null && _c !== void 0 ? _c : [], writers: config?.writers ?? [],
releasedCommits: (_d = config === null || config === void 0 ? void 0 : config.releasedCommits) !== null && _d !== void 0 ? _d : [], releasedCommits: config?.releasedCommits ?? [],
queryRunner: (_e = config === null || config === void 0 ? void 0 : config.queryRunner) !== null && _e !== void 0 ? _e : async function* () { queryRunner: config?.queryRunner ??
yield []; async function* () {
}, yield [];
},
}; };
} }
async *query(query) { async *query(query) {
@@ -47,16 +48,15 @@ class Testbed {
exports.Testbed = Testbed; exports.Testbed = Testbed;
class TestbedIssue extends Testbed { class TestbedIssue extends Testbed {
constructor(globalConfig, issueConfig) { constructor(globalConfig, issueConfig) {
var _a, _b, _c;
super(globalConfig); super(globalConfig);
issueConfig = issueConfig !== null && issueConfig !== void 0 ? issueConfig : {}; issueConfig = issueConfig ?? {};
issueConfig.comments = (_a = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) !== null && _a !== void 0 ? _a : []; issueConfig.comments = issueConfig?.comments ?? [];
issueConfig.labels = (_b = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.labels) !== null && _b !== void 0 ? _b : []; issueConfig.labels = issueConfig?.labels ?? [];
issueConfig.issue = { issueConfig.issue = {
author: { name: 'JacksonKearl' }, author: { name: 'JacksonKearl' },
body: 'issue body', body: 'issue body',
locked: false, locked: false,
numComments: ((_c = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) === null || _c === void 0 ? void 0 : _c.length) || 0, numComments: issueConfig?.comments?.length || 0,
number: 1, number: 1,
open: true, open: true,
title: 'issue title', title: 'issue title',
@@ -90,7 +90,7 @@ class TestbedIssue extends Testbed {
} }
async postComment(body, author) { async postComment(body, author) {
this.issueConfig.comments.push({ this.issueConfig.comments.push({
author: { name: author !== null && author !== void 0 ? author : 'bot' }, author: { name: author ?? 'bot' },
body, body,
id: Math.random(), id: Math.random(),
timestamp: +new Date(), timestamp: +new Date(),

View File

@@ -8,15 +8,15 @@ const core = require("@actions/core");
const github_1 = require("@actions/github"); const github_1 = require("@actions/github");
const octokit_1 = require("../api/octokit"); const octokit_1 = require("../api/octokit");
const utils_1 = require("../utils/utils"); const utils_1 = require("../utils/utils");
const token = utils_1.getRequiredInput('token'); const token = (0, utils_1.getRequiredInput)('token');
const label = utils_1.getRequiredInput('label'); const label = (0, utils_1.getRequiredInput)('label');
async function main() { async function main() {
const pr = new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number }); const pr = new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number });
pr.addLabel(label); pr.addLabel(label);
} }
main() main()
.then(() => utils_1.logRateLimit(token)) .then(() => (0, utils_1.logRateLimit)(token))
.catch(async (error) => { .catch(async (error) => {
core.setFailed(error.message); core.setFailed(error.message);
await utils_1.logErrorToIssue(error.message, true, token); await (0, utils_1.logErrorToIssue)(error.message, true, token);
}); });

View File

@@ -1,24 +1,25 @@
{ {
"name": "github-actions", "name": "github-actions",
"version": "1.0.0", "version": "1.0.0",
"description": "GitHub Actions", "description": "GitHub Actions",
"scripts": { "scripts": {
"test": "mocha -r ts-node/register **/*.test.ts", "test": "mocha -r ts-node/register **/*.test.ts",
"build": "tsc -p ./tsconfig.json", "build": "tsc -p ./tsconfig.json",
"lint": "eslint -c .eslintrc --fix --ext .ts .", "compile": "tsc -p ./tsconfig.json",
"watch-typecheck": "tsc --watch" "lint": "eslint -c .eslintrc --fix --ext .ts .",
}, "watch-typecheck": "tsc --watch"
"repository": { },
"type": "git", "repository": {
"url": "git+https://github.com/microsoft/azuredatastudio.git" "type": "git",
}, "url": "git+https://github.com/microsoft/azuredatastudio.git"
"keywords": [], },
"author": "", "keywords": [],
"dependencies": { "author": "",
"@actions/core": "^1.2.6", "dependencies": {
"@actions/github": "^2.1.1", "@actions/core": "^1.2.6",
"axios": "^0.21.4", "@actions/github": "^2.1.1",
"axios": "^0.21.4",
"ts-node": "^8.6.2", "ts-node": "^8.6.2",
"typescript": "^3.8.3" "typescript": "^3.8.3"
} }
} }

View File

@@ -4,13 +4,16 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.logErrorToIssue = exports.logRateLimit = exports.daysAgoToHumanReadbleDate = exports.daysAgoToTimestamp = exports.loadLatestRelease = exports.normalizeIssue = exports.getRequiredInput = exports.getInput = void 0;
const core = require("@actions/core"); const core = require("@actions/core");
const github_1 = require("@actions/github"); const github_1 = require("@actions/github");
const axios_1 = require("axios"); const axios_1 = require("axios");
const octokit_1 = require("../api/octokit"); const octokit_1 = require("../api/octokit");
exports.getInput = (name) => core.getInput(name) || undefined; const getInput = (name) => core.getInput(name) || undefined;
exports.getRequiredInput = (name) => core.getInput(name, { required: true }); exports.getInput = getInput;
exports.normalizeIssue = (issue) => { const getRequiredInput = (name) => core.getInput(name, { required: true });
exports.getRequiredInput = getRequiredInput;
const normalizeIssue = (issue) => {
const { body, title } = issue; const { body, title } = issue;
const isBug = body.includes('bug_report_template') || /Issue Type:.*Bug.*/.test(body); const isBug = body.includes('bug_report_template') || /Issue Type:.*Bug.*/.test(body);
const isFeatureRequest = body.includes('feature_request_template') || /Issue Type:.*Feature Request.*/.test(body); const isFeatureRequest = body.includes('feature_request_template') || /Issue Type:.*Feature Request.*/.test(body);
@@ -33,23 +36,25 @@ exports.normalizeIssue = (issue) => {
issueType: isBug ? 'bug' : isFeatureRequest ? 'feature_request' : 'unknown', issueType: isBug ? 'bug' : isFeatureRequest ? 'feature_request' : 'unknown',
}; };
}; };
exports.loadLatestRelease = async (quality) => (await axios_1.default.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data; exports.normalizeIssue = normalizeIssue;
exports.daysAgoToTimestamp = (days) => +new Date(Date.now() - days * 24 * 60 * 60 * 1000); const loadLatestRelease = async (quality) => (await axios_1.default.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data;
exports.daysAgoToHumanReadbleDate = (days) => new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, ''); exports.loadLatestRelease = loadLatestRelease;
exports.logRateLimit = async (token) => { const daysAgoToTimestamp = (days) => +new Date(Date.now() - days * 24 * 60 * 60 * 1000);
exports.daysAgoToTimestamp = daysAgoToTimestamp;
const daysAgoToHumanReadbleDate = (days) => new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '');
exports.daysAgoToHumanReadbleDate = daysAgoToHumanReadbleDate;
const logRateLimit = async (token) => {
const usageData = (await new github_1.GitHub(token).rateLimit.get()).data.resources; const usageData = (await new github_1.GitHub(token).rateLimit.get()).data.resources;
['core', 'graphql', 'search'].forEach(async (category) => { ['core', 'graphql', 'search'].forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit; const usage = 1 - usageData[category].remaining / usageData[category].limit;
const message = `Usage at ${usage} for ${category}`; const message = `Usage at ${usage} for ${category}`;
if (usage > 0) {
console.log(message);
}
if (usage > 0.5) { if (usage > 0.5) {
await exports.logErrorToIssue(message, false, token); await (0, exports.logErrorToIssue)(message, false, token);
} }
}); });
}; };
exports.logErrorToIssue = async (message, ping, token) => { exports.logRateLimit = logRateLimit;
const logErrorToIssue = async (message, ping, token) => {
// Attempt to wait out abuse detection timeout if present // Attempt to wait out abuse detection timeout if present
await new Promise((resolve) => setTimeout(resolve, 10000)); await new Promise((resolve) => setTimeout(resolve, 10000));
const dest = github_1.context.repo.repo === 'vscode-internalbacklog' const dest = github_1.context.repo.repo === 'vscode-internalbacklog'
@@ -70,3 +75,4 @@ ${JSON.stringify(github_1.context, null, 2).replace(/<!--/gu, '<@--').replace(/-
--> -->
`); `);
}; };
exports.logErrorToIssue = logErrorToIssue;

View File

@@ -58,13 +58,10 @@ export const daysAgoToHumanReadbleDate = (days: number) =>
new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '') new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '')
export const logRateLimit = async (token: string) => { export const logRateLimit = async (token: string) => {
const usageData = (await new GitHub(token).rateLimit.get()).data.resources const usageData = (await new GitHub(token).rateLimit.get()).data.resources;
;(['core', 'graphql', 'search'] as const).forEach(async (category) => { (['core', 'graphql', 'search'] as const).forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit const usage = 1 - usageData[category].remaining / usageData[category].limit
const message = `Usage at ${usage} for ${category}` const message = `Usage at ${usage} for ${category}`
if (usage > 0) {
console.log(message)
}
if (usage > 0.5) { if (usage > 0.5) {
await logErrorToIssue(message, false, token) await logErrorToIssue(message, false, token)
} }

View File

@@ -285,6 +285,8 @@ node-fetch@^2.3.0:
version "2.6.7" version "2.6.7"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad" resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad"
integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ== integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==
dependencies:
whatwg-url "^5.0.0"
npm-run-path@^2.0.0: npm-run-path@^2.0.0:
version "2.0.2" version "2.0.2"
@@ -371,6 +373,11 @@ strip-eof@^1.0.0:
resolved "https://registry.yarnpkg.com/strip-eof/-/strip-eof-1.0.0.tgz#bb43ff5598a6eb05d89b59fcd129c983313606bf" resolved "https://registry.yarnpkg.com/strip-eof/-/strip-eof-1.0.0.tgz#bb43ff5598a6eb05d89b59fcd129c983313606bf"
integrity sha1-u0P/VZim6wXYm1n80SnJgzE2Br8= integrity sha1-u0P/VZim6wXYm1n80SnJgzE2Br8=
tr46@~0.0.3:
version "0.0.3"
resolved "https://registry.yarnpkg.com/tr46/-/tr46-0.0.3.tgz#8184fd347dac9cdc185992f3a6622e14b9d9ab6a"
integrity sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw==
ts-node@^8.6.2: ts-node@^8.6.2:
version "8.9.0" version "8.9.0"
resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-8.9.0.tgz#d7bf7272dcbecd3a2aa18bd0b96c7d2f270c15d4" resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-8.9.0.tgz#d7bf7272dcbecd3a2aa18bd0b96c7d2f270c15d4"
@@ -388,9 +395,9 @@ tunnel@0.0.6, tunnel@^0.0.6:
integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg== integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==
typescript@^3.8.3: typescript@^3.8.3:
version "3.8.3" version "3.9.10"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.8.3.tgz#409eb8544ea0335711205869ec458ab109ee1061" resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.9.10.tgz#70f3910ac7a51ed6bef79da7800690b19bf778b8"
integrity sha512-MYlEfn5VrLNsgudQTVJeNaQFUAI7DkhnOjdpAp4T+ku1TfQClewlbSuTVHiA+8skNBgaf02TL/kLOvig4y3G8w== integrity sha512-w6fIxVE/H1PkLKcCPsFqKE7Kv7QUwhU8qQY2MueZXWx5cPZdwFupLgKK3vntcK98BtNHZtAF4LA/yl2a7k8R6Q==
universal-user-agent@^4.0.0: universal-user-agent@^4.0.0:
version "4.0.1" version "4.0.1"
@@ -411,6 +418,19 @@ uuid@^8.3.2:
resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2" resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2"
integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg== integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==
webidl-conversions@^3.0.0:
version "3.0.1"
resolved "https://registry.yarnpkg.com/webidl-conversions/-/webidl-conversions-3.0.1.tgz#24534275e2a7bc6be7bc86611cc16ae0a5654871"
integrity sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ==
whatwg-url@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/whatwg-url/-/whatwg-url-5.0.0.tgz#966454e8765462e37644d3626f6742ce8b70965d"
integrity sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==
dependencies:
tr46 "~0.0.3"
webidl-conversions "^3.0.0"
which@^1.2.9: which@^1.2.9:
version "1.3.1" version "1.3.1"
resolved "https://registry.yarnpkg.com/which/-/which-1.3.1.tgz#a45043d54f5805316da8d62f9f50918d3da70b0a" resolved "https://registry.yarnpkg.com/which/-/which-1.3.1.tgz#a45043d54f5805316da8d62f9f50918d3da70b0a"

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs"); const fs = require("fs");
const path = require("path"); const path = require("path");
@@ -14,7 +14,7 @@ shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc'))); shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc'))); shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files // Add `package.json` and `yarn.lock` files
for (let dir of dirs) { for (const dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json'); const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString()); const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = { const relevantPackageJsonSections = {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs'; import * as fs from 'fs';
import * as path from 'path'; import * as path from 'path';
import * as crypto from 'crypto'; import * as crypto from 'crypto';
@@ -19,7 +17,7 @@ shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc'))); shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files // Add `package.json` and `yarn.lock` files
for (let dir of dirs) { for (const dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json'); const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString()); const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = { const relevantPackageJsonSections = {

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs"); const fs = require("fs");
const crypto = require("crypto"); const crypto = require("crypto");
@@ -130,7 +130,6 @@ function getEnv(name) {
return result; return result;
} }
async function main() { async function main() {
var _a;
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv; const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
// getPlatform needs the unprocessedType // getPlatform needs the unprocessedType
const platform = getPlatform(product, os, arch, unprocessedType); const platform = getPlatform(product, os, arch, unprocessedType);
@@ -169,7 +168,7 @@ async function main() {
console.log('Blob successfully uploaded to Azure storage.'); console.log('Blob successfully uploaded to Azure storage.');
}) })
]; ];
const shouldUploadToMooncake = /true/i.test((_a = process.env['VSCODE_PUBLISH_TO_MOONCAKE']) !== null && _a !== void 0 ? _a : 'true'); const shouldUploadToMooncake = /true/i.test(process.env['VSCODE_PUBLISH_TO_MOONCAKE'] ?? 'true');
if (shouldUploadToMooncake) { if (shouldUploadToMooncake) {
const mooncakeCredential = new identity_1.ClientSecretCredential(process.env['AZURE_MOONCAKE_TENANT_ID'], process.env['AZURE_MOONCAKE_CLIENT_ID'], process.env['AZURE_MOONCAKE_CLIENT_SECRET']); const mooncakeCredential = new identity_1.ClientSecretCredential(process.env['AZURE_MOONCAKE_TENANT_ID'], process.env['AZURE_MOONCAKE_CLIENT_ID'], process.env['AZURE_MOONCAKE_CLIENT_SECRET']);
const mooncakeBlobServiceClient = new storage_blob_1.BlobServiceClient(`https://vscode.blob.core.chinacloudapi.cn`, mooncakeCredential, storagePipelineOptions); const mooncakeBlobServiceClient = new storage_blob_1.BlobServiceClient(`https://vscode.blob.core.chinacloudapi.cn`, mooncakeCredential, storagePipelineOptions);

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs'; import * as fs from 'fs';
import { Readable } from 'stream'; import { Readable } from 'stream';
import * as crypto from 'crypto'; import * as crypto from 'crypto';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const identity_1 = require("@azure/identity"); const identity_1 = require("@azure/identity");
const cosmos_1 = require("@azure/cosmos"); const cosmos_1 = require("@azure/cosmos");
@@ -19,12 +19,11 @@ function getEnv(name) {
return result; return result;
} }
async function main() { async function main() {
var _a, _b, _c;
const [, , _version] = process.argv; const [, , _version] = process.argv;
const quality = getEnv('VSCODE_QUALITY'); const quality = getEnv('VSCODE_QUALITY');
const commit = ((_a = process.env['VSCODE_DISTRO_COMMIT']) === null || _a === void 0 ? void 0 : _a.trim()) || getEnv('BUILD_SOURCEVERSION'); const commit = process.env['VSCODE_DISTRO_COMMIT']?.trim() || getEnv('BUILD_SOURCEVERSION');
const queuedBy = getEnv('BUILD_QUEUEDBY'); const queuedBy = getEnv('BUILD_QUEUEDBY');
const sourceBranch = ((_b = process.env['VSCODE_DISTRO_REF']) === null || _b === void 0 ? void 0 : _b.trim()) || getEnv('BUILD_SOURCEBRANCH'); const sourceBranch = process.env['VSCODE_DISTRO_REF']?.trim() || getEnv('BUILD_SOURCEBRANCH');
const version = _version + (quality === 'stable' ? '' : `-${quality}`); const version = _version + (quality === 'stable' ? '' : `-${quality}`);
console.log('Creating build...'); console.log('Creating build...');
console.log('Quality:', quality); console.log('Quality:', quality);
@@ -35,7 +34,7 @@ async function main() {
timestamp: (new Date()).getTime(), timestamp: (new Date()).getTime(),
version, version,
isReleased: false, isReleased: false,
private: Boolean((_c = process.env['VSCODE_DISTRO_REF']) === null || _c === void 0 ? void 0 : _c.trim()), private: Boolean(process.env['VSCODE_DISTRO_REF']?.trim()),
sourceBranch, sourceBranch,
queuedBy, queuedBy,
assets: [], assets: [],
@@ -44,7 +43,7 @@ async function main() {
const aadCredentials = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']); const aadCredentials = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], aadCredentials }); const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], aadCredentials });
const scripts = client.database('builds').container(quality).scripts; const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })])); await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]));
} }
main().then(() => { main().then(() => {
console.log('Build successfully created'); console.log('Build successfully created');

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import { ClientSecretCredential } from '@azure/identity'; import { ClientSecretCredential } from '@azure/identity';
import { CosmosClient } from '@azure/cosmos'; import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry'; import { retry } from './retry';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs"); const fs = require("fs");
const path = require("path"); const path = require("path");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs'; import * as fs from 'fs';
import * as path from 'path'; import * as path from 'path';

View File

@@ -6,10 +6,10 @@
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs"); const fs = require("fs");
const crypto = require("crypto"); const crypto = require("crypto");
const azure = require("azure-storage");
const mime = require("mime"); const mime = require("mime");
const minimist = require("minimist"); const minimist = require("minimist");
const documentdb_1 = require("documentdb"); const documentdb_1 = require("documentdb");
const storage_blob_1 = require("@azure/storage-blob");
// {{SQL CARBON EDIT}} // {{SQL CARBON EDIT}}
if (process.argv.length < 9) { if (process.argv.length < 9) {
console.error('Usage: node publish.js <product_quality> <platform> <file_type> <file_name> <version> <is_update> <file> [commit_id]'); console.error('Usage: node publish.js <product_quality> <platform> <file_type> <file_name> <version> <is_update> <file> [commit_id]');
@@ -104,21 +104,23 @@ function createOrUpdate(commit, quality, platform, type, release, asset, isUpdat
}); });
})); }));
} }
async function assertContainer(blobService, quality) { async function assertContainer(containerClient) {
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c())); let containerResponse = await containerClient.createIfNotExists({ access: 'blob' });
return containerResponse && !!containerResponse.errorCode;
} }
async function doesAssetExist(blobService, quality, blobName) { async function uploadBlob(blobClient, file) {
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r))); const result = await blobClient.uploadFile(file, {
return existsResult.exists; blobHTTPHeaders: {
} blobContentType: mime.lookup(file),
async function uploadBlob(blobService, quality, blobName, file) { blobCacheControl: 'max-age=31536000, public'
const blobOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
} }
}; });
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c())); if (result && !result.errorCode) {
console.log(`Blobs uploaded successfully, response status: ${result?._response?.status}`);
}
else {
console.error(`Blobs failed to upload, response status: ${result?._response?.status}, errorcode: ${result?.errorCode}`);
}
} }
async function publish(commit, quality, platform, type, name, version, _isUpdate, file, opts) { async function publish(commit, quality, platform, type, name, version, _isUpdate, file, opts) {
const isUpdate = _isUpdate === 'true'; const isUpdate = _isUpdate === 'true';
@@ -142,54 +144,62 @@ async function publish(commit, quality, platform, type, name, version, _isUpdate
console.log('SHA256:', sha256hash); console.log('SHA256:', sha256hash);
const blobName = commit + '/' + name; const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']; const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']) const storageKey = process.env['AZURE_STORAGE_ACCESS_KEY_2'];
.withFilter(new azure.ExponentialRetryPolicyFilter(20)); const connectionString = `DefaultEndpointsProtocol=https;AccountName=${storageAccount};AccountKey=${storageKey};EndpointSuffix=core.windows.net`;
await assertContainer(blobService, quality); let blobServiceClient = storage_blob_1.BlobServiceClient.fromConnectionString(connectionString, {
const blobExists = await doesAssetExist(blobService, quality, blobName); retryOptions: {
if (blobExists) { maxTries: 20,
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`); retryPolicyType: storage_blob_1.StorageRetryPolicyType.EXPONENTIAL
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
} }
});
let containerClient = blobServiceClient.getContainerClient(quality);
if (await assertContainer(containerClient)) {
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobClient, file);
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
}
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
} }
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
} }
const RETRY_TIMES = 10; const RETRY_TIMES = 10;
async function retry(fn) { async function retry(fn) {

View File

@@ -8,10 +8,10 @@
import * as fs from 'fs'; import * as fs from 'fs';
import { Readable } from 'stream'; import { Readable } from 'stream';
import * as crypto from 'crypto'; import * as crypto from 'crypto';
import * as azure from 'azure-storage';
import * as mime from 'mime'; import * as mime from 'mime';
import * as minimist from 'minimist'; import * as minimist from 'minimist';
import { DocumentClient, NewDocument } from 'documentdb'; import { DocumentClient, NewDocument } from 'documentdb';
import { BlobServiceClient, BlockBlobClient, ContainerClient, StorageRetryPolicyType } from '@azure/storage-blob';
// {{SQL CARBON EDIT}} // {{SQL CARBON EDIT}}
if (process.argv.length < 9) { if (process.argv.length < 9) {
@@ -127,24 +127,23 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
})); }));
} }
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> { async function assertContainer(containerClient: ContainerClient): Promise<boolean> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c())); let containerResponse = await containerClient.createIfNotExists({ access: 'blob' });
return containerResponse && !!containerResponse.errorCode;
} }
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> { async function uploadBlob(blobClient: BlockBlobClient, file: string): Promise<void> {
const existsResult = await new Promise<azure.BlobService.BlobResult>((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r))); const result = await blobClient.uploadFile(file, {
return existsResult.exists; blobHTTPHeaders: {
} blobContentType: mime.lookup(file),
blobCacheControl: 'max-age=31536000, public'
async function uploadBlob(blobService: azure.BlobService, quality: string, blobName: string, file: string): Promise<void> {
const blobOptions: azure.BlobService.CreateBlockBlobRequestOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
} }
}; });
if (result && !result.errorCode) {
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c())); console.log(`Blobs uploaded successfully, response status: ${result?._response?.status}`);
} else {
console.error(`Blobs failed to upload, response status: ${result?._response?.status}, errorcode: ${result?.errorCode}`)
}
} }
interface PublishOptions { interface PublishOptions {
@@ -180,74 +179,78 @@ async function publish(commit: string, quality: string, platform: string, type:
const blobName = commit + '/' + name; const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!; const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
const storageKey = process.env['AZURE_STORAGE_ACCESS_KEY_2']!;
const connectionString = `DefaultEndpointsProtocol=https;AccountName=${storageAccount};AccountKey=${storageKey};EndpointSuffix=core.windows.net`;
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!) let blobServiceClient = BlobServiceClient.fromConnectionString(connectionString, {
.withFilter(new azure.ExponentialRetryPolicyFilter(20)); retryOptions: {
maxTries: 20,
await assertContainer(blobService, quality); retryPolicyType: StorageRetryPolicyType.EXPONENTIAL
const blobExists = await doesAssetExist(blobService, quality, blobName);
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset: Asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (
(
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)
);
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [] as Array<Asset>,
updates: {} as any
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
} }
} });
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate); let containerClient = blobServiceClient.getContainerClient(quality);
if (await assertContainer(containerClient)) {
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobClient, file);
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset: Asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (
(
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)
);
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [] as Array<Asset>,
updates: {} as any
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
}
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
} }
const RETRY_TIMES = 10; const RETRY_TIMES = 10;

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const identity_1 = require("@azure/identity"); const identity_1 = require("@azure/identity");
const cosmos_1 = require("@azure/cosmos"); const cosmos_1 = require("@azure/cosmos");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import { ClientSecretCredential } from '@azure/identity'; import { ClientSecretCredential } from '@azure/identity';
import { CosmosClient } from '@azure/cosmos'; import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry'; import { retry } from './retry';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.retry = void 0; exports.retry = void 0;
async function retry(fn) { async function retry(fn) {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
export async function retry<T>(fn: () => Promise<T>): Promise<T> { export async function retry<T>(fn: () => Promise<T>): Promise<T> {
let lastError: Error | undefined; let lastError: Error | undefined;

View File

@@ -35,25 +35,66 @@ steps:
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js x64 $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
displayName: Extract node_modules cache
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: | - script: |
set -e set -e
npx https://aka.ms/enablesecurefeed standAlone npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5 timeoutInMinutes: 5
retryCountOnTaskFailure: 3 retryCountOnTaskFailure: 3
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages displayName: Switch to Terrapin packages
- script: | - script: |
set -e set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin for i in {1..3}; do # try 3 times, for Terrapin
yarn --cwd build --frozen-lockfile --check-files && break yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2 echo "Yarn failed too many times" >&2
exit 1 exit 1
fi fi
echo "Yarn failed $i, trying again..." echo "Yarn failed $i, trying again..."
done done
displayName: Install build dependencies env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- download: current - download: current
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive

View File

@@ -1,270 +1,213 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps: steps:
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key"
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
# Set up the credentials to retrieve distro repo and setup git persona
# to create a merge commit for when we merge distro into oss
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
git checkout FETCH_HEAD
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
- script: |
set -e
node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- script: | - script: |
set -e set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install" yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright displayName: Download Electron and Playwright
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
# Setting hardened entitlements is a requirement for: - ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
# * Running tests on Big Sur (because Big Sur has additional security precautions) - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: | - script: |
set -e set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain ./scripts/test.sh --tfs "Unit Tests"
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain displayName: Run unit tests (Electron)
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain timeoutInMinutes: 15
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: | - script: |
set -e set -e
./scripts/test.sh --build --tfs "Unit Tests" yarn test-node
displayName: Run unit tests (Electron) displayName: Run unit tests (node.js)
timeoutInMinutes: 15 timeoutInMinutes: 15
- script: | - script: |
set -e set -e
yarn test-node --build DEBUG=*browser* yarn test-browser-no-install --sequential --browser chromium --browser webkit --tfs "Browser Unit Tests"
displayName: Run unit tests (node.js) displayName: Run unit tests (Browser, Chromium & Webkit)
timeoutInMinutes: 15 timeoutInMinutes: 30
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
DEBUG=*browser* yarn test-browser-no-install --sequential --build --browser chromium --browser webkit --tfs "Browser Unit Tests" set -e
displayName: Run unit tests (Browser, Chromium & Webkit) ./scripts/test.sh --build --tfs "Unit Tests"
timeoutInMinutes: 30 displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: | - script: |
# Figure out the full absolute path of the product we just built set -e
# including the remote server and configure the integration tests yarn test-node --build
# to run with these builds instead of running out of sources. displayName: Run unit tests (node.js)
set -e timeoutInMinutes: 15
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: | - script: |
set -e set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \ DEBUG=*browser* yarn test-browser-no-install --sequential --build --browser chromium --browser webkit --tfs "Browser Unit Tests"
./scripts/test-web-integration.sh --browser webkit displayName: Run unit tests (Browser, Chromium & Webkit)
displayName: Run integration tests (Browser, Webkit) timeoutInMinutes: 30
timeoutInMinutes: 20
- script: | - ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
set -e - script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) set -e
APP_NAME="`ls $APP_ROOT | head -n 1`" yarn gulp \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \ compile-extension:configuration-editing \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \ compile-extension:css-language-features-server \
./scripts/test-remote-integration.sh compile-extension:emmet \
displayName: Run integration tests (Remote) compile-extension:git \
timeoutInMinutes: 20 compile-extension:github-authentication \
compile-extension:html-language-features-server \
compile-extension:ipynb \
compile-extension:json-language-features-server \
compile-extension:markdown-language-features-server \
compile-extension:markdown-language-features \
compile-extension-media \
compile-extension:microsoft-authentication \
compile-extension:typescript-language-features \
compile-extension:vscode-api-tests \
compile-extension:vscode-colorize-tests \
compile-extension:vscode-notebook-tests \
compile-extension:vscode-test-resolver
displayName: Build integration tests
- script: | - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
ps -ef ./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Diagnostics before smoke test run displayName: Run integration tests (Electron)
continueOnError: true timeoutInMinutes: 20
condition: succeededOrFailed()
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \ # Figure out the full absolute path of the product we just built
yarn smoketest-no-compile --web --tracing --headless # including the remote server and configure the integration tests
timeoutInMinutes: 10 # to run with these builds instead of running out of sources.
displayName: Run smoke tests (Browser, Chromium) set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: | - script: |
set -e set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
APP_NAME="`ls $APP_ROOT | head -n 1`" ./scripts/test-web-integration.sh --browser webkit
yarn smoketest-no-compile --tracing --build "$APP_ROOT/$APP_NAME" displayName: Run integration tests (Browser, Webkit)
timeoutInMinutes: 20 timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: | - script: |
set -e set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`" APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \ INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
yarn smoketest-no-compile --tracing --remote --build "$APP_ROOT/$APP_NAME" VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
timeoutInMinutes: 20 ./scripts/test-remote-integration.sh
displayName: Run smoke tests (Remote) displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- script: | - ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
set -e - script: |
ps -ef set -e
displayName: Diagnostics after smoke test run ps -ef
continueOnError: true displayName: Diagnostics before smoke test run
condition: succeededOrFailed() continueOnError: true
condition: succeededOrFailed()
- task: PublishPipelineArtifact@0 - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - script: |
artifactName: crash-dump-macos-$(VSCODE_ARCH) set -e
targetPath: .build/crashes yarn --cwd test/smoke compile
displayName: "Publish Crash Reports" displayName: Compile smoke tests
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports - script: |
# (if any), we need the compiled native modules too set -e
- task: PublishPipelineArtifact@0 yarn smoketest-no-compile --tracing
inputs: timeoutInMinutes: 20
artifactName: node-modules-macos-$(VSCODE_ARCH) displayName: Run smoke tests (Electron)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - script: |
artifactName: logs-macos-$(VSCODE_ARCH)-$(System.JobAttempt) set -e
targetPath: .build/logs APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
displayName: "Publish Log Files" APP_NAME="`ls $APP_ROOT | head -n 1`"
continueOnError: true yarn smoketest-no-compile --tracing --build "$APP_ROOT/$APP_NAME"
condition: failed() timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- script: |
set -e
ps -ef
displayName: Diagnostics after smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-macos-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-macos-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-macos-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2 - task: PublishTestResults@2
displayName: Publish Tests Results displayName: Publish Tests Results

View File

@@ -51,6 +51,50 @@ steps:
set -e set -e
tar -xzf .build/node_modules_cache/cache.tgz tar -xzf .build/node_modules_cache/cache.tgz
displayName: Extract node_modules cache displayName: Extract node_modules cache
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- script: | - script: |
set -e set -e

View File

@@ -1,50 +1,73 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps: steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "16.x" versionSpec: "16.x"
- task: AzureKeyVault@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Azure Key Vault: Get Secrets" - task: AzureKeyVault@1
inputs: displayName: "Azure Key Vault: Get Secrets"
azureSubscription: "vscode-builds-subscription" inputs:
KeyVaultName: vscode azureSubscription: "vscode-builds-subscription"
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key" KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,macos-developer-certificate,macos-developer-certificate-key"
- task: DownloadPipelineArtifact@2 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: DownloadPipelineArtifact@2
artifact: Compilation inputs:
path: $(Build.ArtifactStagingDirectory) artifact: Compilation
displayName: Download compilation output path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz set -e
displayName: Extract compilation output tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
cat << EOF > ~/.netrc set -e
machine github.com cat << EOF > ~/.netrc
login vscode machine github.com
password $(github-distro-mixin-password) login vscode
EOF password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF set -e
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)" git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
git checkout FETCH_HEAD echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' ')) git checkout FETCH_HEAD
displayName: Checkout override commit condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") set -e
displayName: Merge distro git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: | - script: |
mkdir -p .build mkdir -p .build
@@ -64,13 +87,6 @@ steps:
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache displayName: Extract node_modules cache
- script: |
set -e
npm install -g node-gyp@latest
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: | - script: |
set -e set -e
npx https://aka.ms/enablesecurefeed standAlone npx https://aka.ms/enablesecurefeed standAlone
@@ -107,115 +123,142 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive displayName: Create node_modules archive
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration) - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: | # This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
set -e - script: |
node build/azure-pipelines/mixin set -e
displayName: Mix in quality node build/azure-pipelines/mixin
displayName: Mix in quality
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Build client yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
node build/azure-pipelines/mixin --server set -e
displayName: Mix in server quality node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ yarn gulp vscode-reh-darwin-$(VSCODE_ARCH)-min-ci
yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Build Server yarn gulp vscode-reh-web-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build Server
# Setting hardened entitlements is a requirement for: - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
# * Apple notarization - script: |
# * Running tests on Big Sur (because Big Sur has additional security precautions) set -e
- script: | VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
set -e yarn gulp "transpile-client" "transpile-extensions"
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain displayName: Transpile
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: | - ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
set -e - template: product-build-darwin-test.yml
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd parameters:
displayName: Archive build VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e # Setting hardened entitlements is a requirement for:
# * Apple notarization
# * Running tests on Big Sur (because Big Sur has additional security precautions)
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
# package Remote Extension Host - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
pushd .. && mv vscode-reh-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH) && zip -Xry vscode-server-darwin-$(VSCODE_ARCH).zip vscode-server-darwin-$(VSCODE_ARCH) && popd - script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
displayName: Archive build
# package Remote Extension Host (Web) - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
pushd .. && mv vscode-reh-web-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH)-web && zip -Xry vscode-server-darwin-$(VSCODE_ARCH)-web.zip vscode-server-darwin-$(VSCODE_ARCH)-web && popd - script: |
displayName: Prepare to publish servers set -e
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip # package Remote Extension Host
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive pushd .. && mv vscode-reh-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH) && zip -Xry vscode-server-darwin-$(VSCODE_ARCH).zip vscode-server-darwin-$(VSCODE_ARCH) && popd
displayName: Publish client archive
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH).zip # package Remote Extension Host (Web)
artifact: vscode_server_darwin_$(VSCODE_ARCH)_archive-unsigned pushd .. && mv vscode-reh-web-darwin-$(VSCODE_ARCH) vscode-server-darwin-$(VSCODE_ARCH)-web && zip -Xry vscode-server-darwin-$(VSCODE_ARCH)-web.zip vscode-server-darwin-$(VSCODE_ARCH)-web && popd
displayName: Publish server archive displayName: Prepare to publish servers
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH)-web.zip - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifact: vscode_web_darwin_$(VSCODE_ARCH)_archive-unsigned - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Publish web server archive displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
PackageName: Visual Studio Code
- task: AzureCLI@2 - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
inputs: - publish: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)/_manifest
azureSubscription: "vscode-builds-subscription" displayName: Publish SBOM (client)
scriptType: pscore artifact: vscode_client_darwin_$(VSCODE_ARCH)_sbom
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- script: | - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
set -e - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
AZURE_STORAGE_ACCOUNT="ticino" \ displayName: Generate SBOM (server)
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \ inputs:
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \ BuildDropPath: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \ PackageName: Visual Studio Code Server
VSCODE_ARCH="$(VSCODE_ARCH)" \
node build/azure-pipelines/upload-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
continueOnError: true
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0 - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
displayName: Generate SBOM (client) - publish: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)/_manifest
inputs: displayName: Publish SBOM (server)
BuildDropPath: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) artifact: vscode_server_darwin_$(VSCODE_ARCH)_sbom
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)/_manifest - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
displayName: Publish SBOM (client) - publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
artifact: vscode_client_darwin_$(VSCODE_ARCH)_sbom artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) displayName: Publish client archive
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0 - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
displayName: Generate SBOM (server) - publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH).zip
inputs: artifact: vscode_server_darwin_$(VSCODE_ARCH)_archive-unsigned
BuildDropPath: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH) displayName: Publish server archive
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/vscode-server-darwin-$(VSCODE_ARCH)/_manifest - ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
displayName: Publish SBOM (server) - publish: $(Agent.BuildDirectory)/vscode-server-darwin-$(VSCODE_ARCH)-web.zip
artifact: vscode_server_darwin_$(VSCODE_ARCH)_sbom artifact: vscode_web_darwin_$(VSCODE_ARCH)_archive-unsigned
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) displayName: Publish web server archive
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- task: AzureCLI@2
inputs:
azureSubscription: "vscode-builds-subscription"
scriptType: pscore
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- ${{ if and(eq(parameters.VSCODE_PUBLISH, true), eq(parameters.VSCODE_RUN_UNIT_TESTS, false), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
- script: |
set -e
AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
node build/azure-pipelines/upload-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
continueOnError: true

View File

@@ -112,18 +112,19 @@ steps:
displayName: Run unit tests displayName: Run unit tests
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: | # {{SQL CARBON TODO}} Reenable "Run Core Integration Tests"
# Figure out the full absolute path of the product we just built # - script: |
# including the remote server and configure the integration tests # # Figure out the full absolute path of the product we just built
# to run with these builds instead of running out of sources. # # including the remote server and configure the integration tests
set -e # # to run with these builds instead of running out of sources.
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64 # set -e
APP_NAME="`ls $APP_ROOT | head -n 1`" # APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \ # APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \ # INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
./scripts/test-integration.sh --build --tfs "Integration Tests" # VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
displayName: Run core integration tests # ./scripts/test-integration.sh --build --tfs "Integration Tests"
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) # displayName: Run core integration tests
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: | - script: |
set -e set -e

View File

@@ -4,9 +4,7 @@ pool:
trigger: trigger:
branches: branches:
include: ["main", "release/*"] include: ["main", "release/*"]
pr: pr: none
branches:
include: ["main", "release/*"]
steps: steps:
- task: NodeTool@0 - task: NodeTool@0

View File

@@ -118,7 +118,9 @@ steps:
- script: | - script: |
set -e set -e
docker run -e VSCODE_QUALITY -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine-$(VSCODE_ARCH) /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine-$(VSCODE_ARCH) /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
env:
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Prebuild displayName: Prebuild
- script: | - script: |

View File

@@ -0,0 +1,272 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
displayName: Setup build environment
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
ELECTRON_ROOT=.build/electron
sudo chown root $APP_ROOT/chrome-sandbox
sudo chown root $ELECTRON_ROOT/chrome-sandbox
sudo chmod 4755 $APP_ROOT/chrome-sandbox
sudo chmod 4755 $ELECTRON_ROOT/chrome-sandbox
stat $APP_ROOT/chrome-sandbox
stat $ELECTRON_ROOT/chrome-sandbox
displayName: Change setuid helper binary permission
- ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium)
timeoutInMinutes: 15
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- script: |
set -e
yarn test-node --build
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- script: |
set -e
DEBUG=*browser* yarn test-browser-no-install --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser, Chromium)
timeoutInMinutes: 15
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- script: |
set -e
yarn gulp \
compile-extension:configuration-editing \
compile-extension:css-language-features-server \
compile-extension:emmet \
compile-extension:git \
compile-extension:github-authentication \
compile-extension:html-language-features-server \
compile-extension:ipynb \
compile-extension:json-language-features-server \
compile-extension:markdown-language-features-server \
compile-extension:markdown-language-features \
compile-extension-media \
compile-extension:microsoft-authentication \
compile-extension:typescript-language-features \
compile-extension:vscode-api-tests \
compile-extension:vscode-colorize-tests \
compile-extension:vscode-notebook-tests \
compile-extension:vscode-test-resolver
displayName: Build integration tests
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
./scripts/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser, Chromium)
timeoutInMinutes: 20
- script: |
set -e
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
./scripts/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser, Chromium)
timeoutInMinutes: 20
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics before smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
yarn --cwd test/smoke compile
displayName: Compile smoke tests
- script: |
set -e
yarn smoketest-no-compile --tracing
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage"
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
yarn smoketest-no-compile --remote --tracing
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: |
set -e
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
yarn smoketest-no-compile --tracing --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Electron)
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage"
timeoutInMinutes: 20
displayName: Run smoke tests (Browser, Chromium)
- script: |
set -e
yarn gulp compile-extension:vscode-test-resolver
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
yarn smoketest-no-compile --tracing --remote --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
- script: |
set -e
ps -ef
cat /proc/sys/fs/inotify/max_user_watches
lsof | wc -l
displayName: Diagnostics after smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build/logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-linux-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-linux-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-linux-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()

View File

@@ -1,79 +1,113 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps: steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "16.x" versionSpec: "16.x"
- task: AzureKeyVault@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Azure Key Vault: Get Secrets" - task: AzureKeyVault@1
inputs: displayName: "Azure Key Vault: Get Secrets"
azureSubscription: "vscode-builds-subscription" inputs:
KeyVaultName: vscode azureSubscription: "vscode-builds-subscription"
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password" KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: DownloadPipelineArtifact@2 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: DownloadPipelineArtifact@2
artifact: Compilation inputs:
path: $(Build.ArtifactStagingDirectory) artifact: Compilation
displayName: Download compilation output path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- task: DownloadPipelineArtifact@2 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: DownloadPipelineArtifact@2
artifact: reh_node_modules-$(VSCODE_ARCH) inputs:
path: $(Build.ArtifactStagingDirectory) artifact: reh_node_modules-$(VSCODE_ARCH)
displayName: Download server build dependencies path: $(Build.ArtifactStagingDirectory)
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf')) displayName: Download server build dependencies
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
# Start X server set -e
/etc/init.d/xvfb start # Start X server
# Start dbus session /etc/init.d/xvfb start
DBUS_LAUNCH_RESULT=$(sudo dbus-daemon --config-file=/usr/share/dbus-1/system.conf --print-address) # Start dbus session
echo "##vso[task.setvariable variable=DBUS_SESSION_BUS_ADDRESS]$DBUS_LAUNCH_RESULT" DBUS_LAUNCH_RESULT=$(sudo dbus-daemon --config-file=/usr/share/dbus-1/system.conf --print-address)
displayName: Setup system services echo "##vso[task.setvariable variable=DBUS_SESSION_BUS_ADDRESS]$DBUS_LAUNCH_RESULT"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64')) displayName: Setup system services
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz set -e
displayName: Extract compilation output tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
cat << EOF > ~/.netrc set -e
machine github.com cat << EOF > ~/.netrc
login vscode machine github.com
password $(github-distro-mixin-password) login vscode
EOF password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF set -e
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)" git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
git checkout FETCH_HEAD echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' ')) git checkout FETCH_HEAD
displayName: Checkout override commit condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") set -e
displayName: Merge distro git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: | - script: |
mkdir -p .build mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags displayName: Prepare yarn cache flags
- task: Cache@2 - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: Cache@2
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash" inputs:
path: .build/node_modules_cache key: "genericNodeModules | $(Agent.OS) | .build/yarnlockhash"
cacheHitVar: NODE_MODULES_RESTORED path: .build/node_modules_cache
displayName: Restore node_modules cache cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- task: Cache@2
inputs:
key: "nodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: | - script: |
set -e set -e
@@ -91,6 +125,7 @@ steps:
- script: | - script: |
set -e set -e
node build/npm/setupBuildYarnrc
for i in {1..3}; do # try 3 times, for Terrapin for i in {1..3}; do # try 3 times, for Terrapin
yarn --cwd build --frozen-lockfile --check-files && break yarn --cwd build --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then if [ $i -eq 3 ]; then
@@ -103,7 +138,16 @@ steps:
- script: | - script: |
set -e set -e
export npm_config_arch=$(NPM_ARCH) if [ "$NPM_ARCH" = "armv7l" ]; then
# There is no target_arch="armv7l" supported by node_gyp,
# arm versions for compilation are decided based on the CC
# macros.
# Mapping value is based on
# https://github.com/nodejs/node/blob/0903515e126c2697042d6546c6aa4b72e1a4b33e/configure.py#L49-L50
export npm_config_arch="arm"
else
export npm_config_arch=$(NPM_ARCH)
fi
if [ -z "$CC" ] || [ -z "$CXX" ]; then if [ -z "$CC" ] || [ -z "$CXX" ]; then
# Download clang based on chromium revision used by vscode # Download clang based on chromium revision used by vscode
@@ -143,12 +187,13 @@ steps:
displayName: Install dependencies displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
rm -rf remote/node_modules set -e
tar -xzf $(Build.ArtifactStagingDirectory)/reh_node_modules-$(VSCODE_ARCH).tar.gz --directory $(Build.SourcesDirectory)/remote rm -rf remote/node_modules
displayName: Extract server node_modules output tar -xzf $(Build.ArtifactStagingDirectory)/reh_node_modules-$(VSCODE_ARCH).tar.gz --directory $(Build.SourcesDirectory)/remote
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf')) displayName: Extract server node_modules output
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'armhf'))
- script: | - script: |
set -e set -e
@@ -158,267 +203,136 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive displayName: Create node_modules archive
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
node build/azure-pipelines/mixin set -e
displayName: Mix in quality node build/azure-pipelines/mixin
displayName: Mix in quality
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Build yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
node build/azure-pipelines/mixin --server set -e
displayName: Mix in server quality node build/azure-pipelines/mixin --server
displayName: Mix in server quality
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Build Server yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build Server
- script: | - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install" VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Download Electron and Playwright yarn gulp "transpile-client" "transpile-extensions"
displayName: Transpile
- script: | - ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
set -e - template: product-build-linux-client-test.yml
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) parameters:
ELECTRON_ROOT=.build/electron VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
sudo chown root $APP_ROOT/chrome-sandbox VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
sudo chown root $ELECTRON_ROOT/chrome-sandbox VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
sudo chmod 4755 $APP_ROOT/chrome-sandbox VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
sudo chmod 4755 $ELECTRON_ROOT/chrome-sandbox
stat $APP_ROOT/chrome-sandbox
stat $ELECTRON_ROOT/chrome-sandbox
displayName: Change setuid helper binary permission
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - script: |
./scripts/test.sh --build --tfs "Unit Tests" set -e
displayName: Run unit tests (Electron) yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
timeoutInMinutes: 15 yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) displayName: Build deb, rpm packages
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - script: |
yarn test-node --build set -e
displayName: Run unit tests (node.js) yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
timeoutInMinutes: 15 displayName: Prepare snap package
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - task: UseDotNet@2
DEBUG=*browser* yarn test-browser-no-install --build --browser chromium --tfs "Browser Unit Tests" inputs:
displayName: Run unit tests (Browser, Chromium) version: 2.x
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
# Figure out the full absolute path of the product we just built - task: EsrpClientTool@1
# including the remote server and configure the integration tests displayName: Download ESRPClient
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - script: |
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \ set -e
./scripts/test-web-integration.sh --browser chromium node build/azure-pipelines/common/sign "$(esrpclient.toolpath)/$(esrpclient.toolname)" rpm $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) .build/linux/rpm '*.rpm'
displayName: Run integration tests (Browser, Chromium) displayName: Codesign rpm
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - script: |
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) set -e
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") VSCODE_ARCH="$(VSCODE_ARCH)" \
INTEGRATION_TEST_APP_NAME="$APP_NAME" \ ./build/azure-pipelines/linux/prepare-publish.sh
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ displayName: Prepare for Publish
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
./scripts/test-remote-integration.sh
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
ps -ef displayName: Generate SBOM (client)
cat /proc/sys/fs/inotify/max_user_watches inputs:
lsof | wc -l BuildDropPath: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
displayName: Diagnostics before smoke test run (processes, max_user_watches, number of opened file handles) PackageName: Visual Studio Code
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - publish: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)/_manifest
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \ displayName: Publish SBOM (client)
yarn smoketest-no-compile --web --tracing --headless --electronArgs="--disable-dev-shm-usage" artifact: vscode_client_linux_$(VSCODE_ARCH)_sbom
timeoutInMinutes: 10
displayName: Run smoke tests (Browser, Chromium)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) displayName: Generate SBOM (server)
yarn smoketest-no-compile --tracing --build "$APP_PATH" inputs:
timeoutInMinutes: 20 BuildDropPath: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)
displayName: Run smoke tests (Electron) PackageName: Visual Studio Code Server
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - publish: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)/_manifest
APP_PATH=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) displayName: Publish SBOM (server)
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \ artifact: vscode_server_linux_$(VSCODE_ARCH)_sbom
yarn smoketest-no-compile --tracing --remote --build "$APP_PATH"
timeoutInMinutes: 20
displayName: Run smoke tests (Remote)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - publish: $(DEB_PATH)
ps -ef artifact: vscode_client_linux_$(VSCODE_ARCH)_deb-package
cat /proc/sys/fs/inotify/max_user_watches displayName: Publish deb package
lsof | wc -l
displayName: Diagnostics after smoke test run (processes, max_user_watches, number of opened file handles)
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
inputs: - publish: $(RPM_PATH)
artifactName: crash-dump-linux-$(VSCODE_ARCH) artifact: vscode_client_linux_$(VSCODE_ARCH)_rpm-package
targetPath: .build/crashes displayName: Publish rpm package
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
# (if any), we need the compiled native modules too - publish: $(TARBALL_PATH)
- task: PublishPipelineArtifact@0 artifact: vscode_client_linux_$(VSCODE_ARCH)_archive-unsigned
inputs: displayName: Publish client archive
artifactName: node-modules-linux-$(VSCODE_ARCH)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
inputs: - publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
artifactName: logs-linux-$(VSCODE_ARCH)-$(System.JobAttempt) artifact: vscode_server_linux_$(VSCODE_ARCH)_archive-unsigned
targetPath: .build/logs displayName: Publish server archive
displayName: "Publish Log Files"
continueOnError: true
condition: and(failed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishTestResults@2 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
displayName: Publish Tests Results - publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
inputs: artifact: vscode_web_linux_$(VSCODE_ARCH)_archive-unsigned
testResultsFiles: "*-results.xml" displayName: Publish web server archive
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
set -e - task: PublishPipelineArtifact@0
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb" displayName: "Publish Pipeline Artifact"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm" inputs:
displayName: Build deb, rpm packages artifactName: "snap-$(VSCODE_ARCH)"
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) targetPath: .build/linux/snap-tarball
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: UseDotNet@2
inputs:
version: 2.x
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: EsrpClientTool@1
displayName: Download ESRPClient
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
node build/azure-pipelines/common/sign "$(esrpclient.toolpath)/$(esrpclient.toolname)" rpm $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) .build/linux/rpm '*.rpm'
displayName: Codesign rpm
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/prepare-publish.sh
displayName: Prepare for Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(DEB_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_deb-package
displayName: Publish deb package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(RPM_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_rpm-package
displayName: Publish rpm package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(TARBALL_PATH)
artifact: vscode_client_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish client archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
artifact: vscode_server_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
artifact: vscode_web_linux_$(VSCODE_ARCH)_archive-unsigned
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_linux_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/vscode-server-linux-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_linux_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))

View File

@@ -1,49 +1,58 @@
parameters:
- name: VSCODE_QUALITY
type: string
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "16.x" versionSpec: "16.x"
- task: AzureKeyVault@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Azure Key Vault: Get Secrets" - task: AzureKeyVault@1
inputs: displayName: "Azure Key Vault: Get Secrets"
azureSubscription: "vscode-builds-subscription" inputs:
KeyVaultName: vscode azureSubscription: "vscode-builds-subscription"
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password" KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: Docker@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Pull Docker image" - task: Docker@1
inputs: displayName: "Pull Docker image"
azureSubscriptionEndpoint: "vscode-builds-subscription" inputs:
azureContainerRegistry: vscodehub.azurecr.io azureSubscriptionEndpoint: "vscode-builds-subscription"
command: "Run an image" azureContainerRegistry: vscodehub.azurecr.io
imageName: "vscode-linux-build-agent:centos7-devtoolset8-arm64" command: "Run an image"
containerCommand: uname imageName: "vscode-linux-build-agent:centos7-devtoolset8-arm64"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64')) containerCommand: uname
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
cat << EOF > ~/.netrc set -e
machine github.com cat << EOF > ~/.netrc
login vscode machine github.com
password $(github-distro-mixin-password) login vscode
EOF password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF set -e
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)" git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
git checkout FETCH_HEAD echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' ')) git checkout FETCH_HEAD
displayName: Checkout override commit condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") set -e
displayName: Merge distro git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: | - script: |
set -e set -e
@@ -61,17 +70,19 @@ steps:
GITHUB_TOKEN: "$(github-distro-mixin-password)" GITHUB_TOKEN: "$(github-distro-mixin-password)"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64')) condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: docker run --rm --privileged multiarch/qemu-user-static --reset -p yes - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: Register Docker QEMU - script: docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64')) displayName: Register Docker QEMU
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:centos7-devtoolset8-arm64 /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh set -e
displayName: Install dependencies via qemu docker run -e VSCODE_QUALITY -e GITHUB_TOKEN -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:centos7-devtoolset8-arm64 /root/vscode/build/azure-pipelines/linux/scripts/install-remote-dependencies.sh
env: displayName: Install dependencies via qemu
GITHUB_TOKEN: "$(github-distro-mixin-password)" env:
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64')) GITHUB_TOKEN: "$(github-distro-mixin-password)"
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: | - script: |
set -e set -e

View File

@@ -136,48 +136,49 @@ steps:
displayName: Run core integration tests displayName: Run core integration tests
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSIONS_ONLY'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSIONS_ONLY'], 'true'))
- script: | # {{SQL CARBON TODO}} Reenable "Run Extension Unit Tests (Continue on Error)" and "Run Extension Unit Tests (Fail on Error)" and "Archive Logs"
# Figure out the full absolute path of the product we just built # - script: |
# including the remote server and configure the unit tests # # Figure out the full absolute path of the product we just built
# to run with these builds instead of running out of sources. # # including the remote server and configure the unit tests
set -e # # to run with these builds instead of running out of sources.
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64 # set -e
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") # APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ # APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
NO_CLEANUP=1 \ # INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \ # NO_CLEANUP=1 \
DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests" # VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
displayName: Run Extension Unit Tests (Continue on Error) # DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
continueOnError: true # displayName: Run Extension Unit Tests (Continue on Error)
condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), eq(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false'))) # continueOnError: true
# condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), eq(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
- script: | # - script: |
# Figure out the full absolute path of the product we just built # # Figure out the full absolute path of the product we just built
# including the remote server and configure the unit tests # # including the remote server and configure the unit tests
# to run with these builds instead of running out of sources. # # to run with these builds instead of running out of sources.
set -e # set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64 # APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") # APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ # INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
NO_CLEANUP=1 \ # NO_CLEANUP=1 \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \ # VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests" # DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
displayName: Run Extension Unit Tests (Fail on Error) # displayName: Run Extension Unit Tests (Fail on Error)
condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false'))) # condition: and(succeeded(), and(eq(variables['RUN_TESTS'], 'true'), ne(variables['EXTENSION_UNIT_TESTS_FAIL_ON_ERROR'], 'false')))
- bash: | # - bash: |
set -e # set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64 # mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp # cd /tmp
for folder in adsuser*/ # for folder in adsuser*/
do # do
folder=${folder%/} # folder=${folder%/}
# Only archive directories we want for debugging purposes # # Only archive directories we want for debugging purposes
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs # tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
done # done
displayName: Archive Logs # displayName: Archive Logs
continueOnError: true # continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) # condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: | - script: |
set -e set -e

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const json = require("gulp-json-editor"); const json = require("gulp-json-editor");
const buffer = require('gulp-buffer'); const buffer = require('gulp-buffer');
@@ -43,7 +43,7 @@ async function mixinClient(quality) {
else { else {
fancyLog(ansiColors.blue('[mixin]'), 'Inheriting OSS built-in extensions', builtInExtensions.map(e => e.name)); fancyLog(ansiColors.blue('[mixin]'), 'Inheriting OSS built-in extensions', builtInExtensions.map(e => e.name));
} }
return Object.assign(Object.assign({ webBuiltInExtensions: originalProduct.webBuiltInExtensions }, o), { builtInExtensions }); return { webBuiltInExtensions: originalProduct.webBuiltInExtensions, ...o, builtInExtensions };
})) }))
.pipe(productJsonFilter.restore) .pipe(productJsonFilter.restore)
.pipe(es.mapSync((f) => { .pipe(es.mapSync((f) => {
@@ -64,7 +64,7 @@ function mixinServer(quality) {
fancyLog(ansiColors.blue('[mixin]'), `Mixing in server:`); fancyLog(ansiColors.blue('[mixin]'), `Mixing in server:`);
const originalProduct = JSON.parse(fs.readFileSync(path.join(__dirname, '..', '..', 'product.json'), 'utf8')); const originalProduct = JSON.parse(fs.readFileSync(path.join(__dirname, '..', '..', 'product.json'), 'utf8'));
const serverProductJson = JSON.parse(fs.readFileSync(serverProductJsonPath, 'utf8')); const serverProductJson = JSON.parse(fs.readFileSync(serverProductJsonPath, 'utf8'));
fs.writeFileSync('product.json', JSON.stringify(Object.assign(Object.assign({}, originalProduct), serverProductJson), undefined, '\t')); fs.writeFileSync('product.json', JSON.stringify({ ...originalProduct, ...serverProductJson }, undefined, '\t'));
fancyLog(ansiColors.blue('[mixin]'), 'product.json', ansiColors.green('✔︎')); fancyLog(ansiColors.blue('[mixin]'), 'product.json', ansiColors.green('✔︎'));
} }
function main() { function main() {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as json from 'gulp-json-editor'; import * as json from 'gulp-json-editor';
const buffer = require('gulp-buffer'); const buffer = require('gulp-buffer');
import * as filter from 'gulp-filter'; import * as filter from 'gulp-filter';

View File

@@ -0,0 +1,59 @@
steps:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0
inputs:
versionSpec: "16.x"
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: Cache@2
inputs:
key: "genericNodeModules | $(Agent.OS) | .build/yarnlockhash"
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
retryCountOnTaskFailure: 3
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile --check-files && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive

View File

@@ -0,0 +1,189 @@
trigger:
- main
- release/*
pr:
branches:
include: ["main", "release/*"]
variables:
- name: Codeql.SkipTaskAutoInjection
value: true
- name: skipComponentGovernanceDetection
value: true
- name: ENABLE_TERRAPIN
value: false
- name: VSCODE_CIBUILD
value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }}
- name: VSCODE_PUBLISH
value: false
- name: VSCODE_QUALITY
value: oss
- name: VSCODE_STEP_ON_IT
value: false
jobs:
- ${{ if ne(variables['VSCODE_CIBUILD'], true) }}:
- job: Compile
displayName: Compile & Hygiene
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- job: Linuxx64UnitTest
displayName: Linux (Unit Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64IntegrationTest
displayName: Linux (Integration Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64SmokeTest
displayName: Linux (Smoke Tests)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: Linuxx64MaintainNodeModulesCache
displayName: Linux (Maintain node_modules cache)
pool: vscode-1es-vscode-linux-20.04
timeoutInMinutes: 30
variables:
VSCODE_ARCH: x64
steps:
- template: product-build-pr-cache.yml
# - job: macOSUnitTest
# displayName: macOS (Unit Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: true
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: false
# - job: macOSIntegrationTest
# displayName: macOS (Integration Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: true
# VSCODE_RUN_SMOKE_TESTS: false
# - job: macOSSmokeTest
# displayName: macOS (Smoke Tests)
# pool:
# vmImage: macOS-latest
# timeoutInMinutes: 60
# variables:
# BUILDSECMON_OPT_IN: true
# VSCODE_ARCH: x64
# steps:
# - template: darwin/product-build-darwin.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: true
# - job: WindowsUnitTests
# displayName: Windows (Unit Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: true
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: false
# - job: WindowsIntegrationTests
# displayName: Windows (Integration Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: true
# VSCODE_RUN_SMOKE_TESTS: false
# - job: WindowsSmokeTests
# displayName: Windows (Smoke Tests)
# pool: vscode-1es-vscode-windows-2019
# timeoutInMinutes: 60
# variables:
# VSCODE_ARCH: x64
# steps:
# - template: win32/product-build-win32.yml
# parameters:
# VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
# VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
# VSCODE_RUN_UNIT_TESTS: false
# VSCODE_RUN_INTEGRATION_TESTS: false
# VSCODE_RUN_SMOKE_TESTS: true

View File

@@ -8,6 +8,10 @@ schedules:
- main - main
- joao/web - joao/web
trigger:
branches:
include: ["main", "release/*"]
parameters: parameters:
- name: VSCODE_DISTRO_REF - name: VSCODE_DISTRO_REF
displayName: Distro Ref (Private build) displayName: Distro Ref (Private build)
@@ -104,9 +108,11 @@ variables:
- name: VSCODE_BUILD_STAGE_WINDOWS - name: VSCODE_BUILD_STAGE_WINDOWS
value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }} value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_LINUX - name: VSCODE_BUILD_STAGE_LINUX
value: ${{ or(eq(parameters.VSCODE_BUILD_LINUX, true), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE_ARM64, true), eq(parameters.VSCODE_BUILD_WEB, true)) }} value: ${{ or(eq(parameters.VSCODE_BUILD_LINUX, true), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_MACOS - name: VSCODE_BUILD_STAGE_MACOS
value: ${{ or(eq(parameters.VSCODE_BUILD_MACOS, true), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true)) }} value: ${{ or(eq(parameters.VSCODE_BUILD_MACOS, true), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_WEB
value: ${{ eq(parameters.VSCODE_BUILD_WEB, true) }}
- name: VSCODE_CIBUILD - name: VSCODE_CIBUILD
value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }} value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }}
- name: VSCODE_PUBLISH - name: VSCODE_PUBLISH
@@ -162,6 +168,8 @@ stages:
VSCODE_ARCH: x64 VSCODE_ARCH: x64
steps: steps:
- template: product-compile.yml - template: product-compile.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true)) }}: - ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true)) }}:
- stage: Windows - stage: Windows
@@ -169,13 +177,60 @@ stages:
- Compile - Compile
pool: vscode-1es-windows pool: vscode-1es-windows
jobs: jobs:
- ${{ if eq(parameters.VSCODE_BUILD_WIN32, true) }}: - ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: WindowsUnitTests
displayName: Unit Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: WindowsIntegrationTests
displayName: Integration Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: WindowsSmokeTests
displayName: Smoke Tests
timeoutInMinutes: 60
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32, true)) }}:
- job: Windows - job: Windows
timeoutInMinutes: 120 timeoutInMinutes: 120
variables: variables:
VSCODE_ARCH: x64 VSCODE_ARCH: x64
steps: steps:
- template: win32/product-build-win32.yml - template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true)) }}: - ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true)) }}:
- job: Windows32 - job: Windows32
@@ -184,6 +239,12 @@ stages:
VSCODE_ARCH: ia32 VSCODE_ARCH: ia32
steps: steps:
- template: win32/product-build-win32.yml - template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}: - ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}:
- job: WindowsARM64 - job: WindowsARM64
@@ -192,6 +253,12 @@ stages:
VSCODE_ARCH: arm64 VSCODE_ARCH: arm64
steps: steps:
- template: win32/product-build-win32.yml - template: win32/product-build-win32.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}: - ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}:
- stage: LinuxServerDependencies - stage: LinuxServerDependencies
@@ -206,13 +273,17 @@ stages:
NPM_ARCH: x64 NPM_ARCH: x64
steps: steps:
- template: linux/product-build-linux-server.yml - template: linux/product-build-linux-server.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if eq(parameters.VSCODE_BUILD_LINUX, true) }}: - ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
- job: arm64 - job: arm64
variables: variables:
VSCODE_ARCH: arm64 VSCODE_ARCH: arm64
steps: steps:
- template: linux/product-build-linux-server.yml - template: linux/product-build-linux-server.yml
parameters:
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}: - ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}:
- stage: Linux - stage: Linux
@@ -221,7 +292,54 @@ stages:
- LinuxServerDependencies - LinuxServerDependencies
pool: vscode-1es-linux pool: vscode-1es-linux
jobs: jobs:
- ${{ if eq(parameters.VSCODE_BUILD_LINUX, true) }}: - ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: Linuxx64UnitTest
displayName: Unit Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64IntegrationTest
displayName: Integration Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: Linuxx64SmokeTest
displayName: Smoke Tests
container: vscode-bionic-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
DISPLAY: ":10"
steps:
- template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true)) }}:
- job: Linuxx64 - job: Linuxx64
container: vscode-bionic-x64 container: vscode-bionic-x64
variables: variables:
@@ -230,6 +348,12 @@ stages:
DISPLAY: ":10" DISPLAY: ":10"
steps: steps:
- template: linux/product-build-linux-client.yml - template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true), ne(variables['VSCODE_PUBLISH'], 'false')) }}: - ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true), ne(variables['VSCODE_PUBLISH'], 'false')) }}:
- job: LinuxSnap - job: LinuxSnap
@@ -249,6 +373,12 @@ stages:
NPM_ARCH: armv7l NPM_ARCH: armv7l
steps: steps:
- template: linux/product-build-linux-client.yml - template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
# TODO@joaomoreno: We don't ship ARM snaps for now # TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true)) }}: - ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true)) }}:
@@ -269,6 +399,12 @@ stages:
NPM_ARCH: arm64 NPM_ARCH: arm64
steps: steps:
- template: linux/product-build-linux-client.yml - template: linux/product-build-linux-client.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
# TODO@joaomoreno: We don't ship ARM snaps for now # TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}: - ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
@@ -296,13 +432,6 @@ stages:
steps: steps:
- template: linux/product-build-alpine.yml - template: linux/product-build-alpine.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WEB, true)) }}:
- job: LinuxWeb
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_MACOS'], true)) }}: - ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_MACOS'], true)) }}:
- stage: macOS - stage: macOS
dependsOn: dependsOn:
@@ -312,20 +441,76 @@ stages:
variables: variables:
BUILDSECMON_OPT_IN: true BUILDSECMON_OPT_IN: true
jobs: jobs:
- ${{ if eq(parameters.VSCODE_BUILD_MACOS, true) }}: - ${{ if eq(variables['VSCODE_CIBUILD'], true) }}:
- job: macOSTest - job: macOSUnitTest
displayName: Unit Tests
timeoutInMinutes: 90 timeoutInMinutes: 90
variables: variables:
VSCODE_ARCH: x64 VSCODE_ARCH: x64
steps: steps:
- template: darwin/product-build-darwin-test.yml - template: darwin/product-build-darwin.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}: parameters:
- job: macOS VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: true
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- job: macOSIntegrationTest
displayName: Integration Tests
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: true
VSCODE_RUN_SMOKE_TESTS: false
- job: macOSSmokeTest
displayName: Smoke Tests
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: true
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_MACOS, true)) }}:
- job: macOS
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if eq(parameters.VSCODE_STEP_ON_IT, false) }}:
- job: macOSTest
timeoutInMinutes: 90 timeoutInMinutes: 90
variables: variables:
VSCODE_ARCH: x64 VSCODE_ARCH: x64
steps: steps:
- template: darwin/product-build-darwin.yml - template: darwin/product-build-darwin.yml
parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_INTEGRATION_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
VSCODE_RUN_SMOKE_TESTS: ${{ eq(parameters.VSCODE_STEP_ON_IT, false) }}
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSSign - job: macOSSign
dependsOn: dependsOn:
- macOS - macOS
@@ -342,7 +527,14 @@ stages:
VSCODE_ARCH: arm64 VSCODE_ARCH: arm64
steps: steps:
- template: darwin/product-build-darwin.yml - template: darwin/product-build-darwin.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}: parameters:
VSCODE_PUBLISH: ${{ variables.VSCODE_PUBLISH }}
VSCODE_QUALITY: ${{ variables.VSCODE_QUALITY }}
VSCODE_RUN_UNIT_TESTS: false
VSCODE_RUN_INTEGRATION_TESTS: false
VSCODE_RUN_SMOKE_TESTS: false
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSARM64Sign - job: macOSARM64Sign
dependsOn: dependsOn:
- macOSARM64 - macOSARM64
@@ -362,7 +554,8 @@ stages:
VSCODE_ARCH: universal VSCODE_ARCH: universal
steps: steps:
- template: darwin/product-build-darwin-universal.yml - template: darwin/product-build-darwin-universal.yml
- ${{ if eq(variables['VSCODE_CIBUILD'], false) }}:
- ${{ if eq(variables['VSCODE_PUBLISH'], true) }}:
- job: macOSUniversalSign - job: macOSUniversalSign
dependsOn: dependsOn:
- macOSUniversal - macOSUniversal
@@ -372,6 +565,19 @@ stages:
steps: steps:
- template: darwin/product-build-darwin-sign.yml - template: darwin/product-build-darwin-sign.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WEB'], true)) }}:
- stage: Web
dependsOn:
- Compile
pool: vscode-1es-linux
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_WEB, true) }}:
- job: Web
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), ne(variables['VSCODE_PUBLISH'], 'false')) }}: - ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), ne(variables['VSCODE_PUBLISH'], 'false')) }}:
- stage: Publish - stage: Publish
dependsOn: dependsOn:

View File

@@ -1,39 +1,47 @@
parameters:
- name: VSCODE_QUALITY
type: string
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "16.x" versionSpec: "16.x"
- task: AzureKeyVault@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Azure Key Vault: Get Secrets" - task: AzureKeyVault@1
inputs: displayName: "Azure Key Vault: Get Secrets"
azureSubscription: "vscode-builds-subscription" inputs:
KeyVaultName: vscode azureSubscription: "vscode-builds-subscription"
SecretsFilter: "github-distro-mixin-password" KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password"
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
cat << EOF > ~/.netrc set -e
machine github.com cat << EOF > ~/.netrc
login vscode machine github.com
password $(github-distro-mixin-password) login vscode
EOF password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF set -e
echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)" git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $VSCODE_DISTRO_REF
git checkout FETCH_HEAD echo "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' ')) git checkout FETCH_HEAD
displayName: Checkout override commit condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
displayName: Checkout override commit
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") set -e
displayName: Merge distro git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: | - script: |
mkdir -p .build mkdir -p .build
@@ -94,11 +102,12 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive displayName: Create node_modules archive
# Mixin must run before optimize, because the CSS loader will inline small SVGs - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- script: | # Mixin must run before optimize, because the CSS loader will inline small SVGs
set -e - script: |
node build/azure-pipelines/mixin set -e
displayName: Mix in quality node build/azure-pipelines/mixin
displayName: Mix in quality
- script: | - script: |
set -e set -e
@@ -107,59 +116,65 @@ steps:
GITHUB_TOKEN: "$(github-distro-mixin-password)" GITHUB_TOKEN: "$(github-distro-mixin-password)"
displayName: Compile & Hygiene displayName: Compile & Hygiene
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
yarn --cwd test/smoke compile set -e
yarn --cwd test/integration/browser compile yarn --cwd test/smoke compile
displayName: Compile test suites yarn --cwd test/integration/browser compile
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false')) displayName: Compile test suites
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: AzureCLI@2 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: AzureCLI@2
azureSubscription: "vscode-builds-subscription" inputs:
scriptType: pscore azureSubscription: "vscode-builds-subscription"
scriptLocation: inlineScript scriptType: pscore
addSpnToEnvironment: true scriptLocation: inlineScript
inlineScript: | addSpnToEnvironment: true
Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId" inlineScript: |
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId" Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]$env:tenantId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey" Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_ID]$env:servicePrincipalId"
Write-Host "##vso[task.setvariable variable=AZURE_CLIENT_SECRET;issecret=true]$env:servicePrincipalKey"
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
AZURE_STORAGE_ACCOUNT="ticino" \ set -e
AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \ AZURE_STORAGE_ACCOUNT="ticino" \
AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \ AZURE_TENANT_ID="$(AZURE_TENANT_ID)" \
AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \ AZURE_CLIENT_ID="$(AZURE_CLIENT_ID)" \
node build/azure-pipelines/upload-sourcemaps AZURE_CLIENT_SECRET="$(AZURE_CLIENT_SECRET)" \
displayName: Upload sourcemaps node build/azure-pipelines/upload-sourcemaps
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) displayName: Upload sourcemaps
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set - - script: |
./build/azure-pipelines/common/extract-telemetry.sh set -
displayName: Extract Telemetry ./build/azure-pipelines/common/extract-telemetry.sh
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) displayName: Extract Telemetry
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
tar -cz --ignore-failed-read -f $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-* test/integration/browser/out test/smoke/out test/automation/out set -e
displayName: Compress compilation artifact tar -cz --ignore-failed-read -f $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-* test/integration/browser/out test/smoke/out test/automation/out
displayName: Compress compilation artifact
- task: PublishPipelineArtifact@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: PublishPipelineArtifact@1
targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz inputs:
artifactName: Compilation targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Publish compilation artifact artifactName: Compilation
displayName: Publish compilation artifact
- script: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
set -e - script: |
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ set -e
yarn download-builtin-extensions-cg VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Built-in extensions component details yarn download-builtin-extensions-cg
displayName: Built-in extensions component details
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Component Detection" - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
inputs: displayName: "Component Detection"
sourceScanPath: $(Build.SourcesDirectory) inputs:
continueOnError: true sourceScanPath: $(Build.SourcesDirectory)
continueOnError: true

View File

@@ -46,6 +46,7 @@ $stages = @(
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' } if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' } if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' } if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
if ($env:VSCODE_BUILD_STAGE_WEB -eq 'True') { 'Web' }
) )
do { do {

View File

@@ -109,6 +109,7 @@ steps:
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' } if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' } if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' } if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
if ($env:VSCODE_BUILD_STAGE_WEB -eq 'True') { 'Web' }
) )
Write-Host "Stages to check: $stages" Write-Host "Stages to check: $stages"

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const cp = require("child_process"); const cp = require("child_process");
let tag = ''; let tag = '';

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as cp from 'child_process'; import * as cp from 'child_process';
let tag = ''; let tag = '';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs"); const fs = require("fs");
const cp = require("child_process"); const cp = require("child_process");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs'; import * as fs from 'fs';
import * as cp from 'child_process'; import * as cp from 'child_process';
import * as path from 'path'; import * as path from 'path';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const es = require("event-stream"); const es = require("event-stream");
const Vinyl = require("vinyl"); const Vinyl = require("vinyl");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as es from 'event-stream'; import * as es from 'event-stream';
import * as Vinyl from 'vinyl'; import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs'; import * as vfs from 'vinyl-fs';

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.getSettingsSearchBuildId = exports.shouldSetupSettingsSearch = void 0; exports.getSettingsSearchBuildId = exports.shouldSetupSettingsSearch = void 0;
const path = require("path"); const path = require("path");
@@ -52,7 +52,7 @@ function generateVSCodeConfigurationTask() {
const timer = setTimeout(() => { const timer = setTimeout(() => {
codeProc.kill(); codeProc.kill();
reject(new Error('export-default-configuration process timed out')); reject(new Error('export-default-configuration process timed out'));
}, 12 * 1000); }, 60 * 1000);
codeProc.on('error', err => { codeProc.on('error', err => {
clearTimeout(timer); clearTimeout(timer);
reject(err); reject(err);

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path'; import * as path from 'path';
import * as os from 'os'; import * as os from 'os';
import * as cp from 'child_process'; import * as cp from 'child_process';
@@ -63,7 +61,7 @@ function generateVSCodeConfigurationTask(): Promise<string | undefined> {
const timer = setTimeout(() => { const timer = setTimeout(() => {
codeProc.kill(); codeProc.kill();
reject(new Error('export-default-configuration process timed out')); reject(new Error('export-default-configuration process timed out'));
}, 12 * 1000); }, 60 * 1000);
codeProc.on('error', err => { codeProc.on('error', err => {
clearTimeout(timer); clearTimeout(timer);

View File

@@ -1,14 +1,16 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const es = require("event-stream"); const es = require("event-stream");
const vfs = require("vinyl-fs"); const vfs = require("vinyl-fs");
const merge = require("gulp-merge-json"); const merge = require("gulp-merge-json");
const gzip = require("gulp-gzip"); const gzip = require("gulp-gzip");
const identity_1 = require("@azure/identity"); const identity_1 = require("@azure/identity");
const path = require("path");
const fs_1 = require("fs");
const azure = require('gulp-azure-storage'); const azure = require('gulp-azure-storage');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION']; const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION'];
const credential = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']); const credential = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
@@ -18,8 +20,8 @@ function main() {
.pipe(merge({ .pipe(merge({
fileName: 'combined.nls.metadata.json', fileName: 'combined.nls.metadata.json',
jsonSpace: '', jsonSpace: '',
concatArrays: true,
edit: (parsedJson, file) => { edit: (parsedJson, file) => {
let key;
if (file.base === 'out-vscode-web-min') { if (file.base === 'out-vscode-web-min') {
return { vscode: parsedJson }; return { vscode: parsedJson };
} }
@@ -63,7 +65,11 @@ function main() {
break; break;
} }
} }
key = 'vscode.' + file.relative.split('/')[0]; // Get extension id and use that as the key
const folderPath = path.join(file.base, file.relative.split('/')[0]);
const manifest = (0, fs_1.readFileSync)(path.join(folderPath, 'package.json'), 'utf-8');
const manifestJson = JSON.parse(manifest);
const key = manifestJson.publisher + '.' + manifestJson.name;
return { [key]: parsedJson }; return { [key]: parsedJson };
}, },
})) }))

View File

@@ -3,14 +3,14 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as es from 'event-stream'; import * as es from 'event-stream';
import * as Vinyl from 'vinyl'; import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs'; import * as vfs from 'vinyl-fs';
import * as merge from 'gulp-merge-json'; import * as merge from 'gulp-merge-json';
import * as gzip from 'gulp-gzip'; import * as gzip from 'gulp-gzip';
import { ClientSecretCredential } from '@azure/identity'; import { ClientSecretCredential } from '@azure/identity';
import path = require('path');
import { readFileSync } from 'fs';
const azure = require('gulp-azure-storage'); const azure = require('gulp-azure-storage');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION']; const commit = process.env['VSCODE_DISTRO_COMMIT'] || process.env['BUILD_SOURCEVERSION'];
@@ -33,8 +33,8 @@ function main(): Promise<void> {
.pipe(merge({ .pipe(merge({
fileName: 'combined.nls.metadata.json', fileName: 'combined.nls.metadata.json',
jsonSpace: '', jsonSpace: '',
concatArrays: true,
edit: (parsedJson, file) => { edit: (parsedJson, file) => {
let key;
if (file.base === 'out-vscode-web-min') { if (file.base === 'out-vscode-web-min') {
return { vscode: parsedJson }; return { vscode: parsedJson };
} }
@@ -82,7 +82,12 @@ function main(): Promise<void> {
break; break;
} }
} }
key = 'vscode.' + file.relative.split('/')[0];
// Get extension id and use that as the key
const folderPath = path.join(file.base, file.relative.split('/')[0]);
const manifest = readFileSync(path.join(folderPath, 'package.json'), 'utf-8');
const manifestJson = JSON.parse(manifest);
const key = manifestJson.publisher + '.' + manifestJson.name;
return { [key]: parsedJson }; return { [key]: parsedJson };
}, },
})) }))
@@ -113,4 +118,3 @@ main().catch(err => {
console.error(err); console.error(err);
process.exit(1); process.exit(1);
}); });

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path"); const path = require("path");
const es = require("event-stream"); const es = require("event-stream");

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path'; import * as path from 'path';
import * as es from 'event-stream'; import * as es from 'event-stream';
import * as Vinyl from 'vinyl'; import * as Vinyl from 'vinyl';

View File

@@ -163,6 +163,7 @@ steps:
cd $ROOT && tar --owner=0 --group=0 -czf $WEB_TARBALL_PATH $WEB_BUILD_NAME cd $ROOT && tar --owner=0 --group=0 -czf $WEB_TARBALL_PATH $WEB_BUILD_NAME
displayName: Prepare for publish displayName: Prepare for publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-web.tar.gz - publish: $(Agent.BuildDirectory)/vscode-web.tar.gz
artifact: vscode_web_linux_standalone_archive-unsigned artifact: vscode_web_linux_standalone_archive-unsigned

View File

@@ -0,0 +1,247 @@
parameters:
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install" }
displayName: Download Electron and Playwright
- ${{ if eq(parameters.VSCODE_RUN_UNIT_TESTS, true) }}:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-node }
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node test/unit/browser/index.js --sequential --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-node --build }
displayName: Run unit tests (node.js)
timeoutInMinutes: 15
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-browser-no-install --sequential --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp `
compile-extension:configuration-editing `
compile-extension:css-language-features-server `
compile-extension:emmet `
compile-extension:git `
compile-extension:github-authentication `
compile-extension:html-language-features-server `
compile-extension:ipynb `
compile-extension:json-language-features-server `
compile-extension:markdown-language-features-server `
compile-extension:markdown-language-features `
compile-extension-media `
compile-extension:microsoft-authentication `
compile-extension:typescript-language-features `
compile-extension:vscode-api-tests `
compile-extension:vscode-colorize-tests `
compile-extension:vscode-notebook-tests `
compile-extension:vscode-test-resolver `
}
displayName: Build integration tests
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-integration.bat --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser, Firefox)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\scripts\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser, Firefox)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
- ${{ if eq(parameters.VSCODE_RUN_SMOKE_TESTS, true) }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics before smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn --cwd test/smoke compile }
displayName: Compile smoke tests
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn smoketest-no-compile --tracing }
displayName: Run smoke tests (Electron)
timeoutInMinutes: 20
- ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --tracing --build "$AppRoot" }
displayName: Run smoke tests (Electron)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --web --tracing --headless }
displayName: Run smoke tests (Browser, Chromium)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"
exec { yarn gulp compile-extension:vscode-test-resolver }
exec { yarn smoketest-no-compile --tracing --remote --build "$AppRoot" }
displayName: Run smoke tests (Remote)
timeoutInMinutes: 20
- powershell: |
. build/azure-pipelines/win32/exec.ps1
exec {.\build\azure-pipelines\win32\listprocesses.bat }
displayName: Diagnostics after smoke test run
continueOnError: true
condition: succeededOrFailed()
- ${{ if or(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build\crashes
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: crash-dump-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports
# (if any), we need the compiled native modules too
- task: PublishPipelineArtifact@0
inputs:
targetPath: node_modules
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: node-modules-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
targetPath: .build\logs
${{ if and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, false)) }}:
artifactName: logs-windows-$(VSCODE_ARCH)-integration-$(System.JobAttempt)
${{ elseif and(eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, false), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
artifactName: logs-windows-$(VSCODE_ARCH)-smoke-$(System.JobAttempt)
${{ else }}:
artifactName: logs-windows-$(VSCODE_ARCH)-$(System.JobAttempt)
displayName: "Publish Log Files"
continueOnError: true
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()

View File

@@ -1,4 +1,21 @@
parameters:
- name: VSCODE_PUBLISH
type: boolean
- name: VSCODE_QUALITY
type: string
- name: VSCODE_RUN_UNIT_TESTS
type: boolean
- name: VSCODE_RUN_INTEGRATION_TESTS
type: boolean
- name: VSCODE_RUN_SMOKE_TESTS
type: boolean
steps: steps:
- ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
- checkout: self
fetchDepth: 1
retryCountOnTaskFailure: 3
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "16.x" versionSpec: "16.x"
@@ -8,51 +25,58 @@ steps:
versionSpec: "3.x" versionSpec: "3.x"
addToPath: true addToPath: true
- task: AzureKeyVault@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: "Azure Key Vault: Get Secrets" - task: AzureKeyVault@1
inputs: displayName: "Azure Key Vault: Get Secrets"
azureSubscription: "vscode-builds-subscription" inputs:
KeyVaultName: vscode azureSubscription: "vscode-builds-subscription"
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password" KeyVaultName: vscode
SecretsFilter: "github-distro-mixin-password,ESRP-PKI,esrp-aad-username,esrp-aad-password"
- task: DownloadPipelineArtifact@2 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
inputs: - task: DownloadPipelineArtifact@2
artifact: Compilation inputs:
path: $(Build.ArtifactStagingDirectory) artifact: Compilation
displayName: Download compilation output path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- task: ExtractFiles@1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: Extract compilation output - task: ExtractFiles@1
inputs: displayName: Extract compilation output
archiveFilePatterns: "$(Build.ArtifactStagingDirectory)/compilation.tar.gz" inputs:
cleanDestinationFolder: false archiveFilePatterns: "$(Build.ArtifactStagingDirectory)/compilation.tar.gz"
cleanDestinationFolder: false
- powershell: |
. build/azure-pipelines/win32/exec.ps1 - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
$ErrorActionPreference = "Stop" - powershell: |
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git config user.email "vscode@microsoft.com" } "machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.name "VSCode" }
displayName: Prepare tooling exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
- powershell: | displayName: Prepare tooling
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
- powershell: |
exec { git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $(VSCODE_DISTRO_REF) } . build/azure-pipelines/win32/exec.ps1
Write-Host "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)" $ErrorActionPreference = "Stop"
exec { git checkout FETCH_HEAD }
condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' ')) exec { git fetch https://github.com/$(VSCODE_MIXIN_REPO).git $(VSCODE_DISTRO_REF) }
displayName: Checkout override commit Write-Host "##vso[task.setvariable variable=VSCODE_DISTRO_COMMIT;]$(git rev-parse FETCH_HEAD)"
exec { git checkout FETCH_HEAD }
- powershell: | condition: and(succeeded(), ne(variables.VSCODE_DISTRO_REF, ' '))
. build/azure-pipelines/win32/exec.ps1 displayName: Checkout override commit
$ErrorActionPreference = "Stop"
exec { git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") } - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
displayName: Merge distro - powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro") }
displayName: Merge distro
- powershell: | - powershell: |
if (!(Test-Path ".build")) { New-Item -Path ".build" -ItemType Directory }
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch "$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
"$env:ENABLE_TERRAPIN" | Out-File -Encoding ascii -NoNewLine .build\terrapin "$env:ENABLE_TERRAPIN" | Out-File -Encoding ascii -NoNewLine .build\terrapin
node build/azure-pipelines/common/computeNodeModulesCacheKey.js > .build/yarnlockhash node build/azure-pipelines/common/computeNodeModulesCacheKey.js > .build/yarnlockhash
@@ -104,291 +128,176 @@ steps:
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true')) condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive displayName: Create node_modules archive
- powershell: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
exec { node build/azure-pipelines/mixin } $ErrorActionPreference = "Stop"
displayName: Mix in quality exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" $ErrorActionPreference = "Stop"
exec { yarn npm-run-all -lp "electron $(VSCODE_ARCH)" } exec { node build\lib\policies }
displayName: Download Electron displayName: Generate Group Policy definitions
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: | - ${{ if eq(parameters.VSCODE_QUALITY, 'oss') }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" $ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" } $env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)" exec { yarn gulp "transpile-client" "transpile-extensions" }
displayName: Build displayName: Transpile
- powershell: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" $ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" } $env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
displayName: Prepare Package exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" }
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)"
displayName: Build
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
exec { node build/azure-pipelines/mixin --server } $ErrorActionPreference = "Stop"
displayName: Mix in quality $env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" }
displayName: Prepare Package
- powershell: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" $ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" } exec { node build/azure-pipelines/mixin --server }
exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" } displayName: Mix in quality
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)" condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if ne(parameters.VSCODE_QUALITY, 'oss') }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" $ErrorActionPreference = "Stop"
exec { yarn npm-run-all -lp "playwright-install" } $env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
displayName: Download Playwright exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" }
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64')) exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)"
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if or(eq(parameters.VSCODE_RUN_UNIT_TESTS, true), eq(parameters.VSCODE_RUN_INTEGRATION_TESTS, true), eq(parameters.VSCODE_RUN_SMOKE_TESTS, true)) }}:
. build/azure-pipelines/win32/exec.ps1 - template: product-build-win32-test.yml
$ErrorActionPreference = "Stop" parameters:
exec { yarn electron $(VSCODE_ARCH) } VSCODE_QUALITY: ${{ parameters.VSCODE_QUALITY }}
exec { .\scripts\test.bat --build --tfs "Unit Tests" } VSCODE_RUN_UNIT_TESTS: ${{ parameters.VSCODE_RUN_UNIT_TESTS }}
displayName: Run unit tests (Electron) VSCODE_RUN_INTEGRATION_TESTS: ${{ parameters.VSCODE_RUN_INTEGRATION_TESTS }}
timeoutInMinutes: 15 VSCODE_RUN_SMOKE_TESTS: ${{ parameters.VSCODE_RUN_SMOKE_TESTS }}
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - task: UseDotNet@2
$ErrorActionPreference = "Stop" inputs:
exec { yarn test-node --build } version: 3.x
displayName: Run unit tests (node.js) condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
timeoutInMinutes: 15
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - task: EsrpClientTool@1
$ErrorActionPreference = "Stop" displayName: Download ESRPClient
exec { yarn test-browser-no-install --sequential --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser, Chromium & Firefox)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
# Figure out the full absolute path of the product we just built - powershell: |
# including the remote server and configure the integration tests . build/azure-pipelines/win32/exec.ps1
# to run with these builds instead of running out of sources. $ErrorActionPreference = "Stop"
. build/azure-pipelines/win32/exec.ps1 $EsrpClientTool = (gci -directory -filter EsrpClientTool_* $(Agent.RootDirectory)\_tasks | Select-Object -last 1).FullName
$ErrorActionPreference = "Stop" $EsrpCliZip = (gci -recurse -filter esrpcli.*.zip $EsrpClientTool | Select-Object -last 1).FullName
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" mkdir -p $(Agent.TempDirectory)\esrpcli
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json Expand-Archive -Path $EsrpCliZip -DestinationPath $(Agent.TempDirectory)\esrpcli
$AppNameShort = $AppProductJson.nameShort $EsrpCliDllPath = (gci -recurse -filter esrpcli.dll $(Agent.TempDirectory)\esrpcli | Select-Object -last 1).FullName
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" } echo "##vso[task.setvariable variable=EsrpCliDllPath]$EsrpCliDllPath"
displayName: Run integration tests (Electron) displayName: Find ESRP CLI
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\scripts\test-web-integration.bat --browser firefox } $ErrorActionPreference = "Stop"
displayName: Run integration tests (Browser, Firefox) exec { node build\azure-pipelines\common\sign $env:EsrpCliDllPath windows $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) $(CodeSigningFolderPath) '*.dll,*.exe,*.node' }
timeoutInMinutes: 20 displayName: Codesign
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" $ErrorActionPreference = "Stop"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-archive" }
$AppNameShort = $AppProductJson.nameShort displayName: Package archive
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-remote-integration.bat }
displayName: Run integration tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
exec {.\build\azure-pipelines\win32\listprocesses.bat } . build/azure-pipelines/win32/exec.ps1
displayName: Diagnostics before smoke test run $ErrorActionPreference = "Stop"
continueOnError: true $env:ESRPPKI = "$(ESRP-PKI)"
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false')) $env:ESRPAADUsername = "$(esrp-aad-username)"
$env:ESRPAADPassword = "$(esrp-aad-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-system-setup" --sign }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-user-setup" --sign }
displayName: Package setups
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - powershell: |
$ErrorActionPreference = "Stop" . build/azure-pipelines/win32/exec.ps1
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)" $ErrorActionPreference = "Stop"
exec { yarn smoketest-no-compile --web --tracing --headless } .\build\azure-pipelines\win32\prepare-publish.ps1
displayName: Run smoke tests (Browser, Chromium) displayName: Publish
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
$ErrorActionPreference = "Stop" displayName: Generate SBOM (client)
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" inputs:
exec { yarn smoketest-no-compile --tracing --build "$AppRoot" } BuildDropPath: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)
displayName: Run smoke tests (Electron) PackageName: Visual Studio Code
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - publish: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)/_manifest
$ErrorActionPreference = "Stop" displayName: Publish SBOM (client)
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" artifact: vscode_client_win32_$(VSCODE_ARCH)_sbom
$env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"
exec { yarn smoketest-no-compile --tracing --remote --build "$AppRoot" }
displayName: Run smoke tests (Remote)
timeoutInMinutes: 20
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
. build/azure-pipelines/win32/exec.ps1 - task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
exec {.\build\azure-pipelines\win32\listprocesses.bat } displayName: Generate SBOM (server)
displayName: Diagnostics after smoke test run inputs:
continueOnError: true BuildDropPath: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false')) PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishPipelineArtifact@0 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
inputs: - publish: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)/_manifest
artifactName: crash-dump-windows-$(VSCODE_ARCH) displayName: Publish SBOM (server)
targetPath: .build\crashes artifact: vscode_server_win32_$(VSCODE_ARCH)_sbom
displayName: "Publish Crash Reports" condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
continueOnError: true
condition: failed()
# In order to properly symbolify above crash reports - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
# (if any), we need the compiled native modules too - publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\$(ARCHIVE_NAME)
- task: PublishPipelineArtifact@0 artifact: vscode_client_win32_$(VSCODE_ARCH)_archive
inputs: displayName: Publish archive
artifactName: node-modules-windows-$(VSCODE_ARCH)
targetPath: node_modules
displayName: "Publish Node Modules"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
inputs: - publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\$(SYSTEM_SETUP_NAME)
artifactName: logs-windows-$(VSCODE_ARCH)-$(System.JobAttempt) artifact: vscode_client_win32_$(VSCODE_ARCH)_setup
targetPath: .build\logs displayName: Publish system setup
displayName: "Publish Log Files"
continueOnError: true
condition: and(failed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishTestResults@2 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
displayName: Publish Tests Results - publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\$(USER_SETUP_NAME)
inputs: artifact: vscode_client_win32_$(VSCODE_ARCH)_user-setup
testResultsFiles: "*-results.xml" displayName: Publish user setup
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
condition: and(succeededOrFailed(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: UseDotNet@2 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
inputs: - publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
version: 3.x artifact: vscode_server_win32_$(VSCODE_ARCH)_archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: EsrpClientTool@1 - ${{ if eq(parameters.VSCODE_PUBLISH, true) }}:
displayName: Download ESRPClient - publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false')) artifact: vscode_web_win32_$(VSCODE_ARCH)_archive
displayName: Publish web server archive
- powershell: | condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$EsrpClientTool = (gci -directory -filter EsrpClientTool_* $(Agent.RootDirectory)\_tasks | Select-Object -last 1).FullName
$EsrpCliZip = (gci -recurse -filter esrpcli.*.zip $EsrpClientTool | Select-Object -last 1).FullName
mkdir -p $(Agent.TempDirectory)\esrpcli
Expand-Archive -Path $EsrpCliZip -DestinationPath $(Agent.TempDirectory)\esrpcli
$EsrpCliDllPath = (gci -recurse -filter esrpcli.dll $(Agent.TempDirectory)\esrpcli | Select-Object -last 1).FullName
echo "##vso[task.setvariable variable=EsrpCliDllPath]$EsrpCliDllPath"
displayName: Find ESRP CLI
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build\azure-pipelines\common\sign $env:EsrpCliDllPath windows $(ESRP-PKI) $(esrp-aad-username) $(esrp-aad-password) $(CodeSigningFolderPath) '*.dll,*.exe,*.node' }
displayName: Codesign
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-archive" }
displayName: Package archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:ESRPPKI = "$(ESRP-PKI)"
$env:ESRPAADUsername = "$(esrp-aad-username)"
$env:ESRPAADPassword = "$(esrp-aad-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-system-setup" --sign }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-user-setup" --sign }
displayName: Package setups
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\prepare-publish.ps1
displayName: Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\$(ARCHIVE_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_archive
displayName: Publish archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\$(SYSTEM_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_setup
displayName: Publish system setup
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\$(USER_SETUP_NAME)
artifact: vscode_client_win32_$(VSCODE_ARCH)_user-setup
displayName: Publish user setup
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
artifact: vscode_server_win32_$(VSCODE_ARCH)_archive
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
artifact: vscode_web_win32_$(VSCODE_ARCH)_archive
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (client)
inputs:
BuildDropPath: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (client)
artifact: vscode_client_win32_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: AzureArtifacts.manifest-generator-task.manifest-generator-task.ManifestGeneratorTask@0
displayName: Generate SBOM (server)
inputs:
BuildDropPath: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)
PackageName: Visual Studio Code Server
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- publish: $(agent.builddirectory)/vscode-server-win32-$(VSCODE_ARCH)/_manifest
displayName: Publish SBOM (server)
artifact: vscode_server_win32_$(VSCODE_ARCH)_sbom
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const vscode_universal_bundler_1 = require("vscode-universal-bundler"); const vscode_universal_bundler_1 = require("vscode-universal-bundler");
const cross_spawn_promise_1 = require("@malept/cross-spawn-promise"); const cross_spawn_promise_1 = require("@malept/cross-spawn-promise");
@@ -71,7 +71,7 @@ async function main() {
outAppPath, outAppPath,
force: true force: true
}); });
let productJson = await fs.readJson(productJsonPath); const productJson = await fs.readJson(productJsonPath);
Object.assign(productJson, { Object.assign(productJson, {
darwinUniversalAssetId: 'darwin-universal' darwinUniversalAssetId: 'darwin-universal'
}); });

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import { makeUniversalApp } from 'vscode-universal-bundler'; import { makeUniversalApp } from 'vscode-universal-bundler';
import { spawn } from '@malept/cross-spawn-promise'; import { spawn } from '@malept/cross-spawn-promise';
import * as fs from 'fs-extra'; import * as fs from 'fs-extra';
@@ -80,7 +78,7 @@ async function main() {
force: true force: true
}); });
let productJson = await fs.readJson(productJsonPath); const productJson = await fs.readJson(productJsonPath);
Object.assign(productJson, { Object.assign(productJson, {
darwinUniversalAssetId: 'darwin-universal' darwinUniversalAssetId: 'darwin-universal'
}); });

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const codesign = require("electron-osx-sign"); const codesign = require("electron-osx-sign");
const path = require("path"); const path = require("path");
@@ -40,14 +40,26 @@ async function main() {
identity: '99FM488X57', identity: '99FM488X57',
'gatekeeper-assess': false 'gatekeeper-assess': false
}; };
const appOpts = Object.assign(Object.assign({}, defaultOpts), { const appOpts = {
...defaultOpts,
// TODO(deepak1556): Incorrectly declared type in electron-osx-sign // TODO(deepak1556): Incorrectly declared type in electron-osx-sign
ignore: (filePath) => { ignore: (filePath) => {
return filePath.includes(gpuHelperAppName) || return filePath.includes(gpuHelperAppName) ||
filePath.includes(rendererHelperAppName); filePath.includes(rendererHelperAppName);
} }); }
const gpuHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, gpuHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist') }); };
const rendererHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, rendererHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist') }); const gpuHelperOpts = {
...defaultOpts,
app: path.join(appFrameworkPath, gpuHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
};
const rendererHelperOpts = {
...defaultOpts,
app: path.join(appFrameworkPath, rendererHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
};
// Only overwrite plist entries for x64 and arm64 builds, // Only overwrite plist entries for x64 and arm64 builds,
// universal will get its copy from the x64 build. // universal will get its copy from the x64 build.
if (arch !== 'universal') { if (arch !== 'universal') {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as codesign from 'electron-osx-sign'; import * as codesign from 'electron-osx-sign';
import * as path from 'path'; import * as path from 'path';
import * as util from '../lib/util'; import * as util from '../lib/util';

View File

@@ -52,7 +52,6 @@ module.exports.unicodeFilter = [
'!extensions/typescript-language-features/test-workspace/**', '!extensions/typescript-language-features/test-workspace/**',
'!extensions/vscode-api-tests/testWorkspace/**', '!extensions/vscode-api-tests/testWorkspace/**',
'!extensions/vscode-api-tests/testWorkspace2/**', '!extensions/vscode-api-tests/testWorkspace2/**',
'!extensions/vscode-custom-editor-tests/test-workspace/**',
'!extensions/**/dist/**', '!extensions/**/dist/**',
'!extensions/**/out/**', '!extensions/**/out/**',
'!extensions/**/snippets/**', '!extensions/**/snippets/**',
@@ -93,7 +92,6 @@ module.exports.indentationFilter = [
'!extensions/markdown-math/notebook-out/**', '!extensions/markdown-math/notebook-out/**',
'!extensions/vscode-api-tests/testWorkspace/**', '!extensions/vscode-api-tests/testWorkspace/**',
'!extensions/vscode-api-tests/testWorkspace2/**', '!extensions/vscode-api-tests/testWorkspace2/**',
'!extensions/vscode-custom-editor-tests/test-workspace/**',
'!build/monaco/**', '!build/monaco/**',
'!build/win32/**', '!build/win32/**',
@@ -146,7 +144,6 @@ module.exports.indentationFilter = [
'!extensions/admin-tool-ext-win/license/**', '!extensions/admin-tool-ext-win/license/**',
'!extensions/resource-deployment/notebooks/**', '!extensions/resource-deployment/notebooks/**',
'!extensions/mssql/notebooks/**', '!extensions/mssql/notebooks/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/integration-tests/testData/**', '!extensions/integration-tests/testData/**',
'!extensions/arc/src/controller/generated/**', '!extensions/arc/src/controller/generated/**',
'!extensions/sql-database-projects/resources/templates/*.xml', '!extensions/sql-database-projects/resources/templates/*.xml',
@@ -198,7 +195,6 @@ module.exports.copyrightFilter = [
'!src/vs/editor/test/node/classification/typescript-test.ts', '!src/vs/editor/test/node/classification/typescript-test.ts',
// {{SQL CARBON EDIT}} Except for stuff in our code that doesn't use our copyright // {{SQL CARBON EDIT}} Except for stuff in our code that doesn't use our copyright
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/azuremonitor/src/prompts/**', '!extensions/azuremonitor/src/prompts/**',
'!extensions/import/flatfileimportservice/**', '!extensions/import/flatfileimportservice/**',
'!extensions/kusto/src/prompts/**', '!extensions/kusto/src/prompts/**',

View File

@@ -17,34 +17,41 @@ const compilation = require('./lib/compilation');
const monacoapi = require('./lib/monaco-api'); const monacoapi = require('./lib/monaco-api');
const fs = require('fs'); const fs = require('fs');
let root = path.dirname(__dirname); const root = path.dirname(__dirname);
let sha1 = util.getVersion(root); const sha1 = util.getVersion(root);
let semver = require('./monaco/package.json').version; const semver = require('./monaco/package.json').version;
let headerVersion = semver + '(' + sha1 + ')'; const headerVersion = semver + '(' + sha1 + ')';
// Build // Build
let editorEntryPoints = [ const editorEntryPoints = [
{ {
name: 'vs/editor/editor.main', name: 'vs/editor/editor.main',
include: [], include: [],
exclude: ['vs/css', 'vs/nls'], exclude: ['vs/css', 'vs/nls'],
prepend: ['out-editor-build/vs/css.js', 'out-editor-build/vs/nls.js'], prepend: [
{ path: 'out-editor-build/vs/css.js', amdModuleId: 'vs/css' },
{ path: 'out-editor-build/vs/nls.js', amdModuleId: 'vs/nls' }
],
}, },
{ {
name: 'vs/base/common/worker/simpleWorker', name: 'vs/base/common/worker/simpleWorker',
include: ['vs/editor/common/services/editorSimpleWorker'], include: ['vs/editor/common/services/editorSimpleWorker'],
prepend: ['vs/loader.js'], exclude: ['vs/nls'],
append: ['vs/base/worker/workerMain'], prepend: [
{ path: 'vs/loader.js' },
{ path: 'vs/nls.js', amdModuleId: 'vs/nls' },
{ path: 'vs/base/worker/workerMain.js' }
],
dest: 'vs/base/worker/workerMain.js' dest: 'vs/base/worker/workerMain.js'
} }
]; ];
let editorResources = [ const editorResources = [
'out-editor-build/vs/base/browser/ui/codicons/**/*.ttf' 'out-editor-build/vs/base/browser/ui/codicons/**/*.ttf'
]; ];
let BUNDLED_FILE_HEADER = [ const BUNDLED_FILE_HEADER = [
'/*!-----------------------------------------------------------', '/*!-----------------------------------------------------------',
' * Copyright (c) Microsoft Corporation. All rights reserved.', ' * Copyright (c) Microsoft Corporation. All rights reserved.',
' * Version: ' + headerVersion, ' * Version: ' + headerVersion,
@@ -109,12 +116,6 @@ const createESMSourcesAndResourcesTask = task.define('extract-editor-esm', () =>
'inlineEntryPoint:0.ts', 'inlineEntryPoint:0.ts',
'inlineEntryPoint:1.ts', 'inlineEntryPoint:1.ts',
'vs/loader.js', 'vs/loader.js',
'vs/nls.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/css.js',
'vs/css.build.js',
'vs/css.d.ts',
'vs/base/worker/workerMain.ts', 'vs/base/worker/workerMain.ts',
], ],
renames: { renames: {
@@ -224,7 +225,7 @@ const appendJSToESMImportsTask = task.define('append-js-to-esm-imports', () => {
result.push(line); result.push(line);
continue; continue;
} }
let modifiedLine = ( const modifiedLine = (
line line
.replace(/^import(.*)\'([^']+)\'/, `import$1'$2.js'`) .replace(/^import(.*)\'([^']+)\'/, `import$1'$2.js'`)
.replace(/^export \* from \'([^']+)\'/, `export * from '$1.js'`) .replace(/^export \* from \'([^']+)\'/, `export * from '$1.js'`)
@@ -239,10 +240,10 @@ const appendJSToESMImportsTask = task.define('append-js-to-esm-imports', () => {
* @param {string} contents * @param {string} contents
*/ */
function toExternalDTS(contents) { function toExternalDTS(contents) {
let lines = contents.split(/\r\n|\r|\n/); const lines = contents.split(/\r\n|\r|\n/);
let killNextCloseCurlyBrace = false; let killNextCloseCurlyBrace = false;
for (let i = 0; i < lines.length; i++) { for (let i = 0; i < lines.length; i++) {
let line = lines[i]; const line = lines[i];
if (killNextCloseCurlyBrace) { if (killNextCloseCurlyBrace) {
if ('}' === line) { if ('}' === line) {
@@ -316,7 +317,7 @@ const finalEditorResourcesTask = task.define('final-editor-resources', () => {
// package.json // package.json
gulp.src('build/monaco/package.json') gulp.src('build/monaco/package.json')
.pipe(es.through(function (data) { .pipe(es.through(function (data) {
let json = JSON.parse(data.contents.toString()); const json = JSON.parse(data.contents.toString());
json.private = false; json.private = false;
data.contents = Buffer.from(JSON.stringify(json, null, ' ')); data.contents = Buffer.from(JSON.stringify(json, null, ' '));
this.emit('data', data); this.emit('data', data);
@@ -360,10 +361,10 @@ const finalEditorResourcesTask = task.define('final-editor-resources', () => {
return; return;
} }
let relativePathToMap = path.relative(path.join(data.relative), path.join('min-maps', data.relative + '.map')); const relativePathToMap = path.relative(path.join(data.relative), path.join('min-maps', data.relative + '.map'));
let strContents = data.contents.toString(); let strContents = data.contents.toString();
let newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/'); const newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/');
strContents = strContents.replace(/\/\/# sourceMappingURL=[^ ]+$/, newStr); strContents = strContents.replace(/\/\/# sourceMappingURL=[^ ]+$/, newStr);
data.contents = Buffer.from(strContents); data.contents = Buffer.from(strContents);
@@ -483,13 +484,13 @@ function createTscCompileTask(watch) {
cwd: path.join(__dirname, '..'), cwd: path.join(__dirname, '..'),
// stdio: [null, 'pipe', 'inherit'] // stdio: [null, 'pipe', 'inherit']
}); });
let errors = []; const errors = [];
let reporter = createReporter('monaco'); const reporter = createReporter('monaco');
/** @type {NodeJS.ReadWriteStream | undefined} */ /** @type {NodeJS.ReadWriteStream | undefined} */
let report; let report;
// eslint-disable-next-line no-control-regex // eslint-disable-next-line no-control-regex
let magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings const magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings
child.stdout.on('data', data => { child.stdout.on('data', data => {
let str = String(data); let str = String(data);
@@ -502,12 +503,12 @@ function createTscCompileTask(watch) {
report.end(); report.end();
} else if (str) { } else if (str) {
let match = /(.*\(\d+,\d+\): )(.*: )(.*)/.exec(str); const match = /(.*\(\d+,\d+\): )(.*: )(.*)/.exec(str);
if (match) { if (match) {
// trying to massage the message so that it matches the gulp-tsb error messages // trying to massage the message so that it matches the gulp-tsb error messages
// e.g. src/vs/base/common/strings.ts(663,5): error TS2322: Type '1234' is not assignable to type 'string'. // e.g. src/vs/base/common/strings.ts(663,5): error TS2322: Type '1234' is not assignable to type 'string'.
let fullpath = path.join(root, match[1]); const fullpath = path.join(root, match[1]);
let message = match[3]; const message = match[3];
reporter(fullpath + message); reporter(fullpath + message);
} else { } else {
reporter(str); reporter(str);

View File

@@ -56,6 +56,7 @@ const compilations = [
'json-language-features/client/tsconfig.json', 'json-language-features/client/tsconfig.json',
'json-language-features/server/tsconfig.json', 'json-language-features/server/tsconfig.json',
'markdown-language-features/preview-src/tsconfig.json', 'markdown-language-features/preview-src/tsconfig.json',
'markdown-language-features/server/tsconfig.json',
'markdown-language-features/tsconfig.json', 'markdown-language-features/tsconfig.json',
'markdown-math/tsconfig.json', 'markdown-math/tsconfig.json',
'merge-conflict/tsconfig.json', 'merge-conflict/tsconfig.json',
@@ -63,12 +64,13 @@ const compilations = [
'npm/tsconfig.json', 'npm/tsconfig.json',
'php-language-features/tsconfig.json', 'php-language-features/tsconfig.json',
'search-result/tsconfig.json', 'search-result/tsconfig.json',
'references-view/tsconfig.json',
'simple-browser/tsconfig.json', 'simple-browser/tsconfig.json',
'typescript-language-features/test-workspace/tsconfig.json', 'typescript-language-features/test-workspace/tsconfig.json',
'typescript-language-features/tsconfig.json', 'typescript-language-features/tsconfig.json',
'vscode-api-tests/tsconfig.json', 'vscode-api-tests/tsconfig.json',
'vscode-colorize-tests/tsconfig.json', 'vscode-colorize-tests/tsconfig.json',
'vscode-custom-editor-tests/tsconfig.json', 'vscode-notebook-tests/tsconfig.json',
'vscode-test-resolver/tsconfig.json' 'vscode-test-resolver/tsconfig.json'
]; ];
*/ */
@@ -93,7 +95,7 @@ const tasks = compilations.map(function (tsconfigFile) {
const baseUrl = getBaseUrl(out); const baseUrl = getBaseUrl(out);
let headerId, headerOut; let headerId, headerOut;
let index = relativeDirname.indexOf('/'); const index = relativeDirname.indexOf('/');
if (index < 0) { if (index < 0) {
headerId = 'microsoft.' + relativeDirname; // {{SQL CARBON EDIT}} headerId = 'microsoft.' + relativeDirname; // {{SQL CARBON EDIT}}
headerOut = 'out'; headerOut = 'out';
@@ -102,9 +104,9 @@ const tasks = compilations.map(function (tsconfigFile) {
headerOut = relativeDirname.substr(index + 1) + '/out'; headerOut = relativeDirname.substr(index + 1) + '/out';
} }
function createPipeline(build, emitError) { function createPipeline(build, emitError, transpileOnly) {
const nlsDev = require('vscode-nls-dev'); const nlsDev = require('vscode-nls-dev');
const tsb = require('gulp-tsb'); const tsb = require('./lib/tsb');
const sourcemaps = require('gulp-sourcemaps'); const sourcemaps = require('gulp-sourcemaps');
const reporter = createReporter('extensions'); const reporter = createReporter('extensions');
@@ -112,7 +114,7 @@ const tasks = compilations.map(function (tsconfigFile) {
overrideOptions.inlineSources = Boolean(build); overrideOptions.inlineSources = Boolean(build);
overrideOptions.base = path.dirname(absolutePath); overrideOptions.base = path.dirname(absolutePath);
const compilation = tsb.create(absolutePath, overrideOptions, false, err => reporter(err.toString())); const compilation = tsb.create(absolutePath, overrideOptions, { verbose: false, transpileOnly, transpileOnlyIncludesDts: transpileOnly }, err => reporter(err.toString()));
const pipeline = function () { const pipeline = function () {
const input = es.through(); const input = es.through();
@@ -154,6 +156,16 @@ const tasks = compilations.map(function (tsconfigFile) {
const cleanTask = task.define(`clean-extension-${name}`, util.rimraf(out)); const cleanTask = task.define(`clean-extension-${name}`, util.rimraf(out));
const transpileTask = task.define(`transpile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true, true);
const nonts = gulp.src(src, srcOpts).pipe(filter(['**', '!**/*.ts']));
const input = es.merge(nonts, pipeline.tsProjectSrc());
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
}));
const compileTask = task.define(`compile-extension:${name}`, task.series(cleanTask, () => { const compileTask = task.define(`compile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true); const pipeline = createPipeline(false, true);
const nonts = gulp.src(src, srcOpts).pipe(filter(['**', '!**/*.ts'])); const nonts = gulp.src(src, srcOpts).pipe(filter(['**', '!**/*.ts']));
@@ -186,12 +198,16 @@ const tasks = compilations.map(function (tsconfigFile) {
})); }));
// Tasks // Tasks
gulp.task(transpileTask);
gulp.task(compileTask); gulp.task(compileTask);
gulp.task(watchTask); gulp.task(watchTask);
return { compileTask, watchTask, compileBuildTask }; return { transpileTask, compileTask, watchTask, compileBuildTask };
}); });
const transpileExtensionsTask = task.define('transpile-extensions', task.parallel(...tasks.map(t => t.transpileTask)));
gulp.task(transpileExtensionsTask);
const compileExtensionsTask = task.define('compile-extensions', task.parallel(...tasks.map(t => t.compileTask))); const compileExtensionsTask = task.define('compile-extensions', task.parallel(...tasks.map(t => t.compileTask)));
gulp.task(compileExtensionsTask); gulp.task(compileExtensionsTask);
exports.compileExtensionsTask = compileExtensionsTask; exports.compileExtensionsTask = compileExtensionsTask;

View File

@@ -16,7 +16,7 @@ function checkPackageJSON(actualPath) {
const actual = require(path.join(__dirname, '..', actualPath)); const actual = require(path.join(__dirname, '..', actualPath));
const rootPackageJSON = require('../package.json'); const rootPackageJSON = require('../package.json');
const checkIncluded = (set1, set2) => { const checkIncluded = (set1, set2) => {
for (let depName in set1) { for (const depName in set1) {
if (depName === 'typescript') { if (depName === 'typescript') {
continue; continue;
} }

View File

@@ -11,7 +11,7 @@ require('events').EventEmitter.defaultMaxListeners = 100;
const gulp = require('gulp'); const gulp = require('gulp');
const util = require('./lib/util'); const util = require('./lib/util');
const task = require('./lib/task'); const task = require('./lib/task');
const { compileTask, watchTask, compileApiProposalNamesTask, watchApiProposalNamesTask } = require('./lib/compilation'); const { transpileTask, compileTask, watchTask, compileApiProposalNamesTask, watchApiProposalNamesTask } = require('./lib/compilation');
const { monacoTypecheckTask/* , monacoTypecheckWatchTask */ } = require('./gulpfile.editor'); const { monacoTypecheckTask/* , monacoTypecheckWatchTask */ } = require('./gulpfile.editor');
const { compileExtensionsTask, watchExtensionsTask, compileExtensionMediaTask } = require('./gulpfile.extensions'); const { compileExtensionsTask, watchExtensionsTask, compileExtensionMediaTask } = require('./gulpfile.extensions');
@@ -19,6 +19,10 @@ const { compileExtensionsTask, watchExtensionsTask, compileExtensionMediaTask }
gulp.task(compileApiProposalNamesTask); gulp.task(compileApiProposalNamesTask);
gulp.task(watchApiProposalNamesTask); gulp.task(watchApiProposalNamesTask);
// Transpile only
const transpileClientTask = task.define('transpile-client', task.series(util.rimraf('out'), util.buildWebNodePaths('out'), transpileTask('src', 'out')));
gulp.task(transpileClientTask);
// Fast compile for development time // Fast compile for development time
const compileClientTask = task.define('compile-client', task.series(util.rimraf('out'), util.buildWebNodePaths('out'), compileApiProposalNamesTask, compileTask('src', 'out', false))); const compileClientTask = task.define('compile-client', task.series(util.rimraf('out'), util.buildWebNodePaths('out'), compileApiProposalNamesTask, compileTask('src', 'out', false)));
gulp.task(compileClientTask); gulp.task(compileClientTask);

View File

@@ -116,7 +116,7 @@ const serverEntryPoints = [
name: 'vs/workbench/api/node/extensionHostProcess', name: 'vs/workbench/api/node/extensionHostProcess',
exclude: ['vs/css', 'vs/nls'] exclude: ['vs/css', 'vs/nls']
}, },
{ {
name: 'vs/platform/files/node/watcher/watcherMain', name: 'vs/platform/files/node/watcher/watcherMain',
exclude: ['vs/css', 'vs/nls'] exclude: ['vs/css', 'vs/nls']
}, },
@@ -135,7 +135,7 @@ try {
// Include workbench web // Include workbench web
...vscodeWebEntryPoints ...vscodeWebEntryPoints
]; ];
} catch (err) { } catch (err) {
serverWithWebEntryPoints = [ serverWithWebEntryPoints = [
// Include all of server // Include all of server
@@ -356,14 +356,14 @@ function copyConfigTask(folder) {
const json = require('gulp-json-editor'); const json = require('gulp-json-editor');
return gulp.src(['remote/pkg-package.json'], { base: 'remote' }) return gulp.src(['remote/pkg-package.json'], { base: 'remote' })
.pipe(rename(path => path.basename += '.' + folder)) .pipe(rename(path => path.basename += '.' + folder))
.pipe(json(obj => { .pipe(json(obj => {
const pkg = obj.pkg; const pkg = obj.pkg;
pkg.scripts = pkg.scripts && pkg.scripts.map(p => path.join(destination, p)); pkg.scripts = pkg.scripts && pkg.scripts.map(p => path.join(destination, p));
pkg.assets = pkg.assets && pkg.assets.map(p => path.join(destination, p)); pkg.assets = pkg.assets && pkg.assets.map(p => path.join(destination, p));
return obj; return obj;
})) }))
.pipe(vfs.dest('out-vscode-reh-pkg')); .pipe(vfs.dest('out-vscode-reh-pkg'));
}; };
} }

View File

@@ -13,7 +13,7 @@ const ext = require('./lib/extensions');
const loc = require('./lib/locFunc'); const loc = require('./lib/locFunc');
const task = require('./lib/task'); const task = require('./lib/task');
const glob = require('glob'); const glob = require('glob');
const vsce = require('vsce'); const vsce = require('@vscode/vsce');
const mkdirp = require('mkdirp'); const mkdirp = require('mkdirp');
const rename = require('gulp-rename'); const rename = require('gulp-rename');
const fs = require('fs'); const fs = require('fs');
@@ -103,17 +103,22 @@ gulp.task('package-external-extensions', task.series(
return { name: extensionName, path: extensionPath }; return { name: extensionName, path: extensionPath };
}) })
.filter(element => ext.vscodeExternalExtensions.indexOf(element.name) === -1) // VS Code external extensions are bundled into ADS so no need to create a normal VSIX for them .filter(element => ext.vscodeExternalExtensions.indexOf(element.name) === -1) // VS Code external extensions are bundled into ADS so no need to create a normal VSIX for them
.map(element => { .map(async element => {
const pkgJson = require(path.join(element.path, 'package.json')); try {
const vsixDirectory = path.join(root, '.build', 'extensions'); const pkgJson = require(path.join(element.path, 'package.json'));
mkdirp.sync(vsixDirectory); const vsixDirectory = path.join(root, '.build', 'extensions');
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`); mkdirp.sync(vsixDirectory);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath); const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
return vsce.createVSIX({ console.info(`Creating vsix for ${element.path}, result: ${packagePath}`);
cwd: element.path, return await vsce.createVSIX({
packagePath: packagePath, cwd: element.path,
useYarn: true packagePath: packagePath,
}); useYarn: false
});
} catch (e) {
console.error(`Failed to create vsix for ${element.path}, error occurred: ${e}`);
throw e;
}
}); });
// Wait for all the initial VSIXes to be completed before making the VS Code ones since we'll be overwriting // Wait for all the initial VSIXes to be completed before making the VS Code ones since we'll be overwriting
// values in the package.json for those. // values in the package.json for those.
@@ -124,50 +129,54 @@ gulp.task('package-external-extensions', task.series(
// the package.json. It doesn't handle more complex tasks such as replacing localized strings. // the package.json. It doesn't handle more complex tasks such as replacing localized strings.
const vscodeVsixes = glob.sync('.build/external/extensions/*/package.vscode.json') const vscodeVsixes = glob.sync('.build/external/extensions/*/package.vscode.json')
.map(async vscodeManifestRelativePath => { .map(async vscodeManifestRelativePath => {
const vscodeManifestFullPath = path.join(root, vscodeManifestRelativePath); try {
const packageDir = path.dirname(vscodeManifestFullPath); const vscodeManifestFullPath = path.join(root, vscodeManifestRelativePath);
const packageManifestPath = path.join(packageDir, 'package.json'); const packageDir = path.dirname(vscodeManifestFullPath);
const json = require('gulp-json-editor'); const packageManifestPath = path.join(packageDir, 'package.json');
const packageJsonStream = gulp.src(packageManifestPath) // Create stream for the original package.json const json = require('gulp-json-editor');
.pipe(json(data => { const packageJsonStream = gulp.src(packageManifestPath) // Create stream for the original package.json
// And now use gulp-json-editor to modify the contents .pipe(json(data => {
const updateData = JSON.parse(fs.readFileSync(vscodeManifestFullPath)); // Read in the set of values to replace from package.vscode.json // And now use gulp-json-editor to modify the contents
Object.keys(updateData).forEach(key => { const updateData = JSON.parse(fs.readFileSync(vscodeManifestFullPath)); // Read in the set of values to replace from package.vscode.json
if (key !== 'contributes') { Object.keys(updateData).forEach(key => {
data[key] = updateData[key]; if (key !== 'contributes') {
data[key] = updateData[key];
}
});
if (data.contributes?.menus) {
// Remove ADS-only menus. This is a subset of the menus listed in https://github.com/microsoft/azuredatastudio/blob/main/src/vs/workbench/api/common/menusExtensionPoint.ts
// More can be added to the list as needed.
['objectExplorer/item/context', 'dataExplorer/context', 'dashboard/toolbar'].forEach(menu => {
delete data.contributes.menus[menu];
});
} }
});
if (data.contributes?.menus) {
// Remove ADS-only menus. This is a subset of the menus listed in https://github.com/microsoft/azuredatastudio/blob/main/src/vs/workbench/api/common/menusExtensionPoint.ts
// More can be added to the list as needed.
['objectExplorer/item/context', 'dataExplorer/context', 'dashboard/toolbar'].forEach(menu => {
delete data.contributes.menus[menu];
});
}
// Add any configuration properties from the package.vscode.json // Add any configuration properties from the package.vscode.json
// Currently only supports bringing over properties in the first config object found and doesn't support modifying the title // Currently only supports bringing over properties in the first config object found and doesn't support modifying the title
if (updateData.contributes?.configuration[0]?.properties) { if (updateData.contributes?.configuration[0]?.properties) {
Object.keys(updateData.contributes.configuration[0].properties).forEach(key => { Object.keys(updateData.contributes.configuration[0].properties).forEach(key => {
data.contributes.configuration[0].properties[key] = updateData.contributes.configuration[0].properties[key]; data.contributes.configuration[0].properties[key] = updateData.contributes.configuration[0].properties[key];
}); });
} }
return data; return data;
}, { beautify: false })) }, { beautify: false }))
.pipe(gulp.dest(packageDir)); .pipe(gulp.dest(packageDir));
await new Promise(resolve => packageJsonStream.on('finish', resolve)); // Wait for the files to finish being updated before packaging await new Promise(resolve => packageJsonStream.on('finish', resolve)); // Wait for the files to finish being updated before packaging
const pkgJson = JSON.parse(fs.readFileSync(packageManifestPath)); const pkgJson = JSON.parse(fs.readFileSync(packageManifestPath));
const vsixDirectory = path.join(root, '.build', 'extensions'); const vsixDirectory = path.join(root, '.build', 'extensions');
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`); const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + packageDir + ' result:' + packagePath); console.info(`Creating vsix for ${packageDir} result: ${packagePath}`);
return vsce.createVSIX({ return await vsce.createVSIX({
cwd: packageDir, cwd: packageDir,
packagePath: packagePath, packagePath: packagePath,
useYarn: true useYarn: false
}); });
} catch (e) {
console.error(`Failed to create vsix for ${packageDir}, error occurred: ${e}`);
throw e;
}
}); });
return Promise.all(vscodeVsixes); return Promise.all(vscodeVsixes);
}) })
)); ));
@@ -179,17 +188,22 @@ gulp.task('package-langpacks', task.series(
const extensionPath = path.dirname(path.join(root, manifestPath)); const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath); const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath }; return { name: extensionName, path: extensionPath };
}).map(element => { }).map(async element => {
const pkgJson = require(path.join(element.path, 'package.json')); try {
const vsixDirectory = path.join(root, '.build', 'langpacks'); const pkgJson = require(path.join(element.path, 'package.json'));
mkdirp.sync(vsixDirectory); const vsixDirectory = path.join(root, '.build', 'langpacks');
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`); mkdirp.sync(vsixDirectory);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath); const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
return vsce.createVSIX({ console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
cwd: element.path, return await vsce.createVSIX({
packagePath: packagePath, cwd: element.path,
useYarn: true packagePath: packagePath,
}); useYarn: false
});
} catch (e) {
console.error(`Failed to create vsix for ${element.path}, error occurred: ${e}`);
throw e;
}
}); });
return Promise.all(vsixes); return Promise.all(vsixes);

View File

@@ -64,7 +64,6 @@ const vscodeResources = [
'out-build/vs/base/browser/ui/codicons/codicon/**', 'out-build/vs/base/browser/ui/codicons/codicon/**',
'out-build/vs/base/parts/sandbox/electron-browser/preload.js', 'out-build/vs/base/parts/sandbox/electron-browser/preload.js',
'out-build/vs/platform/environment/node/userDataPath.js', 'out-build/vs/platform/environment/node/userDataPath.js',
'out-build/vs/platform/extensions/node/extensionHostStarterWorkerMain.js',
'out-build/vs/workbench/browser/media/*-theme.css', 'out-build/vs/workbench/browser/media/*-theme.css',
'out-build/vs/workbench/contrib/debug/**/*.json', 'out-build/vs/workbench/contrib/debug/**/*.json',
'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt', 'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt',
@@ -76,7 +75,7 @@ const vscodeResources = [
'out-build/vs/workbench/contrib/tasks/**/*.json', 'out-build/vs/workbench/contrib/tasks/**/*.json',
'out-build/vs/platform/files/**/*.exe', 'out-build/vs/platform/files/**/*.exe',
'out-build/vs/platform/files/**/*.md', 'out-build/vs/platform/files/**/*.md',
'out-build/vs/code/electron-browser/workbench/**', 'out-build/vs/code/electron-sandbox/workbench/**',
'out-build/vs/code/electron-browser/sharedProcess/sharedProcess.js', 'out-build/vs/code/electron-browser/sharedProcess/sharedProcess.js',
'out-build/vs/code/electron-sandbox/issue/issueReporter.js', 'out-build/vs/code/electron-sandbox/issue/issueReporter.js',
'out-build/sql/**/*.{svg,png,cur,html}', 'out-build/sql/**/*.{svg,png,cur,html}',
@@ -124,7 +123,6 @@ const extensionsFilter = filter([
'**/asde-deployment.xlf', '**/asde-deployment.xlf',
'**/azcli.xlf', '**/azcli.xlf',
'**/azurecore.xlf', '**/azurecore.xlf',
'**/azurehybridtoolkit.xlf',
'**/azuremonitor.xlf', '**/azuremonitor.xlf',
'**/cms.xlf', '**/cms.xlf',
'**/dacpac.xlf', '**/dacpac.xlf',
@@ -134,7 +132,6 @@ const extensionsFilter = filter([
'**/import.xlf', '**/import.xlf',
'**/kusto.xlf', '**/kusto.xlf',
'**/machine-learning.xlf', '**/machine-learning.xlf',
'**/Microsoft.sqlservernotebook.xlf',
'**/mssql.xlf', '**/mssql.xlf',
'**/notebook.xlf', '**/notebook.xlf',
'**/profiler.xlf', '**/profiler.xlf',
@@ -187,9 +184,9 @@ gulp.task(core);
* @return {Object} A map of paths to checksums. * @return {Object} A map of paths to checksums.
*/ */
function computeChecksums(out, filenames) { function computeChecksums(out, filenames) {
let result = {}; const result = {};
filenames.forEach(function (filename) { filenames.forEach(function (filename) {
let fullPath = path.join(process.cwd(), out, filename); const fullPath = path.join(process.cwd(), out, filename);
result[filename] = computeChecksum(fullPath); result[filename] = computeChecksum(fullPath);
}); });
return result; return result;
@@ -202,9 +199,9 @@ function computeChecksums(out, filenames) {
* @return {string} The checksum for `filename`. * @return {string} The checksum for `filename`.
*/ */
function computeChecksum(filename) { function computeChecksum(filename) {
let contents = fs.readFileSync(filename); const contents = fs.readFileSync(filename);
let hash = crypto const hash = crypto
.createHash('md5') .createHash('md5')
.update(contents) .update(contents)
.digest('base64') .digest('base64')
@@ -230,8 +227,8 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
'vs/workbench/workbench.desktop.main.js', 'vs/workbench/workbench.desktop.main.js',
'vs/workbench/workbench.desktop.main.css', 'vs/workbench/workbench.desktop.main.css',
'vs/workbench/api/node/extensionHostProcess.js', 'vs/workbench/api/node/extensionHostProcess.js',
'vs/code/electron-browser/workbench/workbench.html', 'vs/code/electron-sandbox/workbench/workbench.html',
'vs/code/electron-browser/workbench/workbench.js' 'vs/code/electron-sandbox/workbench/workbench.js'
]); ]);
const src = gulp.src(out + '/**', { base: '.' }) const src = gulp.src(out + '/**', { base: '.' })
@@ -322,6 +319,7 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
all = es.merge(all, gulp.src('resources/linux/code.png', { base: '.' })); all = es.merge(all, gulp.src('resources/linux/code.png', { base: '.' }));
} else if (platform === 'darwin') { } else if (platform === 'darwin') {
const shortcut = gulp.src('resources/darwin/bin/code.sh') const shortcut = gulp.src('resources/darwin/bin/code.sh')
.pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(rename('bin/code')); .pipe(rename('bin/code'));
all = es.merge(all, shortcut); all = es.merge(all, shortcut);
@@ -366,10 +364,14 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
result = es.merge(result, gulp.src('resources/win32/VisualElementsManifest.xml', { base: 'resources/win32' }) result = es.merge(result, gulp.src('resources/win32/VisualElementsManifest.xml', { base: 'resources/win32' })
.pipe(rename(product.nameShort + '.VisualElementsManifest.xml'))); .pipe(rename(product.nameShort + '.VisualElementsManifest.xml')));
result = es.merge(result, gulp.src('.build/policies/win32/**', { base: '.build/policies/win32' })
.pipe(rename(f => f.dirname = `policies/${f.dirname}`)));
} else if (platform === 'linux') { } else if (platform === 'linux') {
result = es.merge(result, gulp.src('resources/linux/bin/code.sh', { base: '.' }) result = es.merge(result, gulp.src('resources/linux/bin/code.sh', { base: '.' })
.pipe(replace('@@PRODNAME@@', product.nameLong)) .pipe(replace('@@PRODNAME@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName)) .pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(rename('bin/' + product.applicationName))); .pipe(rename('bin/' + product.applicationName)));
} }
@@ -523,7 +525,7 @@ gulp.task(task.define(
gulp.task('vscode-translations-pull', function () { gulp.task('vscode-translations-pull', function () {
return es.merge([...i18n.defaultLanguages, ...i18n.extraLanguages].map(language => { return es.merge([...i18n.defaultLanguages, ...i18n.extraLanguages].map(language => {
let includeDefault = !!innoSetupConfig[language.id].defaultInfo; const includeDefault = !!innoSetupConfig[language.id].defaultInfo;
return i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-translations-import/${language.id}/setup`)); return i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-translations-import/${language.id}/setup`));
})); }));
}); });

View File

@@ -16,6 +16,9 @@ const task = require('./lib/task');
const packageJson = require('../package.json'); const packageJson = require('../package.json');
const product = require('../product.json'); const product = require('../product.json');
const rpmDependenciesGenerator = require('./linux/rpm/dependencies-generator'); const rpmDependenciesGenerator = require('./linux/rpm/dependencies-generator');
const debianDependenciesGenerator = require('./linux/debian/dependencies-generator');
const sysrootInstaller = require('./linux/debian/install-sysroot');
const debianRecommendedDependencies = require('./linux/debian/dep-lists').recommendedDeps;
const path = require('path'); const path = require('path');
const root = path.dirname(__dirname); const root = path.dirname(__dirname);
const commit = util.getVersion(root); const commit = util.getVersion(root);
@@ -75,12 +78,16 @@ function prepareDebPackage(arch) {
let size = 0; let size = 0;
const control = code.pipe(es.through( const control = code.pipe(es.through(
function (f) { size += f.isDirectory() ? 4096 : f.contents.length; }, function (f) { size += f.isDirectory() ? 4096 : f.contents.length; },
function () { async function () {
const that = this; const that = this;
const sysroot = await sysrootInstaller.getSysroot(debArch);
const dependencies = debianDependenciesGenerator.getDependencies(binaryDir, product.applicationName, debArch, sysroot);
gulp.src('resources/linux/debian/control.template', { base: '.' }) gulp.src('resources/linux/debian/control.template', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName)) .pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@VERSION@@', packageJson.version + '-' + linuxPackageRevision)) .pipe(replace('@@VERSION@@', packageJson.version + '-' + linuxPackageRevision))
.pipe(replace('@@ARCHITECTURE@@', debArch)) .pipe(replace('@@ARCHITECTURE@@', debArch))
.pipe(replace('@@DEPENDS@@', dependencies.join(', ')))
.pipe(replace('@@RECOMMENDS@@', debianRecommendedDependencies.join(', ')))
.pipe(replace('@@INSTALLEDSIZE@@', Math.ceil(size / 1024))) .pipe(replace('@@INSTALLEDSIZE@@', Math.ceil(size / 1024)))
.pipe(rename('DEBIAN/control')) .pipe(rename('DEBIAN/control'))
.pipe(es.through(function (f) { that.emit('data', f); }, function () { that.emit('end'); })); .pipe(es.through(function (f) { that.emit('data', f); }, function () { that.emit('end'); }));
@@ -212,7 +219,7 @@ function buildRpmPackage(arch) {
return shell.task([ return shell.task([
'mkdir -p ' + destination, 'mkdir -p ' + destination,
'HOME="$(pwd)/' + destination + '" fakeroot rpmbuild -bb ' + rpmBuildPath + '/SPECS/' + product.applicationName + '.spec --target=' + rpmArch, 'HOME="$(pwd)/' + destination + '" rpmbuild -bb ' + rpmBuildPath + '/SPECS/' + product.applicationName + '.spec --target=' + rpmArch,
'cp "' + rpmOut + '/$(ls ' + rpmOut + ')" ' + destination + '/' 'cp "' + rpmOut + '/$(ls ' + rpmOut + ')" ' + destination + '/'
]); ]);
} }

View File

@@ -208,7 +208,7 @@ function packageTask(sourceFolderName, destinationFolderName) {
gulp.src('resources/server/code-512.png', { base: 'resources/server' }) gulp.src('resources/server/code-512.png', { base: 'resources/server' })
); );
let all = es.merge( const all = es.merge(
packageJsonStream, packageJsonStream,
license, license,
sources, sources,
@@ -218,7 +218,7 @@ function packageTask(sourceFolderName, destinationFolderName) {
pwaicons pwaicons
); );
let result = all const result = all
.pipe(util.skipDirectories()) .pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions()); .pipe(util.fixWin32DirectoryPermissions());

View File

@@ -53,7 +53,7 @@ function hygiene(some, linting = true) {
const m = /([^\t\n\r\x20-\x7E⊃⊇✔✓🎯⚠🛑🔴🚗🚙🚕🎉✨❗⇧⌥⌘×÷¦⋯…↑↓→→←↔⟷·•●◆▼⟪⟫┌└├⏎↩√φ]+)/g.exec(line); const m = /([^\t\n\r\x20-\x7E⊃⊇✔✓🎯⚠🛑🔴🚗🚙🚕🎉✨❗⇧⌥⌘×÷¦⋯…↑↓→→←↔⟷·•●◆▼⟪⟫┌└├⏎↩√φ]+)/g.exec(line);
if (m) { if (m) {
console.error( console.error(
file.relative + `(${i + 1},${m.index + 1}): Unexpected unicode character: "${m[0]}". To suppress, use // allow-any-unicode-next-line` file.relative + `(${i + 1},${m.index + 1}): Unexpected unicode character: "${m[0]}" (charCode: ${m[0].charCodeAt(0)}). To suppress, use // allow-any-unicode-next-line`
); );
errorCount++; errorCount++;
} }
@@ -115,8 +115,8 @@ function hygiene(some, linting = true) {
}) })
.then( .then(
(result) => { (result) => {
let original = result.src.replace(/\r\n/gm, '\n'); const original = result.src.replace(/\r\n/gm, '\n');
let formatted = result.dest.replace(/\r\n/gm, '\n'); const formatted = result.dest.replace(/\r\n/gm, '\n');
if (original !== formatted) { if (original !== formatted) {
console.error( console.error(

View File

@@ -1,15 +0,0 @@
{
"compilerOptions": {
"module": "commonjs",
"target": "es2017",
"jsx": "preserve",
"checkJs": true
},
"include": [
"**/*.js"
],
"exclude": [
"node_modules",
"**/node_modules/*"
]
}

View File

@@ -1,8 +1,8 @@
"use strict";
/*--------------------------------------------------------------------------------------------- /*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved. * Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.createAsar = void 0; exports.createAsar = void 0;
const path = require("path"); const path = require("path");
@@ -81,7 +81,7 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
out.push(file.contents); out.push(file.contents);
} }
}, function () { }, function () {
let finish = () => { const finish = () => {
{ {
const headerPickle = pickle.createEmpty(); const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header)); headerPickle.writeString(JSON.stringify(filesystem.header));

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information. * Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/ *--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path'; import * as path from 'path';
import * as es from 'event-stream'; import * as es from 'event-stream';
const pickle = require('chromium-pickle-js'); const pickle = require('chromium-pickle-js');
@@ -98,7 +96,7 @@ export function createAsar(folderPath: string, unpackGlobs: string[], destFilena
} }
}, function () { }, function () {
let finish = () => { const finish = () => {
{ {
const headerPickle = pickle.createEmpty(); const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header)); headerPickle.writeString(JSON.stringify(filesystem.header));

View File

@@ -45,8 +45,7 @@ function isUpToDate(extension) {
} }
} }
function syncMarketplaceExtension(extension) { function syncMarketplaceExtension(extension) {
var _a; const galleryServiceUrl = productjson.extensionsGallery?.serviceUrl;
const galleryServiceUrl = (_a = productjson.extensionsGallery) === null || _a === void 0 ? void 0 : _a.serviceUrl;
const source = ansiColors.blue(galleryServiceUrl ? '[marketplace]' : '[github]'); const source = ansiColors.blue(galleryServiceUrl ? '[marketplace]' : '[github]');
if (isUpToDate(extension)) { if (isUpToDate(extension)) {
log(source, `${extension.name}@${extension.version}`, ansiColors.green('✔︎')); log(source, `${extension.name}@${extension.version}`, ansiColors.green('✔︎'));
@@ -98,12 +97,12 @@ function writeControlFile(control) {
fs.writeFileSync(controlFilePath, JSON.stringify(control, null, 2)); fs.writeFileSync(controlFilePath, JSON.stringify(control, null, 2));
} }
function getBuiltInExtensions() { function getBuiltInExtensions() {
log('Syncronizing built-in extensions...'); log('Synchronizing built-in extensions...');
log(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`); log(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`);
const control = readControlFile(); const control = readControlFile();
const streams = []; const streams = [];
for (const extension of [...builtInExtensions, ...webBuiltInExtensions]) { for (const extension of [...builtInExtensions, ...webBuiltInExtensions]) {
let controlState = control[extension.name] || 'marketplace'; const controlState = control[extension.name] || 'marketplace';
control[extension.name] = controlState; control[extension.name] = controlState;
streams.push(syncExtension(extension, controlState)); streams.push(syncExtension(extension, controlState));
} }

View File

@@ -136,14 +136,14 @@ function writeControlFile(control: IControlFile): void {
} }
export function getBuiltInExtensions(): Promise<void> { export function getBuiltInExtensions(): Promise<void> {
log('Syncronizing built-in extensions...'); log('Synchronizing built-in extensions...');
log(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`); log(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`);
const control = readControlFile(); const control = readControlFile();
const streams: Stream[] = []; const streams: Stream[] = [];
for (const extension of [...builtInExtensions, ...webBuiltInExtensions]) { for (const extension of [...builtInExtensions, ...webBuiltInExtensions]) {
let controlState = control[extension.name] || 'marketplace'; const controlState = control[extension.name] || 'marketplace';
control[extension.name] = controlState; control[extension.name] = controlState;
streams.push(syncExtension(extension, controlState)); streams.push(syncExtension(extension, controlState));

View File

@@ -18,7 +18,6 @@ const token = process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'
const contentBasePath = 'raw.githubusercontent.com'; const contentBasePath = 'raw.githubusercontent.com';
const contentFileNames = ['package.json', 'package-lock.json', 'yarn.lock']; const contentFileNames = ['package.json', 'package-lock.json', 'yarn.lock'];
async function downloadExtensionDetails(extension) { async function downloadExtensionDetails(extension) {
var _a, _b, _c;
const extensionLabel = `${extension.name}@${extension.version}`; const extensionLabel = `${extension.name}@${extension.version}`;
const repository = url.parse(extension.repo).path.substr(1); const repository = url.parse(extension.repo).path.substr(1);
const repositoryContentBaseUrl = `https://${token ? `${token}@` : ''}${contentBasePath}/${repository}/v${extension.version}`; const repositoryContentBaseUrl = `https://${token ? `${token}@` : ''}${contentBasePath}/${repository}/v${extension.version}`;
@@ -56,11 +55,11 @@ async function downloadExtensionDetails(extension) {
} }
} }
// Validation // Validation
if (!((_a = results.find(r => r.fileName === 'package.json')) === null || _a === void 0 ? void 0 : _a.body)) { if (!results.find(r => r.fileName === 'package.json')?.body) {
// throw new Error(`The "package.json" file could not be found for the built-in extension - ${extensionLabel}`); // throw new Error(`The "package.json" file could not be found for the built-in extension - ${extensionLabel}`);
} }
if (!((_b = results.find(r => r.fileName === 'package-lock.json')) === null || _b === void 0 ? void 0 : _b.body) && if (!results.find(r => r.fileName === 'package-lock.json')?.body &&
!((_c = results.find(r => r.fileName === 'yarn.lock')) === null || _c === void 0 ? void 0 : _c.body)) { !results.find(r => r.fileName === 'yarn.lock')?.body) {
// throw new Error(`The "package-lock.json"/"yarn.lock" could not be found for the built-in extension - ${extensionLabel}`); // throw new Error(`The "package-lock.json"/"yarn.lock" could not be found for the built-in extension - ${extensionLabel}`);
} }
} }

Some files were not shown because too many files have changed in this diff Show More