Compare commits

..

75 Commits

Author SHA1 Message Date
Charles Gagnon
d296b6397e Fix HDFS node for Integrated auth (#12906) (#12907)
* Fix some HDFS issues

* Undo other changes
2020-10-13 15:10:11 -07:00
Vasu Bhog
05615c796d Fix connection when changing kernel from Kusto to SQL (#12881) (#12887)
* Fix Kusto to SQL kernel connection change

* Updated Fix - removes kernel alias mapping while ensuring multi kusto notebooks work properly

* Fix tests
2020-10-12 10:52:24 -07:00
Aasim Khan
e60b01ac00 Adding sql vm and sql db notebooks to october (#12880)
* SQL VM deployments (#12144)

* Added sql vm deployment option

* Added more fields for sql vm deployments

* created basic sqlvm deployment. Mostly hardcoded

* added string to package.nls

* added poc deployments for sql vm

* Made some changes in the notebook that was mentioned in PR

* Added scaffolding for azure sql vm wizard.

* code cleanups

* added some async logic

* added loading component

* fixed loader code

* completed page2 of wizard

* added some more required fields.

* added some more fields

* added network settings page

* added sql server settings page

* added azure signin support and sql server settings page

* added some helper methods in wizard code

* added some fixes

* fixed azure and vm setting page
added validation in azure setting page

* added changes for the notebook variable

* validations and other bug fixes

* commenting sql storage optimization dropdown

* cleanedup wizard base page

* reversing  vm image list to display newer images first

* cleaning model code

* added validations for network setting

* Completed summary page
fixed the code poisition
some additional field validations

* fixed networking page

* - fixed an error with vm size model variable
- removed byol images because it was not working with az sql vm
- Fixed vm size display names in dropdown

* added double quotes to some localized strings

* added some space inside strings

* -Added live validations
-Restyled network component
-Added required to regions
-Some bug fixes

* -redesigned summary page
-localized some strings

* Fixed summary page section titles

* -Fixed validations on sql server settings page
-Fixed some fields on Summary Page

* corrected onleave validation
using array for error messages
using Promises.all

* Fixed bug on network settings dropdowns when user does not have existing resource to populate them

* Change resource deployment display name
Added Ninar's iteration of the notebook
Changed RDP check box label
Surfacing API errors to user
Filtering regions based on Azure VM regions and user's subscription region
Made form validation async
Displaying new checkbox on network page when dropdowns empty
Fixed a small bug in SQL auth form validation
Made summary single item per row and fixed the gaps in spacing
Fixed validations in vm page
Checking if vm name already exists on azure

* Fixed sql vm eula
Fixed sql vm description
Added hyperlink for more info on vm sizes

* Replaced loading component with dropdown loaders.

* localized string
Fixed a bug in network settings page

* Added additonal filtering

* added reverse to image images

* Fixing some merge related issues

* Fixed conflicts

* sql db deployments into main (WIP) (#12767)

* added my resource-deployment

* changed notebook message

* Add more advanced properties for spark job submission dialog (#12732)

* Add more advanced properties for spark job submission dialog

* Add queue

* Revert "Add more advanced properties for spark job submission dialog (#12732)"

This reverts commit e6a7e86ddbe70b39660098a8ebd9ded2a1c5530c.

* Changes made for simplification

* changed error messages

* tags added

* tags removed due to redundancy

* Update package.json

* Update resourceTypePickerDialog.ts

* changes based on feedback

* activaterealtimevalidation removed

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* adding tags to sql vm

* added register navigation for Azure settings page

* simplified check

Co-authored-by: Alex Ma <alma1@microsoft.com>
Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2020-10-11 13:06:41 -07:00
Charles Gagnon
b68cdbeebe Update HDFS mount path (#12865) (#12866) 2020-10-09 15:49:21 -07:00
Charles Gagnon
7429407029 [Port] Sync up arc and azdata extensions with main (#12810)
* Sync up arc and azdata extensions with main

* capture 'this' to use retrieveVariable as callback (#12828)

* capture 'this' to use retrieveVariable as callback

* remove change not needed for #12082

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-08 16:03:27 -07:00
Barbara Valdez
6adeffbc8e replace pip in notebook (#12808) (#12827) 2020-10-08 15:23:30 -07:00
Chris LaFreniere
8a078d2d68 default to relative links in images and links (#12802) (#12813) 2020-10-08 12:30:11 -07:00
Charles Gagnon
eadac3af3a Fix arc strings (#12803) 2020-10-07 20:37:59 -07:00
Charles Gagnon
8e8d9b5f59 port c679d5e1f0 (#12780)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-07 20:36:51 -07:00
Aasim Khan
93e806cca1 Aasim/release1.23/resource filter (#12796)
* Added categories and search based filtering to the resource dialog. (#12658)

* added filtering to the resource type along with a new component.

* -Added caching of cards
-Removed unused component props
-localized tags
-limited the scope of list items

* Made some changes in the PR

* - Added Iot Category to SQL edge
- Moved category names to constants
- Moved localization strings to localized constants
- Made filtering logic more concise
- Changed how category list is generated
--Category list can now be ordered
-Added back event generation for selectedCard

* Fixed bugs, and some additional changes
-Fixed radiogroup height to avoid the movement of options below it
-Restoring the focus back to the search and listview components
- Added focus behaviour for listview
- Fixed a typo in comment

* Made categories an Enum

* Added localized string

* localized category string
converted categories to enum.

* made the filtering logic more concise.

* returning string if no localized string formed
removed unnecessary returns

* fixed the filtering tag logic
resetting search when category is changed

* removing the iot tag from sql edge deployment

* made filtering logic more concise
made enum const

* added vscode list

* some cleanup

* Some PR changes
- Made PR camelcase
- added comments to SQL
- removed unnecessary export

* -Some PR related changes
-Removing unsupported style property
-scoping down css and removing unused ones.

* Fixed a comment text

* Fixed typings for listview event

* Adding tags to azure sql deployment
2020-10-07 14:55:09 -07:00
Charles Gagnon
98ed0d5274 cherry-picked from b8de69dfac (#12777)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-07 10:35:10 -07:00
Chris LaFreniere
7bca43524e Notebooks: WYSIWYG Add Redo, Fix Shortcuts (#12752) (#12784)
* Add redo and out/indent

* Check for active cell before doing shortcut

* PR feedback

* Remove unnecessary parameter
2020-10-07 10:01:46 -07:00
Charles Gagnon
a8c983519e Save username/password for BDC HDFS connections (#12667) (#12778)
* Save username/password for BDC HDFS connections

* comment
2020-10-06 21:51:04 -07:00
Charles Gagnon
ac6ef2639f Port 807a4ae8c4 (#12747) 2020-10-06 13:41:27 -07:00
Barbara Valdez
35957cc283 Fix search for pinned notebooks (#12719) (#12766)
* fix search for pinned notebooks

* fix filtering when verifying that a search folder is not a subdirectory from the current folder queries path

* Show book node on pinned notebooks search results

* fix parent node on pinned notebooks search results

* fix search for pinned notebook and modify how pinned notebooks are stored in workspace

* update format of pinned notebooks for users that used the september release version

* removed unused functions

* Address PR comments

* fix parent node for legacy version of jupyter books

* remove cast from book path
2020-10-06 13:38:39 -07:00
Charles Gagnon
b054295eac Add additional logging to spark command failures (#12706) (#12761) 2020-10-06 11:47:06 -07:00
Charles Gagnon
5b7a7c9865 Fix HDFS node to only show up for BDC connections (#12714) (#12762) 2020-10-06 11:36:40 -07:00
Charles Gagnon
867faae14f [Port] Improved behavior for accepting EULA. (#12453) (#12749)
* Improved behavior for accepting EULA. (#12453)

* working version of overloading "select" button

* promptForEula to use showErrorMessage

* make parameter optional in promptForEula

* remove test code

* PR feedback

* eula to EULA

* minor fix

* Fix compile error

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 18:52:16 -07:00
Charles Gagnon
4c6b606c82 use selected subscriptions (#12691) (#12741)
* working version

* pr feedback

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 18:51:26 -07:00
Monica Gupta
d5daaf918d Fix notebook issue when creating Kusto notebooks 2nd time after launching ADS (#12700) (#12750)
* Fix notebook issue

* Removed not required code

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-05 15:56:19 -07:00
Charles Gagnon
72d48bda61 Allow non-admin BDC connections to see BDC features (#12663) (#12737)
* Add handling for non-admin BDC users

* Bump STS

* Fix HDFS root node commands

* remove nested awaits

* colon
2020-10-05 15:55:23 -07:00
Charles Gagnon
93156ccf04 cherry-pick 7bfea07b9b (#12742)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 15:39:48 -07:00
Udeesha Gautam
781c7de772 ML extension - revised button component (#12674) (#12746)
* Revert "Revert "ML extension updates  (#11817)" (#12645)"

This reverts commit 34a6200a47.

* Modified button template and renamed infoButton ElementRef

* fix rendering issue

* Minor code cleanup.

* add clean up previous button logic

Co-authored-by: Alan Ren <alanren@microsoft.com>

Co-authored-by: Hale Rankin <harankin@microsoft.com>
Co-authored-by: Alan Ren <alanren@microsoft.com>
2020-10-05 13:58:01 -07:00
Charles Gagnon
41e8b73ac4 vBump notebooks to get latest CU6 version of book (#12683) (#12739) 2020-10-05 13:41:16 -07:00
Udeesha Gautam
61254c7298 Updating SqltoolsService Version to Pick DacFx changes (#12743)
Co-authored-by: Benjin Dubishar <benjin.dubishar@gmail.com>
2020-10-05 13:06:44 -07:00
Charles Gagnon
5f59fa021c Fix checkbox change event not firing on enter press (#12703) (#12735)
* Fix checkbox change event not firing

* Add comment
2020-10-05 12:53:59 -07:00
Charles Gagnon
1f65216889 Port bf9646ba98 (#12738) 2020-10-05 12:52:46 -07:00
Charles Gagnon
c801d46814 Fix root group name check (#12660) (#12736) 2020-10-05 12:51:23 -07:00
Alan Ren
6c85cf2bdd update preview feature notification (#12723) (#12734) 2020-10-05 12:48:28 -07:00
Aasim Khan
9067204979 Aasim/release1.23/importfixes (#12721)
* Fixing import getting stuck on step 4  (#12677)

* Getting the proper attribute during column modification
Exposing errors of change column settings and stopping import if they occur

* removing extra space

* Added a comment for error handling

* Fixed a test error that was caused due to insufficient null checks.

* removing unnecessary return

* version bump of flat file services (#12686)
2020-10-02 15:17:55 -07:00
Karl Burtram
ac6bc56c4e Bump ADS to 1.23.0 2020-10-02 14:54:53 -07:00
Charles Gagnon
1b5c54dd8c revert grid streaming changes (#12650) (#12652)
(cherry picked from commit cf9754f627)

Co-authored-by: Lucy Zhang <luczhan@microsoft.com>
2020-09-28 21:49:49 -07:00
Aditya Bist
4082170522 bump version for hotfix (#12592) 2020-09-25 21:10:34 -07:00
Alan Ren
5ecf1c6e6f bump sts version (#12636) (#12638) 2020-09-25 14:59:04 -07:00
Charles Gagnon
6de11c8107 Fix undefined error in server tree data source (#12616) (#12617)
* Fix undefined error in server tree data source

* Add comment

(cherry picked from commit 1ea33d83bf)
2020-09-25 13:43:47 -07:00
Monica Gupta
76d7b0a9fe Addressed comments (#12618)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-24 17:16:23 -07:00
Alan Ren
ce4c3e9586 clone the object to be modified (#12583) (#12590) 2020-09-23 13:42:44 -07:00
Alan Ren
5190bf376c escape the value for display (#12547) (#12571) 2020-09-22 14:50:41 -07:00
Udeesha Gautam
77b9a708df fix the reference error due to extra $ in default variable (#12524) 2020-09-21 10:21:23 -07:00
Udeesha Gautam
a4ee871b88 Port/db project fixes (#12521)
* Update default values and example text when dropdown value changes (#12493)

* remove option to add reference to same database (#12495)

Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
2020-09-20 21:09:46 -07:00
Charles Gagnon
3f4e19fc08 Arc good ARC bad (#12499) (#12511)
Co-authored-by: Chris LaFreniere <40371649+chlafreniere@users.noreply.github.com>
2020-09-20 11:44:30 -07:00
Barbara Valdez
571fca6de5 In-Viewlet Notebooks Search (#12455) (#12514)
* fix search

* Add sql carbon tags to vs files

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>
2020-09-19 18:13:10 -07:00
Barbara Valdez
5a2fdc4034 Add warning message for users using the new version of jupyter book (#12496) (#12500)
* Add warning message for users

* Address pr comments
2020-09-18 20:15:12 -07:00
Chris LaFreniere
cc6d84e7f6 Notebooks: Fix Grids Not Rendering when Unsaved Notebook Reloaded (#12483) (#12498)
* Clear Output and fix output change

* Fix tests after forced clear + append output
2020-09-18 20:14:45 -07:00
Vasu Bhog
99e11d2e22 Fix PySpark kernel connection change (#12494) (#12497) 2020-09-18 20:10:37 -07:00
Charles Gagnon
9a85123e21 Revert BDC deployment back to using old azdata check (#12470) (#12474) 2020-09-18 18:46:22 -07:00
Lucy Zhang
56669db6b6 update resultSet in data provider (#12478) (#12486) 2020-09-18 18:36:40 -07:00
Udeesha Gautam
8782eeb32f Port/ml fixes (#12491)
* change to allow refresh and delete correctly (#12477)

* add table name to models that are imported (#12445)
2020-09-18 17:44:58 -07:00
Charles Gagnon
7f3d5bac0a start with eulaCheckButton hidden (#12427) (#12458)
* start with eulaCheckButton hidden

* reset buttons on card select

* remove testcode

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-18 11:52:32 -07:00
Charles Gagnon
7a1e0a7d2e Fix resource deployment text field validation (#12421) (#12457) 2020-09-18 11:22:23 -07:00
Alan Ren
681ecbd946 fix the legacy card style issue (#12428) (#12442)
* fix the legacy card style issue

* replace the card class
2020-09-18 11:14:02 -07:00
Vasu Bhog
e7798a8e32 Fix Spark kernel connections and switch from Kusto to Spark kernels (#12436) (#12441)
* Fix connection dialog for Spark and issue when switching from Kusto to Spark

* Address comments
2020-09-17 21:32:03 -07:00
Aasim Khan
b158180ef4 Added portal link for Azure SQL (#12425) 2020-09-17 17:37:41 -07:00
Aditya Bist
7ad9da7fda fix connection dialog indentation (#12414) 2020-09-17 15:55:54 -07:00
Charles Gagnon
94e2016a16 Port updates for removing EULA acceptance checkbox from Arc deployments (#12409)
* controller dropdown field to SQL MIAA and Postgres deployment. (#12217)

* saving first draft

* throw if no controllers

* cleanup

* bug fixes

* bug fixes and caching controller access

* pr comments and bug fixes.

* fixes

* fixes

* comment fix

* remove debug prints

* comment fixes

* remove debug logs

* inputValueTransformer returns string|Promise

* PR feedback

* pr fixes

* remove _ from protected fields

* anonymous to full methods

* small fixes

(cherry picked from commit 9cf80113fc)

* fix option sources (#12387)


(cherry picked from commit fca8b85a72)

* Remove azdata eula acceptance from arc deployments (#12292)

* saving to switch tasks

* activate to exports in extApi

* working version - cleanup pending

* improve messages

* apply pr feedback from a different review

* remove unneeded strings

* redo apiService

* remove async from getVersionFromOutput

* remove _ prefix from protected fields

* error message fix

* throw specif errors from azdata extension

* arrow methods to regular methods

* pr feedback

* expand azdata extension api

* pr feedback

* remove unused var

* pr feedback

(cherry picked from commit ba44a2f02e)

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-17 15:05:02 -07:00
Aditya Bist
21bb577da8 fix maximize bug (#12335) 2020-09-17 14:18:53 -07:00
Udeesha Gautam
5e8325ba28 marking intermittent test failure as unstable (#12402) (#12407) 2020-09-17 13:36:13 -07:00
Aasim Khan
25b7ccade3 Added awaits to change column setting (#12315) 2020-09-17 13:28:21 -07:00
Barbara Valdez
57940c581c Update Windows command and minor update to installation cell (#12361) (#12400)
* Fix windows command and minor update to installation cell

* Add expand_section field on the first section of the book
2020-09-17 13:17:50 -07:00
Chris LaFreniere
82f9e4e24b Notebooks: Fast update WYSIWYG support for source update (#12289) (#12399)
* Fast update WYSIWYG support for source update

* Do bracket matching over hardcoding line offsets
2020-09-17 13:17:03 -07:00
Hale Rankin
3e22fcfd2d 12360 Notebook UI - Mac/Win fix for Select all. (#12383) (#12397)
* 12360 Notebook UI - Mac/Win fix for Select all.

* Fix for ctrl key selecting all in windows

* Fix undo as well

* preventDefault to prevent confusing behavior

Co-authored-by: chlafreniere <hichise@gmail.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
2020-09-17 12:18:47 -07:00
Lucy Zhang
0bc81e1078 Fix notebook table rendering with multiple code cells (#12363) (#12391)
* create unique query runner for each cell

* use cellUri instead of cellId to identify runner

* disconnect each query runner connection

* remove queryrunners size check
2020-09-17 10:32:11 -07:00
Barbara Valdez
7b6328dccf Fix highlight issue (#12278) (#12362)
* Fix highlight issue

* Address PR comments
2020-09-16 13:48:14 -07:00
Vasu Bhog
05124273ea Fix Notebook Kusto Kernel Consistency (#12256) (#12352)
* fix kusto notebook consistency

* Address undefined
2020-09-16 12:08:28 -07:00
Lucy Zhang
b1d4444522 Fix notebook cancel query bug (#12300) (#12351)
* fix undefined query runner error

* store connection id

* revert sqlSessionManager change
2020-09-16 12:07:38 -07:00
Alan Ren
4ee2d369cf vbump sql-db-proj extension (#12336) (#12354)
* vbump sql-db-proj extension (#12336)

* update sqlproj dependency version (#12359)

Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>
2020-09-16 11:47:51 -07:00
Charles Gagnon
fb28b69bb0 Fix component items in declarative table not showing (#12330) (#12331)
(cherry picked from commit 4dd04cb250)
2020-09-16 11:43:01 -07:00
Chris LaFreniere
f2709c7100 Watch for on load event (#12309) (#12346) 2020-09-16 00:43:23 -07:00
Chris LaFreniere
3476f5ae38 Add newline after caption (#12276) (#12340) 2020-09-15 23:07:18 -07:00
Chris LaFreniere
b937fdee7a 12284 Removed custom CSS that positioned editor text beneath overlapping layers. Text is now selectable. (#12312) (#12339)
Co-authored-by: Hale Rankin <harankin@microsoft.com>
2020-09-15 23:04:30 -07:00
Chris LaFreniere
dd9ac2e362 Add heasdingStyle atx option (#12286) (#12338) 2020-09-15 22:50:46 -07:00
Alan Ren
403ff6cfec remove data-workspace dependency (#12321) (#12327) 2020-09-15 17:05:29 -07:00
Udeesha Gautam
4a6226974e adding icon for add new and open project (#12265) (#12324) 2020-09-15 16:28:42 -07:00
Charles Gagnon
6a2c47f511 Disable resource viewer (#12291) (#12298)
* Disable resource viewer

* comment

* Remove unused

(cherry picked from commit 95b76f08f2)
2020-09-15 16:13:53 -07:00
Aditya Bist
3d9a316f4b bump vscode version (#12258) 2020-09-14 14:53:26 -07:00
6456 changed files with 76242 additions and 442232 deletions

View File

@@ -12,10 +12,6 @@
{ {
"file": "build\\actions\\AutoMerge\\dist\\index.js", "file": "build\\actions\\AutoMerge\\dist\\index.js",
"_justification": "False positive from webpacked code" "_justification": "False positive from webpacked code"
},
{
"file": ".devcontainer\\devcontainer.json",
"_justification": "Local development environment - not used in production"
} }
] ]
} }

121
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,121 @@
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
FROM mcr.microsoft.com/vscode/devcontainers/typescript-node:0-12
ARG TARGET_DISPLAY=":1"
# VNC options
ARG MAX_VNC_RESOLUTION=1920x1080x16
ARG TARGET_VNC_RESOLUTION=1920x1080
ARG TARGET_VNC_DPI=72
ARG TARGET_VNC_PORT=5901
ARG VNC_PASSWORD="vscode"
# noVNC (VNC web client) options
ARG INSTALL_NOVNC="true"
ARG NOVNC_VERSION=1.1.0
ARG TARGET_NOVNC_PORT=6080
ARG WEBSOCKETIFY_VERSION=0.9.0
# Firefox is useful for testing things like browser launch events, but optional
ARG INSTALL_FIREFOX="false"
# Expected non-root username from base image
ARG USERNAME=node
# Core environment variables for X11, VNC, and fluxbox
ENV DBUS_SESSION_BUS_ADDRESS="autolaunch:" \
MAX_VNC_RESOLUTION="${MAX_VNC_RESOLUTION}" \
VNC_RESOLUTION="${TARGET_VNC_RESOLUTION}" \
VNC_DPI="${TARGET_VNC_DPI}" \
VNC_PORT="${TARGET_VNC_PORT}" \
NOVNC_PORT="${TARGET_NOVNC_PORT}" \
DISPLAY="${TARGET_DISPLAY}" \
LANG="en_US.UTF-8" \
LANGUAGE="en_US.UTF-8" \
VISUAL="nano" \
EDITOR="nano"
# Configure apt and install packages
RUN apt-get update \
&& export DEBIAN_FRONTEND=noninteractive \
#
# Install the Cascadia Code fonts - https://github.com/microsoft/cascadia-code
&& curl -sSL https://github.com/microsoft/cascadia-code/releases/download/v2004.30/CascadiaCode_2004.30.zip -o /tmp/cascadia-fonts.zip \
&& unzip /tmp/cascadia-fonts.zip -d /tmp/cascadia-fonts \
&& mkdir -p /usr/share/fonts/truetype/cascadia \
&& mv /tmp/cascadia-fonts/ttf/* /usr/share/fonts/truetype/cascadia/ \
&& rm -rf /tmp/cascadia-fonts.zip /tmp/cascadia-fonts \
#
# Install X11, fluxbox and VS Code dependencies
&& apt-get -y install --no-install-recommends \
xvfb \
x11vnc \
fluxbox \
dbus-x11 \
x11-utils \
x11-xserver-utils \
xdg-utils \
fbautostart \
xterm \
eterm \
gnome-terminal \
gnome-keyring \
seahorse \
nautilus \
libx11-dev \
libxkbfile-dev \
libsecret-1-dev \
libnotify4 \
libnss3 \
libxss1 \
libasound2 \
xfonts-base \
xfonts-terminus \
fonts-noto \
fonts-wqy-microhei \
fonts-droid-fallback \
vim-tiny \
nano \
#
# [Optional] Install noVNC
&& if [ "${INSTALL_NOVNC}" = "true" ]; then \
mkdir -p /usr/local/novnc \
&& curl -sSL https://github.com/novnc/noVNC/archive/v${NOVNC_VERSION}.zip -o /tmp/novnc-install.zip \
&& unzip /tmp/novnc-install.zip -d /usr/local/novnc \
&& cp /usr/local/novnc/noVNC-${NOVNC_VERSION}/vnc_lite.html /usr/local/novnc/noVNC-${NOVNC_VERSION}/index.html \
&& rm /tmp/novnc-install.zip \
&& curl -sSL https://github.com/novnc/websockify/archive/v${WEBSOCKETIFY_VERSION}.zip -o /tmp/websockify-install.zip \
&& unzip /tmp/websockify-install.zip -d /usr/local/novnc \
&& apt-get -y install --no-install-recommends python-numpy \
&& ln -s /usr/local/novnc/websockify-${WEBSOCKETIFY_VERSION} /usr/local/novnc/noVNC-${NOVNC_VERSION}/utils/websockify \
&& rm /tmp/websockify-install.zip; \
fi \
#
# [Optional] Install Firefox
&& if [ "${INSTALL_FIREFOX}" = "true" ]; then \
apt-get -y install --no-install-recommends firefox-esr; \
fi \
#
# Clean up
&& apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /var/lib/apt/lists/*
COPY bin/init-dev-container.sh /usr/local/share/
COPY bin/set-resolution /usr/local/bin/
COPY fluxbox/* /root/.fluxbox/
COPY fluxbox/* /home/${USERNAME}/.fluxbox/
# Update privs, owners of config files
RUN mkdir -p /var/run/dbus /root/.vnc /home/${USERNAME}/.vnc \
&& touch /root/.Xmodmap /home/${USERNAME}/.Xmodmap \
&& echo "${VNC_PASSWORD}" | tee /root/.vnc/passwd > /home/${USERNAME}/.vnc/passwd \
&& chown -R ${USERNAME}:${USERNAME} /home/${USERNAME}/.Xmodmap /home/${USERNAME}/.fluxbox /home/${USERNAME}/.vnc \
&& chmod +x /usr/local/share/init-dev-container.sh /usr/local/bin/set-resolution
ENTRYPOINT ["/usr/local/share/init-dev-container.sh"]
CMD ["sleep", "infinity"]

View File

@@ -1,8 +1,8 @@
# Code - OSS Development Container # Code - OSS Development Container
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [GitHub Codespaces](https://github.com/features/codespaces). This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [Visual Studio Codespaces](https://aka.ms/vso).
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well. > **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well. [Chicken](https://sourceforge.net/projects/chicken/) is a good macOS alternative.
## Quick start - local ## Quick start - local
@@ -30,41 +30,25 @@ Anything you start in VS Code or the integrated terminal will appear here.
Next: **[Try it out!](#try-it)** Next: **[Try it out!](#try-it)**
## Quick start - GitHub Codespaces ## Quick start - Codespaces
> **IMPORTANT:** The current free user beta for GitHub Codespaces uses a "Basic" sized codespace which does not have enough RAM to run a full build of VS Code and will be considerably slower during codespace start and running VS Code. You'll soon be able to use a "Standard" sized codespace (4-core, 8GB) that will be better suited for this purpose (along with even larger sizes should you need it). >Note that the Codespaces browser-based editor cannot currently access the desktop environment in this container (due to a [missing feature](https://github.com/MicrosoftDocs/vsonline/issues/117)). We recommend using Visual Studio Code from the desktop to connect instead in the near term.
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and the **New codespace** 1. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Visual Studio Codespaces](https://aka.ms/vscs-ext-vscode) extension.
> Note that you will not see these options if you are not in the beta yet. ![Image of VS Codespaces extension](https://microsoft.github.io/vscode-remote-release/images/codespaces-extn.png)
2. After the codespace is up and running in your browser, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **View: Show Remote Explorer**. > Note that the Visual Studio Codespaces extension requires the Visual Studio Code distribution of Code - OSS.
3. You should see port `6080` under **Forwarded Ports**. Select the line and click on the globe icon to open it in a browser tab. 2. Sign in by pressing <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and selecting **Codespaces: Sign In**. You may also need to use the **Codespaces: Create Plan** if you do not have a plan. See the [Codespaces docs](https://aka.ms/vso-docs/vscode) for details.
> If you do not see port `6080`, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, select **Forward a Port** and enter port `6080`. 3. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Codespaces: Create New Codespace**.
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password. 4. Use default settings (which should include **Standard** 4 core, 8 GB RAM Codespace), select a plan, and then enter the repository URL `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box when prompted.
Anything you start in VS Code or the integrated terminal will appear here. 5. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
Next: **[Try it out!](#try-it)** 6. Anything you start in VS Code or the integrated terminal will appear here.
### Using VS Code with GitHub Codespaces
You will likely see better performance when accessing the codespace you created from VS Code since you can use a[VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
1. [Create a codespace](#quick-start---github-codespaces) if you have not already.
2. Set up [VS Code for use with GitHub Codespaces](https://docs.github.com/github/developing-online-with-codespaces/using-codespaces-in-visual-studio-code)
3. After the VS Code is up and running, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
4. After you've connected to the codespace, use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
5. Anything you start in VS Code or the integrated terminal will appear here.
Next: **[Try it out!](#try-it)**
## Try it! ## Try it!
@@ -81,9 +65,7 @@ To start working with Code - OSS, follow these steps:
bash scripts/code.sh bash scripts/code.sh
``` ```
Note that a previous run of `yarn install` will already be cached, so this step should simply pick up any recent differences. 2. After the build is complete, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to the desktop environnement as described in the quick start and enter `vscode` as the password.
3. You should now see Code - OSS! 3. You should now see Code - OSS!

View File

@@ -0,0 +1,91 @@
#!/bin/bash
NONROOT_USER=node
LOG=/tmp/container-init.log
# Execute the command it not already running
startInBackgroundIfNotRunning()
{
log "Starting $1."
echo -e "\n** $(date) **" | sudoIf tee -a /tmp/$1.log > /dev/null
if ! pidof $1 > /dev/null; then
keepRunningInBackground "$@"
while ! pidof $1 > /dev/null; do
sleep 1
done
log "$1 started."
else
echo "$1 is already running." | sudoIf tee -a /tmp/$1.log > /dev/null
log "$1 is already running."
fi
}
# Keep command running in background
keepRunningInBackground()
{
($2 sh -c "while :; do echo [\$(date)] Process started.; $3; echo [\$(date)] Process exited!; sleep 5; done 2>&1" | sudoIf tee -a /tmp/$1.log > /dev/null & echo "$!" | sudoIf tee /tmp/$1.pid > /dev/null)
}
# Use sudo to run as root when required
sudoIf()
{
if [ "$(id -u)" -ne 0 ]; then
sudo "$@"
else
"$@"
fi
}
# Use sudo to run as non-root user if not already running
sudoUserIf()
{
if [ "$(id -u)" -eq 0 ]; then
sudo -u ${NONROOT_USER} "$@"
else
"$@"
fi
}
# Log messages
log()
{
echo -e "[$(date)] $@" | sudoIf tee -a $LOG > /dev/null
}
log "** SCRIPT START **"
# Start dbus.
log 'Running "/etc/init.d/dbus start".'
if [ -f "/var/run/dbus/pid" ] && ! pidof dbus-daemon > /dev/null; then
sudoIf rm -f /var/run/dbus/pid
fi
sudoIf /etc/init.d/dbus start 2>&1 | sudoIf tee -a /tmp/dbus-daemon-system.log > /dev/null
while ! pidof dbus-daemon > /dev/null; do
sleep 1
done
# Set up Xvfb.
startInBackgroundIfNotRunning "Xvfb" sudoIf "Xvfb ${DISPLAY:-:1} +extension RANDR -screen 0 ${MAX_VNC_RESOLUTION:-1920x1080x16}"
# Start fluxbox as a light weight window manager.
startInBackgroundIfNotRunning "fluxbox" sudoUserIf "dbus-launch startfluxbox"
# Start x11vnc
startInBackgroundIfNotRunning "x11vnc" sudoIf "x11vnc -display ${DISPLAY:-:1} -rfbport ${VNC_PORT:-5901} -localhost -no6 -xkb -shared -forever -passwdfile $HOME/.vnc/passwd"
# Set resolution
/usr/local/bin/set-resolution ${VNC_RESOLUTION:-1280x720} ${VNC_DPI:-72}
# Spin up noVNC if installed and not runnning.
if [ -d "/usr/local/novnc" ] && [ "$(ps -ef | grep /usr/local/novnc/noVNC*/utils/launch.sh | grep -v grep)" = "" ]; then
keepRunningInBackground "noVNC" sudoIf "/usr/local/novnc/noVNC*/utils/launch.sh --listen ${NOVNC_PORT:-6080} --vnc localhost:${VNC_PORT:-5901}"
log "noVNC started."
else
log "noVNC is already running or not installed."
fi
# Run whatever was passed in
log "Executing \"$@\"."
"$@"
log "** SCRIPT EXIT **"

View File

@@ -0,0 +1,25 @@
#!/bin/bash
RESOLUTION=${1:-${VNC_RESOLUTION:-1920x1080}}
DPI=${2:-${VNC_DPI:-72}}
if [ -z "$1" ]; then
echo -e "**Current Settings **\n"
xrandr
echo -n -e "\nEnter new resolution (WIDTHxHEIGHT, blank for ${RESOLUTION}, Ctrl+C to abort).\n> "
read NEW_RES
if [ "${NEW_RES}" != "" ]; then
RESOLUTION=${NEW_RES}
fi
if [ -z "$2" ]; then
echo -n -e "\nEnter new DPI (blank for ${DPI}, Ctrl+C to abort).\n> "
read NEW_DPI
if [ "${NEW_DPI}" != "" ]; then
DPI=${NEW_DPI}
fi
fi
fi
xrandr --fb ${RESOLUTION} --dpi ${DPI} > /dev/null 2>&1
echo -e "\n**New Settings **\n"
xrandr
echo

View File

@@ -1 +0,0 @@
*.manifest

View File

@@ -1,15 +0,0 @@
#!/bin/bash
# This file establishes a basline for the reposuitory before any steps in the "prepare.sh"
# are run. Its just a find command that filters out a few things we don't need to watch.
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
SOURCE_FOLDER="${1:-"."}"
cd "${SOURCE_FOLDER}"
echo "[$(date)] Generating ""before"" manifest..."
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/before.manifest"
echo "[$(date)] Done!"

View File

@@ -1,28 +0,0 @@
#!/bin/bash
# This file simply wraps the dockeer build command used to build the image with the
# cached result of the commands from "prepare.sh" and pushes it to the specified
# container image registry.
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
CONTAINER_IMAGE_REPOSITORY="$1"
BRANCH="${2:-"master"}"
if [ "${CONTAINER_IMAGE_REPOSITORY}" = "" ]; then
echo "Container repository not specified!"
exit 1
fi
TAG="branch-${BRANCH//\//-}"
echo "[$(date)] ${BRANCH} => ${TAG}"
cd "${SCRIPT_PATH}/../.."
echo "[$(date)] Starting image build..."
docker build -t ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}" -f "${SCRIPT_PATH}/cache.Dockerfile" .
echo "[$(date)] Image build complete."
echo "[$(date)] Pushing image..."
docker push ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}"
echo "[$(date)] Done!"

View File

@@ -1,21 +0,0 @@
#!/bin/bash
# This file is used to archive off a copy of any differences in the source tree into another location
# in the image. Once the codespace is up, this will be restored into its proper location (which is
# quick and happens parallel to other startup activities)
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
SOURCE_FOLDER="${1:-"."}"
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
echo "[$(date)] Starting cache operation..."
cd "${SOURCE_FOLDER}"
echo "[$(date)] Determining diffs..."
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/after.manifest"
grep -Fxvf "${SCRIPT_PATH}/before.manifest" "${SCRIPT_PATH}/after.manifest" > "${SCRIPT_PATH}/cache.manifest"
echo "[$(date)] Archiving diffs..."
mkdir -p "${CACHE_FOLDER}"
tar -cf "${CACHE_FOLDER}/cache.tar" --totals --files-from "${SCRIPT_PATH}/cache.manifest"
echo "[$(date)] Done! $(du -h "${CACHE_FOLDER}/cache.tar")"

View File

@@ -1,14 +0,0 @@
# This dockerfile is used to build up from a base image to create an image with cached results of running "prepare.sh".
# Other image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
FROM mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:dev
ARG USERNAME=node
COPY --chown=${USERNAME}:${USERNAME} . /repo-source-tmp/
RUN mkdir /usr/local/etc/devcontainer-cache \
&& chown ${USERNAME} /usr/local/etc/devcontainer-cache /repo-source-tmp \
&& su ${USERNAME} -c "\
cd /repo-source-tmp \
&& .devcontainer/cache/before-cache.sh \
&& .devcontainer/prepare.sh \
&& .devcontainer/cache/cache-diff.sh" \
&& rm -rf /repo-source-tmp

View File

@@ -1,23 +0,0 @@
#!/bin/bash
# This file restores the results of the "prepare.sh" into their proper locations
# once the container has been created. It runs as a postCreateCommand which
# in GitHub Codespaces occurs parallel to other startup activities and does not
# really add to the overal startup time given how quick the operation ends up being.
set -e
SOURCE_FOLDER="$(cd "${1:-"."}" && pwd)"
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
if [ ! -d "${CACHE_FOLDER}" ]; then
echo "No cache folder found."
exit 0
fi
echo "[$(date)] Expanding $(du -h "${CACHE_FOLDER}/cache.tar") file to ${SOURCE_FOLDER}..."
cd "${SOURCE_FOLDER}"
tar -xf "${CACHE_FOLDER}/cache.tar"
rm -f "${CACHE_FOLDER}/cache.tar"
echo "[$(date)] Done!"

View File

@@ -1,30 +1,45 @@
{ {
"name": "Code - OSS", "name": "Code - OSS",
"build": {
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile "dockerfile": "Dockerfile",
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-master", "args": {
"MAX_VNC_RESOLUTION": "1920x1080x16",
"workspaceMount": "source=${localWorkspaceFolder},target=/home/node/workspace/vscode,type=bind,consistency=cached", "TARGET_VNC_RESOLUTION": "1280x768",
"workspaceFolder": "/home/node/workspace/vscode", "TARGET_VNC_PORT": "5901",
"TARGET_NOVNC_PORT": "6080",
"VNC_PASSWORD": "vscode",
"INSTALL_FIREFOX": "true"
}
},
"overrideCommand": false, "overrideCommand": false,
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined"], "runArgs": [
"--init",
// seccomp=unconfined is required for Chrome sandboxing
"--security-opt", "seccomp=unconfined"
],
"settings": { "settings": {
// zsh is also available
"terminal.integrated.shell.linux": "/bin/bash", "terminal.integrated.shell.linux": "/bin/bash",
"resmon.show.battery": false, "resmon.show.battery": false,
"resmon.show.cpufreq": false "resmon.show.cpufreq": false,
"remote.extensionKind": {
"ms-vscode.js-debug-nightly": "workspace",
"msjsdiag.debugger-for-chrome": "workspace"
},
"debug.chrome.useV3": true
}, },
// noVNC, VNC, debug ports // noVNC, VNC ports
"forwardPorts": [6080, 5901, 9222], "forwardPorts": [6080, 5901],
"extensions": [ "extensions": [
"dbaeumer.vscode-eslint", "dbaeumer.vscode-eslint",
"mutantdino.resourcemonitor" "EditorConfig.EditorConfig",
"msjsdiag.debugger-for-chrome",
"mutantdino.resourcemonitor",
"GitHub.vscode-pull-request-github"
], ],
// Optionally loads a cached yarn install for the repo
"postCreateCommand": ".devcontainer/cache/restore-diff.sh",
"remoteUser": "node" "remoteUser": "node"
} }

View File

@@ -0,0 +1,9 @@
[app] (name=code-oss-dev)
[Position] (CENTER) {0 0}
[Maximized] {yes}
[Dimensions] {100% 100%}
[end]
[transient] (role=GtkFileChooserDialog)
[Position] (CENTER) {0 0}
[Dimensions] {70% 70%}
[end]

View File

@@ -0,0 +1,9 @@
session.menuFile: ~/.fluxbox/menu
session.keyFile: ~/.fluxbox/keys
session.styleFile: /usr/share/fluxbox/styles//Squared_for_Debian
session.configVersion: 13
session.screen0.workspaces: 1
session.screen0.workspacewarping: false
session.screen0.toolbar.widthPercent: 100
session.screen0.strftimeFormat: %d %b, %a %02k:%M:%S
session.screen0.toolbar.tools: prevworkspace, workspacename, nextworkspace, clock, prevwindow, nextwindow, iconbar, systemtray

View File

@@ -0,0 +1,16 @@
[begin] ( Code - OSS Development Container )
[exec] (File Manager) { nautilus ~ } <>
[exec] (Terminal) {/usr/bin/gnome-terminal --working-directory=~ } <>
[exec] (Start Code - OSS) { x-terminal-emulator -T "Code - OSS Build" -e bash /workspaces/vscode*/scripts/code.sh } <>
[submenu] (System >) {}
[exec] (Set Resolution) { x-terminal-emulator -T "Set Resolution" -e bash /usr/local/bin/set-resolution } <>
[exec] (Passwords and Keys) { seahorse } <>
[exec] (Top) { x-terminal-emulator -T "Top" -e /usr/bin/top } <>
[exec] (Editres) {editres} <>
[exec] (Xfontsel) {xfontsel} <>
[exec] (Xkill) {xkill} <>
[exec] (Xrefresh) {xrefresh} <>
[end]
[config] (Configuration >)
[workspaces] (Workspaces >)
[end]

View File

@@ -1,10 +0,0 @@
#!/bin/bash
# This file contains the steps that should be run when creating the intermediary image that contains
# contents for that should be in the image by default. It will be used to build up from the base image
# to create an image that speeds up first time use of the dev container by "caching" the results
# of these commands. Developers can still run these commands without an issue once the container is
# up, but only differences will be processed which also speeds up the first time these operations occur.
yarn install
yarn electron

View File

@@ -5,7 +5,7 @@
**/vs/loader.js **/vs/loader.js
**/insane/** **/insane/**
**/marked/** **/marked/**
**/semver/** **/markjs/**
**/test/**/*.js **/test/**/*.js
**/node_modules/** **/node_modules/**
**/vscode-api-tests/testWorkspace/** **/vscode-api-tests/testWorkspace/**

View File

@@ -7,8 +7,7 @@
}, },
"plugins": [ "plugins": [
"@typescript-eslint", "@typescript-eslint",
"jsdoc", "jsdoc"
"mocha"
], ],
"rules": { "rules": {
"constructor-super": "warn", "constructor-super": "warn",
@@ -42,7 +41,6 @@
"no-var": "warn", "no-var": "warn",
"jsdoc/no-types": "warn", "jsdoc/no-types": "warn",
"semi": "off", "semi": "off",
"mocha/no-exclusive-tests": "warn",
"@typescript-eslint/semi": "warn", "@typescript-eslint/semi": "warn",
"@typescript-eslint/naming-convention": [ "@typescript-eslint/naming-convention": [
"warn", "warn",
@@ -546,8 +544,7 @@
"vscode-textmate", "vscode-textmate",
"vscode-oniguruma", "vscode-oniguruma",
"iconv-lite-umd", "iconv-lite-umd",
"tas-client-umd", "semver-umd"
"jschardet"
] ]
}, },
{ {
@@ -608,11 +605,7 @@
"**/{vs,sql}/editor/**", "**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**"
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
] ]
}, },
{ {
@@ -740,11 +733,7 @@
"angular2-grid", "angular2-grid",
"html-query-plan", "html-query-plan",
"turndown", "turndown",
"mark.js", "mark.js"
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
] ]
}, },
{ {
@@ -773,11 +762,7 @@
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/contrib/**/{common,browser,electron-sandbox}/**", "**/{vs,sql}/workbench/contrib/**/{common,browser,electron-sandbox}/**"
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
] ]
}, },
{ {
@@ -1039,7 +1024,6 @@
"collapse", "collapse",
"create", "create",
"delete", "delete",
"discover",
"dispose", "dispose",
"edit", "edit",
"end", "end",

16
.github/CODEOWNERS vendored
View File

@@ -1,16 +0,0 @@
# Lines starting with '#' are comments.
# Each line is a file pattern followed by one or more owners.
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
/extensions/admin-tool-ext-win @Charles-Gagnon
/extensions/arc/ @Charles-Gagnon
/extensions/azdata/ @Charles-Gagnon
/extensions/big-data-cluster/ @Charles-Gagnon
/extensions/dacpac/ @kisantia
/extensions/query-history/ @Charles-Gagnon
/extensions/resource-deployment/ @Charles-Gagnon
/extensions/schema-compare/ @kisantia
/extensions/sql-database-projects/ @Benjin @kisantia
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon

17
.github/commands.yml vendored
View File

@@ -1,12 +1,11 @@
{ {
perform: true, perform: true,
commands: commands: [
[ {
{ type: 'label',
type: "label", name: 'Needs Logs',
name: "Needs Logs", action: 'comment',
action: "comment", comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log"
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log", }
}, ]
],
} }

View File

@@ -1,27 +0,0 @@
Needs Logs:
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
There are two types of logs to collect:
**Console Logs**
- Open Developer Tools (Help -> Toggle Developer Tools)
- Click the **Console** tab
- Click in the log area and select all text (CTRL+A)
- Save this text into a file named console.log and attach it to this issue.
**Application Logs**
- Open command palette (Click **View** -> **Command Palette**)
- Run the command: **`Developer: Open Logs Folder`**
- This will open the log folder locally. Please zip up this folder and attach it to the issue."

View File

@@ -1,5 +1,5 @@
{ {
perform: true, perform: true,
whenCreatedByTeam: true, whenCreatedByTeam: true,
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}", comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}"
} }

View File

@@ -1,9 +0,0 @@
{
"notebook": [
"claudiaregio",
"rchiodo",
"greazer",
"donjayamanne",
"jilljac"
]
}

View File

@@ -17,51 +17,48 @@ jobs:
CHILD_CONCURRENCY: "1" CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps: steps:
- uses: actions/checkout@v2.2.0 - uses: actions/checkout@v2.2.0
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions # TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- run: | - run: |
sudo apt-get update sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults sudo update-rc.d xvfb defaults
sudo service xvfb start sudo service xvfb start
name: Setup Build Environment name: Setup Build Environment
- uses: actions/setup-node@v1 - uses: actions/setup-node@v1
with: with:
node-version: 12 node-version: 10
# TODO: cache node modules # TODO: cache node modules
# Increase timeout to get around latency issues when fetching certain packages - run: yarn --frozen-lockfile
- run: | name: Install Dependencies
yarn config set network-timeout 300000 - run: yarn electron x64
yarn --frozen-lockfile name: Download Electron
name: Install Dependencies - run: yarn gulp hygiene
- run: yarn electron x64 name: Run Hygiene Checks
name: Download Electron - run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
- run: yarn gulp hygiene name: Run Strict Compile Options
name: Run Hygiene Checks # - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step # name: Run Monaco Editor Checks
name: Run Strict Compile Options - run: yarn valid-layers-check
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step name: Run Valid Layers Checks
# name: Run Monaco Editor Checks - run: yarn compile
- run: yarn valid-layers-check name: Compile Sources
name: Run Valid Layers Checks # - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
- run: yarn compile # name: Download Built-in Extensions
name: Compile Sources - run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js"
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step name: Run Unit Tests (Electron)
# name: Download Built-in Extensions - run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
- run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js" name: Run Extension Unit Tests (Electron)
name: Run Unit Tests (Electron) # {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- run: DISPLAY=:10 ./scripts/test-extensions-unit.sh - run: node test/combineCoverage
name: Run Extension Unit Tests (Electron) name: Combine code coverage files
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly - name: Upload Code Coverage
- run: node test/combineCoverage uses: coverallsapp/github-action@v1.1.1
name: Combine code coverage files with:
- name: Upload Code Coverage github-token: ${{ secrets.GITHUB_TOKEN }}
uses: coverallsapp/github-action@v1.1.1 path-to-lcov: 'test/coverage/lcov.info'
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: "test/coverage/lcov.info"
# Fails with cryptic error (e.g. https://github.com/microsoft/vscode/pull/90292/checks?check_run_id=433681926#step:13:9) # Fails with cryptic error (e.g. https://github.com/microsoft/vscode/pull/90292/checks?check_run_id=433681926#step:13:9)
# - run: DISPLAY=:10 yarn test-browser --browser chromium # - run: DISPLAY=:10 yarn test-browser --browser chromium
@@ -75,34 +72,31 @@ jobs:
CHILD_CONCURRENCY: "1" CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps: steps:
- uses: actions/checkout@v2.2.0 - uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1 - uses: actions/setup-node@v1
with: with:
node-version: 12 node-version: 10
- uses: actions/setup-python@v1 - uses: actions/setup-python@v1
with: with:
python-version: "2.x" python-version: '2.x'
# Increase timeout to get around latency issues when fetching certain packages - run: yarn --frozen-lockfile
- run: | name: Install Dependencies
yarn config set network-timeout 300000 - run: yarn electron
yarn --frozen-lockfile name: Download Electron
name: Install Dependencies - run: yarn gulp hygiene
- run: yarn electron name: Run Hygiene Checks
name: Download Electron - run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
- run: yarn gulp hygiene name: Run Strict Compile Options
name: Run Hygiene Checks # - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step # name: Run Monaco Editor Checks
name: Run Strict Compile Options - run: yarn valid-layers-check
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step name: Run Valid Layers Checks
# name: Run Monaco Editor Checks - run: yarn compile
- run: yarn valid-layers-check name: Compile Sources
name: Run Valid Layers Checks # - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
- run: yarn compile # name: Download Built-in Extensions
name: Compile Sources - run: .\scripts\test.bat --tfs "Unit Tests"
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step name: Run Unit Tests (Electron)
# name: Download Built-in Extensions
- run: .\scripts\test.bat --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium {{SQL CARBON EDIT}} disable for now @TODO @anthonydresser # - run: yarn test-browser --browser chromium {{SQL CARBON EDIT}} disable for now @TODO @anthonydresser
# name: Run Unit Tests (Browser) # name: Run Unit Tests (Browser)
# - run: .\scripts\test-integration.bat --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step # - run: .\scripts\test-integration.bat --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
@@ -114,31 +108,28 @@ jobs:
CHILD_CONCURRENCY: "1" CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps: steps:
- uses: actions/checkout@v2.2.0 - uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1 - uses: actions/setup-node@v1
with: with:
node-version: 12 node-version: 10
# Increase timeout to get around latency issues when fetching certain packages - run: yarn --frozen-lockfile
- run: | name: Install Dependencies
yarn config set network-timeout 300000 - run: yarn electron x64
yarn --frozen-lockfile name: Download Electron
name: Install Dependencies - run: yarn gulp hygiene
- run: yarn electron x64 name: Run Hygiene Checks
name: Download Electron - run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
- run: yarn gulp hygiene name: Run Strict Compile Options
name: Run Hygiene Checks # - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step # name: Run Monaco Editor Checks
name: Run Strict Compile Options - run: yarn valid-layers-check
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step name: Run Valid Layers Checks
# name: Run Monaco Editor Checks - run: yarn compile
- run: yarn valid-layers-check name: Compile Sources
name: Run Valid Layers Checks # - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
- run: yarn compile # name: Download Built-in Extensions
name: Compile Sources - run: ./scripts/test.sh --tfs "Unit Tests"
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step name: Run Unit Tests (Electron)
# name: Download Built-in Extensions
- run: ./scripts/test.sh --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium --browser webkit # - run: yarn test-browser --browser chromium --browser webkit
# name: Run Unit Tests (Browser) # name: Run Unit Tests (Browser)
# - run: ./scripts/test-integration.sh --tfs "Integration Tests" # - run: ./scripts/test-integration.sh --tfs "Integration Tests"

View File

@@ -3,42 +3,44 @@ name: "Code Scanning - Action"
on: on:
push: push:
schedule: schedule:
- cron: "0 0 * * 0" - cron: '0 0 * * 0'
jobs: jobs:
CodeQL-Build: CodeQL-Build:
strategy: strategy:
fail-fast: false fail-fast: false
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest # CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v2 uses: actions/checkout@v2
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v1 uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages # Override language selection by uncommenting this and choosing your languages
# with: # with:
# languages: go, javascript, csharp, python, cpp, java # languages: go, javascript, csharp, python, cpp, java
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java). # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below). # If this step fails, then you should remove it and run the build manually (see below).
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v1 uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell. # Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl # 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines # ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project # and modify them (or add more) to build your code if your project
# uses a compiled language # uses a compiled language
#- run: | #- run: |
# make bootstrap # make bootstrap
# make release # make release
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1 uses: github/codeql-action/analyze@v1

View File

@@ -1,50 +0,0 @@
name: "Deep Classifier: Runner"
on:
schedule:
- cron: 0 * * * *
repository_dispatch:
types: [trigger-deep-classifier-runner]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/apply/fetch-sources
with:
# slightly overlapping to protect against issues slipping through the cracks if a run is delayed
from: 80
until: 5
configPath: classifier
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade numpy scipy scikit-learn joblib nltk simpletransformers torch torchvision
- name: "Run Classifier: Generator"
run: python ./actions/classifier-deep/apply/generate-labels/main.py
- name: "Run Classifier: Labeler"
uses: ./actions/classifier-deep/apply/apply-labels
with:
configPath: classifier
allowLabels: "needs more info|new release"
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,27 +0,0 @@
name: "Deep Classifier: Scraper"
on:
repository_dispatch:
types: [trigger-deep-classifier-scraper]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/train/fetch-issues
with:
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.ISSUE_SCRAPER_TOKEN}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}

View File

@@ -1,40 +0,0 @@
name: VS Code Repo Dev Container Cache Image Generation
on:
push:
# Currently doing this for master, but could be done for PRs as well
branches:
- "master"
# Only updates to these files result in changes to installed packages, so skip otherwise
paths:
- "**/package-lock.json"
- "**/yarn.lock"
jobs:
devcontainer:
name: Generate cache image
runs-on: ubuntu-latest
steps:
- name: Checkout
id: checkout
uses: actions/checkout@v2
- name: Azure CLI login
id: az_login
uses: azure/login@v1
with:
creds: ${{ secrets.AZ_ACR_CREDS }}
- name: Build and push
id: build_and_push
run: |
set -e
ACR_REGISTRY_NAME=$(echo ${{ secrets.CONTAINER_IMAGE_REGISTRY }} | grep -oP '(.+)(?=\.azurecr\.io)')
az acr login --name $ACR_REGISTRY_NAME
GIT_BRANCH=$(echo "${{ github.ref }}" | grep -oP 'refs/(heads|tags)/\K(.+)')
if [ "$GIT_BRANCH" == "" ]; then GIT_BRANCH=master; fi
.devcontainer/cache/build-cache-image.sh "${{ secrets.CONTAINER_IMAGE_REGISTRY }}/public/vscode/devcontainers/repos/microsoft/vscode" "${GIT_BRANCH}"

View File

@@ -1,27 +0,0 @@
name: Latest Release Monitor
on:
schedule:
- cron: 0/5 * * * *
repository_dispatch:
types: [trigger-latest-release-monitor]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
path: ./actions
ref: v40
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Storage Module
run: npm install @azure/storage-blob@12.1.1
- name: Run Latest Release Monitor
uses: ./actions/latest-release-monitor
with:
storageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,15 +0,0 @@
name: On Label
on:
issues:
types: [labeled]
jobs:
processLabelAction:
name: Process Label Action
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Process Label Action
uses: hramos/label-actions@v1
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
10

62
.vscode/launch.json vendored
View File

@@ -41,7 +41,10 @@
"port": 5876, "port": 5876,
"outFiles": [ "outFiles": [
"${workspaceFolder}/out/**/*.js" "${workspaceFolder}/out/**/*.js"
] ],
"presentation": {
"hidden": true,
}
}, },
{ {
"type": "node", "type": "node",
@@ -138,12 +141,9 @@
} }
}, },
{ {
"type": "pwa-chrome", "type": "chrome",
"request": "launch", "request": "launch",
"outFiles": [], "name": "Launch ADS (Web, Chrome) (TBD)",
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "VS Code (Web, Chrome)",
"url": "http://localhost:8080", "url": "http://localhost:8080",
"preLaunchTask": "Run web", "preLaunchTask": "Run web",
"presentation": { "presentation": {
@@ -154,8 +154,6 @@
{ {
"type": "pwa-msedge", "type": "pwa-msedge",
"request": "launch", "request": "launch",
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "VS Code (Web, Edge)", "name": "VS Code (Web, Edge)",
"url": "http://localhost:8080", "url": "http://localhost:8080",
"pauseForSourceMap": false, "pauseForSourceMap": false,
@@ -195,7 +193,7 @@
} }
}, },
{ {
"type": "pwa-node", "type": "node",
"request": "launch", "request": "launch",
"name": "Run Unit Tests", "name": "Run Unit Tests",
"program": "${workspaceFolder}/test/unit/electron/index.js", "program": "${workspaceFolder}/test/unit/electron/index.js",
@@ -214,41 +212,6 @@
"outFiles": [ "outFiles": [
"${workspaceFolder}/out/**/*.js" "${workspaceFolder}/out/**/*.js"
], ],
"cascadeTerminateToConfigurations": [
"Attach to VS Code"
],
"env": {
"MOCHA_COLORS": "true"
},
"presentation": {
"hidden": true
}
},
{
"type": "pwa-node",
"request": "launch",
"name": "Run Unit Tests For Current File",
"program": "${workspaceFolder}/test/unit/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/code-oss"
},
"cascadeTerminateToConfigurations": [
"Attach to VS Code"
],
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222",
"--run",
"${relativeFile}"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"env": { "env": {
"MOCHA_COLORS": "true" "MOCHA_COLORS": "true"
}, },
@@ -352,17 +315,6 @@
"group": "1_vscode", "group": "1_vscode",
"order": 2 "order": 2
} }
},
{
"name": "Debug Unit Tests (Current File)",
"configurations": [
"Attach to VS Code",
"Run Unit Tests For Current File"
],
"presentation": {
"group": "1_vscode",
"order": 2
}
} }
] ]
} }

View File

@@ -8,7 +8,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"November 2020\"", "value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"August 2020\"",
"editable": true "editable": true
}, },
{ {

View File

@@ -1,110 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Candidates",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE label:candidate",
"editable": true
}
]

View File

@@ -1,767 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## Config",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$since=2020-10-01",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode\n\nQuery exceeds the maximum result. Run the query manually: `is:issue is:open closed:>2020-10-01`",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-remote-release",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-docs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-js-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# language-server-protocol",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-eslint",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-css-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-test",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-pull-request-github"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-chrome-debug (deprecated)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-chrome-debug-core",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-debugadapter-node",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-emmet-helper",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-vscode\n\nDeprecated",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-samples",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-filewatcher-windows",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-generator-code",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-html-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-jshint",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jshint is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jshint is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-json-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-languageserver-node",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-loader",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-mono-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug2",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-recipes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-textmate",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-themes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-vsce",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-website",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-windows-process-tree",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# debug-adapter-protocol",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# inno-updater",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# language-server-protocol-inspector",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol-inspector is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol-inspector is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-languages",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-typescript",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-css",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-json",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-html",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor-webpack-plugin",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# node-jsonc-parser",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue created:>$since",
"editable": true
}
]

View File

@@ -1,26 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "### Categorizing Issues\n\nEach issue must have a type label. Most type labels are grey, some are yellow. Bugs are grey with a touch of red.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:\"needs more info\" -label:bug -label:feature-request -label:under-discussion -label:debt -label:*question -label:upstream -label:electron -label:engineering -label:plan-item ",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### Feature Areas\n\nEach issue should be assigned to a feature area",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:L10N -label:VIM -label:api -label:api-finalization -label:api-proposal -label:authentication -label:breadcrumbs -label:callhierarchy -label:code-lens -label:color-palette -label:comments -label:config -label:context-keys -label:css-less-scss -label:custom-editors -label:debug -label:debug-console -label:dialogs -label:diff-editor -label:dropdown -label:editor -label:editor-RTL -label:editor-autoclosing -label:editor-autoindent -label:editor-bracket-matching -label:editor-clipboard -label:editor-code-actions -label:editor-color-picker -label:editor-columnselect -label:editor-commands -label:editor-comments -label:editor-contrib -label:editor-core -label:editor-drag-and-drop -label:editor-error-widget -label:editor-find -label:editor-folding -label:editor-highlight -label:editor-hover -label:editor-indent-detection -label:editor-indent-guides -label:editor-input -label:editor-input-IME -label:editor-insets -label:editor-minimap -label:editor-multicursor -label:editor-parameter-hints -label:editor-render-whitespace -label:editor-rendering -label:editor-scrollbar -label:editor-symbols -label:editor-synced-region -label:editor-textbuffer -label:editor-theming -label:editor-wordnav -label:editor-wrapping -label:emmet -label:error-list -label:explorer-custom -label:extension-host -label:extension-recommendations -label:extensions -label:extensions-development -label:file-decorations -label:file-encoding -label:file-explorer -label:file-glob -label:file-guess-encoding -label:file-io -label:file-watcher -label:font-rendering -label:formatting -label:git -label:github -label:gpu -label:grammar -label:grid-view -label:html -label:i18n -label:icon-brand -label:icons-product -label:install-update -label:integrated-terminal -label:integrated-terminal-conpty -label:integrated-terminal-links -label:integrated-terminal-rendering -label:integrated-terminal-winpty -label:intellisense-config -label:ipc -label:issue-bot -label:issue-reporter -label:javascript -label:json -label:keybindings -label:keybindings-editor -label:keyboard-layout -label:label-provider -label:languages-basic -label:languages-diagnostics -label:languages-guessing -label:layout -label:lcd-text-rendering -label:list -label:log -label:markdown -label:marketplace -label:menus -label:merge-conflict -label:notebook -label:outline -label:output -label:perf -label:perf-bloat -label:perf-startup -label:php -label:portable-mode -label:proxy -label:quick-pick -label:references-viewlet -label:release-notes -label:remote -label:remote-explorer -label:rename -label:sandbox -label:scm -label:screencast-mode -label:search -label:search-api -label:search-editor -label:search-replace -label:semantic-tokens -label:settings-editor -label:settings-sync -label:settings-sync-server -label:shared-process -label:simple-file-dialog -label:smart-select -label:snap -label:snippets -label:splitview -label:suggest -label:sync-error-handling -label:tasks -label:telemetry -label:themes -label:timeline -label:timeline-git -label:titlebar -label:tokenization -label:touch/pointer -label:trackpad/scroll -label:tree -label:typescript -label:undo-redo -label:uri -label:ux -label:variable-resolving -label:vscode-build -label:vscode-website -label:web -label:webview -label:workbench-actions -label:workbench-cli -label:workbench-diagnostics -label:workbench-dnd -label:workbench-editor-grid -label:workbench-editors -label:workbench-electron -label:workbench-feedback -label:workbench-history -label:workbench-hot-exit -label:workbench-hover -label:workbench-launch -label:workbench-link -label:workbench-multiroot -label:workbench-notifications -label:workbench-os-integration -label:workbench-rapid-render -label:workbench-run-as-admin -label:workbench-state -label:workbench-status -label:workbench-tabs -label:workbench-touchbar -label:workbench-views -label:workbench-welcome -label:workbench-window -label:workbench-zen -label:workspace-edit -label:workspace-symbols -label:zoom",
"editable": true
}
]

View File

@@ -1,16 +1,4 @@
[ [
{
"kind": 1,
"language": "markdown",
"value": "## tl;dr: Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:\"needs more info\"",
"editable": true
},
{ {
"kind": 1, "kind": 1,
"language": "markdown", "language": "markdown",
@@ -18,9 +6,9 @@
"editable": true "editable": true
}, },
{ {
"kind": 1, "kind": 2,
"language": "markdown", "language": "github-issues",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/Microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.", "value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item ",
"editable": true "editable": true
}, },
{ {
@@ -32,19 +20,31 @@
{ {
"kind": 1, "kind": 1,
"language": "markdown", "language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.", "value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/Microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"editable": true "editable": true
}, },
{ {
"kind": 1, "kind": 1,
"language": "markdown", "language": "markdown",
"value": "## All Inbox Items\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.", "value": "## Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
"editable": true "editable": true
}, },
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$inbox", "value": "$inbox -label:\"needs more info\" -label:emmet",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Inbox\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:emmet",
"editable": true "editable": true
} }
] ]

View File

@@ -1,206 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"\n\n$MINE=assignee:@me",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open author:@me label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Fixing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Bugs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-steps-needed)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-steps-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-found)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by me",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by others",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Release Notes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode $MILESTONE is:issue is:closed label:feature-request -label:on-release-notes",
"editable": true
}
]

View File

@@ -8,7 +8,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"November 2020\"", "value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"August 2020\"",
"editable": true "editable": true
}, },
{ {

View File

@@ -14,7 +14,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"October 2020\"", "value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"July 2020\"",
"editable": true "editable": true
}, },
{ {
@@ -38,7 +38,7 @@
{ {
"kind": 2, "kind": 2,
"language": "github-issues", "language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand", "value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand",
"editable": false "editable": false
}, },
{ {

View File

@@ -1,101 +0,0 @@
# Query: .innerHTML =
# Flags: CaseSensitive WordMatch
# Including: src/vs/**/*.{t,j}s
# Excluding: *.test.ts, **/test/**
# ContextLines: 3
12 results - 9 files
src/vs/base/browser/dom.ts:
1359 );
1360
1361 const html = _ttpSafeInnerHtml?.createHTML(value, options) ?? insane(value, options);
1362: node.innerHTML = html as unknown as string;
1363 }
src/vs/base/browser/markdownRenderer.ts:
272 };
273
274 if (_ttpInsane) {
275: element.innerHTML = _ttpInsane.createHTML(renderedMarkdown, insaneOptions) as unknown as string;
276 } else {
277: element.innerHTML = insane(renderedMarkdown, insaneOptions);
278 }
279
280 // signal that async code blocks can be now be inserted
src/vs/editor/browser/core/markdownRenderer.ts:
88
89 const element = document.createElement('span');
90
91: element.innerHTML = MarkdownRenderer._ttpTokenizer
92 ? MarkdownRenderer._ttpTokenizer.createHTML(value, tokenization) as unknown as string
93 : tokenizeToString(value, tokenization);
94
src/vs/editor/browser/view/domLineBreaksComputer.ts:
107 allCharOffsets[i] = tmp[0];
108 allVisibleColumns[i] = tmp[1];
109 }
110: containerDomNode.innerHTML = sb.build();
111
112 containerDomNode.style.position = 'absolute';
113 containerDomNode.style.top = '10000';
src/vs/editor/browser/view/viewLayer.ts:
512 }
513 const lastChild = <HTMLElement>this.domNode.lastChild;
514 if (domNodeIsEmpty || !lastChild) {
515: this.domNode.innerHTML = newLinesHTML;
516 } else {
517 lastChild.insertAdjacentHTML('afterend', newLinesHTML);
518 }
533 if (ViewLayerRenderer._ttPolicy) {
534 invalidLinesHTML = ViewLayerRenderer._ttPolicy.createHTML(invalidLinesHTML) as unknown as string;
535 }
536: hugeDomNode.innerHTML = invalidLinesHTML;
537
538 for (let i = 0; i < ctx.linesLength; i++) {
539 const line = ctx.lines[i];
src/vs/editor/browser/widget/diffEditorWidget.ts:
2157
2158 let domNode = document.createElement('div');
2159 domNode.className = `view-lines line-delete ${MOUSE_CURSOR_TEXT_CSS_CLASS_NAME}`;
2160: domNode.innerHTML = sb.build();
2161 Configuration.applyFontInfoSlow(domNode, fontInfo);
2162
2163 let marginDomNode = document.createElement('div');
2164 marginDomNode.className = 'inline-deleted-margin-view-zone';
2165: marginDomNode.innerHTML = marginHTML.join('');
2166 Configuration.applyFontInfoSlow(marginDomNode, fontInfo);
2167
2168 return {
src/vs/editor/standalone/browser/colorizer.ts:
40 let text = domNode.firstChild ? domNode.firstChild.nodeValue : '';
41 domNode.className += ' ' + theme;
42 let render = (str: string) => {
43: domNode.innerHTML = str;
44 };
45 return this.colorize(modeService, text || '', mimeType, options).then(render, (err) => console.error(err));
46 }
src/vs/workbench/contrib/notebook/browser/view/renderers/cellRenderer.ts:
580 const element = DOM.$('div', { style });
581
582 const linesHtml = this.getRichTextLinesAsHtml(model, modelRange, colorMap);
583: element.innerHTML = linesHtml as unknown as string;
584 return element;
585 }
586
src/vs/workbench/contrib/notebook/browser/view/renderers/webviewPreloads.ts:
375 addMouseoverListeners(outputNode, outputId);
376 const content = data.content;
377 if (content.type === RenderOutputType.Html) {
378: outputNode.innerHTML = content.htmlContent;
379 cellOutputContainer.appendChild(outputNode);
380 domEval(outputNode);
381 } else if (preloadErrs.some(e => !!e)) {

73
.vscode/searches/es6.code-search vendored Normal file
View File

@@ -0,0 +1,73 @@
# Query: @deprecated ES6
# Flags: CaseSensitive WordMatch
# ContextLines: 2
14 results - 4 files
src/vs/base/browser/dom.ts:
81 };
82
83: /** @deprecated ES6 - use classList*/
84 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
85: /** @deprecated ES6 - use classList*/
86 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
87: /** @deprecated ES6 - use classList*/
88 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
89: /** @deprecated ES6 - use classList*/
90 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
91: /** @deprecated ES6 - use classList*/
92 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
93: /** @deprecated ES6 - use classList*/
94 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
95
src/vs/base/common/arrays.ts:
401
402 /**
403: * @deprecated ES6: use `Array.findIndex`
404 */
405 export function firstIndex<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean): number {
417
418 /**
419: * @deprecated ES6: use `Array.find`
420 */
421 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
568
569 /**
570: * @deprecated ES6: use `Array.find`
571 */
572 export function find<T>(arr: ArrayLike<T>, predicate: (value: T, index: number, arr: ArrayLike<T>) => any): T | undefined {
src/vs/base/common/objects.ts:
115
116 /**
117: * @deprecated ES6
118 */
119 export function assign<T>(destination: T): T;
src/vs/base/common/strings.ts:
15
16 /**
17: * @deprecated ES6: use `String.padStart`
18 */
19 export function pad(n: number, l: number, char: string = '0'): string {
146
147 /**
148: * @deprecated ES6: use `String.startsWith`
149 */
150 export function startsWith(haystack: string, needle: string): boolean {
167
168 /**
169: * @deprecated ES6: use `String.endsWith`
170 */
171 export function endsWith(haystack: string, needle: string): boolean {
861
862 /**
863: * @deprecated ES6
864 */
865 export function repeat(s: string, count: number): string {

View File

@@ -2,52 +2,18 @@
# Flags: RegExp # Flags: RegExp
# ContextLines: 2 # ContextLines: 2
8 results - 4 files 2 results - 2 files
src/vs/base/browser/ui/tree/asyncDataTree.ts: src/vs/base/browser/ui/tree/asyncDataTree.ts:
241 } : () => 'treeitem', 243 } : () => 'treeitem',
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => { 244 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T)); 245: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
244 } : undefined, 246 } : undefined,
245 getAriaLabel(e) { 247 getAriaLabel(e) {
src/vs/platform/list/browser/listService.ts: src/vs/workbench/contrib/debug/browser/debugConfigurationManager.ts:
463 254
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) { 255 return debugDynamicExtensions.map(e => {
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick'; 256: const type = e.contributes?.debuggers![0].type!;
466 this._register(options?.configurationService.onDidChangeConfiguration(() => { 257 return {
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick'; 258 label: this.getDebuggerLabel(type)!,
468 }));
469 } else {
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
1526
1527 await this._ensureActiveKernel();
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
1529 }
1530
1535
1536 await this._ensureActiveKernel();
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
1538 }
1539
1553
1554 await this._ensureActiveKernel();
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1556 }
1557
1567
1568 await this._ensureActiveKernel();
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1570 }
1571
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
89 .then(() => this._resourceRequestManager.ensureReady())
90 .then(() => {
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
92 });
93 }

View File

@@ -73,9 +73,6 @@
}, },
"gulp.autoDetect": "off", "gulp.autoDetect": "off",
"files.insertFinalNewline": true, "files.insertFinalNewline": true,
"[plaintext]": {
"files.insertFinalNewline": false,
},
"[typescript]": { "[typescript]": {
"editor.defaultFormatter": "vscode.typescript-language-features" "editor.defaultFormatter": "vscode.typescript-language-features"
}, },

View File

@@ -1,3 +1,3 @@
disturl "https://electronjs.org/headers" disturl "https://atom.io/download/electron"
target "9.4.3" target "9.2.1"
runtime "electron" runtime "electron"

View File

@@ -1,115 +1,5 @@
# Change Log # Change Log
## Version 1.26.1
* Release date: February 25, 2021
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/14382
## Version 1.26.0
* Release date: February 22, 2021
* Release status: General Availability
* Added edit Jupyter book UI support
* Improved Jupyter server start-up time by 50% on windows
* Extension Updates:
* Azure Arc
* PG dashboard enhancements
* Multi-controller support
* MIAA Dashboard will no longer prompt for SQL Server connection immediately upon opening
* Azure Data CLI
* Kusto
* Machine Learning
* Profiler
* Server Reports
* Schema Compare
* SQL Server Dacpac
* SQL Database Projects
* Bug Fixes
## Version 1.25.3
* Release date: February 10, 2021
* Release status: General Availability
* Update Electron to 9.4.3 to incorporate critical upstream fixes
## Version 1.25.2
* Release date: January 22, 2021
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/13899
## Version 1.25.1
* Release date: December 10, 2020
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/13751
## Version 1.25.0
* Release date: December 8, 2020
* Release status: General Availability
* Kusto extension improvements
* SQL Project extension improvements
* Notebook improvements
* Azure Browse Connections Preview performance improvements
* Bug Fixes
## Version 1.24.0
* Release date: November 12, 2020
* Release status: General Availability
* SQL Project improvements
* Notebook improvements, including in WYSIWYG editor enhancements
* Azure Arc improvements
* Azure SQL Deployment UX improvements
* Azure Browse Connections Preview
* Bug Fixes
## Version 1.23.0
* Release date: October 14, 2020
* Release status: General Availability
* Added deployments of Azure SQL DB and VM
* Added PowerShell kernel results streaming support
* Added improvements to SQL Database Projects extension
* Bug Fixes
* Extension Updates:
* SQL Server Import
* Machine Learning
* Schema Compare
* Kusto
* SQL Assessment
* SQL Database Projects
* Azure Arc
* azdata
## Version 1.22.1
* Release date: September 30, 2020
* Release status: General Availability
* Fix bug #12615 Active connection filter doesn't untoggle | [#12615](https://github.com/microsoft/azuredatastudio/issues/12615)
* Fix bug #12572 Edit Data grid doesn't escape special characters | [#12572](https://github.com/microsoft/azuredatastudio/issues/12572)
* Fix bug #12570 Dashboard Explorer table doesn't escape special characters | [#12570](https://github.com/microsoft/azuredatastudio/issues/12570)
* Fix bug #12582 Delete row on Edit Data fails | [#12582](https://github.com/microsoft/azuredatastudio/issues/12582)
* Fix bug #12646 SQL Notebooks: Cells being treated isolated | [#12646](https://github.com/microsoft/azuredatastudio/issues/12646)
## Version 1.22.0
* Release date: September 22, 2020
* Release status: General Availability
* New Notebook Features
* Supports brand new text cell editing experience based on rich text formatting and seamless conversion to markdown, also known as WYSIWYG toolbar (What You See Is What You Get)
* Supports Kusto kernel
* Supports pinning of notebooks
* Added support for new version of Jupyter Books
* Improved Jupyter Shortcuts
* Introduced perf loading improvements
* Added Azure Arc extension - Users can try out Azure Arc public preview through Azure Data Studio. This includes:
* Deploy data controller
* Deploy Postgres
* Deploy Managed Instance for Azure Arc
* Connect to data controller
* Access data service dashboards
* Azure Arc Jupyter Book
* Added new deployment options
* Azure SQL Database Edge
* (Edge will require Azure SQL Edge Deployment Extension)
* Added SQL Database Projects extension - The SQL Database Projects extension brings project-based database development to Azure Data Studio. In this preview release, SQL projects can be created and published from Azure Data Studio.
* Added Kusto (KQL) extension - Brings native Kusto experiences in Azure Data Studio for data exploration and data analytics against massive amount of real-time streaming data stored in Azure Data Explorer. This preview release supports connecting and browsing Azure Data Explorer clusters, writing KQL queries as well as authoring notebooks with Kusto kernel.
* SQL Server Import extension GA - Announcing the GA of the SQL Server Import extension, features no longer in preview. This extension facilitates importing csv/txt files. Learn more about the extension in [this article](sql-server-import-extension.md).
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22September+2020+Release%22+is%3Aclosed).
## Version 1.21.0 ## Version 1.21.0
* Release date: August 12, 2020 * Release date: August 12, 2020
* Release status: General Availability * Release status: General Availability

View File

@@ -19,7 +19,7 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| [Linux DEB][linux-deb] | | [Linux DEB][linux-deb] |
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions. Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
## Try out the latest insiders build from `main`: ## Try out the latest insiders build from `main`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider) - [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
@@ -29,8 +29,6 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider) - [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release. See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## **Feature Highlights** ## **Feature Highlights**
@@ -131,10 +129,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt). Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2154985 [win-user]: https://go.microsoft.com/fwlink/?linkid=2138608
[win-system]: https://go.microsoft.com/fwlink/?linkid=2155159 [win-system]: https://go.microsoft.com/fwlink/?linkid=2138704
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2155221 [win-zip]: https://go.microsoft.com/fwlink/?linkid=2138705
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2155096 [osx-zip]: https://go.microsoft.com/fwlink/?linkid=2138609
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2154986 [linux-zip]: https://go.microsoft.com/fwlink/?linkid=2138706
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2155222 [linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2138507
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2155223 [linux-deb]: https://go.microsoft.com/fwlink/?linkid=2138508

View File

@@ -1526,6 +1526,30 @@ END OF primeng NOTICES AND INFORMATION
%% process-nextick-args NOTICES AND INFORMATION BEGIN HERE %% process-nextick-args NOTICES AND INFORMATION BEGIN HERE
========================================= =========================================
# Copyright (c) 2015 Calvin Metcalf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
**THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.**
=========================================
END OF process-nextick-args NOTICES AND INFORMATION
%% pty.js NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/) Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
Permission is hereby granted, free of charge, to any person obtaining a copy Permission is hereby granted, free of charge, to any person obtaining a copy

View File

@@ -1,3 +0,0 @@
* text eol=lf
*.exe binary
*.dll binary

View File

@@ -8,10 +8,6 @@
**/LICENSE **/LICENSE
**/CONTRIBUTORS **/CONTRIBUTORS
**/docs/**
**/example/**
**/examples/**
jschardet/index.js jschardet/index.js
jschardet/src/** jschardet/src/**
jschardet/dist/jschardet.js jschardet/dist/jschardet.js

View File

@@ -15,9 +15,9 @@
"keywords": [], "keywords": [],
"author": "", "author": "",
"dependencies": { "dependencies": {
"@actions/core": "^1.2.6", "@actions/core": "^1.2.3",
"@actions/github": "^2.1.1", "@actions/github": "^2.1.1",
"axios": "^0.21.1", "axios": "^0.19.2",
"ts-node": "^8.6.2", "ts-node": "^8.6.2",
"typescript": "^3.8.3" "typescript": "^3.8.3"
} }

View File

@@ -2,10 +2,10 @@
# yarn lockfile v1 # yarn lockfile v1
"@actions/core@^1.2.6": "@actions/core@^1.2.3":
version "1.2.6" version "1.2.3"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.6.tgz#a78d49f41a4def18e88ce47c2cac615d5694bf09" resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.3.tgz#e844b4fa0820e206075445079130868f95bfca95"
integrity sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA== integrity sha512-Wp4xnyokakM45Uuj4WLUxdsa8fJjKVl1fDTsPbTEcTcuu0Nb26IPQbOtjmnfaCPGcaoPOOqId8H9NapZ8gii4w==
"@actions/github@^2.1.1": "@actions/github@^2.1.1":
version "2.1.1" version "2.1.1"
@@ -144,12 +144,12 @@ atob-lite@^2.0.0:
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696" resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY= integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
axios@^0.21.1: axios@^0.19.2:
version "0.21.1" version "0.19.2"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.1.tgz#22563481962f4d6bde9a76d516ef0e5d3c09b2b8" resolved "https://registry.yarnpkg.com/axios/-/axios-0.19.2.tgz#3ea36c5d8818d0d5f8a8a97a6d36b86cdc00cb27"
integrity sha512-dKQiRHxGD9PPRIUNIWvZhPTPpl1rf/OxTYKsqKUDjBwYylTvV7SjSHJb9ratfyzM6wCdLCOYLzs73qpg5c4iGA== integrity sha512-fjgm5MvRHLhx+osE2xoekY70AhARk3a6hkN+3Io1jc00jtquGvxYlKlsFUhmUET0V5te6CcZI7lcv2Ym61mjHA==
dependencies: dependencies:
follow-redirects "^1.10.0" follow-redirects "1.5.10"
before-after-hook@^2.0.0: before-after-hook@^2.0.0:
version "2.1.0" version "2.1.0"
@@ -177,6 +177,13 @@ cross-spawn@^6.0.0:
shebang-command "^1.2.0" shebang-command "^1.2.0"
which "^1.2.9" which "^1.2.9"
debug@=3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/debug/-/debug-3.1.0.tgz#5bb5a0672628b64149566ba16819e61518c67261"
integrity sha512-OX8XqP7/1a9cqkxYw2yXss15f26NKWBpDXQd0/uK/KPqdQhxbPa994hnzjcE2VqQpDslf55723cKPUOGSmMY3g==
dependencies:
ms "2.0.0"
deprecation@^2.0.0, deprecation@^2.3.1: deprecation@^2.0.0, deprecation@^2.3.1:
version "2.3.1" version "2.3.1"
resolved "https://registry.yarnpkg.com/deprecation/-/deprecation-2.3.1.tgz#6368cbdb40abf3373b525ac87e4a260c3a700919" resolved "https://registry.yarnpkg.com/deprecation/-/deprecation-2.3.1.tgz#6368cbdb40abf3373b525ac87e4a260c3a700919"
@@ -207,10 +214,12 @@ execa@^1.0.0:
signal-exit "^3.0.0" signal-exit "^3.0.0"
strip-eof "^1.0.0" strip-eof "^1.0.0"
follow-redirects@^1.10.0: follow-redirects@1.5.10:
version "1.13.1" version "1.5.10"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.13.1.tgz#5f69b813376cee4fd0474a3aba835df04ab763b7" resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.5.10.tgz#7b7a9f9aea2fdff36786a94ff643ed07f4ff5e2a"
integrity sha512-SSG5xmZh1mkPGyKzjZP8zLjltIfpW32Y5QpdNJyjcfGxK3qo3NDDkZOZSFiGn1A6SclQxY9GzEwAHQ3dmYRWpg== integrity sha512-0V5l4Cizzvqt5D44aTXbFZz+FtyXV1vrDN6qrelxtfYQKW0KO0W2T/hkE8xvGa/540LkZlkaUjO4ailYTFtHVQ==
dependencies:
debug "=3.1.0"
get-stream@^4.0.0: get-stream@^4.0.0:
version "4.1.0" version "4.1.0"
@@ -266,6 +275,11 @@ make-error@^1.1.1:
resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2" resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw== integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
integrity sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=
nice-try@^1.0.4: nice-try@^1.0.4:
version "1.0.5" version "1.0.5"
resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366" resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366"

View File

@@ -24,7 +24,7 @@ const files = [
]; ];
async function main() { async function main() {
return new Promise<void>((resolve, reject) => { return new Promise((resolve, reject) => {
const stream = vfs.src(files, { base: '.build', allowEmpty: true }) const stream = vfs.src(files, { base: '.build', allowEmpty: true })
.pipe(es.through(file => { .pipe(es.through(file => {
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY!, const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY!,

View File

@@ -53,7 +53,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
} }
}; };
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c())); await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
} }
function getEnv(name: string): string { function getEnv(name: string): string {

View File

@@ -17,7 +17,7 @@ const fileNames = [
]; ];
async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> { async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c())); await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
} }
async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> { async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> {
@@ -33,7 +33,7 @@ async function uploadBlob(blobService: azure.BlobService, container: string, blo
} }
}; };
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c())); await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
} }
async function publish(commit: string, files: readonly string[]): Promise<void> { async function publish(commit: string, files: readonly string[]): Promise<void> {

View File

@@ -43,7 +43,6 @@ function createDefaultConfig(quality: string): Config {
} }
function getConfig(quality: string): Promise<Config> { function getConfig(quality: string): Promise<Config> {
console.log(`Getting config for quality ${quality}`);
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] }); const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config'; const collection = 'dbs/builds/colls/config';
const query = { const query = {
@@ -53,13 +52,13 @@ function getConfig(quality: string): Promise<Config> {
] ]
}; };
return retry(() => new Promise<Config>((c, e) => { return new Promise<Config>((c, e) => {
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => { client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err && err.code !== 409) { return e(err); } if (err && err.code !== 409) { return e(err); }
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config); c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config);
}); });
})); });
} }
interface Asset { interface Asset {
@@ -87,7 +86,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
updateTries++; updateTries++;
return new Promise<void>((c, e) => { return new Promise<void>((c, e) => {
console.log(`Querying existing documents to update...`);
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => { client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err) { return e(err); } if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); } if (results.length !== 1) { return e(new Error('No documents')); }
@@ -103,7 +101,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
release.updates[platform] = type; release.updates[platform] = type;
} }
console.log(`Replacing existing document with updated version`);
client.replaceDocument(release._self, release, err => { client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) { return c(update()); } if (err && err.code === 409 && updateTries < 5) { return c(update()); }
if (err) { return e(err); } if (err) { return e(err); }
@@ -115,8 +112,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
}); });
} }
return retry(() => new Promise<void>((c, e) => { return new Promise<void>((c, e) => {
console.log(`Attempting to create document`);
client.createDocument(collection, release, err => { client.createDocument(collection, release, err => {
if (err && err.code === 409) { return c(update()); } if (err && err.code === 409) { return c(update()); }
if (err) { return e(err); } if (err) { return e(err); }
@@ -124,11 +120,11 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
console.log('Build successfully published.'); console.log('Build successfully published.');
c(); c();
}); });
})); });
} }
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> { async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c())); await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
} }
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> { async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
@@ -144,7 +140,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
} }
}; };
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c())); await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
} }
interface PublishOptions { interface PublishOptions {
@@ -192,6 +188,7 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`); console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return; return;
} }
console.log('Uploading blobs to Azure storage...'); console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file); await uploadBlob(blobService, quality, blobName, file);
@@ -250,22 +247,6 @@ async function publish(commit: string, quality: string, platform: string, type:
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate); await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
} }
const RETRY_TIMES = 10;
async function retry<T>(fn: () => Promise<T>): Promise<T> {
for (let run = 1; run <= RETRY_TIMES; run++) {
try {
return await fn();
} catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
}
}
throw new Error('Retried too many times');
}
function main(): void { function main(): void {
const commit = process.env['BUILD_SOURCEVERSION']; const commit = process.env['BUILD_SOURCEVERSION'];

View File

@@ -1,26 +1,24 @@
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
- script: | - script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -35,21 +33,21 @@ steps:
# yarn monaco-compile-check # yarn monaco-compile-check
# displayName: Run Monaco Editor Checks # displayName: Run Monaco Editor Checks
- script: | - script: |
yarn valid-layers-check yarn valid-layers-check
displayName: Run Valid Layers Checks displayName: Run Valid Layers Checks
- script: | - script: |
yarn compile yarn compile
displayName: Compile Sources displayName: Compile Sources
# - script: | {{SQL CARBON EDIT}} remove step # - script: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions # yarn download-builtin-extensions
# displayName: Download Built-in Extensions # displayName: Download Built-in Extensions
- script: | - script: |
./scripts/test.sh --tfs "Unit Tests" ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron) displayName: Run Unit Tests (Electron)
# - script: | {{SQL CARBON EDIT}} disable # - script: | {{SQL CARBON EDIT}} disable
# yarn test-browser --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests" # yarn test-browser --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
@@ -59,17 +57,17 @@ steps:
# ./scripts/test-integration.sh --tfs "Integration Tests" # ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron) # displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0 - task: PublishPipelineArtifact@0
inputs: inputs:
artifactName: crash-dump-macos artifactName: crash-dump-macos
targetPath: .build/crashes targetPath: .build/crashes
displayName: "Publish Crash Reports" displayName: 'Publish Crash Reports'
continueOnError: true continueOnError: true
condition: failed() condition: failed()
- task: PublishTestResults@2 - task: PublishTestResults@2
displayName: Publish Tests Results displayName: Publish Tests Results
inputs: inputs:
testResultsFiles: "*-results.xml" testResultsFiles: '*-results.xml'
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed() condition: succeededOrFailed()

View File

@@ -1,337 +1,269 @@
steps: steps:
- script: | - script: |
mkdir -p .build mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin displayName: Prepare cache flag
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs: inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin" keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min" targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
platformIndependent: true platformIndependent: true
alias: "Compilation" alias: 'Compilation'
- script: | - script: |
set -e set -e
exit 1 exit 1
displayName: Check RestoreCache displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
login vscode login vscode
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - script: |
set -e set -e
sudo xcode-select -s /Applications/Xcode_12.2.app git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
displayName: Switch to Xcode 12 git fetch distro
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64')) git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: | - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
set -e inputs:
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
git fetch distro targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
git merge $(node -p "require('./package.json').distro") vstsFeed: 'npm-vscode'
displayName: Merge distro
- script: | - script: |
npx https://aka.ms/enablesecurefeed standAlone set -e
displayName: Switch to Terrapin packages CHILD_CONCURRENCY=1 yarn --frozen-lockfile
timeoutInMinutes: 5 displayName: Install dependencies
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
echo -n $(VSCODE_ARCH) > .build/arch inputs:
displayName: Prepare yarn cache flags keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - script: |
inputs: set -e
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" yarn postinstall
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" displayName: Run postinstall scripts
vstsFeed: "npm-vscode" condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
npm install -g node-gyp@7.1.0 node build/azure-pipelines/mixin
node-gyp --version displayName: Mix in quality
displayName: Update node-gyp
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
export npm_config_arch=$(VSCODE_ARCH) VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
export npm_config_node_gyp=$(which node-gyp) yarn gulp vscode-darwin-min-ci
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
export CHILD_CONCURRENCY="1" yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
for i in {1..3}; do # try 3 times, for Terrapin - script: |
yarn --frozen-lockfile && break set -e
if [ $i -eq 3 ]; then ./scripts/test.sh --build --tfs "Unit Tests"
echo "Yarn failed too many times" >&2 displayName: Run unit tests (Electron)
exit 1 condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - script: |
inputs: set -e
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" displayName: Run unit tests (Browser)
vstsFeed: "npm-vscode" condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e # Figure out the full absolute path of the product we just built
export npm_config_arch=$(VSCODE_ARCH) # including the remote server and configure the integration tests
export npm_config_node_gyp=$(which node-gyp) # to run with these builds instead of running out of sources.
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk set -e
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/ APP_ROOT=$(agent.builddirectory)/VSCode-darwin
yarn postinstall APP_NAME="`ls $APP_ROOT | head -n 1`"
displayName: Run postinstall scripts INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
condition: and(succeeded(), eq(variables['CacheRestored'], 'true')) VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e set -e
export npm_config_arch=$(VSCODE_ARCH) VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
export npm_config_node_gyp=$(which node-gyp) ./resources/server/test/test-web-integration.sh --browser webkit
export npm_config_build_from_source=true displayName: Run integration tests (Browser)
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/
yarn electron-rebuild
cd ./node_modules/keytar
node-gyp rebuild
displayName: Rebuild native modules for ARM64
condition: eq(variables['VSCODE_ARCH'], 'arm64')
- script: | - script: |
set -e set -e
node build/azure-pipelines/mixin APP_ROOT=$(agent.builddirectory)/VSCode-darwin
displayName: Mix in quality APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ APP_ROOT=$(agent.builddirectory)/VSCode-darwin
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci APP_NAME="`ls $APP_ROOT | head -n 1`"
displayName: Build yarn smoketest --build "$APP_ROOT/$APP_NAME"
continueOnError: true
displayName: Run smoke tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
yarn gulp vscode-reh-darwin-min-ci yarn smoketest --web --headless
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ continueOnError: true
yarn gulp vscode-reh-web-darwin-min-ci displayName: Run smoke tests (Browser)
displayName: Build reh condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: | - task: PublishPipelineArtifact@0
set -e inputs:
yarn electron $(VSCODE_ARCH) artifactName: crash-dump-macos
displayName: Download Electron targetPath: .build/crashes
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false')) displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- script: | - task: PublishTestResults@2
set -e displayName: Publish Tests Results
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain inputs:
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain testResultsFiles: '*-results.xml'
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12 condition: succeededOrFailed()
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH="$(VSCODE_ARCH)" DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: | - script: |
set -e set -e
./scripts/test.sh --build --tfs "Unit Tests" security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
displayName: Run unit tests (Electron) security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: | - script: |
set -e set -e
yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests" pushd $(agent.builddirectory)/VSCode-darwin && zip -r -X -y $(agent.builddirectory)/VSCode-darwin.zip * && popd
displayName: Run unit tests (Browser) displayName: Archive build
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
# Figure out the full absolute path of the product we just built inputs:
# including the remote server and configure the integration tests ConnectedServiceName: 'ESRP CodeSign'
# to run with these builds instead of running out of sources. FolderPath: '$(agent.builddirectory)'
set -e Pattern: 'VSCode-darwin.zip'
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) signConfigType: inlineSignParams
APP_NAME="`ls $APP_ROOT | head -n 1`" inlineOperation: |
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \ [
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \ {
./scripts/test-integration.sh --build --tfs "Integration Tests" "keyCode": "CP-401337-Apple",
displayName: Run integration tests (Electron) "operationSetCode": "MacAppDeveloperSign",
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) "parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: | - script: |
set -e zip -d $(agent.builddirectory)/VSCode-darwin.zip "*.pkg"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \ displayName: Clean Archive
./resources/server/test/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) APP_NAME="`ls $APP_ROOT | head -n 1`"
APP_NAME="`ls $APP_ROOT | head -n 1`" BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \ echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \ displayName: Export bundle identifier
./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
set -e inputs:
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) ConnectedServiceName: 'ESRP CodeSign'
APP_NAME="`ls $APP_ROOT | head -n 1`" FolderPath: '$(agent.builddirectory)'
yarn smoketest --build "$APP_ROOT/$APP_NAME" Pattern: 'VSCode-darwin.zip'
continueOnError: true signConfigType: inlineSignParams
displayName: Run smoke tests (Electron) inlineOperation: |
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) [
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: | - script: |
set -e set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \ APP_ROOT=$(agent.builddirectory)/VSCode-darwin
yarn smoketest --web --headless APP_NAME="`ls $APP_ROOT | head -n 1`"
continueOnError: true "$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Run smoke tests (Browser) displayName: Verify start after signing (export configuration)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0 - script: |
inputs: set -e
artifactName: crash-dump-macos-$(VSCODE_ARCH) VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
targetPath: .build/crashes AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
displayName: "Publish Crash Reports" AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
continueOnError: true AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
condition: failed() ./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- task: PublishTestResults@2 - script: |
displayName: Publish Tests Results AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
inputs: yarn gulp upload-vscode-configuration
testResultsFiles: "*-results.xml" displayName: Upload configuration (for Bing settings search)
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" continueOnError: true
condition: succeededOrFailed()
- script: | - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
set -e displayName: 'Component Detection'
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd continueOnError: true
displayName: Archive build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: |
zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip "*.pkg"
displayName: Clean Archive
- script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
displayName: Export bundle identifier
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- script: |
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
yarn gulp upload-vscode-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -1,27 +1,19 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -e set -e
# Publish DEB
case $VSCODE_ARCH in
x64) ASSET_ID="darwin" ;;
arm64) ASSET_ID="darwin-arm64" ;;
esac
# publish the build # publish the build
node build/azure-pipelines/common/createAsset.js \ node build/azure-pipelines/common/createAsset.js \
"$ASSET_ID" \ darwin \
archive \ archive \
"VSCode-$ASSET_ID.zip" \ "VSCode-darwin-$VSCODE_QUALITY.zip" \
../VSCode-darwin-$VSCODE_ARCH.zip ../VSCode-darwin.zip
if [ "$VSCODE_ARCH" == "x64" ]; then # package Remote Extension Host
# package Remote Extension Host pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
# publish Remote Extension Host # publish Remote Extension Host
node build/azure-pipelines/common/createAsset.js \ node build/azure-pipelines/common/createAsset.js \
server-darwin \ server-darwin \
archive-unsigned \ archive-unsigned \
"vscode-server-darwin.zip" \ "vscode-server-darwin.zip" \
../vscode-server-darwin.zip ../vscode-server-darwin.zip
fi

View File

@@ -12,7 +12,6 @@ steps:
displayName: Prepare cache flag displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs: inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality' keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min' targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -50,7 +49,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
displayName: Prepare tooling displayName: Prepare tooling
@@ -62,7 +61,6 @@ steps:
displayName: Merge distro displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -77,7 +75,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -98,7 +95,9 @@ steps:
- script: | - script: |
set -e set -e
yarn gulp package-rebuild-extensions yarn gulp package-rebuild-extensions
yarn gulp vscode-darwin-x64-min-ci yarn gulp vscode-darwin-min-ci
yarn gulp vscode-reh-darwin-min-ci
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build displayName: Build
env: env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password) VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -114,7 +113,7 @@ steps:
# including the remote server and configure the integration tests # including the remote server and configure the integration tests
# to run with these builds instead of running out of sources. # to run with these builds instead of running out of sources.
set -e set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64 APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`" APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \ INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \ VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
@@ -124,25 +123,25 @@ steps:
- script: | - script: |
set -e set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64 APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`" APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log" yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Electron) displayName: Run smoke tests (Electron)
continueOnError: true continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# - script: | - script: |
# set -e set -e
# node ./node_modules/playwright/install.js node ./node_modules/playwright/install.js
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \ VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
# yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots" yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
# displayName: Run smoke tests (Browser) displayName: Run smoke tests (Browser)
# continueOnError: true continueOnError: true
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: | - script: |
set -e set -e
pushd ../azuredatastudio-darwin-x64 pushd ../azuredatastudio-darwin
ls ls
echo "Cleaning the application" echo "Cleaning the application"
@@ -172,7 +171,7 @@ steps:
- script: | - script: |
set -e set -e
mkdir -p .build/darwin/archive mkdir -p .build/darwin/archive
pushd ../azuredatastudio-darwin-x64 pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip
popd popd
displayName: 'Archive (no signing)' displayName: 'Archive (no signing)'
@@ -181,7 +180,7 @@ steps:
- script: | - script: |
set -e set -e
mkdir -p .build/darwin/archive mkdir -p .build/darwin/archive
pushd ../azuredatastudio-darwin-x64 pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin-unsigned.zip ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin-unsigned.zip
popd popd
displayName: 'Archive' displayName: 'Archive'
@@ -202,7 +201,7 @@ steps:
testResultsFiles: 'test-results.xml' testResultsFiles: 'test-results.xml'
searchFolder: '$(Build.SourcesDirectory)' searchFolder: '$(Build.SourcesDirectory)'
continueOnError: true continueOnError: true
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishCodeCoverageResults@1 - task: PublishCodeCoverageResults@1
displayName: 'Publish code coverage from $(Build.SourcesDirectory)/.build/coverage/cobertura-coverage.xml' displayName: 'Publish code coverage from $(Build.SourcesDirectory)/.build/coverage/cobertura-coverage.xml'

View File

@@ -9,9 +9,9 @@ pr:
include: ['main', 'release/*'] include: ['main', 'release/*']
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets' displayName: 'Azure Key Vault: Get Secrets'
@@ -19,8 +19,8 @@ steps:
azureSubscription: 'azuredatastudio-adointegration' azureSubscription: 'azuredatastudio-adointegration'
KeyVaultName: ado-secrets KeyVaultName: ado-secrets
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
@@ -28,7 +28,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
@@ -37,9 +37,9 @@ steps:
# Push main branch into oss/master # Push main branch into oss/master
git push distro origin/main:refs/heads/oss/master git push distro origin/main:refs/heads/oss/master
# Push every release branch into oss/release # Push every release branch into oss/release
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro
git merge $(node -p "require('./package.json').distro") git merge $(node -p "require('./package.json').distro")
displayName: Sync & Merge Distro displayName: Sync & Merge Distro

View File

@@ -22,7 +22,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
displayName: Prepare tooling displayName: Prepare tooling
@@ -34,7 +34,6 @@ steps:
displayName: Merge distro displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -49,7 +48,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'

View File

@@ -1,40 +1,40 @@
pool: pool:
vmImage: "Ubuntu-16.04" vmImage: 'Ubuntu-16.04'
trigger: trigger:
branches: branches:
include: ["main"] include: ['main']
pr: pr:
branches: branches:
include: ["main"] include: ['main']
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
login vscode login vscode
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
git checkout origin/electron-11.x.y git checkout origin/electron-x.y.z
git merge origin/master git merge origin/master
# Push master branch into exploration branch # Push master branch into exploration branch
git push origin HEAD:electron-11.x.y git push origin HEAD:electron-x.y.z
displayName: Sync & Merge Exploration displayName: Sync & Merge Exploration

View File

@@ -1,5 +0,0 @@
#!/usr/bin/env bash
set -e
echo "Installing remote dependencies"
(cd remote && rm -rf node_modules && yarn)

View File

@@ -1,28 +0,0 @@
#!/usr/bin/env bash
set -e
REPO="$(pwd)"
ROOT="$REPO/.."
PLATFORM_LINUX="linux-alpine"
# Publish Remote Extension Host
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish Remote Extension Host (Web)
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"

View File

@@ -1,99 +1,97 @@
steps: steps:
- script: | - script: |
set -e set -e
sudo apt-get update sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev #{{SQL CARBON EDIT}} add kerberos dep sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev #{{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults sudo update-rc.d xvfb defaults
sudo service xvfb start sudo service xvfb start
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}} inputs:
inputs: keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock" targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules" vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
- script: | - script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}} inputs:
inputs: keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock" targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules" vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
yarn electron x64 yarn electron x64
displayName: Download Electron displayName: Download Electron
- script: | - script: |
yarn gulp hygiene yarn gulp hygiene
displayName: Run Hygiene Checks displayName: Run Hygiene Checks
- script: | # {{SQL CARBON EDIT}} add strict null check - script: | # {{SQL CARBON EDIT}} add strict null check
yarn strict-vscode yarn strict-vscode
displayName: Run Strict Null Check displayName: Run Strict Null Check
# - script: | {{SQL CARBON EDIT}} remove monaco editor checks # - script: | {{SQL CARBON EDIT}} remove monaco editor checks
# yarn monaco-compile-check # yarn monaco-compile-check
# displayName: Run Monaco Editor Checks # displayName: Run Monaco Editor Checks
- script: | - script: |
yarn valid-layers-check yarn valid-layers-check
displayName: Run Valid Layers Checks displayName: Run Valid Layers Checks
- script: | - script: |
yarn compile yarn compile
displayName: Compile Sources displayName: Compile Sources
# - script: | {{SQL CARBON EDIT}} remove step # - script: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions # yarn download-builtin-extensions
# displayName: Download Built-in Extensions # displayName: Download Built-in Extensions
- script: | - script: |
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron) displayName: Run Unit Tests (Electron)
# - script: | {{SQL CARBON EDIT}} disable # - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 yarn test-browser --browser chromium --tfs "Browser Unit Tests" # DISPLAY=:10 yarn test-browser --browser chromium --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser) # displayName: Run Unit Tests (Browser)
# - script: | {{SQL CARBON EDIT}} disable # - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests" # DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron) # displayName: Run Integration Tests (Electron)
# - task: PublishPipelineArtifact@0 # - task: PublishPipelineArtifact@0
# inputs: # inputs:
# artifactName: crash-dump-linux # artifactName: crash-dump-linux
# targetPath: .build/crashes # targetPath: .build/crashes
# displayName: 'Publish Crash Reports' # displayName: 'Publish Crash Reports'
# condition: succeededOrFailed() # condition: succeededOrFailed()
- task: PublishPipelineArtifact@0 - task: PublishPipelineArtifact@0
inputs: inputs:
artifactName: crash-dump-linux artifactName: crash-dump-linux
targetPath: .build/crashes targetPath: .build/crashes
displayName: "Publish Crash Reports" displayName: 'Publish Crash Reports'
continueOnError: true continueOnError: true
condition: failed() condition: failed()
- task: PublishTestResults@2 - task: PublishTestResults@2
displayName: Publish Tests Results displayName: Publish Tests Results
inputs: inputs:
testResultsFiles: "*-results.xml" testResultsFiles: '*-results.xml'
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed() condition: succeededOrFailed()

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -1,135 +0,0 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: Docker@1
displayName: "Pull image"
inputs:
azureSubscriptionEndpoint: "vscode-builds-subscription"
azureContainerRegistry: vscodehub.azurecr.io
command: "Run an image"
imageName: "vscode-linux-build-agent:alpine"
containerCommand: uname
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n "alpine" > .build/arch
displayName: Prepare yarn cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
docker run -e VSCODE_QUALITY -e CHILD_CONCURRENCY=1 -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine /root/vscode/build/azure-pipelines/linux/alpine/install-dependencies.sh
displayName: Prebuild
- script: |
set -e
yarn gulp vscode-reh-linux-alpine-min-ci
yarn gulp vscode-reh-web-linux-alpine-min-ci
displayName: Build
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/alpine/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -0,0 +1,115 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- task: Docker@1
displayName: 'Pull image'
inputs:
azureSubscriptionEndpoint: 'vscode-builds-subscription'
azureContainerRegistry: vscodehub.azurecr.io
command: 'Run an image'
imageName: 'vscode-linux-build-agent:$(VSCODE_ARCH)'
containerCommand: uname
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
CHILD_CONCURRENCY=1 ./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/prebuild.sh
displayName: Prebuild
- script: |
set -e
./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/build.sh
displayName: Build
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -1,231 +1,200 @@
steps: steps:
- script: | - script: |
mkdir -p .build mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin displayName: Prepare cache flag
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs: inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin" keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min" targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
platformIndependent: true platformIndependent: true
alias: "Compilation" alias: 'Compilation'
- script: | - script: |
set -e set -e
exit 1 exit 1
displayName: Check RestoreCache displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
login vscode login vscode
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - script: |
set -e set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro git fetch distro
git merge $(node -p "require('./package.json').distro") git merge $(node -p "require('./package.json').distro")
displayName: Merge distro displayName: Merge distro
- script: | - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
npx https://aka.ms/enablesecurefeed standAlone inputs:
displayName: Switch to Terrapin packages keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
timeoutInMinutes: 5 targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true')) vstsFeed: 'npm-vscode'
- script: | - script: |
echo -n $(VSCODE_ARCH) > .build/arch set -e
displayName: Prepare yarn cache flags CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs: inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
export npm_config_arch=$(NPM_ARCH) yarn postinstall
export CHILD_CONCURRENCY="1" displayName: Run postinstall scripts
for i in {1..3}; do # try 3 times, for Terrapin condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - script: |
inputs: set -e
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" node build/azure-pipelines/mixin
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" displayName: Mix in quality
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
yarn postinstall VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Run postinstall scripts yarn gulp vscode-linux-x64-min-ci
condition: and(succeeded(), eq(variables['CacheRestored'], 'true')) VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-x64-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build
- script: | - script: |
set -e set -e
node build/azure-pipelines/mixin service xvfb start
displayName: Mix in quality displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci displayName: Run unit tests (Electron)
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: | - script: |
set -e set -e
service xvfb start DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Start xvfb displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e # Figure out the full absolute path of the product we just built
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests" # including the remote server and configure the integration tests
displayName: Run unit tests (Electron) # to run with these builds instead of running out of sources.
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set -e set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests" VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-x64" \
displayName: Run unit tests (Browser) DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
# Figure out the full absolute path of the product we just built set -e
# including the remote server and configure the integration tests APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
# to run with these builds instead of running out of sources. APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
set -e INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ displayName: Run remote integration tests (Electron)
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \ condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - task: PublishPipelineArtifact@0
set -e inputs:
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \ artifactName: crash-dump-linux
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium targetPath: .build/crashes
displayName: Run integration tests (Browser) displayName: 'Publish Crash Reports'
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) continueOnError: true
condition: failed()
- script: | - task: PublishTestResults@2
set -e displayName: Publish Tests Results
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH) inputs:
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") testResultsFiles: '*-results.xml'
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \ condition: succeededOrFailed()
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0 - script: |
inputs: set -e
artifactName: "crash-dump-linux-$(VSCODE_ARCH)" yarn gulp "vscode-linux-x64-build-deb"
targetPath: .build/crashes yarn gulp "vscode-linux-x64-build-rpm"
displayName: "Publish Crash Reports" yarn gulp "vscode-linux-x64-prepare-snap"
continueOnError: true displayName: Build packages
condition: failed()
- task: PublishTestResults@2 - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
displayName: Publish Tests Results inputs:
inputs: ConnectedServiceName: 'ESRP CodeSign'
testResultsFiles: "*-results.xml" FolderPath: '.build/linux/rpm/x86_64'
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" Pattern: '*.rpm'
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false')) signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-450779-Pgp",
"operationSetCode": "LinuxSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign rpm
- script: | - script: |
set -e set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb" AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm" AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
displayName: Build deb, rpm packages VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- script: | - task: PublishPipelineArtifact@0
set -e displayName: 'Publish Pipeline Artifact'
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap" inputs:
displayName: Prepare snap package artifactName: snap-x64
targetPath: .build/linux/snap-tarball
# needed for code signing - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
- task: UseDotNet@2 displayName: 'Component Detection'
displayName: "Install .NET Core SDK 2.x" continueOnError: true
inputs:
version: 2.x
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: ".build/linux/rpm"
Pattern: "*.rpm"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-450779-Pgp",
"operationSetCode": "LinuxSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign rpm
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -4,10 +4,11 @@ REPO="$(pwd)"
ROOT="$REPO/.." ROOT="$REPO/.."
# Publish tarball # Publish tarball
PLATFORM_LINUX="linux-$VSCODE_ARCH" PLATFORM_LINUX="linux-x64"
BUILDNAME="VSCode-$PLATFORM_LINUX" BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)" BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$VSCODE_ARCH-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$VSCODE_ARCH-$BUILD_VERSION.tar.gz" [ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME" TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
rm -rf $ROOT/code-*.tar.* rm -rf $ROOT/code-*.tar.*
@@ -27,26 +28,16 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH" node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish DEB # Publish DEB
case $VSCODE_ARCH in PLATFORM_DEB="linux-deb-x64"
x64) DEB_ARCH="amd64" ;; DEB_ARCH="amd64"
*) DEB_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)" DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME" DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH" node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH"
# Publish RPM # Publish RPM
case $VSCODE_ARCH in PLATFORM_RPM="linux-rpm-x64"
x64) RPM_ARCH="x86_64" ;; RPM_ARCH="x86_64"
armhf) RPM_ARCH="armv7hl" ;;
arm64) RPM_ARCH="aarch64" ;;
*) RPM_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)" RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME" RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
@@ -55,6 +46,6 @@ node build/azure-pipelines/common/createAsset.js "$PLATFORM_RPM" package "$RPM_F
# Publish Snap # Publish Snap
# Pack snap tarball artifact, in order to preserve file perms # Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz" SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
rm -rf $SNAP_TARBALL_PATH rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap) (cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)

View File

@@ -1,56 +1,52 @@
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- task: DownloadPipelineArtifact@0 - task: DownloadPipelineArtifact@0
displayName: "Download Pipeline Artifact" displayName: 'Download Pipeline Artifact'
inputs: inputs:
artifactName: snap-$(VSCODE_ARCH) artifactName: snap-x64
targetPath: .build/linux/snap-tarball targetPath: .build/linux/snap-tarball
- script: | - script: |
set -e set -e
# Get snapcraft version # Get snapcraft version
snapcraft --version snapcraft --version
# Make sure we get latest packages # Make sure we get latest packages
sudo apt-get update sudo apt-get update
sudo apt-get upgrade -y sudo apt-get upgrade -y
# Define variables # Define variables
REPO="$(pwd)" REPO="$(pwd)"
SNAP_ROOT="$REPO/.build/linux/snap/$(VSCODE_ARCH)" SNAP_ROOT="$REPO/.build/linux/snap/x64"
# Install build dependencies # Install build dependencies
(cd build && yarn) (cd build && yarn)
# Unpack snap tarball artifact, in order to preserve file perms # Unpack snap tarball artifact, in order to preserve file perms
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$(VSCODE_ARCH).tar.gz" SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
(cd .build/linux && tar -xzf $SNAP_TARBALL_PATH) (cd .build/linux && tar -xzf $SNAP_TARBALL_PATH)
# Create snap package # Create snap package
BUILD_VERSION="$(date +%s)" BUILD_VERSION="$(date +%s)"
SNAP_FILENAME="code-$VSCODE_QUALITY-$(VSCODE_ARCH)-$BUILD_VERSION.snap" SNAP_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.snap"
SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME" SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME"
case $(VSCODE_ARCH) in (cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft snap --output "$SNAP_PATH")
x64) SNAPCRAFT_TARGET_ARGS="" ;;
*) SNAPCRAFT_TARGET_ARGS="--target-arch $(VSCODE_ARCH)" ;;
esac
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft snap $SNAPCRAFT_TARGET_ARGS --output "$SNAP_PATH")
# Publish snap package # Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \ AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \ AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/createAsset.js "linux-snap-$(VSCODE_ARCH)" package "$SNAP_FILENAME" "$SNAP_PATH" node build/azure-pipelines/common/createAsset.js "linux-snap-x64" package "$SNAP_FILENAME" "$SNAP_PATH"

View File

@@ -9,7 +9,6 @@ steps:
displayName: Prepare cache flag displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs: inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality' keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min' targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -46,7 +45,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
displayName: Prepare tooling displayName: Prepare tooling
@@ -58,7 +57,6 @@ steps:
displayName: Merge distro displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -73,7 +71,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -94,7 +91,8 @@ steps:
- script: | - script: |
set -e set -e
yarn gulp vscode-linux-x64-min-ci yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-web-min-ci yarn gulp vscode-reh-linux-x64-min-ci
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build displayName: Build
env: env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password) VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -136,8 +134,7 @@ steps:
set -e set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64 APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName") APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
export INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \ INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export NO_CLEANUP=1
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }} DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests' displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
@@ -152,21 +149,6 @@ steps:
continueOnError: true continueOnError: true
condition: and(succeeded(), eq(variables['RUN_UNSTABLE_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_UNSTABLE_TESTS'], 'true'))
- bash: |
set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp
for folder in adsuser*/
do
folder=${folder%/}
# Only archive directories we want for debugging purposes
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
done
displayName: Archive Logs
continueOnError: true
condition: succeededOrFailed()
- script: | - script: |
set -e set -e
yarn gulp vscode-linux-x64-build-deb yarn gulp vscode-linux-x64-build-deb
@@ -235,11 +217,10 @@ steps:
testResultsFiles: '*.xml' testResultsFiles: '*.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results' searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
continueOnError: true continueOnError: true
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true')) condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishBuildArtifacts@1 - task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop' displayName: 'Publish Artifact: drop'
condition: succeededOrFailed()
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0 - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection' displayName: 'Component Detection'

View File

@@ -2,201 +2,143 @@ trigger: none
pr: none pr: none
schedules: schedules:
- cron: "0 5 * * Mon-Fri" - cron: "0 5 * * Mon-Fri"
displayName: Mon-Fri at 7:00 displayName: Mon-Fri at 7:00
branches: branches:
include: include:
- master - master
resources: resources:
containers: containers:
- container: vscode-x64 - container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64 image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub endpoint: VSCodeHub
- container: vscode-arm64 - container: snapcraft
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-arm64 image: snapcore/snapcraft:stable
endpoint: VSCodeHub
- container: vscode-armhf
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-armhf
endpoint: VSCodeHub
- container: snapcraft
image: snapcore/snapcraft:stable
stages: stages:
- stage: Compile - stage: Compile
jobs: jobs:
- job: Compile - job: Compile
pool:
vmImage: "Ubuntu-16.04"
container: vscode-x64
variables:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
- stage: Windows
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool: pool:
vmImage: VS2017-Win2016 vmImage: 'Ubuntu-16.04'
jobs: container: vscode-x64
- job: Windows steps:
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true')) - template: product-compile.yml
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: Windows32 - stage: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true')) dependsOn:
timeoutInMinutes: 90 - Compile
variables: condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
VSCODE_ARCH: ia32 pool:
steps: vmImage: VS2017-Win2016
- template: win32/product-build-win32.yml jobs:
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: WindowsARM64 - job: Windows32
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true'))
timeoutInMinutes: 90 variables:
variables: VSCODE_ARCH: ia32
VSCODE_ARCH: arm64 steps:
steps: - template: win32/product-build-win32.yml
- template: win32/product-build-win32.yml
- stage: Linux - job: WindowsARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true'))
variables:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32-arm64.yml
- stage: Linux
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: 'Ubuntu-16.04'
jobs:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnap
dependsOn: dependsOn:
- Compile - Linux
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
pool: container: snapcraft
vmImage: "Ubuntu-16.04" steps:
jobs: - template: linux/snap-build-linux.yml
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnap - job: LinuxArmhf
dependsOn: condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
- Linux variables:
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true')) VSCODE_ARCH: armhf
container: snapcraft steps:
variables: - template: linux/product-build-linux-multiarch.yml
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArmhf - job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: vscode-armhf variables:
variables: VSCODE_ARCH: arm64
VSCODE_ARCH: armhf steps:
NPM_ARCH: armv7l - template: linux/product-build-linux-multiarch.yml
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnapArmhf - job: LinuxAlpine
dependsOn: condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))
- LinuxArmhf variables:
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true')) VSCODE_ARCH: alpine
container: snapcraft steps:
variables: - template: linux/product-build-linux-multiarch.yml
VSCODE_ARCH: armhf
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArm64 - job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
container: vscode-arm64 variables:
variables: VSCODE_ARCH: x64
VSCODE_ARCH: arm64 steps:
NPM_ARCH: arm64 - template: web/product-build-web.yml
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnapArm64 - stage: macOS
dependsOn: dependsOn:
- LinuxArm64 - Compile
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
container: snapcraft pool:
variables: vmImage: macOS-latest
VSCODE_ARCH: arm64 jobs:
steps: - job: macOS
- template: linux/snap-build-linux.yml condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
steps:
- template: darwin/product-build-darwin.yml
- job: LinuxAlpine - stage: Mooncake
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true')) dependsOn:
steps: - Windows
- template: linux/product-build-alpine.yml - Linux
- macOS
condition: and(succeededOrFailed(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: 'Ubuntu-16.04'
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
- job: LinuxWeb - stage: Publish
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true')) dependsOn:
variables: - Windows
VSCODE_ARCH: x64 - Linux
steps: - macOS
- template: web/product-build-web.yml condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), or(eq(variables['VSCODE_RELEASE'], 'true'), and(or(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['VSCODE_QUALITY'], 'exploration')), eq(variables['Build.Reason'], 'Schedule'))))
pool:
- stage: macOS vmImage: 'Ubuntu-16.04'
dependsOn: jobs:
- Compile - job: BuildService
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false')) displayName: Build Service
pool: steps:
vmImage: macOS-latest - template: release.yml
jobs:
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
- stage: macOSARM64
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS_ARM64'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin.yml
- stage: Mooncake
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeededOrFailed(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
- stage: Publish
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), or(eq(variables['VSCODE_RELEASE'], 'true'), and(or(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['VSCODE_QUALITY'], 'exploration')), eq(variables['Build.Reason'], 'Schedule'))))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: BuildService
displayName: Build Service
steps:
- template: release.yml

View File

@@ -1,161 +1,142 @@
steps: steps:
- script: | - script: |
mkdir -p .build mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin displayName: Prepare cache flag
displayName: Prepare compilation cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs: inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin" keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min" targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
platformIndependent: true platformIndependent: true
alias: "Compilation" alias: 'Compilation'
dryRun: true dryRun: true
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
login vscode login vscode
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro git fetch distro
git merge $(node -p "require('./package.json').distro") git merge $(node -p "require('./package.json').distro")
displayName: Merge distro displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
npx https://aka.ms/enablesecurefeed standAlone inputs:
displayName: Switch to Terrapin packages keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
timeoutInMinutes: 5 targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['ENABLE_TERRAPIN'], 'true')) vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
echo -n $(VSCODE_ARCH) > .build/arch set -e
displayName: Prepare yarn cache flags CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs: inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
export CHILD_CONCURRENCY="1" yarn postinstall
for i in {1..3}; do # try 3 times, for Terrapin displayName: Run postinstall scripts
yarn --frozen-lockfile && break condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['CacheRestored'], 'true'))
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 # Mixin must run before optimize, because the CSS loader will
inputs: # inline small SVGs
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" - script: |
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" set -e
vstsFeed: "npm-vscode" node build/azure-pipelines/mixin
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true')) displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
yarn postinstall yarn gulp hygiene
displayName: Run postinstall scripts yarn monaco-compile-check
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['CacheRestored'], 'true')) yarn valid-layers-check
displayName: Run hygiene, monaco compile & valid layers checks
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
# Mixin must run before optimize, because the CSS loader will - script: |
# inline small SVGs set -
- script: | ./build/azure-pipelines/common/extract-telemetry.sh
set -e displayName: Extract Telemetry
node build/azure-pipelines/mixin condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
yarn gulp hygiene AZURE_WEBVIEW_STORAGE_ACCESS_KEY="$(vscode-webview-storage-key)" \
yarn monaco-compile-check ./build/azure-pipelines/common/publish-webview.sh
yarn valid-layers-check displayName: Publish Webview
displayName: Run hygiene, monaco compile & valid layers checks condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: | - script: |
set - set -e
./build/azure-pipelines/common/extract-telemetry.sh yarn gulp compile-build
displayName: Extract Telemetry yarn gulp compile-extensions-build
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) yarn gulp minify-vscode
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
AZURE_WEBVIEW_STORAGE_ACCESS_KEY="$(vscode-webview-storage-key)" \ AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
./build/azure-pipelines/common/publish-webview.sh node build/azure-pipelines/upload-sourcemaps
displayName: Publish Webview displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - script: |
set -e set -e
yarn gulp compile-build VERSION=`node -p "require(\"./package.json\").version"`
yarn gulp compile-extensions-build AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
yarn gulp minify-vscode node build/azure-pipelines/common/createBuild.js $VERSION
yarn gulp vscode-reh-linux-x64-min displayName: Create build
yarn gulp vscode-reh-web-linux-x64-min condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: | - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
set -e inputs:
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \ keyfile: 'build/.cachesalt, .build/commit, .build/quality'
node build/azure-pipelines/upload-sourcemaps targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
displayName: Upload sourcemaps vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true')) platformIndependent: true
alias: 'Compilation'
- script: | condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
set -e
VERSION=`node -p "require(\"./package.json\").version"`
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/createBuild.js $VERSION
displayName: Create build
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))

View File

@@ -2,82 +2,82 @@
trigger: trigger:
branches: branches:
include: ["refs/tags/*"] include: ['refs/tags/*']
pr: none pr: none
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- bash: | - bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`) TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F" CHANNEL="G1C14HJ2F"
if [ "$TAG_VERSION" == "1.999.0" ]; then if [ "$TAG_VERSION" == "1.999.0" ]; then
MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local." MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
exit 1
fi
displayName: Check 1.999.0 tag
- bash: |
# Install build dependencies
(cd build && yarn)
node build/azure-pipelines/publish-types/check-version.js
displayName: Check version
- bash: |
git config --global user.email "vscode@microsoft.com"
git config --global user.name "VSCode"
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
node build/azure-pipelines/publish-types/update-types.js
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
cd DefinitelyTyped
git diff --color | cat
git add -A
git status
git checkout -b "vscode-types-$TAG_VERSION"
git commit -m "VS Code $TAG_VERSION Extension API"
git push origin "vscode-types-$TAG_VERSION"
displayName: Push update to DefinitelyTyped
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \ curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \ -H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \ --data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \ exit 1
-H 'Content-type: application/json; charset=utf-8' \ fi
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \ displayName: Check 1.999.0 tag
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \ - bash: |
-H 'Content-type: application/json; charset=utf-8' \ # Install build dependencies
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \ (cd build && yarn)
https://slack.com/api/chat.postMessage node build/azure-pipelines/publish-types/check-version.js
displayName: Check version
displayName: Send message on Slack - bash: |
git config --global user.email "vscode@microsoft.com"
git config --global user.name "VSCode"
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
node build/azure-pipelines/publish-types/update-types.js
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
cd DefinitelyTyped
git diff --color | cat
git add -A
git status
git checkout -b "vscode-types-$TAG_VERSION"
git commit -m "VS Code $TAG_VERSION Extension API"
git push origin "vscode-types-$TAG_VERSION"
displayName: Push update to DefinitelyTyped
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
https://slack.com/api/chat.postMessage
displayName: Send message on Slack

View File

@@ -1,22 +1,22 @@
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "10.x" versionSpec: "10.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
(cd build ; yarn) (cd build ; yarn)
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \ AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/releaseBuild.js node build/azure-pipelines/common/releaseBuild.js

View File

@@ -17,7 +17,7 @@ jobs:
- template: sql-product-compile.yml - template: sql-product-compile.yml
- job: macOS - job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
pool: pool:
vmImage: macOS-latest vmImage: macOS-latest
dependsOn: dependsOn:
@@ -27,7 +27,7 @@ jobs:
timeoutInMinutes: 180 timeoutInMinutes: 180
- job: macOS_Signing - job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true))
pool: pool:
vmImage: macOS-latest vmImage: macOS-latest
dependsOn: dependsOn:
@@ -50,7 +50,7 @@ jobs:
timeoutInMinutes: 70 timeoutInMinutes: 70
- job: LinuxWeb - job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
pool: pool:
vmImage: 'Ubuntu-16.04' vmImage: 'Ubuntu-16.04'
container: linux-x64 container: linux-x64
@@ -61,15 +61,15 @@ jobs:
steps: steps:
- template: web/sql-product-build-web.yml - template: web/sql-product-build-web.yml
# - job: Docker - job: Docker
# condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
# pool: pool:
# vmImage: 'Ubuntu-16.04' vmImage: 'Ubuntu-16.04'
# container: linux-x64 container: linux-x64
# dependsOn: dependsOn:
# - Linux - Linux
# steps: steps:
# - template: docker/sql-product-build-docker.yml - template: docker/sql-product-build-docker.yml
- job: Windows - job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true')) condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -98,7 +98,7 @@ jobs:
dependsOn: dependsOn:
- macOS - macOS
- Linux - Linux
# - Docker - Docker
- Windows - Windows
- Windows_Test - Windows_Test
- LinuxWeb - LinuxWeb

View File

@@ -6,7 +6,6 @@ steps:
displayName: Prepare cache flag displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs: inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality' keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min' targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -37,7 +36,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
displayName: Prepare tooling displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
@@ -51,7 +50,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -64,7 +62,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs: inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock' keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules' targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -99,8 +96,8 @@ steps:
yarn gulp compile-build yarn gulp compile-build
yarn gulp compile-extensions-build yarn gulp compile-extensions-build
yarn gulp minify-vscode yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min yarn gulp minify-vscode-reh
yarn gulp vscode-reh-web-linux-x64-min yarn gulp minify-vscode-reh-web
displayName: Compile displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
@@ -126,7 +123,6 @@ steps:
displayName: 'Publish Artifact: drop' displayName: 'Publish Artifact: drop'
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Compiled Files
inputs: inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality' keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min' targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'

View File

@@ -1,24 +1,24 @@
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
(cd build ; yarn) (cd build ; yarn)
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \ AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \ AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \ MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \
node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY" node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY"

View File

@@ -1,35 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const es = require("event-stream");
const vfs = require("vinyl-fs");
const util = require("../lib/util");
const filter = require("gulp-filter");
const gzip = require("gulp-gzip");
const azure = require('gulp-azure-storage');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
function main() {
return vfs.src('**', { cwd: '../vscode-web', base: '../vscode-web', dot: true })
.pipe(filter(f => !f.isDirectory()))
.pipe(gzip({ append: false }))
.pipe(es.through(function (data) {
console.log('Uploading CDN file:', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: process.env.VSCODE_QUALITY,
prefix: commit + '/',
contentSettings: {
contentEncoding: 'gzip',
cacheControl: 'max-age=31536000, public'
}
}));
}
main();

View File

@@ -1,40 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs';
import * as util from '../lib/util';
import * as filter from 'gulp-filter';
import * as gzip from 'gulp-gzip';
const azure = require('gulp-azure-storage');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
function main() {
return vfs.src('**', { cwd: '../vscode-web', base: '../vscode-web', dot: true })
.pipe(filter(f => !f.isDirectory()))
.pipe(gzip({ append: false }))
.pipe(es.through(function (data: Vinyl) {
console.log('Uploading CDN file:', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: process.env.VSCODE_QUALITY,
prefix: commit + '/',
contentSettings: {
contentEncoding: 'gzip',
cacheControl: 'max-age=31536000, public'
}
}));
}
main();

View File

@@ -1,132 +1,106 @@
steps: steps:
- script: | - script: |
mkdir -p .build mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin displayName: Prepare cache flag
displayName: Prepare compilation cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs: inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin" keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min" targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
platformIndependent: true platformIndependent: true
alias: "Compilation" alias: 'Compilation'
- script: | - script: |
set -e set -e
exit 1 exit 1
displayName: Check RestoreCache displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- script: | - script: |
set -e set -e
cat << EOF > ~/.netrc cat << EOF > ~/.netrc
machine github.com machine github.com
login vscode login vscode
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "vscode@microsoft.com" git config user.email "vscode@microsoft.com"
git config user.name "VSCode" git config user.name "VSCode"
displayName: Prepare tooling displayName: Prepare tooling
- script: | - script: |
set -e set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro git fetch distro
git merge $(node -p "require('./package.json').distro") git merge $(node -p "require('./package.json').distro")
displayName: Merge distro displayName: Merge distro
- script: | # - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
npx https://aka.ms/enablesecurefeed standAlone # inputs:
displayName: Switch to Terrapin packages # keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
timeoutInMinutes: 5 # targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true')) # vstsFeed: 'npm-vscode'
- script: | - script: |
echo -n "web" > .build/arch set -e
displayName: Prepare yarn cache flag CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 # - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs: # inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" # keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" # targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: "npm-vscode" # vstsFeed: 'npm-vscode'
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | # - script: |
set -e # set -e
export CHILD_CONCURRENCY="1" # yarn postinstall
for i in {1..3}; do # try 3 times, for Terrapin # displayName: Run postinstall scripts
yarn --frozen-lockfile && break # condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - script: |
inputs: set -e
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" node build/azure-pipelines/mixin
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" displayName: Mix in quality
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: | - script: |
set -e set -e
yarn postinstall VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
displayName: Run postinstall scripts yarn gulp vscode-web-min-ci
condition: and(succeeded(), eq(variables['CacheRestored'], 'true')) displayName: Build
- script: | # upload only the workbench.web.api.js source maps because
set -e # we just compiled these bits in the previous step and the
node build/azure-pipelines/mixin # general task to upload source maps has already been run
displayName: Mix in quality - script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps out-vscode-web-min out-vscode-web-min/vs/workbench/workbench.web.api.js.map
displayName: Upload sourcemaps (Web)
- script: | - script: |
set -e set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \ AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
yarn gulp vscode-web-min-ci AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
displayName: Build VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/web/publish.sh
- script: | displayName: Publish
set -e
AZURE_STORAGE_ACCOUNT="$(web-storage-account)" \
AZURE_STORAGE_ACCESS_KEY="$(web-storage-key)" \
node build/azure-pipelines/upload-cdn.js
displayName: Upload to CDN
# upload only the workbench.web.api.js source maps because
# we just compiled these bits in the previous step and the
# general task to upload source maps has already been run
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps out-vscode-web-min out-vscode-web-min/vs/workbench/workbench.web.api.js.map
displayName: Upload sourcemaps (Web)
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/web/publish.sh
displayName: Publish

View File

@@ -6,7 +6,6 @@ steps:
displayName: Prepare cache flag displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs: inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality' keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min' targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -43,7 +42,7 @@ steps:
password $(github-distro-mixin-password) password $(github-distro-mixin-password)
EOF EOF
git config user.email "sqltools@service.microsoft.com" git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio" git config user.name "AzureDataStudio"
displayName: Prepare tooling displayName: Prepare tooling
@@ -55,7 +54,6 @@ steps:
displayName: Merge distro displayName: Merge distro
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 # - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# displayName: Restore Cache - Node Modules
# inputs: # inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock' # keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules' # targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
@@ -68,7 +66,6 @@ steps:
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) # condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 # - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# displayName: Save Cache - Node Modules
# inputs: # inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock' # keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules' # targetfolder: '**/node_modules, !**/node_modules/**/node_modules'

View File

@@ -1,7 +1,6 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<configuration> <configuration>
<packageSources> <packageSources>
<clear/>
<add key="ESRP" value="https://microsoft.pkgs.visualstudio.com/_packaging/ESRP/nuget/v3/index.json" /> <add key="ESRP" value="https://microsoft.pkgs.visualstudio.com/_packaging/ESRP/nuget/v3/index.json" />
</packageSources> </packageSources>
</configuration> </configuration>

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<packages> <packages>
<package id="Microsoft.ESRPClient" version="1.2.47" /> <package id="Microsoft.ESRPClient" version="1.2.25" />
</packages> </packages>

View File

@@ -1,82 +1,80 @@
steps: steps:
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: UsePythonVersion@0 - task: UsePythonVersion@0
inputs: inputs:
versionSpec: "2.x" versionSpec: '2.x'
addToPath: true addToPath: true
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}} inputs:
inputs: keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock" targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules" vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
- powershell: | - powershell: |
yarn --frozen-lockfile yarn --frozen-lockfile
env: env:
CHILD_CONCURRENCY: "1" CHILD_CONCURRENCY: "1"
displayName: Install Dependencies displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}} inputs:
inputs: keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock" targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules" vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: | - powershell: |
yarn electron yarn electron
displayName: Download Electron displayName: Download Electron
# - powershell: | {{SQL CARBON EDIT}} remove editor check # - powershell: | {{SQL CARBON EDIT}} remove editor check
# yarn monaco-compile-check # yarn monaco-compile-check
# displayName: Run Monaco Editor Checks # displayName: Run Monaco Editor Checks
- script: | - script: |
yarn valid-layers-check yarn valid-layers-check
displayName: Run Valid Layers Checks displayName: Run Valid Layers Checks
- powershell: | - powershell: |
yarn compile yarn compile
displayName: Compile Sources displayName: Compile Sources
# - powershell: | {{SQL CARBON EDIT}} remove step # - powershell: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions # yarn download-builtin-extensions
# displayName: Download Built-in Extensions # displayName: Download Built-in Extensions
- powershell: | - powershell: |
.\scripts\test.bat --tfs "Unit Tests" .\scripts\test.bat --tfs "Unit Tests"
displayName: Run Unit Tests (Electron) displayName: Run Unit Tests (Electron)
# - powershell: | {{SQL CARBON EDIT}} disable # - powershell: | {{SQL CARBON EDIT}} disable
# yarn test-browser --browser chromium --browser firefox --tfs "Browser Unit Tests" # yarn test-browser --browser chromium --browser firefox --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser) # displayName: Run Unit Tests (Browser)
# - powershell: | {{SQL CARBON EDIT}} disable # - powershell: | {{SQL CARBON EDIT}} disable
# .\scripts\test-integration.bat --tfs "Integration Tests" # .\scripts\test-integration.bat --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron) # displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0 - task: PublishPipelineArtifact@0
displayName: "Publish Crash Reports" displayName: 'Publish Crash Reports'
inputs: inputs:
artifactName: crash-dump-windows artifactName: crash-dump-windows
targetPath: .build\crashes targetPath: .build\crashes
continueOnError: true continueOnError: true
condition: failed() condition: failed()
- task: PublishTestResults@2 - task: PublishTestResults@2
displayName: Publish Tests Results displayName: Publish Tests Results
inputs: inputs:
testResultsFiles: "*-results.xml" testResultsFiles: '*-results.xml'
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed() condition: succeededOrFailed()

View File

@@ -12,9 +12,9 @@ $ServerZipLocation = "$Repo\.build\win32-$Arch\server"
$ServerZip = "$ServerZipLocation\azuredatastudio-server-win32-$Arch.zip" $ServerZip = "$ServerZipLocation\azuredatastudio-server-win32-$Arch.zip"
# Create server archive # Create server archive
# New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
$global:LASTEXITCODE = 0 $global:LASTEXITCODE = 0
# exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item" exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
# exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server" exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { node build/azure-pipelines/common/copyArtifacts.js } "Copy Artifacts" exec { node build/azure-pipelines/common/copyArtifacts.js } "Copy Artifacts"

View File

@@ -1,14 +1,14 @@
param ($CertBase64) Param(
$ErrorActionPreference = "Stop" [string]$AuthCertificateBase64,
[string]$AuthCertificateKey
)
$CertBytes = [System.Convert]::FromBase64String($CertBase64) # Import auth certificate
$CertCollection = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2Collection $AuthCertificateFileName = [System.IO.Path]::GetTempFileName()
$CertCollection.Import($CertBytes, $null, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable) $AuthCertificateBytes = [Convert]::FromBase64String($AuthCertificateBase64)
[IO.File]::WriteAllBytes($AuthCertificateFileName, $AuthCertificateBytes)
$AuthCertificate = Import-PfxCertificate -FilePath $AuthCertificateFileName -CertStoreLocation Cert:\LocalMachine\My -Password (ConvertTo-SecureString $AuthCertificateKey -AsPlainText -Force)
rm $AuthCertificateFileName
$ESRPAuthCertificateSubjectName = $AuthCertificate.Subject
$CertStore = New-Object System.Security.Cryptography.X509Certificates.X509Store("My","LocalMachine") Write-Output ("##vso[task.setvariable variable=ESRPAuthCertificateSubjectName;]$ESRPAuthCertificateSubjectName")
$CertStore.Open("ReadWrite")
$CertStore.AddRange($CertCollection)
$CertStore.Close()
$ESRPAuthCertificateSubjectName = $CertCollection[0].Subject
Write-Output ("##vso[task.setvariable variable=ESRPAuthCertificateSubjectName;]$ESRPAuthCertificateSubjectName")

View File

@@ -0,0 +1,190 @@
steps:
- powershell: |
mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- powershell: |
$ErrorActionPreference = "Stop"
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
mkdir .build -ea 0
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
displayName: Prepare tooling
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
exec { yarn --frozen-lockfile }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn postinstall }
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-code-helper" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }
displayName: Build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -1,272 +1,252 @@
steps: steps:
- powershell: | - powershell: |
mkdir .build -ea 0 mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit "$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality "$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
"$env:ENABLE_TERRAPIN" | Out-File -Encoding ascii -NoNewLine .build\terrapin displayName: Prepare cache flag
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs: inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin" keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min" targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: "npm-vscode" vstsFeed: 'npm-vscode'
platformIndependent: true platformIndependent: true
alias: "Compilation" alias: 'Compilation'
- powershell: | - powershell: |
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
exit 1 exit 1
displayName: Check RestoreCache displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true')) condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0 - task: NodeTool@0
inputs: inputs:
versionSpec: "12.14.1" versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2 - task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs: inputs:
versionSpec: "1.x" versionSpec: "1.x"
- task: UsePythonVersion@0 - task: UsePythonVersion@0
inputs: inputs:
versionSpec: "2.x" versionSpec: '2.x'
addToPath: true addToPath: true
- task: AzureKeyVault@1 - task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets" displayName: 'Azure Key Vault: Get Secrets'
inputs: inputs:
azureSubscription: "vscode-builds-subscription" azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode KeyVaultName: vscode
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII "machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" } exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" } exec { git config user.name "VSCode" }
displayName: Prepare tooling
- powershell: | mkdir .build -ea 0
. build/azure-pipelines/win32/exec.ps1 "$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
$ErrorActionPreference = "Stop" displayName: Prepare tooling
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- script: | - powershell: |
npx https://aka.ms/enablesecurefeed standAlone . build/azure-pipelines/win32/exec.ps1
displayName: Switch to Terrapin packages $ErrorActionPreference = "Stop"
timeoutInMinutes: 5 exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true')) exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- powershell: | - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch inputs:
displayName: Prepare yarn cache flags keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1 - powershell: |
inputs: . build/azure-pipelines/win32/exec.ps1
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" $ErrorActionPreference = "Stop"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" $env:npm_config_arch="$(VSCODE_ARCH)"
vstsFeed: "npm-vscode" $env:CHILD_CONCURRENCY="1"
exec { yarn --frozen-lockfile }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: | - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
. build/azure-pipelines/win32/exec.ps1 inputs:
. build/azure-pipelines/win32/retry.ps1 keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
$ErrorActionPreference = "Stop" targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
$env:npm_config_arch="$(VSCODE_ARCH)" vstsFeed: 'npm-vscode'
$env:CHILD_CONCURRENCY="1" condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
retry { exec { yarn --frozen-lockfile } }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1 - powershell: |
inputs: . build/azure-pipelines/win32/exec.ps1
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock" $ErrorActionPreference = "Stop"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules" exec { yarn postinstall }
vstsFeed: "npm-vscode" displayName: Run postinstall scripts
condition: and(succeeded(), ne(variables['CacheRestored'], 'true')) condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
exec { yarn postinstall } exec { node build/azure-pipelines/mixin }
displayName: Run postinstall scripts displayName: Mix in quality
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin } $env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
displayName: Mix in quality exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-reh-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-code-helper" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }
displayName: Build
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" exec { yarn electron $(VSCODE_ARCH) }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" } exec { .\scripts\test.bat --build --tfs "Unit Tests" }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-code-helper" } displayName: Run unit tests (Electron)
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" } condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)"
displayName: Build
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" exec { yarn test-browser --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" } displayName: Run unit tests (Browser)
exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" } condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)"
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 # Figure out the full absolute path of the product we just built
$ErrorActionPreference = "Stop" # including the remote server and configure the integration tests
exec { yarn electron $(VSCODE_ARCH) } # to run with these builds instead of running out of sources.
exec { .\scripts\test.bat --build --tfs "Unit Tests" } . build/azure-pipelines/win32/exec.ps1
displayName: Run unit tests (Electron) $ErrorActionPreference = "Stop"
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64')) $AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: | - powershell: |
. build/azure-pipelines/win32/exec.ps1 . build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
exec { yarn test-browser --build --browser chromium --browser firefox --tfs "Browser Unit Tests" } exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-web-integration.bat --browser firefox }
displayName: Run unit tests (Browser) displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64')) condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: | - powershell: |
# Figure out the full absolute path of the product we just built . build/azure-pipelines/win32/exec.ps1
# including the remote server and configure the integration tests $ErrorActionPreference = "Stop"
# to run with these builds instead of running out of sources. $AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
. build/azure-pipelines/win32/exec.ps1 $AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$ErrorActionPreference = "Stop" $AppNameShort = $AppProductJson.nameShort
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-remote-integration.bat }
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json displayName: Run remote integration tests (Electron)
$AppNameShort = $AppProductJson.nameShort condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: | - task: PublishPipelineArtifact@0
. build/azure-pipelines/win32/exec.ps1 inputs:
$ErrorActionPreference = "Stop" artifactName: crash-dump-windows-$(VSCODE_ARCH)
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-web-integration.bat --browser firefox } targetPath: .build\crashes
displayName: Run integration tests (Browser) displayName: 'Publish Crash Reports'
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64')) continueOnError: true
condition: failed()
- powershell: | - task: PublishTestResults@2
. build/azure-pipelines/win32/exec.ps1 displayName: Publish Tests Results
$ErrorActionPreference = "Stop" inputs:
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)" testResultsFiles: '*-results.xml'
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
$AppNameShort = $AppProductJson.nameShort condition: succeededOrFailed()
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-remote-integration.bat }
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishPipelineArtifact@0 - task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs: inputs:
artifactName: crash-dump-windows-$(VSCODE_ARCH) ConnectedServiceName: 'ESRP CodeSign'
targetPath: .build\crashes FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)'
displayName: "Publish Crash Reports" Pattern: '*.dll,*.exe,*.node'
continueOnError: true signConfigType: inlineSignParams
condition: failed() inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: PublishTestResults@2 - task: NuGetCommand@2
displayName: Publish Tests Results displayName: Install ESRPClient.exe
inputs: inputs:
testResultsFiles: "*-results.xml" restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results" feedsToUse: config
condition: and(succeededOrFailed(), ne(variables['VSCODE_ARCH'], 'arm64')) nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 'ESRP Nuget'
restoreDirectory: packages
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1 - task: ESRPImportCertTask@1
inputs: displayName: Import ESRP Request Signing Certificate
ConnectedServiceName: "ESRP CodeSign" inputs:
FolderPath: "$(CodeSigningFolderPath)" ESRP: 'ESRP CodeSign'
Pattern: "*.dll,*.exe,*.node"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2 - powershell: |
displayName: Install ESRPClient.exe $ErrorActionPreference = "Stop"
inputs: .\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config' displayName: Import ESRP Auth Certificate
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: "ESRP Nuget"
restoreDirectory: packages
- task: ESRPImportCertTask@1 - powershell: |
displayName: Import ESRP Request Signing Certificate . build/azure-pipelines/win32/exec.ps1
inputs: $ErrorActionPreference = "Stop"
ESRP: "ESRP CodeSign" $env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: PowerShell@2 - task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
inputs: displayName: 'Component Detection'
targetType: filePath continueOnError: true
filePath: .\build\azure-pipelines\win32\import-esrp-auth-cert.ps1
arguments: "$(ESRP-SSL-AADAuth)"
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

Some files were not shown because too many files have changed in this diff Show More