Compare commits

..

27 Commits

Author SHA1 Message Date
Karl Burtram
13e3627627 Update the path to the smoke test (#15139) 2021-04-14 16:51:10 -07:00
Aditya Bist
e2111fe493 Links handling in commands (#15118) (#15137)
* format

* update external options type

* add command flag for command

* Allow commands in Notebooks

Co-authored-by: chgagnon <chgagnon@microsoft.com>
(cherry picked from commit e151668c81)
2021-04-14 15:35:15 -07:00
Aditya Bist
7b46269b44 add flag for proxy (#15120) (#15126)
* add flag for proxy

* update distro hash

* Bump distro hash

* Bump distro

Co-authored-by: kburtram <karlb@microsoft.com>
Co-authored-by: chgagnon <chgagnon@microsoft.com>
(cherry picked from commit b6bdb68596)
2021-04-14 10:22:17 -07:00
Aditya Bist
69b9a19634 fixed css for font family (#15119) (#15124)
(cherry picked from commit 90868bc0ad)
2021-04-13 21:15:48 -07:00
Aditya Bist
ebe835ec99 Remote CLI fixes (#15117) (#15125)
* fixes from vscode

* update distro

* fix distro

* fix hygiene

* reorder hygiene

* Update distro

Co-authored-by: chgagnon <chgagnon@microsoft.com>
(cherry picked from commit 1b78008258)
2021-04-13 21:15:22 -07:00
Justin M
d4e25f4d89 Bumped version of Kusto to 0.5.3 for SqlToolService 92 (#15116) 2021-04-13 13:54:50 -07:00
Charles Gagnon
809d4de862 Only reveal parent node when finding notebook item (#15091) (#15103)
(cherry picked from commit 42cd650147)
2021-04-12 16:46:30 -07:00
Chris LaFreniere
e1aae951a3 Change to fspath from resource tostring (#15069) (#15078) 2021-04-09 14:21:29 -07:00
Sakshi Sharma
bdd99bd0d8 Fix deploy data to be presented on dashboard (#15073) (#15080)
* Fix deploy data to be presented on dashboard

* Bump sql-db-project extensions version

* Address comment
2021-04-09 13:26:56 -07:00
Barbara Valdez
c06bfad916 fix search loading animation (#15063) (#15066) 2021-04-08 16:45:06 -07:00
Maddy
1c91f7971e Select active notebook in viewlet when opened from command pallet (#15027) (#15065)
* await initialized

* fixes specific for windows

* address comments

* update comment
2021-04-08 16:44:08 -07:00
Barbara Valdez
8c9037fbdf fix anchor links in wysiwyg (#14950) (#15054)
* fix anchor links in wysiwyg
2021-04-08 12:51:20 -07:00
Udeesha Gautam
d029bb9602 Fixing an issue where backup launch was freezing ADS and restore launch was not doiing anything. This repros in absence of any connetcions. (#15041) (#15044) 2021-04-08 09:06:58 -07:00
Udeesha Gautam
f0558714a4 Fix for Bug #15020 dashboard update with multiproject (#15029) (#15042)
* Fix to enable two projects showing only their data even if opening together

* Fixing a typo

* Taking in PR comments
2021-04-07 18:37:42 -07:00
Charles Gagnon
49147305e8 Fix event property name (#15026) (#15031)
(cherry picked from commit 1c66499910)
2021-04-07 16:15:34 -07:00
Barbara Valdez
519012c690 bump sqlservernotebook version (#15025) (#15039) 2021-04-07 15:51:20 -07:00
Udeesha Gautam
ff05bc2b03 Add message when no history exists on projects dashboard (#15002) (#15019)
* Add message when no history exists on projects dashboard

* Bump version for sql db projects

* Update text, add refresh button

* Remove commented code

Co-authored-by: Sakshi Sharma <57200045+SakshiS-harma@users.noreply.github.com>
2021-04-07 13:22:00 -07:00
Charles Gagnon
b54beb6e7a Add required minimum version for azdata extension (#15010) (#15015)
* Add check for minimum required azdata version

* cleanup

* remove unused

* comment

* param comment

* Fix tests
2021-04-07 10:53:08 -07:00
Kim Santiago
dbdcc3d20a fix schema compare dropdown not selecting correct db when multiple active connections (#14999) (#15012)
* fix schema compare dropdown not selecting correct db when multiple active connections

* fix when no username so default is used
2021-04-06 19:34:55 -07:00
Justin M
cf48776710 Increased required version for Kusto from 1.27 to 1.28 (#15001) 2021-04-06 14:05:37 -07:00
Vasu Bhog
309f750b92 See and Edit Selected Links in Callout Dialog (#14987) (#15000)
* Add URL label to linkCallout

* add test for file link
2021-04-06 13:26:51 -07:00
Charles Gagnon
8c92af3016 [Port] Update minimum required azdata version (#14998)
* Update minimum required azdata version to latest release (#14995)

* Update minimum required azdata engine version - Arc (#14996)

* vBump Arc/Azdata

* undo vbump

* vBump
2021-04-06 11:25:26 -07:00
Charles Gagnon
9b117da9cb Sign vulkan-1 DLL (#14976) (#14983)
* Sign DLLs

* Only sign vulkan-1
2021-04-06 10:34:38 -07:00
Charles Gagnon
d12a7b81fd Port 43db30d1da (#14986)
Co-authored-by: Maddy <12754347+MaddyDev@users.noreply.github.com>
2021-04-06 10:13:35 -07:00
Alan Ren
3eb705e77b redraw table header (#14963) (#14988) 2021-04-05 19:49:07 -07:00
Charles Gagnon
b421b19b73 Arc updates for March release (#14970) (#14974)
* Updated Postgres Spec for where to find engine version, removed calling calling -ev in edit commands (#14735)

* Added spec.engine.version, took out calling engine version with edit calls

* Added text wrong place

* missed updates

* PR fix

* Update Arc Postgres troubleshooting notebook

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Remove AzdataSession from azdata commands (#14856)

* remove session

* Add in controller-context support

* Revert "Add in controller-context support"

This reverts commit 3b39b968efbf6054041cb01cb2d8443532643a82.

* Add azdataContext to login

* Undo book change

* Undo change correctly

* Add controller context support (#14862)

* remove session

* Add in controller-context support

* Add params to fake

* Fix tests

* Add info and placeholder for controller URL/name (#14887)

* Add info and placeholder for controller URL

* add period + update name

* update memento and allow editing of namespace/URL

* vBump

* vBump

* Fix tests

Co-authored-by: nasc17 <69922333+nasc17@users.noreply.github.com>
Co-authored-by: Brian Bergeron <brian.e.bergeron@gmail.com>
Co-authored-by: Brian Bergeron <brberger@microsoft.com>

Co-authored-by: nasc17 <69922333+nasc17@users.noreply.github.com>
Co-authored-by: Brian Bergeron <brian.e.bergeron@gmail.com>
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2021-04-05 13:00:26 -07:00
Charles Gagnon
124f7ca887 Remove survey link (#14971) 2021-04-05 12:49:19 -07:00
4355 changed files with 288999 additions and 472408 deletions

View File

@@ -32,17 +32,17 @@ Next: **[Try it out!](#try-it)**
## Quick start - GitHub Codespaces
> **IMPORTANT:** You need to use a "Standard" sized codespace or larger (4-core, 8GB) since VS Code needs 6GB of RAM to compile. This is now the default for GitHub Codespaces, but do not downgrade to "Basic" unless you do not intend to compile.
> **IMPORTANT:** The current free user beta for GitHub Codespaces uses a "Basic" sized codespace which does not have enough RAM to run a full build of VS Code and will be considerably slower during codespace start and running VS Code. You'll soon be able to use a "Standard" sized codespace (4-core, 8GB) that will be better suited for this purpose (along with even larger sizes should you need it).
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and the **New codespace**
> Note that you will not see these options if you are not in the beta yet.
2. After the codespace is up and running in your browser, press <kbd>F1</kbd> and select **Ports: Focus on Ports View**.
2. After the codespace is up and running in your browser, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **View: Show Remote Explorer**.
3. You should see port `6080` under **Forwarded Ports**. Select the line and click on the globe icon to open it in a browser tab.
> If you do not see port `6080`, press <kbd>F1</kbd>, select **Forward a Port** and enter port `6080`.
> If you do not see port `6080`, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, select **Forward a Port** and enter port `6080`.
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
@@ -58,7 +58,7 @@ You will likely see better performance when accessing the codespace you created
2. Set up [VS Code for use with GitHub Codespaces](https://docs.github.com/github/developing-online-with-codespaces/using-codespaces-in-visual-studio-code)
3. After the VS Code is up and running, press <kbd>F1</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
3. After the VS Code is up and running, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
4. After you've connected to the codespace, use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.

View File

@@ -8,7 +8,7 @@ set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
CONTAINER_IMAGE_REPOSITORY="$1"
BRANCH="${2:-"main"}"
BRANCH="${2:-"master"}"
if [ "${CONTAINER_IMAGE_REPOSITORY}" = "" ]; then
echo "Container repository not specified!"

View File

@@ -2,7 +2,7 @@
"name": "Code - OSS",
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-main",
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-master",
"workspaceMount": "source=${localWorkspaceFolder},target=/home/node/workspace/vscode,type=bind,consistency=cached",
"workspaceFolder": "/home/node/workspace/vscode",

View File

@@ -13,6 +13,4 @@
**/extensions/**/out/**
**/extensions/**/build/**
**/extensions/markdown-language-features/media/**
**/extensions/markdown-language-features/notebook-out/**
**/extensions/typescript-basics/test/colorize-fixtures/**
**/extensions/**/dist/**

File diff suppressed because it is too large Load Diff

View File

@@ -8,18 +8,12 @@ assignees: ''
---
<!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://github.com/Microsoft/azuredatastudio#try-out-the-latest-insiders-build-from-main -->
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in Azure Data Studio to pre-fill useful information. -->
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Also please test using the latest insiders build to make sure your issue has not already been fixed. -->
<!-- Use Help > Report Issue to prefill these. -->
- Azure Data Studio Version:
- OS Version:
Steps to Reproduce:
1.
2.
<!-- 🔧 Launch with `azuredatastudio --disable-extensions` to check. -->
Does this issue occur when all extensions are disabled?: Yes/No
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->

5
.github/copycat.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
{
perform: false,
target_owner: 'anthonydresser',
target_repo: 'testissues'
}

View File

@@ -11,296 +11,161 @@ on:
- release/*
jobs:
windows:
name: Windows
runs-on: windows-latest
timeout-minutes: 30
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v2
with:
node-version: 12
- uses: actions/setup-python@v2
with:
python-version: "2.x"
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node_modules archive
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: ".build/node_modules_cache"
# key: "${{ runner.os }}-cacheNodeModulesArchive-${{ steps.nodeModulesCacheKey.outputs.value }}"
# - name: Extract node_modules archive
# if: ${{ steps.cacheNodeModules.outputs.cache-hit == 'true' }}
# run: 7z.exe x .build/node_modules_cache/cache.7z -aos
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skipping caching for now
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
# - name: Create node_modules archive {{SQL CARBON EDIT}} Skip caching for now
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: |
# mkdir -Force .build
# node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
# mkdir -Force .build/node_modules_cache
# 7z.exe a .build/node_modules_cache/cache.7z -mx3 `@.build/node_modules_list.txt
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" # {{SQL CARBON EDIT}} Remove unused options playwright-install download-builtin-extensions
- name: Run Unit Tests (Electron)
run: .\scripts\test.bat
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} disable for now
# run: yarn test-browser --browser chromium
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} disable for now
# run: .\scripts\test-integration.bat
linux:
name: Linux
runs-on: ubuntu-latest
timeout-minutes: 30
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- name: Setup Build Environment
run: |
- run: |
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- uses: actions/setup-node@v2
name: Setup Build Environment
- uses: actions/setup-node@v1
with:
node-version: 12
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules13-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skip caching for now
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
- name: Run Unit Tests (Electron)
id: electron-unit-tests
run: DISPLAY=:10 ./scripts/test.sh --coverage --runGlob "**/sql/**/*.test.js" # {{SQL CARBON EDIT}} Run only our tests with coverage
- name: Run Extension Unit Tests (Electron)
id: electron-extension-unit-tests
run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- name: Combine code coverage files
run: node test/combineCoverage
# TODO: cache node modules
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js"
name: Run Unit Tests (Electron)
- run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
name: Run Extension Unit Tests (Electron)
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- run: node test/combineCoverage
name: Combine code coverage files
- name: Upload Code Coverage
uses: coverallsapp/github-action@v1.1.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: "test/coverage/lcov.info"
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
# id: browser-unit-tests
# run: DISPLAY=:10 yarn test-browser --browser chromium
# Fails with cryptic error (e.g. https://github.com/microsoft/vscode/pull/90292/checks?check_run_id=433681926#step:13:9)
# - run: DISPLAY=:10 yarn test-browser --browser chromium
# name: Run Unit Tests (Browser)
# - run: DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
# name: Run Integration Tests (Electron)
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} Skip for now
# id: electron-integration-tests
# run: DISPLAY=:10 ./scripts/test-integration.sh
darwin:
name: macOS
runs-on: macos-latest
timeout-minutes: 30
windows:
runs-on: windows-2016
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v2
- uses: actions/setup-node@v1
with:
node-version: 12
- uses: actions/setup-python@v1
with:
python-version: "2.x"
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: .\scripts\test.bat --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium {{SQL CARBON EDIT}} disable for now @TODO @anthonydresser
# name: Run Unit Tests (Browser)
# - run: .\scripts\test-integration.bat --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
# name: Run Integration Tests (Electron)
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules13-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
- name: Run Unit Tests (Electron)
run: DISPLAY=:10 ./scripts/test.sh
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
# run: DISPLAY=:10 yarn test-browser --browser chromium
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} Skip for now
# run: DISPLAY=:10 ./scripts/test-integration.sh
hygiene:
name: Hygiene and Layering
runs-on: ubuntu-latest
timeout-minutes: 30
darwin:
runs-on: macos-latest
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1
with:
node-version: 12
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: ./scripts/test.sh --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium --browser webkit
# name: Run Unit Tests (Browser)
# - run: ./scripts/test-integration.sh --tfs "Integration Tests"
# name: Run Integration Tests (Electron)
- name: Compute node modules cache key
id: nodeModulesCacheKey
run: echo "::set-output name=value::$(node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js)"
- name: Cache node modules
id: cacheNodeModules
uses: actions/cache@v2
with:
path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules13-
- name: Get yarn cache directory path
id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: echo "::set-output name=dir::$(yarn cache dir)"
- name: Cache yarn directory
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
uses: actions/cache@v2
with:
path: ${{ steps.yarnCacheDirPath.outputs.dir }}
key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Setup Build Environment # {{SQL CARBON EDIT}} Add step to install required packages if we need to run yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: |
sudo apt-get update
sudo apt-get install -y libkrb5-dev
- name: Execute yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Run Hygiene Checks
run: yarn gulp hygiene
- name: Run Valid Layers Checks
run: yarn valid-layers-check
- name: Run Strict Compile Options # {{SQL CARBON EDIT}} add step
run: yarn strict-vscode
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn monaco-compile-check
- name: Run Trusted Types Checks
run: yarn tsec-compile-check
# - name: Editor Distro & ESM Bundle {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn gulp editor-esm-bundle
# - name: Typings validation prep {{SQL CARBON EDIT}} Remove Monaco checks
# run: |
# mkdir typings-test
# - name: Typings validation {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./typings-test
# run: |
# yarn init -yp
# ../node_modules/.bin/tsc --init
# echo "import '../out-monaco-editor-core';" > a.ts
# ../node_modules/.bin/tsc --noEmit
# - name: Webpack Editor {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run bundle
# - name: Compile Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run compile
# - name: Download Playwright {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn playwright-install
# - name: Run Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# timeout-minutes: 5
# working-directory: ./test/monaco
# run: yarn test
# monaco:
# runs-on: ubuntu-latest
# env:
# CHILD_CONCURRENCY: "1"
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v2.2.0
# # TODO: rename azure-pipelines/linux/xvfb.init to github-actions
# - run: |
# sudo apt-get update
# sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1
# sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
# sudo chmod +x /etc/init.d/xvfb
# sudo update-rc.d xvfb defaults
# sudo service xvfb start
# name: Setup Build Environment
# - uses: actions/setup-node@v1
# with:
# node-version: 10
# - run: yarn --frozen-lockfile
# name: Install Dependencies
# - run: yarn monaco-compile-check
# name: Run Monaco Editor Checks
# - run: yarn gulp editor-esm-bundle
# name: Editor Distro & ESM Bundle

44
.github/workflows/codeql.yml vendored Normal file
View File

@@ -0,0 +1,44 @@
name: "Code Scanning - Action"
on:
push:
schedule:
- cron: "0 0 * * 0"
jobs:
CodeQL-Build:
strategy:
fail-fast: false
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below).
- name: Autobuild
uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
#- run: |
# make bootstrap
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -0,0 +1,50 @@
name: "Deep Classifier: Runner"
on:
schedule:
- cron: 0 * * * *
repository_dispatch:
types: [trigger-deep-classifier-runner]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/apply/fetch-sources
with:
# slightly overlapping to protect against issues slipping through the cracks if a run is delayed
from: 80
until: 5
configPath: classifier
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade numpy scipy scikit-learn joblib nltk simpletransformers torch torchvision
- name: "Run Classifier: Generator"
run: python ./actions/classifier-deep/apply/generate-labels/main.py
- name: "Run Classifier: Labeler"
uses: ./actions/classifier-deep/apply/apply-labels
with:
configPath: classifier
allowLabels: "needs more info|new release"
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -0,0 +1,27 @@
name: "Deep Classifier: Scraper"
on:
repository_dispatch:
types: [trigger-deep-classifier-scraper]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/train/fetch-issues
with:
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.ISSUE_SCRAPER_TOKEN}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}

View File

@@ -0,0 +1,40 @@
name: VS Code Repo Dev Container Cache Image Generation
on:
push:
# Currently doing this for master, but could be done for PRs as well
branches:
- "master"
# Only updates to these files result in changes to installed packages, so skip otherwise
paths:
- "**/package-lock.json"
- "**/yarn.lock"
jobs:
devcontainer:
name: Generate cache image
runs-on: ubuntu-latest
steps:
- name: Checkout
id: checkout
uses: actions/checkout@v2
- name: Azure CLI login
id: az_login
uses: azure/login@v1
with:
creds: ${{ secrets.AZ_ACR_CREDS }}
- name: Build and push
id: build_and_push
run: |
set -e
ACR_REGISTRY_NAME=$(echo ${{ secrets.CONTAINER_IMAGE_REGISTRY }} | grep -oP '(.+)(?=\.azurecr\.io)')
az acr login --name $ACR_REGISTRY_NAME
GIT_BRANCH=$(echo "${{ github.ref }}" | grep -oP 'refs/(heads|tags)/\K(.+)')
if [ "$GIT_BRANCH" == "" ]; then GIT_BRANCH=master; fi
.devcontainer/cache/build-cache-image.sh "${{ secrets.CONTAINER_IMAGE_REGISTRY }}/public/vscode/devcontainers/repos/microsoft/vscode" "${GIT_BRANCH}"

View File

@@ -0,0 +1,27 @@
name: Latest Release Monitor
on:
schedule:
- cron: 0/5 * * * *
repository_dispatch:
types: [trigger-latest-release-monitor]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
path: ./actions
ref: v40
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Storage Module
run: npm install @azure/storage-blob@12.1.1
- name: Run Latest Release Monitor
uses: ./actions/latest-release-monitor
with:
storageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

24
.github/workflows/on-issue-open.yml vendored Normal file
View File

@@ -0,0 +1,24 @@
name: On Issue Open
on:
issues:
types: [opened]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2.2.0
with:
repository: 'microsoft/azuredatastudio'
ref: main
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions/build/actions
- name: Run CopyCat
uses: ./actions/build/actions/copycat
with:
token: ${{secrets.TRIAGE_PAT}}
owner: anthonydresser
repo: testissues

22
.gitignore vendored
View File

@@ -5,8 +5,26 @@ Thumbs.db
node_modules/
.build/
extensions/**/dist/
/out*/
/extensions/**/out/
out/
out-build/
out-editor/
out-editor-src/
out-editor-build/
out-editor-esm/
out-editor-esm-bundle/
out-editor-min/
out-monaco-editor-core/
out-vscode/
out-vscode-min/
out-vscode-reh/
out-vscode-reh-min/
out-vscode-reh-pkg/
out-vscode-reh-web/
out-vscode-reh-web-min/
out-vscode-reh-web-pkg/
out-vscode-web/
out-vscode-web-min/
out-vscode-web-pkg/
src/vs/server
resources/server
build/node_modules

View File

@@ -3,6 +3,7 @@
// for the documentation about the extensions.json format
"recommendations": [
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig"
"EditorConfig.EditorConfig",
"msjsdiag.debugger-for-chrome"
]
}

160
.vscode/launch.json vendored
View File

@@ -66,147 +66,10 @@
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Emmet Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/emmet/test-fixtures",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/emmet",
"--extensionTestsPath=${workspaceFolder}/extensions/emmet/out/test"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Git Tests",
"runtimeExecutable": "${execPath}",
"args": [
"/tmp/my4g9l",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/git",
"--extensionTestsPath=${workspaceFolder}/extensions/git/out/test"
],
"outFiles": [
"${workspaceFolder}/extensions/git/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (single folder)",
"runtimeExecutable": "${execPath}",
"args": [
// "${workspaceFolder}", // Uncomment for running out of sources.
"${workspaceFolder}/extensions/vscode-api-tests/testWorkspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/singlefolder-tests",
"--disable-extensions"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 3
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code API Tests (workspace)",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-api-tests/testworkspace.code-workspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/workspace-tests"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 4
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Tokenizer Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-colorize-tests/test",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-colorize-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-colorize-tests/out"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 5
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Notebook Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-notebook-tests/test",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-notebook-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-notebook-tests/out"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "extensionHost",
"request": "launch",
"name": "VS Code Custom Editor Tests",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/extensions/vscode-custom-editor-tests/test-workspace",
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-custom-editor-tests",
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-custom-editor-tests/out/test"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "5_tests",
"order": 6
}
},
{
"type": "pwa-chrome",
"type": "chrome",
"request": "attach",
"name": "Attach to azuredatastudio",
"browserAttachLocation": "workspace",
"port": 9222,
"trace": true,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"resolveSourceMapLocations": [
"${workspaceFolder}/out/**/*.js"
],
"perScriptSourcemaps": "yes"
"port": 9222
},
{
"type": "pwa-chrome",
@@ -242,7 +105,8 @@
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"browserLaunchLocation": "workspace"
"browserLaunchLocation": "workspace",
"preLaunchTask": "Ensure Prelaunch Dependencies",
},
{
"type": "node",
@@ -277,6 +141,7 @@
"type": "pwa-chrome",
"request": "launch",
"outFiles": [],
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "VS Code (Web, Chrome)",
"url": "http://localhost:8080",
@@ -350,7 +215,7 @@
"${workspaceFolder}/out/**/*.js"
],
"cascadeTerminateToConfigurations": [
"Attach to azuredatastudio"
"Attach to VS Code"
],
"env": {
"MOCHA_COLORS": "true"
@@ -364,15 +229,15 @@
"request": "launch",
"name": "Run Unit Tests For Current File",
"program": "${workspaceFolder}/test/unit/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
"runtimeExecutable": "${workspaceFolder}/.build/electron/code-oss"
},
"cascadeTerminateToConfigurations": [
"Attach to azuredatastudio"
"Attach to VS Code"
],
"outputCapture": "std",
"args": [
@@ -386,6 +251,9 @@
],
"env": {
"MOCHA_COLORS": "true"
},
"presentation": {
"hidden": true
}
},
{
@@ -488,7 +356,7 @@
{
"name": "Debug Unit Tests (Current File)",
"configurations": [
"Attach to azuredatastudio",
"Attach to VS Code",
"Run Unit Tests For Current File"
],
"presentation": {

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"April 2021\"",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"November 2020\"",
"editable": true
},
{

View File

@@ -2,106 +2,109 @@
{
"kind": 1,
"language": "markdown",
"value": "#### Macros"
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\""
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation"
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone"
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:pr is:open"
"value": "$REPOS $MILESTONE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone"
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels"
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing"
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item"
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request label:verification-needed -label:verified"
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification"
},
{
"kind": 1,
"language": "markdown",
"value": "## Verifiable Fixes"
"value": "# Verification",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified -label:unreleased"
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Unreleased Fixes"
"value": "# Candidates",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified label:unreleased"
},
{
"kind": 1,
"language": "markdown",
"value": "# Candidates"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:open label:candidate"
"value": "$REPOS $MILESTONE label:candidate",
"editable": true
}
]

View File

@@ -176,20 +176,17 @@
{
"kind": 1,
"language": "markdown",
"value": "# vscode-pull-request-github",
"editable": true
"value": "# vscode-pull-request-github"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since",
"editable": true
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since",
"editable": true
"value": "repo:microsoft/vscode-test is:issue created:>$since"
},
{
"kind": 1,

View File

@@ -3,28 +3,24 @@
"kind": 1,
"language": "markdown",
"value": "### Categorizing Issues\n\nEach issue must have a type label. Most type labels are grey, some are yellow. Bugs are grey with a touch of red.",
"editable": true,
"outputs": []
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:\"needs more info\" -label:bug -label:feature-request -label:under-discussion -label:debt -label:*question -label:upstream -label:electron -label:engineering -label:plan-item ",
"editable": true,
"outputs": []
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### Feature Areas\n\nEach issue should be assigned to a feature area",
"editable": true,
"outputs": []
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:L10N -label:VIM -label:api -label:api-finalization -label:api-proposal -label:authentication -label:breadcrumbs -label:callhierarchy -label:code-lens -label:color-palette -label:comments -label:config -label:context-keys -label:css-less-scss -label:custom-editors -label:debug -label:debug-console -label:dialogs -label:diff-editor -label:dropdown -label:editor -label:editor-RTL -label:editor-autoclosing -label:editor-autoindent -label:editor-bracket-matching -label:editor-clipboard -label:editor-code-actions -label:editor-color-picker -label:editor-columnselect -label:editor-commands -label:editor-comments -label:editor-contrib -label:editor-core -label:editor-drag-and-drop -label:editor-error-widget -label:editor-find -label:editor-folding -label:editor-highlight -label:editor-hover -label:editor-indent-detection -label:editor-indent-guides -label:editor-input -label:editor-input-IME -label:editor-insets -label:editor-minimap -label:editor-multicursor -label:editor-parameter-hints -label:editor-render-whitespace -label:editor-rendering -label:editor-scrollbar -label:editor-symbols -label:editor-synced-region -label:editor-textbuffer -label:editor-theming -label:editor-wordnav -label:editor-wrapping -label:emmet -label:error-list -label:explorer-custom -label:extension-host -label:extension-recommendations -label:extensions -label:extensions-development -label:file-decorations -label:file-encoding -label:file-explorer -label:file-glob -label:file-guess-encoding -label:file-io -label:file-watcher -label:font-rendering -label:formatting -label:git -label:github -label:gpu -label:grammar -label:grid-view -label:html -label:i18n -label:icon-brand -label:icons-product -label:install-update -label:integrated-terminal -label:integrated-terminal-conpty -label:integrated-terminal-links -label:integrated-terminal-rendering -label:integrated-terminal-winpty -label:intellisense-config -label:ipc -label:issue-bot -label:issue-reporter -label:javascript -label:json -label:keybindings -label:keybindings-editor -label:keyboard-layout -label:label-provider -label:languages-basic -label:languages-diagnostics -label:languages-guessing -label:layout -label:lcd-text-rendering -label:list -label:log -label:markdown -label:marketplace -label:menus -label:merge-conflict -label:notebook -label:outline -label:output -label:perf -label:perf-bloat -label:perf-startup -label:php -label:portable-mode -label:proxy -label:quick-pick -label:references-viewlet -label:release-notes -label:remote -label:remote-explorer -label:rename -label:sandbox -label:scm -label:screencast-mode -label:search -label:search-api -label:search-editor -label:search-replace -label:semantic-tokens -label:settings-editor -label:settings-sync -label:settings-sync-server -label:shared-process -label:simple-file-dialog -label:smart-select -label:snap -label:snippets -label:splitview -label:suggest -label:sync-error-handling -label:tasks -label:telemetry -label:themes -label:timeline -label:timeline-git -label:titlebar -label:tokenization -label:touch/pointer -label:trackpad/scroll -label:tree -label:typescript -label:undo-redo -label:uri -label:ux -label:variable-resolving -label:vscode-build -label:vscode-website -label:web -label:webview -label:workbench-actions -label:workbench-cli -label:workbench-diagnostics -label:workbench-dnd -label:workbench-editor-grid -label:workbench-editors -label:workbench-electron -label:workbench-feedback -label:workbench-history -label:workbench-hot-exit -label:workbench-hover -label:workbench-launch -label:workbench-link -label:workbench-multiroot -label:workbench-notifications -label:workbench-os-integration -label:workbench-rapid-render -label:workbench-run-as-admin -label:workbench-state -label:workbench-status -label:workbench-tabs -label:workbench-touchbar -label:workbench-views -label:workbench-welcome -label:workbench-window -label:workbench-zen -label:workspace-edit -label:workspace-symbols -label:zoom",
"editable": true,
"outputs": []
"editable": true
}
]

View File

@@ -18,9 +18,9 @@
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item ",
"kind": 1,
"language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/Microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"editable": true
},
{
@@ -32,7 +32,7 @@
{
"kind": 1,
"language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/main/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"editable": true
},
{

View File

@@ -2,181 +2,205 @@
{
"kind": 1,
"language": "markdown",
"value": "#### Macros"
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\"\n\n$MINE=assignee:@me"
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"\n\n$MINE=assignee:@me",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation"
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone"
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:pr is:open"
"value": "$REPOS $MILESTONE $MINE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone"
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels"
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open author:@me label:testplan-item"
"value": "$REPOS $MILESTONE is:issue is:open author:@me label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed"
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing"
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:testplan-item"
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified -label:z-author-verified label:feature-request label:verification-needed"
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Fixing"
"value": "# Fixing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues"
"value": "## Open Issues",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:endgame-plan -label:testplan-item -label:iteration-plan"
"value": "$REPOS $MILESTONE $MINE is:issue is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Bugs"
"value": "## Open Bugs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug"
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification"
"value": "# Verification",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-steps-needed)"
"value": "## My Issues (verification-steps-needed)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-steps-needed"
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-steps-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-found)"
"value": "## My Issues (verification-found)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-found"
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by me"
"value": "## Issues filed by me",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found"
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed from outside team"
"value": "## Issues filed by others",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -author:aeschli -author:alexdima -author:alexr00 -author:AmandaSilver -author:bamurtaugh -author:bpasero -author:btholt -author:chrisdias -author:chrmarti -author:Chuxel -author:connor4312 -author:dbaeumer -author:deepak1556 -author:devinvalenciano -author:digitarald -author:eamodio -author:egamma -author:fiveisprime -author:gregvanl -author:isidorn -author:ItalyPaleAle -author:JacksonKearl -author:joaomoreno -author:jrieken -author:kieferrm -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:ornellaalt -author:orta -author:rebornix -author:RMacfarlane -author:roblourens -author:rzhao271 -author:sana-ajani -author:sandy081 -author:sbatten -author:stevencl -author:Tyriar -author:weinand -author:TylerLeonhardt -author:lramos15"
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by others"
"value": "# Release Notes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found"
},
{
"kind": 1,
"language": "markdown",
"value": "# Release Notes"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode $MILESTONE $MINE is:issue is:closed label:feature-request -label:on-release-notes"
"value": "repo:microsoft/vscode $MILESTONE is:issue is:closed label:feature-request -label:on-release-notes",
"editable": true
}
]

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"April 2021\"",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"November 2020\"",
"editable": true
},
{
@@ -21,7 +21,7 @@
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone assignee:@me is:open",
"editable": true
"editable": false
},
{
"kind": 1,
@@ -81,25 +81,7 @@
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open milestone:\"Backlog Candidates\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### Personal Inbox\n",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "\n#### Missing Type label",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open type:issue -label:bug -label:\"needs more info\" -label:feature-request -label:under-discussion -label:debt -label:plan-item -label:upstream",
"editable": true
"editable": false
},
{
"kind": 1,
@@ -111,6 +93,6 @@
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:\"needs more info\"",
"editable": true
"editable": false
}
]

View File

@@ -1,44 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## Papercuts\n\nThis notebook serves as an ongoing collection of papercut issues that we encounter while dogfooding. With that in mind only promote issues that really turn you off, e.g. issues that make you want to stop using VS Code or its extensions. To mark an issue (bug, feature-request, etc.) as papercut add the labels: `papercut :drop_of_blood:`",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## All Papercuts\n\nThese are all papercut issues that we encounter while dogfooding vscode or extensions that we author.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open -label:notebook label:\"papercut :drop_of_blood:\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Native Notebook",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open label:notebook label:\"papercut :drop_of_blood:\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### My Papercuts",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open assignee:@me label:\"papercut :drop_of_blood:\"",
"editable": true
}
]

View File

@@ -14,7 +14,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"March 2021\"",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"October 2020\"",
"editable": true
},
{
@@ -38,7 +38,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand -author:rzhao271 -author:kieferrm -author:TylerLeonhardt -author:bamurtaugh",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand",
"editable": false
},
{

View File

@@ -79,7 +79,5 @@
"[typescript]": {
"editor.defaultFormatter": "vscode.typescript-language-features"
},
"typescript.tsc.autoDetect": "off",
"notebook.experimental.useMarkdownRenderer": true,
"testing.autoRun.mode": "rerun",
"typescript.tsc.autoDetect": "off"
}

84
.vscode/tasks.json vendored
View File

@@ -4,11 +4,10 @@
{
"type": "npm",
"script": "watch-clientd",
"label": "Core - Build",
"label": "Build VS Code Core",
"isBackground": true,
"presentation": {
"reveal": "never",
"group": "buildWatchers"
"reveal": "never"
},
"problemMatcher": {
"owner": "typescript",
@@ -31,11 +30,10 @@
{
"type": "npm",
"script": "watch-extensionsd",
"label": "Ext - Build",
"label": "Build VS Code Extensions",
"isBackground": true,
"presentation": {
"reveal": "never",
"group": "buildWatchers"
"reveal": "never"
},
"problemMatcher": {
"owner": "typescript",
@@ -56,87 +54,43 @@
}
},
{
"type": "npm",
"script": "watch-extension-mediad",
"label": "Ext Media - Build",
"isBackground": true,
"presentation": {
"reveal": "never",
"group": "buildWatchers"
},
"problemMatcher": {
"owner": "typescript",
"applyTo": "closedDocuments",
"fileLocation": [
"absolute"
],
"pattern": {
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
"file": 1,
"location": 2,
"message": 3
},
"background": {
"beginsPattern": "Starting compilation",
"endsPattern": "Finished compilation"
}
}
},
{
"label": "VS Code - Build",
"label": "Build VS Code",
"dependsOn": [
"Core - Build",
"Ext - Build",
"Ext Media - Build",
"Build VS Code Core",
"Build VS Code Extensions"
],
"group": {
"kind": "build",
"isDefault": true
},
"problemMatcher": []
}
},
{
"type": "npm",
"script": "kill-watch-clientd",
"label": "Kill Core - Build",
"label": "Kill Build VS Code Core",
"group": "build",
"presentation": {
"reveal": "never",
"group": "buildKillers"
"reveal": "never"
},
"problemMatcher": "$tsc"
},
{
"type": "npm",
"script": "kill-watch-extensionsd",
"label": "Kill Ext - Build",
"label": "Kill Build VS Code Extensions",
"group": "build",
"presentation": {
"reveal": "never",
"group": "buildKillers"
"reveal": "never"
},
"problemMatcher": "$tsc"
},
{
"type": "npm",
"script": "kill-watch-extension-mediad",
"label": "Kill Ext Media - Build",
"group": "build",
"presentation": {
"reveal": "never",
"group": "buildKillers"
},
"problemMatcher": "$tsc"
},
{
"label": "Kill VS Code - Build",
"label": "Kill Build VS Code",
"dependsOn": [
"Kill Core - Build",
"Kill Ext - Build",
"Kill Ext Media - Build",
"Kill Build VS Code Core",
"Kill Build VS Code Extensions"
],
"group": "build",
"problemMatcher": []
"group": "build"
},
{
"type": "npm",
@@ -155,7 +109,7 @@
{
"type": "npm",
"script": "watch-webd",
"label": "Web Ext - Build",
"label": "Build Web Extensions",
"group": "build",
"isBackground": true,
"presentation": {
@@ -182,7 +136,7 @@
{
"type": "npm",
"script": "kill-watch-webd",
"label": "Kill Web Ext - Build",
"label": "Kill Build Web Extensions",
"group": "build",
"presentation": {
"reveal": "never"
@@ -263,7 +217,7 @@
"base": "$tsc",
"applyTo": "allDocuments",
"owner": "tsec"
}
},
],
"group": "build",
"label": "npm: tsec-compile-check",

View File

@@ -1,3 +1,3 @@
disturl "https://electronjs.org/headers"
target "12.0.7"
target "9.4.3"
runtime "electron"

View File

@@ -1,41 +1,5 @@
# Change Log
## Version 1.30.0
* Release date: June 17, 2021
* Release status: General Availability
* New Notebook Features:
* Show book's notebook TOC title in pinned notebooks view
* Add new book icon
* Update Python to 3.8.10
* Query Editor Features:
* Added filtering/sorting feature for query result grid in query editor and notebook, the feature can be invoked from the column headers. Note that this feature is only available when you enable the preview features
* Added a status bar item to show summary of the selected cells if there are multiple numeric values
* Extension Updates:
* SQL Database Projects
* Machine Learning
* Bug Fixes
* Fix WYSIWYG Table cell adding new line in table cell
## Version 1.29.0
* Release date: May 19, 2021
* Release status: General Availability
* New Notebook Features:
* Added runs with a parameters option.
* Extension Updates:
* SQL Database Projects
* Schema Compare
* Bug Fixes
## Version 1.28.0
* Release date: April 16, 2021
* Release status: General Availability
* New Notebook Features:
* Added Add Notebook and Remove Notebook commands
* Extension Updates:
* SQL Database Projects
* Schema Compare
* Bug Fixes
## Version 1.27.0
* Release date: March 17, 2021
* Release status: General Availability
@@ -49,7 +13,7 @@
* Arc
* SQL Database Projects
* ASDE Deployment
* Bug Fixes
* Bux Fixes
## Version 1.26.1
* Release date: February 25, 2021
@@ -603,7 +567,7 @@ The May release is focused on stabilization and bug fixes leading up to the Buil
* Announcing **Redgate SQL Search** extension available in Extension Manager
* Community Localization available for 10 languages: **German, Spanish, French, Italian, Japanese, Korean, Portuguese, Russian, Simplified Chinese and Traditional Chinese!**
* Reduced telemetry collection, improved [opt-out](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Disable-Telemetry-Reporting) experience and in-product links to [Privacy Statement](https://privacy.microsoft.com/privacystatement)
* Reduced telemetry collection, improved [opt-out](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Disable-Telemetry-Reporting) experience and in-product links to [Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement)
* Extension Manager has improved Marketplace experience to easily discover community extensions
* SQL Agent extension Jobs and Job History view improvement
* Updates for **whoisactive** and **Server Reports** extensions

View File

@@ -65,7 +65,7 @@ This project has adopted the [Microsoft Open Source Code of Conduct](https://ope
Azure Data Studio is localized into 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). The language packs are available in the Extension Manager marketplace. Simply, search for the specific language using the extension marketplace and install. Once you install the selected language, Azure Data Studio will prompt you to restart with the new language.
## Privacy Statement
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/privacystatement) describes the privacy statement of this software.
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement) describes the privacy statement of this software.
## Contributions and "Thank You"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
@@ -131,10 +131,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2165736
[win-system]: https://go.microsoft.com/fwlink/?linkid=2165737
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2165838
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2165942
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2165841
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2165842
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2165738
[win-user]: https://go.microsoft.com/fwlink/?linkid=2157460
[win-system]: https://go.microsoft.com/fwlink/?linkid=2157459
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2157458
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2157456
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2157353
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2157248
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2157352

View File

@@ -1,41 +0,0 @@
<!-- BEGIN MICROSOFT SECURITY.MD V0.0.5 BLOCK -->
## Security
Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://docs.microsoft.com/en-us/previous-versions/tn-archive/cc751383(v=technet.10)), please report it to us as described below.
## Reporting Security Issues
**Please do not report security vulnerabilities through public GitHub issues.**
Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://msrc.microsoft.com/create-report).
If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://www.microsoft.com/en-us/msrc/pgp-key-msrc).
You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc).
Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
* Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
* Full paths of source file(s) related to the manifestation of the issue
* The location of the affected source code (tag/branch/commit or direct URL)
* Any special configuration required to reproduce the issue
* Step-by-step instructions to reproduce the issue
* Proof-of-concept or exploit code (if possible)
* Impact of the issue, including how an attacker might exploit the issue
This information will help us triage your report more quickly.
If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://microsoft.com/msrc/bounty) page for more details about our active programs.
## Preferred Languages
We prefer all communications to be in English.
## Policy
Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://www.microsoft.com/en-us/msrc/cvd).
<!-- END MICROSOFT SECURITY.MD BLOCK -->

View File

@@ -12,7 +12,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
angular2-grid: https://github.com/BTMorton/angular2-grid
angular2-slickgrid: https://github.com/Microsoft/angular2-slickgrid
applicationinsights: https://github.com/Microsoft/ApplicationInsights-node.js
axios: https://github.com/axios/axios
axios: https://github.com/axios/axios
bootstrap: https://github.com/twbs/bootstrap
chart.js: https://github.com/Timer/chartjs
chokidar: https://github.com/paulmillr/chokidar
@@ -29,7 +29,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
gc-signals: https://github.com/Microsoft/node-gc-signals
getmac: https://github.com/bevry/getmac
graceful-fs: https://github.com/isaacs/node-graceful-fs
gridstack: https://github.com/gridstack/gridstack.js
html-query-plan: https://github.com/JustinPealing/html-query-plan
http-proxy-agent: https://github.com/TooTallNate/node-https-proxy-agent
https-proxy-agent: https://github.com/TooTallNate/node-https-proxy-agent
@@ -494,32 +493,6 @@ IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
=========================================
END OF graceful-fs NOTICES AND INFORMATION
%% gridstack NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) 2014-2020 Alain Dumesny, Dylan Weiss, Pavel Reznikov
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
=========================================
END OF gridstack NOTICES AND INFORMATION
%% html-query-plan NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)

View File

@@ -20,8 +20,3 @@ jobs:
vmImage: macOS-latest
steps:
- template: build/azure-pipelines/darwin/continuous-build-darwin.yml
trigger:
branches:
exclude:
- electron-11.x.y

View File

@@ -1 +1 @@
2021-04-07T03:52:18.011Z
2020-04-29T05:20:58.491Z

View File

@@ -1,6 +1,4 @@
# cleanup rules for node modules, .gitignore style
# native node modules
# cleanup rules for native node modules, .gitignore style
nan/**
*/node_modules/nan/**
@@ -48,7 +46,6 @@ spdlog/build/**
spdlog/deps/**
spdlog/src/**
spdlog/test/**
spdlog/*.yml
!spdlog/build/Release/*.node
jschardet/dist/**
@@ -75,7 +72,6 @@ node-pty/build/**
node-pty/src/**
node-pty/tools/**
node-pty/deps/**
node-pty/scripts/**
!node-pty/build/Release/*.exe
!node-pty/build/Release/*.dll
!node-pty/build/Release/*.node
@@ -104,11 +100,13 @@ kerberos/build/**
# END SQL Modules
nsfw/binding.gyp
nsfw/build/**
nsfw/src/**
nsfw/includes/**
!nsfw/build/Release/*.node
vscode-nsfw/binding.gyp
vscode-nsfw/build/**
vscode-nsfw/src/**
vscode-nsfw/openpa/**
vscode-nsfw/includes/**
!vscode-nsfw/build/Release/*.node
!vscode-nsfw/**/*.a
vsda/build/**
vsda/ci/**
@@ -119,55 +117,8 @@ vsda/README.md
vsda/targets
!vsda/build/Release/vsda.node
vscode-encrypt/build/**
vscode-encrypt/src/**
vscode-encrypt/vendor/**
vscode-encrypt/.gitignore
vscode-encrypt/binding.gyp
vscode-encrypt/README.md
!vscode-encrypt/build/Release/vscode-encrypt-native.node
vscode-windows-ca-certs/**/*
!vscode-windows-ca-certs/package.json
!vscode-windows-ca-certs/**/*.node
node-addon-api/**/*
# other node modules
**/docs/**
**/example/**
**/examples/**
**/test/**
**/tests/**
**/History.md
**/CHANGELOG.md
**/README.md
**/readme.md
**/readme.markdown
**/*.ts
!typescript/**/*.d.ts
jschardet/dist/**
es6-promise/lib/**
vscode-textmate/webpack.config.js
# {{SQL CARBON EDIT }} We need more than just zone-node.js
# zone.js/dist/**
# !zone.js/dist/zone-node.js
# https://github.com/xtermjs/xterm.js/issues/3137
xterm/src/**
xterm/tsconfig.all.json
# https://github.com/xtermjs/xterm.js/issues/3138
xterm-addon-*/src/**
xterm-addon-*/fixtures/**
xterm-addon-*/out/**
xterm-addon-*/out-test/**

View File

@@ -0,0 +1,15 @@
name: Copycat
description: Copy all new issues to a different repo
inputs:
token:
description: GitHub token with issue, comment, and label read/write permissions to both repos
default: ${{ github.token }}
owner:
description: account/organization that owns the destination repo (the microsoft part of microsoft/vscode)
required: true
repo:
description: name of the destination repo (the vscode part of microsoft/vscode)
required: true
runs:
using: 'node12'
main: 'index.js'

View File

@@ -0,0 +1,19 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
class CopyCat {
constructor(github, owner, repo) {
this.github = github;
this.owner = owner;
this.repo = repo;
}
async run() {
const issue = await this.github.getIssue();
console.log(`Mirroring issue \`${issue.title}\` to ${this.owner}/${this.repo}`);
await this.github.createIssue(this.owner, this.repo, issue.title, issue.body.replace(/@|#|issues/g, '-'));
}
}
exports.CopyCat = CopyCat;

View File

@@ -0,0 +1,21 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { GitHubIssue } from '../api/api'
export class CopyCat {
constructor(private github: GitHubIssue, private owner: string, private repo: string) {}
async run() {
const issue = await this.github.getIssue()
console.log(`Mirroring issue \`${issue.title}\` to ${this.owner}/${this.repo}`)
await this.github.createIssue(
this.owner,
this.repo,
issue.title,
issue.body.replace(/@|#|issues/g, '-'),
)
}
}

View File

@@ -0,0 +1,21 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const core = require("@actions/core");
const github_1 = require("@actions/github");
const octokit_1 = require("../api/octokit");
const utils_1 = require("../utils/utils");
const copyCat_1 = require("./copyCat");
const token = utils_1.getRequiredInput('token');
const main = async () => {
await new copyCat_1.CopyCat(new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number }), utils_1.getRequiredInput('owner'), utils_1.getRequiredInput('repo')).run();
};
main()
.then(() => utils_1.logRateLimit(token))
.catch(async (error) => {
core.setFailed(error.message);
await utils_1.logErrorToIssue(error.message, true, token);
});

View File

@@ -0,0 +1,27 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as core from '@actions/core'
import { context } from '@actions/github'
import { OctoKitIssue } from '../api/octokit'
import { getRequiredInput, logErrorToIssue, logRateLimit } from '../utils/utils'
import { CopyCat } from './copyCat'
const token = getRequiredInput('token')
const main = async () => {
await new CopyCat(
new OctoKitIssue(token, context.repo, { number: context.issue.number }),
getRequiredInput('owner'),
getRequiredInput('repo'),
).run()
}
main()
.then(() => logRateLimit(token))
.catch(async (error) => {
core.setFailed(error.message)
await logErrorToIssue(error.message, true, token)
})

View File

@@ -0,0 +1,2 @@
node_modules/
*.js

View File

@@ -1,25 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
const crypto = require("crypto");
const { dirs } = require('../../npm/dirs');
const ROOT = path.join(__dirname, '../../../');
const shasum = crypto.createHash('sha1');
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `yarn.lock` files
for (let dir of dirs) {
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
shasum.update(fs.readFileSync(yarnLockPath));
}
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,32 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';
import * as crypto from 'crypto';
const { dirs } = require('../../npm/dirs');
const ROOT = path.join(__dirname, '../../../');
const shasum = crypto.createHash('sha1');
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `yarn.lock` files
for (let dir of dirs) {
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
shasum.update(fs.readFileSync(yarnLockPath));
}
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,41 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const vfs = require("vinyl-fs");
const path = require("path");
const es = require("event-stream");
const fs = require("fs");
const files = [
'.build/langpacks/**/*.vsix',
'.build/extensions/**/*.vsix',
'.build/win32-x64/**/*.{exe,zip}',
'.build/linux/sha256hashes.txt',
'.build/linux/deb/amd64/deb/*.deb',
'.build/linux/rpm/x86_64/*.rpm',
'.build/linux/server/*',
'.build/linux/archive/*',
'.build/docker/*',
'.build/darwin/*',
'.build/version.json' // version information
];
async function main() {
return new Promise((resolve, reject) => {
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
.pipe(es.through(file => {
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY,
//Preserve intermediate directories after .build folder
file.path.substr(path.resolve('.build').length + 1));
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.renameSync(file.path, filePath);
}));
stream.on('end', () => resolve());
stream.on('error', e => reject(e));
});
}
main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -11,7 +11,6 @@ import * as es from 'event-stream';
import * as fs from 'fs';
const files = [
'.build/langpacks/**/*.vsix', // langpacks
'.build/extensions/**/*.vsix', // external extensions
'.build/win32-x64/**/*.{exe,zip}', // windows binaries
'.build/linux/sha256hashes.txt', // linux hashes

View File

@@ -1,94 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const crypto = require("crypto");
const azure = require("azure-storage");
const mime = require("mime");
const cosmos_1 = require("@azure/cosmos");
const retry_1 = require("./retry");
if (process.argv.length !== 6) {
console.error('Usage: node createAsset.js PLATFORM TYPE NAME FILE');
process.exit(-1);
}
function hashStream(hashName, stream) {
return new Promise((c, e) => {
const shasum = crypto.createHash(hashName);
stream
.on('data', shasum.update.bind(shasum))
.on('error', e)
.on('close', () => c(shasum.digest('hex')));
});
}
async function doesAssetExist(blobService, quality, blobName) {
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
async function uploadBlob(blobService, quality, blobName, filePath, fileName) {
const blobOptions = {
contentSettings: {
contentType: mime.lookup(filePath),
contentDisposition: `attachment; filename="${fileName}"`,
cacheControl: 'max-age=31536000, public'
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
}
function getEnv(name) {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main() {
const [, , platform, type, fileName, filePath] = process.argv;
const quality = getEnv('VSCODE_QUALITY');
const commit = getEnv('BUILD_SOURCEVERSION');
console.log('Creating asset...');
const stat = await new Promise((c, e) => fs.stat(filePath, (err, stat) => err ? e(err) : c(stat)));
const size = stat.size;
console.log('Size:', size);
const stream = fs.createReadStream(filePath);
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
console.log('SHA1:', sha1hash);
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + fileName;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
const blobExists = await doesAssetExist(blobService, quality, blobName);
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, filePath, fileName);
console.log('Blobs successfully uploaded.');
const asset = {
platform,
type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
}
main().then(() => {
console.log('Asset successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -11,7 +11,6 @@ import * as crypto from 'crypto';
import * as azure from 'azure-storage';
import * as mime from 'mime';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';
interface Asset {
platform: string;
@@ -122,7 +121,7 @@ async function main(): Promise<void> {
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const scripts = client.database('builds').container(quality).scripts;
await retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
await scripts.storedProcedure('createAsset').execute('', [commit, asset, true]);
}
main().then(() => {

View File

@@ -1,51 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const cosmos_1 = require("@azure/cosmos");
const retry_1 = require("./retry");
if (process.argv.length !== 3) {
console.error('Usage: node createBuild.js VERSION');
process.exit(-1);
}
function getEnv(name) {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main() {
const [, , _version] = process.argv;
const quality = getEnv('VSCODE_QUALITY');
const commit = getEnv('BUILD_SOURCEVERSION');
const queuedBy = getEnv('BUILD_QUEUEDBY');
const sourceBranch = getEnv('BUILD_SOURCEBRANCH');
const version = _version + (quality === 'stable' ? '' : `-${quality}`);
console.log('Creating build...');
console.log('Quality:', quality);
console.log('Version:', version);
console.log('Commit:', commit);
const build = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: false,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
}
main().then(() => {
console.log('Build successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -6,7 +6,6 @@
'use strict';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';
if (process.argv.length !== 3) {
console.error('Usage: node createBuild.js VERSION');
@@ -49,7 +48,7 @@ async function main(): Promise<void> {
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const scripts = client.database('builds').container(quality).scripts;
await retry(() => scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]));
await scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]);
}
main().then(() => {

View File

@@ -10,8 +10,8 @@ git clone --depth 1 https://github.com/Microsoft/vscode-node-debug2.git
git clone --depth 1 https://github.com/Microsoft/vscode-node-debug.git
git clone --depth 1 https://github.com/Microsoft/vscode-html-languageservice.git
git clone --depth 1 https://github.com/Microsoft/vscode-json-languageservice.git
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
node $BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
node $BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
mv declarations-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-core.json
mv config-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-extensions.json

View File

@@ -1,14 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const retry_1 = require("./retry");
const { installBrowsersWithProgressBar } = require('playwright/lib/install/installer');
const playwrightPath = path.dirname(require.resolve('playwright'));
async function install() {
await retry_1.retry(() => installBrowsersWithProgressBar(playwrightPath));
}
install();

View File

@@ -1,40 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
if (process.argv.length !== 3) {
console.error('Usage: node listNodeModules.js OUTPUT_FILE');
process.exit(-1);
}
const ROOT = path.join(__dirname, '../../../');
function findNodeModulesFiles(location, inNodeModules, result) {
const entries = fs.readdirSync(path.join(ROOT, location));
for (const entry of entries) {
const entryPath = `${location}/${entry}`;
if (/(^\/out)|(^\/src$)|(^\/.git$)|(^\/.build$)/.test(entryPath)) {
continue;
}
let stat;
try {
stat = fs.statSync(path.join(ROOT, entryPath));
}
catch (err) {
continue;
}
if (stat.isDirectory()) {
findNodeModulesFiles(entryPath, inNodeModules || (entry === 'node_modules'), result);
}
else {
if (inNodeModules) {
result.push(entryPath.substr(1));
}
}
}
}
const result = [];
findNodeModulesFiles('', false, result);
fs.writeFileSync(process.argv[2], result.join('\n') + '\n');

View File

@@ -1,46 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';
if (process.argv.length !== 3) {
console.error('Usage: node listNodeModules.js OUTPUT_FILE');
process.exit(-1);
}
const ROOT = path.join(__dirname, '../../../');
function findNodeModulesFiles(location: string, inNodeModules: boolean, result: string[]) {
const entries = fs.readdirSync(path.join(ROOT, location));
for (const entry of entries) {
const entryPath = `${location}/${entry}`;
if (/(^\/out)|(^\/src$)|(^\/.git$)|(^\/.build$)/.test(entryPath)) {
continue;
}
let stat: fs.Stats;
try {
stat = fs.statSync(path.join(ROOT, entryPath));
} catch (err) {
continue;
}
if (stat.isDirectory()) {
findNodeModulesFiles(entryPath, inNodeModules || (entry === 'node_modules'), result);
} else {
if (inNodeModules) {
result.push(entryPath.substr(1));
}
}
}
}
const result: string[] = [];
findNodeModulesFiles('', false, result);
fs.writeFileSync(process.argv[2], result.join('\n') + '\n');

View File

@@ -1,71 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const azure = require("azure-storage");
const mime = require("mime");
const minimist = require("minimist");
const path_1 = require("path");
const fileNames = [
'fake.html',
'host.js',
'index.html',
'main.js',
'service-worker.js'
];
async function assertContainer(blobService, container) {
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesBlobExist(blobService, container, blobName) {
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(container, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
async function uploadBlob(blobService, container, blobName, file) {
const blobOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
}
async function publish(commit, files) {
console.log('Publishing...');
console.log('Commit:', commit);
const storageAccount = process.env['AZURE_WEBVIEW_STORAGE_ACCOUNT'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_WEBVIEW_STORAGE_ACCESS_KEY'])
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
await assertContainer(blobService, commit);
for (const file of files) {
const blobName = (0, path_1.basename)(file);
const blobExists = await doesBlobExist(blobService, commit, blobName);
if (blobExists) {
console.log(`Blob ${commit}, ${blobName} already exists, not publishing again.`);
continue;
}
console.log('Uploading blob to Azure storage...');
await uploadBlob(blobService, commit, blobName, file);
}
console.log('Blobs successfully uploaded.');
}
function main() {
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const opts = minimist(process.argv.slice(2));
const [directory] = opts._;
const files = fileNames.map(fileName => (0, path_1.join)(directory, fileName));
publish(commit, files).catch(err => {
console.error(err);
process.exit(1);
});
}
if (process.argv.length < 3) {
console.error('Usage: node publish.js <directory>');
process.exit(-1);
}
main();

View File

@@ -1,224 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const crypto = require("crypto");
const azure = require("azure-storage");
const mime = require("mime");
const minimist = require("minimist");
const documentdb_1 = require("documentdb");
// {{SQL CARBON EDIT}}
if (process.argv.length < 9) {
console.error('Usage: node publish.js <product_quality> <platform> <file_type> <file_name> <version> <is_update> <file> [commit_id]');
process.exit(-1);
}
function hashStream(hashName, stream) {
return new Promise((c, e) => {
const shasum = crypto.createHash(hashName);
stream
.on('data', shasum.update.bind(shasum))
.on('error', e)
.on('close', () => c(shasum.digest('hex')));
});
}
function createDefaultConfig(quality) {
return {
id: quality,
frozen: false
};
}
function getConfig(quality) {
console.log(`Getting config for quality ${quality}`);
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
parameters: [
{ name: '@quality', value: quality }
]
};
return retry(() => new Promise((c, e) => {
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err && err.code !== 409) {
return e(err);
}
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
});
}));
}
function createOrUpdate(commit, quality, platform, type, release, asset, isUpdate) {
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/' + quality;
const updateQuery = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
parameters: [{ name: '@id', value: commit }]
};
let updateTries = 0;
function update() {
updateTries++;
return new Promise((c, e) => {
console.log(`Querying existing documents to update...`);
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err) {
return e(err);
}
if (results.length !== 1) {
return e(new Error('No documents'));
}
const release = results[0];
release.assets = [
...release.assets.filter((a) => !(a.platform === platform && a.type === type)),
asset
];
if (isUpdate) {
release.updates[platform] = type;
}
console.log(`Replacing existing document with updated version`);
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) {
return c(update());
}
if (err) {
return e(err);
}
console.log('Build successfully updated.');
c();
});
});
});
}
return retry(() => new Promise((c, e) => {
console.log(`Attempting to create document`);
client.createDocument(collection, release, err => {
if (err && err.code === 409) {
return c(update());
}
if (err) {
return e(err);
}
console.log('Build successfully published.');
c();
});
}));
}
async function assertContainer(blobService, quality) {
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesAssetExist(blobService, quality, blobName) {
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
async function uploadBlob(blobService, quality, blobName, file) {
const blobOptions = {
contentSettings: {
contentType: mime.lookup(file),
cacheControl: 'max-age=31536000, public'
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
}
async function publish(commit, quality, platform, type, name, version, _isUpdate, file, opts) {
const isUpdate = _isUpdate === 'true';
const queuedBy = process.env['BUILD_QUEUEDBY'];
const sourceBranch = process.env['BUILD_SOURCEBRANCH'];
console.log('Publishing...');
console.log('Quality:', quality);
console.log('Platform:', platform);
console.log('Type:', type);
console.log('Name:', name);
console.log('Version:', version);
console.log('Commit:', commit);
console.log('Is Update:', isUpdate);
console.log('File:', file);
const stat = await new Promise((c, e) => fs.stat(file, (err, stat) => err ? e(err) : c(stat)));
const size = stat.size;
console.log('Size:', size);
const stream = fs.createReadStream(file);
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
console.log('SHA1:', sha1hash);
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
await assertContainer(blobService, quality);
const blobExists = await doesAssetExist(blobService, quality, blobName);
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
const config = await getConfig(quality);
console.log('Quality config:', config);
const asset = {
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
hash: sha1hash,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
const release = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: isReleased,
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
if (!opts['upload-only']) {
release.assets.push(asset);
if (isUpdate) {
release.updates[platform] = type;
}
}
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
const RETRY_TIMES = 10;
async function retry(fn) {
for (let run = 1; run <= RETRY_TIMES; run++) {
try {
return await fn();
}
catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
}
}
throw new Error('Retried too many times');
}
function main() {
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const opts = minimist(process.argv.slice(2), {
boolean: ['upload-only']
});
const [quality, platform, type, name, version, _isUpdate, file] = opts._;
publish(commit, quality, platform, type, name, version, _isUpdate, file, opts).catch(err => {
console.error(err);
process.exit(1);
});
}
main();

View File

@@ -1,91 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const documentdb_1 = require("documentdb");
function createDefaultConfig(quality) {
return {
id: quality,
frozen: false
};
}
function getConfig(quality) {
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
parameters: [
{ name: '@quality', value: quality }
]
};
return new Promise((c, e) => {
client.queryDocuments(collection, query).toArray((err, results) => {
if (err && err.code !== 409) {
return e(err);
}
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
});
});
}
function doRelease(commit, quality) {
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/' + quality;
const query = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
parameters: [{ name: '@id', value: commit }]
};
let updateTries = 0;
function update() {
updateTries++;
return new Promise((c, e) => {
client.queryDocuments(collection, query).toArray((err, results) => {
if (err) {
return e(err);
}
if (results.length !== 1) {
return e(new Error('No documents'));
}
const release = results[0];
release.isReleased = true;
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) {
return c(update());
}
if (err) {
return e(err);
}
console.log('Build successfully updated.');
c();
});
});
});
}
return update();
}
async function release(commit, quality) {
const config = await getConfig(quality);
console.log('Quality config:', config);
if (config.frozen) {
console.log(`Skipping release because quality ${quality} is frozen.`);
return;
}
await doRelease(commit, quality);
}
function env(name) {
const result = process.env[name];
if (!result) {
throw new Error(`Skipping release due to missing env: ${name}`);
}
return result;
}
async function main() {
const commit = env('BUILD_SOURCEVERSION');
const quality = env('VSCODE_QUALITY');
await release(commit, quality);
}
main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,50 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const cosmos_1 = require("@azure/cosmos");
const retry_1 = require("./retry");
function getEnv(name) {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
function createDefaultConfig(quality) {
return {
id: quality,
frozen: false
};
}
async function getConfig(client, quality) {
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${quality}"`;
const res = await client.database('builds').container('config').items.query(query).fetchAll();
if (res.resources.length === 0) {
return createDefaultConfig(quality);
}
return res.resources[0];
}
async function main() {
const commit = getEnv('BUILD_SOURCEVERSION');
const quality = getEnv('VSCODE_QUALITY');
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const config = await getConfig(client, quality);
console.log('Quality config:', config);
if (config.frozen) {
console.log(`Skipping release because quality ${quality} is frozen.`);
return;
}
console.log(`Releasing build ${commit}...`);
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
}
main().then(() => {
console.log('Build successfully released');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -6,7 +6,6 @@
'use strict';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';
function getEnv(name: string): string {
const result = process.env[name];
@@ -59,7 +58,7 @@ async function main(): Promise<void> {
console.log(`Releasing build ${commit}...`);
const scripts = client.database('builds').container(quality).scripts;
await retry(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
await scripts.storedProcedure('releaseBuild').execute('', [commit]);
}
main().then(() => {

View File

@@ -1,25 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.retry = void 0;
async function retry(fn) {
for (let run = 1; run <= 10; run++) {
try {
return await fn();
}
catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
const millis = (Math.random() * 200) + (50 * Math.pow(1.5, run));
console.log(`Failed with ECONNRESET, retrying in ${millis}ms...`);
// maximum delay is 10th retry: ~3 seconds
await new Promise(c => setTimeout(c, millis));
}
}
throw new Error('Retried too many times');
}
exports.retry = retry;

View File

@@ -1,26 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
export async function retry<T>(fn: () => Promise<T>): Promise<T> {
for (let run = 1; run <= 10; run++) {
try {
return await fn();
} catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
const millis = (Math.random() * 200) + (50 * Math.pow(1.5, run));
console.log(`Failed with ECONNRESET, retrying in ${millis}ms...`);
// maximum delay is 10th retry: ~3 seconds
await new Promise(c => setTimeout(c, millis));
}
}
throw new Error('Retried too many times');
}

View File

@@ -1,47 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
const crypto = require("crypto");
const ROOT = path.join(__dirname, '../../../');
function findFiles(location, pattern, result) {
const entries = fs.readdirSync(path.join(ROOT, location));
for (const entry of entries) {
const entryPath = `${location}/${entry}`;
let stat;
try {
stat = fs.statSync(path.join(ROOT, entryPath));
}
catch (err) {
continue;
}
if (stat.isDirectory()) {
findFiles(entryPath, pattern, result);
}
else {
if (stat.isFile() && entry.endsWith(pattern)) {
result.push(path.join(ROOT, entryPath));
}
}
}
}
const shasum = crypto.createHash('sha1');
/**
* Creating a sha hash of all the files that can cause packages to change/redownload.
*/
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Adding all yarn.lock files into sha sum.
const result = [];
findFiles('', 'yarn.lock', result);
result.forEach(f => shasum.update(fs.readFileSync(f)));
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,54 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';
import * as crypto from 'crypto';
const ROOT = path.join(__dirname, '../../../');
function findFiles(location: string, pattern: string, result: string[]) {
const entries = fs.readdirSync(path.join(ROOT, location));
for (const entry of entries) {
const entryPath = `${location}/${entry}`;
let stat: fs.Stats;
try {
stat = fs.statSync(path.join(ROOT, entryPath));
} catch (err) {
continue;
}
if (stat.isDirectory()) {
findFiles(entryPath, pattern, result);
} else {
if (stat.isFile() && entry.endsWith(pattern)) {
result.push(path.join(ROOT, entryPath));
}
}
}
}
const shasum = crypto.createHash('sha1');
/**
* Creating a sha hash of all the files that can cause packages to change/redownload.
*/
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Adding all yarn.lock files into sha sum.
const result: string[] = [];
findFiles('', 'yarn.lock', result);
result.forEach(f => shasum.update(fs.readFileSync(f)));
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,87 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const url = require("url");
const azure = require("azure-storage");
const mime = require("mime");
const cosmos_1 = require("@azure/cosmos");
const retry_1 = require("./retry");
function log(...args) {
console.log(...[`[${new Date().toISOString()}]`, ...args]);
}
function error(...args) {
console.error(...[`[${new Date().toISOString()}]`, ...args]);
}
if (process.argv.length < 3) {
error('Usage: node sync-mooncake.js <quality>');
process.exit(-1);
}
async function sync(commit, quality) {
log(`Synchronizing Mooncake assets for ${quality}, ${commit}...`);
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const container = client.database('builds').container(quality);
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${commit}"`;
const res = await container.items.query(query, {}).fetchAll();
if (res.resources.length !== 1) {
throw new Error(`No builds found for ${commit}`);
}
const build = res.resources[0];
log(`Found build for ${commit}, with ${build.assets.length} assets`);
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// mooncake is fussy and far away, this is needed!
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
for (const asset of build.assets) {
try {
const blobPath = url.parse(asset.url).path;
if (!blobPath) {
throw new Error(`Failed to parse URL: ${asset.url}`);
}
const blobName = blobPath.replace(/^\/\w+\//, '');
log(`Found ${blobName}`);
if (asset.mooncakeUrl) {
log(` Already in Mooncake ✔️`);
continue;
}
const readStream = blobService.createReadStream(quality, blobName, undefined);
const blobOptions = {
contentSettings: {
contentType: mime.lookup(blobPath),
cacheControl: 'max-age=31536000, public'
}
};
const writeStream = mooncakeBlobService.createWriteStreamToBlockBlob(quality, blobName, blobOptions, undefined);
log(` Uploading to Mooncake...`);
await new Promise((c, e) => readStream.pipe(writeStream).on('finish', c).on('error', e));
log(` Updating build in DB...`);
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
await (0, retry_1.retry)(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
log(` Done ✔️`);
}
catch (err) {
error(err);
}
}
log(`All done ✔️`);
}
function main() {
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
error('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const quality = process.argv[2];
sync(commit, quality).catch(err => {
error(err);
process.exit(1);
});
}
main();

View File

@@ -9,7 +9,6 @@ import * as url from 'url';
import * as azure from 'azure-storage';
import * as mime from 'mime';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';
function log(...args: any[]) {
console.log(...[`[${new Date().toISOString()}]`, ...args]);
@@ -100,8 +99,8 @@ async function sync(commit: string, quality: string): Promise<void> {
log(` Updating build in DB...`);
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
await retry(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
await container.scripts.storedProcedure('setAssetMooncakeUrl')
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]);
log(` Done ✔️`);
} catch (err) {

View File

@@ -8,11 +8,5 @@
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.device.camera</key>
<true/>
<key>com.apple.security.automation.apple-events</key>
<true/>
</dict>
</plist>

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.18.3"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs:

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
</dict>
</plist>

View File

@@ -1,129 +0,0 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
pushd build \
&& yarn \
&& npm install -g typescript \
&& tsc azure-pipelines/common/createAsset.ts \
&& popd
displayName: Restore modules for just build folder and compile it
- download: current
artifact: vscode-darwin-$(VSCODE_ARCH)
displayName: Download $(VSCODE_ARCH) artifact
- script: |
set -e
unzip $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
mv $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
displayName: Unzip & move
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
displayName: Export bundle identifier
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- script: |
set -e
# For legacy purposes, arch for x64 is just 'darwin'
case $VSCODE_ARCH in
x64) ASSET_ID="darwin" ;;
arm64) ASSET_ID="darwin-arm64" ;;
universal) ASSET_ID="darwin-universal" ;;
esac
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/createAsset.js \
"$ASSET_ID" \
archive \
"VSCode-$ASSET_ID.zip" \
../VSCode-darwin-$(VSCODE_ARCH).zip
displayName: Publish Clients

View File

@@ -1,7 +1,32 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
@@ -9,23 +34,9 @@ steps:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
# Set up the credentials to retrieve distro repo and setup git persona
# to create a merge commit for when we merge distro into oss
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
@@ -44,47 +55,40 @@ steps:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npm install -g node-gyp@latest
npm install -g node-gyp@7.1.0
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export npm_config_build_from_source=true
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
@@ -94,21 +98,39 @@ steps:
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export npm_config_build_from_source=true
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/
yarn electron-rebuild
cd ./node_modules/keytar
node-gyp rebuild
displayName: Rebuild native modules for ARM64
condition: eq(variables['VSCODE_ARCH'], 'arm64')
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
- script: |
set -e
node build/azure-pipelines/mixin
@@ -118,8 +140,7 @@ steps:
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build client
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
displayName: Build
- script: |
set -e
@@ -127,39 +148,15 @@ steps:
yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build Server
displayName: Build reh
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
yarn electron $(VSCODE_ARCH)
displayName: Download Electron
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- download: current
artifact: vscode-darwin-x64
displayName: Download x64 artifact
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
- download: current
artifact: vscode-darwin-arm64
displayName: Download arm64 artifact
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
- script: |
set -e
cp $(Pipeline.Workspace)/vscode-darwin-x64/VSCode-darwin-x64.zip $(agent.builddirectory)/VSCode-darwin-x64.zip
cp $(Pipeline.Workspace)/vscode-darwin-arm64/VSCode-darwin-arm64.zip $(agent.builddirectory)/VSCode-darwin-arm64.zip
unzip $(agent.builddirectory)/VSCode-darwin-x64.zip -d $(agent.builddirectory)/VSCode-darwin-x64
unzip $(agent.builddirectory)/VSCode-darwin-arm64.zip -d $(agent.builddirectory)/VSCode-darwin-arm64
DEBUG=* node build/darwin/create-universal-app.js
displayName: Create Universal App
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
# Setting hardened entitlements is a requirement for:
# * Apple notarization
# * Running tests on Big Sur (because Big Sur has additional security precautions)
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
@@ -168,27 +165,19 @@ steps:
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
VSCODE_ARCH="$(VSCODE_ARCH)" DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn --cwd test/integration/browser compile
displayName: Compile integration tests
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
@@ -202,15 +191,13 @@ steps:
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
./resources/server/test/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser)
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
@@ -221,39 +208,22 @@ steps:
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn --cwd test/smoke compile
displayName: Compile smoke tests
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest-no-compile --build "$APP_ROOT/$APP_NAME"
timeoutInMinutes: 5
yarn smoketest --build "$APP_ROOT/$APP_NAME"
continueOnError: true
displayName: Run smoke tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
yarn smoketest-no-compile --build "$APP_ROOT/$APP_NAME" --remote
timeoutInMinutes: 5
displayName: Run smoke tests (Remote)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
yarn smoketest-no-compile --web --headless
timeoutInMinutes: 5
yarn smoketest --web --headless
continueOnError: true
displayName: Run smoke tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
@@ -270,13 +240,79 @@ steps:
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: succeededOrFailed()
- script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
displayName: Archive build
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: |
zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip "*.pkg"
displayName: Clean Archive
- script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
displayName: Export bundle identifier
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
@@ -284,24 +320,9 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_ARCH="$(VSCODE_ARCH)" ./build/azure-pipelines/darwin/publish-server.sh
displayName: Publish Servers
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
artifact: vscode-darwin-$(VSCODE_ARCH)
displayName: Publish client archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-darwin.zip
artifact: vscode-server-darwin-$(VSCODE_ARCH)
displayName: Publish server archive
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-web.zip
artifact: vscode-server-darwin-$(VSCODE_ARCH)-web
displayName: Publish web server archive
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- script: |
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
@@ -310,3 +331,7 @@ steps:
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -1,6 +1,19 @@
#!/usr/bin/env bash
set -e
# Publish DEB
case $VSCODE_ARCH in
x64) ASSET_ID="darwin" ;;
arm64) ASSET_ID="darwin-arm64" ;;
esac
# publish the build
node build/azure-pipelines/common/createAsset.js \
"$ASSET_ID" \
archive \
"VSCode-$ASSET_ID.zip" \
../VSCode-darwin-$VSCODE_ARCH.zip
if [ "$VSCODE_ARCH" == "x64" ]; then
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd

View File

@@ -5,15 +5,26 @@ steps:
certSecureFile: 'osx_signing_key.p12'
condition: eq(variables['signed'], true)
- task: DownloadPipelineArtifact@2
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
artifact: Compilation
displayName: Download compilation output
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'BuildCache'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
tar -xzf $(Pipeline.Workspace)/compilation.tar.gz
displayName: Extract compilation output
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
@@ -50,23 +61,12 @@ steps:
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js > .build/yarnlockhash
displayName: Prepare yarn cache key
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules archive
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
- script: |
set -e
@@ -74,21 +74,21 @@ steps:
displayName: Install dependencies
env:
GITHUB_TOKEN: $(github-distro-mixin-password)
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
@@ -122,16 +122,11 @@ steps:
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
yarn gulp compile-extensions
displayName: Compile Extensions
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log" --extensionsDir "$(build.sourcesdirectory)/extensions"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log"
displayName: Run smoke tests (Electron)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))

View File

@@ -1,41 +1,41 @@
pool:
vmImage: 'Ubuntu-18.04'
vmImage: 'Ubuntu-16.04'
trigger:
branches:
include: ["main", "release/*"]
include: ['main', 'release/*']
pr:
branches:
include: ["main", "release/*"]
include: ['main', 'release/*']
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'azuredatastudio-adointegration'
KeyVaultName: ado-secrets
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
cat << EOF > ~/.netrc
machine github.com
login azuredatastudio
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git config user.email "sqltools@service.microsoft.com"
git config user.name "AzureDataStudio"
git remote add distro "https://github.com/$VSCODE_MIXIN_REPO.git"
git fetch distro
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
# Push main branch into oss/main
git push distro origin/main:refs/heads/oss/main
# Push main branch into oss/master
git push distro origin/main:refs/heads/oss/master
# Push every release branch into oss/release
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro

View File

@@ -1,14 +1,10 @@
#Download base image ubuntu 21.04
FROM ubuntu:21.04
ENV TZ=America/Los_Angeles
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
#Download base image ubuntu 16.04
FROM ubuntu:16.04
# Update Software repository
RUN apt-get update && apt-get upgrade -y
RUN apt-get update
RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 \
libkrb5-dev git apt-transport-https ca-certificates curl gnupg-agent software-properties-common \
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++ python3-dev python3-pip
RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus libgtk-3-0
ADD ./ /opt/ads-server

View File

@@ -7,7 +7,7 @@ SERVER_BUILD_NAME="azuredatastudio-server-$PLATFORM_LINUX"
# create docker
mkdir -p $REPO/.build/docker
docker build -t azuredatastudio-server -f $REPO/build/azure-pipelines/docker/Dockerfile $ROOT/$SERVER_BUILD_NAME-web
docker build -t azuredatastudio-server -f $REPO/build/azure-pipelines/docker/Dockerfile $ROOT/$SERVER_BUILD_NAME
docker save azuredatastudio-server | gzip > $REPO/.build/docker/azuredatastudio-server-docker.tar.gz
node build/azure-pipelines/common/copyArtifacts.js

View File

@@ -79,7 +79,7 @@ steps:
set -e
for f in $(Build.SourcesDirectory)/.build/drop/linux/server/*.tar.gz
do
tar -C $(Build.SourcesDirectory)/../ -zxvf $f
tar -C $(agent.builddirectory) -zxvf $f
rm $f
done
displayName: Unzip artifacts

View File

@@ -11,7 +11,7 @@ pr:
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
@@ -31,10 +31,10 @@ steps:
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git checkout origin/electron-12.x.y
git merge origin/main
git checkout origin/electron-11.x.y
git merge origin/master
# Push main branch into exploration branch
git push origin HEAD:electron-12.x.y
# Push master branch into exploration branch
git push origin HEAD:electron-11.x.y
displayName: Sync & Merge Exploration

View File

@@ -1,17 +1,17 @@
#Download base image ubuntu 18.04
FROM ubuntu:18.04
#Download base image ubuntu 16.04
FROM ubuntu:16.04
# Update Software repository
RUN apt-get update && apt-get upgrade -y
RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 \
libkrb5-dev git apt-transport-https ca-certificates curl gnupg-agent software-properties-common \
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++-4.9
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++-4.8
RUN rm /usr/bin/gcc
RUN rm /usr/bin/g++
RUN ln -s /usr/bin/gcc-4.9 /usr/bin/gcc
RUN ln -s /usr/bin/g++-4.9 /usr/bin/g++
RUN ln -s /usr/bin/gcc-4.8 /usr/bin/gcc
RUN ln -s /usr/bin/g++-4.8 /usr/bin/g++
#docker
RUN curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add -

View File

@@ -22,7 +22,7 @@ SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*-web.tar.*
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"

View File

@@ -10,21 +10,16 @@ steps:
- task: NodeTool@0
inputs:
versionSpec: "12.18.3"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
inputs:
versionSpec: "1.x"
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js > .build/yarnlockhash
displayName: Prepare yarn cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: ".yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
@@ -36,7 +31,7 @@ steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: ".yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))

View File

@@ -14,4 +14,13 @@ TARBALL_PATH="$REPO/.build/linux/archive/$TARBALL_FILENAME"
rm -rf $ROOT/code-*.tar.*
(cd $ROOT && tar -czf $TARBALL_PATH $BUILDNAME)
# Publish Remote Extension Host
LEGACY_SERVER_BUILD_NAME="azuredatastudio-reh-$PLATFORM_LINUX"
SERVER_BUILD_NAME="azuredatastudio-server-$PLATFORM_LINUX"
SERVER_TARBALL_FILENAME="azuredatastudio-server-$PLATFORM_LINUX.tar.gz"
SERVER_TARBALL_PATH="$REPO/.build/linux/server/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/azuredatastudio-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/copyArtifacts.js

View File

@@ -1,7 +1,28 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
@@ -13,17 +34,6 @@ steps:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- task: Docker@1
displayName: "Pull image"
inputs:
@@ -35,6 +45,7 @@ steps:
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
@@ -47,36 +58,30 @@ steps:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js "alpine" $ENABLE_TERRAPIN > .build/yarnlockhash
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n "alpine" > .build/arch
displayName: Prepare yarn cache flags
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
@@ -85,19 +90,21 @@ steps:
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
@@ -106,7 +113,7 @@ steps:
- script: |
set -e
docker run -e VSCODE_QUALITY -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine /root/vscode/build/azure-pipelines/linux/alpine/install-dependencies.sh
docker run -e VSCODE_QUALITY -e CHILD_CONCURRENCY=1 -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine /root/vscode/build/azure-pipelines/linux/alpine/install-dependencies.sh
displayName: Prebuild
- script: |
@@ -122,14 +129,7 @@ steps:
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/alpine/publish.sh
displayName: Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine.tar.gz
artifact: vscode-server-linux-alpine
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine-web.tar.gz
artifact: vscode-server-linux-alpine-web
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -1,7 +1,28 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
@@ -13,17 +34,6 @@ steps:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
path: $(Build.ArtifactStagingDirectory)
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
displayName: Extract compilation output
- script: |
set -e
cat << EOF > ~/.netrc
@@ -38,49 +48,31 @@ steps:
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
export npm_config_arch=$(NPM_ARCH)
export npm_config_build_from_source=true
if [ -z "$CC" ] || [ -z "$CXX" ]; then
export CC=$(which gcc-5)
export CXX=$(which g++-5)
fi
if [ "$VSCODE_ARCH" == "x64" ]; then
export VSCODE_REMOTE_CC=$(which gcc-4.8)
export VSCODE_REMOTE_CXX=$(which g++-4.8)
fi
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
@@ -89,19 +81,21 @@ steps:
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
@@ -112,41 +106,28 @@ steps:
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build Server
displayName: Build
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
displayName: Download Electron and Playwright
service xvfb start
displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn --cwd test/integration/browser compile
displayName: Compile integration tests
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
@@ -156,12 +137,10 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
@@ -169,19 +148,16 @@ steps:
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser)
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_APP_NAME="$APP_NAME" \
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
timeoutInMinutes: 7
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0
@@ -204,20 +180,17 @@ steps:
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
# needed for code signing
- task: UseDotNet@2
displayName: "Install .NET Core SDK 2.x"
inputs:
version: 2.x
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
@@ -237,7 +210,6 @@ steps:
]
SessionTimeout: 120
displayName: Codesign rpm
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- script: |
set -e
@@ -247,31 +219,13 @@ steps:
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(DEB_PATH)
artifact: vscode-linux-deb-$(VSCODE_ARCH)
displayName: Publish deb package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(RPM_PATH)
artifact: vscode-linux-rpm-$(VSCODE_ARCH)
displayName: Publish rpm package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
artifact: vscode-server-linux-$(VSCODE_ARCH)
displayName: Publish server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
artifact: vscode-server-linux-$(VSCODE_ARCH)-web
displayName: Publish web server archive
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -26,17 +26,6 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish Remote Extension Host (Web)
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*-web.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish DEB
case $VSCODE_ARCH in
x64) DEB_ARCH="amd64" ;;
@@ -69,7 +58,3 @@ mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)
# Export DEB_PATH, RPM_PATH
echo "##vso[task.setvariable variable=DEB_PATH]$DEB_PATH"
echo "##vso[task.setvariable variable=RPM_PATH]$RPM_PATH"

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
@@ -48,17 +48,9 @@ steps:
x64) SNAPCRAFT_TARGET_ARGS="" ;;
*) SNAPCRAFT_TARGET_ARGS="--target-arch $(VSCODE_ARCH)" ;;
esac
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft prime $SNAPCRAFT_TARGET_ARGS && snap pack prime --compression=lzo --filename="$SNAP_PATH")
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft snap $SNAPCRAFT_TARGET_ARGS --output "$SNAP_PATH")
# Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/createAsset.js "linux-snap-$(VSCODE_ARCH)" package "$SNAP_FILENAME" "$SNAP_PATH"
# Export SNAP_PATH
echo "##vso[task.setvariable variable=SNAP_PATH]$SNAP_PATH"
- publish: $(SNAP_PATH)
artifact: vscode-linux-snap-$(VSCODE_ARCH)
displayName: Publish snap package
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))

View File

@@ -2,6 +2,27 @@ parameters:
extensionsToUnitTest: []
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'BuildCache'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.13.0"
@@ -17,16 +38,6 @@ steps:
KeyVaultName: ado-secrets
SecretsFilter: 'github-distro-mixin-password'
- task: DownloadPipelineArtifact@2
inputs:
artifact: Compilation
displayName: Download compilation output
- script: |
set -e
tar -xzf $(Pipeline.Workspace)/compilation.tar.gz
displayName: Extract compilation output
- script: |
set -e
cat << EOF > ~/.netrc
@@ -46,23 +57,12 @@ steps:
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
mkdir -p .build
node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js > .build/yarnlockhash
displayName: Prepare yarn cache key
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules archive
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
- script: |
set -e
@@ -70,21 +70,21 @@ steps:
displayName: Install dependencies
env:
GITHUB_TOKEN: $(github-distro-mixin-password)
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
@@ -94,6 +94,7 @@ steps:
- script: |
set -e
yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-web-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -105,11 +106,6 @@ steps:
yarn gulp package-external-extensions
displayName: Package External extensions
- script: |
set -e
yarn gulp package-langpacks
displayName: Package Langpacks
- script: |
set -e
service xvfb start
@@ -135,19 +131,16 @@ steps:
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the unit tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
NO_CLEANUP=1 \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-extensions-unit.sh --build --tfs "Extension Unit Tests"
displayName: 'Run Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- ${{each extension in parameters.extensionsToUnitTest}}:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
export INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export NO_CLEANUP=1
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
@@ -169,6 +162,7 @@ steps:
# Only archive directories we want for debugging purposes
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
done
displayName: Archive Logs
continueOnError: true
condition: succeededOrFailed()
@@ -194,7 +188,7 @@ steps:
inputs:
ConnectedServiceName: 'Code Signing'
FolderPath: '$(Build.SourcesDirectory)/.build'
Pattern: 'extensions/*.vsix,langpacks/*.vsix'
Pattern: 'extensions/*.vsix'
signConfigType: inlineSignParams
inlineOperation: |
[
@@ -219,7 +213,7 @@ steps:
}
]
SessionTimeout: 120
displayName: 'Signing Extensions and Langpacks'
displayName: 'Signing Extensions'
condition: and(succeeded(), eq(variables['signed'], true))
- script: |
@@ -243,13 +237,6 @@ steps:
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: crash reports'
inputs:
PathtoPublish: '$(Build.SourcesDirectory)/.build/crashes'
ArtifactName: crashes
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
condition: succeededOrFailed()

View File

@@ -1,3 +1,4 @@
trigger: none
pr: none
schedules:
@@ -5,110 +6,13 @@ schedules:
displayName: Mon-Fri at 7:00
branches:
include:
- main
parameters:
- name: VSCODE_QUALITY
displayName: Quality
type: string
default: insider
values:
- exploration
- insider
- stable
- name: ENABLE_TERRAPIN
displayName: "Enable Terrapin"
type: boolean
default: true
- name: VSCODE_BUILD_WIN32
displayName: "🎯 Windows x64"
type: boolean
default: true
- name: VSCODE_BUILD_WIN32_32BIT
displayName: "🎯 Windows ia32"
type: boolean
default: true
- name: VSCODE_BUILD_WIN32_ARM64
displayName: "🎯 Windows arm64"
type: boolean
default: true
- name: VSCODE_BUILD_LINUX
displayName: "🎯 Linux x64"
type: boolean
default: true
- name: VSCODE_BUILD_LINUX_ARM64
displayName: "🎯 Linux arm64"
type: boolean
default: true
- name: VSCODE_BUILD_LINUX_ARMHF
displayName: "🎯 Linux armhf"
type: boolean
default: true
- name: VSCODE_BUILD_LINUX_ALPINE
displayName: "🎯 Alpine Linux"
type: boolean
default: true
- name: VSCODE_BUILD_MACOS
displayName: "🎯 macOS x64"
type: boolean
default: true
- name: VSCODE_BUILD_MACOS_ARM64
displayName: "🎯 macOS arm64"
type: boolean
default: true
- name: VSCODE_BUILD_MACOS_UNIVERSAL
displayName: "🎯 macOS universal"
type: boolean
default: true
- name: VSCODE_BUILD_WEB
displayName: "🎯 Web"
type: boolean
default: true
- name: VSCODE_PUBLISH
displayName: "Publish to builds.code.visualstudio.com"
type: boolean
default: true
- name: VSCODE_RELEASE
displayName: "Release build if successful"
type: boolean
default: false
- name: VSCODE_COMPILE_ONLY
displayName: "Run Compile stage exclusively"
type: boolean
default: false
- name: VSCODE_STEP_ON_IT
displayName: "Skip tests"
type: boolean
default: false
variables:
- name: ENABLE_TERRAPIN
value: ${{ eq(parameters.ENABLE_TERRAPIN, true) }}
- name: VSCODE_QUALITY
value: ${{ parameters.VSCODE_QUALITY }}
- name: VSCODE_BUILD_STAGE_WINDOWS
value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}
- name: VSCODE_BUILD_STAGE_LINUX
value: ${{ or(eq(parameters.VSCODE_BUILD_LINUX, true), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true), eq(parameters.VSCODE_BUILD_WEB, true)) }}
- name: VSCODE_BUILD_STAGE_MACOS
value: ${{ or(eq(parameters.VSCODE_BUILD_MACOS, true), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true)) }}
- name: VSCODE_CIBUILD
value: ${{ in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI') }}
- name: VSCODE_PUBLISH
value: ${{ and(eq(parameters.VSCODE_PUBLISH, true), eq(variables.VSCODE_CIBUILD, false)) }}
- name: VSCODE_SCHEDULEDBUILD
value: ${{ eq(variables['Build.Reason'], 'Schedule') }}
- name: VSCODE_STEP_ON_IT
value: ${{ eq(parameters.VSCODE_STEP_ON_IT, true) }}
- name: VSCODE_BUILD_MACOS_UNIVERSAL
value: ${{ and(eq(variables['VSCODE_PUBLISH'], true), eq(parameters.VSCODE_BUILD_MACOS, true), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true), eq(parameters.VSCODE_BUILD_MACOS_UNIVERSAL, true)) }}
- master
resources:
containers:
- container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:bionic-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub
options: --user 0:0
- container: vscode-arm64
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-arm64
endpoint: VSCodeHub
@@ -122,216 +26,177 @@ stages:
- stage: Compile
jobs:
- job: Compile
pool: compile
pool:
vmImage: "Ubuntu-16.04"
container: vscode-x64
variables:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true)) }}:
- stage: Windows
dependsOn:
- Compile
pool:
vmImage: VS2017-Win2016
jobs:
- stage: Windows
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: VS2017-Win2016
jobs:
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- ${{ if eq(parameters.VSCODE_BUILD_WIN32, true) }}:
- job: Windows
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: Windows32
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true)) }}:
- job: Windows32
timeoutInMinutes: 90
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- job: WindowsARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}:
- job: WindowsARM64
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32.yml
- stage: Linux
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_LINUX'], true)) }}:
- stage: Linux
dependsOn:
- Compile
pool:
vmImage: "Ubuntu-18.04"
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_LINUX, true) }}:
- job: Linux
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX, true)) }}:
- job: LinuxSnap
dependsOn:
- Linux
container: snapcraft
variables:
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true)) }}:
- job: LinuxArmhf
container: vscode-armhf
variables:
VSCODE_ARCH: armhf
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux.yml
# TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARMHF, true)) }}:
- job: LinuxSnapArmhf
dependsOn:
- LinuxArmhf
container: snapcraft
variables:
VSCODE_ARCH: armhf
steps:
- template: linux/snap-build-linux.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
- job: LinuxArm64
container: vscode-arm64
variables:
VSCODE_ARCH: arm64
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux.yml
# TODO@joaomoreno: We don't ship ARM snaps for now
- ${{ if and(false, eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ARM64, true)) }}:
- job: LinuxSnapArm64
dependsOn:
- LinuxArm64
container: snapcraft
variables:
VSCODE_ARCH: arm64
steps:
- template: linux/snap-build-linux.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_LINUX_ALPINE, true)) }}:
- job: LinuxAlpine
steps:
- template: linux/product-build-alpine.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_WEB, true)) }}:
- job: LinuxWeb
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), eq(variables['VSCODE_BUILD_STAGE_MACOS'], true)) }}:
- stage: macOS
dependsOn:
- Compile
pool:
vmImage: macOS-latest
jobs:
- ${{ if eq(parameters.VSCODE_BUILD_MACOS, true) }}:
- job: macOS
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
- ${{ if ne(variables['VSCODE_PUBLISH'], 'false') }}:
- job: macOSSign
dependsOn:
- macOS
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin-sign.yml
- ${{ if and(eq(variables['VSCODE_CIBUILD'], false), eq(parameters.VSCODE_BUILD_MACOS_ARM64, true)) }}:
- job: macOSARM64
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin.yml
- ${{ if ne(variables['VSCODE_PUBLISH'], 'false') }}:
- job: macOSARM64Sign
dependsOn:
- macOSARM64
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin-sign.yml
- ${{ if eq(variables['VSCODE_BUILD_MACOS_UNIVERSAL'], true) }}:
- job: macOSUniversal
dependsOn:
- macOS
- macOSARM64
timeoutInMinutes: 90
variables:
VSCODE_ARCH: universal
steps:
- template: darwin/product-build-darwin.yml
- ${{ if ne(variables['VSCODE_PUBLISH'], 'false') }}:
- job: macOSUniversalSign
dependsOn:
- macOSUniversal
timeoutInMinutes: 90
variables:
VSCODE_ARCH: universal
steps:
- template: darwin/product-build-darwin-sign.yml
- ${{ if and(eq(variables['VSCODE_PUBLISH'], true), eq(parameters.VSCODE_COMPILE_ONLY, false)) }}:
- stage: Mooncake
dependsOn:
- ${{ if eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true) }}:
- Windows
- ${{ if eq(variables['VSCODE_BUILD_STAGE_LINUX'], true) }}:
- job: LinuxSnap
dependsOn:
- Linux
- ${{ if eq(variables['VSCODE_BUILD_STAGE_MACOS'], true) }}:
- macOS
condition: succeededOrFailed()
pool:
vmImage: "Ubuntu-18.04"
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), or(eq(parameters.VSCODE_RELEASE, true), and(in(parameters.VSCODE_QUALITY, 'insider', 'exploration'), eq(variables['VSCODE_SCHEDULEDBUILD'], true)))) }}:
- stage: Release
dependsOn:
- ${{ if eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true) }}:
- Windows
- ${{ if eq(variables['VSCODE_BUILD_STAGE_LINUX'], true) }}:
- Linux
- ${{ if eq(variables['VSCODE_BUILD_STAGE_MACOS'], true) }}:
- macOS
pool:
vmImage: "Ubuntu-18.04"
jobs:
- job: ReleaseBuild
displayName: Release Build
steps:
- template: release.yml
- job: LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: vscode-armhf
variables:
VSCODE_ARCH: armhf
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnapArmhf
dependsOn:
- LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: armhf
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: vscode-arm64
variables:
VSCODE_ARCH: arm64
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnapArm64
dependsOn:
- LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: arm64
steps:
- template: linux/snap-build-linux.yml
- job: LinuxAlpine
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))
steps:
- template: linux/product-build-alpine.yml
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- stage: macOS
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
- stage: macOSARM64
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS_ARM64'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin.yml
- stage: Mooncake
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeededOrFailed(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
- stage: Publish
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), or(eq(variables['VSCODE_RELEASE'], 'true'), and(or(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['VSCODE_QUALITY'], 'exploration')), eq(variables['Build.Reason'], 'Schedule'))))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: BuildService
displayName: Build Service
steps:
- template: release.yml

View File

@@ -1,17 +1,36 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
dryRun: true
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
@@ -24,47 +43,36 @@ steps:
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
mkdir -p .build
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
# using `genericNodeModules` instead of `nodeModules` here to avoid sharing the cache with builds running inside containers
- task: Cache@2
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
key: 'genericNodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
displayName: Restore node_modules cache
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules cache
- script: |
set -e
npx https://aka.ms/enablesecurefeed standAlone
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
displayName: Switch to Terrapin packages
- script: |
set -e
sudo apt update -y
sudo apt install -y build-essential pkg-config libx11-dev libx11-xcb-dev libxkbfile-dev libsecret-1-dev libnotify-bin
displayName: Install build tools
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
@@ -73,50 +81,67 @@ steps:
fi
echo "Yarn failed $i, trying again..."
done
env:
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
displayName: Install dependencies
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['CacheRestored'], 'true'))
# Mixin must run before optimize, because the CSS loader will inline small SVGs
# Mixin must run before optimize, because the CSS loader will
# inline small SVGs
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn npm-run-all -lp core-ci extensions-ci hygiene eslint valid-layers-check
displayName: Compile & Hygiene
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
yarn gulp hygiene
yarn monaco-compile-check
yarn valid-layers-check
displayName: Run hygiene, monaco compile & valid layers checks
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -
./build/azure-pipelines/common/extract-telemetry.sh
displayName: Extract Telemetry
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
AZURE_WEBVIEW_STORAGE_ACCESS_KEY="$(vscode-webview-storage-key)" \
./build/azure-pipelines/common/publish-webview.sh
displayName: Publish Webview
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
@@ -124,28 +149,13 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/createBuild.js $VERSION
displayName: Create build
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
# we gotta tarball everything in order to preserve file permissions
- script: |
set -e
tar -czf $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-*
displayName: Compress compilation artifact
- task: PublishPipelineArtifact@1
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz
artifactName: Compilation
displayName: Publish compilation artifact
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn download-builtin-extensions-cg
displayName: Built-in extensions component details
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
inputs:
sourceScanPath: $(Build.SourcesDirectory)
continueOnError: true
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))

View File

@@ -0,0 +1,2 @@
node_modules/
*.js

View File

@@ -1,36 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const cp = require("child_process");
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
if (!isValidTag(tag)) {
throw Error(`Invalid tag ${tag}`);
}
}
catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function isValidTag(t) {
if (t.split('.').length !== 3) {
return false;
}
const [major, minor, bug] = t.split('.');
// Only release for tags like 1.34.0
if (bug !== '0') {
return false;
}
if (isNaN(parseInt(major, 10)) || isNaN(parseInt(minor, 10))) {
return false;
}
return true;
}

View File

@@ -9,7 +9,7 @@ pr: none
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
@@ -61,7 +61,7 @@ steps:
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame champion, please open this link, examine changes and create a PR:"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."

View File

@@ -1,72 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const cp = require("child_process");
const path = require("path");
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
const dtsUri = `https://raw.githubusercontent.com/microsoft/vscode/${tag}/src/vs/vscode.d.ts`;
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/vscode/index.d.ts');
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
updateDTSFile(outPath, tag);
console.log(`Done updating vscode.d.ts at ${outPath}`);
}
catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function updateDTSFile(outPath, tag) {
const oldContent = fs.readFileSync(outPath, 'utf-8');
const newContent = getNewFileContent(oldContent, tag);
fs.writeFileSync(outPath, newContent);
}
function repeat(str, times) {
const result = new Array(times);
for (let i = 0; i < times; i++) {
result[i] = str;
}
return result.join('');
}
function convertTabsToSpaces(str) {
return str.replace(/\t/gm, value => repeat(' ', value.length));
}
function getNewFileContent(content, tag) {
const oldheader = [
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA. See License.txt in the project root for license information.`,
` *--------------------------------------------------------------------------------------------*/`
].join('\n');
return convertTabsToSpaces(getNewFileHeader(tag) + content.slice(oldheader.length));
}
function getNewFileHeader(tag) {
const [major, minor] = tag.split('.');
const shorttag = `${major}.${minor}`;
const header = [
`// Type definitions for Visual Studio Code ${shorttag}`,
`// Project: https://github.com/microsoft/vscode`,
`// Definitions by: Visual Studio Code Team, Microsoft <https://github.com/Microsoft>`,
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
``,
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA.`,
` * See https://github.com/Microsoft/vscode/blob/main/LICENSE.txt for license information.`,
` *--------------------------------------------------------------------------------------------*/`,
``,
`/**`,
` * Type Definition for Visual Studio Code ${shorttag} Extension API`,
` * See https://code.visualstudio.com/api for more information`,
` */`
].join('\n');
return header;
}

View File

@@ -72,7 +72,7 @@ function getNewFileHeader(tag: string) {
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA.`,
` * See https://github.com/Microsoft/vscode/blob/main/LICENSE.txt for license information.`,
` * See https://github.com/Microsoft/vscode/blob/master/LICENSE.txt for license information.`,
` *--------------------------------------------------------------------------------------------*/`,
``,
`/**`,

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "10.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:

View File

@@ -1,13 +1,13 @@
resources:
containers:
- container: linux-x64
image: sqltoolscontainers.azurecr.io/linux-build-agent:3
image: sqltoolscontainers.azurecr.io/linux-build-agent:2
endpoint: ContainerRegistry
jobs:
- job: Compile
pool:
vmImage: 'Ubuntu-18.04'
vmImage: 'Ubuntu-16.04'
container: linux-x64
steps:
- script: |
@@ -15,7 +15,6 @@ jobs:
echo "##vso[build.addbuildtag]$(VSCODE_QUALITY)"
displayName: Add Quality Build Tag
- template: sql-product-compile.yml
timeoutInMinutes: 120
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
@@ -25,7 +24,7 @@ jobs:
- Compile
steps:
- template: darwin/sql-product-build-darwin.yml
timeoutInMinutes: 90
timeoutInMinutes: 180
- job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
@@ -40,15 +39,37 @@ jobs:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
pool:
vmImage: 'Ubuntu-18.04'
vmImage: 'Ubuntu-16.04'
container: linux-x64
dependsOn:
- Compile
steps:
- template: linux/sql-product-build-linux.yml
parameters:
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azcli", "azdata", "azurecore", "cms", "dacpac", "data-workspace", "import", "machine-learning", "notebook", "resource-deployment", "schema-compare", "sql-database-projects"]
timeoutInMinutes: 90
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azdata", "azurecore", "cms", "dacpac", "import", "schema-compare", "notebook", "resource-deployment", "machine-learning", "sql-database-projects", "data-workspace"]
timeoutInMinutes: 70
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
variables:
VSCODE_ARCH: x64
dependsOn:
- Compile
steps:
- template: web/sql-product-build-web.yml
# - job: Docker
# condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
# pool:
# vmImage: 'Ubuntu-16.04'
# container: linux-x64
# dependsOn:
# - Linux
# steps:
# - template: docker/sql-product-build-docker.yml
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -58,7 +79,7 @@ jobs:
- Compile
steps:
- template: win32/sql-product-build-win32.yml
timeoutInMinutes: 90
timeoutInMinutes: 70
- job: Windows_Test
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -69,17 +90,18 @@ jobs:
- Windows
steps:
- template: win32/sql-product-test-win32.yml
timeoutInMinutes: 90
- job: Release
condition: and(succeeded(), or(eq(variables['VSCODE_RELEASE'], 'true'), and(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['Build.Reason'], 'Schedule'))))
pool:
vmImage: 'Ubuntu-18.04'
vmImage: 'Ubuntu-16.04'
dependsOn:
- macOS
- Linux
# - Docker
- Windows
- Windows_Test
- LinuxWeb
- macOS_Signing
steps:
- template: sql-release.yml

View File

@@ -1,4 +1,19 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'BuildCache'
platformIndependent: true
alias: 'Compilation'
- task: NodeTool@0
inputs:
versionSpec: "12.13.0"
@@ -12,6 +27,7 @@ steps:
inputs:
azureSubscription: 'ClientToolsInfra_670062 (88d5392f-a34f-4769-b405-f597fc533613)'
KeyVaultName: ado-secrets
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
@@ -24,6 +40,7 @@ steps:
git config user.email "sqltools@service.microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
@@ -31,44 +48,34 @@ steps:
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
mkdir -p .build
node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js > .build/yarnlockhash
displayName: Prepare yarn cache key
- task: Cache@2
inputs:
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
path: .build/node_modules_cache
cacheHitVar: NODE_MODULES_RESTORED
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
- script: |
set -e
tar -xzf .build/node_modules_cache/cache.tgz
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Extract node_modules archive
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['NODE_MODULES_RESTORED'], 'true'))
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
mkdir -p .build/node_modules_cache
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
displayName: Create node_modules archive
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'BuildCache'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['NODE_MODULES_RESTORED'], 'true'))
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
# Mixin must run before optimize, because the CSS loader will
# inline small SVGs
@@ -76,6 +83,7 @@ steps:
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
@@ -84,14 +92,17 @@ steps:
yarn strict-vscode
yarn valid-layers-check
displayName: Run hygiene, eslint
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
@@ -99,6 +110,7 @@ steps:
AZURE_STORAGE_ACCESS_KEY="$(sourcemap-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
@@ -113,13 +125,12 @@ steps:
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
- script: |
set -e
tar -czf $(Build.ArtifactStagingDirectory)/compilation.tar.gz .build out-*
displayName: Compress compilation artifact
- task: PublishPipelineArtifact@1
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Compiled Files
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/compilation.tar.gz
artifactName: Compilation
displayName: Publish compilation artifact
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'BuildCache'
platformIndependent: true
alias: 'Compilation'
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))

View File

@@ -1,29 +0,0 @@
resources:
containers:
- container: linux-x64
image: sqltoolscontainers.azurecr.io/web-build-image:1
endpoint: ContainerRegistry
jobs:
- job: LinuxWeb
pool:
vmImage: 'Ubuntu-18.04'
container: linux-x64
variables:
VSCODE_ARCH: x64
steps:
- template: web/sql-product-build-web.yml
timeoutInMinutes: 90
- job: Docker
pool:
vmImage: 'Ubuntu-18.04'
container: linux-x64
dependsOn:
- LinuxWeb
steps:
- template: docker/sql-product-build-docker.yml
timeoutInMinutes: 90
trigger: none
pr: none

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "14.x"
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:

View File

@@ -2,54 +2,56 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const es = require("event-stream");
const vfs = require("vinyl-fs");
const util = require("../lib/util");
// @ts-ignore
const deps = require("../lib/dependencies");
const path = require('path');
const es = require('event-stream');
const azure = require('gulp-azure-storage');
const vfs = require('vinyl-fs');
const util = require('../lib/util');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
// optionally allow to pass in explicit base/maps to upload
const [, , base, maps] = process.argv;
function src(base, maps = `${base}/**/*.map`) {
return vfs.src(maps, { base })
.pipe(es.mapSync((f) => {
f.path = `${f.base}/core/${f.relative}`;
return f;
}));
}
const fetch = function (base, maps = `${base}/**/*.map`) {
return vfs.src(maps, { base })
.pipe(es.mapSync(f => {
f.path = `${f.base}/core/${f.relative}`;
return f;
}));
};
function main() {
const sources = [];
// vscode client maps (default)
if (!base) {
const vs = src('out-vscode-min'); // client source-maps only
sources.push(vs);
const productionDependencies = deps.getProductionDependencies(root);
const productionDependenciesSrc = productionDependencies.map(d => path.relative(root, d.path)).map(d => `./${d}/**/*.map`);
const nodeModules = vfs.src(productionDependenciesSrc, { base: '.' })
.pipe(util.cleanNodeModules(path.join(root, 'build', '.moduleignore')));
sources.push(nodeModules);
const extensionsOut = vfs.src(['.build/extensions/**/*.js.map', '!**/node_modules/**'], { base: '.build' });
sources.push(extensionsOut);
}
// specific client base/maps
else {
sources.push(src(base, maps));
}
return es.merge(...sources)
.pipe(es.through(function (data) {
console.log('Uploading Sourcemap', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'sourcemaps',
prefix: commit + '/'
}));
const sources = [];
// vscode client maps (default)
if (!base) {
const vs = fetch('out-vscode-min'); // client source-maps only
sources.push(vs);
const extensionsOut = vfs.src(['.build/extensions/**/*.js.map', '!**/node_modules/**'], { base: '.build' });
sources.push(extensionsOut);
}
// specific client base/maps
else {
sources.push(fetch(base, maps));
}
return es.merge(...sources)
.pipe(es.through(function (data) {
console.log('Uploading Sourcemap', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'sourcemaps',
prefix: commit + '/'
}));
}
main();

View File

@@ -1,67 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs';
import * as util from '../lib/util';
// @ts-ignore
import * as deps from '../lib/dependencies';
const azure = require('gulp-azure-storage');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
// optionally allow to pass in explicit base/maps to upload
const [, , base, maps] = process.argv;
function src(base: string, maps = `${base}/**/*.map`) {
return vfs.src(maps, { base })
.pipe(es.mapSync((f: Vinyl) => {
f.path = `${f.base}/core/${f.relative}`;
return f;
}));
}
function main() {
const sources = [];
// vscode client maps (default)
if (!base) {
const vs = src('out-vscode-min'); // client source-maps only
sources.push(vs);
const productionDependencies: { name: string, path: string, version: string }[] = deps.getProductionDependencies(root);
const productionDependenciesSrc = productionDependencies.map(d => path.relative(root, d.path)).map(d => `./${d}/**/*.map`);
const nodeModules = vfs.src(productionDependenciesSrc, { base: '.' })
.pipe(util.cleanNodeModules(path.join(root, 'build', '.moduleignore')));
sources.push(nodeModules);
const extensionsOut = vfs.src(['.build/extensions/**/*.js.map', '!**/node_modules/**'], { base: '.build' });
sources.push(extensionsOut);
}
// specific client base/maps
else {
sources.push(src(base, maps));
}
return es.merge(...sources)
.pipe(es.through(function (data: Vinyl) {
console.log('Uploading Sourcemap', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'sourcemaps',
prefix: commit + '/'
}));
}
main();

Some files were not shown because too many files have changed in this diff Show More