Compare commits

..

44 Commits

Author SHA1 Message Date
Charles Gagnon
1b5c54dd8c revert grid streaming changes (#12650) (#12652)
(cherry picked from commit cf9754f627)

Co-authored-by: Lucy Zhang <luczhan@microsoft.com>
2020-09-28 21:49:49 -07:00
Aditya Bist
4082170522 bump version for hotfix (#12592) 2020-09-25 21:10:34 -07:00
Alan Ren
5ecf1c6e6f bump sts version (#12636) (#12638) 2020-09-25 14:59:04 -07:00
Charles Gagnon
6de11c8107 Fix undefined error in server tree data source (#12616) (#12617)
* Fix undefined error in server tree data source

* Add comment

(cherry picked from commit 1ea33d83bf)
2020-09-25 13:43:47 -07:00
Monica Gupta
76d7b0a9fe Addressed comments (#12618)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-24 17:16:23 -07:00
Alan Ren
ce4c3e9586 clone the object to be modified (#12583) (#12590) 2020-09-23 13:42:44 -07:00
Alan Ren
5190bf376c escape the value for display (#12547) (#12571) 2020-09-22 14:50:41 -07:00
Udeesha Gautam
77b9a708df fix the reference error due to extra $ in default variable (#12524) 2020-09-21 10:21:23 -07:00
Udeesha Gautam
a4ee871b88 Port/db project fixes (#12521)
* Update default values and example text when dropdown value changes (#12493)

* remove option to add reference to same database (#12495)

Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
2020-09-20 21:09:46 -07:00
Charles Gagnon
3f4e19fc08 Arc good ARC bad (#12499) (#12511)
Co-authored-by: Chris LaFreniere <40371649+chlafreniere@users.noreply.github.com>
2020-09-20 11:44:30 -07:00
Barbara Valdez
571fca6de5 In-Viewlet Notebooks Search (#12455) (#12514)
* fix search

* Add sql carbon tags to vs files

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>
2020-09-19 18:13:10 -07:00
Barbara Valdez
5a2fdc4034 Add warning message for users using the new version of jupyter book (#12496) (#12500)
* Add warning message for users

* Address pr comments
2020-09-18 20:15:12 -07:00
Chris LaFreniere
cc6d84e7f6 Notebooks: Fix Grids Not Rendering when Unsaved Notebook Reloaded (#12483) (#12498)
* Clear Output and fix output change

* Fix tests after forced clear + append output
2020-09-18 20:14:45 -07:00
Vasu Bhog
99e11d2e22 Fix PySpark kernel connection change (#12494) (#12497) 2020-09-18 20:10:37 -07:00
Charles Gagnon
9a85123e21 Revert BDC deployment back to using old azdata check (#12470) (#12474) 2020-09-18 18:46:22 -07:00
Lucy Zhang
56669db6b6 update resultSet in data provider (#12478) (#12486) 2020-09-18 18:36:40 -07:00
Udeesha Gautam
8782eeb32f Port/ml fixes (#12491)
* change to allow refresh and delete correctly (#12477)

* add table name to models that are imported (#12445)
2020-09-18 17:44:58 -07:00
Charles Gagnon
7f3d5bac0a start with eulaCheckButton hidden (#12427) (#12458)
* start with eulaCheckButton hidden

* reset buttons on card select

* remove testcode

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-18 11:52:32 -07:00
Charles Gagnon
7a1e0a7d2e Fix resource deployment text field validation (#12421) (#12457) 2020-09-18 11:22:23 -07:00
Alan Ren
681ecbd946 fix the legacy card style issue (#12428) (#12442)
* fix the legacy card style issue

* replace the card class
2020-09-18 11:14:02 -07:00
Vasu Bhog
e7798a8e32 Fix Spark kernel connections and switch from Kusto to Spark kernels (#12436) (#12441)
* Fix connection dialog for Spark and issue when switching from Kusto to Spark

* Address comments
2020-09-17 21:32:03 -07:00
Aasim Khan
b158180ef4 Added portal link for Azure SQL (#12425) 2020-09-17 17:37:41 -07:00
Aditya Bist
7ad9da7fda fix connection dialog indentation (#12414) 2020-09-17 15:55:54 -07:00
Charles Gagnon
94e2016a16 Port updates for removing EULA acceptance checkbox from Arc deployments (#12409)
* controller dropdown field to SQL MIAA and Postgres deployment. (#12217)

* saving first draft

* throw if no controllers

* cleanup

* bug fixes

* bug fixes and caching controller access

* pr comments and bug fixes.

* fixes

* fixes

* comment fix

* remove debug prints

* comment fixes

* remove debug logs

* inputValueTransformer returns string|Promise

* PR feedback

* pr fixes

* remove _ from protected fields

* anonymous to full methods

* small fixes

(cherry picked from commit 9cf80113fc)

* fix option sources (#12387)


(cherry picked from commit fca8b85a72)

* Remove azdata eula acceptance from arc deployments (#12292)

* saving to switch tasks

* activate to exports in extApi

* working version - cleanup pending

* improve messages

* apply pr feedback from a different review

* remove unneeded strings

* redo apiService

* remove async from getVersionFromOutput

* remove _ prefix from protected fields

* error message fix

* throw specif errors from azdata extension

* arrow methods to regular methods

* pr feedback

* expand azdata extension api

* pr feedback

* remove unused var

* pr feedback

(cherry picked from commit ba44a2f02e)

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-17 15:05:02 -07:00
Aditya Bist
21bb577da8 fix maximize bug (#12335) 2020-09-17 14:18:53 -07:00
Udeesha Gautam
5e8325ba28 marking intermittent test failure as unstable (#12402) (#12407) 2020-09-17 13:36:13 -07:00
Aasim Khan
25b7ccade3 Added awaits to change column setting (#12315) 2020-09-17 13:28:21 -07:00
Barbara Valdez
57940c581c Update Windows command and minor update to installation cell (#12361) (#12400)
* Fix windows command and minor update to installation cell

* Add expand_section field on the first section of the book
2020-09-17 13:17:50 -07:00
Chris LaFreniere
82f9e4e24b Notebooks: Fast update WYSIWYG support for source update (#12289) (#12399)
* Fast update WYSIWYG support for source update

* Do bracket matching over hardcoding line offsets
2020-09-17 13:17:03 -07:00
Hale Rankin
3e22fcfd2d 12360 Notebook UI - Mac/Win fix for Select all. (#12383) (#12397)
* 12360 Notebook UI - Mac/Win fix for Select all.

* Fix for ctrl key selecting all in windows

* Fix undo as well

* preventDefault to prevent confusing behavior

Co-authored-by: chlafreniere <hichise@gmail.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
2020-09-17 12:18:47 -07:00
Lucy Zhang
0bc81e1078 Fix notebook table rendering with multiple code cells (#12363) (#12391)
* create unique query runner for each cell

* use cellUri instead of cellId to identify runner

* disconnect each query runner connection

* remove queryrunners size check
2020-09-17 10:32:11 -07:00
Barbara Valdez
7b6328dccf Fix highlight issue (#12278) (#12362)
* Fix highlight issue

* Address PR comments
2020-09-16 13:48:14 -07:00
Vasu Bhog
05124273ea Fix Notebook Kusto Kernel Consistency (#12256) (#12352)
* fix kusto notebook consistency

* Address undefined
2020-09-16 12:08:28 -07:00
Lucy Zhang
b1d4444522 Fix notebook cancel query bug (#12300) (#12351)
* fix undefined query runner error

* store connection id

* revert sqlSessionManager change
2020-09-16 12:07:38 -07:00
Alan Ren
4ee2d369cf vbump sql-db-proj extension (#12336) (#12354)
* vbump sql-db-proj extension (#12336)

* update sqlproj dependency version (#12359)

Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>
2020-09-16 11:47:51 -07:00
Charles Gagnon
fb28b69bb0 Fix component items in declarative table not showing (#12330) (#12331)
(cherry picked from commit 4dd04cb250)
2020-09-16 11:43:01 -07:00
Chris LaFreniere
f2709c7100 Watch for on load event (#12309) (#12346) 2020-09-16 00:43:23 -07:00
Chris LaFreniere
3476f5ae38 Add newline after caption (#12276) (#12340) 2020-09-15 23:07:18 -07:00
Chris LaFreniere
b937fdee7a 12284 Removed custom CSS that positioned editor text beneath overlapping layers. Text is now selectable. (#12312) (#12339)
Co-authored-by: Hale Rankin <harankin@microsoft.com>
2020-09-15 23:04:30 -07:00
Chris LaFreniere
dd9ac2e362 Add heasdingStyle atx option (#12286) (#12338) 2020-09-15 22:50:46 -07:00
Alan Ren
403ff6cfec remove data-workspace dependency (#12321) (#12327) 2020-09-15 17:05:29 -07:00
Udeesha Gautam
4a6226974e adding icon for add new and open project (#12265) (#12324) 2020-09-15 16:28:42 -07:00
Charles Gagnon
6a2c47f511 Disable resource viewer (#12291) (#12298)
* Disable resource viewer

* comment

* Remove unused

(cherry picked from commit 95b76f08f2)
2020-09-15 16:13:53 -07:00
Aditya Bist
3d9a316f4b bump vscode version (#12258) 2020-09-14 14:53:26 -07:00
8031 changed files with 214726 additions and 694587 deletions

View File

@@ -12,10 +12,6 @@
{
"file": "build\\actions\\AutoMerge\\dist\\index.js",
"_justification": "False positive from webpacked code"
},
{
"file": ".devcontainer\\devcontainer.json",
"_justification": "Local development environment - not used in production"
}
]
}

121
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,121 @@
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
FROM mcr.microsoft.com/vscode/devcontainers/typescript-node:0-12
ARG TARGET_DISPLAY=":1"
# VNC options
ARG MAX_VNC_RESOLUTION=1920x1080x16
ARG TARGET_VNC_RESOLUTION=1920x1080
ARG TARGET_VNC_DPI=72
ARG TARGET_VNC_PORT=5901
ARG VNC_PASSWORD="vscode"
# noVNC (VNC web client) options
ARG INSTALL_NOVNC="true"
ARG NOVNC_VERSION=1.1.0
ARG TARGET_NOVNC_PORT=6080
ARG WEBSOCKETIFY_VERSION=0.9.0
# Firefox is useful for testing things like browser launch events, but optional
ARG INSTALL_FIREFOX="false"
# Expected non-root username from base image
ARG USERNAME=node
# Core environment variables for X11, VNC, and fluxbox
ENV DBUS_SESSION_BUS_ADDRESS="autolaunch:" \
MAX_VNC_RESOLUTION="${MAX_VNC_RESOLUTION}" \
VNC_RESOLUTION="${TARGET_VNC_RESOLUTION}" \
VNC_DPI="${TARGET_VNC_DPI}" \
VNC_PORT="${TARGET_VNC_PORT}" \
NOVNC_PORT="${TARGET_NOVNC_PORT}" \
DISPLAY="${TARGET_DISPLAY}" \
LANG="en_US.UTF-8" \
LANGUAGE="en_US.UTF-8" \
VISUAL="nano" \
EDITOR="nano"
# Configure apt and install packages
RUN apt-get update \
&& export DEBIAN_FRONTEND=noninteractive \
#
# Install the Cascadia Code fonts - https://github.com/microsoft/cascadia-code
&& curl -sSL https://github.com/microsoft/cascadia-code/releases/download/v2004.30/CascadiaCode_2004.30.zip -o /tmp/cascadia-fonts.zip \
&& unzip /tmp/cascadia-fonts.zip -d /tmp/cascadia-fonts \
&& mkdir -p /usr/share/fonts/truetype/cascadia \
&& mv /tmp/cascadia-fonts/ttf/* /usr/share/fonts/truetype/cascadia/ \
&& rm -rf /tmp/cascadia-fonts.zip /tmp/cascadia-fonts \
#
# Install X11, fluxbox and VS Code dependencies
&& apt-get -y install --no-install-recommends \
xvfb \
x11vnc \
fluxbox \
dbus-x11 \
x11-utils \
x11-xserver-utils \
xdg-utils \
fbautostart \
xterm \
eterm \
gnome-terminal \
gnome-keyring \
seahorse \
nautilus \
libx11-dev \
libxkbfile-dev \
libsecret-1-dev \
libnotify4 \
libnss3 \
libxss1 \
libasound2 \
xfonts-base \
xfonts-terminus \
fonts-noto \
fonts-wqy-microhei \
fonts-droid-fallback \
vim-tiny \
nano \
#
# [Optional] Install noVNC
&& if [ "${INSTALL_NOVNC}" = "true" ]; then \
mkdir -p /usr/local/novnc \
&& curl -sSL https://github.com/novnc/noVNC/archive/v${NOVNC_VERSION}.zip -o /tmp/novnc-install.zip \
&& unzip /tmp/novnc-install.zip -d /usr/local/novnc \
&& cp /usr/local/novnc/noVNC-${NOVNC_VERSION}/vnc_lite.html /usr/local/novnc/noVNC-${NOVNC_VERSION}/index.html \
&& rm /tmp/novnc-install.zip \
&& curl -sSL https://github.com/novnc/websockify/archive/v${WEBSOCKETIFY_VERSION}.zip -o /tmp/websockify-install.zip \
&& unzip /tmp/websockify-install.zip -d /usr/local/novnc \
&& apt-get -y install --no-install-recommends python-numpy \
&& ln -s /usr/local/novnc/websockify-${WEBSOCKETIFY_VERSION} /usr/local/novnc/noVNC-${NOVNC_VERSION}/utils/websockify \
&& rm /tmp/websockify-install.zip; \
fi \
#
# [Optional] Install Firefox
&& if [ "${INSTALL_FIREFOX}" = "true" ]; then \
apt-get -y install --no-install-recommends firefox-esr; \
fi \
#
# Clean up
&& apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /var/lib/apt/lists/*
COPY bin/init-dev-container.sh /usr/local/share/
COPY bin/set-resolution /usr/local/bin/
COPY fluxbox/* /root/.fluxbox/
COPY fluxbox/* /home/${USERNAME}/.fluxbox/
# Update privs, owners of config files
RUN mkdir -p /var/run/dbus /root/.vnc /home/${USERNAME}/.vnc \
&& touch /root/.Xmodmap /home/${USERNAME}/.Xmodmap \
&& echo "${VNC_PASSWORD}" | tee /root/.vnc/passwd > /home/${USERNAME}/.vnc/passwd \
&& chown -R ${USERNAME}:${USERNAME} /home/${USERNAME}/.Xmodmap /home/${USERNAME}/.fluxbox /home/${USERNAME}/.vnc \
&& chmod +x /usr/local/share/init-dev-container.sh /usr/local/bin/set-resolution
ENTRYPOINT ["/usr/local/share/init-dev-container.sh"]
CMD ["sleep", "infinity"]

View File

@@ -1,8 +1,8 @@
# Code - OSS Development Container
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [GitHub Codespaces](https://github.com/features/codespaces).
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [Visual Studio Codespaces](https://aka.ms/vso).
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well.
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well. [Chicken](https://sourceforge.net/projects/chicken/) is a good macOS alternative.
## Quick start - local
@@ -30,41 +30,25 @@ Anything you start in VS Code or the integrated terminal will appear here.
Next: **[Try it out!](#try-it)**
## Quick start - GitHub Codespaces
## Quick start - Codespaces
> **IMPORTANT:** The current free user beta for GitHub Codespaces uses a "Basic" sized codespace which does not have enough RAM to run a full build of VS Code and will be considerably slower during codespace start and running VS Code. You'll soon be able to use a "Standard" sized codespace (4-core, 8GB) that will be better suited for this purpose (along with even larger sizes should you need it).
>Note that the Codespaces browser-based editor cannot currently access the desktop environment in this container (due to a [missing feature](https://github.com/MicrosoftDocs/vsonline/issues/117)). We recommend using Visual Studio Code from the desktop to connect instead in the near term.
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and the **New codespace**
1. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Visual Studio Codespaces](https://aka.ms/vscs-ext-vscode) extension.
> Note that you will not see these options if you are not in the beta yet.
![Image of VS Codespaces extension](https://microsoft.github.io/vscode-remote-release/images/codespaces-extn.png)
2. After the codespace is up and running in your browser, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **View: Show Remote Explorer**.
> Note that the Visual Studio Codespaces extension requires the Visual Studio Code distribution of Code - OSS.
3. You should see port `6080` under **Forwarded Ports**. Select the line and click on the globe icon to open it in a browser tab.
2. Sign in by pressing <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and selecting **Codespaces: Sign In**. You may also need to use the **Codespaces: Create Plan** if you do not have a plan. See the [Codespaces docs](https://aka.ms/vso-docs/vscode) for details.
> If you do not see port `6080`, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, select **Forward a Port** and enter port `6080`.
3. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Codespaces: Create New Codespace**.
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
4. Use default settings (which should include **Standard** 4 core, 8 GB RAM Codespace), select a plan, and then enter the repository URL `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box when prompted.
Anything you start in VS Code or the integrated terminal will appear here.
5. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
Next: **[Try it out!](#try-it)**
### Using VS Code with GitHub Codespaces
You will likely see better performance when accessing the codespace you created from VS Code since you can use a[VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
1. [Create a codespace](#quick-start---github-codespaces) if you have not already.
2. Set up [VS Code for use with GitHub Codespaces](https://docs.github.com/github/developing-online-with-codespaces/using-codespaces-in-visual-studio-code)
3. After the VS Code is up and running, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
4. After you've connected to the codespace, use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
5. Anything you start in VS Code or the integrated terminal will appear here.
Next: **[Try it out!](#try-it)**
6. Anything you start in VS Code or the integrated terminal will appear here.
## Try it!
@@ -81,9 +65,7 @@ To start working with Code - OSS, follow these steps:
bash scripts/code.sh
```
Note that a previous run of `yarn install` will already be cached, so this step should simply pick up any recent differences.
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to the desktop environnement as described in the quick start and enter `vscode` as the password.
2. After the build is complete, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
3. You should now see Code - OSS!

View File

@@ -0,0 +1,91 @@
#!/bin/bash
NONROOT_USER=node
LOG=/tmp/container-init.log
# Execute the command it not already running
startInBackgroundIfNotRunning()
{
log "Starting $1."
echo -e "\n** $(date) **" | sudoIf tee -a /tmp/$1.log > /dev/null
if ! pidof $1 > /dev/null; then
keepRunningInBackground "$@"
while ! pidof $1 > /dev/null; do
sleep 1
done
log "$1 started."
else
echo "$1 is already running." | sudoIf tee -a /tmp/$1.log > /dev/null
log "$1 is already running."
fi
}
# Keep command running in background
keepRunningInBackground()
{
($2 sh -c "while :; do echo [\$(date)] Process started.; $3; echo [\$(date)] Process exited!; sleep 5; done 2>&1" | sudoIf tee -a /tmp/$1.log > /dev/null & echo "$!" | sudoIf tee /tmp/$1.pid > /dev/null)
}
# Use sudo to run as root when required
sudoIf()
{
if [ "$(id -u)" -ne 0 ]; then
sudo "$@"
else
"$@"
fi
}
# Use sudo to run as non-root user if not already running
sudoUserIf()
{
if [ "$(id -u)" -eq 0 ]; then
sudo -u ${NONROOT_USER} "$@"
else
"$@"
fi
}
# Log messages
log()
{
echo -e "[$(date)] $@" | sudoIf tee -a $LOG > /dev/null
}
log "** SCRIPT START **"
# Start dbus.
log 'Running "/etc/init.d/dbus start".'
if [ -f "/var/run/dbus/pid" ] && ! pidof dbus-daemon > /dev/null; then
sudoIf rm -f /var/run/dbus/pid
fi
sudoIf /etc/init.d/dbus start 2>&1 | sudoIf tee -a /tmp/dbus-daemon-system.log > /dev/null
while ! pidof dbus-daemon > /dev/null; do
sleep 1
done
# Set up Xvfb.
startInBackgroundIfNotRunning "Xvfb" sudoIf "Xvfb ${DISPLAY:-:1} +extension RANDR -screen 0 ${MAX_VNC_RESOLUTION:-1920x1080x16}"
# Start fluxbox as a light weight window manager.
startInBackgroundIfNotRunning "fluxbox" sudoUserIf "dbus-launch startfluxbox"
# Start x11vnc
startInBackgroundIfNotRunning "x11vnc" sudoIf "x11vnc -display ${DISPLAY:-:1} -rfbport ${VNC_PORT:-5901} -localhost -no6 -xkb -shared -forever -passwdfile $HOME/.vnc/passwd"
# Set resolution
/usr/local/bin/set-resolution ${VNC_RESOLUTION:-1280x720} ${VNC_DPI:-72}
# Spin up noVNC if installed and not runnning.
if [ -d "/usr/local/novnc" ] && [ "$(ps -ef | grep /usr/local/novnc/noVNC*/utils/launch.sh | grep -v grep)" = "" ]; then
keepRunningInBackground "noVNC" sudoIf "/usr/local/novnc/noVNC*/utils/launch.sh --listen ${NOVNC_PORT:-6080} --vnc localhost:${VNC_PORT:-5901}"
log "noVNC started."
else
log "noVNC is already running or not installed."
fi
# Run whatever was passed in
log "Executing \"$@\"."
"$@"
log "** SCRIPT EXIT **"

View File

@@ -0,0 +1,25 @@
#!/bin/bash
RESOLUTION=${1:-${VNC_RESOLUTION:-1920x1080}}
DPI=${2:-${VNC_DPI:-72}}
if [ -z "$1" ]; then
echo -e "**Current Settings **\n"
xrandr
echo -n -e "\nEnter new resolution (WIDTHxHEIGHT, blank for ${RESOLUTION}, Ctrl+C to abort).\n> "
read NEW_RES
if [ "${NEW_RES}" != "" ]; then
RESOLUTION=${NEW_RES}
fi
if [ -z "$2" ]; then
echo -n -e "\nEnter new DPI (blank for ${DPI}, Ctrl+C to abort).\n> "
read NEW_DPI
if [ "${NEW_DPI}" != "" ]; then
DPI=${NEW_DPI}
fi
fi
fi
xrandr --fb ${RESOLUTION} --dpi ${DPI} > /dev/null 2>&1
echo -e "\n**New Settings **\n"
xrandr
echo

View File

@@ -1 +0,0 @@
*.manifest

View File

@@ -1,15 +0,0 @@
#!/bin/bash
# This file establishes a basline for the reposuitory before any steps in the "prepare.sh"
# are run. Its just a find command that filters out a few things we don't need to watch.
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
SOURCE_FOLDER="${1:-"."}"
cd "${SOURCE_FOLDER}"
echo "[$(date)] Generating ""before"" manifest..."
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/before.manifest"
echo "[$(date)] Done!"

View File

@@ -1,28 +0,0 @@
#!/bin/bash
# This file simply wraps the dockeer build command used to build the image with the
# cached result of the commands from "prepare.sh" and pushes it to the specified
# container image registry.
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
CONTAINER_IMAGE_REPOSITORY="$1"
BRANCH="${2:-"master"}"
if [ "${CONTAINER_IMAGE_REPOSITORY}" = "" ]; then
echo "Container repository not specified!"
exit 1
fi
TAG="branch-${BRANCH//\//-}"
echo "[$(date)] ${BRANCH} => ${TAG}"
cd "${SCRIPT_PATH}/../.."
echo "[$(date)] Starting image build..."
docker build -t ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}" -f "${SCRIPT_PATH}/cache.Dockerfile" .
echo "[$(date)] Image build complete."
echo "[$(date)] Pushing image..."
docker push ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}"
echo "[$(date)] Done!"

View File

@@ -1,21 +0,0 @@
#!/bin/bash
# This file is used to archive off a copy of any differences in the source tree into another location
# in the image. Once the codespace is up, this will be restored into its proper location (which is
# quick and happens parallel to other startup activities)
set -e
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
SOURCE_FOLDER="${1:-"."}"
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
echo "[$(date)] Starting cache operation..."
cd "${SOURCE_FOLDER}"
echo "[$(date)] Determining diffs..."
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/after.manifest"
grep -Fxvf "${SCRIPT_PATH}/before.manifest" "${SCRIPT_PATH}/after.manifest" > "${SCRIPT_PATH}/cache.manifest"
echo "[$(date)] Archiving diffs..."
mkdir -p "${CACHE_FOLDER}"
tar -cf "${CACHE_FOLDER}/cache.tar" --totals --files-from "${SCRIPT_PATH}/cache.manifest"
echo "[$(date)] Done! $(du -h "${CACHE_FOLDER}/cache.tar")"

View File

@@ -1,14 +0,0 @@
# This dockerfile is used to build up from a base image to create an image with cached results of running "prepare.sh".
# Other image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
FROM mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:dev
ARG USERNAME=node
COPY --chown=${USERNAME}:${USERNAME} . /repo-source-tmp/
RUN mkdir /usr/local/etc/devcontainer-cache \
&& chown ${USERNAME} /usr/local/etc/devcontainer-cache /repo-source-tmp \
&& su ${USERNAME} -c "\
cd /repo-source-tmp \
&& .devcontainer/cache/before-cache.sh \
&& .devcontainer/prepare.sh \
&& .devcontainer/cache/cache-diff.sh" \
&& rm -rf /repo-source-tmp

View File

@@ -1,23 +0,0 @@
#!/bin/bash
# This file restores the results of the "prepare.sh" into their proper locations
# once the container has been created. It runs as a postCreateCommand which
# in GitHub Codespaces occurs parallel to other startup activities and does not
# really add to the overal startup time given how quick the operation ends up being.
set -e
SOURCE_FOLDER="$(cd "${1:-"."}" && pwd)"
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
if [ ! -d "${CACHE_FOLDER}" ]; then
echo "No cache folder found."
exit 0
fi
echo "[$(date)] Expanding $(du -h "${CACHE_FOLDER}/cache.tar") file to ${SOURCE_FOLDER}..."
cd "${SOURCE_FOLDER}"
tar -xf "${CACHE_FOLDER}/cache.tar"
rm -f "${CACHE_FOLDER}/cache.tar"
echo "[$(date)] Done!"

View File

@@ -1,30 +1,45 @@
{
"name": "Code - OSS",
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-master",
"workspaceMount": "source=${localWorkspaceFolder},target=/home/node/workspace/vscode,type=bind,consistency=cached",
"workspaceFolder": "/home/node/workspace/vscode",
"build": {
"dockerfile": "Dockerfile",
"args": {
"MAX_VNC_RESOLUTION": "1920x1080x16",
"TARGET_VNC_RESOLUTION": "1280x768",
"TARGET_VNC_PORT": "5901",
"TARGET_NOVNC_PORT": "6080",
"VNC_PASSWORD": "vscode",
"INSTALL_FIREFOX": "true"
}
},
"overrideCommand": false,
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined"],
"runArgs": [
"--init",
// seccomp=unconfined is required for Chrome sandboxing
"--security-opt", "seccomp=unconfined"
],
"settings": {
// zsh is also available
"terminal.integrated.shell.linux": "/bin/bash",
"resmon.show.battery": false,
"resmon.show.cpufreq": false
"resmon.show.cpufreq": false,
"remote.extensionKind": {
"ms-vscode.js-debug-nightly": "workspace",
"msjsdiag.debugger-for-chrome": "workspace"
},
"debug.chrome.useV3": true
},
// noVNC, VNC, debug ports
"forwardPorts": [6080, 5901, 9222],
// noVNC, VNC ports
"forwardPorts": [6080, 5901],
"extensions": [
"dbaeumer.vscode-eslint",
"mutantdino.resourcemonitor"
"EditorConfig.EditorConfig",
"msjsdiag.debugger-for-chrome",
"mutantdino.resourcemonitor",
"GitHub.vscode-pull-request-github"
],
// Optionally loads a cached yarn install for the repo
"postCreateCommand": ".devcontainer/cache/restore-diff.sh",
"remoteUser": "node"
}

View File

@@ -0,0 +1,9 @@
[app] (name=code-oss-dev)
[Position] (CENTER) {0 0}
[Maximized] {yes}
[Dimensions] {100% 100%}
[end]
[transient] (role=GtkFileChooserDialog)
[Position] (CENTER) {0 0}
[Dimensions] {70% 70%}
[end]

View File

@@ -0,0 +1,9 @@
session.menuFile: ~/.fluxbox/menu
session.keyFile: ~/.fluxbox/keys
session.styleFile: /usr/share/fluxbox/styles//Squared_for_Debian
session.configVersion: 13
session.screen0.workspaces: 1
session.screen0.workspacewarping: false
session.screen0.toolbar.widthPercent: 100
session.screen0.strftimeFormat: %d %b, %a %02k:%M:%S
session.screen0.toolbar.tools: prevworkspace, workspacename, nextworkspace, clock, prevwindow, nextwindow, iconbar, systemtray

View File

@@ -0,0 +1,16 @@
[begin] ( Code - OSS Development Container )
[exec] (File Manager) { nautilus ~ } <>
[exec] (Terminal) {/usr/bin/gnome-terminal --working-directory=~ } <>
[exec] (Start Code - OSS) { x-terminal-emulator -T "Code - OSS Build" -e bash /workspaces/vscode*/scripts/code.sh } <>
[submenu] (System >) {}
[exec] (Set Resolution) { x-terminal-emulator -T "Set Resolution" -e bash /usr/local/bin/set-resolution } <>
[exec] (Passwords and Keys) { seahorse } <>
[exec] (Top) { x-terminal-emulator -T "Top" -e /usr/bin/top } <>
[exec] (Editres) {editres} <>
[exec] (Xfontsel) {xfontsel} <>
[exec] (Xkill) {xkill} <>
[exec] (Xrefresh) {xrefresh} <>
[end]
[config] (Configuration >)
[workspaces] (Workspaces >)
[end]

View File

@@ -1,10 +0,0 @@
#!/bin/bash
# This file contains the steps that should be run when creating the intermediary image that contains
# contents for that should be in the image by default. It will be used to build up from the base image
# to create an image that speeds up first time use of the dev container by "caching" the results
# of these commands. Developers can still run these commands without an issue once the container is
# up, but only differences will be processed which also speeds up the first time these operations occur.
yarn install
yarn electron

View File

@@ -5,7 +5,7 @@
**/vs/loader.js
**/insane/**
**/marked/**
**/semver/**
**/markjs/**
**/test/**/*.js
**/node_modules/**
**/vscode-api-tests/testWorkspace/**

View File

@@ -7,8 +7,7 @@
},
"plugins": [
"@typescript-eslint",
"jsdoc",
"mocha"
"jsdoc"
],
"rules": {
"constructor-super": "warn",
@@ -42,7 +41,6 @@
"no-var": "warn",
"jsdoc/no-types": "warn",
"semi": "off",
"mocha/no-exclusive-tests": "warn",
"@typescript-eslint/semi": "warn",
"@typescript-eslint/naming-convention": [
"warn",
@@ -546,8 +544,7 @@
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"tas-client-umd",
"jschardet"
"semver-umd"
]
},
{
@@ -608,11 +605,7 @@
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**",
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**"
]
},
{
@@ -740,11 +733,7 @@
"angular2-grid",
"html-query-plan",
"turndown",
"mark.js",
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
"mark.js"
]
},
{
@@ -773,11 +762,7 @@
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/contrib/**/{common,browser,electron-sandbox}/**",
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"jschardet"
"**/{vs,sql}/workbench/contrib/**/{common,browser,electron-sandbox}/**"
]
},
{
@@ -1039,7 +1024,6 @@
"collapse",
"create",
"delete",
"discover",
"dispose",
"edit",
"end",

16
.github/CODEOWNERS vendored
View File

@@ -1,16 +0,0 @@
# Lines starting with '#' are comments.
# Each line is a file pattern followed by one or more owners.
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
/extensions/admin-tool-ext-win @Charles-Gagnon
/extensions/arc/ @Charles-Gagnon
/extensions/azdata/ @Charles-Gagnon
/extensions/big-data-cluster/ @Charles-Gagnon
/extensions/dacpac/ @kisantia
/extensions/query-history/ @Charles-Gagnon
/extensions/resource-deployment/ @Charles-Gagnon
/extensions/schema-compare/ @kisantia
/extensions/sql-database-projects/ @Benjin @kisantia
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon

17
.github/commands.yml vendored
View File

@@ -1,12 +1,11 @@
{
perform: true,
commands:
[
{
type: "label",
name: "Needs Logs",
action: "comment",
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log",
},
],
commands: [
{
type: 'label',
name: 'Needs Logs',
action: 'comment',
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log"
}
]
}

View File

@@ -1,27 +0,0 @@
Needs Logs:
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
There are two types of logs to collect:
**Console Logs**
- Open Developer Tools (Help -> Toggle Developer Tools)
- Click the **Console** tab
- Click in the log area and select all text (CTRL+A)
- Save this text into a file named console.log and attach it to this issue.
**Application Logs**
- Open command palette (Click **View** -> **Command Palette**)
- Run the command: **`Developer: Open Logs Folder`**
- This will open the log folder locally. Please zip up this folder and attach it to the issue."

View File

@@ -1,5 +1,5 @@
{
perform: true,
whenCreatedByTeam: true,
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}",
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}"
}

View File

@@ -1,9 +0,0 @@
{
"notebook": [
"claudiaregio",
"rchiodo",
"greazer",
"donjayamanne",
"jilljac"
]
}

View File

@@ -17,51 +17,48 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- run: |
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
name: Setup Build Environment
- uses: actions/setup-node@v1
with:
node-version: 12
# TODO: cache node modules
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js"
name: Run Unit Tests (Electron)
- run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
name: Run Extension Unit Tests (Electron)
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- run: node test/combineCoverage
name: Combine code coverage files
- name: Upload Code Coverage
uses: coverallsapp/github-action@v1.1.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: "test/coverage/lcov.info"
- uses: actions/checkout@v2.2.0
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- run: |
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
name: Setup Build Environment
- uses: actions/setup-node@v1
with:
node-version: 10
# TODO: cache node modules
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js"
name: Run Unit Tests (Electron)
- run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
name: Run Extension Unit Tests (Electron)
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- run: node test/combineCoverage
name: Combine code coverage files
- name: Upload Code Coverage
uses: coverallsapp/github-action@v1.1.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: 'test/coverage/lcov.info'
# Fails with cryptic error (e.g. https://github.com/microsoft/vscode/pull/90292/checks?check_run_id=433681926#step:13:9)
# - run: DISPLAY=:10 yarn test-browser --browser chromium
@@ -75,34 +72,31 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1
with:
node-version: 12
- uses: actions/setup-python@v1
with:
python-version: "2.x"
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: .\scripts\test.bat --tfs "Unit Tests"
name: Run Unit Tests (Electron)
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1
with:
node-version: 10
- uses: actions/setup-python@v1
with:
python-version: '2.x'
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: .\scripts\test.bat --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium {{SQL CARBON EDIT}} disable for now @TODO @anthonydresser
# name: Run Unit Tests (Browser)
# - run: .\scripts\test-integration.bat --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
@@ -114,31 +108,28 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1
with:
node-version: 12
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: ./scripts/test.sh --tfs "Unit Tests"
name: Run Unit Tests (Electron)
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v1
with:
node-version: 10
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
- run: yarn gulp hygiene
name: Run Hygiene Checks
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
name: Run Strict Compile Options
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
# name: Run Monaco Editor Checks
- run: yarn valid-layers-check
name: Run Valid Layers Checks
- run: yarn compile
name: Compile Sources
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
# name: Download Built-in Extensions
- run: ./scripts/test.sh --tfs "Unit Tests"
name: Run Unit Tests (Electron)
# - run: yarn test-browser --browser chromium --browser webkit
# name: Run Unit Tests (Browser)
# - run: ./scripts/test-integration.sh --tfs "Integration Tests"

View File

@@ -3,42 +3,44 @@ name: "Code Scanning - Action"
on:
push:
schedule:
- cron: "0 0 * * 0"
- cron: '0 0 * * 0'
jobs:
CodeQL-Build:
strategy:
fail-fast: false
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Checkout repository
uses: actions/checkout@v2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below).
- name: Autobuild
uses: github/codeql-action/autobuild@v1
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below).
- name: Autobuild
uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
#- run: |
# make bootstrap
# make release
#- run: |
# make bootstrap
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,50 +0,0 @@
name: "Deep Classifier: Runner"
on:
schedule:
- cron: 0 * * * *
repository_dispatch:
types: [trigger-deep-classifier-runner]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/apply/fetch-sources
with:
# slightly overlapping to protect against issues slipping through the cracks if a run is delayed
from: 80
until: 5
configPath: classifier
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade numpy scipy scikit-learn joblib nltk simpletransformers torch torchvision
- name: "Run Classifier: Generator"
run: python ./actions/classifier-deep/apply/generate-labels/main.py
- name: "Run Classifier: Labeler"
uses: ./actions/classifier-deep/apply/apply-labels
with:
configPath: classifier
allowLabels: "needs more info|new release"
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,27 +0,0 @@
name: "Deep Classifier: Scraper"
on:
repository_dispatch:
types: [trigger-deep-classifier-scraper]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
ref: v40
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/train/fetch-issues
with:
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.ISSUE_SCRAPER_TOKEN}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}

View File

@@ -1,40 +0,0 @@
name: VS Code Repo Dev Container Cache Image Generation
on:
push:
# Currently doing this for master, but could be done for PRs as well
branches:
- "master"
# Only updates to these files result in changes to installed packages, so skip otherwise
paths:
- "**/package-lock.json"
- "**/yarn.lock"
jobs:
devcontainer:
name: Generate cache image
runs-on: ubuntu-latest
steps:
- name: Checkout
id: checkout
uses: actions/checkout@v2
- name: Azure CLI login
id: az_login
uses: azure/login@v1
with:
creds: ${{ secrets.AZ_ACR_CREDS }}
- name: Build and push
id: build_and_push
run: |
set -e
ACR_REGISTRY_NAME=$(echo ${{ secrets.CONTAINER_IMAGE_REGISTRY }} | grep -oP '(.+)(?=\.azurecr\.io)')
az acr login --name $ACR_REGISTRY_NAME
GIT_BRANCH=$(echo "${{ github.ref }}" | grep -oP 'refs/(heads|tags)/\K(.+)')
if [ "$GIT_BRANCH" == "" ]; then GIT_BRANCH=master; fi
.devcontainer/cache/build-cache-image.sh "${{ secrets.CONTAINER_IMAGE_REGISTRY }}/public/vscode/devcontainers/repos/microsoft/vscode" "${GIT_BRANCH}"

View File

@@ -1,27 +0,0 @@
name: Latest Release Monitor
on:
schedule:
- cron: 0/5 * * * *
repository_dispatch:
types: [trigger-latest-release-monitor]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "microsoft/vscode-github-triage-actions"
path: ./actions
ref: v40
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Storage Module
run: npm install @azure/storage-blob@12.1.1
- name: Run Latest Release Monitor
uses: ./actions/latest-release-monitor
with:
storageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,15 +0,0 @@
name: On Label
on:
issues:
types: [labeled]
jobs:
processLabelAction:
name: Process Label Action
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Process Label Action
uses: hramos/label-actions@v1
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
10

62
.vscode/launch.json vendored
View File

@@ -41,7 +41,10 @@
"port": 5876,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
],
"presentation": {
"hidden": true,
}
},
{
"type": "node",
@@ -138,12 +141,9 @@
}
},
{
"type": "pwa-chrome",
"type": "chrome",
"request": "launch",
"outFiles": [],
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "VS Code (Web, Chrome)",
"name": "Launch ADS (Web, Chrome) (TBD)",
"url": "http://localhost:8080",
"preLaunchTask": "Run web",
"presentation": {
@@ -154,8 +154,6 @@
{
"type": "pwa-msedge",
"request": "launch",
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "VS Code (Web, Edge)",
"url": "http://localhost:8080",
"pauseForSourceMap": false,
@@ -195,7 +193,7 @@
}
},
{
"type": "pwa-node",
"type": "node",
"request": "launch",
"name": "Run Unit Tests",
"program": "${workspaceFolder}/test/unit/electron/index.js",
@@ -214,41 +212,6 @@
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"cascadeTerminateToConfigurations": [
"Attach to VS Code"
],
"env": {
"MOCHA_COLORS": "true"
},
"presentation": {
"hidden": true
}
},
{
"type": "pwa-node",
"request": "launch",
"name": "Run Unit Tests For Current File",
"program": "${workspaceFolder}/test/unit/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/Code - OSS.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/code-oss"
},
"cascadeTerminateToConfigurations": [
"Attach to VS Code"
],
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222",
"--run",
"${relativeFile}"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"env": {
"MOCHA_COLORS": "true"
},
@@ -352,17 +315,6 @@
"group": "1_vscode",
"order": 2
}
},
{
"name": "Debug Unit Tests (Current File)",
"configurations": [
"Attach to VS Code",
"Run Unit Tests For Current File"
],
"presentation": {
"group": "1_vscode",
"order": 2
}
}
]
}

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"November 2020\"",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"August 2020\"",
"editable": true
},
{

View File

@@ -1,110 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Candidates",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE label:candidate",
"editable": true
}
]

View File

@@ -1,767 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## Config",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$since=2020-10-01",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode\n\nQuery exceeds the maximum result. Run the query manually: `is:issue is:open closed:>2020-10-01`",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-remote-release",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-docs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-js-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# language-server-protocol",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-eslint",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-css-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-test",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-pull-request-github"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-chrome-debug (deprecated)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-chrome-debug-core",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-debugadapter-node",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-emmet-helper",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-vscode\n\nDeprecated",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-samples",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-filewatcher-windows",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-generator-code",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-html-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-jshint",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jshint is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jshint is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-json-languageservice",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-languageserver-node",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-loader",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-mono-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug2",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-recipes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-textmate",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-themes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-vsce",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-website",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-windows-process-tree",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# debug-adapter-protocol",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# inno-updater",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# language-server-protocol-inspector",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol-inspector is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol-inspector is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-languages",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-typescript",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-css",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-json",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-html",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor-webpack-plugin",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue created:>$since",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# node-jsonc-parser",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue closed:>$since",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue created:>$since",
"editable": true
}
]

View File

@@ -1,26 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "### Categorizing Issues\n\nEach issue must have a type label. Most type labels are grey, some are yellow. Bugs are grey with a touch of red.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:\"needs more info\" -label:bug -label:feature-request -label:under-discussion -label:debt -label:*question -label:upstream -label:electron -label:engineering -label:plan-item ",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### Feature Areas\n\nEach issue should be assigned to a feature area",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:L10N -label:VIM -label:api -label:api-finalization -label:api-proposal -label:authentication -label:breadcrumbs -label:callhierarchy -label:code-lens -label:color-palette -label:comments -label:config -label:context-keys -label:css-less-scss -label:custom-editors -label:debug -label:debug-console -label:dialogs -label:diff-editor -label:dropdown -label:editor -label:editor-RTL -label:editor-autoclosing -label:editor-autoindent -label:editor-bracket-matching -label:editor-clipboard -label:editor-code-actions -label:editor-color-picker -label:editor-columnselect -label:editor-commands -label:editor-comments -label:editor-contrib -label:editor-core -label:editor-drag-and-drop -label:editor-error-widget -label:editor-find -label:editor-folding -label:editor-highlight -label:editor-hover -label:editor-indent-detection -label:editor-indent-guides -label:editor-input -label:editor-input-IME -label:editor-insets -label:editor-minimap -label:editor-multicursor -label:editor-parameter-hints -label:editor-render-whitespace -label:editor-rendering -label:editor-scrollbar -label:editor-symbols -label:editor-synced-region -label:editor-textbuffer -label:editor-theming -label:editor-wordnav -label:editor-wrapping -label:emmet -label:error-list -label:explorer-custom -label:extension-host -label:extension-recommendations -label:extensions -label:extensions-development -label:file-decorations -label:file-encoding -label:file-explorer -label:file-glob -label:file-guess-encoding -label:file-io -label:file-watcher -label:font-rendering -label:formatting -label:git -label:github -label:gpu -label:grammar -label:grid-view -label:html -label:i18n -label:icon-brand -label:icons-product -label:install-update -label:integrated-terminal -label:integrated-terminal-conpty -label:integrated-terminal-links -label:integrated-terminal-rendering -label:integrated-terminal-winpty -label:intellisense-config -label:ipc -label:issue-bot -label:issue-reporter -label:javascript -label:json -label:keybindings -label:keybindings-editor -label:keyboard-layout -label:label-provider -label:languages-basic -label:languages-diagnostics -label:languages-guessing -label:layout -label:lcd-text-rendering -label:list -label:log -label:markdown -label:marketplace -label:menus -label:merge-conflict -label:notebook -label:outline -label:output -label:perf -label:perf-bloat -label:perf-startup -label:php -label:portable-mode -label:proxy -label:quick-pick -label:references-viewlet -label:release-notes -label:remote -label:remote-explorer -label:rename -label:sandbox -label:scm -label:screencast-mode -label:search -label:search-api -label:search-editor -label:search-replace -label:semantic-tokens -label:settings-editor -label:settings-sync -label:settings-sync-server -label:shared-process -label:simple-file-dialog -label:smart-select -label:snap -label:snippets -label:splitview -label:suggest -label:sync-error-handling -label:tasks -label:telemetry -label:themes -label:timeline -label:timeline-git -label:titlebar -label:tokenization -label:touch/pointer -label:trackpad/scroll -label:tree -label:typescript -label:undo-redo -label:uri -label:ux -label:variable-resolving -label:vscode-build -label:vscode-website -label:web -label:webview -label:workbench-actions -label:workbench-cli -label:workbench-diagnostics -label:workbench-dnd -label:workbench-editor-grid -label:workbench-editors -label:workbench-electron -label:workbench-feedback -label:workbench-history -label:workbench-hot-exit -label:workbench-hover -label:workbench-launch -label:workbench-link -label:workbench-multiroot -label:workbench-notifications -label:workbench-os-integration -label:workbench-rapid-render -label:workbench-run-as-admin -label:workbench-state -label:workbench-status -label:workbench-tabs -label:workbench-touchbar -label:workbench-views -label:workbench-welcome -label:workbench-window -label:workbench-zen -label:workspace-edit -label:workspace-symbols -label:zoom",
"editable": true
}
]

View File

@@ -1,16 +1,4 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## tl;dr: Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:\"needs more info\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
@@ -18,9 +6,9 @@
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/Microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"kind": 2,
"language": "github-issues",
"value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item ",
"editable": true
},
{
@@ -32,19 +20,31 @@
{
"kind": 1,
"language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/master/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/Microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## All Inbox Items\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.",
"value": "## Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox",
"value": "$inbox -label:\"needs more info\" -label:emmet",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Inbox\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:emmet",
"editable": true
}
]

View File

@@ -1,206 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server\n\n$MILESTONE=milestone:\"November 2020\"\n\n$MINE=assignee:@me",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:pr is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open author:@me label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:testplan-item",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified label:feature-request label:verification-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Fixing",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Bugs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-steps-needed)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-steps-needed",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-found)",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by me",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by others",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "# Release Notes",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode $MILESTONE is:issue is:closed label:feature-request -label:on-release-notes",
"editable": true
}
]

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"November 2020\"",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"August 2020\"",
"editable": true
},
{

View File

@@ -14,7 +14,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"October 2020\"",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"July 2020\"",
"editable": true
},
{
@@ -38,7 +38,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand",
"editable": false
},
{

View File

@@ -1,101 +0,0 @@
# Query: .innerHTML =
# Flags: CaseSensitive WordMatch
# Including: src/vs/**/*.{t,j}s
# Excluding: *.test.ts, **/test/**
# ContextLines: 3
12 results - 9 files
src/vs/base/browser/dom.ts:
1359 );
1360
1361 const html = _ttpSafeInnerHtml?.createHTML(value, options) ?? insane(value, options);
1362: node.innerHTML = html as unknown as string;
1363 }
src/vs/base/browser/markdownRenderer.ts:
272 };
273
274 if (_ttpInsane) {
275: element.innerHTML = _ttpInsane.createHTML(renderedMarkdown, insaneOptions) as unknown as string;
276 } else {
277: element.innerHTML = insane(renderedMarkdown, insaneOptions);
278 }
279
280 // signal that async code blocks can be now be inserted
src/vs/editor/browser/core/markdownRenderer.ts:
88
89 const element = document.createElement('span');
90
91: element.innerHTML = MarkdownRenderer._ttpTokenizer
92 ? MarkdownRenderer._ttpTokenizer.createHTML(value, tokenization) as unknown as string
93 : tokenizeToString(value, tokenization);
94
src/vs/editor/browser/view/domLineBreaksComputer.ts:
107 allCharOffsets[i] = tmp[0];
108 allVisibleColumns[i] = tmp[1];
109 }
110: containerDomNode.innerHTML = sb.build();
111
112 containerDomNode.style.position = 'absolute';
113 containerDomNode.style.top = '10000';
src/vs/editor/browser/view/viewLayer.ts:
512 }
513 const lastChild = <HTMLElement>this.domNode.lastChild;
514 if (domNodeIsEmpty || !lastChild) {
515: this.domNode.innerHTML = newLinesHTML;
516 } else {
517 lastChild.insertAdjacentHTML('afterend', newLinesHTML);
518 }
533 if (ViewLayerRenderer._ttPolicy) {
534 invalidLinesHTML = ViewLayerRenderer._ttPolicy.createHTML(invalidLinesHTML) as unknown as string;
535 }
536: hugeDomNode.innerHTML = invalidLinesHTML;
537
538 for (let i = 0; i < ctx.linesLength; i++) {
539 const line = ctx.lines[i];
src/vs/editor/browser/widget/diffEditorWidget.ts:
2157
2158 let domNode = document.createElement('div');
2159 domNode.className = `view-lines line-delete ${MOUSE_CURSOR_TEXT_CSS_CLASS_NAME}`;
2160: domNode.innerHTML = sb.build();
2161 Configuration.applyFontInfoSlow(domNode, fontInfo);
2162
2163 let marginDomNode = document.createElement('div');
2164 marginDomNode.className = 'inline-deleted-margin-view-zone';
2165: marginDomNode.innerHTML = marginHTML.join('');
2166 Configuration.applyFontInfoSlow(marginDomNode, fontInfo);
2167
2168 return {
src/vs/editor/standalone/browser/colorizer.ts:
40 let text = domNode.firstChild ? domNode.firstChild.nodeValue : '';
41 domNode.className += ' ' + theme;
42 let render = (str: string) => {
43: domNode.innerHTML = str;
44 };
45 return this.colorize(modeService, text || '', mimeType, options).then(render, (err) => console.error(err));
46 }
src/vs/workbench/contrib/notebook/browser/view/renderers/cellRenderer.ts:
580 const element = DOM.$('div', { style });
581
582 const linesHtml = this.getRichTextLinesAsHtml(model, modelRange, colorMap);
583: element.innerHTML = linesHtml as unknown as string;
584 return element;
585 }
586
src/vs/workbench/contrib/notebook/browser/view/renderers/webviewPreloads.ts:
375 addMouseoverListeners(outputNode, outputId);
376 const content = data.content;
377 if (content.type === RenderOutputType.Html) {
378: outputNode.innerHTML = content.htmlContent;
379 cellOutputContainer.appendChild(outputNode);
380 domEval(outputNode);
381 } else if (preloadErrs.some(e => !!e)) {

73
.vscode/searches/es6.code-search vendored Normal file
View File

@@ -0,0 +1,73 @@
# Query: @deprecated ES6
# Flags: CaseSensitive WordMatch
# ContextLines: 2
14 results - 4 files
src/vs/base/browser/dom.ts:
81 };
82
83: /** @deprecated ES6 - use classList*/
84 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
85: /** @deprecated ES6 - use classList*/
86 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
87: /** @deprecated ES6 - use classList*/
88 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
89: /** @deprecated ES6 - use classList*/
90 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
91: /** @deprecated ES6 - use classList*/
92 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
93: /** @deprecated ES6 - use classList*/
94 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
95
src/vs/base/common/arrays.ts:
401
402 /**
403: * @deprecated ES6: use `Array.findIndex`
404 */
405 export function firstIndex<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean): number {
417
418 /**
419: * @deprecated ES6: use `Array.find`
420 */
421 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
568
569 /**
570: * @deprecated ES6: use `Array.find`
571 */
572 export function find<T>(arr: ArrayLike<T>, predicate: (value: T, index: number, arr: ArrayLike<T>) => any): T | undefined {
src/vs/base/common/objects.ts:
115
116 /**
117: * @deprecated ES6
118 */
119 export function assign<T>(destination: T): T;
src/vs/base/common/strings.ts:
15
16 /**
17: * @deprecated ES6: use `String.padStart`
18 */
19 export function pad(n: number, l: number, char: string = '0'): string {
146
147 /**
148: * @deprecated ES6: use `String.startsWith`
149 */
150 export function startsWith(haystack: string, needle: string): boolean {
167
168 /**
169: * @deprecated ES6: use `String.endsWith`
170 */
171 export function endsWith(haystack: string, needle: string): boolean {
861
862 /**
863: * @deprecated ES6
864 */
865 export function repeat(s: string, count: number): string {

View File

@@ -2,52 +2,18 @@
# Flags: RegExp
# ContextLines: 2
8 results - 4 files
2 results - 2 files
src/vs/base/browser/ui/tree/asyncDataTree.ts:
241 } : () => 'treeitem',
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
244 } : undefined,
245 getAriaLabel(e) {
243 } : () => 'treeitem',
244 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
245: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
246 } : undefined,
247 getAriaLabel(e) {
src/vs/platform/list/browser/listService.ts:
463
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) {
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
466 this._register(options?.configurationService.onDidChangeConfiguration(() => {
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
468 }));
469 } else {
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
1526
1527 await this._ensureActiveKernel();
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
1529 }
1530
1535
1536 await this._ensureActiveKernel();
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
1538 }
1539
1553
1554 await this._ensureActiveKernel();
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1556 }
1557
1567
1568 await this._ensureActiveKernel();
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1570 }
1571
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
89 .then(() => this._resourceRequestManager.ensureReady())
90 .then(() => {
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
92 });
93 }
src/vs/workbench/contrib/debug/browser/debugConfigurationManager.ts:
254
255 return debugDynamicExtensions.map(e => {
256: const type = e.contributes?.debuggers![0].type!;
257 return {
258 label: this.getDebuggerLabel(type)!,

View File

@@ -73,9 +73,6 @@
},
"gulp.autoDetect": "off",
"files.insertFinalNewline": true,
"[plaintext]": {
"files.insertFinalNewline": false,
},
"[typescript]": {
"editor.defaultFormatter": "vscode.typescript-language-features"
},

View File

@@ -1,3 +1,3 @@
disturl "https://electronjs.org/headers"
target "9.4.3"
disturl "https://atom.io/download/electron"
target "9.2.1"
runtime "electron"

View File

@@ -1,130 +1,5 @@
# Change Log
## Version 1.27.0
* Release date: March 17, 2021
* Release status: General Availability
* New Notebook Features:
* Added create book dialog
* Extension Updates:
* Import
* Dacpac
* Machine Learning
* SQL Assessment
* Arc
* SQL Database Projects
* ASDE Deployment
* Bux Fixes
## Version 1.26.1
* Release date: February 25, 2021
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/14382
## Version 1.26.0
* Release date: February 22, 2021
* Release status: General Availability
* Added edit Jupyter book UI support
* Improved Jupyter server start-up time by 50% on windows
* Extension Updates:
* Azure Arc
* PG dashboard enhancements
* Multi-controller support
* MIAA Dashboard will no longer prompt for SQL Server connection immediately upon opening
* Azure Data CLI
* Kusto
* Machine Learning
* Profiler
* Server Reports
* Schema Compare
* SQL Server Dacpac
* SQL Database Projects
* Bug Fixes
## Version 1.25.3
* Release date: February 10, 2021
* Release status: General Availability
* Update Electron to 9.4.3 to incorporate critical upstream fixes
## Version 1.25.2
* Release date: January 22, 2021
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/13899
## Version 1.25.1
* Release date: December 10, 2020
* Release status: General Availability
* Fixes https://github.com/microsoft/azuredatastudio/issues/13751
## Version 1.25.0
* Release date: December 8, 2020
* Release status: General Availability
* Kusto extension improvements
* SQL Project extension improvements
* Notebook improvements
* Azure Browse Connections Preview performance improvements
* Bug Fixes
## Version 1.24.0
* Release date: November 12, 2020
* Release status: General Availability
* SQL Project improvements
* Notebook improvements, including in WYSIWYG editor enhancements
* Azure Arc improvements
* Azure SQL Deployment UX improvements
* Azure Browse Connections Preview
* Bug Fixes
## Version 1.23.0
* Release date: October 14, 2020
* Release status: General Availability
* Added deployments of Azure SQL DB and VM
* Added PowerShell kernel results streaming support
* Added improvements to SQL Database Projects extension
* Bug Fixes
* Extension Updates:
* SQL Server Import
* Machine Learning
* Schema Compare
* Kusto
* SQL Assessment
* SQL Database Projects
* Azure Arc
* azdata
## Version 1.22.1
* Release date: September 30, 2020
* Release status: General Availability
* Fix bug #12615 Active connection filter doesn't untoggle | [#12615](https://github.com/microsoft/azuredatastudio/issues/12615)
* Fix bug #12572 Edit Data grid doesn't escape special characters | [#12572](https://github.com/microsoft/azuredatastudio/issues/12572)
* Fix bug #12570 Dashboard Explorer table doesn't escape special characters | [#12570](https://github.com/microsoft/azuredatastudio/issues/12570)
* Fix bug #12582 Delete row on Edit Data fails | [#12582](https://github.com/microsoft/azuredatastudio/issues/12582)
* Fix bug #12646 SQL Notebooks: Cells being treated isolated | [#12646](https://github.com/microsoft/azuredatastudio/issues/12646)
## Version 1.22.0
* Release date: September 22, 2020
* Release status: General Availability
* New Notebook Features
* Supports brand new text cell editing experience based on rich text formatting and seamless conversion to markdown, also known as WYSIWYG toolbar (What You See Is What You Get)
* Supports Kusto kernel
* Supports pinning of notebooks
* Added support for new version of Jupyter Books
* Improved Jupyter Shortcuts
* Introduced perf loading improvements
* Added Azure Arc extension - Users can try out Azure Arc public preview through Azure Data Studio. This includes:
* Deploy data controller
* Deploy Postgres
* Deploy Managed Instance for Azure Arc
* Connect to data controller
* Access data service dashboards
* Azure Arc Jupyter Book
* Added new deployment options
* Azure SQL Database Edge
* (Edge will require Azure SQL Edge Deployment Extension)
* Added SQL Database Projects extension - The SQL Database Projects extension brings project-based database development to Azure Data Studio. In this preview release, SQL projects can be created and published from Azure Data Studio.
* Added Kusto (KQL) extension - Brings native Kusto experiences in Azure Data Studio for data exploration and data analytics against massive amount of real-time streaming data stored in Azure Data Explorer. This preview release supports connecting and browsing Azure Data Explorer clusters, writing KQL queries as well as authoring notebooks with Kusto kernel.
* SQL Server Import extension GA - Announcing the GA of the SQL Server Import extension, features no longer in preview. This extension facilitates importing csv/txt files. Learn more about the extension in [this article](sql-server-import-extension.md).
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22September+2020+Release%22+is%3Aclosed).
## Version 1.21.0
* Release date: August 12, 2020
* Release status: General Availability

View File

@@ -19,7 +19,7 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| [Linux DEB][linux-deb] |
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
## Try out the latest insiders build from `main`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
@@ -29,8 +29,6 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## **Feature Highlights**
@@ -131,10 +129,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2157460
[win-system]: https://go.microsoft.com/fwlink/?linkid=2157459
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2157458
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2157456
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2157353
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2157248
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2157352
[win-user]: https://go.microsoft.com/fwlink/?linkid=2138608
[win-system]: https://go.microsoft.com/fwlink/?linkid=2138704
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2138705
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2138609
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2138706
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2138507
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2138508

View File

@@ -12,7 +12,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
angular2-grid: https://github.com/BTMorton/angular2-grid
angular2-slickgrid: https://github.com/Microsoft/angular2-slickgrid
applicationinsights: https://github.com/Microsoft/ApplicationInsights-node.js
axios: https://github.com/axios/axios
axios: https://github.com/axios/axios
bootstrap: https://github.com/twbs/bootstrap
chart.js: https://github.com/Timer/chartjs
chokidar: https://github.com/paulmillr/chokidar
@@ -39,7 +39,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
jschardet: https://github.com/aadsm/jschardet
jupyter-powershell: https://github.com/vors/jupyter-powershell
JupyterLab: https://github.com/jupyterlab/jupyterlab
keytar: https://github.com/atom/node-keytar
keytar: https://github.com/atom/node-keytar
make-error: https://github.com/JsCommunity/make-error
mark.js: https://github.com/julmot/mark.js
minimist: https://github.com/substack/minimist
@@ -54,8 +54,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
primeng: https://github.com/primefaces/primeng
process-nextick-args: https://github.com/calvinmetcalf/process-nextick-args
pty.js: https://github.com/chjj/pty.js
pyzmq: https://github.com/zeromq/pyzmq
qs: https://github.com/ljharb/qs
qs: https://github.com/ljharb/qs
reflect-metadata: https://github.com/rbuckton/reflect-metadata
request: https://github.com/request/request
rxjs: https://github.com/ReactiveX/RxJS
@@ -1527,6 +1526,30 @@ END OF primeng NOTICES AND INFORMATION
%% process-nextick-args NOTICES AND INFORMATION BEGIN HERE
=========================================
# Copyright (c) 2015 Calvin Metcalf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
**THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.**
=========================================
END OF process-nextick-args NOTICES AND INFORMATION
%% pty.js NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
Permission is hereby granted, free of charge, to any person obtaining a copy
@@ -1549,40 +1572,6 @@ THE SOFTWARE.
=========================================
END OF pty.js NOTICES AND INFORMATION
%% PyZMQ NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2009-2012, Brian Granger, Min Ragan-Kelley
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of PyZMQ nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
=========================================
END OF pyzmq NOTICES AND INFORMATION
%% reflect-metadata NOTICES AND INFORMATION BEGIN HERE
=========================================
Apache License

View File

@@ -1,3 +0,0 @@
* text eol=lf
*.exe binary
*.dll binary

View File

@@ -8,10 +8,6 @@
**/LICENSE
**/CONTRIBUTORS
**/docs/**
**/example/**
**/examples/**
jschardet/index.js
jschardet/src/**
jschardet/dist/jschardet.js

View File

@@ -15,9 +15,9 @@
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.6",
"@actions/core": "^1.2.3",
"@actions/github": "^2.1.1",
"axios": "^0.21.1",
"axios": "^0.19.2",
"ts-node": "^8.6.2",
"typescript": "^3.8.3"
}

View File

@@ -2,10 +2,10 @@
# yarn lockfile v1
"@actions/core@^1.2.6":
version "1.2.6"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.6.tgz#a78d49f41a4def18e88ce47c2cac615d5694bf09"
integrity sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA==
"@actions/core@^1.2.3":
version "1.2.3"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.3.tgz#e844b4fa0820e206075445079130868f95bfca95"
integrity sha512-Wp4xnyokakM45Uuj4WLUxdsa8fJjKVl1fDTsPbTEcTcuu0Nb26IPQbOtjmnfaCPGcaoPOOqId8H9NapZ8gii4w==
"@actions/github@^2.1.1":
version "2.1.1"
@@ -144,12 +144,12 @@ atob-lite@^2.0.0:
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
axios@^0.21.1:
version "0.21.1"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.1.tgz#22563481962f4d6bde9a76d516ef0e5d3c09b2b8"
integrity sha512-dKQiRHxGD9PPRIUNIWvZhPTPpl1rf/OxTYKsqKUDjBwYylTvV7SjSHJb9ratfyzM6wCdLCOYLzs73qpg5c4iGA==
axios@^0.19.2:
version "0.19.2"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.19.2.tgz#3ea36c5d8818d0d5f8a8a97a6d36b86cdc00cb27"
integrity sha512-fjgm5MvRHLhx+osE2xoekY70AhARk3a6hkN+3Io1jc00jtquGvxYlKlsFUhmUET0V5te6CcZI7lcv2Ym61mjHA==
dependencies:
follow-redirects "^1.10.0"
follow-redirects "1.5.10"
before-after-hook@^2.0.0:
version "2.1.0"
@@ -177,6 +177,13 @@ cross-spawn@^6.0.0:
shebang-command "^1.2.0"
which "^1.2.9"
debug@=3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/debug/-/debug-3.1.0.tgz#5bb5a0672628b64149566ba16819e61518c67261"
integrity sha512-OX8XqP7/1a9cqkxYw2yXss15f26NKWBpDXQd0/uK/KPqdQhxbPa994hnzjcE2VqQpDslf55723cKPUOGSmMY3g==
dependencies:
ms "2.0.0"
deprecation@^2.0.0, deprecation@^2.3.1:
version "2.3.1"
resolved "https://registry.yarnpkg.com/deprecation/-/deprecation-2.3.1.tgz#6368cbdb40abf3373b525ac87e4a260c3a700919"
@@ -207,10 +214,12 @@ execa@^1.0.0:
signal-exit "^3.0.0"
strip-eof "^1.0.0"
follow-redirects@^1.10.0:
version "1.13.1"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.13.1.tgz#5f69b813376cee4fd0474a3aba835df04ab763b7"
integrity sha512-SSG5xmZh1mkPGyKzjZP8zLjltIfpW32Y5QpdNJyjcfGxK3qo3NDDkZOZSFiGn1A6SclQxY9GzEwAHQ3dmYRWpg==
follow-redirects@1.5.10:
version "1.5.10"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.5.10.tgz#7b7a9f9aea2fdff36786a94ff643ed07f4ff5e2a"
integrity sha512-0V5l4Cizzvqt5D44aTXbFZz+FtyXV1vrDN6qrelxtfYQKW0KO0W2T/hkE8xvGa/540LkZlkaUjO4ailYTFtHVQ==
dependencies:
debug "=3.1.0"
get-stream@^4.0.0:
version "4.1.0"
@@ -266,6 +275,11 @@ make-error@^1.1.1:
resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
integrity sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=
nice-try@^1.0.4:
version "1.0.5"
resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366"

View File

@@ -24,7 +24,7 @@ const files = [
];
async function main() {
return new Promise<void>((resolve, reject) => {
return new Promise((resolve, reject) => {
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
.pipe(es.through(file => {
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY!,

View File

@@ -53,7 +53,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
}
function getEnv(name: string): string {

View File

@@ -17,7 +17,7 @@ const fileNames = [
];
async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> {
@@ -33,7 +33,7 @@ async function uploadBlob(blobService: azure.BlobService, container: string, blo
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
}
async function publish(commit: string, files: readonly string[]): Promise<void> {

View File

@@ -43,7 +43,6 @@ function createDefaultConfig(quality: string): Config {
}
function getConfig(quality: string): Promise<Config> {
console.log(`Getting config for quality ${quality}`);
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
@@ -53,13 +52,13 @@ function getConfig(quality: string): Promise<Config> {
]
};
return retry(() => new Promise<Config>((c, e) => {
return new Promise<Config>((c, e) => {
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err && err.code !== 409) { return e(err); }
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config);
});
}));
});
}
interface Asset {
@@ -87,7 +86,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
updateTries++;
return new Promise<void>((c, e) => {
console.log(`Querying existing documents to update...`);
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); }
@@ -103,7 +101,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
release.updates[platform] = type;
}
console.log(`Replacing existing document with updated version`);
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) { return c(update()); }
if (err) { return e(err); }
@@ -115,8 +112,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
});
}
return retry(() => new Promise<void>((c, e) => {
console.log(`Attempting to create document`);
return new Promise<void>((c, e) => {
client.createDocument(collection, release, err => {
if (err && err.code === 409) { return c(update()); }
if (err) { return e(err); }
@@ -124,11 +120,11 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
console.log('Build successfully published.');
c();
});
}));
});
}
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
@@ -144,7 +140,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
}
interface PublishOptions {
@@ -192,6 +188,7 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
@@ -250,22 +247,6 @@ async function publish(commit: string, quality: string, platform: string, type:
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
const RETRY_TIMES = 10;
async function retry<T>(fn: () => Promise<T>): Promise<T> {
for (let run = 1; run <= RETRY_TIMES; run++) {
try {
return await fn();
} catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
}
}
throw new Error('Retried too many times');
}
function main(): void {
const commit = process.env['BUILD_SOURCEVERSION'];

View File

@@ -1,26 +1,24 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs:
versionSpec: "1.x"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
- script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -35,21 +33,21 @@ steps:
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn compile
displayName: Compile Sources
# - script: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions
# displayName: Download Built-in Extensions
- script: |
./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
- script: |
./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
# - script: | {{SQL CARBON EDIT}} disable
# yarn test-browser --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
@@ -59,17 +57,17 @@ steps:
# ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -1,337 +1,269 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
sudo xcode-select -s /Applications/Xcode_12.2.app
displayName: Switch to Xcode 12
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
npm install -g node-gyp@7.1.0
node-gyp --version
displayName: Update node-gyp
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
export CHILD_CONCURRENCY="1"
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
export npm_config_arch=$(VSCODE_ARCH)
export npm_config_node_gyp=$(which node-gyp)
export npm_config_build_from_source=true
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
ls /Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/
yarn electron-rebuild
cd ./node_modules/keytar
node-gyp rebuild
displayName: Rebuild native modules for ARM64
condition: eq(variables['VSCODE_ARCH'], 'arm64')
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
./resources/server/test/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME"
continueOnError: true
displayName: Run smoke tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build reh
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
yarn smoketest --web --headless
continueOnError: true
displayName: Run smoke tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
yarn electron $(VSCODE_ARCH)
displayName: Download Electron
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
VSCODE_ARCH="$(VSCODE_ARCH)" DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
DEBUG=electron-osx-sign* node build/darwin/sign.js
displayName: Set Hardened Entitlements
- script: |
set -e
yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin && zip -r -X -y $(agent.builddirectory)/VSCode-darwin.zip * && popd
displayName: Archive build
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)'
Pattern: 'VSCode-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
./resources/server/test/test-web-integration.sh --browser webkit
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
zip -d $(agent.builddirectory)/VSCode-darwin.zip "*.pkg"
displayName: Clean Archive
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
displayName: Export bundle identifier
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME"
continueOnError: true
displayName: Run smoke tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)'
Pattern: 'VSCode-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
yarn smoketest --web --headless
continueOnError: true
displayName: Run smoke tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos-$(VSCODE_ARCH)
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()
- script: |
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
yarn gulp upload-vscode-configuration
displayName: Upload configuration (for Bing settings search)
continueOnError: true
- script: |
set -e
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
displayName: Archive build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [
{
"parameterName": "Hardening",
"parameterValue": "--options=runtime"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Codesign
- script: |
zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip "*.pkg"
displayName: Clean Archive
- script: |
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
displayName: Export bundle identifier
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(agent.builddirectory)"
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppNotarize",
"parameters": [
{
"parameterName": "BundleId",
"parameterValue": "$(BundleIdentifier)"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- script: |
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
yarn gulp upload-vscode-configuration
displayName: Upload configuration (for Bing settings search)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -1,27 +1,19 @@
#!/usr/bin/env bash
set -e
# Publish DEB
case $VSCODE_ARCH in
x64) ASSET_ID="darwin" ;;
arm64) ASSET_ID="darwin-arm64" ;;
esac
# publish the build
node build/azure-pipelines/common/createAsset.js \
"$ASSET_ID" \
darwin \
archive \
"VSCode-$ASSET_ID.zip" \
../VSCode-darwin-$VSCODE_ARCH.zip
"VSCode-darwin-$VSCODE_QUALITY.zip" \
../VSCode-darwin.zip
if [ "$VSCODE_ARCH" == "x64" ]; then
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
# publish Remote Extension Host
node build/azure-pipelines/common/createAsset.js \
server-darwin \
archive-unsigned \
"vscode-server-darwin.zip" \
../vscode-server-darwin.zip
fi
# publish Remote Extension Host
node build/azure-pipelines/common/createAsset.js \
server-darwin \
archive-unsigned \
"vscode-server-darwin.zip" \
../vscode-server-darwin.zip

View File

@@ -12,7 +12,6 @@ steps:
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -50,7 +49,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
@@ -62,7 +61,6 @@ steps:
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -77,7 +75,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -98,7 +95,9 @@ steps:
- script: |
set -e
yarn gulp package-rebuild-extensions
yarn gulp vscode-darwin-x64-min-ci
yarn gulp vscode-darwin-min-ci
yarn gulp vscode-reh-darwin-min-ci
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -114,7 +113,7 @@ steps:
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
@@ -124,25 +123,25 @@ steps:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Electron)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# - script: |
# set -e
# node ./node_modules/playwright/install.js
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
# yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
# displayName: Run smoke tests (Browser)
# continueOnError: true
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
node ./node_modules/playwright/install.js
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Browser)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
pushd ../azuredatastudio-darwin-x64
pushd ../azuredatastudio-darwin
ls
echo "Cleaning the application"
@@ -172,7 +171,7 @@ steps:
- script: |
set -e
mkdir -p .build/darwin/archive
pushd ../azuredatastudio-darwin-x64
pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip
popd
displayName: 'Archive (no signing)'
@@ -181,7 +180,7 @@ steps:
- script: |
set -e
mkdir -p .build/darwin/archive
pushd ../azuredatastudio-darwin-x64
pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin-unsigned.zip
popd
displayName: 'Archive'
@@ -202,7 +201,7 @@ steps:
testResultsFiles: 'test-results.xml'
searchFolder: '$(Build.SourcesDirectory)'
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true'))
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishCodeCoverageResults@1
displayName: 'Publish code coverage from $(Build.SourcesDirectory)/.build/coverage/cobertura-coverage.xml'

View File

@@ -9,9 +9,9 @@ pr:
include: ['main', 'release/*']
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
@@ -19,8 +19,8 @@ steps:
azureSubscription: 'azuredatastudio-adointegration'
KeyVaultName: ado-secrets
- script: |
set -e
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
@@ -28,7 +28,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
@@ -37,9 +37,9 @@ steps:
# Push main branch into oss/master
git push distro origin/main:refs/heads/oss/master
# Push every release branch into oss/release
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro
# Push every release branch into oss/release
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro
git merge $(node -p "require('./package.json').distro")
git merge $(node -p "require('./package.json').distro")
displayName: Sync & Merge Distro
displayName: Sync & Merge Distro

View File

@@ -22,7 +22,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
@@ -34,7 +34,6 @@ steps:
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -49,7 +48,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'

View File

@@ -1,40 +1,40 @@
pool:
vmImage: "Ubuntu-16.04"
vmImage: 'Ubuntu-16.04'
trigger:
branches:
include: ["main"]
include: ['main']
pr:
branches:
include: ["main"]
include: ['main']
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git checkout origin/electron-11.x.y
git merge origin/master
git checkout origin/electron-x.y.z
git merge origin/master
# Push master branch into exploration branch
git push origin HEAD:electron-11.x.y
# Push master branch into exploration branch
git push origin HEAD:electron-x.y.z
displayName: Sync & Merge Exploration
displayName: Sync & Merge Exploration

View File

@@ -1,5 +0,0 @@
#!/usr/bin/env bash
set -e
echo "Installing remote dependencies"
(cd remote && rm -rf node_modules && yarn)

View File

@@ -1,28 +0,0 @@
#!/usr/bin/env bash
set -e
REPO="$(pwd)"
ROOT="$REPO/.."
PLATFORM_LINUX="linux-alpine"
# Publish Remote Extension Host
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish Remote Extension Host (Web)
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"

View File

@@ -1,99 +1,97 @@
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev #{{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev #{{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
inputs:
versionSpec: "1.x"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
- script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
yarn electron x64
displayName: Download Electron
- script: |
yarn electron x64
displayName: Download Electron
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: | # {{SQL CARBON EDIT}} add strict null check
yarn strict-vscode
displayName: Run Strict Null Check
- script: | # {{SQL CARBON EDIT}} add strict null check
yarn strict-vscode
displayName: Run Strict Null Check
# - script: | {{SQL CARBON EDIT}} remove monaco editor checks
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
# - script: | {{SQL CARBON EDIT}} remove monaco editor checks
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn compile
displayName: Compile Sources
# - script: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions
# displayName: Download Built-in Extensions
# - script: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions
# displayName: Download Built-in Extensions
- script: |
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
- script: |
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
# - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 yarn test-browser --browser chromium --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser)
# - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 yarn test-browser --browser chromium --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser)
# - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
# - script: | {{SQL CARBON EDIT}} disable
# DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
# - task: PublishPipelineArtifact@0
# inputs:
# artifactName: crash-dump-linux
# targetPath: .build/crashes
# displayName: 'Publish Crash Reports'
# condition: succeededOrFailed()
# - task: PublishPipelineArtifact@0
# inputs:
# artifactName: crash-dump-linux
# targetPath: .build/crashes
# displayName: 'Publish Crash Reports'
# condition: succeededOrFailed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
set -e
echo 'noop'

View File

@@ -1,135 +0,0 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: Docker@1
displayName: "Pull image"
inputs:
azureSubscriptionEndpoint: "vscode-builds-subscription"
azureContainerRegistry: vscodehub.azurecr.io
command: "Run an image"
imageName: "vscode-linux-build-agent:alpine"
containerCommand: uname
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- script: |
echo -n "alpine" > .build/arch
displayName: Prepare yarn cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
docker run -e VSCODE_QUALITY -e CHILD_CONCURRENCY=1 -v $(pwd):/root/vscode -v ~/.netrc:/root/.netrc vscodehub.azurecr.io/vscode-linux-build-agent:alpine /root/vscode/build/azure-pipelines/linux/alpine/install-dependencies.sh
displayName: Prebuild
- script: |
set -e
yarn gulp vscode-reh-linux-alpine-min-ci
yarn gulp vscode-reh-web-linux-alpine-min-ci
displayName: Build
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/alpine/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true

View File

@@ -0,0 +1,115 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- task: Docker@1
displayName: 'Pull image'
inputs:
azureSubscriptionEndpoint: 'vscode-builds-subscription'
azureContainerRegistry: vscodehub.azurecr.io
command: 'Run an image'
imageName: 'vscode-linux-build-agent:$(VSCODE_ARCH)'
containerCommand: uname
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
CHILD_CONCURRENCY=1 ./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/prebuild.sh
displayName: Prebuild
- script: |
set -e
./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/build.sh
displayName: Build
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -1,231 +1,200 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flags
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
export npm_config_arch=$(NPM_ARCH)
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-x64-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-x64-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
service xvfb start
displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
service xvfb start
displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-x64" \
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: "crash-dump-linux-$(VSCODE_ARCH)"
targetPath: .build/crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- script: |
set -e
yarn gulp "vscode-linux-x64-build-deb"
yarn gulp "vscode-linux-x64-build-rpm"
yarn gulp "vscode-linux-x64-prepare-snap"
displayName: Build packages
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '.build/linux/rpm/x86_64'
Pattern: '*.rpm'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-450779-Pgp",
"operationSetCode": "LinuxSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign rpm
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
- task: PublishPipelineArtifact@0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: snap-x64
targetPath: .build/linux/snap-tarball
# needed for code signing
- task: UseDotNet@2
displayName: "Install .NET Core SDK 2.x"
inputs:
version: 2.x
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: ".build/linux/rpm"
Pattern: "*.rpm"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-450779-Pgp",
"operationSetCode": "LinuxSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign rpm
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- task: PublishPipelineArtifact@0
displayName: "Publish Pipeline Artifact"
inputs:
artifactName: "snap-$(VSCODE_ARCH)"
targetPath: .build/linux/snap-tarball
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -4,10 +4,11 @@ REPO="$(pwd)"
ROOT="$REPO/.."
# Publish tarball
PLATFORM_LINUX="linux-$VSCODE_ARCH"
PLATFORM_LINUX="linux-x64"
BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$VSCODE_ARCH-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$VSCODE_ARCH-$BUILD_VERSION.tar.gz"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
rm -rf $ROOT/code-*.tar.*
@@ -27,26 +28,16 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish DEB
case $VSCODE_ARCH in
x64) DEB_ARCH="amd64" ;;
*) DEB_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
PLATFORM_DEB="linux-deb-x64"
DEB_ARCH="amd64"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH"
# Publish RPM
case $VSCODE_ARCH in
x64) RPM_ARCH="x86_64" ;;
armhf) RPM_ARCH="armv7hl" ;;
arm64) RPM_ARCH="aarch64" ;;
*) RPM_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
PLATFORM_RPM="linux-rpm-x64"
RPM_ARCH="x86_64"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
@@ -55,6 +46,6 @@ node build/azure-pipelines/common/createAsset.js "$PLATFORM_RPM" package "$RPM_F
# Publish Snap
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz"
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)

View File

@@ -1,56 +1,52 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- task: DownloadPipelineArtifact@0
displayName: "Download Pipeline Artifact"
inputs:
artifactName: snap-$(VSCODE_ARCH)
targetPath: .build/linux/snap-tarball
- task: DownloadPipelineArtifact@0
displayName: 'Download Pipeline Artifact'
inputs:
artifactName: snap-x64
targetPath: .build/linux/snap-tarball
- script: |
set -e
- script: |
set -e
# Get snapcraft version
snapcraft --version
# Get snapcraft version
snapcraft --version
# Make sure we get latest packages
sudo apt-get update
sudo apt-get upgrade -y
# Make sure we get latest packages
sudo apt-get update
sudo apt-get upgrade -y
# Define variables
REPO="$(pwd)"
SNAP_ROOT="$REPO/.build/linux/snap/$(VSCODE_ARCH)"
# Define variables
REPO="$(pwd)"
SNAP_ROOT="$REPO/.build/linux/snap/x64"
# Install build dependencies
(cd build && yarn)
# Install build dependencies
(cd build && yarn)
# Unpack snap tarball artifact, in order to preserve file perms
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$(VSCODE_ARCH).tar.gz"
(cd .build/linux && tar -xzf $SNAP_TARBALL_PATH)
# Unpack snap tarball artifact, in order to preserve file perms
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
(cd .build/linux && tar -xzf $SNAP_TARBALL_PATH)
# Create snap package
BUILD_VERSION="$(date +%s)"
SNAP_FILENAME="code-$VSCODE_QUALITY-$(VSCODE_ARCH)-$BUILD_VERSION.snap"
SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME"
case $(VSCODE_ARCH) in
x64) SNAPCRAFT_TARGET_ARGS="" ;;
*) SNAPCRAFT_TARGET_ARGS="--target-arch $(VSCODE_ARCH)" ;;
esac
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft snap $SNAPCRAFT_TARGET_ARGS --output "$SNAP_PATH")
# Create snap package
BUILD_VERSION="$(date +%s)"
SNAP_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.snap"
SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME"
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft snap --output "$SNAP_PATH")
# Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/createAsset.js "linux-snap-$(VSCODE_ARCH)" package "$SNAP_FILENAME" "$SNAP_PATH"
# Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/createAsset.js "linux-snap-x64" package "$SNAP_FILENAME" "$SNAP_PATH"

View File

@@ -9,7 +9,6 @@ steps:
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -46,7 +45,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
@@ -58,7 +57,6 @@ steps:
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -73,7 +71,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -94,7 +91,8 @@ steps:
- script: |
set -e
yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-web-min-ci
yarn gulp vscode-reh-linux-x64-min-ci
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -136,8 +134,7 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
export INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export NO_CLEANUP=1
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
@@ -152,21 +149,6 @@ steps:
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_UNSTABLE_TESTS'], 'true'))
- bash: |
set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp
for folder in adsuser*/
do
folder=${folder%/}
# Only archive directories we want for debugging purposes
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/$folder.tar.gz $folder/User $folder/logs
done
displayName: Archive Logs
continueOnError: true
condition: succeededOrFailed()
- script: |
set -e
yarn gulp vscode-linux-x64-build-deb
@@ -235,11 +217,10 @@ steps:
testResultsFiles: '*.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
continueOnError: true
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true'))
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
condition: succeededOrFailed()
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'

View File

@@ -2,201 +2,143 @@ trigger: none
pr: none
schedules:
- cron: "0 5 * * Mon-Fri"
displayName: Mon-Fri at 7:00
branches:
include:
- master
- cron: "0 5 * * Mon-Fri"
displayName: Mon-Fri at 7:00
branches:
include:
- master
resources:
containers:
- container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub
- container: vscode-arm64
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-arm64
endpoint: VSCodeHub
- container: vscode-armhf
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-armhf
endpoint: VSCodeHub
- container: snapcraft
image: snapcore/snapcraft:stable
- container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub
- container: snapcraft
image: snapcore/snapcraft:stable
stages:
- stage: Compile
jobs:
- job: Compile
pool:
vmImage: "Ubuntu-16.04"
container: vscode-x64
variables:
VSCODE_ARCH: x64
steps:
- template: product-compile.yml
- stage: Windows
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
- stage: Compile
jobs:
- job: Compile
pool:
vmImage: VS2017-Win2016
jobs:
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
vmImage: 'Ubuntu-16.04'
container: vscode-x64
steps:
- template: product-compile.yml
- job: Windows32
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- stage: Windows
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: VS2017-Win2016
jobs:
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: WindowsARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32.yml
- job: Windows32
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true'))
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- stage: Linux
- job: WindowsARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true'))
variables:
VSCODE_ARCH: arm64
steps:
- template: win32/product-build-win32-arm64.yml
- stage: Linux
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: 'Ubuntu-16.04'
jobs:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnap
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
- Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: snapcraft
steps:
- template: linux/snap-build-linux.yml
- job: LinuxSnap
dependsOn:
- Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
variables:
VSCODE_ARCH: armhf
steps:
- template: linux/product-build-linux-multiarch.yml
- job: LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: vscode-armhf
variables:
VSCODE_ARCH: armhf
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux.yml
- job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
variables:
VSCODE_ARCH: arm64
steps:
- template: linux/product-build-linux-multiarch.yml
- job: LinuxSnapArmhf
dependsOn:
- LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: armhf
steps:
- template: linux/snap-build-linux.yml
- job: LinuxAlpine
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))
variables:
VSCODE_ARCH: alpine
steps:
- template: linux/product-build-linux-multiarch.yml
- job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: vscode-arm64
variables:
VSCODE_ARCH: arm64
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux.yml
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- job: LinuxSnapArm64
dependsOn:
- LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: arm64
steps:
- template: linux/snap-build-linux.yml
- stage: macOS
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
steps:
- template: darwin/product-build-darwin.yml
- job: LinuxAlpine
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))
steps:
- template: linux/product-build-alpine.yml
- stage: Mooncake
dependsOn:
- Windows
- Linux
- macOS
condition: and(succeededOrFailed(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: 'Ubuntu-16.04'
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
variables:
VSCODE_ARCH: x64
steps:
- template: web/product-build-web.yml
- stage: macOS
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: x64
steps:
- template: darwin/product-build-darwin.yml
- stage: macOSARM64
dependsOn:
- Compile
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: macOS-latest
jobs:
- job: macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS_ARM64'], 'true'))
timeoutInMinutes: 90
variables:
VSCODE_ARCH: arm64
steps:
- template: darwin/product-build-darwin.yml
- stage: Mooncake
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeededOrFailed(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: SyncMooncake
displayName: Sync Mooncake
steps:
- template: sync-mooncake.yml
- stage: Publish
dependsOn:
- Windows
- Linux
- macOS
- macOSARM64
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), or(eq(variables['VSCODE_RELEASE'], 'true'), and(or(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['VSCODE_QUALITY'], 'exploration')), eq(variables['Build.Reason'], 'Schedule'))))
pool:
vmImage: "Ubuntu-16.04"
jobs:
- job: BuildService
displayName: Build Service
steps:
- template: release.yml
- stage: Publish
dependsOn:
- Windows
- Linux
- macOS
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), or(eq(variables['VSCODE_RELEASE'], 'true'), and(or(eq(variables['VSCODE_QUALITY'], 'insider'), eq(variables['VSCODE_QUALITY'], 'exploration')), eq(variables['Build.Reason'], 'Schedule'))))
pool:
vmImage: 'Ubuntu-16.04'
jobs:
- job: BuildService
displayName: Build Service
steps:
- template: release.yml

View File

@@ -1,161 +1,142 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flag
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
dryRun: true
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
dryRun: true
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
echo -n $(VSCODE_ARCH) > .build/arch
displayName: Prepare yarn cache flags
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
# Mixin must run before optimize, because the CSS loader will
# inline small SVGs
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn gulp hygiene
yarn monaco-compile-check
yarn valid-layers-check
displayName: Run hygiene, monaco compile & valid layers checks
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
# Mixin must run before optimize, because the CSS loader will
# inline small SVGs
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -
./build/azure-pipelines/common/extract-telemetry.sh
displayName: Extract Telemetry
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn gulp hygiene
yarn monaco-compile-check
yarn valid-layers-check
displayName: Run hygiene, monaco compile & valid layers checks
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
AZURE_WEBVIEW_STORAGE_ACCESS_KEY="$(vscode-webview-storage-key)" \
./build/azure-pipelines/common/publish-webview.sh
displayName: Publish Webview
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -
./build/azure-pipelines/common/extract-telemetry.sh
displayName: Extract Telemetry
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
AZURE_WEBVIEW_STORAGE_ACCESS_KEY="$(vscode-webview-storage-key)" \
./build/azure-pipelines/common/publish-webview.sh
displayName: Publish Webview
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
VERSION=`node -p "require(\"./package.json\").version"`
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/createBuild.js $VERSION
displayName: Create build
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e
VERSION=`node -p "require(\"./package.json\").version"`
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/createBuild.js $VERSION
displayName: Create build
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))

View File

@@ -2,82 +2,82 @@
trigger:
branches:
include: ["refs/tags/*"]
include: ['refs/tags/*']
pr: none
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
if [ "$TAG_VERSION" == "1.999.0" ]; then
MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
exit 1
fi
displayName: Check 1.999.0 tag
- bash: |
# Install build dependencies
(cd build && yarn)
node build/azure-pipelines/publish-types/check-version.js
displayName: Check version
- bash: |
git config --global user.email "vscode@microsoft.com"
git config --global user.name "VSCode"
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
node build/azure-pipelines/publish-types/update-types.js
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
cd DefinitelyTyped
git diff --color | cat
git add -A
git status
git checkout -b "vscode-types-$TAG_VERSION"
git commit -m "VS Code $TAG_VERSION Extension API"
git push origin "vscode-types-$TAG_VERSION"
displayName: Push update to DefinitelyTyped
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
if [ "$TAG_VERSION" == "1.999.0" ]; then
MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
https://slack.com/api/chat.postMessage
exit 1
fi
displayName: Check 1.999.0 tag
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
https://slack.com/api/chat.postMessage
- bash: |
# Install build dependencies
(cd build && yarn)
node build/azure-pipelines/publish-types/check-version.js
displayName: Check version
displayName: Send message on Slack
- bash: |
git config --global user.email "vscode@microsoft.com"
git config --global user.name "VSCode"
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
node build/azure-pipelines/publish-types/update-types.js
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
cd DefinitelyTyped
git diff --color | cat
git add -A
git status
git checkout -b "vscode-types-$TAG_VERSION"
git commit -m "VS Code $TAG_VERSION Extension API"
git push origin "vscode-types-$TAG_VERSION"
displayName: Push update to DefinitelyTyped
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
https://slack.com/api/chat.postMessage
displayName: Send message on Slack

View File

@@ -1,22 +1,22 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.x"
- task: NodeTool@0
inputs:
versionSpec: "10.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
- script: |
set -e
(cd build ; yarn)
(cd build ; yarn)
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/releaseBuild.js
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
node build/azure-pipelines/common/releaseBuild.js

View File

@@ -17,7 +17,7 @@ jobs:
- template: sql-product-compile.yml
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
pool:
vmImage: macOS-latest
dependsOn:
@@ -27,7 +27,7 @@ jobs:
timeoutInMinutes: 180
- job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true))
pool:
vmImage: macOS-latest
dependsOn:
@@ -50,7 +50,7 @@ jobs:
timeoutInMinutes: 70
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
@@ -61,15 +61,15 @@ jobs:
steps:
- template: web/sql-product-build-web.yml
# - job: Docker
# condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
# pool:
# vmImage: 'Ubuntu-16.04'
# container: linux-x64
# dependsOn:
# - Linux
# steps:
# - template: docker/sql-product-build-docker.yml
- job: Docker
condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
dependsOn:
- Linux
steps:
- template: docker/sql-product-build-docker.yml
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -98,7 +98,7 @@ jobs:
dependsOn:
- macOS
- Linux
# - Docker
- Docker
- Windows
- Windows_Test
- LinuxWeb

View File

@@ -6,7 +6,6 @@ steps:
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -37,7 +36,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
@@ -51,7 +50,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -64,7 +62,6 @@ steps:
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
@@ -99,8 +96,8 @@ steps:
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
@@ -126,7 +123,6 @@ steps:
displayName: 'Publish Artifact: drop'
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'

View File

@@ -1,24 +1,24 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
- script: |
set -e
(cd build ; yarn)
(cd build ; yarn)
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \
node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY"
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \
node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY"

View File

@@ -1,35 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const path = require("path");
const es = require("event-stream");
const vfs = require("vinyl-fs");
const util = require("../lib/util");
const filter = require("gulp-filter");
const gzip = require("gulp-gzip");
const azure = require('gulp-azure-storage');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
function main() {
return vfs.src('**', { cwd: '../vscode-web', base: '../vscode-web', dot: true })
.pipe(filter(f => !f.isDirectory()))
.pipe(gzip({ append: false }))
.pipe(es.through(function (data) {
console.log('Uploading CDN file:', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: process.env.VSCODE_QUALITY,
prefix: commit + '/',
contentSettings: {
contentEncoding: 'gzip',
cacheControl: 'max-age=31536000, public'
}
}));
}
main();

View File

@@ -1,40 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
import * as Vinyl from 'vinyl';
import * as vfs from 'vinyl-fs';
import * as util from '../lib/util';
import * as filter from 'gulp-filter';
import * as gzip from 'gulp-gzip';
const azure = require('gulp-azure-storage');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
function main() {
return vfs.src('**', { cwd: '../vscode-web', base: '../vscode-web', dot: true })
.pipe(filter(f => !f.isDirectory()))
.pipe(gzip({ append: false }))
.pipe(es.through(function (data: Vinyl) {
console.log('Uploading CDN file:', data.relative); // debug
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: process.env.VSCODE_QUALITY,
prefix: commit + '/',
contentSettings: {
contentEncoding: 'gzip',
cacheControl: 'max-age=31536000, public'
}
}));
}
main();

View File

@@ -1,132 +1,106 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
echo -n $ENABLE_TERRAPIN > .build/terrapin
displayName: Prepare compilation cache flag
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: 'npm-vscode'
- script: |
echo -n "web" > .build/arch
displayName: Prepare yarn cache flag
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: 'npm-vscode'
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
export CHILD_CONCURRENCY="1"
for i in {1..3}; do # try 3 times, for Terrapin
yarn --frozen-lockfile && break
if [ $i -eq 3 ]; then
echo "Yarn failed too many times" >&2
exit 1
fi
echo "Yarn failed $i, trying again..."
done
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# - script: |
# set -e
# yarn postinstall
# displayName: Run postinstall scripts
# condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
yarn postinstall
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-web-min-ci
displayName: Build
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
# upload only the workbench.web.api.js source maps because
# we just compiled these bits in the previous step and the
# general task to upload source maps has already been run
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps out-vscode-web-min out-vscode-web-min/vs/workbench/workbench.web.api.js.map
displayName: Upload sourcemaps (Web)
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-web-min-ci
displayName: Build
- script: |
set -e
AZURE_STORAGE_ACCOUNT="$(web-storage-account)" \
AZURE_STORAGE_ACCESS_KEY="$(web-storage-key)" \
node build/azure-pipelines/upload-cdn.js
displayName: Upload to CDN
# upload only the workbench.web.api.js source maps because
# we just compiled these bits in the previous step and the
# general task to upload source maps has already been run
- script: |
set -e
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
node build/azure-pipelines/upload-sourcemaps out-vscode-web-min out-vscode-web-min/vs/workbench/workbench.web.api.js.map
displayName: Upload sourcemaps (Web)
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/web/publish.sh
displayName: Publish
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/web/publish.sh
displayName: Publish

View File

@@ -6,7 +6,6 @@ steps:
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Compiled Files
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
@@ -43,7 +42,7 @@ steps:
password $(github-distro-mixin-password)
EOF
git config user.email "sqltools@service.microsoft.com"
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
@@ -55,7 +54,6 @@ steps:
displayName: Merge distro
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# displayName: Restore Cache - Node Modules
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
@@ -68,7 +66,6 @@ steps:
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# displayName: Save Cache - Node Modules
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'

View File

@@ -1,7 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<clear/>
<add key="ESRP" value="https://microsoft.pkgs.visualstudio.com/_packaging/ESRP/nuget/v3/index.json" />
</packageSources>
</configuration>
</configuration>

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Microsoft.ESRPClient" version="1.2.47" />
<package id="Microsoft.ESRPClient" version="1.2.25" />
</packages>

View File

@@ -1,82 +1,80 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
inputs:
versionSpec: "1.x"
- task: UsePythonVersion@0
inputs:
versionSpec: "2.x"
addToPath: true
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
- powershell: |
yarn --frozen-lockfile
env:
CHILD_CONCURRENCY: "1"
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
yarn --frozen-lockfile
env:
CHILD_CONCURRENCY: "1"
displayName: Install Dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
inputs:
keyfile: "build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules"
vstsFeed: "npm-cache" # {{SQL CARBON EDIT}} update build cache
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
yarn electron
displayName: Download Electron
- powershell: |
yarn electron
displayName: Download Electron
# - powershell: | {{SQL CARBON EDIT}} remove editor check
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
# - powershell: | {{SQL CARBON EDIT}} remove editor check
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- script: |
yarn valid-layers-check
displayName: Run Valid Layers Checks
- powershell: |
yarn compile
displayName: Compile Sources
- powershell: |
yarn compile
displayName: Compile Sources
# - powershell: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions
# displayName: Download Built-in Extensions
# - powershell: | {{SQL CARBON EDIT}} remove step
# yarn download-builtin-extensions
# displayName: Download Built-in Extensions
- powershell: |
.\scripts\test.bat --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
- powershell: |
.\scripts\test.bat --tfs "Unit Tests"
displayName: Run Unit Tests (Electron)
# - powershell: | {{SQL CARBON EDIT}} disable
# yarn test-browser --browser chromium --browser firefox --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser)
# - powershell: | {{SQL CARBON EDIT}} disable
# yarn test-browser --browser chromium --browser firefox --tfs "Browser Unit Tests"
# displayName: Run Unit Tests (Browser)
# - powershell: | {{SQL CARBON EDIT}} disable
# .\scripts\test-integration.bat --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
# - powershell: | {{SQL CARBON EDIT}} disable
# .\scripts\test-integration.bat --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0
displayName: "Publish Crash Reports"
inputs:
artifactName: crash-dump-windows
targetPath: .build\crashes
continueOnError: true
condition: failed()
- task: PublishPipelineArtifact@0
displayName: 'Publish Crash Reports'
inputs:
artifactName: crash-dump-windows
targetPath: .build\crashes
continueOnError: true
condition: failed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -12,9 +12,9 @@ $ServerZipLocation = "$Repo\.build\win32-$Arch\server"
$ServerZip = "$ServerZipLocation\azuredatastudio-server-win32-$Arch.zip"
# Create server archive
# New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
$global:LASTEXITCODE = 0
# exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
# exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { node build/azure-pipelines/common/copyArtifacts.js } "Copy Artifacts"

View File

@@ -1,14 +1,14 @@
param ($CertBase64)
$ErrorActionPreference = "Stop"
Param(
[string]$AuthCertificateBase64,
[string]$AuthCertificateKey
)
$CertBytes = [System.Convert]::FromBase64String($CertBase64)
$CertCollection = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2Collection
$CertCollection.Import($CertBytes, $null, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable)
# Import auth certificate
$AuthCertificateFileName = [System.IO.Path]::GetTempFileName()
$AuthCertificateBytes = [Convert]::FromBase64String($AuthCertificateBase64)
[IO.File]::WriteAllBytes($AuthCertificateFileName, $AuthCertificateBytes)
$AuthCertificate = Import-PfxCertificate -FilePath $AuthCertificateFileName -CertStoreLocation Cert:\LocalMachine\My -Password (ConvertTo-SecureString $AuthCertificateKey -AsPlainText -Force)
rm $AuthCertificateFileName
$ESRPAuthCertificateSubjectName = $AuthCertificate.Subject
$CertStore = New-Object System.Security.Cryptography.X509Certificates.X509Store("My","LocalMachine")
$CertStore.Open("ReadWrite")
$CertStore.AddRange($CertCollection)
$CertStore.Close()
$ESRPAuthCertificateSubjectName = $CertCollection[0].Subject
Write-Output ("##vso[task.setvariable variable=ESRPAuthCertificateSubjectName;]$ESRPAuthCertificateSubjectName")
Write-Output ("##vso[task.setvariable variable=ESRPAuthCertificateSubjectName;]$ESRPAuthCertificateSubjectName")

View File

@@ -0,0 +1,190 @@
steps:
- powershell: |
mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- powershell: |
$ErrorActionPreference = "Stop"
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
mkdir .build -ea 0
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
displayName: Prepare tooling
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
exec { yarn --frozen-lockfile }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn postinstall }
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-code-helper" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }
displayName: Build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -1,272 +1,252 @@
steps:
- powershell: |
mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
"$env:ENABLE_TERRAPIN" | Out-File -Encoding ascii -NoNewLine .build\terrapin
displayName: Prepare compilation cache flags
- powershell: |
mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: "build/.cachesalt, .build/commit, .build/quality, .build/terrapin"
targetfolder: ".build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min"
vstsFeed: "npm-vscode"
platformIndependent: true
alias: "Compilation"
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- powershell: |
$ErrorActionPreference = "Stop"
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- powershell: |
$ErrorActionPreference = "Stop"
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: NodeTool@0
inputs:
versionSpec: "12.14.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: UsePythonVersion@0
inputs:
versionSpec: "2.x"
addToPath: true
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: AzureKeyVault@1
displayName: "Azure Key Vault: Get Secrets"
inputs:
azureSubscription: "vscode-builds-subscription"
KeyVaultName: vscode
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
displayName: Prepare tooling
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
mkdir .build -ea 0
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
displayName: Prepare tooling
- script: |
npx https://aka.ms/enablesecurefeed standAlone
displayName: Switch to Terrapin packages
timeoutInMinutes: 5
condition: and(succeeded(), eq(variables['ENABLE_TERRAPIN'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- powershell: |
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
displayName: Prepare yarn cache flags
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
exec { yarn --frozen-lockfile }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
. build/azure-pipelines/win32/retry.ps1
$ErrorActionPreference = "Stop"
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
retry { exec { yarn --frozen-lockfile } }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: ".build/arch, .build/terrapin, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock"
targetfolder: "**/node_modules, !**/node_modules/**/node_modules"
vstsFeed: "npm-vscode"
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn postinstall }
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn postinstall }
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-reh-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-code-helper" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }
displayName: Build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-min-ci" }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-code-helper" }
exec { yarn gulp "vscode-win32-$(VSCODE_ARCH)-inno-updater" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)"
displayName: Build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-reh-win32-$(VSCODE_ARCH)-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-$(VSCODE_ARCH)-min-ci" }
echo "##vso[task.setvariable variable=CodeSigningFolderPath]$(CodeSigningFolderPath),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)"
displayName: Build Server
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-browser --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn electron $(VSCODE_ARCH) }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn test-browser --build --browser chromium --browser firefox --tfs "Browser Unit Tests" }
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-remote-integration.bat }
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-web-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-web-integration.bat --browser firefox }
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-windows-$(VSCODE_ARCH)
targetPath: .build\crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$AppRoot = "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\vscode-reh-win32-$(VSCODE_ARCH)"; .\resources\server\test\test-remote-integration.bat }
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-windows-$(VSCODE_ARCH)
targetPath: .build\crashes
displayName: "Publish Crash Reports"
continueOnError: true
condition: failed()
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: "*-results.xml"
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
condition: and(succeededOrFailed(), ne(variables['VSCODE_ARCH'], 'arm64'))
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 'ESRP Nuget'
restoreDirectory: packages
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: "ESRP CodeSign"
FolderPath: "$(CodeSigningFolderPath)"
Pattern: "*.dll,*.exe,*.node"
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: "ESRP Nuget"
restoreDirectory: packages
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
displayName: Import ESRP Auth Certificate
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: "ESRP CodeSign"
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: PowerShell@2
inputs:
targetType: filePath
filePath: .\build\azure-pipelines\win32\import-esrp-auth-cert.ps1
arguments: "$(ESRP-SSL-AADAuth)"
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: "Component Detection"
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

Some files were not shown because too many files have changed in this diff Show More