mirror of
https://github.com/ckaczor/azuredatastudio.git
synced 2026-02-17 18:46:43 -05:00
Compare commits
29 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5b3b97f4a3 | ||
|
|
6d25299eef | ||
|
|
26fe9932f2 | ||
|
|
2acd37a1b1 | ||
|
|
85786b48c3 | ||
|
|
acd6257bae | ||
|
|
f007d707b6 | ||
|
|
b118b4bc7a | ||
|
|
f3edece70b | ||
|
|
4de376c357 | ||
|
|
0753b63ad0 | ||
|
|
0523190fbb | ||
|
|
de994972db | ||
|
|
3b06473c49 | ||
|
|
bb75143282 | ||
|
|
3857f11dc9 | ||
|
|
0918d93a18 | ||
|
|
25d96d041e | ||
|
|
0dc88501cf | ||
|
|
bac7eccbaf | ||
|
|
585d609ebb | ||
|
|
168385b6f1 | ||
|
|
eb4612100d | ||
|
|
d89ce8f9ec | ||
|
|
d84dd31491 | ||
|
|
b6632547a2 | ||
|
|
79669f073c | ||
|
|
e078e3bc48 | ||
|
|
72d035be98 |
@@ -12,10 +12,6 @@
|
||||
{
|
||||
"file": "build\\actions\\AutoMerge\\dist\\index.js",
|
||||
"_justification": "False positive from webpacked code"
|
||||
},
|
||||
{
|
||||
"file": ".devcontainer\\devcontainer.json",
|
||||
"_justification": "Local development environment - not used in production"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,100 +0,0 @@
|
||||
# Code - OSS Development Container
|
||||
|
||||
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [GitHub Codespaces](https://github.com/features/codespaces).
|
||||
|
||||
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well.
|
||||
|
||||
## Quick start - local
|
||||
|
||||
1. Install Docker Desktop or Docker for Linux on your local machine. (See [docs](https://aka.ms/vscode-remote/containers/getting-started) for additional details.)
|
||||
|
||||
2. **Important**: Docker needs at least **4 Cores and 6 GB of RAM (8 GB recommended)** to run full build. If you on macOS, or using the old Hyper-V engine for Windows, update these values for Docker Desktop by right-clicking on the Docker status bar item, going to **Preferences/Settings > Resources > Advanced**.
|
||||
|
||||
> **Note:** The [Resource Monitor](https://marketplace.visualstudio.com/items?itemName=mutantdino.resourcemonitor) extension is included in the container so you can keep an eye on CPU/Memory in the status bar.
|
||||
|
||||
3. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Remote - Containers](https://aka.ms/vscode-remote/download/containers) extension.
|
||||
|
||||

|
||||
|
||||
> Note that the Remote - Containers extension requires the Visual Studio Code distribution of Code - OSS. See the [FAQ](https://aka.ms/vscode-remote/faq/license) for details.
|
||||
|
||||
4. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Remote - Containers: Open Repository in Container...**.
|
||||
|
||||
> **Tip:** While you can use your local source tree instead, operations like `yarn install` can be slow on macOS or using the Hyper-V engine on Windows. We recommend the "open repository" approach instead since it uses "named volume" rather than the local filesystem.
|
||||
|
||||
5. Type `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box and press <kbd>Enter</kbd>.
|
||||
|
||||
6. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
||||
|
||||
Anything you start in VS Code or the integrated terminal will appear here.
|
||||
|
||||
Next: **[Try it out!](#try-it)**
|
||||
|
||||
## Quick start - GitHub Codespaces
|
||||
|
||||
> **IMPORTANT:** You need to use a "Standard" sized codespace or larger (4-core, 8GB) since VS Code needs 6GB of RAM to compile. This is now the default for GitHub Codespaces, but do not downgrade to "Basic" unless you do not intend to compile.
|
||||
|
||||
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and the **New codespace**
|
||||
|
||||
> Note that you will not see these options if you are not in the beta yet.
|
||||
|
||||
2. After the codespace is up and running in your browser, press <kbd>F1</kbd> and select **Ports: Focus on Ports View**.
|
||||
|
||||
3. You should see port `6080` under **Forwarded Ports**. Select the line and click on the globe icon to open it in a browser tab.
|
||||
|
||||
> If you do not see port `6080`, press <kbd>F1</kbd>, select **Forward a Port** and enter port `6080`.
|
||||
|
||||
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
|
||||
|
||||
Anything you start in VS Code or the integrated terminal will appear here.
|
||||
|
||||
Next: **[Try it out!](#try-it)**
|
||||
|
||||
### Using VS Code with GitHub Codespaces
|
||||
|
||||
You will likely see better performance when accessing the codespace you created from VS Code since you can use a[VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
|
||||
|
||||
1. [Create a codespace](#quick-start---github-codespaces) if you have not already.
|
||||
|
||||
2. Set up [VS Code for use with GitHub Codespaces](https://docs.github.com/github/developing-online-with-codespaces/using-codespaces-in-visual-studio-code)
|
||||
|
||||
3. After the VS Code is up and running, press <kbd>F1</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
|
||||
|
||||
4. After you've connected to the codespace, use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
||||
|
||||
5. Anything you start in VS Code or the integrated terminal will appear here.
|
||||
|
||||
Next: **[Try it out!](#try-it)**
|
||||
|
||||
## Try it!
|
||||
|
||||
This container uses the [Fluxbox](http://fluxbox.org/) window manager to keep things lean. **Right-click on the desktop** to see menu options. It works with GNOME and GTK applications, so other tools can be installed if needed.
|
||||
|
||||
Note you can also set the resolution from the command line by typing `set-resolution`.
|
||||
|
||||
To start working with Code - OSS, follow these steps:
|
||||
|
||||
1. In your local VS Code, open a terminal (<kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>\`</kbd>) and type the following commands:
|
||||
|
||||
```bash
|
||||
yarn install
|
||||
bash scripts/code.sh
|
||||
```
|
||||
|
||||
Note that a previous run of `yarn install` will already be cached, so this step should simply pick up any recent differences.
|
||||
|
||||
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to the desktop environnement as described in the quick start and enter `vscode` as the password.
|
||||
|
||||
3. You should now see Code - OSS!
|
||||
|
||||
Next, let's try debugging.
|
||||
|
||||
1. Shut down Code - OSS by clicking the box in the upper right corner of the Code - OSS window through your browser or VNC viewer.
|
||||
|
||||
2. Go to your local VS Code client, and use Run / Debug view to launch the **VS Code** configuration. (Typically the default, so you can likely just press <kbd>F5</kbd>).
|
||||
|
||||
> **Note:** If launching times out, you can increase the value of `timeout` in the "VS Code", "Attach Main Process", "Attach Extension Host", and "Attach to Shared Process" configurations in [launch.json](../.vscode/launch.json). However, running `scripts/code.sh` first will set up Electron which will usually solve timeout issues.
|
||||
|
||||
3. After a bit, Code - OSS will appear with the debugger attached!
|
||||
|
||||
Enjoy!
|
||||
1
.devcontainer/cache/.gitignore
vendored
1
.devcontainer/cache/.gitignore
vendored
@@ -1 +0,0 @@
|
||||
*.manifest
|
||||
15
.devcontainer/cache/before-cache.sh
vendored
15
.devcontainer/cache/before-cache.sh
vendored
@@ -1,15 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file establishes a basline for the reposuitory before any steps in the "prepare.sh"
|
||||
# are run. Its just a find command that filters out a few things we don't need to watch.
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
|
||||
SOURCE_FOLDER="${1:-"."}"
|
||||
|
||||
cd "${SOURCE_FOLDER}"
|
||||
echo "[$(date)] Generating ""before"" manifest..."
|
||||
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/before.manifest"
|
||||
echo "[$(date)] Done!"
|
||||
|
||||
28
.devcontainer/cache/build-cache-image.sh
vendored
28
.devcontainer/cache/build-cache-image.sh
vendored
@@ -1,28 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file simply wraps the dockeer build command used to build the image with the
|
||||
# cached result of the commands from "prepare.sh" and pushes it to the specified
|
||||
# container image registry.
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
|
||||
CONTAINER_IMAGE_REPOSITORY="$1"
|
||||
BRANCH="${2:-"main"}"
|
||||
|
||||
if [ "${CONTAINER_IMAGE_REPOSITORY}" = "" ]; then
|
||||
echo "Container repository not specified!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
TAG="branch-${BRANCH//\//-}"
|
||||
echo "[$(date)] ${BRANCH} => ${TAG}"
|
||||
cd "${SCRIPT_PATH}/../.."
|
||||
|
||||
echo "[$(date)] Starting image build..."
|
||||
docker build -t ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}" -f "${SCRIPT_PATH}/cache.Dockerfile" .
|
||||
echo "[$(date)] Image build complete."
|
||||
|
||||
echo "[$(date)] Pushing image..."
|
||||
docker push ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}"
|
||||
echo "[$(date)] Done!"
|
||||
21
.devcontainer/cache/cache-diff.sh
vendored
21
.devcontainer/cache/cache-diff.sh
vendored
@@ -1,21 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file is used to archive off a copy of any differences in the source tree into another location
|
||||
# in the image. Once the codespace is up, this will be restored into its proper location (which is
|
||||
# quick and happens parallel to other startup activities)
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_PATH="$(cd "$(dirname $0)" && pwd)"
|
||||
SOURCE_FOLDER="${1:-"."}"
|
||||
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
|
||||
|
||||
echo "[$(date)] Starting cache operation..."
|
||||
cd "${SOURCE_FOLDER}"
|
||||
echo "[$(date)] Determining diffs..."
|
||||
find -L . -not -path "*/.git/*" -and -not -path "${SCRIPT_PATH}/*.manifest" -type f > "${SCRIPT_PATH}/after.manifest"
|
||||
grep -Fxvf "${SCRIPT_PATH}/before.manifest" "${SCRIPT_PATH}/after.manifest" > "${SCRIPT_PATH}/cache.manifest"
|
||||
echo "[$(date)] Archiving diffs..."
|
||||
mkdir -p "${CACHE_FOLDER}"
|
||||
tar -cf "${CACHE_FOLDER}/cache.tar" --totals --files-from "${SCRIPT_PATH}/cache.manifest"
|
||||
echo "[$(date)] Done! $(du -h "${CACHE_FOLDER}/cache.tar")"
|
||||
14
.devcontainer/cache/cache.Dockerfile
vendored
14
.devcontainer/cache/cache.Dockerfile
vendored
@@ -1,14 +0,0 @@
|
||||
# This dockerfile is used to build up from a base image to create an image with cached results of running "prepare.sh".
|
||||
# Other image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
|
||||
FROM mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:dev
|
||||
|
||||
ARG USERNAME=node
|
||||
COPY --chown=${USERNAME}:${USERNAME} . /repo-source-tmp/
|
||||
RUN mkdir /usr/local/etc/devcontainer-cache \
|
||||
&& chown ${USERNAME} /usr/local/etc/devcontainer-cache /repo-source-tmp \
|
||||
&& su ${USERNAME} -c "\
|
||||
cd /repo-source-tmp \
|
||||
&& .devcontainer/cache/before-cache.sh \
|
||||
&& .devcontainer/prepare.sh \
|
||||
&& .devcontainer/cache/cache-diff.sh" \
|
||||
&& rm -rf /repo-source-tmp
|
||||
23
.devcontainer/cache/restore-diff.sh
vendored
23
.devcontainer/cache/restore-diff.sh
vendored
@@ -1,23 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file restores the results of the "prepare.sh" into their proper locations
|
||||
# once the container has been created. It runs as a postCreateCommand which
|
||||
# in GitHub Codespaces occurs parallel to other startup activities and does not
|
||||
# really add to the overal startup time given how quick the operation ends up being.
|
||||
|
||||
set -e
|
||||
|
||||
SOURCE_FOLDER="$(cd "${1:-"."}" && pwd)"
|
||||
CACHE_FOLDER="${2:-"/usr/local/etc/devcontainer-cache"}"
|
||||
|
||||
if [ ! -d "${CACHE_FOLDER}" ]; then
|
||||
echo "No cache folder found."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "[$(date)] Expanding $(du -h "${CACHE_FOLDER}/cache.tar") file to ${SOURCE_FOLDER}..."
|
||||
cd "${SOURCE_FOLDER}"
|
||||
tar -xf "${CACHE_FOLDER}/cache.tar"
|
||||
rm -f "${CACHE_FOLDER}/cache.tar"
|
||||
echo "[$(date)] Done!"
|
||||
|
||||
@@ -1,30 +0,0 @@
|
||||
{
|
||||
"name": "Code - OSS",
|
||||
|
||||
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
|
||||
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-main",
|
||||
|
||||
"workspaceMount": "source=${localWorkspaceFolder},target=/home/node/workspace/vscode,type=bind,consistency=cached",
|
||||
"workspaceFolder": "/home/node/workspace/vscode",
|
||||
"overrideCommand": false,
|
||||
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined"],
|
||||
|
||||
"settings": {
|
||||
"terminal.integrated.shell.linux": "/bin/bash",
|
||||
"resmon.show.battery": false,
|
||||
"resmon.show.cpufreq": false
|
||||
},
|
||||
|
||||
// noVNC, VNC, debug ports
|
||||
"forwardPorts": [6080, 5901, 9222],
|
||||
|
||||
"extensions": [
|
||||
"dbaeumer.vscode-eslint",
|
||||
"mutantdino.resourcemonitor"
|
||||
],
|
||||
|
||||
// Optionally loads a cached yarn install for the repo
|
||||
"postCreateCommand": ".devcontainer/cache/restore-diff.sh",
|
||||
|
||||
"remoteUser": "node"
|
||||
}
|
||||
@@ -1,10 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file contains the steps that should be run when creating the intermediary image that contains
|
||||
# contents for that should be in the image by default. It will be used to build up from the base image
|
||||
# to create an image that speeds up first time use of the dev container by "caching" the results
|
||||
# of these commands. Developers can still run these commands without an issue once the container is
|
||||
# up, but only differences will be processed which also speeds up the first time these operations occur.
|
||||
|
||||
yarn install
|
||||
yarn electron
|
||||
@@ -5,16 +5,11 @@
|
||||
**/vs/loader.js
|
||||
**/insane/**
|
||||
**/marked/**
|
||||
**/semver/**
|
||||
**/test/**/*.js
|
||||
**/node_modules/**
|
||||
**/vscode-api-tests/testWorkspace/**
|
||||
**/vscode-api-tests/testWorkspace2/**
|
||||
**/extensions/**/out/**
|
||||
**/extensions/**/build/**
|
||||
**/big-data-cluster/src/bigDataCluster/controller/apiGenerated.ts
|
||||
**/big-data-cluster/src/bigDataCluster/controller/clusterApiGenerated2.ts
|
||||
**/extensions/markdown-language-features/media/**
|
||||
**/extensions/markdown-language-features/notebook-out/**
|
||||
**/extensions/typescript-basics/test/colorize-fixtures/**
|
||||
**/extensions/**/dist/**
|
||||
|
||||
1924
.eslintrc.json
1924
.eslintrc.json
File diff suppressed because it is too large
Load Diff
21
.github/CODEOWNERS
vendored
21
.github/CODEOWNERS
vendored
@@ -1,21 +0,0 @@
|
||||
# Lines starting with '#' are comments.
|
||||
# Each line is a file pattern followed by one or more owners.
|
||||
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
|
||||
|
||||
/extensions/admin-tool-ext-win @Charles-Gagnon
|
||||
/extensions/arc/ @Charles-Gagnon @swells @candiceye
|
||||
/extensions/azcli/ @Charles-Gagnon @swells @candiceye
|
||||
/extensions/azdata/ @Charles-Gagnon @swells @candiceye
|
||||
/extensions/big-data-cluster/ @Charles-Gagnon
|
||||
/extensions/dacpac/ @kisantia
|
||||
/extensions/notebook @azure-data-studio-notebook-devs
|
||||
/extensions/query-history/ @Charles-Gagnon
|
||||
/extensions/resource-deployment/ @Charles-Gagnon
|
||||
/extensions/schema-compare/ @kisantia
|
||||
/extensions/sql-database-projects/ @Benjin @kisantia
|
||||
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
|
||||
|
||||
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon
|
||||
/src/sql/workbench/browser/modelComponents @Charles-Gagnon @alanrenmsft
|
||||
/src/sql/workbench/api @Charles-Gagnon @alanrenmsft
|
||||
/src/sql/**/notebook @azure-data-studio-notebook-devs
|
||||
14
.github/ISSUE_TEMPLATE/bug_report.md
vendored
14
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -8,18 +8,12 @@ assignees: ''
|
||||
---
|
||||
<!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
|
||||
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
|
||||
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
|
||||
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://github.com/Microsoft/azuredatastudio#try-out-the-latest-insiders-build-from-main -->
|
||||
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in Azure Data Studio to pre-fill useful information. -->
|
||||
<!-- Please search existing issues to avoid creating duplicates. -->
|
||||
<!-- Also please test using the latest insiders build to make sure your issue has not already been fixed. -->
|
||||
|
||||
<!-- Use Help > Report Issue to prefill these. -->
|
||||
- Azure Data Studio Version:
|
||||
- OS Version:
|
||||
|
||||
Steps to Reproduce:
|
||||
|
||||
1.
|
||||
2.
|
||||
|
||||
<!-- 🔧 Launch with `azuredatastudio --disable-extensions` to check. -->
|
||||
Does this issue occur when all extensions are disabled?: Yes/No
|
||||
|
||||
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->
|
||||
|
||||
5
.github/classifier.yml
vendored
5
.github/classifier.yml
vendored
@@ -8,13 +8,11 @@
|
||||
Area - Acquisition: [],
|
||||
Area - Azure: [],
|
||||
Area - Backup\Restore: [],
|
||||
Area - Big Data Cluster: [ charles-gagnon ],
|
||||
Area - Charting\Insights: [],
|
||||
Area - Connection: [ ],
|
||||
Area - Connection: [ charles-gagnon ],
|
||||
Area - DacFX: [],
|
||||
Area - Dashboard: [],
|
||||
Area - Data Explorer: [],
|
||||
Area - Data Virtualization: [ charles-gagnon ],
|
||||
Area - Edit Data: [],
|
||||
Area - Extensibility: [],
|
||||
Area - External Table: [],
|
||||
@@ -24,7 +22,6 @@
|
||||
Area - Notebooks: [ chlafreniere ],
|
||||
Area - Performance: [],
|
||||
Area - Query Editor: [ anthonydresser ],
|
||||
Area - Query History: [ charles-gagnon ],
|
||||
Area - Query Plan: [],
|
||||
Area - Reliability: [],
|
||||
Area - Resource Deployment: [],
|
||||
|
||||
17
.github/commands.yml
vendored
17
.github/commands.yml
vendored
@@ -1,12 +1,11 @@
|
||||
{
|
||||
perform: true,
|
||||
commands:
|
||||
[
|
||||
{
|
||||
type: "label",
|
||||
name: "Needs Logs",
|
||||
action: "comment",
|
||||
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log",
|
||||
},
|
||||
],
|
||||
commands: [
|
||||
{
|
||||
type: 'label',
|
||||
name: 'Needs Logs',
|
||||
action: 'comment',
|
||||
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
5
.github/copycat.yml
vendored
Normal file
5
.github/copycat.yml
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
perform: false,
|
||||
target_owner: 'anthonydresser',
|
||||
target_repo: 'testissues'
|
||||
}
|
||||
27
.github/label-actions.yml
vendored
27
.github/label-actions.yml
vendored
@@ -1,27 +0,0 @@
|
||||
Needs Logs:
|
||||
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
|
||||
|
||||
|
||||
There are two types of logs to collect:
|
||||
|
||||
|
||||
**Console Logs**
|
||||
|
||||
|
||||
- Open Developer Tools (Help -> Toggle Developer Tools)
|
||||
|
||||
- Click the **Console** tab
|
||||
|
||||
- Click in the log area and select all text (CTRL+A)
|
||||
|
||||
- Save this text into a file named console.log and attach it to this issue.
|
||||
|
||||
|
||||
**Application Logs**
|
||||
|
||||
|
||||
- Open command palette (Click **View** -> **Command Palette**)
|
||||
|
||||
- Run the command: **`Developer: Open Logs Folder`**
|
||||
|
||||
- This will open the log folder locally. Please zip up this folder and attach it to the issue."
|
||||
2
.github/pull_request_template.md
vendored
2
.github/pull_request_template.md
vendored
@@ -2,7 +2,7 @@
|
||||
* Read our Pull Request guidelines:
|
||||
https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#pull-requests.
|
||||
* Associate an issue with the Pull Request.
|
||||
* Ensure that the code is up-to-date with the `main` branch.
|
||||
* Ensure that the code is up-to-date with the `master` branch.
|
||||
* Include a description of the proposed changes and how to test them.
|
||||
-->
|
||||
|
||||
|
||||
2
.github/similarity.yml
vendored
2
.github/similarity.yml
vendored
@@ -1,5 +1,5 @@
|
||||
{
|
||||
perform: true,
|
||||
whenCreatedByTeam: true,
|
||||
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}",
|
||||
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}"
|
||||
}
|
||||
|
||||
9
.github/subscribers.json
vendored
9
.github/subscribers.json
vendored
@@ -1,9 +0,0 @@
|
||||
{
|
||||
"notebook": [
|
||||
"claudiaregio",
|
||||
"rchiodo",
|
||||
"greazer",
|
||||
"donjayamanne",
|
||||
"jilljac"
|
||||
]
|
||||
}
|
||||
428
.github/workflows/ci.yml
vendored
428
.github/workflows/ci.yml
vendored
@@ -3,312 +3,162 @@ name: CI
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
- release/*
|
||||
pull_request:
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
- release/*
|
||||
|
||||
jobs:
|
||||
windows:
|
||||
name: Windows
|
||||
runs-on: windows-latest
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
CHILD_CONCURRENCY: "1"
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2.2.0
|
||||
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: 12
|
||||
|
||||
- uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: "2.x"
|
||||
|
||||
# {{SQL CARBON EDIT}} Skip caching for now
|
||||
# - name: Compute node modules cache key
|
||||
# id: nodeModulesCacheKey
|
||||
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
|
||||
# - name: Cache node_modules archive
|
||||
# id: cacheNodeModules
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: ".build/node_modules_cache"
|
||||
# key: "${{ runner.os }}-cacheNodeModulesArchive-${{ steps.nodeModulesCacheKey.outputs.value }}"
|
||||
# - name: Extract node_modules archive
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit == 'true' }}
|
||||
# run: 7z.exe x .build/node_modules_cache/cache.7z -aos
|
||||
# - name: Get yarn cache directory path
|
||||
# id: yarnCacheDirPath
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# run: echo "::set-output name=dir::$(yarn cache dir)"
|
||||
# - name: Cache yarn directory
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
|
||||
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
# restore-keys: ${{ runner.os }}-yarnCacheDir-
|
||||
|
||||
- name: Execute yarn
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skipping caching for now
|
||||
env:
|
||||
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
|
||||
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
|
||||
run: yarn --frozen-lockfile --network-timeout 180000
|
||||
# - name: Create node_modules archive {{SQL CARBON EDIT}} Skip caching for now
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# run: |
|
||||
# mkdir -Force .build
|
||||
# node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
|
||||
# mkdir -Force .build/node_modules_cache
|
||||
# 7z.exe a .build/node_modules_cache/cache.7z -mx3 `@.build/node_modules_list.txt
|
||||
|
||||
- name: Compile and Download
|
||||
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" # {{SQL CARBON EDIT}} Remove unused options playwright-install download-builtin-extensions
|
||||
|
||||
- name: Run Unit Tests (Electron)
|
||||
run: .\scripts\test.bat
|
||||
|
||||
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} disable for now
|
||||
# run: yarn test-browser --browser chromium
|
||||
|
||||
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} disable for now
|
||||
# run: .\scripts\test-integration.bat
|
||||
|
||||
linux:
|
||||
name: Linux
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
CHILD_CONCURRENCY: "1"
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2.2.0
|
||||
- uses: actions/checkout@v1
|
||||
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
|
||||
- run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
|
||||
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
|
||||
sudo chmod +x /etc/init.d/xvfb
|
||||
sudo update-rc.d xvfb defaults
|
||||
sudo service xvfb start
|
||||
name: Setup Build Environment
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: 10
|
||||
# TODO: cache node modules
|
||||
- run: yarn --frozen-lockfile
|
||||
name: Install Dependencies
|
||||
- run: yarn electron x64
|
||||
name: Download Electron
|
||||
- run: yarn gulp hygiene
|
||||
name: Run Hygiene Checks
|
||||
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
|
||||
name: Run Strict Compile Options
|
||||
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
|
||||
# name: Run Monaco Editor Checks
|
||||
- run: yarn valid-layers-check
|
||||
name: Run Valid Layers Checks
|
||||
- run: yarn compile
|
||||
name: Compile Sources
|
||||
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
|
||||
# name: Download Built-in Extensions
|
||||
- run: DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests" --coverage --runGlob "**/sql/**/*.test.js"
|
||||
name: Run Unit Tests (Electron)
|
||||
- run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
|
||||
name: Run Extension Unit Tests (Electron)
|
||||
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
|
||||
- run: |
|
||||
mkdir .build/coverage-combined
|
||||
cat .build/coverage-single/lcov.info ./extensions/admin-tool-ext-win/coverage/lcov.info ./extensions/agent/coverage/lcov.info ./extensions/azurecore/coverage/lcov.info ./extensions/cms/coverage/lcov.info ./extensions/dacpac/coverage/lcov.info ./extensions/schema-compare/coverage/lcov.info ./extensions/notebook/coverage/lcov.info ./extensions/resource-deployment/coverage/lcov.info ./extensions/machine-learning/coverage/lcov.info > .build/coverage-combined/lcov.info
|
||||
name: Merge coverage reports
|
||||
- name: Upload Code Coverage
|
||||
uses: coverallsapp/github-action@v1.1.1
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
path-to-lcov: '.build/coverage-combined/lcov.info'
|
||||
|
||||
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
|
||||
- name: Setup Build Environment
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
|
||||
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
|
||||
sudo chmod +x /etc/init.d/xvfb
|
||||
sudo update-rc.d xvfb defaults
|
||||
sudo service xvfb start
|
||||
# Fails with cryptic error (e.g. https://github.com/microsoft/vscode/pull/90292/checks?check_run_id=433681926#step:13:9)
|
||||
# - run: DISPLAY=:10 yarn test-browser --browser chromium
|
||||
# name: Run Unit Tests (Browser)
|
||||
# - run: DISPLAY=:10 ./scripts/test-integration.sh --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
|
||||
# name: Run Integration Tests (Electron)
|
||||
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: 12
|
||||
# {{SQL CARBON EDIT}} Skip caching for now
|
||||
# - name: Compute node modules cache key
|
||||
# id: nodeModulesCacheKey
|
||||
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
|
||||
# - name: Cache node modules
|
||||
# id: cacheNodeModules
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: "**/node_modules"
|
||||
# key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
# restore-keys: ${{ runner.os }}-cacheNodeModules13-
|
||||
# - name: Get yarn cache directory path
|
||||
# id: yarnCacheDirPath
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# run: echo "::set-output name=dir::$(yarn cache dir)"
|
||||
# - name: Cache yarn directory
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
|
||||
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
# restore-keys: ${{ runner.os }}-yarnCacheDir-
|
||||
- name: Execute yarn
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skip caching for now
|
||||
env:
|
||||
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
|
||||
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
|
||||
run: yarn --frozen-lockfile --network-timeout 180000
|
||||
|
||||
- name: Compile and Download
|
||||
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
|
||||
|
||||
- name: Run Unit Tests (Electron)
|
||||
id: electron-unit-tests
|
||||
run: DISPLAY=:10 ./scripts/test.sh --coverage --runGlob "**/sql/**/*.test.js" # {{SQL CARBON EDIT}} Run only our tests with coverage
|
||||
|
||||
- name: Run Extension Unit Tests (Electron)
|
||||
id: electron-extension-unit-tests
|
||||
run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
|
||||
|
||||
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
|
||||
- name: Combine code coverage files
|
||||
run: node test/combineCoverage
|
||||
- name: Upload Code Coverage
|
||||
uses: coverallsapp/github-action@v1.1.1
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
path-to-lcov: "test/coverage/lcov.info"
|
||||
|
||||
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
|
||||
# id: browser-unit-tests
|
||||
# run: DISPLAY=:10 yarn test-browser --browser chromium
|
||||
|
||||
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} Skip for now
|
||||
# id: electron-integration-tests
|
||||
# run: DISPLAY=:10 ./scripts/test-integration.sh
|
||||
windows:
|
||||
runs-on: windows-2016
|
||||
env:
|
||||
CHILD_CONCURRENCY: "1"
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v1
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: 10
|
||||
- uses: actions/setup-python@v1
|
||||
with:
|
||||
python-version: '2.x'
|
||||
- run: yarn --frozen-lockfile
|
||||
name: Install Dependencies
|
||||
- run: yarn electron
|
||||
name: Download Electron
|
||||
- run: yarn gulp hygiene
|
||||
name: Run Hygiene Checks
|
||||
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
|
||||
name: Run Strict Compile Options
|
||||
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
|
||||
# name: Run Monaco Editor Checks
|
||||
- run: yarn valid-layers-check
|
||||
name: Run Valid Layers Checks
|
||||
- run: yarn compile
|
||||
name: Compile Sources
|
||||
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
|
||||
# name: Download Built-in Extensions
|
||||
- run: .\scripts\test.bat --tfs "Unit Tests"
|
||||
name: Run Unit Tests (Electron)
|
||||
# - run: yarn test-browser --browser chromium {{SQL CARBON EDIT}} disable for now @TODO @anthonydresser
|
||||
# name: Run Unit Tests (Browser)
|
||||
# - run: .\scripts\test-integration.bat --tfs "Integration Tests" {{SQL CARBON EDIT}} remove step
|
||||
# name: Run Integration Tests (Electron)
|
||||
|
||||
darwin:
|
||||
name: macOS
|
||||
runs-on: macos-latest
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
CHILD_CONCURRENCY: "1"
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2.2.0
|
||||
- uses: actions/checkout@v1
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: 10
|
||||
- run: yarn --frozen-lockfile
|
||||
name: Install Dependencies
|
||||
- run: yarn electron x64
|
||||
name: Download Electron
|
||||
- run: yarn gulp hygiene
|
||||
name: Run Hygiene Checks
|
||||
- run: yarn strict-vscode # {{SQL CARBON EDIT}} add step
|
||||
name: Run Strict Compile Options
|
||||
# - run: yarn monaco-compile-check {{SQL CARBON EDIT}} remove step
|
||||
# name: Run Monaco Editor Checks
|
||||
- run: yarn valid-layers-check
|
||||
name: Run Valid Layers Checks
|
||||
- run: yarn compile
|
||||
name: Compile Sources
|
||||
# - run: yarn download-builtin-extensions {{SQL CARBON EDIT}} remove step
|
||||
# name: Download Built-in Extensions
|
||||
- run: ./scripts/test.sh --tfs "Unit Tests"
|
||||
name: Run Unit Tests (Electron)
|
||||
# - run: yarn test-browser --browser chromium --browser webkit
|
||||
# name: Run Unit Tests (Browser)
|
||||
# - run: ./scripts/test-integration.sh --tfs "Integration Tests"
|
||||
# name: Run Integration Tests (Electron)
|
||||
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: 12
|
||||
|
||||
# {{SQL CARBON EDIT}} Skip caching for now
|
||||
# - name: Compute node modules cache key
|
||||
# id: nodeModulesCacheKey
|
||||
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
|
||||
# - name: Cache node modules
|
||||
# id: cacheNodeModules
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: "**/node_modules"
|
||||
# key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
# restore-keys: ${{ runner.os }}-cacheNodeModules13-
|
||||
# - name: Get yarn cache directory path
|
||||
# id: yarnCacheDirPath
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# run: echo "::set-output name=dir::$(yarn cache dir)"
|
||||
# - name: Cache yarn directory
|
||||
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
# uses: actions/cache@v2
|
||||
# with:
|
||||
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
|
||||
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
# restore-keys: ${{ runner.os }}-yarnCacheDir-
|
||||
- name: Execute yarn
|
||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
env:
|
||||
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
|
||||
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
|
||||
run: yarn --frozen-lockfile --network-timeout 180000
|
||||
|
||||
- name: Compile and Download
|
||||
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
|
||||
|
||||
# This is required for keytar unittests, otherwise we hit
|
||||
# https://github.com/atom/node-keytar/issues/76
|
||||
- name: Create temporary keychain
|
||||
run: |
|
||||
security create-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
|
||||
security default-keychain -s $RUNNER_TEMP/buildagent.keychain
|
||||
security unlock-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
|
||||
|
||||
- name: Run Unit Tests (Electron)
|
||||
run: DISPLAY=:10 ./scripts/test.sh
|
||||
|
||||
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
|
||||
# run: DISPLAY=:10 yarn test-browser --browser chromium
|
||||
|
||||
# - name: Run Integration Tests (Electron) {{SQL CARBON EDIT}} Skip for now
|
||||
# run: DISPLAY=:10 ./scripts/test-integration.sh
|
||||
|
||||
hygiene:
|
||||
name: Hygiene and Layering
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: 12
|
||||
|
||||
- name: Compute node modules cache key
|
||||
id: nodeModulesCacheKey
|
||||
run: echo "::set-output name=value::$(node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js)"
|
||||
- name: Cache node modules
|
||||
id: cacheNodeModules
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: "**/node_modules"
|
||||
key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
restore-keys: ${{ runner.os }}-cacheNodeModules13-
|
||||
- name: Get yarn cache directory path
|
||||
id: yarnCacheDirPath
|
||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
run: echo "::set-output name=dir::$(yarn cache dir)"
|
||||
- name: Cache yarn directory
|
||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
uses: actions/cache@v2
|
||||
with:
|
||||
path: ${{ steps.yarnCacheDirPath.outputs.dir }}
|
||||
key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||
restore-keys: ${{ runner.os }}-yarnCacheDir-
|
||||
- name: Setup Build Environment # {{SQL CARBON EDIT}} Add step to install required packages if we need to run yarn
|
||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libkrb5-dev
|
||||
- name: Execute yarn
|
||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||
env:
|
||||
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
|
||||
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
|
||||
run: yarn --frozen-lockfile --network-timeout 180000
|
||||
|
||||
- name: Run Hygiene Checks
|
||||
run: yarn gulp hygiene
|
||||
|
||||
- name: Run Valid Layers Checks
|
||||
run: yarn valid-layers-check
|
||||
|
||||
- name: Run Strict Compile Options # {{SQL CARBON EDIT}} add step
|
||||
run: yarn strict-vscode
|
||||
|
||||
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# run: yarn monaco-compile-check
|
||||
|
||||
- name: Run Trusted Types Checks
|
||||
run: yarn tsec-compile-check
|
||||
|
||||
# - name: Editor Distro & ESM Bundle {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# run: yarn gulp editor-esm-bundle
|
||||
|
||||
# - name: Typings validation prep {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# run: |
|
||||
# mkdir typings-test
|
||||
|
||||
# - name: Typings validation {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# working-directory: ./typings-test
|
||||
# run: |
|
||||
# yarn init -yp
|
||||
# ../node_modules/.bin/tsc --init
|
||||
# echo "import '../out-monaco-editor-core';" > a.ts
|
||||
# ../node_modules/.bin/tsc --noEmit
|
||||
|
||||
# - name: Webpack Editor {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# working-directory: ./test/monaco
|
||||
# run: yarn run bundle
|
||||
|
||||
# - name: Compile Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# working-directory: ./test/monaco
|
||||
# run: yarn run compile
|
||||
|
||||
# - name: Download Playwright {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# run: yarn playwright-install
|
||||
|
||||
# - name: Run Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
|
||||
# timeout-minutes: 5
|
||||
# working-directory: ./test/monaco
|
||||
# run: yarn test
|
||||
# monaco:
|
||||
# runs-on: ubuntu-latest
|
||||
# env:
|
||||
# CHILD_CONCURRENCY: "1"
|
||||
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
# steps:
|
||||
# - uses: actions/checkout@v1
|
||||
# # TODO: rename azure-pipelines/linux/xvfb.init to github-actions
|
||||
# - run: |
|
||||
# sudo apt-get update
|
||||
# sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1
|
||||
# sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
|
||||
# sudo chmod +x /etc/init.d/xvfb
|
||||
# sudo update-rc.d xvfb defaults
|
||||
# sudo service xvfb start
|
||||
# name: Setup Build Environment
|
||||
# - uses: actions/setup-node@v1
|
||||
# with:
|
||||
# node-version: 10
|
||||
# - run: yarn --frozen-lockfile
|
||||
# name: Install Dependencies
|
||||
# - run: yarn monaco-compile-check
|
||||
# name: Run Monaco Editor Checks
|
||||
# - run: yarn gulp editor-esm-bundle
|
||||
# name: Editor Distro & ESM Bundle
|
||||
|
||||
24
.github/workflows/on-issue-open.yml
vendored
Normal file
24
.github/workflows/on-issue-open.yml
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
name: On Issue Open
|
||||
on:
|
||||
issues:
|
||||
types: [opened]
|
||||
|
||||
jobs:
|
||||
main:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout Actions
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
repository: 'microsoft/azuredatastudio'
|
||||
ref: master
|
||||
path: ./actions
|
||||
- name: Install Actions
|
||||
run: npm install --production --prefix ./actions/build/actions
|
||||
|
||||
- name: Run CopyCat
|
||||
uses: ./actions/build/actions/copycat
|
||||
with:
|
||||
token: ${{secrets.TRIAGE_PAT}}
|
||||
owner: anthonydresser
|
||||
repo: testissues
|
||||
15
.github/workflows/on-label.yml
vendored
15
.github/workflows/on-label.yml
vendored
@@ -1,15 +0,0 @@
|
||||
name: On Label
|
||||
on:
|
||||
issues:
|
||||
types: [labeled]
|
||||
|
||||
jobs:
|
||||
processLabelAction:
|
||||
name: Process Label Action
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Process Label Action
|
||||
uses: hramos/label-actions@v1
|
||||
with:
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
2
.github/workflows/on-pr-open.yml
vendored
2
.github/workflows/on-pr-open.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
repository: 'microsoft/azuredatastudio'
|
||||
ref: main
|
||||
ref: master
|
||||
path: ./actions
|
||||
- name: Install Actions
|
||||
run: npm install --production --prefix ./actions/build/actions
|
||||
|
||||
21
.gitignore
vendored
21
.gitignore
vendored
@@ -5,8 +5,25 @@ Thumbs.db
|
||||
node_modules/
|
||||
.build/
|
||||
extensions/**/dist/
|
||||
/out*/
|
||||
/extensions/**/out/
|
||||
out/
|
||||
out-build/
|
||||
out-editor/
|
||||
out-editor-src/
|
||||
out-editor-build/
|
||||
out-editor-esm/
|
||||
out-editor-esm-bundle/
|
||||
out-editor-min/
|
||||
out-monaco-editor-core/
|
||||
out-vscode/
|
||||
out-vscode-min/
|
||||
out-vscode-reh/
|
||||
out-vscode-reh-min/
|
||||
out-vscode-reh-pkg/
|
||||
out-vscode-reh-web/
|
||||
out-vscode-reh-web-min/
|
||||
out-vscode-reh-web-pkg/
|
||||
out-vscode-web/
|
||||
out-vscode-web-min/
|
||||
src/vs/server
|
||||
resources/server
|
||||
build/node_modules
|
||||
|
||||
3
.vscode/extensions.json
vendored
3
.vscode/extensions.json
vendored
@@ -3,6 +3,7 @@
|
||||
// for the documentation about the extensions.json format
|
||||
"recommendations": [
|
||||
"dbaeumer.vscode-eslint",
|
||||
"EditorConfig.EditorConfig"
|
||||
"EditorConfig.EditorConfig",
|
||||
"msjsdiag.debugger-for-chrome"
|
||||
]
|
||||
}
|
||||
|
||||
264
.vscode/launch.json
vendored
264
.vscode/launch.json
vendored
@@ -19,15 +19,16 @@
|
||||
"timeout": 30000,
|
||||
"port": 5870,
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js",
|
||||
"${workspaceFolder}/extensions/*/out/**/*.js"
|
||||
]
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"hidden": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-chrome",
|
||||
"request": "attach",
|
||||
"name": "Attach to Shared Process",
|
||||
"timeout": 30000,
|
||||
"port": 9222,
|
||||
"urlFilter": "*sharedProcess.html*",
|
||||
"presentation": {
|
||||
@@ -41,7 +42,10 @@
|
||||
"port": 5876,
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
]
|
||||
],
|
||||
"presentation": {
|
||||
"hidden": true,
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
@@ -56,7 +60,6 @@
|
||||
"type": "node",
|
||||
"request": "attach",
|
||||
"name": "Attach to Main Process",
|
||||
"timeout": 30000,
|
||||
"port": 5875,
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
@@ -66,147 +69,11 @@
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code Emmet Tests",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"${workspaceFolder}/extensions/emmet/test-fixtures",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/emmet",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/emmet/out/test"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 6
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code Git Tests",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"/tmp/my4g9l",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/git",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/git/out/test"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/extensions/git/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 6
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code API Tests (single folder)",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
// "${workspaceFolder}", // Uncomment for running out of sources.
|
||||
"${workspaceFolder}/extensions/vscode-api-tests/testWorkspace",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/singlefolder-tests",
|
||||
"--disable-extensions"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 3
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code API Tests (workspace)",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"${workspaceFolder}/extensions/vscode-api-tests/testworkspace.code-workspace",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/workspace-tests"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 4
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code Tokenizer Tests",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"${workspaceFolder}/extensions/vscode-colorize-tests/test",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-colorize-tests",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-colorize-tests/out"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 5
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code Notebook Tests",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"${workspaceFolder}/extensions/vscode-notebook-tests/test",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-notebook-tests",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-notebook-tests/out"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 6
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "extensionHost",
|
||||
"request": "launch",
|
||||
"name": "VS Code Custom Editor Tests",
|
||||
"runtimeExecutable": "${execPath}",
|
||||
"args": [
|
||||
"${workspaceFolder}/extensions/vscode-custom-editor-tests/test-workspace",
|
||||
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-custom-editor-tests",
|
||||
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-custom-editor-tests/out/test"
|
||||
],
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "5_tests",
|
||||
"order": 6
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-chrome",
|
||||
"type": "chrome",
|
||||
"request": "attach",
|
||||
"name": "Attach to azuredatastudio",
|
||||
"browserAttachLocation": "workspace",
|
||||
"port": 9222,
|
||||
"trace": true,
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"resolveSourceMapLocations": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"perScriptSourcemaps": "yes"
|
||||
"timeout": 50000,
|
||||
"port": 9222
|
||||
},
|
||||
{
|
||||
"type": "pwa-chrome",
|
||||
@@ -224,19 +91,16 @@
|
||||
"port": 9222,
|
||||
"timeout": 20000,
|
||||
"env": {
|
||||
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null,
|
||||
"VSCODE_SKIP_PRELAUNCH": "1"
|
||||
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null
|
||||
},
|
||||
"cleanUp": "wholeBrowser",
|
||||
"breakOnLoad": false,
|
||||
"urlFilter": "*workbench.html*",
|
||||
"runtimeArgs": [
|
||||
"--inspect=5875",
|
||||
"--no-cached-data",
|
||||
],
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"cascadeTerminateToConfigurations": [
|
||||
"Attach to Extension Host"
|
||||
],
|
||||
// Settings for js-debug:
|
||||
"userDataDir": false,
|
||||
"pauseForSourceMap": false,
|
||||
"outFiles": [
|
||||
@@ -245,10 +109,37 @@
|
||||
"browserLaunchLocation": "workspace"
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"name": "Launch azuredatastudio with new notebook command",
|
||||
"windows": {
|
||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
|
||||
},
|
||||
"osx": {
|
||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
||||
},
|
||||
"linux": {
|
||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
||||
},
|
||||
"urlFilter": "*index.html*",
|
||||
"runtimeArgs": [
|
||||
"--inspect=5875",
|
||||
"--command=notebook.command.new"
|
||||
],
|
||||
"skipFiles": [
|
||||
"**/winjs*.js"
|
||||
],
|
||||
"webRoot": "${workspaceFolder}",
|
||||
"timeout": 45000
|
||||
},
|
||||
{
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"name": "Launch ADS (Web) (TBD)",
|
||||
"program": "${workspaceFolder}/resources/web/code-web.js",
|
||||
"runtimeExecutable": "yarn",
|
||||
"runtimeArgs": [
|
||||
"web"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "0_vscode",
|
||||
"order": 2
|
||||
@@ -274,11 +165,9 @@
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-chrome",
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
"outFiles": [],
|
||||
"perScriptSourcemaps": "yes",
|
||||
"name": "VS Code (Web, Chrome)",
|
||||
"name": "Launch ADS (Web, Chrome) (TBD)",
|
||||
"url": "http://localhost:8080",
|
||||
"preLaunchTask": "Run web",
|
||||
"presentation": {
|
||||
@@ -286,20 +175,6 @@
|
||||
"order": 3
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-msedge",
|
||||
"request": "launch",
|
||||
"outFiles": [],
|
||||
"perScriptSourcemaps": "yes",
|
||||
"name": "VS Code (Web, Edge)",
|
||||
"url": "http://localhost:8080",
|
||||
"pauseForSourceMap": false,
|
||||
"preLaunchTask": "Run web",
|
||||
"presentation": {
|
||||
"group": "0_vscode",
|
||||
"order": 3
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
"request": "launch",
|
||||
@@ -330,7 +205,7 @@
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-node",
|
||||
"type": "node",
|
||||
"request": "launch",
|
||||
"name": "Run Unit Tests",
|
||||
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
||||
@@ -349,9 +224,6 @@
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"cascadeTerminateToConfigurations": [
|
||||
"Attach to azuredatastudio"
|
||||
],
|
||||
"env": {
|
||||
"MOCHA_COLORS": "true"
|
||||
},
|
||||
@@ -359,35 +231,6 @@
|
||||
"hidden": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "pwa-node",
|
||||
"request": "launch",
|
||||
"name": "Run Unit Tests For Current File",
|
||||
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
|
||||
"windows": {
|
||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
|
||||
},
|
||||
"linux": {
|
||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
|
||||
},
|
||||
"cascadeTerminateToConfigurations": [
|
||||
"Attach to azuredatastudio"
|
||||
],
|
||||
"outputCapture": "std",
|
||||
"args": [
|
||||
"--remote-debugging-port=9222",
|
||||
"--run",
|
||||
"${relativeFile}"
|
||||
],
|
||||
"cwd": "${workspaceFolder}",
|
||||
"outFiles": [
|
||||
"${workspaceFolder}/out/**/*.js"
|
||||
],
|
||||
"env": {
|
||||
"MOCHA_COLORS": "true"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "chrome",
|
||||
"request": "launch",
|
||||
@@ -438,14 +281,12 @@
|
||||
},
|
||||
{
|
||||
"name": "Azure Data Studio",
|
||||
"stopAll": true,
|
||||
"configurations": [
|
||||
"Launch azuredatastudio",
|
||||
"Attach to Main Process",
|
||||
"Attach to Extension Host",
|
||||
"Attach to Shared Process",
|
||||
],
|
||||
"preLaunchTask": "Ensure Prelaunch Dependencies",
|
||||
"presentation": {
|
||||
"group": "0_vscode",
|
||||
"order": 1
|
||||
@@ -484,17 +325,6 @@
|
||||
"group": "1_vscode",
|
||||
"order": 2
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "Debug Unit Tests (Current File)",
|
||||
"configurations": [
|
||||
"Attach to azuredatastudio",
|
||||
"Run Unit Tests For Current File"
|
||||
],
|
||||
"presentation": {
|
||||
"group": "1_vscode",
|
||||
"order": 2
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
38
.vscode/notebooks/api.github-issues
vendored
38
.vscode/notebooks/api.github-issues
vendored
@@ -1,38 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Config",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"April 2021\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Finalization",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repo $milestone label:api-finalization",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Proposals",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repo $milestone is:open label:api-proposal ",
|
||||
"editable": true
|
||||
}
|
||||
]
|
||||
107
.vscode/notebooks/endgame.github-issues
vendored
107
.vscode/notebooks/endgame.github-issues
vendored
@@ -1,107 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Macros"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\""
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Preparation"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Pull Requests on the Milestone"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:pr is:open"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Issues on the Milestone"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Feature Requests Missing Labels"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Testing"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Test Plan Items"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Verification Needed"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request label:verification-needed -label:verified"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Verification"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Verifiable Fixes"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified -label:unreleased"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Unreleased Fixes"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified label:unreleased"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Candidates"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:open label:candidate"
|
||||
}
|
||||
]
|
||||
770
.vscode/notebooks/grooming-delta.github-issues
vendored
770
.vscode/notebooks/grooming-delta.github-issues
vendored
@@ -1,770 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Config",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$since=2020-10-01",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode\n\nQuery exceeds the maximum result. Run the query manually: `is:issue is:open closed:>2020-10-01`",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "//repo:microsoft/vscode is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "//repo:microsoft/vscode is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-remote-release",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-remote-release is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-remote-release is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-editor",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-editor is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-editor is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-docs",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-docs is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-docs is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-js-debug",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-js-debug is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-js-debug is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# language-server-protocol",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/language-server-protocol is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/language-server-protocol is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-eslint",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-eslint is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-eslint is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-css-languageservice",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-css-languageservice is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-css-languageservice is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-test",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-test is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-test is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-pull-request-github",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-test is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-chrome-debug (deprecated)",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-chrome-debug is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-chrome-debug is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-chrome-debug-core",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-chrome-debug-core is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-chrome-debug-core is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-debugadapter-node",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-debugadapter-node is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-debugadapter-node is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-emmet-helper",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-emmet-helper is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-emmet-helper is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-extension-vscode\n\nDeprecated",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-extension-vscode is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-extension-vscode is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-extension-samples",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-extension-samples is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-extension-samples is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-filewatcher-windows",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-filewatcher-windows is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-filewatcher-windows is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-generator-code",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-generator-code is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-generator-code is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-html-languageservice",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-html-languageservice is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-html-languageservice is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-jshint",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-jshint is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-jshint is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-json-languageservice",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-json-languageservice is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-json-languageservice is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-languageserver-node",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-languageserver-node is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-languageserver-node is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-loader",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-loader is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-loader is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-mono-debug",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-mono-debug is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-mono-debug is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-node-debug",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-node-debug is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-node-debug is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-node-debug2",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-node-debug2 is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-node-debug2 is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-recipes",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-recipes is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-recipes is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-textmate",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-textmate is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-textmate is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-themes",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-themes is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-themes is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-vsce",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-vsce is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-vsce is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-website",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-website is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-website is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# vscode-windows-process-tree",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-windows-process-tree is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode-windows-process-tree is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# debug-adapter-protocol",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/debug-adapter-protocol is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/debug-adapter-protocol is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# inno-updater",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/inno-updater is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/inno-updater is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# language-server-protocol-inspector",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/language-server-protocol-inspector is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/language-server-protocol-inspector is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-languages",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-languages is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-languages is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-typescript",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-typescript is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-typescript is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-css",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-css is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-css is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-json",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-json is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-json is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-html",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-html is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-html is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# monaco-editor-webpack-plugin",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue created:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# node-jsonc-parser",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/node-jsonc-parser is:issue closed:>$since",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/node-jsonc-parser is:issue created:>$since",
|
||||
"editable": true
|
||||
}
|
||||
]
|
||||
30
.vscode/notebooks/grooming.github-issues
vendored
30
.vscode/notebooks/grooming.github-issues
vendored
@@ -1,30 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Categorizing Issues\n\nEach issue must have a type label. Most type labels are grey, some are yellow. Bugs are grey with a touch of red.",
|
||||
"editable": true,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:\"needs more info\" -label:bug -label:feature-request -label:under-discussion -label:debt -label:*question -label:upstream -label:electron -label:engineering -label:plan-item ",
|
||||
"editable": true,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Feature Areas\n\nEach issue should be assigned to a feature area",
|
||||
"editable": true,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode is:open is:issue assignee:@me -label:L10N -label:VIM -label:api -label:api-finalization -label:api-proposal -label:authentication -label:breadcrumbs -label:callhierarchy -label:code-lens -label:color-palette -label:comments -label:config -label:context-keys -label:css-less-scss -label:custom-editors -label:debug -label:debug-console -label:dialogs -label:diff-editor -label:dropdown -label:editor -label:editor-RTL -label:editor-autoclosing -label:editor-autoindent -label:editor-bracket-matching -label:editor-clipboard -label:editor-code-actions -label:editor-color-picker -label:editor-columnselect -label:editor-commands -label:editor-comments -label:editor-contrib -label:editor-core -label:editor-drag-and-drop -label:editor-error-widget -label:editor-find -label:editor-folding -label:editor-highlight -label:editor-hover -label:editor-indent-detection -label:editor-indent-guides -label:editor-input -label:editor-input-IME -label:editor-insets -label:editor-minimap -label:editor-multicursor -label:editor-parameter-hints -label:editor-render-whitespace -label:editor-rendering -label:editor-scrollbar -label:editor-symbols -label:editor-synced-region -label:editor-textbuffer -label:editor-theming -label:editor-wordnav -label:editor-wrapping -label:emmet -label:error-list -label:explorer-custom -label:extension-host -label:extension-recommendations -label:extensions -label:extensions-development -label:file-decorations -label:file-encoding -label:file-explorer -label:file-glob -label:file-guess-encoding -label:file-io -label:file-watcher -label:font-rendering -label:formatting -label:git -label:github -label:gpu -label:grammar -label:grid-view -label:html -label:i18n -label:icon-brand -label:icons-product -label:install-update -label:integrated-terminal -label:integrated-terminal-conpty -label:integrated-terminal-links -label:integrated-terminal-rendering -label:integrated-terminal-winpty -label:intellisense-config -label:ipc -label:issue-bot -label:issue-reporter -label:javascript -label:json -label:keybindings -label:keybindings-editor -label:keyboard-layout -label:label-provider -label:languages-basic -label:languages-diagnostics -label:languages-guessing -label:layout -label:lcd-text-rendering -label:list -label:log -label:markdown -label:marketplace -label:menus -label:merge-conflict -label:notebook -label:outline -label:output -label:perf -label:perf-bloat -label:perf-startup -label:php -label:portable-mode -label:proxy -label:quick-pick -label:references-viewlet -label:release-notes -label:remote -label:remote-explorer -label:rename -label:sandbox -label:scm -label:screencast-mode -label:search -label:search-api -label:search-editor -label:search-replace -label:semantic-tokens -label:settings-editor -label:settings-sync -label:settings-sync-server -label:shared-process -label:simple-file-dialog -label:smart-select -label:snap -label:snippets -label:splitview -label:suggest -label:sync-error-handling -label:tasks -label:telemetry -label:themes -label:timeline -label:timeline-git -label:titlebar -label:tokenization -label:touch/pointer -label:trackpad/scroll -label:tree -label:typescript -label:undo-redo -label:uri -label:ux -label:variable-resolving -label:vscode-build -label:vscode-website -label:web -label:webview -label:workbench-actions -label:workbench-cli -label:workbench-diagnostics -label:workbench-dnd -label:workbench-editor-grid -label:workbench-editors -label:workbench-electron -label:workbench-feedback -label:workbench-history -label:workbench-hot-exit -label:workbench-hover -label:workbench-launch -label:workbench-link -label:workbench-multiroot -label:workbench-notifications -label:workbench-os-integration -label:workbench-rapid-render -label:workbench-run-as-admin -label:workbench-state -label:workbench-status -label:workbench-tabs -label:workbench-touchbar -label:workbench-views -label:workbench-welcome -label:workbench-window -label:workbench-zen -label:workspace-edit -label:workspace-symbols -label:zoom",
|
||||
"editable": true,
|
||||
"outputs": []
|
||||
}
|
||||
]
|
||||
50
.vscode/notebooks/inbox.github-issues
vendored
50
.vscode/notebooks/inbox.github-issues
vendored
@@ -1,50 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## tl;dr: Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$inbox -label:\"needs more info\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "##### `Config`: defines the inbox query",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item ",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Inbox tracking and Issue triage",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/main/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions.",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## All Inbox Items\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$inbox",
|
||||
"editable": true
|
||||
}
|
||||
]
|
||||
182
.vscode/notebooks/my-endgame.github-issues
vendored
182
.vscode/notebooks/my-endgame.github-issues
vendored
@@ -1,182 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Macros"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\"\n\n$MINE=assignee:@me"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Preparation"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Pull Requests on the Milestone"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:pr is:open"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Issues on the Milestone"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Feature Requests Missing Labels"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Test Plan Items"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE is:issue is:open author:@me label:testplan-item"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Verification Needed"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Testing"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Test Plan Items"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:testplan-item"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Verification Needed"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified -label:z-author-verified label:feature-request label:verification-needed"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Fixing"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Issues"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:endgame-plan -label:testplan-item -label:iteration-plan"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Open Bugs"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Verification"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## My Issues (verification-steps-needed)"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-steps-needed"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## My Issues (verification-found)"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-found"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Issues filed by me"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Issues filed from outside team"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -author:aeschli -author:alexdima -author:alexr00 -author:AmandaSilver -author:bamurtaugh -author:bpasero -author:btholt -author:chrisdias -author:chrmarti -author:Chuxel -author:connor4312 -author:dbaeumer -author:deepak1556 -author:devinvalenciano -author:digitarald -author:eamodio -author:egamma -author:fiveisprime -author:gregvanl -author:isidorn -author:ItalyPaleAle -author:JacksonKearl -author:joaomoreno -author:jrieken -author:kieferrm -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:ornellaalt -author:orta -author:rebornix -author:RMacfarlane -author:roblourens -author:rzhao271 -author:sana-ajani -author:sandy081 -author:sbatten -author:stevencl -author:Tyriar -author:weinand -author:TylerLeonhardt -author:lramos15"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Issues filed by others"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found"
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "# Release Notes"
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode $MILESTONE $MINE is:issue is:closed label:feature-request -label:on-release-notes"
|
||||
}
|
||||
]
|
||||
116
.vscode/notebooks/my-work.github-issues
vendored
116
.vscode/notebooks/my-work.github-issues
vendored
@@ -1,116 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "##### `Config`: This should be changed every month/milestone",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"April 2021\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "github-issues",
|
||||
"value": "## Milestone Work",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos $milestone assignee:@me is:open",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "github-issues",
|
||||
"value": "## Bugs, Debt, Features...",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### My Bugs",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open label:bug",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Debt & Engineering",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open label:debt OR $repos assignee:@me is:open label:engineering",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Performance 🐌 🔜 🏎",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open label:perf OR $repos assignee:@me is:open label:perf-startup OR $repos assignee:@me is:open label:perf-bloat OR $repos assignee:@me is:open label:freeze-slow-crash-leak",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Feature Requests",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open label:feature-request milestone:Backlog sort:reactions-+1-desc",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open milestone:\"Backlog Candidates\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Personal Inbox\n",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "\n#### Missing Type label",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open type:issue -label:bug -label:\"needs more info\" -label:feature-request -label:under-discussion -label:debt -label:plan-item -label:upstream",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Not Actionable",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos assignee:@me is:open label:\"needs more info\"",
|
||||
"editable": true
|
||||
}
|
||||
]
|
||||
44
.vscode/notebooks/papercuts.github-issues
vendored
44
.vscode/notebooks/papercuts.github-issues
vendored
@@ -1,44 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Papercuts\n\nThis notebook serves as an ongoing collection of papercut issues that we encounter while dogfooding. With that in mind only promote issues that really turn you off, e.g. issues that make you want to stop using VS Code or its extensions. To mark an issue (bug, feature-request, etc.) as papercut add the labels: `papercut :drop_of_blood:`",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## All Papercuts\n\nThese are all papercut issues that we encounter while dogfooding vscode or extensions that we author.",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode is:open -label:notebook label:\"papercut :drop_of_blood:\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "## Native Notebook",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode is:open label:notebook label:\"papercut :drop_of_blood:\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### My Papercuts",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "repo:microsoft/vscode is:open assignee:@me label:\"papercut :drop_of_blood:\"",
|
||||
"editable": true
|
||||
}
|
||||
]
|
||||
56
.vscode/notebooks/verification.github-issues
vendored
56
.vscode/notebooks/verification.github-issues
vendored
@@ -1,56 +0,0 @@
|
||||
[
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Bug Verification Queries\n\nBefore shipping we want to verify _all_ bugs. That means when a bug is fixed we check that the fix actually works. It's always best to start with bugs that you have filed and the proceed with bugs that have been filed from users outside the development team. ",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "#### Config: update list of `repos` and the `milestone`",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"March 2021\"",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Bugs You Filed",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate author:@me",
|
||||
"editable": false
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### Bugs From Outside",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand -author:rzhao271 -author:kieferrm -author:TylerLeonhardt -author:bamurtaugh",
|
||||
"editable": false
|
||||
},
|
||||
{
|
||||
"kind": 1,
|
||||
"language": "markdown",
|
||||
"value": "### All",
|
||||
"editable": true
|
||||
},
|
||||
{
|
||||
"kind": 2,
|
||||
"language": "github-issues",
|
||||
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate",
|
||||
"editable": false
|
||||
}
|
||||
]
|
||||
101
.vscode/searches/TrustedTypes.code-search
vendored
101
.vscode/searches/TrustedTypes.code-search
vendored
@@ -1,101 +0,0 @@
|
||||
# Query: .innerHTML =
|
||||
# Flags: CaseSensitive WordMatch
|
||||
# Including: src/vs/**/*.{t,j}s
|
||||
# Excluding: *.test.ts, **/test/**
|
||||
# ContextLines: 3
|
||||
|
||||
12 results - 9 files
|
||||
|
||||
src/vs/base/browser/dom.ts:
|
||||
1359 );
|
||||
1360
|
||||
1361 const html = _ttpSafeInnerHtml?.createHTML(value, options) ?? insane(value, options);
|
||||
1362: node.innerHTML = html as unknown as string;
|
||||
1363 }
|
||||
|
||||
src/vs/base/browser/markdownRenderer.ts:
|
||||
272 };
|
||||
273
|
||||
274 if (_ttpInsane) {
|
||||
275: element.innerHTML = _ttpInsane.createHTML(renderedMarkdown, insaneOptions) as unknown as string;
|
||||
276 } else {
|
||||
277: element.innerHTML = insane(renderedMarkdown, insaneOptions);
|
||||
278 }
|
||||
279
|
||||
280 // signal that async code blocks can be now be inserted
|
||||
|
||||
src/vs/editor/browser/core/markdownRenderer.ts:
|
||||
88
|
||||
89 const element = document.createElement('span');
|
||||
90
|
||||
91: element.innerHTML = MarkdownRenderer._ttpTokenizer
|
||||
92 ? MarkdownRenderer._ttpTokenizer.createHTML(value, tokenization) as unknown as string
|
||||
93 : tokenizeToString(value, tokenization);
|
||||
94
|
||||
|
||||
src/vs/editor/browser/view/domLineBreaksComputer.ts:
|
||||
107 allCharOffsets[i] = tmp[0];
|
||||
108 allVisibleColumns[i] = tmp[1];
|
||||
109 }
|
||||
110: containerDomNode.innerHTML = sb.build();
|
||||
111
|
||||
112 containerDomNode.style.position = 'absolute';
|
||||
113 containerDomNode.style.top = '10000';
|
||||
|
||||
src/vs/editor/browser/view/viewLayer.ts:
|
||||
512 }
|
||||
513 const lastChild = <HTMLElement>this.domNode.lastChild;
|
||||
514 if (domNodeIsEmpty || !lastChild) {
|
||||
515: this.domNode.innerHTML = newLinesHTML;
|
||||
516 } else {
|
||||
517 lastChild.insertAdjacentHTML('afterend', newLinesHTML);
|
||||
518 }
|
||||
|
||||
533 if (ViewLayerRenderer._ttPolicy) {
|
||||
534 invalidLinesHTML = ViewLayerRenderer._ttPolicy.createHTML(invalidLinesHTML) as unknown as string;
|
||||
535 }
|
||||
536: hugeDomNode.innerHTML = invalidLinesHTML;
|
||||
537
|
||||
538 for (let i = 0; i < ctx.linesLength; i++) {
|
||||
539 const line = ctx.lines[i];
|
||||
|
||||
src/vs/editor/browser/widget/diffEditorWidget.ts:
|
||||
2157
|
||||
2158 let domNode = document.createElement('div');
|
||||
2159 domNode.className = `view-lines line-delete ${MOUSE_CURSOR_TEXT_CSS_CLASS_NAME}`;
|
||||
2160: domNode.innerHTML = sb.build();
|
||||
2161 Configuration.applyFontInfoSlow(domNode, fontInfo);
|
||||
2162
|
||||
2163 let marginDomNode = document.createElement('div');
|
||||
2164 marginDomNode.className = 'inline-deleted-margin-view-zone';
|
||||
2165: marginDomNode.innerHTML = marginHTML.join('');
|
||||
2166 Configuration.applyFontInfoSlow(marginDomNode, fontInfo);
|
||||
2167
|
||||
2168 return {
|
||||
|
||||
src/vs/editor/standalone/browser/colorizer.ts:
|
||||
40 let text = domNode.firstChild ? domNode.firstChild.nodeValue : '';
|
||||
41 domNode.className += ' ' + theme;
|
||||
42 let render = (str: string) => {
|
||||
43: domNode.innerHTML = str;
|
||||
44 };
|
||||
45 return this.colorize(modeService, text || '', mimeType, options).then(render, (err) => console.error(err));
|
||||
46 }
|
||||
|
||||
src/vs/workbench/contrib/notebook/browser/view/renderers/cellRenderer.ts:
|
||||
580 const element = DOM.$('div', { style });
|
||||
581
|
||||
582 const linesHtml = this.getRichTextLinesAsHtml(model, modelRange, colorMap);
|
||||
583: element.innerHTML = linesHtml as unknown as string;
|
||||
584 return element;
|
||||
585 }
|
||||
586
|
||||
|
||||
src/vs/workbench/contrib/notebook/browser/view/renderers/webviewPreloads.ts:
|
||||
375 addMouseoverListeners(outputNode, outputId);
|
||||
376 const content = data.content;
|
||||
377 if (content.type === RenderOutputType.Html) {
|
||||
378: outputNode.innerHTML = content.htmlContent;
|
||||
379 cellOutputContainer.appendChild(outputNode);
|
||||
380 domEval(outputNode);
|
||||
381 } else if (preloadErrs.some(e => !!e)) {
|
||||
86
.vscode/searches/es6.code-search
vendored
Normal file
86
.vscode/searches/es6.code-search
vendored
Normal file
@@ -0,0 +1,86 @@
|
||||
# Query: @deprecated ES6
|
||||
# Flags: CaseSensitive WordMatch
|
||||
# ContextLines: 2
|
||||
|
||||
16 results - 5 files
|
||||
|
||||
src/vs/base/browser/dom.ts:
|
||||
81 };
|
||||
82
|
||||
83: /** @deprecated ES6 - use classList*/
|
||||
84 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
|
||||
85: /** @deprecated ES6 - use classList*/
|
||||
86 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
|
||||
87: /** @deprecated ES6 - use classList*/
|
||||
88 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
|
||||
89: /** @deprecated ES6 - use classList*/
|
||||
90 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
|
||||
91: /** @deprecated ES6 - use classList*/
|
||||
92 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
|
||||
93: /** @deprecated ES6 - use classList*/
|
||||
94 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
|
||||
95
|
||||
|
||||
src/vs/base/common/arrays.ts:
|
||||
401
|
||||
402 /**
|
||||
403: * @deprecated ES6: use `Array.findIndex`
|
||||
404 */
|
||||
405 export function firstIndex<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean): number {
|
||||
|
||||
417
|
||||
418 /**
|
||||
419: * @deprecated ES6: use `Array.find`
|
||||
420 */
|
||||
421 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
|
||||
|
||||
560
|
||||
561 /**
|
||||
562: * @deprecated ES6: use `Array.find`
|
||||
563 */
|
||||
564 export function find<T>(arr: ArrayLike<T>, predicate: (value: T, index: number, arr: ArrayLike<T>) => any): T | undefined {
|
||||
|
||||
src/vs/base/common/map.ts:
|
||||
11
|
||||
12 /**
|
||||
13: * @deprecated ES6: use `[...SetOrMap.values()]`
|
||||
14 */
|
||||
15 export function values<V = any>(set: Set<V>): V[];
|
||||
|
||||
22
|
||||
23 /**
|
||||
24: * @deprecated ES6: use `[...map.keys()]`
|
||||
25 */
|
||||
26 export function keys<K, V>(map: Map<K, V>): K[] {
|
||||
|
||||
src/vs/base/common/objects.ts:
|
||||
115
|
||||
116 /**
|
||||
117: * @deprecated ES6
|
||||
118 */
|
||||
119 export function assign<T>(destination: T): T;
|
||||
|
||||
src/vs/base/common/strings.ts:
|
||||
15
|
||||
16 /**
|
||||
17: * @deprecated ES6: use `String.padStart`
|
||||
18 */
|
||||
19 export function pad(n: number, l: number, char: string = '0'): string {
|
||||
|
||||
146
|
||||
147 /**
|
||||
148: * @deprecated ES6: use `String.startsWith`
|
||||
149 */
|
||||
150 export function startsWith(haystack: string, needle: string): boolean {
|
||||
|
||||
167
|
||||
168 /**
|
||||
169: * @deprecated ES6: use `String.endsWith`
|
||||
170 */
|
||||
171 export function endsWith(haystack: string, needle: string): boolean {
|
||||
|
||||
853
|
||||
854 /**
|
||||
855: * @deprecated ES6
|
||||
856 */
|
||||
857 export function repeat(s: string, count: number): string {
|
||||
58
.vscode/searches/ts36031.code-search
vendored
58
.vscode/searches/ts36031.code-search
vendored
@@ -2,52 +2,18 @@
|
||||
# Flags: RegExp
|
||||
# ContextLines: 2
|
||||
|
||||
8 results - 4 files
|
||||
2 results - 2 files
|
||||
|
||||
src/vs/base/browser/ui/tree/asyncDataTree.ts:
|
||||
241 } : () => 'treeitem',
|
||||
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
|
||||
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
|
||||
244 } : undefined,
|
||||
245 getAriaLabel(e) {
|
||||
243 } : () => 'treeitem',
|
||||
244 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
|
||||
245: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
|
||||
246 } : undefined,
|
||||
247 getAriaLabel(e) {
|
||||
|
||||
src/vs/platform/list/browser/listService.ts:
|
||||
463
|
||||
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) {
|
||||
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
|
||||
466 this._register(options?.configurationService.onDidChangeConfiguration(() => {
|
||||
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
|
||||
468 }));
|
||||
469 } else {
|
||||
|
||||
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
|
||||
1526
|
||||
1527 await this._ensureActiveKernel();
|
||||
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
|
||||
1529 }
|
||||
1530
|
||||
|
||||
1535
|
||||
1536 await this._ensureActiveKernel();
|
||||
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
|
||||
1538 }
|
||||
1539
|
||||
|
||||
1553
|
||||
1554 await this._ensureActiveKernel();
|
||||
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
|
||||
1556 }
|
||||
1557
|
||||
|
||||
1567
|
||||
1568 await this._ensureActiveKernel();
|
||||
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
|
||||
1570 }
|
||||
1571
|
||||
|
||||
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
|
||||
89 .then(() => this._resourceRequestManager.ensureReady())
|
||||
90 .then(() => {
|
||||
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
|
||||
92 });
|
||||
93 }
|
||||
src/vs/workbench/contrib/debug/browser/debugConfigurationManager.ts:
|
||||
254
|
||||
255 return debugDynamicExtensions.map(e => {
|
||||
256: const type = e.contributes?.debuggers![0].type!;
|
||||
257 return {
|
||||
258 label: this.getDebuggerLabel(type)!,
|
||||
|
||||
10
.vscode/settings.json
vendored
10
.vscode/settings.json
vendored
@@ -26,7 +26,6 @@
|
||||
"test/automation/out/**": true,
|
||||
"test/integration/browser/out/**": true,
|
||||
"src/vs/base/test/node/uri.test.data.txt": true,
|
||||
"src/vs/workbench/test/browser/api/extHostDocumentData.test.perf-data.ts": true,
|
||||
"src/vs/server": false
|
||||
},
|
||||
"lcov.path": [
|
||||
@@ -73,13 +72,8 @@
|
||||
},
|
||||
"gulp.autoDetect": "off",
|
||||
"files.insertFinalNewline": true,
|
||||
"[plaintext]": {
|
||||
"files.insertFinalNewline": false,
|
||||
},
|
||||
"[typescript]": {
|
||||
"[typescript]": {
|
||||
"editor.defaultFormatter": "vscode.typescript-language-features"
|
||||
},
|
||||
"typescript.tsc.autoDetect": "off",
|
||||
"notebook.experimental.useMarkdownRenderer": true,
|
||||
"testing.autoRun.mode": "rerun",
|
||||
"typescript.tsc.autoDetect": "off"
|
||||
}
|
||||
|
||||
204
.vscode/tasks.json
vendored
204
.vscode/tasks.json
vendored
@@ -3,140 +3,33 @@
|
||||
"tasks": [
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "watch-clientd",
|
||||
"label": "Core - Build",
|
||||
"isBackground": true,
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildWatchers"
|
||||
},
|
||||
"problemMatcher": {
|
||||
"owner": "typescript",
|
||||
"applyTo": "closedDocuments",
|
||||
"fileLocation": [
|
||||
"absolute"
|
||||
],
|
||||
"pattern": {
|
||||
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||
"file": 1,
|
||||
"location": 2,
|
||||
"message": 3
|
||||
},
|
||||
"background": {
|
||||
"beginsPattern": "Starting compilation",
|
||||
"endsPattern": "Finished compilation"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "watch-extensionsd",
|
||||
"label": "Ext - Build",
|
||||
"isBackground": true,
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildWatchers"
|
||||
},
|
||||
"problemMatcher": {
|
||||
"owner": "typescript",
|
||||
"applyTo": "closedDocuments",
|
||||
"fileLocation": [
|
||||
"absolute"
|
||||
],
|
||||
"pattern": {
|
||||
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||
"file": 1,
|
||||
"location": 2,
|
||||
"message": 3
|
||||
},
|
||||
"background": {
|
||||
"beginsPattern": "Starting compilation",
|
||||
"endsPattern": "Finished compilation"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "watch-extension-mediad",
|
||||
"label": "Ext Media - Build",
|
||||
"isBackground": true,
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildWatchers"
|
||||
},
|
||||
"problemMatcher": {
|
||||
"owner": "typescript",
|
||||
"applyTo": "closedDocuments",
|
||||
"fileLocation": [
|
||||
"absolute"
|
||||
],
|
||||
"pattern": {
|
||||
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||
"file": 1,
|
||||
"location": 2,
|
||||
"message": 3
|
||||
},
|
||||
"background": {
|
||||
"beginsPattern": "Starting compilation",
|
||||
"endsPattern": "Finished compilation"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"label": "VS Code - Build",
|
||||
"dependsOn": [
|
||||
"Core - Build",
|
||||
"Ext - Build",
|
||||
"Ext Media - Build",
|
||||
],
|
||||
"script": "watchd",
|
||||
"label": "Build VS Code",
|
||||
"group": {
|
||||
"kind": "build",
|
||||
"isDefault": true
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "kill-watch-clientd",
|
||||
"label": "Kill Core - Build",
|
||||
"group": "build",
|
||||
"isBackground": true,
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildKillers"
|
||||
"reveal": "never"
|
||||
},
|
||||
"problemMatcher": "$tsc"
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "kill-watch-extensionsd",
|
||||
"label": "Kill Ext - Build",
|
||||
"group": "build",
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildKillers"
|
||||
},
|
||||
"problemMatcher": "$tsc"
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "kill-watch-extension-mediad",
|
||||
"label": "Kill Ext Media - Build",
|
||||
"group": "build",
|
||||
"presentation": {
|
||||
"reveal": "never",
|
||||
"group": "buildKillers"
|
||||
},
|
||||
"problemMatcher": "$tsc"
|
||||
},
|
||||
{
|
||||
"label": "Kill VS Code - Build",
|
||||
"dependsOn": [
|
||||
"Kill Core - Build",
|
||||
"Kill Ext - Build",
|
||||
"Kill Ext Media - Build",
|
||||
],
|
||||
"group": "build",
|
||||
"problemMatcher": []
|
||||
"problemMatcher": {
|
||||
"owner": "typescript",
|
||||
"applyTo": "closedDocuments",
|
||||
"fileLocation": [
|
||||
"absolute"
|
||||
],
|
||||
"pattern": {
|
||||
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||
"file": 1,
|
||||
"location": 2,
|
||||
"message": 3
|
||||
},
|
||||
"background": {
|
||||
"beginsPattern": "Starting compilation",
|
||||
"endsPattern": "Finished compilation"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
@@ -154,35 +47,8 @@
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "watch-webd",
|
||||
"label": "Web Ext - Build",
|
||||
"group": "build",
|
||||
"isBackground": true,
|
||||
"presentation": {
|
||||
"reveal": "never"
|
||||
},
|
||||
"problemMatcher": {
|
||||
"owner": "typescript",
|
||||
"applyTo": "closedDocuments",
|
||||
"fileLocation": [
|
||||
"absolute"
|
||||
],
|
||||
"pattern": {
|
||||
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||
"file": 1,
|
||||
"location": 2,
|
||||
"message": 3
|
||||
},
|
||||
"background": {
|
||||
"beginsPattern": "Starting compilation",
|
||||
"endsPattern": "Finished compilation"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "kill-watch-webd",
|
||||
"label": "Kill Web Ext - Build",
|
||||
"script": "kill-watchd",
|
||||
"label": "Kill Build VS Code",
|
||||
"group": "build",
|
||||
"presentation": {
|
||||
"reveal": "never"
|
||||
@@ -223,7 +89,7 @@
|
||||
},
|
||||
{
|
||||
"type": "shell",
|
||||
"command": "yarn web --no-launch",
|
||||
"command": "yarn web -- --no-launch",
|
||||
"label": "Run web",
|
||||
"isBackground": true,
|
||||
"problemMatcher": {
|
||||
@@ -246,28 +112,6 @@
|
||||
"source": "eslint",
|
||||
"base": "$eslint-stylish"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "shell",
|
||||
"command": "node build/lib/preLaunch.js",
|
||||
"label": "Ensure Prelaunch Dependencies",
|
||||
"presentation": {
|
||||
"reveal": "silent"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "tsec-compile-check",
|
||||
"problemMatcher": [
|
||||
{
|
||||
"base": "$tsc",
|
||||
"applyTo": "allDocuments",
|
||||
"owner": "tsec"
|
||||
}
|
||||
],
|
||||
"group": "build",
|
||||
"label": "npm: tsec-compile-check",
|
||||
"detail": "node_modules/tsec/bin/tsec -p src/tsconfig.json --noEmit"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
4
.yarnrc
4
.yarnrc
@@ -1,3 +1,3 @@
|
||||
disturl "https://electronjs.org/headers"
|
||||
target "12.0.7"
|
||||
disturl "https://atom.io/download/electron"
|
||||
target "7.2.4"
|
||||
runtime "electron"
|
||||
|
||||
239
CHANGELOG.md
239
CHANGELOG.md
@@ -1,242 +1,5 @@
|
||||
# Change Log
|
||||
|
||||
## Version 1.31.1
|
||||
* Release date: July 29, 2021
|
||||
* Release status: General Availability
|
||||
## Hotfix Release
|
||||
- Fix for [#16436 Database Connection Toolbar Missing](https://github.com/microsoft/azuredatastudio/issues/16436)
|
||||
|
||||
## Version 1.31.0
|
||||
* Release date: July 21, 2021
|
||||
* Release status: General Availability
|
||||
* New Notebook Features:
|
||||
* WYSIWYG link improvements
|
||||
* Extension Updates:
|
||||
* Import
|
||||
* SandDance
|
||||
* SQL Database Projects
|
||||
* Bug Fixes
|
||||
* Accessibility bug fixes
|
||||
|
||||
## Version 1.30.0
|
||||
* Release date: June 17, 2021
|
||||
* Release status: General Availability
|
||||
* New Notebook Features:
|
||||
* Show book's notebook TOC title in pinned notebooks view
|
||||
* Add new book icon
|
||||
* Update Python to 3.8.10
|
||||
* Query Editor Features:
|
||||
* Added filtering/sorting feature for query result grid in query editor and notebook, the feature can be invoked from the column headers. Note that this feature is only available when you enable the preview features
|
||||
* Added a status bar item to show summary of the selected cells if there are multiple numeric values
|
||||
* Extension Updates:
|
||||
* SQL Database Projects
|
||||
* Machine Learning
|
||||
* Bug Fixes
|
||||
* Fix WYSIWYG Table cell adding new line in table cell
|
||||
|
||||
## Version 1.29.0
|
||||
* Release date: May 19, 2021
|
||||
* Release status: General Availability
|
||||
* New Notebook Features:
|
||||
* Added runs with a parameters option.
|
||||
* Extension Updates:
|
||||
* SQL Database Projects
|
||||
* Schema Compare
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.28.0
|
||||
* Release date: April 16, 2021
|
||||
* Release status: General Availability
|
||||
* New Notebook Features:
|
||||
* Added Add Notebook and Remove Notebook commands
|
||||
* Extension Updates:
|
||||
* SQL Database Projects
|
||||
* Schema Compare
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.27.0
|
||||
* Release date: March 17, 2021
|
||||
* Release status: General Availability
|
||||
* New Notebook Features:
|
||||
* Added create book dialog
|
||||
* Extension Updates:
|
||||
* Import
|
||||
* Dacpac
|
||||
* Machine Learning
|
||||
* SQL Assessment
|
||||
* Arc
|
||||
* SQL Database Projects
|
||||
* ASDE Deployment
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.26.1
|
||||
* Release date: February 25, 2021
|
||||
* Release status: General Availability
|
||||
* Fixes https://github.com/microsoft/azuredatastudio/issues/14382
|
||||
|
||||
## Version 1.26.0
|
||||
* Release date: February 22, 2021
|
||||
* Release status: General Availability
|
||||
* Added edit Jupyter book UI support
|
||||
* Improved Jupyter server start-up time by 50% on windows
|
||||
* Extension Updates:
|
||||
* Azure Arc
|
||||
* PG dashboard enhancements
|
||||
* Multi-controller support
|
||||
* MIAA Dashboard will no longer prompt for SQL Server connection immediately upon opening
|
||||
* Azure Data CLI
|
||||
* Kusto
|
||||
* Machine Learning
|
||||
* Profiler
|
||||
* Server Reports
|
||||
* Schema Compare
|
||||
* SQL Server Dacpac
|
||||
* SQL Database Projects
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.25.3
|
||||
* Release date: February 10, 2021
|
||||
* Release status: General Availability
|
||||
* Update Electron to 9.4.3 to incorporate critical upstream fixes
|
||||
|
||||
## Version 1.25.2
|
||||
* Release date: January 22, 2021
|
||||
* Release status: General Availability
|
||||
* Fixes https://github.com/microsoft/azuredatastudio/issues/13899
|
||||
|
||||
## Version 1.25.1
|
||||
* Release date: December 10, 2020
|
||||
* Release status: General Availability
|
||||
* Fixes https://github.com/microsoft/azuredatastudio/issues/13751
|
||||
|
||||
## Version 1.25.0
|
||||
* Release date: December 8, 2020
|
||||
* Release status: General Availability
|
||||
* Kusto extension improvements
|
||||
* SQL Project extension improvements
|
||||
* Notebook improvements
|
||||
* Azure Browse Connections Preview performance improvements
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.24.0
|
||||
* Release date: November 12, 2020
|
||||
* Release status: General Availability
|
||||
* SQL Project improvements
|
||||
* Notebook improvements, including in WYSIWYG editor enhancements
|
||||
* Azure Arc improvements
|
||||
* Azure SQL Deployment UX improvements
|
||||
* Azure Browse Connections Preview
|
||||
* Bug Fixes
|
||||
|
||||
## Version 1.23.0
|
||||
* Release date: October 14, 2020
|
||||
* Release status: General Availability
|
||||
* Added deployments of Azure SQL DB and VM
|
||||
* Added PowerShell kernel results streaming support
|
||||
* Added improvements to SQL Database Projects extension
|
||||
* Bug Fixes
|
||||
* Extension Updates:
|
||||
* SQL Server Import
|
||||
* Machine Learning
|
||||
* Schema Compare
|
||||
* Kusto
|
||||
* SQL Assessment
|
||||
* SQL Database Projects
|
||||
* Azure Arc
|
||||
* azdata
|
||||
|
||||
## Version 1.22.1
|
||||
* Release date: September 30, 2020
|
||||
* Release status: General Availability
|
||||
* Fix bug #12615 Active connection filter doesn't untoggle | [#12615](https://github.com/microsoft/azuredatastudio/issues/12615)
|
||||
* Fix bug #12572 Edit Data grid doesn't escape special characters | [#12572](https://github.com/microsoft/azuredatastudio/issues/12572)
|
||||
* Fix bug #12570 Dashboard Explorer table doesn't escape special characters | [#12570](https://github.com/microsoft/azuredatastudio/issues/12570)
|
||||
* Fix bug #12582 Delete row on Edit Data fails | [#12582](https://github.com/microsoft/azuredatastudio/issues/12582)
|
||||
* Fix bug #12646 SQL Notebooks: Cells being treated isolated | [#12646](https://github.com/microsoft/azuredatastudio/issues/12646)
|
||||
|
||||
## Version 1.22.0
|
||||
* Release date: September 22, 2020
|
||||
* Release status: General Availability
|
||||
* New Notebook Features
|
||||
* Supports brand new text cell editing experience based on rich text formatting and seamless conversion to markdown, also known as WYSIWYG toolbar (What You See Is What You Get)
|
||||
* Supports Kusto kernel
|
||||
* Supports pinning of notebooks
|
||||
* Added support for new version of Jupyter Books
|
||||
* Improved Jupyter Shortcuts
|
||||
* Introduced perf loading improvements
|
||||
* Added Azure Arc extension - Users can try out Azure Arc public preview through Azure Data Studio. This includes:
|
||||
* Deploy data controller
|
||||
* Deploy Postgres
|
||||
* Deploy Managed Instance for Azure Arc
|
||||
* Connect to data controller
|
||||
* Access data service dashboards
|
||||
* Azure Arc Jupyter Book
|
||||
* Added new deployment options
|
||||
* Azure SQL Database Edge
|
||||
* (Edge will require Azure SQL Edge Deployment Extension)
|
||||
* Added SQL Database Projects extension - The SQL Database Projects extension brings project-based database development to Azure Data Studio. In this preview release, SQL projects can be created and published from Azure Data Studio.
|
||||
* Added Kusto (KQL) extension - Brings native Kusto experiences in Azure Data Studio for data exploration and data analytics against massive amount of real-time streaming data stored in Azure Data Explorer. This preview release supports connecting and browsing Azure Data Explorer clusters, writing KQL queries as well as authoring notebooks with Kusto kernel.
|
||||
* SQL Server Import extension GA - Announcing the GA of the SQL Server Import extension, features no longer in preview. This extension facilitates importing csv/txt files. Learn more about the extension in [this article](sql-server-import-extension.md).
|
||||
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22September+2020+Release%22+is%3Aclosed).
|
||||
|
||||
## Version 1.21.0
|
||||
* Release date: August 12, 2020
|
||||
* Release status: General Availability
|
||||
* New Notebook Features
|
||||
* Move cell locations changd
|
||||
* Added action to convert cells to Text Cell or Code cell
|
||||
* Jupyter Books picker to open Jupyter Books directly from Github
|
||||
* Search bar added to Notebooks Viewlet for searching through Jupyter Books
|
||||
* Address issues in [August 2020 Milestone](https://github.com/microsoft/azuredatastudio/milestone/59?closed=1)
|
||||
|
||||
## Version 1.20.1
|
||||
* Release date: July 17, 2020
|
||||
* Release status: General Availability
|
||||
* Fix bug #11372 Object Explorer drag-and-drop table incorrectly wraps table names [#11372](https://github.com/microsoft/azuredatastudio/issues/11372)
|
||||
* Fix bug #11356 Dark theme is now the default theme [#11356](https://github.com/microsoft/azuredatastudio/issues/11356)
|
||||
* Known Issues:
|
||||
* Some users have reported connection errors from the new Microsoft.Data.SqlClient v2.0.0 included in this release. Users have found [following these instructions](https://github.com/microsoft/azuredatastudio/issues/11367#issuecomment-659614111) to successfully connect. This issue was caused by a client driver update which fixed an issue where TLS encryption wasn't enforced correctly. See https://github.com/dotnet/SqlClient/blob/master/release-notes/2.0/2.0.0.md#breaking-changes-1 and https://docs.microsoft.com/en-us/sql/relational-databases/native-client/features/using-encryption-without-validation for more information.
|
||||
|
||||
## Version 1.20.0
|
||||
* Release date: July 15, 2020
|
||||
* Release status: General Availability
|
||||
* Feature Tour
|
||||
* New Notebook Features
|
||||
* Header support in Markdown Toolbar
|
||||
* Side-by-side Markdown preview in Text Cells
|
||||
* Drag and drop columns and tables into Query Editor
|
||||
* Azure Account icon added to Activity Bar
|
||||
* Address issues in [July 2020 Milestone](https://github.com/microsoft/azuredatastudio/milestone/57?closed=1)
|
||||
* Bug fixes
|
||||
|
||||
## Version 1.19.0
|
||||
* Release date: June 15, 2020
|
||||
* Release status: General Availability
|
||||
* Address issues in https://github.com/microsoft/azuredatastudio/milestone/55?closed=1
|
||||
* Bug fixes
|
||||
|
||||
## Version 1.18.1
|
||||
* Release date: May 27, 2020
|
||||
* Release status: General Availability
|
||||
* Hotfix for https://github.com/microsoft/azuredatastudio/issues/10538
|
||||
* Hotfix for https://github.com/microsoft/azuredatastudio/issues/10537
|
||||
|
||||
## Version 1.18.0
|
||||
* Release date: May 20, 2020
|
||||
* Release status: General Availability
|
||||
* Announcing Redgate SQL Prompt extension - This extension lets you manage formatting styles directly within Azure Data Studio, so you can create and edit your styles without leaving the IDE.
|
||||
* Announcing the new machine learning extension. This extension enables you to:
|
||||
* Manage Python and R packages with SQL Server machine learning services with Azure Data Studio.
|
||||
* Use ONNX model to make predictions in Azure SQL Edge.
|
||||
* View ONNX models in an Azure SQL Edge database.
|
||||
* Import ONNX models from a file or Azure Machine Learning into Azure SQL Edge database.
|
||||
* Create a notebook to run experiments.
|
||||
* New notebook features:
|
||||
* Added new Python dependencies wizard
|
||||
* Improvements to the notebook markdown toolbar
|
||||
* Added support for parameterization for Always Encrypted - Allows you to run queries that insert, update or filter by encrypted database columns.
|
||||
* Bug fixes
|
||||
|
||||
## Version 1.17.1
|
||||
* Release date: April 29, 2020
|
||||
* Release status: General Availability
|
||||
@@ -621,7 +384,7 @@ The May release is focused on stabilization and bug fixes leading up to the Buil
|
||||
|
||||
* Announcing **Redgate SQL Search** extension available in Extension Manager
|
||||
* Community Localization available for 10 languages: **German, Spanish, French, Italian, Japanese, Korean, Portuguese, Russian, Simplified Chinese and Traditional Chinese!**
|
||||
* Reduced telemetry collection, improved [opt-out](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Disable-Telemetry-Reporting) experience and in-product links to [Privacy Statement](https://privacy.microsoft.com/privacystatement)
|
||||
* Reduced telemetry collection, improved [opt-out](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Disable-Telemetry-Reporting) experience and in-product links to [Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement)
|
||||
* Extension Manager has improved Marketplace experience to easily discover community extensions
|
||||
* SQL Agent extension Jobs and Job History view improvement
|
||||
* Updates for **whoisactive** and **Server Reports** extensions
|
||||
|
||||
45
README.md
45
README.md
@@ -1,7 +1,7 @@
|
||||
# Azure Data Studio
|
||||
|
||||
[](https://gitter.im/Microsoft/sqlopsstudio?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
|
||||
[](https://dev.azure.com/azuredatastudio/azuredatastudio/_build/latest?definitionId=4&branchName=main)
|
||||
[](https://dev.azure.com/azuredatastudio/azuredatastudio/_build/latest?definitionId=4&branchName=master)
|
||||
[](https://twitter.com/azuredatastudio)
|
||||
|
||||
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
|
||||
@@ -19,18 +19,16 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
|
||||
| [Linux DEB][linux-deb] |
|
||||
|
||||
|
||||
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
|
||||
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
|
||||
|
||||
## Try out the latest insiders build from `main`:
|
||||
## Try out the latest insiders build from `master`:
|
||||
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
|
||||
- [Windows System Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/insider)
|
||||
- [Windows ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/insider)
|
||||
- [macOS ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider)
|
||||
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
|
||||
|
||||
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
|
||||
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
|
||||
|
||||
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/master/CHANGELOG.md) for additional details of what's in this release.
|
||||
|
||||
## **Feature Highlights**
|
||||
|
||||
@@ -49,7 +47,7 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
|
||||
|
||||
Here are some of these features in action.
|
||||
|
||||
<img src='https://github.com/Microsoft/azuredatastudio/blob/main/docs/overview_screen.jpg' width='800px'>
|
||||
<img src='https://github.com/Microsoft/azuredatastudio/blob/master/docs/overview_screen.jpg' width='800px'>
|
||||
|
||||
## Contributing
|
||||
If you are interested in fixing issues and contributing directly to the code base,
|
||||
@@ -62,10 +60,12 @@ please see the document [How to Contribute](https://github.com/Microsoft/azureda
|
||||
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
|
||||
|
||||
## Localization
|
||||
Azure Data Studio is localized into 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). The language packs are available in the Extension Manager marketplace. Simply, search for the specific language using the extension marketplace and install. Once you install the selected language, Azure Data Studio will prompt you to restart with the new language.
|
||||
Azure Data Studio localization is now open for community contributions. You can contribute to localization for both software and docs. https://aka.ms/SQLOpsStudioLoc
|
||||
|
||||
Localization is now opened for 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). Help us make Azure Data Studio available in your language!
|
||||
|
||||
## Privacy Statement
|
||||
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/privacystatement) describes the privacy statement of this software.
|
||||
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement) describes the privacy statement of this software.
|
||||
|
||||
## Contributions and "Thank You"
|
||||
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
|
||||
@@ -122,8 +122,19 @@ We would like to thank all our users who raised issues, and in particular the fo
|
||||
* SebastianPfliegel `Remove sqlExtensionHelp (#312)`
|
||||
* olljanat for `Implemented npm version check (#314)`
|
||||
* Adam Machanic for helping with the `whoisactive` extension
|
||||
* All community localization contributors:
|
||||
* French: Adrien Clerbois, ANAS BELABBES, Antoine Griffard, Arian Papillon, Eric Macarez, Eric Van Thorre, Jérémy LANDON, Matthias GROSPERRIN, Maxime COQUEREL, Olivier Guinart, thierry DEMAN-BARCELÒ, Thomas Potier
|
||||
* Italian: Aldo Donetti, Alessandro Alpi, Andrea Dottor, Bruni Luca, Gianluca Hotz, Luca Nardi, Luigi Bruno, Marco Dal Pino, Mirco Vanini, Pasquale Ceglie, Riccardo Cappello, Sergio Govoni, Stefano Demiliani
|
||||
* German: Anna Henke-Gunvaldson, Ben Weissman, David Ullmer, J.M. ., Kai Modo, Konstantin Staschill, Kostja Klein, Lennart Trunk, Markus Ehrenmüller-Jensen, Mascha Kroenlein, Matthias Knoll, Mourad Louha, Thomas Hütter, Wolfgang Straßer
|
||||
* Spanish: Alberto Poblacion, Andy Gonzalez, Carlos Mendible, Christian Araujo, Daniel D, Eickhel Mendoza, Ernesto Cardenas, Ivan Toledo Ivanovic, Fran Diaz, JESUS GIL, Jorge Serrano Pérez, José Saturnino Pimentel Juárez, Mauricio Hidalgo, Pablo Iglesias, Rikhardo Estrada Rdez, Thierry DEMAN, YOLANDA CUESTA ALTIERI
|
||||
* Japanese: Fujio Kojima, Kazushi KAMEGAWA, Masayoshi Yamada, Masayuki Ozawa, Seiji Momoto, Takashi Kanai, Takayoshi Tanaka, Yoshihisa Ozaki, 庄垣内治
|
||||
* Chinese (simplified): DAN YE, Joel Yang, Lynne Dong, Ryan(Yu) Zhang, Sheng Jiang, Wei Zhang, Zhiliang Xu
|
||||
* Chinese (Traditional): Bruce Chen, Chiayi Yen, Kevin Yang, Winnie Lin, 保哥 Will, 謝政廷
|
||||
* Korean: Do-Kyun Kim, Evelyn Kim, Helen Jung, Hong Jmee, jeongwoo choi, Jun Hyoung Lee, Jungsun Kim정선, Justin Yoo, Kavrith mucha, Kiwoong Youm, MinGyu Ju, MVP_JUNO BEA, Sejun Kim, SOONMAN KWON, sung man ko, Yeongrak Choi, younggun kim, Youngjae Kim, 소영 이
|
||||
* Russian: Andrey Veselov, Anton Fontanov, Anton Savin, Elena Ostrovskaia, Igor Babichev, Maxim Zelensky, Rodion Fedechkin, Tasha T, Vladimir Zyryanov
|
||||
* Portuguese Brazil: Daniel de Sousa, Diogo Duarte, Douglas Correa, Douglas Eccker, José Emanuel Mendes, Marcelo Fernandes, Marcondes Alexandre, Roberto Fonseca, Rodrigo Crespi
|
||||
|
||||
And of course, we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/ThirdPartyNotices.txt)
|
||||
And of course, we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/ThirdPartyNotices.txt)
|
||||
|
||||
## License
|
||||
|
||||
@@ -131,10 +142,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
|
||||
Licensed under the [Source EULA](LICENSE.txt).
|
||||
|
||||
[win-user]: https://go.microsoft.com/fwlink/?linkid=2168181
|
||||
[win-system]: https://go.microsoft.com/fwlink/?linkid=2168180
|
||||
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2168436
|
||||
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2168435
|
||||
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2168338
|
||||
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2168271
|
||||
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2168339
|
||||
[win-user]: https://go.microsoft.com/fwlink/?linkid=2127556
|
||||
[win-system]: https://go.microsoft.com/fwlink/?linkid=2127555
|
||||
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2127476
|
||||
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2127554
|
||||
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2127553
|
||||
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2127552
|
||||
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2127551
|
||||
|
||||
41
SECURITY.md
41
SECURITY.md
@@ -1,41 +0,0 @@
|
||||
<!-- BEGIN MICROSOFT SECURITY.MD V0.0.5 BLOCK -->
|
||||
|
||||
## Security
|
||||
|
||||
Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
|
||||
|
||||
If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://docs.microsoft.com/en-us/previous-versions/tn-archive/cc751383(v=technet.10)), please report it to us as described below.
|
||||
|
||||
## Reporting Security Issues
|
||||
|
||||
**Please do not report security vulnerabilities through public GitHub issues.**
|
||||
|
||||
Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://msrc.microsoft.com/create-report).
|
||||
|
||||
If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://www.microsoft.com/en-us/msrc/pgp-key-msrc).
|
||||
|
||||
You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc).
|
||||
|
||||
Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
|
||||
|
||||
* Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
|
||||
* Full paths of source file(s) related to the manifestation of the issue
|
||||
* The location of the affected source code (tag/branch/commit or direct URL)
|
||||
* Any special configuration required to reproduce the issue
|
||||
* Step-by-step instructions to reproduce the issue
|
||||
* Proof-of-concept or exploit code (if possible)
|
||||
* Impact of the issue, including how an attacker might exploit the issue
|
||||
|
||||
This information will help us triage your report more quickly.
|
||||
|
||||
If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://microsoft.com/msrc/bounty) page for more details about our active programs.
|
||||
|
||||
## Preferred Languages
|
||||
|
||||
We prefer all communications to be in English.
|
||||
|
||||
## Policy
|
||||
|
||||
Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://www.microsoft.com/en-us/msrc/cvd).
|
||||
|
||||
<!-- END MICROSOFT SECURITY.MD BLOCK -->
|
||||
@@ -12,7 +12,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
angular2-grid: https://github.com/BTMorton/angular2-grid
|
||||
angular2-slickgrid: https://github.com/Microsoft/angular2-slickgrid
|
||||
applicationinsights: https://github.com/Microsoft/ApplicationInsights-node.js
|
||||
axios: https://github.com/axios/axios
|
||||
axios: https://github.com/axios/axios
|
||||
bootstrap: https://github.com/twbs/bootstrap
|
||||
chart.js: https://github.com/Timer/chartjs
|
||||
chokidar: https://github.com/paulmillr/chokidar
|
||||
@@ -29,8 +29,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
gc-signals: https://github.com/Microsoft/node-gc-signals
|
||||
getmac: https://github.com/bevry/getmac
|
||||
graceful-fs: https://github.com/isaacs/node-graceful-fs
|
||||
gridstack: https://github.com/gridstack/gridstack.js
|
||||
html-to-image: https://github.com/bubkoo/html-to-image
|
||||
html-query-plan: https://github.com/JustinPealing/html-query-plan
|
||||
http-proxy-agent: https://github.com/TooTallNate/node-https-proxy-agent
|
||||
https-proxy-agent: https://github.com/TooTallNate/node-https-proxy-agent
|
||||
@@ -41,9 +39,8 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
jschardet: https://github.com/aadsm/jschardet
|
||||
jupyter-powershell: https://github.com/vors/jupyter-powershell
|
||||
JupyterLab: https://github.com/jupyterlab/jupyterlab
|
||||
keytar: https://github.com/atom/node-keytar
|
||||
keytar: https://github.com/atom/node-keytar
|
||||
make-error: https://github.com/JsCommunity/make-error
|
||||
mark.js: https://github.com/julmot/mark.js
|
||||
minimist: https://github.com/substack/minimist
|
||||
moment: https://github.com/moment/moment
|
||||
native-keymap: https://github.com/Microsoft/node-native-keymap
|
||||
@@ -56,8 +53,7 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
primeng: https://github.com/primefaces/primeng
|
||||
process-nextick-args: https://github.com/calvinmetcalf/process-nextick-args
|
||||
pty.js: https://github.com/chjj/pty.js
|
||||
pyzmq: https://github.com/zeromq/pyzmq
|
||||
qs: https://github.com/ljharb/qs
|
||||
qs: https://github.com/ljharb/qs
|
||||
reflect-metadata: https://github.com/rbuckton/reflect-metadata
|
||||
request: https://github.com/request/request
|
||||
rxjs: https://github.com/ReactiveX/RxJS
|
||||
@@ -67,8 +63,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
svg.js: https://github.com/svgdotjs/svg.js
|
||||
systemjs: https://github.com/systemjs/systemjs
|
||||
temp-write: https://github.com/sindresorhus/temp-write
|
||||
turndown: https://github.com/domchristie/turndown
|
||||
turndown-plugin-gfm: https://github.com/domchristie/turndown-plugin-gfm
|
||||
underscore: https://github.com/jashkenas/underscore
|
||||
v8-profiler: https://github.com/node-inspector/v8-profiler
|
||||
vscode: https://github.com/microsoft/vscode
|
||||
@@ -78,8 +72,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
||||
vscode-ripgrep: https://github.com/roblourens/vscode-ripgrep
|
||||
vscode-textmate: https://github.com/Microsoft/vscode-textmate
|
||||
winreg: https://github.com/fresc81/node-winreg
|
||||
xmldom: https://github.com/xmldom/xmldom
|
||||
xml-formatter: https://github.com/chrisbottin/xml-formatter
|
||||
xterm: https://github.com/sourcelair/xterm.js
|
||||
yargs: https://github.com/yargs/yargs
|
||||
yauzl: https://github.com/thejoshwolfe/yauzl
|
||||
@@ -495,58 +487,6 @@ IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
=========================================
|
||||
END OF graceful-fs NOTICES AND INFORMATION
|
||||
|
||||
%% gridstack NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014-2020 Alain Dumesny, Dylan Weiss, Pavel Reznikov
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
=========================================
|
||||
END OF gridstack NOTICES AND INFORMATION
|
||||
|
||||
%% html-to-image NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2017 W.Y.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
=========================================
|
||||
END OF html-to-image NOTICES AND INFORMATION
|
||||
|
||||
%% html-query-plan NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
The MIT License (MIT)
|
||||
@@ -1312,32 +1252,6 @@ ISC © Julien Fontanet
|
||||
=========================================
|
||||
END OF make-error NOTICES AND INFORMATION
|
||||
|
||||
%% mark.js NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014–2019 Julian Kühnel
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
=========================================
|
||||
END OF mark.js NOTICES AND INFORMATION
|
||||
|
||||
%% minimist NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
This software is released under the MIT license:
|
||||
@@ -1581,6 +1495,30 @@ END OF primeng NOTICES AND INFORMATION
|
||||
|
||||
%% process-nextick-args NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
# Copyright (c) 2015 Calvin Metcalf
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
**THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.**
|
||||
=========================================
|
||||
END OF process-nextick-args NOTICES AND INFORMATION
|
||||
|
||||
%% pty.js NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
@@ -1603,40 +1541,6 @@ THE SOFTWARE.
|
||||
=========================================
|
||||
END OF pty.js NOTICES AND INFORMATION
|
||||
|
||||
%% PyZMQ NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
Copyright (c) 2009-2012, Brian Granger, Min Ragan-Kelley
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
Redistributions of source code must retain the above copyright notice, this
|
||||
list of conditions and the following disclaimer.
|
||||
|
||||
Redistributions in binary form must reproduce the above copyright notice, this
|
||||
list of conditions and the following disclaimer in the documentation and/or
|
||||
other materials provided with the distribution.
|
||||
|
||||
Neither the name of PyZMQ nor the names of its contributors may be used to
|
||||
endorse or promote products derived from this software without specific prior
|
||||
written permission.
|
||||
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
|
||||
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
|
||||
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
||||
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
=========================================
|
||||
END OF pyzmq NOTICES AND INFORMATION
|
||||
|
||||
%% reflect-metadata NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
Apache License
|
||||
@@ -2096,58 +2000,6 @@ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLI
|
||||
=========================================
|
||||
END OF temp-write NOTICES AND INFORMATION
|
||||
|
||||
%% turndown NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2017 Dom Christie
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
=========================================
|
||||
END OF turndown NOTICES AND INFORMATION
|
||||
|
||||
%% turndown-plugin-gfm NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2017 Dom Christie
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
=========================================
|
||||
END OF turndown-plugin-gfm NOTICES AND INFORMATION
|
||||
|
||||
%% underscore NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
Copyright (c) 2009-2017 Jeremy Ashkenas, DocumentCloud and Investigative
|
||||
@@ -2373,51 +2225,6 @@ EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
=========================================
|
||||
END OF winreg NOTICES AND INFORMATION
|
||||
|
||||
%% xmldom NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
The MIT License (MIT)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without limitation the rights to use, copy,
|
||||
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
|
||||
and to permit persons to whom the Software is furnished to do so, subject to the
|
||||
following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies
|
||||
or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
|
||||
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
|
||||
PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE
|
||||
FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
|
||||
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
=========================================
|
||||
END OF xmldom NOTICES AND INFORMATION
|
||||
|
||||
%% xml-formatter NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright 2019 Chris Bottin (https://github.com/chrisbottin)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software
|
||||
and associated documentation files (the "Software"), to deal in the Software without restriction,
|
||||
including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
|
||||
and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do
|
||||
so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT
|
||||
LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
||||
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
|
||||
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
||||
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
=========================================
|
||||
END OF xml-formatter NOTICES AND INFORMATION
|
||||
|
||||
%% xterm NOTICES AND INFORMATION BEGIN HERE
|
||||
=========================================
|
||||
Copyright (c) 2014-2016, SourceLair Private Company (https://www.sourcelair.com)
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
trigger:
|
||||
- main
|
||||
- master
|
||||
- release/*
|
||||
|
||||
jobs:
|
||||
@@ -20,8 +20,3 @@ jobs:
|
||||
vmImage: macOS-latest
|
||||
steps:
|
||||
- template: build/azure-pipelines/darwin/continuous-build-darwin.yml
|
||||
|
||||
trigger:
|
||||
branches:
|
||||
exclude:
|
||||
- electron-11.x.y
|
||||
|
||||
@@ -1 +1 @@
|
||||
2021-04-07T03:52:18.011Z
|
||||
2020-04-29T05:20:58.491Z
|
||||
|
||||
3
build/.gitattributes
vendored
3
build/.gitattributes
vendored
@@ -1,3 +0,0 @@
|
||||
* text eol=lf
|
||||
*.exe binary
|
||||
*.dll binary
|
||||
@@ -1,173 +0,0 @@
|
||||
# cleanup rules for node modules, .gitignore style
|
||||
|
||||
# native node modules
|
||||
|
||||
nan/**
|
||||
*/node_modules/nan/**
|
||||
|
||||
fsevents/binding.gyp
|
||||
fsevents/fsevents.cc
|
||||
fsevents/build/**
|
||||
fsevents/src/**
|
||||
fsevents/test/**
|
||||
!fsevents/**/*.node
|
||||
|
||||
vscode-sqlite3/binding.gyp
|
||||
vscode-sqlite3/benchmark/**
|
||||
vscode-sqlite3/cloudformation/**
|
||||
vscode-sqlite3/deps/**
|
||||
vscode-sqlite3/test/**
|
||||
vscode-sqlite3/build/**
|
||||
vscode-sqlite3/src/**
|
||||
!vscode-sqlite3/build/Release/*.node
|
||||
|
||||
windows-mutex/binding.gyp
|
||||
windows-mutex/build/**
|
||||
windows-mutex/src/**
|
||||
!windows-mutex/**/*.node
|
||||
|
||||
native-keymap/binding.gyp
|
||||
native-keymap/build/**
|
||||
native-keymap/src/**
|
||||
native-keymap/deps/**
|
||||
!native-keymap/build/Release/*.node
|
||||
|
||||
native-is-elevated/binding.gyp
|
||||
native-is-elevated/build/**
|
||||
native-is-elevated/src/**
|
||||
native-is-elevated/deps/**
|
||||
!native-is-elevated/build/Release/*.node
|
||||
|
||||
native-watchdog/binding.gyp
|
||||
native-watchdog/build/**
|
||||
native-watchdog/src/**
|
||||
!native-watchdog/build/Release/*.node
|
||||
|
||||
spdlog/binding.gyp
|
||||
spdlog/build/**
|
||||
spdlog/deps/**
|
||||
spdlog/src/**
|
||||
spdlog/test/**
|
||||
spdlog/*.yml
|
||||
!spdlog/build/Release/*.node
|
||||
|
||||
jschardet/dist/**
|
||||
|
||||
windows-foreground-love/binding.gyp
|
||||
windows-foreground-love/build/**
|
||||
windows-foreground-love/src/**
|
||||
!windows-foreground-love/**/*.node
|
||||
|
||||
windows-process-tree/binding.gyp
|
||||
windows-process-tree/build/**
|
||||
windows-process-tree/src/**
|
||||
!windows-process-tree/**/*.node
|
||||
|
||||
keytar/binding.gyp
|
||||
keytar/build/**
|
||||
keytar/src/**
|
||||
keytar/script/**
|
||||
keytar/node_modules/**
|
||||
!keytar/**/*.node
|
||||
|
||||
node-pty/binding.gyp
|
||||
node-pty/build/**
|
||||
node-pty/src/**
|
||||
node-pty/tools/**
|
||||
node-pty/deps/**
|
||||
node-pty/scripts/**
|
||||
!node-pty/build/Release/*.exe
|
||||
!node-pty/build/Release/*.dll
|
||||
!node-pty/build/Release/*.node
|
||||
|
||||
# START SQL Modules
|
||||
|
||||
@angular/**/src/**
|
||||
@angular/**/testing/**
|
||||
|
||||
angular2-grid/components/**
|
||||
angular2-grid/directives/**
|
||||
angular2-grid/interfaces/**
|
||||
angular2-grid/modules/**
|
||||
|
||||
angular2-slickgrid/.vscode/**
|
||||
angular2-slickgrid/components/**
|
||||
angular2-slickgrid/examples/**
|
||||
|
||||
jquery-ui/external/**
|
||||
jquery-ui/demos/**
|
||||
|
||||
slickgrid/node_modules/**
|
||||
slickgrid/examples/**
|
||||
|
||||
kerberos/build/**
|
||||
|
||||
# END SQL Modules
|
||||
|
||||
nsfw/binding.gyp
|
||||
nsfw/build/**
|
||||
nsfw/src/**
|
||||
nsfw/includes/**
|
||||
!nsfw/build/Release/*.node
|
||||
|
||||
vsda/build/**
|
||||
vsda/ci/**
|
||||
vsda/src/**
|
||||
vsda/.gitignore
|
||||
vsda/binding.gyp
|
||||
vsda/README.md
|
||||
vsda/targets
|
||||
!vsda/build/Release/vsda.node
|
||||
|
||||
vscode-encrypt/build/**
|
||||
vscode-encrypt/src/**
|
||||
vscode-encrypt/vendor/**
|
||||
vscode-encrypt/.gitignore
|
||||
vscode-encrypt/binding.gyp
|
||||
vscode-encrypt/README.md
|
||||
!vscode-encrypt/build/Release/vscode-encrypt-native.node
|
||||
|
||||
vscode-windows-ca-certs/**/*
|
||||
!vscode-windows-ca-certs/package.json
|
||||
!vscode-windows-ca-certs/**/*.node
|
||||
|
||||
node-addon-api/**/*
|
||||
|
||||
# other node modules
|
||||
|
||||
**/docs/**
|
||||
**/example/**
|
||||
**/examples/**
|
||||
**/test/**
|
||||
**/tests/**
|
||||
|
||||
**/History.md
|
||||
**/CHANGELOG.md
|
||||
**/README.md
|
||||
**/readme.md
|
||||
**/readme.markdown
|
||||
|
||||
**/*.ts
|
||||
!typescript/**/*.d.ts
|
||||
|
||||
jschardet/dist/**
|
||||
|
||||
es6-promise/lib/**
|
||||
|
||||
vscode-textmate/webpack.config.js
|
||||
|
||||
# {{SQL CARBON EDIT }} We need more than just zone-node.js
|
||||
# zone.js/dist/**
|
||||
# !zone.js/dist/zone-node.js
|
||||
|
||||
# https://github.com/xtermjs/xterm.js/issues/3137
|
||||
xterm/src/**
|
||||
xterm/tsconfig.all.json
|
||||
|
||||
# https://github.com/xtermjs/xterm.js/issues/3138
|
||||
xterm-addon-*/src/**
|
||||
xterm-addon-*/fixtures/**
|
||||
xterm-addon-*/out/**
|
||||
xterm-addon-*/out-test/**
|
||||
|
||||
|
||||
124
build/.nativeignore
Normal file
124
build/.nativeignore
Normal file
@@ -0,0 +1,124 @@
|
||||
# cleanup rules for native node modules, .gitignore style
|
||||
|
||||
nan/**
|
||||
*/node_modules/nan/**
|
||||
|
||||
fsevents/binding.gyp
|
||||
fsevents/fsevents.cc
|
||||
fsevents/build/**
|
||||
fsevents/src/**
|
||||
fsevents/test/**
|
||||
!fsevents/**/*.node
|
||||
|
||||
vscode-sqlite3/binding.gyp
|
||||
vscode-sqlite3/benchmark/**
|
||||
vscode-sqlite3/cloudformation/**
|
||||
vscode-sqlite3/deps/**
|
||||
vscode-sqlite3/test/**
|
||||
vscode-sqlite3/build/**
|
||||
vscode-sqlite3/src/**
|
||||
!vscode-sqlite3/build/Release/*.node
|
||||
|
||||
windows-mutex/binding.gyp
|
||||
windows-mutex/build/**
|
||||
windows-mutex/src/**
|
||||
!windows-mutex/**/*.node
|
||||
|
||||
native-keymap/binding.gyp
|
||||
native-keymap/build/**
|
||||
native-keymap/src/**
|
||||
native-keymap/deps/**
|
||||
!native-keymap/build/Release/*.node
|
||||
|
||||
native-is-elevated/binding.gyp
|
||||
native-is-elevated/build/**
|
||||
native-is-elevated/src/**
|
||||
native-is-elevated/deps/**
|
||||
!native-is-elevated/build/Release/*.node
|
||||
|
||||
native-watchdog/binding.gyp
|
||||
native-watchdog/build/**
|
||||
native-watchdog/src/**
|
||||
!native-watchdog/build/Release/*.node
|
||||
|
||||
spdlog/binding.gyp
|
||||
spdlog/build/**
|
||||
spdlog/deps/**
|
||||
spdlog/src/**
|
||||
spdlog/test/**
|
||||
!spdlog/build/Release/*.node
|
||||
|
||||
jschardet/dist/**
|
||||
|
||||
windows-foreground-love/binding.gyp
|
||||
windows-foreground-love/build/**
|
||||
windows-foreground-love/src/**
|
||||
!windows-foreground-love/**/*.node
|
||||
|
||||
windows-process-tree/binding.gyp
|
||||
windows-process-tree/build/**
|
||||
windows-process-tree/src/**
|
||||
!windows-process-tree/**/*.node
|
||||
|
||||
keytar/binding.gyp
|
||||
keytar/build/**
|
||||
keytar/src/**
|
||||
keytar/script/**
|
||||
keytar/node_modules/**
|
||||
!keytar/**/*.node
|
||||
|
||||
node-pty/binding.gyp
|
||||
node-pty/build/**
|
||||
node-pty/src/**
|
||||
node-pty/tools/**
|
||||
node-pty/deps/**
|
||||
!node-pty/build/Release/*.exe
|
||||
!node-pty/build/Release/*.dll
|
||||
!node-pty/build/Release/*.node
|
||||
|
||||
# START SQL Modules
|
||||
|
||||
@angular/**/src/**
|
||||
@angular/**/testing/**
|
||||
|
||||
angular2-grid/components/**
|
||||
angular2-grid/directives/**
|
||||
angular2-grid/interfaces/**
|
||||
angular2-grid/modules/**
|
||||
|
||||
angular2-slickgrid/.vscode/**
|
||||
angular2-slickgrid/components/**
|
||||
angular2-slickgrid/examples/**
|
||||
|
||||
jquery-ui/external/**
|
||||
jquery-ui/demos/**
|
||||
|
||||
slickgrid/node_modules/**
|
||||
slickgrid/examples/**
|
||||
|
||||
kerberos/build/**
|
||||
|
||||
# END SQL Modules
|
||||
|
||||
vscode-nsfw/binding.gyp
|
||||
vscode-nsfw/build/**
|
||||
vscode-nsfw/src/**
|
||||
vscode-nsfw/openpa/**
|
||||
vscode-nsfw/includes/**
|
||||
!vscode-nsfw/build/Release/*.node
|
||||
!vscode-nsfw/**/*.a
|
||||
|
||||
vsda/build/**
|
||||
vsda/ci/**
|
||||
vsda/src/**
|
||||
vsda/.gitignore
|
||||
vsda/binding.gyp
|
||||
vsda/README.md
|
||||
vsda/targets
|
||||
!vsda/build/Release/vsda.node
|
||||
|
||||
vscode-windows-ca-certs/**/*
|
||||
!vscode-windows-ca-certs/package.json
|
||||
!vscode-windows-ca-certs/**/*.node
|
||||
|
||||
node-addon-api/**/*
|
||||
@@ -1,31 +0,0 @@
|
||||
# cleanup rules for web node modules, .gitignore style
|
||||
|
||||
**/*.txt
|
||||
**/*.json
|
||||
**/*.md
|
||||
**/*.d.ts
|
||||
**/*.js.map
|
||||
**/LICENSE
|
||||
**/CONTRIBUTORS
|
||||
|
||||
**/docs/**
|
||||
**/example/**
|
||||
**/examples/**
|
||||
|
||||
jschardet/index.js
|
||||
jschardet/src/**
|
||||
jschardet/dist/jschardet.js
|
||||
|
||||
vscode-textmate/webpack.config.js
|
||||
|
||||
xterm/src/**
|
||||
|
||||
xterm-addon-search/src/**
|
||||
xterm-addon-search/out/**
|
||||
xterm-addon-search/fixtures/**
|
||||
|
||||
xterm-addon-unicode11/src/**
|
||||
xterm-addon-unicode11/out/**
|
||||
|
||||
xterm-addon-webgl/src/**
|
||||
xterm-addon-webgl/out/**
|
||||
15
build/actions/copycat/action.yml
Normal file
15
build/actions/copycat/action.yml
Normal file
@@ -0,0 +1,15 @@
|
||||
name: Copycat
|
||||
description: Copy all new issues to a different repo
|
||||
inputs:
|
||||
token:
|
||||
description: GitHub token with issue, comment, and label read/write permissions to both repos
|
||||
default: ${{ github.token }}
|
||||
owner:
|
||||
description: account/organization that owns the destination repo (the microsoft part of microsoft/vscode)
|
||||
required: true
|
||||
repo:
|
||||
description: name of the destination repo (the vscode part of microsoft/vscode)
|
||||
required: true
|
||||
runs:
|
||||
using: 'node12'
|
||||
main: 'index.js'
|
||||
19
build/actions/copycat/copyCat.js
Normal file
19
build/actions/copycat/copyCat.js
Normal file
@@ -0,0 +1,19 @@
|
||||
"use strict";
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
class CopyCat {
|
||||
constructor(github, owner, repo) {
|
||||
this.github = github;
|
||||
this.owner = owner;
|
||||
this.repo = repo;
|
||||
}
|
||||
async run() {
|
||||
const issue = await this.github.getIssue();
|
||||
console.log(`Mirroring issue \`${issue.title}\` to ${this.owner}/${this.repo}`);
|
||||
await this.github.createIssue(this.owner, this.repo, issue.title, issue.body.replace(/@|#|issues/g, '-'));
|
||||
}
|
||||
}
|
||||
exports.CopyCat = CopyCat;
|
||||
21
build/actions/copycat/copyCat.ts
Normal file
21
build/actions/copycat/copyCat.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
import { GitHubIssue } from '../api/api'
|
||||
|
||||
export class CopyCat {
|
||||
constructor(private github: GitHubIssue, private owner: string, private repo: string) {}
|
||||
|
||||
async run() {
|
||||
const issue = await this.github.getIssue()
|
||||
console.log(`Mirroring issue \`${issue.title}\` to ${this.owner}/${this.repo}`)
|
||||
await this.github.createIssue(
|
||||
this.owner,
|
||||
this.repo,
|
||||
issue.title,
|
||||
issue.body.replace(/@|#|issues/g, '-'),
|
||||
)
|
||||
}
|
||||
}
|
||||
21
build/actions/copycat/index.js
Normal file
21
build/actions/copycat/index.js
Normal file
@@ -0,0 +1,21 @@
|
||||
"use strict";
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const core = require("@actions/core");
|
||||
const github_1 = require("@actions/github");
|
||||
const octokit_1 = require("../api/octokit");
|
||||
const utils_1 = require("../utils/utils");
|
||||
const copyCat_1 = require("./copyCat");
|
||||
const token = utils_1.getRequiredInput('token');
|
||||
const main = async () => {
|
||||
await new copyCat_1.CopyCat(new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number }), utils_1.getRequiredInput('owner'), utils_1.getRequiredInput('repo')).run();
|
||||
};
|
||||
main()
|
||||
.then(() => utils_1.logRateLimit(token))
|
||||
.catch(async (error) => {
|
||||
core.setFailed(error.message);
|
||||
await utils_1.logErrorToIssue(error.message, true, token);
|
||||
});
|
||||
27
build/actions/copycat/index.ts
Normal file
27
build/actions/copycat/index.ts
Normal file
@@ -0,0 +1,27 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
import * as core from '@actions/core'
|
||||
import { context } from '@actions/github'
|
||||
import { OctoKitIssue } from '../api/octokit'
|
||||
import { getRequiredInput, logErrorToIssue, logRateLimit } from '../utils/utils'
|
||||
import { CopyCat } from './copyCat'
|
||||
|
||||
const token = getRequiredInput('token')
|
||||
|
||||
const main = async () => {
|
||||
await new CopyCat(
|
||||
new OctoKitIssue(token, context.repo, { number: context.issue.number }),
|
||||
getRequiredInput('owner'),
|
||||
getRequiredInput('repo'),
|
||||
).run()
|
||||
}
|
||||
|
||||
main()
|
||||
.then(() => logRateLimit(token))
|
||||
.catch(async (error) => {
|
||||
core.setFailed(error.message)
|
||||
await logErrorToIssue(error.message, true, token)
|
||||
})
|
||||
@@ -15,9 +15,9 @@
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"dependencies": {
|
||||
"@actions/core": "^1.2.6",
|
||||
"@actions/core": "^1.2.3",
|
||||
"@actions/github": "^2.1.1",
|
||||
"axios": "^0.21.1",
|
||||
"axios": "^0.19.2",
|
||||
"ts-node": "^8.6.2",
|
||||
"typescript": "^3.8.3"
|
||||
}
|
||||
|
||||
@@ -2,10 +2,10 @@
|
||||
# yarn lockfile v1
|
||||
|
||||
|
||||
"@actions/core@^1.2.6":
|
||||
version "1.2.6"
|
||||
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.6.tgz#a78d49f41a4def18e88ce47c2cac615d5694bf09"
|
||||
integrity sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA==
|
||||
"@actions/core@^1.2.3":
|
||||
version "1.2.3"
|
||||
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.3.tgz#e844b4fa0820e206075445079130868f95bfca95"
|
||||
integrity sha512-Wp4xnyokakM45Uuj4WLUxdsa8fJjKVl1fDTsPbTEcTcuu0Nb26IPQbOtjmnfaCPGcaoPOOqId8H9NapZ8gii4w==
|
||||
|
||||
"@actions/github@^2.1.1":
|
||||
version "2.1.1"
|
||||
@@ -144,12 +144,12 @@ atob-lite@^2.0.0:
|
||||
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
|
||||
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
|
||||
|
||||
axios@^0.21.1:
|
||||
version "0.21.1"
|
||||
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.1.tgz#22563481962f4d6bde9a76d516ef0e5d3c09b2b8"
|
||||
integrity sha512-dKQiRHxGD9PPRIUNIWvZhPTPpl1rf/OxTYKsqKUDjBwYylTvV7SjSHJb9ratfyzM6wCdLCOYLzs73qpg5c4iGA==
|
||||
axios@^0.19.2:
|
||||
version "0.19.2"
|
||||
resolved "https://registry.yarnpkg.com/axios/-/axios-0.19.2.tgz#3ea36c5d8818d0d5f8a8a97a6d36b86cdc00cb27"
|
||||
integrity sha512-fjgm5MvRHLhx+osE2xoekY70AhARk3a6hkN+3Io1jc00jtquGvxYlKlsFUhmUET0V5te6CcZI7lcv2Ym61mjHA==
|
||||
dependencies:
|
||||
follow-redirects "^1.10.0"
|
||||
follow-redirects "1.5.10"
|
||||
|
||||
before-after-hook@^2.0.0:
|
||||
version "2.1.0"
|
||||
@@ -177,6 +177,13 @@ cross-spawn@^6.0.0:
|
||||
shebang-command "^1.2.0"
|
||||
which "^1.2.9"
|
||||
|
||||
debug@=3.1.0:
|
||||
version "3.1.0"
|
||||
resolved "https://registry.yarnpkg.com/debug/-/debug-3.1.0.tgz#5bb5a0672628b64149566ba16819e61518c67261"
|
||||
integrity sha512-OX8XqP7/1a9cqkxYw2yXss15f26NKWBpDXQd0/uK/KPqdQhxbPa994hnzjcE2VqQpDslf55723cKPUOGSmMY3g==
|
||||
dependencies:
|
||||
ms "2.0.0"
|
||||
|
||||
deprecation@^2.0.0, deprecation@^2.3.1:
|
||||
version "2.3.1"
|
||||
resolved "https://registry.yarnpkg.com/deprecation/-/deprecation-2.3.1.tgz#6368cbdb40abf3373b525ac87e4a260c3a700919"
|
||||
@@ -207,10 +214,12 @@ execa@^1.0.0:
|
||||
signal-exit "^3.0.0"
|
||||
strip-eof "^1.0.0"
|
||||
|
||||
follow-redirects@^1.10.0:
|
||||
version "1.13.1"
|
||||
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.13.1.tgz#5f69b813376cee4fd0474a3aba835df04ab763b7"
|
||||
integrity sha512-SSG5xmZh1mkPGyKzjZP8zLjltIfpW32Y5QpdNJyjcfGxK3qo3NDDkZOZSFiGn1A6SclQxY9GzEwAHQ3dmYRWpg==
|
||||
follow-redirects@1.5.10:
|
||||
version "1.5.10"
|
||||
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.5.10.tgz#7b7a9f9aea2fdff36786a94ff643ed07f4ff5e2a"
|
||||
integrity sha512-0V5l4Cizzvqt5D44aTXbFZz+FtyXV1vrDN6qrelxtfYQKW0KO0W2T/hkE8xvGa/540LkZlkaUjO4ailYTFtHVQ==
|
||||
dependencies:
|
||||
debug "=3.1.0"
|
||||
|
||||
get-stream@^4.0.0:
|
||||
version "4.1.0"
|
||||
@@ -266,15 +275,20 @@ make-error@^1.1.1:
|
||||
resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
|
||||
integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
|
||||
|
||||
ms@2.0.0:
|
||||
version "2.0.0"
|
||||
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
|
||||
integrity sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=
|
||||
|
||||
nice-try@^1.0.4:
|
||||
version "1.0.5"
|
||||
resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366"
|
||||
integrity sha512-1nh45deeb5olNY7eX82BkPO7SSxR5SSYJiPTrTdFUVYwAl8CKMA5N9PjTYkHiRjisVcxcQ1HXdLhx2qxxJzLNQ==
|
||||
|
||||
node-fetch@^2.3.0:
|
||||
version "2.6.1"
|
||||
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.1.tgz#045bd323631f76ed2e2b55573394416b639a0052"
|
||||
integrity sha512-V4aYg89jEoVRxRb2fJdAg8FHvI7cEyYdVAh94HH0UIK8oJxUfkjlDQN9RbMx+bEjP7+ggMiFRprSti032Oipxw==
|
||||
version "2.6.0"
|
||||
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.0.tgz#e633456386d4aa55863f676a7ab0daa8fdecb0fd"
|
||||
integrity sha512-8dG4H5ujfvFiqDmVu9fQ5bOHUC15JMjMY/Zumv26oOvvVJjM67KF8koCWIabKQ1GJIa9r2mMZscBq/TbdOcmNA==
|
||||
|
||||
npm-run-path@^2.0.0:
|
||||
version "2.0.2"
|
||||
|
||||
2
build/azure-pipelines/common/.gitignore
vendored
Normal file
2
build/azure-pipelines/common/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
node_modules/
|
||||
*.js
|
||||
@@ -1,25 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
const crypto = require("crypto");
|
||||
const { dirs } = require('../../npm/dirs');
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
const shasum = crypto.createHash('sha1');
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
|
||||
// Add `yarn.lock` files
|
||||
for (let dir of dirs) {
|
||||
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
|
||||
shasum.update(fs.readFileSync(yarnLockPath));
|
||||
}
|
||||
// Add any other command line arguments
|
||||
for (let i = 2; i < process.argv.length; i++) {
|
||||
shasum.update(process.argv[i]);
|
||||
}
|
||||
process.stdout.write(shasum.digest('hex'));
|
||||
@@ -1,32 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
'use strict';
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import * as crypto from 'crypto';
|
||||
const { dirs } = require('../../npm/dirs');
|
||||
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
|
||||
const shasum = crypto.createHash('sha1');
|
||||
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
|
||||
|
||||
// Add `yarn.lock` files
|
||||
for (let dir of dirs) {
|
||||
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
|
||||
shasum.update(fs.readFileSync(yarnLockPath));
|
||||
}
|
||||
|
||||
// Add any other command line arguments
|
||||
for (let i = 2; i < process.argv.length; i++) {
|
||||
shasum.update(process.argv[i]);
|
||||
}
|
||||
|
||||
process.stdout.write(shasum.digest('hex'));
|
||||
@@ -1,41 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const vfs = require("vinyl-fs");
|
||||
const path = require("path");
|
||||
const es = require("event-stream");
|
||||
const fs = require("fs");
|
||||
const files = [
|
||||
'.build/langpacks/**/*.vsix',
|
||||
'.build/extensions/**/*.vsix',
|
||||
'.build/win32-x64/**/*.{exe,zip}',
|
||||
'.build/linux/sha256hashes.txt',
|
||||
'.build/linux/deb/amd64/deb/*.deb',
|
||||
'.build/linux/rpm/x86_64/*.rpm',
|
||||
'.build/linux/server/*',
|
||||
'.build/linux/archive/*',
|
||||
'.build/docker/*',
|
||||
'.build/darwin/*',
|
||||
'.build/version.json' // version information
|
||||
];
|
||||
async function main() {
|
||||
return new Promise((resolve, reject) => {
|
||||
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
|
||||
.pipe(es.through(file => {
|
||||
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY,
|
||||
//Preserve intermediate directories after .build folder
|
||||
file.path.substr(path.resolve('.build').length + 1));
|
||||
fs.mkdirSync(path.dirname(filePath), { recursive: true });
|
||||
fs.renameSync(file.path, filePath);
|
||||
}));
|
||||
stream.on('end', () => resolve());
|
||||
stream.on('error', e => reject(e));
|
||||
});
|
||||
}
|
||||
main().catch(err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -11,7 +11,6 @@ import * as es from 'event-stream';
|
||||
import * as fs from 'fs';
|
||||
|
||||
const files = [
|
||||
'.build/langpacks/**/*.vsix', // langpacks
|
||||
'.build/extensions/**/*.vsix', // external extensions
|
||||
'.build/win32-x64/**/*.{exe,zip}', // windows binaries
|
||||
'.build/linux/sha256hashes.txt', // linux hashes
|
||||
@@ -25,7 +24,7 @@ const files = [
|
||||
];
|
||||
|
||||
async function main() {
|
||||
return new Promise<void>((resolve, reject) => {
|
||||
return new Promise((resolve, reject) => {
|
||||
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
|
||||
.pipe(es.through(file => {
|
||||
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY!,
|
||||
|
||||
@@ -1,94 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs");
|
||||
const crypto = require("crypto");
|
||||
const azure = require("azure-storage");
|
||||
const mime = require("mime");
|
||||
const cosmos_1 = require("@azure/cosmos");
|
||||
const retry_1 = require("./retry");
|
||||
if (process.argv.length !== 6) {
|
||||
console.error('Usage: node createAsset.js PLATFORM TYPE NAME FILE');
|
||||
process.exit(-1);
|
||||
}
|
||||
function hashStream(hashName, stream) {
|
||||
return new Promise((c, e) => {
|
||||
const shasum = crypto.createHash(hashName);
|
||||
stream
|
||||
.on('data', shasum.update.bind(shasum))
|
||||
.on('error', e)
|
||||
.on('close', () => c(shasum.digest('hex')));
|
||||
});
|
||||
}
|
||||
async function doesAssetExist(blobService, quality, blobName) {
|
||||
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
|
||||
return existsResult.exists;
|
||||
}
|
||||
async function uploadBlob(blobService, quality, blobName, filePath, fileName) {
|
||||
const blobOptions = {
|
||||
contentSettings: {
|
||||
contentType: mime.lookup(filePath),
|
||||
contentDisposition: `attachment; filename="${fileName}"`,
|
||||
cacheControl: 'max-age=31536000, public'
|
||||
}
|
||||
};
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
function getEnv(name) {
|
||||
const result = process.env[name];
|
||||
if (typeof result === 'undefined') {
|
||||
throw new Error('Missing env: ' + name);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
async function main() {
|
||||
const [, , platform, type, fileName, filePath] = process.argv;
|
||||
const quality = getEnv('VSCODE_QUALITY');
|
||||
const commit = getEnv('BUILD_SOURCEVERSION');
|
||||
console.log('Creating asset...');
|
||||
const stat = await new Promise((c, e) => fs.stat(filePath, (err, stat) => err ? e(err) : c(stat)));
|
||||
const size = stat.size;
|
||||
console.log('Size:', size);
|
||||
const stream = fs.createReadStream(filePath);
|
||||
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
|
||||
console.log('SHA1:', sha1hash);
|
||||
console.log('SHA256:', sha256hash);
|
||||
const blobName = commit + '/' + fileName;
|
||||
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
|
||||
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
|
||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||
const blobExists = await doesAssetExist(blobService, quality, blobName);
|
||||
if (blobExists) {
|
||||
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
|
||||
return;
|
||||
}
|
||||
console.log('Uploading blobs to Azure storage...');
|
||||
await uploadBlob(blobService, quality, blobName, filePath, fileName);
|
||||
console.log('Blobs successfully uploaded.');
|
||||
const asset = {
|
||||
platform,
|
||||
type,
|
||||
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
|
||||
hash: sha1hash,
|
||||
sha256hash,
|
||||
size
|
||||
};
|
||||
// Remove this if we ever need to rollback fast updates for windows
|
||||
if (/win32/.test(platform)) {
|
||||
asset.supportsFastUpdate = true;
|
||||
}
|
||||
console.log('Asset:', JSON.stringify(asset, null, ' '));
|
||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await (0, retry_1.retry)(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
||||
}
|
||||
main().then(() => {
|
||||
console.log('Asset successfully created');
|
||||
process.exit(0);
|
||||
}, err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -11,7 +11,6 @@ import * as crypto from 'crypto';
|
||||
import * as azure from 'azure-storage';
|
||||
import * as mime from 'mime';
|
||||
import { CosmosClient } from '@azure/cosmos';
|
||||
import { retry } from './retry';
|
||||
|
||||
interface Asset {
|
||||
platform: string;
|
||||
@@ -54,7 +53,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
|
||||
}
|
||||
};
|
||||
|
||||
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
|
||||
function getEnv(name: string): string {
|
||||
@@ -122,7 +121,7 @@ async function main(): Promise<void> {
|
||||
|
||||
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
||||
await scripts.storedProcedure('createAsset').execute('', [commit, asset, true]);
|
||||
}
|
||||
|
||||
main().then(() => {
|
||||
|
||||
@@ -1,51 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const cosmos_1 = require("@azure/cosmos");
|
||||
const retry_1 = require("./retry");
|
||||
if (process.argv.length !== 3) {
|
||||
console.error('Usage: node createBuild.js VERSION');
|
||||
process.exit(-1);
|
||||
}
|
||||
function getEnv(name) {
|
||||
const result = process.env[name];
|
||||
if (typeof result === 'undefined') {
|
||||
throw new Error('Missing env: ' + name);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
async function main() {
|
||||
const [, , _version] = process.argv;
|
||||
const quality = getEnv('VSCODE_QUALITY');
|
||||
const commit = getEnv('BUILD_SOURCEVERSION');
|
||||
const queuedBy = getEnv('BUILD_QUEUEDBY');
|
||||
const sourceBranch = getEnv('BUILD_SOURCEBRANCH');
|
||||
const version = _version + (quality === 'stable' ? '' : `-${quality}`);
|
||||
console.log('Creating build...');
|
||||
console.log('Quality:', quality);
|
||||
console.log('Version:', version);
|
||||
console.log('Commit:', commit);
|
||||
const build = {
|
||||
id: commit,
|
||||
timestamp: (new Date()).getTime(),
|
||||
version,
|
||||
isReleased: false,
|
||||
sourceBranch,
|
||||
queuedBy,
|
||||
assets: [],
|
||||
updates: {}
|
||||
};
|
||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
|
||||
}
|
||||
main().then(() => {
|
||||
console.log('Build successfully created');
|
||||
process.exit(0);
|
||||
}, err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -6,7 +6,6 @@
|
||||
'use strict';
|
||||
|
||||
import { CosmosClient } from '@azure/cosmos';
|
||||
import { retry } from './retry';
|
||||
|
||||
if (process.argv.length !== 3) {
|
||||
console.error('Usage: node createBuild.js VERSION');
|
||||
@@ -49,7 +48,7 @@ async function main(): Promise<void> {
|
||||
|
||||
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await retry(() => scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]));
|
||||
await scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]);
|
||||
}
|
||||
|
||||
main().then(() => {
|
||||
|
||||
@@ -10,10 +10,10 @@ git clone --depth 1 https://github.com/Microsoft/vscode-node-debug2.git
|
||||
git clone --depth 1 https://github.com/Microsoft/vscode-node-debug.git
|
||||
git clone --depth 1 https://github.com/Microsoft/vscode-html-languageservice.git
|
||||
git clone --depth 1 https://github.com/Microsoft/vscode-json-languageservice.git
|
||||
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
|
||||
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
|
||||
$BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
|
||||
$BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
|
||||
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
|
||||
mv declarations-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-core.json
|
||||
mv config-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-extensions.json
|
||||
cd ..
|
||||
rm -rf extraction
|
||||
rm -rf extraction
|
||||
@@ -1,14 +0,0 @@
|
||||
"use strict";
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const path = require("path");
|
||||
const retry_1 = require("./retry");
|
||||
const { installBrowsersWithProgressBar } = require('playwright/lib/install/installer');
|
||||
const playwrightPath = path.dirname(require.resolve('playwright'));
|
||||
async function install() {
|
||||
await retry_1.retry(() => installBrowsersWithProgressBar(playwrightPath));
|
||||
}
|
||||
install();
|
||||
@@ -1,40 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
if (process.argv.length !== 3) {
|
||||
console.error('Usage: node listNodeModules.js OUTPUT_FILE');
|
||||
process.exit(-1);
|
||||
}
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
function findNodeModulesFiles(location, inNodeModules, result) {
|
||||
const entries = fs.readdirSync(path.join(ROOT, location));
|
||||
for (const entry of entries) {
|
||||
const entryPath = `${location}/${entry}`;
|
||||
if (/(^\/out)|(^\/src$)|(^\/.git$)|(^\/.build$)/.test(entryPath)) {
|
||||
continue;
|
||||
}
|
||||
let stat;
|
||||
try {
|
||||
stat = fs.statSync(path.join(ROOT, entryPath));
|
||||
}
|
||||
catch (err) {
|
||||
continue;
|
||||
}
|
||||
if (stat.isDirectory()) {
|
||||
findNodeModulesFiles(entryPath, inNodeModules || (entry === 'node_modules'), result);
|
||||
}
|
||||
else {
|
||||
if (inNodeModules) {
|
||||
result.push(entryPath.substr(1));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
const result = [];
|
||||
findNodeModulesFiles('', false, result);
|
||||
fs.writeFileSync(process.argv[2], result.join('\n') + '\n');
|
||||
@@ -1,46 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
'use strict';
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
if (process.argv.length !== 3) {
|
||||
console.error('Usage: node listNodeModules.js OUTPUT_FILE');
|
||||
process.exit(-1);
|
||||
}
|
||||
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
|
||||
function findNodeModulesFiles(location: string, inNodeModules: boolean, result: string[]) {
|
||||
const entries = fs.readdirSync(path.join(ROOT, location));
|
||||
for (const entry of entries) {
|
||||
const entryPath = `${location}/${entry}`;
|
||||
|
||||
if (/(^\/out)|(^\/src$)|(^\/.git$)|(^\/.build$)/.test(entryPath)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
let stat: fs.Stats;
|
||||
try {
|
||||
stat = fs.statSync(path.join(ROOT, entryPath));
|
||||
} catch (err) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (stat.isDirectory()) {
|
||||
findNodeModulesFiles(entryPath, inNodeModules || (entry === 'node_modules'), result);
|
||||
} else {
|
||||
if (inNodeModules) {
|
||||
result.push(entryPath.substr(1));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const result: string[] = [];
|
||||
findNodeModulesFiles('', false, result);
|
||||
fs.writeFileSync(process.argv[2], result.join('\n') + '\n');
|
||||
@@ -1,71 +0,0 @@
|
||||
"use strict";
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const azure = require("azure-storage");
|
||||
const mime = require("mime");
|
||||
const minimist = require("minimist");
|
||||
const path_1 = require("path");
|
||||
const fileNames = [
|
||||
'fake.html',
|
||||
'host.js',
|
||||
'index.html',
|
||||
'main.js',
|
||||
'service-worker.js'
|
||||
];
|
||||
async function assertContainer(blobService, container) {
|
||||
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
}
|
||||
async function doesBlobExist(blobService, container, blobName) {
|
||||
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(container, blobName, (err, r) => err ? e(err) : c(r)));
|
||||
return existsResult.exists;
|
||||
}
|
||||
async function uploadBlob(blobService, container, blobName, file) {
|
||||
const blobOptions = {
|
||||
contentSettings: {
|
||||
contentType: mime.lookup(file),
|
||||
cacheControl: 'max-age=31536000, public'
|
||||
}
|
||||
};
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
async function publish(commit, files) {
|
||||
console.log('Publishing...');
|
||||
console.log('Commit:', commit);
|
||||
const storageAccount = process.env['AZURE_WEBVIEW_STORAGE_ACCOUNT'];
|
||||
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_WEBVIEW_STORAGE_ACCESS_KEY'])
|
||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||
await assertContainer(blobService, commit);
|
||||
for (const file of files) {
|
||||
const blobName = (0, path_1.basename)(file);
|
||||
const blobExists = await doesBlobExist(blobService, commit, blobName);
|
||||
if (blobExists) {
|
||||
console.log(`Blob ${commit}, ${blobName} already exists, not publishing again.`);
|
||||
continue;
|
||||
}
|
||||
console.log('Uploading blob to Azure storage...');
|
||||
await uploadBlob(blobService, commit, blobName, file);
|
||||
}
|
||||
console.log('Blobs successfully uploaded.');
|
||||
}
|
||||
function main() {
|
||||
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||
if (!commit) {
|
||||
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
|
||||
return;
|
||||
}
|
||||
const opts = minimist(process.argv.slice(2));
|
||||
const [directory] = opts._;
|
||||
const files = fileNames.map(fileName => (0, path_1.join)(directory, fileName));
|
||||
publish(commit, files).catch(err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
if (process.argv.length < 3) {
|
||||
console.error('Usage: node publish.js <directory>');
|
||||
process.exit(-1);
|
||||
}
|
||||
main();
|
||||
@@ -17,7 +17,7 @@ const fileNames = [
|
||||
];
|
||||
|
||||
async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> {
|
||||
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
}
|
||||
|
||||
async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> {
|
||||
@@ -33,7 +33,7 @@ async function uploadBlob(blobService: azure.BlobService, container: string, blo
|
||||
}
|
||||
};
|
||||
|
||||
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
|
||||
async function publish(commit: string, files: readonly string[]): Promise<void> {
|
||||
|
||||
@@ -1,224 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs");
|
||||
const crypto = require("crypto");
|
||||
const azure = require("azure-storage");
|
||||
const mime = require("mime");
|
||||
const minimist = require("minimist");
|
||||
const documentdb_1 = require("documentdb");
|
||||
// {{SQL CARBON EDIT}}
|
||||
if (process.argv.length < 9) {
|
||||
console.error('Usage: node publish.js <product_quality> <platform> <file_type> <file_name> <version> <is_update> <file> [commit_id]');
|
||||
process.exit(-1);
|
||||
}
|
||||
function hashStream(hashName, stream) {
|
||||
return new Promise((c, e) => {
|
||||
const shasum = crypto.createHash(hashName);
|
||||
stream
|
||||
.on('data', shasum.update.bind(shasum))
|
||||
.on('error', e)
|
||||
.on('close', () => c(shasum.digest('hex')));
|
||||
});
|
||||
}
|
||||
function createDefaultConfig(quality) {
|
||||
return {
|
||||
id: quality,
|
||||
frozen: false
|
||||
};
|
||||
}
|
||||
function getConfig(quality) {
|
||||
console.log(`Getting config for quality ${quality}`);
|
||||
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const collection = 'dbs/builds/colls/config';
|
||||
const query = {
|
||||
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
|
||||
parameters: [
|
||||
{ name: '@quality', value: quality }
|
||||
]
|
||||
};
|
||||
return retry(() => new Promise((c, e) => {
|
||||
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
|
||||
if (err && err.code !== 409) {
|
||||
return e(err);
|
||||
}
|
||||
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
|
||||
});
|
||||
}));
|
||||
}
|
||||
function createOrUpdate(commit, quality, platform, type, release, asset, isUpdate) {
|
||||
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const collection = 'dbs/builds/colls/' + quality;
|
||||
const updateQuery = {
|
||||
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
|
||||
parameters: [{ name: '@id', value: commit }]
|
||||
};
|
||||
let updateTries = 0;
|
||||
function update() {
|
||||
updateTries++;
|
||||
return new Promise((c, e) => {
|
||||
console.log(`Querying existing documents to update...`);
|
||||
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
|
||||
if (err) {
|
||||
return e(err);
|
||||
}
|
||||
if (results.length !== 1) {
|
||||
return e(new Error('No documents'));
|
||||
}
|
||||
const release = results[0];
|
||||
release.assets = [
|
||||
...release.assets.filter((a) => !(a.platform === platform && a.type === type)),
|
||||
asset
|
||||
];
|
||||
if (isUpdate) {
|
||||
release.updates[platform] = type;
|
||||
}
|
||||
console.log(`Replacing existing document with updated version`);
|
||||
client.replaceDocument(release._self, release, err => {
|
||||
if (err && err.code === 409 && updateTries < 5) {
|
||||
return c(update());
|
||||
}
|
||||
if (err) {
|
||||
return e(err);
|
||||
}
|
||||
console.log('Build successfully updated.');
|
||||
c();
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
return retry(() => new Promise((c, e) => {
|
||||
console.log(`Attempting to create document`);
|
||||
client.createDocument(collection, release, err => {
|
||||
if (err && err.code === 409) {
|
||||
return c(update());
|
||||
}
|
||||
if (err) {
|
||||
return e(err);
|
||||
}
|
||||
console.log('Build successfully published.');
|
||||
c();
|
||||
});
|
||||
}));
|
||||
}
|
||||
async function assertContainer(blobService, quality) {
|
||||
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
}
|
||||
async function doesAssetExist(blobService, quality, blobName) {
|
||||
const existsResult = await new Promise((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
|
||||
return existsResult.exists;
|
||||
}
|
||||
async function uploadBlob(blobService, quality, blobName, file) {
|
||||
const blobOptions = {
|
||||
contentSettings: {
|
||||
contentType: mime.lookup(file),
|
||||
cacheControl: 'max-age=31536000, public'
|
||||
}
|
||||
};
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
async function publish(commit, quality, platform, type, name, version, _isUpdate, file, opts) {
|
||||
const isUpdate = _isUpdate === 'true';
|
||||
const queuedBy = process.env['BUILD_QUEUEDBY'];
|
||||
const sourceBranch = process.env['BUILD_SOURCEBRANCH'];
|
||||
console.log('Publishing...');
|
||||
console.log('Quality:', quality);
|
||||
console.log('Platform:', platform);
|
||||
console.log('Type:', type);
|
||||
console.log('Name:', name);
|
||||
console.log('Version:', version);
|
||||
console.log('Commit:', commit);
|
||||
console.log('Is Update:', isUpdate);
|
||||
console.log('File:', file);
|
||||
const stat = await new Promise((c, e) => fs.stat(file, (err, stat) => err ? e(err) : c(stat)));
|
||||
const size = stat.size;
|
||||
console.log('Size:', size);
|
||||
const stream = fs.createReadStream(file);
|
||||
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
|
||||
console.log('SHA1:', sha1hash);
|
||||
console.log('SHA256:', sha256hash);
|
||||
const blobName = commit + '/' + name;
|
||||
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
|
||||
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
|
||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||
await assertContainer(blobService, quality);
|
||||
const blobExists = await doesAssetExist(blobService, quality, blobName);
|
||||
if (blobExists) {
|
||||
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
|
||||
return;
|
||||
}
|
||||
console.log('Uploading blobs to Azure storage...');
|
||||
await uploadBlob(blobService, quality, blobName, file);
|
||||
console.log('Blobs successfully uploaded.');
|
||||
const config = await getConfig(quality);
|
||||
console.log('Quality config:', config);
|
||||
const asset = {
|
||||
platform: platform,
|
||||
type: type,
|
||||
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
|
||||
hash: sha1hash,
|
||||
sha256hash,
|
||||
size
|
||||
};
|
||||
// Remove this if we ever need to rollback fast updates for windows
|
||||
if (/win32/.test(platform)) {
|
||||
asset.supportsFastUpdate = true;
|
||||
}
|
||||
console.log('Asset:', JSON.stringify(asset, null, ' '));
|
||||
// {{SQL CARBON EDIT}}
|
||||
// Insiders: nightly build from main
|
||||
const isReleased = (((quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
|
||||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))) &&
|
||||
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy));
|
||||
const release = {
|
||||
id: commit,
|
||||
timestamp: (new Date()).getTime(),
|
||||
version,
|
||||
isReleased: isReleased,
|
||||
sourceBranch,
|
||||
queuedBy,
|
||||
assets: [],
|
||||
updates: {}
|
||||
};
|
||||
if (!opts['upload-only']) {
|
||||
release.assets.push(asset);
|
||||
if (isUpdate) {
|
||||
release.updates[platform] = type;
|
||||
}
|
||||
}
|
||||
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
|
||||
}
|
||||
const RETRY_TIMES = 10;
|
||||
async function retry(fn) {
|
||||
for (let run = 1; run <= RETRY_TIMES; run++) {
|
||||
try {
|
||||
return await fn();
|
||||
}
|
||||
catch (err) {
|
||||
if (!/ECONNRESET/.test(err.message)) {
|
||||
throw err;
|
||||
}
|
||||
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
|
||||
}
|
||||
}
|
||||
throw new Error('Retried too many times');
|
||||
}
|
||||
function main() {
|
||||
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||
if (!commit) {
|
||||
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
|
||||
return;
|
||||
}
|
||||
const opts = minimist(process.argv.slice(2), {
|
||||
boolean: ['upload-only']
|
||||
});
|
||||
const [quality, platform, type, name, version, _isUpdate, file] = opts._;
|
||||
publish(commit, quality, platform, type, name, version, _isUpdate, file, opts).catch(err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
main();
|
||||
@@ -43,7 +43,6 @@ function createDefaultConfig(quality: string): Config {
|
||||
}
|
||||
|
||||
function getConfig(quality: string): Promise<Config> {
|
||||
console.log(`Getting config for quality ${quality}`);
|
||||
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const collection = 'dbs/builds/colls/config';
|
||||
const query = {
|
||||
@@ -53,13 +52,13 @@ function getConfig(quality: string): Promise<Config> {
|
||||
]
|
||||
};
|
||||
|
||||
return retry(() => new Promise<Config>((c, e) => {
|
||||
return new Promise<Config>((c, e) => {
|
||||
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
|
||||
if (err && err.code !== 409) { return e(err); }
|
||||
|
||||
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config);
|
||||
});
|
||||
}));
|
||||
});
|
||||
}
|
||||
|
||||
interface Asset {
|
||||
@@ -87,7 +86,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
|
||||
updateTries++;
|
||||
|
||||
return new Promise<void>((c, e) => {
|
||||
console.log(`Querying existing documents to update...`);
|
||||
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
|
||||
if (err) { return e(err); }
|
||||
if (results.length !== 1) { return e(new Error('No documents')); }
|
||||
@@ -103,7 +101,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
|
||||
release.updates[platform] = type;
|
||||
}
|
||||
|
||||
console.log(`Replacing existing document with updated version`);
|
||||
client.replaceDocument(release._self, release, err => {
|
||||
if (err && err.code === 409 && updateTries < 5) { return c(update()); }
|
||||
if (err) { return e(err); }
|
||||
@@ -115,8 +112,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
|
||||
});
|
||||
}
|
||||
|
||||
return retry(() => new Promise<void>((c, e) => {
|
||||
console.log(`Attempting to create document`);
|
||||
return new Promise<void>((c, e) => {
|
||||
client.createDocument(collection, release, err => {
|
||||
if (err && err.code === 409) { return c(update()); }
|
||||
if (err) { return e(err); }
|
||||
@@ -124,11 +120,11 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
|
||||
console.log('Build successfully published.');
|
||||
c();
|
||||
});
|
||||
}));
|
||||
});
|
||||
}
|
||||
|
||||
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
|
||||
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
|
||||
}
|
||||
|
||||
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
|
||||
@@ -144,7 +140,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
|
||||
}
|
||||
};
|
||||
|
||||
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, file, blobOptions, err => err ? e(err) : c()));
|
||||
}
|
||||
|
||||
interface PublishOptions {
|
||||
@@ -192,6 +188,7 @@ async function publish(commit: string, quality: string, platform: string, type:
|
||||
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Uploading blobs to Azure storage...');
|
||||
|
||||
await uploadBlob(blobService, quality, blobName, file);
|
||||
@@ -219,10 +216,10 @@ async function publish(commit: string, quality: string, platform: string, type:
|
||||
console.log('Asset:', JSON.stringify(asset, null, ' '));
|
||||
|
||||
// {{SQL CARBON EDIT}}
|
||||
// Insiders: nightly build from main
|
||||
// Insiders: nightly build from master
|
||||
const isReleased = (
|
||||
(
|
||||
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
|
||||
(quality === 'insider' && /^master$|^refs\/heads\/master$/.test(sourceBranch)) ||
|
||||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
|
||||
) &&
|
||||
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)
|
||||
@@ -250,22 +247,6 @@ async function publish(commit: string, quality: string, platform: string, type:
|
||||
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
|
||||
}
|
||||
|
||||
const RETRY_TIMES = 10;
|
||||
async function retry<T>(fn: () => Promise<T>): Promise<T> {
|
||||
for (let run = 1; run <= RETRY_TIMES; run++) {
|
||||
try {
|
||||
return await fn();
|
||||
} catch (err) {
|
||||
if (!/ECONNRESET/.test(err.message)) {
|
||||
throw err;
|
||||
}
|
||||
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error('Retried too many times');
|
||||
}
|
||||
|
||||
function main(): void {
|
||||
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||
|
||||
|
||||
@@ -1,91 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const documentdb_1 = require("documentdb");
|
||||
function createDefaultConfig(quality) {
|
||||
return {
|
||||
id: quality,
|
||||
frozen: false
|
||||
};
|
||||
}
|
||||
function getConfig(quality) {
|
||||
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const collection = 'dbs/builds/colls/config';
|
||||
const query = {
|
||||
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
|
||||
parameters: [
|
||||
{ name: '@quality', value: quality }
|
||||
]
|
||||
};
|
||||
return new Promise((c, e) => {
|
||||
client.queryDocuments(collection, query).toArray((err, results) => {
|
||||
if (err && err.code !== 409) {
|
||||
return e(err);
|
||||
}
|
||||
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
|
||||
});
|
||||
});
|
||||
}
|
||||
function doRelease(commit, quality) {
|
||||
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const collection = 'dbs/builds/colls/' + quality;
|
||||
const query = {
|
||||
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
|
||||
parameters: [{ name: '@id', value: commit }]
|
||||
};
|
||||
let updateTries = 0;
|
||||
function update() {
|
||||
updateTries++;
|
||||
return new Promise((c, e) => {
|
||||
client.queryDocuments(collection, query).toArray((err, results) => {
|
||||
if (err) {
|
||||
return e(err);
|
||||
}
|
||||
if (results.length !== 1) {
|
||||
return e(new Error('No documents'));
|
||||
}
|
||||
const release = results[0];
|
||||
release.isReleased = true;
|
||||
client.replaceDocument(release._self, release, err => {
|
||||
if (err && err.code === 409 && updateTries < 5) {
|
||||
return c(update());
|
||||
}
|
||||
if (err) {
|
||||
return e(err);
|
||||
}
|
||||
console.log('Build successfully updated.');
|
||||
c();
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
return update();
|
||||
}
|
||||
async function release(commit, quality) {
|
||||
const config = await getConfig(quality);
|
||||
console.log('Quality config:', config);
|
||||
if (config.frozen) {
|
||||
console.log(`Skipping release because quality ${quality} is frozen.`);
|
||||
return;
|
||||
}
|
||||
await doRelease(commit, quality);
|
||||
}
|
||||
function env(name) {
|
||||
const result = process.env[name];
|
||||
if (!result) {
|
||||
throw new Error(`Skipping release due to missing env: ${name}`);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
async function main() {
|
||||
const commit = env('BUILD_SOURCEVERSION');
|
||||
const quality = env('VSCODE_QUALITY');
|
||||
await release(commit, quality);
|
||||
}
|
||||
main().catch(err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,50 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const cosmos_1 = require("@azure/cosmos");
|
||||
const retry_1 = require("./retry");
|
||||
function getEnv(name) {
|
||||
const result = process.env[name];
|
||||
if (typeof result === 'undefined') {
|
||||
throw new Error('Missing env: ' + name);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
function createDefaultConfig(quality) {
|
||||
return {
|
||||
id: quality,
|
||||
frozen: false
|
||||
};
|
||||
}
|
||||
async function getConfig(client, quality) {
|
||||
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${quality}"`;
|
||||
const res = await client.database('builds').container('config').items.query(query).fetchAll();
|
||||
if (res.resources.length === 0) {
|
||||
return createDefaultConfig(quality);
|
||||
}
|
||||
return res.resources[0];
|
||||
}
|
||||
async function main() {
|
||||
const commit = getEnv('BUILD_SOURCEVERSION');
|
||||
const quality = getEnv('VSCODE_QUALITY');
|
||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const config = await getConfig(client, quality);
|
||||
console.log('Quality config:', config);
|
||||
if (config.frozen) {
|
||||
console.log(`Skipping release because quality ${quality} is frozen.`);
|
||||
return;
|
||||
}
|
||||
console.log(`Releasing build ${commit}...`);
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await (0, retry_1.retry)(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
|
||||
}
|
||||
main().then(() => {
|
||||
console.log('Build successfully released');
|
||||
process.exit(0);
|
||||
}, err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -6,7 +6,6 @@
|
||||
'use strict';
|
||||
|
||||
import { CosmosClient } from '@azure/cosmos';
|
||||
import { retry } from './retry';
|
||||
|
||||
function getEnv(name: string): string {
|
||||
const result = process.env[name];
|
||||
@@ -59,7 +58,7 @@ async function main(): Promise<void> {
|
||||
console.log(`Releasing build ${commit}...`);
|
||||
|
||||
const scripts = client.database('builds').container(quality).scripts;
|
||||
await retry(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
|
||||
await scripts.storedProcedure('releaseBuild').execute('', [commit]);
|
||||
}
|
||||
|
||||
main().then(() => {
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.retry = void 0;
|
||||
async function retry(fn) {
|
||||
for (let run = 1; run <= 10; run++) {
|
||||
try {
|
||||
return await fn();
|
||||
}
|
||||
catch (err) {
|
||||
if (!/ECONNRESET/.test(err.message)) {
|
||||
throw err;
|
||||
}
|
||||
const millis = (Math.random() * 200) + (50 * Math.pow(1.5, run));
|
||||
console.log(`Failed with ECONNRESET, retrying in ${millis}ms...`);
|
||||
// maximum delay is 10th retry: ~3 seconds
|
||||
await new Promise(c => setTimeout(c, millis));
|
||||
}
|
||||
}
|
||||
throw new Error('Retried too many times');
|
||||
}
|
||||
exports.retry = retry;
|
||||
@@ -1,26 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
'use strict';
|
||||
|
||||
export async function retry<T>(fn: () => Promise<T>): Promise<T> {
|
||||
for (let run = 1; run <= 10; run++) {
|
||||
try {
|
||||
return await fn();
|
||||
} catch (err) {
|
||||
if (!/ECONNRESET/.test(err.message)) {
|
||||
throw err;
|
||||
}
|
||||
|
||||
const millis = (Math.random() * 200) + (50 * Math.pow(1.5, run));
|
||||
console.log(`Failed with ECONNRESET, retrying in ${millis}ms...`);
|
||||
|
||||
// maximum delay is 10th retry: ~3 seconds
|
||||
await new Promise(c => setTimeout(c, millis));
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error('Retried too many times');
|
||||
}
|
||||
@@ -1,47 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
const crypto = require("crypto");
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
function findFiles(location, pattern, result) {
|
||||
const entries = fs.readdirSync(path.join(ROOT, location));
|
||||
for (const entry of entries) {
|
||||
const entryPath = `${location}/${entry}`;
|
||||
let stat;
|
||||
try {
|
||||
stat = fs.statSync(path.join(ROOT, entryPath));
|
||||
}
|
||||
catch (err) {
|
||||
continue;
|
||||
}
|
||||
if (stat.isDirectory()) {
|
||||
findFiles(entryPath, pattern, result);
|
||||
}
|
||||
else {
|
||||
if (stat.isFile() && entry.endsWith(pattern)) {
|
||||
result.push(path.join(ROOT, entryPath));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
const shasum = crypto.createHash('sha1');
|
||||
/**
|
||||
* Creating a sha hash of all the files that can cause packages to change/redownload.
|
||||
*/
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
|
||||
// Adding all yarn.lock files into sha sum.
|
||||
const result = [];
|
||||
findFiles('', 'yarn.lock', result);
|
||||
result.forEach(f => shasum.update(fs.readFileSync(f)));
|
||||
// Add any other command line arguments
|
||||
for (let i = 2; i < process.argv.length; i++) {
|
||||
shasum.update(process.argv[i]);
|
||||
}
|
||||
process.stdout.write(shasum.digest('hex'));
|
||||
@@ -1,54 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
'use strict';
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import * as crypto from 'crypto';
|
||||
|
||||
const ROOT = path.join(__dirname, '../../../');
|
||||
|
||||
function findFiles(location: string, pattern: string, result: string[]) {
|
||||
const entries = fs.readdirSync(path.join(ROOT, location));
|
||||
|
||||
for (const entry of entries) {
|
||||
const entryPath = `${location}/${entry}`;
|
||||
let stat: fs.Stats;
|
||||
try {
|
||||
stat = fs.statSync(path.join(ROOT, entryPath));
|
||||
} catch (err) {
|
||||
continue;
|
||||
}
|
||||
if (stat.isDirectory()) {
|
||||
findFiles(entryPath, pattern, result);
|
||||
} else {
|
||||
if (stat.isFile() && entry.endsWith(pattern)) {
|
||||
result.push(path.join(ROOT, entryPath));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const shasum = crypto.createHash('sha1');
|
||||
|
||||
/**
|
||||
* Creating a sha hash of all the files that can cause packages to change/redownload.
|
||||
*/
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
|
||||
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
|
||||
|
||||
// Adding all yarn.lock files into sha sum.
|
||||
const result: string[] = [];
|
||||
findFiles('', 'yarn.lock', result);
|
||||
result.forEach(f => shasum.update(fs.readFileSync(f)));
|
||||
|
||||
// Add any other command line arguments
|
||||
for (let i = 2; i < process.argv.length; i++) {
|
||||
shasum.update(process.argv[i]);
|
||||
}
|
||||
|
||||
process.stdout.write(shasum.digest('hex'));
|
||||
228
build/azure-pipelines/common/symbols.ts
Normal file
228
build/azure-pipelines/common/symbols.ts
Normal file
@@ -0,0 +1,228 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
|
||||
'use strict';
|
||||
|
||||
import * as request from 'request';
|
||||
import { createReadStream, createWriteStream, unlink, mkdir } from 'fs';
|
||||
import * as github from 'github-releases';
|
||||
import { join } from 'path';
|
||||
import { tmpdir } from 'os';
|
||||
import { promisify } from 'util';
|
||||
|
||||
const BASE_URL = 'https://rink.hockeyapp.net/api/2/';
|
||||
const HOCKEY_APP_TOKEN_HEADER = 'X-HockeyAppToken';
|
||||
|
||||
export interface IVersions {
|
||||
app_versions: IVersion[];
|
||||
}
|
||||
|
||||
export interface IVersion {
|
||||
id: number;
|
||||
version: string;
|
||||
}
|
||||
|
||||
export interface IApplicationAccessor {
|
||||
accessToken: string;
|
||||
appId: string;
|
||||
}
|
||||
|
||||
export interface IVersionAccessor extends IApplicationAccessor {
|
||||
id: string;
|
||||
}
|
||||
|
||||
enum Platform {
|
||||
WIN_32 = 'win32-ia32',
|
||||
WIN_64 = 'win32-x64',
|
||||
LINUX_64 = 'linux-x64',
|
||||
MAC_OS = 'darwin-x64'
|
||||
}
|
||||
|
||||
function symbolsZipName(platform: Platform, electronVersion: string, insiders: boolean): string {
|
||||
return `${insiders ? 'insiders' : 'stable'}-symbols-v${electronVersion}-${platform}.zip`;
|
||||
}
|
||||
|
||||
const SEED = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
|
||||
async function tmpFile(name: string): Promise<string> {
|
||||
let res = '';
|
||||
for (let i = 0; i < 8; i++) {
|
||||
res += SEED.charAt(Math.floor(Math.random() * SEED.length));
|
||||
}
|
||||
|
||||
const tmpParent = join(tmpdir(), res);
|
||||
|
||||
await promisify(mkdir)(tmpParent);
|
||||
|
||||
return join(tmpParent, name);
|
||||
}
|
||||
|
||||
function getVersions(accessor: IApplicationAccessor): Promise<IVersions> {
|
||||
return asyncRequest<IVersions>({
|
||||
url: `${BASE_URL}/apps/${accessor.appId}/app_versions`,
|
||||
method: 'GET',
|
||||
headers: {
|
||||
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function createVersion(accessor: IApplicationAccessor, version: string): Promise<IVersion> {
|
||||
return asyncRequest<IVersion>({
|
||||
url: `${BASE_URL}/apps/${accessor.appId}/app_versions/new`,
|
||||
method: 'POST',
|
||||
headers: {
|
||||
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
|
||||
},
|
||||
formData: {
|
||||
bundle_version: version
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function updateVersion(accessor: IVersionAccessor, symbolsPath: string) {
|
||||
return asyncRequest<IVersions>({
|
||||
url: `${BASE_URL}/apps/${accessor.appId}/app_versions/${accessor.id}`,
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
|
||||
},
|
||||
formData: {
|
||||
dsym: createReadStream(symbolsPath)
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Promise<T> {
|
||||
return new Promise<T>((resolve, reject) => {
|
||||
request(options, (error, _response, body) => {
|
||||
if (error) {
|
||||
reject(error);
|
||||
} else {
|
||||
resolve(JSON.parse(body));
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function downloadAsset(repository: any, assetName: string, targetPath: string, electronVersion: string) {
|
||||
return new Promise((resolve, reject) => {
|
||||
repository.getReleases({ tag_name: `v${electronVersion}` }, (err: any, releases: any) => {
|
||||
if (err) {
|
||||
reject(err);
|
||||
} else {
|
||||
const asset = releases[0].assets.filter((asset: any) => asset.name === assetName)[0];
|
||||
if (!asset) {
|
||||
reject(new Error(`Asset with name ${assetName} not found`));
|
||||
} else {
|
||||
repository.downloadAsset(asset, (err: any, reader: any) => {
|
||||
if (err) {
|
||||
reject(err);
|
||||
} else {
|
||||
const writer = createWriteStream(targetPath);
|
||||
writer.on('error', reject);
|
||||
writer.on('close', resolve);
|
||||
reader.on('error', reject);
|
||||
|
||||
reader.pipe(writer);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
interface IOptions {
|
||||
repository: string;
|
||||
platform: Platform;
|
||||
versions: { code: string; insiders: boolean; electron: string; };
|
||||
access: { hockeyAppToken: string; hockeyAppId: string; githubToken: string };
|
||||
}
|
||||
|
||||
async function ensureVersionAndSymbols(options: IOptions) {
|
||||
|
||||
// Check version does not exist
|
||||
console.log(`HockeyApp: checking for existing version ${options.versions.code} (${options.platform})`);
|
||||
const versions = await getVersions({ accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId });
|
||||
if (!Array.isArray(versions.app_versions)) {
|
||||
throw new Error(`Unexpected response: ${JSON.stringify(versions)}`);
|
||||
}
|
||||
|
||||
if (versions.app_versions.some(v => v.version === options.versions.code)) {
|
||||
console.log(`HockeyApp: Returning without uploading symbols because version ${options.versions.code} (${options.platform}) was already found`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Download symbols for platform and electron version
|
||||
const symbolsName = symbolsZipName(options.platform, options.versions.electron, options.versions.insiders);
|
||||
const symbolsPath = await tmpFile('symbols.zip');
|
||||
console.log(`HockeyApp: downloading symbols ${symbolsName} for electron ${options.versions.electron} (${options.platform}) into ${symbolsPath}`);
|
||||
await downloadAsset(new (github as any)({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
|
||||
|
||||
// Create version
|
||||
console.log(`HockeyApp: creating new version ${options.versions.code} (${options.platform})`);
|
||||
const version = await createVersion({ accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId }, options.versions.code);
|
||||
|
||||
// Upload symbols
|
||||
console.log(`HockeyApp: uploading symbols for version ${options.versions.code} (${options.platform})`);
|
||||
await updateVersion({ id: String(version.id), accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId }, symbolsPath);
|
||||
|
||||
// Cleanup
|
||||
await promisify(unlink)(symbolsPath);
|
||||
}
|
||||
|
||||
// Environment
|
||||
const pakage = require('../../../package.json');
|
||||
const product = require('../../../product.json');
|
||||
const repository = product.electronRepository;
|
||||
const electronVersion = require('../../lib/electron').getElectronVersion();
|
||||
const insiders = product.quality !== 'stable';
|
||||
let codeVersion = pakage.version;
|
||||
if (insiders) {
|
||||
codeVersion = `${codeVersion}-insider`;
|
||||
}
|
||||
const githubToken = process.argv[2];
|
||||
const hockeyAppToken = process.argv[3];
|
||||
const is64 = process.argv[4] === 'x64';
|
||||
const hockeyAppId = process.argv[5];
|
||||
|
||||
if (process.argv.length !== 6) {
|
||||
throw new Error(`HockeyApp: Unexpected number of arguments. Got ${process.argv}`);
|
||||
}
|
||||
|
||||
let platform: Platform;
|
||||
if (process.platform === 'darwin') {
|
||||
platform = Platform.MAC_OS;
|
||||
} else if (process.platform === 'win32') {
|
||||
platform = is64 ? Platform.WIN_64 : Platform.WIN_32;
|
||||
} else {
|
||||
platform = Platform.LINUX_64;
|
||||
}
|
||||
|
||||
// Create version and upload symbols in HockeyApp
|
||||
if (repository && codeVersion && electronVersion && (product.quality === 'stable' || product.quality === 'insider')) {
|
||||
ensureVersionAndSymbols({
|
||||
repository,
|
||||
platform,
|
||||
versions: {
|
||||
code: codeVersion,
|
||||
insiders,
|
||||
electron: electronVersion
|
||||
},
|
||||
access: {
|
||||
githubToken,
|
||||
hockeyAppToken,
|
||||
hockeyAppId
|
||||
}
|
||||
}).then(() => {
|
||||
console.log('HockeyApp: done');
|
||||
}).catch(error => {
|
||||
console.error(`HockeyApp: error ${error} (AppID: ${hockeyAppId})`);
|
||||
|
||||
return process.exit(1);
|
||||
});
|
||||
} else {
|
||||
console.log(`HockeyApp: skipping due to unexpected context (repository: ${repository}, codeVersion: ${codeVersion}, electronVersion: ${electronVersion}, quality: ${product.quality})`);
|
||||
}
|
||||
@@ -1,87 +0,0 @@
|
||||
/*---------------------------------------------------------------------------------------------
|
||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||
*--------------------------------------------------------------------------------------------*/
|
||||
'use strict';
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const url = require("url");
|
||||
const azure = require("azure-storage");
|
||||
const mime = require("mime");
|
||||
const cosmos_1 = require("@azure/cosmos");
|
||||
const retry_1 = require("./retry");
|
||||
function log(...args) {
|
||||
console.log(...[`[${new Date().toISOString()}]`, ...args]);
|
||||
}
|
||||
function error(...args) {
|
||||
console.error(...[`[${new Date().toISOString()}]`, ...args]);
|
||||
}
|
||||
if (process.argv.length < 3) {
|
||||
error('Usage: node sync-mooncake.js <quality>');
|
||||
process.exit(-1);
|
||||
}
|
||||
async function sync(commit, quality) {
|
||||
log(`Synchronizing Mooncake assets for ${quality}, ${commit}...`);
|
||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||
const container = client.database('builds').container(quality);
|
||||
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${commit}"`;
|
||||
const res = await container.items.query(query, {}).fetchAll();
|
||||
if (res.resources.length !== 1) {
|
||||
throw new Error(`No builds found for ${commit}`);
|
||||
}
|
||||
const build = res.resources[0];
|
||||
log(`Found build for ${commit}, with ${build.assets.length} assets`);
|
||||
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
|
||||
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
|
||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
|
||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||
// mooncake is fussy and far away, this is needed!
|
||||
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||
for (const asset of build.assets) {
|
||||
try {
|
||||
const blobPath = url.parse(asset.url).path;
|
||||
if (!blobPath) {
|
||||
throw new Error(`Failed to parse URL: ${asset.url}`);
|
||||
}
|
||||
const blobName = blobPath.replace(/^\/\w+\//, '');
|
||||
log(`Found ${blobName}`);
|
||||
if (asset.mooncakeUrl) {
|
||||
log(` Already in Mooncake ✔️`);
|
||||
continue;
|
||||
}
|
||||
const readStream = blobService.createReadStream(quality, blobName, undefined);
|
||||
const blobOptions = {
|
||||
contentSettings: {
|
||||
contentType: mime.lookup(blobPath),
|
||||
cacheControl: 'max-age=31536000, public'
|
||||
}
|
||||
};
|
||||
const writeStream = mooncakeBlobService.createWriteStreamToBlockBlob(quality, blobName, blobOptions, undefined);
|
||||
log(` Uploading to Mooncake...`);
|
||||
await new Promise((c, e) => readStream.pipe(writeStream).on('finish', c).on('error', e));
|
||||
log(` Updating build in DB...`);
|
||||
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
||||
await (0, retry_1.retry)(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
|
||||
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
|
||||
log(` Done ✔️`);
|
||||
}
|
||||
catch (err) {
|
||||
error(err);
|
||||
}
|
||||
}
|
||||
log(`All done ✔️`);
|
||||
}
|
||||
function main() {
|
||||
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||
if (!commit) {
|
||||
error('Skipping publish due to missing BUILD_SOURCEVERSION');
|
||||
return;
|
||||
}
|
||||
const quality = process.argv[2];
|
||||
sync(commit, quality).catch(err => {
|
||||
error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
main();
|
||||
@@ -9,7 +9,6 @@ import * as url from 'url';
|
||||
import * as azure from 'azure-storage';
|
||||
import * as mime from 'mime';
|
||||
import { CosmosClient } from '@azure/cosmos';
|
||||
import { retry } from './retry';
|
||||
|
||||
function log(...args: any[]) {
|
||||
console.log(...[`[${new Date().toISOString()}]`, ...args]);
|
||||
@@ -100,8 +99,8 @@ async function sync(commit: string, quality: string): Promise<void> {
|
||||
|
||||
log(` Updating build in DB...`);
|
||||
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
||||
await retry(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
|
||||
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
|
||||
await container.scripts.storedProcedure('setAssetMooncakeUrl')
|
||||
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]);
|
||||
|
||||
log(` Done ✔️`);
|
||||
} catch (err) {
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>com.apple.security.cs.allow-jit</key>
|
||||
<true/>
|
||||
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
|
||||
<true/>
|
||||
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
|
||||
<true/>
|
||||
<key>com.apple.security.device.audio-input</key>
|
||||
<true/>
|
||||
<key>com.apple.security.device.camera</key>
|
||||
<true/>
|
||||
<key>com.apple.security.automation.apple-events</key>
|
||||
<true/>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -1,26 +1,24 @@
|
||||
steps:
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "12.18.3"
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "12.13.0"
|
||||
|
||||
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3 # {{SQL CARBON EDIT}} update version
|
||||
inputs:
|
||||
versionSpec: "1.x"
|
||||
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
|
||||
displayName: Restore Cache - Node Modules # {{SQL CARBON EDIT}}
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
|
||||
vstsFeed: 'npm-cache' # {{SQL CARBON EDIT}} update build cache
|
||||
|
||||
- script: |
|
||||
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
|
||||
displayName: Install Dependencies
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
- script: |
|
||||
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
|
||||
displayName: Install Dependencies
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
|
||||
displayName: Save Cache - Node Modules # {{SQL CARBON EDIT}}
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
|
||||
@@ -31,45 +29,52 @@ steps:
|
||||
yarn electron x64
|
||||
displayName: Download Electron
|
||||
|
||||
# - script: | {{SQL CARBON EDIT}} remove editor checks
|
||||
- script: |
|
||||
yarn gulp hygiene
|
||||
displayName: Run Hygiene Checks
|
||||
|
||||
- script: | # {{SQL CARBON EDIT}} add step
|
||||
yarn strict-vscode
|
||||
displayName: Run Strict Null Check.
|
||||
|
||||
# - script: | {{SQL CARBON EDIT}} remove step
|
||||
# yarn monaco-compile-check
|
||||
# displayName: Run Monaco Editor Checks
|
||||
|
||||
- script: |
|
||||
yarn valid-layers-check
|
||||
displayName: Run Valid Layers Checks
|
||||
- script: |
|
||||
yarn valid-layers-check
|
||||
displayName: Run Valid Layers Checks
|
||||
|
||||
- script: |
|
||||
yarn compile
|
||||
displayName: Compile Sources
|
||||
- script: |
|
||||
yarn compile
|
||||
displayName: Compile Sources
|
||||
|
||||
# - script: | {{SQL CARBON EDIT}} remove step
|
||||
# yarn download-builtin-extensions
|
||||
# displayName: Download Built-in Extensions
|
||||
|
||||
- script: |
|
||||
./scripts/test.sh --tfs "Unit Tests"
|
||||
displayName: Run Unit Tests (Electron)
|
||||
- script: |
|
||||
./scripts/test.sh --tfs "Unit Tests"
|
||||
displayName: Run Unit Tests (Electron)
|
||||
|
||||
# - script: | {{SQL CARBON EDIT}} disable
|
||||
# yarn test-browser --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
|
||||
# yarn test-browser --browser chromium --browser webkit --browser firefox
|
||||
# displayName: Run Unit Tests (Browser)
|
||||
|
||||
# - script: | {{SQL CARBON EDIT}} disable
|
||||
# ./scripts/test-integration.sh --tfs "Integration Tests"
|
||||
# displayName: Run Integration Tests (Electron)
|
||||
|
||||
- task: PublishPipelineArtifact@0
|
||||
inputs:
|
||||
artifactName: crash-dump-macos
|
||||
targetPath: .build/crashes
|
||||
displayName: "Publish Crash Reports"
|
||||
continueOnError: true
|
||||
condition: failed()
|
||||
# - task: PublishPipelineArtifact@0
|
||||
# inputs:
|
||||
# artifactName: crash-dump-macos
|
||||
# targetPath: .build/crashes
|
||||
# displayName: 'Publish Crash Reports'
|
||||
# condition: succeededOrFailed()
|
||||
|
||||
- task: PublishTestResults@2
|
||||
displayName: Publish Tests Results
|
||||
inputs:
|
||||
testResultsFiles: "*-results.xml"
|
||||
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
|
||||
condition: succeededOrFailed()
|
||||
- task: PublishTestResults@2
|
||||
displayName: Publish Tests Results
|
||||
inputs:
|
||||
testResultsFiles: '*-results.xml'
|
||||
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
|
||||
condition: succeededOrFailed()
|
||||
|
||||
6
build/azure-pipelines/darwin/entitlements.plist
Normal file
6
build/azure-pipelines/darwin/entitlements.plist
Normal file
@@ -0,0 +1,6 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -0,0 +1,10 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
|
||||
<true/>
|
||||
<key>com.apple.security.cs.disable-library-validation</key>
|
||||
<true/>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -1,129 +0,0 @@
|
||||
steps:
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "14.x"
|
||||
|
||||
- task: AzureKeyVault@1
|
||||
displayName: "Azure Key Vault: Get Secrets"
|
||||
inputs:
|
||||
azureSubscription: "vscode-builds-subscription"
|
||||
KeyVaultName: vscode
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
cat << EOF > ~/.netrc
|
||||
machine github.com
|
||||
login vscode
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
|
||||
git config user.email "vscode@microsoft.com"
|
||||
git config user.name "VSCode"
|
||||
displayName: Prepare tooling
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
|
||||
displayName: Merge distro
|
||||
|
||||
- script: |
|
||||
pushd build \
|
||||
&& yarn \
|
||||
&& npm install -g typescript \
|
||||
&& tsc azure-pipelines/common/createAsset.ts \
|
||||
&& popd
|
||||
displayName: Restore modules for just build folder and compile it
|
||||
|
||||
- download: current
|
||||
artifact: vscode-darwin-$(VSCODE_ARCH)
|
||||
displayName: Download $(VSCODE_ARCH) artifact
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
unzip $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
mv $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
||||
displayName: Unzip & move
|
||||
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
inputs:
|
||||
ConnectedServiceName: "ESRP CodeSign"
|
||||
FolderPath: "$(agent.builddirectory)"
|
||||
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationSetCode": "MacAppDeveloperSign",
|
||||
"parameters": [
|
||||
{
|
||||
"parameterName": "Hardening",
|
||||
"parameterValue": "--options=runtime"
|
||||
}
|
||||
],
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 60
|
||||
displayName: Codesign
|
||||
|
||||
- script: |
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
|
||||
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
|
||||
displayName: Export bundle identifier
|
||||
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
inputs:
|
||||
ConnectedServiceName: "ESRP CodeSign"
|
||||
FolderPath: "$(agent.builddirectory)"
|
||||
Pattern: "VSCode-darwin-$(VSCODE_ARCH).zip"
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationSetCode": "MacAppNotarize",
|
||||
"parameters": [
|
||||
{
|
||||
"parameterName": "BundleId",
|
||||
"parameterValue": "$(BundleIdentifier)"
|
||||
}
|
||||
],
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 60
|
||||
displayName: Notarization
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
|
||||
displayName: Verify start after signing (export configuration)
|
||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
|
||||
# For legacy purposes, arch for x64 is just 'darwin'
|
||||
case $VSCODE_ARCH in
|
||||
x64) ASSET_ID="darwin" ;;
|
||||
arm64) ASSET_ID="darwin-arm64" ;;
|
||||
universal) ASSET_ID="darwin-universal" ;;
|
||||
esac
|
||||
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||
node build/azure-pipelines/common/createAsset.js \
|
||||
"$ASSET_ID" \
|
||||
archive \
|
||||
"VSCode-$ASSET_ID.zip" \
|
||||
../VSCode-darwin-$(VSCODE_ARCH).zip
|
||||
displayName: Publish Clients
|
||||
@@ -1,312 +1,263 @@
|
||||
steps:
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "14.x"
|
||||
- script: |
|
||||
mkdir -p .build
|
||||
echo -n $BUILD_SOURCEVERSION > .build/commit
|
||||
echo -n $VSCODE_QUALITY > .build/quality
|
||||
displayName: Prepare cache flag
|
||||
|
||||
- task: AzureKeyVault@1
|
||||
displayName: "Azure Key Vault: Get Secrets"
|
||||
inputs:
|
||||
azureSubscription: "vscode-builds-subscription"
|
||||
KeyVaultName: vscode
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
|
||||
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
|
||||
vstsFeed: 'npm-vscode'
|
||||
platformIndependent: true
|
||||
alias: 'Compilation'
|
||||
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: Compilation
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
displayName: Download compilation output
|
||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
|
||||
- script: |
|
||||
set -e
|
||||
exit 1
|
||||
displayName: Check RestoreCache
|
||||
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
tar -xzf $(Build.ArtifactStagingDirectory)/compilation.tar.gz
|
||||
displayName: Extract compilation output
|
||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "12.13.0"
|
||||
|
||||
# Set up the credentials to retrieve distro repo and setup git persona
|
||||
# to create a merge commit for when we merge distro into oss
|
||||
- script: |
|
||||
set -e
|
||||
cat << EOF > ~/.netrc
|
||||
machine github.com
|
||||
login vscode
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
|
||||
inputs:
|
||||
versionSpec: "1.x"
|
||||
|
||||
git config user.email "vscode@microsoft.com"
|
||||
git config user.name "VSCode"
|
||||
displayName: Prepare tooling
|
||||
- task: AzureKeyVault@1
|
||||
displayName: 'Azure Key Vault: Get Secrets'
|
||||
inputs:
|
||||
azureSubscription: 'vscode-builds-subscription'
|
||||
KeyVaultName: vscode
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
sudo xcode-select -s /Applications/Xcode_12.2.app
|
||||
displayName: Switch to Xcode 12
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'arm64'))
|
||||
- script: |
|
||||
set -e
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
git pull --no-rebase https://github.com/$(VSCODE_MIXIN_REPO).git $(node -p "require('./package.json').distro")
|
||||
displayName: Merge distro
|
||||
cat << EOF > ~/.netrc
|
||||
machine github.com
|
||||
login vscode
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
|
||||
- script: |
|
||||
mkdir -p .build
|
||||
node build/azure-pipelines/common/computeNodeModulesCacheKey.js $VSCODE_ARCH $ENABLE_TERRAPIN > .build/yarnlockhash
|
||||
displayName: Prepare yarn cache flags
|
||||
git config user.email "vscode@microsoft.com"
|
||||
git config user.name "VSCode"
|
||||
displayName: Prepare tooling
|
||||
|
||||
- task: Cache@2
|
||||
inputs:
|
||||
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
||||
path: .build/node_modules_cache
|
||||
cacheHitVar: NODE_MODULES_RESTORED
|
||||
displayName: Restore node_modules cache
|
||||
- script: |
|
||||
set -e
|
||||
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
|
||||
git fetch distro
|
||||
git merge $(node -p "require('./package.json').distro")
|
||||
displayName: Merge distro
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
tar -xzf .build/node_modules_cache/cache.tgz
|
||||
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
displayName: Extract node_modules cache
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
|
||||
vstsFeed: 'npm-vscode'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
npm install -g node-gyp@latest
|
||||
node-gyp --version
|
||||
displayName: Update node-gyp
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
- script: |
|
||||
set -e
|
||||
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
|
||||
displayName: Install dependencies
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
npx https://aka.ms/enablesecurefeed standAlone
|
||||
timeoutInMinutes: 5
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'), eq(variables['ENABLE_TERRAPIN'], 'true'))
|
||||
displayName: Switch to Terrapin packages
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
|
||||
vstsFeed: 'npm-vscode'
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
export npm_config_arch=$(VSCODE_ARCH)
|
||||
export npm_config_node_gyp=$(which node-gyp)
|
||||
export npm_config_build_from_source=true
|
||||
export SDKROOT=/Applications/Xcode_12.2.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.0.sdk
|
||||
- script: |
|
||||
set -e
|
||||
yarn postinstall
|
||||
displayName: Run postinstall scripts
|
||||
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
|
||||
|
||||
for i in {1..3}; do # try 3 times, for Terrapin
|
||||
yarn --frozen-lockfile && break
|
||||
if [ $i -eq 3 ]; then
|
||||
echo "Yarn failed too many times" >&2
|
||||
exit 1
|
||||
fi
|
||||
echo "Yarn failed $i, trying again..."
|
||||
done
|
||||
env:
|
||||
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
|
||||
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
|
||||
displayName: Install dependencies
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
- script: |
|
||||
set -e
|
||||
node build/azure-pipelines/mixin
|
||||
displayName: Mix in quality
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
|
||||
mkdir -p .build/node_modules_cache
|
||||
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
displayName: Create node_modules archive
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-darwin-min-ci
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-reh-darwin-min-ci
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-reh-web-darwin-min-ci
|
||||
displayName: Build
|
||||
|
||||
# This script brings in the right resources (images, icons, etc) based on the quality (insiders, stable, exploration)
|
||||
- script: |
|
||||
set -e
|
||||
node build/azure-pipelines/mixin
|
||||
displayName: Mix in quality
|
||||
- script: |
|
||||
set -e
|
||||
./scripts/test.sh --build --tfs "Unit Tests"
|
||||
displayName: Run unit tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-darwin-$(VSCODE_ARCH)-min-ci
|
||||
displayName: Build client
|
||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'))
|
||||
- script: |
|
||||
set -e
|
||||
yarn test-browser --build --browser chromium --browser webkit --browser firefox
|
||||
displayName: Run unit tests (Browser)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-reh-darwin-min-ci
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn gulp vscode-reh-web-darwin-min-ci
|
||||
displayName: Build Server
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
|
||||
- script: |
|
||||
# Figure out the full absolute path of the product we just built
|
||||
# including the remote server and configure the integration tests
|
||||
# to run with these builds instead of running out of sources.
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
|
||||
./scripts/test-integration.sh --build --tfs "Integration Tests"
|
||||
displayName: Run integration tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
yarn npm-run-all -lp "electron $(VSCODE_ARCH)" "playwright-install"
|
||||
displayName: Download Electron and Playwright
|
||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
|
||||
./resources/server/test/test-remote-integration.sh
|
||||
displayName: Run remote integration tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- download: current
|
||||
artifact: vscode-darwin-x64
|
||||
displayName: Download x64 artifact
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
|
||||
./resources/server/test/test-web-integration.sh --browser webkit
|
||||
displayName: Run integration tests (Browser)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- download: current
|
||||
artifact: vscode-darwin-arm64
|
||||
displayName: Download arm64 artifact
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
yarn smoketest --build "$APP_ROOT/$APP_NAME"
|
||||
continueOnError: true
|
||||
displayName: Run smoke tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
cp $(Pipeline.Workspace)/vscode-darwin-x64/VSCode-darwin-x64.zip $(agent.builddirectory)/VSCode-darwin-x64.zip
|
||||
cp $(Pipeline.Workspace)/vscode-darwin-arm64/VSCode-darwin-arm64.zip $(agent.builddirectory)/VSCode-darwin-arm64.zip
|
||||
unzip $(agent.builddirectory)/VSCode-darwin-x64.zip -d $(agent.builddirectory)/VSCode-darwin-x64
|
||||
unzip $(agent.builddirectory)/VSCode-darwin-arm64.zip -d $(agent.builddirectory)/VSCode-darwin-arm64
|
||||
DEBUG=* node build/darwin/create-universal-app.js
|
||||
displayName: Create Universal App
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
|
||||
yarn smoketest --web --headless
|
||||
continueOnError: true
|
||||
displayName: Run smoke tests (Browser)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
# Setting hardened entitlements is a requirement for:
|
||||
# * Apple notarization
|
||||
# * Running tests on Big Sur (because Big Sur has additional security precautions)
|
||||
- script: |
|
||||
set -e
|
||||
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
|
||||
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
|
||||
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
|
||||
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
VSCODE_ARCH=$(VSCODE_ARCH) DEBUG=electron-osx-sign* node build/darwin/sign.js
|
||||
displayName: Set Hardened Entitlements
|
||||
- task: PublishPipelineArtifact@0
|
||||
inputs:
|
||||
artifactName: crash-dump-macos
|
||||
targetPath: .build/crashes
|
||||
displayName: 'Publish Crash Reports'
|
||||
condition: succeededOrFailed()
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
./scripts/test.sh --build --tfs "Unit Tests"
|
||||
displayName: Run unit tests (Electron)
|
||||
timeoutInMinutes: 7
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
HELPER_APP_NAME="`echo $APP_NAME | sed -e 's/^Visual Studio //;s/\.app$//'`"
|
||||
APP_FRAMEWORK_PATH="$APP_ROOT/$APP_NAME/Contents/Frameworks"
|
||||
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
|
||||
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
|
||||
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
|
||||
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
|
||||
codesign -s 99FM488X57 --deep --force --options runtime --entitlements build/azure-pipelines/darwin/entitlements.plist "$APP_ROOT"/*.app
|
||||
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-gpu-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (GPU).app"
|
||||
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-plugin-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (Plugin).app"
|
||||
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-renderer-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (Renderer).app"
|
||||
displayName: Set Hardened Entitlements
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
yarn test-browser --build --browser chromium --browser webkit --browser firefox --tfs "Browser Unit Tests"
|
||||
displayName: Run unit tests (Browser)
|
||||
timeoutInMinutes: 7
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
set -e
|
||||
pushd $(agent.builddirectory)/VSCode-darwin && zip -r -X -y $(agent.builddirectory)/VSCode-darwin.zip * && popd
|
||||
displayName: Archive build
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
yarn --cwd test/integration/browser compile
|
||||
displayName: Compile integration tests
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
inputs:
|
||||
ConnectedServiceName: 'ESRP CodeSign'
|
||||
FolderPath: '$(agent.builddirectory)'
|
||||
Pattern: 'VSCode-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationSetCode": "MacAppDeveloperSign",
|
||||
"parameters": [
|
||||
{
|
||||
"parameterName": "Hardening",
|
||||
"parameterValue": "--options=runtime"
|
||||
}
|
||||
],
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 60
|
||||
displayName: Codesign
|
||||
|
||||
- script: |
|
||||
# Figure out the full absolute path of the product we just built
|
||||
# including the remote server and configure the integration tests
|
||||
# to run with these builds instead of running out of sources.
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
|
||||
./scripts/test-integration.sh --build --tfs "Integration Tests"
|
||||
displayName: Run integration tests (Electron)
|
||||
timeoutInMinutes: 10
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
zip -d $(agent.builddirectory)/VSCode-darwin.zip "*.pkg"
|
||||
displayName: Clean Archive
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
|
||||
./resources/server/test/test-web-integration.sh --browser webkit
|
||||
displayName: Run integration tests (Browser)
|
||||
timeoutInMinutes: 10
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
set -e
|
||||
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||
node build/azure-pipelines/common/createAsset.js darwin-unnotarized archive "VSCode-darwin-$VSCODE_QUALITY.zip" $(agent.builddirectory)/VSCode-darwin.zip
|
||||
displayName: Publish Unnotarized Build
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
|
||||
./resources/server/test/test-remote-integration.sh
|
||||
displayName: Run remote integration tests (Electron)
|
||||
timeoutInMinutes: 7
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
BUNDLE_IDENTIFIER=$(node -p "require(\"$APP_ROOT/$APP_NAME/Contents/Resources/app/product.json\").darwinBundleIdentifier")
|
||||
echo "##vso[task.setvariable variable=BundleIdentifier]$BUNDLE_IDENTIFIER"
|
||||
displayName: Export bundle identifier
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
yarn --cwd test/smoke compile
|
||||
displayName: Compile smoke tests
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
inputs:
|
||||
ConnectedServiceName: 'ESRP CodeSign'
|
||||
FolderPath: '$(agent.builddirectory)'
|
||||
Pattern: 'VSCode-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationSetCode": "MacAppNotarize",
|
||||
"parameters": [
|
||||
{
|
||||
"parameterName": "BundleId",
|
||||
"parameterValue": "$(BundleIdentifier)"
|
||||
}
|
||||
],
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 60
|
||||
displayName: Notarization
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
yarn smoketest-no-compile --build "$APP_ROOT/$APP_NAME"
|
||||
timeoutInMinutes: 5
|
||||
displayName: Run smoke tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
|
||||
./build/azure-pipelines/darwin/publish.sh
|
||||
displayName: Publish
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-darwin" \
|
||||
yarn smoketest-no-compile --build "$APP_ROOT/$APP_NAME" --remote
|
||||
timeoutInMinutes: 5
|
||||
displayName: Run smoke tests (Remote)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-darwin" \
|
||||
yarn smoketest-no-compile --web --headless
|
||||
timeoutInMinutes: 5
|
||||
displayName: Run smoke tests (Browser)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- task: PublishPipelineArtifact@0
|
||||
inputs:
|
||||
artifactName: crash-dump-macos-$(VSCODE_ARCH)
|
||||
targetPath: .build/crashes
|
||||
displayName: "Publish Crash Reports"
|
||||
continueOnError: true
|
||||
condition: failed()
|
||||
|
||||
- task: PublishTestResults@2
|
||||
displayName: Publish Tests Results
|
||||
inputs:
|
||||
testResultsFiles: "*-results.xml"
|
||||
searchFolder: "$(Build.ArtifactStagingDirectory)/test-results"
|
||||
condition: and(succeededOrFailed(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
pushd $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH) && zip -r -X -y $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip * && popd
|
||||
displayName: Archive build
|
||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||
VSCODE_ARCH="$(VSCODE_ARCH)" ./build/azure-pipelines/darwin/publish-server.sh
|
||||
displayName: Publish Servers
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||
|
||||
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
||||
artifact: vscode-darwin-$(VSCODE_ARCH)
|
||||
displayName: Publish client archive
|
||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||
|
||||
- publish: $(Agent.BuildDirectory)/vscode-server-darwin.zip
|
||||
artifact: vscode-server-darwin-$(VSCODE_ARCH)
|
||||
displayName: Publish server archive
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||
|
||||
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-web.zip
|
||||
artifact: vscode-server-darwin-$(VSCODE_ARCH)-web
|
||||
displayName: Publish web server archive
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||
|
||||
- script: |
|
||||
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||
VSCODE_ARCH="$(VSCODE_ARCH)" \
|
||||
yarn gulp upload-vscode-configuration
|
||||
displayName: Upload configuration (for Bing settings search)
|
||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
|
||||
continueOnError: true
|
||||
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
|
||||
displayName: 'Component Detection'
|
||||
continueOnError: true
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
if [ "$VSCODE_ARCH" == "x64" ]; then
|
||||
# package Remote Extension Host
|
||||
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
|
||||
|
||||
# publish Remote Extension Host
|
||||
node build/azure-pipelines/common/createAsset.js \
|
||||
server-darwin \
|
||||
archive-unsigned \
|
||||
"vscode-server-darwin.zip" \
|
||||
../vscode-server-darwin.zip
|
||||
fi
|
||||
27
build/azure-pipelines/darwin/publish.sh
Executable file
27
build/azure-pipelines/darwin/publish.sh
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
# publish the build
|
||||
node build/azure-pipelines/common/createAsset.js \
|
||||
darwin \
|
||||
archive \
|
||||
"VSCode-darwin-$VSCODE_QUALITY.zip" \
|
||||
../VSCode-darwin.zip
|
||||
|
||||
# package Remote Extension Host
|
||||
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
|
||||
|
||||
# publish Remote Extension Host
|
||||
node build/azure-pipelines/common/createAsset.js \
|
||||
server-darwin \
|
||||
archive-unsigned \
|
||||
"vscode-server-darwin.zip" \
|
||||
../vscode-server-darwin.zip
|
||||
|
||||
# publish hockeyapp symbols
|
||||
# node build/azure-pipelines/common/symbols.js "$VSCODE_MIXIN_PASSWORD" "$VSCODE_HOCKEYAPP_TOKEN" x64 "$VSCODE_HOCKEYAPP_ID_MACOS"
|
||||
# Skip hockey app because build failure.
|
||||
# https://github.com/microsoft/vscode/issues/90491
|
||||
|
||||
# upload configuration
|
||||
yarn gulp upload-vscode-configuration
|
||||
@@ -1,82 +0,0 @@
|
||||
steps:
|
||||
- task: InstallAppleCertificate@2
|
||||
displayName: 'Install developer certificate'
|
||||
inputs:
|
||||
certSecureFile: 'osx_signing_key.p12'
|
||||
condition: eq(variables['signed'], true)
|
||||
|
||||
- task: DownloadBuildArtifacts@0
|
||||
displayName: 'Download Build Artifacts'
|
||||
inputs:
|
||||
downloadType: specific
|
||||
itemPattern: 'drop/darwin/archive/azuredatastudio-darwin-unsigned.zip'
|
||||
downloadPath: '$(Build.SourcesDirectory)/.build/'
|
||||
|
||||
- script: |
|
||||
pushd $(Build.SourcesDirectory)/.build/drop/darwin/archive
|
||||
mv azuredatastudio-darwin-unsigned.zip azuredatastudio-darwin.zip
|
||||
displayName: 'Rename the file'
|
||||
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
displayName: 'ESRP CodeSigning'
|
||||
inputs:
|
||||
ConnectedServiceName: 'Code Signing'
|
||||
FolderPath: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
|
||||
Pattern: 'azuredatastudio-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationCode": "MacAppDeveloperSign",
|
||||
"parameters": {
|
||||
"Hardening": "Enable"
|
||||
},
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 90
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- script: |
|
||||
zip -d $(Build.SourcesDirectory)/.build/drop/darwin/archive/azuredatastudio-darwin.zip "*.pkg"
|
||||
displayName: Clean Archive
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- task: EsrpCodeSigning@1
|
||||
displayName: 'ESRP Notarization'
|
||||
inputs:
|
||||
ConnectedServiceName: 'Code Signing'
|
||||
FolderPath: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
|
||||
Pattern: 'azuredatastudio-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"KeyCode": "CP-401337-Apple",
|
||||
"OperationCode": "MacAppNotarize",
|
||||
"Parameters": {
|
||||
"BundleId": "com.microsoft.azuredatastudio-$(VSCODE_QUALITY)"
|
||||
},
|
||||
"ToolName": "sign",
|
||||
"ToolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 120
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- task: CopyFiles@2
|
||||
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)/darwin/archive'
|
||||
inputs:
|
||||
SourceFolder: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
|
||||
TargetFolder: '$(Build.ArtifactStagingDirectory)/darwin/archive'
|
||||
|
||||
- task: PublishBuildArtifacts@1
|
||||
displayName: 'Publish Artifact: drop'
|
||||
condition: always()
|
||||
|
||||
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
|
||||
displayName: 'Component Detection'
|
||||
inputs:
|
||||
failOnAlert: true
|
||||
@@ -5,23 +5,33 @@ steps:
|
||||
certSecureFile: 'osx_signing_key.p12'
|
||||
condition: eq(variables['signed'], true)
|
||||
|
||||
- task: DownloadPipelineArtifact@2
|
||||
- script: |
|
||||
mkdir -p .build
|
||||
echo -n $BUILD_SOURCEVERSION > .build/commit
|
||||
echo -n $VSCODE_QUALITY > .build/quality
|
||||
displayName: Prepare cache flag
|
||||
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
|
||||
inputs:
|
||||
artifact: Compilation
|
||||
displayName: Download compilation output
|
||||
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
|
||||
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
|
||||
vstsFeed: 'BuildCache'
|
||||
platformIndependent: true
|
||||
alias: 'Compilation'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
tar -xzf $(Pipeline.Workspace)/compilation.tar.gz
|
||||
displayName: Extract compilation output
|
||||
exit 1
|
||||
displayName: Check RestoreCache
|
||||
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
|
||||
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "12.13.0"
|
||||
versionSpec: '10.15.3'
|
||||
|
||||
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
|
||||
inputs:
|
||||
versionSpec: "1.x"
|
||||
versionSpec: '1.x'
|
||||
|
||||
- task: AzureKeyVault@1
|
||||
displayName: 'Azure Key Vault: Get Secrets'
|
||||
@@ -39,7 +49,7 @@ steps:
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
|
||||
git config user.email "sqltools@service.microsoft.com"
|
||||
git config user.email "andresse@microsoft.com"
|
||||
git config user.name "AzureDataStudio"
|
||||
displayName: Prepare tooling
|
||||
|
||||
@@ -50,23 +60,11 @@ steps:
|
||||
git merge $(node -p "require('./package.json').distro")
|
||||
displayName: Merge distro
|
||||
|
||||
- script: |
|
||||
mkdir -p .build
|
||||
node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js > .build/yarnlockhash
|
||||
displayName: Prepare yarn cache key
|
||||
|
||||
- task: Cache@2
|
||||
displayName: Restore Cache - Node Modules
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
|
||||
inputs:
|
||||
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
||||
path: .build/node_modules_cache
|
||||
cacheHitVar: NODE_MODULES_RESTORED
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
tar -xzf .build/node_modules_cache/cache.tgz
|
||||
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
displayName: Extract node_modules archive
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
|
||||
vstsFeed: 'BuildCache'
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
@@ -74,21 +72,20 @@ steps:
|
||||
displayName: Install dependencies
|
||||
env:
|
||||
GITHUB_TOKEN: $(github-distro-mixin-password)
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
|
||||
mkdir -p .build/node_modules_cache
|
||||
tar -czf .build/node_modules_cache/cache.tgz --files-from .build/node_modules_list.txt
|
||||
condition: and(succeeded(), ne(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
displayName: Create node_modules archive
|
||||
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
|
||||
inputs:
|
||||
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock, !samples/**/yarn.lock'
|
||||
targetfolder: '**/node_modules, !**/node_modules/**/node_modules, !samples/**/node_modules'
|
||||
vstsFeed: 'BuildCache'
|
||||
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
yarn postinstall
|
||||
displayName: Run postinstall scripts
|
||||
condition: and(succeeded(), eq(variables.NODE_MODULES_RESTORED, 'true'))
|
||||
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
@@ -98,56 +95,37 @@ steps:
|
||||
- script: |
|
||||
set -e
|
||||
yarn gulp package-rebuild-extensions
|
||||
yarn gulp vscode-darwin-x64-min-ci
|
||||
yarn gulp vscode-darwin-min-ci
|
||||
yarn gulp vscode-reh-darwin-min-ci
|
||||
yarn gulp vscode-reh-web-darwin-min-ci
|
||||
displayName: Build
|
||||
env:
|
||||
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
./scripts/test.sh --build --coverage --reporter mocha-junit-reporter --tfs "Unit Tests"
|
||||
./scripts/test.sh --build --coverage --reporter mocha-junit-reporter
|
||||
displayName: Run unit tests
|
||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
||||
|
||||
- script: |
|
||||
# Figure out the full absolute path of the product we just built
|
||||
# including the remote server and configure the integration tests
|
||||
# to run with these builds instead of running out of sources.
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
|
||||
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME/Contents/MacOS/Electron" \
|
||||
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-darwin" \
|
||||
./scripts/test-integration.sh --build --tfs "Integration Tests"
|
||||
displayName: Run integration tests (Electron)
|
||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
yarn gulp compile-extensions
|
||||
displayName: Compile Extensions
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin-x64
|
||||
APP_NAME="`ls $APP_ROOT | head -n 1`"
|
||||
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log" --extensionsDir "$(build.sourcesdirectory)/extensions"
|
||||
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots"
|
||||
displayName: Run smoke tests (Electron)
|
||||
continueOnError: true
|
||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
||||
|
||||
# - script: |
|
||||
# set -e
|
||||
# node ./node_modules/playwright/install.js
|
||||
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
|
||||
# yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
|
||||
# displayName: Run smoke tests (Browser)
|
||||
# continueOnError: true
|
||||
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
pushd ../azuredatastudio-darwin-x64
|
||||
pushd ../azuredatastudio-darwin
|
||||
ls
|
||||
|
||||
echo "Cleaning the application"
|
||||
@@ -177,19 +155,58 @@ steps:
|
||||
- script: |
|
||||
set -e
|
||||
mkdir -p .build/darwin/archive
|
||||
pushd ../azuredatastudio-darwin-x64
|
||||
pushd ../azuredatastudio-darwin
|
||||
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip
|
||||
popd
|
||||
displayName: 'Archive (no signing)'
|
||||
condition: and(succeeded(), eq(variables['signed'], false))
|
||||
displayName: 'Archive'
|
||||
|
||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||
displayName: 'ESRP CodeSigning'
|
||||
inputs:
|
||||
ConnectedServiceName: 'Code Signing'
|
||||
FolderPath: '$(Build.SourcesDirectory)/.build/darwin/archive'
|
||||
Pattern: 'azuredatastudio-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"keyCode": "CP-401337-Apple",
|
||||
"operationCode": "MacAppDeveloperSign",
|
||||
"parameters": {
|
||||
"Hardening": "Enable"
|
||||
},
|
||||
"toolName": "sign",
|
||||
"toolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 90
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
mkdir -p .build/darwin/archive
|
||||
pushd ../azuredatastudio-darwin-x64
|
||||
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin-unsigned.zip
|
||||
popd
|
||||
displayName: 'Archive'
|
||||
zip -d $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip "*.pkg"
|
||||
displayName: Clean Archive
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- task: EsrpCodeSigning@1
|
||||
displayName: 'ESRP Notarization'
|
||||
inputs:
|
||||
ConnectedServiceName: 'Code Signing'
|
||||
FolderPath: '$(Build.SourcesDirectory)/.build/darwin/archive'
|
||||
Pattern: 'azuredatastudio-darwin.zip'
|
||||
signConfigType: inlineSignParams
|
||||
inlineOperation: |
|
||||
[
|
||||
{
|
||||
"KeyCode": "CP-401337-Apple",
|
||||
"OperationCode": "MacAppNotarize",
|
||||
"Parameters": {
|
||||
"BundleId": "com.microsoft.azuredatastudio-$(VSCODE_QUALITY)"
|
||||
},
|
||||
"ToolName": "sign",
|
||||
"ToolVersion": "1.0"
|
||||
}
|
||||
]
|
||||
SessionTimeout: 120
|
||||
condition: and(succeeded(), eq(variables['signed'], true))
|
||||
|
||||
- script: |
|
||||
@@ -207,7 +224,7 @@ steps:
|
||||
testResultsFiles: 'test-results.xml'
|
||||
searchFolder: '$(Build.SourcesDirectory)'
|
||||
continueOnError: true
|
||||
condition: and(succeededOrFailed(), eq(variables['RUN_TESTS'], 'true'))
|
||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
||||
|
||||
- task: PublishCodeCoverageResults@1
|
||||
displayName: 'Publish code coverage from $(Build.SourcesDirectory)/.build/coverage/cobertura-coverage.xml'
|
||||
|
||||
@@ -1,45 +1,45 @@
|
||||
pool:
|
||||
vmImage: 'Ubuntu-18.04'
|
||||
vmImage: 'Ubuntu-16.04'
|
||||
|
||||
trigger:
|
||||
branches:
|
||||
include: ["main", "release/*"]
|
||||
include: ['master', 'release/*']
|
||||
pr:
|
||||
branches:
|
||||
include: ["main", "release/*"]
|
||||
include: ['master', 'release/*']
|
||||
|
||||
steps:
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "14.x"
|
||||
- task: NodeTool@0
|
||||
inputs:
|
||||
versionSpec: "12.13.0"
|
||||
|
||||
- task: AzureKeyVault@1
|
||||
displayName: "Azure Key Vault: Get Secrets"
|
||||
inputs:
|
||||
azureSubscription: "vscode-builds-subscription"
|
||||
KeyVaultName: vscode
|
||||
- task: AzureKeyVault@1
|
||||
displayName: 'Azure Key Vault: Get Secrets'
|
||||
inputs:
|
||||
azureSubscription: 'azuredatastudio-adointegration'
|
||||
KeyVaultName: ado-secrets
|
||||
|
||||
- script: |
|
||||
set -e
|
||||
- script: |
|
||||
set -e
|
||||
|
||||
cat << EOF > ~/.netrc
|
||||
machine github.com
|
||||
login vscode
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
cat << EOF > ~/.netrc
|
||||
machine github.com
|
||||
login azuredatastudio
|
||||
password $(github-distro-mixin-password)
|
||||
EOF
|
||||
|
||||
git config user.email "vscode@microsoft.com"
|
||||
git config user.name "VSCode"
|
||||
git config user.email "andresse@microsoft.com"
|
||||
git config user.name "AzureDataStudio"
|
||||
|
||||
git remote add distro "https://github.com/$VSCODE_MIXIN_REPO.git"
|
||||
git fetch distro
|
||||
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
|
||||
git fetch distro
|
||||
|
||||
# Push main branch into oss/main
|
||||
git push distro origin/main:refs/heads/oss/main
|
||||
# Push master branch into oss/master
|
||||
git push distro origin/master:refs/heads/oss/master
|
||||
|
||||
# Push every release branch into oss/release
|
||||
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro
|
||||
# Push every release branch into oss/release
|
||||
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro
|
||||
|
||||
git merge $(node -p "require('./package.json').distro")
|
||||
git merge $(node -p "require('./package.json').distro")
|
||||
|
||||
displayName: Sync & Merge Distro
|
||||
displayName: Sync & Merge Distro
|
||||
|
||||
@@ -1,14 +1,10 @@
|
||||
#Download base image ubuntu 21.04
|
||||
FROM ubuntu:21.04
|
||||
ENV TZ=America/Los_Angeles
|
||||
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
|
||||
#Download base image ubuntu 16.04
|
||||
FROM ubuntu:16.04
|
||||
|
||||
# Update Software repository
|
||||
RUN apt-get update && apt-get upgrade -y
|
||||
RUN apt-get update
|
||||
|
||||
RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 \
|
||||
libkrb5-dev git apt-transport-https ca-certificates curl gnupg-agent software-properties-common \
|
||||
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++ python3-dev python3-pip
|
||||
RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus libgtk-3-0
|
||||
|
||||
ADD ./ /opt/ads-server
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user