Compare commits

..

3 Commits

Author SHA1 Message Date
Karl Burtram
088cac030f Merge branch 'master' into release/0.30 2018-06-18 13:07:21 -07:00
Karl Burtram
bd3c293f94 Merge branch 'master' into release/0.30 2018-06-18 10:28:15 -07:00
Karl Burtram
17db0b7d09 Revert SlickGrid to 2.3.16 to workaround Edit Data TAB issue 2018-06-12 16:37:42 -07:00
25901 changed files with 776550 additions and 2773066 deletions

View File

@@ -1,25 +0,0 @@
{
"tool": "Credential Scanner",
"suppressions": [
{
"file": "src\\vs\\base\\test\\common\\uri.test.ts",
"_justification": "External code"
},
{
"file": "build\\actions\\AutoLabel\\dist\\index.js",
"_justification": "False positive from webpacked code"
},
{
"file": "build\\actions\\AutoMerge\\dist\\index.js",
"_justification": "False positive from webpacked code"
},
{
"file": ".devcontainer\\devcontainer.json",
"_justification": "Local development environment - not used in production"
},
{
"file": "extensions\\asde-deployment\\notebooks\\edge\\deploy-sql-edge-remote.ipynb",
"_justification": "Deployment Notebook - usernames/passwords are entered by user"
}
]
}

View File

@@ -1,105 +0,0 @@
# Code - OSS Development Container
[![Open in Remote - Containers](https://img.shields.io/static/v1?label=Remote%20-%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/microsoft/vscode)
This repository includes configuration for a development container for working with Code - OSS in a local container or using [GitHub Codespaces](https://github.com/features/codespaces).
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` and a web client is available on port `6080`.
## Quick start - local
If you already have VS Code and Docker installed, you can click the badge above or [here](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/microsoft/vscode) to get started. Clicking these links will cause VS Code to automatically install the Remote - Containers extension if needed, clone the source code into a container volume, and spin up a dev container for use.
1. Install Docker Desktop or Docker for Linux on your local machine. (See [docs](https://aka.ms/vscode-remote/containers/getting-started) for additional details.)
2. **Important**: Docker needs at least **4 Cores and 8 GB of RAM** to run a full build. If you are on macOS, or are using the old Hyper-V engine for Windows, update these values for Docker Desktop by right-clicking on the Docker status bar item and going to **Preferences/Settings > Resources > Advanced**.
> **Note:** The [Resource Monitor](https://marketplace.visualstudio.com/items?itemName=mutantdino.resourcemonitor) extension is included in the container so you can keep an eye on CPU/Memory in the status bar.
3. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Remote - Containers](https://aka.ms/vscode-remote/download/containers) extension.
![Image of Remote - Containers extension](https://microsoft.github.io/vscode-remote-release/images/remote-containers-extn.png)
> **Note:** The Remote - Containers extension requires the Visual Studio Code distribution of Code - OSS. See the [FAQ](https://aka.ms/vscode-remote/faq/license) for details.
4. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd> and select **Remote-Containers: Clone Repository in Container Volume...**.
> **Tip:** While you can use your local source tree instead, operations like `yarn install` can be slow on macOS or when using the Hyper-V engine on Windows. We recommend the "clone repository in container" approach instead since it uses "named volume" rather than the local filesystem.
5. Type `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box and press <kbd>Enter</kbd>.
6. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080), or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
Anything you start in VS Code, or the integrated terminal, will appear here.
Next: **[Try it out!](#try-it)**
## Quick start - GitHub Codespaces
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and then click on **New codespace**. If prompted, select the **Standard** machine size (which is also the default).
> **Note:** You will not see these options within GitHub if you are not in the Codespaces beta.
2. After the codespace is up and running in your browser, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd> and select **Ports: Focus on Ports View**.
3. You should see **VNC web client (6080)** under in the list of ports. Select the line and click on the globe icon to open it in a browser tab.
> **Tip:** If you do not see the port, <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd>, select **Forward a Port** and enter port `6080`.
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
Anything you start in VS Code, or the integrated terminal, will appear here.
Next: **[Try it out!](#try-it)**
### Using VS Code with GitHub Codespaces
You may see improved VNC responsiveness when accessing a codespace from VS Code client since you can use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
1. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the the [GitHub Codespaces extension](https://marketplace.visualstudio.com/items?itemName=GitHub.codespaces).
> **Note:** The GitHub Codespaces extension requires the Visual Studio Code distribution of Code - OSS.
2. After the VS Code is up and running, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd>, choose **Codespaces: Create New Codespace**, and use the following settings:
- `microsoft/vscode` for the repository.
- Select any branch (e.g. **main**) - you can select a different one later.
- Choose **Standard** (4-core, 8GB) as the size.
4. After you have connected to the codespace, you can use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
> **Tip:** You may also need change your VNC client's **Picture Quality** setting to **High** to get a full color desktop.
5. Anything you start in VS Code, or the integrated terminal, will appear here.
Next: **[Try it out!](#try-it)**
## Try it!
This container uses the [Fluxbox](http://fluxbox.org/) window manager to keep things lean. **Right-click on the desktop** to see menu options. It works with GNOME and GTK applications, so other tools can be installed if needed.
> **Note:** You can also set the resolution from the command line by typing `set-resolution`.
To start working with Code - OSS, follow these steps:
1. In your local VS Code client, open a terminal (<kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>\`</kbd>) and type the following commands:
```bash
yarn install
bash scripts/code.sh
```
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to the desktop environment as described in the quick start and enter `vscode` as the password.
3. You should now see Code - OSS!
Next, let's try debugging.
1. Shut down Code - OSS by clicking the box in the upper right corner of the Code - OSS window through your browser or VNC viewer.
2. Go to your local VS Code client, and use the **Run / Debug** view to launch the **VS Code** configuration. (Typically the default, so you can likely just press <kbd>F5</kbd>).
> **Note:** If launching times out, you can increase the value of `timeout` in the "VS Code", "Attach Main Process", "Attach Extension Host", and "Attach to Shared Process" configurations in [launch.json](../.vscode/launch.json). However, running `scripts/code.sh` first will set up Electron which will usually solve timeout issues.
3. After a bit, Code - OSS will appear with the debugger attached!
Enjoy!

View File

@@ -1,15 +0,0 @@
#!/usr/bin/env bash
# This file establishes a basline for the repository before any steps in the "prepare.sh"
# are run. Its just a find command that filters out a few things we don't need to watch.
set -e
SOURCE_FOLDER="${1:-"."}"
CACHE_FOLDER="${2:-"$HOME/.devcontainer-cache"}"
cd "${SOURCE_FOLDER}"
echo "[$(date)] Generating ""before"" manifest..."
mkdir -p "${CACHE_FOLDER}"
find -L . -not -path "*/.git/*" -and -not -path "${CACHE_FOLDER}/*.manifest" -type f > "${CACHE_FOLDER}/before.manifest"
echo "[$(date)] Done!"

View File

@@ -1,28 +0,0 @@
#!/bin/bash
# This file simply wraps the docker build command to build an image that includes
# a cache.tar file with the result of "prepare.sh" inside of it. See cache.Dockerfile
# for the steps that are actually taken to do this.
set -e
SCRIPT_PATH="$(cd $(dirname "${BASH_SOURCE[0]}") && pwd)"
CONTAINER_IMAGE_REPOSITORY="$1"
BRANCH="${2:-"main"}"
if [ "${CONTAINER_IMAGE_REPOSITORY}" = "" ]; then
echo "Container repository not specified!"
exit 1
fi
TAG="branch-${BRANCH//\//-}"
echo "[$(date)] ${BRANCH} => ${TAG}"
cd "${SCRIPT_PATH}/../.."
echo "[$(date)] Starting image build and push..."
export DOCKER_BUILDKIT=1
docker buildx create --use --name vscode-dev-containers
docker run --privileged --rm tonistiigi/binfmt --install all
docker buildx build --push --platform linux/amd64,linux/arm64 -t ${CONTAINER_IMAGE_REPOSITORY}:"${TAG}" -f "${SCRIPT_PATH}/cache.Dockerfile" .
echo "[$(date)] Done!"

View File

@@ -1,23 +0,0 @@
#!/usr/bin/env bash
# This file is used to archive off a copy of any differences in the source tree into another location
# in the image. Once the codespace / container is up, this will be restored into its proper location.
set -e
SOURCE_FOLDER="${1:-"."}"
CACHE_FOLDER="${2:-"$HOME/.devcontainer-cache"}"
if [ ! -d "${CACHE_FOLDER}" ]; then
echo "No cache folder found. Be sure to run before-cache.sh to set one up."
exit 1
fi
echo "[$(date)] Starting cache operation..."
cd "${SOURCE_FOLDER}"
echo "[$(date)] Determining diffs..."
find -L . -not -path "*/.git/*" -and -not -path "${CACHE_FOLDER}/*.manifest" -type f > "${CACHE_FOLDER}/after.manifest"
grep -Fxvf "${CACHE_FOLDER}/before.manifest" "${CACHE_FOLDER}/after.manifest" > "${CACHE_FOLDER}/cache.manifest"
echo "[$(date)] Archiving diffs..."
tar -cf "${CACHE_FOLDER}/cache.tar" --totals --files-from "${CACHE_FOLDER}/cache.manifest"
echo "[$(date)] Done! $(du -h "${CACHE_FOLDER}/cache.tar")"

View File

@@ -1,24 +0,0 @@
# This dockerfile is used to build up from a base image to create an image a cache.tar file containing the results of running "prepare.sh".
# Other image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
# This first stage generates cache.tar
FROM mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:dev as cache
ARG USERNAME=node
ARG CACHE_FOLDER="/home/${USERNAME}/.devcontainer-cache"
COPY --chown=${USERNAME}:${USERNAME} . /repo-source-tmp/
RUN mkdir -p ${CACHE_FOLDER} && chown ${USERNAME} ${CACHE_FOLDER} /repo-source-tmp \
&& su ${USERNAME} -c "\
cd /repo-source-tmp \
&& .devcontainer/cache/before-cache.sh . ${CACHE_FOLDER} \
&& .devcontainer/prepare.sh . ${CACHE_FOLDER} \
&& .devcontainer/cache/cache-diff.sh . ${CACHE_FOLDER}"
# This second stage starts fresh and just copies in cache.tar from the previous stage. The related
# devcontainer.json file is then setup to have postCreateCommand fire restore-diff.sh to expand it.
FROM mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:dev as dev-container
ARG USERNAME=node
ARG CACHE_FOLDER="/home/${USERNAME}/.devcontainer-cache"
RUN mkdir -p "${CACHE_FOLDER}" \
&& chown "${USERNAME}:${USERNAME}" "${CACHE_FOLDER}" \
&& su ${USERNAME} -c "git config --global codespaces-theme.hide-status 1"
COPY --from=cache ${CACHE_FOLDER}/cache.tar ${CACHE_FOLDER}/

View File

@@ -1,29 +0,0 @@
#!/usr/bin/env bash
# This file expands the cache.tar file in the image that contains the results of "prepare.sh"
# on top of the source tree. It runs as a postCreateCommand which runs after the container/codespace
# is already up where you would typically run a command like "yarn install".
set -e
SOURCE_FOLDER="$(cd "${1:-"."}" && pwd)"
CACHE_FOLDER="${2:-"$HOME/.devcontainer-cache"}"
if [ ! -d "${CACHE_FOLDER}" ]; then
echo "No cache folder found."
exit 0
fi
echo "[$(date)] Expanding $(du -h "${CACHE_FOLDER}/cache.tar") file to ${SOURCE_FOLDER}..."
cd "${SOURCE_FOLDER}"
# Ensure user/group is correct if the UID/GID was changed for some reason
echo "+1000 +$(id -u)" > "${CACHE_FOLDER}/cache-owner-map"
echo "+1000 +$(id -g)" > "${CACHE_FOLDER}/cache-group-map"
# Untar to workspace folder, preserving permissions and order, but mapping GID/UID if required
tar --owner-map="${CACHE_FOLDER}/cache-owner-map" --group-map="${CACHE_FOLDER}/cache-group-map" -xpsf "${CACHE_FOLDER}/cache.tar"
rm -rf "${CACHE_FOLDER}"
echo "[$(date)] Done!"
# Change ownership of chrome-sandbox
sudo chown root .build/electron/chrome-sandbox
sudo chmod 4755 .build/electron/chrome-sandbox

View File

@@ -1,40 +0,0 @@
{
"name": "Code - OSS",
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-main",
"overrideCommand": false,
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined", "--shm-size=1g"],
"settings": {
"resmon.show.battery": false,
"resmon.show.cpufreq": false
},
// noVNC, VNC
"forwardPorts": [6080, 5901],
"portsAttributes": {
"6080": {
"label": "VNC web client (noVNC)",
"onAutoForward": "silent"
},
"5901": {
"label": "VNC TCP port",
"onAutoForward": "silent"
}
},
"extensions": [
"dbaeumer.vscode-eslint",
"mutantdino.resourcemonitor"
],
// Optionally loads a cached yarn install for the repo
"postCreateCommand": ".devcontainer/cache/restore-diff.sh",
"remoteUser": "node",
"hostRequirements": {
"memory": "8gb"
}
}

View File

@@ -1,9 +0,0 @@
#!/usr/bin/env bash
# This file contains the steps that should be run when building a "cache" image with contents that should be
# layered directly **on top of the source tree** once a dev container is created. This avoids having to run long
# running commands like "yarn install" from the ground up. Developers (and should) still run these commands
# after the actual dev container is created, but only differences will be processed.
yarn install
yarn electron

View File

@@ -1,4 +1,4 @@
# EditorConfig is awesome: https://EditorConfig.org
# EditorConfig is awesome: http://EditorConfig.org
# top-most EditorConfig file
root = true
@@ -6,6 +6,7 @@ root = true
# Tab indentation
[*]
indent_style = tab
indent_size = 4
trim_trailing_whitespace = true
# The indent size used in the `package.json` file cannot be changed

View File

@@ -1,59 +0,0 @@
**/build/*/**/*.js
**/dist/**/*.js
**/extensions/**/*.d.ts
**/extensions/**/build/**
**/extensions/**/colorize-fixtures/**
**/extensions/css-language-features/server/test/pathCompletionFixtures/**
**/extensions/html-language-features/server/lib/jquery.d.ts
**/extensions/html-language-features/server/src/test/pathCompletionFixtures/**
**/extensions/markdown-language-features/media/**
**/extensions/markdown-language-features/notebook-out/**
**/extensions/markdown-math/notebook-out/**
**/extensions/notebook-renderers/renderer-out/index.js
**/extensions/simple-browser/media/index.js
**/extensions/typescript-language-features/test-workspace/**
**/vs/nls.build.js
**/vs/nls.js
**/vs/css.build.js
**/vs/css.js
**/vs/loader.js
**/dompurify/**
**/marked/**
**/semver/**
**/test/**/*.js
**/node_modules/**
**/extensions/**/out/**
**/extensions/**/build/**
/extensions/big-data-cluster/src/bigDataCluster/controller/apiGenerated.ts
/extensions/big-data-cluster/src/bigDataCluster/controller/clusterApiGenerated2.ts
**/extensions/**/colorize-fixtures/**
**/extensions/html-language-features/server/lib/jquery.d.ts
/extensions/markdown-language-features/media/**
/extensions/markdown-language-features/notebook-out/**
**/extensions/markdown-math/notebook-out/**
**/extensions/typescript-basics/test/colorize-fixtures/**
**/extensions/**/dist/**
/extensions/types
/extensions/typescript-language-features/test-workspace/**
/test/automation/out
/resources/web/code-web.js
**/extensions/vscode-api-tests/testWorkspace/**
**/extensions/vscode-api-tests/testWorkspace2/**
**/fixtures/**
**/node_modules/**
**/out-*/**/*.js
**/out-editor-*/**
**/out/**/*.js
**/src/**/dompurify.js
**/src/**/marked.js
**/src/**/semver.js
**/src/typings/**/*.d.ts
**/src/vs/*/**/*.d.ts
**/src/vs/base/test/common/filters.perf.data.js
**/src/vs/css.build.js
**/src/vs/css.js
**/src/vs/loader.js
**/src/vs/nls.build.js
**/src/vs/nls.js
**/test/unit/assert.js
**/typings/**

19
.eslintrc Normal file
View File

@@ -0,0 +1,19 @@
{
"env": {
"node": true,
"es6": true
},
"rules": {
"no-console": 0,
"no-cond-assign": 0,
"no-unused-vars": 1,
"no-extra-semi": "warn",
"semi": "warn"
},
"extends": "eslint:recommended",
"parserOptions": {
"ecmaFeatures": {
"experimentalObjectRestSpread": true
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,22 +0,0 @@
{
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 6,
"sourceType": "module",
"project": "./tsconfig.sql.json"
},
"plugins": [
"@typescript-eslint",
"jsdoc"
],
"rules": {
"no-cond-assign": 2,
"@typescript-eslint/no-floating-promises": [
"error",
{
"ignoreVoid": true
}
],
"jsdoc/check-param-names": "error"
}
}

View File

@@ -1,25 +0,0 @@
# https://git-scm.com/docs/git-blame#Documentation/git-blame.txt---ignore-revs-fileltfilegt
# mjbvz: Fix spacing
13f4f052582bcec3d6c6c6a70d995c9dee2cac13
# mjbvz: Add script to run build with noImplicitOverride
ae1452eea678f5266ef513f22dacebb90955d6c9
# alexdima: Revert "bump version"
537ba0ef1791c090bb18bc68d727816c0451c117
# alexdima: bump version
387a0dcb82df729e316ca2518a9ed81a75482b18
# joaomoreno: add ghooks dev dependency
0dfc06e0f9de5925de792cdf9f0e6597bb25908f
# mjbvz: organize imports
494cbbd02d67e87727ec885f98d19551aa33aad1
a3cb14be7f2cceadb17adf843675b1a59537dbbd
ee1655a82ebdfd38bf8792088a6602c69f7bbd94
# jrieken: new eslint-rule
4a130c40ed876644ed8af2943809d08221375408

4
.gitattributes vendored
View File

@@ -6,6 +6,4 @@ ThirdPartyNotices.txt eol=crlf
*.bat eol=crlf
*.cmd eol=crlf
*.ps1 eol=lf
*.sh eol=lf
*.rtf -text
**/*.json linguist-language=jsonc
*.sh eol=lf

22
.github/CODEOWNERS vendored
View File

@@ -1,22 +0,0 @@
# Lines starting with '#' are comments.
# Each line is a file pattern followed by one or more owners.
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
/extensions/admin-tool-ext-win @Charles-Gagnon
/extensions/arc/ @Charles-Gagnon @swells @candiceye
/extensions/azcli/ @Charles-Gagnon @swells @candiceye
/extensions/azurecore/ @cssuh @cheenamalhotra
/extensions/big-data-cluster/ @Charles-Gagnon
/extensions/dacpac/ @kisantia
/extensions/notebook @azure-data-studio-notebook-devs
/extensions/query-history/ @Charles-Gagnon
/extensions/resource-deployment/ @Charles-Gagnon
/extensions/schema-compare/ @kisantia
/extensions/sql-bindings/ @vasubhog @Charles-Gagnon @lucyzhang929 @chlafreniere @MaddyDev
/extensions/sql-database-projects/ @Benjin @kisantia
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon
/src/sql/workbench/browser/modelComponents @Charles-Gagnon @alanrenmsft
/src/sql/workbench/api @Charles-Gagnon @alanrenmsft
/src/sql/**/notebook @azure-data-studio-notebook-devs

View File

@@ -1,25 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
<!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://github.com/Microsoft/azuredatastudio#try-out-the-latest-insiders-build-from-main -->
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in Azure Data Studio to pre-fill useful information. -->
- Azure Data Studio Version:
- OS Version:
Steps to Reproduce:
1.
2.
<!-- 🔧 Launch with `azuredatastudio --disable-extensions` to check. -->
Does this issue occur when all extensions are disabled?: Yes/No
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->

View File

@@ -1 +0,0 @@
blank_issues_enabled: false

View File

@@ -1,20 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution or feature you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -1,39 +0,0 @@
{
perform: true,
alwaysRequireAssignee: false,
labelsRequiringAssignee: [],
defaultLabel: 'Triage: Needed',
defaultAssignee: '',
autoAssignees: {
Area - Acquisition: [],
Area - Azure: [],
Area - Backup\Restore: [],
Area - Big Data Cluster: [ charles-gagnon ],
Area - Charting\Insights: [],
Area - Connection: [ ],
Area - DacFX: [],
Area - Dashboard: [],
Area - Data Explorer: [],
Area - Data Virtualization: [ charles-gagnon ],
Area - Edit Data: [],
Area - Extensibility: [],
Area - External Table: [],
Area - Fundamentals: [],
Area - Language Service: [ charles-gagnon ],
Area - Localization: [],
Area - Notebooks: [ chlafreniere ],
Area - Performance: [],
Area - Query Editor: [ anthonydresser ],
Area - Query History: [ charles-gagnon ],
Area - Query Plan: [],
Area - Reliability: [],
Area - Resource Deployment: [],
Area - Schema Compare: [],
Area - Shell: [],
Area - SQL Agent: [],
Area - SQL Import: [],
Area - SQL Profiler: [],
Area - SQL 2019: [],
Area - SSMS Integration: []
}
}

12
.github/commands.yml vendored
View File

@@ -1,12 +0,0 @@
{
perform: true,
commands:
[
{
type: "label",
name: "Needs Logs",
action: "comment",
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please include renderer.log",
},
],
}

View File

@@ -1,51 +0,0 @@
# actions for Needs Logs label
Needs Logs:
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
First open the Settings page, find the `Mssql: Tracing Level` setting and change that to `All` then restart ADS and repro your issue.
Next there are two types of logs to collect:
**Console Logs**
- Open Developer Tools (Help -> Toggle Developer Tools)
- Click the **Console** tab
- Click in the log area and select all text (CTRL+A)
- Save this text into a file named console.log and attach it to this issue.
- Developer Tools can be closed via Help -> Toggle Developer Tools
**Application Logs**
- Open command palette (Click **View** -> **Command Palette**)
- Run the command: **`Developer: Open Logs Folder`**
- This will open the log folder locally. Please zip up this folder and attach it to the issue."
# actions for Needs Logs - Azure label
Needs Logs - Azure:
comment: "We need more info to debug your Azure Active Directory issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
- In the settings menu, find the setting titled `Azure: Logging Level` and select the `Verbose` option
- Run the process that produces your error
- Open command palette (Click **View** -> **Command Palette**)
- Run the command: **`Developer: Open Logs Folder`**
- Follow this path to find the Azure Accounts log file: `[default log folder]/exthost1/output_logging_[earliest timestamp]/#-Azure Acounts.log`
- Please attach the Azure-Accounts.log file to the issue."
# actions for Out of Scope label
Out of Scope:
comment: "Thank you for your feedback! This feature is currently out of scope and we do not plan to work on it in a currently planned release. We will close this issue to keep our backlog focused on requests that we are planning to work on. Please note that users can continue to vote and comment on closed issues, which we encourage as it helps us understand user interest and can provide more details about why a feature is requested."
close: true

6
.github/locker.yml vendored
View File

@@ -1,6 +0,0 @@
{
daysAfterClose: 45,
daysSinceLastUpdate: 3,
ignoredLabels: ['A11y_ADS_OctTestPass', 'A11y_ADS_Schema_Dacpac_Backup', 'A11y_AzureDataStudio', 'A11yExclusion', 'A11yMAS', 'A11yResolved: Will Not Fix', 'A11yTCS'],
perform: true
}

View File

@@ -1,7 +0,0 @@
[
"kenvanhyning",
"kburtram",
"udeeshagautam",
"qifahs",
"chlafreniere"
]

View File

@@ -1,6 +0,0 @@
{
daysUntilClose: 7,
needsMoreInfoLabel: 'needs more info',
perform: true,
closeComment: "This issue has been closed automatically because it needs more information and has not had recent activity in the last 7 days. If you have more info to help resolve the issue, leave a comment"
}

View File

@@ -1,3 +0,0 @@
# Add 'repo' label to any root file changes
Port Request:
- '**/*'

View File

@@ -1,9 +0,0 @@
<!-- Thank you for submitting a Pull Request. Please:
* Read our Pull Request guidelines:
https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#pull-requests.
* Associate an issue with the Pull Request.
* Ensure that the code is up-to-date with the `main` branch.
* Include a description of the proposed changes and how to test them.
-->
This PR fixes #

View File

@@ -1,5 +0,0 @@
{
perform: true,
whenCreatedByTeam: true,
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}",
}

6
.github/stale.yml vendored
View File

@@ -1,6 +0,0 @@
{
perform: true,
label: 'Stale PR',
daysSinceLastUpdate: 7,
ignoredLabels: ['Do Not Merge']
}

View File

@@ -1,2 +0,0 @@
{
}

View File

@@ -1,331 +0,0 @@
name: CI
on:
push:
branches:
- main
- release/*
pull_request:
branches:
- main
- release/*
jobs:
windows:
name: Windows
runs-on: windows-2019
timeout-minutes: 30
env:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: 16
- uses: actions/setup-python@v2
with:
python-version: "2.x"
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node_modules archive
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: ".build/node_modules_cache"
# key: "${{ runner.os }}-cacheNodeModulesArchive-${{ steps.nodeModulesCacheKey.outputs.value }}"
# - name: Extract node_modules archive
# if: ${{ steps.cacheNodeModules.outputs.cache-hit == 'true' }}
# run: 7z.exe x .build/node_modules_cache/cache.7z -aos
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skipping caching for now
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
# - name: Create node_modules archive {{SQL CARBON EDIT}} Skip caching for now
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: |
# mkdir -Force .build
# node build/azure-pipelines/common/listNodeModules.js .build/node_modules_list.txt
# mkdir -Force .build/node_modules_cache
# 7z.exe a .build/node_modules_cache/cache.7z -mx3 `@.build/node_modules_list.txt
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" # {{SQL CARBON EDIT}} Remove unused options playwright-install download-builtin-extensions
- name: Run Core Unit Tests # {{SQL CARBON EDIT}} Rename to core for clarity
run: .\scripts\test.bat
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} disable for now
# run: yarn test-browser --browser chromium
# {{SQL CARBON EDIT}} Rename to core for clarity
# - name: Run Core Integration Tests {{SQL CARBON EDIT}} disable for now
# run: .\scripts\test-integration.bat
linux:
name: Linux
runs-on: ubuntu-latest
timeout-minutes: 30
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- name: Setup Build Environment
run: |
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0 libgbm1 libkrb5-dev # {{SQL CARBON EDIT}} add kerberos dep
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- uses: actions/setup-node@v2
with:
node-version: 16
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules14-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules14-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skip caching for now
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
# Don't inline source maps so that we generate code coverage for ts files
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" # {{SQL CARBON EDIT}} Remove unused options playwright-install download-builtin-extensions
env:
SQL_NO_INLINE_SOURCEMAP: 1
- name: Run Core Unit Tests # {{SQL CARBON EDIT}} Rename to core for clarity
id: electron-unit-tests
run: DISPLAY=:10 ./scripts/test.sh --runGlob "**/sql/**/*.test.js" --coverage
- name: Run Extension Unit Tests # {{SQL CARBON EDIT}} Rename to extension for clarity
id: electron-extension-unit-tests
run: DISPLAY=:10 ./scripts/test-extensions-unit.sh
# {{SQL CARBON EDIT}} Add coveralls. We merge first to get around issue where parallel builds weren't being combined correctly
- name: Combine code coverage files
run: node test/combineCoverage
- name: Upload Code Coverage
uses: coverallsapp/github-action@v1.1.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: "test/coverage/lcov.info"
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
# id: browser-unit-tests
# run: DISPLAY=:10 yarn test-browser --browser chromium
# {{SQL CARBON EDIT}} Rename to core for clarity
# - name: Run Core Integration Tests {{SQL CARBON EDIT}} Skip for now
# id: electron-integration-tests
# run: DISPLAY=:10 ./scripts/test-integration.sh
darwin:
name: macOS
runs-on: macos-latest
timeout-minutes: 30
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/setup-node@v2
with:
node-version: 16
# {{SQL CARBON EDIT}} Skip caching for now
# - name: Compute node modules cache key
# id: nodeModulesCacheKey
# run: echo "::set-output name=value::$(node build/azure-pipelines/common/computeNodeModulesCacheKey.js)"
# - name: Cache node modules
# id: cacheNodeModules
# uses: actions/cache@v2
# with:
# path: "**/node_modules"
# key: ${{ runner.os }}-cacheNodeModules14-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-cacheNodeModules14-
# - name: Get yarn cache directory path
# id: yarnCacheDirPath
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# run: echo "::set-output name=dir::$(yarn cache dir)"
# - name: Cache yarn directory
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
# uses: actions/cache@v2
# with:
# path: ${{ steps.yarnCacheDirPath.outputs.dir }}
# key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
# restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Execute yarn
# if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }} {{SQL CARBON EDIT}} Skip caching for now
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Compile and Download
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" # {{SQL CARBON EDIT}} Remove unused options playwright-install download-builtin-extensions
# This is required for keytar unittests, otherwise we hit
# https://github.com/atom/node-keytar/issues/76
- name: Create temporary keychain
run: |
security create-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
security default-keychain -s $RUNNER_TEMP/buildagent.keychain
security unlock-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
- name: Run Core Unit Tests # {{SQL CARBON EDIT}} Rename to core for clarity
run: DISPLAY=:10 ./scripts/test.sh
# - name: Run Unit Tests (Browser) {{SQL CARBON EDIT}} Skip for now
# run: DISPLAY=:10 yarn test-browser --browser chromium
# {{SQL CARBON EDIT}} Rename to core for clarity
# - name: Run Core Integration Tests {{SQL CARBON EDIT}} Skip for now
# run: DISPLAY=:10 ./scripts/test-integration.sh
hygiene:
name: Hygiene and Layering
runs-on: ubuntu-latest
timeout-minutes: 30
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: 16
- name: Compute node modules cache key
id: nodeModulesCacheKey
run: echo "::set-output name=value::$(node build/azure-pipelines/common/sql-computeNodeModulesCacheKey.js)"
- name: Cache node modules
id: cacheNodeModules
uses: actions/cache@v2
with:
path: "**/node_modules"
key: ${{ runner.os }}-cacheNodeModules14-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-cacheNodeModules14-
- name: Get yarn cache directory path
id: yarnCacheDirPath
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: echo "::set-output name=dir::$(yarn cache dir)"
- name: Cache yarn directory
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
uses: actions/cache@v2
with:
path: ${{ steps.yarnCacheDirPath.outputs.dir }}
key: ${{ runner.os }}-yarnCacheDir-${{ steps.nodeModulesCacheKey.outputs.value }}
restore-keys: ${{ runner.os }}-yarnCacheDir-
- name: Setup Build Environment # {{SQL CARBON EDIT}} Add step to install required packages if we need to run yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
run: |
sudo apt-get update
sudo apt-get install -y libkrb5-dev
- name: Execute yarn
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
env:
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD: 1
ELECTRON_SKIP_BINARY_DOWNLOAD: 1
run: yarn --frozen-lockfile --network-timeout 180000
- name: Run Hygiene Checks
run: yarn gulp hygiene
- name: Run Valid Layers Checks
run: yarn valid-layers-check
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn monaco-compile-check
- name: Compile /build/
run: yarn --cwd build compile
- name: Run eslint
run: yarn eslint
# {{SQL CARBON EDIT}} Don't need this
# - name: Run Monaco Editor Checks
# run: yarn monaco-compile-check
# {{SQL CARBON EDIT}} Don't need this
# - name: Run vscode-dts Compile Checks
# run: yarn vscode-dts-compile-check
- name: Run Trusted Types Checks
run: yarn tsec-compile-check
# - name: Editor Distro & ESM Bundle {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn gulp editor-esm-bundle
# - name: Typings validation prep {{SQL CARBON EDIT}} Remove Monaco checks
# run: |
# mkdir typings-test
# - name: Typings validation {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./typings-test
# run: |
# yarn init -yp
# ../node_modules/.bin/tsc --init
# echo "import '../out-monaco-editor-core';" > a.ts
# ../node_modules/.bin/tsc --noEmit
# - name: Webpack Editor {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run bundle
# - name: Compile Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# working-directory: ./test/monaco
# run: yarn run compile
# - name: Download Playwright {{SQL CARBON EDIT}} Remove Monaco checks
# run: yarn playwright-install
# - name: Run Editor Tests {{SQL CARBON EDIT}} Remove Monaco checks
# timeout-minutes: 5
# working-directory: ./test/monaco
# run: yarn test

View File

@@ -1,15 +0,0 @@
name: On Label
on:
issues:
types: [labeled]
jobs:
processLabelAction:
name: Process Label Action
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Process Label Action
uses: hramos/label-actions@v1
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,23 +0,0 @@
name: On PR Open
on:
pull_request:
branches:
- release/**
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/azuredatastudio'
ref: main
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions/build/actions
- name: Run Port Labeler
uses: ./actions/build/actions/auto-labeler
with:
label: "Port Request"

20
.gitignore vendored
View File

@@ -1,19 +1,17 @@
.DS_Store
.cache
npm-debug.log
Thumbs.db
node_modules/
.build/
extensions/**/dist/
/out*/
/extensions/**/out/
out/
out-build/
out-editor/
out-editor-esm/
out-editor-min/
out-monaco-editor-core/
out-vscode/
out-vscode-min/
build/node_modules
coverage/
test_data/
test-results/
yarn-error.log
*.vsix
vscode.lsif
vscode.db
/.profile-oss
*.orig
yarn-error.log

View File

@@ -1,6 +0,0 @@
{
"project": "src/tsconfig.json",
"source": "./package.json",
"package": "package.json",
"out": "vscode.lsif"
}

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
8.9.2

58
.travis.yml Normal file
View File

@@ -0,0 +1,58 @@
sudo: false
language: cpp
os:
- linux
- osx
cache:
directories:
- $HOME/.cache/yarn
notifications:
email: false
webhooks:
- http://vscode-probot.westus.cloudapp.azure.com:3450/travis/notifications
- http://vscode-test-probot.westus.cloudapp.azure.com:3450/travis/notifications
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- gcc-4.9
- g++-4.9
- gcc-4.9-multilib
- g++-4.9-multilib
- zip
- libgtk2.0-0
- libx11-dev
- libxkbfile-dev
- libsecret-1-dev
before_install:
- git submodule update --init --recursive
- nvm install 8.9.1
- nvm use 8.9.1
- npm i -g yarn
# - npm config set python `which python`
- if [ $TRAVIS_OS_NAME == "linux" ]; then
export CXX="g++-4.9" CC="gcc-4.9" DISPLAY=:99.0;
sh -e /etc/init.d/xvfb start;
sleep 3;
fi
# Make npm logs less verbose
# - npm config set depth 0
# - npm config set loglevel warn
install:
- yarn
script:
- node_modules/.bin/gulp electron --silent
- node_modules/.bin/gulp compile --silent --max_old_space_size=4096
- node_modules/.bin/gulp optimize-vscode --silent --max_old_space_size=4096
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then ./scripts/test.sh --coverage --reporter dot; else ./scripts/test.sh --reporter dot; fi
after_success:
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then node_modules/.bin/coveralls < .build/coverage/lcov.info; fi

View File

@@ -1,61 +0,0 @@
{
"type": "array",
"items": {
"oneOf": [
{
"type": "object",
"required": [
"name",
"prependLicenseText"
],
"properties": {
"name": {
"type": "string",
"description": "The name of the dependency"
},
"fullLicenseText": {
"type": "array",
"description": "The complete license text of the dependency",
"items": {
"type": "string"
}
},
"prependLicenseText": {
"type": "array",
"description": "A piece of text to prepend to the auto-detected license text of the dependency",
"items": {
"type": "string"
}
}
}
},
{
"type": "object",
"required": [
"name",
"fullLicenseText"
],
"properties": {
"name": {
"type": "string",
"description": "The name of the dependency"
},
"fullLicenseText": {
"type": "array",
"description": "The complete license text of the dependency",
"items": {
"type": "string"
}
},
"prependLicenseText": {
"type": "array",
"description": "A piece of text to prepend to the auto-detected license text of the dependency",
"items": {
"type": "string"
}
}
}
}
]
}
}

View File

@@ -2,8 +2,8 @@
// See https://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"eg2.tslint",
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig",
"redhat.vscode-yaml"
"msjsdiag.debugger-for-chrome"
]
}

430
.vscode/launch.json vendored
View File

@@ -9,281 +9,70 @@
"stopOnEntry": true,
"args": [
"hygiene"
]
},
{
"type": "node",
"request": "launch",
"name": "Launch Azure Data Studio",
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
},
"runtimeArgs": [
"--no-cached-data"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "0_ads"
}
},
{
"type": "pwa-chrome",
"request": "launch",
"name": "Launch ADS & Debug Renderer",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"port": 9222,
"timeout": 30000,
"env": {
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null,
"VSCODE_SKIP_PRELAUNCH": "1"
},
"cleanUp": "wholeBrowser",
"urlFilter": "*workbench.html*",
"runtimeArgs": [
"--inspect=5875",
"--no-cached-data",
],
"webRoot": "${workspaceFolder}",
"cascadeTerminateToConfigurations": [
"Attach to Extension Host"
],
"userDataDir": "${workspaceFolder}/.profile-oss",
"pauseForSourceMap": false,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"browserLaunchLocation": "workspace",
"presentation": {
"group": "1_debug",
"order": 2
}
"cwd": "${workspaceFolder}"
},
{
"type": "node",
"request": "attach",
"restart": true,
"name": "Attach to Extension Host",
"timeout": 30000,
"protocol": "inspector",
"port": 5870,
"restart": true,
"outFiles": [
"${workspaceFolder}/out/**/*.js",
"${workspaceFolder}/extensions/*/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "pwa-chrome",
"type": "node",
"request": "attach",
"name": "Attach to Shared Process",
"timeout": 30000,
"port": 9222,
"urlFilter": "*sharedProcess.html*",
"presentation": {
"group": "2_attach"
}
"protocol": "inspector",
"port": 5871,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "node",
"request": "attach",
"protocol": "inspector",
"name": "Attach to Search Process",
"port": 5876,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "node",
"request": "attach",
"name": "Attach to CLI Process",
"protocol": "inspector",
"port": 5874,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "node",
"request": "attach",
"name": "Attach to Main Process",
"timeout": 30000,
"protocol": "inspector",
"port": 5875,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"presentation": {
"group": "2_attach"
}
]
},
{
"type": "pwa-chrome",
"type": "chrome",
"request": "attach",
"name": "Attach to Renderer",
"browserAttachLocation": "workspace",
"port": 9222,
"timeout": 30000,
"trace": true,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"resolveSourceMapLocations": [
"${workspaceFolder}/out/**/*.js"
],
"perScriptSourcemaps": "yes",
"presentation": {
"group": "2_attach",
"order": 2
}
},
{
"type": "node",
"request": "launch",
"name": "Run Smoke Tests",
"program": "${workspaceFolder}/test/smoke/test/index.js",
"cwd": "${workspaceFolder}/test/smoke",
"env": {
"BUILD_ARTIFACTSTAGINGDIRECTORY": "${workspaceFolder}"
},
"presentation": {
"group": "3_tests",
"order": 5
}
},
{
"type": "pwa-node",
"request": "launch",
"name": "Run Core Unit Tests",
"program": "${workspaceFolder}/test/unit/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"cascadeTerminateToConfigurations": [
"Attach to Renderer"
],
"env": {
"MOCHA_COLORS": "true"
},
"presentation": {
"group": "3_tests",
"order": 1
}
},
{
"type": "pwa-node",
"request": "launch",
"name": "Run Core Unit Tests (Current File)",
"program": "${workspaceFolder}/test/unit/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"cascadeTerminateToConfigurations": [
"Attach to Renderer"
],
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222",
"--run",
"${relativeFile}"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
],
"env": {
"MOCHA_COLORS": "true"
},
"presentation": {
"group": "3_tests",
"order": 2
}
"name": "Attach to sqlops",
"port": 9222
},
{
"type": "chrome",
"request": "launch",
"name": "Run Extension Unit Tests",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"webRoot": "${workspaceFolder}",
"timeout": 45000,
"presentation": {
"group": "3_tests",
"order": 3
}
},
{
"type": "chrome",
"request": "launch",
"name": "Run Extension Integration Tests",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql-test-integration.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql-test-integration.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql-test-integration.sh"
},
"webRoot": "${workspaceFolder}",
"timeout": 45000,
"presentation": {
"group": "3_tests",
"order": 4
}
},
{
"type": "node",
"request": "launch",
"name": "Launch Azure Data Studio (Web) (TBD)",
"program": "${workspaceFolder}/resources/web/code-web.js",
"presentation": {
"group": "4_web"
}
},
{
"type": "pwa-chrome",
"request": "launch",
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "Launch Azure Data Studio (Web, Chrome)",
"url": "http://localhost:8080",
"preLaunchTask": "Run web",
"presentation": {
"group": "4_web"
}
},
{
"type": "pwa-msedge",
"request": "launch",
"outFiles": [],
"perScriptSourcemaps": "yes",
"name": "Launch Azure Data Studio (Web, Edge)",
"url": "http://localhost:8080",
"pauseForSourceMap": false,
"preLaunchTask": "Run web",
"presentation": {
"group": "4_web"
}
},
{
"name": "Run Sample Resource Deployment Extension",
"type": "sqlopsExtensionHost",
"request": "launch",
"name": "Launch sqlops",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
},
@@ -293,137 +82,58 @@
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"args": [
"--extensionDevelopmentPath=${workspaceRoot}/samples/sample-resource-deployment"
"urlFilter": "*index.html*",
"runtimeArgs": [
"--inspect=5875"
],
"outFiles": [
"${workspaceRoot}/samples/sample-resource-deployment/out/**/*.js"
"skipFiles": [
"**/winjs*.js"
],
"preLaunchTask": "Watch sample-resource-deployment",
"presentation": {
"group": "5_samples"
},
"timeout": 30000
"webRoot": "${workspaceFolder}",
"timeout": 15000
},
{
"name": "Run Sample Notebook Provider Extension",
"type": "sqlopsExtensionHost",
"type": "node",
"request": "launch",
"name": "Unit Tests",
"protocol": "inspector",
"program": "${workspaceFolder}/node_modules/mocha/bin/_mocha",
"runtimeExecutable": "${workspaceFolder}/.build/electron/SQL Operations Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
"runtimeExecutable": "${workspaceFolder}/.build/electron/sqlops.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
"runtimeExecutable": "${workspaceFolder}/.build/electron/sqlops"
},
"stopOnEntry": false,
"args": [
"--extensionDevelopmentPath=${workspaceRoot}/samples/sample-notebook-provider"
"--delay",
"--timeout",
"2000"
],
"outFiles": [
"${workspaceRoot}/samples/sample-notebook-provider/out/**/*.js"
],
"preLaunchTask": "Watch sample-notebook-provider",
"presentation": {
"group": "5_samples"
"cwd": "${workspaceFolder}",
"env": {
"ELECTRON_RUN_AS_NODE": "true"
},
"timeout": 30000
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
}
],
"compounds": [
{
"name": "Launch ADS & Debug Renderer and Extension Host",
"name": "Debug sqlops Main and Renderer",
"configurations": [
"Launch ADS & Debug Renderer",
"Attach to Extension Host"
],
"presentation": {
"group": "1_debug",
"order": 1
}
"Launch sqlops",
"Attach to Main Process"
]
},
{
"name": "Launch ADS & Debug Extension Host",
"name": "Search and Renderer processes",
"configurations": [
"Launch Azure Data Studio",
"Attach to Extension Host"
],
"presentation": {
"group": "1_debug",
"order": 3
}
},
{
"name": "Launch ADS & Debug Main, Renderer and Extension Host",
"configurations": [
"Launch ADS & Debug Renderer",
"Attach to Main Process",
"Attach to Extension Host"
],
"presentation": {
"group": "1_debug",
"order": 4
}
},
{
"name": "Launch ADS & Debug All",
"stopAll": true,
"configurations": [
"Launch Azure Data Studio",
"Attach to Main Process",
"Attach to Extension Host",
"Attach to Shared Process",
"Attach to Renderer"
],
"preLaunchTask": "Ensure Prelaunch Dependencies",
"presentation": {
"group": "1_debug",
"order": 5
}
},
{
"name": "Attach to Renderer and Extension Host",
"configurations": [
"Attach to Renderer",
"Attach to Extension Host"
],
"presentation": {
"group": "2_attach",
"order": 1
}
},
{
"name": "Debug Core Unit Tests",
"configurations": [
"Attach to Renderer",
"Run Core Unit Tests"
],
"presentation": {
"group": "3_tests",
"order": 6
}
},
{
"name": "Debug Extension Unit Tests",
"configurations": [
"Attach to Extension Host",
"Run Extension Unit Tests"
],
"presentation": {
"group": "3_tests"
}
},
{
"name": "Debug Core Unit Tests (Current File)",
"configurations": [
"Attach to Renderer",
"Run Core Unit Tests (Current File)"
],
"presentation": {
"group": "3_tests",
"order": 8
}
"Launch sqlops",
"Attach to Search Process"
]
}
]
}
}

View File

@@ -1,32 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Config"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"May 2022\""
},
{
"kind": 1,
"language": "markdown",
"value": "### Finalization"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repo $milestone label:api-finalization"
},
{
"kind": 1,
"language": "markdown",
"value": "### Proposals"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repo $milestone is:open label:api-proposal sort:created-asc"
}
]

View File

@@ -1,137 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal repo:microsoft/vscode-unpkg\n\n$MILESTONE=milestone:\"April 2022\""
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:pr is:open"
},
{
"kind": 1,
"language": "markdown",
"value": "## Unverified Older Insiders-Released Issues"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS -$MILESTONE is:issue is:closed label:bug label:insiders-released -label:verified -label:*duplicate -label:*as-designed -label:z-author-verified -label:on-testplan"
},
{
"kind": 1,
"language": "markdown",
"value": "## Unverified Older Insiders-Released Feature Requests"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS -$MILESTONE is:issue is:closed label:feature-request label:insiders-released -label:on-testplan -label:verified -label:*duplicate"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Test Plan Items without milestone"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:open label:testplan-item no:milestone"
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing"
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS is:issue is:open label:testplan-item"
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed label:verification-needed -label:verified"
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification"
},
{
"kind": 1,
"language": "markdown",
"value": "## Verifiable Fixes"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified -label:unreleased"
},
{
"kind": 1,
"language": "markdown",
"value": "## Unreleased Fixes"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:issue is:closed sort:updated-asc label:bug -label:verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -label:z-author-verified label:unreleased"
},
{
"kind": 1,
"language": "markdown",
"value": "# Candidates"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE is:open label:candidate"
}
]

View File

@@ -1,667 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## Config"
},
{
"kind": 2,
"language": "github-issues",
"value": "$since=2021-10-01"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode\n\nQuery exceeds the maximum result. Run the query manually: `is:issue is:open closed:>2021-10-01`"
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "//repo:microsoft/vscode is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-remote-release"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-release is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-docs"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-docs is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-js-debug"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-js-debug is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# language-server-protocol"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/language-server-protocol is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-eslint"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-eslint is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-css-languageservice"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-css-languageservice is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-test"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-pull-request-github"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-pull-request-github is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-chrome-debug-core"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-chrome-debug-core is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-debugadapter-node"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-debugadapter-node is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-emmet-helper"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-emmet-helper is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-vscode\n\nDeprecated"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-vscode is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-extension-samples"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-extension-samples is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-filewatcher-windows"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-filewatcher-windows is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-generator-code"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-generator-code is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-html-languageservice"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-html-languageservice is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-json-languageservice"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-json-languageservice is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-languageserver-node"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue closed:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": ""
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-languageserver-node is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-loader"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-loader is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-mono-debug"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-mono-debug is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-node-debug2"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-node-debug2 is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-recipes"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-recipes is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-textmate"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-textmate is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-themes"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-themes is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-vsce"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-vsce is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-website"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-website is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-windows-process-tree"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-windows-process-tree is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# debug-adapter-protocol"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/debug-adapter-protocol is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# inno-updater"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/inno-updater is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-languages"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-languages is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-typescript"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-typescript is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-css"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-css is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-json"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-json is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-html"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-html is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# monaco-editor-webpack-plugin"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/monaco-editor-webpack-plugin is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# node-jsonc-parser"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/node-jsonc-parser is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-jupyter"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jupyter is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-jupyter is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-python"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-python is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-python is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-livepreview"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-livepreview is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-livepreview is:issue created:>$since"
},
{
"kind": 1,
"language": "markdown",
"value": ""
},
{
"kind": 1,
"language": "markdown",
"value": "# vscode-test"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue closed:>$since"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-test is:issue created:>$since"
}
]

View File

@@ -1,47 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## tl;dr: Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information"
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:\"needs more info\" sort:created-desc"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode label:triage-needed is:open"
},
{
"kind": 1,
"language": "markdown",
"value": "##### `Config`: defines the inbox query"
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item "
},
{
"kind": 1,
"language": "markdown",
"value": "## Inbox tracking and Issue triage"
},
{
"kind": 1,
"language": "markdown",
"value": "New issues or pull requests submitted by the community are initially triaged by an [automatic classification bot](https://github.com/microsoft/vscode-github-triage-actions/tree/master/classifier-deep). Issues that the bot does not correctly triage are then triaged by a team member. The team rotates the inbox tracker on a weekly basis.\n\nA [mirror](https://github.com/JacksonKearl/testissues/issues) of the VS Code issue stream is available with details about how the bot classifies issues, including feature-area classifications and confidence ratings. Per-category confidence thresholds and feature-area ownership data is maintained in [.github/classifier.json](https://github.com/microsoft/vscode/blob/main/.github/classifier.json). \n\n💡 The bot is being run through a GitHub action that runs every 30 minutes. Give the bot the opportunity to classify an issue before doing it manually.\n\n### Inbox Tracking\n\nThe inbox tracker is responsible for the [global inbox](https://github.com/microsoft/vscode/issues?utf8=%E2%9C%93&q=is%3Aopen+no%3Aassignee+-label%3Afeature-request+-label%3Atestplan-item+-label%3Aplan-item) containing all **open issues and pull requests** that\n- are neither **feature requests** nor **test plan items** nor **plan items** and\n- have **no owner assignment**.\n\nThe **inbox tracker** may perform any step described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) but its main responsibility is to route issues to the actual feature area owner.\n\nFeature area owners track the **feature area inbox** containing all **open issues and pull requests** that\n- are personally assigned to them and are not assigned to any milestone\n- are labeled with their feature area label and are not assigned to any milestone.\nThis secondary triage may involve any of the steps described in our [issue triaging documentation](https://github.com/microsoft/vscode/wiki/Issues-Triaging) and results in a fully triaged or closed issue.\n\nThe [github triage extension](https://github.com/microsoft/vscode-github-triage-extension) can be used to assist with triaging — it provides a \"Command Palette\"-style list of triaging actions like assignment, labeling, and triggers for various bot actions."
},
{
"kind": 1,
"language": "markdown",
"value": "## All Inbox Items\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**."
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox"
}
]

View File

@@ -1,182 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "#### Macros"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub repo:microsoft/vscode-remote-repositories-github repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-livepreview repo:microsoft/vscode-python repo:microsoft/vscode-jupyter repo:microsoft/vscode-jupyter-internal\n\n$MILESTONE=milestone:\"April 2022\"\n\n$MINE=assignee:@me"
},
{
"kind": 1,
"language": "markdown",
"value": "# Preparation"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Pull Requests on the Milestone"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:pr is:open"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues on the Milestone"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:iteration-plan -label:endgame-plan -label:testplan-item"
},
{
"kind": 1,
"language": "markdown",
"value": "## Feature Requests Missing Labels"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request -label:verification-needed -label:on-testplan -label:verified -label:*duplicate"
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS is:issue is:open author:@me label:testplan-item"
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:closed label:feature-request label:verification-needed -label:verified"
},
{
"kind": 1,
"language": "markdown",
"value": "# Testing"
},
{
"kind": 1,
"language": "markdown",
"value": "## Test Plan Items"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MINE is:issue is:open label:testplan-item"
},
{
"kind": 1,
"language": "markdown",
"value": "## Verification Needed"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -assignee:@me -label:verified -label:z-author-verified label:feature-request label:verification-needed"
},
{
"kind": 1,
"language": "markdown",
"value": "# Fixing"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Issues"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open -label:endgame-plan -label:testplan-item -label:iteration-plan"
},
{
"kind": 1,
"language": "markdown",
"value": "## Open Bugs"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue is:open label:bug"
},
{
"kind": 1,
"language": "markdown",
"value": "# Verification"
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-steps-needed)"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-steps-needed"
},
{
"kind": 1,
"language": "markdown",
"value": "## My Issues (verification-found)"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE $MINE is:issue label:bug label:verification-found"
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by me"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:needs-triage -label:verification-found"
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed from outside team"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -author:aeschli -author:alexdima -author:alexr00 -author:AmandaSilver -author:bamurtaugh -author:bpasero -author:chrisdias -author:chrmarti -author:Chuxel -author:claudiaregio -author:connor4312 -author:dbaeumer -author:deepak1556 -author:devinvalenciano -author:digitarald -author:DonJayamanne -author:dynamicwebpaige -author:eamodio -author:egamma -author:fiveisprime -author:greazer -author:gregvanl -author:hediet -author:IanMatthewHuff -author:isidorn -author:ItalyPaleAle -author:JacksonKearl -author:joaomoreno -author:joyceerhl -author:jrieken -author:karrtikr-author:kieferrm -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:ornellaalt -author:orta -author:rchiodo -author:rebornix -author:roblourens -author:rzhao271 -author:sana-ajani -author:sandy081 -author:sbatten -author:stevencl -author:tanhakabir -author:TylerLeonhardt -author:Tyriar -author:weinand -author:kimadeline -author:amunger"
},
{
"kind": 1,
"language": "markdown",
"value": "## Issues filed by others"
},
{
"kind": 2,
"language": "github-issues",
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed -author:@me sort:updated-asc label:bug -label:unreleased -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found"
},
{
"kind": 1,
"language": "markdown",
"value": "# Release Notes"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode $MILESTONE $MINE is:issue is:closed label:feature-request -label:on-release-notes"
}
]

File diff suppressed because one or more lines are too long

View File

@@ -1,44 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "## Papercuts\n\nThis notebook serves as an ongoing collection of papercut issues that we encounter while dogfooding. With that in mind only promote issues that really turn you off, e.g. issues that make you want to stop using VS Code or its extensions. To mark an issue (bug, feature-request, etc.) as papercut add the labels: `papercut :drop_of_blood:`",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## All Papercuts\n\nThese are all papercut issues that we encounter while dogfooding vscode or extensions that we author.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open -label:notebook label:\"papercut :drop_of_blood:\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Native Notebook",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open label:notebook label:\"papercut :drop_of_blood:\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### My Papercuts",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode is:open assignee:@me label:\"papercut :drop_of_blood:\"",
"editable": true
}
]

View File

@@ -1,47 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "### Bug Verification Queries\n\nBefore shipping we want to verify _all_ bugs. That means when a bug is fixed we check that the fix actually works. It's always best to start with bugs that you have filed and the proceed with bugs that have been filed from users outside the development team. "
},
{
"kind": 1,
"language": "markdown",
"value": "#### Config: update list of `repos` and the `milestone`"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-dev repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-jupyter repo:microsoft/vscode-python\n$milestone=milestone:\"March 2022\""
},
{
"kind": 1,
"language": "markdown",
"value": "### Bugs You Filed"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate author:@me"
},
{
"kind": 1,
"language": "markdown",
"value": "### Bugs From Outside"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand -author:rzhao271 -author:kieferrm -author:TylerLeonhardt -author:bamurtaugh -author:hediet -author:joyceerhl -author:rchiodo -author:IanMatthewHuff"
},
{
"kind": 1,
"language": "markdown",
"value": "### All"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate"
}
]

View File

@@ -1,42 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "# vscode.dev repo"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-dev milestone:\"December 2021\" is:open"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-dev milestone:\"Backlog\" is:open"
},
{
"kind": 1,
"language": "markdown",
"value": "# VS Code repo"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode label:vscode.dev is:open"
},
{
"kind": 1,
"language": "markdown",
"value": "# GitHub Repositories repos"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remote-repositories-github milestone:\"December 2021\" is:open"
},
{
"kind": 2,
"language": "github-issues",
"value": "repo:microsoft/vscode-remotehub milestone:\"December 2021\" is:open"
}
]

View File

@@ -1,167 +0,0 @@
# Query: strict-null
76 results - 44 files
src\vs\base\browser\ui\tree\compressedObjectTreeModel.ts:
455: return null; // {{SQL CARBON EDIT}} strict-null-check
465: return null; // {{SQL CARBON EDIT}} strict-null-check
src\vs\platform\actions\common\menuService.ts:
97: const toggledExpression: ContextKeyExpression = (item.command.toggled as { condition: ContextKeyExpression }).condition || item.command.toggled as ContextKeyExpression; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\platform\clipboard\browser\clipboardService.ts:
57: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\platform\dialogs\electron-main\dialogs.ts:
123: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\platform\driver\electron-main\driver.ts:
214: const driver = instantiationService.createInstance(Driver as any, windowServer, { verbose }) as Driver; // {{SQL CARBON EDIT}} strict-null-check...i guess?
src\vs\platform\extensionManagement\node\extensionManagementService.ts:
558: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\platform\quickinput\browser\pickerQuickAccess.ts:
216: active: activePick as T || additionalActivePick as T // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\api\browser\mainThreadLanguageFeatures.ts:
90: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
600: return undefined; // {{SQL CARBON EDIT}} strict-null-check
610: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\api\common\extHost.api.impl.ts:
538: alignment = alignmentOrOptions as number; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\api\common\extHostComments.ts:
410: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
src\vs\workbench\api\common\extHostTask.ts:
583: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\api\common\extHostTerminalService.ts:
279: this._onProcessExit.fire(e === void 0 ? undefined : e as number); // {{SQL CARBON EDIT}} strict-null-checks
283: this._pty.onDidOverrideDimensions(e => this._onProcessOverrideDimensions.fire(e ? { cols: e.columns, rows: e.rows } : undefined)); // {{SQL CARBONEDIT}} strict-null-checks
src\vs\workbench\browser\actions\workspaceCommands.ts:
87: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
120: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
src\vs\workbench\browser\parts\editor\editorGroupView.ts:
827: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\browser\parts\panel\panelPart.ts:
151: (id: string, focus?: boolean) => <unknown>this.openPanel(id, focus) as Promise<IPaneComposite | undefined>, // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\browser\parts\sidebar\sidebarPart.ts:
59: return undefined; // {{SQL CARBON EDIT}} strict-null-check
64: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\common\editor\editorGroup.ts:
388: return undefined; // {{SQL CARBON EDIT}} strict-null-check
406: return undefined; // not found {{SQL CARBON EDIT}} strict-null-check
433: return undefined; // not found {{SQL CARBON EDIT}} strict-null-check
456: return undefined; // not found {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\callHierarchy\browser\callHierarchyPeek.ts:
377: const root = <ITreeNode<callHTree.Call, FuzzyScore>>this._tree.getNode(model).children[0]; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\contrib\customEditor\browser\customEditorInput.ts:
230: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\contrib\customEditor\browser\customEditors.ts:
169: return undefined; // {{SQL CARBON EDIT}} strict-nulls
468: return undefined; // {{SQL CARBON EDIT}} Strict-null-checks
493: return undefined; // {{SQL CARBON EDIT}} Strict-null-checks
505: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\extensions\browser\extensionsActions.ts:
2203: return (<IExtensionsConfigContent>json.parse(content.value.toString()) || {}) as IExtensionsConfigContent; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\extensions\test\electron-browser\extensionRecommendationsService.test.ts:
508: instantiationService.stub(IStorageService, <any>{ // {{SQL CARBON EDIT}} strict-null-checks?
src\vs\workbench\contrib\files\common\explorerService.ts:
393: const configSortOrder = configuration?.explorer?.sortOrder || SortOrder.Default; // {{SQL CARBON EDIT}} strict-null-checks?
src\vs\workbench\contrib\notebook\browser\notebookEditor.ts:
475: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\notebook\browser\notebookService.ts:
204: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\notebook\browser\contrib\notebookActions.ts:
412: return undefined; // {{SQL CARBON EDIT}} strict-null-check
417: return undefined; // {{SQL CARBON EDIT}} strict-null-check
479: return undefined; // {{SQL CARBON EDIT}} strict-null-check
484: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\notebook\test\testNotebookEditor.ts:
186: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\remote\browser\remote.ts:
544: return undefined; // {{SQL CARBON EDIT}} strict-null-check
563: return undefined; // {{SQL CARBON EDIT}} strict-null-check;
src\vs\workbench\contrib\remote\browser\tunnelView.ts:
589: const node: ITunnelItem | null = treeEvent.element as ITunnelItem | null; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\search\browser\anythingQuickAccess.ts:
631: return undefined; // {{SQL CARBON EDIT}} strict-null
636: return undefined; // {{SQL CARBON EDIT}} strict-null
641: return undefined; // {{SQL CARBON EDIT}} strict-null
651: return undefined; // {{SQL CARBON EDIT}} strict-null
663: return undefined; // {{SQL CARBON EDIT}} strict-null
668: return undefined; // {{SQL CARBON EDIT}} strict-null
698: return undefined; // {{SQL CARBON EDIT}} strict-null
src\vs\workbench\contrib\searchEditor\browser\searchEditor.ts:
335: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\contrib\searchEditor\browser\searchEditorInput.ts:
121: if ((await this.headerModel).isDisposed() || (await this.contentsModel).isDisposed()) { return undefined; } // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\contrib\tasks\browser\abstractTaskService.ts:
565: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
586: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\contrib\tasks\browser\taskQuickPick.ts:
204: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
207: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\contrib\webview\browser\webviewWorkbenchService.ts:
148: return undefined; // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\electron-browser\desktop.main.ts:
283: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
src\vs\workbench\services\dialogs\browser\simpleFileDialog.ts:
496: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
502: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
src\vs\workbench\services\dialogs\electron-browser\fileDialogService.ts:
127: return undefined; // {{SQL CARBON EDIT}} strict-null-check
151: return undefined; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\services\extensions\common\abstractExtensionService.ts:
235: result.push(new ExtensionPointContribution<T>(desc, desc.contributes[extPoint.name])); // {{SQL CARBON EDIT}} strict-null-checks
376: value: desc.contributes[extensionPoint.name], // {{SQL CARBON EDIT}} strict-null-checks
src\vs\workbench\services\textfile\browser\textFileService.ts:
221: return undefined; // user canceled // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\services\textfile\common\textFileEditorModel.ts:
611: if ((this.saveSequentializer as TaskSequentializer).hasPending()) { // {{SQL CARBON EDIT}} strict-null-check
619: (this.saveSequentializer as TaskSequentializer).cancelPending(); // {{SQL CARBON EDIT}} strict-null-check
622: return (this.saveSequentializer as TaskSequentializer).setNext(() => this.doSave(options)); // {{SQL CARBON EDIT}} strict-null-check
633: return (this.saveSequentializer as TaskSequentializer).setPending(versionId, (async () => { // {{SQL CARBON EDIT}} strict-null-checks
667: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
672: return undefined; // {{SQL CARBON EDIT}} @anthonydresser strict-null-check
src\vs\workbench\services\textfile\common\textfiles.ts:
421: isDirty(): boolean; // {{SQL CARBON EDIT}} strict-null-check
src\vs\workbench\services\themes\browser\workbenchThemeService.ts:
248: Theme(), initializeFileIconTheme(), initializeProductIconTheme()]) as Promise<[IWorkbenchColorTheme | null, IWorkbenchFileIconTheme | null, IWorkbenchProductIconTheme | null]>; // {{SQL CARBON EDIT}} strict-null-checks maybe?
src\vs\workbench\services\workspaces\browser\abstractWorkspaceEditingService.ts:
56: return undefined; // canceled {{SQL CARBON EDIT}} strict-null-checks

View File

@@ -1,53 +0,0 @@
# Query: \\w+\\?\\.\\w+![(.[]
# Flags: RegExp
# ContextLines: 2
8 results - 4 files
src/vs/base/browser/ui/tree/asyncDataTree.ts:
241 } : () => 'treeitem',
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
244 } : undefined,
245 getAriaLabel(e) {
src/vs/platform/list/browser/listService.ts:
463
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) {
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
466 this._register(options?.configurationService.onDidChangeConfiguration(() => {
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
468 }));
469 } else {
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
1526
1527 await this._ensureActiveKernel();
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
1529 }
1530
1535
1536 await this._ensureActiveKernel();
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
1538 }
1539
1553
1554 await this._ensureActiveKernel();
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1556 }
1557
1567
1568 await this._ensureActiveKernel();
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1570 }
1571
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
89 .then(() => this._resourceRequestManager.ensureReady())
90 .then(() => {
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
92 });
93 }

File diff suppressed because it is too large Load Diff

74
.vscode/settings.json vendored
View File

@@ -1,17 +1,17 @@
{
"editor.insertSpaces": false,
"files.eol": "\n",
"files.trimTrailingWhitespace": true,
"files.exclude": {
".git": true,
".build": true,
".profile-oss": true,
"**/.DS_Store": true,
"build/**/*.js": {
"when": "$(basename).ts"
}
},
"files.associations": {
"cglicenses.json": "jsonc"
"OSSREADME.json": "jsonc"
},
"search.exclude": {
"**/node_modules": true,
@@ -22,12 +22,9 @@
"out-vscode/**": true,
"i18n/**": true,
"extensions/**/out/**": true,
"test/smoke/out/**": true,
"test/automation/out/**": true,
"test/integration/browser/out/**": true,
"src/vs/base/test/node/uri.test.data.txt": true,
"src/vs/workbench/api/test/browser/extHostDocumentData.test.perf-data.ts": true
"test/smoke/out/**": true
},
"tslint.enable": true,
"lcov.path": [
"./.build/coverage/lcov.info",
"./.build/coverage-single/lcov.info"
@@ -41,66 +38,5 @@
}
}
],
"eslint.options": {
"rulePaths": [
"./build/lib/eslint"
]
},
"typescript.tsdk": "node_modules/typescript/lib",
"npm.exclude": "**/extensions/**",
"npm.packageManager": "yarn",
"emmet.excludeLanguages": [],
"typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.quoteStyle": "single",
"json.schemas": [
{
"fileMatch": [
"cgmanifest.json"
],
"url": "https://json.schemastore.org/component-detection-manifest.json"
},
{
"fileMatch": [
"cglicenses.json"
],
"url": "./.vscode/cglicenses.schema.json"
}
],
"git.ignoreLimitWarning": true,
"remote.extensionKind": {
"msjsdiag.debugger-for-chrome": "workspace"
},
"gulp.autoDetect": "off",
"files.insertFinalNewline": true,
"[plaintext]": {
"files.insertFinalNewline": false
},
"[typescript]": {
"editor.defaultFormatter": "vscode.typescript-language-features",
"editor.formatOnSave": true
},
"[javascript]": {
"editor.defaultFormatter": "vscode.typescript-language-features",
"editor.formatOnSave": true
},
"typescript.tsc.autoDetect": "off",
"testing.autoRun.mode": "rerun",
"conventionalCommits.scopes": [
"tree",
"scm",
"grid",
"splitview",
"table",
"list",
"git",
"sash"
],
"editor.quickSuggestions": {
"other": "inline",
"comments": "inline",
"strings": "inline"
},
"yaml.schemas": {
"https://raw.githubusercontent.com/microsoft/azure-pipelines-vscode/master/service-schema.json": "build/azure-pipelines/**/*.yml"
},
"typescript.tsdk": "node_modules/typescript/lib"
}

View File

@@ -1,40 +0,0 @@
{
// Each snippet is defined under a snippet name and has a scope, prefix, body and
// description. The scope defines in watch languages the snippet is applicable. The prefix is what is
// used to trigger the snippet and the body will be expanded and inserted.Possible variables are:
// $1, $2 for tab stops, $0 for the final cursor position, and ${1:label}, ${2:another} for placeholders.
// Placeholders with the same ids are connected.
// Example:
"MSFT Copyright Header": {
"scope": "javascript,typescript,css",
"prefix": [
"header",
"stub",
"copyright"
],
"body": [
"/*---------------------------------------------------------------------------------------------",
" * Copyright (c) Microsoft Corporation. All rights reserved.",
" * Licensed under the Source EULA. See License.txt in the project root for license information.",
" *--------------------------------------------------------------------------------------------*/",
"",
"$0"
],
"description": "Insert Copyright Statement"
},
"TS -> Inject Service": {
"scope": "typescript",
"description": "Constructor Injection Pattern",
"prefix": "@inject",
"body": "@$1 private readonly _$2: ${1},$0"
},
"TS -> Event & Emitter": {
"scope": "typescript",
"prefix": "emitter",
"description": "Add emitter and event properties",
"body": [
"private readonly _onDid$1 = new Emitter<$2>();",
"readonly onDid$1: Event<$2> = this._onDid$1.event;"
],
}
}

210
.vscode/tasks.json vendored
View File

@@ -3,119 +3,8 @@
"tasks": [
{
"type": "npm",
"script": "watch-clientd",
"label": "Core - Build",
"isBackground": true,
"presentation": {
"reveal": "never",
"group": "buildWatchers",
"close": false
},
"problemMatcher": {
"owner": "typescript",
"applyTo": "closedDocuments",
"fileLocation": [
"absolute"
],
"pattern": {
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
"file": 1,
"location": 2,
"message": 3
},
"background": {
"beginsPattern": "Starting compilation...",
"endsPattern": "Finished compilation with"
}
}
},
{
"type": "npm",
"script": "watch-extensionsd",
"label": "Ext - Build",
"isBackground": true,
"presentation": {
"reveal": "never",
"group": "buildWatchers",
"close": false
},
"problemMatcher": {
"owner": "typescript",
"applyTo": "closedDocuments",
"fileLocation": [
"absolute"
],
"pattern": {
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
"file": 1,
"location": 2,
"message": 3
},
"background": {
"beginsPattern": "Starting compilation",
"endsPattern": "Finished compilation"
}
}
},
{
"label": "VS Code - Build",
"dependsOn": [
"Core - Build",
"Ext - Build"
],
"group": {
"kind": "build",
"isDefault": true
},
"problemMatcher": []
},
{
"type": "npm",
"script": "kill-watch-clientd",
"label": "Kill Core - Build",
"group": "build",
"presentation": {
"reveal": "never",
"group": "buildKillers",
"close": true
},
"problemMatcher": "$tsc"
},
{
"type": "npm",
"script": "kill-watch-extensionsd",
"label": "Kill Ext - Build",
"group": "build",
"presentation": {
"reveal": "never",
"group": "buildKillers",
"close": true
},
"problemMatcher": "$tsc"
},
{
"label": "Kill VS Code - Build",
"dependsOn": [
"Kill Core - Build",
"Kill Ext - Build"
],
"group": "build",
"problemMatcher": []
},
{
"label": "Restart VS Code - Build",
"dependsOn": [
"Kill VS Code - Build",
"VS Code - Build"
],
"group": "build",
"dependsOrder": "sequence",
"problemMatcher": []
},
{
"type": "npm",
"script": "watch-webd",
"label": "Web Ext - Build",
"script": "watch",
"label": "Build VS Code",
"group": "build",
"isBackground": true,
"presentation": {
@@ -140,14 +29,12 @@
}
},
{
"type": "npm",
"script": "kill-watch-webd",
"label": "Kill Web Ext - Build",
"group": "build",
"presentation": {
"reveal": "never"
},
"problemMatcher": "$tsc"
"type": "gulp",
"task": "tslint",
"label": "Run tslint",
"problemMatcher": [
"$tslint5"
]
},
{
"label": "Run tests",
@@ -172,91 +59,14 @@
"problemMatcher": []
},
{
"type": "npm",
"script": "electron",
"type": "gulp",
"task": "electron",
"label": "Download electron"
},
{
"type": "gulp",
"task": "hygiene",
"problemMatcher": []
},
{
"type": "shell",
"command": "./scripts/code-server.sh",
"windows": {
"command": ".\\scripts\\code-server.bat"
},
"args": ["--no-launch", "--connection-token", "dev-token", "--port", "8080"],
"label": "Run code server",
"isBackground": true,
"problemMatcher": {
"pattern": {
"regexp": ""
},
"background": {
"beginsPattern": ".*node .*",
"endsPattern": "Web UI available at .*"
}
},
"presentation": {
"reveal": "never"
}
},
{
"type": "npm",
"script": "eslint",
"problemMatcher": {
"source": "eslint",
"base": "$eslint-stylish"
}
},
{
"type": "shell",
"command": "node build/lib/preLaunch.js",
"label": "Ensure Prelaunch Dependencies",
"presentation": {
"reveal": "silent",
"close": true
}
},
{
"type": "npm",
"script": "tsec-compile-check",
"problemMatcher": [
{
"base": "$tsc",
"applyTo": "allDocuments",
"owner": "tsec"
}
],
"group": "build",
"label": "npm: tsec-compile-check",
"detail": "node_modules/tsec/bin/tsec -p src/tsconfig.json --noEmit"
},
{
"type": "npm",
"script": "watch",
"label": "Watch sample-resource-deployment",
"path": "./samples/sample-resource-deployment/package.json",
"problemMatcher": "$tsc-watch",
"isBackground": true,
"presentation": {
"reveal": "never"
},
"group": "build"
},
{
"type": "npm",
"script": "watch",
"label": "Watch sample-notebook-provider",
"path": "./samples/sample-notebook-provider/package.json",
"problemMatcher": "$tsc-watch",
"isBackground": true,
"presentation": {
"reveal": "never"
},
"group": "build"
}
]
}

View File

@@ -1,4 +1,3 @@
disturl "https://electronjs.org/headers"
target "19.1.8"
disturl "https://atom.io/download/electron"
target "1.7.12"
runtime "electron"
build_from_source "true"

File diff suppressed because it is too large Load Diff

1
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1 @@
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.

View File

@@ -1,13 +1,13 @@
## Contributing Issues
### Before Submitting an Issue
First, please do a search in [open issues](https://github.com/Microsoft/azuredatastudio/issues) to see if the issue or feature request has already been filed. Use this [query](https://github.com/Microsoft/azuredatastudio/issues?q=is%3Aopen+is%3Aissue+label%3Afeature-request+sort%3Areactions-%2B1-desc) to search for the most popular feature requests.
First, please do a search in [open issues](https://github.com/Microsoft/sqlopsstudio/issues) to see if the issue or feature request has already been filed. Use this [query](https://github.com/Microsoft/sqlopsstudio/issues?q=is%3Aopen+is%3Aissue+label%3Afeature-request+sort%3Areactions-%2B1-desc) to search for the most popular feature requests.
If you find your issue already exists, make relevant comments and add your [reaction](https://github.com/blog/2119-add-reactions-to-pull-requests-issues-and-comments). Use a reaction in place of a "+1" comment.
:+1: - upvote
👍 - upvote
:-1: - downvote
👎 - downvote
If you cannot find an existing issue that describes your bug or feature, submit an issue using the guidelines below.
@@ -18,33 +18,29 @@ File a single issue per problem and feature request.
* Do not enumerate multiple bugs or feature requests in the same issue.
* Do not add your issue as a comment to an existing issue unless it's for the identical input. Many issues look similar, but have different causes.
The more information you can provide, the more likely someone will be successful at reproducing the issue and finding a fix.
The more information you can provide, the more likely someone will be successful reproducing the issue and finding a fix.
The built-in tool for reporting an issue, which you can access by using `Report Issue` in Azure Data Studio's Help menu, can help streamline this process by automatically providing the version of Azure Data Studio, all your installed extensions, and your system info.
Please include the following with each issue.
Please include the following with each issue.
* Version of SQL Ops Studio
* Version of Azure Data Studio (formerly SQL Operations Studio)
> **Tip:** You can easily create an issue using `Report Issues` from SQL Operations Studio Help menu.
* Your operating system
* Reproducible steps (1... 2... 3...) and what you expected versus what you actually saw.
* Images, animations, or a link to a video.
* A code snippet that demonstrates the issue or a link to a code repository we can easily pull down onto our machine to recreate the issue.
> **Tip:** You can easily create an issue using `Report Issues` from Azure Data Studio Help menu.
* Reproducible steps (1... 2... 3...) and what you expected versus what you actually saw.
* Images, animations, or a link to a video.
* A code snippet that demonstrates the issue or a link to a code repository we can easily pull down onto our machine to recreate the issue.
> **Note:** Because we need to copy and paste the code snippet, including a code snippet as a media file (i.e. .gif) is not sufficient.
> **Note:** Because we need to copy and paste the code snippet, including a code snippet as a media file (i.e. .gif) is not sufficient.
* Errors in the Dev Tools Console (Help | Toggle Developer Tools)
Please remember to do the following:
* Search the issue repository to see if there exists a duplicate.
* Simplify your scripts around the issue so we can better isolate the problem.
* Search the issue repository to see if there exists a duplicate.
* Simplify your scripts around the issue so we can better isolate the problem.
Don't feel bad if we can't reproduce the issue and ask for more information!
## Contributing Fixes
If you are interested in fixing issues and contributing directly to the code base,
please see the document [How to Contribute](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute).
please see the document [How to Contribute](https://github.com/Microsoft/sqlopsstudio/wiki/How-to-Contribute).

View File

@@ -1,6 +1,6 @@
MICROSOFT SOFTWARE LICENSE TERMS
MICROSOFT AZURE DATA STUDIO
MICROSOFT SQL OPERATIONS STUDIO
Microsoft Corporation ("Microsoft") grants you a nonexclusive, perpetual,
royalty-free right to use, copy, and modify the software code provided by us

1196
OSSREADME.json Normal file

File diff suppressed because it is too large Load Diff

153
README.md
View File

@@ -1,69 +1,27 @@
# Azure Data Studio
# SQL Operations Studio
[![Join the chat at https://gitter.im/Microsoft/sqlopsstudio](https://badges.gitter.im/Microsoft/sqlopsstudio.svg)](https://gitter.im/Microsoft/sqlopsstudio?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![Build Status](https://dev.azure.com/ms/azuredatastudio/_apis/build/status/AzureDataStudio-Localization-CI?branchName=main)](https://dev.azure.com/ms/azuredatastudio/_build/latest?definitionId=453&branchName=main)
[![Twitter Follow](https://img.shields.io/twitter/follow/azuredatastudio?style=social)](https://twitter.com/azuredatastudio)
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
## **Download the latest Azure Data Studio release**
**Download SQL Operations Studio May Public Preview**
|Platform |Type |Download |
| --------|-----------------|----------------------- |
|Windows |User Installer |[64 bit][win-user]&emsp;[ARM][win-user-arm64] |
| |System Installer |[64 bit][win-system]&emsp;[ARM][win-system-arm64] |
| |.zip |[64 bit][win-zip]&emsp;[ARM][win-zip-arm64] |
|Linux |.tar.gz |[64 bit][linux-zip] |
| |.deb |[64 bit][linux-deb] |
| |.rpm |[64 bit][linux-rpm] |
|Mac |.zip |[Universal][osx-universal]&emsp;[Intel Chip][osx-zip]&emsp;[Apple Silicon][osx-arm64] |
Platform | Link
-- | --
Windows Setup Installer | https://go.microsoft.com/fwlink/?linkid=873386
Windows ZIP | https://go.microsoft.com/fwlink/?linkid=873387
macOS ZIP | https://go.microsoft.com/fwlink/?linkid=873388
Linux TAR.GZ | https://go.microsoft.com/fwlink/?linkid=873389
Linux RPM | https://go.microsoft.com/fwlink/?linkid=873390
Linux DEB | https://go.microsoft.com/fwlink/?linkid=873391
[win-user]: https://go.microsoft.com/fwlink/?linkid=2222768
[win-system]: https://go.microsoft.com/fwlink/?linkid=2222769
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2223104
[win-user-arm64]: https://go.microsoft.com/fwlink/?linkid=2222660
[win-system-arm64]: https://go.microsoft.com/fwlink/?linkid=2222849
[win-zip-arm64]: https://go.microsoft.com/fwlink/?linkid=2222850
[osx-universal]: https://go.microsoft.com/fwlink/?linkid=2222873
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2222874
[osx-arm64]: https://go.microsoft.com/fwlink/?linkid=2222680
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2222918
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2223105
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2222875
Go to our [download page](https://aka.ms/sqlopsstudio) for more specific instructions.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
Try out the latest insiders build from `master` at https://github.com/Microsoft/sqlopsstudio/releases.
## Try out the latest insiders build from `main` branch:
See the [change log](https://github.com/Microsoft/sqlopsstudio/blob/master/CHANGELOG.md) for additional details of what's in this release.
|Platform |Type |Download - Insiders Build |
| --------|-----------------|----------------------- |
|Windows |User Installer |[64 bit][in-win-user]&emsp;[ARM][in-win-user-arm64] |
| |System Installer |[64 bit][in-win-system]&emsp;[ARM][in-win-system-arm64] |
| |.zip |[64 bit][in-win-zip]&emsp;[ARM][in-win-zip-arm64] |
|Linux |.tar.gz |[64 bit][in-linux-zip] |
| |.deb |[64 bit][in-linux-deb] |
| |.rpm |[64 bit][in-linux-rpm] |
|Mac |.zip |[Universal][in-osx-universal]&emsp;[Intel Chip][in-osx-zip]&emsp;[Apple Silicon][in-osx-arm64] |
[in-win-user]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider
[in-win-system]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/insider
[in-win-zip]: https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/insider
[in-win-user-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-user/insider
[in-win-system-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64/insider
[in-win-zip-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/win32-arm64-archive/insider
[in-linux-zip]:https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider
[in-linux-deb]:https://azuredatastudio-update.azurewebsites.net/latest/linux-deb-x64/insider
[in-linux-rpm]:https://azuredatastudio-update.azurewebsites.net/latest/linux-rpm-x64/insider
[in-osx-universal]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-universal/insider
[in-osx-zip]: https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider
[in-osx-arm64]: https://azuredatastudio-update.azurewebsites.net/latest/darwin-arm64/insider
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## **Feature Highlights**
**Feature Highlights**
- Cross-Platform DB management for Windows, macOS and Linux with simple XCopy deployment
- SQL Server Connection Management with Connection Dialog, Server Groups, Azure Integration and Registered Servers
@@ -76,85 +34,60 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
- Task History window to view current task execution status, completion results with error messages and task T-SQL scripting
- Scripting support to generate CREATE, SELECT, ALTER and DROP statements for database objects
- Workspaces with full Git integration and Find In Files support to managing T-SQL script libraries
- Modern light-weight shell with theming, user settings, full-screen support, integrated terminal and numerous other features
- Modern light-weight shell with theming, user settings, full screen support, integrated terminal and numerous other features
Here are some of these features in action.
Here's some of these features in action.
<img src='https://github.com/Microsoft/azuredatastudio/blob/main/docs/overview_screen.jpg' width='800px'>
<img src='https://github.com/Microsoft/sqlopsstudio/blob/master/docs/overview_screen.jpg' width='800px'>
## Contributing
If you are interested in fixing issues and contributing directly to the code base,
please see the document [How to Contribute](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute), which covers the following:
please see the document [How to Contribute](https://github.com/Microsoft/sqlopsstudio/wiki/How-to-Contribute), which covers the following:
* [How to build and run from source](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#Build-and-Run-From-Source)
* [The development workflow, including debugging and running tests](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#development-workflow)
* [Submitting pull requests](https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#pull-requests)
* [How to build and run from source](https://github.com/Microsoft/sqlopsstudio/wiki/How-to-Contribute#Build-and-Run-From-Source)
* [The development workflow, including debugging and running tests](https://github.com/Microsoft/sqlopsstudio/wiki/How-to-Contribute#development-workflow)
* [Submitting pull requests](https://github.com/Microsoft/sqlopsstudio/wiki/How-to-Contribute#pull-requests)
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
## Localization
Azure Data Studio is localized into 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). The language packs are available in the Extension Manager marketplace. Simply, search for the specific language using the extension marketplace and install. Once you install the selected language, Azure Data Studio will prompt you to restart with the new language.
SQL Operations Studio localization is now open for community contributions. You can contribute to localization for both software and docs. https://aka.ms/SQLOpsStudioLoc
Localization is now opened for 10 languages: French, Italian, German, Spanish, Simplified Chinese, Traditional Chinese, Japanese, Korean, Russian, and Portuguese (Brazil). Help us make SQL Operations Studio available in your language!
## Privacy Statement
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/privacystatement) describes the privacy statement of this software.
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement) describes the privacy statement of this software.
## Contributions and "Thank You"
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* eulercamposbarros for `Prevent connections from moving on click (#7528)`
* AlexFsmn for `Fixed issue where task icons got hidden if text was too long`
* jamesrod817 for `Tempdb (#7022)`
* dzsquared for `fix(snippets): ads parenthesis to sqlcreateindex snippet #7020`
* devmattrick for `Update row count as updates are received #6642`
* mottykohn for `In Message panel onclick scroll to line #6417`
* Stevoni for `Corrected Keyboard Shortcut Execution Issue #5480`
* yamatoya for `fix the format #4899`
* GeoffYoung for `Fix sqlDropColumn description #4422`
* AlexFsmn for `Added context menu for DBs in explorer view to backup & restore db. #2277`
* sadedil for `Missing feature request: Save as XML #3729`
* gbritton1 for `Removed reference to object explorer #3463`
* Tarig0 for `Add Routine_Type to CreateStoredProc fixes #3257 (#3286)`
* oltruong for `typo fix #3025'`
* Thomas-S-B for `Removed unnecessary IErrorDetectionStrategy #749`
* Thomas-S-B for `Simplified code #750`
* rdaniels6813 for `Add query plan theme support #3031`
* Ruturaj123 for `Fixed some typos and grammatical errors #3027`
* PromoFaux for `Use emoji shortcodes in CONTRIBUTING.md instead of <20> #3009`
* ckaczor for `Fix: DATETIMEOFFSET data types should be ISO formatted #714`
* hi-im-T0dd for `Fixed sync issue with my forked master so this commit is correct #2948`
* hi-im-T0dd for `Fixed when right clicking and selecting Manage-correct name displays #2794`
* philoushka for `center the icon #2760`
* anthonypants for `Typo #2775`
* kstolte for `Fix Invalid Configuration in Launch.json #2789`
* kstolte for `Fixing a reference to SQL Ops Studio #2788`
* AlexFsmn `Feature: Ability to add connection name #2332`
* AlexFsmn `Disabled connection name input when connecting to a server. #2566`
* SebastianPfliegel `Added more saveAsCsv options #2099`
* ianychoi `Fixes a typo: Mimunum -> Minimum #1994`
* AlexFsmn `Fixed bug where proper file extension wasn't appended to the filename. #2151`
* AlexFsmn `Added functionality for adding any file to import wizard #2329`
* AlexFsmn `Fixed background issue when copying a chart to clipboard #2215`
* AlexFsmn `Fixed problem where vertical charts didn't display labels correctly. #2263`
* AlexFsmn `Fixed Initial values for charts to match visuals #2266`
* AlexFsmn `Renamed chart option labels #2264`
* AlexFsmn `Added feature for the opening file after exporting to CSV/XLS/JSON & query files #2216`
* AlexFsmm `Get Connection String should copy to clipboard #2175`
* lanceklinger `Fix for double-clicking column handle in results table #1504`
* westerncj for `Removed duplicate contribution from README.md (#753)`
* ntovas for `Fix for duplicate extensions shown in "Save File" dialog. (#779)`
* SebastianPfliegel for `Add cursor snippet (#475)`
* mikaoelitiana for the fix: `revert README and CONTRIBUTING after last VSCode merge (#574)`
* mikaoelitiana for fix: `revert README and CONTRIBUTING after last VSCode merge (#574)`
* alextercete for `Reinstate menu item to install from VSIX (#682)`
* alextercete for `Fix "No extension gallery service configured" error (#427)`
* mwiedemeyer for `Fix #58: Default sort order for DB size widget (#111)`
* AlexTroshkin for `Show disconnect in context menu only when connectionProfile connected (#150)`
* AlexTroshkin for `Fix #138: Invalid syntax color highlighting (identity not highlighting) (#140))`
* stebet for `Fix #153: Fixing sql snippets that failed on a DB with a case-sensitive collation. (#152)`
* stebet for `Fix #153: Fixing sql snippets that failed on a DB with case-sensitive collation. (#152)`
* SebastianPfliegel `Remove sqlExtensionHelp (#312)`
* olljanat for `Implemented npm version check (#314)`
* Adam Machanic for helping with the `whoisactive` extension
* Adam Mechanic for helping with the `whoisactive` extension
* All community localization contributors
* French: Adrien Clerbois, ANAS BELABBES, Antoine Griffard, Arian Papillon, Eric Macarez, Eric Van Thorre, Jérémy LANDON, Matthias GROSPERRIN, Maxime COQUEREL, Olivier Guinart, thierry DEMAN-BARCELÒ, Thomas Potier
* Italian: Aldo Donetti, Alessandro Alpi, Andrea Dottor, Bruni Luca, Gianluca Hotz, Luca Nardi, Luigi Bruno, Marco Dal Pino, Mirco Vanini, Pasquale Ceglie, Riccardo Cappello, Sergio Govoni, Stefano Demiliani
* German: Anna Henke-Gunvaldson, Ben Weissman, David Ullmer, J.M. ., Kai Modo, Konstantin Staschill, Kostja Klein, Lennart Trunk, Markus Ehrenmüller-Jensen, Mascha Kroenlein, Matthias Knoll, Mourad Louha, Thomas Hütter, Wolfgang Straßer
* Spanish: Alberto Poblacion, Andy Gonzalez, Carlos Mendible, Christian Araujo, Daniel D, Eickhel Mendoza, Ernesto Cardenas, Ivan Toledo Ivanovic, Fran Diaz, JESUS GIL, Jorge Serrano Pérez, José Saturnino Pimentel Juárez, Mauricio Hidalgo, Pablo Iglesias, Rikhardo Estrada Rdez, Thierry DEMAN, YOLANDA CUESTA ALTIERI
* Japanese: Fujio Kojima, Kazushi KAMEGAWA, Masayoshi Yamada, Masayuki Ozawa , Seiji Momoto, Takashi Kanai, Takayoshi Tanaka, Yoshihisa Ozaki, 庄垣内治
* Chinese (simplified): DAN YE, Joel Yang, Lynne Dong, RyanYu Zhang, Sheng Jiang, Wei Zhang, Zhiliang Xu
* Chinese (Traditional): Bruce Chen, Chiayi Yen, Kevin Yang, Winnie Lin, 保哥 Will, 謝政廷
* Korean: Do-Kyun Kim, Evelyn Kim, Helen Jung, Hong Jmee, jeongwoo choi, Jun Hyoung Lee, Jungsun Kim정선, Justin Yoo, Kavrith mucha, Kiwoong Youm, MinGyu Ju, MVP_JUNO BEA, Sejun Kim, SOONMAN KWON, sung man ko, Yeongrak Choi, younggun kim, Youngjae Kim, 소영 이
* Russian: Andrey Veselov, Anton Fontanov, Anton Savin, Elena Ostrovskaia, Igor Babichev, Maxim Zelensky, Rodion Fedechkin, Tasha T, Vladimir Zyryanov
* Portuguese Brazil: Daniel de Sousa, Diogo Duarte, Douglas Correa, Douglas Eccker, José Emanuel Mendes, Marcelo Fernandes, Marcondes Alexandre, Roberto Fonseca, Rodrigo Crespi
And of course, we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/ThirdPartyNotices.txt)
And of course we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/sqlopsstudio/master/ThirdPartyNotices.txt)
## License

View File

@@ -1,41 +0,0 @@
<!-- BEGIN MICROSOFT SECURITY.MD V0.0.5 BLOCK -->
## Security
Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://docs.microsoft.com/en-us/previous-versions/tn-archive/cc751383(v=technet.10)), please report it to us as described below.
## Reporting Security Issues
**Please do not report security vulnerabilities through public GitHub issues.**
Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://msrc.microsoft.com/create-report).
If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://www.microsoft.com/en-us/msrc/pgp-key-msrc).
You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc).
Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
* Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
* Full paths of source file(s) related to the manifestation of the issue
* The location of the affected source code (tag/branch/commit or direct URL)
* Any special configuration required to reproduce the issue
* Step-by-step instructions to reproduce the issue
* Proof-of-concept or exploit code (if possible)
* Impact of the issue, including how an attacker might exploit the issue
This information will help us triage your report more quickly.
If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://microsoft.com/msrc/bounty) page for more details about our active programs.
## Preferred Languages
We prefer all communications to be in English.
## Policy
Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://www.microsoft.com/en-us/msrc/cvd).
<!-- END MICROSOFT SECURITY.MD BLOCK -->

File diff suppressed because it is too large Load Diff

19
appveyor.yml Normal file
View File

@@ -0,0 +1,19 @@
environment:
ELECTRON_RUN_AS_NODE: 1
VSCODE_BUILD_VERBOSE: true
cache:
- '%LOCALAPPDATA%\Yarn\cache'
install:
- ps: Install-Product node 8.9.1 x64
build_script:
- yarn
- .\node_modules\.bin\gulp electron
- npm run compile
test_script:
- node --version
- .\scripts\test.bat
- .\scripts\test-integration.bat

View File

@@ -1,22 +0,0 @@
trigger:
- main
- release/*
jobs:
- job: Windows
pool:
vmImage: VS2017-Win2016
steps:
- template: build/azure-pipelines/win32/continuous-build-win32.yml
- job: Linux
pool:
vmImage: 'Ubuntu-16.04'
steps:
- template: build/azure-pipelines/linux/continuous-build-linux.yml
- job: macOS
pool:
vmImage: macOS-latest
steps:
- template: build/azure-pipelines/darwin/continuous-build-darwin.yml

View File

@@ -1 +0,0 @@
2022-10-06T02:27:18.022Z

View File

@@ -1,3 +0,0 @@
* text eol=lf
*.exe binary
*.dll binary

View File

@@ -1,185 +0,0 @@
# cleanup rules for node modules, .gitignore style
# native node modules
nan/**
*/node_modules/nan/**
fsevents/binding.gyp
fsevents/fsevents.cc
fsevents/build/**
fsevents/src/**
fsevents/test/**
!fsevents/**/*.node
@vscode/sqlite3/binding.gyp
@vscode/sqlite3/benchmark/**
@vscode/sqlite3/cloudformation/**
@vscode/sqlite3/deps/**
@vscode/sqlite3/test/**
@vscode/sqlite3/build/**
@vscode/sqlite3/src/**
!@vscode/sqlite3/build/Release/*.node
windows-mutex/binding.gyp
windows-mutex/build/**
windows-mutex/src/**
!windows-mutex/**/*.node
native-keymap/binding.gyp
native-keymap/build/**
native-keymap/src/**
native-keymap/deps/**
!native-keymap/build/Release/*.node
native-is-elevated/binding.gyp
native-is-elevated/build/**
native-is-elevated/src/**
native-is-elevated/deps/**
!native-is-elevated/build/Release/*.node
native-watchdog/binding.gyp
native-watchdog/build/**
native-watchdog/src/**
!native-watchdog/build/Release/*.node
spdlog/binding.gyp
spdlog/build/**
spdlog/deps/**
spdlog/src/**
spdlog/test/**
spdlog/*.yml
!spdlog/build/Release/*.node
jschardet/dist/**
windows-foreground-love/binding.gyp
windows-foreground-love/build/**
windows-foreground-love/src/**
!windows-foreground-love/**/*.node
windows-process-tree/binding.gyp
windows-process-tree/build/**
windows-process-tree/src/**
!windows-process-tree/**/*.node
keytar/binding.gyp
keytar/build/**
keytar/src/**
keytar/script/**
keytar/node_modules/**
!keytar/**/*.node
node-pty/binding.gyp
node-pty/build/**
node-pty/src/**
node-pty/tools/**
node-pty/deps/**
node-pty/scripts/**
!node-pty/build/Release/*.exe
!node-pty/build/Release/*.dll
!node-pty/build/Release/*.node
# START SQL Modules
@angular/**/src/**
@angular/**/testing/**
angular2-grid/components/**
angular2-grid/directives/**
angular2-grid/interfaces/**
angular2-grid/modules/**
angular2-slickgrid/.vscode/**
angular2-slickgrid/components/**
angular2-slickgrid/examples/**
jquery-ui/external/**
jquery-ui/demos/**
slickgrid/node_modules/**
slickgrid/examples/**
kerberos/build/**
# END SQL Modules
nsfw/binding.gyp
nsfw/build/**
nsfw/src/**
nsfw/includes/**
!nsfw/build/Release/*.node
vscode-nsfw/binding.gyp
vscode-nsfw/build/**
vscode-nsfw/src/**
vscode-nsfw/includes/**
!vscode-nsfw/build/Release/*.node
@parcel/watcher/binding.gyp
@parcel/watcher/build/**
@parcel/watcher/prebuilds/**
@parcel/watcher/src/**
!@parcel/watcher/build/Release/*.node
vsda/build/**
vsda/ci/**
vsda/src/**
vsda/.gitignore
vsda/binding.gyp
vsda/README.md
vsda/targets
!vsda/build/Release/vsda.node
vscode-encrypt/build/**
vscode-encrypt/src/**
vscode-encrypt/vendor/**
vscode-encrypt/.gitignore
vscode-encrypt/binding.gyp
vscode-encrypt/README.md
!vscode-encrypt/build/Release/vscode-encrypt-native.node
vscode-windows-ca-certs/**/*
!vscode-windows-ca-certs/package.json
!vscode-windows-ca-certs/**/*.node
node-addon-api/**/*
# other node modules
**/docs/**
**/example/**
**/examples/**
**/test/**
**/tests/**
**/History.md
**/CHANGELOG.md
**/README.md
**/readme.md
**/readme.markdown
**/*.ts
!typescript/**/*.d.ts
jschardet/dist/**
es6-promise/lib/**
vscode-textmate/webpack.config.js
# {{SQL CARBON EDIT }} We need more than just zone-node.js
# zone.js/dist/**
# !zone.js/dist/zone-node.js
# https://github.com/xtermjs/xterm.js/issues/3137
xterm/src/**
xterm/tsconfig.all.json
# https://github.com/xtermjs/xterm.js/issues/3138
xterm-addon-*/src/**
xterm-addon-*/fixtures/**
xterm-addon-*/out/**
xterm-addon-*/out-test/**

View File

@@ -1,34 +0,0 @@
# cleanup rules for web node modules, .gitignore style
**/*.txt
**/*.json
**/*.md
**/*.d.ts
**/*.js.map
**/LICENSE
**/CONTRIBUTORS
**/docs/**
**/example/**
**/examples/**
jschardet/index.js
jschardet/src/**
jschardet/dist/jschardet.js
vscode-textmate/webpack.config.js
xterm/src/**
xterm-addon-search/src/**
xterm-addon-search/out/**
xterm-addon-search/fixtures/**
xterm-addon-unicode11/src/**
xterm-addon-unicode11/out/**
xterm-addon-webgl/src/**
xterm-addon-webgl/out/**
# This makes sure the model is included in the package
!@vscode/vscode-languagedetection/model/**

View File

@@ -1,17 +0,0 @@
{
"env": {
"commonjs": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"globals": {
"Atomics": "readonly",
"SharedArrayBuffer": "readonly"
},
"parserOptions": {
"ecmaVersion": 2018
},
"rules": {
}
}

View File

@@ -1,2 +0,0 @@
node_modules
*.js.map

View File

@@ -1,6 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -1,96 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
export interface GitHub {
query(query: Query): AsyncIterableIterator<GitHubIssue[]>
hasWriteAccess(user: User): Promise<boolean>
repoHasLabel(label: string): Promise<boolean>
createLabel(label: string, color: string, description: string): Promise<void>
deleteLabel(label: string): Promise<void>
readConfig(path: string): Promise<any>
createIssue(owner: string, repo: string, title: string, body: string): Promise<void>
releaseContainsCommit(release: string, commit: string): Promise<boolean>
}
export interface GitHubIssue extends GitHub {
getIssue(): Promise<Issue>
postComment(body: string): Promise<void>
deleteComment(id: number): Promise<void>
getComments(last?: boolean): AsyncIterableIterator<Comment[]>
closeIssue(): Promise<void>
lockIssue(): Promise<void>
setMilestone(milestoneId: number): Promise<void>
addLabel(label: string): Promise<void>
removeLabel(label: string): Promise<void>
addAssignee(assignee: string): Promise<void>
getClosingInfo(): Promise<{ hash: string | undefined; timestamp: number } | undefined>
}
type SortVar =
| 'comments'
| 'reactions'
| 'reactions-+1'
| 'reactions--1'
| 'reactions-smile'
| 'reactions-thinking_face'
| 'reactions-heart'
| 'reactions-tada'
| 'interactions'
| 'created'
| 'updated'
type SortOrder = 'asc' | 'desc'
export type Reactions = {
'+1': number
'-1': number
laugh: number
hooray: number
confused: number
heart: number
rocket: number
eyes: number
}
export interface User {
name: string
isGitHubApp?: boolean
}
export interface Comment {
author: User
body: string
id: number
timestamp: number
}
export interface Issue {
author: User
body: string
title: string
labels: string[]
open: boolean
locked: boolean
number: number
numComments: number
reactions: Reactions
milestoneId: number | null
assignee?: string
createdAt: number
updatedAt: number
closedAt?: number
}
export interface Query {
q: string
sort?: SortVar
order?: SortOrder
}

View File

@@ -1,293 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const core_1 = require("@actions/core");
const github_1 = require("@actions/github");
const child_process_1 = require("child_process");
const utils_1 = require("../utils/utils");
class OctoKit {
constructor(token, params, options = { readonly: false }) {
this.token = token;
this.params = params;
this.options = options;
// when in readonly mode, record labels just-created so at to not throw unneccesary errors
this.mockLabels = new Set();
this.writeAccessCache = {};
this.octokit = new github_1.GitHub(token);
}
async *query(query) {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`;
console.log(`Querying for ${q}:`);
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query,
q,
per_page: 100,
headers: { Accept: 'application/vnd.github.squirrel-girl-preview+json' },
});
let pageNum = 0;
const timeout = async () => {
if (pageNum < 2) {
/* pass */
}
else if (pageNum < 4) {
await new Promise((resolve) => setTimeout(resolve, 3000));
}
else {
await new Promise((resolve) => setTimeout(resolve, 30000));
}
};
for await (const pageResponse of this.octokit.paginate.iterator(options)) {
await timeout();
await utils_1.logRateLimit(this.token);
const page = pageResponse.data;
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`);
yield page.map((issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)));
}
}
async createIssue(owner, repo, title, body) {
core_1.debug(`Creating issue \`${title}\` on ${owner}/${repo}`);
if (!this.options.readonly)
await this.octokit.issues.create({ owner, repo, title, body });
}
octokitIssueToIssue(issue) {
var _a, _b, _c, _d, _e, _f;
return {
author: { name: issue.user.login, isGitHubApp: issue.user.type === 'Bot' },
body: issue.body,
number: issue.number,
title: issue.title,
labels: issue.labels.map((label) => label.name),
open: issue.state === 'open',
locked: issue.locked,
numComments: issue.comments,
reactions: issue.reactions,
assignee: (_b = (_a = issue.assignee) === null || _a === void 0 ? void 0 : _a.login) !== null && _b !== void 0 ? _b : (_d = (_c = issue.assignees) === null || _c === void 0 ? void 0 : _c[0]) === null || _d === void 0 ? void 0 : _d.login,
milestoneId: (_f = (_e = issue.milestone) === null || _e === void 0 ? void 0 : _e.number) !== null && _f !== void 0 ? _f : null,
createdAt: +new Date(issue.created_at),
updatedAt: +new Date(issue.updated_at),
closedAt: issue.closed_at ? +new Date(issue.closed_at) : undefined,
};
}
async hasWriteAccess(user) {
if (user.name in this.writeAccessCache) {
core_1.debug('Got permissions from cache for ' + user);
return this.writeAccessCache[user.name];
}
core_1.debug('Fetching permissions for ' + user);
const permissions = (await this.octokit.repos.getCollaboratorPermissionLevel({
...this.params,
username: user.name,
})).data.permission;
return (this.writeAccessCache[user.name] = permissions === 'admin' || permissions === 'write');
}
async repoHasLabel(name) {
try {
await this.octokit.issues.getLabel({ ...this.params, name });
return true;
}
catch (err) {
if (err.status === 404) {
return this.options.readonly && this.mockLabels.has(name);
}
throw err;
}
}
async createLabel(name, color, description) {
core_1.debug('Creating label ' + name);
if (!this.options.readonly)
await this.octokit.issues.createLabel({ ...this.params, color, description, name });
else
this.mockLabels.add(name);
}
async deleteLabel(name) {
core_1.debug('Deleting label ' + name);
try {
if (!this.options.readonly)
await this.octokit.issues.deleteLabel({ ...this.params, name });
}
catch (err) {
if (err.status === 404) {
return;
}
throw err;
}
}
async readConfig(path) {
core_1.debug('Reading config at ' + path);
const repoPath = `.github/${path}.json`;
const data = (await this.octokit.repos.getContents({ ...this.params, path: repoPath })).data;
if ('type' in data && data.type === 'file') {
if (data.encoding === 'base64' && data.content) {
return JSON.parse(Buffer.from(data.content, 'base64').toString('utf-8'));
}
throw Error(`Could not read contents "${data.content}" in encoding "${data.encoding}"`);
}
throw Error('Found directory at config path when expecting file' + JSON.stringify(data));
}
async releaseContainsCommit(release, commit) {
if (utils_1.getInput('commitReleasedDebuggingOverride')) {
return true;
}
return new Promise((resolve, reject) => child_process_1.exec(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) => !err || err.code === 1 ? resolve(!err) : reject(err)));
}
}
exports.OctoKit = OctoKit;
class OctoKitIssue extends OctoKit {
constructor(token, params, issueData, options = { readonly: false }) {
super(token, params, options);
this.params = params;
this.issueData = issueData;
}
async addAssignee(assignee) {
core_1.debug('Adding assignee ' + assignee + ' to ' + this.issueData.number);
if (!this.options.readonly) {
await this.octokit.issues.addAssignees({
...this.params,
issue_number: this.issueData.number,
assignees: [assignee],
});
}
}
async closeIssue() {
core_1.debug('Closing issue ' + this.issueData.number);
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
issue_number: this.issueData.number,
state: 'closed',
});
}
async lockIssue() {
core_1.debug('Locking issue ' + this.issueData.number);
if (!this.options.readonly)
await this.octokit.issues.lock({ ...this.params, issue_number: this.issueData.number });
}
async getIssue() {
if (isIssue(this.issueData)) {
core_1.debug('Got issue data from query result ' + this.issueData.number);
return this.issueData;
}
console.log('Fetching issue ' + this.issueData.number);
const issue = (await this.octokit.issues.get({
...this.params,
issue_number: this.issueData.number,
mediaType: { previews: ['squirrel-girl'] },
})).data;
return (this.issueData = this.octokitIssueToIssue(issue));
}
async postComment(body) {
core_1.debug(`Posting comment ${body} on ${this.issueData.number}`);
if (!this.options.readonly)
await this.octokit.issues.createComment({
...this.params,
issue_number: this.issueData.number,
body,
});
}
async deleteComment(id) {
core_1.debug(`Deleting comment ${id} on ${this.issueData.number}`);
if (!this.options.readonly)
await this.octokit.issues.deleteComment({
owner: this.params.owner,
repo: this.params.repo,
comment_id: id,
});
}
async setMilestone(milestoneId) {
core_1.debug(`Setting milestone for ${this.issueData.number} to ${milestoneId}`);
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
issue_number: this.issueData.number,
milestone: milestoneId,
});
}
async *getComments(last) {
core_1.debug('Fetching comments for ' + this.issueData.number);
const response = this.octokit.paginate.iterator(this.octokit.issues.listComments.endpoint.merge({
...this.params,
issue_number: this.issueData.number,
per_page: 100,
...(last ? { per_page: 1, page: (await this.getIssue()).numComments } : {}),
}));
for await (const page of response) {
yield page.data.map((comment) => ({
author: { name: comment.user.login, isGitHubApp: comment.user.type === 'Bot' },
body: comment.body,
id: comment.id,
timestamp: +new Date(comment.created_at),
}));
}
}
async addLabel(name) {
core_1.debug(`Adding label ${name} to ${this.issueData.number}`);
if (!(await this.repoHasLabel(name))) {
throw Error(`Action could not execute becuase label ${name} is not defined.`);
}
if (!this.options.readonly)
await this.octokit.issues.addLabels({
...this.params,
issue_number: this.issueData.number,
labels: [name],
});
}
async removeLabel(name) {
core_1.debug(`Removing label ${name} from ${this.issueData.number}`);
try {
if (!this.options.readonly)
await this.octokit.issues.removeLabel({
...this.params,
issue_number: this.issueData.number,
name,
});
}
catch (err) {
if (err.status === 404) {
console.log(`Label ${name} not found on issue`);
return;
}
throw err;
}
}
async getClosingInfo() {
var _a;
if ((await this.getIssue()).open) {
return;
}
const options = this.octokit.issues.listEventsForTimeline.endpoint.merge({
...this.params,
issue_number: this.issueData.number,
});
let closingCommit;
for await (const event of this.octokit.paginate.iterator(options)) {
const timelineEvents = event.data;
for (const timelineEvent of timelineEvents) {
if (timelineEvent.event === 'closed') {
closingCommit = {
hash: (_a = timelineEvent.commit_id) !== null && _a !== void 0 ? _a : undefined,
timestamp: +new Date(timelineEvent.created_at),
};
}
}
}
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`);
return closingCommit;
}
}
exports.OctoKitIssue = OctoKitIssue;
function isIssue(object) {
const isIssue = 'author' in object &&
'body' in object &&
'title' in object &&
'labels' in object &&
'open' in object &&
'locked' in object &&
'number' in object &&
'numComments' in object &&
'reactions' in object &&
'milestoneId' in object;
return isIssue;
}

View File

@@ -1,336 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { debug } from '@actions/core'
import { GitHub as GitHubAPI } from '@actions/github'
import { Octokit } from '@octokit/rest'
import { exec } from 'child_process'
import { getInput, logRateLimit } from '../utils/utils'
import { Comment, GitHub, GitHubIssue, Issue, Query, User } from './api'
export class OctoKit implements GitHub {
protected octokit: GitHubAPI
// when in readonly mode, record labels just-created so at to not throw unneccesary errors
protected mockLabels: Set<string> = new Set()
constructor(
private token: string,
protected params: { repo: string; owner: string },
protected options: { readonly: boolean } = { readonly: false },
) {
this.octokit = new GitHubAPI(token)
}
async *query(query: Query): AsyncIterableIterator<GitHubIssue[]> {
const q = query.q + ` repo:${this.params.owner}/${this.params.repo}`
console.log(`Querying for ${q}:`)
const options = this.octokit.search.issuesAndPullRequests.endpoint.merge({
...query,
q,
per_page: 100,
headers: { Accept: 'application/vnd.github.squirrel-girl-preview+json' },
})
let pageNum = 0
const timeout = async () => {
if (pageNum < 2) {
/* pass */
} else if (pageNum < 4) {
await new Promise((resolve) => setTimeout(resolve, 3000))
} else {
await new Promise((resolve) => setTimeout(resolve, 30000))
}
}
for await (const pageResponse of this.octokit.paginate.iterator(options)) {
await timeout()
await logRateLimit(this.token)
const page: Array<Octokit.SearchIssuesAndPullRequestsResponseItemsItem> = pageResponse.data
console.log(`Page ${++pageNum}: ${page.map(({ number }) => number).join(' ')}`)
yield page.map(
(issue) => new OctoKitIssue(this.token, this.params, this.octokitIssueToIssue(issue)),
)
}
}
async createIssue(owner: string, repo: string, title: string, body: string): Promise<void> {
debug(`Creating issue \`${title}\` on ${owner}/${repo}`)
if (!this.options.readonly) await this.octokit.issues.create({ owner, repo, title, body })
}
protected octokitIssueToIssue(
issue: Octokit.IssuesGetResponse | Octokit.SearchIssuesAndPullRequestsResponseItemsItem,
): Issue {
return {
author: { name: issue.user.login, isGitHubApp: issue.user.type === 'Bot' },
body: issue.body,
number: issue.number,
title: issue.title,
labels: (issue.labels as Octokit.IssuesGetLabelResponse[]).map((label) => label.name),
open: issue.state === 'open',
locked: (issue as any).locked,
numComments: issue.comments,
reactions: (issue as any).reactions,
assignee: issue.assignee?.login ?? (issue as any).assignees?.[0]?.login,
milestoneId: issue.milestone?.number ?? null,
createdAt: +new Date(issue.created_at),
updatedAt: +new Date(issue.updated_at),
closedAt: issue.closed_at ? +new Date((issue.closed_at as unknown) as string) : undefined,
}
}
private writeAccessCache: Record<string, boolean> = {}
async hasWriteAccess(user: User): Promise<boolean> {
if (user.name in this.writeAccessCache) {
debug('Got permissions from cache for ' + user)
return this.writeAccessCache[user.name]
}
debug('Fetching permissions for ' + user)
const permissions = (
await this.octokit.repos.getCollaboratorPermissionLevel({
...this.params,
username: user.name,
})
).data.permission
return (this.writeAccessCache[user.name] = permissions === 'admin' || permissions === 'write')
}
async repoHasLabel(name: string): Promise<boolean> {
try {
await this.octokit.issues.getLabel({ ...this.params, name })
return true
} catch (err) {
if (err.status === 404) {
return this.options.readonly && this.mockLabels.has(name)
}
throw err
}
}
async createLabel(name: string, color: string, description: string): Promise<void> {
debug('Creating label ' + name)
if (!this.options.readonly)
await this.octokit.issues.createLabel({ ...this.params, color, description, name })
else this.mockLabels.add(name)
}
async deleteLabel(name: string): Promise<void> {
debug('Deleting label ' + name)
try {
if (!this.options.readonly) await this.octokit.issues.deleteLabel({ ...this.params, name })
} catch (err) {
if (err.status === 404) {
return
}
throw err
}
}
async readConfig(path: string): Promise<any> {
debug('Reading config at ' + path)
const repoPath = `.github/${path}.json`
const data = (await this.octokit.repos.getContents({ ...this.params, path: repoPath })).data
if ('type' in data && data.type === 'file') {
if (data.encoding === 'base64' && data.content) {
return JSON.parse(Buffer.from(data.content, 'base64').toString('utf-8'))
}
throw Error(`Could not read contents "${data.content}" in encoding "${data.encoding}"`)
}
throw Error('Found directory at config path when expecting file' + JSON.stringify(data))
}
async releaseContainsCommit(release: string, commit: string): Promise<boolean> {
if (getInput('commitReleasedDebuggingOverride')) {
return true
}
return new Promise((resolve, reject) =>
exec(`git -C ./repo merge-base --is-ancestor ${commit} ${release}`, (err) =>
!err || err.code === 1 ? resolve(!err) : reject(err),
),
)
}
}
export class OctoKitIssue extends OctoKit implements GitHubIssue {
constructor(
token: string,
protected params: { repo: string; owner: string },
private issueData: { number: number } | Issue,
options: { readonly: boolean } = { readonly: false },
) {
super(token, params, options)
}
async addAssignee(assignee: string): Promise<void> {
debug('Adding assignee ' + assignee + ' to ' + this.issueData.number)
if (!this.options.readonly) {
await this.octokit.issues.addAssignees({
...this.params,
issue_number: this.issueData.number,
assignees: [assignee],
})
}
}
async closeIssue(): Promise<void> {
debug('Closing issue ' + this.issueData.number)
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
issue_number: this.issueData.number,
state: 'closed',
})
}
async lockIssue(): Promise<void> {
debug('Locking issue ' + this.issueData.number)
if (!this.options.readonly)
await this.octokit.issues.lock({ ...this.params, issue_number: this.issueData.number })
}
async getIssue(): Promise<Issue> {
if (isIssue(this.issueData)) {
debug('Got issue data from query result ' + this.issueData.number)
return this.issueData
}
console.log('Fetching issue ' + this.issueData.number)
const issue = (
await this.octokit.issues.get({
...this.params,
issue_number: this.issueData.number,
mediaType: { previews: ['squirrel-girl'] },
})
).data
return (this.issueData = this.octokitIssueToIssue(issue))
}
async postComment(body: string): Promise<void> {
debug(`Posting comment ${body} on ${this.issueData.number}`)
if (!this.options.readonly)
await this.octokit.issues.createComment({
...this.params,
issue_number: this.issueData.number,
body,
})
}
async deleteComment(id: number): Promise<void> {
debug(`Deleting comment ${id} on ${this.issueData.number}`)
if (!this.options.readonly)
await this.octokit.issues.deleteComment({
owner: this.params.owner,
repo: this.params.repo,
comment_id: id,
})
}
async setMilestone(milestoneId: number) {
debug(`Setting milestone for ${this.issueData.number} to ${milestoneId}`)
if (!this.options.readonly)
await this.octokit.issues.update({
...this.params,
issue_number: this.issueData.number,
milestone: milestoneId,
})
}
async *getComments(last?: boolean): AsyncIterableIterator<Comment[]> {
debug('Fetching comments for ' + this.issueData.number)
const response = this.octokit.paginate.iterator(
this.octokit.issues.listComments.endpoint.merge({
...this.params,
issue_number: this.issueData.number,
per_page: 100,
...(last ? { per_page: 1, page: (await this.getIssue()).numComments } : {}),
}),
)
for await (const page of response) {
yield (page.data as Octokit.IssuesListCommentsResponseItem[]).map((comment) => ({
author: { name: comment.user.login, isGitHubApp: comment.user.type === 'Bot' },
body: comment.body,
id: comment.id,
timestamp: +new Date(comment.created_at),
}))
}
}
async addLabel(name: string): Promise<void> {
debug(`Adding label ${name} to ${this.issueData.number}`)
if (!(await this.repoHasLabel(name))) {
throw Error(`Action could not execute becuase label ${name} is not defined.`)
}
if (!this.options.readonly)
await this.octokit.issues.addLabels({
...this.params,
issue_number: this.issueData.number,
labels: [name],
})
}
async removeLabel(name: string): Promise<void> {
debug(`Removing label ${name} from ${this.issueData.number}`)
try {
if (!this.options.readonly)
await this.octokit.issues.removeLabel({
...this.params,
issue_number: this.issueData.number,
name,
})
} catch (err) {
if (err.status === 404) {
console.log(`Label ${name} not found on issue`)
return
}
throw err
}
}
async getClosingInfo(): Promise<{ hash: string | undefined; timestamp: number } | undefined> {
if ((await this.getIssue()).open) {
return
}
const options = this.octokit.issues.listEventsForTimeline.endpoint.merge({
...this.params,
issue_number: this.issueData.number,
})
let closingCommit: { hash: string | undefined; timestamp: number } | undefined
for await (const event of this.octokit.paginate.iterator(options)) {
const timelineEvents = event.data as Octokit.IssuesListEventsForTimelineResponseItem[]
for (const timelineEvent of timelineEvents) {
if (timelineEvent.event === 'closed') {
closingCommit = {
hash: timelineEvent.commit_id ?? undefined,
timestamp: +new Date(timelineEvent.created_at),
}
}
}
}
console.log(`Got ${closingCommit} as closing commit of ${this.issueData.number}`)
return closingCommit
}
}
function isIssue(object: any): object is Issue {
const isIssue =
'author' in object &&
'body' in object &&
'title' in object &&
'labels' in object &&
'open' in object &&
'locked' in object &&
'number' in object &&
'numComments' in object &&
'reactions' in object &&
'milestoneId' in object
return isIssue
}

View File

@@ -1,123 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
class Testbed {
constructor(config) {
var _a, _b, _c, _d, _e;
this.config = {
globalLabels: (_a = config === null || config === void 0 ? void 0 : config.globalLabels) !== null && _a !== void 0 ? _a : [],
configs: (_b = config === null || config === void 0 ? void 0 : config.configs) !== null && _b !== void 0 ? _b : {},
writers: (_c = config === null || config === void 0 ? void 0 : config.writers) !== null && _c !== void 0 ? _c : [],
releasedCommits: (_d = config === null || config === void 0 ? void 0 : config.releasedCommits) !== null && _d !== void 0 ? _d : [],
queryRunner: (_e = config === null || config === void 0 ? void 0 : config.queryRunner) !== null && _e !== void 0 ? _e : async function* () {
yield [];
},
};
}
async *query(query) {
for await (const page of this.config.queryRunner(query)) {
yield page.map((issue) => issue instanceof TestbedIssue ? issue : new TestbedIssue(this.config, issue));
}
}
async createIssue(_owner, _repo, _title, _body) {
// pass...
}
async readConfig(path) {
return JSON.parse(JSON.stringify(this.config.configs[path]));
}
async hasWriteAccess(user) {
return this.config.writers.includes(user.name);
}
async repoHasLabel(label) {
return this.config.globalLabels.includes(label);
}
async createLabel(label, _color, _description) {
this.config.globalLabels.push(label);
}
async deleteLabel(labelToDelete) {
this.config.globalLabels = this.config.globalLabels.filter((label) => label !== labelToDelete);
}
async releaseContainsCommit(_release, commit) {
return this.config.releasedCommits.includes(commit);
}
}
exports.Testbed = Testbed;
class TestbedIssue extends Testbed {
constructor(globalConfig, issueConfig) {
var _a, _b, _c;
super(globalConfig);
issueConfig = issueConfig !== null && issueConfig !== void 0 ? issueConfig : {};
issueConfig.comments = (_a = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) !== null && _a !== void 0 ? _a : [];
issueConfig.labels = (_b = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.labels) !== null && _b !== void 0 ? _b : [];
issueConfig.issue = {
author: { name: 'JacksonKearl' },
body: 'issue body',
locked: false,
numComments: ((_c = issueConfig === null || issueConfig === void 0 ? void 0 : issueConfig.comments) === null || _c === void 0 ? void 0 : _c.length) || 0,
number: 1,
open: true,
title: 'issue title',
assignee: undefined,
reactions: {
'+1': 0,
'-1': 0,
confused: 0,
eyes: 0,
heart: 0,
hooray: 0,
laugh: 0,
rocket: 0,
},
closedAt: undefined,
createdAt: +new Date(),
updatedAt: +new Date(),
...issueConfig.issue,
};
this.issueConfig = issueConfig;
}
async addAssignee(assignee) {
this.issueConfig.issue.assignee = assignee;
}
async setMilestone(milestoneId) {
this.issueConfig.issue.milestoneId = milestoneId;
}
async getIssue() {
const labels = [...this.issueConfig.labels];
return { ...this.issueConfig.issue, labels };
}
async postComment(body, author) {
this.issueConfig.comments.push({
author: { name: author !== null && author !== void 0 ? author : 'bot' },
body,
id: Math.random(),
timestamp: +new Date(),
});
}
async deleteComment(id) {
this.issueConfig.comments = this.issueConfig.comments.filter((comment) => comment.id !== id);
}
async *getComments(last) {
yield last
? [this.issueConfig.comments[this.issueConfig.comments.length - 1]]
: this.issueConfig.comments;
}
async addLabel(label) {
this.issueConfig.labels.push(label);
}
async removeLabel(labelToDelete) {
this.issueConfig.labels = this.issueConfig.labels.filter((label) => label !== labelToDelete);
}
async closeIssue() {
this.issueConfig.issue.open = false;
}
async lockIssue() {
this.issueConfig.issue.locked = true;
}
async getClosingInfo() {
return this.issueConfig.closingCommit;
}
}
exports.TestbedIssue = TestbedIssue;

View File

@@ -1,170 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { Comment, GitHub, GitHubIssue, Issue, Query, User } from './api'
type TestbedConfig = {
globalLabels: string[]
configs: Record<string, any>
writers: string[]
releasedCommits: string[]
queryRunner: (query: Query) => AsyncIterableIterator<(TestbedIssueConstructorArgs | TestbedIssue)[]>
}
export type TestbedConstructorArgs = Partial<TestbedConfig>
export class Testbed implements GitHub {
public config: TestbedConfig
constructor(config?: TestbedConstructorArgs) {
this.config = {
globalLabels: config?.globalLabels ?? [],
configs: config?.configs ?? {},
writers: config?.writers ?? [],
releasedCommits: config?.releasedCommits ?? [],
queryRunner:
config?.queryRunner ??
async function* () {
yield []
},
}
}
async *query(query: Query): AsyncIterableIterator<GitHubIssue[]> {
for await (const page of this.config.queryRunner(query)) {
yield page.map((issue) =>
issue instanceof TestbedIssue ? issue : new TestbedIssue(this.config, issue),
)
}
}
async createIssue(_owner: string, _repo: string, _title: string, _body: string): Promise<void> {
// pass...
}
async readConfig(path: string): Promise<any> {
return JSON.parse(JSON.stringify(this.config.configs[path]))
}
async hasWriteAccess(user: User): Promise<boolean> {
return this.config.writers.includes(user.name)
}
async repoHasLabel(label: string): Promise<boolean> {
return this.config.globalLabels.includes(label)
}
async createLabel(label: string, _color: string, _description: string): Promise<void> {
this.config.globalLabels.push(label)
}
async deleteLabel(labelToDelete: string): Promise<void> {
this.config.globalLabels = this.config.globalLabels.filter((label) => label !== labelToDelete)
}
async releaseContainsCommit(_release: string, commit: string): Promise<boolean> {
return this.config.releasedCommits.includes(commit)
}
}
type TestbedIssueConfig = {
issue: Omit<Issue, 'labels'>
comments: Comment[]
labels: string[]
closingCommit: { hash: string | undefined; timestamp: number } | undefined
}
export type TestbedIssueConstructorArgs = Partial<Omit<TestbedIssueConfig, 'issue'>> & {
issue?: Partial<Omit<Issue, 'labels'>>
}
export class TestbedIssue extends Testbed implements GitHubIssue {
public issueConfig: TestbedIssueConfig
constructor(globalConfig?: TestbedConstructorArgs, issueConfig?: TestbedIssueConstructorArgs) {
super(globalConfig)
issueConfig = issueConfig ?? {}
issueConfig.comments = issueConfig?.comments ?? []
issueConfig.labels = issueConfig?.labels ?? []
issueConfig.issue = {
author: { name: 'JacksonKearl' },
body: 'issue body',
locked: false,
numComments: issueConfig?.comments?.length || 0,
number: 1,
open: true,
title: 'issue title',
assignee: undefined,
reactions: {
'+1': 0,
'-1': 0,
confused: 0,
eyes: 0,
heart: 0,
hooray: 0,
laugh: 0,
rocket: 0,
},
closedAt: undefined,
createdAt: +new Date(),
updatedAt: +new Date(),
...issueConfig.issue,
}
this.issueConfig = issueConfig as TestbedIssueConfig
}
async addAssignee(assignee: string): Promise<void> {
this.issueConfig.issue.assignee = assignee
}
async setMilestone(milestoneId: number): Promise<void> {
this.issueConfig.issue.milestoneId = milestoneId
}
async getIssue(): Promise<Issue> {
const labels = [...this.issueConfig.labels]
return { ...this.issueConfig.issue, labels }
}
async postComment(body: string, author?: string): Promise<void> {
this.issueConfig.comments.push({
author: { name: author ?? 'bot' },
body,
id: Math.random(),
timestamp: +new Date(),
})
}
async deleteComment(id: number): Promise<void> {
this.issueConfig.comments = this.issueConfig.comments.filter((comment) => comment.id !== id)
}
async *getComments(last?: boolean): AsyncIterableIterator<Comment[]> {
yield last
? [this.issueConfig.comments[this.issueConfig.comments.length - 1]]
: this.issueConfig.comments
}
async addLabel(label: string): Promise<void> {
this.issueConfig.labels.push(label)
}
async removeLabel(labelToDelete: string): Promise<void> {
this.issueConfig.labels = this.issueConfig.labels.filter((label) => label !== labelToDelete)
}
async closeIssue(): Promise<void> {
this.issueConfig.issue.open = false
}
async lockIssue(): Promise<void> {
this.issueConfig.issue.locked = true
}
async getClosingInfo(): Promise<{ hash: string | undefined; timestamp: number } | undefined> {
return this.issueConfig.closingCommit
}
}

View File

@@ -1,12 +0,0 @@
name: 'PR Labeler'
description: 'Automatically add a Label to a PR'
inputs:
token:
description: GitHub token with issue, comment, and label read/write permissions
default: ${{ github.token }}
label:
description: Github label to add to the PR
required: true
runs:
using: 'node12'
main: 'index.js'

View File

@@ -1,22 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const core = require("@actions/core");
const github_1 = require("@actions/github");
const octokit_1 = require("../api/octokit");
const utils_1 = require("../utils/utils");
const token = utils_1.getRequiredInput('token');
const label = utils_1.getRequiredInput('label');
async function main() {
const pr = new octokit_1.OctoKitIssue(token, github_1.context.repo, { number: github_1.context.issue.number });
pr.addLabel(label);
}
main()
.then(() => utils_1.logRateLimit(token))
.catch(async (error) => {
core.setFailed(error.message);
await utils_1.logErrorToIssue(error.message, true, token);
});

View File

@@ -1,26 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as core from '@actions/core'
import { context } from '@actions/github'
import { OctoKitIssue } from '../api/octokit'
import { getRequiredInput, logErrorToIssue, logRateLimit } from '../utils/utils'
const token = getRequiredInput('token');
const label = getRequiredInput('label');
async function main() {
const pr = new OctoKitIssue(token, context.repo, { number: context.issue.number });
pr.addLabel(label);
}
main()
.then(() => logRateLimit(token))
.catch(async (error) => {
core.setFailed(error.message)
await logErrorToIssue(error.message, true, token)
})

View File

@@ -1,24 +0,0 @@
{
"name": "github-actions",
"version": "1.0.0",
"description": "GitHub Actions",
"scripts": {
"test": "mocha -r ts-node/register **/*.test.ts",
"build": "tsc -p ./tsconfig.json",
"lint": "eslint -c .eslintrc --fix --ext .ts .",
"watch-typecheck": "tsc --watch"
},
"repository": {
"type": "git",
"url": "git+https://github.com/microsoft/azuredatastudio.git"
},
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.6",
"@actions/github": "^2.1.1",
"axios": "^0.21.4",
"ts-node": "^8.6.2",
"typescript": "^3.8.3"
}
}

View File

@@ -1,19 +0,0 @@
{
"compilerOptions": {
"target": "es2019",
"strict": true,
"module": "commonjs",
"moduleResolution": "node",
"removeComments": false,
"resolveJsonModule": true,
"lib": [
"es2020"
],
},
"include": [
"./**/*.ts"
],
"exclude": [
"node_modules"
]
}

View File

@@ -1,72 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const core = require("@actions/core");
const github_1 = require("@actions/github");
const axios_1 = require("axios");
const octokit_1 = require("../api/octokit");
exports.getInput = (name) => core.getInput(name) || undefined;
exports.getRequiredInput = (name) => core.getInput(name, { required: true });
exports.normalizeIssue = (issue) => {
const { body, title } = issue;
const isBug = body.includes('bug_report_template') || /Issue Type:.*Bug.*/.test(body);
const isFeatureRequest = body.includes('feature_request_template') || /Issue Type:.*Feature Request.*/.test(body);
const cleanse = (str) => str
.toLowerCase()
.replace(/<!--.*?-->/gu, '')
.replace(/.* version: .*/gu, '')
.replace(/issue type: .*/gu, '')
.replace(/<details>(.|\s)*?<\/details>/gu, '')
.replace(/vs ?code/gu, '')
.replace(/we have written.*please paste./gu, '')
.replace(/steps to reproduce:/gu, '')
.replace(/does this issue occur when all extensions are disabled.*/gu, '')
.replace(/```(.|\s)*?```/gu, '')
.replace(/!?\[.*?\]\(.*?\)/gu, '')
.replace(/\s+/gu, ' ');
return {
body: cleanse(body),
title: cleanse(title),
issueType: isBug ? 'bug' : isFeatureRequest ? 'feature_request' : 'unknown',
};
};
exports.loadLatestRelease = async (quality) => (await axios_1.default.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data;
exports.daysAgoToTimestamp = (days) => +new Date(Date.now() - days * 24 * 60 * 60 * 1000);
exports.daysAgoToHumanReadbleDate = (days) => new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '');
exports.logRateLimit = async (token) => {
const usageData = (await new github_1.GitHub(token).rateLimit.get()).data.resources;
['core', 'graphql', 'search'].forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit;
const message = `Usage at ${usage} for ${category}`;
if (usage > 0) {
console.log(message);
}
if (usage > 0.5) {
await exports.logErrorToIssue(message, false, token);
}
});
};
exports.logErrorToIssue = async (message, ping, token) => {
// Attempt to wait out abuse detection timeout if present
await new Promise((resolve) => setTimeout(resolve, 10000));
const dest = github_1.context.repo.repo === 'vscode-internalbacklog'
? { repo: 'vscode-internalbacklog', issue: 974 }
: { repo: 'vscode', issue: 93814 };
return new octokit_1.OctoKitIssue(token, { owner: 'Microsoft', repo: dest.repo }, { number: dest.issue })
.postComment(`
Workflow: ${github_1.context.workflow}
Error: ${message}
Issue: ${ping ? `${github_1.context.repo.owner}/${github_1.context.repo.repo}#` : ''}${github_1.context.issue.number}
Repo: ${github_1.context.repo.owner}/${github_1.context.repo.repo}
<!-- Context:
${JSON.stringify(github_1.context, null, 2).replace(/<!--/gu, '<@--').replace(/-->/gu, '--@>')}
-->
`);
};

View File

@@ -1,95 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as core from '@actions/core'
import { context, GitHub } from '@actions/github'
import axios from 'axios'
import { OctoKitIssue } from '../api/octokit'
import { Issue } from '../api/api'
export const getInput = (name: string) => core.getInput(name) || undefined
export const getRequiredInput = (name: string) => core.getInput(name, { required: true })
export const normalizeIssue = (
issue: Issue,
): { body: string; title: string; issueType: 'bug' | 'feature_request' | 'unknown' } => {
const { body, title } = issue
const isBug = body.includes('bug_report_template') || /Issue Type:.*Bug.*/.test(body)
const isFeatureRequest =
body.includes('feature_request_template') || /Issue Type:.*Feature Request.*/.test(body)
const cleanse = (str: string) =>
str
.toLowerCase()
.replace(/<!--.*?-->/gu, '')
.replace(/.* version: .*/gu, '')
.replace(/issue type: .*/gu, '')
.replace(/<details>(.|\s)*?<\/details>/gu, '')
.replace(/vs ?code/gu, '')
.replace(/we have written.*please paste./gu, '')
.replace(/steps to reproduce:/gu, '')
.replace(/does this issue occur when all extensions are disabled.*/gu, '')
.replace(/```(.|\s)*?```/gu, '')
.replace(/!?\[.*?\]\(.*?\)/gu, '')
.replace(/\s+/gu, ' ')
return {
body: cleanse(body),
title: cleanse(title),
issueType: isBug ? 'bug' : isFeatureRequest ? 'feature_request' : 'unknown',
}
}
export interface Release {
productVersion: string
timestamp: number
version: string
}
export const loadLatestRelease = async (quality: 'stable' | 'insider'): Promise<Release | undefined> =>
(await axios.get(`https://vscode-update.azurewebsites.net/api/update/darwin/${quality}/latest`)).data
export const daysAgoToTimestamp = (days: number): number => +new Date(Date.now() - days * 24 * 60 * 60 * 1000)
export const daysAgoToHumanReadbleDate = (days: number) =>
new Date(Date.now() - days * 24 * 60 * 60 * 1000).toISOString().replace(/\.\d{3}\w$/, '')
export const logRateLimit = async (token: string) => {
const usageData = (await new GitHub(token).rateLimit.get()).data.resources
;(['core', 'graphql', 'search'] as const).forEach(async (category) => {
const usage = 1 - usageData[category].remaining / usageData[category].limit
const message = `Usage at ${usage} for ${category}`
if (usage > 0) {
console.log(message)
}
if (usage > 0.5) {
await logErrorToIssue(message, false, token)
}
})
}
export const logErrorToIssue = async (message: string, ping: boolean, token: string): Promise<void> => {
// Attempt to wait out abuse detection timeout if present
await new Promise((resolve) => setTimeout(resolve, 10000))
const dest =
context.repo.repo === 'vscode-internalbacklog'
? { repo: 'vscode-internalbacklog', issue: 974 }
: { repo: 'vscode', issue: 93814 }
return new OctoKitIssue(token, { owner: 'Microsoft', repo: dest.repo }, { number: dest.issue })
.postComment(`
Workflow: ${context.workflow}
Error: ${message}
Issue: ${ping ? `${context.repo.owner}/${context.repo.repo}#` : ''}${context.issue.number}
Repo: ${context.repo.owner}/${context.repo.repo}
<!-- Context:
${JSON.stringify(context, null, 2).replace(/<!--/gu, '<@--').replace(/-->/gu, '--@>')}
-->
`)
}

View File

@@ -1,436 +0,0 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
"@actions/core@^1.2.6":
version "1.9.1"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.9.1.tgz#97c0201b1f9856df4f7c3a375cdcdb0c2a2f750b"
integrity sha512-5ad+U2YGrmmiw6du20AQW5XuWo7UKN2052FjSV7MX+Wfjf8sCqcsZe62NfgHys4QI4/Y+vQvLKYL8jWtA1ZBTA==
dependencies:
"@actions/http-client" "^2.0.1"
uuid "^8.3.2"
"@actions/github@^2.1.1":
version "2.1.1"
resolved "https://registry.yarnpkg.com/@actions/github/-/github-2.1.1.tgz#bcabedff598196d953f58ba750d5e75549a75142"
integrity sha512-kAgTGUx7yf5KQCndVeHSwCNZuDBvPyxm5xKTswW2lofugeuC1AZX73nUUVDNaysnM9aKFMHv9YCdVJbg7syEyA==
dependencies:
"@actions/http-client" "^1.0.3"
"@octokit/graphql" "^4.3.1"
"@octokit/rest" "^16.43.1"
"@actions/http-client@^1.0.3":
version "1.0.8"
resolved "https://registry.yarnpkg.com/@actions/http-client/-/http-client-1.0.8.tgz#8bd76e8eca89dc8bcf619aa128eba85f7a39af45"
integrity sha512-G4JjJ6f9Hb3Zvejj+ewLLKLf99ZC+9v+yCxoYf9vSyH+WkzPLB2LuUtRMGNkooMqdugGBFStIKXOuvH1W+EctA==
dependencies:
tunnel "0.0.6"
"@actions/http-client@^2.0.1":
version "2.0.1"
resolved "https://registry.yarnpkg.com/@actions/http-client/-/http-client-2.0.1.tgz#873f4ca98fe32f6839462a6f046332677322f99c"
integrity sha512-PIXiMVtz6VvyaRsGY268qvj57hXQEpsYogYOu2nrQhlf+XCGmZstmuZBbAybUl1nQGnvS1k1eEsQ69ZoD7xlSw==
dependencies:
tunnel "^0.0.6"
"@octokit/auth-token@^2.4.0":
version "2.4.0"
resolved "https://registry.yarnpkg.com/@octokit/auth-token/-/auth-token-2.4.0.tgz#b64178975218b99e4dfe948253f0673cbbb59d9f"
integrity sha512-eoOVMjILna7FVQf96iWc3+ZtE/ZT6y8ob8ZzcqKY1ibSQCnu4O/B7pJvzMx5cyZ/RjAff6DAdEb0O0Cjcxidkg==
dependencies:
"@octokit/types" "^2.0.0"
"@octokit/endpoint@^6.0.1":
version "6.0.1"
resolved "https://registry.yarnpkg.com/@octokit/endpoint/-/endpoint-6.0.1.tgz#16d5c0e7a83e3a644d1ddbe8cded6c3d038d31d7"
integrity sha512-pOPHaSz57SFT/m3R5P8MUu4wLPszokn5pXcB/pzavLTQf2jbU+6iayTvzaY6/BiotuRS0qyEUkx3QglT4U958A==
dependencies:
"@octokit/types" "^2.11.1"
is-plain-object "^3.0.0"
universal-user-agent "^5.0.0"
"@octokit/graphql@^4.3.1":
version "4.3.1"
resolved "https://registry.yarnpkg.com/@octokit/graphql/-/graphql-4.3.1.tgz#9ee840e04ed2906c7d6763807632de84cdecf418"
integrity sha512-hCdTjfvrK+ilU2keAdqNBWOk+gm1kai1ZcdjRfB30oA3/T6n53UVJb7w0L5cR3/rhU91xT3HSqCd+qbvH06yxA==
dependencies:
"@octokit/request" "^5.3.0"
"@octokit/types" "^2.0.0"
universal-user-agent "^4.0.0"
"@octokit/plugin-paginate-rest@^1.1.1":
version "1.1.2"
resolved "https://registry.yarnpkg.com/@octokit/plugin-paginate-rest/-/plugin-paginate-rest-1.1.2.tgz#004170acf8c2be535aba26727867d692f7b488fc"
integrity sha512-jbsSoi5Q1pj63sC16XIUboklNw+8tL9VOnJsWycWYR78TKss5PVpIPb1TUUcMQ+bBh7cY579cVAWmf5qG+dw+Q==
dependencies:
"@octokit/types" "^2.0.1"
"@octokit/plugin-request-log@^1.0.0":
version "1.0.0"
resolved "https://registry.yarnpkg.com/@octokit/plugin-request-log/-/plugin-request-log-1.0.0.tgz#eef87a431300f6148c39a7f75f8cfeb218b2547e"
integrity sha512-ywoxP68aOT3zHCLgWZgwUJatiENeHE7xJzYjfz8WI0goynp96wETBF+d95b8g/uL4QmS6owPVlaxiz3wyMAzcw==
"@octokit/plugin-rest-endpoint-methods@2.4.0":
version "2.4.0"
resolved "https://registry.yarnpkg.com/@octokit/plugin-rest-endpoint-methods/-/plugin-rest-endpoint-methods-2.4.0.tgz#3288ecf5481f68c494dd0602fc15407a59faf61e"
integrity sha512-EZi/AWhtkdfAYi01obpX0DF7U6b1VRr30QNQ5xSFPITMdLSfhcBqjamE3F+sKcxPbD7eZuMHu3Qkk2V+JGxBDQ==
dependencies:
"@octokit/types" "^2.0.1"
deprecation "^2.3.1"
"@octokit/request-error@^1.0.2":
version "1.2.1"
resolved "https://registry.yarnpkg.com/@octokit/request-error/-/request-error-1.2.1.tgz#ede0714c773f32347576c25649dc013ae6b31801"
integrity sha512-+6yDyk1EES6WK+l3viRDElw96MvwfJxCt45GvmjDUKWjYIb3PJZQkq3i46TwGwoPD4h8NmTrENmtyA1FwbmhRA==
dependencies:
"@octokit/types" "^2.0.0"
deprecation "^2.0.0"
once "^1.4.0"
"@octokit/request-error@^2.0.0":
version "2.0.0"
resolved "https://registry.yarnpkg.com/@octokit/request-error/-/request-error-2.0.0.tgz#94ca7293373654400fbb2995f377f9473e00834b"
integrity sha512-rtYicB4Absc60rUv74Rjpzek84UbVHGHJRu4fNVlZ1mCcyUPPuzFfG9Rn6sjHrd95DEsmjSt1Axlc699ZlbDkw==
dependencies:
"@octokit/types" "^2.0.0"
deprecation "^2.0.0"
once "^1.4.0"
"@octokit/request@^5.2.0", "@octokit/request@^5.3.0":
version "5.4.2"
resolved "https://registry.yarnpkg.com/@octokit/request/-/request-5.4.2.tgz#74f8e5bbd39dc738a1b127629791f8ad1b3193ee"
integrity sha512-zKdnGuQ2TQ2vFk9VU8awFT4+EYf92Z/v3OlzRaSh4RIP0H6cvW1BFPXq4XYvNez+TPQjqN+0uSkCYnMFFhcFrw==
dependencies:
"@octokit/endpoint" "^6.0.1"
"@octokit/request-error" "^2.0.0"
"@octokit/types" "^2.11.1"
deprecation "^2.0.0"
is-plain-object "^3.0.0"
node-fetch "^2.3.0"
once "^1.4.0"
universal-user-agent "^5.0.0"
"@octokit/rest@^16.43.1":
version "16.43.1"
resolved "https://registry.yarnpkg.com/@octokit/rest/-/rest-16.43.1.tgz#3b11e7d1b1ac2bbeeb23b08a17df0b20947eda6b"
integrity sha512-gfFKwRT/wFxq5qlNjnW2dh+qh74XgTQ2B179UX5K1HYCluioWj8Ndbgqw2PVqa1NnVJkGHp2ovMpVn/DImlmkw==
dependencies:
"@octokit/auth-token" "^2.4.0"
"@octokit/plugin-paginate-rest" "^1.1.1"
"@octokit/plugin-request-log" "^1.0.0"
"@octokit/plugin-rest-endpoint-methods" "2.4.0"
"@octokit/request" "^5.2.0"
"@octokit/request-error" "^1.0.2"
atob-lite "^2.0.0"
before-after-hook "^2.0.0"
btoa-lite "^1.0.0"
deprecation "^2.0.0"
lodash.get "^4.4.2"
lodash.set "^4.3.2"
lodash.uniq "^4.5.0"
octokit-pagination-methods "^1.1.0"
once "^1.4.0"
universal-user-agent "^4.0.0"
"@octokit/types@^2.0.0", "@octokit/types@^2.0.1", "@octokit/types@^2.11.1":
version "2.12.1"
resolved "https://registry.yarnpkg.com/@octokit/types/-/types-2.12.1.tgz#4a26b4a85ec121043d3b0745b5798f9d8fd968ca"
integrity sha512-LRLR1tjbcCfAmUElvTmMvLEzstpx6Xt/aQVTg2xvd+kHA2Ekp1eWl5t+gU7bcwjXHYEAzh4hH4WH+kS3vh+wRw==
dependencies:
"@types/node" ">= 8"
"@types/node@>= 8":
version "13.13.2"
resolved "https://registry.yarnpkg.com/@types/node/-/node-13.13.2.tgz#160d82623610db590a64e8ca81784e11117e5a54"
integrity sha512-LB2R1Oyhpg8gu4SON/mfforE525+Hi/M1ineICEDftqNVTyFg1aRIeGuTvXAoWHc4nbrFncWtJgMmoyRvuGh7A==
arg@^4.1.0:
version "4.1.3"
resolved "https://registry.yarnpkg.com/arg/-/arg-4.1.3.tgz#269fc7ad5b8e42cb63c896d5666017261c144089"
integrity sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==
atob-lite@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
axios@^0.21.4:
version "0.21.4"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.4.tgz#c67b90dc0568e5c1cf2b0b858c43ba28e2eda575"
integrity sha512-ut5vewkiu8jjGBdqpM44XxjuCjq9LAKeHVmoVfHVzy8eHgxxq8SbAVQNovDA8mVi05kP0Ea/n/UzcSHcTJQfNg==
dependencies:
follow-redirects "^1.14.0"
before-after-hook@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/before-after-hook/-/before-after-hook-2.1.0.tgz#b6c03487f44e24200dd30ca5e6a1979c5d2fb635"
integrity sha512-IWIbu7pMqyw3EAJHzzHbWa85b6oud/yfKYg5rqB5hNE8CeMi3nX+2C2sj0HswfblST86hpVEOAb9x34NZd6P7A==
btoa-lite@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/btoa-lite/-/btoa-lite-1.0.0.tgz#337766da15801210fdd956c22e9c6891ab9d0337"
integrity sha1-M3dm2hWAEhD92VbCLpxokaudAzc=
buffer-from@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/buffer-from/-/buffer-from-1.1.1.tgz#32713bc028f75c02fdb710d7c7bcec1f2c6070ef"
integrity sha512-MQcXEUbCKtEo7bhqEs6560Hyd4XaovZlO/k9V3hjVUF/zwW7KBVdSK4gIt/bzwS9MbR5qob+F5jusZsb0YQK2A==
cross-spawn@^6.0.0:
version "6.0.5"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-6.0.5.tgz#4a5ec7c64dfae22c3a14124dbacdee846d80cbc4"
integrity sha512-eTVLrBSt7fjbDygz805pMnstIs2VTBNkRm0qxZd+M7A5XDdxVRWO5MxGBXZhjY4cqLYLdtrGqRf8mBPmzwSpWQ==
dependencies:
nice-try "^1.0.4"
path-key "^2.0.1"
semver "^5.5.0"
shebang-command "^1.2.0"
which "^1.2.9"
deprecation@^2.0.0, deprecation@^2.3.1:
version "2.3.1"
resolved "https://registry.yarnpkg.com/deprecation/-/deprecation-2.3.1.tgz#6368cbdb40abf3373b525ac87e4a260c3a700919"
integrity sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ==
diff@^4.0.1:
version "4.0.2"
resolved "https://registry.yarnpkg.com/diff/-/diff-4.0.2.tgz#60f3aecb89d5fae520c11aa19efc2bb982aade7d"
integrity sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==
end-of-stream@^1.1.0:
version "1.4.4"
resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.4.tgz#5ae64a5f45057baf3626ec14da0ca5e4b2431eb0"
integrity sha512-+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q==
dependencies:
once "^1.4.0"
execa@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/execa/-/execa-1.0.0.tgz#c6236a5bb4df6d6f15e88e7f017798216749ddd8"
integrity sha512-adbxcyWV46qiHyvSp50TKt05tB4tK3HcmF7/nxfAdhnox83seTDbwnaqKO4sXRy7roHAIFqJP/Rw/AuEbX61LA==
dependencies:
cross-spawn "^6.0.0"
get-stream "^4.0.0"
is-stream "^1.1.0"
npm-run-path "^2.0.0"
p-finally "^1.0.0"
signal-exit "^3.0.0"
strip-eof "^1.0.0"
follow-redirects@^1.14.0:
version "1.14.8"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.14.8.tgz#016996fb9a11a100566398b1c6839337d7bfa8fc"
integrity sha512-1x0S9UVJHsQprFcEC/qnNzBLcIxsjAV905f/UkQxbclCsoTWlacCNOpQa/anodLl2uaEKFhfWOvM2Qg77+15zA==
get-stream@^4.0.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-4.1.0.tgz#c1b255575f3dc21d59bfc79cd3d2b46b1c3a54b5"
integrity sha512-GMat4EJ5161kIy2HevLlr4luNjBgvmj413KaQA7jt4V8B4RDsfpHk7WQ9GVqfYyyx8OS/L66Kox+rJRNklLK7w==
dependencies:
pump "^3.0.0"
is-plain-object@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/is-plain-object/-/is-plain-object-3.0.0.tgz#47bfc5da1b5d50d64110806c199359482e75a928"
integrity sha512-tZIpofR+P05k8Aocp7UI/2UTa9lTJSebCXpFFoR9aibpokDj/uXBsJ8luUu0tTVYKkMU6URDUuOfJZ7koewXvg==
dependencies:
isobject "^4.0.0"
is-stream@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/is-stream/-/is-stream-1.1.0.tgz#12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44"
integrity sha1-EtSj3U5o4Lec6428hBc66A2RykQ=
isexe@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/isexe/-/isexe-2.0.0.tgz#e8fbf374dc556ff8947a10dcb0572d633f2cfa10"
integrity sha1-6PvzdNxVb/iUehDcsFctYz8s+hA=
isobject@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/isobject/-/isobject-4.0.0.tgz#3f1c9155e73b192022a80819bacd0343711697b0"
integrity sha512-S/2fF5wH8SJA/kmwr6HYhK/RI/OkhD84k8ntalo0iJjZikgq1XFvR5M8NPT1x5F7fBwCG3qHfnzeP/Vh/ZxCUA==
lodash.get@^4.4.2:
version "4.4.2"
resolved "https://registry.yarnpkg.com/lodash.get/-/lodash.get-4.4.2.tgz#2d177f652fa31e939b4438d5341499dfa3825e99"
integrity sha1-LRd/ZS+jHpObRDjVNBSZ36OCXpk=
lodash.set@^4.3.2:
version "4.3.2"
resolved "https://registry.yarnpkg.com/lodash.set/-/lodash.set-4.3.2.tgz#d8757b1da807dde24816b0d6a84bea1a76230b23"
integrity sha1-2HV7HagH3eJIFrDWqEvqGnYjCyM=
lodash.uniq@^4.5.0:
version "4.5.0"
resolved "https://registry.yarnpkg.com/lodash.uniq/-/lodash.uniq-4.5.0.tgz#d0225373aeb652adc1bc82e4945339a842754773"
integrity sha1-0CJTc662Uq3BvILklFM5qEJ1R3M=
macos-release@^2.2.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/macos-release/-/macos-release-2.3.0.tgz#eb1930b036c0800adebccd5f17bc4c12de8bb71f"
integrity sha512-OHhSbtcviqMPt7yfw5ef5aghS2jzFVKEFyCJndQt2YpSQ9qRVSEv2axSJI1paVThEu+FFGs584h/1YhxjVqajA==
make-error@^1.1.1:
version "1.3.6"
resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
nice-try@^1.0.4:
version "1.0.5"
resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366"
integrity sha512-1nh45deeb5olNY7eX82BkPO7SSxR5SSYJiPTrTdFUVYwAl8CKMA5N9PjTYkHiRjisVcxcQ1HXdLhx2qxxJzLNQ==
node-fetch@^2.3.0:
version "2.6.7"
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.6.7.tgz#24de9fba827e3b4ae44dc8b20256a379160052ad"
integrity sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==
npm-run-path@^2.0.0:
version "2.0.2"
resolved "https://registry.yarnpkg.com/npm-run-path/-/npm-run-path-2.0.2.tgz#35a9232dfa35d7067b4cb2ddf2357b1871536c5f"
integrity sha1-NakjLfo11wZ7TLLd8jV7GHFTbF8=
dependencies:
path-key "^2.0.0"
octokit-pagination-methods@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/octokit-pagination-methods/-/octokit-pagination-methods-1.1.0.tgz#cf472edc9d551055f9ef73f6e42b4dbb4c80bea4"
integrity sha512-fZ4qZdQ2nxJvtcasX7Ghl+WlWS/d9IgnBIwFZXVNNZUmzpno91SX5bc5vuxiuKoCtK78XxGGNuSCrDC7xYB3OQ==
once@^1.3.1, once@^1.4.0:
version "1.4.0"
resolved "https://registry.yarnpkg.com/once/-/once-1.4.0.tgz#583b1aa775961d4b113ac17d9c50baef9dd76bd1"
integrity sha1-WDsap3WWHUsROsF9nFC6753Xa9E=
dependencies:
wrappy "1"
os-name@^3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/os-name/-/os-name-3.1.0.tgz#dec19d966296e1cd62d701a5a66ee1ddeae70801"
integrity sha512-h8L+8aNjNcMpo/mAIBPn5PXCM16iyPGjHNWo6U1YO8sJTMHtEtyczI6QJnLoplswm6goopQkqc7OAnjhWcugVg==
dependencies:
macos-release "^2.2.0"
windows-release "^3.1.0"
p-finally@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/p-finally/-/p-finally-1.0.0.tgz#3fbcfb15b899a44123b34b6dcc18b724336a2cae"
integrity sha1-P7z7FbiZpEEjs0ttzBi3JDNqLK4=
path-key@^2.0.0, path-key@^2.0.1:
version "2.0.1"
resolved "https://registry.yarnpkg.com/path-key/-/path-key-2.0.1.tgz#411cadb574c5a140d3a4b1910d40d80cc9f40b40"
integrity sha1-QRyttXTFoUDTpLGRDUDYDMn0C0A=
pump@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64"
integrity sha512-LwZy+p3SFs1Pytd/jYct4wpv49HiYCqd9Rlc5ZVdk0V+8Yzv6jR5Blk3TRmPL1ft69TxP0IMZGJ+WPFU2BFhww==
dependencies:
end-of-stream "^1.1.0"
once "^1.3.1"
semver@^5.5.0:
version "5.7.1"
resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.1.tgz#a954f931aeba508d307bbf069eff0c01c96116f7"
integrity sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==
shebang-command@^1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/shebang-command/-/shebang-command-1.2.0.tgz#44aac65b695b03398968c39f363fee5deafdf1ea"
integrity sha1-RKrGW2lbAzmJaMOfNj/uXer98eo=
dependencies:
shebang-regex "^1.0.0"
shebang-regex@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/shebang-regex/-/shebang-regex-1.0.0.tgz#da42f49740c0b42db2ca9728571cb190c98efea3"
integrity sha1-2kL0l0DAtC2yypcoVxyxkMmO/qM=
signal-exit@^3.0.0:
version "3.0.3"
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.3.tgz#a1410c2edd8f077b08b4e253c8eacfcaf057461c"
integrity sha512-VUJ49FC8U1OxwZLxIbTTrDvLnf/6TDgxZcK8wxR8zs13xpx7xbG60ndBlhNrFi2EMuFRoeDoJO7wthSLq42EjA==
source-map-support@^0.5.17:
version "0.5.19"
resolved "https://registry.yarnpkg.com/source-map-support/-/source-map-support-0.5.19.tgz#a98b62f86dcaf4f67399648c085291ab9e8fed61"
integrity sha512-Wonm7zOCIJzBGQdB+thsPar0kYuCIzYvxZwlBa87yi/Mdjv7Tip2cyVbLj5o0cFPN4EVkuTwb3GDDyUx2DGnGw==
dependencies:
buffer-from "^1.0.0"
source-map "^0.6.0"
source-map@^0.6.0:
version "0.6.1"
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
strip-eof@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/strip-eof/-/strip-eof-1.0.0.tgz#bb43ff5598a6eb05d89b59fcd129c983313606bf"
integrity sha1-u0P/VZim6wXYm1n80SnJgzE2Br8=
ts-node@^8.6.2:
version "8.9.0"
resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-8.9.0.tgz#d7bf7272dcbecd3a2aa18bd0b96c7d2f270c15d4"
integrity sha512-rwkXfOs9zmoHrV8xE++dmNd6ZIS+nmHHCxcV53ekGJrxFLMbp+pizpPS07ARvhwneCIECPppOwbZHvw9sQtU4w==
dependencies:
arg "^4.1.0"
diff "^4.0.1"
make-error "^1.1.1"
source-map-support "^0.5.17"
yn "3.1.1"
tunnel@0.0.6, tunnel@^0.0.6:
version "0.0.6"
resolved "https://registry.yarnpkg.com/tunnel/-/tunnel-0.0.6.tgz#72f1314b34a5b192db012324df2cc587ca47f92c"
integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==
typescript@^3.8.3:
version "3.8.3"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.8.3.tgz#409eb8544ea0335711205869ec458ab109ee1061"
integrity sha512-MYlEfn5VrLNsgudQTVJeNaQFUAI7DkhnOjdpAp4T+ku1TfQClewlbSuTVHiA+8skNBgaf02TL/kLOvig4y3G8w==
universal-user-agent@^4.0.0:
version "4.0.1"
resolved "https://registry.yarnpkg.com/universal-user-agent/-/universal-user-agent-4.0.1.tgz#fd8d6cb773a679a709e967ef8288a31fcc03e557"
integrity sha512-LnST3ebHwVL2aNe4mejI9IQh2HfZ1RLo8Io2HugSif8ekzD1TlWpHpColOB/eh8JHMLkGH3Akqf040I+4ylNxg==
dependencies:
os-name "^3.1.0"
universal-user-agent@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/universal-user-agent/-/universal-user-agent-5.0.0.tgz#a3182aa758069bf0e79952570ca757de3579c1d9"
integrity sha512-B5TPtzZleXyPrUMKCpEHFmVhMN6EhmJYjG5PQna9s7mXeSqGTLap4OpqLl5FCEFUI3UBmllkETwKf/db66Y54Q==
dependencies:
os-name "^3.1.0"
uuid@^8.3.2:
version "8.3.2"
resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2"
integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==
which@^1.2.9:
version "1.3.1"
resolved "https://registry.yarnpkg.com/which/-/which-1.3.1.tgz#a45043d54f5805316da8d62f9f50918d3da70b0a"
integrity sha512-HxJdYWq1MTIQbJ3nw0cqssHoTNU267KlrDuGZ1WYlxDStUtKUhOaJmh112/TZmHxxUfuJqPXSOm7tDyas0OSIQ==
dependencies:
isexe "^2.0.0"
windows-release@^3.1.0:
version "3.3.0"
resolved "https://registry.yarnpkg.com/windows-release/-/windows-release-3.3.0.tgz#dce167e9f8be733f21c849ebd4d03fe66b29b9f0"
integrity sha512-2HetyTg1Y+R+rUgrKeUEhAG/ZuOmTrI1NBb3ZyAGQMYmOJjBBPe4MTodghRkmLJZHwkuPi02anbeGP+Zf401LQ==
dependencies:
execa "^1.0.0"
wrappy@1:
version "1.0.2"
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
integrity sha1-tSQ9jz7BqjXxNkYFvA0QNuMKtp8=
yn@3.1.1:
version "3.1.1"
resolved "https://registry.yarnpkg.com/yn/-/yn-3.1.1.tgz#1e87401a09d767c1d5eab26a6e4c185182d2eb50"
integrity sha512-Ux4ygGWsu2c7isFWe8Yu1YluJmqVhxqK2cLXNQA5AcC3QfbGNpM7fu0Y8b/z16pXLnFxZYvWhd3fhBY9DLmC6Q==

View File

@@ -1,21 +0,0 @@
{
"codebaseName": "vscode-client",
"ppe": false,
"notificationAliases": [
"sbatten@microsoft.com"
],
"codebaseAdmins": [
"REDMOND\\stbatt",
"REDMOND\\monacotools",
],
"instanceUrl": "https://msazure.visualstudio.com/defaultcollection",
"projectName": "One",
"areaPath": "One\\VSCode\\Visual Studio Code Client",
"iterationPath": "One",
"notifyAlways": true,
"tools": [
"BinSkim",
"CredScan",
"CodeQL"
]
}

View File

@@ -1,34 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
const crypto = require("crypto");
const { dirs } = require('../../npm/dirs');
const ROOT = path.join(__dirname, '../../../');
const shasum = crypto.createHash('sha1');
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files
for (let dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = {
dependencies: packageJson.dependencies,
devDependencies: packageJson.devDependencies,
optionalDependencies: packageJson.optionalDependencies,
resolutions: packageJson.resolutions
};
shasum.update(JSON.stringify(relevantPackageJsonSections));
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
shasum.update(fs.readFileSync(yarnLockPath));
}
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,42 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as path from 'path';
import * as crypto from 'crypto';
const { dirs } = require('../../npm/dirs');
const ROOT = path.join(__dirname, '../../../');
const shasum = crypto.createHash('sha1');
shasum.update(fs.readFileSync(path.join(ROOT, 'build/.cachesalt')));
shasum.update(fs.readFileSync(path.join(ROOT, '.yarnrc')));
shasum.update(fs.readFileSync(path.join(ROOT, 'remote/.yarnrc')));
// Add `package.json` and `yarn.lock` files
for (let dir of dirs) {
const packageJsonPath = path.join(ROOT, dir, 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath).toString());
const relevantPackageJsonSections = {
dependencies: packageJson.dependencies,
devDependencies: packageJson.devDependencies,
optionalDependencies: packageJson.optionalDependencies,
resolutions: packageJson.resolutions
};
shasum.update(JSON.stringify(relevantPackageJsonSections));
const yarnLockPath = path.join(ROOT, dir, 'yarn.lock');
shasum.update(fs.readFileSync(yarnLockPath));
}
// Add any other command line arguments
for (let i = 2; i < process.argv.length; i++) {
shasum.update(process.argv[i]);
}
process.stdout.write(shasum.digest('hex'));

View File

@@ -1,42 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const vfs = require("vinyl-fs");
const path = require("path");
const es = require("event-stream");
const fs = require("fs");
const files = [
'.build/langpacks/**/*.vsix',
'.build/extensions/**/*.vsix',
'.build/win32-x64/**/*.{exe,zip}',
'.build/win32-arm64/**/*.{exe,zip}',
'.build/linux/sha256hashes.txt',
'.build/linux/deb/amd64/deb/*.deb',
'.build/linux/rpm/x86_64/*.rpm',
'.build/linux/server/*',
'.build/linux/archive/*',
'.build/docker/*',
'.build/darwin/*',
'.build/version.json' // version information
];
async function main() {
return new Promise((resolve, reject) => {
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
.pipe(es.through(file => {
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY,
//Preserve intermediate directories after .build folder
file.path.substr(path.resolve('.build').length + 1));
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.renameSync(file.path, filePath);
}));
stream.on('end', () => resolve());
stream.on('error', e => reject(e));
});
}
main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,47 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as vfs from 'vinyl-fs';
import * as path from 'path';
import * as es from 'event-stream';
import * as fs from 'fs';
const files = [
'.build/langpacks/**/*.vsix', // langpacks
'.build/extensions/**/*.vsix', // external extensions
'.build/win32-x64/**/*.{exe,zip}', // windows x64 binaries
'.build/win32-arm64/**/*.{exe,zip}', // windows arm64 binaries
'.build/linux/sha256hashes.txt', // linux hashes
'.build/linux/deb/amd64/deb/*.deb', // linux debs
'.build/linux/rpm/x86_64/*.rpm', // linux rpms
'.build/linux/server/*', // linux server
'.build/linux/archive/*', // linux archive
'.build/docker/*', // docker images
'.build/darwin/*', // darwin binaries
'.build/version.json' // version information
];
async function main() {
return new Promise<void>((resolve, reject) => {
const stream = vfs.src(files, { base: '.build', allowEmpty: true })
.pipe(es.through(file => {
const filePath = path.join(process.env.BUILD_ARTIFACTSTAGINGDIRECTORY!,
//Preserve intermediate directories after .build folder
file.path.substr(path.resolve('.build').length + 1));
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.renameSync(file.path, filePath);
}));
stream.on('end', () => resolve());
stream.on('error', e => reject(e));
});
}
main().catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,217 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const crypto = require("crypto");
const storage_blob_1 = require("@azure/storage-blob");
const mime = require("mime");
const cosmos_1 = require("@azure/cosmos");
const identity_1 = require("@azure/identity");
const retry_1 = require("./retry");
if (process.argv.length !== 8) {
console.error('Usage: node createAsset.js PRODUCT OS ARCH TYPE NAME FILE');
process.exit(-1);
}
// Contains all of the logic for mapping details to our actual product names in CosmosDB
function getPlatform(product, os, arch, type) {
switch (os) {
case 'win32':
switch (product) {
case 'client': {
const asset = arch === 'ia32' ? 'win32' : `win32-${arch}`;
switch (type) {
case 'archive':
return `${asset}-archive`;
case 'setup':
return asset;
case 'user-setup':
return `${asset}-user`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
}
case 'server':
if (arch === 'arm64') {
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
return arch === 'ia32' ? 'server-win32' : `server-win32-${arch}`;
case 'web':
if (arch === 'arm64') {
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
return arch === 'ia32' ? 'server-win32-web' : `server-win32-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'alpine':
switch (product) {
case 'server':
return `server-alpine-${arch}`;
case 'web':
return `server-alpine-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'linux':
switch (type) {
case 'snap':
return `linux-snap-${arch}`;
case 'archive-unsigned':
switch (product) {
case 'client':
return `linux-${arch}`;
case 'server':
return `server-linux-${arch}`;
case 'web':
return arch === 'standalone' ? 'web-standalone' : `server-linux-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'deb-package':
return `linux-deb-${arch}`;
case 'rpm-package':
return `linux-rpm-${arch}`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'darwin':
switch (product) {
case 'client':
if (arch === 'x64') {
return 'darwin';
}
return `darwin-${arch}`;
case 'server':
if (arch === 'x64') {
return 'server-darwin';
}
return `server-darwin-${arch}`;
case 'web':
if (arch === 'x64') {
return 'server-darwin-web';
}
return `server-darwin-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
}
// Contains all of the logic for mapping types to our actual types in CosmosDB
function getRealType(type) {
switch (type) {
case 'user-setup':
return 'setup';
case 'deb-package':
case 'rpm-package':
return 'package';
default:
return type;
}
}
function hashStream(hashName, stream) {
return new Promise((c, e) => {
const shasum = crypto.createHash(hashName);
stream
.on('data', shasum.update.bind(shasum))
.on('error', e)
.on('close', () => c(shasum.digest('hex')));
});
}
function getEnv(name) {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main() {
var _a;
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
// getPlatform needs the unprocessedType
const platform = getPlatform(product, os, arch, unprocessedType);
const type = getRealType(unprocessedType);
const quality = getEnv('VSCODE_QUALITY');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || getEnv('BUILD_SOURCEVERSION');
console.log('Creating asset...');
const stat = await new Promise((c, e) => fs.stat(filePath, (err, stat) => err ? e(err) : c(stat)));
const size = stat.size;
console.log('Size:', size);
const stream = fs.createReadStream(filePath);
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
console.log('SHA1:', sha1hash);
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + fileName;
const storagePipelineOptions = { retryOptions: { retryPolicyType: storage_blob_1.StorageRetryPolicyType.EXPONENTIAL, maxTries: 6, tryTimeoutInMs: 10 * 60 * 1000 } };
const credential = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
const blobServiceClient = new storage_blob_1.BlobServiceClient(`https://vscode.blob.core.windows.net`, credential, storagePipelineOptions);
const containerClient = blobServiceClient.getContainerClient(quality);
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
const blobOptions = {
blobHTTPHeaders: {
blobContentType: mime.lookup(filePath),
blobContentDisposition: `attachment; filename="${fileName}"`,
blobCacheControl: 'max-age=31536000, public'
}
};
const uploadPromises = [
(0, retry_1.retry)(async () => {
await blobClient.uploadFile(filePath, blobOptions);
console.log('Blob successfully uploaded to Azure storage.');
})
];
const shouldUploadToMooncake = /true/i.test((_a = process.env['VSCODE_PUBLISH_TO_MOONCAKE']) !== null && _a !== void 0 ? _a : 'true');
if (shouldUploadToMooncake) {
const mooncakeCredential = new identity_1.ClientSecretCredential(process.env['AZURE_MOONCAKE_TENANT_ID'], process.env['AZURE_MOONCAKE_CLIENT_ID'], process.env['AZURE_MOONCAKE_CLIENT_SECRET']);
const mooncakeBlobServiceClient = new storage_blob_1.BlobServiceClient(`https://vscode.blob.core.chinacloudapi.cn`, mooncakeCredential, storagePipelineOptions);
const mooncakeContainerClient = mooncakeBlobServiceClient.getContainerClient(quality);
const mooncakeBlobClient = mooncakeContainerClient.getBlockBlobClient(blobName);
uploadPromises.push((0, retry_1.retry)(async () => {
await mooncakeBlobClient.uploadFile(filePath, blobOptions);
console.log('Blob successfully uploaded to Mooncake Azure storage.');
}));
console.log('Uploading blobs to Azure storage and Mooncake Azure storage...');
}
else {
console.log('Uploading blobs to Azure storage...');
}
await Promise.all(uploadPromises);
console.log('All blobs successfully uploaded.');
const assetUrl = `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`;
const blobPath = new URL(assetUrl).pathname;
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
const asset = {
platform,
type,
url: assetUrl,
hash: sha1hash,
mooncakeUrl,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], aadCredentials: credential });
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
console.log(` Done ✔️`);
}
main().then(() => {
console.log('Asset successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,260 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import { Readable } from 'stream';
import * as crypto from 'crypto';
import { BlobServiceClient, BlockBlobParallelUploadOptions, StoragePipelineOptions, StorageRetryPolicyType } from '@azure/storage-blob';
import * as mime from 'mime';
import { CosmosClient } from '@azure/cosmos';
import { ClientSecretCredential } from '@azure/identity';
import { retry } from './retry';
interface Asset {
platform: string;
type: string;
url: string;
mooncakeUrl?: string;
hash: string;
sha256hash: string;
size: number;
supportsFastUpdate?: boolean;
}
if (process.argv.length !== 8) {
console.error('Usage: node createAsset.js PRODUCT OS ARCH TYPE NAME FILE');
process.exit(-1);
}
// Contains all of the logic for mapping details to our actual product names in CosmosDB
function getPlatform(product: string, os: string, arch: string, type: string): string {
switch (os) {
case 'win32':
switch (product) {
case 'client': {
const asset = arch === 'ia32' ? 'win32' : `win32-${arch}`;
switch (type) {
case 'archive':
return `${asset}-archive`;
case 'setup':
return asset;
case 'user-setup':
return `${asset}-user`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
}
case 'server':
if (arch === 'arm64') {
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
return arch === 'ia32' ? 'server-win32' : `server-win32-${arch}`;
case 'web':
if (arch === 'arm64') {
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
return arch === 'ia32' ? 'server-win32-web' : `server-win32-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'alpine':
switch (product) {
case 'server':
return `server-alpine-${arch}`;
case 'web':
return `server-alpine-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'linux':
switch (type) {
case 'snap':
return `linux-snap-${arch}`;
case 'archive-unsigned':
switch (product) {
case 'client':
return `linux-${arch}`;
case 'server':
return `server-linux-${arch}`;
case 'web':
return arch === 'standalone' ? 'web-standalone' : `server-linux-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'deb-package':
return `linux-deb-${arch}`;
case 'rpm-package':
return `linux-rpm-${arch}`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
case 'darwin':
switch (product) {
case 'client':
if (arch === 'x64') {
return 'darwin';
}
return `darwin-${arch}`;
case 'server':
if (arch === 'x64') {
return 'server-darwin';
}
return `server-darwin-${arch}`;
case 'web':
if (arch === 'x64') {
return 'server-darwin-web';
}
return `server-darwin-${arch}-web`;
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
default:
throw new Error(`Unrecognized: ${product} ${os} ${arch} ${type}`);
}
}
// Contains all of the logic for mapping types to our actual types in CosmosDB
function getRealType(type: string) {
switch (type) {
case 'user-setup':
return 'setup';
case 'deb-package':
case 'rpm-package':
return 'package';
default:
return type;
}
}
function hashStream(hashName: string, stream: Readable): Promise<string> {
return new Promise<string>((c, e) => {
const shasum = crypto.createHash(hashName);
stream
.on('data', shasum.update.bind(shasum))
.on('error', e)
.on('close', () => c(shasum.digest('hex')));
});
}
function getEnv(name: string): string {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main(): Promise<void> {
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
// getPlatform needs the unprocessedType
const platform = getPlatform(product, os, arch, unprocessedType);
const type = getRealType(unprocessedType);
const quality = getEnv('VSCODE_QUALITY');
const commit = process.env['VSCODE_DISTRO_COMMIT'] || getEnv('BUILD_SOURCEVERSION');
console.log('Creating asset...');
const stat = await new Promise<fs.Stats>((c, e) => fs.stat(filePath, (err, stat) => err ? e(err) : c(stat)));
const size = stat.size;
console.log('Size:', size);
const stream = fs.createReadStream(filePath);
const [sha1hash, sha256hash] = await Promise.all([hashStream('sha1', stream), hashStream('sha256', stream)]);
console.log('SHA1:', sha1hash);
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + fileName;
const storagePipelineOptions: StoragePipelineOptions = { retryOptions: { retryPolicyType: StorageRetryPolicyType.EXPONENTIAL, maxTries: 6, tryTimeoutInMs: 10 * 60 * 1000 } };
const credential = new ClientSecretCredential(process.env['AZURE_TENANT_ID']!, process.env['AZURE_CLIENT_ID']!, process.env['AZURE_CLIENT_SECRET']!);
const blobServiceClient = new BlobServiceClient(`https://vscode.blob.core.windows.net`, credential, storagePipelineOptions);
const containerClient = blobServiceClient.getContainerClient(quality);
const blobClient = containerClient.getBlockBlobClient(blobName);
const blobExists = await blobClient.exists();
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
const blobOptions: BlockBlobParallelUploadOptions = {
blobHTTPHeaders: {
blobContentType: mime.lookup(filePath),
blobContentDisposition: `attachment; filename="${fileName}"`,
blobCacheControl: 'max-age=31536000, public'
}
};
const uploadPromises: Promise<void>[] = [
retry(async () => {
await blobClient.uploadFile(filePath, blobOptions);
console.log('Blob successfully uploaded to Azure storage.');
})
];
const shouldUploadToMooncake = /true/i.test(process.env['VSCODE_PUBLISH_TO_MOONCAKE'] ?? 'true');
if (shouldUploadToMooncake) {
const mooncakeCredential = new ClientSecretCredential(process.env['AZURE_MOONCAKE_TENANT_ID']!, process.env['AZURE_MOONCAKE_CLIENT_ID']!, process.env['AZURE_MOONCAKE_CLIENT_SECRET']!);
const mooncakeBlobServiceClient = new BlobServiceClient(`https://vscode.blob.core.chinacloudapi.cn`, mooncakeCredential, storagePipelineOptions);
const mooncakeContainerClient = mooncakeBlobServiceClient.getContainerClient(quality);
const mooncakeBlobClient = mooncakeContainerClient.getBlockBlobClient(blobName);
uploadPromises.push(retry(async () => {
await mooncakeBlobClient.uploadFile(filePath, blobOptions);
console.log('Blob successfully uploaded to Mooncake Azure storage.');
}));
console.log('Uploading blobs to Azure storage and Mooncake Azure storage...');
} else {
console.log('Uploading blobs to Azure storage...');
}
await Promise.all(uploadPromises);
console.log('All blobs successfully uploaded.');
const assetUrl = `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`;
const blobPath = new URL(assetUrl).pathname;
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
const asset: Asset = {
platform,
type,
url: assetUrl,
hash: sha1hash,
mooncakeUrl,
sha256hash,
size
};
// Remove this if we ever need to rollback fast updates for windows
if (/win32/.test(platform)) {
asset.supportsFastUpdate = true;
}
console.log('Asset:', JSON.stringify(asset, null, ' '));
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, aadCredentials: credential });
const scripts = client.database('builds').container(quality).scripts;
await retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
console.log(` Done ✔️`);
}
main().then(() => {
console.log('Asset successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,55 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const identity_1 = require("@azure/identity");
const cosmos_1 = require("@azure/cosmos");
const retry_1 = require("./retry");
if (process.argv.length !== 3) {
console.error('Usage: node createBuild.js VERSION');
process.exit(-1);
}
function getEnv(name) {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main() {
var _a, _b, _c;
const [, , _version] = process.argv;
const quality = getEnv('VSCODE_QUALITY');
const commit = ((_a = process.env['VSCODE_DISTRO_COMMIT']) === null || _a === void 0 ? void 0 : _a.trim()) || getEnv('BUILD_SOURCEVERSION');
const queuedBy = getEnv('BUILD_QUEUEDBY');
const sourceBranch = ((_b = process.env['VSCODE_DISTRO_REF']) === null || _b === void 0 ? void 0 : _b.trim()) || getEnv('BUILD_SOURCEBRANCH');
const version = _version + (quality === 'stable' ? '' : `-${quality}`);
console.log('Creating build...');
console.log('Quality:', quality);
console.log('Version:', version);
console.log('Commit:', commit);
const build = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: false,
private: Boolean((_c = process.env['VSCODE_DISTRO_REF']) === null || _c === void 0 ? void 0 : _c.trim()),
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
const aadCredentials = new identity_1.ClientSecretCredential(process.env['AZURE_TENANT_ID'], process.env['AZURE_CLIENT_ID'], process.env['AZURE_CLIENT_SECRET']);
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], aadCredentials });
const scripts = client.database('builds').container(quality).scripts;
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
}
main().then(() => {
console.log('Build successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,64 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import { ClientSecretCredential } from '@azure/identity';
import { CosmosClient } from '@azure/cosmos';
import { retry } from './retry';
if (process.argv.length !== 3) {
console.error('Usage: node createBuild.js VERSION');
process.exit(-1);
}
function getEnv(name: string): string {
const result = process.env[name];
if (typeof result === 'undefined') {
throw new Error('Missing env: ' + name);
}
return result;
}
async function main(): Promise<void> {
const [, , _version] = process.argv;
const quality = getEnv('VSCODE_QUALITY');
const commit = process.env['VSCODE_DISTRO_COMMIT']?.trim() || getEnv('BUILD_SOURCEVERSION');
const queuedBy = getEnv('BUILD_QUEUEDBY');
const sourceBranch = process.env['VSCODE_DISTRO_REF']?.trim() || getEnv('BUILD_SOURCEBRANCH');
const version = _version + (quality === 'stable' ? '' : `-${quality}`);
console.log('Creating build...');
console.log('Quality:', quality);
console.log('Version:', version);
console.log('Commit:', commit);
const build = {
id: commit,
timestamp: (new Date()).getTime(),
version,
isReleased: false,
private: Boolean(process.env['VSCODE_DISTRO_REF']?.trim()),
sourceBranch,
queuedBy,
assets: [],
updates: {}
};
const aadCredentials = new ClientSecretCredential(process.env['AZURE_TENANT_ID']!, process.env['AZURE_CLIENT_ID']!, process.env['AZURE_CLIENT_SECRET']!);
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, aadCredentials });
const scripts = client.database('builds').container(quality).scripts;
await retry(() => scripts.storedProcedure('createBuild').execute('', [{ ...build, _partitionKey: '' }]));
}
main().then(() => {
console.log('Build successfully created');
process.exit(0);
}, err => {
console.error(err);
process.exit(1);
});

View File

@@ -1,19 +0,0 @@
#!/usr/bin/env bash
set -e
cd $BUILD_STAGINGDIRECTORY
mkdir extraction
cd extraction
git clone --depth 1 https://github.com/microsoft/vscode-extension-telemetry.git
git clone --depth 1 https://github.com/microsoft/vscode-chrome-debug-core.git
git clone --depth 1 https://github.com/microsoft/vscode-node-debug2.git
git clone --depth 1 https://github.com/microsoft/vscode-node-debug.git
git clone --depth 1 https://github.com/microsoft/vscode-html-languageservice.git
git clone --depth 1 https://github.com/microsoft/vscode-json-languageservice.git
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
mv declarations-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-core.json
mv config-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-extensions.json
cd ..
rm -rf extraction

View File

@@ -1,12 +0,0 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const retry_1 = require("./retry");
const { installDefaultBrowsersForNpmInstall } = require('playwright-core/lib/server');
async function install() {
await (0, retry_1.retry)(() => installDefaultBrowsersForNpmInstall());
}
install();

View File

@@ -1,13 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { retry } from './retry';
const { installDefaultBrowsersForNpmInstall } = require('playwright-core/lib/server');
async function install() {
await retry(() => installDefaultBrowsersForNpmInstall());
}
install();

View File

@@ -1,40 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const path = require("path");
if (process.argv.length !== 3) {
console.error('Usage: node listNodeModules.js OUTPUT_FILE');
process.exit(-1);
}
const ROOT = path.join(__dirname, '../../../');
function findNodeModulesFiles(location, inNodeModules, result) {
const entries = fs.readdirSync(path.join(ROOT, location));
for (const entry of entries) {
const entryPath = `${location}/${entry}`;
if (/(^\/out)|(^\/src$)|(^\/.git$)|(^\/.build$)/.test(entryPath)) {
continue;
}
let stat;
try {
stat = fs.statSync(path.join(ROOT, entryPath));
}
catch (err) {
continue;
}
if (stat.isDirectory()) {
findNodeModulesFiles(entryPath, inNodeModules || (entry === 'node_modules'), result);
}
else {
if (inNodeModules) {
result.push(entryPath.substr(1));
}
}
}
}
const result = [];
findNodeModulesFiles('', false, result);
fs.writeFileSync(process.argv[2], result.join('\n') + '\n');

Some files were not shown because too many files have changed in this diff Show More