mirror of
https://github.com/ckaczor/azuredatastudio.git
synced 2026-02-23 18:47:06 -05:00
Compare commits
20 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
65fb22cc7c | ||
|
|
ce93729d5c | ||
|
|
eaf3482a64 | ||
|
|
a430fd2592 | ||
|
|
9577b70e37 | ||
|
|
6555fff09f | ||
|
|
7c1d98ad06 | ||
|
|
7aa00b80b0 | ||
|
|
3d65519595 | ||
|
|
2308b2cb3d | ||
|
|
4d86a62c24 | ||
|
|
3f51fa42fd | ||
|
|
97881ce62a | ||
|
|
c1fede2c75 | ||
|
|
7c34261fd2 | ||
|
|
aa05f77ef2 | ||
|
|
d737ed796c | ||
|
|
8d3187c511 | ||
|
|
6abc5f2287 | ||
|
|
ac4f53ed0a |
@@ -16,10 +16,6 @@
|
|||||||
{
|
{
|
||||||
"file": ".devcontainer\\devcontainer.json",
|
"file": ".devcontainer\\devcontainer.json",
|
||||||
"_justification": "Local development environment - not used in production"
|
"_justification": "Local development environment - not used in production"
|
||||||
},
|
|
||||||
{
|
|
||||||
"file": "extensions\\asde-deployment\\notebooks\\edge\\deploy-sql-edge-remote.ipynb",
|
|
||||||
"_justification": "Deployment Notebook - usernames/passwords are entered by user"
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,14 +1,14 @@
|
|||||||
# Code - OSS Development Container
|
# Code - OSS Development Container
|
||||||
|
|
||||||
This repository includes configuration for a development container for working with Code - OSS in a local container or using [GitHub Codespaces](https://github.com/features/codespaces).
|
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [GitHub Codespaces](https://github.com/features/codespaces).
|
||||||
|
|
||||||
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` and a web client is available on port `6080`.
|
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well.
|
||||||
|
|
||||||
## Quick start - local
|
## Quick start - local
|
||||||
|
|
||||||
1. Install Docker Desktop or Docker for Linux on your local machine. (See [docs](https://aka.ms/vscode-remote/containers/getting-started) for additional details.)
|
1. Install Docker Desktop or Docker for Linux on your local machine. (See [docs](https://aka.ms/vscode-remote/containers/getting-started) for additional details.)
|
||||||
|
|
||||||
2. **Important**: Docker needs at least **4 Cores and 6 GB of RAM (8 GB recommended)** to run a full build. If you are on macOS, or are using the old Hyper-V engine for Windows, update these values for Docker Desktop by right-clicking on the Docker status bar item and going to **Preferences/Settings > Resources > Advanced**.
|
2. **Important**: Docker needs at least **4 Cores and 6 GB of RAM (8 GB recommended)** to run full build. If you on macOS, or using the old Hyper-V engine for Windows, update these values for Docker Desktop by right-clicking on the Docker status bar item, going to **Preferences/Settings > Resources > Advanced**.
|
||||||
|
|
||||||
> **Note:** The [Resource Monitor](https://marketplace.visualstudio.com/items?itemName=mutantdino.resourcemonitor) extension is included in the container so you can keep an eye on CPU/Memory in the status bar.
|
> **Note:** The [Resource Monitor](https://marketplace.visualstudio.com/items?itemName=mutantdino.resourcemonitor) extension is included in the container so you can keep an eye on CPU/Memory in the status bar.
|
||||||
|
|
||||||
@@ -16,56 +16,53 @@ This repository includes configuration for a development container for working w
|
|||||||
|
|
||||||

|

|
||||||
|
|
||||||
> **Note:** The Remote - Containers extension requires the Visual Studio Code distribution of Code - OSS. See the [FAQ](https://aka.ms/vscode-remote/faq/license) for details.
|
> Note that the Remote - Containers extension requires the Visual Studio Code distribution of Code - OSS. See the [FAQ](https://aka.ms/vscode-remote/faq/license) for details.
|
||||||
|
|
||||||
4. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd> and select **Remote-Containers: Clone Repository in Container Volume...**.
|
4. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Remote - Containers: Open Repository in Container...**.
|
||||||
|
|
||||||
> **Tip:** While you can use your local source tree instead, operations like `yarn install` can be slow on macOS or when using the Hyper-V engine on Windows. We recommend the "clone repository in container" approach instead since it uses "named volume" rather than the local filesystem.
|
> **Tip:** While you can use your local source tree instead, operations like `yarn install` can be slow on macOS or using the Hyper-V engine on Windows. We recommend the "open repository" approach instead since it uses "named volume" rather than the local filesystem.
|
||||||
|
|
||||||
5. Type `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box and press <kbd>Enter</kbd>.
|
5. Type `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box and press <kbd>Enter</kbd>.
|
||||||
|
|
||||||
6. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080), or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
6. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
||||||
|
|
||||||
Anything you start in VS Code, or the integrated terminal, will appear here.
|
Anything you start in VS Code or the integrated terminal will appear here.
|
||||||
|
|
||||||
Next: **[Try it out!](#try-it)**
|
Next: **[Try it out!](#try-it)**
|
||||||
|
|
||||||
## Quick start - GitHub Codespaces
|
## Quick start - GitHub Codespaces
|
||||||
|
|
||||||
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and then click on **New codespace**. If prompted, select the **Standard** machine size (which is also the default).
|
> **IMPORTANT:** You need to use a "Standard" sized codespace or larger (4-core, 8GB) since VS Code needs 6GB of RAM to compile. This is now the default for GitHub Codespaces, but do not downgrade to "Basic" unless you do not intend to compile.
|
||||||
|
|
||||||
> **Note:** You will not see these options within GitHub if you are not in the Codespaces beta.
|
1. From the [microsoft/vscode GitHub repository](https://github.com/microsoft/vscode), click on the **Code** dropdown, select **Open with Codespaces**, and the **New codespace**
|
||||||
|
|
||||||
2. After the codespace is up and running in your browser, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd> and select **Ports: Focus on Ports View**.
|
> Note that you will not see these options if you are not in the beta yet.
|
||||||
|
|
||||||
3. You should see **VNC web client (6080)** under in the list of ports. Select the line and click on the globe icon to open it in a browser tab.
|
2. After the codespace is up and running in your browser, press <kbd>F1</kbd> and select **Ports: Focus on Ports View**.
|
||||||
|
|
||||||
> **Tip:** If you do not see the port, <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd>, select **Forward a Port** and enter port `6080`.
|
3. You should see port `6080` under **Forwarded Ports**. Select the line and click on the globe icon to open it in a browser tab.
|
||||||
|
|
||||||
|
> If you do not see port `6080`, press <kbd>F1</kbd>, select **Forward a Port** and enter port `6080`.
|
||||||
|
|
||||||
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
|
4. In the new tab, you should see noVNC. Click **Connect** and enter `vscode` as the password.
|
||||||
|
|
||||||
Anything you start in VS Code, or the integrated terminal, will appear here.
|
Anything you start in VS Code or the integrated terminal will appear here.
|
||||||
|
|
||||||
Next: **[Try it out!](#try-it)**
|
Next: **[Try it out!](#try-it)**
|
||||||
|
|
||||||
### Using VS Code with GitHub Codespaces
|
### Using VS Code with GitHub Codespaces
|
||||||
|
|
||||||
You may see improved VNC responsiveness when accessing a codespace from VS Code client since you can use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
|
You will likely see better performance when accessing the codespace you created from VS Code since you can use a[VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Here's how to do it.
|
||||||
|
|
||||||
1. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the the [GitHub Codespaces extension](https://marketplace.visualstudio.com/items?itemName=GitHub.codespaces).
|
1. [Create a codespace](#quick-start---github-codespaces) if you have not already.
|
||||||
|
|
||||||
> **Note:** The GitHub Codespaces extension requires the Visual Studio Code distribution of Code - OSS.
|
2. Set up [VS Code for use with GitHub Codespaces](https://docs.github.com/github/developing-online-with-codespaces/using-codespaces-in-visual-studio-code)
|
||||||
|
|
||||||
2. After the VS Code is up and running, press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> or <kbd>F1</kbd>, choose **Codespaces: Create New Codespace**, and use the following settings:
|
3. After the VS Code is up and running, press <kbd>F1</kbd>, choose **Codespaces: Connect to Codespace**, and select the codespace you created.
|
||||||
- `microsoft/vscode` for the repository.
|
|
||||||
- Select any branch (e.g. **main**) - you select a different one later.
|
|
||||||
- Choose **Standard** (4-core, 8GB) as the size.
|
|
||||||
|
|
||||||
4. After you have connected to the codespace, you can use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
4. After you've connected to the codespace, use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
|
||||||
|
|
||||||
> **Tip:** You may also need change your VNC client's **Picture Quaility** setting to **High** to get a full color desktop.
|
5. Anything you start in VS Code or the integrated terminal will appear here.
|
||||||
|
|
||||||
5. Anything you start in VS Code, or the integrated terminal, will appear here.
|
|
||||||
|
|
||||||
Next: **[Try it out!](#try-it)**
|
Next: **[Try it out!](#try-it)**
|
||||||
|
|
||||||
@@ -73,18 +70,20 @@ Next: **[Try it out!](#try-it)**
|
|||||||
|
|
||||||
This container uses the [Fluxbox](http://fluxbox.org/) window manager to keep things lean. **Right-click on the desktop** to see menu options. It works with GNOME and GTK applications, so other tools can be installed if needed.
|
This container uses the [Fluxbox](http://fluxbox.org/) window manager to keep things lean. **Right-click on the desktop** to see menu options. It works with GNOME and GTK applications, so other tools can be installed if needed.
|
||||||
|
|
||||||
> **Note:** You can also set the resolution from the command line by typing `set-resolution`.
|
Note you can also set the resolution from the command line by typing `set-resolution`.
|
||||||
|
|
||||||
To start working with Code - OSS, follow these steps:
|
To start working with Code - OSS, follow these steps:
|
||||||
|
|
||||||
1. In your local VS Code client, open a terminal (<kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>\`</kbd>) and type the following commands:
|
1. In your local VS Code, open a terminal (<kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>\`</kbd>) and type the following commands:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
yarn install
|
yarn install
|
||||||
bash scripts/code.sh
|
bash scripts/code.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to the desktop environment as described in the quick start and enter `vscode` as the password.
|
Note that a previous run of `yarn install` will already be cached, so this step should simply pick up any recent differences.
|
||||||
|
|
||||||
|
2. After the build is complete, open a web browser or a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to the desktop environnement as described in the quick start and enter `vscode` as the password.
|
||||||
|
|
||||||
3. You should now see Code - OSS!
|
3. You should now see Code - OSS!
|
||||||
|
|
||||||
@@ -92,7 +91,7 @@ Next, let's try debugging.
|
|||||||
|
|
||||||
1. Shut down Code - OSS by clicking the box in the upper right corner of the Code - OSS window through your browser or VNC viewer.
|
1. Shut down Code - OSS by clicking the box in the upper right corner of the Code - OSS window through your browser or VNC viewer.
|
||||||
|
|
||||||
2. Go to your local VS Code client, and use the **Run / Debug** view to launch the **VS Code** configuration. (Typically the default, so you can likely just press <kbd>F5</kbd>).
|
2. Go to your local VS Code client, and use Run / Debug view to launch the **VS Code** configuration. (Typically the default, so you can likely just press <kbd>F5</kbd>).
|
||||||
|
|
||||||
> **Note:** If launching times out, you can increase the value of `timeout` in the "VS Code", "Attach Main Process", "Attach Extension Host", and "Attach to Shared Process" configurations in [launch.json](../.vscode/launch.json). However, running `scripts/code.sh` first will set up Electron which will usually solve timeout issues.
|
> **Note:** If launching times out, you can increase the value of `timeout` in the "VS Code", "Attach Main Process", "Attach Extension Host", and "Attach to Shared Process" configurations in [launch.json](../.vscode/launch.json). However, running `scripts/code.sh` first will set up Electron which will usually solve timeout issues.
|
||||||
|
|
||||||
|
|||||||
@@ -3,26 +3,20 @@
|
|||||||
|
|
||||||
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
|
// Image contents: https://github.com/microsoft/vscode-dev-containers/blob/master/repository-containers/images/github.com/microsoft/vscode/.devcontainer/base.Dockerfile
|
||||||
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-main",
|
"image": "mcr.microsoft.com/vscode/devcontainers/repos/microsoft/vscode:branch-main",
|
||||||
|
|
||||||
|
"workspaceMount": "source=${localWorkspaceFolder},target=/home/node/workspace/vscode,type=bind,consistency=cached",
|
||||||
|
"workspaceFolder": "/home/node/workspace/vscode",
|
||||||
"overrideCommand": false,
|
"overrideCommand": false,
|
||||||
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined"],
|
"runArgs": [ "--init", "--security-opt", "seccomp=unconfined"],
|
||||||
|
|
||||||
"settings": {
|
"settings": {
|
||||||
|
"terminal.integrated.shell.linux": "/bin/bash",
|
||||||
"resmon.show.battery": false,
|
"resmon.show.battery": false,
|
||||||
"resmon.show.cpufreq": false
|
"resmon.show.cpufreq": false
|
||||||
},
|
},
|
||||||
|
|
||||||
// noVNC, VNC
|
// noVNC, VNC, debug ports
|
||||||
"forwardPorts": [6080, 5901],
|
"forwardPorts": [6080, 5901, 9222],
|
||||||
"portsAttributes": {
|
|
||||||
"6080": {
|
|
||||||
"label": "VNC web client (noVNC)",
|
|
||||||
"onAutoForward": "silent"
|
|
||||||
},
|
|
||||||
"5901": {
|
|
||||||
"label": "VNC TCP port",
|
|
||||||
"onAutoForward": "silent"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
|
|
||||||
"extensions": [
|
"extensions": [
|
||||||
"dbaeumer.vscode-eslint",
|
"dbaeumer.vscode-eslint",
|
||||||
|
|||||||
@@ -12,8 +12,6 @@
|
|||||||
**/vscode-api-tests/testWorkspace2/**
|
**/vscode-api-tests/testWorkspace2/**
|
||||||
**/extensions/**/out/**
|
**/extensions/**/out/**
|
||||||
**/extensions/**/build/**
|
**/extensions/**/build/**
|
||||||
**/big-data-cluster/src/bigDataCluster/controller/apiGenerated.ts
|
|
||||||
**/big-data-cluster/src/bigDataCluster/controller/clusterApiGenerated2.ts
|
|
||||||
**/extensions/markdown-language-features/media/**
|
**/extensions/markdown-language-features/media/**
|
||||||
**/extensions/markdown-language-features/notebook-out/**
|
**/extensions/markdown-language-features/notebook-out/**
|
||||||
**/extensions/typescript-basics/test/colorize-fixtures/**
|
**/extensions/typescript-basics/test/colorize-fixtures/**
|
||||||
|
|||||||
@@ -104,7 +104,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/common/**",
|
"**/{vs,sql}/base/common/**",
|
||||||
"**/{vs,sql}/base/test/common/**"
|
"**/{vs,sql}/base/test/common/**"
|
||||||
@@ -142,7 +141,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/{common,browser}/**",
|
"**/{vs,sql}/base/{common,browser}/**",
|
||||||
"**/{vs,sql}/base/test/{common,browser}/**",
|
"**/{vs,sql}/base/test/{common,browser}/**",
|
||||||
@@ -222,7 +220,6 @@
|
|||||||
"assert",
|
"assert",
|
||||||
"typemoq",
|
"typemoq",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"azdata",
|
"azdata",
|
||||||
"**/{vs,sql}/base/common/**",
|
"**/{vs,sql}/base/common/**",
|
||||||
@@ -295,7 +292,6 @@
|
|||||||
"typemoq",
|
"typemoq",
|
||||||
"sinon",
|
"sinon",
|
||||||
"azdata",
|
"azdata",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/{common,browser}/**",
|
"**/{vs,sql}/base/{common,browser}/**",
|
||||||
"**/{vs,sql}/base/test/{common,browser}/**",
|
"**/{vs,sql}/base/test/{common,browser}/**",
|
||||||
@@ -319,7 +315,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/common/**",
|
"**/{vs,sql}/base/common/**",
|
||||||
"**/{vs,sql}/platform/*/common/**",
|
"**/{vs,sql}/platform/*/common/**",
|
||||||
@@ -343,7 +338,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/{common,browser}/**",
|
"**/{vs,sql}/base/{common,browser}/**",
|
||||||
"**/{vs,sql}/platform/*/{common,browser}/**",
|
"**/{vs,sql}/platform/*/{common,browser}/**",
|
||||||
@@ -367,7 +361,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/common/**",
|
"**/{vs,sql}/base/common/**",
|
||||||
"**/{vs,sql}/platform/*/common/**",
|
"**/{vs,sql}/platform/*/common/**",
|
||||||
@@ -394,7 +387,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/{common,browser}/**",
|
"**/{vs,sql}/base/{common,browser}/**",
|
||||||
"**/{vs,sql}/platform/*/{common,browser}/**",
|
"**/{vs,sql}/platform/*/{common,browser}/**",
|
||||||
@@ -409,7 +401,6 @@
|
|||||||
"restrictions": [
|
"restrictions": [
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"vs/nls",
|
"vs/nls",
|
||||||
"**/{vs,sql}/base/{common,browser}/**",
|
"**/{vs,sql}/base/{common,browser}/**",
|
||||||
"**/{vs,sql}/base/test/{common,browser}/**",
|
"**/{vs,sql}/base/test/{common,browser}/**",
|
||||||
@@ -532,7 +523,7 @@
|
|||||||
"**/{vs,sql}/platform/**",
|
"**/{vs,sql}/platform/**",
|
||||||
"**/{vs,sql}/editor/**",
|
"**/{vs,sql}/editor/**",
|
||||||
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
|
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
|
||||||
"vs/workbench/contrib/files/browser/editors/fileEditorInput",
|
"vs/workbench/contrib/files/common/editors/fileEditorInput",
|
||||||
"**/{vs,sql}/workbench/services/**",
|
"**/{vs,sql}/workbench/services/**",
|
||||||
"**/{vs,sql}/workbench/test/**",
|
"**/{vs,sql}/workbench/test/**",
|
||||||
"*" // node modules
|
"*" // node modules
|
||||||
@@ -590,9 +581,7 @@
|
|||||||
"iconv-lite-umd",
|
"iconv-lite-umd",
|
||||||
"jschardet",
|
"jschardet",
|
||||||
"@angular/*",
|
"@angular/*",
|
||||||
"rxjs/**",
|
"rxjs/**"
|
||||||
"sanitize-html",
|
|
||||||
"ansi_up"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -749,12 +738,12 @@
|
|||||||
"rxjs/**",
|
"rxjs/**",
|
||||||
"ng2-charts",
|
"ng2-charts",
|
||||||
"chart.js",
|
"chart.js",
|
||||||
"plotly.js",
|
"plotly.js-dist-min",
|
||||||
"angular2-grid",
|
"angular2-grid",
|
||||||
"kburtram-query-plan",
|
"html-query-plan",
|
||||||
"html-to-image",
|
|
||||||
"turndown",
|
"turndown",
|
||||||
"gridstack",
|
"gridstack",
|
||||||
|
"gridstack/**",
|
||||||
"mark.js",
|
"mark.js",
|
||||||
"vscode-textmate",
|
"vscode-textmate",
|
||||||
"vscode-oniguruma",
|
"vscode-oniguruma",
|
||||||
@@ -967,7 +956,6 @@
|
|||||||
"**/{vs,sql}/**",
|
"**/{vs,sql}/**",
|
||||||
"assert",
|
"assert",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"crypto",
|
"crypto",
|
||||||
"vscode",
|
"vscode",
|
||||||
"typemoq",
|
"typemoq",
|
||||||
@@ -1003,7 +991,6 @@
|
|||||||
"assert",
|
"assert",
|
||||||
"typemoq",
|
"typemoq",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"crypto",
|
"crypto",
|
||||||
"xterm*",
|
"xterm*",
|
||||||
"azdata"
|
"azdata"
|
||||||
@@ -1016,7 +1003,6 @@
|
|||||||
"assert",
|
"assert",
|
||||||
"typemoq",
|
"typemoq",
|
||||||
"sinon",
|
"sinon",
|
||||||
"sinon-test",
|
|
||||||
"crypto",
|
"crypto",
|
||||||
"xterm*"
|
"xterm*"
|
||||||
]
|
]
|
||||||
@@ -1054,7 +1040,6 @@
|
|||||||
"vscode-dts-cancellation": "warn",
|
"vscode-dts-cancellation": "warn",
|
||||||
"vscode-dts-use-thenable": "warn",
|
"vscode-dts-use-thenable": "warn",
|
||||||
"vscode-dts-region-comments": "warn",
|
"vscode-dts-region-comments": "warn",
|
||||||
"vscode-dts-vscode-in-comments": "warn",
|
|
||||||
"vscode-dts-provider-naming": [
|
"vscode-dts-provider-naming": [
|
||||||
"warn",
|
"warn",
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -10,7 +10,6 @@
|
|||||||
"jsdoc"
|
"jsdoc"
|
||||||
],
|
],
|
||||||
"rules": {
|
"rules": {
|
||||||
"no-cond-assign": 2,
|
|
||||||
"@typescript-eslint/no-floating-promises": [
|
"@typescript-eslint/no-floating-promises": [
|
||||||
"error",
|
"error",
|
||||||
{
|
{
|
||||||
|
|||||||
8
.github/CODEOWNERS
vendored
8
.github/CODEOWNERS
vendored
@@ -3,11 +3,10 @@
|
|||||||
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
|
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
|
||||||
|
|
||||||
/extensions/admin-tool-ext-win @Charles-Gagnon
|
/extensions/admin-tool-ext-win @Charles-Gagnon
|
||||||
/extensions/arc/ @Charles-Gagnon @swells @candiceye
|
/extensions/arc/ @Charles-Gagnon
|
||||||
/extensions/azcli/ @Charles-Gagnon @swells @candiceye
|
/extensions/azdata/ @Charles-Gagnon
|
||||||
/extensions/big-data-cluster/ @Charles-Gagnon
|
/extensions/big-data-cluster/ @Charles-Gagnon
|
||||||
/extensions/dacpac/ @kisantia
|
/extensions/dacpac/ @kisantia
|
||||||
/extensions/notebook @azure-data-studio-notebook-devs
|
|
||||||
/extensions/query-history/ @Charles-Gagnon
|
/extensions/query-history/ @Charles-Gagnon
|
||||||
/extensions/resource-deployment/ @Charles-Gagnon
|
/extensions/resource-deployment/ @Charles-Gagnon
|
||||||
/extensions/schema-compare/ @kisantia
|
/extensions/schema-compare/ @kisantia
|
||||||
@@ -15,6 +14,3 @@
|
|||||||
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
|
/extensions/mssql/config.json @Charles-Gagnon @alanrenmsft @kburtram
|
||||||
|
|
||||||
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon
|
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon
|
||||||
/src/sql/workbench/browser/modelComponents @Charles-Gagnon @alanrenmsft
|
|
||||||
/src/sql/workbench/api @Charles-Gagnon @alanrenmsft
|
|
||||||
/src/sql/**/notebook @azure-data-studio-notebook-devs
|
|
||||||
|
|||||||
9
.github/label-actions.yml
vendored
9
.github/label-actions.yml
vendored
@@ -1,4 +1,3 @@
|
|||||||
# actions for Needs Logs label
|
|
||||||
Needs Logs:
|
Needs Logs:
|
||||||
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
|
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.
|
||||||
|
|
||||||
@@ -26,11 +25,3 @@ There are two types of logs to collect:
|
|||||||
- Run the command: **`Developer: Open Logs Folder`**
|
- Run the command: **`Developer: Open Logs Folder`**
|
||||||
|
|
||||||
- This will open the log folder locally. Please zip up this folder and attach it to the issue."
|
- This will open the log folder locally. Please zip up this folder and attach it to the issue."
|
||||||
|
|
||||||
# actions for Out of Scope label
|
|
||||||
Out of Scope:
|
|
||||||
comment: "Thank you for opening this suggestion! This enhancement is not planned in our
|
|
||||||
medium-term roadmap. The issue is being closed to reduce active issues to focus on
|
|
||||||
enhancements that are being considered for an upcoming release. We will review closed issues
|
|
||||||
with the 'Out of Scope' label when doing long-term planning."
|
|
||||||
close: true
|
|
||||||
|
|||||||
3
.github/subscribers.json
vendored
3
.github/subscribers.json
vendored
@@ -4,7 +4,6 @@
|
|||||||
"rchiodo",
|
"rchiodo",
|
||||||
"greazer",
|
"greazer",
|
||||||
"donjayamanne",
|
"donjayamanne",
|
||||||
"jilljac",
|
"jilljac"
|
||||||
"IanMatthewHuff"
|
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
12
.github/workflows/ci.yml
vendored
12
.github/workflows/ci.yml
vendored
@@ -204,14 +204,6 @@ jobs:
|
|||||||
- name: Compile and Download
|
- name: Compile and Download
|
||||||
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
|
run: yarn npm-run-all --max_old_space_size=4095 -lp compile "electron x64" playwright-install download-builtin-extensions
|
||||||
|
|
||||||
# This is required for keytar unittests, otherwise we hit
|
|
||||||
# https://github.com/atom/node-keytar/issues/76
|
|
||||||
- name: Create temporary keychain
|
|
||||||
run: |
|
|
||||||
security create-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
|
|
||||||
security default-keychain -s $RUNNER_TEMP/buildagent.keychain
|
|
||||||
security unlock-keychain -p pwd $RUNNER_TEMP/buildagent.keychain
|
|
||||||
|
|
||||||
- name: Run Unit Tests (Electron)
|
- name: Run Unit Tests (Electron)
|
||||||
run: DISPLAY=:10 ./scripts/test.sh
|
run: DISPLAY=:10 ./scripts/test.sh
|
||||||
|
|
||||||
@@ -243,6 +235,7 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
path: "**/node_modules"
|
path: "**/node_modules"
|
||||||
key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
|
key: ${{ runner.os }}-cacheNodeModules13-${{ steps.nodeModulesCacheKey.outputs.value }}
|
||||||
|
restore-keys: ${{ runner.os }}-cacheNodeModules13-
|
||||||
- name: Get yarn cache directory path
|
- name: Get yarn cache directory path
|
||||||
id: yarnCacheDirPath
|
id: yarnCacheDirPath
|
||||||
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
if: ${{ steps.cacheNodeModules.outputs.cache-hit != 'true' }}
|
||||||
@@ -278,9 +271,6 @@ jobs:
|
|||||||
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
|
# - name: Run Monaco Editor Checks {{SQL CARBON EDIT}} Remove Monaco checks
|
||||||
# run: yarn monaco-compile-check
|
# run: yarn monaco-compile-check
|
||||||
|
|
||||||
- name: Compile /build/
|
|
||||||
run: yarn --cwd build compile
|
|
||||||
|
|
||||||
- name: Run Trusted Types Checks
|
- name: Run Trusted Types Checks
|
||||||
run: yarn tsec-compile-check
|
run: yarn tsec-compile-check
|
||||||
|
|
||||||
|
|||||||
33
.lgtm/javascript-queries/promises.ql
Normal file
33
.lgtm/javascript-queries/promises.ql
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
/**
|
||||||
|
* @name No floating promises
|
||||||
|
* @kind problem
|
||||||
|
* @problem.severity error
|
||||||
|
* @id js/experimental/floating-promise
|
||||||
|
*/
|
||||||
|
import javascript
|
||||||
|
|
||||||
|
private predicate isEscapingPromise(PromiseDefinition promise) {
|
||||||
|
exists (DataFlow::Node escape | promise.flowsTo(escape) |
|
||||||
|
escape = any(DataFlow::InvokeNode invk).getAnArgument()
|
||||||
|
or
|
||||||
|
escape = any(DataFlow::FunctionNode fun).getAReturn()
|
||||||
|
or
|
||||||
|
escape = any(ThrowStmt t).getExpr().flow()
|
||||||
|
or
|
||||||
|
escape = any(GlobalVariable v).getAnAssignedExpr().flow()
|
||||||
|
or
|
||||||
|
escape = any(DataFlow::PropWrite write).getRhs()
|
||||||
|
or
|
||||||
|
exists(WithStmt with, Assignment assign |
|
||||||
|
with.mayAffect(assign.getLhs()) and
|
||||||
|
assign.getRhs().flow() = escape
|
||||||
|
)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
from PromiseDefinition promise
|
||||||
|
where
|
||||||
|
not exists(promise.getAMethodCall(any(string m | m = "then" or m = "catch" or m = "finally"))) and
|
||||||
|
not exists (AwaitExpr e | promise.flowsTo(e.getOperand().flow())) and
|
||||||
|
not isEscapingPromise(promise)
|
||||||
|
select promise, "This promise appears to be a floating promise"
|
||||||
556
.vscode/launch.json
vendored
556
.vscode/launch.json
vendored
@@ -4,25 +4,214 @@
|
|||||||
{
|
{
|
||||||
"type": "node",
|
"type": "node",
|
||||||
"request": "launch",
|
"request": "launch",
|
||||||
"name": "Launch Azure Data Studio",
|
"name": "Gulp Build",
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
|
"program": "${workspaceFolder}/node_modules/gulp/bin/gulp.js",
|
||||||
"windows": {
|
"stopOnEntry": true,
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
|
"args": [
|
||||||
},
|
"hygiene"
|
||||||
"runtimeArgs": [
|
]
|
||||||
"--no-cached-data"
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "attach",
|
||||||
|
"restart": true,
|
||||||
|
"name": "Attach to Extension Host",
|
||||||
|
"timeout": 30000,
|
||||||
|
"port": 5870,
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js",
|
||||||
|
"${workspaceFolder}/extensions/*/out/**/*.js"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "pwa-chrome",
|
||||||
|
"request": "attach",
|
||||||
|
"name": "Attach to Shared Process",
|
||||||
|
"timeout": 30000,
|
||||||
|
"port": 9222,
|
||||||
|
"urlFilter": "*sharedProcess.html*",
|
||||||
|
"presentation": {
|
||||||
|
"hidden": true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "attach",
|
||||||
|
"name": "Attach to Search Process",
|
||||||
|
"port": 5876,
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "attach",
|
||||||
|
"name": "Attach to CLI Process",
|
||||||
|
"port": 5874,
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "attach",
|
||||||
|
"name": "Attach to Main Process",
|
||||||
|
"timeout": 30000,
|
||||||
|
"port": 5875,
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"hidden": true,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code Emmet Tests",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"${workspaceFolder}/extensions/emmet/test-fixtures",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/emmet",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/emmet/out/test"
|
||||||
],
|
],
|
||||||
"outFiles": [
|
"outFiles": [
|
||||||
"${workspaceFolder}/out/**/*.js"
|
"${workspaceFolder}/out/**/*.js"
|
||||||
],
|
],
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "0_ads"
|
"group": "5_tests",
|
||||||
|
"order": 6
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code Git Tests",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"/tmp/my4g9l",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/git",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/git/out/test"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/extensions/git/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 6
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code API Tests (single folder)",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
// "${workspaceFolder}", // Uncomment for running out of sources.
|
||||||
|
"${workspaceFolder}/extensions/vscode-api-tests/testWorkspace",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/singlefolder-tests",
|
||||||
|
"--disable-extensions"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 3
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code API Tests (workspace)",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"${workspaceFolder}/extensions/vscode-api-tests/testworkspace.code-workspace",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-api-tests",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-api-tests/out/workspace-tests"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 4
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code Tokenizer Tests",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"${workspaceFolder}/extensions/vscode-colorize-tests/test",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-colorize-tests",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-colorize-tests/out"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 5
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code Notebook Tests",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"${workspaceFolder}/extensions/vscode-notebook-tests/test",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-notebook-tests",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-notebook-tests/out"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 6
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "extensionHost",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "VS Code Custom Editor Tests",
|
||||||
|
"runtimeExecutable": "${execPath}",
|
||||||
|
"args": [
|
||||||
|
"${workspaceFolder}/extensions/vscode-custom-editor-tests/test-workspace",
|
||||||
|
"--extensionDevelopmentPath=${workspaceFolder}/extensions/vscode-custom-editor-tests",
|
||||||
|
"--extensionTestsPath=${workspaceFolder}/extensions/vscode-custom-editor-tests/out/test"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 6
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "pwa-chrome",
|
||||||
|
"request": "attach",
|
||||||
|
"name": "Attach to azuredatastudio",
|
||||||
|
"browserAttachLocation": "workspace",
|
||||||
|
"port": 9222,
|
||||||
|
"trace": true,
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"resolveSourceMapLocations": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"perScriptSourcemaps": "yes"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "pwa-chrome",
|
"type": "pwa-chrome",
|
||||||
"request": "launch",
|
"request": "launch",
|
||||||
"name": "Launch ADS & Debug Renderer",
|
"name": "Launch azuredatastudio",
|
||||||
"windows": {
|
"windows": {
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
|
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
|
||||||
},
|
},
|
||||||
@@ -33,7 +222,7 @@
|
|||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
||||||
},
|
},
|
||||||
"port": 9222,
|
"port": 9222,
|
||||||
"timeout": 30000,
|
"timeout": 20000,
|
||||||
"env": {
|
"env": {
|
||||||
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null,
|
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null,
|
||||||
"VSCODE_SKIP_PRELAUNCH": "1"
|
"VSCODE_SKIP_PRELAUNCH": "1"
|
||||||
@@ -53,89 +242,97 @@
|
|||||||
"outFiles": [
|
"outFiles": [
|
||||||
"${workspaceFolder}/out/**/*.js"
|
"${workspaceFolder}/out/**/*.js"
|
||||||
],
|
],
|
||||||
"browserLaunchLocation": "workspace",
|
"browserLaunchLocation": "workspace"
|
||||||
"presentation": {
|
|
||||||
"group": "1_debug",
|
|
||||||
"order": 2
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "node",
|
"type": "node",
|
||||||
"request": "attach",
|
"request": "launch",
|
||||||
"restart": true,
|
"name": "Launch ADS (Web) (TBD)",
|
||||||
"name": "Attach to Extension Host",
|
"program": "${workspaceFolder}/resources/web/code-web.js",
|
||||||
"timeout": 30000,
|
|
||||||
"port": 5870,
|
|
||||||
"outFiles": [
|
|
||||||
"${workspaceFolder}/out/**/*.js",
|
|
||||||
"${workspaceFolder}/extensions/*/out/**/*.js"
|
|
||||||
],
|
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "2_attach"
|
"group": "0_vscode",
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "pwa-chrome",
|
|
||||||
"request": "attach",
|
|
||||||
"name": "Attach to Shared Process",
|
|
||||||
"timeout": 30000,
|
|
||||||
"port": 9222,
|
|
||||||
"urlFilter": "*sharedProcess.html*",
|
|
||||||
"presentation": {
|
|
||||||
"group": "2_attach"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "node",
|
|
||||||
"request": "attach",
|
|
||||||
"name": "Attach to Main Process",
|
|
||||||
"timeout": 30000,
|
|
||||||
"port": 5875,
|
|
||||||
"outFiles": [
|
|
||||||
"${workspaceFolder}/out/**/*.js"
|
|
||||||
],
|
|
||||||
"presentation": {
|
|
||||||
"group": "2_attach"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "pwa-chrome",
|
|
||||||
"request": "attach",
|
|
||||||
"name": "Attach to Renderer",
|
|
||||||
"browserAttachLocation": "workspace",
|
|
||||||
"port": 9222,
|
|
||||||
"timeout": 30000,
|
|
||||||
"trace": true,
|
|
||||||
"outFiles": [
|
|
||||||
"${workspaceFolder}/out/**/*.js"
|
|
||||||
],
|
|
||||||
"resolveSourceMapLocations": [
|
|
||||||
"${workspaceFolder}/out/**/*.js"
|
|
||||||
],
|
|
||||||
"perScriptSourcemaps": "yes",
|
|
||||||
"presentation": {
|
|
||||||
"group": "2_attach",
|
|
||||||
"order": 2
|
"order": 2
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "node",
|
"type": "node",
|
||||||
"request": "launch",
|
"request": "launch",
|
||||||
"name": "Run Smoke Tests",
|
"name": "Main Process",
|
||||||
|
"runtimeExecutable": "${workspaceFolder}/scripts/code.sh",
|
||||||
|
"windows": {
|
||||||
|
"runtimeExecutable": "${workspaceFolder}/scripts/code.bat",
|
||||||
|
},
|
||||||
|
"runtimeArgs": [
|
||||||
|
"--no-cached-data"
|
||||||
|
],
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "1_vscode",
|
||||||
|
"order": 1
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "pwa-chrome",
|
||||||
|
"request": "launch",
|
||||||
|
"outFiles": [],
|
||||||
|
"perScriptSourcemaps": "yes",
|
||||||
|
"name": "VS Code (Web, Chrome)",
|
||||||
|
"url": "http://localhost:8080",
|
||||||
|
"preLaunchTask": "Run web",
|
||||||
|
"presentation": {
|
||||||
|
"group": "0_vscode",
|
||||||
|
"order": 3
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "pwa-msedge",
|
||||||
|
"request": "launch",
|
||||||
|
"outFiles": [],
|
||||||
|
"perScriptSourcemaps": "yes",
|
||||||
|
"name": "VS Code (Web, Edge)",
|
||||||
|
"url": "http://localhost:8080",
|
||||||
|
"pauseForSourceMap": false,
|
||||||
|
"preLaunchTask": "Run web",
|
||||||
|
"presentation": {
|
||||||
|
"group": "0_vscode",
|
||||||
|
"order": 3
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "Git Unit Tests",
|
||||||
|
"program": "${workspaceFolder}/extensions/git/node_modules/mocha/bin/_mocha",
|
||||||
|
"stopOnEntry": false,
|
||||||
|
"cwd": "${workspaceFolder}/extensions/git",
|
||||||
|
"outFiles": [
|
||||||
|
"${workspaceFolder}/extensions/git/out/**/*.js"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "5_tests",
|
||||||
|
"order": 10
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "node",
|
||||||
|
"request": "launch",
|
||||||
|
"name": "Launch Smoke Test",
|
||||||
"program": "${workspaceFolder}/test/smoke/test/index.js",
|
"program": "${workspaceFolder}/test/smoke/test/index.js",
|
||||||
"cwd": "${workspaceFolder}/test/smoke",
|
"cwd": "${workspaceFolder}/test/smoke",
|
||||||
"env": {
|
"env": {
|
||||||
"BUILD_ARTIFACTSTAGINGDIRECTORY": "${workspaceFolder}"
|
"BUILD_ARTIFACTSTAGINGDIRECTORY": "${workspaceFolder}"
|
||||||
},
|
},
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "3_tests",
|
"group": "5_tests",
|
||||||
"order": 5
|
"order": 8
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "pwa-node",
|
"type": "pwa-node",
|
||||||
"request": "launch",
|
"request": "launch",
|
||||||
"name": "Run Core Unit Tests",
|
"name": "Run Unit Tests",
|
||||||
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
||||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
|
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
|
||||||
"windows": {
|
"windows": {
|
||||||
@@ -153,20 +350,19 @@
|
|||||||
"${workspaceFolder}/out/**/*.js"
|
"${workspaceFolder}/out/**/*.js"
|
||||||
],
|
],
|
||||||
"cascadeTerminateToConfigurations": [
|
"cascadeTerminateToConfigurations": [
|
||||||
"Attach to Renderer"
|
"Attach to azuredatastudio"
|
||||||
],
|
],
|
||||||
"env": {
|
"env": {
|
||||||
"MOCHA_COLORS": "true"
|
"MOCHA_COLORS": "true"
|
||||||
},
|
},
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "3_tests",
|
"hidden": true
|
||||||
"order": 1
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "pwa-node",
|
"type": "pwa-node",
|
||||||
"request": "launch",
|
"request": "launch",
|
||||||
"name": "Run Core Unit Tests (Current File)",
|
"name": "Run Unit Tests For Current File",
|
||||||
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
"program": "${workspaceFolder}/test/unit/electron/index.js",
|
||||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
|
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
|
||||||
"windows": {
|
"windows": {
|
||||||
@@ -176,7 +372,7 @@
|
|||||||
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
|
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
|
||||||
},
|
},
|
||||||
"cascadeTerminateToConfigurations": [
|
"cascadeTerminateToConfigurations": [
|
||||||
"Attach to Renderer"
|
"Attach to azuredatastudio"
|
||||||
],
|
],
|
||||||
"outputCapture": "std",
|
"outputCapture": "std",
|
||||||
"args": [
|
"args": [
|
||||||
@@ -190,10 +386,6 @@
|
|||||||
],
|
],
|
||||||
"env": {
|
"env": {
|
||||||
"MOCHA_COLORS": "true"
|
"MOCHA_COLORS": "true"
|
||||||
},
|
|
||||||
"presentation": {
|
|
||||||
"group": "3_tests",
|
|
||||||
"order": 2
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -210,11 +402,7 @@
|
|||||||
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
|
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
|
||||||
},
|
},
|
||||||
"webRoot": "${workspaceFolder}",
|
"webRoot": "${workspaceFolder}",
|
||||||
"timeout": 45000,
|
"timeout": 45000
|
||||||
"presentation": {
|
|
||||||
"group": "3_tests",
|
|
||||||
"order": 3
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "chrome",
|
"type": "chrome",
|
||||||
@@ -230,164 +418,82 @@
|
|||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql-test-integration.sh"
|
"runtimeExecutable": "${workspaceFolder}/scripts/sql-test-integration.sh"
|
||||||
},
|
},
|
||||||
"webRoot": "${workspaceFolder}",
|
"webRoot": "${workspaceFolder}",
|
||||||
"timeout": 45000,
|
"timeout": 45000
|
||||||
"presentation": {
|
|
||||||
"group": "3_tests",
|
|
||||||
"order": 4
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"type": "node",
|
|
||||||
"request": "launch",
|
|
||||||
"name": "Launch Azure Data Studio (Web) (TBD)",
|
|
||||||
"program": "${workspaceFolder}/resources/web/code-web.js",
|
|
||||||
"presentation": {
|
|
||||||
"group": "4_web"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "pwa-chrome",
|
|
||||||
"request": "launch",
|
|
||||||
"outFiles": [],
|
|
||||||
"perScriptSourcemaps": "yes",
|
|
||||||
"name": "Launch Azure Data Studio (Web, Chrome)",
|
|
||||||
"url": "http://localhost:8080",
|
|
||||||
"preLaunchTask": "Run web",
|
|
||||||
"presentation": {
|
|
||||||
"group": "4_web"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "pwa-msedge",
|
|
||||||
"request": "launch",
|
|
||||||
"outFiles": [],
|
|
||||||
"perScriptSourcemaps": "yes",
|
|
||||||
"name": "Launch Azure Data Studio (Web, Edge)",
|
|
||||||
"url": "http://localhost:8080",
|
|
||||||
"pauseForSourceMap": false,
|
|
||||||
"preLaunchTask": "Run web",
|
|
||||||
"presentation": {
|
|
||||||
"group": "4_web"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Run Sample Resource Deployment Extension",
|
|
||||||
"type": "sqlopsExtensionHost",
|
|
||||||
"request": "launch",
|
|
||||||
"windows": {
|
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
|
|
||||||
},
|
|
||||||
"osx": {
|
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
|
||||||
},
|
|
||||||
"linux": {
|
|
||||||
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
|
|
||||||
},
|
|
||||||
"args": [
|
|
||||||
"--extensionDevelopmentPath=${workspaceRoot}/samples/sample-resource-deployment"
|
|
||||||
],
|
|
||||||
"outFiles": [
|
|
||||||
"${workspaceRoot}/samples/sample-resource-deployment/out/**/*.js"
|
|
||||||
],
|
|
||||||
"preLaunchTask": "Watch sample-resource-deployment",
|
|
||||||
"presentation": {
|
|
||||||
"group": "5_samples"
|
|
||||||
},
|
|
||||||
"timeout": 30000
|
|
||||||
}
|
|
||||||
],
|
],
|
||||||
"compounds": [
|
"compounds": [
|
||||||
{
|
{
|
||||||
"name": "Launch ADS & Debug Renderer and Extension Host",
|
"name": "Debug Unit Tests",
|
||||||
"configurations": [
|
"configurations": [
|
||||||
"Launch ADS & Debug Renderer",
|
"Attach to azuredatastudio",
|
||||||
"Attach to Extension Host"
|
"Run Unit Tests"
|
||||||
],
|
]
|
||||||
"presentation": {
|
|
||||||
"group": "1_debug",
|
|
||||||
"order": 1
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Launch ADS & Debug Extension Host",
|
|
||||||
"configurations": [
|
|
||||||
"Launch Azure Data Studio",
|
|
||||||
"Attach to Extension Host"
|
|
||||||
],
|
|
||||||
"presentation": {
|
|
||||||
"group": "1_debug",
|
|
||||||
"order": 3
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Launch ADS & Debug Main, Renderer and Extension Host",
|
|
||||||
"configurations": [
|
|
||||||
"Launch ADS & Debug Renderer",
|
|
||||||
"Attach to Main Process",
|
|
||||||
"Attach to Extension Host"
|
|
||||||
],
|
|
||||||
"presentation": {
|
|
||||||
"group": "1_debug",
|
|
||||||
"order": 4
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Launch ADS & Debug All",
|
|
||||||
"stopAll": true,
|
|
||||||
"configurations": [
|
|
||||||
"Launch Azure Data Studio",
|
|
||||||
"Attach to Main Process",
|
|
||||||
"Attach to Extension Host",
|
|
||||||
"Attach to Shared Process",
|
|
||||||
"Attach to Renderer"
|
|
||||||
],
|
|
||||||
"preLaunchTask": "Ensure Prelaunch Dependencies",
|
|
||||||
"presentation": {
|
|
||||||
"group": "1_debug",
|
|
||||||
"order": 5
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Attach to Renderer and Extension Host",
|
|
||||||
"configurations": [
|
|
||||||
"Attach to Renderer",
|
|
||||||
"Attach to Extension Host"
|
|
||||||
],
|
|
||||||
"presentation": {
|
|
||||||
"group": "2_attach",
|
|
||||||
"order": 1
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Debug Core Unit Tests",
|
|
||||||
"configurations": [
|
|
||||||
"Attach to Renderer",
|
|
||||||
"Run Core Unit Tests"
|
|
||||||
],
|
|
||||||
"presentation": {
|
|
||||||
"group": "3_tests",
|
|
||||||
"order": 6
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "Debug Extension Unit Tests",
|
"name": "Debug Extension Unit Tests",
|
||||||
"configurations": [
|
"configurations": [
|
||||||
"Attach to Extension Host",
|
"Attach to Extension Host",
|
||||||
"Run Extension Unit Tests"
|
"Run Extension Unit Tests"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Azure Data Studio",
|
||||||
|
"stopAll": true,
|
||||||
|
"configurations": [
|
||||||
|
"Launch azuredatastudio",
|
||||||
|
"Attach to Main Process",
|
||||||
|
"Attach to Extension Host",
|
||||||
|
"Attach to Shared Process",
|
||||||
],
|
],
|
||||||
|
"preLaunchTask": "Ensure Prelaunch Dependencies",
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "3_tests"
|
"group": "0_vscode",
|
||||||
|
"order": 1
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "Debug Core Unit Tests (Current File)",
|
"name": "Debug azuredatastudio Main, Renderer & Extension Host",
|
||||||
"configurations": [
|
"configurations": [
|
||||||
"Attach to Renderer",
|
"Launch azuredatastudio",
|
||||||
"Run Core Unit Tests (Current File)"
|
"Attach to Main Process",
|
||||||
|
"Attach to Extension Host"
|
||||||
],
|
],
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"group": "3_tests",
|
"group": "1_vscode",
|
||||||
"order": 8
|
"order": 3
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Debug Renderer and Extension Host processes",
|
||||||
|
"configurations": [
|
||||||
|
"Launch azuredatastudio",
|
||||||
|
"Attach to Extension Host"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "1_vscode",
|
||||||
|
"order": 2
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Attach Renderer and Extension Host",
|
||||||
|
"configurations": [
|
||||||
|
"Attach to azuredatastudio",
|
||||||
|
"Attach to Extension Host"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "1_vscode",
|
||||||
|
"order": 2
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Debug Unit Tests (Current File)",
|
||||||
|
"configurations": [
|
||||||
|
"Attach to azuredatastudio",
|
||||||
|
"Run Unit Tests For Current File"
|
||||||
|
],
|
||||||
|
"presentation": {
|
||||||
|
"group": "1_vscode",
|
||||||
|
"order": 2
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
|||||||
18
.vscode/notebooks/api.github-issues
vendored
18
.vscode/notebooks/api.github-issues
vendored
@@ -2,31 +2,37 @@
|
|||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### Config"
|
"value": "#### Config",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"June 2021\""
|
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"April 2021\"",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "### Finalization"
|
"value": "### Finalization",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repo $milestone label:api-finalization"
|
"value": "$repo $milestone label:api-finalization",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "### Proposals"
|
"value": "### Proposals",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repo $milestone is:open label:api-proposal "
|
"value": "$repo $milestone is:open label:api-proposal ",
|
||||||
|
"editable": true
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
2
.vscode/notebooks/endgame.github-issues
vendored
2
.vscode/notebooks/endgame.github-issues
vendored
@@ -7,7 +7,7 @@
|
|||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"May 2021\""
|
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-emmet-helper repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\""
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
|
|||||||
4
.vscode/notebooks/my-endgame.github-issues
vendored
4
.vscode/notebooks/my-endgame.github-issues
vendored
@@ -7,7 +7,7 @@
|
|||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"May 2021\"\n\n$MINE=assignee:@me"
|
"value": "$REPOS=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-js-debug repo:microsoft/vscode-remote-release repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-settings-sync-server repo:microsoft/vscode-remotehub\n\n$MILESTONE=milestone:\"April 2021\"\n\n$MINE=assignee:@me"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
@@ -157,7 +157,7 @@
|
|||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -author:aeschli -author:alexdima -author:alexr00 -author:AmandaSilver -author:bamurtaugh -author:bpasero -author:btholt -author:chrisdias -author:chrmarti -author:Chuxel -author:connor4312 -author:dbaeumer -author:deepak1556 -author:devinvalenciano -author:digitarald -author:eamodio -author:egamma -author:fiveisprime -author:gregvanl -author:isidorn -author:ItalyPaleAle -author:JacksonKearl -author:joaomoreno -author:jrieken -author:kieferrm -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:ornellaalt -author:orta -author:rebornix -author:RMacfarlane -author:roblourens -author:rzhao271 -author:sana-ajani -author:sandy081 -author:sbatten -author:stevencl -author:Tyriar -author:weinand -author:TylerLeonhardt -author:lramos15 -author:hediet"
|
"value": "$REPOS $MILESTONE -$MINE is:issue is:closed sort:updated-asc label:bug -label:verified -label:z-author-verified -label:on-testplan -label:*duplicate -label:duplicate -label:invalid -label:*as-designed -label:error-telemetry -label:verification-steps-needed -label:verification-found -author:aeschli -author:alexdima -author:alexr00 -author:AmandaSilver -author:bamurtaugh -author:bpasero -author:btholt -author:chrisdias -author:chrmarti -author:Chuxel -author:connor4312 -author:dbaeumer -author:deepak1556 -author:devinvalenciano -author:digitarald -author:eamodio -author:egamma -author:fiveisprime -author:gregvanl -author:isidorn -author:ItalyPaleAle -author:JacksonKearl -author:joaomoreno -author:jrieken -author:kieferrm -author:lszomoru -author:meganrogge -author:misolori -author:mjbvz -author:ornellaalt -author:orta -author:rebornix -author:RMacfarlane -author:roblourens -author:rzhao271 -author:sana-ajani -author:sandy081 -author:sbatten -author:stevencl -author:Tyriar -author:weinand -author:TylerLeonhardt -author:lramos15"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
|
|||||||
52
.vscode/notebooks/my-work.github-issues
vendored
52
.vscode/notebooks/my-work.github-issues
vendored
@@ -2,17 +2,20 @@
|
|||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "##### `Config`: This should be changed every month/milestone"
|
"value": "##### `Config`: This should be changed every month/milestone",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"June 2021\""
|
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks repo:microsoft/vscode-internalbacklog\n\n// current milestone name\n$milestone=milestone:\"April 2021\"",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "github-issues",
|
||||||
"value": "## Milestone Work"
|
"value": "## Milestone Work",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
@@ -22,48 +25,57 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "github-issues",
|
||||||
"value": "## Bugs, Debt, Features..."
|
"value": "## Bugs, Debt, Features...",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### My Bugs"
|
"value": "#### My Bugs",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repos assignee:@me is:open label:bug"
|
"value": "$repos assignee:@me is:open label:bug",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### Debt & Engineering"
|
"value": "#### Debt & Engineering",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repos assignee:@me is:open label:debt OR $repos assignee:@me is:open label:engineering"
|
"value": "$repos assignee:@me is:open label:debt OR $repos assignee:@me is:open label:engineering",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### Performance 🐌 🔜 🏎"
|
"value": "#### Performance 🐌 🔜 🏎",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repos assignee:@me is:open label:perf OR $repos assignee:@me is:open label:perf-startup OR $repos assignee:@me is:open label:perf-bloat OR $repos assignee:@me is:open label:freeze-slow-crash-leak"
|
"value": "$repos assignee:@me is:open label:perf OR $repos assignee:@me is:open label:perf-startup OR $repos assignee:@me is:open label:perf-bloat OR $repos assignee:@me is:open label:freeze-slow-crash-leak",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### Feature Requests"
|
"value": "#### Feature Requests",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repos assignee:@me is:open label:feature-request milestone:Backlog sort:reactions-+1-desc"
|
"value": "$repos assignee:@me is:open label:feature-request milestone:Backlog sort:reactions-+1-desc",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
@@ -74,22 +86,26 @@
|
|||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "### Personal Inbox\n"
|
"value": "### Personal Inbox\n",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "\n#### Missing Type label"
|
"value": "\n#### Missing Type label",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
"language": "github-issues",
|
"language": "github-issues",
|
||||||
"value": "$repos assignee:@me is:open type:issue -label:bug -label:\"needs more info\" -label:feature-request -label:under-discussion -label:debt -label:plan-item -label:upstream"
|
"value": "$repos assignee:@me is:open type:issue -label:bug -label:\"needs more info\" -label:feature-request -label:under-discussion -label:debt -label:plan-item -label:upstream",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 1,
|
"kind": 1,
|
||||||
"language": "markdown",
|
"language": "markdown",
|
||||||
"value": "#### Not Actionable"
|
"value": "#### Not Actionable",
|
||||||
|
"editable": true
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"kind": 2,
|
"kind": 2,
|
||||||
|
|||||||
65
.vscode/tasks.json
vendored
65
.vscode/tasks.json
vendored
@@ -55,11 +55,39 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "npm",
|
||||||
|
"script": "watch-extension-mediad",
|
||||||
|
"label": "Ext Media - Build",
|
||||||
|
"isBackground": true,
|
||||||
|
"presentation": {
|
||||||
|
"reveal": "never",
|
||||||
|
"group": "buildWatchers"
|
||||||
|
},
|
||||||
|
"problemMatcher": {
|
||||||
|
"owner": "typescript",
|
||||||
|
"applyTo": "closedDocuments",
|
||||||
|
"fileLocation": [
|
||||||
|
"absolute"
|
||||||
|
],
|
||||||
|
"pattern": {
|
||||||
|
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
|
||||||
|
"file": 1,
|
||||||
|
"location": 2,
|
||||||
|
"message": 3
|
||||||
|
},
|
||||||
|
"background": {
|
||||||
|
"beginsPattern": "Starting compilation",
|
||||||
|
"endsPattern": "Finished compilation"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"label": "VS Code - Build",
|
"label": "VS Code - Build",
|
||||||
"dependsOn": [
|
"dependsOn": [
|
||||||
"Core - Build",
|
"Core - Build",
|
||||||
"Ext - Build"
|
"Ext - Build",
|
||||||
|
"Ext Media - Build",
|
||||||
],
|
],
|
||||||
"group": {
|
"group": {
|
||||||
"kind": "build",
|
"kind": "build",
|
||||||
@@ -74,8 +102,7 @@
|
|||||||
"group": "build",
|
"group": "build",
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"reveal": "never",
|
"reveal": "never",
|
||||||
"group": "buildKillers",
|
"group": "buildKillers"
|
||||||
"close": true
|
|
||||||
},
|
},
|
||||||
"problemMatcher": "$tsc"
|
"problemMatcher": "$tsc"
|
||||||
},
|
},
|
||||||
@@ -86,8 +113,18 @@
|
|||||||
"group": "build",
|
"group": "build",
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"reveal": "never",
|
"reveal": "never",
|
||||||
"group": "buildKillers",
|
"group": "buildKillers"
|
||||||
"close": true
|
},
|
||||||
|
"problemMatcher": "$tsc"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "npm",
|
||||||
|
"script": "kill-watch-extension-mediad",
|
||||||
|
"label": "Kill Ext Media - Build",
|
||||||
|
"group": "build",
|
||||||
|
"presentation": {
|
||||||
|
"reveal": "never",
|
||||||
|
"group": "buildKillers"
|
||||||
},
|
},
|
||||||
"problemMatcher": "$tsc"
|
"problemMatcher": "$tsc"
|
||||||
},
|
},
|
||||||
@@ -95,7 +132,8 @@
|
|||||||
"label": "Kill VS Code - Build",
|
"label": "Kill VS Code - Build",
|
||||||
"dependsOn": [
|
"dependsOn": [
|
||||||
"Kill Core - Build",
|
"Kill Core - Build",
|
||||||
"Kill Ext - Build"
|
"Kill Ext - Build",
|
||||||
|
"Kill Ext Media - Build",
|
||||||
],
|
],
|
||||||
"group": "build",
|
"group": "build",
|
||||||
"problemMatcher": []
|
"problemMatcher": []
|
||||||
@@ -214,8 +252,7 @@
|
|||||||
"command": "node build/lib/preLaunch.js",
|
"command": "node build/lib/preLaunch.js",
|
||||||
"label": "Ensure Prelaunch Dependencies",
|
"label": "Ensure Prelaunch Dependencies",
|
||||||
"presentation": {
|
"presentation": {
|
||||||
"reveal": "silent",
|
"reveal": "silent"
|
||||||
"close": true
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -231,18 +268,6 @@
|
|||||||
"group": "build",
|
"group": "build",
|
||||||
"label": "npm: tsec-compile-check",
|
"label": "npm: tsec-compile-check",
|
||||||
"detail": "node_modules/tsec/bin/tsec -p src/tsconfig.json --noEmit"
|
"detail": "node_modules/tsec/bin/tsec -p src/tsconfig.json --noEmit"
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "npm",
|
|
||||||
"script": "watch",
|
|
||||||
"label": "Watch sample-resource-deployment",
|
|
||||||
"path": "./samples/sample-resource-deployment/package.json",
|
|
||||||
"problemMatcher": "$tsc-watch",
|
|
||||||
"isBackground": true,
|
|
||||||
"presentation": {
|
|
||||||
"reveal": "never"
|
|
||||||
},
|
|
||||||
"group": "build"
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
2
.yarnrc
2
.yarnrc
@@ -1,3 +1,3 @@
|
|||||||
disturl "https://electronjs.org/headers"
|
disturl "https://electronjs.org/headers"
|
||||||
target "12.0.9"
|
target "12.0.7"
|
||||||
runtime "electron"
|
runtime "electron"
|
||||||
|
|||||||
89
CHANGELOG.md
89
CHANGELOG.md
@@ -1,96 +1,11 @@
|
|||||||
# Change Log
|
# Change Log
|
||||||
|
|
||||||
## Version 1.33.1
|
|
||||||
* Release date: Nov 4, 2021
|
|
||||||
* Release status: General Availability
|
|
||||||
|
|
||||||
## Hotfix release
|
|
||||||
- Fix for [#16535 Unable to See Saved Connections in Restricted Mode](https://github.com/microsoft/azuredatastudio/issues/17535)
|
|
||||||
- Fix for [#17579 Can't type in Notebook code cell after editing text cell](https://github.com/microsoft/azuredatastudio/issues/17579)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
| Platform |
|
|
||||||
| --------------------------------------- |
|
|
||||||
| [Windows User Installer][win-user] |
|
|
||||||
| [Windows System Installer][win-system] |
|
|
||||||
| [Windows ZIP][win-zip] |
|
|
||||||
| [macOS ZIP][osx-zip] |
|
|
||||||
| [Linux TAR.GZ][linux-zip] |
|
|
||||||
| [Linux RPM][linux-rpm] |
|
|
||||||
| [Linux DEB][linux-deb] |
|
|
||||||
|
|
||||||
[win-user]: https://go.microsoft.com/fwlink/?linkid=2176805
|
|
||||||
[win-system]: https://go.microsoft.com/fwlink/?linkid=2175910
|
|
||||||
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2176806
|
|
||||||
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2176807
|
|
||||||
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2176505
|
|
||||||
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2176005
|
|
||||||
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2176006
|
|
||||||
|
|
||||||
## Version 1.33.0
|
|
||||||
* Release date: October 27, 2021
|
|
||||||
* Release status: General Availability
|
|
||||||
## What's new in this version
|
|
||||||
* New Notebook Features:
|
|
||||||
* Notebook Views
|
|
||||||
* Split cell support
|
|
||||||
* Keyboard shortcuts for Markdown Toolbar Cells
|
|
||||||
* Ctrl/Cmd + B = Bold Text
|
|
||||||
* Ctrl/Cmd + I = Italicize Text
|
|
||||||
* Ctrl/Cmd + U = Underline Text
|
|
||||||
* Ctrl/Cmd + Shift + K = Add Code Block
|
|
||||||
* Ctrl/Cmd + Shift + H = Highlight Text
|
|
||||||
* Book improvements
|
|
||||||
* Add a new section
|
|
||||||
* Drag and Drop
|
|
||||||
|
|
||||||
* Extension Updates:
|
|
||||||
* Import
|
|
||||||
* Langpacks
|
|
||||||
* Schema Compare
|
|
||||||
* Sql Database Projects
|
|
||||||
|
|
||||||
* Bug Fixes
|
|
||||||
* Notebook linking improvements
|
|
||||||
* Horizontal Scrollbar improvement (when word wrap is off in MD Splitview / MD mode) in Notebooks
|
|
||||||
* Vertical Scrollbar improvement for MD Splitview in Notebooks
|
|
||||||
|
|
||||||
## Version 1.32.0
|
|
||||||
* Release date: August 18, 2021
|
|
||||||
* Release status: General Availability
|
|
||||||
* Extension Updates:
|
|
||||||
* Arc/Az CLI extensions - Azure Arc extension now uses Azure CLI instead of Azure Data CLI for deploying and interacting with Azure Arc
|
|
||||||
instances
|
|
||||||
* Langpacks
|
|
||||||
* SQL Database Projects
|
|
||||||
* Azure Monitor
|
|
||||||
* Machine Learning
|
|
||||||
|
|
||||||
## Version 1.31.1
|
|
||||||
* Release date: July 29, 2021
|
|
||||||
* Release status: General Availability
|
|
||||||
## Hotfix Release
|
|
||||||
- Fix for [#16436 Database Connection Toolbar Missing](https://github.com/microsoft/azuredatastudio/issues/16436)
|
|
||||||
|
|
||||||
## Version 1.31.0
|
|
||||||
* Release date: July 21, 2021
|
|
||||||
* Release status: General Availability
|
|
||||||
* New Notebook Features:
|
|
||||||
* WYSIWYG link improvements
|
|
||||||
* Extension Updates:
|
|
||||||
* Import
|
|
||||||
* SandDance
|
|
||||||
* SQL Database Projects
|
|
||||||
* Bug Fixes
|
|
||||||
* Accessibility bug fixes
|
|
||||||
|
|
||||||
## Version 1.30.0
|
## Version 1.30.0
|
||||||
* Release date: June 17, 2021
|
* Release date: June 17, 2021
|
||||||
* Release status: General Availability
|
* Release status: General Availability
|
||||||
* New Notebook Features:
|
* New Notebook Features:
|
||||||
* Show book's notebook TOC title in pinned notebooks view
|
* Show book's notebook TOC title in pinned notebooks view
|
||||||
* Add new book icon
|
* Add new book icon
|
||||||
* Update Python to 3.8.10
|
* Update Python to 3.8.10
|
||||||
* Query Editor Features:
|
* Query Editor Features:
|
||||||
* Added filtering/sorting feature for query result grid in query editor and notebook, the feature can be invoked from the column headers. Note that this feature is only available when you enable the preview features
|
* Added filtering/sorting feature for query result grid in query editor and notebook, the feature can be invoked from the column headers. Note that this feature is only available when you enable the preview features
|
||||||
@@ -99,7 +14,7 @@
|
|||||||
* SQL Database Projects
|
* SQL Database Projects
|
||||||
* Machine Learning
|
* Machine Learning
|
||||||
* Bug Fixes
|
* Bug Fixes
|
||||||
* Fix WYSIWYG Table cell adding new line in table cell
|
* Fix WYSIWYG Table cell adding new line in table cell
|
||||||
|
|
||||||
## Version 1.29.0
|
## Version 1.29.0
|
||||||
* Release date: May 19, 2021
|
* Release date: May 19, 2021
|
||||||
|
|||||||
14
README.md
14
README.md
@@ -131,10 +131,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
|
|||||||
|
|
||||||
Licensed under the [Source EULA](LICENSE.txt).
|
Licensed under the [Source EULA](LICENSE.txt).
|
||||||
|
|
||||||
[win-user]: https://go.microsoft.com/fwlink/?linkid=2176805
|
[win-user]: https://go.microsoft.com/fwlink/?linkid=2165736
|
||||||
[win-system]: https://go.microsoft.com/fwlink/?linkid=2175910
|
[win-system]: https://go.microsoft.com/fwlink/?linkid=2165737
|
||||||
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2176806
|
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2165838
|
||||||
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2176807
|
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2165942
|
||||||
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2176505
|
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2165841
|
||||||
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2176005
|
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2165842
|
||||||
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2176006
|
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2165738
|
||||||
|
|||||||
@@ -86,125 +86,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
|
|||||||
|
|
||||||
Microsoft PROSE SDK: https://microsoft.github.io/prose
|
Microsoft PROSE SDK: https://microsoft.github.io/prose
|
||||||
|
|
||||||
atom/language-clojure version 0.22.7 (https://github.com/atom/language-clojure)
|
|
||||||
atom/language-coffee-script version 0.49.3 (https://github.com/atom/language-coffee-script)
|
|
||||||
atom/language-css version 0.44.4 (https://github.com/atom/language-css)
|
|
||||||
atom/language-java version 0.32.1 (https://github.com/atom/language-java)
|
|
||||||
atom/language-sass version 0.62.1 (https://github.com/atom/language-sass)
|
|
||||||
atom/language-shellscript version 0.26.0 (https://github.com/atom/language-shellscript)
|
|
||||||
atom/language-xml version 0.35.2 (https://github.com/atom/language-xml)
|
|
||||||
better-go-syntax version 1.0.0 (https://github.com/jeff-hykin/better-go-syntax/ )
|
|
||||||
Colorsublime-Themes version 0.1.0 (https://github.com/Colorsublime/Colorsublime-Themes)
|
|
||||||
daaain/Handlebars version 1.8.0 (https://github.com/daaain/Handlebars)
|
|
||||||
dart-lang/dart-syntax-highlight (https://github.com/dart-lang/dart-syntax-highlight)
|
|
||||||
davidrios/pug-tmbundle (https://github.com/davidrios/pug-tmbundle)
|
|
||||||
definitelytyped (https://github.com/DefinitelyTyped/DefinitelyTyped)
|
|
||||||
demyte/language-cshtml version 0.3.0 (https://github.com/demyte/language-cshtml)
|
|
||||||
Document Object Model version 4.0.0 (https://www.w3.org/DOM/)
|
|
||||||
dotnet/csharp-tmLanguage version 0.1.0 (https://github.com/dotnet/csharp-tmLanguage)
|
|
||||||
expand-abbreviation version 0.5.8 (https://github.com/emmetio/expand-abbreviation)
|
|
||||||
fadeevab/make.tmbundle (https://github.com/fadeevab/make.tmbundle)
|
|
||||||
freebroccolo/atom-language-swift (https://github.com/freebroccolo/atom-language-swift)
|
|
||||||
HTML 5.1 W3C Working Draft version 08 October 2015 (http://www.w3.org/TR/2015/WD-html51-20151008/)
|
|
||||||
Ikuyadeu/vscode-R version 1.3.0 (https://github.com/Ikuyadeu/vscode-R)
|
|
||||||
insane version 2.6.2 (https://github.com/bevacqua/insane)
|
|
||||||
Ionic documentation version 1.2.4 (https://github.com/ionic-team/ionic-site)
|
|
||||||
ionide/ionide-fsgrammar (https://github.com/ionide/ionide-fsgrammar)
|
|
||||||
jeff-hykin/cpp-textmate-grammar version 1.12.11 (https://github.com/jeff-hykin/cpp-textmate-grammar)
|
|
||||||
jeff-hykin/cpp-textmate-grammar version 1.15.5 (https://github.com/jeff-hykin/cpp-textmate-grammar)
|
|
||||||
js-beautify version 1.6.8 (https://github.com/beautify-web/js-beautify)
|
|
||||||
JuliaEditorSupport/atom-language-julia version 0.21.0 (https://github.com/JuliaEditorSupport/atom-language-julia)
|
|
||||||
Jxck/assert version 1.0.0 (https://github.com/Jxck/assert)
|
|
||||||
language-docker (https://github.com/moby/moby)
|
|
||||||
language-less version 0.34.2 (https://github.com/atom/language-less)
|
|
||||||
language-php version 0.46.2 (https://github.com/atom/language-php)
|
|
||||||
MagicStack/MagicPython version 1.1.1 (https://github.com/MagicStack/MagicPython)
|
|
||||||
marked version 1.1.0 (https://github.com/markedjs/marked)
|
|
||||||
mdn-data version 1.1.12 (https://github.com/mdn/data)
|
|
||||||
microsoft/TypeScript-TmLanguage version 0.0.1 (https://github.com/microsoft/TypeScript-TmLanguage)
|
|
||||||
microsoft/vscode-JSON.tmLanguage (https://github.com/microsoft/vscode-JSON.tmLanguage)
|
|
||||||
microsoft/vscode-markdown-tm-grammar version 1.0.0 (https://github.com/microsoft/vscode-markdown-tm-grammar)
|
|
||||||
microsoft/vscode-mssql version 1.9.0 (https://github.com/microsoft/vscode-mssql)
|
|
||||||
mmims/language-batchfile version 0.7.6 (https://github.com/mmims/language-batchfile)
|
|
||||||
NVIDIA/cuda-cpp-grammar (https://github.com/NVIDIA/cuda-cpp-grammar)
|
|
||||||
PowerShell/EditorSyntax version 1.0.0 (https://github.com/PowerShell/EditorSyntax)
|
|
||||||
rust-syntax version 0.4.3 (https://github.com/dustypomerleau/rust-syntax)
|
|
||||||
seti-ui version 0.1.0 (https://github.com/jesseweed/seti-ui)
|
|
||||||
shaders-tmLanguage version 0.1.0 (https://github.com/tgjones/shaders-tmLanguage)
|
|
||||||
textmate/asp.vb.net.tmbundle (https://github.com/textmate/asp.vb.net.tmbundle)
|
|
||||||
textmate/c.tmbundle (https://github.com/textmate/c.tmbundle)
|
|
||||||
textmate/diff.tmbundle (https://github.com/textmate/diff.tmbundle)
|
|
||||||
textmate/git.tmbundle (https://github.com/textmate/git.tmbundle)
|
|
||||||
textmate/groovy.tmbundle (https://github.com/textmate/groovy.tmbundle)
|
|
||||||
textmate/html.tmbundle (https://github.com/textmate/html.tmbundle)
|
|
||||||
textmate/ini.tmbundle (https://github.com/textmate/ini.tmbundle)
|
|
||||||
textmate/javascript.tmbundle (https://github.com/textmate/javascript.tmbundle)
|
|
||||||
textmate/lua.tmbundle (https://github.com/textmate/lua.tmbundle)
|
|
||||||
textmate/markdown.tmbundle (https://github.com/textmate/markdown.tmbundle)
|
|
||||||
textmate/perl.tmbundle (https://github.com/textmate/perl.tmbundle)
|
|
||||||
textmate/ruby.tmbundle (https://github.com/textmate/ruby.tmbundle)
|
|
||||||
textmate/yaml.tmbundle (https://github.com/textmate/yaml.tmbundle)
|
|
||||||
TypeScript-TmLanguage version 0.1.8 (https://github.com/microsoft/TypeScript-TmLanguage)
|
|
||||||
TypeScript-TmLanguage version 1.0.0 (https://github.com/microsoft/TypeScript-TmLanguage)
|
|
||||||
Unicode version 12.0.0 (https://home.unicode.org/)
|
|
||||||
vscode-codicons version 0.0.14 (https://github.com/microsoft/vscode-codicons)
|
|
||||||
vscode-logfile-highlighter version 2.11.0 (https://github.com/emilast/vscode-logfile-highlighter)
|
|
||||||
vscode-swift version 0.0.1 (https://github.com/owensd/vscode-swift)
|
|
||||||
Web Background Synchronization (https://github.com/WICG/background-sync)
|
|
||||||
|
|
||||||
|
|
||||||
%% atom/language-clojure NOTICES AND INFORMATION BEGIN HERE
|
|
||||||
=========================================
|
|
||||||
Copyright (c) 2014 GitHub Inc.
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining
|
|
||||||
a copy of this software and associated documentation files (the
|
|
||||||
"Software"), to deal in the Software without restriction, including
|
|
||||||
without limitation the rights to use, copy, modify, merge, publish,
|
|
||||||
distribute, sublicense, and/or sell copies of the Software, and to
|
|
||||||
permit persons to whom the Software is furnished to do so, subject to
|
|
||||||
the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be
|
|
||||||
included in all copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
|
||||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
|
||||||
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
|
||||||
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
|
||||||
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
|
||||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
|
||||||
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
||||||
|
|
||||||
|
|
||||||
This package was derived from a TextMate bundle located at
|
|
||||||
https://github.com/mmcgrana/textmate-clojure and distributed under the
|
|
||||||
following license, located in `LICENSE.md`:
|
|
||||||
|
|
||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2010- Mark McGranaghan
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
|
||||||
in the Software without restriction, including without limitation the rights
|
|
||||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
||||||
copies of the Software, and to permit persons to whom the Software is
|
|
||||||
furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
||||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
||||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
||||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
||||||
SOFTWARE.
|
|
||||||
=========================================
|
|
||||||
END OF atom/language-clojure NOTICES AND INFORMATION
|
|
||||||
|
|
||||||
%% angular NOTICES AND INFORMATION BEGIN HERE
|
%% angular NOTICES AND INFORMATION BEGIN HERE
|
||||||
|
|
||||||
Copyright (c) 2014-2017 Google, Inc. http://angular.io
|
Copyright (c) 2014-2017 Google, Inc. http://angular.io
|
||||||
@@ -679,63 +560,6 @@ THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLI
|
|||||||
=========================================
|
=========================================
|
||||||
END OF http-proxy-agent NOTICES AND INFORMATION
|
END OF http-proxy-agent NOTICES AND INFORMATION
|
||||||
|
|
||||||
%% dart-lang/dart-syntax-highlight NOTICES AND INFORMATION BEGIN HERE
|
|
||||||
=========================================
|
|
||||||
Copyright 2020, the Dart project authors.
|
|
||||||
|
|
||||||
Redistribution and use in source and binary forms, with or without
|
|
||||||
modification, are permitted provided that the following conditions are
|
|
||||||
met:
|
|
||||||
|
|
||||||
* Redistributions of source code must retain the above copyright
|
|
||||||
notice, this list of conditions and the following disclaimer.
|
|
||||||
* Redistributions in binary form must reproduce the above
|
|
||||||
copyright notice, this list of conditions and the following
|
|
||||||
disclaimer in the documentation and/or other materials provided
|
|
||||||
with the distribution.
|
|
||||||
* Neither the name of Google LLC nor the names of its
|
|
||||||
contributors may be used to endorse or promote products derived
|
|
||||||
from this software without specific prior written permission.
|
|
||||||
|
|
||||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
|
||||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
|
||||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
|
||||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
|
||||||
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
|
||||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
|
||||||
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
|
||||||
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
|
||||||
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
|
||||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
|
||||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
|
||||||
=========================================
|
|
||||||
END OF dart-lang/dart-syntax-highlight NOTICES AND INFORMATION
|
|
||||||
|
|
||||||
%% davidrios/pug-tmbundle NOTICES AND INFORMATION BEGIN HERE
|
|
||||||
=========================================
|
|
||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2016 David Rios
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
|
||||||
this software and associated documentation files (the "Software"), to deal in
|
|
||||||
the Software without restriction, including without limitation the rights to
|
|
||||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
|
|
||||||
the Software, and to permit persons to whom the Software is furnished to do so,
|
|
||||||
subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
|
|
||||||
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
|
||||||
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
|
||||||
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
|
|
||||||
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
||||||
=========================================
|
|
||||||
END OF davidrios/pug-tmbundle NOTICES AND INFORMATION
|
|
||||||
|
|
||||||
%% iconv-lite NOTICES AND INFORMATION BEGIN HERE
|
%% iconv-lite NOTICES AND INFORMATION BEGIN HERE
|
||||||
=========================================
|
=========================================
|
||||||
Copyright (c) 2011 Alexander Shtuchkin
|
Copyright (c) 2011 Alexander Shtuchkin
|
||||||
@@ -1662,61 +1486,6 @@ THE SOFTWARE.
|
|||||||
=========================================
|
=========================================
|
||||||
END OF node-pty NOTICES AND INFORMATION
|
END OF node-pty NOTICES AND INFORMATION
|
||||||
|
|
||||||
%% JuliaEditorSupport/atom-language-julia NOTICES AND INFORMATION BEGIN HERE
|
|
||||||
=========================================
|
|
||||||
The atom-language-julia package is licensed under the MIT "Expat" License:
|
|
||||||
|
|
||||||
> Copyright (c) 2015
|
|
||||||
>
|
|
||||||
> Permission is hereby granted, free of charge, to any person obtaining
|
|
||||||
> a copy of this software and associated documentation files (the
|
|
||||||
> "Software"), to deal in the Software without restriction, including
|
|
||||||
> without limitation the rights to use, copy, modify, merge, publish,
|
|
||||||
> distribute, sublicense, and/or sell copies of the Software, and to
|
|
||||||
> permit persons to whom the Software is furnished to do so, subject to
|
|
||||||
> the following conditions:
|
|
||||||
>
|
|
||||||
> The above copyright notice and this permission notice shall be
|
|
||||||
> included in all copies or substantial portions of the Software.
|
|
||||||
>
|
|
||||||
> THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
|
||||||
> EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
|
||||||
> MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
|
|
||||||
> IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
|
||||||
> CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
|
||||||
> TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
|
|
||||||
> SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
||||||
=========================================
|
|
||||||
END OF JuliaEditorSupport/atom-language-julia NOTICES AND INFORMATION
|
|
||||||
|
|
||||||
%% Jxck/assert NOTICES AND INFORMATION BEGIN HERE
|
|
||||||
=========================================
|
|
||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2011 Jxck
|
|
||||||
|
|
||||||
Originally from node.js (http://nodejs.org)
|
|
||||||
Copyright Joyent, Inc.
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
|
||||||
in the Software without restriction, including without limitation the rights
|
|
||||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
||||||
copies of the Software, and to permit persons to whom the Software is
|
|
||||||
furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
||||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
||||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
||||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
||||||
=========================================
|
|
||||||
END OF Jxck/assert NOTICES AND INFORMATION
|
|
||||||
|
|
||||||
%% nsfw NOTICES AND INFORMATION BEGIN HERE
|
%% nsfw NOTICES AND INFORMATION BEGIN HERE
|
||||||
=========================================
|
=========================================
|
||||||
The MIT License (MIT)
|
The MIT License (MIT)
|
||||||
|
|||||||
@@ -20,3 +20,8 @@ jobs:
|
|||||||
vmImage: macOS-latest
|
vmImage: macOS-latest
|
||||||
steps:
|
steps:
|
||||||
- template: build/azure-pipelines/darwin/continuous-build-darwin.yml
|
- template: build/azure-pipelines/darwin/continuous-build-darwin.yml
|
||||||
|
|
||||||
|
trigger:
|
||||||
|
branches:
|
||||||
|
exclude:
|
||||||
|
- electron-11.x.y
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
2021-11-19T02:27:18.022Z
|
2021-04-07T03:52:18.011Z
|
||||||
|
|||||||
@@ -17,7 +17,7 @@
|
|||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/core": "^1.2.6",
|
"@actions/core": "^1.2.6",
|
||||||
"@actions/github": "^2.1.1",
|
"@actions/github": "^2.1.1",
|
||||||
"axios": "^0.21.4",
|
"axios": "^0.21.1",
|
||||||
"ts-node": "^8.6.2",
|
"ts-node": "^8.6.2",
|
||||||
"typescript": "^3.8.3"
|
"typescript": "^3.8.3"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -144,12 +144,12 @@ atob-lite@^2.0.0:
|
|||||||
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
|
resolved "https://registry.yarnpkg.com/atob-lite/-/atob-lite-2.0.0.tgz#0fef5ad46f1bd7a8502c65727f0367d5ee43d696"
|
||||||
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
|
integrity sha1-D+9a1G8b16hQLGVyfwNn1e5D1pY=
|
||||||
|
|
||||||
axios@^0.21.4:
|
axios@^0.21.1:
|
||||||
version "0.21.4"
|
version "0.21.1"
|
||||||
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.4.tgz#c67b90dc0568e5c1cf2b0b858c43ba28e2eda575"
|
resolved "https://registry.yarnpkg.com/axios/-/axios-0.21.1.tgz#22563481962f4d6bde9a76d516ef0e5d3c09b2b8"
|
||||||
integrity sha512-ut5vewkiu8jjGBdqpM44XxjuCjq9LAKeHVmoVfHVzy8eHgxxq8SbAVQNovDA8mVi05kP0Ea/n/UzcSHcTJQfNg==
|
integrity sha512-dKQiRHxGD9PPRIUNIWvZhPTPpl1rf/OxTYKsqKUDjBwYylTvV7SjSHJb9ratfyzM6wCdLCOYLzs73qpg5c4iGA==
|
||||||
dependencies:
|
dependencies:
|
||||||
follow-redirects "^1.14.0"
|
follow-redirects "^1.10.0"
|
||||||
|
|
||||||
before-after-hook@^2.0.0:
|
before-after-hook@^2.0.0:
|
||||||
version "2.1.0"
|
version "2.1.0"
|
||||||
@@ -207,10 +207,10 @@ execa@^1.0.0:
|
|||||||
signal-exit "^3.0.0"
|
signal-exit "^3.0.0"
|
||||||
strip-eof "^1.0.0"
|
strip-eof "^1.0.0"
|
||||||
|
|
||||||
follow-redirects@^1.14.0:
|
follow-redirects@^1.10.0:
|
||||||
version "1.14.3"
|
version "1.13.1"
|
||||||
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.14.3.tgz#6ada78118d8d24caee595595accdc0ac6abd022e"
|
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.13.1.tgz#5f69b813376cee4fd0474a3aba835df04ab763b7"
|
||||||
integrity sha512-3MkHxknWMUtb23apkgz/83fDoe+y+qr0TdgacGIA7bew+QLBo3vdgEN2xEsuXNivpFy4CyDhBBZnNZOtalmenw==
|
integrity sha512-SSG5xmZh1mkPGyKzjZP8zLjltIfpW32Y5QpdNJyjcfGxK3qo3NDDkZOZSFiGn1A6SclQxY9GzEwAHQ3dmYRWpg==
|
||||||
|
|
||||||
get-stream@^4.0.0:
|
get-stream@^4.0.0:
|
||||||
version "4.1.0"
|
version "4.1.0"
|
||||||
|
|||||||
@@ -5,101 +5,15 @@
|
|||||||
'use strict';
|
'use strict';
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
const fs = require("fs");
|
const fs = require("fs");
|
||||||
const url = require("url");
|
|
||||||
const crypto = require("crypto");
|
const crypto = require("crypto");
|
||||||
const azure = require("azure-storage");
|
const azure = require("azure-storage");
|
||||||
const mime = require("mime");
|
const mime = require("mime");
|
||||||
const cosmos_1 = require("@azure/cosmos");
|
const cosmos_1 = require("@azure/cosmos");
|
||||||
const retry_1 = require("./retry");
|
const retry_1 = require("./retry");
|
||||||
if (process.argv.length !== 8) {
|
if (process.argv.length !== 6) {
|
||||||
console.error('Usage: node createAsset.js PRODUCT OS ARCH TYPE NAME FILE');
|
console.error('Usage: node createAsset.js PLATFORM TYPE NAME FILE');
|
||||||
process.exit(-1);
|
process.exit(-1);
|
||||||
}
|
}
|
||||||
// Contains all of the logic for mapping details to our actual product names in CosmosDB
|
|
||||||
function getPlatform(product, os, arch, type) {
|
|
||||||
switch (os) {
|
|
||||||
case 'win32':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
const asset = arch === 'ia32' ? 'win32' : `win32-${arch}`;
|
|
||||||
switch (type) {
|
|
||||||
case 'archive':
|
|
||||||
return `${asset}-archive`;
|
|
||||||
case 'setup':
|
|
||||||
return asset;
|
|
||||||
case 'user-setup':
|
|
||||||
return `${asset}-user`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'server':
|
|
||||||
if (arch === 'arm64') {
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return arch === 'ia32' ? 'server-win32' : `server-win32-${arch}`;
|
|
||||||
case 'web':
|
|
||||||
if (arch === 'arm64') {
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return arch === 'ia32' ? 'server-win32-web' : `server-win32-${arch}-web`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'linux':
|
|
||||||
switch (type) {
|
|
||||||
case 'snap':
|
|
||||||
return `linux-snap-${arch}`;
|
|
||||||
case 'archive-unsigned':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
return `linux-${arch}`;
|
|
||||||
case 'server':
|
|
||||||
return `server-linux-${arch}`;
|
|
||||||
case 'web':
|
|
||||||
return arch === 'standalone' ? 'web-standalone' : `server-linux-${arch}-web`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'deb-package':
|
|
||||||
return `linux-deb-${arch}`;
|
|
||||||
case 'rpm-package':
|
|
||||||
return `linux-rpm-${arch}`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'darwin':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
if (arch === 'x64') {
|
|
||||||
return 'darwin';
|
|
||||||
}
|
|
||||||
return `darwin-${arch}`;
|
|
||||||
case 'server':
|
|
||||||
return 'server-darwin';
|
|
||||||
case 'web':
|
|
||||||
if (arch !== 'x64') {
|
|
||||||
throw `What should the platform be?: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return 'server-darwin-web';
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// Contains all of the logic for mapping types to our actual types in CosmosDB
|
|
||||||
function getRealType(type) {
|
|
||||||
switch (type) {
|
|
||||||
case 'user-setup':
|
|
||||||
return 'setup';
|
|
||||||
case 'deb-package':
|
|
||||||
case 'rpm-package':
|
|
||||||
return 'package';
|
|
||||||
default:
|
|
||||||
return type;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
function hashStream(hashName, stream) {
|
function hashStream(hashName, stream) {
|
||||||
return new Promise((c, e) => {
|
return new Promise((c, e) => {
|
||||||
const shasum = crypto.createHash(hashName);
|
const shasum = crypto.createHash(hashName);
|
||||||
@@ -131,10 +45,7 @@ function getEnv(name) {
|
|||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
async function main() {
|
async function main() {
|
||||||
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
|
const [, , platform, type, fileName, filePath] = process.argv;
|
||||||
// getPlatform needs the unprocessedType
|
|
||||||
const platform = getPlatform(product, os, arch, unprocessedType);
|
|
||||||
const type = getRealType(unprocessedType);
|
|
||||||
const quality = getEnv('VSCODE_QUALITY');
|
const quality = getEnv('VSCODE_QUALITY');
|
||||||
const commit = getEnv('BUILD_SOURCEVERSION');
|
const commit = getEnv('BUILD_SOURCEVERSION');
|
||||||
console.log('Creating asset...');
|
console.log('Creating asset...');
|
||||||
@@ -154,27 +65,14 @@ async function main() {
|
|||||||
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
|
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
|
console.log('Uploading blobs to Azure storage...');
|
||||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
await uploadBlob(blobService, quality, blobName, filePath, fileName);
|
||||||
// mooncake is fussy and far away, this is needed!
|
|
||||||
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
|
||||||
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
|
||||||
console.log('Uploading blobs to Azure storage and Mooncake Azure storage...');
|
|
||||||
await retry_1.retry(() => Promise.all([
|
|
||||||
uploadBlob(blobService, quality, blobName, filePath, fileName),
|
|
||||||
uploadBlob(mooncakeBlobService, quality, blobName, filePath, fileName)
|
|
||||||
]));
|
|
||||||
console.log('Blobs successfully uploaded.');
|
console.log('Blobs successfully uploaded.');
|
||||||
// TODO: Understand if blobName and blobPath are the same and replace blobPath with blobName if so.
|
|
||||||
const assetUrl = `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`;
|
|
||||||
const blobPath = url.parse(assetUrl).path;
|
|
||||||
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
|
||||||
const asset = {
|
const asset = {
|
||||||
platform,
|
platform,
|
||||||
type,
|
type,
|
||||||
url: assetUrl,
|
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
|
||||||
hash: sha1hash,
|
hash: sha1hash,
|
||||||
mooncakeUrl,
|
|
||||||
sha256hash,
|
sha256hash,
|
||||||
size
|
size
|
||||||
};
|
};
|
||||||
@@ -185,8 +83,7 @@ async function main() {
|
|||||||
console.log('Asset:', JSON.stringify(asset, null, ' '));
|
console.log('Asset:', JSON.stringify(asset, null, ' '));
|
||||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||||
const scripts = client.database('builds').container(quality).scripts;
|
const scripts = client.database('builds').container(quality).scripts;
|
||||||
await retry_1.retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
await (0, retry_1.retry)(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
||||||
console.log(` Done ✔️`);
|
|
||||||
}
|
}
|
||||||
main().then(() => {
|
main().then(() => {
|
||||||
console.log('Asset successfully created');
|
console.log('Asset successfully created');
|
||||||
|
|||||||
@@ -6,7 +6,6 @@
|
|||||||
'use strict';
|
'use strict';
|
||||||
|
|
||||||
import * as fs from 'fs';
|
import * as fs from 'fs';
|
||||||
import * as url from 'url';
|
|
||||||
import { Readable } from 'stream';
|
import { Readable } from 'stream';
|
||||||
import * as crypto from 'crypto';
|
import * as crypto from 'crypto';
|
||||||
import * as azure from 'azure-storage';
|
import * as azure from 'azure-storage';
|
||||||
@@ -25,98 +24,11 @@ interface Asset {
|
|||||||
supportsFastUpdate?: boolean;
|
supportsFastUpdate?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (process.argv.length !== 8) {
|
if (process.argv.length !== 6) {
|
||||||
console.error('Usage: node createAsset.js PRODUCT OS ARCH TYPE NAME FILE');
|
console.error('Usage: node createAsset.js PLATFORM TYPE NAME FILE');
|
||||||
process.exit(-1);
|
process.exit(-1);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Contains all of the logic for mapping details to our actual product names in CosmosDB
|
|
||||||
function getPlatform(product: string, os: string, arch: string, type: string): string {
|
|
||||||
switch (os) {
|
|
||||||
case 'win32':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
const asset = arch === 'ia32' ? 'win32' : `win32-${arch}`;
|
|
||||||
switch (type) {
|
|
||||||
case 'archive':
|
|
||||||
return `${asset}-archive`;
|
|
||||||
case 'setup':
|
|
||||||
return asset;
|
|
||||||
case 'user-setup':
|
|
||||||
return `${asset}-user`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'server':
|
|
||||||
if (arch === 'arm64') {
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return arch === 'ia32' ? 'server-win32' : `server-win32-${arch}`;
|
|
||||||
case 'web':
|
|
||||||
if (arch === 'arm64') {
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return arch === 'ia32' ? 'server-win32-web' : `server-win32-${arch}-web`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'linux':
|
|
||||||
switch (type) {
|
|
||||||
case 'snap':
|
|
||||||
return `linux-snap-${arch}`;
|
|
||||||
case 'archive-unsigned':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
return `linux-${arch}`;
|
|
||||||
case 'server':
|
|
||||||
return `server-linux-${arch}`;
|
|
||||||
case 'web':
|
|
||||||
return arch === 'standalone' ? 'web-standalone' : `server-linux-${arch}-web`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'deb-package':
|
|
||||||
return `linux-deb-${arch}`;
|
|
||||||
case 'rpm-package':
|
|
||||||
return `linux-rpm-${arch}`;
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
case 'darwin':
|
|
||||||
switch (product) {
|
|
||||||
case 'client':
|
|
||||||
if (arch === 'x64') {
|
|
||||||
return 'darwin';
|
|
||||||
}
|
|
||||||
return `darwin-${arch}`;
|
|
||||||
case 'server':
|
|
||||||
return 'server-darwin';
|
|
||||||
case 'web':
|
|
||||||
if (arch !== 'x64') {
|
|
||||||
throw `What should the platform be?: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
return 'server-darwin-web';
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
default:
|
|
||||||
throw `Unrecognized: ${product} ${os} ${arch} ${type}`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Contains all of the logic for mapping types to our actual types in CosmosDB
|
|
||||||
function getRealType(type: string) {
|
|
||||||
switch (type) {
|
|
||||||
case 'user-setup':
|
|
||||||
return 'setup';
|
|
||||||
case 'deb-package':
|
|
||||||
case 'rpm-package':
|
|
||||||
return 'package';
|
|
||||||
default:
|
|
||||||
return type;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function hashStream(hashName: string, stream: Readable): Promise<string> {
|
function hashStream(hashName: string, stream: Readable): Promise<string> {
|
||||||
return new Promise<string>((c, e) => {
|
return new Promise<string>((c, e) => {
|
||||||
const shasum = crypto.createHash(hashName);
|
const shasum = crypto.createHash(hashName);
|
||||||
@@ -156,10 +68,7 @@ function getEnv(name: string): string {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async function main(): Promise<void> {
|
async function main(): Promise<void> {
|
||||||
const [, , product, os, arch, unprocessedType, fileName, filePath] = process.argv;
|
const [, , platform, type, fileName, filePath] = process.argv;
|
||||||
// getPlatform needs the unprocessedType
|
|
||||||
const platform = getPlatform(product, os, arch, unprocessedType);
|
|
||||||
const type = getRealType(unprocessedType);
|
|
||||||
const quality = getEnv('VSCODE_QUALITY');
|
const quality = getEnv('VSCODE_QUALITY');
|
||||||
const commit = getEnv('BUILD_SOURCEVERSION');
|
const commit = getEnv('BUILD_SOURCEVERSION');
|
||||||
|
|
||||||
@@ -189,33 +98,17 @@ async function main(): Promise<void> {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY']!, `${storageAccount}.blob.core.chinacloudapi.cn`)
|
console.log('Uploading blobs to Azure storage...');
|
||||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
|
||||||
|
|
||||||
// mooncake is fussy and far away, this is needed!
|
await uploadBlob(blobService, quality, blobName, filePath, fileName);
|
||||||
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
|
||||||
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
|
||||||
|
|
||||||
console.log('Uploading blobs to Azure storage and Mooncake Azure storage...');
|
|
||||||
|
|
||||||
await retry(() => Promise.all([
|
|
||||||
uploadBlob(blobService, quality, blobName, filePath, fileName),
|
|
||||||
uploadBlob(mooncakeBlobService, quality, blobName, filePath, fileName)
|
|
||||||
]));
|
|
||||||
|
|
||||||
console.log('Blobs successfully uploaded.');
|
console.log('Blobs successfully uploaded.');
|
||||||
|
|
||||||
// TODO: Understand if blobName and blobPath are the same and replace blobPath with blobName if so.
|
|
||||||
const assetUrl = `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`;
|
|
||||||
const blobPath = url.parse(assetUrl).path;
|
|
||||||
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
|
||||||
|
|
||||||
const asset: Asset = {
|
const asset: Asset = {
|
||||||
platform,
|
platform,
|
||||||
type,
|
type,
|
||||||
url: assetUrl,
|
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
|
||||||
hash: sha1hash,
|
hash: sha1hash,
|
||||||
mooncakeUrl,
|
|
||||||
sha256hash,
|
sha256hash,
|
||||||
size
|
size
|
||||||
};
|
};
|
||||||
@@ -230,8 +123,6 @@ async function main(): Promise<void> {
|
|||||||
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||||
const scripts = client.database('builds').container(quality).scripts;
|
const scripts = client.database('builds').container(quality).scripts;
|
||||||
await retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
await retry(() => scripts.storedProcedure('createAsset').execute('', [commit, asset, true]));
|
||||||
|
|
||||||
console.log(` Done ✔️`);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
main().then(() => {
|
main().then(() => {
|
||||||
|
|||||||
@@ -40,7 +40,7 @@ async function main() {
|
|||||||
};
|
};
|
||||||
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||||
const scripts = client.database('builds').container(quality).scripts;
|
const scripts = client.database('builds').container(quality).scripts;
|
||||||
await retry_1.retry(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
|
await (0, retry_1.retry)(() => scripts.storedProcedure('createBuild').execute('', [Object.assign(Object.assign({}, build), { _partitionKey: '' })]));
|
||||||
}
|
}
|
||||||
main().then(() => {
|
main().then(() => {
|
||||||
console.log('Build successfully created');
|
console.log('Build successfully created');
|
||||||
|
|||||||
@@ -4,12 +4,12 @@ set -e
|
|||||||
cd $BUILD_STAGINGDIRECTORY
|
cd $BUILD_STAGINGDIRECTORY
|
||||||
mkdir extraction
|
mkdir extraction
|
||||||
cd extraction
|
cd extraction
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-extension-telemetry.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-extension-telemetry.git
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-chrome-debug-core.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-chrome-debug-core.git
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-node-debug2.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-node-debug2.git
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-node-debug.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-node-debug.git
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-html-languageservice.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-html-languageservice.git
|
||||||
git clone --depth 1 https://github.com/microsoft/vscode-json-languageservice.git
|
git clone --depth 1 https://github.com/Microsoft/vscode-json-languageservice.git
|
||||||
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
|
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
|
||||||
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
|
node $BUILD_SOURCESDIRECTORY/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
|
||||||
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
|
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
|
||||||
|
|||||||
@@ -39,7 +39,7 @@ async function publish(commit, files) {
|
|||||||
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||||
await assertContainer(blobService, commit);
|
await assertContainer(blobService, commit);
|
||||||
for (const file of files) {
|
for (const file of files) {
|
||||||
const blobName = path_1.basename(file);
|
const blobName = (0, path_1.basename)(file);
|
||||||
const blobExists = await doesBlobExist(blobService, commit, blobName);
|
const blobExists = await doesBlobExist(blobService, commit, blobName);
|
||||||
if (blobExists) {
|
if (blobExists) {
|
||||||
console.log(`Blob ${commit}, ${blobName} already exists, not publishing again.`);
|
console.log(`Blob ${commit}, ${blobName} already exists, not publishing again.`);
|
||||||
@@ -58,7 +58,7 @@ function main() {
|
|||||||
}
|
}
|
||||||
const opts = minimist(process.argv.slice(2));
|
const opts = minimist(process.argv.slice(2));
|
||||||
const [directory] = opts._;
|
const [directory] = opts._;
|
||||||
const files = fileNames.map(fileName => path_1.join(directory, fileName));
|
const files = fileNames.map(fileName => (0, path_1.join)(directory, fileName));
|
||||||
publish(commit, files).catch(err => {
|
publish(commit, files).catch(err => {
|
||||||
console.error(err);
|
console.error(err);
|
||||||
process.exit(1);
|
process.exit(1);
|
||||||
|
|||||||
@@ -39,7 +39,7 @@ async function main() {
|
|||||||
}
|
}
|
||||||
console.log(`Releasing build ${commit}...`);
|
console.log(`Releasing build ${commit}...`);
|
||||||
const scripts = client.database('builds').container(quality).scripts;
|
const scripts = client.database('builds').container(quality).scripts;
|
||||||
await retry_1.retry(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
|
await (0, retry_1.retry)(() => scripts.storedProcedure('releaseBuild').execute('', [commit]));
|
||||||
}
|
}
|
||||||
main().then(() => {
|
main().then(() => {
|
||||||
console.log('Build successfully released');
|
console.log('Build successfully released');
|
||||||
|
|||||||
87
build/azure-pipelines/common/sync-mooncake.js
Normal file
87
build/azure-pipelines/common/sync-mooncake.js
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
/*---------------------------------------------------------------------------------------------
|
||||||
|
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||||
|
*--------------------------------------------------------------------------------------------*/
|
||||||
|
'use strict';
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
const url = require("url");
|
||||||
|
const azure = require("azure-storage");
|
||||||
|
const mime = require("mime");
|
||||||
|
const cosmos_1 = require("@azure/cosmos");
|
||||||
|
const retry_1 = require("./retry");
|
||||||
|
function log(...args) {
|
||||||
|
console.log(...[`[${new Date().toISOString()}]`, ...args]);
|
||||||
|
}
|
||||||
|
function error(...args) {
|
||||||
|
console.error(...[`[${new Date().toISOString()}]`, ...args]);
|
||||||
|
}
|
||||||
|
if (process.argv.length < 3) {
|
||||||
|
error('Usage: node sync-mooncake.js <quality>');
|
||||||
|
process.exit(-1);
|
||||||
|
}
|
||||||
|
async function sync(commit, quality) {
|
||||||
|
log(`Synchronizing Mooncake assets for ${quality}, ${commit}...`);
|
||||||
|
const client = new cosmos_1.CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT'], key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||||
|
const container = client.database('builds').container(quality);
|
||||||
|
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${commit}"`;
|
||||||
|
const res = await container.items.query(query, {}).fetchAll();
|
||||||
|
if (res.resources.length !== 1) {
|
||||||
|
throw new Error(`No builds found for ${commit}`);
|
||||||
|
}
|
||||||
|
const build = res.resources[0];
|
||||||
|
log(`Found build for ${commit}, with ${build.assets.length} assets`);
|
||||||
|
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
|
||||||
|
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
|
||||||
|
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||||
|
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
|
||||||
|
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||||
|
// mooncake is fussy and far away, this is needed!
|
||||||
|
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||||
|
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||||
|
for (const asset of build.assets) {
|
||||||
|
try {
|
||||||
|
const blobPath = url.parse(asset.url).path;
|
||||||
|
if (!blobPath) {
|
||||||
|
throw new Error(`Failed to parse URL: ${asset.url}`);
|
||||||
|
}
|
||||||
|
const blobName = blobPath.replace(/^\/\w+\//, '');
|
||||||
|
log(`Found ${blobName}`);
|
||||||
|
if (asset.mooncakeUrl) {
|
||||||
|
log(` Already in Mooncake ✔️`);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const readStream = blobService.createReadStream(quality, blobName, undefined);
|
||||||
|
const blobOptions = {
|
||||||
|
contentSettings: {
|
||||||
|
contentType: mime.lookup(blobPath),
|
||||||
|
cacheControl: 'max-age=31536000, public'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const writeStream = mooncakeBlobService.createWriteStreamToBlockBlob(quality, blobName, blobOptions, undefined);
|
||||||
|
log(` Uploading to Mooncake...`);
|
||||||
|
await new Promise((c, e) => readStream.pipe(writeStream).on('finish', c).on('error', e));
|
||||||
|
log(` Updating build in DB...`);
|
||||||
|
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
||||||
|
await (0, retry_1.retry)(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
|
||||||
|
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
|
||||||
|
log(` Done ✔️`);
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
error(err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
log(`All done ✔️`);
|
||||||
|
}
|
||||||
|
function main() {
|
||||||
|
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||||
|
if (!commit) {
|
||||||
|
error('Skipping publish due to missing BUILD_SOURCEVERSION');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const quality = process.argv[2];
|
||||||
|
sync(commit, quality).catch(err => {
|
||||||
|
error(err);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
main();
|
||||||
131
build/azure-pipelines/common/sync-mooncake.ts
Normal file
131
build/azure-pipelines/common/sync-mooncake.ts
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
/*---------------------------------------------------------------------------------------------
|
||||||
|
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
|
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||||
|
*--------------------------------------------------------------------------------------------*/
|
||||||
|
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
import * as url from 'url';
|
||||||
|
import * as azure from 'azure-storage';
|
||||||
|
import * as mime from 'mime';
|
||||||
|
import { CosmosClient } from '@azure/cosmos';
|
||||||
|
import { retry } from './retry';
|
||||||
|
|
||||||
|
function log(...args: any[]) {
|
||||||
|
console.log(...[`[${new Date().toISOString()}]`, ...args]);
|
||||||
|
}
|
||||||
|
|
||||||
|
function error(...args: any[]) {
|
||||||
|
console.error(...[`[${new Date().toISOString()}]`, ...args]);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (process.argv.length < 3) {
|
||||||
|
error('Usage: node sync-mooncake.js <quality>');
|
||||||
|
process.exit(-1);
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Build {
|
||||||
|
assets: Asset[];
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Asset {
|
||||||
|
platform: string;
|
||||||
|
type: string;
|
||||||
|
url: string;
|
||||||
|
mooncakeUrl: string;
|
||||||
|
hash: string;
|
||||||
|
sha256hash: string;
|
||||||
|
size: number;
|
||||||
|
supportsFastUpdate?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function sync(commit: string, quality: string): Promise<void> {
|
||||||
|
log(`Synchronizing Mooncake assets for ${quality}, ${commit}...`);
|
||||||
|
|
||||||
|
const client = new CosmosClient({ endpoint: process.env['AZURE_DOCUMENTDB_ENDPOINT']!, key: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
|
||||||
|
const container = client.database('builds').container(quality);
|
||||||
|
|
||||||
|
const query = `SELECT TOP 1 * FROM c WHERE c.id = "${commit}"`;
|
||||||
|
const res = await container.items.query<Build>(query, {}).fetchAll();
|
||||||
|
|
||||||
|
if (res.resources.length !== 1) {
|
||||||
|
throw new Error(`No builds found for ${commit}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const build = res.resources[0];
|
||||||
|
|
||||||
|
log(`Found build for ${commit}, with ${build.assets.length} assets`);
|
||||||
|
|
||||||
|
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
|
||||||
|
|
||||||
|
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!)
|
||||||
|
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||||
|
|
||||||
|
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY']!, `${storageAccount}.blob.core.chinacloudapi.cn`)
|
||||||
|
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
|
||||||
|
|
||||||
|
// mooncake is fussy and far away, this is needed!
|
||||||
|
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||||
|
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
|
||||||
|
|
||||||
|
for (const asset of build.assets) {
|
||||||
|
try {
|
||||||
|
const blobPath = url.parse(asset.url).path;
|
||||||
|
|
||||||
|
if (!blobPath) {
|
||||||
|
throw new Error(`Failed to parse URL: ${asset.url}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const blobName = blobPath.replace(/^\/\w+\//, '');
|
||||||
|
|
||||||
|
log(`Found ${blobName}`);
|
||||||
|
|
||||||
|
if (asset.mooncakeUrl) {
|
||||||
|
log(` Already in Mooncake ✔️`);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const readStream = blobService.createReadStream(quality, blobName, undefined!);
|
||||||
|
const blobOptions: azure.BlobService.CreateBlockBlobRequestOptions = {
|
||||||
|
contentSettings: {
|
||||||
|
contentType: mime.lookup(blobPath),
|
||||||
|
cacheControl: 'max-age=31536000, public'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const writeStream = mooncakeBlobService.createWriteStreamToBlockBlob(quality, blobName, blobOptions, undefined);
|
||||||
|
|
||||||
|
log(` Uploading to Mooncake...`);
|
||||||
|
await new Promise((c, e) => readStream.pipe(writeStream).on('finish', c).on('error', e));
|
||||||
|
|
||||||
|
log(` Updating build in DB...`);
|
||||||
|
const mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
|
||||||
|
await retry(() => container.scripts.storedProcedure('setAssetMooncakeUrl')
|
||||||
|
.execute('', [commit, asset.platform, asset.type, mooncakeUrl]));
|
||||||
|
|
||||||
|
log(` Done ✔️`);
|
||||||
|
} catch (err) {
|
||||||
|
error(err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
log(`All done ✔️`);
|
||||||
|
}
|
||||||
|
|
||||||
|
function main(): void {
|
||||||
|
const commit = process.env['BUILD_SOURCEVERSION'];
|
||||||
|
|
||||||
|
if (!commit) {
|
||||||
|
error('Skipping publish due to missing BUILD_SOURCEVERSION');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const quality = process.argv[2];
|
||||||
|
|
||||||
|
sync(commit, quality).catch(err => {
|
||||||
|
error(err);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
main();
|
||||||
@@ -35,13 +35,13 @@ steps:
|
|||||||
displayName: Restore modules for just build folder and compile it
|
displayName: Restore modules for just build folder and compile it
|
||||||
|
|
||||||
- download: current
|
- download: current
|
||||||
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive
|
artifact: vscode-darwin-$(VSCODE_ARCH)
|
||||||
displayName: Download $(VSCODE_ARCH) artifact
|
displayName: Download $(VSCODE_ARCH) artifact
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
unzip $(Pipeline.Workspace)/unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive/VSCode-darwin-$(VSCODE_ARCH).zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
unzip $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip -d $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH)
|
||||||
mv $(Pipeline.Workspace)/unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive/VSCode-darwin-$(VSCODE_ARCH).zip $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
mv $(Pipeline.Workspace)/vscode-darwin-$(VSCODE_ARCH)/VSCode-darwin-$(VSCODE_ARCH).zip $(agent.builddirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
||||||
displayName: Unzip & move
|
displayName: Unzip & move
|
||||||
|
|
||||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||||
@@ -108,18 +108,22 @@ steps:
|
|||||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
|
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'arm64'))
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
|
set -e
|
||||||
|
|
||||||
# For legacy purposes, arch for x64 is just 'darwin'
|
# For legacy purposes, arch for x64 is just 'darwin'
|
||||||
case $VSCODE_ARCH in
|
case $VSCODE_ARCH in
|
||||||
x64) ASSET_ID="darwin" ;;
|
x64) ASSET_ID="darwin" ;;
|
||||||
arm64) ASSET_ID="darwin-arm64" ;;
|
arm64) ASSET_ID="darwin-arm64" ;;
|
||||||
universal) ASSET_ID="darwin-universal" ;;
|
universal) ASSET_ID="darwin-universal" ;;
|
||||||
esac
|
esac
|
||||||
echo "##vso[task.setvariable variable=ASSET_ID]$ASSET_ID"
|
|
||||||
displayName: Set asset id variable
|
|
||||||
|
|
||||||
- script: mv $(agent.builddirectory)/VSCode-darwin-x64.zip $(agent.builddirectory)/VSCode-darwin.zip
|
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||||
displayName: Rename x64 build to it's legacy name
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
|
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||||
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
- publish: $(Agent.BuildDirectory)/VSCode-$(ASSET_ID).zip
|
node build/azure-pipelines/common/createAsset.js \
|
||||||
artifact: vscode_client_darwin_$(VSCODE_ARCH)_archive
|
"$ASSET_ID" \
|
||||||
|
archive \
|
||||||
|
"VSCode-$ASSET_ID.zip" \
|
||||||
|
../VSCode-darwin-$(VSCODE_ARCH).zip
|
||||||
|
displayName: Publish Clients
|
||||||
|
|||||||
@@ -138,19 +138,19 @@ steps:
|
|||||||
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_ARCH'], 'universal'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||||
|
|
||||||
- download: current
|
- download: current
|
||||||
artifact: unsigned_vscode_client_darwin_x64_archive
|
artifact: vscode-darwin-x64
|
||||||
displayName: Download x64 artifact
|
displayName: Download x64 artifact
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
||||||
|
|
||||||
- download: current
|
- download: current
|
||||||
artifact: unsigned_vscode_client_darwin_arm64_archive
|
artifact: vscode-darwin-arm64
|
||||||
displayName: Download arm64 artifact
|
displayName: Download arm64 artifact
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'universal'))
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
cp $(Pipeline.Workspace)/unsigned_vscode_client_darwin_x64_archive/VSCode-darwin-x64.zip $(agent.builddirectory)/VSCode-darwin-x64.zip
|
cp $(Pipeline.Workspace)/vscode-darwin-x64/VSCode-darwin-x64.zip $(agent.builddirectory)/VSCode-darwin-x64.zip
|
||||||
cp $(Pipeline.Workspace)/unsigned_vscode_client_darwin_arm64_archive/VSCode-darwin-arm64.zip $(agent.builddirectory)/VSCode-darwin-arm64.zip
|
cp $(Pipeline.Workspace)/vscode-darwin-arm64/VSCode-darwin-arm64.zip $(agent.builddirectory)/VSCode-darwin-arm64.zip
|
||||||
unzip $(agent.builddirectory)/VSCode-darwin-x64.zip -d $(agent.builddirectory)/VSCode-darwin-x64
|
unzip $(agent.builddirectory)/VSCode-darwin-x64.zip -d $(agent.builddirectory)/VSCode-darwin-x64
|
||||||
unzip $(agent.builddirectory)/VSCode-darwin-arm64.zip -d $(agent.builddirectory)/VSCode-darwin-arm64
|
unzip $(agent.builddirectory)/VSCode-darwin-arm64.zip -d $(agent.builddirectory)/VSCode-darwin-arm64
|
||||||
DEBUG=* node build/darwin/create-universal-app.js
|
DEBUG=* node build/darwin/create-universal-app.js
|
||||||
@@ -280,27 +280,26 @@ steps:
|
|||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
|
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||||
# package Remote Extension Host
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
|
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
|
||||||
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
# package Remote Extension Host (Web)
|
VSCODE_ARCH="$(VSCODE_ARCH)" ./build/azure-pipelines/darwin/publish-server.sh
|
||||||
pushd .. && mv vscode-reh-web-darwin vscode-server-darwin-web && zip -Xry vscode-server-darwin-web.zip vscode-server-darwin-web && popd
|
displayName: Publish Servers
|
||||||
displayName: Prepare to publish servers
|
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
- publish: $(Agent.BuildDirectory)/VSCode-darwin-$(VSCODE_ARCH).zip
|
||||||
artifact: unsigned_vscode_client_darwin_$(VSCODE_ARCH)_archive
|
artifact: vscode-darwin-$(VSCODE_ARCH)
|
||||||
displayName: Publish client archive
|
displayName: Publish client archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-darwin.zip
|
- publish: $(Agent.BuildDirectory)/vscode-server-darwin.zip
|
||||||
artifact: vscode_server_darwin_$(VSCODE_ARCH)_archive-unsigned
|
artifact: vscode-server-darwin-$(VSCODE_ARCH)
|
||||||
displayName: Publish server archive
|
displayName: Publish server archive
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-web.zip
|
- publish: $(Agent.BuildDirectory)/vscode-server-darwin-web.zip
|
||||||
artifact: vscode_web_darwin_$(VSCODE_ARCH)_archive-unsigned
|
artifact: vscode-server-darwin-$(VSCODE_ARCH)-web
|
||||||
displayName: Publish web server archive
|
displayName: Publish web server archive
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
@@ -309,5 +308,5 @@ steps:
|
|||||||
VSCODE_ARCH="$(VSCODE_ARCH)" \
|
VSCODE_ARCH="$(VSCODE_ARCH)" \
|
||||||
yarn gulp upload-vscode-configuration
|
yarn gulp upload-vscode-configuration
|
||||||
displayName: Upload configuration (for Bing settings search)
|
displayName: Upload configuration (for Bing settings search)
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
|
||||||
continueOnError: true
|
continueOnError: true
|
||||||
|
|||||||
14
build/azure-pipelines/darwin/publish-server.sh
Executable file
14
build/azure-pipelines/darwin/publish-server.sh
Executable file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
if [ "$VSCODE_ARCH" == "x64" ]; then
|
||||||
|
# package Remote Extension Host
|
||||||
|
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
|
||||||
|
|
||||||
|
# publish Remote Extension Host
|
||||||
|
node build/azure-pipelines/common/createAsset.js \
|
||||||
|
server-darwin \
|
||||||
|
archive-unsigned \
|
||||||
|
"vscode-server-darwin.zip" \
|
||||||
|
../vscode-server-darwin.zip
|
||||||
|
fi
|
||||||
@@ -61,7 +61,6 @@ steps:
|
|||||||
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
||||||
path: .build/node_modules_cache
|
path: .build/node_modules_cache
|
||||||
cacheHitVar: NODE_MODULES_RESTORED
|
cacheHitVar: NODE_MODULES_RESTORED
|
||||||
continueOnError: true
|
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
|
|||||||
28
build/azure-pipelines/linux/alpine/publish.sh
Executable file
28
build/azure-pipelines/linux/alpine/publish.sh
Executable file
@@ -0,0 +1,28 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -e
|
||||||
|
REPO="$(pwd)"
|
||||||
|
ROOT="$REPO/.."
|
||||||
|
|
||||||
|
PLATFORM_LINUX="linux-alpine"
|
||||||
|
|
||||||
|
# Publish Remote Extension Host
|
||||||
|
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
|
||||||
|
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
|
||||||
|
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX.tar.gz"
|
||||||
|
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
||||||
|
|
||||||
|
rm -rf $ROOT/vscode-server-*.tar.*
|
||||||
|
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
|
||||||
|
|
||||||
|
# Publish Remote Extension Host (Web)
|
||||||
|
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
|
||||||
|
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
|
||||||
|
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
|
||||||
|
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
||||||
|
|
||||||
|
rm -rf $ROOT/vscode-server-*-web.tar.*
|
||||||
|
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
|
||||||
@@ -117,37 +117,19 @@ steps:
|
|||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
REPO="$(pwd)"
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
ROOT="$REPO/.."
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
|
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||||
PLATFORM_LINUX="linux-alpine"
|
./build/azure-pipelines/linux/alpine/publish.sh
|
||||||
|
displayName: Publish
|
||||||
# Publish Remote Extension Host
|
|
||||||
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
|
|
||||||
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
|
|
||||||
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX.tar.gz"
|
|
||||||
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
|
||||||
|
|
||||||
rm -rf $ROOT/vscode-server-*.tar.*
|
|
||||||
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
|
||||||
|
|
||||||
# Publish Remote Extension Host (Web)
|
|
||||||
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
|
|
||||||
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
|
|
||||||
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX-web.tar.gz"
|
|
||||||
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
|
||||||
|
|
||||||
rm -rf $ROOT/vscode-server-*-web.tar.*
|
|
||||||
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
|
||||||
displayName: Prepare for publish
|
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine.tar.gz
|
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine.tar.gz
|
||||||
artifact: vscode_server_linux_alpine_archive-unsigned
|
artifact: vscode-server-linux-alpine
|
||||||
displayName: Publish server archive
|
displayName: Publish server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine-web.tar.gz
|
- publish: $(Agent.BuildDirectory)/vscode-server-linux-alpine-web.tar.gz
|
||||||
artifact: vscode_web_linux_alpine_archive-unsigned
|
artifact: vscode-server-linux-alpine-web
|
||||||
displayName: Publish web server archive
|
displayName: Publish web server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|||||||
@@ -245,32 +245,27 @@ steps:
|
|||||||
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||||
VSCODE_ARCH="$(VSCODE_ARCH)" \
|
VSCODE_ARCH="$(VSCODE_ARCH)" \
|
||||||
./build/azure-pipelines/linux/prepare-publish.sh
|
./build/azure-pipelines/linux/publish.sh
|
||||||
displayName: Prepare for Publish
|
displayName: Publish
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(DEB_PATH)
|
- publish: $(DEB_PATH)
|
||||||
artifact: vscode_client_linux_$(VSCODE_ARCH)_deb-package
|
artifact: vscode-linux-deb-$(VSCODE_ARCH)
|
||||||
displayName: Publish deb package
|
displayName: Publish deb package
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(RPM_PATH)
|
- publish: $(RPM_PATH)
|
||||||
artifact: vscode_client_linux_$(VSCODE_ARCH)_rpm-package
|
artifact: vscode-linux-rpm-$(VSCODE_ARCH)
|
||||||
displayName: Publish rpm package
|
displayName: Publish rpm package
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(TARBALL_PATH)
|
|
||||||
artifact: vscode_client_linux_$(VSCODE_ARCH)_archive-unsigned
|
|
||||||
displayName: Publish client archive
|
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
|
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH).tar.gz
|
||||||
artifact: vscode_server_linux_$(VSCODE_ARCH)_archive-unsigned
|
artifact: vscode-server-linux-$(VSCODE_ARCH)
|
||||||
displayName: Publish server archive
|
displayName: Publish server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
|
- publish: $(Agent.BuildDirectory)/vscode-server-linux-$(VSCODE_ARCH)-web.tar.gz
|
||||||
artifact: vscode_web_linux_$(VSCODE_ARCH)_archive-unsigned
|
artifact: vscode-server-linux-$(VSCODE_ARCH)-web
|
||||||
displayName: Publish web server archive
|
displayName: Publish web server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
|
|||||||
@@ -13,6 +13,8 @@ TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
|
|||||||
rm -rf $ROOT/code-*.tar.*
|
rm -rf $ROOT/code-*.tar.*
|
||||||
(cd $ROOT && tar -czf $TARBALL_PATH $BUILDNAME)
|
(cd $ROOT && tar -czf $TARBALL_PATH $BUILDNAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "$PLATFORM_LINUX" archive-unsigned "$TARBALL_FILENAME" "$TARBALL_PATH"
|
||||||
|
|
||||||
# Publish Remote Extension Host
|
# Publish Remote Extension Host
|
||||||
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
|
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
|
||||||
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
|
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
|
||||||
@@ -22,6 +24,8 @@ SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
|||||||
rm -rf $ROOT/vscode-server-*.tar.*
|
rm -rf $ROOT/vscode-server-*.tar.*
|
||||||
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
|
||||||
|
|
||||||
# Publish Remote Extension Host (Web)
|
# Publish Remote Extension Host (Web)
|
||||||
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
|
LEGACY_SERVER_BUILD_NAME="vscode-reh-web-$PLATFORM_LINUX"
|
||||||
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
|
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX-web"
|
||||||
@@ -31,6 +35,8 @@ SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
|
|||||||
rm -rf $ROOT/vscode-server-*-web.tar.*
|
rm -rf $ROOT/vscode-server-*-web.tar.*
|
||||||
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar --owner=0 --group=0 -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX-web" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
|
||||||
|
|
||||||
# Publish DEB
|
# Publish DEB
|
||||||
case $VSCODE_ARCH in
|
case $VSCODE_ARCH in
|
||||||
x64) DEB_ARCH="amd64" ;;
|
x64) DEB_ARCH="amd64" ;;
|
||||||
@@ -41,6 +47,8 @@ PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
|
|||||||
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
|
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
|
||||||
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
|
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH"
|
||||||
|
|
||||||
# Publish RPM
|
# Publish RPM
|
||||||
case $VSCODE_ARCH in
|
case $VSCODE_ARCH in
|
||||||
x64) RPM_ARCH="x86_64" ;;
|
x64) RPM_ARCH="x86_64" ;;
|
||||||
@@ -53,6 +61,8 @@ PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
|
|||||||
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
|
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
|
||||||
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
|
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js "$PLATFORM_RPM" package "$RPM_FILENAME" "$RPM_PATH"
|
||||||
|
|
||||||
# Publish Snap
|
# Publish Snap
|
||||||
# Pack snap tarball artifact, in order to preserve file perms
|
# Pack snap tarball artifact, in order to preserve file perms
|
||||||
mkdir -p $REPO/.build/linux/snap-tarball
|
mkdir -p $REPO/.build/linux/snap-tarball
|
||||||
@@ -63,4 +73,3 @@ rm -rf $SNAP_TARBALL_PATH
|
|||||||
# Export DEB_PATH, RPM_PATH
|
# Export DEB_PATH, RPM_PATH
|
||||||
echo "##vso[task.setvariable variable=DEB_PATH]$DEB_PATH"
|
echo "##vso[task.setvariable variable=DEB_PATH]$DEB_PATH"
|
||||||
echo "##vso[task.setvariable variable=RPM_PATH]$RPM_PATH"
|
echo "##vso[task.setvariable variable=RPM_PATH]$RPM_PATH"
|
||||||
echo "##vso[task.setvariable variable=TARBALL_PATH]$TARBALL_PATH"
|
|
||||||
@@ -50,11 +50,15 @@ steps:
|
|||||||
esac
|
esac
|
||||||
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft prime $SNAPCRAFT_TARGET_ARGS && snap pack prime --compression=lzo --filename="$SNAP_PATH")
|
(cd $SNAP_ROOT/code-* && sudo --preserve-env snapcraft prime $SNAPCRAFT_TARGET_ARGS && snap pack prime --compression=lzo --filename="$SNAP_PATH")
|
||||||
|
|
||||||
|
# Publish snap package
|
||||||
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
|
node build/azure-pipelines/common/createAsset.js "linux-snap-$(VSCODE_ARCH)" package "$SNAP_FILENAME" "$SNAP_PATH"
|
||||||
|
|
||||||
# Export SNAP_PATH
|
# Export SNAP_PATH
|
||||||
echo "##vso[task.setvariable variable=SNAP_PATH]$SNAP_PATH"
|
echo "##vso[task.setvariable variable=SNAP_PATH]$SNAP_PATH"
|
||||||
displayName: Prepare for publish
|
|
||||||
|
|
||||||
- publish: $(SNAP_PATH)
|
- publish: $(SNAP_PATH)
|
||||||
artifact: vscode_client_linux_$(VSCODE_ARCH)_snap
|
artifact: vscode-linux-snap-$(VSCODE_ARCH)
|
||||||
displayName: Publish snap package
|
displayName: Publish snap package
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|||||||
@@ -57,7 +57,6 @@ steps:
|
|||||||
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
key: 'nodeModules | $(Agent.OS) | .build/yarnlockhash'
|
||||||
path: .build/node_modules_cache
|
path: .build/node_modules_cache
|
||||||
cacheHitVar: NODE_MODULES_RESTORED
|
cacheHitVar: NODE_MODULES_RESTORED
|
||||||
continueOnError: true
|
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
@@ -172,7 +171,7 @@ steps:
|
|||||||
done
|
done
|
||||||
displayName: Archive Logs
|
displayName: Archive Logs
|
||||||
continueOnError: true
|
continueOnError: true
|
||||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
condition: succeededOrFailed()
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
@@ -188,7 +187,7 @@ steps:
|
|||||||
displayName: 'Install .NET Core sdk for signing'
|
displayName: 'Install .NET Core sdk for signing'
|
||||||
inputs:
|
inputs:
|
||||||
packageType: sdk
|
packageType: sdk
|
||||||
version: 5.0.x
|
version: 2.1.x
|
||||||
installationPath: $(Agent.ToolsDirectory)/dotnet
|
installationPath: $(Agent.ToolsDirectory)/dotnet
|
||||||
|
|
||||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||||
|
|||||||
@@ -86,8 +86,6 @@ variables:
|
|||||||
value: ${{ eq(parameters.ENABLE_TERRAPIN, true) }}
|
value: ${{ eq(parameters.ENABLE_TERRAPIN, true) }}
|
||||||
- name: VSCODE_QUALITY
|
- name: VSCODE_QUALITY
|
||||||
value: ${{ parameters.VSCODE_QUALITY }}
|
value: ${{ parameters.VSCODE_QUALITY }}
|
||||||
- name: VSCODE_RELEASE
|
|
||||||
value: ${{ parameters.VSCODE_RELEASE }}
|
|
||||||
- name: VSCODE_BUILD_STAGE_WINDOWS
|
- name: VSCODE_BUILD_STAGE_WINDOWS
|
||||||
value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}
|
value: ${{ or(eq(parameters.VSCODE_BUILD_WIN32, true), eq(parameters.VSCODE_BUILD_WIN32_32BIT, true), eq(parameters.VSCODE_BUILD_WIN32_ARM64, true)) }}
|
||||||
- name: VSCODE_BUILD_STAGE_LINUX
|
- name: VSCODE_BUILD_STAGE_LINUX
|
||||||
@@ -303,30 +301,37 @@ stages:
|
|||||||
steps:
|
steps:
|
||||||
- template: darwin/product-build-darwin-sign.yml
|
- template: darwin/product-build-darwin-sign.yml
|
||||||
|
|
||||||
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), ne(variables['VSCODE_PUBLISH'], 'false')) }}:
|
- ${{ if and(eq(variables['VSCODE_PUBLISH'], true), eq(parameters.VSCODE_COMPILE_ONLY, false)) }}:
|
||||||
- stage: Publish
|
- stage: Mooncake
|
||||||
dependsOn:
|
dependsOn:
|
||||||
- Compile
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true) }}:
|
||||||
|
- Windows
|
||||||
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_LINUX'], true) }}:
|
||||||
|
- Linux
|
||||||
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_MACOS'], true) }}:
|
||||||
|
- macOS
|
||||||
|
condition: succeededOrFailed()
|
||||||
pool:
|
pool:
|
||||||
vmImage: "Ubuntu-18.04"
|
vmImage: "Ubuntu-18.04"
|
||||||
variables:
|
|
||||||
- name: BUILDS_API_URL
|
|
||||||
value: $(System.CollectionUri)$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/
|
|
||||||
jobs:
|
jobs:
|
||||||
- job: PublishBuild
|
- job: SyncMooncake
|
||||||
timeoutInMinutes: 180
|
displayName: Sync Mooncake
|
||||||
displayName: Publish Build
|
|
||||||
steps:
|
steps:
|
||||||
- template: product-publish.yml
|
- template: sync-mooncake.yml
|
||||||
|
|
||||||
- ${{ if or(eq(parameters.VSCODE_RELEASE, true), and(in(parameters.VSCODE_QUALITY, 'insider', 'exploration'), eq(variables['VSCODE_SCHEDULEDBUILD'], true))) }}:
|
- ${{ if and(eq(parameters.VSCODE_COMPILE_ONLY, false), or(eq(parameters.VSCODE_RELEASE, true), and(in(parameters.VSCODE_QUALITY, 'insider', 'exploration'), eq(variables['VSCODE_SCHEDULEDBUILD'], true)))) }}:
|
||||||
- stage: Release
|
- stage: Release
|
||||||
dependsOn:
|
dependsOn:
|
||||||
- Publish
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_WINDOWS'], true) }}:
|
||||||
pool:
|
- Windows
|
||||||
vmImage: "Ubuntu-18.04"
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_LINUX'], true) }}:
|
||||||
jobs:
|
- Linux
|
||||||
- job: ReleaseBuild
|
- ${{ if eq(variables['VSCODE_BUILD_STAGE_MACOS'], true) }}:
|
||||||
displayName: Release Build
|
- macOS
|
||||||
steps:
|
pool:
|
||||||
- template: product-release.yml
|
vmImage: "Ubuntu-18.04"
|
||||||
|
jobs:
|
||||||
|
- job: ReleaseBuild
|
||||||
|
displayName: Release Build
|
||||||
|
steps:
|
||||||
|
- template: release.yml
|
||||||
|
|||||||
@@ -118,6 +118,14 @@ steps:
|
|||||||
displayName: Publish Webview
|
displayName: Publish Webview
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
|
- script: |
|
||||||
|
set -e
|
||||||
|
VERSION=`node -p "require(\"./package.json\").version"`
|
||||||
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
|
node build/azure-pipelines/common/createBuild.js $VERSION
|
||||||
|
displayName: Create build
|
||||||
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
# we gotta tarball everything in order to preserve file permissions
|
# we gotta tarball everything in order to preserve file permissions
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
|
|||||||
@@ -1,114 +0,0 @@
|
|||||||
. build/azure-pipelines/win32/exec.ps1
|
|
||||||
$ErrorActionPreference = 'Stop'
|
|
||||||
$ProgressPreference = 'SilentlyContinue'
|
|
||||||
$ARTIFACT_PROCESSED_WILDCARD_PATH = "$env:PIPELINE_WORKSPACE/artifacts_processed_*/artifacts_processed_*"
|
|
||||||
$ARTIFACT_PROCESSED_FILE_PATH = "$env:PIPELINE_WORKSPACE/artifacts_processed_$env:SYSTEM_STAGEATTEMPT/artifacts_processed_$env:SYSTEM_STAGEATTEMPT.txt"
|
|
||||||
|
|
||||||
function Get-PipelineArtifact {
|
|
||||||
param($Name = '*')
|
|
||||||
try {
|
|
||||||
$res = Invoke-RestMethod "$($env:BUILDS_API_URL)artifacts?api-version=6.0" -Headers @{
|
|
||||||
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
|
|
||||||
} -MaximumRetryCount 5 -RetryIntervalSec 1
|
|
||||||
|
|
||||||
if (!$res) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
$res.value | Where-Object { $_.name -Like $Name }
|
|
||||||
} catch {
|
|
||||||
Write-Warning $_
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# This set will keep track of which artifacts have already been processed
|
|
||||||
$set = [System.Collections.Generic.HashSet[string]]::new()
|
|
||||||
|
|
||||||
if (Test-Path $ARTIFACT_PROCESSED_WILDCARD_PATH) {
|
|
||||||
# Grab the latest artifact_processed text file and load all assets already processed from that.
|
|
||||||
# This means that the latest artifact_processed_*.txt file has all of the contents of the previous ones.
|
|
||||||
# Note: The kusto-like syntax only works in PS7+ and only in scripts, not at the REPL.
|
|
||||||
Get-ChildItem $ARTIFACT_PROCESSED_WILDCARD_PATH
|
|
||||||
| Sort-Object
|
|
||||||
| Select-Object -Last 1
|
|
||||||
| Get-Content
|
|
||||||
| ForEach-Object {
|
|
||||||
$set.Add($_) | Out-Null
|
|
||||||
Write-Host "Already processed artifact: $_"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# Create the artifact file that will be used for this run
|
|
||||||
New-Item -Path $ARTIFACT_PROCESSED_FILE_PATH -Force | Out-Null
|
|
||||||
|
|
||||||
# Determine which stages we need to watch
|
|
||||||
$stages = @(
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
|
|
||||||
)
|
|
||||||
|
|
||||||
do {
|
|
||||||
Start-Sleep -Seconds 10
|
|
||||||
|
|
||||||
$artifacts = Get-PipelineArtifact -Name 'vscode_*'
|
|
||||||
if (!$artifacts) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
$artifacts | ForEach-Object {
|
|
||||||
$artifactName = $_.name
|
|
||||||
if($set.Add($artifactName)) {
|
|
||||||
Write-Host "Processing artifact: '$artifactName. Downloading from: $($_.resource.downloadUrl)"
|
|
||||||
|
|
||||||
try {
|
|
||||||
Invoke-RestMethod $_.resource.downloadUrl -OutFile "$env:AGENT_TEMPDIRECTORY/$artifactName.zip" -Headers @{
|
|
||||||
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
|
|
||||||
} -MaximumRetryCount 5 -RetryIntervalSec 1 | Out-Null
|
|
||||||
|
|
||||||
Expand-Archive -Path "$env:AGENT_TEMPDIRECTORY/$artifactName.zip" -DestinationPath $env:AGENT_TEMPDIRECTORY | Out-Null
|
|
||||||
} catch {
|
|
||||||
Write-Warning $_
|
|
||||||
$set.Remove($artifactName) | Out-Null
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
$null,$product,$os,$arch,$type = $artifactName -split '_'
|
|
||||||
$asset = Get-ChildItem -rec "$env:AGENT_TEMPDIRECTORY/$artifactName"
|
|
||||||
Write-Host "Processing artifact with the following values:"
|
|
||||||
# turning in into an object just to log nicely
|
|
||||||
@{
|
|
||||||
product = $product
|
|
||||||
os = $os
|
|
||||||
arch = $arch
|
|
||||||
type = $type
|
|
||||||
asset = $asset.Name
|
|
||||||
} | Format-Table
|
|
||||||
|
|
||||||
exec { node build/azure-pipelines/common/createAsset.js $product $os $arch $type $asset.Name $asset.FullName }
|
|
||||||
$artifactName >> $ARTIFACT_PROCESSED_FILE_PATH
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# Get the timeline and see if it says the other stage completed
|
|
||||||
try {
|
|
||||||
$timeline = Invoke-RestMethod "$($env:BUILDS_API_URL)timeline?api-version=6.0" -Headers @{
|
|
||||||
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
|
|
||||||
} -MaximumRetryCount 5 -RetryIntervalSec 1
|
|
||||||
} catch {
|
|
||||||
Write-Warning $_
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
foreach ($stage in $stages) {
|
|
||||||
$otherStageFinished = $timeline.records | Where-Object { $_.name -eq $stage -and $_.type -eq 'stage' -and $_.state -eq 'completed' }
|
|
||||||
if (!$otherStageFinished) {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
$artifacts = Get-PipelineArtifact -Name 'vscode_*'
|
|
||||||
$artifactsStillToProcess = $artifacts.Count -ne $set.Count
|
|
||||||
} while (!$otherStageFinished -or $artifactsStillToProcess)
|
|
||||||
|
|
||||||
Write-Host "Processed $($set.Count) artifacts."
|
|
||||||
@@ -1,89 +0,0 @@
|
|||||||
steps:
|
|
||||||
- task: NodeTool@0
|
|
||||||
inputs:
|
|
||||||
versionSpec: "12.x"
|
|
||||||
|
|
||||||
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
|
|
||||||
inputs:
|
|
||||||
versionSpec: "1.x"
|
|
||||||
|
|
||||||
- task: AzureKeyVault@1
|
|
||||||
displayName: "Azure Key Vault: Get Secrets"
|
|
||||||
inputs:
|
|
||||||
azureSubscription: "vscode-builds-subscription"
|
|
||||||
KeyVaultName: vscode
|
|
||||||
|
|
||||||
- pwsh: |
|
|
||||||
. build/azure-pipelines/win32/exec.ps1
|
|
||||||
cd build
|
|
||||||
exec { yarn }
|
|
||||||
displayName: Install dependencies
|
|
||||||
|
|
||||||
- download: current
|
|
||||||
patterns: '**/artifacts_processed_*.txt'
|
|
||||||
displayName: Download all artifacts_processed text files
|
|
||||||
|
|
||||||
- pwsh: |
|
|
||||||
. build/azure-pipelines/win32/exec.ps1
|
|
||||||
|
|
||||||
if (Test-Path "$(Pipeline.Workspace)/artifacts_processed_*/artifacts_processed_*.txt") {
|
|
||||||
Write-Host "Artifacts already processed so a build must have already been created."
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
|
|
||||||
$VERSION = node -p "require('./package.json').version"
|
|
||||||
Write-Host "Creating build with version: $VERSION"
|
|
||||||
exec { node build/azure-pipelines/common/createBuild.js $VERSION }
|
|
||||||
displayName: Create build if it hasn't been created before
|
|
||||||
|
|
||||||
- pwsh: |
|
|
||||||
$env:VSCODE_MIXIN_PASSWORD = "$(github-distro-mixin-password)"
|
|
||||||
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
|
|
||||||
$env:AZURE_STORAGE_ACCESS_KEY = "$(ticino-storage-key)"
|
|
||||||
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
|
|
||||||
$env:MOONCAKE_STORAGE_ACCESS_KEY = "$(vscode-mooncake-storage-key)"
|
|
||||||
build/azure-pipelines/product-publish.ps1
|
|
||||||
env:
|
|
||||||
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
|
|
||||||
displayName: Process artifacts
|
|
||||||
|
|
||||||
- publish: $(Pipeline.Workspace)/artifacts_processed_$(System.StageAttempt)/artifacts_processed_$(System.StageAttempt).txt
|
|
||||||
artifact: artifacts_processed_$(System.StageAttempt)
|
|
||||||
displayName: Publish what artifacts were published for this stage attempt
|
|
||||||
|
|
||||||
- pwsh: |
|
|
||||||
$ErrorActionPreference = 'Stop'
|
|
||||||
|
|
||||||
# Determine which stages we need to watch
|
|
||||||
$stages = @(
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_WINDOWS -eq 'True') { 'Windows' }
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_LINUX -eq 'True') { 'Linux' }
|
|
||||||
if ($env:VSCODE_BUILD_STAGE_MACOS -eq 'True') { 'macOS' }
|
|
||||||
)
|
|
||||||
Write-Host "Stages to check: $stages"
|
|
||||||
|
|
||||||
# Get the timeline and see if it says the other stage completed
|
|
||||||
$timeline = Invoke-RestMethod "$($env:BUILDS_API_URL)timeline?api-version=6.0" -Headers @{
|
|
||||||
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
|
|
||||||
} -MaximumRetryCount 5 -RetryIntervalSec 1
|
|
||||||
|
|
||||||
$failedStages = @()
|
|
||||||
foreach ($stage in $stages) {
|
|
||||||
$didStageFail = $timeline.records | Where-Object {
|
|
||||||
$_.name -eq $stage -and $_.type -eq 'stage' -and $_.result -ne 'succeeded' -and $_.result -ne 'succeededWithIssues'
|
|
||||||
}
|
|
||||||
|
|
||||||
if($didStageFail) {
|
|
||||||
$failedStages += $stage
|
|
||||||
} else {
|
|
||||||
Write-Host "'$stage' did not fail."
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if ($failedStages.Length) {
|
|
||||||
throw "Failed stages: $($failedStages -join ', '). This stage will now fail so that it is easier to retry failed jobs."
|
|
||||||
}
|
|
||||||
env:
|
|
||||||
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
|
|
||||||
displayName: Determine if stage should succeed
|
|
||||||
@@ -11,25 +11,25 @@ steps:
|
|||||||
inputs:
|
inputs:
|
||||||
versionSpec: "14.x"
|
versionSpec: "14.x"
|
||||||
|
|
||||||
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
|
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
|
||||||
inputs:
|
inputs:
|
||||||
versionSpec: "1.x"
|
versionSpec: "1.x"
|
||||||
|
|
||||||
# - bash: |
|
- bash: |
|
||||||
# TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
||||||
# CHANNEL="G1C14HJ2F"
|
CHANNEL="G1C14HJ2F"
|
||||||
|
|
||||||
# if [ "$TAG_VERSION" == "1.999.0" ]; then
|
if [ "$TAG_VERSION" == "1.999.0" ]; then
|
||||||
# MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local."
|
MESSAGE="<!here>. Someone pushed 1.999.0 tag. Please delete it ASAP from remote and local."
|
||||||
|
|
||||||
# curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
||||||
# -H 'Content-type: application/json; charset=utf-8' \
|
-H 'Content-type: application/json; charset=utf-8' \
|
||||||
# --data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
|
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
|
||||||
# https://slack.com/api/chat.postMessage
|
https://slack.com/api/chat.postMessage
|
||||||
|
|
||||||
# exit 1
|
exit 1
|
||||||
# fi
|
fi
|
||||||
# displayName: Check 1.999.0 tag
|
displayName: Check 1.999.0 tag
|
||||||
|
|
||||||
- bash: |
|
- bash: |
|
||||||
# Install build dependencies
|
# Install build dependencies
|
||||||
@@ -37,55 +37,47 @@ steps:
|
|||||||
node build/azure-pipelines/publish-types/check-version.js
|
node build/azure-pipelines/publish-types/check-version.js
|
||||||
displayName: Check version
|
displayName: Check version
|
||||||
|
|
||||||
# {{SQL CARBON EDIT}} Modify to fit our own scenario - specifically currently we need to use a fork of the repo since we don't
|
|
||||||
# have an account with push access to DT
|
|
||||||
- bash: |
|
- bash: |
|
||||||
git config --global user.email "azuredatastudio@microsoft.com"
|
git config --global user.email "vscode@microsoft.com"
|
||||||
git config --global user.name "Azure Data Studio"
|
git config --global user.name "VSCode"
|
||||||
|
|
||||||
git clone https://$(GITHUB_TOKEN)@$(REPO) --depth=1
|
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
|
||||||
node build/azure-pipelines/publish-types/update-types.js
|
node build/azure-pipelines/publish-types/update-types.js
|
||||||
|
|
||||||
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
||||||
|
|
||||||
cd DefinitelyTyped
|
cd DefinitelyTyped
|
||||||
|
|
||||||
# Sync up to latest from the DT repo
|
|
||||||
git remote add upstream https://github.com/DefinitelyTyped/DefinitelyTyped.git
|
|
||||||
git fetch upstream
|
|
||||||
git merge upstream/master
|
|
||||||
git push origin
|
|
||||||
|
|
||||||
git diff --color | cat
|
git diff --color | cat
|
||||||
git add -A
|
git add -A
|
||||||
git status
|
git status
|
||||||
git checkout -b "azdata-types-$TAG_VERSION"
|
git checkout -b "vscode-types-$TAG_VERSION"
|
||||||
git commit -m "Azure Data Studio $TAG_VERSION Extension API"
|
git commit -m "VS Code $TAG_VERSION Extension API"
|
||||||
git push origin "azdata-types-$TAG_VERSION"
|
git push origin "vscode-types-$TAG_VERSION"
|
||||||
|
|
||||||
displayName: Push update to DefinitelyTyped
|
displayName: Push update to DefinitelyTyped
|
||||||
|
|
||||||
# - bash: |
|
- bash: |
|
||||||
# TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
|
||||||
# CHANNEL="G1C14HJ2F"
|
CHANNEL="G1C14HJ2F"
|
||||||
|
|
||||||
# MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame champion, please open this link, examine changes and create a PR:"
|
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame champion, please open this link, examine changes and create a PR:"
|
||||||
# LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
|
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
|
||||||
# MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
|
MESSAGE2="[@eamodio, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
|
||||||
|
|
||||||
# curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
||||||
# -H 'Content-type: application/json; charset=utf-8' \
|
-H 'Content-type: application/json; charset=utf-8' \
|
||||||
# --data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
|
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
|
||||||
# https://slack.com/api/chat.postMessage
|
https://slack.com/api/chat.postMessage
|
||||||
|
|
||||||
# curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
||||||
# -H 'Content-type: application/json; charset=utf-8' \
|
-H 'Content-type: application/json; charset=utf-8' \
|
||||||
# --data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
|
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
|
||||||
# https://slack.com/api/chat.postMessage
|
https://slack.com/api/chat.postMessage
|
||||||
|
|
||||||
# curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
|
||||||
# -H 'Content-type: application/json; charset=utf-8' \
|
-H 'Content-type: application/json; charset=utf-8' \
|
||||||
# --data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
|
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
|
||||||
# https://slack.com/api/chat.postMessage
|
https://slack.com/api/chat.postMessage
|
||||||
|
|
||||||
# displayName: Send message on Slack
|
displayName: Send message on Slack
|
||||||
|
|||||||
@@ -13,11 +13,11 @@ try {
|
|||||||
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
|
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
|
||||||
.toString()
|
.toString()
|
||||||
.trim();
|
.trim();
|
||||||
const dtsUri = `https://raw.githubusercontent.com/microsoft/azuredatastudio/${tag}/src/sql/azdata.d.ts`; // {{SQL CARBON EDIT}} Use our typings
|
const dtsUri = `https://raw.githubusercontent.com/microsoft/vscode/${tag}/src/vs/vscode.d.ts`;
|
||||||
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/azdata/index.d.ts'); // {{SQL CARBON EDIT}} Use our typings
|
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/vscode/index.d.ts');
|
||||||
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
|
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
|
||||||
updateDTSFile(outPath, tag);
|
updateDTSFile(outPath, tag);
|
||||||
console.log(`Done updating azdata.d.ts at ${outPath}`); // {{SQL CARBON EDIT}} Use our typings
|
console.log(`Done updating vscode.d.ts at ${outPath}`);
|
||||||
}
|
}
|
||||||
catch (err) {
|
catch (err) {
|
||||||
console.error(err);
|
console.error(err);
|
||||||
@@ -51,25 +51,21 @@ function getNewFileContent(content, tag) {
|
|||||||
function getNewFileHeader(tag) {
|
function getNewFileHeader(tag) {
|
||||||
const [major, minor] = tag.split('.');
|
const [major, minor] = tag.split('.');
|
||||||
const shorttag = `${major}.${minor}`;
|
const shorttag = `${major}.${minor}`;
|
||||||
// {{SQL CARBON EDIT}} Use our own header
|
|
||||||
const header = [
|
const header = [
|
||||||
`// Type definitions for Azure Data Studio ${shorttag}`,
|
`// Type definitions for Visual Studio Code ${shorttag}`,
|
||||||
`// Project: https://github.com/microsoft/azuredatastudio`,
|
`// Project: https://github.com/microsoft/vscode`,
|
||||||
`// Definitions by: Charles Gagnon <https://github.com/Charles-Gagnon>`,
|
`// Definitions by: Visual Studio Code Team, Microsoft <https://github.com/Microsoft>`,
|
||||||
`// Alan Ren: <https://github.com/alanrenmsft>`,
|
|
||||||
`// Karl Burtram: <https://github.com/kburtram>`,
|
|
||||||
`// Ken Van Hyning: <https://github.com/kenvanhyning>`,
|
|
||||||
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
|
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
|
||||||
``,
|
``,
|
||||||
`/*---------------------------------------------------------------------------------------------`,
|
`/*---------------------------------------------------------------------------------------------`,
|
||||||
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
|
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
|
||||||
` * Licensed under the Source EULA.`,
|
` * Licensed under the Source EULA.`,
|
||||||
` * See https://github.com/microsoft/azuredatastudio/blob/main/LICENSE.txt for license information.`,
|
` * See https://github.com/Microsoft/vscode/blob/main/LICENSE.txt for license information.`,
|
||||||
` *--------------------------------------------------------------------------------------------*/`,
|
` *--------------------------------------------------------------------------------------------*/`,
|
||||||
``,
|
``,
|
||||||
`/**`,
|
`/**`,
|
||||||
` * Type Definition for Azure Data Studio ${shorttag} Extension API`,
|
` * Type Definition for Visual Studio Code ${shorttag} Extension API`,
|
||||||
` * See https://docs.microsoft.com/sql/azure-data-studio/extensibility-apis for more information`,
|
` * See https://code.visualstudio.com/api for more information`,
|
||||||
` */`
|
` */`
|
||||||
].join('\n');
|
].join('\n');
|
||||||
return header;
|
return header;
|
||||||
|
|||||||
@@ -16,13 +16,13 @@ try {
|
|||||||
.toString()
|
.toString()
|
||||||
.trim();
|
.trim();
|
||||||
|
|
||||||
const dtsUri = `https://raw.githubusercontent.com/microsoft/azuredatastudio/${tag}/src/sql/azdata.d.ts`; // {{SQL CARBON EDIT}} Use our typings
|
const dtsUri = `https://raw.githubusercontent.com/microsoft/vscode/${tag}/src/vs/vscode.d.ts`;
|
||||||
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/azdata/index.d.ts'); // {{SQL CARBON EDIT}} Use our typings
|
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/vscode/index.d.ts');
|
||||||
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
|
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
|
||||||
|
|
||||||
updateDTSFile(outPath, tag);
|
updateDTSFile(outPath, tag);
|
||||||
|
|
||||||
console.log(`Done updating azdata.d.ts at ${outPath}`); // {{SQL CARBON EDIT}} Use our typings
|
console.log(`Done updating vscode.d.ts at ${outPath}`);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error(err);
|
console.error(err);
|
||||||
console.error('Failed to update types');
|
console.error('Failed to update types');
|
||||||
@@ -63,25 +63,21 @@ function getNewFileHeader(tag: string) {
|
|||||||
const [major, minor] = tag.split('.');
|
const [major, minor] = tag.split('.');
|
||||||
const shorttag = `${major}.${minor}`;
|
const shorttag = `${major}.${minor}`;
|
||||||
|
|
||||||
// {{SQL CARBON EDIT}} Use our own header
|
|
||||||
const header = [
|
const header = [
|
||||||
`// Type definitions for Azure Data Studio ${shorttag}`,
|
`// Type definitions for Visual Studio Code ${shorttag}`,
|
||||||
`// Project: https://github.com/microsoft/azuredatastudio`,
|
`// Project: https://github.com/microsoft/vscode`,
|
||||||
`// Definitions by: Charles Gagnon <https://github.com/Charles-Gagnon>`,
|
`// Definitions by: Visual Studio Code Team, Microsoft <https://github.com/Microsoft>`,
|
||||||
`// Alan Ren: <https://github.com/alanrenmsft>`,
|
|
||||||
`// Karl Burtram: <https://github.com/kburtram>`,
|
|
||||||
`// Ken Van Hyning: <https://github.com/kenvanhyning>`,
|
|
||||||
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
|
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
|
||||||
``,
|
``,
|
||||||
`/*---------------------------------------------------------------------------------------------`,
|
`/*---------------------------------------------------------------------------------------------`,
|
||||||
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
|
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
|
||||||
` * Licensed under the Source EULA.`,
|
` * Licensed under the Source EULA.`,
|
||||||
` * See https://github.com/microsoft/azuredatastudio/blob/main/LICENSE.txt for license information.`,
|
` * See https://github.com/Microsoft/vscode/blob/main/LICENSE.txt for license information.`,
|
||||||
` *--------------------------------------------------------------------------------------------*/`,
|
` *--------------------------------------------------------------------------------------------*/`,
|
||||||
``,
|
``,
|
||||||
`/**`,
|
`/**`,
|
||||||
` * Type Definition for Azure Data Studio ${shorttag} Extension API`,
|
` * Type Definition for Visual Studio Code ${shorttag} Extension API`,
|
||||||
` * See https://docs.microsoft.com/sql/azure-data-studio/extensibility-apis for more information`,
|
` * See https://code.visualstudio.com/api for more information`,
|
||||||
` */`
|
` */`
|
||||||
].join('\n');
|
].join('\n');
|
||||||
|
|
||||||
|
|||||||
@@ -2,7 +2,7 @@ resources:
|
|||||||
containers:
|
containers:
|
||||||
- container: linux-x64
|
- container: linux-x64
|
||||||
image: sqltoolscontainers.azurecr.io/linux-build-agent:3
|
image: sqltoolscontainers.azurecr.io/linux-build-agent:3
|
||||||
endpoint: SqlToolsContainers
|
endpoint: ContainerRegistry
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
- job: Compile
|
- job: Compile
|
||||||
@@ -20,7 +20,7 @@ jobs:
|
|||||||
- job: macOS
|
- job: macOS
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
|
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
|
||||||
pool:
|
pool:
|
||||||
vmImage: 'macOS-10.15'
|
vmImage: macOS-latest
|
||||||
dependsOn:
|
dependsOn:
|
||||||
- Compile
|
- Compile
|
||||||
steps:
|
steps:
|
||||||
@@ -30,7 +30,7 @@ jobs:
|
|||||||
- job: macOS_Signing
|
- job: macOS_Signing
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
|
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
|
||||||
pool:
|
pool:
|
||||||
vmImage: 'macOS-10.15'
|
vmImage: macOS-latest
|
||||||
dependsOn:
|
dependsOn:
|
||||||
- macOS
|
- macOS
|
||||||
steps:
|
steps:
|
||||||
@@ -47,13 +47,13 @@ jobs:
|
|||||||
steps:
|
steps:
|
||||||
- template: linux/sql-product-build-linux.yml
|
- template: linux/sql-product-build-linux.yml
|
||||||
parameters:
|
parameters:
|
||||||
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azcli", "azurecore", "cms", "dacpac", "data-workspace", "import", "machine-learning", "notebook", "resource-deployment", "schema-compare", "sql-database-projects"]
|
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azcli", "azdata", "azurecore", "cms", "dacpac", "data-workspace", "import", "machine-learning", "notebook", "resource-deployment", "schema-compare", "sql-database-projects"]
|
||||||
timeoutInMinutes: 90
|
timeoutInMinutes: 90
|
||||||
|
|
||||||
- job: Windows
|
- job: Windows
|
||||||
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
|
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
|
||||||
pool:
|
pool:
|
||||||
vmImage: 'windows-2019'
|
vmImage: VS2017-Win2016
|
||||||
dependsOn:
|
dependsOn:
|
||||||
- Compile
|
- Compile
|
||||||
steps:
|
steps:
|
||||||
|
|||||||
@@ -79,13 +79,19 @@ steps:
|
|||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
yarn npm-run-all -lp core-ci extensions-ci hygiene eslint valid-layers-check
|
yarn sqllint
|
||||||
displayName: Compile & Hygiene
|
yarn gulp hygiene
|
||||||
|
yarn strict-vscode
|
||||||
|
yarn valid-layers-check
|
||||||
|
displayName: Run hygiene, eslint
|
||||||
|
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
yarn npm-run-all -lp sqllint extensions-lint strict-vscode
|
yarn gulp compile-build
|
||||||
displayName: SQL Hygiene
|
yarn gulp compile-extensions-build
|
||||||
|
yarn gulp minify-vscode
|
||||||
|
displayName: Compile
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
resources:
|
resources:
|
||||||
containers:
|
containers:
|
||||||
- container: linux-x64
|
- container: linux-x64
|
||||||
image: sqltoolscontainers.azurecr.io/web-build-image:2
|
image: sqltoolscontainers.azurecr.io/web-build-image:1
|
||||||
endpoint: SqlToolsContainers
|
endpoint: ContainerRegistry
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
- job: LinuxWeb
|
- job: LinuxWeb
|
||||||
|
|||||||
24
build/azure-pipelines/sync-mooncake.yml
Normal file
24
build/azure-pipelines/sync-mooncake.yml
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
steps:
|
||||||
|
- task: NodeTool@0
|
||||||
|
inputs:
|
||||||
|
versionSpec: "14.x"
|
||||||
|
|
||||||
|
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
|
||||||
|
inputs:
|
||||||
|
versionSpec: "1.x"
|
||||||
|
|
||||||
|
- task: AzureKeyVault@1
|
||||||
|
displayName: "Azure Key Vault: Get Secrets"
|
||||||
|
inputs:
|
||||||
|
azureSubscription: "vscode-builds-subscription"
|
||||||
|
KeyVaultName: vscode
|
||||||
|
|
||||||
|
- script: |
|
||||||
|
set -e
|
||||||
|
|
||||||
|
(cd build ; yarn)
|
||||||
|
|
||||||
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
|
MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \
|
||||||
|
node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY"
|
||||||
@@ -10,11 +10,6 @@ RUN apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xv
|
|||||||
libkrb5-dev git apt-transport-https ca-certificates curl gnupg-agent software-properties-common \
|
libkrb5-dev git apt-transport-https ca-certificates curl gnupg-agent software-properties-common \
|
||||||
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++ python
|
libnss3 libasound2 make gcc libx11-dev fakeroot rpm libgconf-2-4 libunwind8 g++ python
|
||||||
|
|
||||||
# Set the Chrome repo and install Chrome.
|
|
||||||
RUN curl -sSL https://dl.google.com/linux/linux_signing_key.pub | apt-key add - \
|
|
||||||
&& echo "deb https://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google-chrome.list \
|
|
||||||
&& apt-get update && apt-get install -y google-chrome-stable
|
|
||||||
|
|
||||||
#docker
|
#docker
|
||||||
RUN curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add -
|
RUN curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add -
|
||||||
RUN apt-key fingerprint 0EBFCD88
|
RUN apt-key fingerprint 0EBFCD88
|
||||||
|
|||||||
@@ -119,19 +119,13 @@ steps:
|
|||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
REPO="$(pwd)"
|
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
|
||||||
ROOT="$REPO/.."
|
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
|
||||||
|
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
|
||||||
WEB_BUILD_NAME="vscode-web"
|
./build/azure-pipelines/web/publish.sh
|
||||||
WEB_TARBALL_FILENAME="vscode-web.tar.gz"
|
displayName: Publish
|
||||||
WEB_TARBALL_PATH="$ROOT/$WEB_TARBALL_FILENAME"
|
|
||||||
|
|
||||||
rm -rf $ROOT/vscode-web.tar.*
|
|
||||||
|
|
||||||
cd $ROOT && tar --owner=0 --group=0 -czf $WEB_TARBALL_PATH $WEB_BUILD_NAME
|
|
||||||
displayName: Prepare for publish
|
|
||||||
|
|
||||||
- publish: $(Agent.BuildDirectory)/vscode-web.tar.gz
|
- publish: $(Agent.BuildDirectory)/vscode-web.tar.gz
|
||||||
artifact: vscode_web_linux_standalone_archive-unsigned
|
artifact: vscode-web-standalone
|
||||||
displayName: Publish web archive
|
displayName: Publish web archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|||||||
15
build/azure-pipelines/web/publish.sh
Executable file
15
build/azure-pipelines/web/publish.sh
Executable file
@@ -0,0 +1,15 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -e
|
||||||
|
REPO="$(pwd)"
|
||||||
|
ROOT="$REPO/.."
|
||||||
|
|
||||||
|
# Publish Web Client
|
||||||
|
WEB_BUILD_NAME="vscode-web"
|
||||||
|
WEB_TARBALL_FILENAME="vscode-web.tar.gz"
|
||||||
|
WEB_TARBALL_PATH="$ROOT/$WEB_TARBALL_FILENAME"
|
||||||
|
|
||||||
|
rm -rf $ROOT/vscode-web.tar.*
|
||||||
|
|
||||||
|
(cd $ROOT && tar --owner=0 --group=0 -czf $WEB_TARBALL_PATH $WEB_BUILD_NAME)
|
||||||
|
|
||||||
|
node build/azure-pipelines/common/createAsset.js web-standalone archive-unsigned "$WEB_TARBALL_FILENAME" "$WEB_TARBALL_PATH"
|
||||||
@@ -43,7 +43,6 @@ steps:
|
|||||||
path: .build/node_modules_cache
|
path: .build/node_modules_cache
|
||||||
cacheHitVar: NODE_MODULES_RESTORED
|
cacheHitVar: NODE_MODULES_RESTORED
|
||||||
displayName: Restore Cache - Node Modules
|
displayName: Restore Cache - Node Modules
|
||||||
continueOnError: true
|
|
||||||
|
|
||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
@@ -81,7 +80,6 @@ steps:
|
|||||||
- script: |
|
- script: |
|
||||||
set -e
|
set -e
|
||||||
yarn sqllint
|
yarn sqllint
|
||||||
yarn extensions-lint
|
|
||||||
yarn gulp hygiene
|
yarn gulp hygiene
|
||||||
yarn strict-vscode
|
yarn strict-vscode
|
||||||
yarn valid-layers-check
|
yarn valid-layers-check
|
||||||
@@ -100,20 +98,6 @@ steps:
|
|||||||
yarn gulp vscode-reh-web-linux-x64-min
|
yarn gulp vscode-reh-web-linux-x64-min
|
||||||
displayName: Compile
|
displayName: Compile
|
||||||
|
|
||||||
- script: |
|
|
||||||
set -e
|
|
||||||
yarn gulp compile-extensions
|
|
||||||
displayName: Compile Extensions
|
|
||||||
|
|
||||||
- script: |
|
|
||||||
set -e
|
|
||||||
node ./node_modules/playwright/install.js
|
|
||||||
APP_ROOT=$(Agent.BuildDirectory)/vscode-reh-web-linux-x64
|
|
||||||
xvfb-run yarn smoketest --build "$(Agent.BuildDirectory)/vscode-reh-web-linux-x64" --web --headless --screenshots "$(Build.ArtifactStagingDirectory)/smokeshots" --log "$(Build.ArtifactStagingDirectory)/logs/web/smoke.log"
|
|
||||||
displayName: Run smoke tests (Browser)
|
|
||||||
continueOnError: true
|
|
||||||
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
|
|
||||||
|
|
||||||
# - script: |
|
# - script: |
|
||||||
# set -e
|
# set -e
|
||||||
# AZURE_STORAGE_ACCOUNT="$(sourcemap-storage-account)" \
|
# AZURE_STORAGE_ACCOUNT="$(sourcemap-storage-account)" \
|
||||||
@@ -171,7 +155,7 @@ steps:
|
|||||||
displayName: 'Install .NET Core sdk for signing'
|
displayName: 'Install .NET Core sdk for signing'
|
||||||
inputs:
|
inputs:
|
||||||
packageType: sdk
|
packageType: sdk
|
||||||
version: 5.0.x
|
version: 2.1.x
|
||||||
installationPath: $(Agent.ToolsDirectory)/dotnet
|
installationPath: $(Agent.ToolsDirectory)/dotnet
|
||||||
|
|
||||||
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
|
||||||
|
|||||||
@@ -295,31 +295,31 @@ steps:
|
|||||||
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
|
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
|
||||||
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
|
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
|
||||||
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
|
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
|
||||||
.\build\azure-pipelines\win32\prepare-publish.ps1
|
.\build\azure-pipelines\win32\publish.ps1
|
||||||
displayName: Publish
|
displayName: Publish
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\$(ARCHIVE_NAME)
|
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\archive\VSCode-win32-$(VSCODE_ARCH).zip
|
||||||
artifact: vscode_client_win32_$(VSCODE_ARCH)_archive
|
artifact: vscode-win32-$(VSCODE_ARCH)
|
||||||
displayName: Publish archive
|
displayName: Publish archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\$(SYSTEM_SETUP_NAME)
|
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\system-setup\VSCodeSetup.exe
|
||||||
artifact: vscode_client_win32_$(VSCODE_ARCH)_setup
|
artifact: vscode-win32-$(VSCODE_ARCH)-setup
|
||||||
displayName: Publish system setup
|
displayName: Publish system setup
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\$(USER_SETUP_NAME)
|
- publish: $(System.DefaultWorkingDirectory)\.build\win32-$(VSCODE_ARCH)\user-setup\VSCodeSetup.exe
|
||||||
artifact: vscode_client_win32_$(VSCODE_ARCH)_user-setup
|
artifact: vscode-win32-$(VSCODE_ARCH)-user-setup
|
||||||
displayName: Publish user setup
|
displayName: Publish user setup
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'))
|
||||||
|
|
||||||
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
|
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH).zip
|
||||||
artifact: vscode_server_win32_$(VSCODE_ARCH)_archive
|
artifact: vscode-server-win32-$(VSCODE_ARCH)
|
||||||
displayName: Publish server archive
|
displayName: Publish server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
|
||||||
|
|
||||||
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
|
- publish: $(System.DefaultWorkingDirectory)\.build\vscode-server-win32-$(VSCODE_ARCH)-web.zip
|
||||||
artifact: vscode_web_win32_$(VSCODE_ARCH)_archive
|
artifact: vscode-server-win32-$(VSCODE_ARCH)-web
|
||||||
displayName: Publish web server archive
|
displayName: Publish web server archive
|
||||||
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
|
condition: and(succeeded(), ne(variables['VSCODE_PUBLISH'], 'false'), ne(variables['VSCODE_ARCH'], 'arm64'))
|
||||||
|
|||||||
@@ -13,31 +13,24 @@ $Zip = "$Repo\.build\win32-$Arch\archive\VSCode-win32-$Arch.zip"
|
|||||||
$LegacyServer = "$Root\vscode-reh-win32-$Arch"
|
$LegacyServer = "$Root\vscode-reh-win32-$Arch"
|
||||||
$Server = "$Root\vscode-server-win32-$Arch"
|
$Server = "$Root\vscode-server-win32-$Arch"
|
||||||
$ServerZip = "$Repo\.build\vscode-server-win32-$Arch.zip"
|
$ServerZip = "$Repo\.build\vscode-server-win32-$Arch.zip"
|
||||||
$LegacyWeb = "$Root\vscode-reh-web-win32-$Arch"
|
|
||||||
$Web = "$Root\vscode-server-win32-$Arch-web"
|
|
||||||
$WebZip = "$Repo\.build\vscode-server-win32-$Arch-web.zip"
|
|
||||||
$Build = "$Root\VSCode-win32-$Arch"
|
$Build = "$Root\VSCode-win32-$Arch"
|
||||||
|
|
||||||
# Create server archive
|
# Create server archive
|
||||||
if ("$Arch" -ne "arm64") {
|
if ("$Arch" -ne "arm64") {
|
||||||
exec { xcopy $LegacyServer $Server /H /E /I }
|
exec { xcopy $LegacyServer $Server /H /E /I }
|
||||||
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r }
|
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r }
|
||||||
exec { xcopy $LegacyWeb $Web /H /E /I }
|
|
||||||
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $WebZip $Web -r }
|
|
||||||
}
|
}
|
||||||
|
|
||||||
# get version
|
# get version
|
||||||
$PackageJson = Get-Content -Raw -Path "$Build\resources\app\package.json" | ConvertFrom-Json
|
$PackageJson = Get-Content -Raw -Path "$Build\resources\app\package.json" | ConvertFrom-Json
|
||||||
$Version = $PackageJson.version
|
$Version = $PackageJson.version
|
||||||
|
|
||||||
$ARCHIVE_NAME = "VSCode-win32-$Arch-$Version.zip"
|
$AssetPlatform = if ("$Arch" -eq "ia32") { "win32" } else { "win32-$Arch" }
|
||||||
$SYSTEM_SETUP_NAME = "VSCodeSetup-$Arch-$Version.exe"
|
|
||||||
$USER_SETUP_NAME = "VSCodeUserSetup-$Arch-$Version.exe"
|
|
||||||
|
|
||||||
# Set variables for upload
|
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform-archive" archive "VSCode-win32-$Arch-$Version.zip" $Zip }
|
||||||
Move-Item $Zip "$Repo\.build\win32-$Arch\archive\$ARCHIVE_NAME"
|
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform" setup "VSCodeSetup-$Arch-$Version.exe" $SystemExe }
|
||||||
Write-Host "##vso[task.setvariable variable=ARCHIVE_NAME]$ARCHIVE_NAME"
|
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform-user" setup "VSCodeUserSetup-$Arch-$Version.exe" $UserExe }
|
||||||
Move-Item $SystemExe "$Repo\.build\win32-$Arch\system-setup\$SYSTEM_SETUP_NAME"
|
|
||||||
Write-Host "##vso[task.setvariable variable=SYSTEM_SETUP_NAME]$SYSTEM_SETUP_NAME"
|
if ("$Arch" -ne "arm64") {
|
||||||
Move-Item $UserExe "$Repo\.build\win32-$Arch\user-setup\$USER_SETUP_NAME"
|
exec { node build/azure-pipelines/common/createAsset.js "server-$AssetPlatform" archive "vscode-server-win32-$Arch.zip" $ServerZip }
|
||||||
Write-Host "##vso[task.setvariable variable=USER_SETUP_NAME]$USER_SETUP_NAME"
|
}
|
||||||
@@ -27,7 +27,7 @@ steps:
|
|||||||
- powershell: |
|
- powershell: |
|
||||||
. build/azure-pipelines/win32/exec.ps1
|
. build/azure-pipelines/win32/exec.ps1
|
||||||
$ErrorActionPreference = "Stop"
|
$ErrorActionPreference = "Stop"
|
||||||
exec { tar -xf $(Pipeline.Workspace)/compilation.tar.gz }
|
exec { tar --force-local -xzf $(Pipeline.Workspace)/compilation.tar.gz }
|
||||||
displayName: Extract compilation output
|
displayName: Extract compilation output
|
||||||
|
|
||||||
- powershell: |
|
- powershell: |
|
||||||
@@ -57,7 +57,6 @@ steps:
|
|||||||
path: .build/node_modules_cache
|
path: .build/node_modules_cache
|
||||||
cacheHitVar: NODE_MODULES_RESTORED
|
cacheHitVar: NODE_MODULES_RESTORED
|
||||||
displayName: Restore Cache - Node Modules
|
displayName: Restore Cache - Node Modules
|
||||||
continueOnError: true
|
|
||||||
|
|
||||||
- powershell: |
|
- powershell: |
|
||||||
. build/azure-pipelines/win32/exec.ps1
|
. build/azure-pipelines/win32/exec.ps1
|
||||||
|
|||||||
@@ -23,7 +23,7 @@ async function main() {
|
|||||||
const outAppPath = path.join(buildDir, `VSCode-darwin-${arch}`, appName);
|
const outAppPath = path.join(buildDir, `VSCode-darwin-${arch}`, appName);
|
||||||
const productJsonPath = path.resolve(outAppPath, 'Contents', 'Resources', 'app', 'product.json');
|
const productJsonPath = path.resolve(outAppPath, 'Contents', 'Resources', 'app', 'product.json');
|
||||||
const infoPlistPath = path.resolve(outAppPath, 'Contents', 'Info.plist');
|
const infoPlistPath = path.resolve(outAppPath, 'Contents', 'Info.plist');
|
||||||
await vscode_universal_1.makeUniversalApp({
|
await (0, vscode_universal_1.makeUniversalApp)({
|
||||||
x64AppPath,
|
x64AppPath,
|
||||||
arm64AppPath,
|
arm64AppPath,
|
||||||
x64AsarPath,
|
x64AsarPath,
|
||||||
|
|||||||
@@ -51,7 +51,7 @@ module.exports.indentationFilter = [
|
|||||||
'!test/monaco/out/**',
|
'!test/monaco/out/**',
|
||||||
'!test/smoke/out/**',
|
'!test/smoke/out/**',
|
||||||
'!extensions/typescript-language-features/test-workspace/**',
|
'!extensions/typescript-language-features/test-workspace/**',
|
||||||
'!extensions/markdown-math/notebook-out/**',
|
'!extensions/notebook-markdown-extensions/notebook-out/**',
|
||||||
'!extensions/vscode-api-tests/testWorkspace/**',
|
'!extensions/vscode-api-tests/testWorkspace/**',
|
||||||
'!extensions/vscode-api-tests/testWorkspace2/**',
|
'!extensions/vscode-api-tests/testWorkspace2/**',
|
||||||
'!extensions/vscode-custom-editor-tests/test-workspace/**',
|
'!extensions/vscode-custom-editor-tests/test-workspace/**',
|
||||||
@@ -89,7 +89,7 @@ module.exports.indentationFilter = [
|
|||||||
'!**/*.dockerfile',
|
'!**/*.dockerfile',
|
||||||
'!extensions/markdown-language-features/media/*.js',
|
'!extensions/markdown-language-features/media/*.js',
|
||||||
'!extensions/markdown-language-features/notebook-out/*.js',
|
'!extensions/markdown-language-features/notebook-out/*.js',
|
||||||
'!extensions/markdown-math/notebook-out/*.js',
|
'!extensions/notebook-markdown-extensions/notebook-out/*.js',
|
||||||
'!extensions/simple-browser/media/*.js',
|
'!extensions/simple-browser/media/*.js',
|
||||||
];
|
];
|
||||||
|
|
||||||
@@ -119,7 +119,7 @@ module.exports.copyrightFilter = [
|
|||||||
'!resources/completions/**',
|
'!resources/completions/**',
|
||||||
'!extensions/configuration-editing/build/inline-allOf.ts',
|
'!extensions/configuration-editing/build/inline-allOf.ts',
|
||||||
'!extensions/markdown-language-features/media/highlight.css',
|
'!extensions/markdown-language-features/media/highlight.css',
|
||||||
'!extensions/markdown-math/notebook-out/**',
|
'!extensions/notebook-markdown-extensions/notebook-out/**',
|
||||||
'!extensions/html-language-features/server/src/modes/typescript/*',
|
'!extensions/html-language-features/server/src/modes/typescript/*',
|
||||||
'!extensions/*/server/bin/*',
|
'!extensions/*/server/bin/*',
|
||||||
'!src/vs/editor/test/node/classification/typescript-test.ts',
|
'!src/vs/editor/test/node/classification/typescript-test.ts',
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ const i18n = require('./lib/i18n');
|
|||||||
const standalone = require('./lib/standalone');
|
const standalone = require('./lib/standalone');
|
||||||
const cp = require('child_process');
|
const cp = require('child_process');
|
||||||
const compilation = require('./lib/compilation');
|
const compilation = require('./lib/compilation');
|
||||||
const monacoapi = require('./lib/monaco-api');
|
const monacoapi = require('./monaco/api');
|
||||||
const fs = require('fs');
|
const fs = require('fs');
|
||||||
|
|
||||||
let root = path.dirname(__dirname);
|
let root = path.dirname(__dirname);
|
||||||
@@ -49,7 +49,7 @@ let BUNDLED_FILE_HEADER = [
|
|||||||
' * Copyright (c) Microsoft Corporation. All rights reserved.',
|
' * Copyright (c) Microsoft Corporation. All rights reserved.',
|
||||||
' * Version: ' + headerVersion,
|
' * Version: ' + headerVersion,
|
||||||
' * Released under the Source EULA',
|
' * Released under the Source EULA',
|
||||||
' * https://github.com/microsoft/vscode/blob/main/LICENSE.txt',
|
' * https://github.com/Microsoft/vscode/blob/master/LICENSE.txt',
|
||||||
' *-----------------------------------------------------------*/',
|
' *-----------------------------------------------------------*/',
|
||||||
''
|
''
|
||||||
].join('\n');
|
].join('\n');
|
||||||
@@ -279,7 +279,7 @@ const finalEditorResourcesTask = task.define('final-editor-resources', () => {
|
|||||||
// version.txt
|
// version.txt
|
||||||
gulp.src('build/monaco/version.txt')
|
gulp.src('build/monaco/version.txt')
|
||||||
.pipe(es.through(function (data) {
|
.pipe(es.through(function (data) {
|
||||||
data.contents = Buffer.from(`monaco-editor-core: https://github.com/microsoft/vscode/tree/${sha1}`);
|
data.contents = Buffer.from(`monaco-editor-core: https://github.com/Microsoft/vscode/tree/${sha1}`);
|
||||||
this.emit('data', data);
|
this.emit('data', data);
|
||||||
}))
|
}))
|
||||||
.pipe(gulp.dest('out-monaco-editor-core')),
|
.pipe(gulp.dest('out-monaco-editor-core')),
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ require('events').EventEmitter.defaultMaxListeners = 100;
|
|||||||
|
|
||||||
const gulp = require('gulp');
|
const gulp = require('gulp');
|
||||||
const path = require('path');
|
const path = require('path');
|
||||||
|
const child_process = require('child_process');
|
||||||
const nodeUtil = require('util');
|
const nodeUtil = require('util');
|
||||||
const es = require('event-stream');
|
const es = require('event-stream');
|
||||||
const filter = require('gulp-filter');
|
const filter = require('gulp-filter');
|
||||||
@@ -19,6 +20,8 @@ const glob = require('glob');
|
|||||||
const root = path.dirname(__dirname);
|
const root = path.dirname(__dirname);
|
||||||
const commit = util.getVersion(root);
|
const commit = util.getVersion(root);
|
||||||
const plumber = require('gulp-plumber');
|
const plumber = require('gulp-plumber');
|
||||||
|
const fancyLog = require('fancy-log');
|
||||||
|
const ansiColors = require('ansi-colors');
|
||||||
const ext = require('./lib/extensions');
|
const ext = require('./lib/extensions');
|
||||||
|
|
||||||
const extensionsPath = path.join(path.dirname(__dirname), 'extensions');
|
const extensionsPath = path.join(path.dirname(__dirname), 'extensions');
|
||||||
@@ -56,7 +59,6 @@ const compilations = glob.sync('**/tsconfig.json', {
|
|||||||
// 'json-language-features/server/tsconfig.json',
|
// 'json-language-features/server/tsconfig.json',
|
||||||
// 'markdown-language-features/preview-src/tsconfig.json',
|
// 'markdown-language-features/preview-src/tsconfig.json',
|
||||||
// 'markdown-language-features/tsconfig.json',
|
// 'markdown-language-features/tsconfig.json',
|
||||||
// 'markdown-math/tsconfig.json',
|
|
||||||
// 'merge-conflict/tsconfig.json',
|
// 'merge-conflict/tsconfig.json',
|
||||||
// 'microsoft-authentication/tsconfig.json',
|
// 'microsoft-authentication/tsconfig.json',
|
||||||
// 'npm/tsconfig.json',
|
// 'npm/tsconfig.json',
|
||||||
@@ -205,17 +207,45 @@ gulp.task(compileExtensionsBuildLegacyTask);
|
|||||||
|
|
||||||
//#region Extension media
|
//#region Extension media
|
||||||
|
|
||||||
const compileExtensionMediaTask = task.define('compile-extension-media', () => ext.buildExtensionMedia(false));
|
// Additional projects to webpack. These typically build code for webviews
|
||||||
|
const webpackMediaConfigFiles = [
|
||||||
|
'markdown-language-features/webpack.config.js',
|
||||||
|
'simple-browser/webpack.config.js',
|
||||||
|
];
|
||||||
|
|
||||||
|
// Additional projects to run esbuild on. These typically build code for webviews
|
||||||
|
const esbuildMediaScripts = [
|
||||||
|
'markdown-language-features/esbuild.js',
|
||||||
|
'notebook-markdown-extensions/esbuild.js',
|
||||||
|
];
|
||||||
|
|
||||||
|
const compileExtensionMediaTask = task.define('compile-extension-media', () => buildExtensionMedia(false));
|
||||||
gulp.task(compileExtensionMediaTask);
|
gulp.task(compileExtensionMediaTask);
|
||||||
exports.compileExtensionMediaTask = compileExtensionMediaTask;
|
exports.compileExtensionMediaTask = compileExtensionMediaTask;
|
||||||
|
|
||||||
const watchExtensionMedia = task.define('watch-extension-media', () => ext.buildExtensionMedia(true));
|
const watchExtensionMedia = task.define('watch-extension-media', () => buildExtensionMedia(true));
|
||||||
gulp.task(watchExtensionMedia);
|
gulp.task(watchExtensionMedia);
|
||||||
exports.watchExtensionMedia = watchExtensionMedia;
|
exports.watchExtensionMedia = watchExtensionMedia;
|
||||||
|
|
||||||
const compileExtensionMediaBuildTask = task.define('compile-extension-media-build', () => ext.buildExtensionMedia(false, '.build/extensions'));
|
const compileExtensionMediaBuildTask = task.define('compile-extension-media-build', () => buildExtensionMedia(false, '.build/extensions'));
|
||||||
gulp.task(compileExtensionMediaBuildTask);
|
gulp.task(compileExtensionMediaBuildTask);
|
||||||
|
|
||||||
|
async function buildExtensionMedia(isWatch, outputRoot) {
|
||||||
|
const webpackConfigLocations = webpackMediaConfigFiles.map(p => {
|
||||||
|
return {
|
||||||
|
configPath: path.join(extensionsPath, p),
|
||||||
|
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
||||||
|
};
|
||||||
|
});
|
||||||
|
return Promise.all([
|
||||||
|
webpackExtensions('webpacking extension media', isWatch, webpackConfigLocations),
|
||||||
|
esbuildExtensions('esbuilding extension media', isWatch, esbuildMediaScripts.map(p => ({
|
||||||
|
script: path.join(extensionsPath, p),
|
||||||
|
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
||||||
|
}))),
|
||||||
|
]);
|
||||||
|
}
|
||||||
|
|
||||||
//#endregion
|
//#endregion
|
||||||
|
|
||||||
//#region Azure Pipelines
|
//#region Azure Pipelines
|
||||||
@@ -290,5 +320,121 @@ async function buildWebExtensions(isWatch) {
|
|||||||
path.join(extensionsPath, '**', 'extension-browser.webpack.config.js'),
|
path.join(extensionsPath, '**', 'extension-browser.webpack.config.js'),
|
||||||
{ ignore: ['**/node_modules'] }
|
{ ignore: ['**/node_modules'] }
|
||||||
);
|
);
|
||||||
return ext.webpackExtensions('packaging web extension', isWatch, webpackConfigLocations.map(configPath => ({ configPath })));
|
return webpackExtensions('packaging web extension', isWatch, webpackConfigLocations.map(configPath => ({ configPath })));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @param {string} taskName
|
||||||
|
* @param {boolean} isWatch
|
||||||
|
* @param {{ configPath: string, outputRoot?: boolean}} webpackConfigLocations
|
||||||
|
*/
|
||||||
|
async function webpackExtensions(taskName, isWatch, webpackConfigLocations) {
|
||||||
|
const webpack = require('webpack');
|
||||||
|
|
||||||
|
const webpackConfigs = [];
|
||||||
|
|
||||||
|
for (const { configPath, outputRoot } of webpackConfigLocations) {
|
||||||
|
const configOrFnOrArray = require(configPath);
|
||||||
|
function addConfig(configOrFn) {
|
||||||
|
let config;
|
||||||
|
if (typeof configOrFn === 'function') {
|
||||||
|
config = configOrFn({}, {});
|
||||||
|
webpackConfigs.push(config);
|
||||||
|
} else {
|
||||||
|
config = configOrFn;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (outputRoot) {
|
||||||
|
config.output.path = path.join(outputRoot, path.relative(path.dirname(configPath), config.output.path));
|
||||||
|
}
|
||||||
|
|
||||||
|
webpackConfigs.push(configOrFn);
|
||||||
|
}
|
||||||
|
addConfig(configOrFnOrArray);
|
||||||
|
}
|
||||||
|
function reporter(fullStats) {
|
||||||
|
if (Array.isArray(fullStats.children)) {
|
||||||
|
for (const stats of fullStats.children) {
|
||||||
|
const outputPath = stats.outputPath;
|
||||||
|
if (outputPath) {
|
||||||
|
const relativePath = path.relative(extensionsPath, outputPath).replace(/\\/g, '/');
|
||||||
|
const match = relativePath.match(/[^\/]+(\/server|\/client)?/);
|
||||||
|
fancyLog(`Finished ${ansiColors.green(taskName)} ${ansiColors.cyan(match[0])} with ${stats.errors.length} errors.`);
|
||||||
|
}
|
||||||
|
if (Array.isArray(stats.errors)) {
|
||||||
|
stats.errors.forEach(error => {
|
||||||
|
fancyLog.error(error);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (Array.isArray(stats.warnings)) {
|
||||||
|
stats.warnings.forEach(warning => {
|
||||||
|
fancyLog.warn(warning);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (isWatch) {
|
||||||
|
webpack(webpackConfigs).watch({}, (err, stats) => {
|
||||||
|
if (err) {
|
||||||
|
reject();
|
||||||
|
} else {
|
||||||
|
reporter(stats.toJson());
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
webpack(webpackConfigs).run((err, stats) => {
|
||||||
|
if (err) {
|
||||||
|
fancyLog.error(err);
|
||||||
|
reject();
|
||||||
|
} else {
|
||||||
|
reporter(stats.toJson());
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* @param {string} taskName
|
||||||
|
* @param {boolean} isWatch
|
||||||
|
* @param {{ script: string, outputRoot?: string }}} scripts
|
||||||
|
*/
|
||||||
|
async function esbuildExtensions(taskName, isWatch, scripts) {
|
||||||
|
function reporter(/** @type {string} */ stdError, /** @type {string} */script) {
|
||||||
|
const matches = (stdError || '').match(/\> (.+): error: (.+)?/g);
|
||||||
|
fancyLog(`Finished ${ansiColors.green(taskName)} ${script} with ${matches ? matches.length : 0} errors.`);
|
||||||
|
for (const match of matches || []) {
|
||||||
|
fancyLog.error(match);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const tasks = scripts.map(({ script, outputRoot }) => {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const args = [script];
|
||||||
|
if (isWatch) {
|
||||||
|
args.push('--watch');
|
||||||
|
}
|
||||||
|
if (outputRoot) {
|
||||||
|
args.push('--outputRoot', outputRoot);
|
||||||
|
}
|
||||||
|
const proc = child_process.execFile(process.argv[0], args, {}, (error, _stdout, stderr) => {
|
||||||
|
if (error) {
|
||||||
|
return reject(error);
|
||||||
|
}
|
||||||
|
reporter(stderr, script);
|
||||||
|
if (stderr) {
|
||||||
|
return reject();
|
||||||
|
}
|
||||||
|
return resolve();
|
||||||
|
});
|
||||||
|
|
||||||
|
proc.stdout.on('data', (data) => {
|
||||||
|
fancyLog(`${ansiColors.green(taskName)}: ${data.toString('utf8')}`);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
return Promise.all(tasks);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,8 +15,6 @@ const task = require('./lib/task');
|
|||||||
const glob = require('glob');
|
const glob = require('glob');
|
||||||
const vsce = require('vsce');
|
const vsce = require('vsce');
|
||||||
const mkdirp = require('mkdirp');
|
const mkdirp = require('mkdirp');
|
||||||
const rename = require('gulp-rename');
|
|
||||||
const fs = require('fs');
|
|
||||||
|
|
||||||
gulp.task('fmt', () => formatStagedFiles());
|
gulp.task('fmt', () => formatStagedFiles());
|
||||||
const formatFiles = (some) => {
|
const formatFiles = (some) => {
|
||||||
@@ -96,14 +94,12 @@ const root = path.dirname(__dirname);
|
|||||||
|
|
||||||
gulp.task('package-external-extensions', task.series(
|
gulp.task('package-external-extensions', task.series(
|
||||||
task.define('bundle-external-extensions-build', () => ext.packageExternalExtensionsStream().pipe(gulp.dest('.build/external'))),
|
task.define('bundle-external-extensions-build', () => ext.packageExternalExtensionsStream().pipe(gulp.dest('.build/external'))),
|
||||||
task.define('create-external-extension-vsix-build', async () => {
|
task.define('create-external-extension-vsix-build', () => {
|
||||||
const vsixes = glob.sync('.build/external/extensions/*/package.json').map(manifestPath => {
|
const vsixes = glob.sync('.build/external/extensions/*/package.json').map(manifestPath => {
|
||||||
const extensionPath = path.dirname(path.join(root, manifestPath));
|
const extensionPath = path.dirname(path.join(root, manifestPath));
|
||||||
const extensionName = path.basename(extensionPath);
|
const extensionName = path.basename(extensionPath);
|
||||||
return { name: extensionName, path: extensionPath };
|
return { name: extensionName, path: extensionPath };
|
||||||
})
|
}).map(element => {
|
||||||
.filter(element => ext.vscodeExternalExtensions.indexOf(element.name) === -1) // VS Code external extensions are bundled into ADS so no need to create a normal VSIX for them
|
|
||||||
.map(element => {
|
|
||||||
const pkgJson = require(path.join(element.path, 'package.json'));
|
const pkgJson = require(path.join(element.path, 'package.json'));
|
||||||
const vsixDirectory = path.join(root, '.build', 'extensions');
|
const vsixDirectory = path.join(root, '.build', 'extensions');
|
||||||
mkdirp.sync(vsixDirectory);
|
mkdirp.sync(vsixDirectory);
|
||||||
@@ -115,46 +111,8 @@ gulp.task('package-external-extensions', task.series(
|
|||||||
useYarn: true
|
useYarn: true
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
// Wait for all the initial VSIXes to be completed before making the VS Code ones since we'll be overwriting
|
|
||||||
// values in the package.json for those.
|
|
||||||
await Promise.all(vsixes);
|
|
||||||
|
|
||||||
// Go through and find the extensions which build separate versions of themselves for VS Code.
|
return Promise.all(vsixes);
|
||||||
// This is currently a pretty simplistic process, essentially just replacing certain values in
|
|
||||||
// the package.json. It doesn't handle more complex tasks such as replacing localized strings.
|
|
||||||
const vscodeVsixes = glob.sync('.build/external/extensions/*/package.vscode.json')
|
|
||||||
.map(async vscodeManifestRelativePath => {
|
|
||||||
const vscodeManifestFullPath = path.join(root, vscodeManifestRelativePath);
|
|
||||||
const packageDir = path.dirname(vscodeManifestFullPath);
|
|
||||||
const packageManifestPath = path.join(packageDir, 'package.json');
|
|
||||||
const json = require('gulp-json-editor');
|
|
||||||
const packageJsonStream = gulp.src(packageManifestPath) // Create stream for the original package.json
|
|
||||||
.pipe(json(data => { // And now use gulp-json-editor to modify the contents
|
|
||||||
const updateData = JSON.parse(fs.readFileSync(vscodeManifestFullPath)); // Read in the set of values to replace from package.vscode.json
|
|
||||||
Object.keys(updateData).forEach(key => {
|
|
||||||
data[key] = updateData[key];
|
|
||||||
});
|
|
||||||
// Remove ADS-only menus. This is a subset of the menus listed in https://github.com/microsoft/azuredatastudio/blob/main/src/vs/workbench/api/common/menusExtensionPoint.ts
|
|
||||||
// More can be added to the list as needed.
|
|
||||||
['objectExplorer/item/context', 'dataExplorer/context', 'dashboard/toolbar'].forEach(menu => {
|
|
||||||
delete data.contributes.menus[menu];
|
|
||||||
});
|
|
||||||
return data;
|
|
||||||
}, { beautify: false }))
|
|
||||||
.pipe(gulp.dest(packageDir));
|
|
||||||
await new Promise(resolve => packageJsonStream.on('finish', resolve)); // Wait for the files to finish being updated before packaging
|
|
||||||
const pkgJson = JSON.parse(fs.readFileSync(packageManifestPath));
|
|
||||||
const vsixDirectory = path.join(root, '.build', 'extensions');
|
|
||||||
const packagePath = path.join(vsixDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
|
|
||||||
console.info('Creating vsix for ' + packageDir + ' result:' + packagePath);
|
|
||||||
return vsce.createVSIX({
|
|
||||||
cwd: packageDir,
|
|
||||||
packagePath: packagePath,
|
|
||||||
useYarn: true
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
return Promise.all(vscodeVsixes);
|
|
||||||
})
|
})
|
||||||
));
|
));
|
||||||
|
|
||||||
|
|||||||
@@ -112,32 +112,33 @@ gulp.task(optimizeVSCodeTask);
|
|||||||
|
|
||||||
// List of ADS extension XLF files that we want to put into the English resource folder.
|
// List of ADS extension XLF files that we want to put into the English resource folder.
|
||||||
const extensionsFilter = filter([
|
const extensionsFilter = filter([
|
||||||
'**/admin-tool-ext-win.xlf',
|
"**/admin-tool-ext-win.xlf",
|
||||||
'**/agent.xlf',
|
"**/agent.xlf",
|
||||||
'**/arc.xlf',
|
"**/arc.xlf",
|
||||||
'**/asde-deployment.xlf',
|
"**/asde-deployment.xlf",
|
||||||
'**/azurecore.xlf',
|
"**/azdata.xlf",
|
||||||
'**/azurehybridtoolkit.xlf',
|
"**/azurecore.xlf",
|
||||||
'**/big-data-cluster.xlf',
|
"**/azurehybridtoolkit.xlf",
|
||||||
'**/cms.xlf',
|
"**/big-data-cluster.xlf",
|
||||||
'**/dacpac.xlf',
|
"**/cms.xlf",
|
||||||
'**/data-workspace.xlf',
|
"**/dacpac.xlf",
|
||||||
'**/import.xlf',
|
"**/data-workspace.xlf",
|
||||||
'**/kusto.xlf',
|
"**/import.xlf",
|
||||||
'**/machine-learning.xlf',
|
"**/kusto.xlf",
|
||||||
'**/Microsoft.sqlservernotebook.xlf',
|
"**/machine-learning.xlf",
|
||||||
'**/mssql.xlf',
|
"**/Microsoft.sqlservernotebook.xlf",
|
||||||
'**/notebook.xlf',
|
"**/mssql.xlf",
|
||||||
'**/profiler.xlf',
|
"**/notebook.xlf",
|
||||||
'**/query-history.xlf',
|
"**/profiler.xlf",
|
||||||
'**/resource-deployment.xlf',
|
"**/query-history.xlf",
|
||||||
'**/schema-compare.xlf',
|
"**/resource-deployment.xlf",
|
||||||
'**/server-report.xlf',
|
"**/schema-compare.xlf",
|
||||||
'**/sql-assessment.xlf',
|
"**/server-report.xlf",
|
||||||
'**/sql-database-projects.xlf',
|
"**/sql-assessment.xlf",
|
||||||
'**/sql-migration.xlf',
|
"**/sql-database-projects.xlf",
|
||||||
'**/xml-language-features.xlf'
|
"**/sql-migration.xlf",
|
||||||
]);
|
"**/xml-language-features.xlf"
|
||||||
|
])
|
||||||
|
|
||||||
// Copy ADS extension XLFs into English resource folder.
|
// Copy ADS extension XLFs into English resource folder.
|
||||||
const importExtensionsTask = task.define('import-extensions-xlfs', function () {
|
const importExtensionsTask = task.define('import-extensions-xlfs', function () {
|
||||||
@@ -148,7 +149,7 @@ const importExtensionsTask = task.define('import-extensions-xlfs', function () {
|
|||||||
)
|
)
|
||||||
.pipe(vfs.dest(`./resources/xlf/en`));
|
.pipe(vfs.dest(`./resources/xlf/en`));
|
||||||
});
|
});
|
||||||
gulp.task(importExtensionsTask);
|
gulp.task(importExtensionsTask)
|
||||||
// {{SQL CARBON EDIT}} end
|
// {{SQL CARBON EDIT}} end
|
||||||
|
|
||||||
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
|
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
|
||||||
@@ -282,14 +283,7 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
|
|||||||
.pipe(jsFilter)
|
.pipe(jsFilter)
|
||||||
.pipe(util.rewriteSourceMappingURL(sourceMappingURLBase))
|
.pipe(util.rewriteSourceMappingURL(sourceMappingURLBase))
|
||||||
.pipe(jsFilter.restore)
|
.pipe(jsFilter.restore)
|
||||||
.pipe(createAsar(path.join(process.cwd(), 'node_modules'), [
|
.pipe(createAsar(path.join(process.cwd(), 'node_modules'), ['**/*.node', '**/vscode-ripgrep/bin/*', '**/node-pty/build/Release/*', '**/*.wasm'], 'node_modules.asar'));
|
||||||
'**/*.node',
|
|
||||||
'**/vscode-ripgrep/bin/*',
|
|
||||||
'**/node-pty/build/Release/*',
|
|
||||||
'**/node-pty/lib/worker/conoutSocketWorker.js',
|
|
||||||
'**/node-pty/lib/shared/conout.js',
|
|
||||||
'**/*.wasm'
|
|
||||||
], 'node_modules.asar'));
|
|
||||||
|
|
||||||
let all = es.merge(
|
let all = es.merge(
|
||||||
packageJsonStream,
|
packageJsonStream,
|
||||||
@@ -445,6 +439,8 @@ BUILD_TARGETS.forEach(buildTarget => {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Transifex Localizations
|
||||||
|
|
||||||
const innoSetupConfig = {
|
const innoSetupConfig = {
|
||||||
'zh-cn': { codePage: 'CP936', defaultInfo: { name: 'Simplified Chinese', id: '$0804', } },
|
'zh-cn': { codePage: 'CP936', defaultInfo: { name: 'Simplified Chinese', id: '$0804', } },
|
||||||
'zh-tw': { codePage: 'CP950', defaultInfo: { name: 'Traditional Chinese', id: '$0404' } },
|
'zh-tw': { codePage: 'CP950', defaultInfo: { name: 'Traditional Chinese', id: '$0404' } },
|
||||||
@@ -460,8 +456,6 @@ const innoSetupConfig = {
|
|||||||
'tr': { codePage: 'CP1254' }
|
'tr': { codePage: 'CP1254' }
|
||||||
};
|
};
|
||||||
|
|
||||||
// Transifex Localizations
|
|
||||||
|
|
||||||
const apiHostname = process.env.TRANSIFEX_API_URL;
|
const apiHostname = process.env.TRANSIFEX_API_URL;
|
||||||
const apiName = process.env.TRANSIFEX_API_NAME;
|
const apiName = process.env.TRANSIFEX_API_NAME;
|
||||||
const apiToken = process.env.TRANSIFEX_API_TOKEN;
|
const apiToken = process.env.TRANSIFEX_API_TOKEN;
|
||||||
@@ -492,12 +486,12 @@ const vscodeTranslationsExport = task.define(
|
|||||||
'vscode-translations-export',
|
'vscode-translations-export',
|
||||||
task.series(
|
task.series(
|
||||||
compileBuildTask,
|
compileBuildTask,
|
||||||
compileLocalizationExtensionsBuildTask, // {{SQL CARBON EDIT}} now include all extensions in ADS, not just a subset. (replaces 'compileExtensionsBuildTask' here).
|
compileLocalizationExtensionsBuildTask, // {{SQL CARBON EDIT}} now include all extensions in ADS, not just a subset. (replaces "compileExtensionsBuildTask" here).
|
||||||
optimizeVSCodeTask,
|
optimizeVSCodeTask,
|
||||||
function () {
|
function () {
|
||||||
const pathToMetadata = './out-vscode/nls.metadata.json';
|
const pathToMetadata = './out-vscode/nls.metadata.json';
|
||||||
const pathToExtensions = '.build/extensions/*';
|
const pathToExtensions = '.build/extensions/*';
|
||||||
const pathToSetup = 'build/win32/i18n/messages.en.isl';
|
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
|
||||||
|
|
||||||
return es.merge(
|
return es.merge(
|
||||||
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
|
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
|
||||||
@@ -507,7 +501,7 @@ const vscodeTranslationsExport = task.define(
|
|||||||
}
|
}
|
||||||
)
|
)
|
||||||
);
|
);
|
||||||
gulp.task(vscodeTranslationsExport);
|
gulp.task(vscodeTranslationsExport)
|
||||||
|
|
||||||
// {{SQL CARBON EDIT}} Localization gulp task, runs vscodeTranslationsExport and imports a subset of the generated XLFs into the folder.
|
// {{SQL CARBON EDIT}} Localization gulp task, runs vscodeTranslationsExport and imports a subset of the generated XLFs into the folder.
|
||||||
gulp.task(task.define(
|
gulp.task(task.define(
|
||||||
|
|||||||
@@ -18,8 +18,8 @@ const ansiColors = require("ansi-colors");
|
|||||||
const mkdirp = require('mkdirp');
|
const mkdirp = require('mkdirp');
|
||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
||||||
const builtInExtensions = productjson.builtInExtensions || [];
|
const builtInExtensions = productjson.builtInExtensions;
|
||||||
const webBuiltInExtensions = productjson.webBuiltInExtensions || [];
|
const webBuiltInExtensions = productjson.webBuiltInExtensions;
|
||||||
const controlFilePath = path.join(os.homedir(), '.vscode-oss-dev', 'extensions', 'control.json');
|
const controlFilePath = path.join(os.homedir(), '.vscode-oss-dev', 'extensions', 'control.json');
|
||||||
const ENABLE_LOGGING = !process.env['VSCODE_BUILD_BUILTIN_EXTENSIONS_SILENCE_PLEASE'];
|
const ENABLE_LOGGING = !process.env['VSCODE_BUILD_BUILTIN_EXTENSIONS_SILENCE_PLEASE'];
|
||||||
function log(...messages) {
|
function log(...messages) {
|
||||||
|
|||||||
@@ -36,8 +36,8 @@ export interface IExtensionDefinition {
|
|||||||
|
|
||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
||||||
const builtInExtensions = <IExtensionDefinition[]>productjson.builtInExtensions || [];
|
const builtInExtensions = <IExtensionDefinition[]>productjson.builtInExtensions;
|
||||||
const webBuiltInExtensions = <IExtensionDefinition[]>productjson.webBuiltInExtensions || [];
|
const webBuiltInExtensions = <IExtensionDefinition[]>productjson.webBuiltInExtensions;
|
||||||
const controlFilePath = path.join(os.homedir(), '.vscode-oss-dev', 'extensions', 'control.json');
|
const controlFilePath = path.join(os.homedir(), '.vscode-oss-dev', 'extensions', 'control.json');
|
||||||
const ENABLE_LOGGING = !process.env['VSCODE_BUILD_BUILTIN_EXTENSIONS_SILENCE_PLEASE'];
|
const ENABLE_LOGGING = !process.env['VSCODE_BUILD_BUILTIN_EXTENSIONS_SILENCE_PLEASE'];
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
"use strict";
|
"use strict";
|
||||||
/*---------------------------------------------------------------------------------------------
|
/*---------------------------------------------------------------------------------------------
|
||||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
* Licensed under the MIT License. See License.txt in the project root for license information.
|
||||||
*--------------------------------------------------------------------------------------------*/
|
*--------------------------------------------------------------------------------------------*/
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
const got_1 = require("got");
|
const got_1 = require("got");
|
||||||
@@ -12,8 +12,8 @@ const ansiColors = require("ansi-colors");
|
|||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const rootCG = path.join(root, 'extensionsCG');
|
const rootCG = path.join(root, 'extensionsCG');
|
||||||
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
||||||
const builtInExtensions = productjson.builtInExtensions || [];
|
const builtInExtensions = productjson.builtInExtensions;
|
||||||
const webBuiltInExtensions = productjson.webBuiltInExtensions || [];
|
const webBuiltInExtensions = productjson.webBuiltInExtensions;
|
||||||
const token = process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined;
|
const token = process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined;
|
||||||
const contentBasePath = 'raw.githubusercontent.com';
|
const contentBasePath = 'raw.githubusercontent.com';
|
||||||
const contentFileNames = ['package.json', 'package-lock.json', 'yarn.lock'];
|
const contentFileNames = ['package.json', 'package-lock.json', 'yarn.lock'];
|
||||||
@@ -25,7 +25,7 @@ async function downloadExtensionDetails(extension) {
|
|||||||
const promises = [];
|
const promises = [];
|
||||||
for (const fileName of contentFileNames) {
|
for (const fileName of contentFileNames) {
|
||||||
promises.push(new Promise(resolve => {
|
promises.push(new Promise(resolve => {
|
||||||
got_1.default(`${repositoryContentBaseUrl}/${fileName}`)
|
(0, got_1.default)(`${repositoryContentBaseUrl}/${fileName}`)
|
||||||
.then(response => {
|
.then(response => {
|
||||||
resolve({ fileName, body: response.rawBody });
|
resolve({ fileName, body: response.rawBody });
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
/*---------------------------------------------------------------------------------------------
|
/*---------------------------------------------------------------------------------------------
|
||||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
* Licensed under the MIT License. See License.txt in the project root for license information.
|
||||||
*--------------------------------------------------------------------------------------------*/
|
*--------------------------------------------------------------------------------------------*/
|
||||||
|
|
||||||
import got from 'got';
|
import got from 'got';
|
||||||
@@ -13,8 +13,8 @@ import { IExtensionDefinition } from './builtInExtensions';
|
|||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const rootCG = path.join(root, 'extensionsCG');
|
const rootCG = path.join(root, 'extensionsCG');
|
||||||
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
const productjson = JSON.parse(fs.readFileSync(path.join(__dirname, '../../product.json'), 'utf8'));
|
||||||
const builtInExtensions = <IExtensionDefinition[]>productjson.builtInExtensions || [];
|
const builtInExtensions = <IExtensionDefinition[]>productjson.builtInExtensions;
|
||||||
const webBuiltInExtensions = <IExtensionDefinition[]>productjson.webBuiltInExtensions || [];
|
const webBuiltInExtensions = <IExtensionDefinition[]>productjson.webBuiltInExtensions;
|
||||||
const token = process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined;
|
const token = process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined;
|
||||||
|
|
||||||
const contentBasePath = 'raw.githubusercontent.com';
|
const contentBasePath = 'raw.githubusercontent.com';
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ const es = require("event-stream");
|
|||||||
const fs = require("fs");
|
const fs = require("fs");
|
||||||
const gulp = require("gulp");
|
const gulp = require("gulp");
|
||||||
const path = require("path");
|
const path = require("path");
|
||||||
const monacodts = require("./monaco-api");
|
const monacodts = require("../monaco/api");
|
||||||
const nls = require("./nls");
|
const nls = require("./nls");
|
||||||
const reporter_1 = require("./reporter");
|
const reporter_1 = require("./reporter");
|
||||||
const util = require("./util");
|
const util = require("./util");
|
||||||
@@ -17,7 +17,7 @@ const fancyLog = require("fancy-log");
|
|||||||
const ansiColors = require("ansi-colors");
|
const ansiColors = require("ansi-colors");
|
||||||
const os = require("os");
|
const os = require("os");
|
||||||
const watch = require('./watch');
|
const watch = require('./watch');
|
||||||
const reporter = reporter_1.createReporter();
|
const reporter = (0, reporter_1.createReporter)();
|
||||||
function getTypeScriptCompilerOptions(src) {
|
function getTypeScriptCompilerOptions(src) {
|
||||||
const rootDir = path.join(__dirname, `../../${src}`);
|
const rootDir = path.join(__dirname, `../../${src}`);
|
||||||
let options = {};
|
let options = {};
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ import * as es from 'event-stream';
|
|||||||
import * as fs from 'fs';
|
import * as fs from 'fs';
|
||||||
import * as gulp from 'gulp';
|
import * as gulp from 'gulp';
|
||||||
import * as path from 'path';
|
import * as path from 'path';
|
||||||
import * as monacodts from './monaco-api';
|
import * as monacodts from '../monaco/api';
|
||||||
import * as nls from './nls';
|
import * as nls from './nls';
|
||||||
import { createReporter } from './reporter';
|
import { createReporter } from './reporter';
|
||||||
import * as util from './util';
|
import * as util from './util';
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ module.exports = new class {
|
|||||||
const configs = context.options;
|
const configs = context.options;
|
||||||
for (const config of configs) {
|
for (const config of configs) {
|
||||||
if (minimatch(context.getFilename(), config.target)) {
|
if (minimatch(context.getFilename(), config.target)) {
|
||||||
return utils_1.createImportRuleListener((node, value) => this._checkImport(context, config, node, value));
|
return (0, utils_1.createImportRuleListener)((node, value) => this._checkImport(context, config, node, value));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return {};
|
return {};
|
||||||
@@ -29,7 +29,7 @@ module.exports = new class {
|
|||||||
_checkImport(context, config, node, path) {
|
_checkImport(context, config, node, path) {
|
||||||
// resolve relative paths
|
// resolve relative paths
|
||||||
if (path[0] === '.') {
|
if (path[0] === '.') {
|
||||||
path = path_1.join(context.getFilename(), path);
|
path = (0, path_1.join)(context.getFilename(), path);
|
||||||
}
|
}
|
||||||
let restrictions;
|
let restrictions;
|
||||||
if (typeof config.restrictions === 'string') {
|
if (typeof config.restrictions === 'string') {
|
||||||
|
|||||||
@@ -17,7 +17,7 @@ module.exports = new class {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
create(context) {
|
create(context) {
|
||||||
const fileDirname = path_1.dirname(context.getFilename());
|
const fileDirname = (0, path_1.dirname)(context.getFilename());
|
||||||
const parts = fileDirname.split(/\\|\//);
|
const parts = fileDirname.split(/\\|\//);
|
||||||
const ruleArgs = context.options[0];
|
const ruleArgs = context.options[0];
|
||||||
let config;
|
let config;
|
||||||
@@ -39,11 +39,11 @@ module.exports = new class {
|
|||||||
// nothing
|
// nothing
|
||||||
return {};
|
return {};
|
||||||
}
|
}
|
||||||
return utils_1.createImportRuleListener((node, path) => {
|
return (0, utils_1.createImportRuleListener)((node, path) => {
|
||||||
if (path[0] === '.') {
|
if (path[0] === '.') {
|
||||||
path = path_1.join(path_1.dirname(context.getFilename()), path);
|
path = (0, path_1.join)((0, path_1.dirname)(context.getFilename()), path);
|
||||||
}
|
}
|
||||||
const parts = path_1.dirname(path).split(/\\|\//);
|
const parts = (0, path_1.dirname)(path).split(/\\|\//);
|
||||||
for (let i = parts.length - 1; i >= 0; i--) {
|
for (let i = parts.length - 1; i >= 0; i--) {
|
||||||
const part = parts[i];
|
const part = parts[i];
|
||||||
if (config.allowed.has(part)) {
|
if (config.allowed.has(part)) {
|
||||||
|
|||||||
@@ -20,10 +20,10 @@ module.exports = new class NoNlsInStandaloneEditorRule {
|
|||||||
|| /vs(\/|\\)editor(\/|\\)editor.api/.test(fileName)
|
|| /vs(\/|\\)editor(\/|\\)editor.api/.test(fileName)
|
||||||
|| /vs(\/|\\)editor(\/|\\)editor.main/.test(fileName)
|
|| /vs(\/|\\)editor(\/|\\)editor.main/.test(fileName)
|
||||||
|| /vs(\/|\\)editor(\/|\\)editor.worker/.test(fileName)) {
|
|| /vs(\/|\\)editor(\/|\\)editor.worker/.test(fileName)) {
|
||||||
return utils_1.createImportRuleListener((node, path) => {
|
return (0, utils_1.createImportRuleListener)((node, path) => {
|
||||||
// resolve relative paths
|
// resolve relative paths
|
||||||
if (path[0] === '.') {
|
if (path[0] === '.') {
|
||||||
path = path_1.join(context.getFilename(), path);
|
path = (0, path_1.join)(context.getFilename(), path);
|
||||||
}
|
}
|
||||||
if (/vs(\/|\\)nls/.test(path)) {
|
if (/vs(\/|\\)nls/.test(path)) {
|
||||||
context.report({
|
context.report({
|
||||||
|
|||||||
@@ -21,10 +21,10 @@ module.exports = new class NoNlsInStandaloneEditorRule {
|
|||||||
// the vs/editor folder is allowed to use the standalone editor
|
// the vs/editor folder is allowed to use the standalone editor
|
||||||
return {};
|
return {};
|
||||||
}
|
}
|
||||||
return utils_1.createImportRuleListener((node, path) => {
|
return (0, utils_1.createImportRuleListener)((node, path) => {
|
||||||
// resolve relative paths
|
// resolve relative paths
|
||||||
if (path[0] === '.') {
|
if (path[0] === '.') {
|
||||||
path = path_1.join(context.getFilename(), path);
|
path = (0, path_1.join)(context.getFilename(), path);
|
||||||
}
|
}
|
||||||
if (/vs(\/|\\)editor(\/|\\)standalone(\/|\\)/.test(path)
|
if (/vs(\/|\\)editor(\/|\\)standalone(\/|\\)/.test(path)
|
||||||
|| /vs(\/|\\)editor(\/|\\)common(\/|\\)standalone(\/|\\)/.test(path)
|
|| /vs(\/|\\)editor(\/|\\)common(\/|\\)standalone(\/|\\)/.test(path)
|
||||||
|
|||||||
@@ -2,124 +2,144 @@
|
|||||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
* Copyright (c) Microsoft Corporation. All rights reserved.
|
||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||||
*--------------------------------------------------------------------------------------------*/
|
*--------------------------------------------------------------------------------------------*/
|
||||||
|
|
||||||
// FORKED FROM https://github.com/eslint/eslint/blob/b23ad0d789a909baf8d7c41a35bc53df932eaf30/lib/rules/no-unused-expressions.js
|
// FORKED FROM https://github.com/eslint/eslint/blob/b23ad0d789a909baf8d7c41a35bc53df932eaf30/lib/rules/no-unused-expressions.js
|
||||||
// and added support for `OptionalCallExpression`, see https://github.com/facebook/create-react-app/issues/8107 and https://github.com/eslint/eslint/issues/12642
|
// and added support for `OptionalCallExpression`, see https://github.com/facebook/create-react-app/issues/8107 and https://github.com/eslint/eslint/issues/12642
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @fileoverview Flag expressions in statement position that do not side effect
|
* @fileoverview Flag expressions in statement position that do not side effect
|
||||||
* @author Michael Ficarra
|
* @author Michael Ficarra
|
||||||
*/
|
*/
|
||||||
|
|
||||||
'use strict';
|
'use strict';
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
|
||||||
//------------------------------------------------------------------------------
|
//------------------------------------------------------------------------------
|
||||||
// Rule Definition
|
// Rule Definition
|
||||||
//------------------------------------------------------------------------------
|
//------------------------------------------------------------------------------
|
||||||
|
|
||||||
module.exports = {
|
module.exports = {
|
||||||
meta: {
|
meta: {
|
||||||
type: 'suggestion',
|
type: 'suggestion',
|
||||||
docs: {
|
|
||||||
description: 'disallow unused expressions',
|
docs: {
|
||||||
category: 'Best Practices',
|
description: 'disallow unused expressions',
|
||||||
recommended: false,
|
category: 'Best Practices',
|
||||||
url: 'https://eslint.org/docs/rules/no-unused-expressions'
|
recommended: false,
|
||||||
},
|
url: 'https://eslint.org/docs/rules/no-unused-expressions'
|
||||||
schema: [
|
},
|
||||||
{
|
|
||||||
type: 'object',
|
schema: [
|
||||||
properties: {
|
{
|
||||||
allowShortCircuit: {
|
type: 'object',
|
||||||
type: 'boolean',
|
properties: {
|
||||||
default: false
|
allowShortCircuit: {
|
||||||
},
|
type: 'boolean',
|
||||||
allowTernary: {
|
default: false
|
||||||
type: 'boolean',
|
},
|
||||||
default: false
|
allowTernary: {
|
||||||
},
|
type: 'boolean',
|
||||||
allowTaggedTemplates: {
|
default: false
|
||||||
type: 'boolean',
|
},
|
||||||
default: false
|
allowTaggedTemplates: {
|
||||||
}
|
type: 'boolean',
|
||||||
},
|
default: false
|
||||||
additionalProperties: false
|
}
|
||||||
}
|
},
|
||||||
]
|
additionalProperties: false
|
||||||
},
|
}
|
||||||
create(context) {
|
]
|
||||||
const config = context.options[0] || {},
|
},
|
||||||
allowShortCircuit = config.allowShortCircuit || false,
|
|
||||||
allowTernary = config.allowTernary || false,
|
create(context) {
|
||||||
allowTaggedTemplates = config.allowTaggedTemplates || false;
|
const config = context.options[0] || {},
|
||||||
// eslint-disable-next-line jsdoc/require-description
|
allowShortCircuit = config.allowShortCircuit || false,
|
||||||
|
allowTernary = config.allowTernary || false,
|
||||||
|
allowTaggedTemplates = config.allowTaggedTemplates || false;
|
||||||
|
|
||||||
|
// eslint-disable-next-line jsdoc/require-description
|
||||||
/**
|
/**
|
||||||
* @param node any node
|
* @param {ASTNode} node any node
|
||||||
* @returns whether the given node structurally represents a directive
|
* @returns {boolean} whether the given node structurally represents a directive
|
||||||
*/
|
*/
|
||||||
function looksLikeDirective(node) {
|
function looksLikeDirective(node) {
|
||||||
return node.type === 'ExpressionStatement' &&
|
return node.type === 'ExpressionStatement' &&
|
||||||
node.expression.type === 'Literal' && typeof node.expression.value === 'string';
|
node.expression.type === 'Literal' && typeof node.expression.value === 'string';
|
||||||
}
|
}
|
||||||
// eslint-disable-next-line jsdoc/require-description
|
|
||||||
|
// eslint-disable-next-line jsdoc/require-description
|
||||||
/**
|
/**
|
||||||
* @param predicate ([a] -> Boolean) the function used to make the determination
|
* @param {Function} predicate ([a] -> Boolean) the function used to make the determination
|
||||||
* @param list the input list
|
* @param {a[]} list the input list
|
||||||
* @returns the leading sequence of members in the given list that pass the given predicate
|
* @returns {a[]} the leading sequence of members in the given list that pass the given predicate
|
||||||
*/
|
*/
|
||||||
function takeWhile(predicate, list) {
|
function takeWhile(predicate, list) {
|
||||||
for (let i = 0; i < list.length; ++i) {
|
for (let i = 0; i < list.length; ++i) {
|
||||||
if (!predicate(list[i])) {
|
if (!predicate(list[i])) {
|
||||||
return list.slice(0, i);
|
return list.slice(0, i);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return list.slice();
|
return list.slice();
|
||||||
}
|
}
|
||||||
// eslint-disable-next-line jsdoc/require-description
|
|
||||||
|
// eslint-disable-next-line jsdoc/require-description
|
||||||
/**
|
/**
|
||||||
* @param node a Program or BlockStatement node
|
* @param {ASTNode} node a Program or BlockStatement node
|
||||||
* @returns the leading sequence of directive nodes in the given node's body
|
* @returns {ASTNode[]} the leading sequence of directive nodes in the given node's body
|
||||||
*/
|
*/
|
||||||
function directives(node) {
|
function directives(node) {
|
||||||
return takeWhile(looksLikeDirective, node.body);
|
return takeWhile(looksLikeDirective, node.body);
|
||||||
}
|
}
|
||||||
// eslint-disable-next-line jsdoc/require-description
|
|
||||||
|
// eslint-disable-next-line jsdoc/require-description
|
||||||
/**
|
/**
|
||||||
* @param node any node
|
* @param {ASTNode} node any node
|
||||||
* @param ancestors the given node's ancestors
|
* @param {ASTNode[]} ancestors the given node's ancestors
|
||||||
* @returns whether the given node is considered a directive in its current position
|
* @returns {boolean} whether the given node is considered a directive in its current position
|
||||||
*/
|
*/
|
||||||
function isDirective(node, ancestors) {
|
function isDirective(node, ancestors) {
|
||||||
const parent = ancestors[ancestors.length - 1], grandparent = ancestors[ancestors.length - 2];
|
const parent = ancestors[ancestors.length - 1],
|
||||||
return (parent.type === 'Program' || parent.type === 'BlockStatement' &&
|
grandparent = ancestors[ancestors.length - 2];
|
||||||
(/Function/u.test(grandparent.type))) &&
|
|
||||||
directives(parent).indexOf(node) >= 0;
|
return (parent.type === 'Program' || parent.type === 'BlockStatement' &&
|
||||||
}
|
(/Function/u.test(grandparent.type))) &&
|
||||||
|
directives(parent).indexOf(node) >= 0;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Determines whether or not a given node is a valid expression. Recurses on short circuit eval and ternary nodes if enabled by flags.
|
* Determines whether or not a given node is a valid expression. Recurses on short circuit eval and ternary nodes if enabled by flags.
|
||||||
* @param node any node
|
* @param {ASTNode} node any node
|
||||||
* @returns whether the given node is a valid expression
|
* @returns {boolean} whether the given node is a valid expression
|
||||||
*/
|
*/
|
||||||
function isValidExpression(node) {
|
function isValidExpression(node) {
|
||||||
if (allowTernary) {
|
if (allowTernary) {
|
||||||
// Recursive check for ternary and logical expressions
|
|
||||||
if (node.type === 'ConditionalExpression') {
|
// Recursive check for ternary and logical expressions
|
||||||
return isValidExpression(node.consequent) && isValidExpression(node.alternate);
|
if (node.type === 'ConditionalExpression') {
|
||||||
}
|
return isValidExpression(node.consequent) && isValidExpression(node.alternate);
|
||||||
}
|
}
|
||||||
if (allowShortCircuit) {
|
}
|
||||||
if (node.type === 'LogicalExpression') {
|
|
||||||
return isValidExpression(node.right);
|
if (allowShortCircuit) {
|
||||||
}
|
if (node.type === 'LogicalExpression') {
|
||||||
}
|
return isValidExpression(node.right);
|
||||||
if (allowTaggedTemplates && node.type === 'TaggedTemplateExpression') {
|
}
|
||||||
return true;
|
}
|
||||||
}
|
|
||||||
return /^(?:Assignment|OptionalCall|Call|New|Update|Yield|Await)Expression$/u.test(node.type) ||
|
if (allowTaggedTemplates && node.type === 'TaggedTemplateExpression') {
|
||||||
(node.type === 'UnaryExpression' && ['delete', 'void'].indexOf(node.operator) >= 0);
|
return true;
|
||||||
}
|
}
|
||||||
return {
|
|
||||||
ExpressionStatement(node) {
|
return /^(?:Assignment|OptionalCall|Call|New|Update|Yield|Await)Expression$/u.test(node.type) ||
|
||||||
if (!isValidExpression(node.expression) && !isDirective(node, context.getAncestors())) {
|
(node.type === 'UnaryExpression' && ['delete', 'void'].indexOf(node.operator) >= 0);
|
||||||
context.report({ node: node, message: 'Expected an assignment or function call and instead saw an expression.' });
|
}
|
||||||
}
|
|
||||||
}
|
return {
|
||||||
};
|
ExpressionStatement(node) {
|
||||||
}
|
if (!isValidExpression(node.expression) && !isDirective(node, context.getAncestors())) {
|
||||||
|
context.report({ node, message: 'Expected an assignment or function call and instead saw an expression.' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ module.exports = new (_a = class TranslationRemind {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
create(context) {
|
create(context) {
|
||||||
return utils_1.createImportRuleListener((node, path) => this._checkImport(context, node, path));
|
return (0, utils_1.createImportRuleListener)((node, path) => this._checkImport(context, node, path));
|
||||||
}
|
}
|
||||||
_checkImport(context, node, path) {
|
_checkImport(context, node, path) {
|
||||||
if (path !== TranslationRemind.NLS_MODULE) {
|
if (path !== TranslationRemind.NLS_MODULE) {
|
||||||
@@ -31,7 +31,7 @@ module.exports = new (_a = class TranslationRemind {
|
|||||||
let resourceDefined = false;
|
let resourceDefined = false;
|
||||||
let json;
|
let json;
|
||||||
try {
|
try {
|
||||||
json = fs_1.readFileSync('./build/lib/i18n.resources.json', 'utf8');
|
json = (0, fs_1.readFileSync)('./build/lib/i18n.resources.json', 'utf8');
|
||||||
}
|
}
|
||||||
catch (e) {
|
catch (e) {
|
||||||
console.error('[translation-remind rule]: File with resources to pull from Transifex was not found. Aborting translation resource check for newly defined workbench part/service.');
|
console.error('[translation-remind rule]: File with resources to pull from Transifex was not found. Aborting translation resource check for newly defined workbench part/service.');
|
||||||
|
|||||||
@@ -1,45 +0,0 @@
|
|||||||
"use strict";
|
|
||||||
/*---------------------------------------------------------------------------------------------
|
|
||||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
|
||||||
* Licensed under the MIT License. See License.txt in the project root for license information.
|
|
||||||
*--------------------------------------------------------------------------------------------*/
|
|
||||||
module.exports = new class ApiVsCodeInComments {
|
|
||||||
constructor() {
|
|
||||||
this.meta = {
|
|
||||||
messages: {
|
|
||||||
comment: `Don't use the term 'vs code' in comments`
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
create(context) {
|
|
||||||
const sourceCode = context.getSourceCode();
|
|
||||||
return {
|
|
||||||
['Program']: (_node) => {
|
|
||||||
for (const comment of sourceCode.getAllComments()) {
|
|
||||||
if (comment.type !== 'Block') {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if (!comment.range) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
const startIndex = comment.range[0] + '/*'.length;
|
|
||||||
const re = /vs code/ig;
|
|
||||||
let match;
|
|
||||||
while ((match = re.exec(comment.value))) {
|
|
||||||
// Allow using 'VS Code' in quotes
|
|
||||||
if (comment.value[match.index - 1] === `'` && comment.value[match.index + match[0].length] === `'`) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
// Types for eslint seem incorrect
|
|
||||||
const start = sourceCode.getLocFromIndex(startIndex + match.index);
|
|
||||||
const end = sourceCode.getLocFromIndex(startIndex + match.index + match[0].length);
|
|
||||||
context.report({
|
|
||||||
messageId: 'comment',
|
|
||||||
loc: { start, end }
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
};
|
|
||||||
@@ -1,53 +0,0 @@
|
|||||||
/*---------------------------------------------------------------------------------------------
|
|
||||||
* Copyright (c) Microsoft Corporation. All rights reserved.
|
|
||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
|
||||||
*--------------------------------------------------------------------------------------------*/
|
|
||||||
|
|
||||||
import * as eslint from 'eslint';
|
|
||||||
import type * as estree from 'estree';
|
|
||||||
|
|
||||||
export = new class ApiVsCodeInComments implements eslint.Rule.RuleModule {
|
|
||||||
|
|
||||||
readonly meta: eslint.Rule.RuleMetaData = {
|
|
||||||
messages: {
|
|
||||||
comment: `Don't use the term 'vs code' in comments`
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
create(context: eslint.Rule.RuleContext): eslint.Rule.RuleListener {
|
|
||||||
|
|
||||||
const sourceCode = context.getSourceCode();
|
|
||||||
|
|
||||||
return {
|
|
||||||
['Program']: (_node: any) => {
|
|
||||||
|
|
||||||
for (const comment of sourceCode.getAllComments()) {
|
|
||||||
if (comment.type !== 'Block') {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if (!comment.range) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const startIndex = comment.range[0] + '/*'.length;
|
|
||||||
const re = /vs code/ig;
|
|
||||||
let match: RegExpExecArray | null;
|
|
||||||
while ((match = re.exec(comment.value))) {
|
|
||||||
// Allow using 'VS Code' in quotes
|
|
||||||
if (comment.value[match.index - 1] === `'` && comment.value[match.index + match[0].length] === `'`) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Types for eslint seem incorrect
|
|
||||||
const start = sourceCode.getLocFromIndex(startIndex + match.index) as any as estree.Position;
|
|
||||||
const end = sourceCode.getLocFromIndex(startIndex + match.index + match[0].length) as any as estree.Position;
|
|
||||||
context.report({
|
|
||||||
messageId: 'comment',
|
|
||||||
loc: { start, end }
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
};
|
|
||||||
@@ -4,10 +4,9 @@
|
|||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||||
*--------------------------------------------------------------------------------------------*/
|
*--------------------------------------------------------------------------------------------*/
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
exports.buildExtensionMedia = exports.webpackExtensions = exports.translatePackageJSON = exports.packageRebuildExtensionsStream = exports.cleanRebuildExtensions = exports.packageExternalExtensionsStream = exports.scanBuiltinExtensions = exports.packageMarketplaceExtensionsStream = exports.packageLocalExtensionsStream = exports.vscodeExternalExtensions = exports.fromMarketplace = exports.fromLocalNormal = exports.fromLocal = void 0;
|
exports.translatePackageJSON = exports.packageRebuildExtensionsStream = exports.cleanRebuildExtensions = exports.packageExternalExtensionsStream = exports.scanBuiltinExtensions = exports.packageMarketplaceExtensionsStream = exports.packageLocalExtensionsStream = exports.fromMarketplace = exports.fromLocalNormal = exports.fromLocal = void 0;
|
||||||
const es = require("event-stream");
|
const es = require("event-stream");
|
||||||
const fs = require("fs");
|
const fs = require("fs");
|
||||||
const cp = require("child_process");
|
|
||||||
const glob = require("glob");
|
const glob = require("glob");
|
||||||
const gulp = require("gulp");
|
const gulp = require("gulp");
|
||||||
const path = require("path");
|
const path = require("path");
|
||||||
@@ -24,7 +23,7 @@ const jsoncParser = require("jsonc-parser");
|
|||||||
const util = require('./util');
|
const util = require('./util');
|
||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const commit = util.getVersion(root);
|
const commit = util.getVersion(root);
|
||||||
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`; // {{SQL CARBON EDIT}}
|
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
|
||||||
function minifyExtensionResources(input) {
|
function minifyExtensionResources(input) {
|
||||||
const jsonFilter = filter(['**/*.json', '**/*.code-snippets'], { restore: true });
|
const jsonFilter = filter(['**/*.json', '**/*.code-snippets'], { restore: true });
|
||||||
return input
|
return input
|
||||||
@@ -145,7 +144,7 @@ function fromLocalWebpack(extensionPath, webpackConfigFileName) {
|
|||||||
console.error(packagedDependencies);
|
console.error(packagedDependencies);
|
||||||
result.emit('error', err);
|
result.emit('error', err);
|
||||||
});
|
});
|
||||||
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
|
return result.pipe((0, stats_1.createStatsStream)(path.basename(extensionPath)));
|
||||||
}
|
}
|
||||||
function fromLocalNormal(extensionPath) {
|
function fromLocalNormal(extensionPath) {
|
||||||
const result = es.through();
|
const result = es.through();
|
||||||
@@ -163,7 +162,7 @@ function fromLocalNormal(extensionPath) {
|
|||||||
es.readArray(files).pipe(result);
|
es.readArray(files).pipe(result);
|
||||||
})
|
})
|
||||||
.catch(err => result.emit('error', err));
|
.catch(err => result.emit('error', err));
|
||||||
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
|
return result.pipe((0, stats_1.createStatsStream)(path.basename(extensionPath)));
|
||||||
}
|
}
|
||||||
exports.fromLocalNormal = fromLocalNormal;
|
exports.fromLocalNormal = fromLocalNormal;
|
||||||
const baseHeaders = {
|
const baseHeaders = {
|
||||||
@@ -175,7 +174,7 @@ function fromMarketplace(extensionName, version, metadata) {
|
|||||||
const remote = require('gulp-remote-retry-src');
|
const remote = require('gulp-remote-retry-src');
|
||||||
const json = require('gulp-json-editor');
|
const json = require('gulp-json-editor');
|
||||||
const [, name] = extensionName.split('.');
|
const [, name] = extensionName.split('.');
|
||||||
const url = `https://sqlopsextensions.blob.core.windows.net/extensions/${name}/${name}-${version}.vsix`; // {{SQL CARBON EDIT}}
|
const url = `https://sqlopsextensions.blob.core.windows.net/extensions/${name}/${name}-${version}.vsix`;
|
||||||
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
|
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
|
||||||
const options = {
|
const options = {
|
||||||
base: url,
|
base: url,
|
||||||
@@ -216,6 +215,7 @@ const externalExtensions = [
|
|||||||
'arc',
|
'arc',
|
||||||
'asde-deployment',
|
'asde-deployment',
|
||||||
'azcli',
|
'azcli',
|
||||||
|
'azdata',
|
||||||
'azurehybridtoolkit',
|
'azurehybridtoolkit',
|
||||||
'azuremonitor',
|
'azuremonitor',
|
||||||
'cms',
|
'cms',
|
||||||
@@ -232,12 +232,6 @@ const externalExtensions = [
|
|||||||
'sql-database-projects',
|
'sql-database-projects',
|
||||||
'sql-migration'
|
'sql-migration'
|
||||||
];
|
];
|
||||||
/**
|
|
||||||
* Extensions that are built into ADS but should be packaged externally as well for VS Code.
|
|
||||||
*/
|
|
||||||
exports.vscodeExternalExtensions = [
|
|
||||||
'data-workspace'
|
|
||||||
];
|
|
||||||
// extensions that require a rebuild since they have native parts
|
// extensions that require a rebuild since they have native parts
|
||||||
const rebuildExtensions = [
|
const rebuildExtensions = [
|
||||||
'big-data-cluster',
|
'big-data-cluster',
|
||||||
@@ -347,7 +341,6 @@ function scanBuiltinExtensions(extensionsRoot, exclude = []) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
exports.scanBuiltinExtensions = scanBuiltinExtensions;
|
exports.scanBuiltinExtensions = scanBuiltinExtensions;
|
||||||
// {{SQL CARBON EDIT}} start
|
|
||||||
function packageExternalExtensionsStream() {
|
function packageExternalExtensionsStream() {
|
||||||
const extenalExtensionDescriptions = glob.sync('extensions/*/package.json')
|
const extenalExtensionDescriptions = glob.sync('extensions/*/package.json')
|
||||||
.map(manifestPath => {
|
.map(manifestPath => {
|
||||||
@@ -355,7 +348,7 @@ function packageExternalExtensionsStream() {
|
|||||||
const extensionName = path.basename(extensionPath);
|
const extensionName = path.basename(extensionPath);
|
||||||
return { name: extensionName, path: extensionPath };
|
return { name: extensionName, path: extensionPath };
|
||||||
})
|
})
|
||||||
.filter(({ name }) => externalExtensions.indexOf(name) >= 0 || exports.vscodeExternalExtensions.indexOf(name) >= 0);
|
.filter(({ name }) => externalExtensions.indexOf(name) >= 0);
|
||||||
const builtExtensions = extenalExtensionDescriptions.map(extension => {
|
const builtExtensions = extenalExtensionDescriptions.map(extension => {
|
||||||
return fromLocal(extension.path, false)
|
return fromLocal(extension.path, false)
|
||||||
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
|
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
|
||||||
@@ -363,6 +356,7 @@ function packageExternalExtensionsStream() {
|
|||||||
return es.merge(builtExtensions);
|
return es.merge(builtExtensions);
|
||||||
}
|
}
|
||||||
exports.packageExternalExtensionsStream = packageExternalExtensionsStream;
|
exports.packageExternalExtensionsStream = packageExternalExtensionsStream;
|
||||||
|
// {{SQL CARBON EDIT}} start
|
||||||
function cleanRebuildExtensions(root) {
|
function cleanRebuildExtensions(root) {
|
||||||
return Promise.all(rebuildExtensions.map(async (e) => {
|
return Promise.all(rebuildExtensions.map(async (e) => {
|
||||||
await util2.rimraf(path.join(root, e))();
|
await util2.rimraf(path.join(root, e))();
|
||||||
@@ -409,132 +403,3 @@ function translatePackageJSON(packageJSON, packageNLSPath) {
|
|||||||
return packageJSON;
|
return packageJSON;
|
||||||
}
|
}
|
||||||
exports.translatePackageJSON = translatePackageJSON;
|
exports.translatePackageJSON = translatePackageJSON;
|
||||||
const extensionsPath = path.join(root, 'extensions');
|
|
||||||
// Additional projects to webpack. These typically build code for webviews
|
|
||||||
const webpackMediaConfigFiles = [
|
|
||||||
'markdown-language-features/webpack.config.js',
|
|
||||||
'simple-browser/webpack.config.js',
|
|
||||||
];
|
|
||||||
// Additional projects to run esbuild on. These typically build code for webviews
|
|
||||||
const esbuildMediaScripts = [
|
|
||||||
'markdown-language-features/esbuild.js',
|
|
||||||
'markdown-math/esbuild.js',
|
|
||||||
];
|
|
||||||
async function webpackExtensions(taskName, isWatch, webpackConfigLocations) {
|
|
||||||
const webpack = require('webpack');
|
|
||||||
const webpackConfigs = [];
|
|
||||||
for (const { configPath, outputRoot } of webpackConfigLocations) {
|
|
||||||
const configOrFnOrArray = require(configPath);
|
|
||||||
function addConfig(configOrFn) {
|
|
||||||
let config;
|
|
||||||
if (typeof configOrFn === 'function') {
|
|
||||||
config = configOrFn({}, {});
|
|
||||||
webpackConfigs.push(config);
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
config = configOrFn;
|
|
||||||
}
|
|
||||||
if (outputRoot) {
|
|
||||||
config.output.path = path.join(outputRoot, path.relative(path.dirname(configPath), config.output.path));
|
|
||||||
}
|
|
||||||
webpackConfigs.push(configOrFn);
|
|
||||||
}
|
|
||||||
addConfig(configOrFnOrArray);
|
|
||||||
}
|
|
||||||
function reporter(fullStats) {
|
|
||||||
if (Array.isArray(fullStats.children)) {
|
|
||||||
for (const stats of fullStats.children) {
|
|
||||||
const outputPath = stats.outputPath;
|
|
||||||
if (outputPath) {
|
|
||||||
const relativePath = path.relative(extensionsPath, outputPath).replace(/\\/g, '/');
|
|
||||||
const match = relativePath.match(/[^\/]+(\/server|\/client)?/);
|
|
||||||
fancyLog(`Finished ${ansiColors.green(taskName)} ${ansiColors.cyan(match[0])} with ${stats.errors.length} errors.`);
|
|
||||||
}
|
|
||||||
if (Array.isArray(stats.errors)) {
|
|
||||||
stats.errors.forEach((error) => {
|
|
||||||
fancyLog.error(error);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (Array.isArray(stats.warnings)) {
|
|
||||||
stats.warnings.forEach((warning) => {
|
|
||||||
fancyLog.warn(warning);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
if (isWatch) {
|
|
||||||
webpack(webpackConfigs).watch({}, (err, stats) => {
|
|
||||||
if (err) {
|
|
||||||
reject();
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
reporter(stats.toJson());
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
webpack(webpackConfigs).run((err, stats) => {
|
|
||||||
if (err) {
|
|
||||||
fancyLog.error(err);
|
|
||||||
reject();
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
reporter(stats.toJson());
|
|
||||||
resolve();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
exports.webpackExtensions = webpackExtensions;
|
|
||||||
async function esbuildExtensions(taskName, isWatch, scripts) {
|
|
||||||
function reporter(stdError, script) {
|
|
||||||
const matches = (stdError || '').match(/\> (.+): error: (.+)?/g);
|
|
||||||
fancyLog(`Finished ${ansiColors.green(taskName)} ${script} with ${matches ? matches.length : 0} errors.`);
|
|
||||||
for (const match of matches || []) {
|
|
||||||
fancyLog.error(match);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const tasks = scripts.map(({ script, outputRoot }) => {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
const args = [script];
|
|
||||||
if (isWatch) {
|
|
||||||
args.push('--watch');
|
|
||||||
}
|
|
||||||
if (outputRoot) {
|
|
||||||
args.push('--outputRoot', outputRoot);
|
|
||||||
}
|
|
||||||
const proc = cp.execFile(process.argv[0], args, {}, (error, _stdout, stderr) => {
|
|
||||||
if (error) {
|
|
||||||
return reject(error);
|
|
||||||
}
|
|
||||||
reporter(stderr, script);
|
|
||||||
if (stderr) {
|
|
||||||
return reject();
|
|
||||||
}
|
|
||||||
return resolve();
|
|
||||||
});
|
|
||||||
proc.stdout.on('data', (data) => {
|
|
||||||
fancyLog(`${ansiColors.green(taskName)}: ${data.toString('utf8')}`);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
return Promise.all(tasks);
|
|
||||||
}
|
|
||||||
async function buildExtensionMedia(isWatch, outputRoot) {
|
|
||||||
return Promise.all([
|
|
||||||
webpackExtensions('webpacking extension media', isWatch, webpackMediaConfigFiles.map(p => {
|
|
||||||
return {
|
|
||||||
configPath: path.join(extensionsPath, p),
|
|
||||||
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
|
||||||
};
|
|
||||||
})),
|
|
||||||
esbuildExtensions('esbuilding extension media', isWatch, esbuildMediaScripts.map(p => ({
|
|
||||||
script: path.join(extensionsPath, p),
|
|
||||||
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
|
||||||
}))),
|
|
||||||
]);
|
|
||||||
}
|
|
||||||
exports.buildExtensionMedia = buildExtensionMedia;
|
|
||||||
|
|||||||
@@ -5,7 +5,6 @@
|
|||||||
|
|
||||||
import * as es from 'event-stream';
|
import * as es from 'event-stream';
|
||||||
import * as fs from 'fs';
|
import * as fs from 'fs';
|
||||||
import * as cp from 'child_process';
|
|
||||||
import * as glob from 'glob';
|
import * as glob from 'glob';
|
||||||
import * as gulp from 'gulp';
|
import * as gulp from 'gulp';
|
||||||
import * as path from 'path';
|
import * as path from 'path';
|
||||||
@@ -20,11 +19,10 @@ import * as fancyLog from 'fancy-log';
|
|||||||
import * as ansiColors from 'ansi-colors';
|
import * as ansiColors from 'ansi-colors';
|
||||||
const buffer = require('gulp-buffer');
|
const buffer = require('gulp-buffer');
|
||||||
import * as jsoncParser from 'jsonc-parser';
|
import * as jsoncParser from 'jsonc-parser';
|
||||||
import webpack = require('webpack');
|
|
||||||
const util = require('./util');
|
const util = require('./util');
|
||||||
const root = path.dirname(path.dirname(__dirname));
|
const root = path.dirname(path.dirname(__dirname));
|
||||||
const commit = util.getVersion(root);
|
const commit = util.getVersion(root);
|
||||||
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`; // {{SQL CARBON EDIT}}
|
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
|
||||||
|
|
||||||
function minifyExtensionResources(input: Stream): Stream {
|
function minifyExtensionResources(input: Stream): Stream {
|
||||||
const jsonFilter = filter(['**/*.json', '**/*.code-snippets'], { restore: true });
|
const jsonFilter = filter(['**/*.json', '**/*.code-snippets'], { restore: true });
|
||||||
@@ -207,7 +205,7 @@ export function fromMarketplace(extensionName: string, version: string, metadata
|
|||||||
const json = require('gulp-json-editor') as typeof import('gulp-json-editor');
|
const json = require('gulp-json-editor') as typeof import('gulp-json-editor');
|
||||||
|
|
||||||
const [, name] = extensionName.split('.');
|
const [, name] = extensionName.split('.');
|
||||||
const url = `https://sqlopsextensions.blob.core.windows.net/extensions/${name}/${name}-${version}.vsix`; // {{SQL CARBON EDIT}}
|
const url = `https://sqlopsextensions.blob.core.windows.net/extensions/${name}/${name}-${version}.vsix`;
|
||||||
|
|
||||||
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
|
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
|
||||||
|
|
||||||
@@ -252,6 +250,7 @@ const externalExtensions = [
|
|||||||
'arc',
|
'arc',
|
||||||
'asde-deployment',
|
'asde-deployment',
|
||||||
'azcli',
|
'azcli',
|
||||||
|
'azdata',
|
||||||
'azurehybridtoolkit',
|
'azurehybridtoolkit',
|
||||||
'azuremonitor',
|
'azuremonitor',
|
||||||
'cms',
|
'cms',
|
||||||
@@ -269,13 +268,6 @@ const externalExtensions = [
|
|||||||
'sql-migration'
|
'sql-migration'
|
||||||
];
|
];
|
||||||
|
|
||||||
/**
|
|
||||||
* Extensions that are built into ADS but should be packaged externally as well for VS Code.
|
|
||||||
*/
|
|
||||||
export const vscodeExternalExtensions = [
|
|
||||||
'data-workspace'
|
|
||||||
];
|
|
||||||
|
|
||||||
// extensions that require a rebuild since they have native parts
|
// extensions that require a rebuild since they have native parts
|
||||||
const rebuildExtensions = [
|
const rebuildExtensions = [
|
||||||
'big-data-cluster',
|
'big-data-cluster',
|
||||||
@@ -426,7 +418,6 @@ export function scanBuiltinExtensions(extensionsRoot: string, exclude: string[]
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// {{SQL CARBON EDIT}} start
|
|
||||||
export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
|
export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
|
||||||
const extenalExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
|
const extenalExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
|
||||||
.map(manifestPath => {
|
.map(manifestPath => {
|
||||||
@@ -434,7 +425,7 @@ export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
|
|||||||
const extensionName = path.basename(extensionPath);
|
const extensionName = path.basename(extensionPath);
|
||||||
return { name: extensionName, path: extensionPath };
|
return { name: extensionName, path: extensionPath };
|
||||||
})
|
})
|
||||||
.filter(({ name }) => externalExtensions.indexOf(name) >= 0 || vscodeExternalExtensions.indexOf(name) >= 0);
|
.filter(({ name }) => externalExtensions.indexOf(name) >= 0);
|
||||||
|
|
||||||
const builtExtensions = extenalExtensionDescriptions.map(extension => {
|
const builtExtensions = extenalExtensionDescriptions.map(extension => {
|
||||||
return fromLocal(extension.path, false)
|
return fromLocal(extension.path, false)
|
||||||
@@ -444,6 +435,7 @@ export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
|
|||||||
return es.merge(builtExtensions);
|
return es.merge(builtExtensions);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// {{SQL CARBON EDIT}} start
|
||||||
export function cleanRebuildExtensions(root: string): Promise<void> {
|
export function cleanRebuildExtensions(root: string): Promise<void> {
|
||||||
return Promise.all(rebuildExtensions.map(async e => {
|
return Promise.all(rebuildExtensions.map(async e => {
|
||||||
await util2.rimraf(path.join(root, e))();
|
await util2.rimraf(path.join(root, e))();
|
||||||
@@ -489,138 +481,3 @@ export function translatePackageJSON(packageJSON: string, packageNLSPath: string
|
|||||||
translate(packageJSON);
|
translate(packageJSON);
|
||||||
return packageJSON;
|
return packageJSON;
|
||||||
}
|
}
|
||||||
|
|
||||||
const extensionsPath = path.join(root, 'extensions');
|
|
||||||
|
|
||||||
// Additional projects to webpack. These typically build code for webviews
|
|
||||||
const webpackMediaConfigFiles = [
|
|
||||||
'markdown-language-features/webpack.config.js',
|
|
||||||
'simple-browser/webpack.config.js',
|
|
||||||
];
|
|
||||||
|
|
||||||
// Additional projects to run esbuild on. These typically build code for webviews
|
|
||||||
const esbuildMediaScripts = [
|
|
||||||
'markdown-language-features/esbuild.js',
|
|
||||||
'markdown-math/esbuild.js',
|
|
||||||
];
|
|
||||||
|
|
||||||
export async function webpackExtensions(taskName: string, isWatch: boolean, webpackConfigLocations: { configPath: string, outputRoot?: string }[]) {
|
|
||||||
const webpack = require('webpack') as typeof import('webpack');
|
|
||||||
|
|
||||||
const webpackConfigs: webpack.Configuration[] = [];
|
|
||||||
|
|
||||||
for (const { configPath, outputRoot } of webpackConfigLocations) {
|
|
||||||
const configOrFnOrArray = require(configPath);
|
|
||||||
function addConfig(configOrFn: webpack.Configuration | Function) {
|
|
||||||
let config;
|
|
||||||
if (typeof configOrFn === 'function') {
|
|
||||||
config = configOrFn({}, {});
|
|
||||||
webpackConfigs.push(config);
|
|
||||||
} else {
|
|
||||||
config = configOrFn;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (outputRoot) {
|
|
||||||
config.output.path = path.join(outputRoot, path.relative(path.dirname(configPath), config.output.path));
|
|
||||||
}
|
|
||||||
|
|
||||||
webpackConfigs.push(configOrFn);
|
|
||||||
}
|
|
||||||
addConfig(configOrFnOrArray);
|
|
||||||
}
|
|
||||||
function reporter(fullStats: any) {
|
|
||||||
if (Array.isArray(fullStats.children)) {
|
|
||||||
for (const stats of fullStats.children) {
|
|
||||||
const outputPath = stats.outputPath;
|
|
||||||
if (outputPath) {
|
|
||||||
const relativePath = path.relative(extensionsPath, outputPath).replace(/\\/g, '/');
|
|
||||||
const match = relativePath.match(/[^\/]+(\/server|\/client)?/);
|
|
||||||
fancyLog(`Finished ${ansiColors.green(taskName)} ${ansiColors.cyan(match![0])} with ${stats.errors.length} errors.`);
|
|
||||||
}
|
|
||||||
if (Array.isArray(stats.errors)) {
|
|
||||||
stats.errors.forEach((error: any) => {
|
|
||||||
fancyLog.error(error);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (Array.isArray(stats.warnings)) {
|
|
||||||
stats.warnings.forEach((warning: any) => {
|
|
||||||
fancyLog.warn(warning);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return new Promise<void>((resolve, reject) => {
|
|
||||||
if (isWatch) {
|
|
||||||
webpack(webpackConfigs).watch({}, (err, stats) => {
|
|
||||||
if (err) {
|
|
||||||
reject();
|
|
||||||
} else {
|
|
||||||
reporter(stats.toJson());
|
|
||||||
}
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
webpack(webpackConfigs).run((err, stats) => {
|
|
||||||
if (err) {
|
|
||||||
fancyLog.error(err);
|
|
||||||
reject();
|
|
||||||
} else {
|
|
||||||
reporter(stats.toJson());
|
|
||||||
resolve();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async function esbuildExtensions(taskName: string, isWatch: boolean, scripts: { script: string, outputRoot?: string }[]) {
|
|
||||||
function reporter(stdError: string, script: string) {
|
|
||||||
const matches = (stdError || '').match(/\> (.+): error: (.+)?/g);
|
|
||||||
fancyLog(`Finished ${ansiColors.green(taskName)} ${script} with ${matches ? matches.length : 0} errors.`);
|
|
||||||
for (const match of matches || []) {
|
|
||||||
fancyLog.error(match);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const tasks = scripts.map(({ script, outputRoot }) => {
|
|
||||||
return new Promise<void>((resolve, reject) => {
|
|
||||||
const args = [script];
|
|
||||||
if (isWatch) {
|
|
||||||
args.push('--watch');
|
|
||||||
}
|
|
||||||
if (outputRoot) {
|
|
||||||
args.push('--outputRoot', outputRoot);
|
|
||||||
}
|
|
||||||
const proc = cp.execFile(process.argv[0], args, {}, (error, _stdout, stderr) => {
|
|
||||||
if (error) {
|
|
||||||
return reject(error);
|
|
||||||
}
|
|
||||||
reporter(stderr, script);
|
|
||||||
if (stderr) {
|
|
||||||
return reject();
|
|
||||||
}
|
|
||||||
return resolve();
|
|
||||||
});
|
|
||||||
|
|
||||||
proc.stdout!.on('data', (data) => {
|
|
||||||
fancyLog(`${ansiColors.green(taskName)}: ${data.toString('utf8')}`);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
});
|
|
||||||
return Promise.all(tasks);
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function buildExtensionMedia(isWatch: boolean, outputRoot?: string) {
|
|
||||||
return Promise.all([
|
|
||||||
webpackExtensions('webpacking extension media', isWatch, webpackMediaConfigFiles.map(p => {
|
|
||||||
return {
|
|
||||||
configPath: path.join(extensionsPath, p),
|
|
||||||
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
|
||||||
};
|
|
||||||
})),
|
|
||||||
esbuildExtensions('esbuilding extension media', isWatch, esbuildMediaScripts.map(p => ({
|
|
||||||
script: path.join(extensionsPath, p),
|
|
||||||
outputRoot: outputRoot ? path.join(root, outputRoot, path.dirname(p)) : undefined
|
|
||||||
}))),
|
|
||||||
]);
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -4,13 +4,14 @@
|
|||||||
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
* Licensed under the Source EULA. See License.txt in the project root for license information.
|
||||||
*--------------------------------------------------------------------------------------------*/
|
*--------------------------------------------------------------------------------------------*/
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
exports.prepareIslFiles = exports.prepareI18nPackFiles = exports.i18nPackVersion = exports.createI18nFile = exports.prepareI18nFiles = exports.pullSetupXlfFiles = exports.findObsoleteResources = exports.pushXlfFiles = exports.createXlfFilesForIsl = exports.createXlfFilesForExtensions = exports.createXlfFilesForCoreBundle = exports.getResource = exports.processNlsFiles = exports.Limiter = exports.XLF = exports.Line = exports.externalExtensionsWithTranslations = exports.extraLanguages = exports.defaultLanguages = void 0;
|
exports.prepareIslFiles = exports.prepareI18nPackFiles = exports.pullI18nPackFiles = exports.i18nPackVersion = exports.createI18nFile = exports.prepareI18nFiles = exports.pullSetupXlfFiles = exports.pullCoreAndExtensionsXlfFiles = exports.findObsoleteResources = exports.pushXlfFiles = exports.createXlfFilesForIsl = exports.createXlfFilesForExtensions = exports.createXlfFilesForCoreBundle = exports.getResource = exports.processNlsFiles = exports.Limiter = exports.XLF = exports.Line = exports.externalExtensionsWithTranslations = exports.extraLanguages = exports.defaultLanguages = void 0;
|
||||||
const path = require("path");
|
const path = require("path");
|
||||||
const fs = require("fs");
|
const fs = require("fs");
|
||||||
const event_stream_1 = require("event-stream");
|
const event_stream_1 = require("event-stream");
|
||||||
const File = require("vinyl");
|
const File = require("vinyl");
|
||||||
const Is = require("is");
|
const Is = require("is");
|
||||||
const xml2js = require("xml2js");
|
const xml2js = require("xml2js");
|
||||||
|
const glob = require("glob");
|
||||||
const https = require("https");
|
const https = require("https");
|
||||||
const gulp = require("gulp");
|
const gulp = require("gulp");
|
||||||
const fancyLog = require("fancy-log");
|
const fancyLog = require("fancy-log");
|
||||||
@@ -109,16 +110,12 @@ class XLF {
|
|||||||
}
|
}
|
||||||
toString() {
|
toString() {
|
||||||
this.appendHeader();
|
this.appendHeader();
|
||||||
const files = Object.keys(this.files).sort();
|
for (let file in this.files) {
|
||||||
for (const file of files) {
|
|
||||||
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
|
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
|
||||||
const items = this.files[file].sort((a, b) => {
|
for (let item of this.files[file]) {
|
||||||
return a.id < b.id ? -1 : a.id > b.id ? 1 : 0;
|
|
||||||
});
|
|
||||||
for (const item of items) {
|
|
||||||
this.addStringItem(file, item);
|
this.addStringItem(file, item);
|
||||||
}
|
}
|
||||||
this.appendNewLine('</body></file>');
|
this.appendNewLine('</body></file>', 2);
|
||||||
}
|
}
|
||||||
this.appendFooter();
|
this.appendFooter();
|
||||||
return this.buffer.join('\r\n');
|
return this.buffer.join('\r\n');
|
||||||
@@ -466,7 +463,7 @@ function processCoreBundleFormat(fileHeader, languages, json, emitter) {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
function processNlsFiles(opts) {
|
function processNlsFiles(opts) {
|
||||||
return event_stream_1.through(function (file) {
|
return (0, event_stream_1.through)(function (file) {
|
||||||
let fileName = path.basename(file.path);
|
let fileName = path.basename(file.path);
|
||||||
if (fileName === 'nls.metadata.json') {
|
if (fileName === 'nls.metadata.json') {
|
||||||
let json = null;
|
let json = null;
|
||||||
@@ -524,17 +521,13 @@ function getResource(sourceFile) {
|
|||||||
}
|
}
|
||||||
exports.getResource = getResource;
|
exports.getResource = getResource;
|
||||||
function createXlfFilesForCoreBundle() {
|
function createXlfFilesForCoreBundle() {
|
||||||
return event_stream_1.through(function (file) {
|
return (0, event_stream_1.through)(function (file) {
|
||||||
const basename = path.basename(file.path);
|
const basename = path.basename(file.path);
|
||||||
if (basename === 'nls.metadata.json') {
|
if (basename === 'nls.metadata.json') {
|
||||||
if (file.isBuffer()) {
|
if (file.isBuffer()) {
|
||||||
const xlfs = Object.create(null);
|
const xlfs = Object.create(null);
|
||||||
const json = JSON.parse(file.contents.toString('utf8'));
|
const json = JSON.parse(file.contents.toString('utf8'));
|
||||||
// {{SQL CARBON EDIT}} - Must sort the keys for easier translation.
|
for (let coreModule in json.keys) {
|
||||||
let sortedKeys = Object.keys(json.keys).sort();
|
|
||||||
for (let i = 0; i < sortedKeys.length; i++) {
|
|
||||||
let coreModule = sortedKeys[i];
|
|
||||||
// {{SQL CARBON EDIT}} - End
|
|
||||||
const projectResource = getResource(coreModule);
|
const projectResource = getResource(coreModule);
|
||||||
const resource = projectResource.name;
|
const resource = projectResource.name;
|
||||||
const project = projectResource.project;
|
const project = projectResource.project;
|
||||||
@@ -579,7 +572,7 @@ function createXlfFilesForExtensions() {
|
|||||||
let counter = 0;
|
let counter = 0;
|
||||||
let folderStreamEnded = false;
|
let folderStreamEnded = false;
|
||||||
let folderStreamEndEmitted = false;
|
let folderStreamEndEmitted = false;
|
||||||
return event_stream_1.through(function (extensionFolder) {
|
return (0, event_stream_1.through)(function (extensionFolder) {
|
||||||
const folderStream = this;
|
const folderStream = this;
|
||||||
const stat = fs.statSync(extensionFolder.path);
|
const stat = fs.statSync(extensionFolder.path);
|
||||||
if (!stat.isDirectory()) {
|
if (!stat.isDirectory()) {
|
||||||
@@ -597,7 +590,7 @@ function createXlfFilesForExtensions() {
|
|||||||
}
|
}
|
||||||
return _xlf;
|
return _xlf;
|
||||||
}
|
}
|
||||||
gulp.src([`.build/extensions/${extensionName}/package.nls.json`, `.build/extensions/${extensionName}/**/nls.metadata.json`], { allowEmpty: true }).pipe(event_stream_1.through(function (file) {
|
gulp.src([`.build/extensions/${extensionName}/package.nls.json`, `.build/extensions/${extensionName}/**/nls.metadata.json`], { allowEmpty: true }).pipe((0, event_stream_1.through)(function (file) {
|
||||||
if (file.isBuffer()) {
|
if (file.isBuffer()) {
|
||||||
const buffer = file.contents;
|
const buffer = file.contents;
|
||||||
const basename = path.basename(file.path);
|
const basename = path.basename(file.path);
|
||||||
@@ -656,14 +649,15 @@ function createXlfFilesForExtensions() {
|
|||||||
}
|
}
|
||||||
exports.createXlfFilesForExtensions = createXlfFilesForExtensions;
|
exports.createXlfFilesForExtensions = createXlfFilesForExtensions;
|
||||||
function createXlfFilesForIsl() {
|
function createXlfFilesForIsl() {
|
||||||
return event_stream_1.through(function (file) {
|
return (0, event_stream_1.through)(function (file) {
|
||||||
let projectName, resourceFile;
|
let projectName, resourceFile;
|
||||||
if (path.basename(file.path) === 'messages.en.isl') {
|
if (path.basename(file.path) === 'Default.isl') {
|
||||||
projectName = setupProject;
|
projectName = setupProject;
|
||||||
resourceFile = 'messages.xlf';
|
resourceFile = 'setup_default.xlf';
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
throw new Error(`Unknown input file ${file.path}`);
|
projectName = workbenchProject;
|
||||||
|
resourceFile = 'setup_messages.xlf';
|
||||||
}
|
}
|
||||||
let xlf = new XLF(projectName), keys = [], messages = [];
|
let xlf = new XLF(projectName), keys = [], messages = [];
|
||||||
let model = new TextModel(file.contents.toString());
|
let model = new TextModel(file.contents.toString());
|
||||||
@@ -709,7 +703,7 @@ exports.createXlfFilesForIsl = createXlfFilesForIsl;
|
|||||||
function pushXlfFiles(apiHostname, username, password) {
|
function pushXlfFiles(apiHostname, username, password) {
|
||||||
let tryGetPromises = [];
|
let tryGetPromises = [];
|
||||||
let updateCreatePromises = [];
|
let updateCreatePromises = [];
|
||||||
return event_stream_1.through(function (file) {
|
return (0, event_stream_1.through)(function (file) {
|
||||||
const project = path.dirname(file.relative);
|
const project = path.dirname(file.relative);
|
||||||
const fileName = path.basename(file.path);
|
const fileName = path.basename(file.path);
|
||||||
const slug = fileName.substr(0, fileName.length - '.xlf'.length);
|
const slug = fileName.substr(0, fileName.length - '.xlf'.length);
|
||||||
@@ -771,7 +765,7 @@ function getAllResources(project, apiHostname, username, password) {
|
|||||||
function findObsoleteResources(apiHostname, username, password) {
|
function findObsoleteResources(apiHostname, username, password) {
|
||||||
let resourcesByProject = Object.create(null);
|
let resourcesByProject = Object.create(null);
|
||||||
resourcesByProject[extensionsProject] = [].concat(exports.externalExtensionsWithTranslations); // clone
|
resourcesByProject[extensionsProject] = [].concat(exports.externalExtensionsWithTranslations); // clone
|
||||||
return event_stream_1.through(function (file) {
|
return (0, event_stream_1.through)(function (file) {
|
||||||
const project = path.dirname(file.relative);
|
const project = path.dirname(file.relative);
|
||||||
const fileName = path.basename(file.path);
|
const fileName = path.basename(file.path);
|
||||||
const slug = fileName.substr(0, fileName.length - '.xlf'.length);
|
const slug = fileName.substr(0, fileName.length - '.xlf'.length);
|
||||||
@@ -911,6 +905,31 @@ function updateResource(project, slug, xlfFile, apiHostname, credentials) {
|
|||||||
request.end();
|
request.end();
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
// cache resources
|
||||||
|
let _coreAndExtensionResources;
|
||||||
|
function pullCoreAndExtensionsXlfFiles(apiHostname, username, password, language, externalExtensions) {
|
||||||
|
if (!_coreAndExtensionResources) {
|
||||||
|
_coreAndExtensionResources = [];
|
||||||
|
// editor and workbench
|
||||||
|
const json = JSON.parse(fs.readFileSync('./build/lib/i18n.resources.json', 'utf8'));
|
||||||
|
_coreAndExtensionResources.push(...json.editor);
|
||||||
|
_coreAndExtensionResources.push(...json.workbench);
|
||||||
|
// extensions
|
||||||
|
let extensionsToLocalize = Object.create(null);
|
||||||
|
glob.sync('.build/extensions/**/*.nls.json').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
|
||||||
|
glob.sync('.build/extensions/*/node_modules/vscode-nls').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
|
||||||
|
Object.keys(extensionsToLocalize).forEach(extension => {
|
||||||
|
_coreAndExtensionResources.push({ name: extension, project: extensionsProject });
|
||||||
|
});
|
||||||
|
if (externalExtensions) {
|
||||||
|
for (let resourceName in externalExtensions) {
|
||||||
|
_coreAndExtensionResources.push({ name: resourceName, project: extensionsProject });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return pullXlfFiles(apiHostname, username, password, language, _coreAndExtensionResources);
|
||||||
|
}
|
||||||
|
exports.pullCoreAndExtensionsXlfFiles = pullCoreAndExtensionsXlfFiles;
|
||||||
function pullSetupXlfFiles(apiHostname, username, password, language, includeDefault) {
|
function pullSetupXlfFiles(apiHostname, username, password, language, includeDefault) {
|
||||||
let setupResources = [{ name: 'setup_messages', project: workbenchProject }];
|
let setupResources = [{ name: 'setup_messages', project: workbenchProject }];
|
||||||
if (includeDefault) {
|
if (includeDefault) {
|
||||||
@@ -923,7 +942,7 @@ function pullXlfFiles(apiHostname, username, password, language, resources) {
|
|||||||
const credentials = `${username}:${password}`;
|
const credentials = `${username}:${password}`;
|
||||||
let expectedTranslationsCount = resources.length;
|
let expectedTranslationsCount = resources.length;
|
||||||
let translationsRetrieved = 0, called = false;
|
let translationsRetrieved = 0, called = false;
|
||||||
return event_stream_1.readable(function (_count, callback) {
|
return (0, event_stream_1.readable)(function (_count, callback) {
|
||||||
// Mark end of stream when all resources were retrieved
|
// Mark end of stream when all resources were retrieved
|
||||||
if (translationsRetrieved === expectedTranslationsCount) {
|
if (translationsRetrieved === expectedTranslationsCount) {
|
||||||
return this.emit('end');
|
return this.emit('end');
|
||||||
@@ -981,7 +1000,7 @@ function retrieveResource(language, resource, apiHostname, credentials) {
|
|||||||
}
|
}
|
||||||
function prepareI18nFiles() {
|
function prepareI18nFiles() {
|
||||||
let parsePromises = [];
|
let parsePromises = [];
|
||||||
return event_stream_1.through(function (xlf) {
|
return (0, event_stream_1.through)(function (xlf) {
|
||||||
let stream = this;
|
let stream = this;
|
||||||
let parsePromise = XLF.parse(xlf.contents.toString());
|
let parsePromise = XLF.parse(xlf.contents.toString());
|
||||||
parsePromises.push(parsePromise);
|
parsePromises.push(parsePromise);
|
||||||
@@ -1021,16 +1040,20 @@ function createI18nFile(originalFilePath, messages) {
|
|||||||
}
|
}
|
||||||
exports.createI18nFile = createI18nFile;
|
exports.createI18nFile = createI18nFile;
|
||||||
exports.i18nPackVersion = '1.0.0'; // {{SQL CARBON EDIT}} Needed in locfunc.
|
exports.i18nPackVersion = '1.0.0'; // {{SQL CARBON EDIT}} Needed in locfunc.
|
||||||
|
function pullI18nPackFiles(apiHostname, username, password, language, resultingTranslationPaths) {
|
||||||
|
return pullCoreAndExtensionsXlfFiles(apiHostname, username, password, language, exports.externalExtensionsWithTranslations)
|
||||||
|
.pipe(prepareI18nPackFiles(exports.externalExtensionsWithTranslations, resultingTranslationPaths, language.id === 'ps'));
|
||||||
|
}
|
||||||
|
exports.pullI18nPackFiles = pullI18nPackFiles;
|
||||||
function prepareI18nPackFiles(externalExtensions, resultingTranslationPaths, pseudo = false) {
|
function prepareI18nPackFiles(externalExtensions, resultingTranslationPaths, pseudo = false) {
|
||||||
let parsePromises = [];
|
let parsePromises = [];
|
||||||
let mainPack = { version: exports.i18nPackVersion, contents: {} };
|
let mainPack = { version: exports.i18nPackVersion, contents: {} };
|
||||||
let extensionsPacks = {};
|
let extensionsPacks = {};
|
||||||
let errors = [];
|
let errors = [];
|
||||||
return event_stream_1.through(function (xlf) {
|
return (0, event_stream_1.through)(function (xlf) {
|
||||||
let project = path.basename(path.dirname(path.dirname(xlf.relative)));
|
let project = path.basename(path.dirname(xlf.relative));
|
||||||
let resource = path.basename(xlf.relative, '.xlf');
|
let resource = path.basename(xlf.relative, '.xlf');
|
||||||
let contents = xlf.contents.toString();
|
let contents = xlf.contents.toString();
|
||||||
log(`Found ${project}: ${resource}`);
|
|
||||||
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
|
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
|
||||||
parsePromises.push(parsePromise);
|
parsePromises.push(parsePromise);
|
||||||
parsePromise.then(resolvedFiles => {
|
parsePromise.then(resolvedFiles => {
|
||||||
@@ -1088,12 +1111,15 @@ function prepareI18nPackFiles(externalExtensions, resultingTranslationPaths, pse
|
|||||||
exports.prepareI18nPackFiles = prepareI18nPackFiles;
|
exports.prepareI18nPackFiles = prepareI18nPackFiles;
|
||||||
function prepareIslFiles(language, innoSetupConfig) {
|
function prepareIslFiles(language, innoSetupConfig) {
|
||||||
let parsePromises = [];
|
let parsePromises = [];
|
||||||
return event_stream_1.through(function (xlf) {
|
return (0, event_stream_1.through)(function (xlf) {
|
||||||
let stream = this;
|
let stream = this;
|
||||||
let parsePromise = XLF.parse(xlf.contents.toString());
|
let parsePromise = XLF.parse(xlf.contents.toString());
|
||||||
parsePromises.push(parsePromise);
|
parsePromises.push(parsePromise);
|
||||||
parsePromise.then(resolvedFiles => {
|
parsePromise.then(resolvedFiles => {
|
||||||
resolvedFiles.forEach(file => {
|
resolvedFiles.forEach(file => {
|
||||||
|
if (path.basename(file.originalFilePath) === 'Default' && !innoSetupConfig.defaultInfo) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
let translatedFile = createIslFile(file.originalFilePath, file.messages, language, innoSetupConfig);
|
let translatedFile = createIslFile(file.originalFilePath, file.messages, language, innoSetupConfig);
|
||||||
stream.queue(translatedFile);
|
stream.queue(translatedFile);
|
||||||
});
|
});
|
||||||
@@ -1129,9 +1155,20 @@ function createIslFile(originalFilePath, messages, language, innoSetup) {
|
|||||||
let key = sections[0];
|
let key = sections[0];
|
||||||
let translated = line;
|
let translated = line;
|
||||||
if (key) {
|
if (key) {
|
||||||
let translatedMessage = messages[key];
|
if (key === 'LanguageName') {
|
||||||
if (translatedMessage) {
|
translated = `${key}=${innoSetup.defaultInfo.name}`;
|
||||||
translated = `${key}=${translatedMessage}`;
|
}
|
||||||
|
else if (key === 'LanguageID') {
|
||||||
|
translated = `${key}=${innoSetup.defaultInfo.id}`;
|
||||||
|
}
|
||||||
|
else if (key === 'LanguageCodePage') {
|
||||||
|
translated = `${key}=${innoSetup.codePage.substr(2)}`;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
let translatedMessage = messages[key];
|
||||||
|
if (translatedMessage) {
|
||||||
|
translated = `${key}=${translatedMessage}`;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
content.push(translated);
|
content.push(translated);
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import { through, readable, ThroughStream } from 'event-stream';
|
|||||||
import * as File from 'vinyl';
|
import * as File from 'vinyl';
|
||||||
import * as Is from 'is';
|
import * as Is from 'is';
|
||||||
import * as xml2js from 'xml2js';
|
import * as xml2js from 'xml2js';
|
||||||
|
import * as glob from 'glob';
|
||||||
import * as https from 'https';
|
import * as https from 'https';
|
||||||
import * as gulp from 'gulp';
|
import * as gulp from 'gulp';
|
||||||
import * as fancyLog from 'fancy-log';
|
import * as fancyLog from 'fancy-log';
|
||||||
@@ -30,6 +31,10 @@ export interface Language {
|
|||||||
|
|
||||||
export interface InnoSetup {
|
export interface InnoSetup {
|
||||||
codePage: string; //code page for encoding (http://www.jrsoftware.org/ishelp/index.php?topic=langoptionssection)
|
codePage: string; //code page for encoding (http://www.jrsoftware.org/ishelp/index.php?topic=langoptionssection)
|
||||||
|
defaultInfo?: {
|
||||||
|
name: string; // inno setup language name
|
||||||
|
id: string; // locale identifier (https://msdn.microsoft.com/en-us/library/dd318693.aspx)
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
export const defaultLanguages: Language[] = [
|
export const defaultLanguages: Language[] = [
|
||||||
@@ -193,17 +198,14 @@ export class XLF {
|
|||||||
public toString(): string {
|
public toString(): string {
|
||||||
this.appendHeader();
|
this.appendHeader();
|
||||||
|
|
||||||
const files = Object.keys(this.files).sort();
|
for (let file in this.files) {
|
||||||
for (const file of files) {
|
|
||||||
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
|
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
|
||||||
const items = this.files[file].sort((a: Item, b: Item) => {
|
for (let item of this.files[file]) {
|
||||||
return a.id < b.id ? -1 : a.id > b.id ? 1 : 0;
|
|
||||||
});
|
|
||||||
for (const item of items) {
|
|
||||||
this.addStringItem(file, item);
|
this.addStringItem(file, item);
|
||||||
}
|
}
|
||||||
this.appendNewLine('</body></file>');
|
this.appendNewLine('</body></file>', 2);
|
||||||
}
|
}
|
||||||
|
|
||||||
this.appendFooter();
|
this.appendFooter();
|
||||||
return this.buffer.join('\r\n');
|
return this.buffer.join('\r\n');
|
||||||
}
|
}
|
||||||
@@ -650,11 +652,7 @@ export function createXlfFilesForCoreBundle(): ThroughStream {
|
|||||||
if (file.isBuffer()) {
|
if (file.isBuffer()) {
|
||||||
const xlfs: Map<XLF> = Object.create(null);
|
const xlfs: Map<XLF> = Object.create(null);
|
||||||
const json: BundledFormat = JSON.parse((file.contents as Buffer).toString('utf8'));
|
const json: BundledFormat = JSON.parse((file.contents as Buffer).toString('utf8'));
|
||||||
// {{SQL CARBON EDIT}} - Must sort the keys for easier translation.
|
for (let coreModule in json.keys) {
|
||||||
let sortedKeys = Object.keys(json.keys).sort();
|
|
||||||
for (let i = 0; i < sortedKeys.length; i++) {
|
|
||||||
let coreModule = sortedKeys[i];
|
|
||||||
// {{SQL CARBON EDIT}} - End
|
|
||||||
const projectResource = getResource(coreModule);
|
const projectResource = getResource(coreModule);
|
||||||
const resource = projectResource.name;
|
const resource = projectResource.name;
|
||||||
const project = projectResource.project;
|
const project = projectResource.project;
|
||||||
@@ -773,11 +771,12 @@ export function createXlfFilesForIsl(): ThroughStream {
|
|||||||
return through(function (this: ThroughStream, file: File) {
|
return through(function (this: ThroughStream, file: File) {
|
||||||
let projectName: string,
|
let projectName: string,
|
||||||
resourceFile: string;
|
resourceFile: string;
|
||||||
if (path.basename(file.path) === 'messages.en.isl') {
|
if (path.basename(file.path) === 'Default.isl') {
|
||||||
projectName = setupProject;
|
projectName = setupProject;
|
||||||
resourceFile = 'messages.xlf';
|
resourceFile = 'setup_default.xlf';
|
||||||
} else {
|
} else {
|
||||||
throw new Error(`Unknown input file ${file.path}`);
|
projectName = workbenchProject;
|
||||||
|
resourceFile = 'setup_messages.xlf';
|
||||||
}
|
}
|
||||||
|
|
||||||
let xlf = new XLF(projectName),
|
let xlf = new XLF(projectName),
|
||||||
@@ -1045,6 +1044,35 @@ function updateResource(project: string, slug: string, xlfFile: File, apiHostnam
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// cache resources
|
||||||
|
let _coreAndExtensionResources: Resource[];
|
||||||
|
|
||||||
|
export function pullCoreAndExtensionsXlfFiles(apiHostname: string, username: string, password: string, language: Language, externalExtensions?: Map<string>): NodeJS.ReadableStream {
|
||||||
|
if (!_coreAndExtensionResources) {
|
||||||
|
_coreAndExtensionResources = [];
|
||||||
|
// editor and workbench
|
||||||
|
const json = JSON.parse(fs.readFileSync('./build/lib/i18n.resources.json', 'utf8'));
|
||||||
|
_coreAndExtensionResources.push(...json.editor);
|
||||||
|
_coreAndExtensionResources.push(...json.workbench);
|
||||||
|
|
||||||
|
// extensions
|
||||||
|
let extensionsToLocalize = Object.create(null);
|
||||||
|
glob.sync('.build/extensions/**/*.nls.json').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
|
||||||
|
glob.sync('.build/extensions/*/node_modules/vscode-nls').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
|
||||||
|
|
||||||
|
Object.keys(extensionsToLocalize).forEach(extension => {
|
||||||
|
_coreAndExtensionResources.push({ name: extension, project: extensionsProject });
|
||||||
|
});
|
||||||
|
|
||||||
|
if (externalExtensions) {
|
||||||
|
for (let resourceName in externalExtensions) {
|
||||||
|
_coreAndExtensionResources.push({ name: resourceName, project: extensionsProject });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return pullXlfFiles(apiHostname, username, password, language, _coreAndExtensionResources);
|
||||||
|
}
|
||||||
|
|
||||||
export function pullSetupXlfFiles(apiHostname: string, username: string, password: string, language: Language, includeDefault: boolean): NodeJS.ReadableStream {
|
export function pullSetupXlfFiles(apiHostname: string, username: string, password: string, language: Language, includeDefault: boolean): NodeJS.ReadableStream {
|
||||||
let setupResources = [{ name: 'setup_messages', project: workbenchProject }];
|
let setupResources = [{ name: 'setup_messages', project: workbenchProject }];
|
||||||
if (includeDefault) {
|
if (includeDefault) {
|
||||||
@@ -1176,16 +1204,20 @@ export interface TranslationPath {
|
|||||||
resourceName: string;
|
resourceName: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function pullI18nPackFiles(apiHostname: string, username: string, password: string, language: Language, resultingTranslationPaths: TranslationPath[]): NodeJS.ReadableStream {
|
||||||
|
return pullCoreAndExtensionsXlfFiles(apiHostname, username, password, language, externalExtensionsWithTranslations)
|
||||||
|
.pipe(prepareI18nPackFiles(externalExtensionsWithTranslations, resultingTranslationPaths, language.id === 'ps'));
|
||||||
|
}
|
||||||
|
|
||||||
export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingTranslationPaths: TranslationPath[], pseudo = false): NodeJS.ReadWriteStream {
|
export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingTranslationPaths: TranslationPath[], pseudo = false): NodeJS.ReadWriteStream {
|
||||||
let parsePromises: Promise<ParsedXLF[]>[] = [];
|
let parsePromises: Promise<ParsedXLF[]>[] = [];
|
||||||
let mainPack: I18nPack = { version: i18nPackVersion, contents: {} };
|
let mainPack: I18nPack = { version: i18nPackVersion, contents: {} };
|
||||||
let extensionsPacks: Map<I18nPack> = {};
|
let extensionsPacks: Map<I18nPack> = {};
|
||||||
let errors: any[] = [];
|
let errors: any[] = [];
|
||||||
return through(function (this: ThroughStream, xlf: File) {
|
return through(function (this: ThroughStream, xlf: File) {
|
||||||
let project = path.basename(path.dirname(path.dirname(xlf.relative)));
|
let project = path.basename(path.dirname(xlf.relative));
|
||||||
let resource = path.basename(xlf.relative, '.xlf');
|
let resource = path.basename(xlf.relative, '.xlf');
|
||||||
let contents = xlf.contents.toString();
|
let contents = xlf.contents.toString();
|
||||||
log(`Found ${project}: ${resource}`);
|
|
||||||
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
|
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
|
||||||
parsePromises.push(parsePromise);
|
parsePromises.push(parsePromise);
|
||||||
parsePromise.then(
|
parsePromise.then(
|
||||||
@@ -1254,6 +1286,9 @@ export function prepareIslFiles(language: Language, innoSetupConfig: InnoSetup):
|
|||||||
parsePromise.then(
|
parsePromise.then(
|
||||||
resolvedFiles => {
|
resolvedFiles => {
|
||||||
resolvedFiles.forEach(file => {
|
resolvedFiles.forEach(file => {
|
||||||
|
if (path.basename(file.originalFilePath) === 'Default' && !innoSetupConfig.defaultInfo) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
let translatedFile = createIslFile(file.originalFilePath, file.messages, language, innoSetupConfig);
|
let translatedFile = createIslFile(file.originalFilePath, file.messages, language, innoSetupConfig);
|
||||||
stream.queue(translatedFile);
|
stream.queue(translatedFile);
|
||||||
});
|
});
|
||||||
@@ -1288,9 +1323,17 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
|
|||||||
let key = sections[0];
|
let key = sections[0];
|
||||||
let translated = line;
|
let translated = line;
|
||||||
if (key) {
|
if (key) {
|
||||||
let translatedMessage = messages[key];
|
if (key === 'LanguageName') {
|
||||||
if (translatedMessage) {
|
translated = `${key}=${innoSetup.defaultInfo!.name}`;
|
||||||
translated = `${key}=${translatedMessage}`;
|
} else if (key === 'LanguageID') {
|
||||||
|
translated = `${key}=${innoSetup.defaultInfo!.id}`;
|
||||||
|
} else if (key === 'LanguageCodePage') {
|
||||||
|
translated = `${key}=${innoSetup.codePage.substr(2)}`;
|
||||||
|
} else {
|
||||||
|
let translatedMessage = messages[key];
|
||||||
|
if (translatedMessage) {
|
||||||
|
translated = `${key}=${translatedMessage}`;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -199,7 +199,7 @@ const RULES = [
|
|||||||
]
|
]
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
const TS_CONFIG_PATH = path_1.join(__dirname, '../../', 'src', 'tsconfig.json');
|
const TS_CONFIG_PATH = (0, path_1.join)(__dirname, '../../', 'src', 'tsconfig.json');
|
||||||
let hasErrors = false;
|
let hasErrors = false;
|
||||||
function checkFile(program, sourceFile, rule) {
|
function checkFile(program, sourceFile, rule) {
|
||||||
checkNode(sourceFile);
|
checkNode(sourceFile);
|
||||||
@@ -250,8 +250,8 @@ function checkFile(program, sourceFile, rule) {
|
|||||||
}
|
}
|
||||||
function createProgram(tsconfigPath) {
|
function createProgram(tsconfigPath) {
|
||||||
const tsConfig = ts.readConfigFile(tsconfigPath, ts.sys.readFile);
|
const tsConfig = ts.readConfigFile(tsconfigPath, ts.sys.readFile);
|
||||||
const configHostParser = { fileExists: fs_1.existsSync, readDirectory: ts.sys.readDirectory, readFile: file => fs_1.readFileSync(file, 'utf8'), useCaseSensitiveFileNames: process.platform === 'linux' };
|
const configHostParser = { fileExists: fs_1.existsSync, readDirectory: ts.sys.readDirectory, readFile: file => (0, fs_1.readFileSync)(file, 'utf8'), useCaseSensitiveFileNames: process.platform === 'linux' };
|
||||||
const tsConfigParsed = ts.parseJsonConfigFileContent(tsConfig.config, configHostParser, path_1.resolve(path_1.dirname(tsconfigPath)), { noEmit: true });
|
const tsConfigParsed = ts.parseJsonConfigFileContent(tsConfig.config, configHostParser, (0, path_1.resolve)((0, path_1.dirname)(tsconfigPath)), { noEmit: true });
|
||||||
const compilerHost = ts.createCompilerHost(tsConfigParsed.options, true);
|
const compilerHost = ts.createCompilerHost(tsConfigParsed.options, true);
|
||||||
return ts.createProgram(tsConfigParsed.fileNames, tsConfigParsed.options, compilerHost);
|
return ts.createProgram(tsConfigParsed.fileNames, tsConfigParsed.options, compilerHost);
|
||||||
}
|
}
|
||||||
@@ -261,7 +261,7 @@ function createProgram(tsconfigPath) {
|
|||||||
const program = createProgram(TS_CONFIG_PATH);
|
const program = createProgram(TS_CONFIG_PATH);
|
||||||
for (const sourceFile of program.getSourceFiles()) {
|
for (const sourceFile of program.getSourceFiles()) {
|
||||||
for (const rule of RULES) {
|
for (const rule of RULES) {
|
||||||
if (minimatch_1.match([sourceFile.fileName], rule.target).length > 0) {
|
if ((0, minimatch_1.match)([sourceFile.fileName], rule.target).length > 0) {
|
||||||
if (!rule.skip) {
|
if (!rule.skip) {
|
||||||
checkFile(program, sourceFile, rule);
|
checkFile(program, sourceFile, rule);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -96,7 +96,7 @@ function modifyI18nPackFiles(existingTranslationFolder, resultingTranslationPath
|
|||||||
let mainPack = { version: i18n.i18nPackVersion, contents: {} };
|
let mainPack = { version: i18n.i18nPackVersion, contents: {} };
|
||||||
let extensionsPacks = {};
|
let extensionsPacks = {};
|
||||||
let errors = [];
|
let errors = [];
|
||||||
return event_stream_1.through(function (xlf) {
|
return (0, event_stream_1.through)(function (xlf) {
|
||||||
let rawResource = path.basename(xlf.relative, '.xlf');
|
let rawResource = path.basename(xlf.relative, '.xlf');
|
||||||
let resource = rawResource.substring(0, rawResource.lastIndexOf('.'));
|
let resource = rawResource.substring(0, rawResource.lastIndexOf('.'));
|
||||||
let contents = xlf.contents.toString();
|
let contents = xlf.contents.toString();
|
||||||
@@ -238,7 +238,7 @@ function refreshLangpacks() {
|
|||||||
}
|
}
|
||||||
let packageJSON = JSON.parse(fs.readFileSync(path.join(locExtFolder, 'package.json')).toString());
|
let packageJSON = JSON.parse(fs.readFileSync(path.join(locExtFolder, 'package.json')).toString());
|
||||||
//processing extension fields, version and folder name must be changed manually.
|
//processing extension fields, version and folder name must be changed manually.
|
||||||
packageJSON['name'] = packageJSON['name'].replace('vscode', textFields.nameText).toLowerCase();
|
packageJSON['name'] = packageJSON['name'].replace('vscode', textFields.nameText);
|
||||||
packageJSON['displayName'] = packageJSON['displayName'].replace('Visual Studio Code', textFields.displayNameText);
|
packageJSON['displayName'] = packageJSON['displayName'].replace('Visual Studio Code', textFields.displayNameText);
|
||||||
packageJSON['publisher'] = textFields.publisherText;
|
packageJSON['publisher'] = textFields.publisherText;
|
||||||
packageJSON['license'] = textFields.licenseText;
|
packageJSON['license'] = textFields.licenseText;
|
||||||
@@ -353,11 +353,6 @@ function renameVscodeLangpacks() {
|
|||||||
console.log('vscode pack is not in ADS yet: ' + langId);
|
console.log('vscode pack is not in ADS yet: ' + langId);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
//Delete any erroneous zip files found in vscode folder.
|
|
||||||
let globZipArray = glob.sync(path.join(locVSCODEFolder, '*.zip'));
|
|
||||||
globZipArray.forEach(element => {
|
|
||||||
fs.unlinkSync(element);
|
|
||||||
});
|
|
||||||
// Delete extension files in vscode language pack that are not in ADS.
|
// Delete extension files in vscode language pack that are not in ADS.
|
||||||
if (fs.existsSync(translationDataFolder)) {
|
if (fs.existsSync(translationDataFolder)) {
|
||||||
let totalExtensions = fs.readdirSync(path.join(translationDataFolder, 'extensions'));
|
let totalExtensions = fs.readdirSync(path.join(translationDataFolder, 'extensions'));
|
||||||
@@ -371,9 +366,9 @@ function renameVscodeLangpacks() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
//Get list of md files in ADS langpack, to copy to vscode langpack prior to renaming.
|
//Get list of md files in ADS langpack, to copy to vscode langpack prior to renaming.
|
||||||
let globMDArray = glob.sync(path.join(locADSFolder, '*.md'));
|
let globArray = glob.sync(path.join(locADSFolder, '*.md'));
|
||||||
//Copy files to vscode langpack, then remove the ADS langpack, and finally rename the vscode langpack to match the ADS one.
|
//Copy files to vscode langpack, then remove the ADS langpack, and finally rename the vscode langpack to match the ADS one.
|
||||||
globMDArray.forEach(element => {
|
globArray.forEach(element => {
|
||||||
fs.copyFileSync(element, path.join(locVSCODEFolder, path.parse(element).base));
|
fs.copyFileSync(element, path.join(locVSCODEFolder, path.parse(element).base));
|
||||||
});
|
});
|
||||||
rimraf.sync(locADSFolder);
|
rimraf.sync(locADSFolder);
|
||||||
|
|||||||
@@ -256,7 +256,7 @@ export function refreshLangpacks(): Promise<void> {
|
|||||||
}
|
}
|
||||||
let packageJSON = JSON.parse(fs.readFileSync(path.join(locExtFolder, 'package.json')).toString());
|
let packageJSON = JSON.parse(fs.readFileSync(path.join(locExtFolder, 'package.json')).toString());
|
||||||
//processing extension fields, version and folder name must be changed manually.
|
//processing extension fields, version and folder name must be changed manually.
|
||||||
packageJSON['name'] = packageJSON['name'].replace('vscode', textFields.nameText).toLowerCase();
|
packageJSON['name'] = packageJSON['name'].replace('vscode', textFields.nameText);
|
||||||
packageJSON['displayName'] = packageJSON['displayName'].replace('Visual Studio Code', textFields.displayNameText);
|
packageJSON['displayName'] = packageJSON['displayName'].replace('Visual Studio Code', textFields.displayNameText);
|
||||||
packageJSON['publisher'] = textFields.publisherText;
|
packageJSON['publisher'] = textFields.publisherText;
|
||||||
packageJSON['license'] = textFields.licenseText;
|
packageJSON['license'] = textFields.licenseText;
|
||||||
@@ -375,13 +375,6 @@ export function renameVscodeLangpacks(): Promise<void> {
|
|||||||
console.log('vscode pack is not in ADS yet: ' + langId);
|
console.log('vscode pack is not in ADS yet: ' + langId);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
//Delete any erroneous zip files found in vscode folder.
|
|
||||||
let globZipArray = glob.sync(path.join(locVSCODEFolder, '*.zip'));
|
|
||||||
globZipArray.forEach(element => {
|
|
||||||
fs.unlinkSync(element);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Delete extension files in vscode language pack that are not in ADS.
|
// Delete extension files in vscode language pack that are not in ADS.
|
||||||
if (fs.existsSync(translationDataFolder)) {
|
if (fs.existsSync(translationDataFolder)) {
|
||||||
let totalExtensions = fs.readdirSync(path.join(translationDataFolder, 'extensions'));
|
let totalExtensions = fs.readdirSync(path.join(translationDataFolder, 'extensions'));
|
||||||
@@ -396,10 +389,10 @@ export function renameVscodeLangpacks(): Promise<void> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
//Get list of md files in ADS langpack, to copy to vscode langpack prior to renaming.
|
//Get list of md files in ADS langpack, to copy to vscode langpack prior to renaming.
|
||||||
let globMDArray = glob.sync(path.join(locADSFolder, '*.md'));
|
let globArray = glob.sync(path.join(locADSFolder, '*.md'));
|
||||||
|
|
||||||
//Copy files to vscode langpack, then remove the ADS langpack, and finally rename the vscode langpack to match the ADS one.
|
//Copy files to vscode langpack, then remove the ADS langpack, and finally rename the vscode langpack to match the ADS one.
|
||||||
globMDArray.forEach(element => {
|
globArray.forEach(element => {
|
||||||
fs.copyFileSync(element, path.join(locVSCODEFolder,path.parse(element).base));
|
fs.copyFileSync(element, path.join(locVSCODEFolder,path.parse(element).base));
|
||||||
});
|
});
|
||||||
rimraf.sync(locADSFolder);
|
rimraf.sync(locADSFolder);
|
||||||
|
|||||||
@@ -53,8 +53,8 @@ define([], [${wrap + lines.map(l => indent + l).join(',\n') + wrap}]);`;
|
|||||||
* Returns a stream containing the patched JavaScript and source maps.
|
* Returns a stream containing the patched JavaScript and source maps.
|
||||||
*/
|
*/
|
||||||
function nls() {
|
function nls() {
|
||||||
const input = event_stream_1.through();
|
const input = (0, event_stream_1.through)();
|
||||||
const output = input.pipe(event_stream_1.through(function (f) {
|
const output = input.pipe((0, event_stream_1.through)(function (f) {
|
||||||
if (!f.sourceMap) {
|
if (!f.sourceMap) {
|
||||||
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
|
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
|
||||||
}
|
}
|
||||||
@@ -72,7 +72,7 @@ function nls() {
|
|||||||
}
|
}
|
||||||
_nls.patchFiles(f, typescript).forEach(f => this.emit('data', f));
|
_nls.patchFiles(f, typescript).forEach(f => this.emit('data', f));
|
||||||
}));
|
}));
|
||||||
return event_stream_1.duplex(input, output);
|
return (0, event_stream_1.duplex)(input, output);
|
||||||
}
|
}
|
||||||
exports.nls = nls;
|
exports.nls = nls;
|
||||||
function isImportNode(ts, node) {
|
function isImportNode(ts, node) {
|
||||||
|
|||||||
@@ -98,7 +98,7 @@ function toConcatStream(src, bundledFileHeader, sources, dest, fileContentMapper
|
|||||||
return es.readArray(treatedSources)
|
return es.readArray(treatedSources)
|
||||||
.pipe(useSourcemaps ? util.loadSourcemaps() : es.through())
|
.pipe(useSourcemaps ? util.loadSourcemaps() : es.through())
|
||||||
.pipe(concat(dest))
|
.pipe(concat(dest))
|
||||||
.pipe(stats_1.createStatsStream(dest));
|
.pipe((0, stats_1.createStatsStream)(dest));
|
||||||
}
|
}
|
||||||
function toBundleStream(src, bundledFileHeader, bundles, fileContentMapper) {
|
function toBundleStream(src, bundledFileHeader, bundles, fileContentMapper) {
|
||||||
return es.merge(bundles.map(function (bundle) {
|
return es.merge(bundles.map(function (bundle) {
|
||||||
@@ -155,7 +155,7 @@ function optimizeTask(opts) {
|
|||||||
addComment: true,
|
addComment: true,
|
||||||
includeContent: true
|
includeContent: true
|
||||||
}))
|
}))
|
||||||
.pipe(opts.languages && opts.languages.length ? i18n_1.processNlsFiles({
|
.pipe(opts.languages && opts.languages.length ? (0, i18n_1.processNlsFiles)({
|
||||||
fileHeader: bundledFileHeader,
|
fileHeader: bundledFileHeader,
|
||||||
languages: opts.languages
|
languages: opts.languages
|
||||||
}) : es.through())
|
}) : es.through())
|
||||||
@@ -179,7 +179,7 @@ function minifyTask(src, sourceMapBaseUrl) {
|
|||||||
sourcemap: 'external',
|
sourcemap: 'external',
|
||||||
outdir: '.',
|
outdir: '.',
|
||||||
platform: 'node',
|
platform: 'node',
|
||||||
target: ['node14.16'],
|
target: ['node12.18'],
|
||||||
write: false
|
write: false
|
||||||
}).then(res => {
|
}).then(res => {
|
||||||
const jsFile = res.outputFiles.find(f => /\.js$/.test(f.path));
|
const jsFile = res.outputFiles.find(f => /\.js$/.test(f.path));
|
||||||
|
|||||||
@@ -256,7 +256,7 @@ export function minifyTask(src: string, sourceMapBaseUrl?: string): (cb: any) =>
|
|||||||
sourcemap: 'external',
|
sourcemap: 'external',
|
||||||
outdir: '.',
|
outdir: '.',
|
||||||
platform: 'node',
|
platform: 'node',
|
||||||
target: ['node14.16'],
|
target: ['node12.18'],
|
||||||
write: false
|
write: false
|
||||||
}).then(res => {
|
}).then(res => {
|
||||||
const jsFile = res.outputFiles.find(f => /\.js$/.test(f.path))!;
|
const jsFile = res.outputFiles.find(f => /\.js$/.test(f.path))!;
|
||||||
|
|||||||
@@ -12,7 +12,7 @@ const yarn = process.platform === 'win32' ? 'yarn.cmd' : 'yarn';
|
|||||||
const rootDir = path.resolve(__dirname, '..', '..');
|
const rootDir = path.resolve(__dirname, '..', '..');
|
||||||
function runProcess(command, args = []) {
|
function runProcess(command, args = []) {
|
||||||
return new Promise((resolve, reject) => {
|
return new Promise((resolve, reject) => {
|
||||||
const child = child_process_1.spawn(command, args, { cwd: rootDir, stdio: 'inherit', env: process.env });
|
const child = (0, child_process_1.spawn)(command, args, { cwd: rootDir, stdio: 'inherit', env: process.env });
|
||||||
child.on('exit', err => !err ? resolve() : process.exit(err !== null && err !== void 0 ? err : 1));
|
child.on('exit', err => !err ? resolve() : process.exit(err !== null && err !== void 0 ? err : 1));
|
||||||
child.on('error', reject);
|
child.on('error', reject);
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -260,7 +260,7 @@ function transportCSS(module, enqueue, write) {
|
|||||||
}
|
}
|
||||||
const filename = path.join(SRC_DIR, module);
|
const filename = path.join(SRC_DIR, module);
|
||||||
const fileContents = fs.readFileSync(filename).toString();
|
const fileContents = fs.readFileSync(filename).toString();
|
||||||
const inlineResources = 'base64'; // see https://github.com/microsoft/monaco-editor/issues/148
|
const inlineResources = 'base64'; // see https://github.com/Microsoft/monaco-editor/issues/148
|
||||||
const newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64');
|
const newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64');
|
||||||
write(module, newContents);
|
write(module, newContents);
|
||||||
return true;
|
return true;
|
||||||
|
|||||||
@@ -302,7 +302,7 @@ function transportCSS(module: string, enqueue: (module: string) => void, write:
|
|||||||
|
|
||||||
const filename = path.join(SRC_DIR, module);
|
const filename = path.join(SRC_DIR, module);
|
||||||
const fileContents = fs.readFileSync(filename).toString();
|
const fileContents = fs.readFileSync(filename).toString();
|
||||||
const inlineResources = 'base64'; // see https://github.com/microsoft/monaco-editor/issues/148
|
const inlineResources = 'base64'; // see https://github.com/Microsoft/monaco-editor/issues/148
|
||||||
|
|
||||||
const newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64');
|
const newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64');
|
||||||
write(module, newContents);
|
write(module, newContents);
|
||||||
|
|||||||
@@ -241,9 +241,6 @@ function nodeOrChildIsBlack(node) {
|
|||||||
}
|
}
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
function isSymbolWithDeclarations(symbol) {
|
|
||||||
return !!(symbol && symbol.declarations);
|
|
||||||
}
|
|
||||||
function markNodes(ts, languageService, options) {
|
function markNodes(ts, languageService, options) {
|
||||||
const program = languageService.getProgram();
|
const program = languageService.getProgram();
|
||||||
if (!program) {
|
if (!program) {
|
||||||
@@ -416,7 +413,7 @@ function markNodes(ts, languageService, options) {
|
|||||||
if (symbolImportNode) {
|
if (symbolImportNode) {
|
||||||
setColor(symbolImportNode, 2 /* Black */);
|
setColor(symbolImportNode, 2 /* Black */);
|
||||||
}
|
}
|
||||||
if (isSymbolWithDeclarations(symbol) && !nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol)) {
|
if (symbol && !nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol)) {
|
||||||
for (let i = 0, len = symbol.declarations.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
for (let i = 0, len = symbol.declarations.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const declaration = symbol.declarations[i]; // {{SQL CARBON EDIT}} Compile fixes
|
const declaration = symbol.declarations[i]; // {{SQL CARBON EDIT}} Compile fixes
|
||||||
if (ts.isSourceFile(declaration)) {
|
if (ts.isSourceFile(declaration)) {
|
||||||
@@ -689,7 +686,7 @@ function getRealNodeSymbol(ts, checker, node) {
|
|||||||
// get the aliased symbol instead. This allows for goto def on an import e.g.
|
// get the aliased symbol instead. This allows for goto def on an import e.g.
|
||||||
// import {A, B} from "mod";
|
// import {A, B} from "mod";
|
||||||
// to jump to the implementation directly.
|
// to jump to the implementation directly.
|
||||||
if (symbol && symbol.flags & ts.SymbolFlags.Alias && symbol.declarations && shouldSkipAlias(node, symbol.declarations[0])) { // {{SQL CARBON EDIT}} Compile fixes
|
if (symbol && symbol.flags & ts.SymbolFlags.Alias && shouldSkipAlias(node, symbol.declarations[0])) { // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const aliased = checker.getAliasedSymbol(symbol);
|
const aliased = checker.getAliasedSymbol(symbol);
|
||||||
if (aliased.declarations) {
|
if (aliased.declarations) {
|
||||||
// We should mark the import as visited
|
// We should mark the import as visited
|
||||||
|
|||||||
@@ -323,10 +323,6 @@ function nodeOrChildIsBlack(node: ts.Node): boolean {
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
function isSymbolWithDeclarations(symbol: ts.Symbol | undefined | null): symbol is ts.Symbol & { declarations: ts.Declaration[] } {
|
|
||||||
return !!(symbol && symbol.declarations);
|
|
||||||
}
|
|
||||||
|
|
||||||
function markNodes(ts: typeof import('typescript'), languageService: ts.LanguageService, options: ITreeShakingOptions) {
|
function markNodes(ts: typeof import('typescript'), languageService: ts.LanguageService, options: ITreeShakingOptions) {
|
||||||
const program = languageService.getProgram();
|
const program = languageService.getProgram();
|
||||||
if (!program) {
|
if (!program) {
|
||||||
@@ -534,7 +530,7 @@ function markNodes(ts: typeof import('typescript'), languageService: ts.Language
|
|||||||
setColor(symbolImportNode, NodeColor.Black);
|
setColor(symbolImportNode, NodeColor.Black);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (isSymbolWithDeclarations(symbol) && !nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol)) {
|
if (symbol && !nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol)) {
|
||||||
for (let i = 0, len = symbol.declarations!.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
for (let i = 0, len = symbol.declarations!.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const declaration = symbol.declarations![i]; // {{SQL CARBON EDIT}} Compile fixes
|
const declaration = symbol.declarations![i]; // {{SQL CARBON EDIT}} Compile fixes
|
||||||
if (ts.isSourceFile(declaration)) {
|
if (ts.isSourceFile(declaration)) {
|
||||||
@@ -599,7 +595,7 @@ function markNodes(ts: typeof import('typescript'), languageService: ts.Language
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function nodeIsInItsOwnDeclaration(nodeSourceFile: ts.SourceFile, node: ts.Node, symbol: ts.Symbol & { declarations: ts.Declaration[] }): boolean {
|
function nodeIsInItsOwnDeclaration(nodeSourceFile: ts.SourceFile, node: ts.Node, symbol: ts.Symbol): boolean {
|
||||||
for (let i = 0, len = symbol.declarations!.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
for (let i = 0, len = symbol.declarations!.length; i < len; i++) { // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const declaration = symbol.declarations![i]; // {{SQL CARBON EDIT}} Compile fixes
|
const declaration = symbol.declarations![i]; // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const declarationSourceFile = declaration.getSourceFile();
|
const declarationSourceFile = declaration.getSourceFile();
|
||||||
@@ -842,7 +838,7 @@ function getRealNodeSymbol(ts: typeof import('typescript'), checker: ts.TypeChec
|
|||||||
// get the aliased symbol instead. This allows for goto def on an import e.g.
|
// get the aliased symbol instead. This allows for goto def on an import e.g.
|
||||||
// import {A, B} from "mod";
|
// import {A, B} from "mod";
|
||||||
// to jump to the implementation directly.
|
// to jump to the implementation directly.
|
||||||
if (symbol && symbol.flags & ts.SymbolFlags.Alias && symbol.declarations && shouldSkipAlias(node, symbol.declarations![0])) { // {{SQL CARBON EDIT}} Compile fixes
|
if (symbol && symbol.flags & ts.SymbolFlags.Alias && shouldSkipAlias(node, symbol.declarations![0])) { // {{SQL CARBON EDIT}} Compile fixes
|
||||||
const aliased = checker.getAliasedSymbol(symbol);
|
const aliased = checker.getAliasedSymbol(symbol);
|
||||||
if (aliased.declarations) {
|
if (aliased.declarations) {
|
||||||
// We should mark the import as visited
|
// We should mark the import as visited
|
||||||
|
|||||||
2
build/lib/typings/gulp-bom.d.ts
vendored
2
build/lib/typings/gulp-bom.d.ts
vendored
@@ -4,7 +4,7 @@ declare module "gulp-bom" {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* This is required as per:
|
* This is required as per:
|
||||||
* https://github.com/microsoft/TypeScript/issues/5073
|
* https://github.com/Microsoft/TypeScript/issues/5073
|
||||||
*/
|
*/
|
||||||
namespace f {}
|
namespace f {}
|
||||||
|
|
||||||
|
|||||||
4
build/lib/typings/gulp-flatmap.d.ts
vendored
4
build/lib/typings/gulp-flatmap.d.ts
vendored
@@ -4,9 +4,9 @@ declare module 'gulp-flatmap' {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* This is required as per:
|
* This is required as per:
|
||||||
* https://github.com/microsoft/TypeScript/issues/5073
|
* https://github.com/Microsoft/TypeScript/issues/5073
|
||||||
*/
|
*/
|
||||||
namespace f {}
|
namespace f {}
|
||||||
|
|
||||||
export = f;
|
export = f;
|
||||||
}
|
}
|
||||||
4
build/lib/typings/vinyl.d.ts
vendored
4
build/lib/typings/vinyl.d.ts
vendored
@@ -103,10 +103,10 @@ declare module "vinyl" {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* This is required as per:
|
* This is required as per:
|
||||||
* https://github.com/microsoft/TypeScript/issues/5073
|
* https://github.com/Microsoft/TypeScript/issues/5073
|
||||||
*/
|
*/
|
||||||
namespace File {}
|
namespace File {}
|
||||||
|
|
||||||
export = File;
|
export = File;
|
||||||
|
|
||||||
}
|
}
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user