Compare commits

..

16 Commits

Author SHA1 Message Date
Chris LaFreniere
4095037f25 Fix Notebook Viewlet Highlighting (#10868) (#10870)
* Stop double tree data provider registrations

* revert book finding logic, leverage currentBook
2020-06-10 22:56:57 -07:00
Karl Burtram
7091480c55 Update SQL Tools Service to include "use master" revert (#10834) (#10835) 2020-06-09 17:37:20 -07:00
Hale Rankin
b6b360de10 Fixed notebook toolbar icon issue in dark and hc-black themes. (#10802) (#10830) 2020-06-09 15:41:41 -07:00
Maddy
d6ba3c5d48 highlight correct notebook on navigation. (#10807) (#10812)
* highlight correct notebook on navigation.

* add back behavior of highight on viewlet visible

* optimize bookViewer create

* createTreeView for every book
2020-06-08 20:02:25 -07:00
Aasim Khan
5de8b066e7 Fix for "Run Current Query" keybind no longer behaving as expected (#10538) (#10557) (#10804)
* -"Run current" command runs the entire selection instead of only first statement in the selection

* -brought some logic back from the  old code to correct the behavior

* -Added unit tests for runCurrent command

* -Fixed a comment

* - Added checks for run input parameters

* -Added some extra checks

* -Fixed some spacing issue

* Changed to using verify instead of a counter variable
2020-06-08 18:34:42 -07:00
Amir Omidi
9962363f4c Fixes the azure auth for postgresql (#10662) (#10753) 2020-06-08 13:55:52 -07:00
Hale Rankin
1c0a9a1849 10712, 10677 - Fixed focus indicators for both main toolbar and cell … (#10756) (#10785)
* 10712, 10677 - Fixed focus indicators for both main toolbar and cell toolbar.

* Removed duplicated line
2020-06-08 09:51:35 -07:00
Hale Rankin
a871b03550 Reduced margin between notebook cells. Increased space at top of page. (#10735) (#10783) 2020-06-08 09:20:53 -07:00
Hale Rankin
5e6e9fcc2f Removed style that was interfering with text selection within code cell. (#10769) (#10786) 2020-06-05 18:32:10 -07:00
Hale Rankin
f130ced71c Switched order of the two dropdowns in main toolbar in preview. (#10717) (#10784)
* Switched order of the two dropdowns in main toolbar in preview.

* Reverting this single line b/c it's out of scope for the PR.
2020-06-05 18:16:06 -07:00
Chris LaFreniere
d7d4c26780 Bring back hover state setting on cell model (#10758) (#10782) 2020-06-05 17:43:05 -07:00
Alan Ren
7b382794e2 reduce max memory (#10775) (#10778) 2020-06-05 15:32:28 -07:00
Cory Rivera
b6be7f1218 Add custom width to Python wizard to reduce wasted space. (#10705) (#10755)
* Also use double quotes for localize calls in the wizard.
2020-06-05 12:17:00 -07:00
Karl Burtram
ca7ee35993 Add Redgate Search to recommendation list (#10749) (#10764) 2020-06-05 09:05:21 -07:00
Chris LaFreniere
46192ae754 Change markdown cell to text cell (#10715) (#10744) 2020-06-04 15:00:12 -07:00
Chris LaFreniere
f5b0141763 Only show edit cell action for markdown cells (#10719) (#10743) 2020-06-04 14:36:52 -07:00
2258 changed files with 40680 additions and 90940 deletions

View File

@@ -1,121 +0,0 @@
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
FROM mcr.microsoft.com/vscode/devcontainers/typescript-node:0-10
ARG TARGET_DISPLAY=":1"
# VNC options
ARG MAX_VNC_RESOLUTION=1920x1080x16
ARG TARGET_VNC_RESOLUTION=1920x1080
ARG TARGET_VNC_DPI=72
ARG TARGET_VNC_PORT=5901
ARG VNC_PASSWORD="vscode"
# noVNC (VNC web client) options
ARG INSTALL_NOVNC="true"
ARG NOVNC_VERSION=1.1.0
ARG TARGET_NOVNC_PORT=6080
ARG WEBSOCKETIFY_VERSION=0.9.0
# Firefox is useful for testing things like browser launch events, but optional
ARG INSTALL_FIREFOX="false"
# Expected non-root username from base image
ARG USERNAME=node
# Core environment variables for X11, VNC, and fluxbox
ENV DBUS_SESSION_BUS_ADDRESS="autolaunch:" \
MAX_VNC_RESOLUTION="${MAX_VNC_RESOLUTION}" \
VNC_RESOLUTION="${TARGET_VNC_RESOLUTION}" \
VNC_DPI="${TARGET_VNC_DPI}" \
VNC_PORT="${TARGET_VNC_PORT}" \
NOVNC_PORT="${TARGET_NOVNC_PORT}" \
DISPLAY="${TARGET_DISPLAY}" \
LANG="en_US.UTF-8" \
LANGUAGE="en_US.UTF-8" \
VISUAL="nano" \
EDITOR="nano"
# Configure apt and install packages
RUN apt-get update \
&& export DEBIAN_FRONTEND=noninteractive \
#
# Install the Cascadia Code fonts - https://github.com/microsoft/cascadia-code
&& curl -sSL https://github.com/microsoft/cascadia-code/releases/download/v2004.30/CascadiaCode_2004.30.zip -o /tmp/cascadia-fonts.zip \
&& unzip /tmp/cascadia-fonts.zip -d /tmp/cascadia-fonts \
&& mkdir -p /usr/share/fonts/truetype/cascadia \
&& mv /tmp/cascadia-fonts/ttf/* /usr/share/fonts/truetype/cascadia/ \
&& rm -rf /tmp/cascadia-fonts.zip /tmp/cascadia-fonts \
#
# Install X11, fluxbox and VS Code dependencies
&& apt-get -y install --no-install-recommends \
xvfb \
x11vnc \
fluxbox \
dbus-x11 \
x11-utils \
x11-xserver-utils \
xdg-utils \
fbautostart \
xterm \
eterm \
gnome-terminal \
gnome-keyring \
seahorse \
nautilus \
libx11-dev \
libxkbfile-dev \
libsecret-1-dev \
libnotify4 \
libnss3 \
libxss1 \
libasound2 \
xfonts-base \
xfonts-terminus \
fonts-noto \
fonts-wqy-microhei \
fonts-droid-fallback \
vim-tiny \
nano \
#
# [Optional] Install noVNC
&& if [ "${INSTALL_NOVNC}" = "true" ]; then \
mkdir -p /usr/local/novnc \
&& curl -sSL https://github.com/novnc/noVNC/archive/v${NOVNC_VERSION}.zip -o /tmp/novnc-install.zip \
&& unzip /tmp/novnc-install.zip -d /usr/local/novnc \
&& cp /usr/local/novnc/noVNC-${NOVNC_VERSION}/vnc_lite.html /usr/local/novnc/noVNC-${NOVNC_VERSION}/index.html \
&& rm /tmp/novnc-install.zip \
&& curl -sSL https://github.com/novnc/websockify/archive/v${WEBSOCKETIFY_VERSION}.zip -o /tmp/websockify-install.zip \
&& unzip /tmp/websockify-install.zip -d /usr/local/novnc \
&& apt-get -y install --no-install-recommends python-numpy \
&& ln -s /usr/local/novnc/websockify-${WEBSOCKETIFY_VERSION} /usr/local/novnc/noVNC-${NOVNC_VERSION}/utils/websockify \
&& rm /tmp/websockify-install.zip; \
fi \
#
# [Optional] Install Firefox
&& if [ "${INSTALL_FIREFOX}" = "true" ]; then \
apt-get -y install --no-install-recommends firefox-esr; \
fi \
#
# Clean up
&& apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /var/lib/apt/lists/*
COPY bin/init-dev-container.sh /usr/local/share/
COPY bin/set-resolution /usr/local/bin/
COPY fluxbox/* /root/.fluxbox/
COPY fluxbox/* /home/${USERNAME}/.fluxbox/
# Update privs, owners of config files
RUN mkdir -p /var/run/dbus /root/.vnc /home/${USERNAME}/.vnc \
&& touch /root/.Xmodmap /home/${USERNAME}/.Xmodmap \
&& echo "${VNC_PASSWORD}" | tee /root/.vnc/passwd > /home/${USERNAME}/.vnc/passwd \
&& chown -R ${USERNAME}:${USERNAME} /home/${USERNAME}/.Xmodmap /home/${USERNAME}/.fluxbox /home/${USERNAME}/.vnc \
&& chmod +x /usr/local/share/init-dev-container.sh /usr/local/bin/set-resolution
ENTRYPOINT ["/usr/local/share/init-dev-container.sh"]
CMD ["sleep", "infinity"]

View File

@@ -1,82 +0,0 @@
# Code - OSS Development Container
This repository includes configuration for a development container for working with Code - OSS in an isolated local container or using [Visual Studio Codespaces](https://aka.ms/vso).
> **Tip:** The default VNC password is `vscode`. The VNC server runs on port `5901` with a web client at `6080`. For better performance, we recommend using a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/). Applications like the macOS Screen Sharing app will not perform as well. [Chicken](https://sourceforge.net/projects/chicken/) is a good macOS alternative.
## Quick start - local
1. Install Docker Desktop or Docker on your local machine. (See [docs](https://aka.ms/vscode-remote/containers/getting-started) for additional details.)
2. [Docker Desktop] If you are not using the new WSL2 Docker Desktop engine, increase the resources allocated to Docker Desktop to at least **4 Cores and 4 GB of RAM (8 GB recommended)**. Right-click on the Docker status bar item, go to **Preferences/Settings > Resources > Advanced** to do so.
> **Note:** The [Resource Monitor](https://marketplace.visualstudio.com/items?itemName=mutantdino.resourcemonitor) extension is included in the container so you can keep an eye on CPU/Memory in the status bar.
3. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Remote - Containers](https://aka.ms/vscode-remote/download/containers) extension.
![Image of Remote - Containers extension](https://microsoft.github.io/vscode-remote-release/images/remote-containers-extn.png)
> Note that the Remote - Containers extension requires the Visual Studio Code distribution of Code - OSS. See the [FAQ](https://aka.ms/vscode-remote/faq/license) for details.
4. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Remote - Containers: Open Repository in Container...**.
> **Tip:** While you can use your local source tree instead, operations like `yarn install` can be slow on macOS or using the Hyper-V engine on Windows. We recommend the "open repository" approach instead since it uses "named volume" rather than the local filesystem.
5. Type `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box and press <kbd>Enter</kbd>.
6. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
Anything you start in VS Code or the integrated terminal will appear here.
Next: **[Try it out!](#try-it)**
## Quick start - Codespaces
>Note that the Codespaces browser-based editor cannot currently access the desktop environment in this container (due to a [missing feature](https://github.com/MicrosoftDocs/vsonline/issues/117)). We recommend using Visual Studio Code from the desktop to connect instead in the near term.
1. Install [Visual Studio Code Stable](https://code.visualstudio.com/) or [Insiders](https://code.visualstudio.com/insiders/) and the [Visual Studio Codespaces](https://aka.ms/vscs-ext-vscode) extension.
![Image of VS Codespaces extension](https://microsoft.github.io/vscode-remote-release/images/codespaces-extn.png)
> Note that the Visual Studio Codespaces extension requires the Visual Studio Code distribution of Code - OSS.
2. Sign in by pressing <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and selecting **Codespaces: Sign In**. You may also need to use the **Codespaces: Create Plan** if you do not have a plan. See the [Codespaces docs](https://aka.ms/vso-docs/vscode) for details.
3. Press <kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>P</kbd> and select **Codespaces: Create New Codespace**.
4. Use default settings, select a plan, and then enter the repository URL `https://github.com/microsoft/vscode` (or a branch or PR URL) in the input box when prompted.
5. After the container is running, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
6. Anything you start in VS Code or the integrated terminal will appear here.
## Try it!
This container uses the [Fluxbox](http://fluxbox.org/) window manager to keep things lean. **Right-click on the desktop** to see menu options. It works with GNOME and GTK applications, so other tools can be installed if needed.
Note you can also set the resolution from the command line by typing `set-resolution`.
To start working with Code - OSS, follow these steps:
1. In your local VS Code, open a terminal (<kbd>Ctrl/Cmd</kbd> + <kbd>Shift</kbd> + <kbd>\`</kbd>) and type the following commands:
```bash
yarn install
bash scripts/code.sh
```
2. After the build is complete, open a web browser and go to [http://localhost:6080](http://localhost:6080) or use a [VNC Viewer](https://www.realvnc.com/en/connect/download/viewer/) to connect to `localhost:5901` and enter `vscode` as the password.
3. You should now see Code - OSS!
Next, let's try debugging.
1. Shut down Code - OSS by clicking the box in the upper right corner of the Code - OSS window through your browser or VNC viewer.
2. Go to your local VS Code client, and use Run / Debug view to launch the **VS Code** configuration. (Typically the default, so you can likely just press <kbd>F5</kbd>).
> **Note:** If launching times out, you can increase the value of `timeout` in the "VS Code", "Attach Main Process", "Attach Extension Host", and "Attach to Shared Process" configurations in [launch.json](../.vscode/launch.json). However, running `scripts/code.sh` first will set up Electron which will usually solve timeout issues.
3. After a bit, Code - OSS will appear with the debugger attached!
Enjoy!

View File

@@ -1,91 +0,0 @@
#!/bin/bash
NONROOT_USER=node
LOG=/tmp/container-init.log
# Execute the command it not already running
startInBackgroundIfNotRunning()
{
log "Starting $1."
echo -e "\n** $(date) **" | sudoIf tee -a /tmp/$1.log > /dev/null
if ! pidof $1 > /dev/null; then
keepRunningInBackground "$@"
while ! pidof $1 > /dev/null; do
sleep 1
done
log "$1 started."
else
echo "$1 is already running." | sudoIf tee -a /tmp/$1.log > /dev/null
log "$1 is already running."
fi
}
# Keep command running in background
keepRunningInBackground()
{
($2 sh -c "while :; do echo [\$(date)] Process started.; $3; echo [\$(date)] Process exited!; sleep 5; done 2>&1" | sudoIf tee -a /tmp/$1.log > /dev/null & echo "$!" | sudoIf tee /tmp/$1.pid > /dev/null)
}
# Use sudo to run as root when required
sudoIf()
{
if [ "$(id -u)" -ne 0 ]; then
sudo "$@"
else
"$@"
fi
}
# Use sudo to run as non-root user if not already running
sudoUserIf()
{
if [ "$(id -u)" -eq 0 ]; then
sudo -u ${NONROOT_USER} "$@"
else
"$@"
fi
}
# Log messages
log()
{
echo -e "[$(date)] $@" | sudoIf tee -a $LOG > /dev/null
}
log "** SCRIPT START **"
# Start dbus.
log 'Running "/etc/init.d/dbus start".'
if [ -f "/var/run/dbus/pid" ] && ! pidof dbus-daemon > /dev/null; then
sudoIf rm -f /var/run/dbus/pid
fi
sudoIf /etc/init.d/dbus start 2>&1 | sudoIf tee -a /tmp/dbus-daemon-system.log > /dev/null
while ! pidof dbus-daemon > /dev/null; do
sleep 1
done
# Set up Xvfb.
startInBackgroundIfNotRunning "Xvfb" sudoIf "Xvfb ${DISPLAY:-:1} +extension RANDR -screen 0 ${MAX_VNC_RESOLUTION:-1920x1080x16}"
# Start fluxbox as a light weight window manager.
startInBackgroundIfNotRunning "fluxbox" sudoUserIf "dbus-launch startfluxbox"
# Start x11vnc
startInBackgroundIfNotRunning "x11vnc" sudoIf "x11vnc -display ${DISPLAY:-:1} -rfbport ${VNC_PORT:-5901} -localhost -no6 -xkb -shared -forever -passwdfile $HOME/.vnc/passwd"
# Set resolution
/usr/local/bin/set-resolution ${VNC_RESOLUTION:-1280x720} ${VNC_DPI:-72}
# Spin up noVNC if installed and not runnning.
if [ -d "/usr/local/novnc" ] && [ "$(ps -ef | grep /usr/local/novnc/noVNC*/utils/launch.sh | grep -v grep)" = "" ]; then
keepRunningInBackground "noVNC" sudoIf "/usr/local/novnc/noVNC*/utils/launch.sh --listen ${NOVNC_PORT:-6080} --vnc localhost:${VNC_PORT:-5901}"
log "noVNC started."
else
log "noVNC is already running or not installed."
fi
# Run whatever was passed in
log "Executing \"$@\"."
"$@"
log "** SCRIPT EXIT **"

View File

@@ -1,25 +0,0 @@
#!/bin/bash
RESOLUTION=${1:-${VNC_RESOLUTION:-1920x1080}}
DPI=${2:-${VNC_DPI:-72}}
if [ -z "$1" ]; then
echo -e "**Current Settings **\n"
xrandr
echo -n -e "\nEnter new resolution (WIDTHxHEIGHT, blank for ${RESOLUTION}, Ctrl+C to abort).\n> "
read NEW_RES
if [ "${NEW_RES}" != "" ]; then
RESOLUTION=${NEW_RES}
fi
if [ -z "$2" ]; then
echo -n -e "\nEnter new DPI (blank for ${DPI}, Ctrl+C to abort).\n> "
read NEW_DPI
if [ "${NEW_DPI}" != "" ]; then
DPI=${NEW_DPI}
fi
fi
fi
xrandr --fb ${RESOLUTION} --dpi ${DPI} > /dev/null 2>&1
echo -e "\n**New Settings **\n"
xrandr
echo

View File

@@ -1,41 +0,0 @@
{
"name": "Code - OSS",
"build": {
"dockerfile": "Dockerfile",
"args": {
"MAX_VNC_RESOLUTION": "1920x1080x16",
"TARGET_VNC_RESOLUTION": "1280x768",
"TARGET_VNC_PORT": "5901",
"TARGET_NOVNC_PORT": "6080",
"VNC_PASSWORD": "vscode",
"INSTALL_FIREFOX": "true"
}
},
"overrideCommand": false,
"runArgs": ["--init"],
"settings": {
// zsh is also available
"terminal.integrated.shell.linux": "/bin/bash",
"resmon.show.battery": false,
"resmon.show.cpufreq": false,
"remote.extensionKind": {
"ms-vscode.js-debug-nightly": "workspace",
"msjsdiag.debugger-for-chrome": "workspace"
},
"debug.chrome.useV3": true
},
// noVNC, VNC ports
"forwardPorts": [6080, 5901],
"extensions": [
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig",
"msjsdiag.debugger-for-chrome",
"mutantdino.resourcemonitor",
"GitHub.vscode-pull-request-github"
],
"remoteUser": "node"
}

View File

@@ -1,9 +0,0 @@
[app] (name=code-oss-dev)
[Position] (CENTER) {0 0}
[Maximized] {yes}
[Dimensions] {100% 100%}
[end]
[transient] (role=GtkFileChooserDialog)
[Position] (CENTER) {0 0}
[Dimensions] {70% 70%}
[end]

View File

@@ -1,9 +0,0 @@
session.menuFile: ~/.fluxbox/menu
session.keyFile: ~/.fluxbox/keys
session.styleFile: /usr/share/fluxbox/styles//Squared_for_Debian
session.configVersion: 13
session.screen0.workspaces: 1
session.screen0.workspacewarping: false
session.screen0.toolbar.widthPercent: 100
session.screen0.strftimeFormat: %d %b, %a %02k:%M:%S
session.screen0.toolbar.tools: prevworkspace, workspacename, nextworkspace, clock, prevwindow, nextwindow, iconbar, systemtray

View File

@@ -1,16 +0,0 @@
[begin] ( Code - OSS Development Container )
[exec] (File Manager) { nautilus ~ } <>
[exec] (Terminal) {/usr/bin/gnome-terminal --working-directory=~ } <>
[exec] (Start Code - OSS) { x-terminal-emulator -T "Code - OSS Build" -e bash /workspaces/vscode*/scripts/code.sh } <>
[submenu] (System >) {}
[exec] (Set Resolution) { x-terminal-emulator -T "Set Resolution" -e bash /usr/local/bin/set-resolution } <>
[exec] (Passwords and Keys) { seahorse } <>
[exec] (Top) { x-terminal-emulator -T "Top" -e /usr/bin/top } <>
[exec] (Editres) {editres} <>
[exec] (Xfontsel) {xfontsel} <>
[exec] (Xkill) {xkill} <>
[exec] (Xrefresh) {xrefresh} <>
[end]
[config] (Configuration >)
[workspaces] (Workspaces >)
[end]

View File

@@ -42,15 +42,7 @@
"jsdoc/no-types": "warn",
"semi": "off",
"@typescript-eslint/semi": "warn",
"@typescript-eslint/naming-convention": [
"warn",
{
"selector": "class",
"format": [
"PascalCase"
]
}
],
"@typescript-eslint/class-name-casing": "warn",
"code-no-unused-expressions": [
"warn",
{
@@ -71,18 +63,13 @@
"browser": [
"common"
],
"electron-sandbox": [
"electron-main": [
"common",
"browser"
"node"
],
"electron-browser": [
"common",
"browser",
"node",
"electron-sandbox"
],
"electron-main": [
"common",
"node"
]
}
@@ -119,14 +106,6 @@
"rxjs/*"
]
},
{
"target": "**/{vs,sql}/base/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/base/node/**",
"restrictions": [
@@ -176,22 +155,13 @@
"*" // node modules
]
},
{
"target": "**/{vs,sql}/base/parts/*/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/base/parts/*/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/base/parts/*/electron-browser/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -221,10 +191,10 @@
"typemoq",
"sinon",
"vs/nls",
"azdata",
"azdata",
"**/{vs,sql}/base/common/**",
"**/{vs,sql}/base/parts/*/common/**",
"**/{vs,sql}/base/test/common/**",
"**/{vs,sql}/base/parts/*/common/**",
"**/{vs,sql}/platform/*/common/**",
"**/{vs,sql}/platform/*/test/common/**"
]
@@ -251,25 +221,15 @@
"*" // node modules
]
},
{
"target": "**/{vs,sql}/platform/*/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/base/parts/*/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/platform/*/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/platform/*/electron-browser/**",
"restrictions": [
"vs/nls",
"azdata",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/{common,browser,node}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/platform/*/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -482,34 +442,18 @@
"**/{vs,sql}/**/{common,worker}/**"
]
},
{
"target": "**/{vs,sql}/workbench/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/base/parts/*/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/platform/*/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/editor/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/editor/contrib/**", // editor/contrib is equivalent to /browser/ by convention
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/*/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/workbench/electron-browser/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/editor/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/base/parts/*/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/platform/*/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/editor/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/editor/contrib/**", // editor/contrib is equivalent to /browser/ by convention
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/api/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/services/*/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-browser,api}/**",
"**/{vs,sql}/workbench/services/*/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -521,7 +465,7 @@
"**/{vs,sql}/base/**",
"**/{vs,sql}/platform/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-browser}/**",
"vs/workbench/contrib/files/common/editors/fileEditorInput",
"**/{vs,sql}/workbench/services/**",
"**/{vs,sql}/workbench/test/**",
@@ -542,9 +486,7 @@
"**/{vs,sql}/workbench/api/**/common/**",
"vs/workbench/contrib/files/common/editors/fileEditorInput", // this should be fine, it only accesses constants from contrib
"vscode-textmate",
"vscode-oniguruma",
"iconv-lite-umd",
"semver-umd"
"vscode-oniguruma"
]
},
{
@@ -595,30 +537,16 @@
"*" // node modules
]
},
{
"target": "**/{vs,sql}/workbench/services/**/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/**/{common,browser,worker,electron-sandbox}/**",
"**/{vs,sql}/platform/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/workbench/services/**/electron-browser/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/**/{common,browser,worker,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/**/{common,browser,worker,node,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/api/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-browser,api}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -631,7 +559,7 @@
"**/{vs,sql}/base/**",
"**/{vs,sql}/platform/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/workbench/services/**",
"**/{vs,sql}/workbench/contrib/**",
"**/{vs,sql}/workbench/test/**",
@@ -735,32 +663,17 @@
"*" // node modules
]
},
{
"target": "**/{vs,sql}/workbench/contrib/**/electron-sandbox/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/**/{common,browser,worker,electron-sandbox}/**",
"**/{vs,sql}/platform/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/api/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/contrib/**/{common,browser,electron-sandbox}/**"
]
},
{
"target": "**/{vs,sql}/workbench/contrib/**/electron-browser/**",
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/**/{common,browser,worker,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/**/{common,browser,worker,node,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/api/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/contrib/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/{common,browser,node,electron-browser,api}/**",
"**/{vs,sql}/workbench/services/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/workbench/contrib/**/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -780,10 +693,10 @@
"restrictions": [
"vs/nls",
"vs/css!./**/*",
"**/{vs,sql}/base/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/parts/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/code/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/base/parts/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-browser}/**",
"**/{vs,sql}/code/**/{common,browser,node,electron-browser}/**",
"*" // node modules
]
},
@@ -811,54 +724,6 @@
"*" // node modules
]
},
{
"target": "**/src/{vs,sql}/workbench/workbench.common.main.ts",
"restrictions": [
"vs/nls",
"**/{vs,sql}/base/**/{common,browser}/**",
"**/{vs,sql}/base/parts/**/{common,browser}/**",
"**/{vs,sql}/platform/**/{common,browser}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/**/{common,browser}/**"
]
},
{
"target": "**/src/{vs,sql}/workbench/workbench.web.main.ts",
"restrictions": [
"vs/nls",
"**/{vs,sql}/base/**/{common,browser}/**",
"**/{vs,sql}/base/parts/**/{common,browser}/**",
"**/{vs,sql}/platform/**/{common,browser}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/**/{common,browser}/**",
"**/{vs,sql}/workbench/workbench.common.main"
]
},
{
"target": "**/src/{vs,sql}/workbench/workbench.sandbox.main.ts",
"restrictions": [
"vs/nls",
"**/{vs,sql}/base/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/base/parts/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/platform/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/**/{common,browser,electron-sandbox}/**",
"**/{vs,sql}/workbench/workbench.common.main"
]
},
{
"target": "**/src/{vs,sql}/workbench/workbench.desktop.main.ts",
"restrictions": [
"vs/nls",
"**/{vs,sql}/base/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/base/parts/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/platform/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/editor/**",
"**/{vs,sql}/workbench/**/{common,browser,node,electron-sandbox,electron-browser}/**",
"**/{vs,sql}/workbench/workbench.common.main",
"**/{vs,sql}/workbench/workbench.sandbox.main"
]
},
{
"target": "**/extensions/**",
"restrictions": "**/*"

View File

@@ -2,7 +2,7 @@
* Read our Pull Request guidelines:
https://github.com/Microsoft/azuredatastudio/wiki/How-to-Contribute#pull-requests.
* Associate an issue with the Pull Request.
* Ensure that the code is up-to-date with the `main` branch.
* Ensure that the code is up-to-date with the `master` branch.
* Include a description of the proposed changes and how to test them.
-->

View File

@@ -3,11 +3,11 @@ name: CI
on:
push:
branches:
- main
- master
- release/*
pull_request:
branches:
- main
- master
- release/*
jobs:
@@ -17,7 +17,7 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/checkout@v1
# TODO: rename azure-pipelines/linux/xvfb.init to github-actions
- run: |
sudo apt-get update
@@ -72,7 +72,7 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/checkout@v1
- uses: actions/setup-node@v1
with:
node-version: 10
@@ -108,7 +108,7 @@ jobs:
CHILD_CONCURRENCY: "1"
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v2.2.0
- uses: actions/checkout@v1
- uses: actions/setup-node@v1
with:
node-version: 10
@@ -141,7 +141,7 @@ jobs:
# CHILD_CONCURRENCY: "1"
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# steps:
# - uses: actions/checkout@v2.2.0
# - uses: actions/checkout@v1
# # TODO: rename azure-pipelines/linux/xvfb.init to github-actions
# - run: |
# sudo apt-get update

View File

@@ -1,46 +0,0 @@
name: "Code Scanning - Action"
on:
push:
schedule:
- cron: '0 0 * * 0'
jobs:
CodeQL-Build:
strategy:
fail-fast: false
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below).
- name: Autobuild
uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# and modify them (or add more) to build your code if your project
# uses a compiled language
#- run: |
# make bootstrap
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -8,10 +8,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2.2.0
uses: actions/checkout@v2
with:
repository: 'microsoft/azuredatastudio'
ref: main
ref: master
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions/build/actions

View File

@@ -12,7 +12,7 @@ jobs:
uses: actions/checkout@v2
with:
repository: 'microsoft/azuredatastudio'
ref: main
ref: master
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions/build/actions

View File

@@ -4,7 +4,6 @@
"recommendations": [
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig",
"msjsdiag.debugger-for-chrome",
"ms-vscode.vscode-github-issue-notebooks"
"msjsdiag.debugger-for-chrome"
]
}

50
.vscode/launch.json vendored
View File

@@ -20,13 +20,15 @@
"port": 5870,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
],
"presentation": {
"hidden": true
}
},
{
"type": "pwa-chrome",
"request": "attach",
"name": "Attach to Shared Process",
"timeout": 30000,
"port": 9222,
"urlFilter": "*sharedProcess.html*",
"presentation": {
@@ -58,7 +60,6 @@
"type": "node",
"request": "attach",
"name": "Attach to Main Process",
"timeout": 30000,
"port": 5875,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -92,7 +93,6 @@
"env": {
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null
},
"cleanUp": "wholeBrowser",
"breakOnLoad": false,
"urlFilter": "*workbench.html*",
"runtimeArgs": [
@@ -108,11 +108,38 @@
],
"browserLaunchLocation": "workspace"
},
{
"type": "chrome",
"request": "launch",
"name": "Launch azuredatastudio with new notebook command",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"urlFilter": "*index.html*",
"runtimeArgs": [
"--inspect=5875",
"--command=notebook.command.new"
],
"skipFiles": [
"**/winjs*.js"
],
"webRoot": "${workspaceFolder}",
"timeout": 45000
},
{
"type": "chrome",
"request": "launch",
"name": "Launch ADS (Web) (TBD)",
"program": "${workspaceFolder}/resources/serverless/code-web.js",
"runtimeExecutable": "yarn",
"runtimeArgs": [
"web"
],
"presentation": {
"group": "0_vscode",
"order": 2
@@ -148,18 +175,6 @@
"order": 3
}
},
{
"type": "pwa-msedge",
"request": "launch",
"name": "VS Code (Web, Edge)",
"url": "http://localhost:8080",
"pauseForSourceMap": false,
"preLaunchTask": "Run web",
"presentation": {
"group": "0_vscode",
"order": 3
}
},
{
"type": "node",
"request": "launch",
@@ -266,7 +281,6 @@
},
{
"name": "Azure Data Studio",
"stopAll": true,
"configurations": [
"Launch azuredatastudio",
"Attach to Main Process",

View File

@@ -1,38 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "##### `Config`: defines the inbox query",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox=repo:microsoft/vscode is:open no:assignee -label:feature-request -label:testplan-item -label:plan-item ",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Triage Inbox\n\nAll inbox issues but not those that need more information. These issues need to be triaged, e.g assigned to a user or ask for more information",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox -label:\"needs more info\" -label:emmet",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "## Inbox\n\nAll issues that have no assignee and that have neither **feature requests** nor **test plan items** nor **plan items**.",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$inbox",
"editable": false
}
]

View File

@@ -1,98 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "##### `Config`: This should be changed every month/milestone",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"June 2020\"",
"editable": true
},
{
"kind": 1,
"language": "github-issues",
"value": "## Milestone Work",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone assignee:@me is:open",
"editable": false
},
{
"kind": 1,
"language": "github-issues",
"value": "## Bugs, Debt, Features...",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "#### My Bugs",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:bug",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "#### Debt & Engineering",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:debt OR $repos assignee:@me is:open label:engineering",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "#### Performance 🐌 🔜 🏎",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:perf OR $repos assignee:@me is:open label:perf-startup OR $repos assignee:@me is:open label:perf-bloat OR $repos assignee:@me is:open label:freeze-slow-crash-leak",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "#### Feature Requests",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:feature-request milestone:Backlog sort:reactions-+1-desc",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open milestone:\"Backlog Candidates\"",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "#### Not Actionable",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos assignee:@me is:open label:\"needs more info\"",
"editable": false
}
]

View File

@@ -1,55 +0,0 @@
[
{
"kind": 1,
"language": "markdown",
"value": "### Bug Verification Queries\n\nBefore shipping we want to verify _all_ bugs. That means when a bug is fixed we check that the fix actually works. It's always best to start with bugs that you have filed and the proceed with bugs that have been filed from users outside the development team. ",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "#### Config: update list of `repos` and the `milestone`",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"June 2020\"",
"editable": true
},
{
"kind": 1,
"language": "markdown",
"value": "### Bugs You Filed",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate author:@me",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "### Bugs From Outside",
"editable": true
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate -author:@me -assignee:@me label:bug -label:verified -author:@me -author:aeschli -author:alexdima -author:alexr00 -author:bpasero -author:chrisdias -author:chrmarti -author:connor4312 -author:dbaeumer -author:deepak1556 -author:eamodio -author:egamma -author:gregvanl -author:isidorn -author:JacksonKearl -author:joaomoreno -author:jrieken -author:lramos15 -author:lszomoru -author:misolori -author:mjbvz -author:rebornix -author:RMacfarlane -author:roblourens -author:sana-ajani -author:sandy081 -author:sbatten -author:Tyriar -author:weinand",
"editable": false
},
{
"kind": 1,
"language": "markdown",
"value": "### All"
},
{
"kind": 2,
"language": "github-issues",
"value": "$repos $milestone is:closed -assignee:@me label:bug -label:verified -label:*duplicate",
"editable": false
}
]

View File

@@ -79,8 +79,8 @@ src/vs/base/common/strings.ts:
170 */
171 export function endsWith(haystack: string, needle: string): boolean {
861
862 /**
863: * @deprecated ES6
864 */
865 export function repeat(s: string, count: number): string {
853
854 /**
855: * @deprecated ES6
856 */
857 export function repeat(s: string, count: number): string {

View File

@@ -26,7 +26,6 @@
"test/automation/out/**": true,
"test/integration/browser/out/**": true,
"src/vs/base/test/node/uri.test.data.txt": true,
"src/vs/workbench/test/browser/api/extHostDocumentData.test.perf-data.ts": true,
"src/vs/server": false
},
"lcov.path": [
@@ -73,7 +72,7 @@
},
"gulp.autoDetect": "off",
"files.insertFinalNewline": true,
"[typescript]": {
"[typescript]": {
"editor.defaultFormatter": "vscode.typescript-language-features"
},
"typescript.tsc.autoDetect": "off"

43
.vscode/tasks.json vendored
View File

@@ -26,8 +26,8 @@
"message": 3
},
"background": {
"beginsPattern": "\\[watch-client\\].*Starting compilation",
"endsPattern": "\\[watch-client\\].*Finished compilation"
"beginsPattern": "Starting compilation",
"endsPattern": "Finished compilation"
}
}
},
@@ -55,43 +55,6 @@
},
"problemMatcher": "$tsc"
},
{
"type": "npm",
"script": "watch-webd",
"label": "Build Web Extensions",
"group": "build",
"isBackground": true,
"presentation": {
"reveal": "never"
},
"problemMatcher": {
"owner": "typescript",
"applyTo": "closedDocuments",
"fileLocation": [
"absolute"
],
"pattern": {
"regexp": "Error: ([^(]+)\\((\\d+|\\d+,\\d+|\\d+,\\d+,\\d+,\\d+)\\): (.*)$",
"file": 1,
"location": 2,
"message": 3
},
"background": {
"beginsPattern": "Starting compilation",
"endsPattern": "Finished compilation"
}
}
},
{
"type": "npm",
"script": "kill-watch-webd",
"label": "Kill Build Web Extensions",
"group": "build",
"presentation": {
"reveal": "never",
},
"problemMatcher": "$tsc"
},
{
"label": "Run tests",
"type": "shell",
@@ -126,7 +89,7 @@
},
{
"type": "shell",
"command": "yarn web --no-launch",
"command": "yarn web -- --no-launch",
"label": "Run web",
"isBackground": true,
"problemMatcher": {

View File

@@ -1,3 +1,3 @@
disturl "https://atom.io/download/electron"
target "7.3.2"
target "7.2.4"
runtime "electron"

View File

@@ -1,11 +1,5 @@
# Change Log
## Version 1.19.0
* Release date: June 15, 2020
* Release status: General Availability
* Address issues in https://github.com/microsoft/azuredatastudio/milestone/55?closed=1
* Bug fixes
## Version 1.18.1
* Release date: May 27, 2020
* Release status: General Availability

View File

@@ -1,7 +1,7 @@
# Azure Data Studio
[![Join the chat at https://gitter.im/Microsoft/sqlopsstudio](https://badges.gitter.im/Microsoft/sqlopsstudio.svg)](https://gitter.im/Microsoft/sqlopsstudio?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![Build Status](https://dev.azure.com/azuredatastudio/azuredatastudio/_apis/build/status/Azure%20Data%20Studio%20CI?branchName=main)](https://dev.azure.com/azuredatastudio/azuredatastudio/_build/latest?definitionId=4&branchName=main)
[![Build Status](https://dev.azure.com/azuredatastudio/azuredatastudio/_apis/build/status/Azure%20Data%20Studio%20CI?branchName=master)](https://dev.azure.com/azuredatastudio/azuredatastudio/_build/latest?definitionId=4&branchName=master)
[![Twitter Follow](https://img.shields.io/twitter/follow/azuredatastudio?style=social)](https://twitter.com/azuredatastudio)
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
@@ -21,14 +21,14 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
## Try out the latest insiders build from `main`:
## Try out the latest insiders build from `master`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
- [Windows System Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/insider)
- [Windows ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/insider)
- [macOS ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider)
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/master/CHANGELOG.md) for additional details of what's in this release.
## **Feature Highlights**
@@ -47,7 +47,7 @@ See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHAN
Here are some of these features in action.
<img src='https://github.com/Microsoft/azuredatastudio/blob/main/docs/overview_screen.jpg' width='800px'>
<img src='https://github.com/Microsoft/azuredatastudio/blob/master/docs/overview_screen.jpg' width='800px'>
## Contributing
If you are interested in fixing issues and contributing directly to the code base,
@@ -121,7 +121,7 @@ We would like to thank all our users who raised issues, and in particular the fo
* olljanat for `Implemented npm version check (#314)`
* Adam Machanic for helping with the `whoisactive` extension
And of course, we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/ThirdPartyNotices.txt)
And of course, we'd like to thank the authors of all upstream dependencies. Please see a full list in the [ThirdPartyNotices.txt](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/ThirdPartyNotices.txt)
## License
@@ -129,10 +129,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2132348
[win-system]: https://go.microsoft.com/fwlink/?linkid=2132347
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2132518
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2132519
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2132349
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2132351
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2132350
[win-user]: https://go.microsoft.com/fwlink/?linkid=2127522
[win-system]: https://go.microsoft.com/fwlink/?linkid=2127432
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2127716
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2127431
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2127523
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2127433
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2127524

View File

@@ -72,8 +72,6 @@ expressly granted herein, whether by implication, estoppel or otherwise.
vscode-ripgrep: https://github.com/roblourens/vscode-ripgrep
vscode-textmate: https://github.com/Microsoft/vscode-textmate
winreg: https://github.com/fresc81/node-winreg
xmldom: https://github.com/xmldom/xmldom
xml-formatter: https://github.com/chrisbottin/xml-formatter
xterm: https://github.com/sourcelair/xterm.js
yargs: https://github.com/yargs/yargs
yauzl: https://github.com/thejoshwolfe/yauzl
@@ -2227,51 +2225,6 @@ EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
=========================================
END OF winreg NOTICES AND INFORMATION
%% xmldom NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in the
Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the
following conditions:
The above copyright notice and this permission notice shall be included in all copies
or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE
FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
=========================================
END OF xmldom NOTICES AND INFORMATION
%% xml-formatter NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright 2019 Chris Bottin (https://github.com/chrisbottin)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software
and associated documentation files (the "Software"), to deal in the Software without restriction,
including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT
LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF xml-formatter NOTICES AND INFORMATION
%% xterm NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2014-2016, SourceLair Private Company (https://www.sourcelair.com)

View File

@@ -1,5 +1,5 @@
trigger:
- main
- master
- release/*
jobs:

View File

@@ -1,27 +0,0 @@
# cleanup rules for web node modules, .gitignore style
**/*.txt
**/*.json
**/*.md
**/*.d.ts
**/*.js.map
**/LICENSE
**/CONTRIBUTORS
jschardet/index.js
jschardet/src/**
jschardet/dist/jschardet.js
vscode-textmate/webpack.config.js
xterm/src/**
xterm-addon-search/src/**
xterm-addon-search/out/**
xterm-addon-search/fixtures/**
xterm-addon-unicode11/src/**
xterm-addon-unicode11/out/**
xterm-addon-webgl/src/**
xterm-addon-webgl/out/**

View File

@@ -10,10 +10,10 @@ git clone --depth 1 https://github.com/Microsoft/vscode-node-debug2.git
git clone --depth 1 https://github.com/Microsoft/vscode-node-debug.git
git clone --depth 1 https://github.com/Microsoft/vscode-html-languageservice.git
git clone --depth 1 https://github.com/Microsoft/vscode-json-languageservice.git
node $BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
node $BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
$BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --sourceDir $BUILD_SOURCESDIRECTORY --excludedDir $BUILD_SOURCESDIRECTORY/extensions --outputDir . --applyEndpoints
$BUILD_SOURCESDIRECTORY/build/node_modules/.bin/vscode-telemetry-extractor --config $BUILD_SOURCESDIRECTORY/build/azure-pipelines/common/telemetry-config.json -o .
mkdir -p $BUILD_SOURCESDIRECTORY/.build/telemetry
mv declarations-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-core.json
mv config-resolved.json $BUILD_SOURCESDIRECTORY/.build/telemetry/telemetry-extensions.json
cd ..
rm -rf extraction
rm -rf extraction

View File

@@ -216,10 +216,10 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log('Asset:', JSON.stringify(asset, null, ' '));
// {{SQL CARBON EDIT}}
// Insiders: nightly build from main
// Insiders: nightly build from master
const isReleased = (
(
(quality === 'insider' && /^main$|^refs\/heads\/main$/.test(sourceBranch)) ||
(quality === 'insider' && /^master$|^refs\/heads\/master$/.test(sourceBranch)) ||
(quality === 'rc1' && /^release\/|^refs\/heads\/release\//.test(sourceBranch))
) &&
/Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)

View File

@@ -0,0 +1,228 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as request from 'request';
import { createReadStream, createWriteStream, unlink, mkdir } from 'fs';
import * as github from 'github-releases';
import { join } from 'path';
import { tmpdir } from 'os';
import { promisify } from 'util';
const BASE_URL = 'https://rink.hockeyapp.net/api/2/';
const HOCKEY_APP_TOKEN_HEADER = 'X-HockeyAppToken';
export interface IVersions {
app_versions: IVersion[];
}
export interface IVersion {
id: number;
version: string;
}
export interface IApplicationAccessor {
accessToken: string;
appId: string;
}
export interface IVersionAccessor extends IApplicationAccessor {
id: string;
}
enum Platform {
WIN_32 = 'win32-ia32',
WIN_64 = 'win32-x64',
LINUX_64 = 'linux-x64',
MAC_OS = 'darwin-x64'
}
function symbolsZipName(platform: Platform, electronVersion: string, insiders: boolean): string {
return `${insiders ? 'insiders' : 'stable'}-symbols-v${electronVersion}-${platform}.zip`;
}
const SEED = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
async function tmpFile(name: string): Promise<string> {
let res = '';
for (let i = 0; i < 8; i++) {
res += SEED.charAt(Math.floor(Math.random() * SEED.length));
}
const tmpParent = join(tmpdir(), res);
await promisify(mkdir)(tmpParent);
return join(tmpParent, name);
}
function getVersions(accessor: IApplicationAccessor): Promise<IVersions> {
return asyncRequest<IVersions>({
url: `${BASE_URL}/apps/${accessor.appId}/app_versions`,
method: 'GET',
headers: {
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
}
});
}
function createVersion(accessor: IApplicationAccessor, version: string): Promise<IVersion> {
return asyncRequest<IVersion>({
url: `${BASE_URL}/apps/${accessor.appId}/app_versions/new`,
method: 'POST',
headers: {
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
},
formData: {
bundle_version: version
}
});
}
function updateVersion(accessor: IVersionAccessor, symbolsPath: string) {
return asyncRequest<IVersions>({
url: `${BASE_URL}/apps/${accessor.appId}/app_versions/${accessor.id}`,
method: 'PUT',
headers: {
[HOCKEY_APP_TOKEN_HEADER]: accessor.accessToken
},
formData: {
dsym: createReadStream(symbolsPath)
}
});
}
function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Promise<T> {
return new Promise<T>((resolve, reject) => {
request(options, (error, _response, body) => {
if (error) {
reject(error);
} else {
resolve(JSON.parse(body));
}
});
});
}
function downloadAsset(repository: any, assetName: string, targetPath: string, electronVersion: string) {
return new Promise((resolve, reject) => {
repository.getReleases({ tag_name: `v${electronVersion}` }, (err: any, releases: any) => {
if (err) {
reject(err);
} else {
const asset = releases[0].assets.filter((asset: any) => asset.name === assetName)[0];
if (!asset) {
reject(new Error(`Asset with name ${assetName} not found`));
} else {
repository.downloadAsset(asset, (err: any, reader: any) => {
if (err) {
reject(err);
} else {
const writer = createWriteStream(targetPath);
writer.on('error', reject);
writer.on('close', resolve);
reader.on('error', reject);
reader.pipe(writer);
}
});
}
}
});
});
}
interface IOptions {
repository: string;
platform: Platform;
versions: { code: string; insiders: boolean; electron: string; };
access: { hockeyAppToken: string; hockeyAppId: string; githubToken: string };
}
async function ensureVersionAndSymbols(options: IOptions) {
// Check version does not exist
console.log(`HockeyApp: checking for existing version ${options.versions.code} (${options.platform})`);
const versions = await getVersions({ accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId });
if (!Array.isArray(versions.app_versions)) {
throw new Error(`Unexpected response: ${JSON.stringify(versions)}`);
}
if (versions.app_versions.some(v => v.version === options.versions.code)) {
console.log(`HockeyApp: Returning without uploading symbols because version ${options.versions.code} (${options.platform}) was already found`);
return;
}
// Download symbols for platform and electron version
const symbolsName = symbolsZipName(options.platform, options.versions.electron, options.versions.insiders);
const symbolsPath = await tmpFile('symbols.zip');
console.log(`HockeyApp: downloading symbols ${symbolsName} for electron ${options.versions.electron} (${options.platform}) into ${symbolsPath}`);
await downloadAsset(new (github as any)({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
// Create version
console.log(`HockeyApp: creating new version ${options.versions.code} (${options.platform})`);
const version = await createVersion({ accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId }, options.versions.code);
// Upload symbols
console.log(`HockeyApp: uploading symbols for version ${options.versions.code} (${options.platform})`);
await updateVersion({ id: String(version.id), accessToken: options.access.hockeyAppToken, appId: options.access.hockeyAppId }, symbolsPath);
// Cleanup
await promisify(unlink)(symbolsPath);
}
// Environment
const pakage = require('../../../package.json');
const product = require('../../../product.json');
const repository = product.electronRepository;
const electronVersion = require('../../lib/electron').getElectronVersion();
const insiders = product.quality !== 'stable';
let codeVersion = pakage.version;
if (insiders) {
codeVersion = `${codeVersion}-insider`;
}
const githubToken = process.argv[2];
const hockeyAppToken = process.argv[3];
const is64 = process.argv[4] === 'x64';
const hockeyAppId = process.argv[5];
if (process.argv.length !== 6) {
throw new Error(`HockeyApp: Unexpected number of arguments. Got ${process.argv}`);
}
let platform: Platform;
if (process.platform === 'darwin') {
platform = Platform.MAC_OS;
} else if (process.platform === 'win32') {
platform = is64 ? Platform.WIN_64 : Platform.WIN_32;
} else {
platform = Platform.LINUX_64;
}
// Create version and upload symbols in HockeyApp
if (repository && codeVersion && electronVersion && (product.quality === 'stable' || product.quality === 'insider')) {
ensureVersionAndSymbols({
repository,
platform,
versions: {
code: codeVersion,
insiders,
electron: electronVersion
},
access: {
githubToken,
hockeyAppToken,
hockeyAppId
}
}).then(() => {
console.log('HockeyApp: done');
}).catch(error => {
console.error(`HockeyApp: error ${error} (AppID: ${hockeyAppId})`);
return process.exit(1);
});
} else {
console.log(`HockeyApp: skipping due to unexpected context (repository: ${repository}, codeVersion: ${codeVersion}, electronVersion: ${electronVersion}, quality: ${product.quality})`);
}

View File

@@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
</dict>
</plist>

View File

@@ -29,7 +29,15 @@ steps:
yarn electron x64
displayName: Download Electron
# - script: | {{SQL CARBON EDIT}} remove editor checks
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: | # {{SQL CARBON EDIT}} add step
yarn strict-vscode
displayName: Run Strict Null Check.
# - script: | {{SQL CARBON EDIT}} remove step
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
@@ -57,13 +65,12 @@ steps:
# ./scripts/test-integration.sh --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-macos
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
# - task: PublishPipelineArtifact@0
# inputs:
# artifactName: crash-dump-macos
# targetPath: .build/crashes
# displayName: 'Publish Crash Reports'
# condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results

View File

@@ -0,0 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
</dict>
</plist>

View File

@@ -157,18 +157,24 @@ steps:
artifactName: crash-dump-macos
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
condition: succeededOrFailed()
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
HELPER_APP_NAME="`echo $APP_NAME | sed -e 's/^Visual Studio //;s/\.app$//'`"
APP_FRAMEWORK_PATH="$APP_ROOT/$APP_NAME/Contents/Frameworks"
security create-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
security default-keychain -s $(agent.tempdirectory)/buildagent.keychain
security unlock-keychain -p pwd $(agent.tempdirectory)/buildagent.keychain
echo "$(macos-developer-certificate)" | base64 -D > $(agent.tempdirectory)/cert.p12
security import $(agent.tempdirectory)/cert.p12 -k $(agent.tempdirectory)/buildagent.keychain -P "$(macos-developer-certificate-key)" -T /usr/bin/codesign
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k pwd $(agent.tempdirectory)/buildagent.keychain
DEBUG=electron-osx-sign* node build/darwin/sign.js
codesign -s 99FM488X57 --deep --force --options runtime --entitlements build/azure-pipelines/darwin/entitlements.plist "$APP_ROOT"/*.app
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-gpu-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (GPU).app"
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-plugin-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (Plugin).app"
codesign -s 99FM488X57 --force --options runtime --entitlements build/azure-pipelines/darwin/helper-renderer-entitlements.plist "$APP_FRAMEWORK_PATH/$HELPER_APP_NAME Helper (Renderer).app"
displayName: Set Hardened Entitlements
- script: |
@@ -242,28 +248,16 @@ steps:
SessionTimeout: 60
displayName: Notarization
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
"$APP_ROOT/$APP_NAME/Contents/Resources/app/bin/code" --export-default-configuration=.build
displayName: Verify start after signing (export configuration)
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- script: |
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
yarn gulp upload-vscode-configuration
displayName: Upload configuration (for Bing settings search)
continueOnError: true
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -17,3 +17,11 @@ node build/azure-pipelines/common/createAsset.js \
archive-unsigned \
"vscode-server-darwin.zip" \
../vscode-server-darwin.zip
# publish hockeyapp symbols
# node build/azure-pipelines/common/symbols.js "$VSCODE_MIXIN_PASSWORD" "$VSCODE_HOCKEYAPP_TOKEN" x64 "$VSCODE_HOCKEYAPP_ID_MACOS"
# Skip hockey app because build failure.
# https://github.com/microsoft/vscode/issues/90491
# upload configuration
yarn gulp upload-vscode-configuration

View File

@@ -1,82 +0,0 @@
steps:
- task: InstallAppleCertificate@2
displayName: 'Install developer certificate'
inputs:
certSecureFile: 'osx_signing_key.p12'
condition: eq(variables['signed'], true)
- task: DownloadBuildArtifacts@0
displayName: 'Download Build Artifacts'
inputs:
downloadType: specific
itemPattern: 'drop/darwin/archive/azuredatastudio-darwin-unsigned.zip'
downloadPath: '$(Build.SourcesDirectory)/.build/'
- script: |
pushd $(Build.SourcesDirectory)/.build/drop/darwin/archive
mv azuredatastudio-darwin-unsigned.zip azuredatastudio-darwin.zip
displayName: 'Rename the file'
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
displayName: 'ESRP CodeSigning'
inputs:
ConnectedServiceName: 'Code Signing'
FolderPath: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
Pattern: 'azuredatastudio-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationCode": "MacAppDeveloperSign",
"parameters": {
"Hardening": "Enable"
},
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 90
condition: and(succeeded(), eq(variables['signed'], true))
- script: |
zip -d $(Build.SourcesDirectory)/.build/drop/darwin/archive/azuredatastudio-darwin.zip "*.pkg"
displayName: Clean Archive
condition: and(succeeded(), eq(variables['signed'], true))
- task: EsrpCodeSigning@1
displayName: 'ESRP Notarization'
inputs:
ConnectedServiceName: 'Code Signing'
FolderPath: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
Pattern: 'azuredatastudio-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"KeyCode": "CP-401337-Apple",
"OperationCode": "MacAppNotarize",
"Parameters": {
"BundleId": "com.microsoft.azuredatastudio-$(VSCODE_QUALITY)"
},
"ToolName": "sign",
"ToolVersion": "1.0"
}
]
SessionTimeout: 120
condition: and(succeeded(), eq(variables['signed'], true))
- task: CopyFiles@2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)/darwin/archive'
inputs:
SourceFolder: '$(Build.SourcesDirectory)/.build/drop/darwin/archive'
TargetFolder: '$(Build.ArtifactStagingDirectory)/darwin/archive'
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
condition: always()
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
inputs:
failOnAlert: true

View File

@@ -171,16 +171,55 @@ steps:
pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip
popd
displayName: 'Archive (no signing)'
condition: and(succeeded(), eq(variables['signed'], false))
displayName: 'Archive'
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
displayName: 'ESRP CodeSigning'
inputs:
ConnectedServiceName: 'Code Signing'
FolderPath: '$(Build.SourcesDirectory)/.build/darwin/archive'
Pattern: 'azuredatastudio-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationCode": "MacAppDeveloperSign",
"parameters": {
"Hardening": "Enable"
},
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 90
condition: and(succeeded(), eq(variables['signed'], true))
- script: |
set -e
mkdir -p .build/darwin/archive
pushd ../azuredatastudio-darwin
ditto -c -k --keepParent *.app $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin-unsigned.zip
popd
displayName: 'Archive'
zip -d $(Build.SourcesDirectory)/.build/darwin/archive/azuredatastudio-darwin.zip "*.pkg"
displayName: Clean Archive
condition: and(succeeded(), eq(variables['signed'], true))
- task: EsrpCodeSigning@1
displayName: 'ESRP Notarization'
inputs:
ConnectedServiceName: 'Code Signing'
FolderPath: '$(Build.SourcesDirectory)/.build/darwin/archive'
Pattern: 'azuredatastudio-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"KeyCode": "CP-401337-Apple",
"OperationCode": "MacAppNotarize",
"Parameters": {
"BundleId": "com.microsoft.azuredatastudio-$(VSCODE_QUALITY)"
},
"ToolName": "sign",
"ToolVersion": "1.0"
}
]
SessionTimeout: 120
condition: and(succeeded(), eq(variables['signed'], true))
- script: |

View File

@@ -3,10 +3,10 @@ pool:
trigger:
branches:
include: ['main', 'release/*']
include: ['master', 'release/*']
pr:
branches:
include: ['main', 'release/*']
include: ['master', 'release/*']
steps:
- task: NodeTool@0
@@ -34,8 +34,8 @@ steps:
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
# Push main branch into oss/master
git push distro origin/main:refs/heads/oss/master
# Push master branch into oss/master
git push distro origin/master:refs/heads/oss/master
# Push every release branch into oss/release
git for-each-ref --format="%(refname:short)" refs/remotes/origin/release/* | sed 's/^origin\/\(.*\)$/\0:refs\/heads\/oss\/\1/' | xargs git push distro

View File

@@ -3,10 +3,10 @@ pool:
trigger:
branches:
include: ['main']
include: ['master']
pr:
branches:
include: ['main']
include: ['master']
steps:
- task: NodeTool@0
@@ -31,10 +31,10 @@ steps:
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git checkout origin/electron-x.y.z
git checkout origin/electron-8.0.x
git merge origin/master
# Push master branch into exploration branch
git push origin HEAD:electron-x.y.z
git push origin HEAD:electron-8.0.x
displayName: Sync & Merge Exploration

View File

@@ -81,14 +81,6 @@ steps:
# displayName: 'Publish Crash Reports'
# condition: succeededOrFailed()
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:

View File

@@ -107,6 +107,7 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
./build/azure-pipelines/linux/multiarch/$(VSCODE_ARCH)/publish.sh
displayName: Publish

View File

@@ -145,8 +145,7 @@ steps:
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
condition: succeededOrFailed()
- script: |
set -e
@@ -179,6 +178,7 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish

View File

@@ -27,6 +27,11 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish hockeyapp symbols
# node build/azure-pipelines/common/symbols.js "$VSCODE_MIXIN_PASSWORD" "$VSCODE_HOCKEYAPP_TOKEN" "x64" "$VSCODE_HOCKEYAPP_ID_LINUX64"
# Skip hockey app because build failure.
# https://github.com/microsoft/vscode/issues/90491
# Publish DEB
PLATFORM_DEB="linux-deb-x64"
DEB_ARCH="amd64"

View File

@@ -1,6 +1,3 @@
parameters:
extensionsToUnitTest: []
steps:
- script: |
mkdir -p .build
@@ -129,15 +126,14 @@ steps:
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- ${{each extension in parameters.extensionsToUnitTest}}:
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
DISPLAY=:10 ./scripts/test-extensions-unit.sh
displayName: 'Run Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
@@ -164,12 +160,12 @@ steps:
./build/azure-pipelines/linux/createDrop.sh
displayName: Create Drop
- script: |
set -e
shopt -s globstar
mkdir -p $(Build.ArtifactStagingDirectory)/test-results/coverage
cp --parents -r $(Build.SourcesDirectory)/extensions/*/coverage/** $(Build.ArtifactStagingDirectory)/test-results/coverage
displayName: Copy Coverage
- task: CopyFiles@2
displayName: 'Copy Extension Unit Test Coverage Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: '$(Build.SourcesDirectory)/extensions'
Contents: '*/coverage/**'
TargetFolder: '$(Build.ArtifactStagingDirectory)/test-results/coverage'
- task: PublishTestResults@2
displayName: 'Publish Test Results test-results.xml'

View File

@@ -12,8 +12,6 @@ const es = require('event-stream');
const vfs = require('vinyl-fs');
const fancyLog = require('fancy-log');
const ansiColors = require('ansi-colors');
const fs = require('fs');
const path = require('path');
function main() {
const quality = process.env['VSCODE_QUALITY'];
@@ -23,7 +21,7 @@ function main() {
return;
}
const productJsonFilter = filter(f => f.relative === 'product.json', { restore: true });
const productJsonFilter = filter('**/product.json', { restore: true });
fancyLog(ansiColors.blue('[mixin]'), `Mixing in sources:`);
return vfs
@@ -31,32 +29,7 @@ function main() {
.pipe(filter(f => !f.isDirectory()))
.pipe(productJsonFilter)
.pipe(buffer())
.pipe(json(o => {
const ossProduct = JSON.parse(fs.readFileSync(path.join(__dirname, '..', '..', 'product.json'), 'utf8'));
let builtInExtensions = ossProduct.builtInExtensions;
if (Array.isArray(o.builtInExtensions)) {
fancyLog(ansiColors.blue('[mixin]'), 'Overwriting built-in extensions:', o.builtInExtensions.map(e => e.name));
builtInExtensions = o.builtInExtensions;
} else if (o.builtInExtensions) {
const include = o.builtInExtensions['include'] || [];
const exclude = o.builtInExtensions['exclude'] || [];
fancyLog(ansiColors.blue('[mixin]'), 'OSS built-in extensions:', builtInExtensions.map(e => e.name));
fancyLog(ansiColors.blue('[mixin]'), 'Including built-in extensions:', include.map(e => e.name));
fancyLog(ansiColors.blue('[mixin]'), 'Excluding built-in extensions:', exclude);
builtInExtensions = builtInExtensions.filter(ext => !include.find(e => e.name === ext.name) && !exclude.find(name => name === ext.name));
builtInExtensions = [...builtInExtensions, ...include];
fancyLog(ansiColors.blue('[mixin]'), 'Final built-in extensions:', builtInExtensions.map(e => e.name));
} else {
fancyLog(ansiColors.blue('[mixin]'), 'Inheriting OSS built-in extensions', builtInExtensions.map(e => e.name));
}
return { ...ossProduct, ...o, builtInExtensions };
}))
.pipe(json(o => Object.assign({}, require('../../product.json'), o)))
.pipe(productJsonFilter.restore)
.pipe(es.mapSync(function (f) {
fancyLog(ansiColors.blue('[mixin]'), f.relative, ansiColors.green('✔︎'));

View File

@@ -36,17 +36,6 @@ jobs:
steps:
- template: win32/product-build-win32.yml
- job: WindowsARM64
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), eq(variables['VSCODE_BUILD_WIN32_ARM64'], 'true'))
pool:
vmImage: VS2017-Win2016
variables:
VSCODE_ARCH: arm64
dependsOn:
- Compile
steps:
- template: win32/product-build-win32-arm64.yml
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_COMPILE_ONLY'], 'false'), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
pool:
@@ -160,4 +149,4 @@ schedules:
displayName: Mon-Fri at 7:00
branches:
include:
- main
- master

View File

@@ -72,6 +72,29 @@ steps:
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
- script: |
set -e
yarn generate-github-config
displayName: Generate GitHub config
condition: succeeded()
env:
OSS_GITHUB_ID: "a5d3c261b032765a78de"
OSS_GITHUB_SECRET: $(oss-github-client-secret)
INSIDERS_GITHUB_ID: "31f02627809389d9f111"
INSIDERS_GITHUB_SECRET: $(insiders-github-client-secret)
STABLE_GITHUB_ID: "baa8a44b5e861d918709"
STABLE_GITHUB_SECRET: $(stable-github-client-secret)
EXPLORATION_GITHUB_ID: "94e8376d3a90429aeaea"
EXPLORATION_GITHUB_SECRET: $(exploration-github-client-secret)
VSO_GITHUB_ID: "3d4be8f37a0325b5817d"
VSO_GITHUB_SECRET: $(vso-github-client-secret)
VSO_PPE_GITHUB_ID: "eabf35024dc2e891a492"
VSO_PPE_GITHUB_SECRET: $(vso-ppe-github-client-secret)
VSO_DEV_GITHUB_ID: "84383ebd8a7c5f5efc5c"
VSO_DEV_GITHUB_SECRET: $(vso-dev-github-client-secret)
GITHUB_APP_ID: "Iv1.ae51e546bef24ff1"
GITHUB_APP_SECRET: $(github-app-client-secret)
- script: |
set -e
yarn postinstall

View File

@@ -26,16 +26,6 @@ jobs:
- template: darwin/sql-product-build-darwin.yml
timeoutInMinutes: 180
- job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true))
pool:
vmImage: macOS-latest
dependsOn:
- macOS
steps:
- template: darwin/sql-product-build-darwin-signing.yml
timeoutInMinutes: 60
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
pool:
@@ -45,22 +35,8 @@ jobs:
- Compile
steps:
- template: linux/sql-product-build-linux.yml
parameters:
extensionsToUnitTest: ["admin-tool-ext-win", "agent", "azurecore", "cms", "dacpac", "import", "schema-compare", "notebook", "resource-deployment", "machine-learning", "sql-database-projects"]
timeoutInMinutes: 70
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
variables:
VSCODE_ARCH: x64
dependsOn:
- Compile
steps:
- template: web/sql-product-build-web.yml
- job: Docker
condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
pool:
@@ -101,10 +77,15 @@ jobs:
- Docker
- Windows
- Windows_Test
- LinuxWeb
- macOS_Signing
steps:
- template: sql-release.yml
trigger: none
pr: none
schedules:
- cron: "0 5 * * Mon-Fri"
displayName: Mon-Fri at 7:00
branches:
include:
- master

View File

@@ -101,14 +101,6 @@ steps:
displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- script: |
set -e
AZURE_STORAGE_ACCOUNT="$(sourcemap-storage-account)" \
AZURE_STORAGE_ACCESS_KEY="$(sourcemap-storage-key)" \
node build/azure-pipelines/upload-sourcemaps
displayName: Upload sourcemaps
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
set -e

View File

@@ -1,108 +0,0 @@
steps:
- script: |
mkdir -p .build
echo -n $BUILD_SOURCEVERSION > .build/commit
echo -n $VSCODE_QUALITY > .build/quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'BuildCache'
platformIndependent: true
alias: 'Compilation'
- script: |
set -e
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.13.0"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@3
inputs:
versionSpec: "1.x"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'ClientToolsInfra_670062 (88d5392f-a34f-4769-b405-f597fc533613)'
KeyVaultName: ado-secrets
SecretsFilter: 'github-distro-mixin-password'
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login azuredatastudio
password $(github-distro-mixin-password)
EOF
git config user.email "andresse@microsoft.com"
git config user.name "AzureDataStudio"
displayName: Prepare tooling
- script: |
set -e
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: 'npm-vscode'
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: 'npm-vscode'
# condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
# - script: |
# set -e
# yarn postinstall
# displayName: Run postinstall scripts
# condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- script: |
set -e
node build/azure-pipelines/mixin
displayName: Mix in quality
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-web-min-ci
displayName: Build
# upload only the workbench.web.api.js source maps because
# we just compiled these bits in the previous step and the
# general task to upload source maps has already been run
- script: |
set -e
AZURE_STORAGE_ACCOUNT="$(sourcemap-storage-account)" \
AZURE_STORAGE_ACCESS_KEY="$(sourcemap-storage-key)" \
node build/azure-pipelines/upload-sourcemaps out-vscode-web-min out-vscode-web-min/vs/workbench/workbench.web.api.js.map
displayName: Upload sourcemaps (Web)
# - script: |
# set -e
# AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
# AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
# VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
# ./build/azure-pipelines/web/publish.sh
# displayName: Publish

View File

@@ -36,7 +36,15 @@ steps:
yarn electron
displayName: Download Electron
# - powershell: | {{SQL CARBON EDIT}} remove editor check
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: | # {{SQL CARBON EDIT}} add step
yarn strict-vscode
displayName: Run Strict Null Check
# - powershell: | {{SQL CARBON EDIT}} remove step
# yarn monaco-compile-check
# displayName: Run Monaco Editor Checks
@@ -64,13 +72,12 @@ steps:
# .\scripts\test-integration.bat --tfs "Integration Tests"
# displayName: Run Integration Tests (Electron)
- task: PublishPipelineArtifact@0
displayName: 'Publish Crash Reports'
inputs:
artifactName: crash-dump-windows
targetPath: .build\crashes
continueOnError: true
condition: failed()
# - task: PublishPipelineArtifact@0
# displayName: 'Publish Crash Reports'
# inputs:
# artifactName: crash-dump-windows
# targetPath: .build\crashes
# condition: succeededOrFailed()
- task: PublishTestResults@2
displayName: Publish Tests Results

View File

@@ -1,190 +0,0 @@
steps:
- powershell: |
mkdir .build -ea 0
"$env:BUILD_SOURCEVERSION" | Out-File -Encoding ascii -NoNewLine .build\commit
"$env:VSCODE_QUALITY" | Out-File -Encoding ascii -NoNewLine .build\quality
displayName: Prepare cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/commit, .build/quality'
targetfolder: '.build, out-build, out-vscode-min, out-vscode-reh-min, out-vscode-reh-web-min'
vstsFeed: 'npm-vscode'
platformIndependent: true
alias: 'Compilation'
- powershell: |
$ErrorActionPreference = "Stop"
exit 1
displayName: Check RestoreCache
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))
- task: NodeTool@0
inputs:
versionSpec: "12.13.0"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.x"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
mkdir .build -ea 0
"$(VSCODE_ARCH)" | Out-File -Encoding ascii -NoNewLine .build\arch
displayName: Prepare tooling
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
displayName: Merge distro
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
exec { yarn --frozen-lockfile }
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .build/arch, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn postinstall }
displayName: Run postinstall scripts
condition: and(succeeded(), eq(variables['CacheRestored'], 'true'))
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { node build/azure-pipelines/mixin }
displayName: Mix in quality
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min-ci" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-code-helper" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }
displayName: Build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -154,8 +154,7 @@ steps:
artifactName: crash-dump-windows-$(VSCODE_ARCH)
targetPath: .build\crashes
displayName: 'Publish Crash Reports'
continueOnError: true
condition: failed()
condition: succeededOrFailed()
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
@@ -218,7 +217,7 @@ steps:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 'ESRP Nuget'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
@@ -236,6 +235,7 @@ steps:
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_HOCKEYAPP_TOKEN = "$(vscode-hockeyapp-token)"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish

View File

@@ -16,21 +16,22 @@ $ServerZip = "$Repo\.build\vscode-server-win32-$Arch.zip"
$Build = "$Root\VSCode-win32-$Arch"
# Create server archive
if ("$Arch" -ne "arm64") {
exec { xcopy $LegacyServer $Server /H /E /I }
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r }
}
exec { xcopy $LegacyServer $Server /H /E /I }
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r }
# get version
$PackageJson = Get-Content -Raw -Path "$Build\resources\app\package.json" | ConvertFrom-Json
$Version = $PackageJson.version
$AssetPlatform = if ("$Arch" -eq "ia32") { "win32" } else { "win32-$Arch" }
$AssetPlatform = if ("$Arch" -eq "ia32") { "win32" } else { "win32-x64" }
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform-archive" archive "VSCode-win32-$Arch-$Version.zip" $Zip }
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform" setup "VSCodeSetup-$Arch-$Version.exe" $SystemExe }
exec { node build/azure-pipelines/common/createAsset.js "$AssetPlatform-user" setup "VSCodeUserSetup-$Arch-$Version.exe" $UserExe }
exec { node build/azure-pipelines/common/createAsset.js "server-$AssetPlatform" archive "vscode-server-win32-$Arch.zip" $ServerZip }
if ("$Arch" -ne "arm64") {
exec { node build/azure-pipelines/common/createAsset.js "server-$AssetPlatform" archive "vscode-server-win32-$Arch.zip" $ServerZip }
}
# Skip hockey app because build failure.
# https://github.com/microsoft/vscode/issues/90491
# publish hockeyapp symbols
# $hockeyAppId = if ("$Arch" -eq "ia32") { "$env:VSCODE_HOCKEYAPP_ID_WIN32" } else { "$env:VSCODE_HOCKEYAPP_ID_WIN64" }
# exec { node build/azure-pipelines/common/symbols.js "$env:VSCODE_MIXIN_PASSWORD" "$env:VSCODE_HOCKEYAPP_TOKEN" "$Arch" $hockeyAppId }

View File

@@ -10,11 +10,11 @@ const path = require('path');
let window = null;
app.once('ready', () => {
window = new BrowserWindow({ width: 800, height: 600, webPreferences: { nodeIntegration: true, webviewTag: true, enableWebSQL: false, nativeWindowOpen: true } });
window = new BrowserWindow({ width: 800, height: 600, webPreferences: { nodeIntegration: true, webviewTag: true } });
window.setMenuBarVisibility(false);
window.loadURL(url.format({ pathname: path.join(__dirname, 'index.html'), protocol: 'file:', slashes: true }));
// window.webContents.openDevTools();
window.once('closed', () => window = null);
});
app.on('window-all-closed', () => app.quit());
app.on('window-all-closed', () => app.quit());

View File

@@ -1,61 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const codesign = require("electron-osx-sign");
const path = require("path");
const util = require("../lib/util");
const product = require("../../product.json");
async function main() {
const buildDir = process.env['AGENT_BUILDDIRECTORY'];
const tempDir = process.env['AGENT_TEMPDIRECTORY'];
if (!buildDir) {
throw new Error('$AGENT_BUILDDIRECTORY not set');
}
if (!tempDir) {
throw new Error('$AGENT_TEMPDIRECTORY not set');
}
const baseDir = path.dirname(__dirname);
const appRoot = path.join(buildDir, 'VSCode-darwin');
const appName = product.nameLong + '.app';
const appFrameworkPath = path.join(appRoot, appName, 'Contents', 'Frameworks');
const helperAppBaseName = product.nameShort;
const gpuHelperAppName = helperAppBaseName + ' Helper (GPU).app';
const pluginHelperAppName = helperAppBaseName + ' Helper (Plugin).app';
const rendererHelperAppName = helperAppBaseName + ' Helper (Renderer).app';
const defaultOpts = {
app: path.join(appRoot, appName),
platform: 'darwin',
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'app-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'app-entitlements.plist'),
hardenedRuntime: true,
'pre-auto-entitlements': false,
'pre-embed-provisioning-profile': false,
keychain: path.join(tempDir, 'buildagent.keychain'),
version: util.getElectronVersion(),
identity: '99FM488X57',
'gatekeeper-assess': false
};
const appOpts = Object.assign(Object.assign({}, defaultOpts), {
// TODO(deepak1556): Incorrectly declared type in electron-osx-sign
ignore: (filePath) => {
return filePath.includes(gpuHelperAppName) ||
filePath.includes(pluginHelperAppName) ||
filePath.includes(rendererHelperAppName);
} });
const gpuHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, gpuHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist') });
const pluginHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, pluginHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-plugin-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-plugin-entitlements.plist') });
const rendererHelperOpts = Object.assign(Object.assign({}, defaultOpts), { app: path.join(appFrameworkPath, rendererHelperAppName), entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'), 'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist') });
await codesign.signAsync(gpuHelperOpts);
await codesign.signAsync(pluginHelperOpts);
await codesign.signAsync(rendererHelperOpts);
await codesign.signAsync(appOpts);
}
if (require.main === module) {
main().catch(err => {
console.error(err);
process.exit(1);
});
}

View File

@@ -1,90 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as codesign from 'electron-osx-sign';
import * as path from 'path';
import * as util from '../lib/util';
import * as product from '../../product.json';
async function main(): Promise<void> {
const buildDir = process.env['AGENT_BUILDDIRECTORY'];
const tempDir = process.env['AGENT_TEMPDIRECTORY'];
if (!buildDir) {
throw new Error('$AGENT_BUILDDIRECTORY not set');
}
if (!tempDir) {
throw new Error('$AGENT_TEMPDIRECTORY not set');
}
const baseDir = path.dirname(__dirname);
const appRoot = path.join(buildDir, 'VSCode-darwin');
const appName = product.nameLong + '.app';
const appFrameworkPath = path.join(appRoot, appName, 'Contents', 'Frameworks');
const helperAppBaseName = product.nameShort;
const gpuHelperAppName = helperAppBaseName + ' Helper (GPU).app';
const pluginHelperAppName = helperAppBaseName + ' Helper (Plugin).app';
const rendererHelperAppName = helperAppBaseName + ' Helper (Renderer).app';
const defaultOpts: codesign.SignOptions = {
app: path.join(appRoot, appName),
platform: 'darwin',
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'app-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'app-entitlements.plist'),
hardenedRuntime: true,
'pre-auto-entitlements': false,
'pre-embed-provisioning-profile': false,
keychain: path.join(tempDir, 'buildagent.keychain'),
version: util.getElectronVersion(),
identity: '99FM488X57',
'gatekeeper-assess': false
};
const appOpts = {
...defaultOpts,
// TODO(deepak1556): Incorrectly declared type in electron-osx-sign
ignore: (filePath: string) => {
return filePath.includes(gpuHelperAppName) ||
filePath.includes(pluginHelperAppName) ||
filePath.includes(rendererHelperAppName);
}
};
const gpuHelperOpts: codesign.SignOptions = {
...defaultOpts,
app: path.join(appFrameworkPath, gpuHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-gpu-entitlements.plist'),
};
const pluginHelperOpts: codesign.SignOptions = {
...defaultOpts,
app: path.join(appFrameworkPath, pluginHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-plugin-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-plugin-entitlements.plist'),
};
const rendererHelperOpts: codesign.SignOptions = {
...defaultOpts,
app: path.join(appFrameworkPath, rendererHelperAppName),
entitlements: path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
'entitlements-inherit': path.join(baseDir, 'azure-pipelines', 'darwin', 'helper-renderer-entitlements.plist'),
};
await codesign.signAsync(gpuHelperOpts);
await codesign.signAsync(pluginHelperOpts);
await codesign.signAsync(rendererHelperOpts);
await codesign.signAsync(appOpts as any);
}
if (require.main === module) {
main().catch(err => {
console.error(err);
process.exit(1);
});
}

View File

@@ -127,7 +127,6 @@ const createESMSourcesAndResourcesTask = task.define('extract-editor-esm', () =>
const compileEditorESMTask = task.define('compile-editor-esm', () => {
const KEEP_PREV_ANALYSIS = false;
const FAIL_ON_PURPOSE = false;
console.log(`Launching the TS compiler at ${path.join(__dirname, '../out-editor-esm')}...`);
let result;
if (process.platform === 'win32') {
@@ -143,7 +142,7 @@ const compileEditorESMTask = task.define('compile-editor-esm', () => {
console.log(result.stdout.toString());
console.log(result.stderr.toString());
if (FAIL_ON_PURPOSE || result.status !== 0) {
if (result.status !== 0) {
console.log(`The TS Compilation failed, preparing analysis folder...`);
const destPath = path.join(__dirname, '../../vscode-monaco-editor-esm-analysis');
const keepPrevAnalysis = (KEEP_PREV_ANALYSIS && fs.existsSync(destPath));

View File

@@ -8,11 +8,9 @@ require('events').EventEmitter.defaultMaxListeners = 100;
const gulp = require('gulp');
const path = require('path');
const nodeUtil = require('util');
const tsb = require('gulp-tsb');
const es = require('event-stream');
const filter = require('gulp-filter');
const webpack = require('webpack');
const util = require('./lib/util');
const task = require('./lib/task');
const watcher = require('./lib/watch');
@@ -23,8 +21,6 @@ const nlsDev = require('vscode-nls-dev');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
const plumber = require('gulp-plumber');
const fancyLog = require('fancy-log');
const ansiColors = require('ansi-colors');
const ext = require('./lib/extensions');
const extensionsPath = path.join(path.dirname(__dirname), 'extensions');
@@ -40,7 +36,7 @@ const compilations = glob.sync('**/tsconfig.json', {
ignore: ['**/out/**', '**/node_modules/**']
});
const getBaseUrl = out => `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}/${out}`;
const getBaseUrl = out => `https://ticino.blob.core.windows.net/sourcemaps/${commit}/${out}`;
const tasks = compilations.map(function (tsconfigFile) {
const absolutePath = path.join(extensionsPath, tsconfigFile);
@@ -177,78 +173,3 @@ const compileExtensionsBuildTask = task.define('compile-extensions-build', task.
gulp.task(compileExtensionsBuildTask);
exports.compileExtensionsBuildTask = compileExtensionsBuildTask;
const compileWebExtensionsTask = task.define('compile-web', () => buildWebExtensions(false));
gulp.task(compileWebExtensionsTask);
exports.compileWebExtensionsTask = compileWebExtensionsTask;
const watchWebExtensionsTask = task.define('watch-web', () => buildWebExtensions(true));
gulp.task(watchWebExtensionsTask);
exports.watchWebExtensionsTask = watchWebExtensionsTask;
async function buildWebExtensions(isWatch) {
const webpackConfigLocations = await nodeUtil.promisify(glob)(
path.join(extensionsPath, '**', 'extension-browser.webpack.config.js'),
{ ignore: ['**/node_modules'] }
);
const webpackConfigs = [];
for (const webpackConfigPath of webpackConfigLocations) {
const configOrFnOrArray = require(webpackConfigPath);
function addConfig(configOrFn) {
if (typeof configOrFn === 'function') {
webpackConfigs.push(configOrFn({}, {}));
} else {
webpackConfigs.push(configOrFn);
}
}
addConfig(configOrFnOrArray);
}
function reporter(fullStats) {
if (Array.isArray(fullStats.children)) {
for (const stats of fullStats.children) {
const outputPath = stats.outputPath;
if (outputPath) {
const relativePath = path.relative(extensionsPath, outputPath).replace(/\\/g, '/');
const match = relativePath.match(/[^\/]+(\/server|\/client)?/);
fancyLog(`Finished ${ansiColors.green('packaging web extension')} ${ansiColors.cyan(match[0])} with ${stats.errors.length} errors.`);
}
if (Array.isArray(stats.errors)) {
stats.errors.forEach(error => {
fancyLog.error(error);
});
}
if (Array.isArray(stats.warnings)) {
stats.warnings.forEach(warning => {
fancyLog.warn(warning);
});
}
}
}
}
return new Promise((resolve, reject) => {
if (isWatch) {
webpack(webpackConfigs).watch({}, (err, stats) => {
if (err) {
reject();
} else {
reporter(stats.toJson());
}
});
} else {
webpack(webpackConfigs).run((err, stats) => {
if (err) {
fancyLog.error(err);
reject();
} else {
reporter(stats.toJson());
resolve();
}
});
}
});
}

View File

@@ -60,7 +60,6 @@ const indentationFilter = [
// except specific folders
'!test/automation/out/**',
'!test/smoke/out/**',
'!extensions/typescript-language-features/test-workspace/**',
'!extensions/vscode-api-tests/testWorkspace/**',
'!extensions/vscode-api-tests/testWorkspace2/**',
'!build/monaco/**',
@@ -86,7 +85,7 @@ const indentationFilter = [
'!src/typings/**/*.d.ts',
'!extensions/**/*.d.ts',
'!**/*.{svg,exe,png,bmp,scpt,bat,cmd,cur,ttf,woff,eot,md,ps1,template,yaml,yml,d.ts.recipe,ico,icns,plist}',
'!build/{lib,download,darwin}/**/*.js',
'!build/{lib,download}/**/*.js',
'!build/**/*.sh',
'!build/azure-pipelines/**/*.js',
'!build/azure-pipelines/**/*.config',
@@ -109,7 +108,6 @@ const indentationFilter = [
'!extensions/sql-database-projects/src/test/baselines/*.xml',
'!extensions/sql-database-projects/src/test/baselines/*.json',
'!extensions/sql-database-projects/src/test/baselines/*.sqlproj',
'!extensions/sql-database-projects/BuildDirectory/SystemDacpacs/**',
'!extensions/big-data-cluster/src/bigDataCluster/controller/apiGenerated.ts',
'!extensions/big-data-cluster/src/bigDataCluster/controller/clusterApiGenerated2.ts',
'!resources/linux/snap/electron-launch'
@@ -144,7 +142,6 @@ const copyrightFilter = [
'!extensions/*/server/bin/*',
'!src/vs/editor/test/node/classification/typescript-test.ts',
'!scripts/code-web.js',
'!resources/serverless/code-web.js',
// {{SQL CARBON EDIT}}
'!extensions/notebook/src/intellisense/text.ts',
'!extensions/mssql/src/hdfs/webhdfs.ts',
@@ -177,8 +174,7 @@ const copyrightFilter = [
'!**/*.gif',
'!**/*.xlf',
'!**/*.dacpac',
'!**/*.bacpac',
'!**/*.py'
'!**/*.bacpac'
];
const jsHygieneFilter = [

View File

@@ -58,7 +58,6 @@ const nodeModules = [ // {{SQL CARBON EDIT}}
const vscodeEntryPoints = _.flatten([
buildfile.entrypoint('vs/workbench/workbench.desktop.main'),
buildfile.base,
buildfile.workerExtensionHost,
buildfile.workbenchDesktop,
buildfile.code
]);
@@ -70,23 +69,19 @@ const vscodeResources = [
'out-build/bootstrap.js',
'out-build/bootstrap-fork.js',
'out-build/bootstrap-amd.js',
'out-build/bootstrap-node.js',
'out-build/bootstrap-window.js',
'out-build/paths.js',
'out-build/vs/**/*.{svg,png,html}',
'!out-build/vs/code/browser/**/*.html',
'!out-build/vs/editor/standalone/**/*.svg',
'out-build/vs/base/common/performance.js',
'out-build/vs/base/node/languagePacks.js',
'out-build/vs/base/node/{stdForkStart.js,terminateProcess.sh,cpuUsage.sh,ps.sh}',
'out-build/vs/base/browser/ui/codicons/codicon/**',
'out-build/vs/base/parts/sandbox/electron-browser/preload.js',
'out-build/vs/workbench/browser/media/*-theme.css',
'out-build/vs/workbench/contrib/debug/**/*.json',
'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt',
'out-build/vs/workbench/contrib/webview/browser/pre/*.js',
'out-build/vs/workbench/contrib/webview/electron-browser/pre/*.js',
'out-build/vs/workbench/services/extensions/worker/extensionHostWorkerMain.js',
'out-build/vs/**/markdown.css',
'out-build/vs/workbench/contrib/tasks/**/*.json',
'out-build/vs/platform/files/**/*.exe',
@@ -131,7 +126,7 @@ const optimizeVSCodeTask = task.define('optimize-vscode', task.series(
));
gulp.task(optimizeVSCodeTask);
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
const sourceMappingURLBase = `https://ticino.blob.core.windows.net/sourcemaps/${commit}`;
const minifyVSCodeTask = task.define('minify-vscode', task.series(
optimizeVSCodeTask,
util.rimraf('out-vscode-min'),
@@ -189,7 +184,6 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
const out = sourceFolderName;
const checksums = computeChecksums(out, [
'vs/base/parts/sandbox/electron-browser/preload.js',
'vs/workbench/workbench.desktop.main.js',
'vs/workbench/workbench.desktop.main.css',
'vs/workbench/services/extensions/node/extensionHostProcess.js',

View File

@@ -66,7 +66,6 @@ function buildWin32Setup(arch, target) {
return cb => {
const ia32AppId = target === 'system' ? product.win32AppId : product.win32UserAppId;
const x64AppId = target === 'system' ? product.win32x64AppId : product.win32x64UserAppId;
const arm64AppId = target === 'system' ? product.win32arm64AppId : product.win32arm64UserAppId;
const sourcePath = buildPath(arch);
const outputPath = setupDir(arch, target);
@@ -90,12 +89,12 @@ function buildWin32Setup(arch, target) {
ShellNameShort: product.win32ShellNameShort,
AppMutex: product.win32MutexName,
Arch: arch,
AppId: { 'ia32': ia32AppId, 'x64': x64AppId, 'arm64': arm64AppId }[arch],
IncompatibleTargetAppId: { 'ia32': product.win32AppId, 'x64': product.win32x64AppId, 'arm64': product.win32arm64AppId }[arch],
IncompatibleArchAppId: { 'ia32': x64AppId, 'x64': ia32AppId, 'arm64': ia32AppId }[arch],
AppId: arch === 'ia32' ? ia32AppId : x64AppId,
IncompatibleTargetAppId: arch === 'ia32' ? product.win32AppId : product.win32x64AppId,
IncompatibleArchAppId: arch === 'ia32' ? x64AppId : ia32AppId,
AppUserId: product.win32AppUserModelId,
ArchitecturesAllowed: { 'ia32': '', 'x64': 'x64', 'arm64': '' }[arch],
ArchitecturesInstallIn64BitMode: { 'ia32': '', 'x64': 'x64', 'arm64': '' }[arch],
ArchitecturesAllowed: arch === 'ia32' ? '' : 'x64',
ArchitecturesInstallIn64BitMode: arch === 'ia32' ? '' : 'x64',
SourceDir: sourcePath,
RepoDir: repoPath,
OutputDir: outputPath,
@@ -114,10 +113,8 @@ function defineWin32SetupTasks(arch, target) {
defineWin32SetupTasks('ia32', 'system');
defineWin32SetupTasks('x64', 'system');
defineWin32SetupTasks('arm64', 'system');
defineWin32SetupTasks('ia32', 'user');
defineWin32SetupTasks('x64', 'user');
defineWin32SetupTasks('arm64', 'user');
function archiveWin32Setup(arch) {
return cb => {
@@ -149,7 +146,6 @@ function updateIcon(executablePath) {
gulp.task(task.define('vscode-win32-ia32-inno-updater', task.series(copyInnoUpdater('ia32'), updateIcon(path.join(buildPath('ia32'), 'tools', 'inno_updater.exe')))));
gulp.task(task.define('vscode-win32-x64-inno-updater', task.series(copyInnoUpdater('x64'), updateIcon(path.join(buildPath('x64'), 'tools', 'inno_updater.exe')))));
gulp.task(task.define('vscode-win32-arm64-inno-updater', task.series(copyInnoUpdater('arm64'), updateIcon(path.join(buildPath('arm64'), 'tools', 'inno_updater.exe')))));
// CodeHelper.exe icon

View File

@@ -4,7 +4,7 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.config = void 0;
exports.config = exports.getElectronVersion = void 0;
const fs = require("fs");
const path = require("path");
const vfs = require("vinyl-fs");
@@ -16,6 +16,12 @@ const electron = require('gulp-atom-electron');
const root = path.dirname(path.dirname(__dirname));
const product = JSON.parse(fs.readFileSync(path.join(root, 'product.json'), 'utf8'));
const commit = util.getVersion(root);
function getElectronVersion() {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
const target = /^target "(.*)"$/m.exec(yarnrc)[1];
return target;
}
exports.getElectronVersion = getElectronVersion;
const darwinCreditsTemplate = product.darwinCredits && _.template(fs.readFileSync(path.join(root, product.darwinCredits), 'utf8'));
function darwinBundleDocumentType(extensions, icon) {
return {
@@ -27,7 +33,7 @@ function darwinBundleDocumentType(extensions, icon) {
};
}
exports.config = {
version: util.getElectronVersion(),
version: getElectronVersion(),
productAppName: product.nameLong,
companyName: 'Microsoft Corporation',
copyright: 'Copyright (C) 2019 Microsoft. All rights reserved',
@@ -67,7 +73,7 @@ function getElectron(arch) {
};
}
async function main(arch = process.arch) {
const version = util.getElectronVersion();
const version = getElectronVersion();
const electronPath = path.join(root, '.build', 'electron');
const versionFile = path.join(electronPath, 'version');
const isUpToDate = fs.existsSync(versionFile) && fs.readFileSync(versionFile, 'utf8') === `${version}`;

View File

@@ -19,6 +19,12 @@ const root = path.dirname(path.dirname(__dirname));
const product = JSON.parse(fs.readFileSync(path.join(root, 'product.json'), 'utf8'));
const commit = util.getVersion(root);
export function getElectronVersion(): string {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
const target = /^target "(.*)"$/m.exec(yarnrc)![1];
return target;
}
const darwinCreditsTemplate = product.darwinCredits && _.template(fs.readFileSync(path.join(root, product.darwinCredits), 'utf8'));
function darwinBundleDocumentType(extensions: string[], icon: string) {
@@ -32,7 +38,7 @@ function darwinBundleDocumentType(extensions: string[], icon: string) {
}
export const config = {
version: util.getElectronVersion(),
version: getElectronVersion(),
productAppName: product.nameLong,
companyName: 'Microsoft Corporation',
copyright: 'Copyright (C) 2019 Microsoft. All rights reserved',
@@ -75,7 +81,7 @@ function getElectron(arch: string): () => NodeJS.ReadWriteStream {
}
async function main(arch = process.arch): Promise<void> {
const version = util.getElectronVersion();
const version = getElectronVersion();
const electronPath = path.join(root, '.build', 'electron');
const versionFile = path.join(electronPath, 'version');
const isUpToDate = fs.existsSync(versionFile) && fs.readFileSync(versionFile, 'utf8') === `${version}`;

View File

@@ -13,10 +13,6 @@ module.exports = new class ApiLiteralOrTypes {
create(context) {
return {
['TSTypeAnnotation TSUnionType TSLiteralType']: (node) => {
var _a;
if (((_a = node.literal) === null || _a === void 0 ? void 0 : _a.type) === 'TSNullKeyword') {
return;
}
context.report({
node: node,
messageId: 'useEnum'

View File

@@ -15,9 +15,6 @@ export = new class ApiLiteralOrTypes implements eslint.Rule.RuleModule {
create(context: eslint.Rule.RuleContext): eslint.Rule.RuleListener {
return {
['TSTypeAnnotation TSUnionType TSLiteralType']: (node: any) => {
if (node.literal?.type === 'TSNullKeyword') {
return;
}
context.report({
node: node,
messageId: 'useEnum'

View File

@@ -4,7 +4,7 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
exports.translatePackageJSON = exports.packageRebuildExtensionsStream = exports.cleanRebuildExtensions = exports.packageExternalExtensionsStream = exports.scanBuiltinExtensions = exports.packageMarketplaceWebExtensionsStream = exports.packageMarketplaceExtensionsStream = exports.packageLocalWebExtensionsStream = exports.packageLocalExtensionsStream = exports.fromMarketplace = void 0;
exports.packageRebuildExtensionsStream = exports.cleanRebuildExtensions = exports.packageExternalExtensionsStream = exports.packageMarketplaceExtensionsStream = exports.packageLocalExtensionsStream = exports.fromMarketplace = void 0;
const es = require("event-stream");
const fs = require("fs");
const glob = require("glob");
@@ -27,8 +27,12 @@ const webpackGulp = require('webpack-stream');
const util = require('./util');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
function minimizeLanguageJSON(input) {
const sourceMappingURLBase = `https://ticino.blob.core.windows.net/sourcemaps/${commit}`;
function fromLocal(extensionPath) {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
const input = fs.existsSync(webpackFilename)
? fromLocalWebpack(extensionPath)
: fromLocalNormal(extensionPath);
const tmLanguageJsonFilter = filter('**/*.tmLanguage.json', { restore: true });
return input
.pipe(tmLanguageJsonFilter)
@@ -39,49 +43,12 @@ function minimizeLanguageJSON(input) {
}))
.pipe(tmLanguageJsonFilter.restore);
}
function updateExtensionPackageJSON(input, update) {
const packageJsonFilter = filter('extensions/*/package.json', { restore: true });
return input
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(es.mapSync((f) => {
const data = JSON.parse(f.contents.toString('utf8'));
f.contents = Buffer.from(JSON.stringify(update(data)));
return f;
}))
.pipe(packageJsonFilter.restore);
}
function fromLocal(extensionPath, forWeb) {
const webpackConfigFileName = forWeb ? 'extension-browser.webpack.config.js' : 'extension.webpack.config.js';
const isWebPacked = fs.existsSync(path.join(extensionPath, webpackConfigFileName));
let input = isWebPacked
? fromLocalWebpack(extensionPath, webpackConfigFileName)
: fromLocalNormal(extensionPath);
if (forWeb) {
input = updateExtensionPackageJSON(input, (data) => {
if (data.browser) {
data.main = data.browser;
}
data.extensionKind = ['web'];
return data;
});
}
else if (isWebPacked) {
input = updateExtensionPackageJSON(input, (data) => {
if (data.main) {
data.main = data.main.replace('/out/', /dist/);
}
return data;
});
}
return minimizeLanguageJSON(input);
}
function fromLocalWebpack(extensionPath, webpackConfigFileName) {
function fromLocalWebpack(extensionPath) {
const result = es.through();
const packagedDependencies = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, webpackConfigFileName));
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
@@ -97,9 +64,30 @@ function fromLocalWebpack(extensionPath, webpackConfigFileName) {
base: extensionPath,
contents: fs.createReadStream(filePath)
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream.
const webpackConfigLocations = glob.sync(path.join(extensionPath, '**', webpackConfigFileName), { ignore: ['**/node_modules'] });
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = glob.sync(path.join(extensionPath, '/**/extension.webpack.config.js'), { ignore: ['**/node_modules'] });
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => {
const webpackDone = (err, stats) => {
fancyLog(`Bundled extension: ${ansiColors.yellow(path.join(path.basename(extensionPath), path.relative(extensionPath, webpackConfigPath)))}...`);
@@ -133,7 +121,7 @@ function fromLocalWebpack(extensionPath, webpackConfigFileName) {
this.emit('data', data);
}));
});
es.merge(...webpackStreams, es.readArray(files))
es.merge(...webpackStreams, patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
@@ -221,8 +209,7 @@ const externalExtensions = [
'liveshare',
'sql-database-projects',
'machine-learning',
'sql-assessment',
'asde-deployment'
'sql-assessment'
];
// extensions that require a rebuild since they have native parts
const rebuildExtensions = [
@@ -240,32 +227,15 @@ function packageLocalExtensionsStream() {
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => externalExtensions.indexOf(name) === -1); // {{SQL CARBON EDIT}} Remove external Extensions with separate package
const nodeModules = gulp.src('extensions/node_modules/**', { base: '.' });
const localExtensions = localExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
const nodeModules = gulp.src('extensions/node_modules/**', { base: '.' });
return es.merge(nodeModules, ...localExtensions)
.pipe(util2.setExecutableBit(['**/*.sh']));
}
exports.packageLocalExtensionsStream = packageLocalExtensionsStream;
function packageLocalWebExtensionsStream() {
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.filter(manifestPath => {
const packageJsonConfig = require(path.join(root, manifestPath));
return !packageJsonConfig.main || packageJsonConfig.browser;
})
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
});
return es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path, true)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
}));
}
exports.packageLocalWebExtensionsStream = packageLocalWebExtensionsStream;
function packageMarketplaceExtensionsStream() {
const extensions = builtInExtensions.map(extension => {
return fromMarketplace(extension.name, extension.version, extension.metadata)
@@ -275,54 +245,6 @@ function packageMarketplaceExtensionsStream() {
.pipe(util2.setExecutableBit(['**/*.sh']));
}
exports.packageMarketplaceExtensionsStream = packageMarketplaceExtensionsStream;
function packageMarketplaceWebExtensionsStream(builtInExtensions) {
const extensions = builtInExtensions
.map(extension => {
const input = fromMarketplace(extension.name, extension.version, extension.metadata)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
return updateExtensionPackageJSON(input, (data) => {
if (data.main) {
data.browser = data.main;
}
data.extensionKind = ['web'];
return data;
});
});
return es.merge(extensions);
}
exports.packageMarketplaceWebExtensionsStream = packageMarketplaceWebExtensionsStream;
function scanBuiltinExtensions(extensionsRoot, forWeb) {
const scannedExtensions = [];
const extensionsFolders = fs.readdirSync(extensionsRoot);
for (const extensionFolder of extensionsFolders) {
const packageJSONPath = path.join(extensionsRoot, extensionFolder, 'package.json');
if (!fs.existsSync(packageJSONPath)) {
continue;
}
let packageJSON = JSON.parse(fs.readFileSync(packageJSONPath).toString('utf8'));
const extensionKind = packageJSON['extensionKind'] || [];
if (forWeb && extensionKind.indexOf('web') === -1) {
continue;
}
const children = fs.readdirSync(path.join(extensionsRoot, extensionFolder));
const packageNLS = children.filter(child => child === 'package.nls.json')[0];
const readme = children.filter(child => /^readme(\.txt|\.md|)$/i.test(child))[0];
const changelog = children.filter(child => /^changelog(\.txt|\.md|)$/i.test(child))[0];
if (packageNLS) {
// temporary
packageJSON = translatePackageJSON(packageJSON, path.join(extensionsRoot, extensionFolder, packageNLS));
}
scannedExtensions.push({
extensionPath: extensionFolder,
packageJSON,
packageNLSPath: packageNLS ? path.join(extensionFolder, packageNLS) : undefined,
readmePath: readme ? path.join(extensionFolder, readme) : undefined,
changelogPath: changelog ? path.join(extensionFolder, changelog) : undefined,
});
}
return scannedExtensions;
}
exports.scanBuiltinExtensions = scanBuiltinExtensions;
function packageExternalExtensionsStream() {
const extenalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
@@ -332,13 +254,13 @@ function packageExternalExtensionsStream() {
})
.filter(({ name }) => externalExtensions.indexOf(name) >= 0);
const builtExtensions = extenalExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
return es.merge(builtExtensions);
}
exports.packageExternalExtensionsStream = packageExternalExtensionsStream;
// {{SQL CARBON EDIT}} start
// {{SQL CARBON EDIT}} - End
function cleanRebuildExtensions(root) {
return Promise.all(rebuildExtensions.map(async (e) => {
await util2.rimraf(path.join(root, e))();
@@ -354,34 +276,9 @@ function packageRebuildExtensionsStream() {
})
.filter(({ name }) => rebuildExtensions.indexOf(name) >= 0);
const builtExtensions = extenalExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
return es.merge(builtExtensions);
}
exports.packageRebuildExtensionsStream = packageRebuildExtensionsStream;
// {{SQL CARBON EDIT}} end
function translatePackageJSON(packageJSON, packageNLSPath) {
const CharCode_PC = '%'.charCodeAt(0);
const packageNls = JSON.parse(fs.readFileSync(packageNLSPath).toString());
const translate = (obj) => {
for (let key in obj) {
const val = obj[key];
if (Array.isArray(val)) {
val.forEach(translate);
}
else if (val && typeof val === 'object') {
translate(val);
}
else if (typeof val === 'string' && val.charCodeAt(0) === CharCode_PC && val.charCodeAt(val.length - 1) === CharCode_PC) {
const translated = packageNls[val.substr(1, val.length - 2)];
if (translated) {
obj[key] = translated;
}
}
}
};
translate(packageJSON);
return packageJSON;
}
exports.translatePackageJSON = translatePackageJSON;

View File

@@ -26,10 +26,16 @@ const webpackGulp = require('webpack-stream');
const util = require('./util');
const root = path.dirname(path.dirname(__dirname));
const commit = util.getVersion(root);
const sourceMappingURLBase = `https://sqlopsbuilds.blob.core.windows.net/sourcemaps/${commit}`;
const sourceMappingURLBase = `https://ticino.blob.core.windows.net/sourcemaps/${commit}`;
function fromLocal(extensionPath: string): Stream {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
const input = fs.existsSync(webpackFilename)
? fromLocalWebpack(extensionPath)
: fromLocalNormal(extensionPath);
function minimizeLanguageJSON(input: Stream): Stream {
const tmLanguageJsonFilter = filter('**/*.tmLanguage.json', { restore: true });
return input
.pipe(tmLanguageJsonFilter)
.pipe(buffer())
@@ -40,55 +46,13 @@ function minimizeLanguageJSON(input: Stream): Stream {
.pipe(tmLanguageJsonFilter.restore);
}
function updateExtensionPackageJSON(input: Stream, update: (data: any) => any): Stream {
const packageJsonFilter = filter('extensions/*/package.json', { restore: true });
return input
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(es.mapSync((f: File) => {
const data = JSON.parse(f.contents.toString('utf8'));
f.contents = Buffer.from(JSON.stringify(update(data)));
return f;
}))
.pipe(packageJsonFilter.restore);
}
function fromLocal(extensionPath: string, forWeb: boolean): Stream {
const webpackConfigFileName = forWeb ? 'extension-browser.webpack.config.js' : 'extension.webpack.config.js';
const isWebPacked = fs.existsSync(path.join(extensionPath, webpackConfigFileName));
let input = isWebPacked
? fromLocalWebpack(extensionPath, webpackConfigFileName)
: fromLocalNormal(extensionPath);
if (forWeb) {
input = updateExtensionPackageJSON(input, (data: any) => {
if (data.browser) {
data.main = data.browser;
}
data.extensionKind = ['web'];
return data;
});
} else if (isWebPacked) {
input = updateExtensionPackageJSON(input, (data: any) => {
if (data.main) {
data.main = data.main.replace('/out/', /dist/);
}
return data;
});
}
return minimizeLanguageJSON(input);
}
function fromLocalWebpack(extensionPath: string, webpackConfigFileName: string): Stream {
function fromLocalWebpack(extensionPath: string): Stream {
const result = es.through();
const packagedDependencies: string[] = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, webpackConfigFileName));
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
@@ -106,13 +70,38 @@ function fromLocalWebpack(extensionPath: string, webpackConfigFileName: string):
contents: fs.createReadStream(filePath) as any
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream.
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = (<string[]>glob.sync(
path.join(extensionPath, '**', webpackConfigFileName),
path.join(extensionPath, '/**/extension.webpack.config.js'),
{ ignore: ['**/node_modules'] }
));
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data: any) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => {
const webpackDone = (err: any, stats: any) => {
@@ -154,7 +143,7 @@ function fromLocalWebpack(extensionPath: string, webpackConfigFileName: string):
}));
});
es.merge(...webpackStreams, es.readArray(files))
es.merge(...webpackStreams, patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
@@ -255,8 +244,7 @@ const externalExtensions = [
'liveshare',
'sql-database-projects',
'machine-learning',
'sql-assessment',
'asde-deployment'
'sql-assessment'
];
// extensions that require a rebuild since they have native parts
@@ -285,35 +273,16 @@ export function packageLocalExtensionsStream(): NodeJS.ReadWriteStream {
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => externalExtensions.indexOf(name) === -1); // {{SQL CARBON EDIT}} Remove external Extensions with separate package
const nodeModules = gulp.src('extensions/node_modules/**', { base: '.' });
const localExtensions = localExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
const nodeModules = gulp.src('extensions/node_modules/**', { base: '.' });
return es.merge(nodeModules, ...localExtensions)
.pipe(util2.setExecutableBit(['**/*.sh']));
}
export function packageLocalWebExtensionsStream(): NodeJS.ReadWriteStream {
const localExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
.filter(manifestPath => {
const packageJsonConfig = require(path.join(root, manifestPath));
return !packageJsonConfig.main || packageJsonConfig.browser;
})
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
});
return es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path, true)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
}));
}
export function packageMarketplaceExtensionsStream(): NodeJS.ReadWriteStream {
const extensions = builtInExtensions.map(extension => {
return fromMarketplace(extension.name, extension.version, extension.metadata)
@@ -324,63 +293,6 @@ export function packageMarketplaceExtensionsStream(): NodeJS.ReadWriteStream {
.pipe(util2.setExecutableBit(['**/*.sh']));
}
export function packageMarketplaceWebExtensionsStream(builtInExtensions: IBuiltInExtension[]): NodeJS.ReadWriteStream {
const extensions = builtInExtensions
.map(extension => {
const input = fromMarketplace(extension.name, extension.version, extension.metadata)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
return updateExtensionPackageJSON(input, (data: any) => {
if (data.main) {
data.browser = data.main;
}
data.extensionKind = ['web'];
return data;
});
});
return es.merge(extensions);
}
export interface IScannedBuiltinExtension {
extensionPath: string,
packageJSON: any,
packageNLSPath?: string,
readmePath?: string,
changelogPath?: string,
}
export function scanBuiltinExtensions(extensionsRoot: string, forWeb: boolean): IScannedBuiltinExtension[] {
const scannedExtensions: IScannedBuiltinExtension[] = [];
const extensionsFolders = fs.readdirSync(extensionsRoot);
for (const extensionFolder of extensionsFolders) {
const packageJSONPath = path.join(extensionsRoot, extensionFolder, 'package.json');
if (!fs.existsSync(packageJSONPath)) {
continue;
}
let packageJSON = JSON.parse(fs.readFileSync(packageJSONPath).toString('utf8'));
const extensionKind: string[] = packageJSON['extensionKind'] || [];
if (forWeb && extensionKind.indexOf('web') === -1) {
continue;
}
const children = fs.readdirSync(path.join(extensionsRoot, extensionFolder));
const packageNLS = children.filter(child => child === 'package.nls.json')[0];
const readme = children.filter(child => /^readme(\.txt|\.md|)$/i.test(child))[0];
const changelog = children.filter(child => /^changelog(\.txt|\.md|)$/i.test(child))[0];
if (packageNLS) {
// temporary
packageJSON = translatePackageJSON(packageJSON, path.join(extensionsRoot, extensionFolder, packageNLS));
}
scannedExtensions.push({
extensionPath: extensionFolder,
packageJSON,
packageNLSPath: packageNLS ? path.join(extensionFolder, packageNLS) : undefined,
readmePath: readme ? path.join(extensionFolder, readme) : undefined,
changelogPath: changelog ? path.join(extensionFolder, changelog) : undefined,
});
}
return scannedExtensions;
}
export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
const extenalExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
.map(manifestPath => {
@@ -391,14 +303,14 @@ export function packageExternalExtensionsStream(): NodeJS.ReadWriteStream {
.filter(({ name }) => externalExtensions.indexOf(name) >= 0);
const builtExtensions = extenalExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
return es.merge(builtExtensions);
}
// {{SQL CARBON EDIT}} - End
// {{SQL CARBON EDIT}} start
export function cleanRebuildExtensions(root: string): Promise<void> {
return Promise.all(rebuildExtensions.map(async e => {
await util2.rimraf(path.join(root, e))();
@@ -415,32 +327,9 @@ export function packageRebuildExtensionsStream(): NodeJS.ReadWriteStream {
.filter(({ name }) => rebuildExtensions.indexOf(name) >= 0);
const builtExtensions = extenalExtensionDescriptions.map(extension => {
return fromLocal(extension.path, false)
return fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
});
return es.merge(builtExtensions);
}
// {{SQL CARBON EDIT}} end
export function translatePackageJSON(packageJSON: string, packageNLSPath: string) {
const CharCode_PC = '%'.charCodeAt(0);
const packageNls = JSON.parse(fs.readFileSync(packageNLSPath).toString());
const translate = (obj: any) => {
for (let key in obj) {
const val = obj[key];
if (Array.isArray(val)) {
val.forEach(translate);
} else if (val && typeof val === 'object') {
translate(val);
} else if (typeof val === 'string' && val.charCodeAt(0) === CharCode_PC && val.charCodeAt(val.length - 1) === CharCode_PC) {
const translated = packageNls[val.substr(1, val.length - 2)];
if (translated) {
obj[key] = translated;
}
}
}
};
translate(packageJSON);
return packageJSON;
}

View File

@@ -16,7 +16,7 @@ const https = require("https");
const gulp = require("gulp");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const iconv = require("iconv-lite-umd");
const iconv = require("iconv-lite");
const NUMBER_OF_CONCURRENT_DOWNLOADS = 4;
function log(message, ...rest) {
fancyLog(ansiColors.green('[i18n]'), message, ...rest);
@@ -101,158 +101,161 @@ class TextModel {
return this._lines;
}
}
class XLF {
constructor(project) {
this.project = project;
this.buffer = [];
this.files = Object.create(null);
this.numberOfMessages = 0;
}
toString() {
this.appendHeader();
for (let file in this.files) {
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
for (let item of this.files[file]) {
this.addStringItem(file, item);
}
this.appendNewLine('</body></file>', 2);
let XLF = /** @class */ (() => {
class XLF {
constructor(project) {
this.project = project;
this.buffer = [];
this.files = Object.create(null);
this.numberOfMessages = 0;
}
this.appendFooter();
return this.buffer.join('\r\n');
}
addFile(original, keys, messages) {
if (keys.length === 0) {
console.log('No keys in ' + original);
return;
}
if (keys.length !== messages.length) {
throw new Error(`Unmatching keys(${keys.length}) and messages(${messages.length}).`);
}
this.numberOfMessages += keys.length;
this.files[original] = [];
let existingKeys = new Set();
for (let i = 0; i < keys.length; i++) {
let key = keys[i];
let realKey;
let comment;
if (Is.string(key)) {
realKey = key;
comment = undefined;
}
else if (LocalizeInfo.is(key)) {
realKey = key.key;
if (key.comment && key.comment.length > 0) {
comment = key.comment.map(comment => encodeEntities(comment)).join('\r\n');
toString() {
this.appendHeader();
for (let file in this.files) {
this.appendNewLine(`<file original="${file}" source-language="en" datatype="plaintext"><body>`, 2);
for (let item of this.files[file]) {
this.addStringItem(file, item);
}
this.appendNewLine('</body></file>', 2);
}
if (!realKey || existingKeys.has(realKey)) {
continue;
this.appendFooter();
return this.buffer.join('\r\n');
}
addFile(original, keys, messages) {
if (keys.length === 0) {
console.log('No keys in ' + original);
return;
}
existingKeys.add(realKey);
let message = encodeEntities(messages[i]);
this.files[original].push({ id: realKey, message: message, comment: comment });
if (keys.length !== messages.length) {
throw new Error(`Unmatching keys(${keys.length}) and messages(${messages.length}).`);
}
this.numberOfMessages += keys.length;
this.files[original] = [];
let existingKeys = new Set();
for (let i = 0; i < keys.length; i++) {
let key = keys[i];
let realKey;
let comment;
if (Is.string(key)) {
realKey = key;
comment = undefined;
}
else if (LocalizeInfo.is(key)) {
realKey = key.key;
if (key.comment && key.comment.length > 0) {
comment = key.comment.map(comment => encodeEntities(comment)).join('\r\n');
}
}
if (!realKey || existingKeys.has(realKey)) {
continue;
}
existingKeys.add(realKey);
let message = encodeEntities(messages[i]);
this.files[original].push({ id: realKey, message: message, comment: comment });
}
}
addStringItem(file, item) {
if (!item.id || item.message === undefined || item.message === null) {
throw new Error(`No item ID or value specified: ${JSON.stringify(item)}. File: ${file}`);
}
if (item.message.length === 0) {
log(`Item with id ${item.id} in file ${file} has an empty message.`);
}
this.appendNewLine(`<trans-unit id="${item.id}">`, 4);
this.appendNewLine(`<source xml:lang="en">${item.message}</source>`, 6);
if (item.comment) {
this.appendNewLine(`<note>${item.comment}</note>`, 6);
}
this.appendNewLine('</trans-unit>', 4);
}
appendHeader() {
this.appendNewLine('<?xml version="1.0" encoding="utf-8"?>', 0);
this.appendNewLine('<xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2">', 0);
}
appendFooter() {
this.appendNewLine('</xliff>', 0);
}
appendNewLine(content, indent) {
let line = new Line(indent);
line.append(content);
this.buffer.push(line.toString());
}
}
addStringItem(file, item) {
if (!item.id || item.message === undefined || item.message === null) {
throw new Error(`No item ID or value specified: ${JSON.stringify(item)}. File: ${file}`);
}
if (item.message.length === 0) {
log(`Item with id ${item.id} in file ${file} has an empty message.`);
}
this.appendNewLine(`<trans-unit id="${item.id}">`, 4);
this.appendNewLine(`<source xml:lang="en">${item.message}</source>`, 6);
if (item.comment) {
this.appendNewLine(`<note>${item.comment}</note>`, 6);
}
this.appendNewLine('</trans-unit>', 4);
}
appendHeader() {
this.appendNewLine('<?xml version="1.0" encoding="utf-8"?>', 0);
this.appendNewLine('<xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2">', 0);
}
appendFooter() {
this.appendNewLine('</xliff>', 0);
}
appendNewLine(content, indent) {
let line = new Line(indent);
line.append(content);
this.buffer.push(line.toString());
}
}
XLF.parsePseudo = function (xlfString) {
return new Promise((resolve) => {
let parser = new xml2js.Parser();
let files = [];
parser.parseString(xlfString, function (_err, result) {
const fileNodes = result['xliff']['file'];
fileNodes.forEach(file => {
const originalFilePath = file.$.original;
const messages = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach((unit) => {
const key = unit.$.id;
const val = pseudify(unit.source[0]['_'].toString());
if (key && val) {
messages[key] = decodeEntities(val);
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: 'ps' });
}
});
resolve(files);
});
});
};
XLF.parse = function (xlfString) {
return new Promise((resolve, reject) => {
let parser = new xml2js.Parser();
let files = [];
parser.parseString(xlfString, function (err, result) {
if (err) {
reject(new Error(`XLF parsing error: Failed to parse XLIFF string. ${err}`));
}
const fileNodes = result['xliff']['file'];
if (!fileNodes) {
reject(new Error(`XLF parsing error: XLIFF file does not contain "xliff" or "file" node(s) required for parsing.`));
}
fileNodes.forEach((file) => {
const originalFilePath = file.$.original;
if (!originalFilePath) {
reject(new Error(`XLF parsing error: XLIFF file node does not contain original attribute to determine the original location of the resource file.`));
}
let language = file.$['target-language'];
if (!language) {
reject(new Error(`XLF parsing error: XLIFF file node does not contain target-language attribute to determine translated language.`));
}
const messages = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach((unit) => {
const key = unit.$.id;
if (!unit.target) {
return; // No translation available
}
let val = unit.target[0];
if (typeof val !== 'string') {
val = val._;
}
if (key && val) {
messages[key] = decodeEntities(val);
}
else {
reject(new Error(`XLF parsing error: XLIFF file ${originalFilePath} does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: language.toLowerCase() });
}
});
resolve(files);
});
});
};
return XLF;
})();
exports.XLF = XLF;
XLF.parsePseudo = function (xlfString) {
return new Promise((resolve) => {
let parser = new xml2js.Parser();
let files = [];
parser.parseString(xlfString, function (_err, result) {
const fileNodes = result['xliff']['file'];
fileNodes.forEach(file => {
const originalFilePath = file.$.original;
const messages = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach((unit) => {
const key = unit.$.id;
const val = pseudify(unit.source[0]['_'].toString());
if (key && val) {
messages[key] = decodeEntities(val);
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: 'ps' });
}
});
resolve(files);
});
});
};
XLF.parse = function (xlfString) {
return new Promise((resolve, reject) => {
let parser = new xml2js.Parser();
let files = [];
parser.parseString(xlfString, function (err, result) {
if (err) {
reject(new Error(`XLF parsing error: Failed to parse XLIFF string. ${err}`));
}
const fileNodes = result['xliff']['file'];
if (!fileNodes) {
reject(new Error(`XLF parsing error: XLIFF file does not contain "xliff" or "file" node(s) required for parsing.`));
}
fileNodes.forEach((file) => {
const originalFilePath = file.$.original;
if (!originalFilePath) {
reject(new Error(`XLF parsing error: XLIFF file node does not contain original attribute to determine the original location of the resource file.`));
}
let language = file.$['target-language'];
if (!language) {
reject(new Error(`XLF parsing error: XLIFF file node does not contain target-language attribute to determine translated language.`));
}
const messages = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach((unit) => {
const key = unit.$.id;
if (!unit.target) {
return; // No translation available
}
let val = unit.target[0];
if (typeof val !== 'string') {
val = val._;
}
if (key && val) {
messages[key] = decodeEntities(val);
}
else {
reject(new Error(`XLF parsing error: XLIFF file ${originalFilePath} does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: language.toLowerCase() });
}
});
resolve(files);
});
});
};
class Limiter {
constructor(maxDegreeOfParalellism) {
this.maxDegreeOfParalellism = maxDegreeOfParalellism;
@@ -1181,10 +1184,9 @@ function createIslFile(originalFilePath, messages, language, innoSetup) {
});
const basename = path.basename(originalFilePath);
const filePath = `${basename}.${language.id}.isl`;
const encoded = iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage);
return new File({
path: filePath,
contents: Buffer.from(encoded),
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage)
});
}
function encodeEntities(value) {

View File

@@ -138,10 +138,6 @@
"name": "vs/workbench/contrib/relauncher",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/sash",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/scm",
"project": "vscode-workbench"
@@ -218,10 +214,6 @@
"name": "vs/workbench/contrib/userDataSync",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/views",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/actions",
"project": "vscode-workbench"
@@ -346,10 +338,6 @@
"name": "vs/workbench/services/userDataSync",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/views",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/timeline",
"project": "vscode-workbench"

View File

@@ -15,7 +15,7 @@ import * as https from 'https';
import * as gulp from 'gulp';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as iconv from 'iconv-lite-umd';
import * as iconv from 'iconv-lite';
const NUMBER_OF_CONCURRENT_DOWNLOADS = 4;
@@ -1347,11 +1347,10 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
const basename = path.basename(originalFilePath);
const filePath = `${basename}.${language.id}.isl`;
const encoded = iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage);
return new File({
path: filePath,
contents: Buffer.from(encoded),
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage)
});
}

View File

@@ -130,14 +130,6 @@ const RULES = [
'lib.dom.d.ts' // no DOM
]
},
// Electron (sandbox)
{
target: '**/vs/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
},
// Electron (renderer): skip
{
target: '**/{vs,sql}/**/electron-browser/**',

View File

@@ -143,15 +143,6 @@ const RULES = [
]
},
// Electron (sandbox)
{
target: '**/vs/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
},
// Electron (renderer): skip
{
target: '**/{vs,sql}/**/electron-browser/**',

View File

@@ -420,7 +420,7 @@ function markNodes(languageService, options) {
// (they can be the declaration of a module import)
continue;
}
if (options.shakeLevel === 2 /* ClassMembers */ && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration)) && !isLocalCodeExtendingOrInheritingFromDefaultLibSymbol(program, checker, declaration)) {
if (options.shakeLevel === 2 /* ClassMembers */ && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration))) {
enqueue_black(declaration.name);
for (let j = 0; j < declaration.members.length; j++) {
const member = declaration.members[j];
@@ -614,34 +614,6 @@ function generateResult(languageService, shakeLevel) {
}
//#endregion
//#region Utils
function isLocalCodeExtendingOrInheritingFromDefaultLibSymbol(program, checker, declaration) {
if (!program.isSourceFileDefaultLibrary(declaration.getSourceFile()) && declaration.heritageClauses) {
for (const heritageClause of declaration.heritageClauses) {
for (const type of heritageClause.types) {
const symbol = findSymbolFromHeritageType(checker, type);
if (symbol) {
const decl = symbol.valueDeclaration || (symbol.declarations && symbol.declarations[0]);
if (decl && program.isSourceFileDefaultLibrary(decl.getSourceFile())) {
return true;
}
}
}
}
}
return false;
}
function findSymbolFromHeritageType(checker, type) {
if (ts.isExpressionWithTypeArguments(type)) {
return findSymbolFromHeritageType(checker, type.expression);
}
if (ts.isIdentifier(type)) {
return getRealNodeSymbol(checker, type)[0];
}
if (ts.isPropertyAccessExpression(type)) {
return findSymbolFromHeritageType(checker, type.name);
}
return null;
}
/**
* Returns the node's symbol and the `import` node (if the symbol resolved from a different module)
*/

View File

@@ -536,7 +536,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
continue;
}
if (options.shakeLevel === ShakeLevel.ClassMembers && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration)) && !isLocalCodeExtendingOrInheritingFromDefaultLibSymbol(program, checker, declaration)) {
if (options.shakeLevel === ShakeLevel.ClassMembers && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration))) {
enqueue_black(declaration.name!);
for (let j = 0; j < declaration.members.length; j++) {
@@ -752,36 +752,6 @@ function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLe
//#region Utils
function isLocalCodeExtendingOrInheritingFromDefaultLibSymbol(program: ts.Program, checker: ts.TypeChecker, declaration: ts.ClassDeclaration | ts.InterfaceDeclaration): boolean {
if (!program.isSourceFileDefaultLibrary(declaration.getSourceFile()) && declaration.heritageClauses) {
for (const heritageClause of declaration.heritageClauses) {
for (const type of heritageClause.types) {
const symbol = findSymbolFromHeritageType(checker, type);
if (symbol) {
const decl = symbol.valueDeclaration || (symbol.declarations && symbol.declarations[0]);
if (decl && program.isSourceFileDefaultLibrary(decl.getSourceFile())) {
return true;
}
}
}
}
}
return false;
}
function findSymbolFromHeritageType(checker: ts.TypeChecker, type: ts.ExpressionWithTypeArguments | ts.Expression | ts.PrivateIdentifier): ts.Symbol | null {
if (ts.isExpressionWithTypeArguments(type)) {
return findSymbolFromHeritageType(checker, type.expression);
}
if (ts.isIdentifier(type)) {
return getRealNodeSymbol(checker, type)[0];
}
if (ts.isPropertyAccessExpression(type)) {
return findSymbolFromHeritageType(checker, type.name);
}
return null;
}
/**
* Returns the node's symbol and the `import` node (if the symbol resolved from a different module)
*/

View File

@@ -4,7 +4,7 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
exports.getElectronVersion = exports.streamToPromise = exports.versionStringToNumber = exports.filter = exports.rebase = exports.getVersion = exports.ensureDir = exports.rreddir = exports.rimraf = exports.stripSourceMappingURL = exports.loadSourcemaps = exports.cleanNodeModules = exports.skipDirectories = exports.toFileUri = exports.setExecutableBit = exports.fixWin32DirectoryPermissions = exports.incremental = void 0;
exports.streamToPromise = exports.versionStringToNumber = exports.filter = exports.rebase = exports.getVersion = exports.ensureDir = exports.rreddir = exports.rimraf = exports.stripSourceMappingURL = exports.loadSourcemaps = exports.cleanNodeModules = exports.skipDirectories = exports.toFileUri = exports.setExecutableBit = exports.fixWin32DirectoryPermissions = exports.incremental = void 0;
const es = require("event-stream");
const debounce = require("debounce");
const _filter = require("gulp-filter");
@@ -14,7 +14,6 @@ const fs = require("fs");
const _rimraf = require("rimraf");
const git = require("./git");
const VinylFile = require("vinyl");
const root = path.dirname(path.dirname(__dirname));
const NoCancellationToken = { isCancellationRequested: () => false };
function incremental(streamProvider, initial, supportsCancellation) {
const input = es.through();
@@ -138,7 +137,7 @@ function loadSourcemaps() {
version: '3',
names: [],
mappings: '',
sources: [f.relative],
sources: [f.relative.replace(/\//g, '/')],
sourcesContent: [contents]
};
cb(undefined, f);
@@ -256,9 +255,3 @@ function streamToPromise(stream) {
});
}
exports.streamToPromise = streamToPromise;
function getElectronVersion() {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
const target = /^target "(.*)"$/m.exec(yarnrc)[1];
return target;
}
exports.getElectronVersion = getElectronVersion;

View File

@@ -18,8 +18,6 @@ import * as VinylFile from 'vinyl';
import { ThroughStream } from 'through';
import * as sm from 'source-map';
const root = path.dirname(path.dirname(__dirname));
export interface ICancellationToken {
isCancellationRequested(): boolean;
}
@@ -186,7 +184,7 @@ export function loadSourcemaps(): NodeJS.ReadWriteStream {
version: '3',
names: [],
mappings: '',
sources: [f.relative],
sources: [f.relative.replace(/\//g, '/')],
sourcesContent: [contents]
};
@@ -320,9 +318,3 @@ export function streamToPromise(stream: NodeJS.ReadWriteStream): Promise<void> {
stream.on('end', () => c());
});
}
export function getElectronVersion(): string {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
const target = /^target "(.*)"$/m.exec(yarnrc)![1];
return target;
}

View File

@@ -33,10 +33,9 @@ function yarnInstall(location, opts) {
yarnInstall('extensions'); // node modules shared by all extensions
if (!(process.platform === 'win32' && process.env['npm_config_arch'] === 'arm64')) {
yarnInstall('remote'); // node modules used by vscode server
yarnInstall('remote/web'); // node modules used by vscode web
}
yarnInstall('remote'); // node modules used by vscode server
yarnInstall('remote/web'); // node modules used by vscode web
const allExtensionFolders = fs.readdirSync('extensions');
const extensions = allExtensionFolders.filter(e => {
@@ -73,5 +72,3 @@ yarnInstall('test/automation'); // node modules required for smoketest
yarnInstall('test/smoke'); // node modules required for smoketest
yarnInstall('test/integration/browser'); // node modules required for integration
yarnInstallBuildDependencies(); // node modules for watching, specific to host node version, not electron
cp.execSync('git config pull.rebase true');

View File

@@ -35,12 +35,11 @@
"applicationinsights": "1.0.8",
"azure-storage": "^2.1.0",
"documentdb": "1.13.0",
"electron-osx-sign": "^0.4.16",
"github-releases": "^0.4.1",
"gulp-bom": "^1.0.0",
"gulp-sourcemaps": "^1.11.0",
"gulp-uglify": "^3.0.0",
"iconv-lite-umd": "0.6.7",
"iconv-lite": "0.4.23",
"mime": "^1.3.4",
"minimatch": "3.0.4",
"minimist": "^1.2.3",
@@ -49,9 +48,9 @@
"rollup-plugin-commonjs": "^10.1.0",
"rollup-plugin-node-resolve": "^5.2.0",
"terser": "4.3.8",
"typescript": "^4.0.0-dev.20200629",
"typescript": "^3.9.1-rc",
"vsce": "1.48.0",
"vscode-telemetry-extractor": "^1.6.0",
"vscode-telemetry-extractor": "^1.5.4",
"xml2js": "^0.4.17"
},
"scripts": {

View File

@@ -1,26 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
let TelemetryReporter = (function () {
function TelemetryReporter(extensionId, extensionVersion, key) {
}
TelemetryReporter.prototype.updateUserOptIn = function (key) {
};
TelemetryReporter.prototype.createAppInsightsClient = function (key) {
};
TelemetryReporter.prototype.getCommonProperties = function () {
};
TelemetryReporter.prototype.sendTelemetryEvent = function (eventName, properties, measurements) {
};
TelemetryReporter.prototype.dispose = function () {
};
TelemetryReporter.TELEMETRY_CONFIG_ID = 'telemetry';
TelemetryReporter.TELEMETRY_CONFIG_ENABLED_ID = 'enableTelemetry';
return TelemetryReporter;
}());
exports.default = TelemetryReporter;

View File

@@ -1,79 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
function format(message, args) {
let result;
// if (isPseudo) {
// // FF3B and FF3D is the Unicode zenkaku representation for [ and ]
// message = '\uFF3B' + message.replace(/[aouei]/g, '$&$&') + '\uFF3D';
// }
if (args.length === 0) {
result = message;
}
else {
result = message.replace(/\{(\d+)\}/g, function (match, rest) {
let index = rest[0];
let arg = args[index];
let replacement = match;
if (typeof arg === 'string') {
replacement = arg;
}
else if (typeof arg === 'number' || typeof arg === 'boolean' || arg === void 0 || arg === null) {
replacement = String(arg);
}
return replacement;
});
}
return result;
}
function localize(key, message) {
let args = [];
for (let _i = 2; _i < arguments.length; _i++) {
args[_i - 2] = arguments[_i];
}
return format(message, args);
}
function loadMessageBundle(file) {
return localize;
}
let MessageFormat;
(function (MessageFormat) {
MessageFormat["file"] = "file";
MessageFormat["bundle"] = "bundle";
MessageFormat["both"] = "both";
})(MessageFormat = exports.MessageFormat || (exports.MessageFormat = {}));
let BundleFormat;
(function (BundleFormat) {
// the nls.bundle format
BundleFormat["standalone"] = "standalone";
BundleFormat["languagePack"] = "languagePack";
})(BundleFormat = exports.BundleFormat || (exports.BundleFormat = {}));
exports.loadMessageBundle = loadMessageBundle;
function config(opts) {
if (opts) {
if (isString(opts.locale)) {
options.locale = opts.locale.toLowerCase();
options.language = options.locale;
resolvedLanguage = undefined;
resolvedBundles = Object.create(null);
}
if (opts.messageFormat !== undefined) {
options.messageFormat = opts.messageFormat;
}
if (opts.bundleFormat === BundleFormat.standalone && options.languagePackSupport === true) {
options.languagePackSupport = false;
}
}
isPseudo = options.locale === 'pseudo';
return loadMessageBundle;
}
exports.config = config;

View File

@@ -1,8 +1,7 @@
#define RootLicenseFileName FileExists(RepoDir + '\LICENSE.rtf') ? 'LICENSE.rtf' : 'LICENSE.txt'
#define LocalizedLanguageFile(Language = "") \
DirExists(RepoDir + "\licenses") && Language != "" \
? ('; LicenseFile: "' + RepoDir + '\licenses\LICENSE-' + Language + '.txt"') \
: '; LicenseFile: "' + RepoDir + '\' + RootLicenseFileName + '"'
: '; LicenseFile: "' + RepoDir + '\LICENSE.txt"'
[Setup]
AppId={#AppId}
@@ -57,9 +56,6 @@ Name: "russian"; MessagesFile: "compiler:Languages\Russian.isl,{#RepoDir}\build\
Name: "korean"; MessagesFile: "{#RepoDir}\build\win32\i18n\Default.ko.isl,{#RepoDir}\build\win32\i18n\messages.ko.isl" {#LocalizedLanguageFile("kor")}
Name: "simplifiedChinese"; MessagesFile: "{#RepoDir}\build\win32\i18n\Default.zh-cn.isl,{#RepoDir}\build\win32\i18n\messages.zh-cn.isl" {#LocalizedLanguageFile("chs")}
Name: "traditionalChinese"; MessagesFile: "{#RepoDir}\build\win32\i18n\Default.zh-tw.isl,{#RepoDir}\build\win32\i18n\messages.zh-tw.isl" {#LocalizedLanguageFile("cht")}
Name: "brazilianPortuguese"; MessagesFile: "compiler:Languages\BrazilianPortuguese.isl,{#RepoDir}\build\win32\i18n\messages.pt-br.isl" {#LocalizedLanguageFile("ptb")}
Name: "hungarian"; MessagesFile: "compiler:Languages\Hungarian.isl,{#RepoDir}\build\win32\i18n\messages.hu.isl" {#LocalizedLanguageFile("hun")}
Name: "turkish"; MessagesFile: "compiler:Languages\Turkish.isl,{#RepoDir}\build\win32\i18n\messages.tr.isl" {#LocalizedLanguageFile("trk")}
[InstallDelete]
Type: filesandordirs; Name: "{app}\resources\app\out"; Check: IsNotUpdate
@@ -155,7 +151,7 @@ begin
#endif
#if "user" == InstallTarget
#if "ia32" == Arch || "arm64" == Arch
#if "ia32" == Arch
#define IncompatibleArchRootKey "HKLM32"
#else
#define IncompatibleArchRootKey "HKLM64"
@@ -265,7 +261,7 @@ begin
end;
end;
// https://stackoverflow.com/a/23838239/261019
// http://stackoverflow.com/a/23838239/261019
procedure Explode(var Dest: TArrayOfString; Text: String; Separator: String);
var
i, p: Integer;

View File

@@ -420,11 +420,6 @@ acorn@^7.0.0:
resolved "https://registry.yarnpkg.com/acorn/-/acorn-7.1.0.tgz#949d36f2c292535da602283586c2477c57eb2d6c"
integrity sha512-kL5CuoXA/dgxlBbVrflsflzQ3PAas7RYZB52NOm/6839iVYJgKMJ3cQJD+t2i5+qFa8h3MDpEOJiS64E8JLnSQ==
agent-base@5:
version "5.1.1"
resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-5.1.1.tgz#e8fb3f242959db44d63be665db7a8e739537a32c"
integrity sha512-TMeqbNl2fMW0nMjTEPOwe3J/PRFP4vqeoNuQMG0HlMrtm5QxKqdvAkZ1pRBQ/ulIyDD5Yq0nJ7YbdD8ey0TO3g==
ajv@^4.9.1:
version "4.11.8"
resolved "https://registry.yarnpkg.com/ajv/-/ajv-4.11.8.tgz#82ffb02b29e662ae53bdc20af15947706739c536"
@@ -639,11 +634,6 @@ balanced-match@^1.0.0:
resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.0.tgz#89b4d199ab2bee49de164ea02b89ce462d71b767"
integrity sha1-ibTRmasr7kneFk6gK4nORi1xt2c=
base64-js@^1.2.3:
version "1.3.1"
resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.3.1.tgz#58ece8cb75dd07e71ed08c736abc5fac4dbf8df1"
integrity sha512-mLQ4i2QO1ytvGWFWmcngKO//JXAQueZvwEKtjgQFM4jIK0kU+ytMfplL8j+n5mspOfjHwoAg+9yhb7BwAHm36g==
base@^0.11.1:
version "0.11.2"
resolved "https://registry.yarnpkg.com/base/-/base-0.11.2.tgz#7bde5ced145b6d551a90db87f83c558b4eb48a8f"
@@ -679,11 +669,6 @@ binary-search-bounds@2.0.3:
resolved "https://registry.yarnpkg.com/binary-search-bounds/-/binary-search-bounds-2.0.3.tgz#5ff8616d6dd2ca5388bc85b2d6266e2b9da502dc"
integrity sha1-X/hhbW3SylOIvIWy1iZuK52lAtw=
bluebird@^3.5.0:
version "3.7.2"
resolved "https://registry.yarnpkg.com/bluebird/-/bluebird-3.7.2.tgz#9f229c15be272454ffa973ace0dbee79a1b0c36f"
integrity sha512-XpNj6GDQzdfW+r2Wnn7xiSAd7TM3jzkxGXBGTtWKuSXv1xUV+azxAm8jdWZN06QTQk+2N2XB9jRDkvbmQmcRtg==
boolbase@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/boolbase/-/boolbase-1.0.0.tgz#68dff5fbe60c51eb37725ea9e3ed310dcc1e776e"
@@ -746,29 +731,11 @@ browserify-mime@~1.2.9:
resolved "https://registry.yarnpkg.com/browserify-mime/-/browserify-mime-1.2.9.tgz#aeb1af28de6c0d7a6a2ce40adb68ff18422af31f"
integrity sha1-rrGvKN5sDXpqLOQK22j/GEIq8x8=
buffer-alloc-unsafe@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/buffer-alloc-unsafe/-/buffer-alloc-unsafe-1.1.0.tgz#bd7dc26ae2972d0eda253be061dba992349c19f0"
integrity sha512-TEM2iMIEQdJ2yjPJoSIsldnleVaAk1oW3DBVUykyOLsEsFmEc9kn+SFFPz+gl54KQNxlDnAwCXosOS9Okx2xAg==
buffer-alloc@^1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/buffer-alloc/-/buffer-alloc-1.2.0.tgz#890dd90d923a873e08e10e5fd51a57e5b7cce0ec"
integrity sha512-CFsHQgjtW1UChdXgbyJGtnm+O/uLQeZdtbDo8mfUgYXCHSM1wgrVxXm6bSyrUuErEb+4sYVGCzASBRot7zyrow==
dependencies:
buffer-alloc-unsafe "^1.1.0"
buffer-fill "^1.0.0"
buffer-crc32@~0.2.3:
version "0.2.13"
resolved "https://registry.yarnpkg.com/buffer-crc32/-/buffer-crc32-0.2.13.tgz#0d333e3f00eac50aa1454abd30ef8c2a5d9a7242"
integrity sha1-DTM+PwDqxQqhRUq9MO+MKl2ackI=
buffer-fill@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/buffer-fill/-/buffer-fill-1.0.0.tgz#f8f78b76789888ef39f205cd637f68e702122b2c"
integrity sha1-+PeLdniYiO858gXNY39o5wISKyw=
buffer-from@^1.0.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/buffer-from/-/buffer-from-1.1.1.tgz#32713bc028f75c02fdb710d7c7bcec1f2c6070ef"
@@ -933,11 +900,6 @@ commander@^2.8.1:
resolved "https://registry.yarnpkg.com/commander/-/commander-2.19.0.tgz#f6198aa84e5b83c46054b94ddedbfed5ee9ff12a"
integrity sha512-6tvAOO+D6OENvRAh524Dh9jcfKTYDQAqvqezbCW82xj5X0pSrcpxtvRKHLG0yBY6SD7PSDrJaj+0AiOcKVd1Xg==
compare-version@^0.1.2:
version "0.1.2"
resolved "https://registry.yarnpkg.com/compare-version/-/compare-version-0.1.2.tgz#0162ec2d9351f5ddd59a9202cba935366a725080"
integrity sha1-AWLsLZNR9d3VmpICy6k1NmpyUIA=
component-emitter@^1.2.1:
version "1.2.1"
resolved "https://registry.yarnpkg.com/component-emitter/-/component-emitter-1.2.1.tgz#137918d6d78283f7df7a6b7c5a63e140e69425e6"
@@ -1041,14 +1003,14 @@ debug-fabulous@0.0.X:
lazy-debug-legacy "0.0.X"
object-assign "4.1.0"
debug@2.X, debug@^2.1.2, debug@^2.2.0, debug@^2.3.3, debug@^2.6.8:
debug@2.X, debug@^2.1.2, debug@^2.2.0, debug@^2.3.3:
version "2.6.9"
resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f"
integrity sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==
dependencies:
ms "2.0.0"
debug@4, debug@^4.1.1:
debug@^4.1.1:
version "4.1.1"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.1.1.tgz#3b72260255109c6b589cee050f1d516139664791"
integrity sha512-pYAIzeRo8J6KPEaJ0VWOh5Pzkbw/RetuzehGM7QRRX5he4fPHx2rdKMB256ehJCkX+XRQm16eZLqLNS8RSZXZw==
@@ -1201,18 +1163,6 @@ ecc-jsbn@~0.1.1:
dependencies:
jsbn "~0.1.0"
electron-osx-sign@^0.4.16:
version "0.4.16"
resolved "https://registry.yarnpkg.com/electron-osx-sign/-/electron-osx-sign-0.4.16.tgz#0be8e579b2d9fa4c12d2a21f063898294b3434aa"
integrity sha512-ziMWfc3NmQlwnWLW6EaZq8nH2BWVng/atX5GWsGwhexJYpdW6hsg//MkAfRTRx1kR3Veiqkeiog1ibkbA4x0rg==
dependencies:
bluebird "^3.5.0"
compare-version "^0.1.2"
debug "^2.6.8"
isbinaryfile "^3.0.2"
minimist "^1.2.0"
plist "^3.0.1"
end-of-stream@^1.1.0:
version "1.4.4"
resolved "https://registry.yarnpkg.com/end-of-stream/-/end-of-stream-1.4.4.tgz#5ae64a5f45057baf3626ec14da0ca5e4b2431eb0"
@@ -1809,18 +1759,12 @@ http-signature@~1.2.0:
jsprim "^1.2.2"
sshpk "^1.7.0"
https-proxy-agent@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-4.0.0.tgz#702b71fb5520a132a66de1f67541d9e62154d82b"
integrity sha512-zoDhWrkR3of1l9QAL8/scJZyLu8j/gBkcwcaQOZh7Gyh/+uJQzGVETdgT30akuwkpL8HTRfssqI3BZuV18teDg==
iconv-lite@0.4.23:
version "0.4.23"
resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.4.23.tgz#297871f63be507adcfbfca715d0cd0eed84e9a63"
integrity sha512-neyTUVFtahjf0mB3dZT77u+8O0QB89jFdnBkd5P1JgYPbPaia3gXXOVL2fq8VyU2gMMD7SaN7QukTB/pmXYvDA==
dependencies:
agent-base "5"
debug "4"
iconv-lite-umd@0.6.7:
version "0.6.7"
resolved "https://registry.yarnpkg.com/iconv-lite-umd/-/iconv-lite-umd-0.6.7.tgz#ee437e34b30f15dc00ec93ea65065e672770777c"
integrity sha512-DT90zb7wL1B3I6DmYUMcfJeVdY19XigzDj5AtXbXEw9Jfi0+AVAxfn7ytvY7Xhr+GFn7nd7hPonapC37oo7iAQ==
safer-buffer ">= 2.1.2 < 3"
iconv-lite@^0.4.4:
version "0.4.24"
@@ -2060,13 +2004,6 @@ isarray@1.0.0, isarray@~1.0.0:
resolved "https://registry.yarnpkg.com/isarray/-/isarray-1.0.0.tgz#bb935d48582cba168c06834957a54a3e07124f11"
integrity sha1-u5NdSFgsuhaMBoNJV6VKPgcSTxE=
isbinaryfile@^3.0.2:
version "3.0.3"
resolved "https://registry.yarnpkg.com/isbinaryfile/-/isbinaryfile-3.0.3.tgz#5d6def3edebf6e8ca8cae9c30183a804b5f8be80"
integrity sha512-8cJBL5tTd2OS0dM4jz07wQd5g0dCCqIhUxPIGtZfa5L6hWlvV5MHTITy/DBAsF+Oe2LS1X3krBUhNwaGUWpWxw==
dependencies:
buffer-alloc "^1.2.0"
isexe@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/isexe/-/isexe-2.0.0.tgz#e8fbf374dc556ff8947a10dcb0572d633f2cfa10"
@@ -2459,9 +2396,9 @@ minizlib@^1.1.1:
minipass "^2.2.1"
mixin-deep@^1.2.0:
version "1.3.2"
resolved "https://registry.yarnpkg.com/mixin-deep/-/mixin-deep-1.3.2.tgz#1120b43dc359a785dce65b55b82e257ccf479566"
integrity sha512-WRoDn//mXBiJ1H40rqa3vH0toePwSsGb45iInWlTySa+Uu4k3tYUSxa2v1KqAiLtvlrSzaExqS1gtk96A9zvEA==
version "1.3.1"
resolved "https://registry.yarnpkg.com/mixin-deep/-/mixin-deep-1.3.1.tgz#a49e7268dce1a0d9698e45326c5626df3543d0fe"
integrity sha512-8ZItLHeEgaqEvd5lYBXfm4EZSFCX29Jb9K+lAHhDKzReKBQKj3R+7NOF6tjqYi9t4oI8VUfaWITJQm86wnXGNQ==
dependencies:
for-in "^1.0.2"
is-extendable "^1.0.1"
@@ -2783,15 +2720,6 @@ picomatch@^2.0.5:
resolved "https://registry.yarnpkg.com/picomatch/-/picomatch-2.0.7.tgz#514169d8c7cd0bdbeecc8a2609e34a7163de69f6"
integrity sha512-oLHIdio3tZ0qH76NybpeneBhYVj0QFTfXEFTc/B3zKQspYfYYkWYgFsmzo+4kvId/bQRcNkVeguI3y+CD22BtA==
plist@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/plist/-/plist-3.0.1.tgz#a9b931d17c304e8912ef0ba3bdd6182baf2e1f8c"
integrity sha512-GpgvHHocGRyQm74b6FWEZZVRroHKE1I0/BTjAmySaohK+cUn+hZpbqXkc3KWgW3gQYkqcQej35FohcT0FRlkRQ==
dependencies:
base64-js "^1.2.3"
xmlbuilder "^9.0.7"
xmldom "0.1.x"
posix-character-classes@^0.1.0:
version "0.1.1"
resolved "https://registry.yarnpkg.com/posix-character-classes/-/posix-character-classes-0.1.1.tgz#01eac0fe3b5af71a2a6c02feabb8c1fef7e00eab"
@@ -2820,11 +2748,6 @@ process-nextick-args@~2.0.0:
resolved "https://registry.yarnpkg.com/process-nextick-args/-/process-nextick-args-2.0.0.tgz#a37d732f4271b4ab1ad070d35508e8290788ffaa"
integrity sha512-MtEC1TqN0EU5nephaJ4rAtThHtC86dNN9qCuEhtshvpVBkAW5ZO7BASN9REnF9eoXGcRub+pFuKEpOHE+HbEMw==
proxy-from-env@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/proxy-from-env/-/proxy-from-env-1.1.0.tgz#e102f16ca355424865755d2c9e8ea4f24d58c3e2"
integrity sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==
pump@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/pump/-/pump-3.0.0.tgz#b4a2116815bde2f4e1ea602354e8c75565107a64"
@@ -3341,9 +3264,9 @@ string_decoder@~0.10.x:
integrity sha1-YuIDvEF2bGwoyfyEMB2rHFMQ+pQ=
stringstream@~0.0.4, stringstream@~0.0.5:
version "0.0.6"
resolved "https://registry.yarnpkg.com/stringstream/-/stringstream-0.0.6.tgz#7880225b0d4ad10e30927d167a1d6f2fd3b33a72"
integrity sha512-87GEBAkegbBcweToUrdzf3eLhWNg06FJTebl4BVJz/JgWy8CvEr9dRtX5qWphiynMSQlxxi+QqN0z5T32SLlhA==
version "0.0.5"
resolved "https://registry.yarnpkg.com/stringstream/-/stringstream-0.0.5.tgz#4e484cd4de5a0bbbee18e46307710a8a81621878"
integrity sha1-TkhM1N5aC7vuGORjB3EKioFiGHg=
strip-ansi@^3.0.0, strip-ansi@^3.0.1:
version "3.0.1"
@@ -3539,10 +3462,10 @@ typescript@^3.0.1:
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.5.3.tgz#c830f657f93f1ea846819e929092f5fe5983e977"
integrity sha512-ACzBtm/PhXBDId6a6sDJfroT2pOWt/oOnk4/dElG5G33ZL776N3Y6/6bKZJBFpd+b05F3Ct9qDjMeJmRWtE2/g==
typescript@^4.0.0-dev.20200629:
version "4.0.0-dev.20200629"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-4.0.0-dev.20200629.tgz#4631667ebffe3a340beee885a4bebe3a73b6f18e"
integrity sha512-c4DUu7KvTcx4x7V8sBWexYNkCfioiH1huOJL6WFAA8Oot0Gr/+PlKKDBS9fYjsadEv1JI1qboJKobwLQn0kQXw==
typescript@^3.9.1-rc:
version "3.9.1-rc"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.9.1-rc.tgz#81d5a5a0a597e224b6e2af8dffb46524b2eaf5f3"
integrity sha512-+cPv8L2Vd4KidCotqi2wjegBZ5n47CDRUu/QiLVu2YbeXAz78hIfcai9ziBiNI6JTGTVwUqXRug2UZxDcxhvFw==
typical@^4.0.0:
version "4.0.0"
@@ -3702,22 +3625,19 @@ vsce@1.48.0:
yauzl "^2.3.1"
yazl "^2.2.2"
vscode-ripgrep@^1.6.2:
version "1.6.2"
resolved "https://registry.yarnpkg.com/vscode-ripgrep/-/vscode-ripgrep-1.6.2.tgz#fb912c7465699f10ce0218a6676cc632c77369b4"
integrity sha512-jkZEWnQFcE+QuQFfxQXWcWtDafTmgkp3DjMKawDkajZwgnDlGKpFp15ybKrZNVTi1SLEF/12BzxYSZVVZ2XrkA==
dependencies:
https-proxy-agent "^4.0.0"
proxy-from-env "^1.1.0"
vscode-ripgrep@^1.5.6:
version "1.5.7"
resolved "https://registry.yarnpkg.com/vscode-ripgrep/-/vscode-ripgrep-1.5.7.tgz#acb6b548af488a4bca5d0f1bb5faf761343289ce"
integrity sha512-/Vsz/+k8kTvui0q3O74pif9FK0nKopgFTiGNVvxicZANxtSA8J8gUE9GQ/4dpi7D/2yI/YVORszwVskFbz46hQ==
vscode-telemetry-extractor@^1.6.0:
version "1.6.0"
resolved "https://registry.yarnpkg.com/vscode-telemetry-extractor/-/vscode-telemetry-extractor-1.6.0.tgz#e9d9c1d24863cce8d3d715f0287de3b31eb90c56"
integrity sha512-zSxvkbyAMa1lTRGIHfGg7gW2e9Sey+2zGYD19uNWCsVEfoXAr2NB6uzb0sNHtbZ2SSqxSePmFXzBAavsudT5fw==
vscode-telemetry-extractor@^1.5.4:
version "1.5.4"
resolved "https://registry.yarnpkg.com/vscode-telemetry-extractor/-/vscode-telemetry-extractor-1.5.4.tgz#bcb0d17667fa1b77715e3a3bf372ade18f846782"
integrity sha512-MN9LNPo0Rc6cy3sIWTAG97PTWkEKdRnP0VeYoS8vjKSNtG9CAsrUxHgFfYoHm2vNK/ijd0a4NzETyVGO2kT6hw==
dependencies:
command-line-args "^5.1.1"
ts-morph "^3.1.3"
vscode-ripgrep "^1.6.2"
vscode-ripgrep "^1.5.6"
vso-node-api@6.1.2-preview:
version "6.1.2-preview"
@@ -3780,21 +3700,11 @@ xmlbuilder@0.4.3:
resolved "https://registry.yarnpkg.com/xmlbuilder/-/xmlbuilder-0.4.3.tgz#c4614ba74e0ad196e609c9272cd9e1ddb28a8a58"
integrity sha1-xGFLp04K0ZbmCcknLNnh3bKKilg=
xmlbuilder@^9.0.7:
version "9.0.7"
resolved "https://registry.yarnpkg.com/xmlbuilder/-/xmlbuilder-9.0.7.tgz#132ee63d2ec5565c557e20f4c22df9aca686b10d"
integrity sha1-Ey7mPS7FVlxVfiD0wi35rKaGsQ0=
xmlbuilder@~9.0.1:
version "9.0.4"
resolved "https://registry.yarnpkg.com/xmlbuilder/-/xmlbuilder-9.0.4.tgz#519cb4ca686d005a8420d3496f3f0caeecca580f"
integrity sha1-UZy0ymhtAFqEINNJbz8MruzKWA8=
xmldom@0.1.x:
version "0.1.31"
resolved "https://registry.yarnpkg.com/xmldom/-/xmldom-0.1.31.tgz#b76c9a1bd9f0a9737e5a72dc37231cf38375e2ff"
integrity sha512-yS2uJflVQs6n+CyjHoaBmVSqIDevTAWrzMmjG1Gc7h1qQ7uVozNhEPJAwZXWyGQ/Gafo3fCwrcaokezLPupVyQ==
xtend@~4.0.1:
version "4.0.1"
resolved "https://registry.yarnpkg.com/xtend/-/xtend-4.0.1.tgz#a5c6d532be656e23db820efb943a1f04998d63af"

View File

@@ -200,135 +200,6 @@
},
{
"name": "big-integer",
"prependLicenseText": [
"Copyright released to public domain"
]
},
{
// Reason: The license at https://github.com/justmoon/node-extend/blob/main/LICENSE
// cannot be found by the OSS tool automatically.
"name": "extend",
"fullLicenseText": [
"The MIT License (MIT)",
"",
"Copyright (c) 2014 Stefan Thomas",
"",
"Permission is hereby granted, free of charge, to any person obtaining",
"a copy of this software and associated documentation files (the",
"\"Software\"), to deal in the Software without restriction, including",
"without limitation the rights to use, copy, modify, merge, publish,",
"distribute, sublicense, and/or sell copies of the Software, and to",
"permit persons to whom the Software is furnished to do so, subject to",
"the following conditions:",
"",
"The above copyright notice and this permission notice shall be",
"included in all copies or substantial portions of the Software.",
"",
"THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,",
"EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF",
"MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND",
"NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE",
"LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION",
"OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION",
"WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
]
},
{
// Reason: The license at https://github.com/retep998/winapi-rs/blob/0.3/LICENSE-MIT
// cannot be found by the OSS tool automatically.
"name": "retep998/winapi-rs",
"fullLicenseText": [
"Copyright (c) 2015-2018 The winapi-rs Developers",
"",
"Permission is hereby granted, free of charge, to any person obtaining a copy",
"of this software and associated documentation files (the \"Software\"), to deal",
"in the Software without restriction, including without limitation the rights",
"to use, copy, modify, merge, publish, distribute, sublicense, and/or sell",
"copies of the Software, and to permit persons to whom the Software is",
"furnished to do so, subject to the following conditions:",
"",
"The above copyright notice and this permission notice shall be included in all",
"copies or substantial portions of the Software.",
"",
"THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR",
"IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,",
"FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE",
"AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER",
"LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,",
"OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE",
"SOFTWARE."
]
},
{
// Reason: The license at https://github.com/digitaldesignlabs/es6-promisify/blob/main/LICENSE
// cannot be found by the OSS tool automatically.
"name": "es6-promisify",
"fullLicenseText": [
"Copyright (c) 2014 Mike Hall / Digital Design Labs",
"",
"Permission is hereby granted, free of charge, to any person obtaining a copy",
"of this software and associated documentation files (the \"Software\"), to deal",
"in the Software without restriction, including without limitation the rights",
"to use, copy, modify, merge, publish, distribute, sublicense, and/or sell",
"copies of the Software, and to permit persons to whom the Software is",
"furnished to do so, subject to the following conditions:",
"",
"The above copyright notice and this permission notice shall be included in all",
"copies or substantial portions of the Software.",
"",
"THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR",
"IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,",
"FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE",
"AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER",
"LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,",
"OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE",
"SOFTWARE."
]
},
{
// Reason: The license at https://github.com/zkat/json-parse-better-errors/blob/latest/LICENSE.md
// cannot be found by the OSS tool automatically.
"name": "json-parse-better-errors",
"fullLicenseText": [
"Copyright 2017 Kat Marchán",
"",
"Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the",
"\"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute,",
"sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following",
"conditions:",
"",
"The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.",
"",
"THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE",
"WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS",
"OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR",
"OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
]
},
{
// Reason: The license at https://github.com/time-rs/time/blob/main/LICENSE-MIT
// cannot be found by the OSS tool automatically.
"name": "time-rs/time",
"fullLicenseText": [
"Copyright (c) 2019 Jacob Pratt",
"",
"Permission is hereby granted, free of charge, to any person obtaining a copy",
"of this software and associated documentation files (the \"Software\"), to deal",
"in the Software without restriction, including without limitation the rights",
"to use, copy, modify, merge, publish, distribute, sublicense, and/or sell",
"copies of the Software, and to permit persons to whom the Software is",
"furnished to do so, subject to the following conditions:",
"",
"The above copyright notice and this permission notice shall be included in all",
"copies or substantial portions of the Software.",
"",
"THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR",
"IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,",
"FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE",
"AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER",
"LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,",
"OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE",
"SOFTWARE."
]
"prependLicenseText": ["Copyright released to public domain"]
}
]

View File

@@ -60,12 +60,12 @@
"git": {
"name": "electron",
"repositoryUrl": "https://github.com/electron/electron",
"commitHash": "5f93e889020d279d5a9cd1ecab080ab467312447"
"commitHash": "0552e0d5de46ffa3b481d741f1db5c779e201565"
}
},
"isOnlyProductionDependency": true,
"license": "MIT",
"version": "7.3.2"
"version": "7.2.4"
},
{
"component": {
@@ -77,40 +77,6 @@
}
},
"isOnlyProductionDependency": true,
"licenseDetail": [
"Inno Setup License",
"==================",
"",
"Except where otherwise noted, all of the documentation and software included in the Inno Setup",
"package is copyrighted by Jordan Russell.",
"",
"Copyright (C) 1997-2020 Jordan Russell. All rights reserved.",
"Portions Copyright (C) 2000-2020 Martijn Laan. All rights reserved.",
"",
"This software is provided \"as-is,\" without any express or implied warranty. In no event shall the",
"author be held liable for any damages arising from the use of this software.",
"",
"Permission is granted to anyone to use this software for any purpose, including commercial",
"applications, and to alter and redistribute it, provided that the following conditions are met:",
"",
"1. All redistributions of source code files must retain all copyright notices that are currently in",
" place, and this list of conditions without modification.",
"",
"2. All redistributions in binary form must retain all occurrences of the above copyright notice and",
" web site addresses that are currently in place (for example, in the About boxes).",
"",
"3. The origin of this software must not be misrepresented; you must not claim that you wrote the",
" original software. If you use this software to distribute a product, an acknowledgment in the",
" product documentation would be appreciated but is not required.",
"",
"4. Modified versions in source or binary form must be plainly marked as such, and must not be",
" misrepresented as being the original software.",
"",
"",
"Jordan Russell",
"jr-2010 AT jrsoftware.org",
"https://jrsoftware.org/"
],
"version": "5.5.6"
},
{
@@ -533,7 +499,7 @@
"git": {
"name": "ripgrep",
"repositoryUrl": "https://github.com/BurntSushi/ripgrep",
"commitHash": "973de50c9ef451da2cfcdfa86f2b2711d8d6ff48"
"commitHash": "8a7db1a918e969b85cd933d8ed9fa5285b281ba4"
}
},
"isOnlyProductionDependency": true,

View File

@@ -29,4 +29,4 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt).
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/LICENSE.txt).

View File

@@ -31,4 +31,4 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt).
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/LICENSE.txt).

View File

@@ -4,8 +4,7 @@
"relativeCoverageDir": "../../coverage",
"ignorePatterns": [
"**/node_modules/**",
"**/test/**",
"main.js"
"**/test/**"
],
"reports": [
"cobertura",

View File

@@ -5,7 +5,7 @@
"version": "0.1.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/extensions/admin-tool-ext-win/license/Azure%20Data%20Studio%20Extension%20-%20Standalone%20(free)%20Use%20Terms.txt",
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/extensions/admin-tool-ext-win/license/Azure%20Data%20Studio%20Extension%20-%20Standalone%20(free)%20Use%20Terms.txt",
"icon": "images/extension.png",
"aiKey": "AIF-37eefaf0-8022-4671-a3fb-64752724682e",
"engines": {
@@ -74,7 +74,7 @@
]
},
"dependencies": {
"ads-extension-telemetry": "^1.0.0",
"ads-extension-telemetry": "github:Charles-Gagnon/ads-extension-telemetry#0.1.0",
"service-downloader": "0.2.1",
"vscode-nls": "^3.2.1"
},

View File

@@ -8,7 +8,7 @@ import * as path from 'path';
import * as azdata from 'azdata';
import * as vscode from 'vscode';
import { TelemetryReporter, TelemetryViews } from './telemetry';
import { getTelemetryErrorType, buildSsmsMinCommandArgs, buildUrn, LaunchSsmsDialogParams, nodeTypeToUrnNameMapping } from './utils';
import { doubleEscapeSingleQuotes, backEscapeDoubleQuotes, getTelemetryErrorType } from './utils';
import { ChildProcess, exec } from 'child_process';
import { promises as fs } from 'fs';
@@ -17,6 +17,53 @@ const localize = nls.loadMessageBundle();
let exePath: string;
const runningProcesses: Map<number, ChildProcess> = new Map<number, ChildProcess>();
interface SmoMapping {
action: string;
urnName: string;
}
const nodeTypeToUrnNameMapping: { [oeNodeType: string]: SmoMapping } = {
'Database': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Database', urnName: 'Database' },
'Server': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Server', urnName: 'Server' },
'ServerLevelServerAudit': { action: 'sqla:AuditProperties', urnName: 'Audit' },
'ServerLevelCredential': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Credential', urnName: 'Credential' },
'ServerLevelServerRole': { action: 'sqla:ManageServerRole', urnName: 'Role' },
'ServerLevelServerAuditSpecification': { action: 'sqla:ServerAuditSpecificationProperties', urnName: 'ServerAuditSpecification' },
'ServerLevelLinkedServer': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.LinkedServer', urnName: 'LinkedServer' },
'Table': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Table', urnName: 'Table' },
'View': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.View', urnName: 'View' },
'Column': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Column', urnName: 'Column' },
'Index': { action: 'sqla:IndexProperties', urnName: 'Index' },
'Statistic': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Statistic', urnName: 'Statistic' },
'StoredProcedure': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.StoredProcedure', urnName: 'StoredProcedure' },
'ScalarValuedFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'TableValuedFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'AggregateFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'Synonym': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Synonym', urnName: 'Synonym' },
'Assembly': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.SqlAssembly', urnName: 'SqlAssembly' },
'UserDefinedDataType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedDataType', urnName: 'UserDefinedDataType' },
'UserDefinedType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedType', urnName: 'UserDefinedType' },
'UserDefinedTableType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedTableType', urnName: 'UserDefinedTableType' },
'Sequence': { action: 'sqla:SequenceProperties', urnName: 'Sequence' },
'User': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.User', urnName: 'User' },
'DatabaseRole': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.DatabaseRole', urnName: 'Role' },
'ApplicationRole': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.ApplicationRole', urnName: 'ApplicationRole' },
'Schema': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Schema', urnName: 'Schema' },
'SecurityPolicy': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.SecurityPolicy', urnName: 'SecurityPolicy' },
'ServerLevelLogin': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Login', urnName: 'Login' },
};
// Params to pass to SsmsMin.exe, only an action and server are required - the rest are optional based on the
// action used. Exported for use in testing.
export interface LaunchSsmsDialogParams {
action: string;
server: string;
database?: string;
user?: string;
useAad?: boolean;
urn?: string;
}
export async function activate(context: vscode.ExtensionContext): Promise<void> {
// This is for Windows-specific support so do nothing on other platforms
if (process.platform === 'win32') {
@@ -178,3 +225,41 @@ async function launchSsmsDialog(action: string, connectionContext: azdata.Object
runningProcesses.set(proc.pid, proc);
}
/**
* Builds the command arguments to pass to SsmsMin.exe. Values are expected to be escaped correctly
* already per their - they will be further escaped * for command-line usage but no additional
* escaping will occur.
* @param params The params used to build up the command parameter string
*/
export function buildSsmsMinCommandArgs(params: LaunchSsmsDialogParams): string {
return `-a "${backEscapeDoubleQuotes(params.action)}" \
-S "${backEscapeDoubleQuotes(params.server)}"\
${params.database ? ' -D "' + backEscapeDoubleQuotes(params.database) + '"' : ''}\
${params.user ? ' -U "' + backEscapeDoubleQuotes(params.user) + '"' : ''}\
${params.useAad === true ? ' -G' : ''}\
${params.urn ? ' -u "' + backEscapeDoubleQuotes(params.urn) + '"' : ''}`;
}
/**
* Builds the URN string for a given ObjectExplorerNode in the form understood by SsmsMin
* @param node The node to get the URN of
*/
export async function buildUrn(node: azdata.objectexplorer.ObjectExplorerNode): Promise<string> {
let urnNodes: string[] = [];
while (node) {
// Server is special since it's a connection node - always add it as the root
if (node.nodeType === 'Server') {
break;
}
else if (node.metadata && node.nodeType !== 'Folder') {
// SFC URN expects Name and Schema to be separate properties
const urnSegment = node.metadata.schema && node.metadata.schema !== '' ?
`${nodeTypeToUrnNameMapping[node.nodeType].urnName}[@Name='${doubleEscapeSingleQuotes(node.metadata.name)}' and @Schema='${doubleEscapeSingleQuotes(node.metadata.schema)}']` :
`${nodeTypeToUrnNameMapping[node.nodeType].urnName}[@Name='${doubleEscapeSingleQuotes(node.metadata.name)}']`;
urnNodes = [urnSegment].concat(urnNodes);
}
node = await node.getParent();
}
return ['Server'].concat(urnNodes).join('/');
}

View File

@@ -5,7 +5,9 @@
import * as should from 'should';
import 'mocha';
import { doubleEscapeSingleQuotes, backEscapeDoubleQuotes, getTelemetryErrorType,buildSsmsMinCommandArgs, buildUrn, LaunchSsmsDialogParams } from '../utils';
import { buildSsmsMinCommandArgs, buildUrn, LaunchSsmsDialogParams } from '../main';
import { doubleEscapeSingleQuotes, backEscapeDoubleQuotes, getTelemetryErrorType } from '../utils';
import { ExtHostObjectExplorerNodeStub } from './stubs';
describe('buildSsmsMinCommandArgs Method Tests', () => {

View File

@@ -3,8 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
export interface IPackageInfo {
name: string;
version: string;
@@ -58,90 +56,3 @@ export function getTelemetryErrorType(msg: string): string {
return 'Other';
}
}
// Params to pass to SsmsMin.exe, only an action and server are required - the rest are optional based on the
// action used. Exported for use in testing.
export interface LaunchSsmsDialogParams {
action: string;
server: string;
database?: string;
user?: string;
useAad?: boolean;
urn?: string;
}
/**
* Builds the command arguments to pass to SsmsMin.exe. Values are expected to be escaped correctly
* already per their - they will be further escaped * for command-line usage but no additional
* escaping will occur.
* @param params The params used to build up the command parameter string
*/
export function buildSsmsMinCommandArgs(params: LaunchSsmsDialogParams): string {
return `-a "${backEscapeDoubleQuotes(params.action)}" \
-S "${backEscapeDoubleQuotes(params.server)}"\
${params.database ? ' -D "' + backEscapeDoubleQuotes(params.database) + '"' : ''}\
${params.user ? ' -U "' + backEscapeDoubleQuotes(params.user) + '"' : ''}\
${params.useAad === true ? ' -G' : ''}\
${params.urn ? ' -u "' + backEscapeDoubleQuotes(params.urn) + '"' : ''}`;
}
interface SmoMapping {
action: string;
urnName: string;
}
export const nodeTypeToUrnNameMapping: { [oeNodeType: string]: SmoMapping } = {
'Database': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Database', urnName: 'Database' },
'Server': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Server', urnName: 'Server' },
'ServerLevelServerAudit': { action: 'sqla:AuditProperties', urnName: 'Audit' },
'ServerLevelCredential': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Credential', urnName: 'Credential' },
'ServerLevelServerRole': { action: 'sqla:ManageServerRole', urnName: 'Role' },
'ServerLevelServerAuditSpecification': { action: 'sqla:ServerAuditSpecificationProperties', urnName: 'ServerAuditSpecification' },
'ServerLevelLinkedServer': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.LinkedServer', urnName: 'LinkedServer' },
'Table': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Table', urnName: 'Table' },
'View': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.View', urnName: 'View' },
'Column': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Column', urnName: 'Column' },
'Index': { action: 'sqla:IndexProperties', urnName: 'Index' },
'Statistic': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Statistic', urnName: 'Statistic' },
'StoredProcedure': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.StoredProcedure', urnName: 'StoredProcedure' },
'ScalarValuedFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'TableValuedFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'AggregateFunction': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedFunction', urnName: 'UserDefinedFunction' },
'Synonym': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Synonym', urnName: 'Synonym' },
'Assembly': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.SqlAssembly', urnName: 'SqlAssembly' },
'UserDefinedDataType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedDataType', urnName: 'UserDefinedDataType' },
'UserDefinedType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedType', urnName: 'UserDefinedType' },
'UserDefinedTableType': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.UserDefinedTableType', urnName: 'UserDefinedTableType' },
'Sequence': { action: 'sqla:SequenceProperties', urnName: 'Sequence' },
'User': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.User', urnName: 'User' },
'DatabaseRole': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.DatabaseRole', urnName: 'Role' },
'ApplicationRole': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.ApplicationRole', urnName: 'ApplicationRole' },
'Schema': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Schema', urnName: 'Schema' },
'SecurityPolicy': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.SecurityPolicy', urnName: 'SecurityPolicy' },
'ServerLevelLogin': { action: 'sqla:Properties@Microsoft.SqlServer.Management.Smo.Login', urnName: 'Login' },
};
/**
* Builds the URN string for a given ObjectExplorerNode in the form understood by SsmsMin
* @param node The node to get the URN of
*/
export async function buildUrn(node: azdata.objectexplorer.ObjectExplorerNode): Promise<string> {
let urnNodes: string[] = [];
while (node) {
// Server is special since it's a connection node - always add it as the root
if (node.nodeType === 'Server') {
break;
}
else if (node.metadata && node.nodeType !== 'Folder') {
// SFC URN expects Name and Schema to be separate properties
const urnSegment = node.metadata.schema && node.metadata.schema !== '' ?
`${nodeTypeToUrnNameMapping[node.nodeType].urnName}[@Name='${doubleEscapeSingleQuotes(node.metadata.name)}' and @Schema='${doubleEscapeSingleQuotes(node.metadata.schema)}']` :
`${nodeTypeToUrnNameMapping[node.nodeType].urnName}[@Name='${doubleEscapeSingleQuotes(node.metadata.name)}']`;
urnNodes = [urnSegment].concat(urnNodes);
}
node = await node.getParent();
}
return ['Server'].concat(urnNodes).join('/');
}

View File

@@ -192,12 +192,11 @@
resolved "https://registry.yarnpkg.com/@types/node/-/node-12.12.7.tgz#01e4ea724d9e3bd50d90c11fd5980ba317d8fa11"
integrity sha512-E6Zn0rffhgd130zbCbAr/JdXfXkoOUFAKNs/rF8qnafSJ8KYaA/j3oz7dcwal+lYjLA7xvdd5J4wdYpCTlP8+w==
ads-extension-telemetry@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/ads-extension-telemetry/-/ads-extension-telemetry-1.0.0.tgz#840b363a6ad958447819b9bc59fdad3e49de31a9"
integrity sha512-ouxZVECe4tsO0ek0dLdnAZEz1Lrytv1uLbbGZhRbZsHITsUYNjnkKnA471uWh0Dj80s+orvv49/j3/tNBDP/SQ==
"ads-extension-telemetry@github:Charles-Gagnon/ads-extension-telemetry#0.1.0":
version "0.1.0"
resolved "https://codeload.github.com/Charles-Gagnon/ads-extension-telemetry/tar.gz/70c2fea10e9ff6e329c4c5ec0b77017ada514b6d"
dependencies:
vscode-extension-telemetry "0.1.2"
vscode-extension-telemetry "0.1.1"
agent-base@4, agent-base@^4.3.0:
version "4.3.0"
@@ -225,30 +224,14 @@ append-transform@^2.0.0:
dependencies:
default-require-extensions "^3.0.0"
applicationinsights@1.4.0:
version "1.4.0"
resolved "https://registry.yarnpkg.com/applicationinsights/-/applicationinsights-1.4.0.tgz#e17e436427b6e273291055181e29832cca978644"
integrity sha512-TV8MYb0Kw9uE2cdu4V/UvTKdOABkX2+Fga9iDz0zqV7FLrNXfmAugWZmmdTx4JoynYkln3d5CUHY3oVSUEbfFw==
applicationinsights@1.0.8:
version "1.0.8"
resolved "https://registry.yarnpkg.com/applicationinsights/-/applicationinsights-1.0.8.tgz#db6e3d983cf9f9405fe1ee5ba30ac6e1914537b5"
integrity sha512-KzOOGdphOS/lXWMFZe5440LUdFbrLpMvh2SaRxn7BmiI550KAoSb2gIhiq6kJZ9Ir3AxRRztjhzif+e5P5IXIg==
dependencies:
cls-hooked "^4.2.2"
continuation-local-storage "^3.2.1"
diagnostic-channel "0.2.0"
diagnostic-channel-publishers "^0.3.2"
async-hook-jl@^1.7.6:
version "1.7.6"
resolved "https://registry.yarnpkg.com/async-hook-jl/-/async-hook-jl-1.7.6.tgz#4fd25c2f864dbaf279c610d73bf97b1b28595e68"
integrity sha512-gFaHkFfSxTjvoxDMYqDuGHlcRyUuamF8s+ZTtJdDzqjws4mCt7v0vuV79/E2Wr2/riMQgtG4/yUtXWs1gZ7JMg==
dependencies:
stack-chain "^1.3.7"
async-listener@^0.6.0:
version "0.6.10"
resolved "https://registry.yarnpkg.com/async-listener/-/async-listener-0.6.10.tgz#a7c97abe570ba602d782273c0de60a51e3e17cbc"
integrity sha512-gpuo6xOyF4D5DE5WvyqZdPA3NGhiT6Qf07l7DCB0wwDEsLvDIbCr6j9S5aj5Ch96dLace5tXVzWBZkxU/c5ohw==
dependencies:
semver "^5.3.0"
shimmer "^1.1.0"
diagnostic-channel-publishers "0.2.1"
zone.js "0.7.6"
async-retry@^1.2.3:
version "1.3.1"
@@ -304,15 +287,6 @@ chownr@^1.1.3:
resolved "https://registry.yarnpkg.com/chownr/-/chownr-1.1.4.tgz#6fc9d7b42d32a583596337666e7d08084da2cc6b"
integrity sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==
cls-hooked@^4.2.2:
version "4.2.2"
resolved "https://registry.yarnpkg.com/cls-hooked/-/cls-hooked-4.2.2.tgz#ad2e9a4092680cdaffeb2d3551da0e225eae1908"
integrity sha512-J4Xj5f5wq/4jAvcdgoGsL3G103BtWpZrMo8NEinRltN+xpTZdI+M38pyQqhuFU/P792xkMFvnKSf+Lm81U1bxw==
dependencies:
async-hook-jl "^1.7.6"
emitter-listener "^1.0.1"
semver "^5.4.1"
color-convert@^1.9.0:
version "1.9.3"
resolved "https://registry.yarnpkg.com/color-convert/-/color-convert-1.9.3.tgz#bb71850690e1f136567de629d2d5471deda4c1e8"
@@ -335,14 +309,6 @@ concat-map@0.0.1:
resolved "https://registry.yarnpkg.com/concat-map/-/concat-map-0.0.1.tgz#d8a96bd77fd68df7793a73036a3ba0d5405d477b"
integrity sha1-2Klr13/Wjfd5OnMDajug1UBdR3s=
continuation-local-storage@^3.2.1:
version "3.2.1"
resolved "https://registry.yarnpkg.com/continuation-local-storage/-/continuation-local-storage-3.2.1.tgz#11f613f74e914fe9b34c92ad2d28fe6ae1db7ffb"
integrity sha512-jx44cconVqkCEEyLSKWwkvUXwO561jXMa3LPjTPsm5QR22PA0/mhe33FT4Xb5y74JDvt/Cq+5lm8S8rskLv9ZA==
dependencies:
async-listener "^0.6.0"
emitter-listener "^1.1.1"
convert-source-map@^1.7.0:
version "1.7.0"
resolved "https://registry.yarnpkg.com/convert-source-map/-/convert-source-map-1.7.0.tgz#17a2cb882d7f77d3490585e2ce6c524424a3a442"
@@ -397,10 +363,10 @@ default-require-extensions@^3.0.0:
dependencies:
strip-bom "^4.0.0"
diagnostic-channel-publishers@^0.3.2:
version "0.3.5"
resolved "https://registry.yarnpkg.com/diagnostic-channel-publishers/-/diagnostic-channel-publishers-0.3.5.tgz#a84a05fd6cc1d7619fdd17791c17e540119a7536"
integrity sha512-AOIjw4T7Nxl0G2BoBPhkQ6i7T4bUd9+xvdYizwvG7vVAM1dvr+SDrcUudlmzwH0kbEwdR2V1EcnKT0wAeYLQNQ==
diagnostic-channel-publishers@0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/diagnostic-channel-publishers/-/diagnostic-channel-publishers-0.2.1.tgz#8e2d607a8b6d79fe880b548bc58cc6beb288c4f3"
integrity sha1-ji1geottef6IC1SLxYzGvrKIxPM=
diagnostic-channel@0.2.0:
version "0.2.0"
@@ -414,13 +380,6 @@ diff@3.5.0:
resolved "https://registry.yarnpkg.com/diff/-/diff-3.5.0.tgz#800c0dd1e0a8bfbc95835c202ad220fe317e5a12"
integrity sha512-A46qtFgd+g7pDZinpnwiRJtxbC1hpgf0uzP3iG89scHk0AUC7A1TGxf5OiiOUv/JMZR8GOt8hL900hV0bOy5xA==
emitter-listener@^1.0.1, emitter-listener@^1.1.1:
version "1.1.2"
resolved "https://registry.yarnpkg.com/emitter-listener/-/emitter-listener-1.1.2.tgz#56b140e8f6992375b3d7cb2cab1cc7432d9632e8"
integrity sha512-Bt1sBAGFHY9DKY+4/2cV6izcKJUf5T7/gkdmkxzX/qv9CcGH8xSwVRW5mtX03SWJtRTWSOpzCuWN9rBFYZepZQ==
dependencies:
shimmer "^1.2.0"
es6-promise@^4.0.3:
version "4.2.8"
resolved "https://registry.yarnpkg.com/es6-promise/-/es6-promise-4.2.8.tgz#4eb21594c972bc40553d276e510539143db53e0a"
@@ -853,11 +812,6 @@ service-downloader@0.2.1:
tmp "^0.0.33"
yauzl "^2.10.0"
shimmer@^1.1.0, shimmer@^1.2.0:
version "1.2.1"
resolved "https://registry.yarnpkg.com/shimmer/-/shimmer-1.2.1.tgz#610859f7de327b587efebf501fb43117f9aff337"
integrity sha512-sQTKC1Re/rM6XyFM6fIAGHRPVGvyXfgzIDvzoq608vM+jeyVD0Tu1E6Np0Kc2zAIFWIj963V2800iF/9LPieQw==
should-equal@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/should-equal/-/should-equal-2.0.0.tgz#6072cf83047360867e68e98b09d71143d04ee0c3"
@@ -912,11 +866,6 @@ source-map@^0.6.1:
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
stack-chain@^1.3.7:
version "1.3.7"
resolved "https://registry.yarnpkg.com/stack-chain/-/stack-chain-1.3.7.tgz#d192c9ff4ea6a22c94c4dd459171e3f00cea1285"
integrity sha1-0ZLJ/06moiyUxN1FkXHj8AzqEoU=
strip-ansi@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-4.0.0.tgz#a8479022eb1ac368a871389b635262c505ee368f"
@@ -974,12 +923,12 @@ to-fast-properties@^2.0.0:
resolved "https://registry.yarnpkg.com/to-fast-properties/-/to-fast-properties-2.0.0.tgz#dc5e698cbd079265bc73e0377681a4e4e83f616e"
integrity sha1-3F5pjL0HkmW8c+A3doGk5Og/YW4=
vscode-extension-telemetry@0.1.2:
version "0.1.2"
resolved "https://registry.yarnpkg.com/vscode-extension-telemetry/-/vscode-extension-telemetry-0.1.2.tgz#049207f5453930888ff68ca925b07bab08f2c955"
integrity sha512-FSbaZKlIH3VKvBJsKw7v5bESWHXzltji2rtjaJeJglpQH4tfClzwHMzlMXUZGiblV++djEzb1gW8mb5E+wxFsg==
vscode-extension-telemetry@0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/vscode-extension-telemetry/-/vscode-extension-telemetry-0.1.1.tgz#91387e06b33400c57abd48979b0e790415ae110b"
integrity sha512-TkKKG/B/J94DP5qf6xWB4YaqlhWDg6zbbqVx7Bz//stLQNnfE9XS1xm3f6fl24c5+bnEK0/wHgMgZYKIKxPeUA==
dependencies:
applicationinsights "1.4.0"
applicationinsights "1.0.8"
vscode-nls@^3.2.1:
version "3.2.5"
@@ -1023,3 +972,8 @@ yauzl@^2.10.0:
dependencies:
buffer-crc32 "~0.2.3"
fd-slicer "~1.1.0"
zone.js@0.7.6:
version "0.7.6"
resolved "https://registry.yarnpkg.com/zone.js/-/zone.js-0.7.6.tgz#fbbc39d3e0261d0986f1ba06306eb3aeb0d22009"
integrity sha1-+7w50+AmHQmG8boGMG6zrrDSIAk=

View File

@@ -23,4 +23,4 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt).
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/LICENSE.txt).

View File

@@ -5,7 +5,7 @@
"version": "0.48.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/LICENSE.txt",
"icon": "images/extension.png",
"aiKey": "AIF-37eefaf0-8022-4671-a3fb-64752724682e",
"engines": {

View File

@@ -4,6 +4,3 @@ out/**
extension.webpack.config.js
tsconfig.json
yarn.lock
coverConfig.json
*.vsix
coverage

View File

@@ -2,42 +2,6 @@
Welcome to Microsoft Azure Arc Extension for Azure Data Studio!
**This extension is only applicable to customers in the Azure Arc data services private preview.**
## Overview
This extension adds the following features to Azure Data Studio.
### Deployment Wizards
A gui-based experience to deploy an Azure Arc data controller as well as resources on an existing Azure Arc data controller. Current list of supported resources:
* SQL Managed Instance
* PostgreSQL server groups
### Management Dashboards
After connecting to an existing Azure Arc data controller in the **Azure Arc Controllers** view of the **Connections** viewlet a list of the active resources registered to the controller is shown, which allow launching a dashboard for further management capabilities.
## Usage Guide
### Deployment Wizards
* After installing this extension open the **Connections** viewlet
* Click on '...' in the top right corner of the viewlet panel and click on 'New Deployment...'.
* This opens a dialog box that shows several deployment tiles. This extension adds tiles for the supported resources listed above.
* Click on that tile, accept any license agreements, and choose the appropriate 'Resource Type' to install
* A required tools check will run, if any required tools are missing then instructions will be given for installing those tools
* Once the check has passed successfully then click the **Select** button
* This opens up a new dialog where you can enter input parameters specific to the deployment selected and then open a notebook that does the actual deployment.
### Controller View/Management Dashboards
* The **Azure Arc Controllers** view can be found in the **Connections** viewlet
* You can create a new controller by clicking the + button in the view title bar. This will launch the deployment wizard detailed above
* You can connect to an existing controller by clicking the plug button in the view title bar or the **Connect Controller** button in the view area when no controllers are registered
* You will then be prompted for the connection information to the controller, if it's successful then a node will be added to the tree for that controller
* Right click on one of the nodes to launch a dashboard for that resource - either the Data Controller or an instance deployed on it
## Code of Conduct
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
@@ -50,4 +14,4 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt).
Licensed under the [Source EULA](https://raw.githubusercontent.com/Microsoft/azuredatastudio/master/LICENSE.txt).

View File

@@ -1,23 +0,0 @@
{
"enabled": true,
"relativeSourcePath": "..",
"relativeCoverageDir": "../../coverage",
"ignorePatterns": [
"**/generated/**",
"**/node_modules/**",
"**/test/**",
"constants.js",
"localizedConstants.js",
"extension.js"
],
"reports": [
"cobertura",
"lcov",
"json"
],
"verbose": false,
"remapOptions": {
"basePath": "..",
"useAbsolutePaths": true
}
}

View File

@@ -0,0 +1 @@
<svg width="26" height="26" xmlns="http://www.w3.org/2000/svg"> <g id="svg_1"> <polygon transform="rotate(90 12.967000007629395,8.000000000000002) " fill="#ffffff" points="7.549000024795532,-4.9200000166893005 5.464999999850988,-2.8359999656677246 16.300999641418457,8 5.464999999850988,18.836000442504883 7.549000024795532,20.920000076293945 20.4689998626709,8 " id="svg_2"/> <polygon transform="rotate(90 12.96700096130371,18.90800476074219) " fill="#ffffff" points="7.5490007400512695,5.988004416227341 5.465001106262207,8.07200500369072 16.301002502441406,18.908005446195602 5.465001106262207,29.744003981351852 7.5490007400512695,31.828003615140915 20.46900177001953,18.908005446195602 " id="svg_3"/> </g> </svg>

After

Width:  |  Height:  |  Size: 717 B

Some files were not shown because too many files have changed in this diff Show More