Insights generator project (#1066)

* InsightsGenerator project template files

* Add insights projects to SLN

* Setting up siggen class (#1003)

* Setting up siggen class

* fixed the learn method

* Refactoring code
Fixed compile errors

* renamed results to result

* Basic transformation logic (#1004)

* Fix a couple bugs and add a simple test (#1007)

* Fix a couple bugs and add a simple test

* More tests and bug fix

* Nara/workflow (#1006)

* added a queue processor

* ordered  using statements

* Armemon/analytics (#1008)

* Basic transformation logic

* changed some structure of siggen

* added sum and average method, as well as select rows by input name

* add insights to results

* min, max added

Co-authored-by: Karl Burtram <karlb@microsoft.com>
Co-authored-by: Aasim Khan <aasimkhan30@gmail.com>
Co-authored-by: Arslan Memon <armemon@microsoft.com>

* Added rules engine base implementation (#1005)

* Added rules engine base implementation

* update comments

* addressing comments

* adding template text to columnheaders object

* adding template text to columnheaders object

* fixing columnheaders class

* Added test

* Added Template Parser unit test in Test project

* Deleted unnecessary files and reverted the files that were modified by mistake

Co-authored-by: Jinjing Arima <jiarima@microsoft.com>

* Insights generator message handler placeholder (#1013)

* Aasim/insights/insight methods (#1014)

* Basic transformation logic

* changed some structure of siggen

* Added top and bottom insight functions

* Added top, bottom insights
Added tests for top, bottom insights

* Armemon/insights2 (#1011)

* max and min insightsperslice, and tests

* got rid of unneccesssary function

* get indexes

Co-authored-by: Arslan Memon <armemon@microsoft.com>

* Armemon/insights2 (#1012)

* max and min insightsperslice, and tests

* got rid of unneccesssary function

* get indexes

* learn for stringinputtyype

* add learn implentation

Co-authored-by: Arslan Memon <armemon@microsoft.com>

* Added Tests
Removed duplicate methods

Co-authored-by: Karl Burtram <karlb@microsoft.com>
Co-authored-by: arslan9955 <53170027+arslan9955@users.noreply.github.com>
Co-authored-by: Arslan Memon <armemon@microsoft.com>

* Armemon/insights2 (#1016)

* Basic transformation logic

* changed some structure of siggen

* Added top and bottom insight functions

* Added top, bottom insights
Added tests for top, bottom insights

* max and min insightsperslice, and tests

* got rid of unneccesssary function

* get indexes

* learn for stringinputtyype

* add learn implentation

* add unique inputs

* fix merge error

* add to result

Co-authored-by: Karl Burtram <karlb@microsoft.com>
Co-authored-by: Aasim Khan <aasimkhan30@gmail.com>
Co-authored-by: Arslan Memon <armemon@microsoft.com>

* Added all the templates (#1015)

* Added all the templates
Added a method to find matched template

* Added a function to replace # and ## values in a template

* Added ReplaceHashesInTemplate call

* Added comments

* Updated the template txt

* Updated GetTopHeadersWithHash function to add #toplist

* Updated the logic per offline discussion with Hermineh

* Update request handler contract to take array (#1020)

* added rulesengine findmatchingtemplate (#1019)

* Add support for getting DacFx deploy options from a publish profile (#995)

* add support for getting options from a publish profile

* update comments

* set values for default options if they aren't specified in the publish profile

* addressing comments

* Updating to latest DacFx for a bug fix (#1010)

* added rulesengine findmatchingtemplate

* Update DacFx deploy and generate script with options (#998)

* update deploy and generate script to accept deployment options

* add tests

* add test with option set to true

* merge

* merge

* incorporated FindMatchedTemplate

Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>

* -Added logic for Insights Generator Service Handler (#1017)

* -Added logic for Insights Generator Service Handler

* Fixed some logic

* Adding workflow test

* Update transform and add tests (#1024)

* Jiarima/fix rules engine logic (#1021)

* Added all the templates
Added a method to find matched template

* Added a function to replace # and ## values in a template

* Added ReplaceHashesInTemplate call

* Added comments

* Updated the template txt

* Updated GetTopHeadersWithHash function to add #toplist

* Updated the logic per offline discussion with Hermineh

* Update with the fixes

* Updated template and foreach conditions

* Added distinct

* Updated tests according to the logic change (#1026)

* Nara/remove queing (#1023)

* loc update (#914)

* loc update

* loc updates

* Add support for getting DacFx deploy options from a publish profile (#995)

* add support for getting options from a publish profile

* update comments

* set values for default options if they aren't specified in the publish profile

* addressing comments

* Updating to latest DacFx for a bug fix (#1010)

* Update DacFx deploy and generate script with options (#998)

* update deploy and generate script to accept deployment options

* add tests

* add test with option set to true

* intermediate check in for merge, transformed not working

* intermediate check in for merge, transformed not working

* added test case

* merged

Co-authored-by: khoiph1 <khoiph@microsoft.com>
Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>

* Output data types from transform (#1029)

* Fix bug process input_g (#1030)

* Fixed the insight generator service (#1028)

* Jiarima/added more testings (#1031)

* Added another test
Updated ReplaceHashesInTemplate function to return string instead of Template

* Added third test

* Merged

* Reverted the workflow file to match with the one in hack/insights

* Bugs fixes to hook insights up to ADS (#1033)

* Bug fixes for hack insights (#1032)

* Fixed the minColumn index bug in Data Transformation
Fixed the template matching logic.

* Adding changes from PR

* Try to fix Workflow tests

* Readd workflow tests

* Fix template load location

Co-authored-by: Aasim Khan <aasimkhan30@gmail.com>
Co-authored-by: Nara <NaraVen@users.noreply.github.com>
Co-authored-by: arslan9955 <53170027+arslan9955@users.noreply.github.com>
Co-authored-by: Arslan Memon <armemon@microsoft.com>
Co-authored-by: gadudhbh <68879970+gadudhbh@users.noreply.github.com>
Co-authored-by: Jinjing Arima <jiarima@microsoft.com>
Co-authored-by: jiarima <68882862+jiarima@users.noreply.github.com>
Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>
Co-authored-by: khoiph1 <khoiph@microsoft.com>
This commit is contained in:
Karl Burtram
2020-09-03 14:15:51 -07:00
committed by GitHub
parent 9784c3eaa2
commit 5cf5b59a0d
29 changed files with 2281 additions and 12 deletions

View File

@@ -114,6 +114,10 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "azure-pipelines", "azure-pi
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Microsoft.Kusto.ServiceLayer.UnitTests", "test\Microsoft.Kusto.ServiceLayer.UnitTests\Microsoft.Kusto.ServiceLayer.UnitTests.csproj", "{AFCDED82-B659-4BE1-86ED-0F4F8BC661AE}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Microsoft.InsightsGenerator", "src\Microsoft.InsightsGenerator\Microsoft.InsightsGenerator.csproj", "{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Microsoft.InsightsGenerator.UnitTests", "test\Microsoft.InsightsGenerator.UnitTests\Microsoft.InsightsGenerator.UnitTests.csproj", "{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
@@ -265,6 +269,18 @@ Global
{AFCDED82-B659-4BE1-86ED-0F4F8BC661AE}.Integration|Any CPU.Build.0 = Debug|Any CPU
{AFCDED82-B659-4BE1-86ED-0F4F8BC661AE}.Release|Any CPU.ActiveCfg = Release|Any CPU
{AFCDED82-B659-4BE1-86ED-0F4F8BC661AE}.Release|Any CPU.Build.0 = Release|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Integration|Any CPU.ActiveCfg = Debug|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Integration|Any CPU.Build.0 = Debug|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F}.Release|Any CPU.Build.0 = Release|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Debug|Any CPU.Build.0 = Debug|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Integration|Any CPU.ActiveCfg = Debug|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Integration|Any CPU.Build.0 = Debug|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Release|Any CPU.ActiveCfg = Release|Any CPU
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
@@ -296,6 +312,8 @@ Global
{0EC2B30C-0652-49AE-9594-85B3C3E9CA21} = {AB9CA2B8-6F70-431C-8A1D-67479D8A7BE4}
{E0C941C8-91F2-4BE1-8B79-AC88EDB78729} = {2BBD7364-054F-4693-97CD-1C395E3E84A9}
{AFCDED82-B659-4BE1-86ED-0F4F8BC661AE} = {AB9CA2B8-6F70-431C-8A1D-67479D8A7BE4}
{7F2659DB-92C8-4823-AFB9-88BC1B6D959F} = {2BBD7364-054F-4693-97CD-1C395E3E84A9}
{BB7FF5B5-84E3-4F4B-A2A7-2CC4C75632E9} = {AB9CA2B8-6F70-431C-8A1D-67479D8A7BE4}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {B31CDF4B-2851-45E5-8C5F-BE97125D9DD8}

View File

@@ -0,0 +1,26 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
namespace Microsoft.InsightsGenerator
{
public class DataArray
{
public enum DataType
{
Number,
String,
DateTime
}
public string[] ColumnNames { get; set; }
public string[] TransformedColumnNames { get; set; }
public DataType[] ColumnDataType { get; set; }
public object[][] Cells { get; set; }
}
}

View File

@@ -0,0 +1,164 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
namespace Microsoft.InsightsGenerator
{
public class DataTransformer
{
private class ColumnInfo
{
public int ColumnIndex { get; set; }
public int DistinctValues { get; set; }
public DataArray.DataType DataType { get; set; }
}
public DataArray Transform(DataArray array)
{
if (array == null || array.Cells == null || array.Cells.Length == 0)
{
return array;
}
DataArray.DataType[] columnDataType;
array.TransformedColumnNames = GetColumnLabels(array , out columnDataType);
array.ColumnDataType = columnDataType;
return array;
}
private string[] GetColumnLabels(DataArray array, out DataArray.DataType[] columnDataType)
{
columnDataType = new DataArray.DataType[array.ColumnNames.Length];
int columnCount = array.Cells[0].Length;
Dictionary<DataArray.DataType, List<ColumnInfo>> columnInfo = new Dictionary<DataArray.DataType, List<ColumnInfo>>();
for (int column = 0; column < columnCount; ++column)
{
int distinctValues;
DataArray.DataType dataType = GetColumnType(array, column, out distinctValues);
columnDataType[column] = dataType;
if (!columnInfo.ContainsKey(dataType))
{
columnInfo.Add(dataType, new List<ColumnInfo>());
}
columnInfo[dataType].Add(new ColumnInfo()
{
ColumnIndex = column,
DistinctValues = distinctValues,
DataType = dataType
});
}
bool containsDateTime = columnInfo.ContainsKey(DataArray.DataType.DateTime);
string[] labels = new string[columnCount];
if (containsDateTime)
{
List<ColumnInfo> dateColumns = columnInfo[DataArray.DataType.DateTime];
for (int i = 0; i < dateColumns.Count; ++i)
{
labels[dateColumns[i].ColumnIndex] = "input_t_" + i;
}
if (columnInfo.ContainsKey(DataArray.DataType.String))
{
List<ColumnInfo> stringColumns = columnInfo[DataArray.DataType.String];
for (int i = 0; i < stringColumns.Count; ++i)
{
labels[stringColumns[i].ColumnIndex] = "slicer_" + i;
}
}
}
else
{
if (columnInfo.ContainsKey(DataArray.DataType.String))
{
int maxDistinctValue = Int32.MaxValue;
int maxColumnIndex = -1;
int maxColumnLabelIndex = 0;
List<ColumnInfo> stringColumns = columnInfo[DataArray.DataType.String];
for (int i = 0; i < stringColumns.Count; ++i)
{
if (maxDistinctValue == Int32.MaxValue || maxDistinctValue < stringColumns[i].DistinctValues)
{
maxDistinctValue = stringColumns[i].DistinctValues;
maxColumnIndex = i;
maxColumnLabelIndex = stringColumns[i].ColumnIndex;
}
}
labels[maxColumnLabelIndex] = "input_g_0";
int adjustIndex = 0;
for (int i = 0; i < stringColumns.Count; ++i)
{
if (i != maxColumnIndex)
{
labels[stringColumns[i].ColumnIndex] = "slicer_" + (i - adjustIndex);
}
else
{
++adjustIndex;
}
}
}
}
if (columnInfo.ContainsKey(DataArray.DataType.Number))
{
List<ColumnInfo> numberColumns = columnInfo[DataArray.DataType.Number];
for (int i = 0; i < numberColumns.Count; ++i)
{
labels[numberColumns[i].ColumnIndex] = "output_" + i;
}
}
return labels;
}
private DataArray.DataType GetColumnType(DataArray array, int column, out int distinctValues)
{
// count number of distinct values
HashSet<object> values = new HashSet<object>();
for (int row = 0; row < array.Cells.Length; ++row)
{
if (!values.Contains(array.Cells[row][column]))
{
values.Add(array.Cells[row][column]);
}
}
distinctValues = values.Count;
// return the provided type if available
if (array.ColumnDataType != null && array.ColumnDataType.Length > column)
{
return array.ColumnDataType[column];
}
else
{
// determine the type from the first value in array
object firstValue = array.Cells[0][column];
string firstValueString = firstValue.ToString();
long longValue;
double doubleValue;
if (long.TryParse(firstValueString, out longValue) || double.TryParse(firstValueString, out doubleValue))
{
return DataArray.DataType.Number;
}
DateTime dateValue;
if (DateTime.TryParse(firstValueString, out dateValue))
{
return DataArray.DataType.DateTime;
}
return DataArray.DataType.String;
}
}
}
}

View File

@@ -0,0 +1,21 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<EnableDefaultItems>false</EnableDefaultItems>
<EnableDefaultCompileItems>false</EnableDefaultCompileItems>
<EnableDefaultEmbeddedResourceItems>false</EnableDefaultEmbeddedResourceItems>
<EnableDefaultNoneItems>false</EnableDefaultNoneItems>
<GenerateAssemblyInfo>false</GenerateAssemblyInfo>
<DefineConstants>$(DefineConstants);TRACE</DefineConstants>
<PreserveCompilationContext>true</PreserveCompilationContext>
<DebugType>portable</DebugType>
</PropertyGroup>
<ItemGroup>
<Compile Include="**\*.cs" Exclude="**/obj/**/*.cs" />
</ItemGroup>
<ItemGroup>
<Content Include="Templates\Templates.txt">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
</Project>

View File

@@ -0,0 +1,44 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System.Reflection;
using System.Runtime.CompilerServices;
using System.Runtime.InteropServices;
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("Microsoft InsightsGenerator")]
[assembly: AssemblyDescription("Provides Microsoft InsightsGenerator services.")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("Microsoft")]
[assembly: AssemblyProduct("SMicrosoft InsightsGenerator")]
[assembly: AssemblyCopyright("<22> Microsoft Corporation. All rights reserved.")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]
// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("5b6bd4c4-7352-4762-9ad2-578b3fbd1685")]
// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
// You can specify all the values or you can default the Build and Revision Numbers
// by using the '*' as shown below:
// [assembly: AssemblyVersion("1.0.*")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]
[assembly: AssemblyInformationalVersion("1.0.0.0")]
[assembly: InternalsVisibleTo("Microsoft.InsightsGenerator.UnitTests")]

View File

@@ -0,0 +1,212 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
namespace Microsoft.InsightsGenerator
{
public class RulesEngine
{
public static List<Template> Templates;
public static List<string> TopListHashHeaders = new List<string>{ "#top", "#averageSlice", "#topPerslice" , "#bottom"};
public RulesEngine()
{
Templates = GetTemplates();
}
public static ColumnHeaders TemplateParser(string templateContent)
{
ColumnHeaders ch = new ColumnHeaders();
var processedText = Regex.Replace(templateContent, @",|\\n", "");
ch.Template = templateContent;
List<string> keyvalue = processedText.Split(' ').Select(s => s.Trim()).ToList();
foreach (string s in keyvalue)
{
if (s.StartsWith("#"))
{
string headers = s.Substring(1, s.Length - 1);
if (headers.StartsWith("#"))
{
ch.DoubleHashValues.Add( "##" + headers.Substring(1, headers.Length - 1));
}
else
{
if (headers != "placeHolder")
{
ch.SingleHashValues.Add("#" + headers);
}
}
}
}
return ch;
}
/// <summary>
/// Find a matched template based on the single hash values
/// </summary>
/// <param name="singleHashHeaders"></param>
/// <returns></returns>
public static string FindMatchedTemplate(List<List<string>> singleHashHeaders, DataArray columnInfo)
{
var resultTemplate = new StringBuilder();
if (Templates == null)
{
Templates = GetTemplates();
}
var headersWithSingleHash = GetTopHeadersWithHash(singleHashHeaders);
foreach (var template in Templates)
{
var columnHeaders = TemplateParser(template.Content);
var singleHashFromTemplate = columnHeaders.SingleHashValues.Distinct();
var isMatched = true;
// all the values from template needs to be found in the input from SigGen
foreach (var hashFromTemplate in singleHashFromTemplate)
{
if (!headersWithSingleHash.Contains(hashFromTemplate.ToLower()))
{
isMatched = false;
break;
}
}
if (isMatched)
{
// Replace # and ## values in template with actual values
resultTemplate.AppendLine(ReplaceHashesInTemplate(singleHashHeaders, columnInfo, template) + "\n");
}
}
// No matched Template found
return resultTemplate.ToString();
}
private static string ReplaceHashesInTemplate(List<List<string>>singleHashList, DataArray columnInfo, Template template)
{
StringBuilder modifiedTemp = new StringBuilder(template.Content);
// Replace single hash values
foreach (var line in singleHashList)
{
// Example of a single list (line):
// "top", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue "
var headerInputs = line.ToArray();
string header = "#" + headerInputs[0];
if (TopListHashHeaders.Contains(header))
{
if(!modifiedTemp.ToString().Contains(header))
{
continue;
}
//First replace the header with the second value in the list
modifiedTemp.Replace(header, headerInputs[1]);
StringBuilder remainingStr = new StringBuilder();
for (int i = 2; i < headerInputs.Length; i++)
{
// Append all the rest of the elemet in the array separated by new line
remainingStr.AppendLine(headerInputs[i]);
}
modifiedTemp.Replace("#placeHolder", remainingStr.ToString());
}
else
{
modifiedTemp.Replace("#" + headerInputs[0], headerInputs[1]);
}
}
// Replace double hash values
var transformedColumnArray = columnInfo.TransformedColumnNames.ToArray();
var columnArray = columnInfo.ColumnNames.ToArray();
for (int p = 0; p < columnInfo.TransformedColumnNames.Length; p++)
{
modifiedTemp.Replace("##" + transformedColumnArray[p], columnArray[p]);
}
return modifiedTemp.ToString();
}
private static List<string> GetTopHeadersWithHash(List<List<string>> singleHashHeaders)
{
var topHeaderList = new List<string>();
foreach (var list in singleHashHeaders)
{
topHeaderList.Add("#" + list.First().ToLower());
}
return topHeaderList;
}
/// <summary>
/// Retrieve all the templates and template ids
/// </summary>
/// <returns>All the template & id comnbination </returns>
public static List<Template> GetTemplates()
{
var templateHolder = new List<Template>();
string assemblyPath = System.Reflection.Assembly.GetExecutingAssembly().Location;
string assemblyDirectoryPath = System.IO.Path.GetDirectoryName(assemblyPath);
string templateFilePath = Path.Combine(assemblyDirectoryPath, "Templates", "templates.txt");
using (StreamReader streamReader = new StreamReader(templateFilePath, Encoding.UTF8))
{
int temId = 0;
var wholeText = streamReader.ReadToEnd();
var templateArray = wholeText.Split(new[] { "Template " }, StringSplitOptions.None).ToList();
foreach (var line in templateArray.Where(r => r != string.Empty))
{
var parts = line.Split(new[] { Environment.NewLine }, StringSplitOptions.None).ToList();
temId = int.Parse(parts[0]);
templateHolder.Add(new Template(temId, parts[1]));
}
return templateHolder;
}
}
}
/// <summary>
/// Template container to hold the template id + template body combination
/// </summary>
public class Template
{
public Template(int id, string content)
{
Id = id;
Content = content;
}
public int Id { get; set; }
public string Content { get; set; }
}
public class ColumnHeaders
{
public ColumnHeaders()
{
SingleHashValues = new List<string>();
DoubleHashValues = new List<string>();
Template = null;
}
public List<string> SingleHashValues { get; set; }
public List<string> DoubleHashValues { get; set; }
public string Template { get; set; }
}
}

View File

@@ -0,0 +1,462 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Net.Sockets;
using System.Globalization;
namespace Microsoft.InsightsGenerator
{
class SignatureGenerator
{
private DataArray Table;
public SignatureGeneratorResult Result;
public SignatureGenerator(DataArray table)
{
this.Table = table;
Result = new SignatureGeneratorResult();
}
public SignatureGeneratorResult Learn()
{
var stringInputIndexes = new List<int>();
var timeInputIndexes = new List<int>();
var slicerIndexes = new List<int>();
var outputIndexes = new List<int>();
for (var i = 0; i < Table.TransformedColumnNames.Length; i++)
{
if (Table.TransformedColumnNames[i].Contains("input_g"))
{
stringInputIndexes.Add(i);
}
if (Table.TransformedColumnNames[i].Contains("input_t"))
{
timeInputIndexes.Add(i);
}
if (Table.TransformedColumnNames[i].Contains("slicer"))
{
slicerIndexes.Add(i);
}
if (Table.TransformedColumnNames[i].Contains("output"))
{
outputIndexes.Add(i);
}
}
foreach (int stringIndex in stringInputIndexes)
{
foreach (int outputIndex in outputIndexes)
{
ExecuteStringInputInsights(stringIndex, outputIndex);
foreach (int slicerIndex in slicerIndexes)
{
ExecuteStringInputSlicerInsights(stringIndex, outputIndex, slicerIndex);
}
}
}
return Result;
}
public void ExecuteStringInputInsights(int inputCol, int outputCol)
{
var n = Table.Cells.Length;
if (Table.Cells.Length >8) {
n = 3;
}
OverallAverageInsights(outputCol);
OverallBottomInsights(n, inputCol, outputCol);
OverallMaxInsights(outputCol);
OverallMinInsights(outputCol);
OverallSumInsights(outputCol);
OverallTopInsights(n, inputCol, outputCol);
UniqueInputsInsight(inputCol);
}
public void ExecuteStringInputSlicerInsights(int inputCol, int outputCol, int slicerCol)
{
var n = Table.Cells.Length;
if (Table.Cells.Length > 8)
{
n = 3;
}
if (Table.Cells.Length > 50)
{
n = 5;
}
SlicedMaxInsights(slicerCol, outputCol);
SlicedAverageInsights(slicerCol, outputCol);
SlicedBottomInsights(n, inputCol, slicerCol, outputCol);
SlicedSumInsights(slicerCol, outputCol);
SlicedPercentageInsights(slicerCol, outputCol);
SlicedSumInsights(slicerCol, outputCol);
SlicedMinInsights(slicerCol, outputCol);
SlicedTopInsights(n, inputCol, slicerCol, outputCol);
}
public void UniqueInputsInsight(int inputCol)
{
List<string> insight = new List<string>();
// Adding the insight identifier
insight.Add(SignatureGeneratorResult.uniqueInputsIdentifier);
var uniqueInputs = GetUniqueColumValues(inputCol);
insight.Add(uniqueInputs.Length.ToString());
insight.AddRange(uniqueInputs);
Result.Insights.Add(insight);
}
public void OverallTopInsights(long n, int inputColumn, int outputColumn)
{
List<string> insight = new List<string>();
// Adding the insight identifier
insight.Add(SignatureGeneratorResult.topInsightIdentifier);
insight.AddRange(GenericTop(Table.Cells, n, inputColumn, outputColumn));
Result.Insights.Add(insight);
}
public void SlicedTopInsights(long n, int inputColumn, int sliceColumn, int outputColumn)
{
List<string> insight = new List<string>();
// Adding the insight identifier
insight.Add(SignatureGeneratorResult.topSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceColumn);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceColumn, slice.ToString());
insight.AddRange(GenericTop(sliceTable, n, inputColumn, outputColumn));
}
Result.Insights.Add(insight);
}
public List<string> GenericTop(Object[][] table, long n, int inputColumn, int outputColumn)
{
List<string> insight = new List<string>();
Object[][] sortedTable = SortCellsByColumn(table, outputColumn);
double outputSum = CalculateColumnSum(sortedTable, outputColumn);
for (int i = sortedTable.Length - 1; i >= 0 && i >= sortedTable.Length - n; i--)
{
double percent = Percentage(Double.Parse(sortedTable[i][outputColumn].ToString()), outputSum);
string temp = String.Format("{0} ({1}) {2}%", sortedTable[i][inputColumn].ToString(), sortedTable[i][outputColumn].ToString(), percent);
insight.Add(temp);
}
// Adding the count of the result
insight.Insert(0, insight.Count.ToString());
return insight;
}
public void OverallBottomInsights(long n, int inputColumn, int outputColumn)
{
List<string> insight = new List<string>();
// Adding the insight identifier
insight.Add(SignatureGeneratorResult.bottomInsightIdentifier);
insight.AddRange(GenericBottom(Table.Cells, n, inputColumn, outputColumn));
Result.Insights.Add(insight);
}
public void SlicedBottomInsights(long n, int inputColumn, int sliceColumn, int outputColumn)
{
List<string> insight = new List<string>();
// Adding the insight identifier
insight.Add(SignatureGeneratorResult.bottomSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceColumn);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceColumn, slice.ToString());
insight.AddRange(GenericBottom(sliceTable, n, inputColumn, outputColumn));
}
Result.Insights.Add(insight);
}
public List<string> GenericBottom(Object[][] table, long n, int inputColumn, int outputColumn)
{
List<string> insight = new List<string>();
Object[][] sortedTable = SortCellsByColumn(table, outputColumn);
double outputSum = CalculateColumnSum(sortedTable, outputColumn);
for (int i = 0; i < n && i < sortedTable.Length; i++)
{
double percent = Percentage(Double.Parse(sortedTable[i][outputColumn].ToString()), outputSum);
string temp = String.Format("{0} ({1}) {2}%", sortedTable[i][inputColumn].ToString(), sortedTable[i][outputColumn].ToString(), percent);
insight.Add(temp);
}
// Adding the count of the result
insight.Insert(0, insight.Count.ToString());
return insight;
}
public void OverallAverageInsights(int colIndex)
{
var outputList = new List<string>();
outputList.Add(SignatureGeneratorResult.averageInsightIdentifier);
outputList.Add(CalculateColumnAverage(Table.Cells, colIndex).ToString());
Result.Insights.Add(outputList);
}
public void OverallSumInsights(int colIndex)
{
var outputList = new List<string>();
outputList.Add(SignatureGeneratorResult.sumInsightIdentifier);
outputList.Add(CalculateColumnSum(Table.Cells, colIndex).ToString());
Result.Insights.Add(outputList);
}
public void OverallMaxInsights(int colIndex)
{
var outputList = new List<string>();
outputList.Add(SignatureGeneratorResult.maxInsightIdentifier);
outputList.Add(CalculateColumnMax(Table.Cells, colIndex).ToString());
Result.Insights.Add(outputList);
}
public void OverallMinInsights(int colIndex)
{
var outputList = new List<string>();
outputList.Add(SignatureGeneratorResult.minInsightIdentifier);
outputList.Add(CalculateColumnMin(Table.Cells, colIndex).ToString());
Result.Insights.Add(outputList);
}
public void SlicedSumInsights(int sliceIndex, int colIndex)
{
var insight = new List<string>();
insight.Add(SignatureGeneratorResult.sumSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceIndex);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceIndex, slice.ToString());
insight.Add(CalculateColumnSum(sliceTable, colIndex).ToString());
}
Result.Insights.Add(insight);
}
public void SlicedMaxInsights(int sliceIndex, int colIndex)
{
var insight = new List<string>();
insight.Add(SignatureGeneratorResult.maxSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceIndex);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceIndex, slice.ToString());
insight.Add(CalculateColumnMax(sliceTable, colIndex).ToString());
}
Result.Insights.Add(insight);
}
public void SlicedMinInsights(int sliceIndex, int colIndex)
{
var insight = new List<string>();
insight.Add(SignatureGeneratorResult.minSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceIndex);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceIndex, slice.ToString());
insight.Add(CalculateColumnMin(sliceTable, colIndex).ToString());
}
Result.Insights.Add(insight);
}
public void SlicedAverageInsights(int sliceIndex, int colIndex)
{
var insight = new List<string>();
insight.Add(SignatureGeneratorResult.sumSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceIndex);
insight.Add(slices.Length.ToString());
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceIndex, slice.ToString());
insight.Add(CalculateColumnAverage(sliceTable, colIndex).ToString());
}
Result.Insights.Add(insight);
}
public void SlicedPercentageInsights(int sliceIndex, int colIndex)
{
var insight = new List<string>();
insight.Add(SignatureGeneratorResult.percentageSliceInsightIdentifier);
object[] slices = GetUniqueColumValues(sliceIndex);
insight.Add(slices.Length.ToString());
double totalSum = CalculateColumnSum(Table.Cells, colIndex);
foreach (var slice in slices)
{
insight.Add(slice.ToString());
var sliceTable = CreateSliceBucket(sliceIndex, slice.ToString());
double sliceSum = CalculateColumnSum(sliceTable, colIndex);
var percentagePerSlice = Percentage(sliceSum, totalSum);
insight.Add(percentagePerSlice.ToString());
}
Result.Insights.Add(insight);
}
private double CalculateColumnAverage(object[][] rows, int colIndex)
{
return Math.Round(CalculateColumnSum(rows, colIndex) / rows.Length, 2);
}
private double CalculateColumnSum(object[][] rows, int colIndex)
{
return Math.Round(rows.Sum(row => double.Parse(row[colIndex].ToString())), 2);
}
private double CalculateColumnPercentage(object[][] rows, int colIndex)
{
return rows.Sum(row => double.Parse(row[colIndex].ToString()));
}
private double CalculateColumnMin(object[][] rows, int colIndex)
{
return rows.Min(row => double.Parse(row[colIndex].ToString()));
}
private double CalculateColumnMax(object[][] rows, int colIndex)
{
return rows.Max(row => double.Parse(row[colIndex].ToString()));
}
private string[] GetUniqueColumValues(int colIndex)
{
return Table.Cells.Select(row => row[colIndex].ToString()).Distinct().ToArray();
}
public Object[][] CreateSliceBucket(int sliceColIndex, string sliceValue)
{
List<Object[]> slicedTable = new List<object[]>();
foreach (var row in Table.Cells)
{
if (row[sliceColIndex].Equals(sliceValue))
{
slicedTable.Add(DeepCloneRow(row));
}
}
return slicedTable.ToArray();
}
public object[][] SortCellsByColumn(Object[][] table, int colIndex)
{
var cellCopy = DeepCloneTable(table);
Comparer<Object> comparer = Comparer<Object>.Default;
switch (this.Table.ColumnDataType[colIndex])
{
case DataArray.DataType.Number:
Array.Sort<Object[]>(cellCopy, (x, y) => comparer.Compare(double.Parse(x[colIndex].ToString()), double.Parse(y[colIndex].ToString())));
break;
case DataArray.DataType.String:
Array.Sort<Object[]>(cellCopy, (x, y) => String.Compare(x[colIndex].ToString(), y[colIndex].ToString()));
break;
case DataArray.DataType.DateTime:
Array.Sort<Object[]>(cellCopy, (x, y) => DateTime.Compare(DateTime.Parse(x[colIndex].ToString()), DateTime.Parse(y[colIndex].ToString())));
break;
}
return cellCopy;
}
public Object[][] DeepCloneTable(object[][] table)
{
return table.Select(a => a.ToArray()).ToArray();
}
public Object[] DeepCloneRow(object[] row)
{
return row.Select(a => a).ToArray();
}
public double Percentage(double value, double sum)
{
return Math.Round((double)((value / sum) * 100), 2);
}
}
}
public class SignatureGeneratorResult
{
public SignatureGeneratorResult()
{
Insights = new List<List<string>>();
}
public List<List<string>> Insights { get; set; }
public static string topInsightIdentifier = "top";
public static string bottomInsightIdentifier = "bottom";
public static string topSliceInsightIdentifier = "topPerSlice";
public static string bottomSliceInsightIdentifier = "bottomPerSlice";
public static string averageInsightIdentifier = "average";
public static string sumInsightIdentifier = "sum";
public static string maxInsightIdentifier = "max";
public static string minInsightIdentifier = "min";
public static string averageSliceInsightIdentifier = "averagePerSlice";
public static string sumSliceInsightIdentifier = "sumPerSlice";
public static string percentageSliceInsightIdentifier = "percentagePerSlice";
public static string maxSliceInsightIdentifier = "maxPerSlice";
public static string minSliceInsightIdentifier = "minPerSlice";
public static string uniqueInputsIdentifier = "uniqueInputs";
}
/** Some general format about the output
* "time"/"string"
* "top", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue "
* "top", "1", " input (value) %OfValue "
* "bottom", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue "
* "average", "100"
* "mean", "100"
* "median", "100"
* "averageSlice", "#slice","nameofslice", "100", "nameofslice", "100", "nameofslice", "100"
* "topPerslice", "#slice", "nameofslice", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue ",
* "nameofslice", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue ",
* "nameofslice", "3", " input (value) %OfValue ", " input (value) %OfValue ", " input (value) %OfValue "
* ....
*
**/

View File

@@ -0,0 +1,52 @@
Template 9
There were #uniqueInputs ##input_g_0 (s), the top #top highest total ##output_0 were as follows:\n #placeHolder
Template 16
For the #slices ##SlicePar_GG_1(s), the percentage of ##OutPar_N_C_1 on #time were \n #stHData\n this was compared with #Etime where #ESlices ##SlicePar_GG_1\n #EstHData \n.
Template 17
For the #slices ##SlicePar_GG_1(s), the percentage of ##OutPar_N_C_1 for each group during the week of #time was \n #stHData\n, compared to the #Etime where #ESlices ##SlicePar_GG_1\n #EstHData \n.
Template 21
The data pattern indicates that #st have #volumeType ##OutPar_N_C_1 \n.
Template 22
The largest increase of ##OutPar_N_C_1 was on #time when it increased #increasesize% above the average value.
Template 23
The largest decrease in ##OutPar_N_C_1 was on #time when it fell below #reducesize% of the average value.
Template 24
The data trend indicates a consistent #trend over time. \n
Template 1
For the #averageSlice ##slicer_0, the volume of each is: #placeHolder
Template 2
The ##InPar_GG_1 that had largest of each ##SlicePar_GG_1 are (only top #n ##InPar_GG_1) \n
Template 3
#input had #TPer% for ##SlicePar_GG_1 #Slice \n"
Template 5
The ##InPar_GG_1 that had largest of each ##SlicePar_GG_1 are (only top #n ##InPar_GG_1) \n
Template 6
The largest and smallest ##SlicePar_GG_1 for the top #n ##InPar_GG_1 were are as follows \n
Template 7
#inp had #maxSlice the largest ##SlicePar_GG_1 #maxPer% and #minSlice the smallest ##SlicePar_GG_1 #minPer% \n
Template 8
#inp had a total of #OutPar_N_C_1 ##OutPar_N_C_1 that constitues #Per% \n \n
Template 10
There were #uniqueInputs ##InPar_GG_1(s), the total ##OutPar_N_C_1 were as follows:\n #st
Template 11
in the week #time the data contained #slices ##SlicePar_GG_1, the top #n ##SlicePar_GG_1 with highest ##OutPar_N_C_1 were as follows:\n #stHData this was compared with week #Etime with the following top #En of #Eslice ##SlicePar_GG_1\n #EstData\n
Template 12
in the week #time the data contained #slices ##SlicePar_GG_1, the top #n ##SlicePar_GG_1 with highest ##OutPar_N_C_1 were as follows:\n #stHData\n
Template 13
On day #time the data contained #slices ##SlicePar_GG_1, the top #n ##SlicePar_GG_1 with highest ##OutPar_N_C_1 were as follows:\n #stHData this was compared with day #Etime with the following top #n ##SlicePar_GG_1\n #EstHData\n
Template 14
in month #time the data contained #slices ##SlicePar_GG_1, the top #n ##SlicePar_GG_1 with highest ##OutPar_N_C_1 were as follows:\n #stData this was compared with month #Etime with the following top #n ##SlicePar_GG_1\n #EstData\n
Template 15
in month #time the data contained #slices ##SlicePar_GG_1, the top #n ##SlicePar_GG_1 with highest ##OutPar_N_C_1 were as follows:\n #stHData\n
Template 18
in the week #time the data contained #slices ##SlicePar_GG_1, the % of ##OutPar_N_C_1 for each ##SlicePar_GG_1 were as follows:\n #stHData this was compared with week #Etime with the following #En of #Eslice ##SlicePar_GG_1\n #EstHData \n
Template 19
in month #time the data contained #slices ##SlicePar_GG_1, the % of ##OutPar_N_C_1 for each ##SlicePar_GG_1 were as follows:\n #stData this was compared with month #Etime with the following #n ##SlicePar_GG_1\n #EstData\n
Template 20
in month #time the data contained #slices ##SlicePar_GG_1, the % of ##OutPar_N_C_1 for each ##SlicePar_GG_1 were as follows:\n #stData\n
Template 21
Looking at the pattern of the data it appears that #st has the #volumeType ##OutPar_N_C_1 \n
Template 22
The largest increase in ##OutPar_N_C_1 were observed on #time with an increase of #increasesize% of the avarage value of ##OutPar_N_C_1 \n
Template 23
The bottom #bottom total ##output_0 were as follows:\n #placeHolder

View File

@@ -0,0 +1,58 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
namespace Microsoft.InsightsGenerator
{
public class Workflow
{
public async Task<string> ProcessInputData(DataArray rulesData,
CancellationToken cancellationToken = new CancellationToken())
{
// added cancellationToken just in case for future
cancellationToken.ThrowIfCancellationRequested();
//Get the signature result
SignatureGenerator siggen = new SignatureGenerator(rulesData);
string insights = null;
await Task.Run(() =>
{
try
{
DataTransformer transformer = new DataTransformer();
transformer.Transform(rulesData);
SignatureGeneratorResult result = siggen.Learn();
// call the rules engine processor
if (result?.Insights == null)
{
// Console.WriteLine("Failure in generating insights, Input not recognized!");
}
else
{
insights = RulesEngine.FindMatchedTemplate(result.Insights, rulesData);
// Console.WriteLine(
// $"Good News! Insights generator has provided you the chart text: \n{insights}\n");
}
}
catch (Exception)
{
// Console.WriteLine(ex.ToString());
throw;
}
}, cancellationToken);
return insights;
}
}
}

View File

@@ -29,6 +29,7 @@ using Microsoft.SqlTools.ServiceLayer.Security;
using Microsoft.SqlTools.ServiceLayer.SqlAssessment;
using Microsoft.SqlTools.ServiceLayer.SqlContext;
using Microsoft.SqlTools.ServiceLayer.Workspace;
using Microsoft.SqlTools.ServiceLayer.InsightsGenerator;
namespace Microsoft.SqlTools.ServiceLayer
{
@@ -137,6 +138,9 @@ namespace Microsoft.SqlTools.ServiceLayer
NotebookConvertService.Instance.InitializeService(serviceHost);
serviceProvider.RegisterSingleService(NotebookConvertService.Instance);
InsightsGeneratorService.Instance.InitializeService(serviceHost);
serviceProvider.RegisterSingleService(InsightsGeneratorService.Instance);
InitializeHostedServices(serviceProvider, serviceHost);
serviceHost.ServiceProvider = serviceProvider;

View File

@@ -0,0 +1,46 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using Microsoft.SqlTools.Hosting.Protocol.Contracts;
using Microsoft.SqlTools.ServiceLayer.Utility;
using Microsoft.SqlTools.Utility;
namespace Microsoft.SqlTools.ServiceLayer.InsightsGenerator.Contracts
{
public class AccessibleChartData
{
public string[] Columns { get; set; }
public string[][] Rows { get; set; }
}
/// <summary>
/// Query insights generator parameters
/// </summary>
public class QueryInsightsGeneratorParams : GeneralRequestDetails
{
public AccessibleChartData Data { get; set; }
}
/// <summary>
/// Query insights generator result
/// </summary>
public class InsightsGeneratorResult : ResultStatus
{
public string InsightsText { get; set; }
}
/// <summary>
/// Query insights generato request type
/// </summary>
public class QueryInsightsGeneratorRequest
{
/// <summary>
/// Request definition
/// </summary>
public static readonly
RequestType<QueryInsightsGeneratorParams, InsightsGeneratorResult> Type =
RequestType<QueryInsightsGeneratorParams, InsightsGeneratorResult>.Create("insights/query");
}
}

View File

@@ -0,0 +1,78 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Threading.Tasks;
using Microsoft.SqlTools.Hosting.Protocol;
using Microsoft.SqlTools.ServiceLayer.Hosting;
using Microsoft.SqlTools.ServiceLayer.InsightsGenerator.Contracts;
using Microsoft.InsightsGenerator;
namespace Microsoft.SqlTools.ServiceLayer.InsightsGenerator
{
/// <summary>
/// Service responsible for securing credentials in a platform-neutral manner. This provides
/// a generic API for read, save and delete credentials
/// </summary>
public class InsightsGeneratorService
{
/// <summary>
/// Singleton service instance
/// </summary>
private static Lazy<InsightsGeneratorService> instance
= new Lazy<InsightsGeneratorService>(() => new InsightsGeneratorService());
/// <summary>
/// Gets the singleton service instance
/// </summary>
public static InsightsGeneratorService Instance
{
get
{
return instance.Value;
}
}
/// <summary>
/// Initializes the service instance
/// </summary>
public void InitializeService(ServiceHost serviceHost)
{
// Insights Generatoe request handlers
serviceHost.SetRequestHandler(QueryInsightsGeneratorRequest.Type, HandleQueryInsightGeneratorRequest);
}
internal async Task HandleQueryInsightGeneratorRequest(QueryInsightsGeneratorParams parameters, RequestContext<InsightsGeneratorResult> requestContext)
{
Microsoft.InsightsGenerator.DataArray dataArray = new Microsoft.InsightsGenerator.DataArray(){
ColumnNames = parameters.Data.Columns,
Cells = parameters.Data.Rows
};
Workflow insightWorkFlow = new Workflow();
try
{
string insightText = await insightWorkFlow.ProcessInputData(dataArray);
insightText = insightText.Replace("\\n", "");
await requestContext.SendResult(new InsightsGeneratorResult()
{
InsightsText = insightText,
Success = true,
ErrorMessage = null
});
}
catch (Exception ex)
{
await requestContext.SendResult(new InsightsGeneratorResult()
{
Success = false,
ErrorMessage = ex.Message
});
}
}
}
}

View File

@@ -2321,13 +2321,13 @@
</trans-unit>
<trans-unit id="ExportBacpacTaskName">
<source>Export bacpac</source>
<target state="translated">BACPAC exportieren</target>
<target state="translated">BACPAC-Datei exportieren</target>
<note>
</note>
</trans-unit>
<trans-unit id="ImportBacpacTaskName">
<source>Import bacpac</source>
<target state="translated">BACPAC importieren</target>
<target state="translated">BACPAC-Datei importieren</target>
<note>
</note>
</trans-unit>
@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">Ungültiger Wert "{0}" für ColumnEncryption. Gültige Werte sind "Enabled" und "Disabled".</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">Ungültiger Wert "{0}" für EnclaveAttestationProtocol. Gültige Werte sind "AAS" und "HGS".</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">Für das Nachweisprotokoll und die Enclave-Nachweis-URL muss Always Encrypted auf "Enabled" festgelegt sein.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">Fehler bei der Batchausführung. Der Vorgang wird beendet.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">Es wurde ein nicht unterstütztes Token "{0}" gefunden.</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">El valor "{0}" no es válido para ComlumEncryption. Los valores válidos son "Enabled" y "Disabled".</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">El valor "{0}" no es válido para EnclaveAttestationProtocol. Los valores válidos son "AAS" y "HGS".</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">El protocolo de atestación y la dirección URL de atestación de enclave requieren que Always Encrypted esté habilitado.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">Error durante la ejecución del lote. Saliendo.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">Se ha encontrado un token no admitido {0}</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">Valeur « {0} » non valide pour ComlumEncryption. Les valeurs valides sont « Enabled » et « Disabled ».</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">Valeur « {0} » non valide pour EnclaveAttestationProtocol. Les valeurs valides sont « AAS » et « HGS ».</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">Le protocole d'attestation et l'URL d'attestation d'enclave exigent l'activation d'Always Encrypted.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">Une erreur s'est produite durant l'exécution du lot. Fermeture en cours.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">Jeton non pris en charge détecté ({0})</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">Il valore '{0}' non è valido per ComlumEncryption. I valori validi sono 'Enabled' e 'Disabled'.</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">Il valore '{0}' non è valido per EnclaveAttestationProtocol. I valori validi sono 'AAS' e 'HGS'.</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">Il protocollo di attestazione e l'URL di attestazione dell'enclave richiedono che Always Encrypted sia impostato su Abilitato.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">Si è verificato un errore durante l'esecuzione del batch. Chiusura in corso.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">È stato rilevato il token {0} non supportato</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">ComlumEncryption の値 '{0}' が無効です。有効な値は 'Enabled' と 'Disabled' です。</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">EnclaveAttestationProtocol の値 '{0}' が無効です。有効な値は、'AAS' と 'HGS' です。</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">構成証明プロトコルおよびエンクレーブ構成証明の URL では、Always Encrypted を Enabled に設定することが必要です。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">バッチの実行中にエラーが発生しました。終了します。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">サポートされていないトークン {0} が見つかりました</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -1898,7 +1898,7 @@
</trans-unit>
<trans-unit id="EditDataValueTooLarge">
<source>Value {0} is too large to fit in column of type {1}</source>
<target state="translated">값 {0}이 너무 커서 {1} 유형의 열에 맞지 않습니다.</target>
<target state="translated">값 {0}이(가) 너무 커서 {1} 유형의 열에 맞지 않습니다.</target>
<note>.
Parameters: 0 - value (string), 1 - columnType (string) </note>
</trans-unit>
@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">ComlumEncryption에 대한 '{0}' 값이 잘못되었습니다. 유효한 값은 'Enabled' 및 'Disabled'입니다.</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">EnclaveAttestationProtocol에 대한 '{0}' 값이 잘못되었습니다. 유효한 값은 'AAS' 및 'HGS'입니다.</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">증명 프로토콜 및 enclave 증명 URL을 사용하려면 Always Encrypted를 Enabled(사용)로 설정해야 합니다.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">일괄 처리를 실행하는 동안 오류가 발생하여 종료합니다.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">지원되지 않는 {0} 토큰이 발생했습니다.</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">Valor inválido '{0}' para ComlumEncryption. Os valores válidos são 'Habilitado' e 'Desabilitado'.</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">Valor inválido '{0}' para EnclaveAttestationProtocol. Os valores válidos são 'AAS' e 'HGS'.</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">O Protocolo de Atestado e a URL de Atestado de Enclave exigem que a opção Always Encrypted seja definida como Habilitada.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">Erro durante a execução do lote. Saindo.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">Token sem suporte encontrado {0}</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">Недопустимое значение "{0}" для ColumnEncryption. Допустимые значения: "Enabled" (Включено) и "Disabled" (Отключено).</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">Недопустимое значение "{0}" для EnclaveAttestationProtocol. Допустимые значения: "AAS" и "HGS".</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">Протокол аттестации и URL аттестации анклава требуют задать для Always Encrypted значение Enabled (Включено).</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">При выполнении пакета возникла ошибка. Выполняется выход.</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">Обнаружен неподдерживаемый токен {0}.</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -136,13 +136,13 @@
</trans-unit>
<trans-unit id="QueryServiceAffectedOneRow">
<source>(1 row affected)</source>
<target state="translated">1 行受到影响</target>
<target state="translated">(1 行受到影响)</target>
<note>
</note>
</trans-unit>
<trans-unit id="QueryServiceAffectedRows">
<source>({0} rows affected)</source>
<target state="translated">{0} 行受到影响</target>
<target state="translated">({0} 行受到影响)</target>
<note>.
Parameters: 0 - rows (long) </note>
</trans-unit>
@@ -1922,7 +1922,7 @@
</trans-unit>
<trans-unit id="StoredProcedureScriptParameterComment">
<source>-- TODO: Set parameter values here.</source>
<target state="translated">-- 待办事项在此处设置参数值</target>
<target state="translated">-- 待办事项: 在此处设置参数值</target>
<note>
</note>
</trans-unit>
@@ -2209,13 +2209,13 @@
</trans-unit>
<trans-unit id="CreateSessionFailed">
<source>Failed to create session: {0}</source>
<target state="translated">创建会话失败{0}</target>
<target state="translated">创建会话失败: {0}</target>
<note>.
Parameters: 0 - error (String) </note>
</trans-unit>
<trans-unit id="PauseSessionFailed">
<source>Failed to pause session: {0}</source>
<target state="translated">暂停会话失败{0}</target>
<target state="translated">暂停会话失败: {0}</target>
<note>.
Parameters: 0 - error (String) </note>
</trans-unit>
@@ -2321,13 +2321,13 @@
</trans-unit>
<trans-unit id="ExportBacpacTaskName">
<source>Export bacpac</source>
<target state="translated">导出 DACPAC</target>
<target state="translated">导出 BACPAC</target>
<note>
</note>
</trans-unit>
<trans-unit id="ImportBacpacTaskName">
<source>Import bacpac</source>
<target state="translated">导入 DACPAC</target>
<target state="translated">导入 BACPAC</target>
<note>
</note>
</trans-unit>
@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">ComlumEncryption 的值“{0}”无效。有效值为 "Enabled" 和 "Disabled"。</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">EnclaveAttestationProtocol 的值“{0}”无效。有效值为 "AAS" 和 "HGS"。</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">证明协议和 Enclave 证明 URL 需要将 Always Encrypted 设置为“启用”。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">在执行批处理期间遇到错误。正在退出。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">遇到不支持的令牌 {0}</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -784,13 +784,13 @@
</trans-unit>
<trans-unit id="SchemaHierarchy_MasterKey">
<source>Master Key</source>
<target state="translated">主要索引鍵</target>
<target state="translated">主要金鑰</target>
<note>
</note>
</trans-unit>
<trans-unit id="SchemaHierarchy_MasterKeys">
<source>Master Keys</source>
<target state="translated">主要索引鍵</target>
<target state="translated">主要金鑰</target>
<note>
</note>
</trans-unit>
@@ -2391,6 +2391,36 @@
<note>.
Parameters: 0 - filePath (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidColumnEncryptionSetting">
<source>Invalid value '{0}' for ComlumEncryption. Valid values are 'Enabled' and 'Disabled'.</source>
<target state="translated">ComlumEncryption 的值 '{0}' 無效。有效值為 'Enabled' 及 'Disabled'。</target>
<note>.
Parameters: 0 - columnEncryptionSetting (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidEnclaveAttestationProtocol">
<source>Invalid value '{0}' for EnclaveAttestationProtocol. Valid values are 'AAS' and 'HGS'.</source>
<target state="translated">EnclaveAttestationProtocol 的值 '{0}' 無效。有效值為 'AAS' 及 'HGS'。</target>
<note>.
Parameters: 0 - enclaveAttestationProtocol (string) </note>
</trans-unit>
<trans-unit id="ConnectionServiceConnStringInvalidAlwaysEncryptedOptionCombination">
<source>The Attestation Protocol and Enclave Attestation URL requires Always Encrypted to be set to Enabled.</source>
<target state="translated">證明通訊協定與記憶體保護區證明 URL 需要將 Always Encrypted 設定為「啟用」。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdExitOnError">
<source>An error was encountered during execution of batch. Exiting.</source>
<target state="translated">執行批次期間發生錯誤,結束中。</target>
<note>
</note>
</trans-unit>
<trans-unit id="SqlCmdUnsupportedToken">
<source>Encountered unsupported token {0}</source>
<target state="translated">發現不支援的權杖 {0}</target>
<note>
</note>
</trans-unit>
</body>
</file>
</xliff>

View File

@@ -34,6 +34,7 @@
<ProjectReference Include="../Microsoft.SqlTools.Credentials/Microsoft.SqlTools.Credentials.csproj" />
<ProjectReference Include="../Microsoft.SqlTools.ManagedBatchParser/Microsoft.SqlTools.ManagedBatchParser.csproj" />
<ProjectReference Include="../Microsoft.Kusto.ServiceLayer/Microsoft.Kusto.ServiceLayer.csproj" />
<ProjectReference Include="../Microsoft.InsightsGenerator/Microsoft.InsightsGenerator.csproj" />
</ItemGroup>
<ItemGroup>
<Content Include="$(PkgMicrosoft_SqlServer_DacFx)\lib\netstandard2.0\Microsoft.Data.Tools.Schema.SqlTasks.targets">

View File

@@ -0,0 +1,198 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using Xunit;
namespace Microsoft.InsightsGenerator.UnitTests
{
/// <summary>
/// DataTransformation tests
/// </summary>
public class DataTransformerTests
{
[Fact]
public void Tranform_NullInput()
{
DataTransformer transformer = new DataTransformer();
DataArray array = null;
array = transformer.Transform(array);
Assert.Null(array);
}
[Fact]
public void Tranform_TimeSlicerCount_DeduceTypes()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[3] { "1/15/2020", "Redmond", 50 };
cells[1] = new object[3] { "1/25/2020", "Bellevue", 75 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Date", "City", "Count" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "input_t_0");
Assert.Equal(array.TransformedColumnNames[1], "slicer_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
}
[Fact]
public void Tranform_TimeSlicerCount_ProvidedTypes()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[3] { "1/15/2020", "Redmond", 50 };
cells[1] = new object[3] { "1/25/2020", "Bellevue", 75 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Date", "City", "Count" },
ColumnDataType = new DataArray.DataType[] {
DataArray.DataType.String,
DataArray.DataType.DateTime,
DataArray.DataType.Number },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "slicer_0");
Assert.Equal(array.TransformedColumnNames[1], "input_t_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
}
[Fact]
public void Tranform_TimeSlicerSlicerCount()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[5][];
cells[0] = new object[4] { "1/15/2020", "Redmond", "1st Street", 50 };
cells[1] = new object[4] { "1/25/2020", "Redmond", "2nd Street", 75 };
cells[2] = new object[4] { "1/10/2020", "Bellevue", "3rd Street", 125 };
cells[3] = new object[4] { "1/13/2020", "Bellevue", "4th Street", 55 };
cells[4] = new object[4] { "1/20/2020", "Bellevue", "5th Street", 95 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Date", "City", "Address", "Count" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "input_t_0");
Assert.Equal(array.TransformedColumnNames[1], "slicer_0");
Assert.Equal(array.TransformedColumnNames[2], "slicer_1");
Assert.Equal(array.TransformedColumnNames[3], "output_0");
}
[Fact]
public void Tranform_TimeSlicerCountSlicer()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[5][];
cells[0] = new object[4] { "1/15/2020", "1st Street", 50, "Redmond" };
cells[1] = new object[4] { "1/25/2020", "2nd Street", 75, "Redmond" };
cells[2] = new object[4] { "1/10/2020", "3rd Street", 125, "Bellevue" };
cells[3] = new object[4] { "1/13/2020", "4th Street", 55, "Bellevue" };
cells[4] = new object[4] { "1/20/2020", "5th Street", 95, "Bellevue" };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Date", "Address", "Count", "City" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "input_t_0");
Assert.Equal(array.TransformedColumnNames[1], "slicer_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
Assert.Equal(array.TransformedColumnNames[3], "slicer_1");
}
[Fact]
public void Tranform_TimeSlicerCountCount()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[4] { "1/15/2020", "1st Street", 50, 110 };
cells[1] = new object[4] { "1/25/2020", "2nd Street", 75, 160 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Date", "Adress", "Count1", "Count2" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "input_t_0");
Assert.Equal(array.TransformedColumnNames[1], "slicer_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
Assert.Equal(array.TransformedColumnNames[3], "output_1");
}
[Fact]
public void Tranform_GroupSlicerTime()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[3] { "1st Street", "Redmond", 110 };
cells[1] = new object[3] { "2nd Street", "Bellevue", 160 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Address", "City", "Count" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "input_g_0");
Assert.Equal(array.TransformedColumnNames[1], "slicer_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
}
[Fact]
public void Tranform_SlicewrGroupSlicerTime()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[4] { "1st Street", "Redmond", "North", 110 };
cells[1] = new object[4] { "2nd Street", "Redmond", "East", 160 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Address", "City", "Direction", "Count" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "slicer_0");
Assert.Equal(array.TransformedColumnNames[1], "input_g_0");
Assert.Equal(array.TransformedColumnNames[2], "slicer_1");
Assert.Equal(array.TransformedColumnNames[3], "output_0");
}
[Fact]
public void Tranform_SlicerGroupTime()
{
DataTransformer transformer = new DataTransformer();
object[][] cells = new object[2][];
cells[0] = new object[3] { "1st Street", "Redmond", 110 };
cells[1] = new object[3] { "2nd Street", "Redmond", 160 };
DataArray array = new DataArray()
{
ColumnNames = new string[] { "Address", "City", "Count" },
Cells = cells
};
array = transformer.Transform(array);
Assert.Equal(array.TransformedColumnNames[0], "slicer_0");
Assert.Equal(array.TransformedColumnNames[1], "input_g_0");
Assert.Equal(array.TransformedColumnNames[2], "output_0");
}
}
}

View File

@@ -0,0 +1,26 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup Label="Configuration">
<OutputType>Exe</OutputType>
<TargetFramework>$(TestProjectsTargetFramework)</TargetFramework>
<GenerateAssemblyInfo>false</GenerateAssemblyInfo>
<DefineConstants>$(DefineConstants);TRACE</DefineConstants>
<IsPackable>false</IsPackable>
<ApplicationIcon />
<StartupObject />
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Text.Encoding.CodePages" />
<PackageReference Include="Microsoft.NET.Test.Sdk" />
<PackageReference Include="Moq" />
<PackageReference Include="NUnit" />
<PackageReference Include="xunit" />
<PackageReference Include="xunit.runner.visualstudio" />
<PackageReference Include="Microsoft.Data.SqlClient" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="../../src/Microsoft.InsightsGenerator/Microsoft.InsightsGenerator.csproj" />
</ItemGroup>
<ItemGroup>
<Service Include="{82a7f48d-3b50-4b1e-b82e-3ada8210c358}" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,41 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System.Reflection;
using System.Runtime.InteropServices;
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("Microsoft InsightsGenerator tests")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("Microsoft InsightsGenerator tests")]
[assembly: AssemblyCopyright("Copyright <20> 2020")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]
// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("68aa66d3-4d62-4ecf-85a0-3944256cb161")]
// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
// You can specify all the values or you can default the Build and Revision Numbers
// by using the '*' as shown below:
// [assembly: AssemblyVersion("1.0.*")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]

View File

@@ -0,0 +1,83 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.Linq;
using Xunit;
using static Microsoft.InsightsGenerator.RulesEngine;
namespace Microsoft.InsightsGenerator.UnitTests
{
/// <summary>
/// Rules Engine tests
/// </summary>
public class RulesEngineTests
{
[Fact]
public void TemplateParserTest()
{
ColumnHeaders headerForTemp8 = TemplateParser(@"#inp had a total of #OutPar_N_C_1 ##OutPar_N_C_1 that constitues #Per% \n \n");
ColumnHeaders headerForTemp16 = TemplateParser(@"For the #slices ##SlicePar_GG_1(s), the percentage of ##OutPar_N_C_1 on #time were \n #stHData\n this was compared with #Etime where #ESlices ##SlicePar_GG_1\n #EstHData \n.");
var expectedSingleHashValuesForTemp16 = new List<string>(new string[] { "#slices", "#time", "#stHData", "#Etime", "#ESlices", "#EstHData" });
var expectedDoubleHashValuesForTemp16 = new List<string>(new string[] { "##SlicePar_GG_1(s)", "##OutPar_N_C_1", "##SlicePar_GG_1" });
var expectedSingleHashValuesForTemp8 = new List<string>(new string[] { "#inp", "#OutPar_N_C_1", "#Per%" });
var expectedDoubleHashValuesForTemp8 = new List<string>(new string[] { "##OutPar_N_C_1" });
Assert.True(Enumerable.SequenceEqual(expectedSingleHashValuesForTemp8, headerForTemp8.SingleHashValues));
Assert.True(Enumerable.SequenceEqual(expectedDoubleHashValuesForTemp8, headerForTemp8.DoubleHashValues));
Assert.True(Enumerable.SequenceEqual(expectedSingleHashValuesForTemp16, headerForTemp16.SingleHashValues));
Assert.True(Enumerable.SequenceEqual(expectedDoubleHashValuesForTemp16, headerForTemp16.DoubleHashValues));
}
[Fact]
public void RulesEngineEndToEndTest()
{
// Create test input objects for test #1
var singleHashList1 = new List<List<string>>();
var list1_1 = new List<string>() { "uniqueinputs", "15" };
var list1_2 = new List<string>() { "top", "3", "China: 55%", "United States: 49%", "Japan: 37%" };
singleHashList1.Add(list1_1);
singleHashList1.Add(list1_2);
DataArray testArray1 = new DataArray();
testArray1.ColumnNames = new string[] { "Country", "Area" };
testArray1.TransformedColumnNames = new string[] { "input_g_0", "output_0" };
// Create test input objects for test #2
var singleHashList2 = new List<List<string>>();
var list2_1 = new List<string>() { "bottom", "5", "Apple: 30%", "Oragne: 28%", "Strawberry: 17%", "Pear: 13%", "Peach: 8%" };
singleHashList2.Add(list2_1);
DataArray testArray2 = new DataArray();
testArray2.ColumnNames = new string[] { "fruits" };
testArray2.TransformedColumnNames = new string[] { "output_0" };
// Create test input objects for test#3
var singleHashList3 = new List<List<string>>();
var list3_1 = new List<string>() { "averageSlice", "4", "Cow: 60%", "Dog: 28%", "Cat: 17%", "Mouse: 8%"};
singleHashList3.Add(list3_1);
DataArray testArray3 = new DataArray();
testArray3.ColumnNames = new string[] { "animals" };
testArray3.TransformedColumnNames = new string[] { "slicer_0" };
var returnedStr1 = $@"{RulesEngine.FindMatchedTemplate(singleHashList1, testArray1)}";
var returnedStr2 = $@"{RulesEngine.FindMatchedTemplate(singleHashList2, testArray2)}";
var returnedStr3 = $@"{RulesEngine.FindMatchedTemplate(singleHashList3, testArray3)}";
string expectedOutput1 = "There were 15 Country (s), the top 3 highest total Area were as follows:\\n China: 55%" + Environment.NewLine + "United States: 49%" + Environment.NewLine + "Japan: 37%" + Environment.NewLine + Environment.NewLine + Environment.NewLine;
string expectedOutput2 = "The top 5 lowest total fruits were as follows:\\n Apple: 30%" + Environment.NewLine + "Oragne: 28%" + Environment.NewLine + "Strawberry: 17%" + Environment.NewLine + "Pear: 13%" + Environment.NewLine + "Peach: 8%" + Environment.NewLine + Environment.NewLine + Environment.NewLine;
string expectedOutput3 = "For the 4 animals, the volume of each is: Cow: 60%" + Environment.NewLine + "Dog: 28%" + Environment.NewLine + "Cat: 17%" + Environment.NewLine + "Mouse: 8%" + Environment.NewLine + Environment.NewLine + Environment.NewLine;
Assert.True(string.Equals(returnedStr1, returnedStr1));
Assert.True(string.Equals(returnedStr2, expectedOutput2));
Assert.True(string.Equals(returnedStr3, expectedOutput3));
}
}
}

View File

@@ -0,0 +1,340 @@
using System;
using System.Collections.Generic;
using System.Text;
using Xunit;
namespace Microsoft.InsightsGenerator.UnitTests
{
public class SignatureGeneratorTests
{
[Fact]
public void TopTest()
{
var expectedTopInsight = @"top
3
China (455) 19.73%
Turkey (254) 11.01%
United States (188) 8.15%";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.OverallTopInsights(3, 0, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedTopInsight);
}
[Fact]
public void TopSliceTest()
{
var expectedTopSliceInsight = @"topPerSlice
5
Category1
3
China (455) 34.89%
Turkey (254) 19.48%
United States (188) 14.42%
Category2
3
Japan (171) 91.94%
China (10) 5.38%
United States (3) 1.61%
Category3
3
United States (106) 15.5%
Brazil (91) 13.3%
Korea (61) 8.92%
Category4
3
United States (38) 38%
China (12) 12%
Korea (8) 8%
Category5
3
Korea (21) 65.62%
United States (6) 18.75%
Canada (3) 9.38%";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedTopInsights(3, 0, 2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedTopSliceInsight);
}
[Fact]
public void BottomTest()
{
var expectedBottomInsight = @"bottom
3
Korea (1) 0.04%
Germany (1) 0.04%
India (1) 0.04%";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.OverallBottomInsights(3, 0, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedBottomInsight);
}
[Fact]
public void BottomSliceTest()
{
var expectedBottomSliceInsight = @"bottomPerSlice
5
Category1
3
Canada (17) 1.3%
United Kingdom (17) 1.3%
Vietnam (18) 1.38%
Category2
3
Germany (1) 0.54%
Korea (1) 0.54%
United States (3) 1.61%
Category3
3
France (12) 1.75%
United Kingdom (20) 2.92%
Vietnam (22) 3.22%
Category4
3
India (1) 1%
Japan (1) 1%
Canada (2) 2%
Category5
3
India (1) 3.12%
Japan (1) 3.12%
Canada (3) 9.38%";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedBottomInsights(3, 0, 2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedBottomSliceInsight);
}
[Fact]
public void AverageTest()
{
var expectedAverageInsight = @"average
42.7";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.OverallAverageInsights(1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedAverageInsight);
}
[Fact]
public void SumTest()
{
var expectedSumInsight = @"sum
2306";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.OverallSumInsights(1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedSumInsight);
}
[Fact]
public void SlicedSumTest()
{
var expectedSlicedSumInsight = @"sumPerSlice
5
Category1
1304
Category2
186
Category3
684
Category4
100
Category5
32";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedSumInsights(2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedSlicedSumInsight);
}
[Fact]
public void SlicedAverageTest()
{
var expectedSlicedAverageInsight = @"sumPerSlice
5
Category1
86.93
Category2
37.2
Category3
45.6
Category4
7.14
Category5
6.4";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedAverageInsights(2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedSlicedAverageInsight);
}
[Fact]
public void SlicedPercentageTest()
{
var expectedSlicedPercentageInsight = @"percentagePerSlice
5
Category1
56.55
Category2
8.07
Category3
29.66
Category4
4.34
Category5
1.39";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedPercentageInsights(2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedSlicedPercentageInsight);
}
[Fact]
public void MaxAndMinInsightsTest()
{
var expectedMaxAndMinInsight = @"max
455
min
1";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.OverallMaxInsights(1);
sigGen.OverallMinInsights(1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedMaxAndMinInsight);
}
[Fact]
public void MaxAndMinSlicedInsightsTest()
{
string expectedMaxAndMinSlicedInsight = @"maxPerSlice
5
Category1
455
Category2
171
Category3
106
Category4
38
Category5
21
minPerSlice
5
Category1
17
Category2
1
Category3
12
Category4
1
Category5
1";
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.SlicedMaxInsights(2, 1);
sigGen.SlicedMinInsights(2, 1);
CompareInsightWithExpectedOutcome(sigGen.Result.Insights, expectedMaxAndMinSlicedInsight);
}
public void CompareInsightWithExpectedOutcome(List<List<string>> insights, string expectedOutcome)
{
List<string> stringedInsights = new List<string>();
foreach (List<string> insight in insights)
{
stringedInsights.Add(string.Join(Environment.NewLine, insight));
}
Assert.Equal(expectedOutcome, string.Join(Environment.NewLine, stringedInsights));
}
[Fact]
public void LearnTest()
{
SignatureGenerator sigGen = new SignatureGenerator(sampleDataArray(false));
sigGen.Learn();
foreach (List<string> list in sigGen.Result.Insights)
{
foreach (string str in list)
{
Console.WriteLine(str);
}
}
}
public DataArray sampleDataArray(bool timeinput)
{
DataArray sample = new DataArray();
var inputDataType = DataArray.DataType.String;
if (timeinput)
{
inputDataType = DataArray.DataType.DateTime;
}
sample.ColumnNames = new string[] { "input_g_0", "output_0", "slicer_0" };
sample.ColumnDataType = new DataArray.DataType[] { inputDataType, DataArray.DataType.Number, DataArray.DataType.String };
string sampleTableString =
@"China 455 Category1
Turkey 254 Category1
United States 188 Category1
Japan 171 Category2
United States 106 Category3
Brazil 91 Category3
Thailand 67 Category1
Korea 61 Category3
Russia 61 Category1
China 60 Category3
Brazil 57 Category1
Germany 51 Category3
Turkey 49 Category3
Russia 45 Category3
Japan 44 Category3
United States 38 Category4
Thailand 37 Category3
India 36 Category3
Germany 35 Category1
France 33 Category1
India 31 Category1
Japan 28 Category1
Mexico 27 Category3
Canada 23 Category3
Mexico 22 Category1
Vietnam 22 Category3
Korea 21 Category1
Korea 21 Category5
United Kingdom 20 Category3
Vietnam 18 Category1
Canada 17 Category1
United Kingdom 17 Category1
China 12 Category4
France 12 Category3
China 10 Category2
Korea 8 Category4
Brazil 6 Category4
Russia 6 Category4
United States 6 Category5
France 5 Category4
Germany 5 Category4
United Kingdom 5 Category4
Thailand 4 Category4
Turkey 4 Category4
Canada 3 Category5
Mexico 3 Category4
United States 3 Category2
Canada 2 Category4
Germany 1 Category2
India 1 Category4
India 1 Category5
Japan 1 Category4
Japan 1 Category5
Korea 1 Category2";
string[] sampleRows = sampleTableString.Split(Environment.NewLine);
List<string[]> sampleRowList = new List<string[]>();
foreach (var row in sampleRows)
{
sampleRowList.Add(row.Split(" "));
}
var columnTypes = new string[] { "input_g_1", "output_1", "slicer_1" };
sample.Cells = sampleRowList.ToArray();
sample.TransformedColumnNames = columnTypes;
return sample;
}
}
}

View File

@@ -0,0 +1,95 @@
using System;
using System.Collections.Generic;
using System.Text;
using Xunit;
namespace Microsoft.InsightsGenerator.UnitTests
{
public class WorkFlowTests
{
[Fact]
public async void mainWorkFlowTest()
{
Workflow instance = new Workflow();
string insight = await instance.ProcessInputData(getSampleDataArray());
Assert.NotNull(insight);
}
public DataArray getSampleDataArray()
{
string sampleTableString =
@"Country Count Category
China 455 Category1
Turkey 254 Category1
United States 188 Category1
Japan 171 Category2
United States 106 Category3
Brazil 91 Category3
Thailand 67 Category1
Korea 61 Category3
Russia 61 Category1
China 60 Category3
Brazil 57 Category1
Germany 51 Category3
Turkey 49 Category3
Russia 45 Category3
Japan 44 Category3
United States 38 Category4
Thailand 37 Category3
India 36 Category3
Germany 35 Category1
France 33 Category1
India 31 Category1
Japan 28 Category1
Mexico 27 Category3
Canada 23 Category3
Mexico 22 Category1
Vietnam 22 Category3
Korea 21 Category1
Korea 21 Category5
United Kingdom 20 Category3
Vietnam 18 Category1
Canada 17 Category1
United Kingdom 17 Category1
China 12 Category4
France 12 Category3
China 10 Category2
Korea 8 Category4
Brazil 6 Category4
Russia 6 Category4
United States 6 Category5
France 5 Category4
Germany 5 Category4
United Kingdom 5 Category4
Thailand 4 Category4
Turkey 4 Category4
Canada 3 Category5
Mexico 3 Category4
United States 3 Category2
Canada 2 Category4
Germany 1 Category2
India 1 Category4
India 1 Category5
Japan 1 Category4
Japan 1 Category5
Korea 1 Category2";
string[] sampleRows = sampleTableString.Split(Environment.NewLine);
var columnNames = sampleRows[0].Split(" ");
List<string[]> sampleRowList = new List<string[]>();
for (int i = 1; i < sampleRows.Length; i++)
{
sampleRowList.Add(sampleRows[i].Split(" "));
}
DataArray result = new DataArray();
result.ColumnNames = columnNames;
result.Cells = sampleRowList.ToArray();
return result;
}
}
}