Added new Kusto ServiceLayer (#1009)

* Copy smoModel some rename

* Copy entire service layer

* Building copy

* Fixing some references

* Launch profile

* Resolve namespace issues

* Compiling tests. Correct manifest.

* Fixing localization resources

* ReliableKustoClient

* Some trimming of extra code and Kusto code

* Kusto client creation in bindingContent

* Removing Smo and new Kusto classes

* More trimming

* Kusto schema hookup

* Solidying DataSource abstraction

* Solidifying further

* Latest refatoring

* More refactoring

* Building and launching Kusto service layer

* Working model which enumerates databases

* Refactoring to pass IDataSource to all tree nodes

* Removing some dependencies on the context

* Working with tables and schema

* Comment checkin

* Refactoring to give out select script

* Query created and sent back to ADS

* Fix query generation

* Fix listing of databases

* Tunneling the query through.

* Successful query execution

* Return only results table

* Deleting Cms

* Delete DacFx

* Delete SchemaCompare and TaskServices

* Change build definition to not stop at launch

* Fix error after merge

* Save Kusto results in different formats (#935)

* save results as csv etc

* some fixes

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* 2407 Added OrderBy clause in KustoDataSource > GetDatabaseMetaData and GetColumnMetadata (#959)

* 2405 Defaulted Options when setting ServerInfo in ConnectionService > GetConnectionCompleteParams (#965)

* 2747 Fixed IsUnknownType error for Kusto (#989)

* 2747 Removed unused directives in Kusto > DbColumnWrapper. Refactored IsUnknownType to handle null DataTypeName

* 2747 Reverted IsUnknownType change in DbColumnWrapper. Changed DataTypeName to get calue from ColumnType. Refactored SafeGetValue to type check before hard casting to reduce case exceptions.

* Added EmbeddedResourceUseDependentUponConvention to Microsoft.Kusto.ServiceLayer.csproj. Also renamed DACfx to match Microsoft.SqlTools.ServiceLayer. Added to compile Exclude="**/obj/**/*.cs"

* Srahman cleanup sql code (#992)

* Removed Management and Security Service Code.

* Remove FileBrowser service

* Comment why we are using SqlServer library

* Remove SQL specific type definitions

* clean up formatter service (#996)

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* Code clean up and Kusto intellisense (#994)

* Code clean up and Kusto intellisense

* Addressed few comments

* Addressed few comments

* addressed comments

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* Return multiple tables for Kusto

* Changes required for Kusto manage dashboard (#1039)

* Changes required for manage dashboard

* Addressed comments

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* 2728 Kusto function support (#1038)

* loc update (#914)

* loc update

* loc updates

* 2728 moved ColumnInfo and KustoResultsReader to separate files. Added Folder and Function to TreeNode.cs

* 2728 Added FunctionInfo. Added Folder to ColumnInfo. Removed partial class from KustoResultsReader. Set Function.IsAlwaysLeaf=true in TreeNode.cs. In KustoDataSource changed tableMetadata type to TableMetaData. Added folder and function dictionaries. Refactored GetSchema function. Renamed GenerateColumnMetadataKey to GenerateMetadataKey

* 2728 Added FunctionInfo. Added Folder to ColumnInfo. Removed partial class from KustoResultsReader. Set Function.IsAlwaysLeaf=true in TreeNode.cs. In KustoDataSource changed tableMetadata type to TableMetaData. Added folder and function dictionaries. Refactored GetSchema function. Renamed GenerateColumnMetadataKey to GenerateMetadataKey

* 2728 Created new SqlConnection within using block. Refactored KustoDataSource > columnmetadata to sort on get instead of insert.

* 2728 Added GetFunctionInfo function to KustoDataSource.

* 2728 Reverted change to Microsoft.Kusto.ServiceLayer.csproj from merge

* 2728 Reverted change to SqlTools.ServiceLayer\Localization\transXliff

* 2728 Reverted change to sr.de.xlf and sr.zh-hans.xlf

* 2728 Refactored KustoDataSource Function folders to support subfolders

* 2728 Refactored KustoDataSource to use urn for folders, functions, and tables instead of name.

* Merge remote-tracking branch 'origin/main' into feature-ADE

# Conflicts:
#	Packages.props

* 2728 Moved metadata files into Metadata subdirectory. Added GenerateAlterFunction to IDataSource and DataSourceBase.

* 2728 Added summary information to SafeAdd in SystemExtensions. Renamed local variable in SetTableMetadata

* 2728 Moved SafeAdd from SystemExtensions to KustoQueryUtils. Added check when getting database schema to return existing records before querying again. Added AddRange function to KustoQueryUtils. Created SetFolderMetadataForFunctions method.

* 2728 Added DatabaseKeyPrefix to only return tables to a database for the dashboard. Added logic to store all database tables within the tableMetadata dictionary for the dashboard.

* 2728 Created TableInfo and moved info objects into Models directory. Refactored KustoDataSource to lazy load columns for tables. Refactored logic to load tables using cslschema instead of schema.

* 2728 Renamed LoadColumnSchema to GetTableSchema to be consistent.

Co-authored-by: khoiph1 <khoiph@microsoft.com>

* Addressed comments

Co-authored-by: Shafiq Rahman <srahman@microsoft.com>
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
Co-authored-by: Justin M <63619224+JustinMDotNet@users.noreply.github.com>
Co-authored-by: rkselfhost <rkselfhost@outlook.com>
Co-authored-by: khoiph1 <khoiph@microsoft.com>
This commit is contained in:
Monica Gupta
2020-08-12 15:34:38 -07:00
committed by GitHub
parent d2f5bfaa16
commit 148b6e398d
276 changed files with 75983 additions and 1 deletions

View File

@@ -0,0 +1,44 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Represents a value returned from a read from a file stream. This is used to eliminate ref
/// parameters used in the read methods.
/// </summary>
public struct FileStreamReadResult
{
/// <summary>
/// The total length in bytes of the value, (including the bytes used to store the length
/// of the value)
/// </summary>
/// <remarks>
/// Cell values are stored such that the length of the value is stored first, then the
/// value itself is stored. Eg, a string may be stored as 0x03 0x6C 0x6F 0x6C. Under this
/// system, the value would be "lol", the length would be 3, and the total length would be
/// 4 bytes.
/// </remarks>
public int TotalLength { get; set; }
/// <summary>
/// Value of the cell
/// </summary>
public DbCellValue Value { get; set; }
/// <summary>
/// Constructs a new FileStreamReadResult
/// </summary>
/// <param name="value">The value of the result, ready for consumption by a client</param>
/// <param name="totalLength">The number of bytes for the used to store the value's length and value</param>s
public FileStreamReadResult(DbCellValue value, int totalLength)
{
Value = value;
TotalLength = totalLength;
}
}
}

View File

@@ -0,0 +1,22 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Interface for a factory that creates filesystem readers/writers
/// </summary>
public interface IFileStreamFactory
{
string CreateFile();
IFileStreamReader GetReader(string fileName);
IFileStreamWriter GetWriter(string fileName);
void DisposeFile(string fileName);
}
}

View File

@@ -0,0 +1,19 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Interface for a object that reads from the filesystem
/// </summary>
public interface IFileStreamReader : IDisposable
{
IList<DbCellValue> ReadRow(long offset, long rowId, IEnumerable<DbColumnWrapper> columns);
}
}

View File

@@ -0,0 +1,23 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Interface for a object that writes to a filesystem wrapper
/// </summary>
public interface IFileStreamWriter : IDisposable
{
int WriteRow(StorageDataReader dataReader);
void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns);
void Seek(long offset);
void FlushBuffer();
}
}

View File

@@ -0,0 +1,73 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Factory for creating a reader/writer pair that will read from the temporary buffer file
/// and output to a CSV file.
/// </summary>
public class SaveAsCsvFileStreamFactory : IFileStreamFactory
{
#region Properties
/// <summary>
/// Settings for query execution
/// </summary>
public QueryExecutionSettings QueryExecutionSettings { get; set; }
/// <summary>
/// Parameters for the save as CSV request
/// </summary>
public SaveResultsAsCsvRequestParams SaveRequestParams { get; set; }
#endregion
/// <summary>
/// File names are not meant to be created with this factory.
/// </summary>
/// <exception cref="NotImplementedException">Thrown all times</exception>
[Obsolete]
public string CreateFile()
{
throw new NotImplementedException();
}
/// <summary>
/// Returns a new service buffer reader for reading results back in from the temporary buffer files, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the temp buffer file</param>
/// <returns>Stream reader</returns>
public IFileStreamReader GetReader(string fileName)
{
return new ServiceBufferFileStreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), QueryExecutionSettings);
}
/// <summary>
/// Returns a new CSV writer for writing results to a CSV file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the CSV output file</param>
/// <returns>Stream writer</returns>
public IFileStreamWriter GetWriter(string fileName)
{
return new SaveAsCsvFileStreamWriter(new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite), SaveRequestParams);
}
/// <summary>
/// Safely deletes the file
/// </summary>
/// <param name="fileName">Path to the file to delete</param>
public void DisposeFile(string fileName)
{
FileUtilities.SafeFileDelete(fileName);
}
}
}

View File

@@ -0,0 +1,161 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Writer for writing rows of results to a CSV file
/// </summary>
public class SaveAsCsvFileStreamWriter : SaveAsStreamWriter
{
#region Member Variables
private readonly SaveResultsAsCsvRequestParams saveParams;
private bool headerWritten;
#endregion
/// <summary>
/// Constructor, stores the CSV specific request params locally, chains into the base
/// constructor
/// </summary>
/// <param name="stream">FileStream to access the CSV file output</param>
/// <param name="requestParams">CSV save as request parameters</param>
public SaveAsCsvFileStreamWriter(Stream stream, SaveResultsAsCsvRequestParams requestParams)
: base(stream, requestParams)
{
saveParams = requestParams;
}
/// <summary>
/// Writes a row of data as a CSV row. If this is the first row and the user has requested
/// it, the headers for the column will be emitted as well.
/// </summary>
/// <param name="row">The data of the row to output to the file</param>
/// <param name="columns">
/// The entire list of columns for the result set. They will be filtered down as per the
/// request params.
/// </param>
public override void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns)
{
char delimiter = ',';
if(!string.IsNullOrEmpty(saveParams.Delimiter))
{
// first char in string
delimiter = saveParams.Delimiter[0];
}
string lineSeperator = Environment.NewLine;
if(!string.IsNullOrEmpty(saveParams.LineSeperator))
{
lineSeperator = saveParams.LineSeperator;
}
char textIdentifier = '"';
if(!string.IsNullOrEmpty(saveParams.TextIdentifier))
{
// first char in string
textIdentifier = saveParams.TextIdentifier[0];
}
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);
int codepage;
Encoding encoding;
try
{
if(int.TryParse(saveParams.Encoding, out codepage))
{
encoding = Encoding.GetEncoding(codepage);
}
else
{
encoding = Encoding.GetEncoding(saveParams.Encoding);
}
}
catch
{
// Fallback encoding when specified codepage is invalid
encoding = Encoding.GetEncoding("utf-8");
}
// Write out the header if we haven't already and the user chose to have it
if (saveParams.IncludeHeaders && !headerWritten)
{
// Build the string
var selectedColumns = columns.Skip(ColumnStartIndex ?? 0).Take(ColumnCount ?? columns.Count)
.Select(c => EncodeCsvField(c.ColumnName, delimiter, textIdentifier) ?? string.Empty);
string headerLine = string.Join(delimiter, selectedColumns);
// Encode it and write it out
byte[] headerBytes = encoding.GetBytes(headerLine + lineSeperator);
FileStream.Write(headerBytes, 0, headerBytes.Length);
headerWritten = true;
}
// Build the string for the row
var selectedCells = row.Skip(ColumnStartIndex ?? 0)
.Take(ColumnCount ?? columns.Count)
.Select(c => EncodeCsvField(c.DisplayValue, delimiter, textIdentifier));
string rowLine = string.Join(delimiter, selectedCells);
// Encode it and write it out
byte[] rowBytes = encoding.GetBytes(rowLine + lineSeperator);
FileStream.Write(rowBytes, 0, rowBytes.Length);
}
/// <summary>
/// Encodes a single field for inserting into a CSV record. The following rules are applied:
/// <list type="bullet">
/// <item><description>All double quotes (") are replaced with a pair of consecutive double quotes</description></item>
/// </list>
/// The entire field is also surrounded by a pair of double quotes if any of the following conditions are met:
/// <list type="bullet">
/// <item><description>The field begins or ends with a space</description></item>
/// <item><description>The field begins or ends with a tab</description></item>
/// <item><description>The field contains the ListSeparator string</description></item>
/// <item><description>The field contains the '\n' character</description></item>
/// <item><description>The field contains the '\r' character</description></item>
/// <item><description>The field contains the '"' character</description></item>
/// </list>
/// </summary>
/// <param name="field">The field to encode</param>
/// <returns>The CSV encoded version of the original field</returns>
internal static string EncodeCsvField(string field, char delimiter, char textIdentifier)
{
string strTextIdentifier = textIdentifier.ToString();
// Special case for nulls
if (field == null)
{
return "NULL";
}
// Whether this field has special characters which require it to be embedded in quotes
bool embedInQuotes = field.IndexOfAny(new[] { delimiter, '\r', '\n', textIdentifier }) >= 0 // Contains special characters
|| field.StartsWith(" ") || field.EndsWith(" ") // Start/Ends with space
|| field.StartsWith("\t") || field.EndsWith("\t"); // Starts/Ends with tab
//Replace all quotes in the original field with double quotes
string ret = field.Replace(strTextIdentifier, strTextIdentifier + strTextIdentifier);
if (embedInQuotes)
{
ret = strTextIdentifier + $"{ret}" + strTextIdentifier;
}
return ret;
}
}
}

View File

@@ -0,0 +1,73 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Factory for creating a reader/writer pair that will read from the temporary buffer file
/// and output to a Excel file.
/// </summary>
public class SaveAsExcelFileStreamFactory : IFileStreamFactory
{
#region Properties
/// <summary>
/// Settings for query execution
/// </summary>
public QueryExecutionSettings QueryExecutionSettings { get; set; }
/// <summary>
/// Parameters for the save as Excel request
/// </summary>
public SaveResultsAsExcelRequestParams SaveRequestParams { get; set; }
#endregion
/// <summary>
/// File names are not meant to be created with this factory.
/// </summary>
/// <exception cref="NotImplementedException">Thrown all times</exception>
[Obsolete]
public string CreateFile()
{
throw new NotImplementedException();
}
/// <summary>
/// Returns a new service buffer reader for reading results back in from the temporary buffer files, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the temp buffer file</param>
/// <returns>Stream reader</returns>
public IFileStreamReader GetReader(string fileName)
{
return new ServiceBufferFileStreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), QueryExecutionSettings);
}
/// <summary>
/// Returns a new Excel writer for writing results to a Excel file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the Excel output file</param>
/// <returns>Stream writer</returns>
public IFileStreamWriter GetWriter(string fileName)
{
return new SaveAsExcelFileStreamWriter(new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite), SaveRequestParams);
}
/// <summary>
/// Safely deletes the file
/// </summary>
/// <param name="fileName">Path to the file to delete</param>
public void DisposeFile(string fileName)
{
FileUtilities.SafeFileDelete(fileName);
}
}
}

View File

@@ -0,0 +1,87 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System.Collections.Generic;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Writer for writing rows of results to a Excel file
/// </summary>
public class SaveAsExcelFileStreamWriter : SaveAsStreamWriter
{
#region Member Variables
private readonly SaveResultsAsExcelRequestParams saveParams;
private bool headerWritten;
private SaveAsExcelFileStreamWriterHelper helper;
private SaveAsExcelFileStreamWriterHelper.ExcelSheet sheet;
#endregion
/// <summary>
/// Constructor, stores the Excel specific request params locally, chains into the base
/// constructor
/// </summary>
/// <param name="stream">FileStream to access the Excel file output</param>
/// <param name="requestParams">Excel save as request parameters</param>
public SaveAsExcelFileStreamWriter(Stream stream, SaveResultsAsExcelRequestParams requestParams)
: base(stream, requestParams)
{
saveParams = requestParams;
helper = new SaveAsExcelFileStreamWriterHelper(stream);
sheet = helper.AddSheet();
}
/// <summary>
/// Writes a row of data as a Excel row. If this is the first row and the user has requested
/// it, the headers for the column will be emitted as well.
/// </summary>
/// <param name="row">The data of the row to output to the file</param>
/// <param name="columns">
/// The entire list of columns for the result set. They will be filtered down as per the
/// request params.
/// </param>
public override void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns)
{
int columnStart = ColumnStartIndex ?? 0;
int columnEnd = (ColumnEndIndex != null) ? ColumnEndIndex.Value + 1 : columns.Count;
// Write out the header if we haven't already and the user chose to have it
if (saveParams.IncludeHeaders && !headerWritten)
{
sheet.AddRow();
for (int i = columnStart; i < columnEnd; i++)
{
sheet.AddCell(columns[i].ColumnName);
}
headerWritten = true;
}
sheet.AddRow();
for (int i = columnStart; i < columnEnd; i++)
{
sheet.AddCell(row[i]);
}
}
private bool disposed;
override protected void Dispose(bool disposing)
{
if (disposed)
return;
sheet.Dispose();
helper.Dispose();
disposed = true;
base.Dispose(disposing);
}
}
}

View File

@@ -0,0 +1,805 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
using System.Xml;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
// A xlsx file is a zip with specific folder structure.
// http://www.ecma-international.org/publications/standards/Ecma-376.htm
// The page number in the comments are based on
// ECMA-376, Fifth Edition, Part 1 - Fundamentals And Markup Language Reference
// Page 75, SpreadsheetML package structure
// |- [Content_Types].xml
// |- _rels
// |- .rels
// |- xl
// |- workbook.xml
// |- styles.xml
// |- _rels
// |- workbook.xml.rels
// |- worksheets
// |- sheet1.xml
/// <summary>
/// A helper class for write xlsx file base on ECMA-376. It tries to be minimal,
/// both in implementation and runtime allocation.
/// </summary>
/// <example>
/// This sample shows how to use the class
/// <code>
/// public class TestClass
/// {
/// public static int Main()
/// {
/// using (Stream stream = File.Create("test.xlsx"))
/// using (var helper = new SaveAsExcelFileStreamWriterHelper(stream, false))
/// using (var sheet = helper.AddSheet())
/// {
/// sheet.AddRow();
/// sheet.AddCell("string");
/// }
/// }
/// }
/// </code>
/// </example>
internal sealed class SaveAsExcelFileStreamWriterHelper : IDisposable
{
/// <summary>
/// Present a Excel sheet
/// </summary>
public sealed class ExcelSheet : IDisposable
{
// The excel epoch is 1/1/1900, but it has 1/0/1900 and 2/29/1900
// which is equal to set the epoch back two days to 12/30/1899
// new DateTime(1899,12,30).Ticks
private const long ExcelEpochTick = 599264352000000000L;
// Excel can not use date before 1/0/1900 and
// date before 3/1/1900 is wrong, off by 1 because of 2/29/1900
// thus, for any date before 3/1/1900, use string for date
// new DateTime(1900,3,1).Ticks
private const long ExcelDateCutoffTick = 599317056000000000L;
// new TimeSpan(24,0,0).Ticks
private const long TicksPerDay = 864000000000L;
private XmlWriter writer;
private ReferenceManager referenceManager;
private bool hasOpenRowTag;
/// <summary>
/// Initializes a new instance of the ExcelSheet class.
/// </summary>
/// <param name="writer">XmlWriter to write the sheet data</param>
internal ExcelSheet(XmlWriter writer)
{
this.writer = writer;
writer.WriteStartDocument();
writer.WriteStartElement("worksheet", "http://schemas.openxmlformats.org/spreadsheetml/2006/main");
writer.WriteAttributeString("xmlns", "r", null, "http://schemas.openxmlformats.org/officeDocument/2006/relationships");
writer.WriteStartElement("sheetData");
referenceManager = new ReferenceManager(writer);
}
/// <summary>
/// Start a new row
/// </summary>
public void AddRow()
{
EndRowIfNeeded();
hasOpenRowTag = true;
referenceManager.AssureRowReference();
writer.WriteStartElement("row");
referenceManager.WriteAndIncreaseRowReference();
}
/// <summary>
/// Write a string cell
/// </summary>
/// <param name="value">string value to write</param>
public void AddCell(string value)
{
// string needs <c t="inlineStr"><is><t>string</t></is></c>
// This class uses inlineStr instead of more common shared string table
// to improve write performance and reduce implementation complexity
referenceManager.AssureColumnReference();
if (value == null)
{
AddCellEmpty();
return;
}
writer.WriteStartElement("c");
referenceManager.WriteAndIncreaseColumnReference();
writer.WriteAttributeString("t", "inlineStr");
writer.WriteStartElement("is");
writer.WriteStartElement("t");
writer.WriteValue(value);
writer.WriteEndElement();
writer.WriteEndElement();
writer.WriteEndElement();
}
/// <summary>
/// Write a object cell
/// </summary>
/// The program will try to output number/datetime, otherwise, call the ToString
/// <param name="o"></param>
public void AddCell(DbCellValue dbCellValue)
{
object o = dbCellValue.RawObject;
if (dbCellValue.IsNull || o == null)
{
AddCellEmpty();
return;
}
switch (Type.GetTypeCode(o.GetType()))
{
case TypeCode.Boolean:
AddCell((bool)o);
break;
case TypeCode.Byte:
case TypeCode.Int16:
case TypeCode.Int32:
case TypeCode.Int64:
case TypeCode.Single:
case TypeCode.Double:
case TypeCode.Decimal:
AddCellBoxedNumber(o);
break;
case TypeCode.DateTime:
AddCell((DateTime)o);
break;
case TypeCode.String:
AddCell((string)o);
break;
default:
if (o is TimeSpan) //TimeSpan doesn't have TypeCode
{
AddCell((TimeSpan)o);
break;
}
AddCell(dbCellValue.DisplayValue);
break;
}
}
/// <summary>
/// Close the <row><sheetData><worksheet> tags and close the stream
/// </summary>
public void Dispose()
{
EndRowIfNeeded();
writer.WriteEndElement(); // <sheetData>
writer.WriteEndElement(); // <worksheet>
writer.Dispose();
}
/// <summary>
/// Write a empty cell
/// </summary>
/// This only increases the internal bookmark and doesn't arcturally write out anything.
private void AddCellEmpty()
{
referenceManager.IncreaseColumnReference();
}
/// <summary>
/// Write a bool cell.
/// </summary>
/// <param name="time"></param>
private void AddCell(bool value)
{
// Excel FALSE: <c r="A1" t="b"><v>0</v></c>
// Excel TRUE: <c r="A1" t="b"><v>1</v></c>
referenceManager.AssureColumnReference();
writer.WriteStartElement("c");
referenceManager.WriteAndIncreaseColumnReference();
writer.WriteAttributeString("t", "b");
writer.WriteStartElement("v");
if (value)
{
writer.WriteValue("1"); //use string to avoid convert
}
else
{
writer.WriteValue("0");
}
writer.WriteEndElement();
writer.WriteEndElement();
}
/// <summary>
/// Write a TimeSpan cell.
/// </summary>
/// <param name="time"></param>
private void AddCell(TimeSpan time)
{
referenceManager.AssureColumnReference();
double excelDate = (double)time.Ticks / (double)TicksPerDay;
// The default hh:mm:ss format do not support more than 24 hours
// For that case, use the format string [h]:mm:ss
if (time.Ticks >= TicksPerDay)
{
AddCellDateTimeInternal(excelDate, Style.TimeMoreThan24Hours);
}
else
{
AddCellDateTimeInternal(excelDate, Style.Time);
}
}
/// <summary>
/// Write a DateTime cell.
/// </summary>
/// <param name="dateTime">Datetime</param>
/// <remark>
/// If the DateTime does not have date part, it will be written as datetime and show as time only
/// If the DateTime is before 1900-03-01, save as string because excel doesn't support them.
/// Otherwise, save as datetime, and if the time is 00:00:00, show as yyyy-MM-dd.
/// Show the datetime as yyyy-MM-dd HH:mm:ss if none of the previous situations
/// </remark>
private void AddCell(DateTime dateTime)
{
referenceManager.AssureColumnReference();
long ticks = dateTime.Ticks;
Style style = Style.DateTime;
double excelDate;
if (ticks < TicksPerDay) //date empty, time only
{
style = Style.Time;
excelDate = ((double)ticks) / (double)TicksPerDay;
}
else if (ticks < ExcelDateCutoffTick) //before excel cut-off, use string
{
AddCell(dateTime.ToString("yyyy-MM-dd", System.Globalization.CultureInfo.InvariantCulture));
return;
}
else
{
if (ticks % TicksPerDay == 0) //time empty, date only
{
style = Style.Date;
}
excelDate = ((double)(ticks - ExcelEpochTick)) / (double)TicksPerDay;
}
AddCellDateTimeInternal(excelDate, style);
}
// number needs <c r="A1"><v>12.5</v></c>
private void AddCellBoxedNumber(object number)
{
referenceManager.AssureColumnReference();
writer.WriteStartElement("c");
referenceManager.WriteAndIncreaseColumnReference();
writer.WriteStartElement("v");
writer.WriteValue(number);
writer.WriteEndElement();
writer.WriteEndElement();
}
// datetime needs <c r="A1" s="2"><v>26012.451</v></c>
private void AddCellDateTimeInternal(double excelDate, Style style)
{
writer.WriteStartElement("c");
referenceManager.WriteAndIncreaseColumnReference();
writer.WriteStartAttribute("s");
writer.WriteValue((int)style);
writer.WriteEndAttribute();
writer.WriteStartElement("v");
writer.WriteValue(excelDate);
writer.WriteEndElement();
writer.WriteEndElement();
}
private void EndRowIfNeeded()
{
if (hasOpenRowTag)
{
writer.WriteEndElement(); // <row>
}
}
}
/// <summary>
/// Helper class to track the current cell reference.
/// </summary>
/// <remarks>
/// SpreadsheetML cell needs a reference attribute. (e.g. r="A1"). This class is used
/// to track the current cell reference.
/// </remarks>
internal class ReferenceManager
{
private int currColumn; // 0 is invalid, the first AddRow will set to 1
private int currRow = 1;
// In order to reduce allocation, current reference is saved in this array,
// and write to the XmlWriter through WriteChars.
// For example, when the reference has value AA15,
// The content of this array will be @AA15xxxxx, with currReferenceRowLength=2
// and currReferenceColumnLength=2
private char[] currReference = new char[3 + 7]; //maximal XFD1048576
private int currReferenceRowLength;
private int currReferenceColumnLength;
private XmlWriter writer;
/// <summary>
/// Initializes a new instance of the ReferenceManager class.
/// </summary>
/// <param name="writer">XmlWriter to write the reference attribute to.</param>
public ReferenceManager(XmlWriter writer)
{
this.writer = writer;
}
/// <summary>
/// Check that we have not write too many columns. (xlsx has a limit of 16384 columns)
/// </summary>
public void AssureColumnReference()
{
if (currColumn == 0)
{
throw new InvalidOperationException("AddRow must be called before AddCell");
}
if (currColumn > 16384)
{
throw new InvalidOperationException("max column number is 16384, see https://support.office.com/en-us/article/Excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3");
}
}
/// <summary>
/// Write out the r="A1" attribute and increase the column number of internal bookmark
/// </summary>
public void WriteAndIncreaseColumnReference()
{
writer.WriteStartAttribute("r");
writer.WriteChars(currReference, 3 - currReferenceColumnLength, currReferenceRowLength + currReferenceColumnLength);
writer.WriteEndAttribute();
IncreaseColumnReference();
}
/// <summary>
/// Increase the column of internal bookmark.
/// </summary>
public void IncreaseColumnReference()
{
// This function change the first three chars of currReference array
// The logic is simple, when a start a new row, the array is reset to @@A
// where @='A'-1. At each increase, check if the current reference is Z
// and move to AA if needed, since the maximal is 16384, or XFD, the code
// manipulates the array element directly instead of loop
char[] reference = currReference;
currColumn++;
if ('Z' == reference[2]++)
{
reference[2] = 'A';
if (currReferenceColumnLength < 2)
{
currReferenceColumnLength = 2;
}
if ('Z' == reference[1]++)
{
reference[0]++;
reference[1] = 'A';
currReferenceColumnLength = 3;
}
}
}
/// <summary>
/// Check that we have not write too many rows. (xlsx has a limit of 1048576 rows)
/// </summary>
public void AssureRowReference()
{
if (currRow > 1048576)
{
throw new InvalidOperationException("max row number is 1048576, see https://support.office.com/en-us/article/Excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3");
}
}
/// <summary>
/// Write out the r="1" attribute and increase the row number of internal bookmark
/// </summary>
public void WriteAndIncreaseRowReference()
{
writer.WriteStartAttribute("r");
writer.WriteValue(currRow);
writer.WriteEndAttribute();
ResetColumnReference(); //This need to be called before the increase
currRow++;
}
// Reset the Column Reference
// This will reset the first three chars of currReference array to '@@A'
// and the rest to the array to the string presentation of the current row.
private void ResetColumnReference()
{
currColumn = 1;
currReference[0] = currReference[1] = (char)('A' - 1);
currReference[2] = 'A';
currReferenceColumnLength = 1;
string rowReference = XmlConvert.ToString(currRow);
currReferenceRowLength = rowReference.Length;
rowReference.CopyTo(0, currReference, 3, rowReference.Length);
}
}
private enum Style
{
Normal = 0,
Date = 1,
Time = 2,
DateTime = 3,
TimeMoreThan24Hours = 4,
}
private ZipArchive zipArchive;
private List<string> sheetNames = new List<string>();
private XmlWriterSettings writerSetting = new XmlWriterSettings()
{
CloseOutput = true,
};
/// <summary>
/// Initializes a new instance of the SaveAsExcelFileStreamWriterHelper class.
/// </summary>
/// <param name="stream">The input or output stream.</param>
public SaveAsExcelFileStreamWriterHelper(Stream stream)
{
zipArchive = new ZipArchive(stream, ZipArchiveMode.Create, false);
}
/// <summary>
/// Initializes a new instance of the SaveAsExcelFileStreamWriterHelper class.
/// </summary>
/// <param name="stream">The input or output stream.</param>
/// <param name="leaveOpen">true to leave the stream open after the
/// SaveAsExcelFileStreamWriterHelper object is disposed; otherwise, false.</param>
public SaveAsExcelFileStreamWriterHelper(Stream stream, bool leaveOpen)
{
zipArchive = new ZipArchive(stream, ZipArchiveMode.Create, leaveOpen);
}
/// <summary>
/// Add sheet inside the Xlsx file.
/// </summary>
/// <param name="sheetName">Sheet name</param>
/// <returns>ExcelSheet for writing the sheet content</returns>
/// <remarks>
/// When the sheetName is null, sheet1,shhet2,..., will be used.
/// The following charactors are not allowed in the sheetName
/// '\', '/','*','[',']',':','?'
/// </remarks>
public ExcelSheet AddSheet(string sheetName = null)
{
string sheetFileName = "sheet" + (sheetNames.Count + 1);
if (sheetName == null)
{
sheetName = sheetFileName;
}
EnsureValidSheetName(sheetName);
sheetNames.Add(sheetName);
XmlWriter sheetWriter = AddEntry($"xl/worksheets/{sheetFileName}.xml");
return new ExcelSheet(sheetWriter);
}
/// <summary>
/// Write out the rest of the xlsx files and release the resources used by the current instance
/// </summary>
public void Dispose()
{
WriteMinimalTemplate();
zipArchive.Dispose();
}
private XmlWriter AddEntry(string entryName)
{
ZipArchiveEntry entry = zipArchive.CreateEntry(entryName, CompressionLevel.Fastest);
return XmlWriter.Create(entry.Open(), writerSetting);
}
//ECMA-376 page 75
private void WriteMinimalTemplate()
{
WriteTopRel();
WriteWorkbook();
WriteStyle();
WriteContentType();
WriteWorkbookRel();
}
/// <summary>
/// write [Content_Types].xml
/// </summary>
/// <remarks>
/// This file need to describe all the files in the zip.
/// </remarks>
private void WriteContentType()
{
using (XmlWriter xw = AddEntry("[Content_Types].xml"))
{
xw.WriteStartDocument();
xw.WriteStartElement("Types", "http://schemas.openxmlformats.org/package/2006/content-types");
xw.WriteStartElement("Default");
xw.WriteAttributeString("Extension", "rels");
xw.WriteAttributeString("ContentType", "application/vnd.openxmlformats-package.relationships+xml");
xw.WriteEndElement();
xw.WriteStartElement("Override");
xw.WriteAttributeString("PartName", "/xl/workbook.xml");
xw.WriteAttributeString("ContentType", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml");
xw.WriteEndElement();
xw.WriteStartElement("Override");
xw.WriteAttributeString("PartName", "/xl/styles.xml");
xw.WriteAttributeString("ContentType", "application/vnd.openxmlformats-officedocument.spreadsheetml.styles+xml");
xw.WriteEndElement();
for (int i = 1; i <= sheetNames.Count; ++i)
{
xw.WriteStartElement("Override");
xw.WriteAttributeString("PartName", "/xl/worksheets/sheet" + i + ".xml");
xw.WriteAttributeString("ContentType", "application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml");
xw.WriteEndElement();
}
xw.WriteEndElement();
xw.WriteEndDocument();
}
}
/// <summary>
/// Write _rels/.rels. This file only need to reference main workbook
/// </summary>
private void WriteTopRel()
{
using (XmlWriter xw = AddEntry("_rels/.rels"))
{
xw.WriteStartDocument();
xw.WriteStartElement("Relationships", "http://schemas.openxmlformats.org/package/2006/relationships");
xw.WriteStartElement("Relationship");
xw.WriteAttributeString("Id", "rId1");
xw.WriteAttributeString("Type", "http://schemas.openxmlformats.org/officeDocument/2006/relationships/officeDocument");
xw.WriteAttributeString("Target", "xl/workbook.xml");
xw.WriteEndElement();
xw.WriteEndElement();
xw.WriteEndDocument();
}
}
private static char[] invalidSheetNameCharacters = new char[]
{
'\\', '/','*','[',']',':','?'
};
private void EnsureValidSheetName(string sheetName)
{
if (sheetName.IndexOfAny(invalidSheetNameCharacters) != -1)
{
throw new ArgumentException($"Invalid sheetname: sheetName");
}
if (sheetNames.IndexOf(sheetName) != -1)
{
throw new ArgumentException($"Duplicate sheetName: {sheetName}");
}
}
/// <summary>
/// Write xl/workbook.xml. This file will references the sheets through ids in xl/_rels/workbook.xml.rels
/// </summary>
private void WriteWorkbook()
{
using (XmlWriter xw = AddEntry("xl/workbook.xml"))
{
xw.WriteStartDocument();
xw.WriteStartElement("workbook", "http://schemas.openxmlformats.org/spreadsheetml/2006/main");
xw.WriteAttributeString("xmlns", "r", null, "http://schemas.openxmlformats.org/officeDocument/2006/relationships");
xw.WriteStartElement("sheets");
for (int i = 1; i <= sheetNames.Count; i++)
{
xw.WriteStartElement("sheet");
xw.WriteAttributeString("name", sheetNames[i - 1]);
xw.WriteAttributeString("sheetId", i.ToString());
xw.WriteAttributeString("r", "id", null, "rId" + i);
xw.WriteEndElement();
}
xw.WriteEndDocument();
}
}
/// <summary>
/// Write xl/_rels/workbook.xml.rels. This file will have the paths of the style and sheets.
/// </summary>
private void WriteWorkbookRel()
{
using (XmlWriter xw = AddEntry("xl/_rels/workbook.xml.rels"))
{
xw.WriteStartDocument();
xw.WriteStartElement("Relationships", "http://schemas.openxmlformats.org/package/2006/relationships");
xw.WriteStartElement("Relationship");
xw.WriteAttributeString("Id", "rId0");
xw.WriteAttributeString("Type", "http://schemas.openxmlformats.org/officeDocument/2006/relationships/styles");
xw.WriteAttributeString("Target", "styles.xml");
xw.WriteEndElement();
for (int i = 1; i <= sheetNames.Count; i++)
{
xw.WriteStartElement("Relationship");
xw.WriteAttributeString("Id", "rId" + i);
xw.WriteAttributeString("Type", "http://schemas.openxmlformats.org/officeDocument/2006/relationships/worksheet");
xw.WriteAttributeString("Target", "worksheets/sheet" + i + ".xml");
xw.WriteEndElement();
}
xw.WriteEndElement();
xw.WriteEndDocument();
}
}
// Write the xl/styles.xml
private void WriteStyle()
{
// the style 0 is used for general case, style 1 for date, style 2 for time and style 3 for datetime see Enum Style
// reference chain: (index start with 0)
// <c>(in sheet1.xml) --> (by s) <cellXfs> --> (by xfId) <cellStyleXfs>
// --> (by numFmtId) <numFmts>
// that is <c s="1"></c> will reference the second element of <cellXfs> <xf numFmtId=""162"" xfId=""0"" applyNumberFormat=""1""/>
// then, this xf reference numFmt by name and get formatCode "hh:mm:ss"
using (XmlWriter xw = AddEntry("xl/styles.xml"))
{
xw.WriteStartElement("styleSheet", "http://schemas.openxmlformats.org/spreadsheetml/2006/main");
xw.WriteStartElement("numFmts");
xw.WriteAttributeString("count", "4");
xw.WriteStartElement("numFmt");
xw.WriteAttributeString("numFmtId", "166");
xw.WriteAttributeString("formatCode", "yyyy-mm-dd");
xw.WriteEndElement();
xw.WriteStartElement("numFmt");
xw.WriteAttributeString("numFmtId", "167");
xw.WriteAttributeString("formatCode", "hh:mm:ss");
xw.WriteEndElement();
xw.WriteStartElement("numFmt");
xw.WriteAttributeString("numFmtId", "168");
xw.WriteAttributeString("formatCode", "yyyy-mm-dd hh:mm:ss");
xw.WriteEndElement();
xw.WriteStartElement("numFmt");
xw.WriteAttributeString("numFmtId", "169");
xw.WriteAttributeString("formatCode", "[h]:mm:ss");
xw.WriteEndElement();
xw.WriteEndElement(); //mumFmts
xw.WriteStartElement("fonts");
xw.WriteAttributeString("count", "1");
xw.WriteStartElement("font");
xw.WriteStartElement("sz");
xw.WriteAttributeString("val", "11");
xw.WriteEndElement();
xw.WriteStartElement("color");
xw.WriteAttributeString("theme", "1");
xw.WriteEndElement();
xw.WriteStartElement("name");
xw.WriteAttributeString("val", "Calibri");
xw.WriteEndElement();
xw.WriteStartElement("family");
xw.WriteAttributeString("val", "2");
xw.WriteEndElement();
xw.WriteStartElement("scheme");
xw.WriteAttributeString("val", "minor");
xw.WriteEndElement();
xw.WriteEndElement(); // font
xw.WriteEndElement(); // fonts
xw.WriteStartElement("fills");
xw.WriteAttributeString("count", "1");
xw.WriteStartElement("fill");
xw.WriteStartElement("patternFill");
xw.WriteAttributeString("patternType", "none");
xw.WriteEndElement(); // patternFill
xw.WriteEndElement(); // fill
xw.WriteEndElement(); // fills
xw.WriteStartElement("borders");
xw.WriteAttributeString("count", "1");
xw.WriteStartElement("border");
xw.WriteElementString("left", null);
xw.WriteElementString("right", null);
xw.WriteElementString("top", null);
xw.WriteElementString("bottom", null);
xw.WriteElementString("diagonal", null);
xw.WriteEndElement(); // board
xw.WriteEndElement(); // borders
xw.WriteStartElement("cellStyleXfs");
xw.WriteAttributeString("count", "1");
xw.WriteStartElement("xf");
xw.WriteAttributeString("numFmtId", "0");
xw.WriteAttributeString("fontId", "0");
xw.WriteAttributeString("fillId", "0");
xw.WriteAttributeString("borderId", "0");
xw.WriteEndElement(); // xf
xw.WriteEndElement(); // cellStyleXfs
xw.WriteStartElement("cellXfs");
xw.WriteAttributeString("count", "5");
xw.WriteStartElement("xf");
xw.WriteAttributeString("xfId", "0");
xw.WriteEndElement();
xw.WriteStartElement("xf");
xw.WriteAttributeString("numFmtId", "166");
xw.WriteAttributeString("xfId", "0");
xw.WriteAttributeString("applyNumberFormat", "1");
xw.WriteEndElement();
xw.WriteStartElement("xf");
xw.WriteAttributeString("numFmtId", "167");
xw.WriteAttributeString("xfId", "0");
xw.WriteAttributeString("applyNumberFormat", "1");
xw.WriteEndElement();
xw.WriteStartElement("xf");
xw.WriteAttributeString("numFmtId", "168");
xw.WriteAttributeString("xfId", "0");
xw.WriteAttributeString("applyNumberFormat", "1");
xw.WriteEndElement();
xw.WriteStartElement("xf");
xw.WriteAttributeString("numFmtId", "169");
xw.WriteAttributeString("xfId", "0");
xw.WriteAttributeString("applyNumberFormat", "1");
xw.WriteEndElement();
xw.WriteEndElement(); // cellXfs
xw.WriteStartElement("cellStyles");
xw.WriteAttributeString("count", "1");
xw.WriteStartElement("cellStyle");
xw.WriteAttributeString("name", "Normal");
xw.WriteAttributeString("builtinId", "0");
xw.WriteAttributeString("xfId", "0");
xw.WriteEndElement(); // cellStyle
xw.WriteEndElement(); // cellStyles
}
}
}
}

View File

@@ -0,0 +1,71 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
public class SaveAsJsonFileStreamFactory : IFileStreamFactory
{
#region Properties
/// <summary>
/// Settings for query execution
/// </summary>
public QueryExecutionSettings QueryExecutionSettings { get; set; }
/// <summary>
/// Parameters for the save as JSON request
/// </summary>
public SaveResultsAsJsonRequestParams SaveRequestParams { get; set; }
#endregion
/// <summary>
/// File names are not meant to be created with this factory.
/// </summary>
/// <exception cref="NotImplementedException">Thrown all times</exception>
[Obsolete]
public string CreateFile()
{
throw new InvalidOperationException();
}
/// <summary>
/// Returns a new service buffer reader for reading results back in from the temporary buffer files, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the temp buffer file</param>
/// <returns>Stream reader</returns>
public IFileStreamReader GetReader(string fileName)
{
return new ServiceBufferFileStreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), QueryExecutionSettings);
}
/// <summary>
/// Returns a new JSON writer for writing results to a JSON file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the JSON output file</param>
/// <returns>Stream writer</returns>
public IFileStreamWriter GetWriter(string fileName)
{
return new SaveAsJsonFileStreamWriter(new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite), SaveRequestParams);
}
/// <summary>
/// Safely deletes the file
/// </summary>
/// <param name="fileName">Path to the file to delete</param>
public void DisposeFile(string fileName)
{
FileUtilities.SafeFileDelete(fileName);
}
}
}

View File

@@ -0,0 +1,111 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Newtonsoft.Json;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Writer for writing rows of results to a JSON file.
/// </summary>
/// <remarks>
/// This implements its own IDisposable because the cleanup logic closes the array that was
/// created when the writer was created. Since this behavior is different than the standard
/// file stream cleanup, the extra Dispose method was added.
/// </remarks>
public class SaveAsJsonFileStreamWriter : SaveAsStreamWriter, IDisposable
{
#region Member Variables
private readonly StreamWriter streamWriter;
private readonly JsonWriter jsonWriter;
#endregion
/// <summary>
/// Constructor, writes the header to the file, chains into the base constructor
/// </summary>
/// <param name="stream">FileStream to access the JSON file output</param>
/// <param name="requestParams">JSON save as request parameters</param>
public SaveAsJsonFileStreamWriter(Stream stream, SaveResultsRequestParams requestParams)
: base(stream, requestParams)
{
// Setup the internal state
streamWriter = new StreamWriter(stream);
jsonWriter = new JsonTextWriter(streamWriter);
jsonWriter.Formatting = Formatting.Indented;
// Write the header of the file
jsonWriter.WriteStartArray();
}
/// <summary>
/// Writes a row of data as a JSON object
/// </summary>
/// <param name="row">The data of the row to output to the file</param>
/// <param name="columns">
/// The entire list of columns for the result set. They will be filtered down as per the
/// request params.
/// </param>
public override void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns)
{
// Write the header for the object
jsonWriter.WriteStartObject();
// Write the items out as properties
int columnStart = ColumnStartIndex ?? 0;
int columnEnd = (ColumnEndIndex != null) ? ColumnEndIndex.Value + 1 : columns.Count;
for (int i = columnStart; i < columnEnd; i++)
{
jsonWriter.WritePropertyName(columns[i].ColumnName);
if (row[i].RawObject == null)
{
jsonWriter.WriteNull();
}
else
{
// Try converting to column type
try
{
var value = Convert.ChangeType(row[i].DisplayValue, columns[i].DataType);
jsonWriter.WriteValue(value);
}
// Default column type as string
catch
{
jsonWriter.WriteValue(row[i].DisplayValue);
}
}
}
// Write the footer for the object
jsonWriter.WriteEndObject();
}
private bool disposed = false;
/// <summary>
/// Disposes the writer by closing up the array that contains the row objects
/// </summary>
protected override void Dispose(bool disposing)
{
if (disposed)
return;
if (disposing)
{
// Write the footer of the file
jsonWriter.WriteEndArray();
// This closes the underlying stream, so we needn't call close on the underlying stream explicitly
jsonWriter.Close();
}
disposed = true;
base.Dispose(disposing);
}
}
}

View File

@@ -0,0 +1,123 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Abstract class for implementing writers that save results to file. Stores some basic info
/// that all save as writer would need.
/// </summary>
public abstract class SaveAsStreamWriter : IFileStreamWriter
{
/// <summary>
/// Stores the internal state for the writer that will be necessary for any writer.
/// </summary>
/// <param name="stream">The stream that will be written to</param>
/// <param name="requestParams">The SaveAs request parameters</param>
protected SaveAsStreamWriter(Stream stream, SaveResultsRequestParams requestParams)
{
FileStream = stream;
var saveParams = requestParams;
if (requestParams.IsSaveSelection)
{
// ReSharper disable PossibleInvalidOperationException IsSaveSelection verifies these values exist
ColumnStartIndex = saveParams.ColumnStartIndex.Value;
ColumnEndIndex = saveParams.ColumnEndIndex.Value;
ColumnCount = saveParams.ColumnEndIndex.Value - saveParams.ColumnStartIndex.Value + 1;
// ReSharper restore PossibleInvalidOperationException
}
}
#region Properties
/// <summary>
/// Index of the first column to write to the output file
/// </summary>
protected int? ColumnStartIndex { get; private set; }
/// <summary>
/// Number of columns to write to the output file
/// </summary>
protected int? ColumnCount { get; private set; }
/// <summary>
/// Index of the last column to write to the output file
/// </summary>
protected int? ColumnEndIndex { get; private set; }
/// <summary>
/// The file stream to use to write the output file
/// </summary>
protected Stream FileStream { get; private set; }
#endregion
/// <summary>
/// Not implemented, do not use.
/// </summary>
[Obsolete]
public int WriteRow(StorageDataReader dataReader)
{
throw new InvalidOperationException("This type of writer is meant to write values from a list of cell values only.");
}
/// <summary>
/// Writes a row of data to the output file using the format provided by the implementing class.
/// </summary>
/// <param name="row">The row of data to output</param>
/// <param name="columns">The list of columns to output</param>
public abstract void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns);
/// <summary>
/// Not implemented, do not use.
/// </summary>
[Obsolete]
public void Seek(long offset)
{
throw new InvalidOperationException("SaveAs writers are meant to be written once contiguously.");
}
/// <summary>
/// Flushes the file stream buffer
/// </summary>
public void FlushBuffer()
{
FileStream.Flush();
}
#region IDisposable Implementation
private bool disposed;
/// <summary>
/// Disposes the instance by flushing and closing the file stream
/// </summary>
/// <param name="disposing"></param>
protected virtual void Dispose(bool disposing)
{
if (disposed)
return;
if (disposing)
{
FileStream.Dispose();
}
disposed = true;
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
#endregion
}
}

View File

@@ -0,0 +1,71 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.IO;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
public class SaveAsXmlFileStreamFactory : IFileStreamFactory
{
#region Properties
/// <summary>
/// Settings for query execution
/// </summary>
public QueryExecutionSettings QueryExecutionSettings { get; set; }
/// <summary>
/// Parameters for the save as XML request
/// </summary>
public SaveResultsAsXmlRequestParams SaveRequestParams { get; set; }
#endregion
/// <summary>
/// File names are not meant to be created with this factory.
/// </summary>
/// <exception cref="NotImplementedException">Thrown all times</exception>
[Obsolete]
public string CreateFile()
{
throw new InvalidOperationException();
}
/// <summary>
/// Returns a new service buffer reader for reading results back in from the temporary buffer files, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the temp buffer file</param>
/// <returns>Stream reader</returns>
public IFileStreamReader GetReader(string fileName)
{
return new ServiceBufferFileStreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), QueryExecutionSettings);
}
/// <summary>
/// Returns a new XML writer for writing results to a XML file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">Path to the XML output file</param>
/// <returns>Stream writer</returns>
public IFileStreamWriter GetWriter(string fileName)
{
return new SaveAsXmlFileStreamWriter(new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite), SaveRequestParams);
}
/// <summary>
/// Safely deletes the file
/// </summary>
/// <param name="fileName">Path to the file to delete</param>
public void DisposeFile(string fileName)
{
FileUtilities.SafeFileDelete(fileName);
}
}
}

View File

@@ -0,0 +1,142 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Xml;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Writer for writing rows of results to a XML file.
/// </summary>
/// <remarks>
/// This implements its own IDisposable because the cleanup logic closes the element that was
/// created when the writer was created. Since this behavior is different than the standard
/// file stream cleanup, the extra Dispose method was added.
/// </remarks>
public class SaveAsXmlFileStreamWriter : SaveAsStreamWriter, IDisposable
{
// Root element name for the output XML
private const string RootElementTag = "data";
// Item element name which will be used for every row
private const string ItemElementTag = "row";
#region Member Variables
private readonly XmlTextWriter xmlTextWriter;
#endregion
/// <summary>
/// Constructor, writes the header to the file, chains into the base constructor
/// </summary>
/// <param name="stream">FileStream to access the JSON file output</param>
/// <param name="requestParams">XML save as request parameters</param>
public SaveAsXmlFileStreamWriter(Stream stream, SaveResultsAsXmlRequestParams requestParams)
: base(stream, requestParams)
{
// Setup the internal state
var encoding = GetEncoding(requestParams);
xmlTextWriter = new XmlTextWriter(stream, encoding);
xmlTextWriter.Formatting = requestParams.Formatted ? Formatting.Indented : Formatting.None;
//Start the document and the root element
xmlTextWriter.WriteStartDocument();
xmlTextWriter.WriteStartElement(RootElementTag);
}
/// <summary>
/// Writes a row of data as a XML object
/// </summary>
/// <param name="row">The data of the row to output to the file</param>
/// <param name="columns">
/// The entire list of columns for the result set. They will be filtered down as per the
/// request params.
/// </param>
public override void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns)
{
// Write the header for the object
xmlTextWriter.WriteStartElement(ItemElementTag);
// Write the items out as properties
int columnStart = ColumnStartIndex ?? 0;
int columnEnd = ColumnEndIndex + 1 ?? columns.Count;
for (int i = columnStart; i < columnEnd; i++)
{
// Write the column name as item tag
xmlTextWriter.WriteStartElement(columns[i].ColumnName);
if (row[i].RawObject != null)
{
xmlTextWriter.WriteString(row[i].DisplayValue);
}
// End the item tag
xmlTextWriter.WriteEndElement();
}
// Write the footer for the object
xmlTextWriter.WriteEndElement();
}
/// <summary>
/// Get the encoding for the XML file according to <param name="requestParams"></param>
/// </summary>
/// <param name="requestParams">XML save as request parameters</param>
/// <returns></returns>
private Encoding GetEncoding(SaveResultsAsXmlRequestParams requestParams)
{
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);
Encoding encoding;
try
{
if (int.TryParse(requestParams.Encoding, out var codepage))
{
encoding = Encoding.GetEncoding(codepage);
}
else
{
encoding = Encoding.GetEncoding(requestParams.Encoding);
}
}
catch
{
// Fallback encoding when specified codepage is invalid
encoding = Encoding.GetEncoding("utf-8");
}
return encoding;
}
private bool disposed = false;
/// <summary>
/// Disposes the writer by closing up the element that contains the row objects
/// </summary>
protected override void Dispose(bool disposing)
{
if (disposed)
return;
if (disposing)
{
// Write the footer of the file
xmlTextWriter.WriteEndElement();
xmlTextWriter.WriteEndDocument();
xmlTextWriter.Close();
xmlTextWriter.Dispose();
}
disposed = true;
base.Dispose(disposing);
}
}
}

View File

@@ -0,0 +1,66 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System.IO;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Factory that creates file reader/writers that process rows in an internal, non-human readable file format
/// </summary>
public class ServiceBufferFileStreamFactory : IFileStreamFactory
{
#region Properties
/// <summary>
/// The settings for query execution
/// </summary>
public QueryExecutionSettings ExecutionSettings { get; set; }
#endregion
/// <summary>
/// Creates a new temporary file
/// </summary>
/// <returns>The name of the temporary file</returns>
public string CreateFile()
{
return Path.GetTempFileName();
}
/// <summary>
/// Creates a new <see cref="ServiceBufferFileStreamReader"/> for reading values back from
/// an SSMS formatted buffer file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">The file to read values from</param>
/// <returns>A <see cref="ServiceBufferFileStreamReader"/></returns>
public IFileStreamReader GetReader(string fileName)
{
return new ServiceBufferFileStreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), ExecutionSettings);
}
/// <summary>
/// Creates a new <see cref="ServiceBufferFileStreamWriter"/> for writing values out to an
/// SSMS formatted buffer file, file share is ReadWrite to allow concurrent reads/writes to the file.
/// </summary>
/// <param name="fileName">The file to write values to</param>
/// <returns>A <see cref="ServiceBufferFileStreamWriter"/></returns>
public IFileStreamWriter GetWriter(string fileName)
{
return new ServiceBufferFileStreamWriter(new FileStream(fileName, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite), ExecutionSettings);
}
/// <summary>
/// Disposes of a file created via this factory
/// </summary>
/// <param name="fileName">The file to dispose of</param>
public void DisposeFile(string fileName)
{
FileUtilities.SafeFileDelete(fileName);
}
}
}

View File

@@ -0,0 +1,552 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.Data.SqlTypes;
using System.IO;
using System.Text;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.SqlTools.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Reader for service buffer formatted file streams
/// </summary>
public class ServiceBufferFileStreamReader : IFileStreamReader
{
#region Constants
private const int DefaultBufferSize = 8192;
private const string DateFormatString = "yyyy-MM-dd";
private const string TimeFormatString = "HH:mm:ss";
#endregion
#region Member Variables
private delegate FileStreamReadResult ReadMethod(long fileOffset, long rowId, DbColumnWrapper column);
private byte[] buffer;
private readonly QueryExecutionSettings executionSettings;
private readonly Stream fileStream;
private readonly Dictionary<Type, ReadMethod> readMethods;
#endregion
/// <summary>
/// Constructs a new ServiceBufferFileStreamReader and initializes its state
/// </summary>
/// <param name="stream">The filestream to read from</param>
/// <param name="settings">The query execution settings</param>
public ServiceBufferFileStreamReader(Stream stream, QueryExecutionSettings settings)
{
Validate.IsNotNull(nameof(stream), stream);
Validate.IsNotNull(nameof(settings), settings);
// Open file for reading/writing
if (!stream.CanRead || !stream.CanSeek)
{
throw new InvalidOperationException("Stream must be readable and seekable");
}
fileStream = stream;
executionSettings = settings;
// Create internal buffer
buffer = new byte[DefaultBufferSize];
// Create the methods that will be used to read back
readMethods = new Dictionary<Type, ReadMethod>
{
{typeof(string), (o, id, col) => ReadString(o, id)},
{typeof(short), (o, id, col) => ReadInt16(o, id)},
{typeof(int), (o, id, col) => ReadInt32(o, id)},
{typeof(long), (o, id, col) => ReadInt64(o, id)},
{typeof(byte), (o, id, col) => ReadByte(o, id)},
{typeof(char), (o, id, col) => ReadChar(o, id)},
{typeof(bool), (o, id, col) => ReadBoolean(o, id)},
{typeof(double), (o, id, col) => ReadDouble(o, id)},
{typeof(float), (o, id, col) => ReadSingle(o, id)},
{typeof(decimal), (o, id, col) => ReadDecimal(o, id)},
{typeof(DateTime), ReadDateTime},
{typeof(DateTimeOffset), (o, id, col) => ReadDateTimeOffset(o, id)},
{typeof(TimeSpan), (o, id, col) => ReadTimeSpan(o, id)},
{typeof(byte[]), (o, id, col) => ReadBytes(o, id)},
{typeof(Guid), (o, id, col) => ReadGuid(o, id)},
{typeof(SqlString), (o, id, col) => ReadString(o, id)},
{typeof(SqlInt16), (o, id, col) => ReadInt16(o, id)},
{typeof(SqlInt32), (o, id, col) => ReadInt32(o, id)},
{typeof(SqlInt64), (o, id, col) => ReadInt64(o, id)},
{typeof(SqlByte), (o, id, col) => ReadByte(o, id)},
{typeof(SqlBoolean), (o, id, col) => ReadBoolean(o, id)},
{typeof(SqlDouble), (o, id, col) => ReadDouble(o, id)},
{typeof(SqlSingle), (o, id, col) => ReadSingle(o, id)},
{typeof(SqlDecimal), (o, id, col) => ReadSqlDecimal(o, id)},
{typeof(SqlDateTime), ReadDateTime},
{typeof(SqlBytes), (o, id, col) => ReadBytes(o, id)},
{typeof(SqlBinary), (o, id, col) => ReadBytes(o, id)},
{typeof(SqlGuid), (o, id, col) => ReadGuid(o, id)},
{typeof(SqlMoney), (o, id, col) => ReadMoney(o, id)},
};
}
#region IFileStreamStorage Implementation
/// <summary>
/// Reads a row from the file, based on the columns provided
/// </summary>
/// <param name="fileOffset">Offset into the file where the row starts</param>
/// <param name="rowId">Internal ID of the row to set for all cells in this row</param>
/// <param name="columns">The columns that were encoded</param>
/// <returns>The objects from the row, ready for output to the client</returns>
public IList<DbCellValue> ReadRow(long fileOffset, long rowId, IEnumerable<DbColumnWrapper> columns)
{
// Initialize for the loop
long currentFileOffset = fileOffset;
List<DbCellValue> results = new List<DbCellValue>();
// Iterate over the columns
Type colType;
foreach (DbColumnWrapper column in columns)
{
colType = column.DataType;
// Use the right read function for the type to read the data from the file
ReadMethod readFunc;
if (!readMethods.TryGetValue(colType, out readFunc))
{
// Treat everything else as a string
readFunc = readMethods[typeof(string)];
}
FileStreamReadResult result = readFunc(currentFileOffset, rowId, column);
currentFileOffset += result.TotalLength;
results.Add(result.Value);
}
return results;
}
#endregion
#region Private Helpers
/// <summary>
/// Creates a new buffer that is of the specified length if the buffer is not already
/// at least as long as specified.
/// </summary>
/// <param name="newBufferLength">The minimum buffer size</param>
private void AssureBufferLength(int newBufferLength)
{
if (buffer.Length < newBufferLength)
{
buffer = new byte[newBufferLength];
}
}
/// <summary>
/// Reads the value of a cell from the file wrapper, checks to see if it null using
/// <paramref name="isNullFunc"/>, and converts it to the proper output type using
/// <paramref name="convertFunc"/>.
/// </summary>
/// <param name="offset">Offset into the file to read from</param>
/// <param name="rowId">Internal ID of the row to set on all cells in this row</param>
/// <param name="convertFunc">Function to use to convert the buffer to the target type</param>
/// <param name="isNullFunc">
/// If provided, this function will be used to determine if the value is null
/// </param>
/// <param name="toStringFunc">Optional function to use to convert the object to a string.</param>
/// <param name="setInvariantCultureDisplayValue">Optional parameter indicates whether the culture invariant display value should be provided.</param>
/// <typeparam name="T">The expected type of the cell. Used to keep the code honest</typeparam>
/// <returns>The object, a display value, and the length of the value + its length</returns>
private FileStreamReadResult ReadCellHelper<T>(long offset, long rowId,
Func<int, T> convertFunc,
Func<int, bool> isNullFunc = null,
Func<T, string> toStringFunc = null,
bool setInvariantCultureDisplayValue = false)
{
LengthResult length = ReadLength(offset);
DbCellValue result = new DbCellValue { RowId = rowId };
if (isNullFunc == null ? length.ValueLength == 0 : isNullFunc(length.TotalLength))
{
result.RawObject = null;
result.DisplayValue = SR.QueryServiceCellNull;
result.IsNull = true;
}
else
{
AssureBufferLength(length.ValueLength);
fileStream.Read(buffer, 0, length.ValueLength);
T resultObject = convertFunc(length.ValueLength);
result.RawObject = resultObject;
result.DisplayValue = toStringFunc == null ? result.RawObject.ToString() : toStringFunc(resultObject);
if (setInvariantCultureDisplayValue)
{
string icDisplayValue = string.Format(System.Globalization.CultureInfo.InvariantCulture, "{0}", result.RawObject);
// Only set the value when it is different from the DisplayValue to reduce the size of the result
//
if (icDisplayValue != result.DisplayValue)
{
result.InvariantCultureDisplayValue = icDisplayValue;
}
}
result.IsNull = false;
}
return new FileStreamReadResult(result, length.TotalLength);
}
/// <summary>
/// Reads a short from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the short from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A short</returns>
internal FileStreamReadResult ReadInt16(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToInt16(buffer, 0));
}
/// <summary>
/// Reads a int from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the int from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>An int</returns>
internal FileStreamReadResult ReadInt32(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToInt32(buffer, 0));
}
/// <summary>
/// Reads a long from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the long from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A long</returns>
internal FileStreamReadResult ReadInt64(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToInt64(buffer, 0));
}
/// <summary>
/// Reads a byte from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the byte from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A byte</returns>
internal FileStreamReadResult ReadByte(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => buffer[0]);
}
/// <summary>
/// Reads a char from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the char from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A char</returns>
internal FileStreamReadResult ReadChar(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToChar(buffer, 0));
}
/// <summary>
/// Reads a bool from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the bool from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A bool</returns>
internal FileStreamReadResult ReadBoolean(long fileOffset, long rowId)
{
// Override the stringifier with numeric values if the user prefers that
return ReadCellHelper(fileOffset, rowId, length => buffer[0] == 0x1,
toStringFunc: val => executionSettings.DisplayBitAsNumber
? val ? "1" : "0"
: val.ToString());
}
/// <summary>
/// Reads a single from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the single from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A single</returns>
internal FileStreamReadResult ReadSingle(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToSingle(buffer, 0), setInvariantCultureDisplayValue: true);
}
/// <summary>
/// Reads a double from the file at the offset provided
/// </summary>
/// <param name="fileOffset">Offset into the file to read the double from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A double</returns>
internal FileStreamReadResult ReadDouble(long fileOffset, long rowId)
{
return ReadCellHelper(fileOffset, rowId, length => BitConverter.ToDouble(buffer, 0), setInvariantCultureDisplayValue: true);
}
/// <summary>
/// Reads a SqlDecimal from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the SqlDecimal from</param>
/// <returns>A SqlDecimal</returns>
internal FileStreamReadResult ReadSqlDecimal(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
int[] arrInt32 = new int[(length - 3) / 4];
Buffer.BlockCopy(buffer, 3, arrInt32, 0, length - 3);
return new SqlDecimal(buffer[0], buffer[1], buffer[2] == 1, arrInt32);
}, setInvariantCultureDisplayValue: true);
}
/// <summary>
/// Reads a decimal from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the decimal from</param>
/// <returns>A decimal</returns>
internal FileStreamReadResult ReadDecimal(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
int[] arrInt32 = new int[length / 4];
Buffer.BlockCopy(buffer, 0, arrInt32, 0, length);
return new decimal(arrInt32);
}, setInvariantCultureDisplayValue: true);
}
/// <summary>
/// Reads a DateTime from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the DateTime from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <param name="col">Column metadata, used for determining what precision to output</param>
/// <returns>A DateTime</returns>
internal FileStreamReadResult ReadDateTime(long offset, long rowId, DbColumnWrapper col)
{
return ReadCellHelper(offset, rowId, length =>
{
long ticks = BitConverter.ToInt64(buffer, 0);
return new DateTime(ticks);
}, null, dt =>
{
// Switch based on the type of column
string formatString;
// For anything else that returns as a CLR DateTime, just show date and time
formatString = $"{DateFormatString} {TimeFormatString}";
return dt.ToString(formatString);
});
}
/// <summary>
/// Reads a DateTimeOffset from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the DateTimeOffset from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A DateTimeOffset</returns>
internal FileStreamReadResult ReadDateTimeOffset(long offset, long rowId)
{
// DateTimeOffset is represented by DateTime.Ticks followed by TimeSpan.Ticks
// both as Int64 values
return ReadCellHelper(offset, rowId, length =>
{
long dtTicks = BitConverter.ToInt64(buffer, 0);
long dtOffset = BitConverter.ToInt64(buffer, 8);
return new DateTimeOffset(new DateTime(dtTicks), new TimeSpan(dtOffset));
}, null, dt =>
{
string formatString = $"{DateFormatString} {TimeFormatString}.fffffff zzz";
return dt.ToString(formatString);
});
}
/// <summary>
/// Reads a TimeSpan from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the TimeSpan from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A TimeSpan</returns>
internal FileStreamReadResult ReadTimeSpan(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
long ticks = BitConverter.ToInt64(buffer, 0);
return new TimeSpan(ticks);
});
}
/// <summary>
/// Reads a string from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the string from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A string</returns>
internal FileStreamReadResult ReadString(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
length > 0
? Encoding.Unicode.GetString(buffer, 0, length)
: string.Empty, totalLength => totalLength == 1);
}
/// <summary>
/// Reads bytes from the file at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the bytes from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A byte array</returns>
internal FileStreamReadResult ReadBytes(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
byte[] output = new byte[length];
Buffer.BlockCopy(buffer, 0, output, 0, length);
return output;
}, totalLength => totalLength == 1,
bytes =>
{
StringBuilder sb = new StringBuilder("0x");
foreach (byte b in bytes)
{
sb.AppendFormat("{0:X2}", b);
}
return sb.ToString();
});
}
/// <summary>
/// Reads the bytes that make up a GUID at the offset provided
/// </summary>
/// <param name="offset">Offset into the file to read the bytes from</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A system guid type object</returns>
internal FileStreamReadResult ReadGuid(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
byte[] output = new byte[length];
Buffer.BlockCopy(buffer, 0, output, 0, length);
return new Guid(output);
}, totalLength => totalLength == 1);
}
/// <summary>
/// Reads a SqlMoney type from the offset provided
/// into a
/// </summary>
/// <param name="offset">Offset into the file to read the value</param>
/// <param name="rowId">Internal ID of the row that will be stored in the cell</param>
/// <returns>A sql money type object</returns>
internal FileStreamReadResult ReadMoney(long offset, long rowId)
{
return ReadCellHelper(offset, rowId, length =>
{
int[] arrInt32 = new int[length / 4];
Buffer.BlockCopy(buffer, 0, arrInt32, 0, length);
return new SqlMoney(new decimal(arrInt32));
});
}
/// <summary>
/// Reads the length of a field at the specified offset in the file
/// </summary>
/// <param name="offset">Offset into the file to read the field length from</param>
/// <returns>A LengthResult</returns>
private LengthResult ReadLength(long offset)
{
// read in length information
int lengthValue;
fileStream.Seek(offset, SeekOrigin.Begin);
int lengthLength = fileStream.Read(buffer, 0, 1);
if (buffer[0] != 0xFF)
{
// one byte is enough
lengthValue = Convert.ToInt32(buffer[0]);
}
else
{
// read in next 4 bytes
lengthLength += fileStream.Read(buffer, 0, 4);
// reconstruct the length
lengthValue = BitConverter.ToInt32(buffer, 0);
}
return new LengthResult { LengthLength = lengthLength, ValueLength = lengthValue };
}
#endregion
/// <summary>
/// Internal struct used for representing the length of a field from the file
/// </summary>
internal struct LengthResult
{
/// <summary>
/// How many bytes the length takes up
/// </summary>
public int LengthLength { get; set; }
/// <summary>
/// How many bytes the value takes up
/// </summary>
public int ValueLength { get; set; }
/// <summary>
/// <see cref="LengthLength"/> + <see cref="ValueLength"/>
/// </summary>
public int TotalLength => LengthLength + ValueLength;
}
#region IDisposable Implementation
private bool disposed;
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (disposed)
{
return;
}
if (disposing)
{
fileStream.Dispose();
}
disposed = true;
}
~ServiceBufferFileStreamReader()
{
Dispose(false);
}
#endregion
}
}

View File

@@ -0,0 +1,587 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.Data.SqlTypes;
using System.Diagnostics;
using System.IO;
using System.Text;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.Kusto.ServiceLayer.SqlContext;
using Microsoft.Kusto.ServiceLayer.Utility;
using Microsoft.SqlTools.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Writer for service buffer formatted file streams
/// </summary>
public class ServiceBufferFileStreamWriter : IFileStreamWriter
{
private const int DefaultBufferLength = 8192;
#region Member Variables
private readonly Stream fileStream;
private readonly QueryExecutionSettings executionSettings;
private byte[] byteBuffer;
private readonly short[] shortBuffer;
private readonly int[] intBuffer;
private readonly long[] longBuffer;
private readonly char[] charBuffer;
private readonly double[] doubleBuffer;
private readonly float[] floatBuffer;
/// <summary>
/// Functions to use for writing various types to a file
/// </summary>
private readonly Dictionary<Type, Func<object, int>> writeMethods;
#endregion
/// <summary>
/// Constructs a new writer
/// </summary>
/// <param name="stream">The file wrapper to use as the underlying file stream</param>
/// <param name="settings">The query execution settings</param>
public ServiceBufferFileStreamWriter(Stream stream, QueryExecutionSettings settings)
{
Validate.IsNotNull(nameof(stream), stream);
Validate.IsNotNull(nameof(settings), settings);
// open file for reading/writing
if (!stream.CanWrite || !stream.CanSeek)
{
throw new InvalidOperationException("Stream must be writable and seekable.");
}
fileStream = stream;
executionSettings = settings;
// create internal buffer
byteBuffer = new byte[DefaultBufferLength];
// Create internal buffers for blockcopy of contents to byte array
// Note: We create them now to avoid the overhead of creating a new array for every write call
shortBuffer = new short[1];
intBuffer = new int[1];
longBuffer = new long[1];
charBuffer = new char[1];
doubleBuffer = new double[1];
floatBuffer = new float[1];
// Define what methods to use to write a type to the file
writeMethods = new Dictionary<Type, Func<object, int>>
{
{typeof(string), val => WriteString((string) val)},
{typeof(short), val => WriteInt16((short) val)},
{typeof(int), val => WriteInt32((int) val)},
{typeof(long), val => WriteInt64((long) val)},
{typeof(byte), val => WriteByte((byte) val)},
{typeof(char), val => WriteChar((char) val)},
{typeof(bool), val => WriteBoolean((bool) val)},
{typeof(double), val => WriteDouble((double) val) },
{typeof(float), val => WriteSingle((float) val) },
{typeof(decimal), val => WriteDecimal((decimal) val) },
{typeof(DateTime), val => WriteDateTime((DateTime) val) },
{typeof(DateTimeOffset), val => WriteDateTimeOffset((DateTimeOffset) val) },
{typeof(TimeSpan), val => WriteTimeSpan((TimeSpan) val) },
{typeof(byte[]), val => WriteBytes((byte[]) val)},
{typeof(Guid), val => WriteGuid((Guid) val)},
{typeof(SqlString), val => WriteNullable((SqlString) val, obj => WriteString((string) obj))},
{typeof(SqlInt16), val => WriteNullable((SqlInt16) val, obj => WriteInt16((short) obj))},
{typeof(SqlInt32), val => WriteNullable((SqlInt32) val, obj => WriteInt32((int) obj))},
{typeof(SqlInt64), val => WriteNullable((SqlInt64) val, obj => WriteInt64((long) obj)) },
{typeof(SqlByte), val => WriteNullable((SqlByte) val, obj => WriteByte((byte) obj)) },
{typeof(SqlBoolean), val => WriteNullable((SqlBoolean) val, obj => WriteBoolean((bool) obj)) },
{typeof(SqlDouble), val => WriteNullable((SqlDouble) val, obj => WriteDouble((double) obj)) },
{typeof(SqlSingle), val => WriteNullable((SqlSingle) val, obj => WriteSingle((float) obj)) },
{typeof(SqlDecimal), val => WriteNullable((SqlDecimal) val, obj => WriteSqlDecimal((SqlDecimal) obj)) },
{typeof(SqlDateTime), val => WriteNullable((SqlDateTime) val, obj => WriteDateTime((DateTime) obj)) },
{typeof(SqlBytes), val => WriteNullable((SqlBytes) val, obj => WriteBytes((byte[]) obj)) },
{typeof(SqlBinary), val => WriteNullable((SqlBinary) val, obj => WriteBytes((byte[]) obj)) },
{typeof(SqlGuid), val => WriteNullable((SqlGuid) val, obj => WriteGuid((Guid) obj)) },
{typeof(SqlMoney), val => WriteNullable((SqlMoney) val, obj => WriteMoney((SqlMoney) obj)) }
};
}
#region IFileStreamWriter Implementation
/// <summary>
/// Writes an entire row to the file stream
/// </summary>
/// <param name="reader">A primed reader</param>
/// <returns>Number of bytes used to write the row</returns>
public int WriteRow(StorageDataReader reader)
{
// Read the values in from the db
object[] values = new object[reader.Columns.Length];
if (!reader.HasLongColumns)
{
// get all record values in one shot if there are no extra long fields
reader.GetValues(values);
}
// Loop over all the columns and write the values to the temp file
int rowBytes = 0;
for (int i = 0; i < reader.Columns.Length; i++)
{
DbColumnWrapper ci = reader.Columns[i];
if (reader.HasLongColumns)
{
if (reader.IsDBNull(i))
{
// Need special case for DBNull because
// reader.GetValue doesn't return DBNull in case of SqlXml and CLR type
values[i] = DBNull.Value;
}
else
{
if (ci.IsLong.HasValue && ci.IsLong.Value)
{
// this is a long field
if (ci.IsBytes)
{
values[i] = reader.GetBytesWithMaxCapacity(i, executionSettings.MaxCharsToStore);
}
else if (ci.IsChars)
{
int maxChars = ci.IsXml
? executionSettings.MaxXmlCharsToStore
: executionSettings.MaxCharsToStore;
values[i] = reader.GetCharsWithMaxCapacity(i, maxChars);
}
else if (ci.IsXml)
{
values[i] = reader.GetXmlWithMaxCapacity(i, executionSettings.MaxXmlCharsToStore);
}
else
{
// we should never get here
Debug.Assert(false);
}
}
else
{
// not a long field
values[i] = reader.GetValue(i);
}
}
}
// Get true type of the object
Type tVal = values[i].GetType();
// Write the object to a file
if (tVal == typeof(DBNull))
{
rowBytes += WriteNull();
}
else
{
if (ci.IsSqlVariant)
{
// serialize type information as a string before the value
string val = tVal.ToString();
rowBytes += WriteString(val);
}
// Use the appropriate writing method for the type
Func<object, int> writeMethod;
if (writeMethods.TryGetValue(tVal, out writeMethod))
{
rowBytes += writeMethod(values[i]);
}
else
{
rowBytes += WriteString(values[i].ToString());
}
}
}
// Flush the buffer after every row
FlushBuffer();
return rowBytes;
}
[Obsolete]
public void WriteRow(IList<DbCellValue> row, IList<DbColumnWrapper> columns)
{
throw new InvalidOperationException("This type of writer is meant to write values from a DbDataReader only.");
}
/// <summary>
/// Seeks to a given offset in the file, relative to the beginning of the file
/// </summary>
public void Seek(long offset)
{
fileStream.Seek(offset, SeekOrigin.Begin);
}
/// <summary>
/// Flushes the internal buffer to the file stream
/// </summary>
public void FlushBuffer()
{
fileStream.Flush();
}
#endregion
#region Private Helpers
/// <summary>
/// Writes null to the file as one 0x00 byte
/// </summary>
/// <returns>Number of bytes used to store the null</returns>
internal int WriteNull()
{
byteBuffer[0] = 0x00;
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 1);
}
/// <summary>
/// Writes a short to the file
/// </summary>
/// <returns>Number of bytes used to store the short</returns>
internal int WriteInt16(short val)
{
byteBuffer[0] = 0x02; // length
shortBuffer[0] = val;
Buffer.BlockCopy(shortBuffer, 0, byteBuffer, 1, 2);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 3);
}
/// <summary>
/// Writes a int to the file
/// </summary>
/// <returns>Number of bytes used to store the int</returns>
internal int WriteInt32(int val)
{
byteBuffer[0] = 0x04; // length
intBuffer[0] = val;
Buffer.BlockCopy(intBuffer, 0, byteBuffer, 1, 4);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 5);
}
/// <summary>
/// Writes a long to the file
/// </summary>
/// <returns>Number of bytes used to store the long</returns>
internal int WriteInt64(long val)
{
byteBuffer[0] = 0x08; // length
longBuffer[0] = val;
Buffer.BlockCopy(longBuffer, 0, byteBuffer, 1, 8);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 9);
}
/// <summary>
/// Writes a char to the file
/// </summary>
/// <returns>Number of bytes used to store the char</returns>
internal int WriteChar(char val)
{
byteBuffer[0] = 0x02; // length
charBuffer[0] = val;
Buffer.BlockCopy(charBuffer, 0, byteBuffer, 1, 2);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 3);
}
/// <summary>
/// Writes a bool to the file
/// </summary>
/// <returns>Number of bytes used to store the bool</returns>
internal int WriteBoolean(bool val)
{
byteBuffer[0] = 0x01; // length
byteBuffer[1] = (byte) (val ? 0x01 : 0x00);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 2);
}
/// <summary>
/// Writes a byte to the file
/// </summary>
/// <returns>Number of bytes used to store the byte</returns>
internal int WriteByte(byte val)
{
byteBuffer[0] = 0x01; // length
byteBuffer[1] = val;
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 2);
}
/// <summary>
/// Writes a float to the file
/// </summary>
/// <returns>Number of bytes used to store the float</returns>
internal int WriteSingle(float val)
{
byteBuffer[0] = 0x04; // length
floatBuffer[0] = val;
Buffer.BlockCopy(floatBuffer, 0, byteBuffer, 1, 4);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 5);
}
/// <summary>
/// Writes a double to the file
/// </summary>
/// <returns>Number of bytes used to store the double</returns>
internal int WriteDouble(double val)
{
byteBuffer[0] = 0x08; // length
doubleBuffer[0] = val;
Buffer.BlockCopy(doubleBuffer, 0, byteBuffer, 1, 8);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 9);
}
/// <summary>
/// Writes a SqlDecimal to the file
/// </summary>
/// <returns>Number of bytes used to store the SqlDecimal</returns>
internal int WriteSqlDecimal(SqlDecimal val)
{
int[] arrInt32 = val.Data;
int iLen = 3 + (arrInt32.Length * 4);
int iTotalLen = WriteLength(iLen); // length
// precision
byteBuffer[0] = val.Precision;
// scale
byteBuffer[1] = val.Scale;
// positive
byteBuffer[2] = (byte)(val.IsPositive ? 0x01 : 0x00);
// data value
Buffer.BlockCopy(arrInt32, 0, byteBuffer, 3, iLen - 3);
iTotalLen += FileUtilities.WriteWithLength(fileStream, byteBuffer, iLen);
return iTotalLen; // len+data
}
/// <summary>
/// Writes a decimal to the file
/// </summary>
/// <returns>Number of bytes used to store the decimal</returns>
internal int WriteDecimal(decimal val)
{
int[] arrInt32 = decimal.GetBits(val);
int iLen = arrInt32.Length * 4;
int iTotalLen = WriteLength(iLen); // length
Buffer.BlockCopy(arrInt32, 0, byteBuffer, 0, iLen);
iTotalLen += FileUtilities.WriteWithLength(fileStream, byteBuffer, iLen);
return iTotalLen; // len+data
}
/// <summary>
/// Writes a DateTime to the file
/// </summary>
/// <returns>Number of bytes used to store the DateTime</returns>
public int WriteDateTime(DateTime dtVal)
{
return WriteInt64(dtVal.Ticks);
}
/// <summary>
/// Writes a DateTimeOffset to the file
/// </summary>
/// <returns>Number of bytes used to store the DateTimeOffset</returns>
internal int WriteDateTimeOffset(DateTimeOffset dtoVal)
{
// Write the length, which is the 2*sizeof(long)
byteBuffer[0] = 0x10; // length (16)
// Write the two longs, the datetime and the offset
long[] longBufferOffset = new long[2];
longBufferOffset[0] = dtoVal.Ticks;
longBufferOffset[1] = dtoVal.Offset.Ticks;
Buffer.BlockCopy(longBufferOffset, 0, byteBuffer, 1, 16);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 17);
}
/// <summary>
/// Writes a TimeSpan to the file
/// </summary>
/// <returns>Number of bytes used to store the TimeSpan</returns>
internal int WriteTimeSpan(TimeSpan timeSpan)
{
return WriteInt64(timeSpan.Ticks);
}
/// <summary>
/// Writes a string to the file
/// </summary>
/// <returns>Number of bytes used to store the string</returns>
internal int WriteString(string sVal)
{
Validate.IsNotNull(nameof(sVal), sVal);
int iTotalLen;
if (0 == sVal.Length) // special case of 0 length string
{
const int iLen = 5;
AssureBufferLength(iLen);
byteBuffer[0] = 0xFF;
byteBuffer[1] = 0x00;
byteBuffer[2] = 0x00;
byteBuffer[3] = 0x00;
byteBuffer[4] = 0x00;
iTotalLen = FileUtilities.WriteWithLength(fileStream, byteBuffer, 5);
}
else
{
// Convert to a unicode byte array
byte[] bytes = Encoding.Unicode.GetBytes(sVal);
// convert char array into byte array and write it out
iTotalLen = WriteLength(bytes.Length);
iTotalLen += FileUtilities.WriteWithLength(fileStream, bytes, bytes.Length);
}
return iTotalLen; // len+data
}
/// <summary>
/// Writes a byte[] to the file
/// </summary>
/// <returns>Number of bytes used to store the byte[]</returns>
internal int WriteBytes(byte[] bytesVal)
{
Validate.IsNotNull(nameof(bytesVal), bytesVal);
int iTotalLen;
if (bytesVal.Length == 0) // special case of 0 length byte array "0x"
{
AssureBufferLength(5);
byteBuffer[0] = 0xFF;
byteBuffer[1] = 0x00;
byteBuffer[2] = 0x00;
byteBuffer[3] = 0x00;
byteBuffer[4] = 0x00;
iTotalLen = FileUtilities.WriteWithLength(fileStream, byteBuffer, 5);
}
else
{
iTotalLen = WriteLength(bytesVal.Length);
iTotalLen += FileUtilities.WriteWithLength(fileStream, bytesVal, bytesVal.Length);
}
return iTotalLen; // len+data
}
/// <summary>
/// Stores a GUID value to the file by treating it as a byte array
/// </summary>
/// <param name="val">The GUID to write to the file</param>
/// <returns>Number of bytes written to the file</returns>
internal int WriteGuid(Guid val)
{
byte[] guidBytes = val.ToByteArray();
return WriteBytes(guidBytes);
}
/// <summary>
/// Stores a SqlMoney value to the file by treating it as a decimal
/// </summary>
/// <param name="val">The SqlMoney value to write to the file</param>
/// <returns>Number of bytes written to the file</returns>
internal int WriteMoney(SqlMoney val)
{
return WriteDecimal(val.Value);
}
/// <summary>
/// Creates a new buffer that is of the specified length if the buffer is not already
/// at least as long as specified.
/// </summary>
/// <param name="newBufferLength">The minimum buffer size</param>
private void AssureBufferLength(int newBufferLength)
{
if (newBufferLength > byteBuffer.Length)
{
byteBuffer = new byte[byteBuffer.Length];
}
}
/// <summary>
/// Writes the length of the field using the appropriate number of bytes (ie, 1 if the
/// length is &lt;255, 5 if the length is &gt;=255)
/// </summary>
/// <returns>Number of bytes used to store the length</returns>
private int WriteLength(int iLen)
{
if (iLen < 0xFF)
{
// fits in one byte of memory only need to write one byte
int iTmp = iLen & 0x000000FF;
byteBuffer[0] = Convert.ToByte(iTmp);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 1);
}
// The length won't fit in 1 byte, so we need to use 1 byte to signify that the length
// is a full 4 bytes.
byteBuffer[0] = 0xFF;
// convert int32 into array of bytes
intBuffer[0] = iLen;
Buffer.BlockCopy(intBuffer, 0, byteBuffer, 1, 4);
return FileUtilities.WriteWithLength(fileStream, byteBuffer, 5);
}
/// <summary>
/// Writes a Nullable type (generally a Sql* type) to the file. The function provided by
/// <paramref name="valueWriteFunc"/> is used to write to the file if <paramref name="val"/>
/// is not null. <see cref="WriteNull"/> is used if <paramref name="val"/> is null.
/// </summary>
/// <param name="val">The value to write to the file</param>
/// <param name="valueWriteFunc">The function to use if val is not null</param>
/// <returns>Number of bytes used to write value to the file</returns>
private int WriteNullable(INullable val, Func<object, int> valueWriteFunc)
{
return val.IsNull ? WriteNull() : valueWriteFunc(val);
}
#endregion
#region IDisposable Implementation
private bool disposed;
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (disposed)
{
return;
}
if (disposing)
{
fileStream.Flush();
fileStream.Dispose();
}
disposed = true;
}
~ServiceBufferFileStreamWriter()
{
Dispose(false);
}
#endregion
}
}

View File

@@ -0,0 +1,309 @@
//
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Common;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using System.Xml;
using Microsoft.Kusto.ServiceLayer.QueryExecution.Contracts;
using Microsoft.SqlTools.Utility;
namespace Microsoft.Kusto.ServiceLayer.QueryExecution.DataStorage
{
/// <summary>
/// Wrapper around a DbData reader to perform some special operations more simply
/// </summary>
public class StorageDataReader
{
/// <summary>
/// Constructs a new wrapper around the provided reader
/// </summary>
/// <param name="reader">The reader to wrap around</param>
public StorageDataReader(IDataReader reader)
{
// Sanity check to make sure there is a data reader
Validate.IsNotNull(nameof(reader), reader);
DataReader = reader;
// Read the columns into a set of wrappers
List<DbColumnWrapper> columnList = new List<DbColumnWrapper>();
var rows = DataReader.GetSchemaTable().Rows;
foreach (DataRow row in rows)
{
columnList.Add(new DbColumnWrapper(row));
}
Columns = columnList.ToArray();
HasLongColumns = Columns.Any(column => column.IsLong.HasValue && column.IsLong.Value);
}
#region Properties
/// <summary>
/// All the columns that this reader currently contains
/// </summary>
public DbColumnWrapper[] Columns { get; private set; }
/// <summary>
/// The <see cref="DataReader"/> that will be read from
/// </summary>
public IDataReader DataReader { get; private set; }
/// <summary>
/// Whether or not any of the columns of this reader are 'long', such as nvarchar(max)
/// </summary>
public bool HasLongColumns { get; private set; }
#endregion
#region DbDataReader Methods
/// <summary>
/// Pass-through to DbDataReader.ReadAsync()
/// </summary>
/// <param name="cancellationToken">The cancellation token to use for cancelling a query</param>
/// <returns></returns>
public Task<bool> ReadAsync(CancellationToken cancellationToken)
{
return Task.Run(() => DataReader.Read());
}
/// <summary>
/// Retrieves a value
/// </summary>
/// <param name="i">Column ordinal</param>
/// <returns>The value of the given column</returns>
public object GetValue(int i)
{
return DataReader.GetValue(i);
}
/// <summary>
/// Stores all values of the current row into the provided object array
/// </summary>
/// <param name="values">Where to store the values from this row</param>
public void GetValues(object[] values)
{
DataReader.GetValues(values);
}
/// <summary>
/// Whether or not the cell of the given column at the current row is a DBNull
/// </summary>
/// <param name="i">Column ordinal</param>
/// <returns>True if the cell is DBNull, false otherwise</returns>
public bool IsDBNull(int i)
{
return DataReader.IsDBNull(i);
}
#endregion
#region Public Methods
/// <summary>
/// Retrieves bytes with a maximum number of bytes to return
/// </summary>
/// <param name="iCol">Column ordinal</param>
/// <param name="maxNumBytesToReturn">Number of bytes to return at maximum</param>
/// <returns>Byte array</returns>
public byte[] GetBytesWithMaxCapacity(int iCol, int maxNumBytesToReturn)
{
if (maxNumBytesToReturn <= 0)
{
throw new ArgumentOutOfRangeException(nameof(maxNumBytesToReturn), SR.QueryServiceDataReaderByteCountInvalid);
}
//first, ask provider how much data it has and calculate the final # of bytes
//NOTE: -1 means that it doesn't know how much data it has
long neededLength;
long origLength = neededLength = GetBytes(iCol, 0, null, 0, 0);
if (neededLength == -1 || neededLength > maxNumBytesToReturn)
{
neededLength = maxNumBytesToReturn;
}
//get the data up to the maxNumBytesToReturn
byte[] bytesBuffer = new byte[neededLength];
GetBytes(iCol, 0, bytesBuffer, 0, (int)neededLength);
//see if server sent back more data than we should return
if (origLength == -1 || origLength > neededLength)
{
//pump the rest of data from the reader and discard it right away
long dataIndex = neededLength;
const int tmpBufSize = 100000;
byte[] tmpBuf = new byte[tmpBufSize];
while (GetBytes(iCol, dataIndex, tmpBuf, 0, tmpBufSize) == tmpBufSize)
{
dataIndex += tmpBufSize;
}
}
return bytesBuffer;
}
/// <summary>
/// Retrieves characters with a maximum number of charss to return
/// </summary>
/// <param name="iCol">Column ordinal</param>
/// <param name="maxCharsToReturn">Number of chars to return at maximum</param>
/// <returns>String</returns>
public string GetCharsWithMaxCapacity(int iCol, int maxCharsToReturn)
{
if (maxCharsToReturn <= 0)
{
throw new ArgumentOutOfRangeException(nameof(maxCharsToReturn), SR.QueryServiceDataReaderCharCountInvalid);
}
//first, ask provider how much data it has and calculate the final # of chars
//NOTE: -1 means that it doesn't know how much data it has
long neededLength;
long origLength = neededLength = GetChars(iCol, 0, null, 0, 0);
if (neededLength == -1 || neededLength > maxCharsToReturn)
{
neededLength = maxCharsToReturn;
}
Debug.Assert(neededLength < int.MaxValue);
//get the data up to maxCharsToReturn
char[] buffer = new char[neededLength];
if (neededLength > 0)
{
GetChars(iCol, 0, buffer, 0, (int)neededLength);
}
//see if server sent back more data than we should return
if (origLength == -1 || origLength > neededLength)
{
//pump the rest of data from the reader and discard it right away
long dataIndex = neededLength;
const int tmpBufSize = 100000;
char[] tmpBuf = new char[tmpBufSize];
while (GetChars(iCol, dataIndex, tmpBuf, 0, tmpBufSize) == tmpBufSize)
{
dataIndex += tmpBufSize;
}
}
string res = new string(buffer);
return res;
}
/// <summary>
/// Retrieves xml with a maximum number of bytes to return
/// </summary>
/// <param name="iCol">Column ordinal</param>
/// <param name="maxCharsToReturn">Number of chars to return at maximum</param>
/// <returns>String</returns>
public string GetXmlWithMaxCapacity(int iCol, int maxCharsToReturn)
{
if (maxCharsToReturn <= 0)
{
throw new ArgumentOutOfRangeException(nameof(maxCharsToReturn), SR.QueryServiceDataReaderXmlCountInvalid);
}
object o = GetValue(iCol);
return o?.ToString();
}
#endregion
#region Private Helpers
private long GetBytes(int i, long dataIndex, byte[] buffer, int bufferIndex, int length)
{
return DataReader.GetBytes(i, dataIndex, buffer, bufferIndex, length);
}
private long GetChars(int i, long dataIndex, char[] buffer, int bufferIndex, int length)
{
return DataReader.GetChars(i, dataIndex, buffer, bufferIndex, length);
}
#endregion
/// <summary>
/// Internal class for writing strings with a maximum capacity
/// </summary>
/// <remarks>
/// This code is take almost verbatim from Microsoft.SqlServer.Management.UI.Grid, SSMS
/// DataStorage, StorageDataReader class.
/// </remarks>
internal class StringWriterWithMaxCapacity : StringWriter
{
private bool stopWriting;
private int CurrentLength
{
get { return GetStringBuilder().Length; }
}
public StringWriterWithMaxCapacity(IFormatProvider formatProvider, int capacity) : base(formatProvider)
{
MaximumCapacity = capacity;
}
private int MaximumCapacity { get; set; }
public override void Write(char value)
{
if (stopWriting) { return; }
if (CurrentLength < MaximumCapacity)
{
base.Write(value);
}
else
{
stopWriting = true;
}
}
public override void Write(char[] buffer, int index, int count)
{
if (stopWriting) { return; }
int curLen = CurrentLength;
if (curLen + (count - index) > MaximumCapacity)
{
stopWriting = true;
count = MaximumCapacity - curLen + index;
if (count < 0)
{
count = 0;
}
}
base.Write(buffer, index, count);
}
public override void Write(string value)
{
if (stopWriting) { return; }
int curLen = CurrentLength;
if (value.Length + curLen > MaximumCapacity)
{
stopWriting = true;
base.Write(value.Substring(0, MaximumCapacity - curLen));
}
else
{
base.Write(value);
}
}
}
}
}