Archive for the ‘ SSAS ’ Category

SSAS The Operation Was Canceled Because of Locking Conflicts

I have so many cubes and I never saw this error. Every one of my cubes was processing on the same schedule they always have. Users are now getting error: “The operation Was canceled because of locking conflicts” My processes full are having this and so are the users. Whats going on? My users are mostly SSAS Tabular so all users are “read only” as far as I was concerned they can never make an exclusive lock” but they are getting a “read” lock and some users started getting “A lock could not be obtained” Whats going on?

It pretty simple. First all my jobs were every hour from 6 am to 7 pm. So every hour on the hour all jobs will run. Starting off it was one ProcessFull per database but as time went on PowerPivot daily refresh in SharePoint got moved to SSAS Tabular Server hourly ProcessFull which then became partitioned and then “we need realtime” so every 10 min.

Whats happening?

Processing (process partition reading in data and calculating) It needs to swap out old with new partition but will wait till long running queries are finished. Its a write lock

Read Data (Users making a read lock that will block write till finished)

New Read Locks (Will wail on write locks till released) Users get a hourglass

This is where locking occurs. SSAS Looks at the read locks and will wait till timeout threshold has been reached. It will then timeout the users. The users were making very small MDX queries in the beginning but now they are getting very complex by drilling down to many details that are causing a bigger query plan and are blocked by the update. After a ProcessFull the cache plan has to update so only the first user running the query is slow. The rest of the users get a fast result leaving me trying to analyze random users

First: stagger all processing 7:05, 7:10 for start times

Next: Use <Parallel> tag

This will process all objects in same thread and do one commit

Finally: the  Managers must make a tradeoff on ProcessFull reduced or Use ProcessAdd . ProcessAdd adds a small partition that adds a small footprint that does not lock your main partition.  ProcessAdd is more complicated as any updates must use a delta script such as a fact of 8 with an new change to 3 must delta process -5 and then ProcessFull to true up.

This is a simple analysis but if the problem continues you must look at using process add. If your database is in a star snowflake schema process add is very simple. you have a fact of $3000 and an update went through -$200 you add a new fact of -$200 and the sum is $2800 and then do a process merge at end of day. Dimensions still have to processed full.

There is a much more detailed description here

 

SSAS Tabular Differential Processing (Process ADD)

In SSAS Tabular you have several options for getting close to real time processing. like ROLAP SSAS Multidimensional. ROLAP really is a query that uses the OLTP database for querying vs MOLAP were data is pulled in to SSAS and pre-aggregated.

SSAS 2012 Tabular is in memory(excluding direct query) so you have several choices. Full Processing(All Dimensions and facts are processed causing SSAS to make a copy of your Tabular database for current users and refreshing the tables (dimensions and facts) to be refreshed in memory. Your users don’t notice a thing provided that you have the memory to double your SSAS Tabular Database and that you have the processing power to handle current MDX queries and calculations during build of new Cube/ Tabular Database.

In a dedicated server environment where your users want a daily,hourly, 5 min  refresh and you have a small SSAS cube you will be fine as long as you take these 2 considerations in.

Anything else you need to use partitioning and/or differential processing

Partitioning is breaking the same dataset into segments (Similar to SQLS Partitioning) or Differential processing (Process ADD) or both. Partitioning (All SSAS Tabular have) involve taking the table definition query and breaking that query into subsets that use the same connection and columns by ranges. They have multiple  queries behind them that have a range based on date, transaction #. Differential Processing is exactly the same but there is a performance gain. For a partition to be processed and viewable requires a full processing on that partition and it recalculates if you have calculations across all other processed partitions in that table.

Process add creates a partition but  SSAS joins it in the MDX query eliminating the need for any full processing (recalculating across all partitions in the Tabular table). The gain is instant inserts of new data without processing. The drawback is this is only easy when your data never is updated and you are putting the work on SSAS to “merge” the data causing slower MDX queries. Lucky there is a merge process where you can merge all differential partitions back into a daily monthly.

For a database that never is updated differential (Tabular process add) is easy. Get the max record id from your tabular table and get all data greater.

For updating database you need a datamart or at least one that gives you old value and new value of column so  you can insert a delta value old-new value.

In this example I am demonstrating a no update. I’m using data from ASTrace Tool that captures all SSAS Activity from multiple servers.

What I do is processed is a bit where I update it after  I have staged and imported the data into my Tabular cube. my start and end transactions # are  max tabular and last current in SSIS.

I have a suffix (Table Name) and Prefix _YYYYMMDDSSSS

Here is how I do it in SSIS Script Task

#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using Microsoft.AnalysisServices;
#endregion

namespace ST_4c592bd54c674fb5a16c0569f836f12e
{

[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{

public void Main()
{
Int32 iRowCount = Convert.ToInt32(Dts.Variables[“User::RowCount”].Value);
if (iRowCount != 0) //Skip in row rows are ready to process
{
bool fireAgain = true;
int iStartRow = (int)Dts.Variables[“$Package::iStartRow”].Value;
int iEndRow = (int)Dts.Variables[“$Package::iEndRow”].Value;
String sServerConn = Dts.Connections[“Zenox.TraceDataSSAS”].ConnectionString;
Server svr = new Server();
Database db = new Database();
Cube cb = new Cube();
DataSourceView dsv;
MeasureGroup mg;

try
{
svr.Connect(sServerConn);
db = svr.Databases.FindByName(“TraceData”);
dsv = db.DataSourceViews.FindByName(“Sandbox”);
cb = db.Cubes.FindByName(“Model”);
mg = cb.MeasureGroups.FindByName(“TraceData”);

String sPartName = “TraceData_” + CreateSuffix();
Dts.Events.FireInformation(0, “Partition Name”, sPartName, “”, 0, ref fireAgain);
Partition part = mg.Partitions.Add(sPartName);
part.Source = new QueryBinding(dsv.DataSourceID, “SELECT [dbo].[PermTraceData].[RowNumber],[dbo].[PermTraceData].[EventClass],[dbo].[PermTraceData].[NTUserName],[dbo].[PermTraceData].[ApplicationName],[dbo].[PermTraceData].[StartTime],[dbo].[PermTraceData].[CurrentTime],[dbo].[PermTraceData].[Duration],[dbo].[PermTraceData].[DatabaseName],[dbo].[PermTraceData].[ObjectName],[dbo].[PermTraceData].[Error],[dbo].[PermTraceData].[CPUTime],[dbo].[PermTraceData].[NTDomainName],[dbo].[PermTraceData].[RequestParameters],[dbo].[PermTraceData].[RequestProperties],[dbo].[PermTraceData].[TextData],[dbo].[PermTraceData].[ServerName] FROM [dbo].[PermTraceData] WHERE [Is_Processed] = 0”);
part.StorageMode = StorageMode.InMemory;
part.Update(UpdateOptions.ExpandFull);
part.Process(ProcessType.ProcessData);
part.Process(ProcessType.ProcessAdd);
}
catch (Exception ex)
{
Dts.Events.FireError(0, “Process Cube Script”, ex.Message + “\r” + ex.StackTrace, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}

Dts.TaskResult = (int)ScriptResults.Success;
}

private String CreateSuffix()
{
String sPartName = DateTime.Now.Year.ToString();
//pad month
switch (DateTime.Now.Month.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Month.ToString();
break;
default:
sPartName += DateTime.Now.Month.ToString();
break;
}
sPartName += DateTime.Now.Day.ToString();

//pad day
switch (DateTime.Now.Day.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Day.ToString();
break;
default:
sPartName += DateTime.Now.Day.ToString();
break;
}
//pad hour
switch (DateTime.Now.Hour.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Hour.ToString();
break;
default:
sPartName += DateTime.Now.Hour.ToString();
break;
}
//pad hour
switch (DateTime.Now.Hour.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Hour.ToString();
break;
default:
sPartName += DateTime.Now.Hour.ToString();
break;
}
//pad min
switch (DateTime.Now.Minute.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Minute.ToString();
break;
default:
sPartName += DateTime.Now.Minute.ToString();
break;
}
//pad sec
sPartName += DateTime.Now.Second.ToString();

return sPartName;
}

#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion

}
}

Added Processing SSAS Tabular Cube with Powershell

Powershell is perfect for simple processing of Tabular Cubes. Here is how to do it

SSAS 2012 Multidimensional vs Tabular

I’m still getting used to it the days of many dimensions with cubes is going to migrate into tabular. I will be the first to admit its hard to let my SQL, Oracle.. data warehouses go. But using tabular you can process data and handle bad data using DAX. It lets your users use data. Its all free with Excel 2010. Download PowerPivot and see what you can do with you old data marts, data warehouses, SQL Databases Oracle Databases, Access, etc. You can use them all together and let your power users create dynamic PowerPivot sheets on their own. Using SharePoint 2010 you can look at usage and import into tabular if it the data needs to be updated on schedule. Multidimensional still has a purpose but with Tabular you can put it into memory and have the fastest results. Any small data marts I do as Tabular and huge one use Multidimensional and then tabular.

http://www.slideshare.net/d501159/02-sql-server2012sp1blazingfastperformancel200

Tabular Differential (Process ADD) Processing

In SSAS Tabular you have several options for getting close to real time processing. like ROLAP SSAS Multidimensional. ROLAP really is a query that uses the OLTP database for querying vs MOLAP were data is pulled in to SSAS and pre-aggregated.

SSAS 2012 Tabular is in memory(excluding direct query) so you have several choices. Full Processing(All Dimensions and facts are processed causing SSAS to make a copy of your Tabular database for current users and refreshing the tables (dimensions and facts) to be refreshed in memory. Your users don’t notice a thing provided that you have the memory to double your SSAS Tabular Database and that you have the processing power to handle current MDX queries and calculations during build of new Cube/ Tabular Database.

In a dedicated server environment where your users want a daily,hourly, 5 min  refresh and you have a small SSAS cube you will be fine as long as you take these 2 considerations in.

Anything else you need to use partitioning and/or differential processing

Partitioning is breaking the same dataset into segments (Similar to SQLS Partitioning) or Differential processing (Process ADD) or both. Partitioning (All SSAS Tabular have) involve taking the table definition query and breaking that query into subsets that use the same connection and columns by ranges. They have multiple  queries behind them that have a range based on date, transaction #. Differential Processing is exactly the same but there is a performance gain. For a partition to be processed and viewable requires a full processing on that partition and it recalculates if you have calculations across all other processed partitions in that table.

Process add creates a partition but  SSAS joins it in the MDX query eliminating the need for any full processing (recalculating across all partitions in the Tabular table). The gain is instant inserts of new data without processing. The drawback is this is only easy when your data never is updated and you are putting the work on SSAS to “merge” the data causing slower MDX queries. Lucky there is a merge process where you can merge all differential partitions back into a daily monthly.

For a database that never is updated differential (Tabular process add) is easy. Get the max record id from your tabular table and get all data greater.

For updating database you need a datamart or at least one that gives you old value and new value of column so  you can insert a delta value old-new value.

In this example I am demonstrating a no update. I’m using data from ASTrace Tool that captures all SSAS Activity from multiple servers.

What I do is processed is a bit where I update it after  I have staged and imported the data into my Tabular cube. my start and end transactions # are  max tabular and last current in SSIS.

I have a suffix (Table Name) and Prefix _YYYYMMDDSSSS

Here is how I do it in SSIS Script Task

#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using Microsoft.AnalysisServices;
#endregion

namespace ST_4c592bd54c674fb5a16c0569f836f12e
{

[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{

public void Main()
{
Int32 iRowCount = Convert.ToInt32(Dts.Variables[“User::RowCount”].Value);
if (iRowCount != 0) //Skip in row rows are ready to process
{
bool fireAgain = true;
int iStartRow = (int)Dts.Variables[“$Package::iStartRow”].Value;
int iEndRow = (int)Dts.Variables[“$Package::iEndRow”].Value;
String sServerConn = Dts.Connections[“Zenox.TraceDataSSAS”].ConnectionString;
Server svr = new Server();
Database db = new Database();
Cube cb = new Cube();
DataSourceView dsv;
MeasureGroup mg;

try
{
svr.Connect(sServerConn);
db = svr.Databases.FindByName(“TraceData”);
dsv = db.DataSourceViews.FindByName(“Sandbox”);
cb = db.Cubes.FindByName(“Model”);
mg = cb.MeasureGroups.FindByName(“TraceData”);

String sPartName = “TraceData_” + CreateSuffix();
Dts.Events.FireInformation(0, “Partition Name”, sPartName, “”, 0, ref fireAgain);
Partition part = mg.Partitions.Add(sPartName);
part.Source = new QueryBinding(dsv.DataSourceID, “SELECT [dbo].[PermTraceData].[RowNumber],[dbo].[PermTraceData].[EventClass],[dbo].[PermTraceData].[NTUserName],[dbo].[PermTraceData].[ApplicationName],[dbo].[PermTraceData].[StartTime],[dbo].[PermTraceData].[CurrentTime],[dbo].[PermTraceData].[Duration],[dbo].[PermTraceData].[DatabaseName],[dbo].[PermTraceData].[ObjectName],[dbo].[PermTraceData].[Error],[dbo].[PermTraceData].[CPUTime],[dbo].[PermTraceData].[NTDomainName],[dbo].[PermTraceData].[RequestParameters],[dbo].[PermTraceData].[RequestProperties],[dbo].[PermTraceData].[TextData],[dbo].[PermTraceData].[ServerName] FROM [dbo].[PermTraceData] WHERE [Is_Processed] = 0”);
part.StorageMode = StorageMode.InMemory;
part.Update(UpdateOptions.ExpandFull);
part.Process(ProcessType.ProcessData);
part.Process(ProcessType.ProcessAdd);
}
catch (Exception ex)
{
Dts.Events.FireError(0, “Process Cube Script”, ex.Message + “\r” + ex.StackTrace, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
}

Dts.TaskResult = (int)ScriptResults.Success;
}

private String CreateSuffix()
{
String sPartName = DateTime.Now.Year.ToString();
//pad month
switch (DateTime.Now.Month.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Month.ToString();
break;
default:
sPartName += DateTime.Now.Month.ToString();
break;
}
sPartName += DateTime.Now.Day.ToString();

//pad day
switch (DateTime.Now.Day.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Day.ToString();
break;
default:
sPartName += DateTime.Now.Day.ToString();
break;
}
//pad hour
switch (DateTime.Now.Hour.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Hour.ToString();
break;
default:
sPartName += DateTime.Now.Hour.ToString();
break;
}
//pad hour
switch (DateTime.Now.Hour.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Hour.ToString();
break;
default:
sPartName += DateTime.Now.Hour.ToString();
break;
}
//pad min
switch (DateTime.Now.Minute.ToString().Length)
{
case 1:
sPartName += “0” + DateTime.Now.Minute.ToString();
break;
default:
sPartName += DateTime.Now.Minute.ToString();
break;
}
//pad sec
sPartName += DateTime.Now.Second.ToString();

return sPartName;
}

#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion

}
}