Thursday, December 18, 2008

PPS Planning - Current Period

Over the past 2 years one of the biggest mysteries to a lot of PPS Planning users is the storage of the current period which is set in the Business Planning Modeler. This has been discussed and a number of posts on the PPS forum have clearly demonstrated how one can get to it. However I have yet to see a full set of code to bring this setting into your environment for use. I use the current period for my hourly PPS updates from our ERP system so as to make sure that we do not reload old data and speed up the process. So part of my ETL process interrogates the XML blob in the table BizAppNodes and returns the current period ID and Label for each model. These records are inserted into a table that I keep up to date and then use this in my ETL process. Here is the code:

DECLARE @xmlblob xml

SELECT @xmlblob = CAST(CAST ( ba.xmlblob AS varbinary(MAX))
as xml)

FROM BizAppNodes ba

WHERE VersionEndDateTime = '9999-12-31
00:00:00.000'

AND BizAppNodeLabel = 'FinanceModel'

SELECT distinct ModelLabel


,CurrentPeriodId


,T.[MonthLabel]


FROM


(


SELECT DISTINCT


tab.col.value('../../@Label', 'varchar(30)') as ModelLabel,


tab.col.value('@CurrentPeriodId', 'varchar(30)') as
CurrentPeriodId,


tab.col.value('@EffectiveDate', 'varchar(30)') as EffectiveDate


FROM @xmlblob.nodes
('/BizModelSite/Models/ArrayOfBizModel/BizModel/EffectiveDatedCurrentPeriods/EffectiveDatedCurrentPeriod')
as tab(col)


) AS TABLE1

LEFT OUTER JOIN [dbo].[D_Time] T ON T.[Month] =
TABLE1.[CurrentPeriodId]

The output is the model name, period id and period label for each model in the application.

- Paul Steynberg

Friday, December 12, 2008

PPS Up Your Street

I have added a new link to another PPS blog by David Street. It makes for some good reading.

- Paul Steynberg

Wednesday, December 10, 2008

PerformancePoint Server SP2 - Feedback

Nick Barclay, Tim Kent and Sacha Tomey have all posted some detailed blogs on what you can expect from PerformancePoint Server SP2 so I will not labour the point. However here are some real live bits and bobs as I installed SP2(Beta) over a month ago under the following configuration:

  • PPS Server, Windows 2008 Server and SQL Server 2008. X64, 4x6 core processors, 64 Gb Memory.
  • SQL Server, Windows 2003 Server and SQL Server 2008. X64, 2x4 core processors, 64 Gb Memory.

We put a new server in for PPS but used an existing one for SQL. This will be upgraded within the next 3 months.

The install did not give any surprises. You must firstly install PPS RTM and then apply SP2 as was expected. Something that did bite us was the fact that you could use the PBM on the server but NOT from your desktop. My guys in networks are convinced that it is something to do with the way credentials are double hopped from the desktop to the PPS Server to the SQL Server. (They mumbled a whole bunch of stuff about Kerberos and AD and it would take days to figure out the problem.) To fix this we just changed the PPSPlanningWebService “ASP.NET impersonation” from Enabled to Disabled. Here is the error message you will find in your event viewer:

Date 14/11/2008 15:38

Log SQL Server (Current - 14/11/2008 15:38)

Source Logon

Message

Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: [Ip address here]]

Date 14/11/2008 15:38

Log SQL Server (Current - 14/11/2008 15:38)

Source Logon

Message

Error: 18456, Severity: 14, State: 11.

Considerations when Installing on Windows 2008.

You need to make sure that Web Server (IIS) is installed. If not open Server Manager, click on “Add Roles” and Install Web Server(IIS).

Once installed open up Server Manager and click on Roles.

Click on “Add Role Services”.

Install the following services:

Application Development, all except “Server Side Includes”.

Under Management Tools, “IIS Management Console” and “IIS 6.0 Management Capability”.

Under Security “Basic Authentication” and “Windows Authentication”.

Now install the PPS RTM 64 bits. DO NOT run the Configuration Manager.

Install the PPS SP2 64 bits.

Now you can run the Configuration Manager for the PPS Planning. You must have SQL Server Cumulative Update 7 installed. We needed to install these files to ensure that we met the requirements:

  • SQLServer2005-KB949095-x64-ENU.exe
  • sqlncli_x64.exe
  • SQLServer2005_ADOMD_x64.exe
  • SQLServer2005_XMO_x64.exe

To register ASP.NET 2.0 Web Service Extensions in IIS:

  • Open cmd : Start -> Run -> cmd
  • Navigate to the correct folder and run:
  • C:\Windows\Microsoft.NET\Framework64\V2.0.50727\aspnet_regiis –ir

You can then start the Configuration Manager again and from here on it is stock standard as per normal.

We needed to update to SP2 so that we could get the benefits on the Management Reporter in terms of performance. The improvements in performance are massive HOWEVER they are just not good enough. It still takes way too long to produce our reports. So back to the drawing board for Microsoft. They are now performing a full evaluation of the product and are working out how to improve it.

- Paul Steynberg

Friday, December 5, 2008

PerformancePoint Server - An ETL Tip

Anyone running a Financial Model within PPS Planning in all likelihood updates the actuals from an ERP System within the business. Under normal circumstances this will entail (amongst a myriad of other things) the following:

  • Synchronize MG Tables to the Stage Area.
  • By way of some ETL tool (Normally SSIS) bring in your actuals.
  • All Deletes, Inserts and Updates are then written into the MG Table with the appropriate BizSystemFlag. 200 for Inserts, 300 for updates and 400 for deletes.
  • The data is then loaded from staging to the RefDB.
  • The model is processed.

Now one of the biggest problems in this entire process is the time it takes to Synchronize the MG tables to the Staging database and then if you have an enormous number of records, the inserting of these to the RefDB. (We have 22 million records in our MG table).

So we have decided to shortcut the process. We created our own ETL SSIS packages to move the data into the MG tables in the Stage database. This method completely negates the effort of synchronizing the MG table to the staging database as we join across the 2 databases in order to detect any updates or inserts. As the data always comes from a LOB system we never do any deletes. For new records we just insert them into the Stage database with the BizSystemFlag of 200. For updates we fetch the existing record from the RefDB into the StageDB and insert an additional record with the BizSystemFlag of 300. You cannot insert a record of type 300 or 400 without the accompanying 100 record. If you do the load will fail and you will get errors on those records.

By adopting this methodology we have reduced our update of PPS to under 5 minutes and it is run every half hour so that our reporting will be up to date. Another advantage of not synchronizing is that your indexes on the MG table in the staging DB are not dropped. Saves a lot of time.

- Paul Steynberg

PerformancePoint Server - Balancing Your Actuals

If like me, you are loading your actuals from your ERP system in PerformancePoint Server Planning, it helps to check your figures to make sure that they always balance. Now every good accountant will tell you that your trial balance must always balance to zero. This goes for each period and off course year to date. I have written some reports that self balance our system but in general here is the manual leg work behind it.

Remember that when loading from your ERP system all Balance Sheet Items are loaded as “CLO” for closing balance and all your Income Statement items are loaded as “NONE”. Your rules will calculate the Opening Balance “OPE” records and the Movement “MVT” records prior to you processing the model. In order to balance you then just bring the data into a pivot table (Standard connection to analysis services) and then check that your totals come to zero. But it can get a bit confusing if you do not get your combinations of Flow and TimeDataView correct. So to check you Year to Date figure you select TimeDataView as “YTD” and multi-select “NONE” and “CLO” for the flows. To check the period movement only change the TimeDataView to “Periodic” but then set flow to “NONE” and “MVT”. Voila, it should all balance as per the example below. In this example the 1 and 2 series are Balance Sheet and 3-9 series Income Statement. Also note that if you just add up the numbers you will not get to zero but the Pivot Table understands that some of them are signed and that they should be subtracted. Take a look here and here for a quick overview on this subject.


- Paul Steynberg

Thursday, December 4, 2008

PPS Management Reporter - Some Key Tables

While authoring some reports using the Management Reporter Designer from PerformancePoint Server I had reason to start digging through the tables in order to make sense of the row definitions. During this little excavation I identified some tables that thought might just be useful for somebody else if they knew this information upfront.

Firstly it would appear as though the report meta data is stored in a set of tables with the prefix “Control”. So for example the row definitions are stored in a number of tables but the one that resembles the layout in the designer is “ControlRowDetail”. Listed below are a list of tables that I needed to look into.

ControlCompany

When one has connected to a Management Reporter database (known as a connection) you are then required to connect to an entity. One creates an Entity and when doing so you have 2 “out of the box” sources being the Financial Data Mart 7.0 or PerformancePoint Server. This data is stored in the ControlCompany table. The connection data is stored in XML format and looks something like this:

EntitySetting Name="Connection"; Value xsi:type="xsd:string" /; /EntitySetting; EntitySetting Name="Model"; Value xsi:type="xsd:string";Financial Reporting/Value; /EntitySetting; EntitySetting Name="FunctionalCurrency"; Value xsi:type="xsd:string";ZAR/Value; /EntitySetting; EntitySetting Name="Calendar"; Value xsi:type="xsd:string" /; /EntitySetting; EntitySetting Name="Address"; Value xsi:type="xsd:string";http://servername:46787/Value; /EntitySetting; EntitySetting Name="Application"; Value xsi:type="xsd:string";TheGroup/Value; /EntitySetting; EntitySetting Name="ModelSite"; Value xsi:type="xsd:string";FinanceModel/Value; /EntitySetting; EntitySetting Name="OLAPServer"; Value xsi:type="xsd:string";OlapServerName/Value; /EntitySetting; EntitySetting Name="OLAPDatabaseName"; Value xsi:type="xsd:string";TheGroup_FinanceModel/Value; /EntitySetting; EntitySetting Name="Cube"; Value xsi:type="xsd:string";Financial Reporting/Value; /EntitySetting; EntitySetting Name="CalendarHierarchy"; Value xsi:type="xsd:string";Financial Calendar/Value; /EntitySetting; /ArrayOfEntitySetting;

Associated with the Entity is also a Building Block Group. This is stored in the field SpecificationSetID.

ControlSpecificationSet

The ControlSpecificationSet table stores the Id, Name and Description of the Building Block Groups referred to in the ControlCompany table.

ControlRowMaster

By using the SpecificationSetId from the ControlSpecificationSet table one can then return the Id’s and descriptions of the Row Definitions for the Building Block Group selected.

Now in order to get to the data that I needed (ie what accounts make up the lines in the Row Definition) one has to look at several tables being the ControlRowLinkMaster which gives you the ID for the ControlRowCriteria field called RowLinkId. You obtain the RowDetailId from the ControlRowDetail. This is used to build a query on the ControlRowCriteria, see example below:

select * from dbo.ControlRowCriteria where
rowlinkid='76DCEA67-9069-46E1-9704-2F42A3E0BC72' (Obtained from
ControlRowLinkMaster) and rowdetailid='E430DB82-081B-475A-B7E4-81AF5ECC3725'
(Obtained from ControlRowDetail)

You now have the dimensions and exactly what the criteria for each of those dimensions are for the row being looked at.

I needed to do this in order to generate a list of lowest level accounts from a report to make sure that EVERY account in my hierarchy was represented in the report at some point.

I am sure that the same logic above can be applied to Column Definitions etc etc.

Other tables worth mentioning are those surrounding security which all start with the word “Security”. So to find your users you look at SecurityUser etc.

- Paul Steynberg

Wednesday, December 3, 2008

Dynamics AX 2009 - Inconsistent Data When Exporting to Excel from Grid

Sometime back I listed some bugs that we had found in Dynamics AX 2009. One of the bugs was exporting large record sets from the journal lines grids into Excel. See here.

Although we have not managed to get this fixed here is a workaround until Microsoft releases the fix.

1) Go to AOT – Forms - LedgerTransAccount [this is the form that corresponds to Chart of Account Details - Transactions ]

2) Expand the node to view the datasources. Select the datasource ‘LedgerTrans’

3) Click on the ‘Properties’ icon or alternatively, press ‘Alt+Enter’, to open up the Properties window.

4) The property ‘StartPosition’ by default is marked ‘Last. This results in the cursor being at the last whenever the particular form opens up. Change this property to ‘First’ so that the cursor will always be at the top of the grid for the mentioned form

So whenever one opens this grid it will automatically navigate to the first record and will thus export correctly to Excel.

- Paul Steynberg

Back from The UK

Back at the office after a 10 day whirlwind tour of the UK. And boy was it cold. Landed in snow at Heathrow but for the most of the trip it bounced between 0 and -4 degrees C. Now for someone who barely owns a jersey this was quite something.

A big thanks to Tim and Sacha from Adatis Consulting for buying me a few beers and getting the chance to finally put faces to the names. We had some interesting discussions around the potential and direction of PerformancePoint Server and I still rate their blogs as some of the best in the business. Keep it up guys.

So it now looks quite imminent that my family and I will be relocating to the UK. Our second child will be born here at the end of February and the whole family should be settled in the UK by mid April 2009. I hope to continue contributing to the Finance/BI community but if they become a bit scarce you know why.

- Paul Steynberg

Friday, November 21, 2008

Off to the UK

Some of you may know that I am relocating to the UK in the new year. The reasons are many but primarily for the personal safety of my family as South Africa is becoming quite a dangerous place to live.

So I’m off to London tomorrow for 10 days to meet with some prospects and I believe that it is pretty damn cold at the moment. Wish me luck!

- Paul Steynberg

Tuesday, November 18, 2008

PerformancePoint Server Planning and Management Reporter - Further Considerations

When I originally listed a few things that one should bear in mind when designing a PPS Planning Model which will ultimately be used for reporting via the Management Reporter I forgot to mention this little tit-bit of information.

One should be very careful in naming your Entity Dimensions especially if you require Management Reporter Designer to self generate your reporting tree definitions. As in our situation it is quite possible to have a number of entities that have unique labels but duplicate names or descriptions. So as an example you may have the entity called “Finance” in a number of your entity hierarchies pertaining to various companies or divisions within your structure. Because these are identified separately in PPS Planning due to the label being different the problem is not immediate. The problem becomes apparent when you try and import all your entities into a Management Reporter Reporting Tree. The system identifies that you have duplicate Unit Names and will not allow you to save the Reporting Tree. (Ours lit up like a red Xmas Tree)

How to fix this. Luckily it is quite easy to create your descriptions in SQL or Excel and then just copy and paste them into the Reporting Tree Grid in the designer. All we did was concatenate the label and the name with a hyphen in between from SQL. This way you are assured of name and description uniqueness in your reporting structure. Or make sure that your Entity dimension names are unique.

- Paul Steynberg

Saturday, November 15, 2008

Interfaces to Dynamics AX 2009

A while back I penned about our dismal interlude into the BizTalk to Dynamics Application Interface Framework. I am led to believe that the problem was not with the AIF but with Dynamics and that it has been resolved. We however did not have any slack time in our very ambitious timetable to wait for it so had to take a different route.

I spent countless hours talking to people and browsing the Internet trying to find a robust way of creating interfaces and found absolutely nothing worth reading. Here is the route that I took.

I created a new Database specifically for interfaces into Dynamics AX. This Database is housed on the same server as our Dynamics Database. Within this database we then created a number of tables that are populated by our other systems within the organization. Based on our install I had 2 distinct types of interfaces. Invoices for Accounts Payable and Journals for the General Ledger. Most of the fields for both are shared anyway. Each system that was being integrated was assigned a "Source" code so that we would never lose track of where the data originated from. In a more complex environment I would suggest that you create a schema within the database for each source system. This will help with control and security.

Within Dynamics AX (DAX) we then created a new setup table and created a form called "External Interface Import Parameters". This table/form held the following information:

  • Import Type – A short identifier of the interface
  • Name – A full description of the external interface
  • External Server Name – The name of the server that housed the Interface Database being called.
  • Database Name – The name of the interface database mentioned above.
  • Journal Name – We set up a different journal name for each interface so that they could easily be identified when looking at the transactions in an account.
  • One Voucher Number Only – A bit switch. If set on this will create one voucher number for the entire interface run. If set off it will generate a new voucher number when it reaches a point of balance.
  • Stored Procedure – This is the name of the stored procedure within the database to call in order to get the data.
  • Write Back Table Name – The table that will be used to stamp back information once the interface has successfully run.
  • ODBC Source Name – We had to use ODBC connections due to the way the systems run from the client or from the batch server. (Some history here but it was necessary).
  • Batch Class Name – If this job was going to be called by the Batch server this is the name of the job to call.
  • Auto Post – If set on it will load and auto post the interface journal.

This information is then used repeatedly throughout the interfaces. Here are the steps that all interfaces go through:

  1. An interface call is made either from a menu item by an operator or from the batch manager in DAX.
  2. Based on the Import Parameters mentioned above the job will then call a Stored Procedure on the Interface database and will pass it the Company code as the only parameter. This is often referred to as the DataAreaId in DAX.
  3. The Stored Procedure fires on the Server and passes back to the job all the records for the interface.
  4. DAX then creates the Journal based on the Journal Name and will either auto post or leave it as unposted.
  5. When complete it will the write back to the table the Journal Number and Date/Time Completed.
  6. The batch job will keep firing for the company until the stored procedure returns no records back.

So what do the Stored Procedures do?

All records that are new have a TransactionStatus of "NEW". Those in the process of being posted are stamped "WIP" and those completed are stamped "FIN". The stored procedures finds records that are "NEW" and then selects the Top 1 record summarized by your extraction criteria into a table variable. As an example our POS Journals are summarized by Company, Trading Date, Currency and ExtractID from the LOB System. This table variable is then used to join back to the table to return the result set. As part of this procedure we also stamp the records as "WIP" with the date/time it was sent to Dynamics.

Each interface has been specifically written to meet the LOB criteria. I am busy working on a more generic system that will completely future proof the business. Once the framework is in place no additional development will be required. The only work required will be set up a record in the Import Parameters Table, set up a Journal Name, Number Sequence and just populate the interface tables.

I would be very interested to hear from other parties how they tackled this.

- Paul Steynberg

Monday, October 27, 2008

Dynamics AX Implementation Hours and Costs

When I started this Dynamics AX 2009 implementation process I hunted the net looking for a graph detailing the costs and cash outflows for the duration of the project. I could not find one so had to rely on my past experiences and the vendor to budget. As we stand, just a few weeks away from the consultants walking off site, I predict that we will be bang on target. We should spend about 95% of our allocated budget. Having gone through this process I thought that I should share my weekly cost and hours graph with you. To read the graph is quite simple. The red line represents that weekly cumulative cost of the project relative to the total cost of the project. I have actuals to the week ending 24th October 2008 and budgeted figures to the end. The graph starts at 61% which is the total cost of the software, training for my team and half a weeks consulting time. The final costs are expressed as a percentage of the total cost. (Values excluded to protect innocent bystanders).

Cost of Software (including first years maintenance) 58.36%
Training of myself and my team 1.93%
Consultants Time 39.71%

I have not included hidden costs such as training lunches, my staff time, material printing etc etc.

The blue line represents consultants hours by week.



- Paul Steynberg

Thursday, October 23, 2008

Finding Your Dynamics AX Lead Blocker

Dynamics AX 2009 does have some issues with SQL Server locks and quite often we are finding that people cannot access journal lines due to someone else blocking them. This then just hangs the client and it becomes unresponsive. Here is a quick method of finding the offender. The spid in SQL can be traced back to the user in Dynamics AX 2009 on the online users screen which can be accessed from "Administration". Each user that is working will have a SPID or sometimes several attached to his/her user name. Now you can call the lead blocker as you have a name and ask them to shut down AX.


If you do not have a tool such as Spotlight then create a view on the Master database that will show you the blockers. Here is the code for the view:

Create View vSysProcesses as

select p.spid, case when (p.blocked <> 0) then left(convert(varchar, b.spid) + ' - ' + case when (b.nt_username = '') then rtrim(b.loginame) else rtrim(b.nt_username) end -- UserName, + ' on ' + rtrim(b.hostname) + ', using ' + rtrim(b.program_name), 80) else null end as [BlockedBy], case when (p.nt_username = '') then left(p.loginame, 15) else left(p.nt_username, 15) end as UserName, left(p.hostname, 15) as HostName, left(p.program_name, 35) as Program, left(isnull(d.name, ''), 20) as DB, left(p.cmd, 25) as Command, left(p.status, 15) as Status, p.login_time as LogInTime, p.last_batch as LastBatchTime, left(p.lastwaittype, 20) as [LastWaitType], p.Open_Tran as [OpenTran], p.cpu as [CPU], p.physical_io as [PhysicalIO], p.[memusage] as [MemUsage], p.dbid as [dbId], (((1.0 + p.cpu) / 8500.0) * ((1.0 + p.physical_io) / 1100.0) * ((1.0 + p.[memusage]) / 38.0)) / (datediff(ss, p.login_time, GetDate())) as ResourceUsageFactorfrom master.dbo.sysprocesses pjoin master.dbo.sysdatabases don p.dbid = d.dbidleft join master.dbo.sysprocesses b -- blocking on p.blocked = b.spidwhere p.spid <> @@spid and p.Status <> 'background'

Once you have created the view run this against the view and you will find the blockers:
use master
select * from dbo.vSysProcesses
order by blockedby desc


- Paul Steynberg

Monday, October 20, 2008

Spotlight on.....

At the beginning of my blogging career I penned the Financial System Manager’s Toolbox Series. This series covered amongst other essential products the likes of Speed SQL IDE and LiteSpeed from Quest Software. I have to add another product to this list called Spotlight also by Quest Software. We have found this product to be brilliant. Although specifically purchased to monitor our SQL instances it also has a wonderful Windows Operating System Dashboard. If it were not for Spotlight I am sure that we would have taken many more hours to find our Dynamics AX 2009 issue.

Spotlight gives you a visual interface into your systems inner workings and you can always see what is going on. The ability to also track back in time to see what was happening at a point in time is priceless. Here is an example of the dashboard for a SQL instance.


Other products are capable of doing this type of monitoring but we have just found this one to be easy and it looks really great.


- Paul Steynberg

Thursday, October 16, 2008

Is Your Dynamics AX 2009 Client Freezing?

Picture the scene. 16 weeks of hard work and implementation time behind you. 1000's of hours with users, management and consultants. We finally go live and...... The clients hang for no apparent reason. No detectable patterns whatsoever. The peasants were getting restless. Imagine Marie Antoinette standing in the middle of Paris with a piece of cake in her hand. Things were looking a bit bumpy.

So we got under the hood with the spanners and gauges trying to find the problem. We eventually narrowed it down to open cursors locking a table "SysUtilElementsLog". This table is new to AX 2009 and is updated when the client is closed or every 15 minutes. If you are experiencing the client hanging I would suggest that you look for Database Locks. If they are locking the above table here is the work around from Microsoft. Backup the table prior to running this as it will be dropped. In essence one merely changes the table from being permanent to temporary.

1. Open the Application Object Tree (AOT).
2. Locate the Data Dictionary node in the AOT.
3. Expand the Data Dictionary node by clicking on the 'plus sign' located in front of Data Dictionary.
4. Locate the Tables node that exists within the Data Dictionary.
5. Expand the Tables node by clicking on the 'plus sign'
6. Scroll through the tables until you locate the SysUtilElementsLog table.
7. Right click on the SysUtilElementsLog table, and choose properties from the menu that is displayed.
8. Locate the property Temporary in the property window.
9. Change the Temporary property from No to Yes.
10. Save the changes by clicking on the disk icon in the AOT window.

- Paul Steynberg

Wednesday, October 15, 2008

Dynamics AX 2009 - ETL for PPS

Unlike our old system Dynamics AX is more suited to the PerformancePoint
Server Planning data requirements. Our old system stored the account movements in buckets by period in a single line for a year. In order to make this PPS friendly I had to use the SQL Server 2005 function "unpivot". We also had to hard-code certain things into the extract scripts in order to make sense of it in PPS. As an example here is the code I wrote. (It was beautifully structured and formated until a pasted it into the blog).
select case when code_fscs_dsg='A' then 'ACTUAL' when
code_fscs_dsg='B' then 'BUDGET'end as ScenarioLabel,case when
Period='Amt_Perd_01' then 'P1'+' FY'+cnt_yr_fscs when Period='Amt_Perd_02'
then
'P2'+' FY'+cnt_yr_fscs when Period='Amt_Perd_03' then 'P3'+'
FY'+cnt_yr_fscs
when Period='Amt_Perd_04' then 'P4'+' FY'+cnt_yr_fscs when
Period='Amt_Perd_05'
then 'P5'+' FY'+cnt_yr_fscs when Period='Amt_Perd_06'
then 'P6'+'
FY'+cnt_yr_fscs when Period='Amt_Perd_07' then 'P7'+'
FY'+cnt_yr_fscs when
Period='Amt_Perd_08' then 'P8'+' FY'+cnt_yr_fscs when
Period='Amt_Perd_09' then
'P9'+' FY'+cnt_yr_fscs when Period='Amt_Perd_10'
then 'P10'+' FY'+cnt_yr_fscs
when Period='Amt_Perd_11' then 'P11'+'
FY'+cnt_yr_fscs when Period='Amt_Perd_12'
then 'P12'+' FY'+cnt_yr_fscs when
Period='Amt_Perd_14' then 'P12'+'
FY'+cnt_yr_fscsend as Time_MonthLabel,
right(rtrim(left(id_acct,21)),8) AS
ACCOUNTLABEL,case when
Period='Amt_Perd_14' then 'MANADJ' else 'INPUT' end as
BusinessProcessLabel,left(ID_ACCT,13) as EntityLabel,'PERIODIC' as
TimeDateView,CODE_CURN_FSCS as CurrencyLabel,case when
substring(id_acct,14,1)
>= '1' and substring(id_acct,14,1)<='2' then 'CLO' else 'NONE' end as FlowLabel,substring(id_acct,6,2) as Country,Value,null as RowId, '' as RuleID, '' as ContextID, '' as AssignmentID, GETDATE() as CreateDateTime, GETDATE() as ChangeDateTime, 0 as LoadingControlID, 200 AS [BizSystemFlag],'' as BizValidationStatus, '' as BizSystemErrorDetailsfrom (Select Id_Acct,code_fscs_dsg,cnt_yr_fscs,CODE_CURN_FSCS,fscs_curn_type, Amt_Perd_01,Amt_Perd_02,Amt_Perd_03,Amt_Perd_04,Amt_Perd_05,Amt_Perd_06,Amt_Perd_07,Amt_Perd_08,Amt_Perd_09,Amt_Perd_10,Amt_Perd_11,Amt_Perd_12,Amt_Perd_14 from iMp30_afs)punpivot (Value for Period in (Amt_Perd_01,Amt_Perd_02,Amt_Perd_03,Amt_Perd_04,Amt_Perd_05,Amt_Perd_06,Amt_Perd_07,Amt_Perd_08,Amt_Perd_09,Amt_Perd_10,Amt_Perd_11,Amt_Perd_12,Amt_Perd_14)) as Unpvt

As mentioned Dynamics is better suited as it stores the balances by company, dimensions and accounts per day. So having to unpivot the data is now unnecessary. The information is now stored in a table called LedgerBalancesDimTrans. I have had a cursory look at the AX to PPS Wizard and have it on good authority that it will take across your transactions by day. Now I hate duplicating data so have written our extract to summarize the data by Ledger Period before loading it into PPS. In order for all this to make sense one has to make sure that the descriptions captured in the Ledger Period table match those that you are going to use when setting up the PPS Calendar series. For example I have used the format P1 FY2009 in PPS and must type this into the Period description on the table LedgerPeriod in Dynamics. Now my extract will bring across movement by period in the format that I require for PPS. There is subsequent conversions to id's etc but I will not get into that now. Below is the much shortened and simpler extract.

SELECT A.DATAAREAID AS DATAAREAID, A.DIMENSION AS DIMENSION, A.ACCOUNTNUM AS ACCOUNTNUM, P.COMMENTARIES AS PERIOD, SUM(A.DEBITMST+A.CREDITMST) AS VALUEFROM LEDGERBALANCESDIMTRANS A JOIN DBO.LEDGERPERIOD PON A.DATAAREAID=P.DATAAREAID AND A.PERIODCODE=P.PERIODCODE AND A.TRANSDATE BETWEEN P.PERIODSTART AND P.PERIODEND WHERE P.PERIODSTART>='2008-03-30'GROUP BY A.DATAAREAID, A.DIMENSION, A.ACCOUNTNUM, P.COMMENTARIESORDER BY A.DATAAREAID, A.DIMENSION, A.ACCOUNTNUM, P.COMMENTARIES

This was also laid out in an orderly fashion but it appears that the blog editor does not respect my line feeds and tabs. I think you will agree that Dynamics AX 2009 lends itself to easy integration with PPS Planning.

- Paul Steynberg

Monday, October 13, 2008

TILADAX - Locked in Journal

Dynamics AX has a setting in the chart of accounts that allows one to restrict posting to certain accounts. By using this facility one can ensure that only entries from say a subsidiary ledger can be posted to this account. Now anyone who has tried to reconcile a subsidiary account to the ledger control account will know that this is a good idea. During the setup process one would select certain of the ledger accounts to be control accounts. I will use the Accounts Payable module as an example. Under your setup of the Vendor Posting profiles you can select the summary accounts for each vendor group which will act as the control account in the ledger. The problem that I found was that there is no sure fire way of making sure that what is set up in the posting profiles is in fact then locked in journal in the COA's. By missing this one could easily end up having a reconciliation nightmare later on. To this end I created a whole bunch of scripts that tests all my posting profiles across Accounts Payable, Account Receivable, Bank, Tax and Fixed Assets to the COA to make sure that they are locked in journal.

Using my AP example I found the table that holds the accounts and then joined this to the COA table. Here is the script:

select * from
( select distinct sumaccount as Accounts
from dbo.VENDLEDGERACCOUNTS) as a
left join dbo.LEDGERTABLE b
on a.Accounts=b.AccountNum
where b.BlockedInJournal=0
or b.AccountNum is null

This will test for either the account not being locked in journal or the account missing from the COA. (I did not join on DataAreaId as my COA is shared amongst all the companies).

In order to fully test your system you would also need to write a similar script for the following tables:

BankAccountTable
AssetLedgerAccounts
CustLedgerAccounts
LedgerIntercompany
TaxLedgerAccountGroup

The list is not exhaustive but should get you going.

You will have to use the pivot function on the inter company and tax tables to test all of them.

- Paul Steynberg

Sunday, October 12, 2008

TILADAX Series

TILADAX - Things I learned about Dynamics AX, the Series. Over the past few months I have spent countless hours trying to find information on Dynamics AX. There is surprisingly less information than one would expect. Most of the information found tended towards either installation or development. A lot of what I have learned would have been helpful if it were published and although it may now seem trivial I am sure that someone else is looking for similar information.

To this end I am going to publish a whole bunch of TILADAX blogs. Some may appear ho-hum to the seasoned DAX Consultant but if it can help at least one person then it was worth it. The first blog will about the concept of "locked in journal" from the Chart of Accounts. I will try and bang it out tomorrow with a script that I used to check that the  accounts that should be locked in journal are in fact locked.

- Paul Steynberg

Thursday, September 25, 2008

Setting the MaxBufferSize for Dynamics AX 2009

A while back I was bleating on about some changes to the MaxBufferSize to help out in some performance issues between the client and the AOS in Dynamics AX 2009. Someone pointed out to me yesterday that it may have been a tad helpful had I posted the procedures. My apologies and here they are. Changes must be made to both the AOS and the Client.

DISCLAIMER: I have made these changes on our system BUT I do not endorse them or suggest you do them without first speaking to MS. (That should keep the legal chaps happy). And as per MS please backup your registry before attempting any changes.

AOS Registry
Key name: [HKLM\SYSTEM\CurrentControlSet\Services\Dynamics Server\5.0\\]
Value name: maxbuffersize
Value type: REG_SZ
Value:

Client Registry
Key name: [HKCU\Software\Microsoft\Dynamics\5.0\Configuration\]
Value name: maxbuffersize
Value type: REG_SZ
Value:

AXC-File Add the following line to the AXC-File manually:

maxbuffersize, text,0


Use Notepad to edit the AXC file. Do not change the format of this file. These changes will disappear if you change the configuration using the ConfigUtility.

Another source of restriction may also be the MaxRpcSize. I would suggest taking a look at this on the client and the AOS. You can find the setting on the AOS here:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Rpc

I would suggest reading up on this before making any changes.

I did notice an improvement when loading very large journals from our Excel templates.

- Paul Steynberg

Saturday, September 13, 2008

Dynamics AX 2009 - Half Way Review

I have spent the past 14 weeks with Dynamics AX 2009. In this time we have installed and configured it with a view to going live in 2 weeks time. I am reasonably happy with the product so far but, like all enterprise software solutions, one only really gets to grips with it once you go live. In these few weeks we have configured General Ledger (GL), Accounts Payable (AP), Accounts Receivable (AR) and Fixed Assets (FA). We have also written interfaces to and from our Stock & POS, Payroll, Bank Reconciliation and PO System. We are also in the process of finalising bespoke integration with Excel for uploading of journals and invoices.

Database Design

Believe me, I have asked the question, but no answer as yet. The design of the database flies in the face of database design principles. Very little attempt has been made to normalise the tables. The biggest sin of all is that the company code (DataAreaId, Char(4)) is stored against EVERY record in the entire database. This would have been the prime candidate for a surrogate integer key. Follow that closely with AccountNum and Dimensions and one can see the HUGE space savings and potential performance improvements. This design, in my experience, has 2 potential sources. One, the database was designed by front end coders. Two, the database was designed by someone who has little experience with large record sets. What is going to happen when my LedgerTrans table gets to 100 million rows? On that note based on all the answers that I have received back from Microsoft it would appear as though Dynamics does not have a stock standard archiving solution shipped with it. So this is what I'm thinking - "I have a database which is not normalised, check. My business processes approximately 40 Million Journal Lines per annum, check. Dynamics does not ship with an archiving solution, check. Question - What will happen to performance and database size over the next 3-5 years?". The answer? Upgrade the DB to SQL Server 2008 with compression set on. Talk about banking on technology improvements.

Something else that really baffles me is the apparent lack of referential integrity. There are also no stored procedures and very limited use of views. I understand that the product is also designed to run on Oracle but one has to ask the question. Why does a company that has developed a database and punted certain principles of design within that product group then write a product that forsakes all of them.

Accounting and Interface Design Principles

I have wrestled hard trying to come to terms with the design principles and how Axapta was conceived. It is both ingenious and ludicrous at the same time. The concept that one needs to get under the belt very early is that you can do just about anything from anywhere when it comes to journals. One is able to debit a supplier in Accounts Payable and credit a customer in Account Receivable directly etc etc. The system setup through it's posting profiles etc from all Subsidiary Ledgers keeps the General Ledger in balance at all times. With such abilities comes the inevitable framework to flummox even some of the most seasoned accountants. When configuring the system you have to keep your wits about you and really focus on the requirements. I cannot help but think that somebody with excellent systems design talent sat down with a bunch of accountants and when all was done and dusted he presented a system, not as the Royal Accounting Society would have done it, but rather as a Technological show piece. I like it, but it will take sometime for the business to fully come to terms with it.

External Interfacing

Dynamics AX is somewhat of a framework and does allow a number of ways to interface to it. The most prominent being through Web Services, the Application Integration Framework (AIF) and natively using X++. If you have read my prior blogs you will know that we hoofed the AIF due to bugs. We have used X++ to interface from our LOB Systems. (When I get a chance I will pen a full article on how we achieved this). After our experience with the AIF and BizTalk we have not endeavoured to test the Web Services.

Reporting

Dynamics AX does not ship with adequate report writing capabilities. Period. 

Frx Reporter is an additional cost and we all know that it is on it's way to the grave. PerformancePoint Management Reporter can be purchased BUT no standard direct data access at this point. My approach to this has been to keep the exact same PPS Financial Model as from our current ERP System and to just add to the data from AX after we go live. This way our Accountants will have one source of reporting data and it will have current data as well as years of history.

Overall Opinion

To date nothing in AX has wowed me. I am impressed with the development framework, disappointed with the reporting and pretty much neutral with everything else. We are not sure how it is going to perform in the wild but time will tell. Both our AOS and SQL Servers are way over spec'd so we do not really think that performance is going to be an issue. We followed the standard guidelines on setting them up. I do believe that we should however move to SQL Server 2008 as soon as we are comfortable with it. (Read- wait for SP1).

Tuesday, September 9, 2008

Dynamics AX 2009 - Bug List 1

For those of you who are looking to move to Dynamics AX 2009, here are some bugs/issues that you might find worth knowing about before you start.
Journal Grid Export to Excel

We found that if one did an inquiry which returned a large number of records for an account (say 10,000 records) they would be returned to the grid, BUT exporting them to Excel produced erroneous records. This will only happen if you DO NOT navigate the grid prior to exporting the data to Excel. You can easily see the problem as it stamps every record with exactly the same journal number, being the first one. If you navigate the grid first you will get the correct records in Excel. One only has to hit Cntrl-Home and then export. So there is a problem but also a very easy work around.

Fixed Assets Movements Report

When running this report you will get an exception thrown as seen below:


The reason, Variable fieldPrintFAInfo_FR is only initialized under the French configuration key, but is accessed under all configurations.



The solution, Add check for French configuration key in getFromDialog() method.


Grid Sort Order
Now Microsoft are not totally convinced that this is a bug but I am going to stick to my guns on this one. Every single software product in the market place that uses grids and allows sorting by clicking on the heading, sorts in ascending order first. All the grids in AX 2009 default to sort in descending order first. MS inform me that this can be changed but I am not convinced that out of the box sorting should be descending.
Unsaved Changes in the AOT Mark Objects as Changed
This is just downright annoying. If you go into the AOT and say for example right click on Addresses under forms and select edit, the edit screen will appear. Now do nothing but close down the edit screen. You will notice that a red vertical strip will appear next to the object and if you close and open the AOT it has been marked as changed with (usr) layer. This is valid for all objects in the AOT that we have looked at.
As we come across any others (I am sure we will), I will post them.
- Paul Steynberg

Friday, September 5, 2008

What's Bugging the AIF

Well you heard it here first. The Application Integration Framework (AIF) for Dynamics AX 2009 was dropped by us due to its inability to process large volumes of data for journal imports. This issue was raised with Microsoft as a bug and credit must be given to them for a very speedy and concerted effort to find the source of the issue. I had a conference call with Michael Merz and a whole bunch of people from Microsoft land a few weeks ago just after we dropped the AIF. They undertook to do some in-house stress testing and report back as soon as they could. The have upheld the commitment.

I received confirmation this week that there is indeed a bug, not actually in the AIF but with the GL Service that is called in Dynamics. Apparently some sort of validation was being called at a line level that required some caching. This turned out to be less than optimal on larger sets of data.

A hot fix should be available at the end of this week.

- Paul Steynberg

Monday, August 18, 2008

Management Reporter Considerations

When using Management Reporter (MR) with PerformancePoint Server Planning you must take certain things into consideration. To start with you must have a Financial Model as your source for MR. This will give the entity wizard a few things that are required such as currency and a calendar. It is also crucial that the calendar view selected for MR must have Years and Months.

How you set up your dimensions is also very important especially if have specific reporting requirements. None of the dimension attributes are exposed to MR so if you require them then create separate dimensions. For example you may have set Country as an attribute of your Entities and you will not see these in MR. Actually create a new dimension called Country and suddenly it will pop up in MR for use.

Another consideration of labeling your dimensions is that when exposed in MR, the hierarchies are not visible. It may be a good idea to identify your dimensions in some manner so that one can determine parents vs leaf level members. You may consider having an alternate hierarchy dimension for accounts that only show posting level accounts. This way you do not have the risk of adding already totalled accounts more than once.
By just keeping these few things in mind when creating models for reporting through MR will make life a lot easier.
- Paul Steynberg

Friday, August 15, 2008

Performance Issues with PPS Management Reporter - Update

After months of testing PerformancePoint Server Management Reporter in it's current state, I have concluded that it will not handle the volumes that our company requires. It is way to slow and will just frustrate the business. The problem lies with the whole design of the system. I did a full analysis of the MDX code and how the system operates and found that the client (on your machine) sends an MDX code string to the Analysis Server for every combination of lines in your report to the entities specified in your reporting tree. So if you have a 50 line report and you are running it for 10 entities you will send out 500 MDX queries, one at a time. As you can imagine this makes my ZX81 (with the 64Kb Ram Pack) look like Usain Bolt pumped up on steroids.

Do not despair, all is not lost. I had a very positive conversation last night with some members of the team working on this issue. I suspect that we could see two positives in the near future. The first being a MASSIVE improvement on SQL Server 2008. PPS should be certified to work on SQL Server 2008 by SP2 which is planned for December 2008. Another idea kicked around is changing the architecture to a service type environment on the server and use multi-threading and the power available. I would guess that this type of change would only be done for V2, so let's wait and see. As soon as I get my grubby paws on a copy of SP2 to test I will post an item on performance improvements.

- Paul Steynberg

Saturday, August 9, 2008

Should You Throw Apples at Windows?

I found myself looking around for a new laptop a few month ago. Around the same time I purchased a Canon DSLR and started a photography course. Everything pointed to a Mac being the choice for my hobby but Windows was required for my work. I found an article on running Windows on a Mac using Parallels and tested it out on a Mac that I borrowed from my company.

Well hold the phone people, we have a winner. I can say from first hand experience that purchasing a MacBook Pro is the best thing I have done all year. I can switch between Windows and Mac OS without even thinking about it. All my Windows tools are still available and I can benefit from Aperture 2 on Mac for my photograpy.

From what I can gather there are 3 products that allow you to run Windows on your Mac. They are Parallels, Fusion and Macs own Boot Camp. I chose Parallels and have been very happy with the results. It has a mode called Cohesion which presents Windows in such a way that it looks like your Windows applications are running natively within Mac OS. In other words you do not get the Vista desktop in a traditional sense of the word, you get the task bar loaded with your Mac task bar and you cannot tell that it is running on a virtual machine.

If you decide to go this route I suggest that you purchase a 4Gb memory kit from Kingston at around $100 and replace the 2x1Gb modules that comes standard with the Mac. The performance increases are well worth the price.

Another good idea is to share your documents with Mac outside the virtual machine. Also once you have installed Vista and all your applications take a snapshot. This way if you mess something up in the future you can restore the snapshot and still access your documents.

I have tested my system with no problems as yet. I can even terminal on to my PC at the office via a VPN.

The two biggest pleasant surprises for me have been the following:
  • If you have a number of applications open and say for example you have done some work on a new Excel spreadsheet but have not yet saved it, you can just quite Parallels and when you bring it back up again in the future the spreadsheet is waiting for you just as you left it.
  • If you receive e-mail attachments via your Mac, say a Word doc, and you double click it on your Mac, it associates it with Word in your VM and opens up your Vista session and presents the document in Word.
So to answer the original question, YES, you should throw Apples at Windows.

- Paul Steynberg

Thursday, July 31, 2008

BizTalk Dynamics AX AIF Adapter

A few days back I was blowing off some steam about the Dynamics AX AIF and the lack of documentation and know how. Our BizTalk consultant, Paul Somers, has done an excellent job of working it out and has posted a blog on his site giving the details thereof.

You will find 2 blogs to do with our issues and hopefully more to come.

Happy reading.


- Paul Steynberg

Wednesday, July 23, 2008

I Promised Myself ........

I promised myself when I started this blog that I would not use it as a soap box to expel the evil spirits of the day, but today I am going to make an exception.

So by now most of you are aware that I am implementing Dynamics AX 2009 and had decided early on in the project to use the Application Integration Framework (AIF) to interface to and from the system. I attended the Microsoft Convergence Conference in Orlando in March this year and after hearing Michael Merz speak, it was in the bag.

Back home during early negotiations with Microsoft and the suggested "Partner" we emphasized this point over and over. At no point does anyone stop me and say "but nobody here in South Africa knows how to use the AIF and certainly not with the BizTalk Adapter". So come time to install the AIF and the BizTalk Adapter our Partner's have no experienced skills base and everyone spins out trying to read the outdated documentation. In short it was not properly configured by the time my BizTalk consultant walked on site to write the interfaces.

It took us several working days to get the BizTalk Adapter to successfully pass XML documents into AX and get a return receipt. How that was done is just another whole story on its own. Once our BizTalk guru has documented the "undocumented" on his blog I will post a reference to it.

So now it works. We then start to pile on the volumes of our daily journals into the test DAX environment and it falls over. We eventually work out that if we chunk the data and send in lots of small journals it works. Our average daily journals into our current system from our LOB systems can vary from about 3000-9000 lines. We had to break them down to roughly 1000 lines each for the AIF to consume without rolling over on its back. To add to this my developer has written a cracker of an interface to bring in our payroll data as journals into DAX. This journal can be up to 6000 lines long and it loads perfectly but when he tried to validate it, it fell over with an out of memory error. In fact quite similar to the error that we were experiencing from the AIF side.

Well if you just imagine there was much wailing and gnashing of teeth. Here we had spent millions on an "Enterprise" solution and it fell over on a few thousand line journal. I sat and contemplated how I was going to tell our Executive Committee on Friday that I had spent millions on the software, had consultants on site who charged like wounded buffalo, and the system was not capable of processing what our current system could. A rusty bread knife and the thought of me slowly perishing in a pool of my own blood suddenly looked appealing. Anyway this passed quickly as I banged out out my plight on the support site with Microsoft.

Well friends it looks like AX has a default install that pretty much matches itself up with something like, say, Quickbooks. ie, it looks good but don't ask it to handle volumes. It would appear that the default size for memory allocation in the AOS is set to 10 MB at installation. To change this default buffer size requires changes in the AOS Server registry and a new entry in the AXC File. Another issue is a restriction on the RPC which also requires a registry setting to increase the MaxRpcSize. One can also increase the maxbuffersize of the client.

Now I have to ask the question, how can one market and sell Enterprise software and then install it with such limitations? I have gone through all the documentation provided with the product and also anything else that I have gleaned from the net and nowhere can I find reference to these settings.

Bad form Microsoft.

- Paul Steynberg

Wednesday, July 16, 2008

Talking in the Dark

Today has been quite a trying day for my team and I. The Dynamics Consultants have configured the AX2009 AIF as best as they could from the AX4.0 documentation. Now this in itself is quite something as they do have differences. The BizTalk Adapter was also configured and ready for our BizTalk expert who walked on site this morning. The day started of sterling with a thumbs up on our BizTalk installation. The rest of the day was however spent blundering around in the dark trying to get the 2 talking to each other. Have you ever tried to figure out what is going on with outdated documentation. I do not think that anything could possibly be more frustrating.

So at the end of the day we sit with 2 systems that appear to be properly configured but not able to communicate. (Sounds like my wife and I).

Any way here is the error that we get when we drop a document into the BizTalk inbound connector after it has tried to pass this to the AIF.



Event Type: ErrorEvent Source: Microsoft Dynamics AX
2009 adapterEvent Category: NoneEvent
ID: 0Date: 2008/07/16Time: 01:27:07
PMUser: N/AComputer: AUTHENTIXDescription:The adapter is unable
to send the following message to Microsoft Dynamics AX:MessageId:
77802122-69d3-4676-912d-d73d95090e3fError:
http://schemas.microsoft.com/dynamics/2008/01/documents/Fault">
Request Failed. See the Exception Log for
details.


--------------------------------------------------------------------------Product
Version : 5.0Assembly Version : 5.0.0.0Class :
AdapterTransmitterEndpointMethod : ProcessMessageSubsystem :
BizTalkDynamicsAdapter

So if anybody has seen this before your comments would be most welcome. When we find a solution I will post it.

- Paul Steynberg


Sunday, July 6, 2008

Is Talking Business Cheap?

As part of my Dynamics AX 2009 project we are evaluating the use of BizTalk as a mechanism to integrate data between our LOB systems and DAX. As we are a value retailer (Sounds cooler than calling us cheap-ass), it is in-bred in us to maximize our spend. (Also sounds cooler than milking it).

BizTalk is expensive, you cannot hide from this fact. A 2 processor license for the Standard Edition will set you back $17,000 and for the Enterprise Edition you are in for a stonking $70,00. My reference is here. I am sure that one can argue that this is before discounts etc etc but lets just talk list for this exercises. So what does the Enterprise Edition give you that costs $53,000 extra. You can go from a limit of 5 applications to unlimited. It also allows you to use more than 2 processors and it scales out better with fail over clustering etc etc. Apart from that the functionality is pretty much the same. Now I would imagine that most medium to larger enterprises would probably be OK with the Standard Edition depending on how many disparate systems are lying around.

On one of my late night web blundering, trying to find out more about this animal and potential alternatives I came across something quite exciting. I found a company based in the UK by the name of Quality Componentware. These guys are BizTalk consultants and are Micrososft Gold Certified Partners. If one digs a bit deeper you will also find out that they have written an alternative to BizTalk called SmartsIntegrator. The pricing? $499 per machine!!!! And it does not look that bad. I downloaded a trial version and had it installed and working on my PC within half an hour. Compare this with the day it took us to get a test version of BizTalk up and running.

After installing the product I browsed around and found it relatively intuitive. I created and inbound and outbound file location. Dropped file into one side and a second later it appeared on the other side as expected.

I am in no way endorsing this product and cannot find any customer references on it, BUT at the price surely it is worth investigating? I am certain that BizTalk will club it in terms of volume handling and pure functionality BUT does every business need that?

If you have heard about or worked with this product your comments would be most welcome.

- Paul Steynberg

Wednesday, July 2, 2008

Dynamics AX AIF

If you are looking for some quality information on the Dynamics AX Application Integration Framework I would seriously suggest that you visit Dianne Siebold's blog on the subject. Her blog focuses on the SDK documentation for Dynamics AX.

- Paul Steynberg

The X-Files

So why have I been so quiet for the past few weeks? Well, a lot has transpired. We have made the decision to replace our current ERP system with Microsoft's Dynamics AX. My team and I had exactly 14 weeks to implement counting from the day the first consultant walked onto site which was about 10 days ago. So to say that the deadline is tight would be an understatement.

What does this have to do with the X-Files? Well apart from the new product being called AX, I have also been heavily exposed to XML. I would suggest reading the link to get a good understanding of XML but here is a quick description that sums it up for me. XML is an open standard which promotes self describing data to be transferred between points.

Dynamics AX 2009 is the latest release from Microsoft and we have decided to jump right in and skip V4.0. Most techies reading the blog at this point have probably fainted as we all know what new releases from MS can be like, but know this. My team and I did a lot of homework and also attended the Orlando Convergence Conference. We discussed our concerns with some of the team leaders for AX. The consensus was that as we are using only standard core functionality within the application, the risks would be minimal.

One of the main reasons I wanted to go with AX2009 was the new Application Integration Framework (AIF) provided. This framework allows the flow of data to and from other systems in a robust fashion. I attended the session given by Michael Merz (Program Manager) on the AIF and was extremely impressed with the business capabilities. AX2009 ships with a bunch of new web services for data interchange and we are going to take advantage of this as a lot of data that ends up in our ERP system is generated from home written sub systems. The AIF uses XML files in order to receive and send data.

Reading up on the AIF was quite a task and had me up late at night for a week. This exercise lead me to another product from Microsoft called BizTalk Server. AX2009 ships with the BizTalk connectors required to move data in and out of AX. The great thing about this whole "interface" layer (BizTalk and AIF) is that one never compromises the business logic at any point. We are now at a point where we are seriously considering using BizTalk Server to implement our system interfaces and from what I have seen and heard so far we will not be disappointed. One of the things that BizTalk does is allow sets of data to communicate and be converted into formats that can be read by both sides.

Hopefully BizTalk and the AIF will allow us to meet our deadline with some time to spare. Unlike David Duchovny and his large lunged side-kick Gillian Anderson I have taken a bold step for mankind and hired my people based on talent and not good looks.

- Paul Steynberg

Sunday, June 22, 2008

The Financial System Manager's Toolbox (Part II)

A while back I promised to elaborate on Microsoft Analysis Services as part of a Financial Systems Manager's Toolbox, so here it is. If there was an Accounting God this would be the tool that he would bestow upon his most loved people. SSAS is shipped with certain SQL Server versions. To see if yours is please follow this link:

http://www.microsoft.com/sql/prodinfo/features/compare-features.mspx

I cannot even begin to tell you how incredible this software is for accountants. (Well actually for most people but you get the idea). It allows one to take huge amounts of data and store it in such a way that you can retrieve aggregated data on any combination of dimensions FAST. Two words in that sentence are of importance. The first is dimension and the second is fast.

A dimension in SSAS terms is one of a possible number of ways to describe a number. Here is an example. I give you a number, say 100. That means nothing to you. Now I start to give you some dimensions and dimension members and suddenly the 100 makes sense. Here they are:

Dimensions and Member of the Dimension
Entity=Contoso Head Office
Department=Accounting
Account=Bank Charges
Scenario=Actual
Period=Y2009P03
Currency=USD

Now that you have the dimensions and the dimension member the 100 suddenly makes sense. It is the actual bank charges (as apposed to the budget bank charges) for the accounting department at Contoso Head Office for the accounting period 3 of Year 2009 in dollars. Now imagine having millions of rows of data each having these dimensions describing the number in each row. To try and make sense of this would be madness. Along come SSAS.

SSAS takes the data and builds OLAP cubes which pre-aggregate the data based on internal algorithms in order to allow the user to bring back information at every possible intersection of data with incredible speed.

Here at my organisation I have built a few financial cubes off our underlying ERP system. These cubes allow the rapid slice and dicing of information from a number of delivery mechanisms. I prefer delivering the data in pivot tables in Excel linked directly to the cube. One can also use tools such as ProClarity and PPS Monitoring and Analytics. The uses are endless. PerformancePoint Server uses SSAS as the data point for all its components including Management Reporter, a product that we are rolling out in parallel with our new ERP System, Dynamics AX 2009.

Once you start using something like this you will know that the Accounting Gods are smiling upon you.

- Paul Steynberg

Friday, June 13, 2008

Negotiating in the IT World

Purchasing software is quite an ordeal, especially ERP software. Here are a few tips and notes from some very recent first hand experience.

  1. ERP software vendors are no better than second hand car salesmen. Listen well but make sure that you get any promises in writing or at least confirmation via e-mail. This has just saved me $45,000 and certain embarrassment. During the negotiation period they will promise everything and often state that "that functionality is standard". Be certain to check everything as often things don't add up when you get to sign the contract.
  2. The RFP and negotiating always takes months. Find out when each of the ERP vendor's financial year end is. Aim to sign the final deal within a few weeks of this date. They will throw in the CEO's grandmother for free, if it means making target.
  3. Plan your capacity with care. It often does not pay to purchase double the required licenses now at a great discounted price if you are not going to hit the numbers for quite some time.
  4. A large portion of a major ERP project requires no software so do not buy the software up front as it will sit on your desk starring at you while your annual license fees for a gazillion concurrent users remains unused. So when you do go live you suddenly find yourself having to pay the annual fee again a few days later. Use a test environment until you really have to install the live one.
  5. When one of the ERP vendors comes to visit during the negotiation stages always subtly remind them that this is a competition. For example when Oracle comes to visit make sure the pen you remove from your top pocket to make notes is from SAP. Don't show the SAP people out of the lobby with the Oracle bunch waiting for you. This may sound effective but they all know each other and you don't want them to start talking to each other.

Above all else, as mentioned in one of my other blogs, DO NOT believe all the white papers. Do your homework diligently and if you are offered site visits try and do them without vendor presence as you can really get some honest answers.

- Paul Steynberg

Thursday, June 5, 2008

Management Reporter - Scheduling Reports Act II

Sometime back I requested the ability to schedule reports from PPS Management Reporter, a feature that was missing from V1. I put together a quick spec for Microsoft of what would pacify us until they put a proper full scheduler into future releases. This is what I requested as a minimum:

The command – ManagementReporter.exe

Flags/Parameters

-Generate (action to be done)
-Connection:DefaultConnection (connection created to the MR database.)
-Logon:”Mr Price Group” (The entity that you wish to log into)
-BuildingBlockGroup:Default (building block within logged on install that you wish to use)
-Report: “\Mr Price\SSMP1 Income Statement with PMO Monthly” (The report name and the subdirectory stored in)
-Output:”Management Reporter” (Could be “Management Reporter” or “Reporting Definition Language” or “Printer”

So what did we get? In short we get to use the same ReportGenerator.exe that is in the program files directory. We also get 3 of the 6 switches/parameters requested. These are:

-Generate
-BuildingBlockGroup
-Report

The report flag does not however require any sub directory as although the reports are stored in directories they MUST be unique in name.

Here is an example of the command syntax.

ReportDesigner.exe -Generate -BuildingBlockGroup:Default -Report:"SSAS4 Expenses Actual vs Budget Monthly"

We are extremely grateful for what we have received but it does come with some prerequisites as the Report Designer is actually opened and then closed once the report has completed.

  1. Remember to tick the "save my credentials" box on the logon screen for Report Designer on the machine that you are going to run the scheduled reports. Also make sure that the appropriate default connection and entity are selected. The scheduled reports will use these credentials, connection and entity.
  2. Any reports run should not have "Select units at run time" ticked on the Settings Tab under "Reporting unit selection".
  3. You can only output to Management Reporter and not to file or RDL. Even if you select these they report will only be sent to the Report Viewer.
  4. You cannot run more than one report at a time. If one report is running any subsequent attempt will realize that the Report Designer is already open and report a success message even though it has done nothing.

A list of error messages that you can encounter are:

0=Success
1=UnknownBuildingBlockGroup
2=UnknownReport
3=ReportViewerExportError
4=ReportFailedToLoad
5=ReportAlreadyGenerating
6=ReportingServicesGenerateError
7=UseRowDefinitionFromTree
8=ReportDesignerBadRepositoryPermissions
9=UnknownRepositoryError
10=FailedToConnectToReportingServices
11=BeforeGenerateValidationError
12=AcceptSecurityWarningError
13=ValidateDataError
14=BuildingBlocksNotValidated
15=UnableToCreateTree
16=CompanyCredentials
17=EnsureCompleteCompanyCredentialsFailed
18=EngineException
19=FailedToOpenReport
255=UnknownError

Reports can be run from the command line, task scheduler and from SQL Server Agent. Although this is not comparable to the Frx Reporter Scheduling it is a step in the right direction.

- Paul Steynberg

Wednesday, June 4, 2008

Management Reporter - Some Feedback

Thanks to Jill Carter from the Management Report Team for a prompt reply to my questions posed to them yesterday. Refer to my last post.

It seems that the team tried to "group" the entities together and fire of fewer queries but found that is was slower. Who would have thought?

There are apparently some improvements on SQL2008 which we can look forward to. I have also suggested that they multi-thread the queries but at this point it cannot be done as a hotfix as it my destabilize the product. Watch this space.

- Paul Steynberg

Monday, June 2, 2008

Management Reporter - Fixed or Not Fixed, That is the Question

For those of you following my Management Reporter performance tragedy here is Act II. I received my QFE . Now it sounds ironic that I would be experiencing a performance issue with something called PerformancePoint, but what is more ironic is that the fix did not really fix the problem. I have seen a marked improvement in the report generation times BUT it is still not good enough.

Most of the reports halved in the time taken to run. A report with a large reporting tree did not return in the previous version (canned after 8 hours) but took 23 minutes in the "fixed" version. So the gains are excellent but how does this stack up to the data. Not well, the 23 minute report only needed to fetch 3310 rows of data from the underlying PPS cube. I ran the report a number of times in succession, hoping that the trick of "warming" the cache might be my salvation. Performance increases were seen and this report took 12 minutes to run. So from not being rendered at all to just 12 minutes means that the people at Microsoft did some serious work. Now from a technical perspective one has to admire the effort. From an end users perspective I still say "toooooooo sloooooooowwwww".

Surely it cannot take 12 minutes to return 3310 rows of data from a technology such as OLAP which was designed for fast and efficient retrieval. So what is the problem? In order to make sense of this I profiled the cube being connected to by MR and what I found is quite interesting.

Reports are made up of a combination of rows, columns and then reporting trees. It seems that for every row in the report an MDX query is built to return the required data and then fired off to the server for each entity in the entity reporting tree. Here is an example of the MDX query. I have replaced the parameters with the actual member names. They follow the query in the profiler in XML format. Now take note of the red code which is the entity reference.

WITH
-- Column Sets and Members
-- Columns General: Define a set and calculated member for each unique filter criteria across all columns.
SET [Criteria480369GLScenarioCriteria] AS
{
([Scenario].[All Members].[Actual])
}
SET [Criteria480370GLTimeDataViewCriteria] AS
{
([TimeDataView].[All Members].[YTD])
}
SET [Criteria480371GLTimeCriteria] AS
{ [Time].[Financial Calendar].&[772007013] }
MEMBER [Time].[Financial Calendar].[Criteria480371GLTimeCriteria] AS SUM ([Criteria480371GLTimeCriteria])
SET [Criteria480372GLCurrencyCriteria] AS
{
([Currency].[All Members].[ZAR])
}
SET [Criteria480377GLScenarioCriteria] AS
{
([Scenario].[All Members].[Budget])
}
SET [Criteria480378GLTimeCriteria] AS
{ [Time].[Financial Calendar].&[772008025] }
MEMBER [Time].[Financial Calendar].[Criteria480378GLTimeCriteria] AS SUM ([Criteria480378GLTimeCriteria])
-- Column46363
SET [Column46363] AS
{
( Criteria480369GLScenarioCriteria, Criteria480370GLTimeDataViewCriteria, [Time].[Financial Calendar].[Criteria480371GLTimeCriteria], [Account].[DebitCreditMemberId].[All], Criteria480372GLCurrencyCriteria )
}
-- Column46364
SET [Column46364] AS
{
( Criteria480377GLScenarioCriteria, Criteria480370GLTimeDataViewCriteria, [Time].[Financial Calendar].[Criteria480378GLTimeCriteria], [Account].[DebitCreditMemberId].[All], Criteria480372GLCurrencyCriteria )
}
-- Column46365
SET [Column46365] AS
{
( Criteria480369GLScenarioCriteria, Criteria480370GLTimeDataViewCriteria, [Time].[Financial Calendar].[Criteria480378GLTimeCriteria], [Account].[DebitCreditMemberId].[All], Criteria480372GLCurrencyCriteria )
}
-- Row Sets and Members
MEMBER [ValueMember] AS [Measures].[Value]
MEMBER [ValueMemberWithSignReversal] AS [Measures].[Value] * IIF([Account].[MemberId].CurrentMember.Properties("DebitCreditMemberId") = "Credit", -1, 1), NON_EMPTY_BEHAVIOR=[Measures].[Value]
-- Row 61640
MEMBER [61640] AS
( [ValueMemberWithSignReversal],
([Account].[Masterpiece].[7XXXXXXX]),
([BusinessProcess].[Standard].[INPUT]),
([Flow].[All Members].[NONE]),
([Entity].[Entity Hierarchy].[SSMP1AA46])
)
-- Main Select Statement
SELECT
{ [Column46363], [Column46364], [Column46365] }
PROPERTIES MEMBER_CAPTION ON COLUMNS,
NON EMPTY
{ [61640] }
PROPERTIES MEMBER_CAPTION ON ROWS
FROM [FinancialBudgets]

This code returns 1 row of data.



So if you have 60 entities in your Reporting Tree Definition this code is going to run exactly as it is 60 times with only the row number and entity changing. So why not modify the MDX and send only 1 query through per line and modify the row set to include ALL entities? I tested my theory and modified the code with 3 additional entities. The run time was the same but it returned 3 rows of data.

Here is the part that is amended:

-- Row 38186
MEMBER [38186] AS
( [ValueMemberWithSignReversal],
([Account].[Masterpiece].[3XXXXXXX]),
([BusinessProcess].[Standard].[INPUT]),
([Flow].[All Members].[NONE]),
([Entity].[Entity Hierarchy].[SSMP1AA62])
)
MEMBER [38187] AS
( [ValueMemberWithSignReversal],
([Account].[Masterpiece].[3XXXXXXX]),
([BusinessProcess].[Standard].[INPUT]),
([Flow].[All Members].[NONE]),
([Entity].[Entity Hierarchy].[SSMP1AA63])
)
MEMBER [38188] AS
( [ValueMemberWithSignReversal],
([Account].[Masterpiece].[3XXXXXXX]),
([BusinessProcess].[Standard].[INPUT]),
([Flow].[All Members].[NONE]),
([Entity].[Entity Hierarchy].[SSMP1AA64])
)
-- Main Select Statement
SELECT
{ [Column19387], [Column19388], [Column19389] }
PROPERTIES MEMBER_CAPTION ON COLUMNS,
NON EMPTY
{ [38186],[38187],[38188] }
PROPERTIES MEMBER_CAPTION ON ROWS
FROM [FinancialBudgets]
I am curious as to why MS has elected to fire 1 MDX query per report row for each entity and have put the question to them.
I did notice one additional feature with the hotfix. The "number of records processed" counter on the "Generate Report" screen now shows the count in 100 increments as the rows are retrieved. In the previous version you would have to wait until the end of the report for the total only so you had no idea of the progress.
I will report back once I have more information from Microsoft.
- Paul Steynberg

Thursday, May 29, 2008

Management Reporter - Scheduling Reports

When I first go my hands on Management Reporter last year I was very disappointed to find that you could not run the reports unattended on a schedule. You had to be logged into the Report Designer and had to physically run the reports one at a time. As this would not do I took my case to Jill Carter and her team while attending the AX conference in Orlando. I asked her who I would have to beg or crawl naked over broken glass in order to get this functionality.

She did not disappoint. They are in the final stages of a "solution" which they will make available to me in the next week. From what I can gather it allows you to schedule reports via the command line. This means you can schedule them using Task Scheduler or from SQL Server Agent. Either way the news is good.

As soon as I have it in my grubby little paws I will test and feedback.

- Paul Steynberg

Monday, May 26, 2008

The Rational Guide to PerformancePoint Server Planning - Not a Book Review

I back-ordered The Rational Guide to Planning with Microsoft Office PerformancePoint Server 2007 with Amazon and was probably one of the first to get my hands on it. I hang my head in shame as it has taken me months to finish it. Not due to any short comings in the book, but rather that my 13 month old daughter firmly believes that my wife and I are put on this planet to serve and play.

The book is written by Adrian Downes and Nick Barclay and assured to contain no bull by Microsoft's own Peter Bull, Principal Group Program Manager for PerformancePoint Server.

I would like to call this a book review but cannot as just reading the book would be like reading a murder mystery with the last chapters torn out. The authors casually point out that you can download bonus chapters from the website, but what they don't tell you is that this is where the gold is to be found. If you buy this book you MUST download the additional 4 chapters.

The book is well laid out and easy to read. There are highlighted notes, tech tips and cautionary announcements throughout and enough screen shots for illustration. Worth mentioning is Chapter 9, Integrating Business Data. This details the ETL process between LOB systems and PPS and from many posts on the forum this seems to be an area of concern for a lot of people.

Both authors are extremely active in the PPS community and this book shows their authority on the subject.

- Paul Steynberg

Thursday, May 22, 2008

Lies, Damned Lies and Statistics

This well-known saying is part of a phrase attributed to Benjamin Disraeli and popularized in the U.S. by Mark Twain: There are three kinds of lies: lies, damned lies, and statistics. The semi-ironic statement refers to the persuasive power of numbers, and succinctly describes how even accurate statistics can be used to bolster inaccurate arguments. (From Wikipedia)

So here I stand before you with the results of my survey. A resounding beating of the accountants by the techies by 17:2. (No Pareto Principle here.)

So do these statistics mean that I should abandon my accounting blog with a technical undertone? Absolutely not, because the statistics lie. I received a number of e-mails from the respondents who state that the accounting view is what makes this blog different from most other technical blogs. So, on that note I will continue as is, unabated by the statistics.

Thanks to all those who took the survey and took the time to contact me directly, really appreciate it. I leave you with this parting comic strip from Dilbert. What a classic!


- Paul Steynberg

Wednesday, May 21, 2008

Management Reporter Live Recording

Hot off the press! Live Meeting recording from Microsoft about Performancepoint Server Management Reporter.

https://www107.livemeeting.com/cc/msevents/view?id=BI27AL&role=attend&pw=EDK389

On my way to check it out. Also MS are about to give me a QFE for the performance issue. Will feedback on that as soon as I have results.

- Paul Steynberg

Monday, May 19, 2008

Shake the Trees Early

It doesn't matter how good your accounts department is, when you start shaking the trees, skeletons will start to fall. So rather start shaking that trees sooner than later because the ensuing cleanup/cover up could take some time.

You may remember from my previous posts that we are moving from our current ERP system to Dynamics AX 2009 (Formerly 5.0 but re branded recently to fall in with Microsofts standards). As soon as we got the go ahead I put my hard hat on and ran into the tree, shoulder first.

So what do we mean but this. When one moves to a new ERP system, all the relevant data in the old one need to be migrated across. The very first place to start looking is subsidiary ledgers and control accounts. To illustrate this concept I am going to use Accounts Payable (AP) as my example. At any point in time all the open vouchers in the system should tie up to the AP ledger which in turn should balance to the General Ledger control account. It is also very important to start identifying how you can isolate the open/unpaid vouchers in your old system from an SQL script point of view. The outcome of this exercise will be that you should be able to clean up your data and have a full understanding of what needs to be extracted for the conversion.

I does not help to only balance up the subsidiary ledger. One should also look at the open/unpaid items and take a view on validity. As a yard stick I would suggest that any vouchers older than 6 months should be reviewed and taken care of prior to go live. In our situation I found a number of vouchers that were quite old but if you take the debit and credit memo's they balance to zero for a specific vendor. It thus came to light that zero payments were not being made on these nett-offs, resulting in an unusually high number of open items.

By making these numbers and transactions available to the accountants we have managed to clear a significant number of vouchers which will now not have to be taken on in the future.

A parting word of caution, always look up when shaking the tree, you need to side step the falling elephants that everybody can see but nobody wants to talk about.

- Paul Steynberg