Showing posts with label SQL Server. Show all posts
Showing posts with label SQL Server. Show all posts

Wednesday, 27 July 2011

New Features in SQL Server Code-name Denali - CTP3

Microsoft are shipping some new technology in the upcoming version of SQL Server. This new technology is code-named "Apollo" and introduces two new features.
  • Columnstore Indexes
  • Vector-based query execution
These 2 features claim to speed up Data Warehouse query processing time by a factor of between 10 and 100. Follow this link for a full description of the these features. You can download the CTP3 of Denali here.

- Paul Steynberg

Monday, 25 July 2011

SQL Server Project "Crescent" Demo

Follow this link to see a demonstration of SQL Server Project "Crescent". To quote "a new immersive ad-hoc visualization tool that is part of SQL Server Code Name "Denali" Reporting Services.  Project "Crescent" is designed with end users in mind to quickly, easily, and visually explore their data and answer ad-hoc questions in just a few clicks."

It sounds like exciting stuff and certainly plugs some gaps in the Microsoft stack.

- Paul Steynberg

Saturday, 15 November 2008

Interfaces to Dynamics AX 2009

A while back I penned about our dismal interlude into the BizTalk to Dynamics Application Interface Framework. I am led to believe that the problem was not with the AIF but with Dynamics and that it has been resolved. We however did not have any slack time in our very ambitious timetable to wait for it so had to take a different route.

I spent countless hours talking to people and browsing the Internet trying to find a robust way of creating interfaces and found absolutely nothing worth reading. Here is the route that I took.

I created a new Database specifically for interfaces into Dynamics AX. This Database is housed on the same server as our Dynamics Database. Within this database we then created a number of tables that are populated by our other systems within the organization. Based on our install I had 2 distinct types of interfaces. Invoices for Accounts Payable and Journals for the General Ledger. Most of the fields for both are shared anyway. Each system that was being integrated was assigned a "Source" code so that we would never lose track of where the data originated from. In a more complex environment I would suggest that you create a schema within the database for each source system. This will help with control and security.

Within Dynamics AX (DAX) we then created a new setup table and created a form called "External Interface Import Parameters". This table/form held the following information:

  • Import Type – A short identifier of the interface
  • Name – A full description of the external interface
  • External Server Name – The name of the server that housed the Interface Database being called.
  • Database Name – The name of the interface database mentioned above.
  • Journal Name – We set up a different journal name for each interface so that they could easily be identified when looking at the transactions in an account.
  • One Voucher Number Only – A bit switch. If set on this will create one voucher number for the entire interface run. If set off it will generate a new voucher number when it reaches a point of balance.
  • Stored Procedure – This is the name of the stored procedure within the database to call in order to get the data.
  • Write Back Table Name – The table that will be used to stamp back information once the interface has successfully run.
  • ODBC Source Name – We had to use ODBC connections due to the way the systems run from the client or from the batch server. (Some history here but it was necessary).
  • Batch Class Name – If this job was going to be called by the Batch server this is the name of the job to call.
  • Auto Post – If set on it will load and auto post the interface journal.

This information is then used repeatedly throughout the interfaces. Here are the steps that all interfaces go through:

  1. An interface call is made either from a menu item by an operator or from the batch manager in DAX.
  2. Based on the Import Parameters mentioned above the job will then call a Stored Procedure on the Interface database and will pass it the Company code as the only parameter. This is often referred to as the DataAreaId in DAX.
  3. The Stored Procedure fires on the Server and passes back to the job all the records for the interface.
  4. DAX then creates the Journal based on the Journal Name and will either auto post or leave it as unposted.
  5. When complete it will the write back to the table the Journal Number and Date/Time Completed.
  6. The batch job will keep firing for the company until the stored procedure returns no records back.

So what do the Stored Procedures do?

All records that are new have a TransactionStatus of "NEW". Those in the process of being posted are stamped "WIP" and those completed are stamped "FIN". The stored procedures finds records that are "NEW" and then selects the Top 1 record summarized by your extraction criteria into a table variable. As an example our POS Journals are summarized by Company, Trading Date, Currency and ExtractID from the LOB System. This table variable is then used to join back to the table to return the result set. As part of this procedure we also stamp the records as "WIP" with the date/time it was sent to Dynamics.

Each interface has been specifically written to meet the LOB criteria. I am busy working on a more generic system that will completely future proof the business. Once the framework is in place no additional development will be required. The only work required will be set up a record in the Import Parameters Table, set up a Journal Name, Number Sequence and just populate the interface tables.

I would be very interested to hear from other parties how they tackled this.

- Paul Steynberg

Monday, 20 October 2008

Spotlight on.....

At the beginning of my blogging career I penned the Financial System Manager’s Toolbox Series. This series covered amongst other essential products the likes of Speed SQL IDE and LiteSpeed from Quest Software. I have to add another product to this list called Spotlight also by Quest Software. We have found this product to be brilliant. Although specifically purchased to monitor our SQL instances it also has a wonderful Windows Operating System Dashboard. If it were not for Spotlight I am sure that we would have taken many more hours to find our Dynamics AX 2009 issue.

Spotlight gives you a visual interface into your systems inner workings and you can always see what is going on. The ability to also track back in time to see what was happening at a point in time is priceless. Here is an example of the dashboard for a SQL instance.


Other products are capable of doing this type of monitoring but we have just found this one to be easy and it looks really great.


- Paul Steynberg

Friday, 15 August 2008

Performance Issues with PPS Management Reporter - Update

After months of testing PerformancePoint Server Management Reporter in it's current state, I have concluded that it will not handle the volumes that our company requires. It is way to slow and will just frustrate the business. The problem lies with the whole design of the system. I did a full analysis of the MDX code and how the system operates and found that the client (on your machine) sends an MDX code string to the Analysis Server for every combination of lines in your report to the entities specified in your reporting tree. So if you have a 50 line report and you are running it for 10 entities you will send out 500 MDX queries, one at a time. As you can imagine this makes my ZX81 (with the 64Kb Ram Pack) look like Usain Bolt pumped up on steroids.

Do not despair, all is not lost. I had a very positive conversation last night with some members of the team working on this issue. I suspect that we could see two positives in the near future. The first being a MASSIVE improvement on SQL Server 2008. PPS should be certified to work on SQL Server 2008 by SP2 which is planned for December 2008. Another idea kicked around is changing the architecture to a service type environment on the server and use multi-threading and the power available. I would guess that this type of change would only be done for V2, so let's wait and see. As soon as I get my grubby paws on a copy of SP2 to test I will post an item on performance improvements.

- Paul Steynberg

Sunday, 11 May 2008

Leverage Your Existing Investment in Microsoft (Part I)

Under utilisation (Correctly spelled, see footnote 1) of one's current investment is probably one of the most common issues surrounding IT projects. This is especially true with Microsoft. Just as an example when I speak to other Managers about Analysis Services I am most often greeted with a look typically sported by the lesser spotted stunned mullet. It is quite unbelievable how many companies install an ERP system with SQL Server as the underlying database and have no idea that it is shipped with some great additional features such as Analysis Services and Reporting Services. To check what your version is shipped with go to: http://www.microsoft.com/sql/prodinfo/features/compare-features.mspx

Just as a quick FYI, Analysis Services is a framework that allows one to produce cubes for typically “slicing and dicing” ones data, often referred to as OLAP. Reporting Services is a tool that allows one to author and deliver reports from within a web environment. I am not going to go into much detail on these 2 gems (They will feature in a future blog), as the biggest culprit is the Office Environment, Excel being the ring leader.

Excel, Excel wherefore art though, a product that seems as old and familiar as Bill Shakespeare himself. Yip, I’m back here again, lurching on about Accountants and Excel, but you must understand, you just cannot separate them. So if this is true then why do so many Financial Systems not ship with Excel tools as standard. I had this happen to me some years back with our current system. The original decision makers who attended the demonstrations on the product were amazed by how easy the information could be pulled into Excel. However when it came time to actually implement this great functionality we were told that it costs “extra” and we were not licensed. Not just a little, but a lot extra.

Act II – In walks my stubborn alter ego, has a look at the product and says “We can write something like that for free”.

Evil Vendor – Ah, but our product is fully integrated with securities etc.

Stubborn Alter Ego – Big deal.

And so our Excel integration was born a few weeks later with a bit of help from our Database guru. We only worked on this from time to time when we needed a break from our daily routine and if we had to work out the actual man hours we would estimate that it was no more than 40 hours in total.

We now run 7 Excel based extracts from our Financial Systems, 1 Fixed Assets, 2 Accounts Payable and 4 General Ledger. We benchmarked them against the products supplied by our vendor and found that on average they ran 20 times faster. Not kidding, 20 times faster. How was this achieved? There are only 2 questions to be answered. Firstly is the person making the query allowed access to the data and secondly where is the data.

The first question can be answered by finding the security related tables within the database. One normally starts with finding the user table which holds the user id, name and other details related to the user. From here you then link to tables that associate the user with different entities and then profiles/menus/accounts etc etc. You need to understand how the security is set up in order to make the connections. A little bit of digging and you will work it out.

Once you understand the security model you then need to find out where the data is that is required. In my example of one of our Excel extracts we extract the balances of certain accounts chased down from a higher level account into several columns based on some criteria. Below is screenshot of the criteria that the users would enter in order to get the information. This particular extract will look to the account master table in order to return the account description and also the account balance table in order to retrieve the balances for the selected periods.



Most of the sheet is protected which gives the user limited input. Everything in grey cannot be changed, only the white blocks can be. Each of the input fields has a comment attached to it for on-line help and most of them are validated so that users cannot capture invalid information. In reality the users will complete the input fields and save the spreadsheet for repeated usage.

Once all the fields are updated the user then clicks the “Run Extract” button. This button has some Visual Basic code behind it which is also password protected. Once clicked the code traverses through the input fields and passes them as parameters to a Stored Procedure in SQL Server which in turn returns a data set and populates the data sheets.

Here is a step by step explanation of how this particular extract works.

The amount columns from 1 through 6 represent 6 different columns of data for each account that can be extracted and can change for each column based on the fiscal year, the scenario (Actual/Budget/Forecast) the currency (drop down validated to the currency table), Currency type (Also drop down and validated), from and to periods. I have not had anyone ask for more than the 6 comparative columns yet so one can assume that it is a safe number.

The accounts are then entered. If the account is a parent account you can specify the chase level which is just how many accounts down from this account do you want to display. The show levels switch if set to “N” will only show the accounts at the bottom of the chase and if set to “Y” will show the original account chased and all its subordinates down to the chase level. The New Sheet switch if set to “N” will return the data set to the DataDefault Sheet and if set to “Y” will create a new sheet for each account line’s data set that is fetched. Each sheet will be named with the account number.

There is setup information higher up for the Server Name, Database Name and version number of this particular extract. I have not shown this for obvious security reasons.

Now once the user hits the “Run Extract” button the VB Code collects the input data and builds a string that eventually gets sent to the server and database in the set up information as a stored procedure.

The stored procedure accepts the parameters and now goes through a set of validation procedures starting with the user access. When the user calls the stored procedure their domain and user name are available for the procedure to use in order to compare to the security settings within the product. If the user is allowed to see this org and the account then one allows the procedure to continue. If not we return an error message indicating lack of security. As in our case you may have to create a security link table that holds the Windows full domain name of the user logged in and then the user name of the user within the Financial System. If your system is not Windows Security compatible you may have a spare field in the user table to store the windows login.

Once the data sets are returned the VB script does a whole lot of quick steps such as:

Format the numbers columns correctly
Correct the column sizes to fit the data
Put in some totals

Some procedures done before the data is returned are:

Clear the data area to be populated
Remove and formatting including subtotals etc.
Delete any sheets that are not in the original sheet. (We have made provision for specially named sheets that are never deleted so the user can link them up for other purposes etc.)

Some of my other extracts are even more elaborate. For example the AP Voucher extract allows one to return the entire vendor master into Excel for an org, by double clicking the vendor line you return all vouchers from the system in another sheet. By double clicking a voucher one can return the lines that make up the entire voucher and even further down to the distribution etc. Our users just LOVE it, no need to navigate the application if you can just double click in an Excel sheet.

The point of this blog was merely to demonstrate that we must seek out ways to maximise our Microsoft investment and to challenge the hold that some vendors appear to have over us. In my next session on this I will highlight Analysis Services and how this product can make a full grown Accountant weak at the knees.

- Paul Steynberg

Footnote 1

An excerpt from a letter supposedly penned by John Cleese to the citizens of the Unites States of America: You should look up "revocation" in the Oxford English Dictionary. Then look up "aluminium". Check the pronunciation guide. You will be amazed at just how wrongly you have been pronouncing it. The letter 'U' will be reinstated in words such as 'favour' and 'neighbour', skipping the letter 'U' is nothing more than laziness on your part. Likewise, you will learn to spell 'doughnut' without skipping half the letters. You will end your love affair with the letter 'Z' (pronounced 'zed' not 'zee') and the suffix "ize" will be replaced by the suffix "ise". You will learn that the suffix 'burgh is pronounced 'burra' e.g. Edinburgh. You are welcome to respell Pittsburgh as 'Pittsberg' if you can't cope with correct pronunciation. Generally, you should raise your vocabulary to acceptable levels. Look up "vocabulary". Using the same twenty seven words interspersed with filler noises such as "like" and "you know" is an unacceptable and inefficient form of communication. Look up "interspersed". There will be no more 'bleeps' in the Jerry Springer show. If you're not old enough to cope with bad language then you shouldn't have chat shows. When you learn to develop your vocabulary then you won't have to use bad language as often.

A rebuttal to this can be found at http://www.authorsden.com/categories/article_top.asp?catid=19&id=26498

Thursday, 8 May 2008

The Financial System Manager's Toolbox (Part I)

A common question. What tools should a Financial Systems Manager have in his/her toolbox in order to be effective? I started to add up the bits and pieces that my team and I surround ourselves with and it turns out to be quite a lot. I suppose the list depends on what responsibilities the team is tasked with in each organisation but here we do pretty much everything from project management, help desk support, training, operational checks, backups, server updates, installs etc. So what do we use?

Excel - Without a doubt the corner stone of every accountants existence. Take this away and you risk imploding the known universe.

Word - Pretty obvious but essential for documenting. We make sure that every system is fully documented with backup and recovery plans. Hand in hand with documentation comes screen shots and this is where a tool such as Snag-It comes into play. A must have in order to make your documentation "idiot" proof. Once you have finished your document it is a good idea to save it as a pdf so that it becomes difficult for others to tamper with it. Microsoft has an add-in for the Office Suite which saves to pdf or xps. You can find it here.

Text Editor - I use EditPad. Especially useful when you receive and or send files between PC based technology and mainframes etc. It allows you to check record lengths, convert between Windows, UNIX and Mac, change cases and a whole lot more. Very useful tool.

SQL Editor - The current version of SQL Server Management Studio does not provide intellisense or auto complete. It also does not allow you to save your scripts in an orderly searchable fashion. I use Speed SQL IDE from Quest Software (which is no longer supported) as it provides all this functionality required and saves versions of your scripts in a central repository which can be backed up daily. They have replaced it with Toad, which I do not like. You can also use a tool called SQL Prompt from Red-gate. It works within SQL Server Management Studio and Visual Studio.

Backups - All our Financial System have been consolidated to our standard database, SQL Server. Sometime back I started to run out of disk space and also window time for backups. We found a product, also from Quest Software, called LiteSpeed. Now this is a product that saved my bacon. Our backups were taking 1,5 hours and size on disk of 80Gb's for our main ERP system. After installing LiteSpeed the backup times reduced to less than 30 minutes and the files size only 10Gb's. It compresses on the fly as it backs up. An absolute winner, saves you time and cost of additional disk space.

Text File Searching - A number of older style ERP systems produce files for every process within the business. These files are mainly text based and for example can be log files and spool files. Now in the past if things did not go according to plan these files would contain error messages or warnings. If the person in charge of the system was not aware of any problems they may go unchecked. These files were also produced during interfaces between systems and the only way to make sure that they were successful was to troll through the files and make sure that they were OK. A massive task if you have as many orgs as we do with as many interfaces running daily. I could not find anything to do this job automatically back in 2000 so speced and had my own application written by a Delphi programmer. It still works today and saves my guys hours. One sets up a search location, a file mask for the files being searched, what you are looking for or not looking for, a location for the file to be copied or moved to, an SMS number and message if you need to SMS someone etc. It was written as a service or to be interactive. Love it!

All the above tools combined assist myself and my team in maintaining our Financial Systems environment to a very high standard. Maybe one or two of them are just what you are looking for.

In Part II I will touch on more specific tools such as MS Integration Services, Analysis Services and Reporting Services.

- Paul Steynberg