Monday, 12 October 2009

iXBRL - Don't Panic

Introduction

Companies in the UK will have to submit financials to Her Majesty’s Revenue and Customs (HMRC) via an online process within the next 2 years. I had a session with Ernst & Young about the language to be used called XBRL. I have also attended a workshop at HMRC titled “Company Tax Returns and Online Filing”. This is a summary of my findings and my personal views on the impact.

Overview of XBRL

XBRL stands for eXtensible Business Reporting Language. It is one of a family of "XML" languages which is becoming a standard means of communicating information between businesses and on the internet. (1). iXBRL is a slight derivative of this language which stands for InLine eXtensible Business Reporting Language and allows the data to be read by computers and viewed by humans all from the same file.

Impact

All companies filing tax after 31st March 2011 will have to comply with the requirements of submitting all the data online. This will affect 3 submissions as follows:

  • CT600 – Currently can be submitted via XML and will not change.
  • Accounts – Must be submitted online in iXBRL format.
  • Tax Computations – Must be submitted online in iXBRL format.

What will not change is WHAT you submit, WHO is responsible for submitting it and WHEN it must be submitted, only HOW you submit it will change.

4 Taxonomies will be supported by HMRC but for the first 2 years of implementation only a limited subset of each taxonomy will be required.

Taxonomy

Full Tag Set

Limited Tag Set

UK GAAP

4375

1182

UK IFRS

3400

<1600

UK Common Data

900

900

CT Computational

4548

1350

One is not restricted to the limited tag set but you are only required to at least submit these for the first 2 years. Thereafter the full tag set will be required.

The current versions of the taxonomies can be downloaded from this website.( http://www.xbrl.org/uk/Taxonomies/)

The Companies House and HMRC issued a joint statement on 1st September 2009 stating that they are working together and that both their filing services are being aligned so that a single point of filing can be used.

Software

As with any technology solution we will find many ways of solving this challenge ranging from ERP Supplier based solutions right through to stand-alone file converters. Unfortunately iXBRL is not yet widely used so I have not yet managed to find anything worth taking a look at yet. HMRC is busy reviewing a number of products and certifying them but when pushed they did not want to yield any names or costs just in case they were seen to favour one supplier over another. They did assure us that a number of suppliers would be ready for market in Q1 of 2010.

Here is a list of companies who I believe are working on a solution. Do not take this as gospel, just through my googling.

www.savanet.net

www.corefiling.com

www.semansys.com

www.rivetsoftware.com

www.ibs.nl

www.edgarfilings.com

www.allocationsolutions.com

www.justsystems.com

Approach

Awareness of XBRL and its impact is at an acceptable level within the UK market. Based on the Q&A session with HMRC the first 2 years are really going to be a phased/teething process and companies will not be penalised or prosecuted if they don’t get it quite right. We also do not have any products to work with as they are still being developed.

I would suggest a measured approach to the online filing which starts with Finance obtaining the taxonomies that they intend to use for filing and to start manually mapping the figures across from the financials. This in my opinion is going to be the largest portion of the work required. A deadline of the mapping tables by end Q2 2010 should be achievable.

Get your IT Department to keep tabs on the development of the various alternatives in the software space and then organize demonstrations and evaluations as and when possible. I estimate that we should be able to find something by Q2 2010. Based on all the input I would at this point suggest that we look to purchasing an Office Add-In that will allow you to continue with our current method of producing the financials and thus reduce impact on the business.

Works Cited

1. XBRL International. What is XBRL. [Online] [Cited: October 9, 2009.] http://www.xbrl.org/WhatIsXBRL/.

- Paul Steynberg

Thursday, 8 October 2009

Solvency II - Data Requirements

Overview of Solvency II

Solvency II is a piece of legislation (directive to be exact) adopted by the European Parliament on 22nd April 2009 and is a fundamental review of the capital adequacy regime for European insurers and reinsurers. Planned effective date is October 2012 and it aims to establish a revised set of EU-wide capital requirements, valuation techniques and risk management standards that will replace the current Solvency I requirements. (2).

The full text of this legislation can be found here. Powers have been granted to CEIOPS in order to produce consulting papers and to engage with the industry in order ensure uniformity and clarity.

IT Impact

A section of the proposed framework deals with Standards of Data Quality which is outlined in a consultation paper from CEIOPS, referred to as CP43. (3). Data, according to this paper, is used to refer to all the information which is directly or indirectly needed in order to carry out a valuation of technical provisions, in particular enabling the use of appropriate actuarial and statistical methodologies, in line with the underlying (re)insurance obligations, undertaking’s specificities and with the principle of proportionality. In the context of this Paper, data comprises numerical, census or classification information but not qualitative information. Assumptions are not regarded as data, but it is noted that the use of data is an important basis in the development of actuarial assumptions.

Whereas this Paper is focused on setting out advice in the context of a valuation of technical provisions, it is noted that the issue of data quality is also relevant in other areas of a solvency assessment, for example for the calculation of the Solvency Capital Requirement (SCR) using the standard formula or internal models.

From the materials I have been reading and through a breakfast (1) I attended here are some issues that I would like to flag up. The CP papers expand on the concept that data should be Accurate, Complete and Appropriate. In doing so they highlight the following potential gaps.

Data Governance
  • Most IT policies focus on security. Substantial changes to policies will be required in order to ensure that we focus on data quality.
  • As a group you will need to understand what data quality means within your context and how to measure it. This also requires documenting.
  • Clear responsibility versus ownership of the data is required which also requires documenting. In my view this would be split between the business units and IT.
  • You need to start looking at the technologies/tools required within your Group in order to ensure data quality.
Monitoring
  • The subtleties of the text indicate a focus on both transactional and non transactional data.
  • Both regular and adhoc types of data will require a degree of monitoring and appropriate controls in place to deal with both types.
  • A move to focusing away from the accuracy of the data but rather its relevance and content is required.
Documentation
  • Additional documentation required with regular updating.
  • Decisions to data quality deviations now require specific documenting and approvals.
In order to ensure accuracy, any data deficiencies should be rectified, with each adjustment justified and documented and should not overwrite the raw data. It was also hinted that a more detailed review of the data is required to ensure that it is valid and appropriate. In other words the fact that data made it from your store system through a multitude of layers to a warehouse does not absolve you from the responsibility of ensuring that the original data was correct. This, in reality, leads me to believe that data profiling and review of individual pieces of data with appropriate monitoring tools is required. This applies to both external and internal data.

Addressing the issue of data quality will go to the core of your IT department and if the hype is to be believed will require substantial investment in ensuring that your processes and tools are up to the task. This will have an impact on your IT strategy both from a technology and approach view.

Audit Impact

Article 46 of the Directive Consolidated Text referred to as Insurance and reinsurance (Solvency II)(recast) defines the responsibility of the Internal Auditors (IA) within the scope of Solvency II. (4). This article requires the usual stance of being independent and specifically requires that IA provide an effective internal audit function which includes an evaluation of the adequacy and effectiveness of the internal control system and other elements of the governance system. (This gets a bit circular as the resolution passed under (18a) says that the governance system includes the risk management function, the compliance function, the internal audit function and the actuarial function, which implies that they audit the same system that they are part off).

CP33 from CEIOPS deals with the advice on Governance and makes reference to the functions of Internal Audit. (5).

Although the requirements do not appear to be any different from any IA business as usual functions they are advised to submit a written report on its findings to the administrative or management body at least annually. It makes sense then to assume that IA have a full understanding of the Solvency II requirements.

Conclusions

Solvency II is going to have quite an impact on the way we do business as an Insurance Company and is something that should be driven from the highest possible position in the company as it will require substantial resourcing. This initiative should not be driven from within one department.

Works Cited

1. Addressing the data and technology challenges of Solvency II. PriceWaterhouseCoopers. London : s.n., 2009.
2. Financial Services Authority. Insurance Risk Management: The Path to Solvency II. s.l. : Financial Services Authority, 2008.
3. CEIOPS. Consultation Paper No.43 - Technical Provisions - Article 85 f Standards for Data Quality. Frankfurt : CEIOPS, 2009.
4. European Parliment. Insurance and reinsurance (Solvency II) (recast). Strasbourg : s.n., 2009.
5. CEIOPS. Consultation Paper No.33 - Draft CEIOPS Advice for Level 2 Implementing Measures on Solvency II: System of Governance. Frankfurt : s.n., 2009.

Thursday, 1 October 2009

Who Still Trusts the Gorilla?

Where does a 500 pound gorilla sleep? Anywhere it wants. Enter left stage - Microsoft.

At the beginning of the year Microsoft canned development on PerformancePoint Server Planning and laid off a whole pile of people. Some of them by e-mail, I know, one of them was on site with me in the UK when he got the news via an e-mail. To me this was a pivotal moment in our relationship with Microsoft. A year ago nobody questioned Microsoft's commitment to software development and products. Today a very different story. A few weeks ago I was discussing the way forward in our ETL architecture with my current employer and we had 2 roads, either IBM DataStage or Microsoft SSIS. The most senior person in the room posed this question "Are we sure that Microsoft SSIS will still be around in the future given the demise of PPS?". A year ago that question would have elicited chuckles from the boardroom table as an obvious joke. Not anymore, it was a serious question which required follow up.

I don't think that Microsoft has any idea how much damage they have done to their reputation in the market.

What do you think? If you have time please complete my on-line quick poll on this.

- Paul Steynberg

Monday, 28 September 2009

Gartner on Mid Market ERP Systems

Those of you who have followed my blog will know that prior to me leaving my South African employer I implemented Dynamics AX 2009 (Axapta 5.0). The project was a raging success and done within budget for considerably less than the alternatives on the list. It is always a good thing to have your actions supported in the future and I came across this Gartner report which put Dynamics AX in the top right hand quadrant of the famous Gartner Magic Quadrant.

This report is definitely worth a read and was published in June 2009.

- Paul Steynberg

Saturday, 29 August 2009

A Novel Way of Dealing with Excel Models

Excel is both heaven and hell in the business world. It's a great calculator, modeler, reporter, you name it. With this incredible flexibility comes a trade off, your data is stored in multiple locations throughout the company in an unstructured and often unsecured manner. This data is not visible to the Enterprise and in most cases is not integrated into your reporting stack. Loads of capturing and data copying results in huge inefficiencies and in many cases errors.

My current employer embarked on a project 18 months ago to "black box" a number of these high profile models. Without going into the gory details let's just say that it did not work. I joined them in April of this year when my family and I made the move to London and was tasked with firstly evaluating the models to give an opinion on whether they were fit for purpose and could they be supported into the future. Sadly they could not and it was my unfortunate responsibility to stop the bleeding and bring the project to a conclusion. That was the easy part, the hard part is what do you do now as the original issues around the Excel Models were still relevant.

The obvious choice would be to take a look at the short listed products again that were selected at great expense and find the next suitable candidate. I was not comfortable with this approach and we did not want to try and rebuild them in PerformancePoint as all development for the product has been discontinued by Microsoft. (Don't even get me started on this subject, I have bitten my tongue since they shook the market with this announcement in January 2009).

I have come up with a rather "out of the box" approach which my team is in the process of designing at the moment. We needed to achieve a few objectives from the project being:
  • The models should be flexible
  • They should have access control
  • Changes should follow appropriate change control
  • They should be auditable
  • The data should be stored centrally and available to other business systems
Package these requirements up with the new Solvency II requirements and SOX etc, and you have quite a tall order.

Off course the most flexible model is still Excel. So what if you could keep Excel and also apply security, access control, change control, auditing etc. This led me into a journey of trying to find software that would give us this functionality. And guess what, it does exist, albeit a software concept in its infancy. Gartner have even released a paper on it (March 11, 2008) under the heading "MarketScope for Spreadsheet Control Products, 2008". It was quite comforting to find out that we had made contact with all the major players prior to this report and that our findings were in line with Gartners. Here are the major players in this space:
They all have a presence in London. Cimcon and Prodiance are US based suppliers whereas ClusterSeven and Finsbury are UK based.

Now they all do pretty much the same thing but in slightly different ways. Essentially each product has 3 major functionality groups:
  • Discovery and Risk Assessment - Basically trolling through all your locations looking for spreadsheets, categorising them and creating a baseline.
  • Monitoring - Keeping track of your spreadsheets and keeping a full audit trail of all changes and versions.
  • Development Tools - Some are add-ins, others applications that assist in developing your spreadsheets to minimise risk, errors etc.
So in a nutshell you can see who did what to which spreadsheet over time and set up alerts and workflows etc. You will be able to see broken links between spreadsheets and a whole host of interesting things that are out of the scope of this paper. We are in the process of getting RFP's from the suppliers and although I have some opinions will reserve them until a later date.

This piece of software should take care of most of the issues surrounding the Excel Models. Now for centrally storing the data and making it available to the rest of the business. My team is in the process of requirements gathering for a framework that will allow us to upload many different data sets from these models into a SQL Server database and thus make it available to our reporting layer. This may sound quite simple but to build a framework that is scalable. supportable and flexible is no easy task.

Watch this space!

- Paul Steynberg

Friday, 13 March 2009

Management Reporter/Frx Roadmap

The big question has finally been answered, what is the future of Management Reporter, FRx Reporter, FRx Forecaster and Enterprise Reporting? Well MS has released this roadmap.

2010 - MR to Replace FRx Reporter as the reporting tool of choice for Dynamics AX. This will coincide with the release of AX2010 (6.0) and it will be called MR V2.
2012 - MR to be released as V3 with AX2012 (7.0) and will now include Forecaster.
2014 - MR to be released as V4 with AX2014 (8.0) and will include functionality from Enterprise Reporting.

Personally I would not move from FRx Reporter until 2012 having learned a very hard lesson over the last year or so with the current version of MR.

Regards

Paul

Wednesday, 11 February 2009

Cannot Install Management Report Directly to SQL Server 2008

I found a bug some time back that did not allow one to install Management Reporter while linking directly to a SQL Server 2008 database. One had to install RTM and point to a database which was SQL Server 2005 and only once SP1 for MR (SP2 for PPS) was applied could you then change the database to SQL Server 2008.

Just had confirmation from Microsoft that this will be fixed in SP2 for Management Reporter.

SP1 for MR was actually packed with SP2 for PPS so we will get this in SP3 for PPS which will be released in the Summer.

- Paul Steynberg

Saturday, 31 January 2009

Wednesday, 14 January 2009

PerformancePoint Server Management Reporter and SQL 2008

Does Management Reporter (MR) work with SQL Server 2008? Yes, if you apply SP2 which has been recently released by Microsoft. I have tested this with huge improvements in performance. But that is not the reason for this blog. Something a lot more sinister is afoot.

I started the MR rollout to the business after upgrading all our pre-sp2 installs. We migrated the ManagementReporter database from the SQL Server 2005 environment to the SQL 2008 environment with no problems.

Here is the BUT. When you install MR on a clean machine you MUST, during the install process give it a valid ManagementReporter database to connect to. Now in order to install SP2 you must firstly install RTM. SP2 is designed to work with 2008 but RTM does not and lets you know in no uncertain terms. So you sit with a chicken/egg story. You want to install SP2 to make it work with 2008 but because the DB is 2008 you cannot firstly install RTM.

Work around. DO NOT delete a copy of MR DB which you have anywhere on the network that is already on SQL 2000/2005. Or just install a DB from the RTM version on to any 2005 SQL box. During the client install you must point to this DB in order to get RTM complete. Apply SP2 and then create a new connection to the 2008 DB and delete the old connection. Simple but unfortunately necessary.

This has been raised as a bug and is in production.

- Paul Steynberg