Upgrading Axapta 3 to Dynamics AX 4

Posted
Comments 6

Just under a month ago, we completed an upgrade of our Axapta 3.0 SP5 KR3 system to Dynamics AX 4.0 SP1. For those who don't know, Microsoft Dynamics AX is the new name for Microsoft Business Solutions Axapta.

During the upgrade, we ran into many problems, so I thought I'd write up the highlights so others in the same position could benefit. I don't seek to condemn the product, in fact I think the product in general is very good, however I believe this information should be available to help others out. All of these problems have been presented to Microsoft, at least a month ago, and I've also provided my solutions where applicable.

It's interesting to note that unlike previous releases, Dynamics AX 4.x was released in the USA quite some time ago, and was only commercially available in Europe once Service Pack 1 was available. Rumour has it that Service Pack 2 will be available later this year, possibly by the end of summer.

Background

Before I go into too much detail, I should explain our situation. Our company works in the distribution industry, with our head office in Brussels, and smaller offices located in the UK, France, Germany, The Netherlands, Italy, and so forth.

We work with customers predominantly within EMEA, but also from around the world.

Out of respect, I won't mention the name of the company I work for, but we are a Microsoft end-user, not a Microsoft partner.

We're in the process of migrating away from a prehistoric ERP system, and so far the most major component completed is the finance modules, of which we're currently using everything except inter-company accounting.

Upgrading customisations

Prior to the upgrade, I took a snapshot of our live system and froze the code-bed, essentially halting new customisations. On a separate platform, I performed the upgrade database upgrade and merged in our code from the CUS layer.

Not much has changed with the API, but since Microsoft acquired Axapta from Navision they've done a lot of work cleaning up the AOT. Because of this, most changes to our own code involved changing names of data-types.

The most prevalent data-type changes I noticed were related to addressing. For example, CountryId was renamed to AddressCountryRegionId, City to AddressCity, and so forth. Along with these data-type changes, variables, tables, and classes have changed to match the data-type names in keeping with traditional X++ best practises.

We were using the \Data Dictionary\Tables\Country\Fields\ISOcode (now \Data Dictionary\Tables\AddressCountryRegion\Fields\ISOcode) to store ISO 3166-1 alpha-3 country codes used by our legacy systems as it was decided to use the simpler two-letter ISO 3166-1 alpha-2 codes as our newer systems use. Microsoft haven't considered this, and changed the length of extended data type CountryISOCode (now AddressCountryRegionISOCode) from 3 characters to 2 characters.

Somewhat problematic for us was the removal of code that permitted us to hack a SOAP interface together. Having said that, though, the new code that uses .NET is much nicer. For details on this, please read my previous article, Using SOAP via .NET in Dynamics AX.

Further-more, Microsoft have introduced code security, requiring the CodeAccessPermissions API be used prior to calling out to DLL files, .NET assemblies, COM components, and even ordinary file IO.

We used a lot of @SYS labels within our code, and it seems that many of these were moved around while the GLS and DIP layers were merged into SYS. These took quite some time to track down.

Testing phase

Fortunately I was lucky to be able to casually spend a month going through a testing process that included our users and a senior consultant from our Microsoft partner. My methodology for the test was to take a snapshot of the live environment on a predefined day, and create two duplicates that were “frozen in time”, like a normal test environment would be. One of them would remain running the old version, and one running the new version.

In this way, our staff were able to compare financial reports between the systems to make sure the data upgrade was accurate. Once this was complete, they could then perform identical daily operations in both systems, with the help and training of our Microsoft partner.

Our partner has also helped us reconfigure parts of the system where large changes have occurred.

License problems during upgrade

We had several problems during the upgrade because the license keys remaining in the database were obviously invalid. Initially logging into the new system caused errors complaining about no access to start-up, and so forth.

Installing the new license key file and restarting the system solved the problem.

Index problems during data dictionary synchronisation

We saw problems involved in index creation during synchronisation because of duplicated data within the database. Most of this probably could be traced back to much older service pack levels of the 3.0 system, but careful removal of duplicated lines solved the problem quickly.

In the database, we saw problems mostly with TaxIntervatDetail, TaxIntervatConfiguration and SalesPurchaseCycle tables.

Problem with post-synchronisation upgrade tool

We experienced a problem with the post-synchronisation upgrade related to upgrading the KMBSCParameters table. This table relates to balanced scorecard configuration, part of a module we don't have.

To solve this quickly, I manually edited the table ReleaseUpdateJobStatus to flag these jobs as completed so that the other update jobs would continue normally.

Bugs creating references to .NET assemblies

This is a known issue from Microsoft: When attempting to create a reference to a .NET assembly (under the AOT node \References) without the Web MorphX Development Suite license key, the reference will be created but not saved. Within moments, the reference will disappear.

The work-around for this is to create the reference, then immediately export to a file. Importing the file again will save the reference as you'd expect.

The problem is removing the reference later, which cannot be done without this license again. The solution while doing experimental code would be to save the reference in an unused layer like USP, and if you need to remove it you can shut the system down and remove the axusp.aod and axapd.aoi files from the server.

Bug in MAPI kernel class

If your code relies on the MAPI kernel class like ours did, expect problems. Within MAPI::findNext() there's a flaw that fails to return the next message ID from the server. The returned error code is successful, however. This makes the class useless for reading in mail from an MAPI profile.

Since MAPI is depreciated, and Microsoft likes to depreciating their mail technologies with every Exchange release, I'm writing a small POP3 client!

Customer/Vendor exchange adjustment problems

We've found a number of flaws within the exchange adjustment routines for customers and vendors. Particularly, these problems exist within the class CustVendExchAdjTrans, and could possibly be a blog entry on its own. This code has been reworked and actually functions better than the old code since it includes invoices that are closed in the future, and allows you to perform a simulation of an adjustment.

Related to this new code that looks for future invoices, there are a few problems. Firstly, the query you specify for performing the adjustment only works for open invoices, not invoices that will be closed in the future. If you specified a range of customers to apply the adjustment on, all invoices closed in the future will still be used.

There's a problem with the selection of posting profile for accounts during adjustment of invoices closed in the future where it will always use a default “catch-all” posting profile, or fail to find one if you have no default profile configured. My fix for this applied to the code at the end of \Classes\CustVendExchAdjTrans\adjust follows (the original line has been commented out:

[geshi lang=xpp] this.postLedgerTrans(_regAmountMST, this.accountNonrealLossProfit(_reverseTrans ? -_regAmountMST : _regAmountMST, newCustVendTrans.CurrencyCode), // this.sumAccount(custVendTable.AccountNum), // Start of patch this.sumAccount(custVendAC), // End of patch newCustVendTrans); } }[/geshi]

There's a divide by zero error that has existed since Axapta 3.0 in \Classes\CustVendExchAdjTrans\postLedgerTrans, around line 113. The following is an excerpt of that method highlighting the patch (original code is commented):

[geshi lang=xpp]// amountMST = amountMST * 100 / totalAmountMST; // Start of patch if (totalAmountMST) { amountMST = amountMST * 100 / totalAmountMST; } // End of patch [/geshi]

Finally, \Classes\CustVendExchAdjTrans\exchAdjustTrans contains both a divide by zero bug and a hard-coded presumption that the company's default currency is in USD. The following excerpt starts at line 20 of the method, with the original code commented out as usual:

[geshi lang=xpp]// reverseAmountMST = Currency::amount((-custExchAdjustmentUnrealized * // custVendTransOpen.AmountCur / (custVendTrans.AmountCur - custVendTrans.SettleAmountCur)), 'USD'); // Start of patch if ((custVendTrans.AmountCur - custVendTrans.SettleAmountCur) != 0) { reverseAmountMST = Currency::amount((-custExchAdjustmentUnrealized * custVendTransOpen.AmountCur / (custVendTrans.AmountCur - custVendTrans.SettleAmountCur)), standardCurrency); } else { reverseAmountMST = 0; } // End of patch [/geshi]
Financial statement configuration problems

The configuration for financial statement reports (in the General Ledger menu) have changed for the better, but in the process the upgrade jobs don't handle the conversion of the old configuration very well.

For Axapta 3.0, accounts were configured in lists defining ranges between two ledger account numbers. In Dynamics AX, this is translated to one very long range field for use within a query, and in most cases the configuration must be rebuilt using the new methodology rather than relying on these range fields.

Since the standard length of this range string (data-type Criterias) is 100 characters, the upgrade job quite often runs out of space and truncates the range. To avoid having to rebuild the configuration as a temporary solution, I increased the length of this data-type and wrote a small job to rebuild the new configuration slightly more intelligently:

[geshi lang=xpp]static void SDB_RepairFinancialStatementRows(Args _args) { DictType dictType; int maxLen; LedgerRowDefLine ledgerRowDefLine; LedgerTableAlternativeTrans ledgerTableAlternativeTrans; LedgerTableInterval ledgerTableInterval; str newCriteriaPos; str newCriteriaNeg; str newCriteria; int newCriteriaLen; int longestLen = 0; ; // To be safe, work out what the maximum length is *now*.. dictType = new DictType(extendedTypeNum(DimensionsAccountCriteria)); maxLen = dictType.stringLen(); info("Maximum criteria length is: " + int2str(maxLen)); ttsbegin; // Probably not the best way of doing this, but anyway.. while select forupdate * from ledgerRowDefLine where ((ledgerRowDefLine.AccountCriteria != "") && (ledgerRowDefLine.Type == DimensionsLedgerDimensionType::Element)) { // Reset the criteria.. info("Existing criteria: " + ledgerRowDefLine.AccountCriteria); newCriteriaPos = ""; newCriteriaNeg = ""; while select * from ledgerTableAlternativeTrans where ((ledgerTableAlternativeTrans.ChartOfAccounts == ledgerRowDefLine.RowDefinition) && (ledgerTableAlternativeTrans.Txt == ledgerRowDefLine.Name)) join ledgerTableInterval where ((ledgerTableInterval.AccountTableId == tablenum(LedgerTableAlternativeTrans)) && (ledgerTableInterval.AccountRecID == ledgerTableAlternativeTrans.RecId)) { info(" ++ From: " + ledgerTableInterval.FromAccount + " To: " + ledgerTableInterval.ToAccount); // If this is a 'invert sign' we will 'not' the criteria if (ledgerTableInterval.ReverseSign) { // If we already have something in this field, add a delimeter if (newCriteriaNeg != "") { newCriteriaNeg += ','; } newCriteriaNeg += '!'; // Is the account the same? if (ledgerTableInterval.FromAccount == ledgerTableInterval.ToAccount) { // Just add the account.. newCriteriaNeg += ledgerTableInterval.FromAccount; } else { // Add the accounts as a range newCriteriaNeg += (ledgerTableInterval.FromAccount + ".." + ledgerTableInterval.ToAccount); } } else { // If we already have something in this field, add a delimeter if (newCriteriaPos != "") { newCriteriaPos += ','; } // Is the account the same? if (ledgerTableInterval.FromAccount == ledgerTableInterval.ToAccount) { // Just add the account.. newCriteriaPos += ledgerTableInterval.FromAccount; } else { // Add the accounts as a range newCriteriaPos += (ledgerTableInterval.FromAccount + ".." + ledgerTableInterval.ToAccount); } } } // Add the positive and negative criteria if (newCriteriaPos && !newCriteriaNeg) { newCriteria = newCriteriaPos; } else if (!newCriteriaPos && newCriteriaNeg) { newCriteria = newCriteriaNeg; } else if (!newCriteriaPos && !newCriteriaNeg) { newCriteria = ""; } else { newCriteria = newCriteriaPos + ',' + newCriteriaNeg; } // If we have hit the maximum criteria length, abort.. newCriteriaLen = strlen(newCriteria); if (newCriteriaLen >= maxLen) { ttsabort; throw error("Criteria is too long to store (need at least " + int2str(newCriteriaLen) + " chars)"); } // For statistics, remember the longest length we dealt with :) if (longestLen < newCriteriaLen) { longestLen = newCriteriaLen; } // Copy the new criteria over ledgerRowDefLine.AccountCriteria = newCriteria; // No criteria? Just leave it, it would be one account I think.. if (ledgerRowDefLine.AccountCriteria == "") { info(" !! Skipped"); continue; } // Update the row definition info(" == Transformed: " + ledgerRowDefLine.AccountCriteria); ledgerRowDefLine.update(); } ttscommit; info("Complete! Longest criteria length was " + int2str(longestLen)); }[/geshi]

You can download my job, including the adjusted data-type, but please note that this kludge worked for us, but it might not work for you, so please test it first and don't blame me if it goes wrong.

The job will determine the maximum length of the range string, and if it cannot store the range it generates correctly it will tell you what size to expand your range string to.

Because there is a different data-type is erroneously used on the configuration forms than is actually used by the table, opening the financial statement configuration forms and saving them again will truncate the range. I left this as it was since I'd rather our accounts department fix it rather than have a quasi-permanent kludge in the system.

Payment proposals fail to propose payments

Payment proposals for vendor payments until a certain due date fail if the specified due date is prior to the system date. Previously, this feature worked, however now payments could be missed.

The work-around for this is to change the system date to the requested due date prior to creating the proposal.

Journal balances are not updated correctly within some journals

The balance fields in the top left hand corner of some journal forms do not display the balance correctly until the journal is posted or validated. Obviously this is a minor bug, but the work-around is simply to revalidate the journal. Actual figures don't change within the journal, and this is purely a display fault.

Double negative figures shown for negative debit/credit

In many places, the AmountCur data-type may display a double negative figure for a negative debit or negative credit value. For example, if “-10.00” was posted as a credit, it could be displayed as “--10.00” (technically changing the value to a positive value of “10.00”).

This becomes prominent for us within our customer/vendor transactions forms where negative debits/credits have been posted. Even more problematic is when data is copied from these forms into Excel, it can produce incorrect results as Excel will correctly interpret a double negative as a positive as you would expect.

Microsoft is aware of this issue, however I'm told it also existed in a smaller way in previous versions of Axapta 3.0.

Intrastat compression differs when Italian configuration is enabled

In our system configuration, we have Italian localisations enabled for the Italian company; however we produce Intrastat reports from our Belgian company.

Within the Intrastat form, a compression button is provided, however behind the scenes there are in fact two compress buttons that are toggled in availability in the form's init() call. When the Italian configuration is enabled, you will receive an Italian form of the button, otherwise the standard compression button will be shown.

Microsoft presumes incorrectly that because the Italian localisation is enabled in the system configuration, we must be an Italian company. We have since disabled this code, which is obvious and can be found in \Forms\Intrastat\Methods\init.

Conclusion

We still have some outstanding but minor problems with the system, including discrepancies between how Dynamics AX now validates VAT numbers (Dutch VAT numbers require no country code prefix on the VAT number, but none of our current problems are considered show-stoppers and we're happily using the new version of Dynamics AX.


Categories X++, Axapta

Comments

  1. hi, i am also doing code upgration from 3.0 to 4.0 .i am facing problem while upgrading BM Classes (Benchmark Classes) because they have been deleted in 4.0 ,so if you have an idea please let me know. Thanks, Deep
  2. (Author)

    Hi Deep, The benchmark classes were removed from Dynamics AX 4.0 completely, however there is external tool called the _Benchmark Toolkit for Microsoft Dynamics AX 4.0_ which is available via PartnerSource. I haven't yet been able to try this as I sadly don't have access to this site, however I understand it uses the standard _Visual Studio Team Suite_ and _Visual Studio Team Test Load Agent_ components. The old benchmark classes suffered from being a component within the very product it was benchmarking, and often didn't present an accurate picture of how the system was performing. If it's quick and dirty benchmarking and profiling you need, you can still use the _Code profiler_ which is still available in the development menu. If it's the upgrade itself that's not working, the first step would be to make sure you've upgraded to the latest Axapta 3.0 service pack first, and you have the _"Keep update objects"_ configuration key enabled prior to any AOD synchronisation or data conversion. If you've used the old benchmark classes in your code, you will need to remove any references moving forwards. Cheers, - Simon
  3. HI., I want to know the Pitfalls when upgrading Ax 3.0 to 4.0 Especially with reports. Thank u
  4. (Author)

    Hi Sunee, You shouldn't expect too much trouble, although that depends heavily on how much you've customised, and what you've customised. For reports, the only major thing you may run into, other than what's outlined in this blog post, is that counting fields should be of a type "Int64" (64-bit integer). You will need to recreate Integer report fields that are based on record ID's and so forth to switch from 32-bit integers to 64-bit integers. If you have built reports that use actual SQL code rather than pure X++, you will need to keep an eye on the new security involved, or rewrite the report. If reports are important to you, you may find the SQL Reporting Services integration built into Dynamics AX 4.0 extremely useful! Good luck with your upgrade! - Simon
  5. Hi, I am upgrading from 3.0 sp2 to 4.0 sp1. a lot of our customizations are dependent on 3.0 classes of sys layer like "SalesCreditLimit", "BMTablefill".. etc and they are no more present in 4.0. then my question is how do i know what classes of 4.0 replace the functionality of those classes which are no more in 4.0. and how do i resolve this problem as there are lot of customizations depending on these type of classes?
  6. (Author)

    Hi Naushad, You're in for a difficult time, I think, because a large number of changes were made in those areas, and many of them are undocumented. In fact Microsoft decided to remove large portions of existing documentation for classes and functions that once existed in _Axapta 3.0_ and as such this documentation no longer exists in _Dynamics AX 4.0_. Off the top of my head, I know that credit limits on sales orders were changed so limits are no longer calculated on the lines but off the header. This code has changed so much that you will need to find new points within the modified API. On the other hand, the BM* classes no longer exist, and all related functionality has been moved to software outside of _Dynamics AX_ (available from _PartnerSource_). I cannot comment on your specific customisations, but I think your best option is to run _Axapta 3.0_ and _Dynamics AX 4.0_ in parallel during your migration and hunt down the core API in the _3.0_ system and where it has changed in _4.0_. Fortunately if you do comparisons on your classes, you should still be able to select "sys (old)" layers if your upgrade has gone well, allowing you to see some of the _3.0_ code within the _4.0_ environment. A key thing I'm teaching junior developers is how to isolate customisations into a set of your own classes with your own APIs, and only hooking them into the standard functionality through one or two lines of code added into standard methods. This way, you will save yourself a lot of heartache in future upgrades, and I anticipate much has changed again in _Dynamics AX 2009_ (scheduled for release over the coming months). Good luck! - Simon

Commenting has expired for this article.