The 2017 Dynamics 365 Tech Conference Review by Scott Morley
AX and Azure Mashup
Posted in AX
For those of us fortunate enough to attend the Dynamics 365 for Operations Tech Conference in Seattle last week, it offered a great learning experience. There was tons of content that Microsoft shared and I, personally, had a chance to catch up with some old friends and make some new ones.
If you weren’t able to attend, or maybe missed out on some of the sessions because of conflicts, read on, I have shared what I got out of the sessions I attended. If there is something you feel I missed, or want to share your favorite part of the conference, feel free to leave a comment!
The most prevalent topic discussed – even more than local business data – was definitely LCS. The Lifecycle Services portal (lcs.dynamics.com) plays such a critical role in everything D365 that if you haven’t logged in there yet, you will soon. LCS is where you can track your project, deploy and monitor your environments, submit support requests and even translate AX label files. It’s included with your AX licensing and it works with AX 2012 as well as Dynamics 365 for Operations. Microsoft is currently on a monthly update track for LCS and the features being added are significant. If you’ve used LCS in the past but have not been in there recently, some of the updates you missed include being able to deploy 2012 to an Azure ARM subscription, alerts for service health issues and advanced SQL troubleshooting with SQL Now. On the road map, Microsoft will be adding self-service requests for some of the common requests they currently receive around data refreshes and system restarts. One really important announcement is that LCS will be required to manage a local business data (on premise) deployment of D365, which is all the more reason to log in now and check it out.
Speaking of the local business data deployment, there was quite a lot of discussion around the offering and how it will work. With an expected release in Q2, Microsoft will be offering three distinct deployment scenarios for D365: Cloud only, Cloud and edge and local business data. The cloud offering is where we started, with everything living in Azure. Cloud and edge will offer the ability to have some functionality, like retail and manufacturing, with a local presence, allowing for offline survivability and improved performance. The cloud component will still be required for data consolidation and analytics, but edge systems can survive offline for up to 16 hours, after which the synchronization tasks will likely be too great to catch up on and manual intervention will be needed. On a side note, the edge component is configured based on company (DataAreaID), and certain functionality will be disabled in the Cloud for an edge company to prevent data update conflicts.
The last option, local business data, will run very much like an on premise deployment of AX 2012, with the exception that LCS connectivity will be required. No business data will be moved to the cloud, however patching, deployments and the like will still require LCS. Additionally, some of the more advanced BI/analytics and data consolidation will not be available in the local business data offering. A single instance of AX is also a current limitation. That may change, but there was no discussion of when. This doesn’t mean that if you go with a local business deployment you are stuck there forever. The connection to Azure is always available, it just starts turned off. If you wish to move your data to the Cloud at a later time to gain the additional benefits, it is a simple flip of a switch and the data will sync to Azure.
With all the great features, and now multiple deployment options, there has been a lot of interest in the upgrade story, and that was discussed in detail as well. Depending on where you start (2009 or 2012), the story is a little different, but at least now there is a story. If you are on a version older than 2009, don’t despair! A roadmap item is to extend the 2009 upgrade model to include older versions.
For 2009, the upgrade is really more of a migration. There is no code upgrade tool and the data tools are all about selecting which master data to import. Transactions are not on the list for import, though schema customizations for the master data will be carried over. To get this to work, Microsoft ported DIXF (Data Import/Export Framework) back to 2009, exposing a list of entities that will map the data forward. The migration can be done on all legal entities or you can select specific ones to migrate, meaning you can stagger the migration. Virtual companies cannot be ported over as that feature is no longer available in D365. Power BI is the recommended path forward to support roll up reporting. Even though the tool will do the vast majority of the work, including data validation, there are still some limitations due to the functionality differences between 2009 and 2012. For example dimensions will need to be manually mapped. Retail, WMS and security data are also still on the roadmap and would need to be done manually. As output, the tool will provide data packages that can be added to a project and deployed through LCS.
For 2012 the story is more complete. It is a true upgrade process that will include code and data, meaning that things like ElementId’s will remain intact. The code upgrade component will automate some of the code conversion, especially on new objects, however in the case of code that has been overlayered (modifications to SYS or ISV objects) some manual process is likely to be needed. This is because Microsoft is moving coding to the extension model (more on this later) and overlayering will not be supported in D365 for long. The data upgrade is pretty straight forward as the schema between 2012 and D365 is largely the same. In the same manner as a 2009 to 2012 upgrade, modifying or creating upgrade scripts to handle custom tables/fields will be required. As an added bonus, the upgrade analyzer will recommend places to reduce or eliminate data that is not used in order to reduce the upgrade time.
If you are interested in working with either of these tools, the best bet is to contact your partner. The 2009 tool is currently in a preview program with general availability expected in May. There is also a preview program available for the 2012 version, however it is still in early development and will only work for 2012 R3. The GA for the 2012 tool is July with support for versions of 2012 prior to R3 coming shortly after that.
For the developers present, there was completely separate track available as the most significant change to D365 (UI aside) is on how code is written. In all previous versions of AX, overlayering was the way to modify existing objects. For example if I added a field to the SalesTable, it would bring the table up to a higher layer (like VAR or CUS) and then AX would only look at that instance of the object. This worked really well, however it made hotfix application and upgrades a significant project because all of the new Microsoft code needed to be compared against custom objects to see if the new code needed to be merged in. To resolve this issue, Microsoft has moved to the extension model. What this means is that no system code is ever directly modified. Under the new model, if a new Microsoft change comes in, it gets applied to the base code without issue and the extensions still run. Of course regression testing needs to be done, but the code merge step is removed. Anyone familiar with event driven programming has already been working with this type of model, but it is a significant change to AX. Microsoft’s plan is to lock all of the base code some time in 2018 so that no overlayering can happen at all. Currently, in the interest of supporting upgrades, overlayering is still offered in limited situations. Right now all sys objects are locked, so any modifications to those objects will need to be converted to the extension model for code to compile. This holds true for upgrade scenarios as well. The upgrade tool will not be able to identify what needs to change, so it will fail the object and a developer will need to fix it. The good news is that will be the last time a code merge has to happen. Unless you extend objects also extended by ISVs…but I digress. One last caveat is that triggers (which drive the event model) are not available to cover every possibility of code change that could be made in overlayering. Until Microsoft provides that final solution, which they are working on, overlayering may be the only option. Something to keep in mind if you are undergoing a code conversion.
The last topic that I can remember (there was a lot of information…) is integrations. With the move to a cloud model, certain types of integrations are no longer available, they simply can’t work. Instead of trying to provide workarounds to handle specific situations, Microsoft has developed a whole new suite of integration tools, based around the Common Data Model (CDM) and the Azure service that uses it, the Common Data Service (CDS). In a nutshell, the CDS exposes data entities that can be used to read or write data from almost any system, not just D365. By using Flow and LogicApps/PowerApps, integrations can be automated and scheduled, allowing information to move seamlessly across systems. This is a topic way too big for this post, so if you are interested in learning more, head over to https://docs.microsoft.com/en-us/common-data-service/entity-reference/introduction
As with any tech conference, there were a number of questions that were asked that didn’t really have answers to them yet. With all of the new innovations that Microsoft is cramming into D365, this was really to be expected and happened quite often. At times it felt like the most popular answer was “that’s coming”. Not always what we want to hear but a lot better than “no”. If you still have questions about what Dynamics 365 for Operations can do for you, ask your partner, or reach out to the community. A lot of great discussions are happening on the AXUG forum (www.axug.com). The answer may still be “I don’t know” or “that’s coming”, but as more and more people start experiencing D365 that will change.
If you weren’t able to attend, or maybe missed out on some of the sessions because of conflicts, read on, I have shared what I got out of the sessions I attended. If there is something you feel I missed, or want to share your favorite part of the conference, feel free to leave a comment!
The most prevalent topic discussed – even more than local business data – was definitely LCS. The Lifecycle Services portal (lcs.dynamics.com) plays such a critical role in everything D365 that if you haven’t logged in there yet, you will soon. LCS is where you can track your project, deploy and monitor your environments, submit support requests and even translate AX label files. It’s included with your AX licensing and it works with AX 2012 as well as Dynamics 365 for Operations. Microsoft is currently on a monthly update track for LCS and the features being added are significant. If you’ve used LCS in the past but have not been in there recently, some of the updates you missed include being able to deploy 2012 to an Azure ARM subscription, alerts for service health issues and advanced SQL troubleshooting with SQL Now. On the road map, Microsoft will be adding self-service requests for some of the common requests they currently receive around data refreshes and system restarts. One really important announcement is that LCS will be required to manage a local business data (on premise) deployment of D365, which is all the more reason to log in now and check it out.
Speaking of the local business data deployment, there was quite a lot of discussion around the offering and how it will work. With an expected release in Q2, Microsoft will be offering three distinct deployment scenarios for D365: Cloud only, Cloud and edge and local business data. The cloud offering is where we started, with everything living in Azure. Cloud and edge will offer the ability to have some functionality, like retail and manufacturing, with a local presence, allowing for offline survivability and improved performance. The cloud component will still be required for data consolidation and analytics, but edge systems can survive offline for up to 16 hours, after which the synchronization tasks will likely be too great to catch up on and manual intervention will be needed. On a side note, the edge component is configured based on company (DataAreaID), and certain functionality will be disabled in the Cloud for an edge company to prevent data update conflicts.
The last option, local business data, will run very much like an on premise deployment of AX 2012, with the exception that LCS connectivity will be required. No business data will be moved to the cloud, however patching, deployments and the like will still require LCS. Additionally, some of the more advanced BI/analytics and data consolidation will not be available in the local business data offering. A single instance of AX is also a current limitation. That may change, but there was no discussion of when. This doesn’t mean that if you go with a local business deployment you are stuck there forever. The connection to Azure is always available, it just starts turned off. If you wish to move your data to the Cloud at a later time to gain the additional benefits, it is a simple flip of a switch and the data will sync to Azure.
With all the great features, and now multiple deployment options, there has been a lot of interest in the upgrade story, and that was discussed in detail as well. Depending on where you start (2009 or 2012), the story is a little different, but at least now there is a story. If you are on a version older than 2009, don’t despair! A roadmap item is to extend the 2009 upgrade model to include older versions.
For 2009, the upgrade is really more of a migration. There is no code upgrade tool and the data tools are all about selecting which master data to import. Transactions are not on the list for import, though schema customizations for the master data will be carried over. To get this to work, Microsoft ported DIXF (Data Import/Export Framework) back to 2009, exposing a list of entities that will map the data forward. The migration can be done on all legal entities or you can select specific ones to migrate, meaning you can stagger the migration. Virtual companies cannot be ported over as that feature is no longer available in D365. Power BI is the recommended path forward to support roll up reporting. Even though the tool will do the vast majority of the work, including data validation, there are still some limitations due to the functionality differences between 2009 and 2012. For example dimensions will need to be manually mapped. Retail, WMS and security data are also still on the roadmap and would need to be done manually. As output, the tool will provide data packages that can be added to a project and deployed through LCS.
For 2012 the story is more complete. It is a true upgrade process that will include code and data, meaning that things like ElementId’s will remain intact. The code upgrade component will automate some of the code conversion, especially on new objects, however in the case of code that has been overlayered (modifications to SYS or ISV objects) some manual process is likely to be needed. This is because Microsoft is moving coding to the extension model (more on this later) and overlayering will not be supported in D365 for long. The data upgrade is pretty straight forward as the schema between 2012 and D365 is largely the same. In the same manner as a 2009 to 2012 upgrade, modifying or creating upgrade scripts to handle custom tables/fields will be required. As an added bonus, the upgrade analyzer will recommend places to reduce or eliminate data that is not used in order to reduce the upgrade time.
If you are interested in working with either of these tools, the best bet is to contact your partner. The 2009 tool is currently in a preview program with general availability expected in May. There is also a preview program available for the 2012 version, however it is still in early development and will only work for 2012 R3. The GA for the 2012 tool is July with support for versions of 2012 prior to R3 coming shortly after that.
For the developers present, there was completely separate track available as the most significant change to D365 (UI aside) is on how code is written. In all previous versions of AX, overlayering was the way to modify existing objects. For example if I added a field to the SalesTable, it would bring the table up to a higher layer (like VAR or CUS) and then AX would only look at that instance of the object. This worked really well, however it made hotfix application and upgrades a significant project because all of the new Microsoft code needed to be compared against custom objects to see if the new code needed to be merged in. To resolve this issue, Microsoft has moved to the extension model. What this means is that no system code is ever directly modified. Under the new model, if a new Microsoft change comes in, it gets applied to the base code without issue and the extensions still run. Of course regression testing needs to be done, but the code merge step is removed. Anyone familiar with event driven programming has already been working with this type of model, but it is a significant change to AX. Microsoft’s plan is to lock all of the base code some time in 2018 so that no overlayering can happen at all. Currently, in the interest of supporting upgrades, overlayering is still offered in limited situations. Right now all sys objects are locked, so any modifications to those objects will need to be converted to the extension model for code to compile. This holds true for upgrade scenarios as well. The upgrade tool will not be able to identify what needs to change, so it will fail the object and a developer will need to fix it. The good news is that will be the last time a code merge has to happen. Unless you extend objects also extended by ISVs…but I digress. One last caveat is that triggers (which drive the event model) are not available to cover every possibility of code change that could be made in overlayering. Until Microsoft provides that final solution, which they are working on, overlayering may be the only option. Something to keep in mind if you are undergoing a code conversion.
The last topic that I can remember (there was a lot of information…) is integrations. With the move to a cloud model, certain types of integrations are no longer available, they simply can’t work. Instead of trying to provide workarounds to handle specific situations, Microsoft has developed a whole new suite of integration tools, based around the Common Data Model (CDM) and the Azure service that uses it, the Common Data Service (CDS). In a nutshell, the CDS exposes data entities that can be used to read or write data from almost any system, not just D365. By using Flow and LogicApps/PowerApps, integrations can be automated and scheduled, allowing information to move seamlessly across systems. This is a topic way too big for this post, so if you are interested in learning more, head over to https://docs.microsoft.com/en-us/common-data-service/entity-reference/introduction
As with any tech conference, there were a number of questions that were asked that didn’t really have answers to them yet. With all of the new innovations that Microsoft is cramming into D365, this was really to be expected and happened quite often. At times it felt like the most popular answer was “that’s coming”. Not always what we want to hear but a lot better than “no”. If you still have questions about what Dynamics 365 for Operations can do for you, ask your partner, or reach out to the community. A lot of great discussions are happening on the AXUG forum (www.axug.com). The answer may still be “I don’t know” or “that’s coming”, but as more and more people start experiencing D365 that will change.
Comments