måndag 20 maj 2019

Round up of data consumption in Microsoft Dynamics 365 Customer Engagement



I thought I should somehow round up the “Minimal copy” blogs with some sort of conclusions or rather recap perhaps. The base of this blog is that a minimal copy of a production instance wound up being 4.3GB which later was downsized with roughly 3GB.

The base of this text is “What are we paying for” when a subscription of Microsoft Dynamics 365 Customer Engagement is started. This has actually just changed but there are still some questions from my side. As of now, or when your subscription renews, you will have 10GB of database capacity, 20GB of file capacity and 2GB of log capacity per subscription.
The change is to begin with that the “attachments to notes or emails in Dynamics 365 Customer Engagement applications and PowerApps” (from the license document) no longer is counted as database storage, which is a good thing. I don’t know how many customers have wanted to delete attachments and there’s even a solution moving attachments to Sharepoint, which still is a good idea since Sharepoint is better at storing documents. 

The next thing that’s nice is that logs are counted as a separate store. I can’t say from the top of my head how much 2GB of logs are worth since it depends on how much auditing you are doing, the possible downside of this is that you don’t get any extra log storage from buying more licenses of Dynamics 365 Customer Engagement (I will address this later).
Sometimes this is how I feel
Now, database storage. 10GB of database storage, what does that mean? Again from the licensing document: “Common Data Service Database Capacity stores and manages entity definitions and data. The tenant for Dynamics 365 Customer Engagement plan and application subscriptions includes by default 10GB for database capacity.”

Ok, entity definitions and data. This means we are paying for the data we put into the system and for the schema. Not only the customizations we do but the standard schema as well. This might very well be reasonable but I don’t think this is what most customers think they’re paying for so it’s a good thing to know if you’re a partner and also good to know if you’re a customer.

This is where things gets interesting to me because a completely out of the box new instance of Dynamics 365 v9 on premise (ok latin-something collation since finnish_swedish is not working atm) is roughly 1.1GB compared to a Dynamics 365 v8.2 which is around 0.6GB. This 1GB storage of an empty instance is also true when we have Dynamics 365 Customer Engagement online, which I actually thought was a bit much when discussing it with the support but it turns out that the vanilla-no-data instance has doubled in size during the last version.  So when we look at 10GB of data per subscription, more than 1GB of that is taken per instance we’re having on it. For example, if we would have 1 production instance and 2 sandboxes (1 development and 1 test), 3GB of data would be consumed by just the instances. This is something that’s good to know at least.

Moving on with the licensing experience (ok, I don’t know if it’s still called this but it’s a too good a name to not use. It was called this during a presentation of Dynamics CRM 2013) you will get an additional 0,25GB of database storage and 2GB of file storage per Enterprise license. No log storage is added by having licenses. Extra storage is bought by the GB per type and I haven’t been able to find any price for this but that might be a case of bad Google-fu on my part (sorry but Bing-fu doesn’t sound as catchy).

When I was starting to dig into this I was able to download a report from the Power Platform Admin Center (one of three or four admin centers at the moment) which showed in which tables data was consumed. That report has since been removed after Microsoft changed the storage thingy to a new name and a new portal which doesn’t show as much information as it used to, yet at least. These portals and information flows tend to shift over time. The result as of 2019-05-20 is that you can’t tell where your data is consumed. What didn’t show in that report though was how much data that was consumed by “not your data” i.e. the system so a completely empty instance will eat some 1GB of data storage as I wrote earlier. And with that report being removed, I can’t tell what a completely empty instance will show as consumed for contacts for example.

Conclusions 

This all boils down to that there’s no real way of telling what you actually consume and we have to rely on Microsoft telling us what is actually in our instances which in this case wasn’t entirely true and I had to push back rather hard to make them believe that 4.3GB of data for an empty instance is not reasonable. This is easy to think that something’s wrong in an instance that you would think is empty but when you have an instance filled with data it’s quite the opposite.
In this case we made a minimal copy from production to an empty instance and started to fill it with data from source systems thinking we would use that instance as the new production so we cleared the “old production”. After a while we made a choice to redo the migration and did a minimal copy from the just filled instance to the “old  production” and when that was done, the “sort-of-.reliable-source” of this was erased. So, for most cases you risk getting this issue in the instances you set up from another instance. And in most cases that shouldn’t be all your instances, even if it’s not unthinkable it’s more than one.
The data usage per table report that has been removed will hopefully be re-introduced shortly.




Rickard Norström
Developer at CRM-Konsulterna
www.crmkonsulterna.se 

onsdag 15 maj 2019

Minimal copy follow up.

I started a case with the minimal copy issue as stated earlier. About a week later after some rather heated discussion with the support a bit over 3GB of each instance was removed.

It was at times interesting to say the least when talking to the support crew. That being said, I have no grudge whatsoever with the people I was talking to since it wasn't their job that I was debating but rather the back end.

The first answer I got was that it was data from the instance I was copying to, since that wasn't empty when I started, there wasn't any real explanaition to why I couldn't access any of that data to remove it though.

Answer number two was that it was metadata and customizations, when I asked why the development instance was a mere 1.3GB that didn't really give any answers either. I think that the 1.3GB is a bit on the heavy side too considering that a somewhat filled Dynamics 365 Customer Engagement 8.2 on premise database is around 0.8GB so I was asking how come an "empty" instance is almost twice the size of an on prem one too. After an hour long screen sharing session where I did a reset of the test instance before I did a new minimal copy from production which gave the same result as before, which I assume removed the idea that it was old data taking up the space, the case was still open.

At this time the power platform admin center wasn't available any longer for this instance, and I don't think it's back up yet either so I couldn't see which tables were creating the most impact any longer. the Dynamics admin center told me that the size was still on the large side though.

After little more than a week the size was considerably smaller, a little over 3GB per instance was shaved and the case was closed. I didn't get any info on what data was removed more than that scripts had been run on the backend to recalculate the size of the instances.

The moral of this is that it's a good idea to have a look at your instances after a minimal copy to see if they are around the 1GB mark unless you have a shitload of customizations of course. Sometimes the answers given by the support team isn't correct unfortunately and it helps having had a look at how the databases behave. The thing that annoys me most in all of this is that I got answers two times that didn't really told the truth and that Microsoft charges by the GB, in this case 6 of them, which is just air.

Rickard Norström
Developer at CRM-Konsulterna
www.crmkonsulterna.se 

måndag 29 april 2019

Minimal copy of instance and what the result is.

We've been working a bit with migration the past couple of weeks and in this job we've done a few instance copies using the "Minimal Copy Instance" from the Administration center. This resulted in a sandbox instance with 4.5 GB of data which seemed sort of funny.

What we were doing is to copy the production instance to a sandbox dito with the minimal copy instance option from the Administration center. According to the documentation from Microsoft found at https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/copy-instance  the entities that are copied are the following:
  • BusinessUnit
  • ConnectionRole
  • Currency
  • DuplicateRule
  • DuplicateRuleCondition
  • EmailServerProfile
  • FieldPermission
  • FieldSecurityProfile
  • ImportMap
  • InternalAddress
  • Mailbox
  • Organization
  • Position
  • Report
  • Resource
  • ResourceGroup
  • Role
  • RollupField
  • SavedQuery
  • SLAKPIInstance
  • Solution
  • Subject
  • Team
  • TeamTemplate
  • Template
  • SystemUser
Now, I had a look at the storage report of the instances and we have three of those, one development, one test and one production. These are 1.5GB, 4.5GB and 8.5GB which is sort of interesting since the test one should be completely empty of data.

Digging deeper into this I had a look at what data is present and there's quite a lot of Audit logs according to the admin center, 850 MB of those, but when I look at the Audit logs in the instance, there's 250MB that I can find. Looking at the Entity list, Audit logs aren't present there making it even more confusing.

The top three biggest tables are contoso_paymentBase, ContactBase and AuditBase (contoso_paymentBase is called something else really, but it's a custom entity, or the "contoso_payment" is a custom entity at least). Now, these tables should be the SQL tables of the entities "contoso_payment", "Contact" and "Audit", but how can they be so large considering there are NO records in the first two and yes, 250 MB in the Audit table which I can see, but not 800 MB.

I started a support case with Microsoft in this matter at the first contact I was told that I should do a reset of the instance, which isn't really what I want since I want the metadata presented above plus all customizations, Microsoft was going to investigate and come back to me.

Rickard Norström
Developer at CRM-Konsulterna
www.crmkonsulterna.se 

onsdag 24 april 2019

SSIS error: The Binary Code for the Script is Not Found issue on SQL server

Hey, long time no see. (if you're reading this). It's been quite hectic the past 6 months so I haven't been doing much blogging as you might have noticed but now it's time again. This time not really Dynamics 365 but related as integrations are one major thing I'm doing.
I've been working on an integration using SSIS for some time now and we've finally gotten to the point that it's time to push to production. Then this error appeared.

To sum things up rather quickly, I have two packages that handles integration, one per source system, and they read customer info and purchase info which our customer wants to see inside CR... eehhmm, Microsoft Dynamics 365 Customer Edition (powered by powerplatform?). I will go through my steps but the solution is really at the end of the post so, by all means scroll down.

They both worked when I run them in Visual Studio in "debug mode" and the first one worked very good when deployed to the SQL server which will orchestrate the integration.

When I deployed the second package I got a validation error saying "The Binary Code for the Script is Not Found" and then it pointed to a script component in the package saying "script component[5212]" (Yes I was lazy and didn't name it after what it did, which was bad it would seem). I did what most developers do, search the internet and there are quite a few hits on the error.


As you can see, I visited most of these but it didn't really help since most of them are addressing issues when running the packages inside Visual Studio and that wasn't really my issue since it was working in Visual Studio but not on the SQL server. These are all saying roughly "do something in the script code, recompile the script and then recompile the package" with a few variants talking about pre compiled code which doesn't seem to be a factor any longer.

Eventually I deleted the script component and recreated the script component, which of course made the entire package misbehave since you're not supposed to do changes to an SSIS package it seems.
This didn't resolve anything either, one thing I did see though is that the script component ID in the error message didn't match the ID of the component in the package which looked sort of strange.

Next step was that I created a new solution, imported the package and compiled it. At this point I was sort of thinking that it was silly to have the same name of the packages when they had the same solution so I renamed the package file, again I was lazy and just let it be called package.dtsx.After a move to the server and installation of the package it validated and ran just fine.

Solution: Don't be lazy and name stuff to what they are doing. Lessons learned: Error messages in Microsoft's products aren't always brilliant, even outside Dynamics.

Rickard Norström
Developer at CRM-Konsulterna
www.crmkonsulterna.se