Friday, December 29, 2017

Debugging JavaScript in Dynamics 365 Outlook App

Here's a quick tip on how to debug your custom JavaScript code in the Dynamics 365 App for Outlook
  1. Right-click in the page/form shown by the Dynamics 365 app and select the Debug option.
  2. Select Visual Studio as the debug tool to use.
  3. In Visual Studio, in the Exception Settings window, select all options/boxes for the JavaScript Runtime Exceptions option.
  4. In Visual Studio, press F5 to continue running the Outlook App. Perform an operation that leads to the exception/error in your code or to a "debugger" statement that you've added to the code.
  5. You can then proceed to step through the code, get variable values, etc.

Tuesday, August 15, 2017

Identify differences between Dynamics 365 (CRM) instances

Goal

You want to identify differences (entities, fields, picklists, etc.) between two CRM Online instances (e.g., production and a sandbox).

Possible Solution

You might be able to follow the approach detailed on Develop 1's blog, but if your default solution zip file never appears or you get an error (as I do quite often these days with the default solution), then here's another approach.

Steps

Note that these steps are targeted toward CRM developers. That is, until someone writes an XrmToolBox plugin that does the same thing or in a better way.
  1. Export entities as solution zip files: The Dynamics 365/CRM SDK contains an article named "Work with Solutions" that provides source code samples for creating and exporting solutions. The first step in the diff process is to write an app that creates a temporary solution in the instance, adds an entity component to it, exports the solution (as a zip file) and then moves to the next entity, and so on. This will result in all entity components (and their dependencies) in solution zip files.
  2. Extract the solutions to sub-components: Use the SolutionPackager tool that's included in the SDK to extract each entity solution zip file to individual sub-component files. You can use Excel to build a batch file that will run the SolutionPackager utility for each solution zip file, or better yet, run the SolutionPackager from within your app.
  3. Follow the steps above for a second CRM instance: Now that you have all entity components extracted on disk, you can run steps 1 and 2 again for a different CRM instance. Or run your app twice at the same time, one pointed to production and one to the sandbox.
  4. Use a file diff/comparison tool to identify differences: You can now use a diff tool such as Beyond Compare to diff the two sets of folders. This will reveal metadata differences in both instances.

Notes

  • By exporting entities along with their required components (that's an option when adding components to a solution programmatically), you'll end up extracting most components that relate to entities. However, you might also want to export other component types such as all processes (not only those that have entity dependencies), global picklists, web resources, etc.
  • It might take an hour or two for the app you create to export all entities as solutions.
  • You can add the extracted solution files into a source control system to keep track of changes over time.
  • You can index and search through the solution zip files or extracted files using a grep or indexing tool (e.g., dtSearch) to track down a component that might be giving you trouble while importing or exporting larger solutions.


Tuesday, June 20, 2017

Dynamics CRM workflow editor bug might be related to editable grids

The problem: While creating a workflow process in Dynamics 365 (CRM Online), the "Form Assistant" drop-down boxes (i.e., where you can select the "Operator" or dynamic values) were empty/blank.



What we tried: We tried these steps to troubleshoot the issue: 1) click into a variety of fields to see if the problem is related to specific types of fields (no, the options are blank for all field types), 2) clear browser cache, 3) try IE, Chrome and Edge, 4) determine whether the problem is happening for all entities.

Some clues: Regarding #4 above, the problem with the blank options when editing a workflow step occurred for some entity types but not all. So what was common between the entity forms where this bug occurred?

The fix: It turns out, at least for the CRM organization we're working with, that if there's an editable grid on the form, then the Form Assistant controls do not get populated. We removed the editable grid, saved, published and then tried editing the workflow step again. That resolve it: The expected operators, field choices, etc. are appearing in the workflow step edit page again.

This might be a coincidence, but this work-around is worth trying if you run into the same problem.

Monday, June 5, 2017

Prototype and fail early with cloud-based development

In these days of rapid and incremental software product releases, where development companies add features for our benefit but often before thoroughly testing them, it's crucial that we all approach projects in a prototyping mindset and find the bugs early rather than assuming that all released features are going to work as we progress through an application.

It used to be that when you bought a license to, for example, Microsoft SQL Server, you were confident that any problems you encountered were likely due to your own code.

These days, it's difficult to know whether a bug is due to something you're doing wrong or whether it's a "known issue" to the software manufacturer due to releasing functionality too quickly.

My latest example of this relates to Microsoft PowerApps. I created and finalized several screens and moved on to the last part of the project and that was to simply list records from a Common Data Service (CDS) entity that relate to a record selected on a previous screen. This is fully supported and Microsoft provides examples on how to do this.

Then I ran into this "known issue": "For some connectors, the connection to the data source is lost if you modify the Items property."

In my case, I selected the connection to the CDS entity for a data table and went to apply a filter and the data source disappeared. I thought that I was doing something wrong so I re-read the documentation on using filters and the best I could tell I was using the functionality the way it was designed.

Then I searched online some more and eventually ran into the issue.

I'll likely find a work-around to this bug, probably by storing the related data onto the main CDS entity itself.  But this is another reminder to myself, and hopefully to you as well, to approach application development with cloud-based platforms in a way where you prototype as much of the functionality first before spending a lot of time polishing various forms/screens so that you can switch gears on your design when you run into an issue like the one I described.

Saturday, June 3, 2017

Checking for existence of record in Azure Logic Apps

I recently had the need to store data into an Microsoft Common Data Service (CDS) database, and using a Logic App seemed to fit the need in this case.

Project requirements:

- Run the integration (data copy) once per day
- Query Dynamics 365 (CRM) for Project records
- Insert or update into the Common Data Service database, to the Project entity

I'm probably missing something obvious, but that last step was more difficult to figure out than I thought. I created a "For each" to loop through the CRM records and then wanted to query the CDS entity to determine whether or not the record already exists. If the record exists, update it, if it doesn't exist, create it.

To query the CDS entity to determine whether the source CRM record exists, I used the CDS "Retrieve record" action. When running the Logic App, though, when it encountered the condition where a CDS record didn't exist the Logic App stopped processing and set the instance as "Failed".  Not finding a record really shouldn't be considered a failed job, but that's the way Logic Apps handles this condition.

The way Scribe Online handles this type of query is to set a property (e.g., RecordsMatched) to indicate whether or not the record exists. You can use the property's value in IF..THEN formulas to change the runtime processing path.

In Logic Apps, I couldn't find a simple way to set a condition based on whether or not it found the CDS record. The solution I came up with was to check the HTTP status of the query to the CDS. A response of 200 means that the record exists and a 404 means the record does not exist.

For the "IF" statement in the Logic App (after querying for the CDS record), I switched to "Code view" in the Logic Apps Designer and set the expression to the following:

"expression": "@not(equals(outputs('Attempt_to_retrieve_Project_CDS_record')['statusCode'], 200))",

Here's how that change looks in the Designer view:



From there, I added an action to insert the CDS record if it doesn't exist or update it if it does exist.

This solution was confusing because I couldn't find any examples or documentation from Microsoft on how to deal with this common scenario. I found an article on dealing with "exceptions" using the runAfter setting for an action, but checking for a record and not finding it really isn't an exception, it's a simple operation that should, in my opinion, be easier to handle in Logic Apps. Maybe it is and I just missed it. If so, please let me know.

Monday, April 3, 2017

Using Azure from .NET Console App for Exception Notifications

Microsoft Azure helps out again...

I recently worked on a project where a .NET console app needed to run consistently throughout a weekend, without interruption. If the app threw an exception from which it could not recover, I wanted to be notified right away so I could login and resolve the issue.

The server where I was asked to run the application did not have the ability to send email, so the solution I came up with was to send a message from the app to an Azure Service Bus queue and then use an Azure Logic App to read from the queue and send an email to my account.

Below is a C# method that will send a message to an Azure Service Bus queue:

public static void PostExceptionToAzureSBQueue(Exception ex)
{
  try
  {
    var connectionString = Settings1.Default.ExceptionServiceBusQueueConnectionString;
    var queueName = Settings1.Default.ExceptionServiceBusQueueName;

    var client = QueueClient.CreateFromConnectionString(connectionString, queueName);
    string msgText = "Data migration exception occurred.\n\n";
    var message = new BrokeredMessage(new MemoryStream(Encoding.UTF8.GetBytes(msgText + ex.ToString())), true);
    client.Send(message);
  }
  catch (Exception ex2)
  {
    log.WarnFormat("Error occurred attempting to send Azure Service Bus Queue message: Exception={0}", ex2.Message);
  }
}
The Azure Logic app only needs two steps: 1) Get queue message and 2) Send email via Office 365 Outlook. In the second step, simply place the "Content" data from the queue message into the Body field for the email to send. You can set the email priority to High and deliver the email to multiple recipients if desired.

This is another example of where Azure can help with general .NET development... even "old school" .NET console apps.

Tuesday, February 28, 2017

More progress on my Dynamics 365 (CRM) Metadata DIFF tool

I worked on one of my rainy day projects last weekend... a diff tool for Dynamics 365 CRM orgs.

The goal of the solution is to be able to view the major differences between the same CRM org over time or two different orgs (online or on-prem).

The components include two Azure WebJobs and a SQL Server database. One WebJob queries the CRM orgs for metadata (see list below) and the other is responsible for the diff operations and reporting the differences.

So far, the tool compares the following metadata and entity-based data:
  • Entity
  • Attribute
  • OptionSet (Global and entity-based)
  • One-to-many relationships
  • Many-to-many relationships
  • Business Unit
  • Connection Role
  • Field Permission
  • Mailbox
  • Organization
  • Plugin Assembly
  • Plugin Type
  • Process
  • Process Stage
  • Publisher
  • Queue
  • Role
  • Saved Query
  • Saved Query Visualization
  • Sdk Message Processing Step
  • Sdk Message Processing Step Image
  • Site Map
  • Solution
  • Solution Component
  • System Form
  • System User
  • System User Roles
  • Team
  • Template
  • User Form
  • User Query
  • User Query Visualization
  • User Settings
  • Web Resource
Next up for the tool is to make the deployment faster and easier. Currently, it's necessary to manually run T-SQL DDL scripts to create the database tables and stored procedures. The plan is to make that a single click using a simple ASP.NET app.

If your organization has the need for an automated CRM metadata tracking or diff process, please get in touch and I'll share some of my experiences in building the tool or, through Altriva, we can talk about sharing the code with you.

Monday, February 13, 2017

Twitter noise-reduction solution in Microsoft Azure is working well

I've been running my Azure-based Twitter "noise-reduction" solution for a few days now. Below are some observations.

First, a recap of what the solution is doing...

The solution runs in Azure to help identify new and unique tweets, eliminating retweets, duplicates and unwanted "noise" tweets by keyword. It sends the tweets that pass the filters to an email folder. I then review the tweets (about 75/day related to Dynamics 365 CRM and Azure) and drag them to another folder. An Azure Logic App reads from that folder and posts the tweets to this blog. So, it's mostly automated, and becomes better over time as it catches duplicate tweets.

This screenshot (Azure Portal) shows the solution services. The solution includes three Azure Logic Apps, a Function App with one function and Azure Storage queues and tables.




Why do this? Mostly because I want to continue to learn how to use Azure Logic Apps, Azure Functions, Microsoft Flow, etc. Second, I have $150 in Azure credits and this seemed like a good use of some of them. And lastly, part of my job is to keep on top of what's happening with Dynamics 365, so creating an intelligent Twitter spy tool seemed like a reasonable way to meet that goal as well.


Lessons learned so far:

  • Cost: It costs about $0.75/day to run this solution in Azure. Besides the Azure services shown in the screenshot above, I have two Microsoft Flow workflows querying Twitter for the following hashtags.
    • #msdyncrm OR #dynamics365 OR #msdyn365 OR #dynamicscrm OR #dyn365 OR xrmtoolbox 
    • #azurefunctions OR #logicapps OR #webjobs OR #powerapps OR #microsoftflow OR #azuresql OR #sqlazure OR @logicappsio
So, it's a relatively inexpensive way to save me a lot of time reading about new CRM and Azure happenings.

  • Part of the solution is for a Logic App to auto-send a tweet about the new "Favorite Tweets" blog post. After a few hours of that tweet, each page is getting about 50 unique visitors, so the automated solution is leading to a respectable number of page views. That's not my goal, but if I ran a business then this solution could be used to provide timely and relevant content to site visitors.

  • It's reliable. The Azure Logic Apps, Functions, etc. that I created for the solution have run on-time and without fail each day. I didn't really expect them to fail, it was just reassuring to see all green checkmarks in the runtime history.

I'm not sure what my next rainy day side project will be with Azure but I'm sure I'll come up with something soon.

Thursday, February 9, 2017

Building a String in an Azure Logic App

In an Azure Logic App, one way to build a string containing multiple values (for example, within a "For Each" loop) is to use an Azure Storage Blob. First, add an action that creates a blob. Then, in your For Each block, use a "Get blob content using path" action along with an "Update blob" action.

To make this work in a loop, though, you'll need to add the following property and value to the For Each block definition in the underlying Logic App code:

"operationOptions": "Sequential"

Example:

"forEach_email": {
       "type": "foreach",
       "foreach": "@body('email_filter')",
       "operationOptions": "Sequential",
       "..."
}

Without that setting, the blob you're building will very likely be incomplete. That's because Logic Apps run loops in parallel. So, by the time the app has retrieved the current blob contents and has appended the next value, another thread of the app has done the same thing and the result will be an unpredictable set of data.

The following For Each block from a Logic App demonstrates this technique.



Note that it's certainly possible that I missed a feature in Logic Apps that helps to build a string, such as some sort of "Append to a String Variable" action, which will make my use of a blob for this look silly. If I did miss this, please leave a comment to set me straight. Otherwise, Logic Apps team: We need an "Append to a String Variable" action.   :)

Monday, February 6, 2017

Taming Twitter with Microsoft Azure

Twitter is noisy.

At least 50% of tweets are duplicates, retweets or auto-generated from old content. And another 25% are likely tweets that, if you're busy, you don't really care to see at the moment. Thus, trying to use Twitter as a source for learning and new information is often time-consuming and not worth the effort. For example, on the topic of Dynamics 365/CRM and Azure, people and bots around the world send out over 1,000 tweets per day -- far too many for most of us to view.

Twitter is useful.

On the other hand, there are tweets that are important (or at least interesting) to see. It's a great resource for keeping up-to-date on just about any topic.

Twitter can be tamed.

This blog post demonstrates one way to tame Twitter: To cast aside unwanted or duplicate tweets to get to the good stuff.

The solution described below utilizes various Microsoft Azure and Office 365 services to achieve the following goals:
  • Receive Tweets by e-mail that match a keyword or hashtag
  • Eliminate duplicate tweets, retweets and bot tweets
  • Eliminate tweets that contain a particular keyword (e.g., #jobs)
  • Categorize tweets within e-mail folders
  • Benefit from Twitter by significantly reducing the "noise"
  • Have full control over the tweet filtering process to further enhance the solution

Get Ready to Build

If you're like me and find benefit in using Twitter but find it too time-consuming to wade through the noisy tweets, and if you're a DIY type of person who would rather build a solution then buy one, then this tweet tamer solution will be worth the effort.


Solution Components

This solution utilizes the following components:
  • Microsoft Azure:
    • 1 Logic App
    • 1 Azure Function
    • 1 Azure Storage Queue
    • 2 Azure Storage Tables
  • Microsoft Office 365
    • 1+ Microsoft Flow workflows
    • Outlook (desktop or web)
Time to implement: About 1 hour

Cost per month: It depends. If your Twitter searches are relatively narrow, running this solution might be free or not more than a few dollars per month. To avoid a big bill, avoid searching Twitter for keywords or hashtags (in your Flows) that will return hundreds or thousands of tweets per query.

Prerequisites


Solution Runtime Overview

In a nutshell, this tweet-taming solution works in this way:
  1. Microsoft Flow monitors Twitter for specific hashtags, keywords or phrases and posts matching tweets to an Azure Storage Queue.
  2. Azure Logic App reads messages from the queue every 30 minutes, sends tweet messages to an Azure Function for analysis and storage.
  3. The Azure Function determines whether the tweet is a duplicate or is an unwanted tweet by keyword. It does this by storing and matching tweet messages with Azure Storage Tables.
  4. The Azure Logic App conditionally sends the tweet message to an email account. Office 365 Email rule moves tweets to specific folders based on keywords.

Note: In addition to Microsoft Flow, it's also possible to retrieve tweets with an Azure Logic App and other means (e.g., Twitter API). I chose to use Flow for this solution because it's easier to add additional tweet retrievers with Flow compared with creating separate Logic Apps to capture tweets. Plus, Flow is included with my Office 365 account, so using it does not use up my Azure Subscription credits.








Implementing the Solution


This section provides the steps for creating this tweet-taming solution.

Note: Since the procedures for creating and managing Azure services changes often, and since there are lots of online resources available, I am not providing complete step-by-step instructions. For example, when I say to create an Azure Storage account, I don't provide the specific steps for doing that, but I believe I've provided sufficient instructions to understand and build the solution. It just might take some research on your part to complete a step that you haven't done before.

Create an Azure Resource Group for this solution

In the Azure Portal, start by creating a Resource Group in Azure to assign to the services that you'll be creating in the steps below. Using a Resource Group helps organize Azure services that are part of the same solution.

Create the Azure Storage Queue and Tables

  1. Create an Azure Storage account.
  2. In Microsoft Azure Storage Explorer (or similarly capable tool), create a queue named "tweets" and two tables, one named "tweets" and the other named "unwantedTweetKeywords". There's no need to add additional columns to the Tables.

Create the Azure Function

  1. Get the Azure Function files from the GitHub Gist location.
  2. Create an Azure Function App. Name it something like "TweetTamer".
  3. Create the Azure Function. Use the GenericWebHook-CSharp template. Name it "ProcessTweet".
  4. Create the Azure Function files to correspond with the files you retrieved from the provided Gist location. Your function should have three files: run.csx, function.json and project.json.

Create the Azure Logic App

Create the Logic App as shown in the provided screenshots. Name it "ProcessTweets" or any name that you choose.

The "ProcessTweet" action shown below is where the Logic App calls the Azure Function. Construct the JSON string to pass to the function. If the return value from the function does not contain "ignore_tweet" then the tweet text is sent via email. The last step removes the Azure Storage queue message.






Create Rules In Outlook to Move Tweets

The solution is almost ready for use. One optional but recommended step is to create one or more email routing rules in your email application to move emails that start with "New Tweet" to a particular folder. Doing this will help keep your inbox clear of tweets and will help categorize the tweets to further put them into context.

Create Microsoft Flow workflows to receive Tweets and Send to Queue

The last step is to create one or more Microsoft Flow workflows to periodically query Twitter and send tweets to the Azure Storage queue that you created in a previous step.
  1. Login to Microsoft Flow
  2. Create a blank Flow.
  3. Search for "Twitter" and select the trigger named "Twitter - When a new tweet is posted"
  4. For the search text, enter one or more hashtags or keywords separate by the word "OR". Example: #msdyn365 OR #msdyncrm OR #dynamics365
  5. Click "Next Step" and search actions for "azure queues". Select the "Azure Queues" action then select "Azure Queues - Put a message on a queue".
  6. For the queue name, enter "tweets". For the message, select the "Tweet text" option from the dynamic content list. If you want to include a URL to the tweet, then see the image below for an example of how to construct the URL.
  7. Click "Create flow" to save the flow and then click "Done".




Twitter is Tamed!

Within 15 minutes of creating one or more of the Microsoft Flow workflows that retrieve tweets and sends them to the Azure Storage queue, you should start receiving tweets in email.

Over time, the number of tweets you receive will decrease because you will not be receiving as many duplicate tweets, no retweets and no tweets that contain the keywords you specify (see "Ongoing Maintenance" below for details).

The end result is that you can now benefit from Twitter without the noise!

Here's my own tweet about this blog post in Outlook online...



Ongoing Maintenance

Ignoring Tweets by hashtag or keyword

As you receive tweets in email, when you see a word or hashtag appear that you'd like to ignore, you can add the word or hashtag to the Azure Storage table "unwantedTweetKeywords".  For the PartitionKey, enter "Keyword" and for the RowKey, enter the word or hashtag. Since Azure Storage tables don't allow the pound character ("#") in the RowKey column, use "HT" instead. For example, to ignore tweets that have #jobs in the text, enter "HTjobs" into the RowKey column. The Azure Function provided in this solution converts "HT" to the "#" character.

Other Ideas

Below are some other thoughts on using and extending this solution.
  • Within your email client, create a folder named "Tweets to Read" and use it to store tweets (links) that you want to review later.
  • Also within the email client, one idea is to create a folder for tweets that you want to share with others. You can set up a Microsoft Flow workflow to monitor a "To Share" folder and post the tweets on Slack or Yammer.




Monday, January 23, 2017

Bad data is fake news

The term "fake news" is being tossed around a lot lately. It points to the general opinion that it's becoming increasingly difficult to trust what we read online or even in our local newspapers.

Reading "alternative facts" from seemingly honest news sources is disturbing, but what's equally damaging is reading a news article that omits important details.

Likewise, in software business systems, such as Dynamics 365 and Salesforce, fake news is the result if those using the system aren't careful to consistently provide accurate and complete information.

Examples of bad data include duplicate records, inconsistent field entries (e.g., US, USA and United States in the country field), data in the wrong place (e.g., mixing customers with leads), fat-fingered values that go unverified and data entered in the wrong place.

In addition to bad data, another common problem is missing data -- data that is omitted for a variety of reasons: it's too difficult to enter, takes too much time, not sure where the data goes, difficult to import, getting an error message, forgot to update it, etc.

Just like with fake news, consumers of the data will start to question the content of the records and reports they view. This will eventually erode confidence in the system and people will find alternative ways to do their jobs, which usually means creating another Excel file and storing the data on their local machine.

To prevent bad data from making its way into your Dynamics 365 (CRM) system and to keep it clean over time, Microsoft provides features in the base product that can help. Those features include duplicate detection (real-time and scheduled), form-based business rules and scripting for data validation, processes and plug-ins to validate, correct and complete data, and bulk import and deletion tools.

Just as the New York Times and the Wall Street Journal are careful about publishing accurate news stories to at least maintain their subscriber base (and avoid the "fake news" label), it's also essential for CIO's, system administrators and end-users to treat their business systems in the same way. Otherwise, once trust is lost, it's difficult to get back.

Wednesday, January 18, 2017

Azure Logic App to Report on Neglected Dynamics 365 (CRM) Cases

Scenario: You want to send an email to the owner of an unresolved CRM case record if the modifiedon date/time is greater than 24 hours.

Solution: Azure Logic App and Azure Function

Note: The Logic App references an Azure Function named GetHoursSinceUtcDateTime. You can find the function code and instructions here: https://gist.github.com/tdutch1/a14c60eed1a005c074f317b581cd6f01

Logic App


  • For the "Send an email" action, you can include fields from the CRM case. Consider providing a link to the case record in CRM.
  • In the "List records" action, set the query condition to "statuscode eq 0" to include only active cases.

Friday, January 13, 2017

Get Tweets in Email, Send to Yammer: Automated with Microsoft Flow

Automation success! I feel like a weight has been lifted from my shoulders. I used Microsoft Flow today to automate a few tasks that have been on my list for a while.

The Goals
  • Keep up with CRM happenings via Twitter without wading through hundreds of "noise" Tweets.
  • Receive a filtered set of Tweets into an Office 365 email folder for easy access and review.
  • Share my favorite Tweets (and the linked article) with others at Altriva.
The Solution
Microsoft Flow provides the ability to act upon new Tweets based on keywords that you specify.  I created Flows to trigger on Tweets with hashtags #MSDynCRM, #MSDyn365, #Dynamics365, etc.. The Flows send the Tweet message and related details to my Office 365 email. In Outlook, I set up rules to move the emails to a specific inbound folder.

A few times each day, I go through the Tweets (as emails) and move any of them that I want to review further to a "To Read" folder.

After I've had a chance to review the Tweet and the related article, if I think my colleagues at Altriva might also want the information, I drag the Tweet (email) to a folder named "Post to Yammer".  I created another Flow to periodically retrieve the emails in that folder, post the Tweet to Yammer and then delete the email.

Conclusion
I've been critical lately about Microsoft Flow, particularly regarding the several bugs I've run into in dealing with data from Dynamics 365 CRM, but it works very well for a lot of other types of tasks. If you find yourself doing the same repetitive tasks (copy/paste is a big red flag for this) then take a look at Microsoft Flow and see if you can connect the dots to automate things. It feels good when it's all working.

Tuesday, January 10, 2017

A fix for Azure Logic App and Dynamics 365 CRM connector

A project came up for me recently where the requirement was to insert data from a Dynamics 365 CRM Online organization into a Google Sheet, once per day. Using Azure's Logic Apps for this seemed like a perfect fit. Logic Apps provides a connector for CRM and Google Sheets, and it has a built-in way to run the app on a schedule.

I was able to put together the structure for this Logic App using the visual designer in just a few minutes. However, when I clicked to run the app, it failed with the following message:

Unable to process template language expressions in action 'Update_file' inputs at line '1' and column '11': 'The template language expression 'body('List_records')['msft_date']' cannot be evaluated because property 'msft_date' doesn't exist, available properties are '@odata.context, value'. Please see https://aka.ms/logicexpressions for usage details.'.


Fortunately, Logic Apps provides a "Code View" that allows you to view and edit the underlying code behind the Logic App. I searched for the problematic "msft_date" string and found it in the code here:

"body": "@{body('List_records')['msft_date']},\"@{item()?['msft_description']}\",@{item()?['msft_hours']}"

That text came from the Logic Apps designer.

After reading through the Workflow Definition Language for Logic Apps, it became apparent that the "@{body" part of the syntax was not correct.

The fix to this bug was to change that line to the following:

"body": "@{item()?['msft_date']},\"@{item()?['msft_description']}\",@{item()?['msft_hours']}"

After making that change and saving the app, it now runs fine. Data from CRM is making its way to a Google Sheet on a regular basis.

Lessons learned on this project include:
  • For a lot of tasks, Logic Apps works fine. But don't assume that it's a solid platform at this point. It is not SQL Server. It is clearly not going through the same level of QA that other Microsoft enterprise products go through.
  • Some of today's GA (general availability) cloud apps would've been considered Beta back in the day. Explanation: When I worked at Asymetrix (Paul Allen's first company after leaving Microsoft), the engineering, QA and support teams would almost get to the point of throwing punches in battles over product quality vs ship dates. I remember working past midnight on several occasions closing out all of my bugs (even small ones) after QA won the latest screaming match in the hallway. As a team, though, we were all working toward the same goals: feature-rich applications that were as solid as we could make them. With Logic Apps, I'll just say that I don't think the same battles are happening. Maybe they should. (Note that this opinion isn't coming from just this one bug I found, but several others.)
  • Get to know the Workflow Definition Language -- the underlying structure of a Logic App. The visual designer only presents a small amount of the functionality that Logic Apps offers. For example, there are data conversion and other types of functions available to enhance a Logic App.
  • Don't write off Logic Apps due to a few bad experiences. I was recently in a meeting and the general consensus was that Logic Apps needs another year before a lot of the people will consider it for a "real world" project. I think that's wrong. It's useful for a lot of projects today, if you can live with occasionally fixing bugs introduced by the designer or getting creative to work around some limitations (e.g. currently, no way to add rows to an Excel Online file).
If you come up with some ways that you've used Logic Apps, particularly with Dynamics 365 CRM, let me know. I'd love to hear about your experiences with it as well.