A Framework for Reusable Record-Triggered Flows

Pratibha Sundar
Salesforce Architects
8 min readJul 6, 2023

--

Introduction

Not long ago, I joined a project that was already underway as a developer. As I began working on the Flow part of the project, the software engineering voice in my head started raising concerns. In particular, I noticed the following:

  • The flow names didn’t always accurately represent of the entire functionality of the flow, especially when the flow had evolved over time. While it was possible to change the label to address this, the label and the API names would be different.
  • One flow had lots of functionality packed into it, which made testing difficult. Whenever I had to modify one part of this flow, I had to regression test the entire flow to make sure nothing else was broken.
  • In some cases, the default debug Flow feature was not sufficient for me to test the certain scenarios.
  • It was difficult for multiple team members to work on the same flow at the same time.

As I thought about these issues, an interesting question occurred to me. What if there was a framework that was adaptable to meet evolving business needs, that could be scaled with minimal conflicts across team members, and that followed the Automated principles of Salesforce Well-Architected? Alignment with these principles for designing automation means:

“using approaches that are consistent and predictable, and enabling teams to develop, test, deploy, and maintain the automations you design”.

This blog post explores a real-world use case of implementing a healthy automation framework for reusable record-triggered flows that addresses commonly encountered issues with flow naming, testing, and team-oriented development.

The flow framework

The reusable record-triggered flow framework is based on a parent flow and a series of subflows attached to the parent. It also addresses creating generic subflows that can be used across various flows.

Note: In this post, I apply the naming convention used in the Architect’s Guide to Record-Triggered Automation. Specifically, “before-save” and “after-save” flows refer to “fast field update” and “actions and related records“ in Flow Builder respectively.

The framework addresses the following areas:

  • Consistent naming of flows
  • Where to start a flow
  • Error handling
  • Minimizing conflict among team members working on the same flow
  • Improving debugging and testability for flows

One thing to note upfront is that the before-save flow does not support subflows or Apex actions, so some elements of the framework cannot be applied in those instances. The Guide to Record-Triggered Automation recommends putting same-record field updates into before-save flow triggers. The Well-Architected Guide to Composable recommends moving away from large “monoflows” to orchestrate complex processes within the same flow. Hence, the design of before-save flow triggers requires careful consideration. It’s a good idea to keep before-save flow triggers as light as possible; if the business requirements call for complex processing, consider using trigger equivalents. The following diagram illustrates a record-triggered flow framework aligned with these guidelines.

Flow framework overview diagram

Consistent naming of flows

When you start to build a flow, one of the first questions to arise is what to name it. Often, as the Salesforce org evolves and more logic is added to the flow, the name originally given to the flow is no longer entirely accurate. A clearly defined naming convention like the one shown in the following table makes it possible for flows to evolve gracefully as the organization grows over the years without compromising on consistency or clarity in their names.

Here is an example of what a before-save flow looks like when applying these naming conventions.

Before-save flow example

Here is an example of what an after-save flow looks like when applying these naming conventions.

After-save flow example

Where to start a flow

One of the biggest challenges when writing a flow is determining where to start or deciding which flow the business logic needs to go in. Frequently, developers or admins end up creating a new flow every time a new requirement comes in. Over time, this leads to many flows on the same object, which can be difficult to maintain. To avoid this situation, follow logic optimization patterns for Salesforce automation detailed in Well-Architected, specifically ensure that:

  • Each flow serves a single, specific purpose
  • Flows are organized in a hierarchical structure consisting of a main flow and supporting subflows

Start with purpose-built subflows and relate them to one of the before or after flows for a given object. Because the flow naming conventions include the object name as the start of the name, you can easily group flows by sorting them by their names. This makes it easy to browse through all subflows defined for an object to see if you can use an existing subflow before creating a new one. Only create a new subflow if none of the existing subflows will meet the customization needs. As the customizations evolve with business growth, the flow structure will also scale gracefully.

Recall that before-save flows do not support subflows, which makes them an exception to the pattern of organizing flows in a hierarchical structure. As mentioned in the Record-Triggered Automation guide, before-save is intentionally scoped to support only certain operations for performance reasons.

Error handling

Error handling in flows is an important, but often overlooked, exercise. As the Well-Architected patterns and anti-patterns for error handling put it, “Flows [that] do not use fault paths consistently or at all” are not well-architected. How can you follow this guidance in the real world without slowing development?

Create a generic error-handling subflow by using fault connectors. The generic logic can be as simple as sending an error email to the org admin or the record owner (for example, the case owner for cases) when there is an error in a DML operation on a record. When you have exception handling subflows already defined, error handling in a flow is as easy as plugging in the appropriate exception handling subflow to the flow whenever there is a need (for example, in data operations, callouts, or critical processing logic).

Multiple team members working on the same flow

When multiple team members are working on the same flow, you need the ability to handle merge conflicts. In a pro-code situation, you can rely on version control to prevent work from being overwritten. Let’s explore one approach to help low-code builders avoid overwriting each other’s work.

Each team member works on a subflow and tests that subflow independently before it is added to the parent flow whenever it is possible. With this approach, each team member can work on their user story in a subflow. Breaking the work into composable units in this way enables a team to work on a single flow with fewer conflicts. It is not always possible to have team members working on different subflows, especially in cases for before-save flows. Hence, it is a good practice to establish a development process that enables team members to coordinate and to compare and manually merge changes before pushing their changes to the flow when deploying to UAT / production orgs.

Flow debugging and testing

Sometimes, business logic in flows can be mission-critical and complex. When such a flow is updated after it has gone live, it is important to test the modified flow across all applicable business scenarios before it is pushed to production. And of course, when a bug is reported during UAT or after production deployment, you want to be able to quickly track down the cause.

A good way to enable step-by-step run-time troubleshooting is to create debug statements in an Apex action that can be added to each node of a flow. You can avoid optimization anti-patterns mentioned in Well-Architected — Automated by creating a generic subflow containing the debug statement, which can be used by the parent flow.

It’s also a good practice to create test scripts every time a flow is created or updated to automate testing. By default, tests can be created with assertions on flows. These tests may not cover all scenarios of a record life cycle. In such instances, you can use Apex actions and test classes to automate testing. By incorporating the writing of test scripts as part of flow development, you reduce the risk of introducing undetected errors every time you create or update a flow.

Framework example

Let’s see how this framework works with a simple real-world example involving case creation. When a case is created, the goal is to have the case’s contact auto-populated by searching for a contact with the same email address as the one that came through email-to-case, before saving the case record. Further, if a severe priority case record is created or updated, a case manager will be emailed. The All Flows list view looks like this:

All Flows list view snapshot

As you can see, because the naming convention was followed, all flows related to cases are listed one after the other. In this example, Case: Before Save is a before-save flow and Case: After Save is an after-save flow. The Case: Email Manager on a Severe Case Update subflow will be called from the Case: After Save flow.

Example flows in the Flow Builder interface

Now let’s check the Flow Trigger Explorer. Notice that the naming conventions have helped to standardize and simplify the Flow Trigger Explorer view.

View of Flow Trigger Explorer

Now imagine that the flow must support a more complex business process that requires greater granularity than just before-save and after-save. You can add an after-create flow to handle logic exclusive to after-create situations, add an after-update flow for logic exclusive to after-update situations, and keep the after-save flow to handle just the logic common for both create and update. With no support for subflows, the before-save flow has a tendency to become a “monoflow”. You can explore splitting the before-save flow into before-save, before-create, and before-update flows to break the single large flow into smaller, simpler flows.

Conclusion

This post described — and provided a brief example for — a reusable record-triggered flow framework that helps to improve maintainability and scalability as your automation landscape grows. To learn more about designing well-architected automations visit https://architect.salesforce.com/well-architected/easy/automated.

--

--

Pratibha Sundar
Salesforce Architects

Salesforce consultant, Technology enthusiast and much more