Skip to main content

Flow Structural Conventions - Common Core

As detailed in the General Notes section, these conventions are heavily opinionated towards maintenance and scaling in large organizations. The conventions contain:

  • a "common core" set of structural conventions that apply everywhere
  • conventions for Record Triggered Flows specifically
  • conventions for Scheduled Flows specifically

Due to their nature of being triggered by the user and outside of a specific record context, Screen Flows do not require specific structural adaptations at the moment that are not part of the common core specifications.

Per-FlowCommon Core Conventions

On System-Level Design

  1. Do not do DMLs or Queries in Loops.

    Simpler: No pink squares in loops.

    DML is Data Manipulation Language. Basically it is what tells the database to change stuff. DML Operations include Insert, Update, Upsert, and Delete, which you should know from Data Loader or other such tools.

    Salesforce now actually warns you when you're doing this, but it still bears saying.

      Don’t do thisDon't do this

    You really must not do this because:
    • it can break your Flow. Salesforce will still try to optimize your DML operations, but it will often fail due to the changing context of the loop. This will result in you doing one query or update per record in your loop, which will send you straight into Governor Limit territory.
    • even if it doesn't break your Flow, it will be SLOW AS HELL, due to the overhead of all the operations you're doing
    • it's unmaintainable at best, because trying to figure out the interaction between X individual updates and all the possible automations you're triggering on the records you're updating or creating is nigh impossible.
    All Pink (DML or Query) elements should have Error handling

    Error, or Fault Paths, are available both in Free Design mode and the Auto-Layout Mode. In Free mode, you need to handle all possible other paths before the Fault path becomes available. In Auto-Layout mode, you can simply select Fault Path.

    Screen Flow? Throw a Screen, and display what situation could lead to this. Maybe also send the Admin an email explaining what happened.

    Screen Flow Error HandlingRecord-triggered Flow? Throw an email to the APEX Email Exception recipients, or emit a Custom Notification.
    Hell, better yet throw that logic into a Subflow and call it from wherever.

    (Note that if you are in a sandbox with email deliverability set to System Only, regular flow emails and email alerts will not get sent.)

    image-1644417882117.png

    Handling Errors this way allows you to:
    - not have your users presented with UNEXPECTED EXCEPTION - YOUR ADMIN DID THINGS BADLY
    - maybe deflect a few error messages, in case some things can be fixed by the user doing things differently
    - have a better understanding of how often Errors happen.

    You want to supercharge your error handling? Audit Nebula Logger to see if it can suit your needs. With proper implementation (and knowledge of how to service it, remember that installed code is still code that requires maintenance), Nebula Logger will allow you to centralize all logs in your organization, and have proper notification when something happens - whether in Flow, APEX, or whatever.

    Don't exit loops based on decision checks

    The Flow engine doesn't support that well and you will have weird and confusing issues if you ever go back to the main loop.

    image-1644417909572.png

    Don’t do this either - always finish the loop

    Issues include variables not being reset, DML errors if you do come back to the loop, and all around general unpredicatble situations.
    You can still do this if you absolutely NEVER come back to the loop, but it's bad design.

    Do not design Flows that will have long Wait elements

    This is often done by Admins coming from Workflow or Process Builder space, where you could just say "do that 1 week before contract end date" or "1 day after Opportunity closure". This design is sadly as outdated as the tools that permitted it.
    Doing this will have you exceed your Paused Interview limits, adn actions just won't be carried out.

    A proper handling of "1 day before/after whenever", in Flow, is often via a Scheduled Flow.
    Scheduled Flows execute once daily (or more if you use plugins to allow it), check conditions, and execute based on these conditions. In the above case, you would be creating a Scheduled Flow that :

    • Queries all Contract that have an End Date at TODAY()-7
    • Proceeds to loop over them and do whatever you need it to

    Despite it not being evident in the Salesforce Builder, there is a VERY big difference between the criteria in the Schedule Flow execution start, and an initial GET.
    - Putting criteria in the Start Element has less conditions available, but effectively limits the scope of the Flow to only these records, which is great in big environments. It also fires One Flow Interview per Record, and then bulkifies operations at the end - so doing a GET if you put a criteria in the Start element should be done after due consideration only.
    - On the opposite, putting no criteria and relying on an initial Get does a single Flow Interview, and so will run less effectively on huge amounts of records, but does allow you to handle more complex selection criteria.

    Do not Over-Optimize your Flows

    When Admins start becoming great at Flows, everything looks like a Flow.
    The issue with that is that sometimes, Admins will start building Flows that shouldn't be built because Users should be using standard features (yes, I know, convincing Users to change habits can be nigh impossible but is sometimes still the right path)... and sometimes, they will keep at building Flows that just should be APEX instead.

    If you are starting to hit CPU timeout errors, Flow Element Count errors, huge amounts of slowness... You're probably trying to shove things in Flow that should be something else instead.

    APEX has more tools than Flows, as do LWCs. Sometimes, admitting that Development is necessary is not a failure - it's just good design.

    On Flow-Specific Design

    Flows should have one easily identifiable Triggering Element.Element

    This relates to the Naming Conventions.

    this is easy - itthisthisSubflows,the
    Flow Type
    For
    Triggering Element
    Record-Triggered Flows,Flows It is the Record that triggers the DML.
    ForDML
    Event-based Flows,Flows It should be a single event, as simple as possible.
    For
    Screen Flows,Flows This should be either a single recordId, a single sObject variable, or a single sObject list variable. In all cases, the Flow that is being called should query what it needs by itself, and output whatever is needed in its context.
    For
    Subflows The rule can vary - it can be useful to pass multiple collections to a Subflow in order to avoid recurring queries on the same object. However, passing multiple single-record variables, or single text variables, to a Subflow generally indicates a design that is overly coupled with the main flow and should be more abstracted.

    image-1691666525835.png

    image-1644417867237.png

    A good example of Triggering Elements naming
  2. Fill in the descriptions.descriptions

    You'll thank yourself when you have to maintain it in two years.
    Descriptions should not be technical, but functional. A Consultant should be able to read your Flow and know what it does technically. The Descriptions should therefore explain what function the Flow provides within the given Domain (if applicable) of the configuration.

    image-1644417872437.png

    Descriptions shouldn’t be too technical.

  3. Don't
  4. use the "Set Fields manually" part of Update elements
  5. AllYes, Pinkit's possible. It's also bad practice. You should always rely on a record variable, which you Assign values to, before using Update wiht "use the values from a record variable". This is mainly for maintenance purposes (DMLin or99% Query)of cases you can safely ignore pink elements shouldin havemaintenance Errorto handling.
    Screenknow Flow?where Throwsomething ais Screen,set), but is also impactful when you do multi-record edits and displayyou what situation could leadhave to this.
    Record-triggered Flow? Throw an email tomanipulate the APEXrecord Email Exception recipients. Hell, better yet throw that logic into a Subflowvariable and callstore itthe fromresulting wherever.
    Or send a Notification.
    (Note that if you aremanipulation in a sandboxrecord withcollection email deliverability set to System Only, regular flow emails and email alerts will not get sent.)variable.

    image-1644417882117.pngimage-1691673305766.png

    If the DMLs fail, the user gets presented with an error message written in the screen.
  6. Don't do DMLs or Queries in Loops. Simpler: No pink squares in loops.
    Salesforce now actually warns you when you're doing this, but it still bears saying.
    To avoid doing this, ASSIGN a single record that you want to process to a collection variable in your loop, then do the DML outside of the loop at the end of your Flow. *1

    image-1644417893729.pngimage-1691673360337.png

    Don’t do this
  7. If you have the same logic happening in various places, you should repackage this logic into a Sub-flow.
    This will avoid you having to modify the same thing in 6 places. It also makes the Flows easier to read. *2

  8. Don't exit loops based on decision checks.
    The Flow engine doesn't support that well and you will have weird and confusing issues if you ever go back to the main loop.

    image-1644417909572.png

    Don’t do this either - always finish the loop
  9. Do not design Flows that will have long Wait elements and will be called by thousands of records.
    You'll exceed your Paused Interview limits. This kind of use case it should be a Scheduled flow anyway.

  10. Do not rely on implicit references.
    This is when you query a record, then fetch parent information via {MyRecord.ParentRecord__c.SomeField__c}. While this is useful, it’s also very prone to errors (specifically with fields like RecordType and makes for wonky error messages.

Cross-Flow Design

  1. Try to pass only one Record variable or one Record collection to a singleFlow or Subflow.

    See Per-"Tie each Flow designto itema 1.Domain".
    Initializing a lot of Record variables on run often points to you being able to split that subflow into different functions. Passing Records as the Triggering Element, and configuration information as variables is fine within reason.

    image-1644417918698.png

    In
    the
    example Example -below, the Pricebook2Id variable should be taken from the Order variable.

  2. image-1644417918698.png

    Try to make Subflows that are reusable as possible.

    A Subflow that does a lot of different actions will probably be single-use, and if you need a subpart of it in another logic, you will probably build it again, which may lead to higher technical debt.
    If at all possible, each Subflow should execute a single function, within a single Domain.
    Yes, this ties into "service-based architecture" - we did say Flows were code.

  3. Do
  4. not rely on implicit references
  5. This is when you query a record, then fetch parent information via {MyRecord.ParentRecord__c.SomeField__c}. While this is useful, it’s also very prone to errors (specifically with fields like RecordType ) and makes for wonky error messages if the User does not have access to one of the intermediary records.
    Do an explicit Query instead if possible, even if it is technically slower.

    EachTie each Flow shouldto bea Domain

    This is also tied to aNaming singleConventions. Domain,Note that in the example below, the Domain is the Object that the Flow lives on. One might say it is redundant with the Triggering Object, except Scheduled Flows and communicationScreen Flows don't have this populated, and are often still linked to specific objects, hence the explicit link.

    Domains are definable as Stand-alone groupings of function which have a clear Responsible Persona.

    image-1644417943258.png

    Communication between Domains should ideally be handled byvia Events.Events

    See
    Domain-Driven Design.

    In short, if a Flow starts in Sales (actions that are taken when an Opportunity closes for example) and finishes in Invoicing (creates an invoice and notifies the people responsible for those invoices), this should be two separate Flows, each tied to a single Domain.

    Note that the Salesforce Event bus is mostly built for External Integrations.
    The amount of events we specify here is quite high, and as such on gigantic organisations it might not be best practice to handle things this way - you might want to rely on an external event bus instead.

    That being said if you are in fact an enterprise admin I expect you are considering the best usecase in every practice you implement, and as such this disclaimer is unnecessary.

    image-1644417931475.png
    Example of Event-Driven decoupling

    In this example, the Technician On Site flow signals that an Intervention has started, which triggers a Work Order related flow.
  6. Communication between Flows should ideally be handled via Events to avoid heavy coupling.
    In the example above, a Flow that starts in Sales should fire an “Opportunity Closed” event, which will be listened to by the “Start Invoicing” flow to trigger the actions tied to Invoicing.
    See above example

  7. Avoid cascading Subflows wherein one calls another one that call another one.one

    Unless the secondary subflows are basically fully abstract methods handling inputs from any possible Flow (like one that returns a collection from a multipicklist), you're adding complexity in maintenance which will be costly.costly

Domain-Driven Design

  1. Identify the Domain of your Flow when you create it.
    One flow should have exclusively one attributable Domain.

  2. If you need to modify elements from another Domain in your Flow, emit an Event matching the current status instead, and use that to start another flow which has said attributed Domain.

  3. Domains are definable as Stand-alone groupings of function which have a clear Responsible Persona.

image-1644417943258.png

High-level communication between Domains

 

Record-Triggered Flow Design

  1. Create only one Flow per Object and per Operation type - meaning one BEFORE Create, one BEFORE Update, one AFTER Create, etc.
    In all cases apart from BEFORE flows, you can leverage Subflows to ensure order of execution and ease of maintenance.
    For BEFORE Flows, for the moment it should still be best practice to have a single flow, of which the functions are split via Decision nodes.

    1. In the case of BeforeUpdate, the "decisions" should be done in a vertical tree reminiscent of Process Builder, and the actual actions and logic to the right side. This is done to familiarize Admins with Flows if they open them, and to ease migrating to Subflows when possible.

    2. In the case of AfterUpdate, the "decisions" should be done in a vertical tree reminiscent of Process Builder, and the logic should be stored in Subflows whenever possible, and called on the right side of the decision.

  2. Prioritize BEFORE-save operations whenever possible.
    This is more efficient in every way for the database, and avoids recurring SAVE operations.

  3. Try to leverage Sub-Flows in all designs, including Record-Triggered Flows, for the reasons outlined above.

  4. Flows that exclusively are Email handlers should be their own record-triggered flows.
    This is done because the conditions for evaluating Flows can impact how Email sending is handled. Another advantage is that you can turn off your “email handler” flows while making changes or testing your other flows.

  5. You may want to start out in Autolayout Mode which makes it easier for beginners and keeps things nice and neat, and then turn it off as needed since it doesn’t support everything you might want to do. You can turn it on and off as needed so there’s no commitment involved in that decision.

On Delayed Actions

Flows allows you to do complex queries and loops as well as schedules. As such, there is virtually no reason to use wait elements or delayed actions, unless said waits are for a platform event, or the delayed actions are relatively short.

Any action that is scheduled for a month in the future for example should instead set a flag on the record, and let a Scheduled Flow evaluate the records daily to see if they fit criteria for processing. If they do in fact fit criteria, then execute the action.

A great example of this is Birthday emails - instead of triggering an action that waits for a year, do a Scheduled flow running daily on contacts who's birthday it is. This makes it a lot easier to debug and see what’s going on.

*1  This is easy to work around for DML, but sometimes less so for Get Records. There are apex actions you can install to work around this issue, with the caveats mentioned above about installed actions. If you are building a screen flow, this is less of an issue.

*2 (Note that as of Spring 21, record-edit flows do not yet support calling subflows. You may consider waiting to use them until they do.)