1. What Does This Page Cover?
Learn how you can efficiently split your larger tasks into smaller batches and process them at off-peak hours using batch workflows.
2. Availability
Batch workflows feature:
- Is available only for paid plans of Creator, with a limited number (refer Things to Know) of batch blocks for each plan
- Can only be created, enabled, and managed by the super admin, admins, and developers, while other users can benefit from the automated execution of repetitive tasks.
- Is available in all data centers
3. Overview
Batch processing is used to efficiently and periodically complete high-volume and repetitive data jobs by splitting the data at hand into manageable batches. Each batch is then processed separately in an asynchronous flow. This method requires minimal human interaction and can be run sequentially throughout the day or at customized intervals. Some common business processes that can use batch processing include generating reports, printing documents, or updating information at the end of the day.
4. Batch workflows in Creator
In Creator, workflows comprise a set of actions that can be configured to execute at specific instances to automate routines in your application. Batch workflows enable you to efficiently execute customized Deluge scripts on a large volume of data, either after successful import or at scheduled intervals. You can choose to split the bulk records into smaller batches ranging from 10 to 1000 records per batch and configure actions to be run for every record in a batch block (collection of grouped records) in the Execution section. Then, you'll be shown the Before and After section wherein you can specify the actions to be triggered before and after the execution of the above deluge script. For example, instead of processing every order as it occurs, you can configure a batch workflow to collect all orders at the end of each day and share them with the order fulfilment team. This splitting of data processing helps you overcome Deluge's execution limitations (10-50k statements limit) and ensures smooth processing of large datasets.
- You can select a batch size that optimizes the overall script execution time and resource utilization, thus balancing performance with processing efficiency.
- If an error occurs in one of the batches during the batch workflow execution, that particular batch alone will be reverted. This means that an error in one batch doesn't affect the successful execution of other batches in the same batch block.
4.1 Examples of processes that batch workflows can automate
Batch workflows can be configured to process various types of data requests, some of which commonly include:
- Periodic billing - Calculate the total amount periodically for individual billing dates.
- Invoice generation - Generate sales invoices as per your requirements
- Payroll processing - Mail monthly salary slips for each employee in your organization.
- Inventory management - Track the product stock periodically to never go out of stock.
- Detect fraudulent transactions - Generate monthly statement bills for credit cards to detect frauds.
Thus, batch workflows increase operational efficiency, automate repetitive tasks, and maximize productivity for your business, all while reducing manual supervision of repetitive tasks.
5. Stages in a Batch Workflow Execution
A batch workflow includes the following three stages, with the ability to configure individual actions for each stage.
Note: The Before and After Execution blocks are optional based on your requirements. However, you must configure the script to be executed in the During Execution block.
- Before Execution of Batch Workflow: You can define variables, initialize their usage (applicable to be used in other two blocks as well), and configure the required actions to be carried out before the execution of batch workflow begins. Examples of initialization actions include sending notifications before the process begins, webhook calls to fetch stock from inventory and prepare a list from a response, and so on. Learn how
The Before Execution block will be executed first upon start of the batch workflow execution and will be executed only once.
- During Execution of Batch Workflow: Here, you can configure a single action to be executed for every record in a batch block. This action will be performed for every record when the batch workflow execution is in progress. For example, you can configure to calculate every employee's monthly salary that includes taxes, benefits, and much more. Learn how
The During Execution block will be executed in iteration until all batch blocks in that batch workflow are completed.
- After Execution of Batch Workflow: You can configure actions to be executed after the execution of all batches to handle both the success and failure scenarios, such as allowing cleanup tasks, displaying respective (success/failure) notifications, and error handling. For example, you can configure to notify an employee through email to them about their monthly payslip after the successful completion of your batch workflow execution. Learn how
6. Prerequisites
- You must have a form in your application based on which you can configure a batch workflow.
- You need to decide when to trigger the batch workflow and an idea of the number of records i.e each batch's size.
7. See How it Works
Let’s assume that your organization issues credit cards and has built a Creator application named Credit Card Management. You need to generate and send billing statements on the 1st of every month to your respective customers. To do so, you can schedule batch workflows to be run on the 1st of every month with the action configured to generate relevant statements. You can also utilize variables for dynamic calculations like deriving the total bill amounts for each customer. Upon successful execution, you can configure the monthly credit card billing statements to be sent as attachments via email.
8. Use Cases
Case 1: After successful import of records
In the sales and marketing ecosystem, managing leads efficiently is paramount for driving conversions and increasing revenue growth. Let's assume you've created a Creator application named Leads Management.
Your organization receives a huge volume of leads from various sources, including imports from third-party platforms. Each lead requires a lead score calculation based on various attributes like engagement level, demographic data, and past interactions. Additionally, assigning the right lead owner based on territory, expertise, or workload is crucial for timely follow-ups and conversions. However, manually processing these tasks for a significant number of leads is time-consuming and prone to human errors.
Imagine that you're importing these leads into your application. To streamline this process, you can create a batch workflow to be executed after successfully importing the leads. You can configure actions to calculate the lead scores and assign lead owners to the imported leads.
Before Execution Block
- Perform initialization tasks, including notifications to relevant stakeholders about the commencement of lead processing.
On Execution of Batch Block
- Configure the batch size to handle leads in manageable chunks.
- Develop scripts to calculate lead scores based on predefined criteria and assign appropriate lead owners using territory mapping or workload distribution algorithms.
After Batch Execution Block
- Implement actions based on the success or failure of batch execution.
- Notify stakeholders about the completion of lead processing and provide insights into the processed leads, including statistics on lead scores and ownership assignments.
- Thus, this feature enables automated processing of lead data in manageable batches, ensuring efficiency and accuracy.
Case 2: Set custom schedule
Let's assume you manage an e-commerce system that receives orders throughout the day. You've created a Creator application named Order Management. As a marketing manager, you want to run a process every month that aggregates customer feedback from your application reports and generates a report on customer satisfaction. You can create a batch workflow, specify the data source, and process the required scripts to be run at a specified time, and share the collected data easily with stakeholder.
9. Navigation Guide to Create a Batch Workflow
Navigate to the
Solutions module and open the required application.Then click the Workflow tab at the top to navigate to the workflow dashboard. Here, you can select the batch workflows section and follow the steps mentioned
here.
10. Things to Know
- You can configure batch workflows in the following cases — after successful import of your application's records or at a specified time.
- You can create multiple batch workflows based on the same form with different or identical conditions.
- All the batches in a batch block will be executed sequentially.
- You can track the status of batch workflow executions through the application logs section. It provides visibility into the queue status, in-progress executions, and completion status (success or failure).
- You can view error logs for the first 5 batch block failures, providing insight into the reasons for execution failures.
- If one or more batches in a block fail, that particular batch alone will be reverted, and the batch workflow will still continue executing other batches. You can view the that block's failure message in logs.
- You can execute batch workflows for batches in sizes of 10, 50, 100, 200, 500, and 1000 records.
- You can create up to 100 batch workflows at a time in your account.
- You can execute up to a certain number of batch block executions per account in a day based on your pricing plan.
- The execution timeout for each batch is set to 1 minute i.e., if a batch block's size is configured as 1000, then the timeout for the execution of those 1000 records will be 1 minute.
- Similarly, the timeout for before and after execution batch blocks is 30 seconds each.
11. Limitations
- This workflow is currently not supported for integrations forms.
- Users can create up to 5 variables within a batch workflow.
- Only one batch workflow can be executed concurrently for an account (based on creation date). This means that additional batch workflows will be queued for execution, thereby ensuring fair resource allocation and preventing overload on system resources.
- You cannot trigger batch workflows for records imported by portal users in customer portal.
Deluge limitations
- The "old" keyword is not supported in batch workflows.
- You can configure upto 10 CRUD actions like add, update, and delete for each records in a batch workflow.
- You cannot fetch more than 200 records.
- Info message cannot exceed 50KB. Otherwise, the message will be truncated.
- You can send up to a maximum of 3 email notifications with attachments.
- The maximum size of file attachments allowed is 15 MB. If the attachments exceed this limit, the emails will be delivered without them.
- If you want to get the content of the specified file or create a file object using the required content, the respective file size should be less 2 MB.
- getUrl() and postUrl() tasks are currently not supported.
- You cannot assign values to global variables. You can still access them using info statement.
- The following Deluge operations are not supported in batch workflows.
- Fetching all values of a field
- Using aggregate functions (except count) on fetched records
- 'Call Function' task
- Built-in functions such as:
- matches()
- ReplaceAll()
- ReplaceFirst()
- ReplaceAllIgnorecase()
- ReplaceFirstIgnoreCase()
- Compress()
- Client functions
- Understand records
- Create and manage batch workflows