It’s a season of change, and nothing has changed more than the user experience. Learn about this update and all of the other updates our product team launched in August:
- A whole new experience
- Workbot Updates:
- Connector Updates:
- Platform Updates:
A whole new experience. Automate faster. Automate Together.
We are excited to introduce the brand new experience! It’s designed for speed, simplifies the UX to reduce effort, and redefines collaboration. Our product, engineering, and design teams have spent several months redesigning the experience from the ground up, and in the two weeks that it has been out, we have received overwhelmingly positive feedback on how it has made it easy for first time users to pick up and quickly create automations.
Check out the videos on our YouTube channel with highlights of this new experience and learn how it can help your team create automations faster, together.
Learn more about the new experience.
Get more work done, faster with custom apps in Microsoft Teams
With the rise of remote work, there has been a tremendous growth in usage of custom chatbot apps to drive productivity. The latest update to the Workbot Connector for Microsoft Teams allows you to build custom apps for automating tasks like approvals, escalations, and helpdesk requests.
Before this update, you could only install one workbot, Workbot for MS Teams, in an organization. This single Workbot housed all of the bot functionality and couldn’t be rebranded. It also limited your ability to develop new Workbot actions without a Teams sandbox.
Build a custom app for Teams to provide personalized experiences
Here are some of the benefits of building your own custom app:
- Control bot branding – Give your custom apps its own logo and design
- Segment functionality – Create purpose-specific bots, like bots for approvals and bots to retrieve sales data
- Choose scopes – Control which data your bots can access
- Have governance without sandboxes – You can have your dev, staging, and production Workbots co-existing in a single Microsoft organization. Just swap connections when you’re ready to promote a bot to production.
Build multiple purpose-specific bots
Each business unit can have their own bot to brand and control. Sales teams can build a SalesBot that speeds up deal approvals and helps log calls in your CRM, HR teams can build a HRBot that employees can interact with to find benefits information, and more.
Learn more here.
Increased security and wider adoption with the Workbot Connector for Slack
Earlier this year, Slack redesigned the bot user token with a revised permissions model called granular permissions. With granular permissions, you can specify the exact scopes needed for your app to function. This month, we added the ability to set up granular permissions to the Workbot Connector for Slack.
Increased security and wider adoption
Imagine that you built a Slack app that for approving PTO. If you are a Slack workspace admin looking to add this PTO bot, before granular permissions you would have been asked to approve the bot to access everything from channel histories and user email addresses. All information that the bot doesn’t need access to for its functionality. This would make any admin nervous and more security-conscious admins may even reject the bot.
With granular permissions, developers can choose just the data that the bot needs to access. The smaller scope of access leads to more security and wider adoption as admins are rest assured that the bot won’t access sensitive data. Read more about granular permissions in Slack here.
Need to migrate your old custom bots by late 2021
Slack requires all new custom bots to use the updated bot token with granular permissions. If you created a new bot after February 2020, no action is needed on your part. If you create a new bot during or before February 2020, you must migrate to granular permissions by late 2021 or risk your bot from being delisted.
Learn more about migrating to the new scopes
Robust data pipelines and automations for Snowflake in minutes
We are excited to announce that Workato’s connector for Snowflake is among the first few Snowflake Ready connectors. This means you can be even more confident about the connector’s security, performance and scalability, ease of use, best practices for connection and data handling. Join the upcoming Product Hour to explore possibilities.
Sign up for the Product Hour
Simple, adaptable data pipelines for Microsoft SQL Server
When loading data from API endpoints, files or databases into a Data Warehouse, like SQL Server, it is common to model and create the tables in the destination database before executing the load. This time-consuming process often requires continually updating the data pipeline and mappings to stay current as new fields get added to the source objects.
Not keeping up with the changes in the source schema leads to broken data pipelines and data loss from not including the newly added fields. And with the rising adoption of SaaS applications, the ability for a data pipeline to adapt to changing source schemas is increasingly important.
Workato makes the creation and maintenance of your data pipelines simple, fast, and easy. You can use Workato’s SQL Server connector to easily manage replication from cloud and on-prem apps.
Now, with the new “Replicate Rows” action in the Snowflake connector, you can track schema changes in Salesforce, Workday, ServiceNow, JIRA, Marketo, and other SaaS apps and adjust your data pipelines to ensure no data is lost.
How it works:
1) Initial setup for the data pipeline
- Initially set up the data pipeline with the “Replicate rows” action in the SQL Server connector. Note that the “Replicate rows” action processes data in batches.
- When you run the recipe job for the first time, Workato automatically reads the object definition (e.g. opportunity object in Salesforce), and creates a table in SQL Server based on the following information you provide in the recipe steps:
- Table Name: A unique table name that will be the destination for the replicated data. If there is no existing table, Workato automatically creates the table with the right field types.
- Unique Key: The unique key in the table that will be used to determine new/updated data for UPSERT operations.
- Flatten Columns: When replicating an object that may have a hierarchical data structure (e.g. address columns for an account object), you have the option to replicate the data as is or flatten the structure to create a separate column for each element. For example, when you flatten an Address object: Address Line 1, Address Line 2, City, State, Zip, and Country will be created as separate columns.
2) Capturing changed data and schema
- When changed data is detected in the data source, Workato first inspects the new object definition against the existing SQL Server table schema.
- When mismatches in source schema object and the table definition in SQL Server are detected, the recipe job will alter the table definition match that of the source object.
- Next, the recipe job will sync the data for the updated object definitions to ensure no data or schema changes are lost.
Learn more on about SQL Server Replication in our documentation
Loading data into NetSuite just got easier and faster
The latest throughput optimizations to the Netsuite connector make bulk data loads 50% faster.
We’ve also made loading records with references or lookups to other objects easier. Previously, in order to load new NetSuite records, many steps were required to find the internal NetSuite ID of related objects before the load could happen.
Load data into NetSuite with just the external ID
Let’s say you need to upsert Salesforce opportunities into NetSuite to track sales orders. Previously, many steps were required to find the internal NetSuite ID for related Accounts and Marketing campaigns to new or updated NetSuite Orders.
Now it’s as simple as loading the opportunity as-is with its external ids. As you can see in the example below, the difference in recipe complexity is stark.
Read more about the updates to the NetSuite Connector
Optimize your business processes with the Celonis Connector
Celonis runs algorithms on event log data, like transaction logs in an ERP application or electronic patient records in a hospital. This “process mining” helps business leaders identify friction in their processes across sales, operations, procurement, and more. But while process mining provides a clearer view of inefficiencies, corrective action is still manual.
Add automation to process mining
Celonis can identify issues, but it takes Workato to automate the fixes to them. When Celonis identifies a process blocker, it sends an alert to Workato that triggers automations that unblock your business processes – without any human intervention.
Celonis and Workato work together to shorten procurement times
Let’s say you are a procurement manager at a company that handles thousands of requisitions each day using both Coupa and NetSuite for procurement. After adopting Celonis, you identify the root cause of the delays to purchase orders is price discrepancies between Coupa and NetSuite. This delays POs until the discrepancy is resolved, which often adds a week or more to the procurement process. Here’s a process you can use to shorten these delays.
1. When Celonis picks up a price mismatch in the PO during the procurement process, it alerts Workato.
2. Next, Workato connects to Coupa and NetSuite, does a comparison, and resolves the price discrepancy.
Read more about Workato’s Celonis Connector
Automate more of Workato’s platform with additional Platform APIs
Workato’s Platform APIs work hand-in-hand with RecipeOps to automate the managing and provisioning of assets, like recipes and jobs, used on the Workato platform. Platform APIs are often used to automate the migration of Workato packages from dev, to test, to production environments.
Manage even more Workato functionality programmatically by using the Workato API. You can now:
Automate Onboarding/Offboarding with the Custom Role API
Let’s say you’ve onboarded a new employee who needs access to Workato to build automations. Automate the setup of new users in Workato with the new Custom Role API. Only give them access to the folders that they need. When users leave your company, automatically deprovision them from all platforms with this API.
Turn API Endpoints on and off when a system goes down
You’ve used API Platform to grant controlled access to SAP data. Suddenly, SAP goes down, but users are still trying to access it. You can now automatically stop traffic to the app. Once the system is back up, turn the endpoint back on programmatically.
Optimize Recipe Development Lifecycle Management
You can now obtain the names and IDs of folders via API. This enables you to automate even more of the migration process between environments. The video below explains how Workato handles Recipe Lifecycle Management and CI/CD
Learn more about Workato’s Platform APIs.
Control API calls and increase efficiency with configurable poll intervals
When you use a poll-based trigger, the default is to check the source application every 5 minutes for any updates.
This setup works for many use cases, but if the source application charges based on the number of compute hours used, you may want a longer interval between polls. Now you can pick from a list of interval options, from 6 hours to 1 day or even 30 days.
Reduce costs and stay under API limits
Let’s say you wants to reduce the number of polls your recipe makes to Snowflake since their pricing model is based on the number of compute hours used. Fewer polls would also reduce database load by fetching larger batches of data at a slower interval.
With control over the polling interval, you reduce the number of polls to every few hours. This reduces costs as the number of API calls is less, increases efficiency as you load larger batches of data per poll, and helps you stay under API limits.
Learn more about changing the poll interval.
Stay up-to-date on product features
We hope you’ve enjoyed this edition of Product Updates! For all of the latest updates, please visit https://docs.workato.com/product-updates.html