23.9 Release Notes
  • 9 Minutes to read
  • Dark
    Light
  • PDF

23.9 Release Notes

  • Dark
    Light
  • PDF

Article Summary

Structured Planning: Enhanced View Mode Indication for Employee Record in WFP Templates

Enhanced view mode in Workforce Planning Templates now clearly displays non-editable employee details for improved usability.

Read More...

With this release, we have enhanced the usability of the employee details interface in the Workforce Planning Templates when opened in View mode.

Previously when you opened an employee record in a workforce planning template in the View mode, the fields looked like they could be edited, which often was confusing. With this enhancement, you will now see that the employee details are non-editable in view mode.

The following illustration shows the employee details screen with an improved display:

Dynamic Planning: Enhanced SSO Login for Dynamic Planning Report Links Based on Organization (Tenant)

The latest update to Dynamic Planning introduces a significant improvement to the SSO login process, making it easier for users to access shared Dynamic Planning Report links. SSO users can now effortlessly navigate to the relevant page associated with the clicked URL in a single click.

Previously, inactive SSO users encountered a redirect to the login page when clicking on a unique URL from an email. However, even after logging in via SSO or user credentials, they would be redirected to the task manager screen instead of the intended URL or specific page. 

With the latest release, users clicking on a URL containing artifacts related to a specific product area or report will experience accurate redirection to the corresponding page after a successful login. This enhanced redirection is now based on the organization's (tenant's) ID and the obtained IDP profile. As a result, the SSO login experience becomes seamless and successful for all users.

Dynamic Planning: Introduced the Enable Calculation Execution Status Notification for Views and Reports Flag

Receive real-time notifications for ongoing calculations in SpotlightXL, ensuring data consistency in collaborative efforts.

Read More...

With this release, we have introduced a new flag, Enable Calculation Execution Status Notification for Views and Reports in SpotlightXL. This flag gives users real-time notifications to inform them about ongoing calculations associated with the view that may impact their views and reports.

Previously, there were instances where data inconsistencies could occur when two different users edited and tried to save or refresh changes simultaneously in the same view. This was possible because users could save or refresh data even while the calculation associated with the view was still in progress. 

However, with this new flag, if one user saves data to the view, it will trigger the calculation associated with that view. While this calculation is in progress, the second user will receive a notification if they attempt to refresh or save data to the same view. Hence, if any calculation is running on any model, it will display a notification when a user performs a refresh or save data operation on any views/reports of that model, regardless of whether the calculation is linked to that specific view/report.

The following notification will appear when a user tries to save data to the same view:

And the following notification appears when the user refreshes data to the same view:

To illustrate this, consider a scenario at XYZ Company where Sarah and Tom simultaneously add data to a shared view. Previously, both Sarah and Tom could save their data independently, even when the calculation associated with the view was still in progress. This could lead to data inconsistencies.

Now, if either Sarah or Tom attempts to save or refresh their data, a notification will inform them that the calculation associated with the view is currently in progress. They cannot save their changes until the ongoing calculation is completed. This ensures that the data remains consistent and accurate, eliminating any possibility of discrepancies.

Note:
All power users can enable this flag.

In Practice: Enable Calculation Execution Status Notifications for Views and Reports Flag

  1. Go to Manage > Application Administration > Application Settings.
  2. In the Calculation Property section, set Enable Calculation Execution status notification for Views and Reports to Yes.

Consolidation: Automatic Processing of Reclassification Journals with Cloud Scheduler

With the new Select All option, you can conveniently automate the process of Reclassification journals with the Cloud Scheduler, eliminating all manual efforts.


Read More...

With this release, you can auto-process a Reclassification journal for a predefined schedule with the Cloud Scheduler. This enhancement will reduce the manual effort of processing and posting Reclassification journals for a predefined schedule. We have also introduced a new option to select all journals available at the time while processing Reclassification journals. Previously, you could only select the Reclassification journals listed in the Journals tab. You can also see all the selected journals in the Selected Journals tab.

Now, the Select All option allows you to process all Reclassification journals for a predefined schedule, eliminating manual effort.

In Practice: How to schedule Reclassification journals

  1. Navigate to Maintenance > Administration > Cloud Scheduler.
  2. Select the Process Flow tab, and click Add.
    RJ_11
  3. In the General Information tab, fill the details for Code, Name, and Email Recipients.
    RJ_2
  4. In the Tasks Tab, click Add Task. A New Task window appears.
  5. Select the task type Reclassification Journal and fill in all the details.
    RJ_3
  6. In the Reclassification Journals option, click Selected or Select All to process the relevant journals.
  7. Click Save.
  8. In the Scheduler tab, define all the fields for the reclassification journals you want to process monthly.
    RJ_4

Predict: Enhanced Signal Sensitivity Adjustment in Dynamic Reports

Use the Signals Range option in Dynamic Reports to effortlessly customize the signals sensitivity of reports on the fly.


Read More...

With this release, we have enhanced the Signals Sensitivity feature to temporarily modify the sensitivity level for a specific report on the fly, affecting only the currently open session in Dynamic Reports. It's important to note that when you refresh or close the session, the sensitivity will revert to its configured setting in the Predict Admin screen. This feature allows you to adjust the sensitivity level to your preferred setting with the help of an easy-to-use slider. You can now customize the sensitivity level to view fewer or more signals directly within the report.

Once you have generated signals for the report, you can access the Signals Range option from either of the following ways:

  1. Navigate to the Predict: Signals drop-down menu in the top menu bar and select Adjust Signals Range.

  2. Go to the legend panel and click Signals Range.


Integrations: Introduced Native FTP/SFTP connector in Cloud Services

Introducing FTP/SFTP option for Cloud Services to elevate your data integration! Now, users can load GL data, translations, and transaction data from FTP/SFTP server to the Planful application.


Read More...

Until now, users have been able to integrate data from Cloud Services like Box, Google Drive, and Netsuite Connect into the Planful application. With the latest release, we are taking integration a step further by introducing the FTP/SFTP option. This enhancement provides flexibility to load data into Planful.

Note
You must contact the Planful Support team to enable the FTP/SFTP functionality in your application. Also, ensure that Ivy is enabled in your application before you start using the FTP/SFTP connector.

The native FTP/SFTP connector allows users to connect to their SFTP server, fetch any new files, and load the data into Planful. This simplifies the process by allowing users to upload files to their SFTP server, and Cloud Scheduler will execute the Process flow. This eliminates the need for any third-party services, allowing users to upload files to their SFTP connector and integrate the data into Planful.

To initiate the integration, users need to configure the FTP/SFTP server with the Planful application. After successful configuration, the system automatically creates three folders (Input, Success, and Failure) in the server. When the user creates an FTP/SFTP Data Load Rule in Planful, new sub-folders with the DLR name get created inside each of these three folders. Then the data can be loaded to the Input folder, and the process flow can be executed manually or schedule-based.

Let's understand this with an example:

Let’s say a company receives sales data from multiple branches. Each branch sends data via the SFTP server. With the new FTP/SFTP options, the company can create a Data Load Rule for each branch and then schedule an automated process execution daily. As per the schedule, the sales data fetches automatically from the SFTP server daily and integrates data into Planful.

In Practice: Integrating SFTP Server Data with Planful

Users can load GL, translations, and transaction data from the SFTP server to the Planful application. To integrate data from the SFTP server into the application:

  1. Connect to your SFTP server within Planful Application.
  2. Create Data Load Rules (DLR) to Load FTP/SFTP File.
  3. Schedule and Execute the DLR with Cloud Scheduler from the server.

Connect to your SFTP server within Planful Application

The initial step to integrate FTP/SFTP involves a one-time configuration on the Cloud Services page. To perform the configuration, please follow these steps:

  1. Navigate to Maintenance > Admin > Configuration Tasks.
  2. Click Cloud Services under Data Integration Configuration on the Configuration Task List page.
  3. On the Cloud Services page, click FTP/SFTP.
    Note:
    The FTP/SFTP edit feature requires activation through the Planful Support team.
  4. Enter your IP address in the Host Name field.
  5. Choose the FTP or SFTP radio button to indicate your preferred Protocol.
  6. Enter your FTP/SFTP server User Name and Password to grant access to the Planful application to load data.
  7. Click Save.

Upon successful server configuration, the system will automatically generate three folders: Input, Success, and Failure, in the configured server. Each DLR you create in the Planful application in FTP/SFTP data load type will automatically create new folders with the DLR name within all three (input, success, and failure) folders.

  • Input - Within this folder, place the data file into the corresponding DLR’s folder to execute the Process Flow in the Planful application.
  • Success - If the Process Flow is successfully executed, the data file located in the corresponding DLR folder within the Input folder will be moved to the corresponding DLR folder in the Success folder.
  • Failure - If the Process Flow fails, the data file located in the corresponding DLR folder within the Input folder will be moved to the corresponding DLR folder in the Failure folder.
Note
Currently, we only support basic (username and password) authentication to connect to any FTP/SFTP server.

Create Data Load Rules (DLR) to Load FTP/SFTP File

DLR creation helps to load data from the FTP/SFTP server. To create DLR, follow the steps below:

  1. Go to Maintenance > DLR > Data Load Rules.
  2. Click New Data Load Rule.
  3. Enter a Name and Description (not mandatory).
  4. For Load Type, select FTP/SFTP.
  5. For Load Item, select Data.
  6. For Sub Load Item, select either GL Data, Translation Data, or Transaction Data V2.

Complete the fields on the remaining Data Load pages as needed. Refer to Creating New Data Load Rules for detailed instructions. Once the DLR is created, a sub-folder with the DLR’s name will be created in all the input, success, and failure folders on the server.

For instance, if you named a DLR “DLR Data” in the Planful application, the DLR folders will be created in the input, success, and failure folders on the server with the name “dlr_data” (spaces are replaced with underscores). Now, you can add the data file to the specific DLR folder. You can add a total of 20 files to the DLR’s folder, with each file having a maximum capacity of 1GB.

Schedule and Execute the DLR with Cloud Scheduler from the server

Users can execute the Process Flow in Cloud Scheduler once DLR data is added/updated in the DLR’s sub-folder. Process Flow is a set of tasks to be executed in parallel or sequentially. A Process Flow can be scheduled to run once or can be recurring.

To schedule and run the Process Flow in Cloud Scheduler, click here.

After the Process Flow is executed, the status will be either Successful or Failure. If the Process Flow is successfully executed, the data file located in the corresponding DLR folder within the Input folder gets moved to the corresponding DLR folder in the Success folder. If it fails, it gets moved to the corresponding DLR folder within the Failure folder.

Note
If you have more than one file in the DLR folder, the execution will occur following the alphabetical sequencing of the file names.

Was this article helpful?