- 65 Minutes to read
- Print
- DarkLight
- PDF
Data Load Rules
- 65 Minutes to read
- Print
- DarkLight
- PDF
The Planful suite includes built-in data integration functionality, which allows you to import your data.
A Data Load Rule (DLR) tells the Planful system how to handle data values in the data source during a data load. You can create data load rules to load files, copy and paste data, and use web services to load segment hierarchies, attributes, attribute hierarchies, entity hierarchies, users, Workforce data, currency exchanges rate and other data. There are several steps to create a data load rule. These steps vary based on source and target data options.
The first page you will see when you access the Data Load Rules page is the Data Load Rule List. Click the New Data Load Rule page and the Data Integration wizard will step you through the remainder of the process.
Types of Data Loads
The following methods for data import are available:
Spreadsheet Based Data Load
Using this approach, your source data is first exported to Microsoft Excel and then copied on to the Planful browser-based spreadsheet interface through the copy/paste operation.
This approach is user-friendly and most commonly used, however, when the data volume is huge (i.e. above 60,000 records per data load), alternative methods are recommended.
File Based Data Load
Using this approach, your source data is first exported into a text /excel/csv file format. Then, through load mapping settings, load file setup is configured in the application. Lastly, the source file is setup to upload to the Planful application database.
If you set up a File based data load rule and later on want to load data via Web Services, instead of re-creating the entire DLR (Data Load Rule), you can simply edit it and change the Load Type to Web Services. This flexibility saves time and ensures consistency in loading data when the same DLR is used. However, it is very important and a best practice to complete all the steps in the DLR to effectively change the DLR type. Otherwise, the DLR will be rendered unusable.
You can switch between the types of DLRs:
- File Load
- Web Services
- Copy – Paste
- NetSuite Connect
For example, to change the load type from File to Web Services, click the DLR. Change the Load Type to Web Services and complete the remaining steps of Select Sample Input File, Define Overall Rule Settings, Manipulate Input File, Define Data mappings and Load Data for the changes to take effect.
Data Load Rules (DLRs) are automatically sorted alphanumerically. Sorting alphanumerically allows you to more quickly identify a Data Load Rule without scrolling through the lists.
If you sort on any column on the Data Load Rule List page, navigate away, and then return to the same page, the sorting is retained.
Template Based Data Load
Using this approach, data is loaded into the Planful suite via data input templates in which data is manually entered. Administrative users can design and publish the data input templates, which are then accessed by end users who input data into the templates.
This approach is recommended when the data needs to be loaded into the Planful suite from several disparate General Ledgers with significant dissimilarities in the chart of accounts and the data translation mappings from the subsidiary general ledger data to the common reporting chart of accounts do not exist or would be hard to create.
Web/Cloud Services
See the Admin Guide to Loading Data to Planful
High Level Steps to Create a Copy-Paste DLR to Load Data
- Access the Data Load Rule page by navigating to Maintenance > Data Integration > Data Load Rules.
- Click the New Data Load Rule tab.
- Select Copy - Paste as the Load Type on the New Data Load Rule page.
- Copy and paste data from your source file to the spreadsheet on the Select Sample Input File page.
- Select parameters, defaults, and segment information to further define the way you want to load data using the Define Overall RuleSettings page.
- Transform/manipulate the data you are loading to best suit your needs on the Manipulate Input File page.
- Map source data columns to targets using the Define Data Mappings page.
- Perform the load using the Load Data page.
High Level Steps Create a File DLR to Load Data
- Select File as the Load Type on the New Data Load Rule page.
- Load the source file using the Select Sample Input File page.
- Select parameters, defaults, and segment information to further define the way you want to load data using the Define Overall RuleSettings page.
- Transform/manipulate the data you are loading to best suit your needs on the Manipulate Input File page.
- Map source data columns to targets using the Define Data Mappings page.
- Perform the load using the Load Data page.
High Level Steps to Create a Web Services DLR to Load Data
- Access the Data Load Rule page by navigating to Maintenance > DLR > Data Load Rules.
- Click the New Data Load Rule tab.
- Select Web Services as the Load Type on the New Data Load Rule page.
- Use an external Data Integration tool (like Boomi) to load data to the Planful system.
- The steps will vary based on the external Data Integration tool used.
Steps to Convert a DLR Load Type
When you set up a Data Load Rule (DLR), you select how you want your source data (such as actuals) loaded to Planful. For example, you might load it in a file format or copy and paste the data. You might also use Boomi Connectors (Web Services). Let’s say you generally use the File load type to load data to Planful from your source system. But, you’ve decided to use connectors via Web Services to load your data. Instead of re-creating the entire DLR (Data Load Rule), you can simply edit it and change the Load Type. This flexibility saves time and ensures consistency in loading data when the same DLR is used.
You can switch between the types of DLRs:
- File Load
- Web Services
- Copy – Paste
- NetSuite Connect
Best Practices:
Complete all the steps in the DLR to effectively change the DLR type. Otherwise, the DLR will be rendered unusable.
In Practice: Changing the Load Type From File to Web Services
- Navigate to Maintenance > Data Integration > Data Load Rules.
- Select a DLR and double-click it
- Change the Load Type to Web Services and complete the remaining steps of Select Sample Input File, Define Overall Rule Settings, Manipulate Input File, Define Data Mappings and Load Data for the changes to take effect.
- To change the DLR back to File Load for Load Type:
- Double-click the DLR from the Data Load Rule List page.
- On the New Data Load Rule page, change the Load Type to File Load and complete the remaining steps of the DLR.
Steps to Convert a DLR with a Load Type of File, Copy Paste, or Web Services to NetSuite Connect
When you set up a Data Load Rule (DLR), you select how you want your source data (such as actuals) loaded to Planful. For example, you might load it in a file format or copy and paste the data. You might also use Boomi Connectors (Web Services). Let’s say you generally use the File load type to load data to Planful from your source system. But, you’ve decided to use NetSuite Connect to load your data. Instead of re-creating the entire DLR (Data Load Rule), you can simply edit it and change the Load Type. This flexibility saves time and ensures consistency in loading data when the same DLR is used.
Best Practices:
Complete all the steps in the DLR to effectively change the DLR type. Otherwise the DLR will be rendered unusable.
In Practice
- Access the Data Load Rules pages by navigating to Maintenance > Data Integration > Data Load Rules.
- Double-click a DLR on the Data Load Rule List page. The Data Load Rule opens.
- Change the load type as needed.
- Complete the remaining steps from the Select Sample Input File to the Load Data step to use the DLR.
To use NetSuite Connect there are required setup steps. Click here to learn all about NetSuite Connect and how to perform the setup.
Steps to Change a NetSuite Connect DLR Load Type to File Load, Web Services, or Copy Paste
- Access the Data Load Rules pages by navigating to Maintenance > Data Integration > Data Load Rules.
- Check the Usage Report of the NetSuite Connect DLR to find out if the DLR is being used in any Process Flow.
- If the DLR is being used in a Process Flow, as shown in the above screenshot, edit the Process Flow and remove the Task with that DLR. If there is only one Task in the Process Flow, then you will have to delete the Process Flow to continue changing the DLR type.
- Return to the Data Rule Page and double-click a DLR on the Data Load Rule List page. The Data Load Rule opens.
- Change the load type as needed.
- Complete the remaining steps from the Select Sample Input File to the Load Data step to use the DLR.
High Level Steps to Load Alternate Hierarchy Data Using an XLSX Source File
To load data into an alternate hierarchy:
- Navigate to Maintenance > Data Integration > Data Load Rules.
- Click New Data Load Rule.
- Select the File Load load type, the Segment Hierarchies load item and the Segment load sub item.
- Click Select Sample Input File.
- Browse for the Sample Data File under the File Information section of the screen (cf. first figure, below) then click Next or Define Overall Rule Settings.
- Select Alternate Hierarchy from the Hierarchy Type section. If you set up an elimination company in Hierarchy Management and selected Yes for Elimination, there will be an option to tell the application to obtain a value for the elimination company from the file that contains the data you are loading to Planful.
- Select Yes for Include in Data File for the Elim Attribute to obtain the value from the data you are loading to Planful. This Parameter is available for Segment Hierarchy and Alternate Hierarchydata loads.Note:Elimination companies in the main company hierarchy do not require a new code for the alternate hierarchies.
- Define the remaining parameters for the data load (e.g., input file manipulation, data mappings).
- Click Load Data and then click Finish. The DLR for the alternate hierarchy is submitted for processing. A notification will be sent once processing is complete and the data is in the alternate hierarchy.
How to Create a Data Load Rule to Process Workforce Planning Allocations
Complete the steps below to create a data load rule to process allocations in Workforce Planning.
Prerequisites
- Choose Allocation Method by Department when setting up employee allocations. For information on how to do so, see the
- Understanding Employee Allocations topic in the Workforce Planning Admin Guide.
- Employees should have previously been populated in a Workforce Planning scenario.
- Create a file that defines the allocation by all segments in the Common Chart of Accounts.
- Define the file to include all fiscal years based on the budget input years defined in Configuration Tasks.
- Define the allocation percentages for each month ensuring that the percentages total 100% for each month within a given fiscal year.
- Once loaded, the allocation can be viewed in the home department of the employee.
In Practice
Steps to process allocations via a data load rule:
- Define a data load rule with the following parameters:
- Load Type: Use either File Load or Copy-Paste
- Load Item: Choose Workforce Planning
- Load Sub-Item: Choose Workforce – Employee Allocations
- Click Next.
- Select Sample Input File and complete the following fields:
- Row Containing Header – Select 1
- Number of Rows to Skip – Select 1
- Number of Source Fields – Select 22, this is the number of columns in the file format for this Data Load Rule
- Number Format/Thousand Separator – Select Comma(,)
- Number Format/Decimal Separator – Select Dot(.)
- Date Format – Select MM/DD/YYYY
- Data Load – Copy and paste from an Excel spreadsheet the formatted fields to be used in the DLR
- Click Next.
- Define Overall Rule Settings: Define the scenario and time mapping to be used for this data load rule.
- Scenario – Select the Scenario the data will be loaded into.
- Allocation Basis – This is predefined based on the Define Workforce Planning Budgeting Criteria Task, in this Allocation by Department
- Time Mapping – Select Across Columns which defines a column for each month that allocations are processed against.
- Click Next.
- Manipulate Input File: - Confirm the columns align with the Source file.
- Click Next.
- Define Data Mappings – Select the correct field mappings to the columns identified in the source data file.
- Click Next.
- Confirm the scenario data is being loaded and click Finish. Use Verify Data to confirm data loads.
You will receive confirmation that data was loaded successfully, or any errors during the data load will be identified. If errors are found correct the errors in the source file and attempt to load again.
High Level Steps to Incrementally Load Workforce Data
To incrementally load Employee Workforce Data, complete the following steps:
- Navigate to Maintenance > Data Integration > Data Load Rules.
- Click the New Data Load Rule tab.
- Select a Load Type for your new data load rule.
- Select Workforce Data for the Load Item.
- Select Employee Data as the Load Sub Item.
- Continue with the load process. New employees will be appended and existing employees will be updated.
Loading SSO Users
You cannot load SSO Authentication users using Data Load Rules. If there are a few users, you can manually add them from the user interface. If there are too many users to manually add, load the users with email and password as Native Authentication. Then, submit a Support case and the Operations group will change the user Authentication mode from Native to SSO.
You can load SSO users with Planful APIs. After creating users with Data Load Rules or the APIs, the authentication type must be updated by the Planful Operations team.
Creating New Data Load Rules - Explanation of All Fields with Examples
Data Load Rule List
This application page contains all data load rules that have been created and modified.
The Status indicates whether the latest data load was completed successfully. Red indicates a failure, green means the load was successful, and a gray hyphen indicates that no data loads have taken place.
Sort the list by clicking a column header to display the sort icon. Sort the list from bottom-up or top-down.
Delete: Planful checks for data loaded to the Actual scenario. If data is loaded for a locked period, data deletion is not allowed. Unlock the period and then click this button to perform the deletion.
Clear Data: Delete all data loaded to date for different scenarios using the selected Data Load Rule.
For the Workforce Employee Data, you will receive an alert message if you click the Clear Data button and data exists. The alert message indicates that employee data exists and if you proceed, all employee details and associated data will be deleted. Clear Data deletes data from whichever scenario is selected and saved in the Define Overall Rule Settings step of the data load. If employees are mapped to any other scenario, the data is retained in that scenario.
Copy: Copy a selected data load rule. When the Copy Dataload page is opened, select a new name for the rule you are copying data to, or allow the system to assign a default name.
Once the copy is complete, the system will inform you, and the DLR List is automatically updated to include the new rule.
The following settings will be copied:
- General settings (name, description, etc.)
- Sample input file format information
- Overall rule settings
- Metadata (data mappings and manipulations)
Deleting a Data Load Rule
Once a data load rule is created, you can delete the rule without affecting the loaded data (with the exception of the GL Data type load). For example, if you created a rule to load Segment Hierarchies and then delete the rule after you load the segments and hierarchies, the rule will be removed from the application but the segments and hierarchies you loaded will remain. In some cases, it's a good idea to keep data load rules that you create for reuse purposes. For example, you can reuse a rule to mass load new members for a dimension or update properties.
New Data Load Rule
Access the New Data Load Rule page by navigating to Maintenance > Data Integration > Data Load Rules. Click the New Data Load Rule tab.
Load Type
Load Types represent the way in which you plan to load data to Planful. Select from the following options:
- File Load – Load a file that exists outside of the Planful system. For .csv file types, each column must contain data that is separated by a comma. For .txt files, make sure each column contains data that is separated by a tab.
- Copy-Paste – Copy and paste cells directly from a spreadsheet to the Planful system.
- Web Services - Load data through an external Data Integration tool (like Boomi). If you create a Web Services DLR and want to change the DLR to a File Load type, you can do so. Make sure you complete all steps on Select Sample Input File, Define Overall
- Rule Settings, Manipulate Input File, Define Data mappings and Load Data for the changes to take effect.
- Cloud File Upload - Load data through Box. The difference between the File Load and Cloud File Load is that data for the
- Cloud File Upload is loaded to Planful from a cloud-based service provider (i.e., Box) whereas data for a File Load is loaded from a non-cloud based source (i.e. hard drive).
- Google Drive File Upload - Load data via Google Drive.
- NetSuite Connect - Load data through a NetSuite integration.
Once a Cloud File Data Load Rule is configured, you can include the rule in a Cloud Scheduler task, job, and process flow. When the process flow is executed, the Planful system automatically connects to your Box account, looks for relevant data, and extracts that data to Planful. The data load is then categorized as a Success or Failure in Box. The status (successes and failures) of Data Load Rule cloud file uploads is recorded within the Detail Log accessed on the Job Managerpage.
Best Practices - Specific to Box
- Set the Automatic Cube Refresh at the Define Overall Rule Settings step in the Data Load Rule to No.
- Set up Cloud Scheduler so that the Scenario process is dependent upon the Data Load Rule.
- Process the scenario after loading the Box file and have a separate task for each in Cloud Scheduler.
Load Item
Select from the following load items:
- Segment Hierarchies- Select specific segments to load in the Load Sub Item list box.
- Data - Select detailed data (i.e. Sales Data, GL Data, etc.) in the Load Sub Item list box and load all associated information. Attributes - Load the values associated with the attribute. Once loaded, you can view the values on the Attribute Members List page. Important! When loading attributes, the leaf name AND code must be included.
- Attribute Hierarchies - Load values for an attribute in an attribute hierarchy.
- Entity Hierarchies- Load entity hierarchies, like the Budget Entity Hierarchy for example.
- User Information - Load multiple users at once with all associated information.
- Workforce Data - Load Workforce Data, like employee data for example.
- Currency Exchange Rates - Create a data load rule for currency exchange rates.
Load Sub Item
The data populated/available options are based on the selected Load Item.
Segment Hierarchies: Available segments are listed. The options available are populated based on system-delivered/defined segments and segments you create.
Data: When loading GL Data, Sales Data, Sales Price, Transaction Data, and Translations Data, it is best to perform a sequential load. For example, if you load data for October 2018 after loading data for November 2018 you will need to load October and November to calculate the MTD/QTD/YTD information correctly for reporting purposes.
When data is loaded, Planful automatically calculates either MTD or YTD values based on the data load rule setting.
YTD setting: MTD=YTD current period - YTD prior period
MTD setting: YTD=MTD current period + YTD prior period
Account Based: If account type is Flow then YTD=MTD current period + YTD prior period. If account type is Balance then MTD=YTD current period - YTD prior period.
With this release, a new version of Transactions Data Version 2 has been introduced to the data load process.
Transaction Data V2
The Transaction Data Version 2 provides fast and more efficient processing of the Data Load Rule. You can select the Transaction Data V2 option from the Load Sub Item drop-down list in the New Data Load Rule page. The Transaction Data Load Rule process is not changed only the processing speed has been increased.
Attributes: Available attributes are listed. The options available are populated based on system-delivered/defined attributes and attributes you create using the Attributes page.
Attribute Hierarchies: Available attribute hierarchies are listed. This list is populated based on the attribute hierarchies defined on the Attribute Hierarchy page.
Entity Hierarchies: Budget Entity Hierarchies contain entities based on attributes or dimensions. Select this option to load data to defined budget entities and associate the children of the hierarchy (lowest level members) created using the Budget Hierarchy Entity Setup page.
Workforce Planning: You can load employee data directly to a target scenario other than the Default scenario. Select the target scenario on the Overall Rule Settings page.
- Workforce - Employee Data - Load employee position and attribute information or perform a complete load of employee information including compensations.
- Workforce - Employee Compensation Item - Create a data load rule to update Salary (full-time or hourly wage type compensation items) and Direct Data Input compensation items in the Default Scenario. This functionality allows you to incrementally update Workforce data versus reloading all Workforce data.
The following table reflects the columns that are required for this type of data load and some example row data:
Employee # | Plan Pos./Pos. Desc. | Compensation Code | Rate | Hour | Default Value | Description |
---|---|---|---|---|---|---|
| 13400Default Pos | FT | 0 | 0 | 70000 | Full-time / Annualized employee |
| 13400Default Pos | Commissions | 0 | 0 | 100 | Direct Data Input compensation item |
| 13400Default Pos | PT | 40 | 2000 | 0 | Hourly employee |
- Workforce - Employee Allocations- Load employee allocations (allocation percentages) for all periods of each fiscal year (across cost centers). For example, if the start year is 2018 and the scenario covers a two year time period, the load should contain allocations for 2018 and 2019. Also, each allocation needs to total 100%. For example, if an employee expenses relate to two different departments, the compensation must allocate to 100% across both departments.
Currency Exchange Rates
- Load Exchange Rates - Load exchange rate details (conversion calculations, currency types, etc.). Exchange rates are generally calculated based on the exchange rates entered on the Exchange Rate page. Exceptions are not taken into consideration when exchange rates are loaded.
- Exchange Rates with Translations - This option allows you to load exchange rate data and translate the data from a local currency to a common currency.
Translations provide a mechanism for organizations to map the data from the local chart of accounts of the reporting site (subsidiary or a partner company) with the common reporting chart of accounts used in the Planful suite.
Select Sample Input File
Select file requirements or the structure of the data you are loading. If you are performing a Copy Paste or Web Services load, this page allows you to specify parameters for incoming data. Different fields are available based on the Load Type selected on the New Data Load Rule page.
Let's say you loaded data using Data Load Rules with a File load type. Then, you want to add extra data columns, which you want reflected in the load. To make this modification and load data, change the number of columns and browse to the sample input file to change the structure. Lastly, map the new columns on the Define Data Mappings page before you perform the load.
If you are having any difficulties loading the sample data file, make sure the source file does not contain formulas linked to other files on your computer. Files with formula references are not supported.
The format of your data spreadsheet (i.e. Excel) must match the Select Sample Input file specifications.
File Requirements - These fields are available if you selected a File Load Type
- File Type – Select the type of file you want to load; Excel (xls), text (txt) or csv (comma delimited value).
- Column delimiter- Available if you select to load from a txt or csv file. A column delimiter separates each column with a tab, semicolon, or comma. xls files already contain column separators and therefore this option is not available.
- Row Identifier- Specify the character that separates rows in the Data Load file. The row delimiter is important because the system needs to know when a row begins and ends. The identifier marks the beginning/ending of rows in your txt and csv files.
- Text Qualifier - Specify the character you want the system to use to enclose values in the file. This function is important because it separates the field data from the delimiter. You may select to separate the two using single quotes or double quotes. If you choose none, there will be no separator to distinguish between the delimiter and field data.
Header
- Row Containing Header- Enter the number of the row that contains the header information. A header contains the heading names for a column.
- Number of Rows to Skip- Enter the number of rows in your spreadsheet to skip. For example, if your spreadsheet has 20 rows, but the first 3 rows contain data that is not important you can enter 3 in this field. The system will skip the first 3 rows during the data load process.
- Number of Source Fields - Enter a number based on the number of columns you plan to select to copy from your spreadsheet, txt file, or csv file. For example, if your source file contains 10 columns, enter 10 here. For Copy Paste loads, the graph automatically reflects the number you enter.
Number Format
Different countries use different ways to represent numbers. For example, 6 euros and 28 cents needs to be loaded as 6,28 whereas 6 dollars and 28 cents should load as 6.28. The Thousand and Decimal Separators allow you to specify the format in which you want to load your numerical data.
- Thousand Separator - Select a period or comma to separate thousands. In general, non-USD based companies will set the thousand separator to period and the decimal separator to comma.
- Decimal Separator - Select a period or comma to separate decimals. In general, USD based companies will set the thousand separator to comma and the decimal separator to period. This is the default setting.
Date Format
Select the format for the date to appear in the resulting load. Different countries use different ways of representing a day within a calendar year. For example, in Hungary the date is represented as Year/Month/Day. In the U.S., several organizations represent a date by
Month/Day/Year. Essentially, there are several ways in which you may choose to represent a date to best coincide with your organization's current practices.
File Information - Available for File Load Types
Sample Data File - Click Browse to search for the sample data file.
Copy - Paste Data
If you select to Copy Paste data, a grid will appear. Copy data from your file and paste it into the provided grid. Paste data into the grid by selecting the whole grid (click on the cell in the upper left hand corner).
Define Overall Rule Settings
Overview
Select parameters, defaults, and segment information to further define the way you want to load data. Different fields are available based on the Load Type. Documentation is grouped based on load types.
Description of Fields for a Segment Hierarchies Load
Hierarchy Type: Select Main Hierarchy unless you are loading data to an Alternate Hierarchy.
Member
- Allow Move: Select to allow parent changes for existing members.
- Allow Property Change: Select to allow Property Changes (i.e. name, currency, or other system properties) for existing members.
Build
- Level: Build a hierarchical level.
- Select Number of Levels: Available if you select the Level option. Select the number of levels you want your hierarchy to consist of.
- Parent Child: Build a parent child based hierarchy.
Parent Child Versus Level Based Hierarchies
A Parent Child hierarchy is one where you have a child field and a parent field. For example, let's say you have a table with employee department details and associated sub-ordinates.
Child | Parent |
---|---|
Matt | ? |
Brian | Matt |
Jim | Matt |
Katy | Brian |
Matt is the ultimate boss of the department. Matt has 2 sub-ordinates: Brian and Jim. Brian's sub-ordinate is Katy.
A Level Based hierarchy has the following structure.
Level 1 | Level 2 | Level 3 |
---|---|---|
Matt | Brain | Katy |
MattBrianKaty
Ragged hierarchies is supported (hierarchies with varying levels of tree depth) while loading Level based segment hierarchies using a Data Load Rule.
Parameter
The parameters listed are those available to be included in the data file. If an elimination company was set up in Hierarchy Management, you may have a default Elim Attribute parameter (as shown below). Select Yes for Include in Data File for the Elim Attribute to obtain the value from the data you are loading to Planful. This Parameter is available for Segment Hierarchy and Alternate Hierarchy data loads.
Include in Clear Data
Select segments to clear or retain data for. For example, if you load data for Company A and then want to use the same data load rule to load data for Company B, the data loaded for Company A will not be deleted if you do not choose to clear the data.
Include in Data File
Select Yes if a dimension is not defined within the data file and there is a common member for all records.
Select No if you do not want the listed parameters in the data file.
Select
Available if you select No for a parameter. Search for other dimensions to associate with the data file.
Description of Fields for a Data Load
Parameters
- Reporting-Select Common Currency (CC) if you are loading legal entity data in a common currency. Otherwise, select Local Currency (LC). .
- Credit/Debit – Select No if account balances are in their natural trial balance format. If not, select Yes. The calculation for Debit and Credit accounts works as follows when "Credit/Debit" is set to 'Yes' while loading data:
- For Debit Accounts - Amounts loaded for a debit account will be displayed "as is" in reports. For example, if you want to see a negative number for a debit account, load a negative number for that account while uploading data into the system.
- For Credit Accounts - By default, the number being loaded is negated and stored in to the system. So, if an amount of 100 is loaded, the number is stored internally as -100. The 'Verify Data Report' will show -100, as (100 X -1) while storing the data.
- For a Credit account, by default, the number being loaded would be negated and stored in to the system. So, if an amount of 100 is loaded, the number would be stored internally as -100 and hence the 'Verify Data Report' would show -100, as (100 X 1) is done while storing the data.
- Data Type – Select YTD format for balance sheet and income statement data loads. If you select MTD, there must be an opening balance for data. Select Account Based to format the balance sheet based on an account.
- Calculate Trial Balance – Select Yes to calculate the trail balance. Credits and Debits must be balanced in order for the data to load. Remember that a Trial Balance is a list of all general ledger accounts contained in an organization's ledger. When the trial balance option is available, the data load file is rejected if the validation check for the trial balance fails.
- Currency Conversion – Select Yes to translate the data from local currency to common currency. Select No to indicate that a translation does not need to be processed.
- Automatic Cube Refresh - Select to automatically have the system refresh the reporting cube. The cube must be processed after the load is complete to view translated data in reports.
- Scenario – Select the default scenario to load data. If you select a Budget Scenario, Forecast Scenario or a Preload Scenario for a GL or Translation and select Month as the Frequency, the Across Columns by Scenario Time Mapping value is available.
- Frequency – Select Monthly for all Financials. For Scorecards you may select Day, Quarter, or Year.
- Time Mapping – Select to indicate if time periods are defined across the columns or whether one period column is defined with the values stored in the rows. If you select Across Columns by Scenario, the Segments (subject to default settings), Fiscal Months and Opening Balances are available in data mappings. Fiscal Year is not available. In the DLR default settings, Fiscal Year and Fiscal Month are not available. Time Mapping Across Columns by Scenario is available for Sales Price and Sales Data if a budget scenario is selected while loading Sales Price/Sales Data. Fiscal Year and Fiscal Month fields will be disabled in Default Settings. Fiscal Year will be disabled in Data Mappings.
- Process Employees- Select this checkbox to process employees using DLR. If this check box is selected and data is loaded successfully using DLR, then a request to process all the employees will be submitted. The default value is No. For the existing DLRs, this check box will not be selected.
The check box is available for the following DLRs:
- Workforce - Employee Data
- Workforce - Employee Compensation Item
- Workforce - Employee Allocations
Default Value for Missing Periods
Define data calculations for missing periods at DLR level with GL Data and Translations sub type.
- Enable new Data Calculations —Select this checkbox to enable data calculations for missing periods (selected by default).
- Overwrite Global Settings —Select this checkbox to overwrite values for flow accounts and balance accounts.
- Flow Accounts —Select YTD or MTD flow account to overwrite.
- Balance Accounts —Select YTD or MTD balance accounts to overwrite.
Email Recipients
By default, the e-mail recipients added to the Cloud Services Configuration Task are carried forward to the Cloud Services Data Load Rule, but you have the flexibility to edit the recipients specific to each Cloud Services Data Load Rule created. You can edit email recipients here.
Clear Data Configuration - Include in Clear Data for GL and Translations Data
For Clear Data Combination, Scenarios are never in the file. They are defined by the DLR settings on the same page. So, the file has nothing to do with which Scenario is updated/cleared. If you don’t select any checkboxes, the system will clear all data for the Time periods and Scenarios found in the load file. If you select a box, for example Company, the system will clear all data for the Time periods and Scenario but only for the companies found in the load file. The more boxes you select, the more granular/precise the clear based upon the file being loaded.
Clear Data Combination functionality can be used in two ways:
- Uncheck all dimensions except the dimension that drives your currency (legal entity like Company)- to clear all data loaded to date for legal entities and periods in the file and replace with the new file.
- Check all dimensions - to update only the GL combinations present in the file and not clear any other combinations previously loaded using this rule.
If you are unsure or confused on how to use this functionality, you should start with all of the boxes unchecked. This will ensure that every subsequent file load will clear all data previously loaded to that scenario for the fiscal period(s) in the current load file.
The most common situation where you might need to check a box is when you are loading multiple files using the same DLR to the same fiscal period. Here are two examples:
- If you are loading multiple Companies using separate files, you would check the Company box. This would ensure that ONLY the data for the company in the next load file would be cleared. Other companies loaded prior to this file would remain unchanged.
- Sometimes something similar happens with the Account dimension. For example, you might have 1 file that contains trial balance information and another file that contains data for statistical accounts. The files might have the same layout and use the same DLR. In this case, you would check the Account dimension only so that loading the statistical amounts after the trial balance amounts will NOT clear the trial balance amounts.
Clear data combination allows you to clear existing data prior to loading new data based on selected segment combinations.
- Segment - Select segments to update the records in the database for the named data load rule. The Legal Entity dimension that affects the currency in the application will always be grayed out.
- Include in Clear Data- Select this checkbox to clear all records and reload what is in the file for all periods included in the data load rule.
When All Segments are checked, the system will overwrite only the associated ProfitCenter_CostCenter and Account combinations in the file.
When Account is not checked, the system will overwrite data loaded previously using the Data Load Rule (DLR) at the ProfitCenter_CostCenter level. For example, if you are loading data to the file for the ProfitCenter_CostCenter member “0001” only, all account data previously loaded using the DLR for “0001” will be deleted for the period and reloaded based on the file.
1. Loading Data with all Segments Checked in Clear Data Combination
Below is a sample of the original data that had been previously loaded with a data load rule for 2022:
Original Data
Company | Department | Account | ProductLine | CostCenter | BuisnessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 | Default | 1610 | Default | Default | Default | Default | Default | 2022 | 161 | 161 | 161 | 161 |
2010 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 332 | 332 | 332 | 332 |
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 444 | 444 | 444 | 444 |
New data is then loaded using a data load rule, with all segments checked in the Clear Data Combination option:
Data Loaded
Company | Dept | Acct | Product | CostCenter | BuisnessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 555 | 555 | 555 | 555 |
2120 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 212 | 212 | 212 | 212 |
2140 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 214 | 214 | 214 | 214 |
The resulting data contains the previously loaded data along with the newly loaded data, with any updates:
Resulting Data, with all Segments Checked in Clear Data Combination
Account | Company | Department | CostCenter | BuisnessCenter | ProductLine | Geography | ICSegment | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|
1610 | 2010 | Default | Default | Default | Default | Default | Default | 161 | 161 | 161 | 161 |
3320 | 2010 | Default | Default | Default | Default | Default | Default | 332 | 332 | 332 | 332 |
3320 | 2100 | Default | Default | Default | Default | Default | Default | 444 | 444 | 444 | 444 |
3320 | 2120 | Default | Default | Default | Default | Default | Default | 212 | 212 | 212 | 212 |
3320 | 2140 | Default | Default | Default | Default | Default | Default | 214 | 214 | 214 | 214 |
Conversely, when all Segments in Clear Data Combination are unchecked, data that was previously loaded by this data load rule and data for the time period included in the data load rule is cleared, and the new data is loaded.
2. Loading Data with all Segments Unchecked in Clear Data Combination
Below is a sample of original data that had been previously loaded with a data load rule for 2022:
Original Data
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 | Default | 1610 | Default | Default | Default | Default | Default | 2022 | 161 | 161 | 161 | 161 |
2010 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 332 | 332 | 332 | 332 |
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 444 | 444 | 444 | 444 |
New data is then loaded using a data load rule, having all segments unchecked in the Clear Data Combination option:
Data Loaded
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 555 | 555 | 555 | 555 |
2120 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 212 | 212 | 212 | 212 |
2140 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 214 | 214 | 214 | 214 |
The resulting data clears data that was previously loaded by this data rule and for the time period included in the data load rule, and loads the new data:
Resulting Data, with all Segments Unchecked in Clear Data Combination
Account | Company | Department | CostCenter | BuisnessCenter | ProductLine | Geography | ICSegment | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|
3320 | 2100 | Default | Default | Default | Default | Default | Default | 555 | 555 | 555 | 555 |
3320 | 2120 | Default | Default | Default | Default | Default | Default | 212 | 212 | 212 | 212 |
3320 | 2140 | Default | Default | Default | Default | Default | Default | 214 | 214 | 214 | 214 |
3. Loading Data with Only One Segment Checked in Clear Data Combination
Below is a sample of original data that had been previously loaded with a data load rule for 2022:
Original Data
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 | Default | 1610 | Default | Default | Default | Default | Default | 2022 | 161 | 161 | 161 | 161 |
2010 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 332 | 332 | 332 | 332 |
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 444 | 444 | 444 | 444 |
New data is then loaded using a data load rule, having only one segment (Company) checked in the Clear Data Combination option:
Data Loaded
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 | Default | 1610 | Default | Default | Default | Default | Default | 2022 | 161 | 161 | 161 | 161 |
2010 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 332 | 332 | 332 | 332 |
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 444 | 444 | 444 | 444 |
The resulting data contains the previously loaded data along with the newly loaded data, with any updates:
Resulting Data, with only one Segment (Company) Checked in Clear Data Combination
Account | Company | Department | CostCenter | BuisnessCenter | ProductLine | Geography | ICSegment | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|
1610 | 2010 | Default | Default | Default | Default | Default | Default | 161 | 161 | 161 | 161 |
3320 | 2010 | Default | Default | Default | Default | Default | Default | 332 | 332 | 332 | 332 |
3320 | 2100 | Default | Default | Default | Default | Default | Default | 555 | 555 | 555 | 555 |
3320 | 2120 | Default | Default | Default | Default | Default | Default | 212 | 212 | 212 | 212 |
3320 | 2140 | Default | Default | Default | Default | Default | Default | 214 | 214 | 214 | 214 |
4. Loading Data with more than One Segment Checked in Clear Data Combination
Below is a sample of original data that had been previously loaded with a data load rule for 2022:
Original Data
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 | Default | 1610 | Default | Default | Default | Default | Default | 2022 | 161 | 161 | 161 | 161 |
2010 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 332 | 332 | 332 | 332 |
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 444 | 444 | 444 | 444 |
New data is then loaded using a data load rule, having two segments (Company & Accounts) checked in the Clear Data Combination option:
Data Loaded
Company | Department | Account | ProductLine | CostCenter | BusinessCenter | Geography | ICSegment | Fiscal Year | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2100 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 555 | 555 | 555 | 555 |
2120 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 212 | 212 | 212 | 212 |
2140 | Default | 3320 | Default | Default | Default | Default | Default | 2022 | 214 | 214 | 214 | 214 |
The resulting data contains the previously loaded data along with the newly loaded data, with any updates:
Resulting Data, with only one Segment (Company) Checked in Clear Data Combination
Account | Company | Department | CostCenter | BuisnessCenter | ProductLine | Geography | ICSegment | Jan-22 | Feb-22 | Mar-22 | Apr-22 |
---|---|---|---|---|---|---|---|---|---|---|---|
1610 | 2010 | Default | Default | Default | Default | Default | Default | 161 | 161 | 161 | 161 |
3320 | 2010 | Default | Default | Default | Default | Default | Default | 332 | 332 | 332 | 332 |
3320 | 2100 | Default | Default | Default | Default | Default | Default | 555 | 555 | 555 | 555 |
3320 | 2120 | Default | Default | Default | Default | Default | Default | 212 | 212 | 212 | 212 |
3320 | 2140 | Default | Default | Default | Default | Default | Default | 214 | 214 | 214 | 214 |
Clear Data Combination- Clear all data prior to new data load for Transaction Data
If the 'Clear all data prior to new data load' box is unchecked, the Transaction Data Load will clear transactions only for the exact GL combination strings contained in the load file.
The system does not allow duplicate transaction data for a GL combination. If you want to update a specific transaction, uncheck the box and upload all transaction data for that specific GL combination.
Parameter
The parameters listed are those available to be included in the data file.
Transaction Details
Transaction details can be setup on the Transaction Details Setup page.
Attribute type: The Common COA (Chart of Accounts) segment defaults are displayed based on the setup on the Define Financial Segments page
Attribute name: The name of the attribute based on the segments defined for the Common COA and attributes setup for those segments. For example, some Common COA names are; Company, Account, and Project.
Attribute data type
- varchar: Associates a data type with a variable character field. Variable character fields have a user-defined length and can hold letters and numbers.
- URL: Load a reference URL. You can click the URL hyperlink when you view the Transactions Drill-Through reports.
Attribute length: The maximum amount of characters for the field.
Include in Data File
- Select Yes if a dimension is not defined within the data file and there is a common member for all records.
- Select No if you do not want the listed parameters in the data file.
Default/Select
Available if you select No for a parameter. Search for other dimensions to associate with the data file.
1. Loading Data with Checked Clear all data prior to new data load Checkbox
Below is a sample of original data that had been previously loaded with a data load rule:
Original Data
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 15001 | P007126 | 150 | http://www.yahoo.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 16001 | P007127 | 160 | www.gmail.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 16301 | P007128 | 170 | http://www.google.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 18-01-2022 | 16401 | P007129 | 180 | http://www.yahoo4.com |
1943-Balance-YTD_V1 | 2091LC | 53 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 19-01-2022 | 16402 | P007130 | 190 | http://www.yahoo4.com |
New data is then loaded using a data load rule, having two segments (Company & Accounts) checked in the Clear Data Combination option:
Data Loaded
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 25001 | P007126 | 150 | http://www.yahoo.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 25002 | P007127 | 160 | www.gmail.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 25003 | P007128 | 170 | http://www.google.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 18-01-2022 | 25004 | P007129 | 180 | http://www.yahoo4.com |
4018-Flow-MTD_V1 | 2091LC | 53 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 19-01-2022 | 25005 | P007130 | 190 | http://www.yahoo4.com |
The resulting data clears data that was previously (Account 1: 1943-Balance-YTD_V1) Account loaded by this rule and replaced with the below 5 (Account 2: 4018-Flow-MTD_V1) lines loaded.
Resulting Data
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 25001 | P007126 | 150 | http://www.yahoo.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 25002 | P007127 | 160 | www.gmail.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 25003 | P007128 | 170 | http://www.google.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 18-01-2022 | 25004 | P007129 | 180 | http://www.yahoo4.com |
4018-Flow-MTD_V1 | 2091LC | 53 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 19-01-2022 | 25005 | P007130 | 190 | http://www.yahoo4.com |
2. Loading Data with Unchecked Clear all data prior to new data load Checkbox
Below is a sample of original data that had been previously loaded with a data load rule:
Original Data
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 15001 | P007126 | 150 | http://www.yahoo.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 16001 | P007127 | 160 | www.gmail.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 16301 | P007128 | 170 | http://www.google.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 18-01-2022 | 16401 | P007129 | 180 | http://www.yahoo4.com |
New data is then loaded using a data load rule, having Unchecked Clear all data prior to new data load Checkbox.
Data Loaded
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 25001 | P007126 | 150 | http://www.yahoo.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 25002 | P007127 | 160 | www.gmail.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 25003 | P007128 | 170 | http://www.google.com |
The resulting data will add newly loaded 3 lines of data for the Account - ‘4018-Flow-MTD_V1’ to the existing data.
Account | Company | Department | CostCenter | BusinessCenter | ProductLine | Geography | ICSegment | Date/Time | Debit Amount | Product Code | Net gms | URL |
---|---|---|---|---|---|---|---|---|---|---|---|---|
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 15001 | P007126 | 150 | http://www.yahoo.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 15001 | P007127 | 160 | www.gmail.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 15001 | P007128 | 170 | http://www.google.com |
1943-Balance-YTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 18-01-2022 | 15001 | P007129 | 180 | http://www.yahoo4.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 15-01-2022 | 25001 | P007126 | 150 | http://www.yahoo.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 16-01-2022 | 25002 | P007127 | 160 | www.gmail.com |
4018-Flow-MTD_V1 | 2091LC | 535 | 10092 | 10091 | PROD 011 | 100-9011 | 1033 | 17-01-2022 | 25003 | P007128 | 170 | http://www.yahoo4.com |
Description of Fields for an Attributes and Attribute Hierarchies Load
Important! When loading attributes, the leaf name AND code must be included.
Member
- Allow Move - Select this checkbox while loading segment hierarchies to allow Parent Changes for existing members.
- Allow Property Change - Select this checkbox while loading segment hierarchies to allow Property Changes (i.e. name, currency, or other system properties) for existing members.
Build
- Level- Select to build a hierarchical level.
- Select Number of Levels - This option is available if you select the Level option. Select the number of levels you want you hierarchy to consist of.
- Parent Child - Select to build a parent child based hierarchy.
Description of Fields for Entity Hierarchies
Member
- Allow Move - Select this checkbox while loading segment hierarchies to allow Parent Changes for existing members.
- Allow Property Change - Select this checkbox while loading segment hierarchies to allow Property Changes (i.e. name, currency, or other system properties) for existing members.
Build
- Level- Select to build a hierarchical level.
- Max Number of Levels - This option is available if you select the Level option. Select the number of levels you want you hierarchy to consist of.
- Parent Child - Select to build a parent child based hierarchy.
Parameters
- Entity Type: Entity types may be based on attributes or segments.
Defaults
- Parameter: The parameters listed are those available to be included in the data file.
- Include in Data File:
- Select Yes if a dimension is not defined within the data file and there is a common member for all records.
- Select No if you do not want the listed parameters in the data file.
- Select: Available if you select No for a parameter. Search for other dimensions to associate with the data file.
Description of Fields for a User Load
Deafults
- Parameter: The parameters listed are those available to be included in the data file.
- Include in Data File:
- Select Yes to include role in data file.
- Select No if you do not want the role in the data file.
- Select: Available if you select No for a parameter. Search for other dimensions to associate with the data file.
Description of Fields for Workforce Data Load
Load Columns
Click to open the Load Columns dialog page to select additional columns you want to load.
- Click a row.
- Click the Save button.
- The column will be added to the Data Load Column list.
In addition to the current default data load columns populated, the following required columns are displayed:
- Budget Entity Code
- Budget Entity Segments (COA segments form the Budget Entity)
- Rate per Hour
- Work Hours - Basis (Annual / By Month)
- Salary - Basis (Annual / By Month - the wage Type compensation item)
For Review percentages, the ‘Multiple Reviews’ column is available in the Load columns selector. If selected, the columns related to the effective dates and percentages become available.
Scenario selection in Workforce – Employee Data Load Rule (DLR)
You can retain the Data Load Columns when you change the Scenario in the Define Overall Rule Settings tab for a Workforce – Employee Data DLR.
For example, when you select Scenario 2 after selecting Scenario 1 from the Scenario drop-down list, the Scenario Change pop-up window displays a confirmation message to retain the columns and if you click:
- Yes: The columns from Scenario 1 will be retained for Scenario 2.
- No: The columns from Scenario 1 will not be retained for Scenario 2. If there are any columns saved against the DLR for Scenario 2 then those columns will be displayed along with the default columns.
To save the column settings for a scenario, click Next.
The image below displays the Scenario Change pop-up window when you select Budget 2017 scenario after selecting Budget 2013 scenario.
Paramaters
- Scenario: Select the scenario you want to load employee data directly to from the list box. The scenarios populated in the list box include those created using the Scenario Setup page.
- Note:For forecast scenarios, while loading allocations with Across columns data load format, it is not possible to select only the open periods for the load for data mappings. We recommend that you use the Down Rows and Across columns by Scenario options to load the allocations to the forecast scenarios if there are any closed periods.
If annually varying compensation items are created in the scenario, relevant columns become visible for mapping automatically. They are otherwise hidden.
- Payroll upload Date: Select the date for which you want to upload payroll data.
Setup columns and order
- Data Load Column (Type): Contains a list of columns that will be displayed (by Display Name) in the resulting load. Change the order in which the columns are displayed using the arrows to move the columns up and down the list. There are 5 system delivered/defined columns, which are:
- Employee Number
- Hire Date
- Pay Plan Code
- Employee Name
- Employee Type Code
- Display Name: The name of the Data Load Column.
- Basis: The basis is used to calculate compensation amounts. This field is available for selection for compensation items.
- Annual - Compensations amounts will be calculated on an annual basis.
- By Month - Compensations amounts will be calculated on a monthly basis.
- Default Value: This field is available for Position Name, Employee Name and Pay Plan Code. These three columns are system-defined and delivered and may not be deleted. Enter a default value to populate the column in the resulting load.
Time Mapping
- Down Rows - Selecting this option indicates that time periods are defined down the rows.
- Across Columns - Selecting this option indicates that time periods are defined across the columns.
Parameters for loading Wage Type Compensations
You can load Wage Type compensations (i.e. Salary) for compensation items with a Wage Type or Wage Type without Reviews basis.
The following options are available for Salary and Wage Hours parameters:
- Annual Salary - Annual input value.
- Monthly Salary - Monthly input value, which is calculated by multiplying this number times the number of fiscal periods.
- Annual Wage Hours - Annual input value, which is calculated by dividing this number by the number of fiscal periods.
- Monthly Wage Hours - By period input value.
Description of Fields for a Currency Exchange Rates Load
Parameters
- Scenario: Select the Scenario for which you want to load exchange rates. Scenarios represent versions of a budget or forecast for a fiscal year and are created on the Scenario Setuppage.
- If you select a Budget type scenario, the Time Mapping Across Columns by Scenario option is available.
- Apply Currency Conversion: Automatically converts local currencies to a common currency.
- Time Mapping
- Across Columns by Scenario is available for Currency Exchange Rates if a budget scenario is selected while loading Exchange Rates.
- Fiscal Year and Fiscal Month will be disabled in Default Settings. Fiscal Year will be disabled in Data Mappings.
- Down Rows - Selecting this option indicates that time periods are defined down the rows.
- Across Columns - Selecting this option indicates that time periods are defined across the columns.
Defaults
- Parameter: The parameters listed are those available to be included in the data file.
- Include in Data File:
- Select Yes if a dimension is not defined within the data file and there is a common member for all records.
- Select No if you do not want the listed parameters in the data file.
- Select: Available if you select No for a parameter. Search for other dimensions to associate with the data file.
Selection Translation(s)
- Select currency translations to change the value of the source currency into another (foreign) currency target value.
Manipulate Input File
Manipulate the file or format of the data from which you are loading data. The top pane displays the raw data, and the bottom pane displays the manipulated data. Manipulation options include:
- Join - merges multiple columns into a new column.
- Split - divides a column into multiple columns.
- Move - changes the display order of the columns.
- Sort - categorizes or sorts the data based on a specific column.
- Replace - finds and replaces for the selected column.
- Create - creates a new column with a user-defined value.
- Headers - allows the creation of column headings for the data.
- Manipulations - provides a view of all the manipulations applied to the data.
Description of Fields for the Manipulate Input File Page
Create: Create a new column on the Create page.
- Column Name - Enter the name of the new column.
- Column Value - Enter the row value for the column.
Copy: Select the column you want to copy and click Copy to open the Copy page where you will create two new columns.
Description of fields on the Copy dialog page:
Source Column - Select the column you want to copy and click Copy to create two new columns. When you copy a column, the Source Column contains information from the column you copied. Enter the name of the new column in which you will copy the contents to.
Grid Actions
Sort: Sort the columns.
Manipulations
Join: Join two columns into one column by providing a Join Column Name and Join Operator.
- Join Column Name - Select the columns you want to join by pressing the Shift key on your keyboard while selecting the columns. Enter the new name of the column you are creating with the join. For example, a Join Column Name might be COL3 for joined columns 1 (COL1) and 2 (COL2).
- Join Operator:
- None - No join operator.
- Underscore - Add an underscore. For example, if you add column 6 containing X with column 7 containing Y, the joined column 8 will contain X_Y.
- Space - Add a space between the contents of the two joined columns.
- Dot - Add a dot between the contents of the two joined columns.
- Hyphen - Add a hyphen between the contents of the two joined columns.
- Add New - Add the newly joined column to the spreadsheet on the Manipulate Input File page.
Split: Split one column into two columns by providing Split at, Split Column Name 1, and Split Column Name 2 information.
- Split at - Select the column you want to split. Enter the point where the split will occur within the column.
- Split Column Name 1 - The name of the first column you are creating by performing the split.
- Split Column Name 2 - Enter the name of the second column you are creating by performing the split.
- Add New - Split the column and display the new columns on the spreadsheet on the Manipulate Input File page.
Move: Move the placing of the column up, down, to the top, or to the bottom of the grid.
Replace: Replace a selected column name on the Replace page.
- Replace/With - Enter the information you want to replace (in the Replace field) with the information you are replacing it with (in the With
- field).
- Replace Entire Sheet - Replace the entire grid of columns. If this checkbox is not available, the replace will only happen for a selected column. By default, Replace Entire Sheet is deselected.
- Match Entire Cell Content - Replace partial text in a cell. Select this checkbox to replace only those cells where the cell contents exactly match with the content provided in the Replace (full text replacement). If you don’t select this checkbox, all cells where the text in content is used in partial or full will be replaced. By default, Match Entire Cell Content is deselected.
Data Headers: Edit or create the heading names of the columns on the Headers dialog page. Manipulated Columns provides information on the name of the columns. Header Columns provides the name of the headers, which can be edited.
Data Manipulations: View data manipulations on the Manipulations page.
- Order - Reverse the order of the columns.
- Operation - The operations defined (i.e. Split, Join).
- Join Operator - The join operators created when you performed the Join (underscore, dot, hyphen, or space).
- Source Columns - The columns copied from when you performed the Copy.
- Target Columns - The columns copied to when you performed the Copy.
Grid Action
Turn the row and column headers on or off.
Auto-Size Column Width - Automatically adjust the columns in the Raw Data and Manipulations Data grids to fit the contents.
The columns in the Manipulations grid may also be resized by dragging them to the right as you would in an Excel spreadsheet. The resized column settings are saved when you click Next or Finish.
Define Data Mappings
Map Planful source data columns (Source Column) to target segments (Maps To). This mapping tells the system where and how your data will display in the columns once the load takes place.
You can map up to 100 columns. For example, if the number of Budget Input Years is 5, you may map as many as 60 columns. However, it is not necessary to map all the months for a scenario. You must map at least one month to load data.
Description of Fields for the Define Data Mappings Page
Source Columns/Map To
View Source Columns and select a Maps To option.
- If Across Columns was selected for Time Mapping on the Define Overall Rule Setting page, monthly values will be provided in columns when the data is loaded through File, Copy Paste or Web Services. There are multiple measures that can be loaded while loading sales data.
- If the Across Columns option was selected instead of individual Measures, the Measure Code is available in the Data Mappings in addition to the other columns. Measure Code is required in the source file.
- Support Role (if not selected on the Define Overall Rule Settings page to be included in the data file) is added to the Data Mappings as a mandatory field.
In the Define Data Mappings step, you need to map certain mandatory fields that vary for each load item and load sub-item. To learn about the mandatory fields for each load item, click here.
Load Data
This is the final step in the Data Load Process. If there where issues with the load, an error log is created. If the Data Load was successful, a message indicating so will appear. It's important to load data in chronological order to avoid discrepancies and multiple loads.
Once data is loaded, a status page displays the target scenario and the load status with any exceptions. If data does not load properly, click the Exception Report button to view the Data Load Exception Report.
Processing of Data Load Rules - Scenario Locking
Available for Shared Mode Applications
This applies to only those applications where Shared Mode is enabled. For information on what Shared Mode is, click here. If you are not sure if your application has Shared Mode, please contact Planful Support.
Overview
To prevent Data Load Rules and Consolidation Processes from running on the same scenario simultaneously, Scenario Lock functionality is in place.
When Data Load Rules and Consolidation Processes are executed, the first one to be initiated will acquire a lock at the Scenario level preventing the second process from executing on the same scenario until the first process is complete. For example, let's say a second process is initiated before the first one completes, the system will wait 10 minutes before canceling the second process. Note that information on status (canceled, completed and queued processes) is available in Cloud Scheduler. During the 10 minutes, the second process will automatically attempt to execute every 30 seconds to acquire a hold on the scenario. If the scenario lock is acquired by the second process (meaning that the first process completes within the 10 minutes) then it will execute and complete. If the scenario is not acquired by the second process within 10 minutes (meaning that the first process takes longer to complete), then it will be canceled and you can retry the process later.
Application Behavior
You can run Data Load Rules synchronously and/or asynchronously.
Behavior When Running Asynchronous Data Load Rules
- Asynchronous Data Load Rules run in cloud and are executed via Cloud Scheduler.
- Asynchronous Data Load Rules have a limit of one process per application. So when two Data Load Rules are executed by the same or different users (on the same or two different scenarios), they are queued in the Cloud Scheduler and executed one after another.
Behavior When Running Synchronous Data Load Rules
- Synchronous Data Load Rules run instantaneously and you cannot continue to work in the application until completion.
- You will not see entries in the Cloud scheduler. You will see entries in the Data Load History.
- Two different users can run a Data Load Rule independently at the same time as long as each is run for a different scenario. If the same scenario, subsequent process will fail and result in an error.
Behavior When Running a Combination of Synchronous and Asynchronous Data Load Rules
- You can run Data Load Rules asynchronously and synchronously within the same application. If an asynchronous is running and a synchronous Data Load Rule is initiated for a different scenario, both will be executed.
- If an asynchronous Data Load Rule is running and a synchronous Data Load Rule is initiated for the same scenario, the synchronous Data Load Rule will fail immediately.
If a synchronous Data Load Rule is running and an asynchronous Data Load Rule is initiated for the same scenario, the asynchronous Data Load Rule will be queued in the Cloud Scheduler (the Status will be ‘running’ while waiting). The wait time is ten minutes, after which the application cancels the queued process.
Data Load Rules - Required Fields
Users must select specific mandatory fields to complete the Data Load Rule (DLR) creation process during the Define Data Mappings step. A few additional mandatory fields vary depending on the options selected in the following areas:
- Load Type
- Load Sub-Item
- Build Type (as selected in the Define Overall Settings step)
- Time Mapping
- Parameters set to "Yes" or "No" in the Define Overall Settings step
- Direct data input basis
The following is the list of Load Items available in the DLRs:
- Segment Hierarchies
- Data
- Attributes
- Attributes Hierarchies
- Entity Hierarchy
- User Information
- Workforce Planning
- Currency Exchange Rates
- External Source Model
Segment Hierarchies
To complete the Segment Hierarchy or Alternative Hierarchy - DLR creation process, users must map specific fields in the Maps To column in the Define Data Mappings step. The required fields may vary depending on the Build Type selected in the Define Overall Rule Settings step. Also, if the Include In Data File field is set to Yes for any parameter in the Define Overall Rule Settings step, that parameter will become required in the Define Data Mappings step.
Hierarchy Type - Main Hierarchy
The required fields for each combination are detailed in the table below:
- The number of segments depends on your configuration as defined in the Configuration Task > Define Financial Segments.
- The system displays segment names defined in the Configuration Task > Define Financial Segments.
Load Sub Item (New Data Rule Load Rule) | Define Overall Rule Settings | Define Data Mappings (Mandatory Fields) |
---|---|---|
Segment 1 |
|
|
Segment 2 |
| |
Segment 3 to Segment 8 |
|
Hierarchy Type - Alternate Hierarchies
The required fields for Alternate Hierarchies for each combination are detailed in the table below:
Load Sub Item (New Data Rule Load Rule) | Define Overall Rule Settings | Define Data Mappings (Mandatory Fields) |
---|---|---|
Segment 1 | Build Type - Parent Child |
Note: If the Include in Data File is set to No, then the parameter will not be a mandatory field. |
Build Type - Level |
| |
Segment 2 | Build Type - Parent Child |
Note: If the Include in Data File is set to No for Rollup Operator and Elimination, then those two parameters will not be a mandatory field. |
Build Type - Level |
Note: If the Include in Data File is set to No for Rollup Operator and Elimination, then those two parameters will not be a mandatory field. | |
Segment 3 to Segment 8 | Build Type - Parent Child |
Note: If the Include in Data File is set to No, then the parameter will not be a mandatory field. |
Build Type - Level |
|
Data
To complete the Data - DLR creation process, users must map specific fields in the Maps To column in the Define Data Mappings step. The required fields for the Define Data Mappings primarily differ based on the Load Sub Item selected in the New Data Load Rule step. The Load Sub Item can be GL Data, Transaction Data, Transaction Data V2, or Translations Data.
For each Load Sub Item, the required fields differ based on the options chosen for the below parameters on the Define Overall Rule Settings step:
- Time Mapping - Down Rows or Across Columns.
- Include in Data File - set to Yes or No.
- Other than the above two, users might have a mandatory field(s) based on the Data Load Sub Item chosen on the New Data Load Rule step.
The detailed required fields are explained below for each Load Sub Item:
GL Data
Define Overall Rule Settings | Define Data Mappings |
---|---|
Time Mapping - Down Rows Include In Data File - Yes |
Note: If Include in Data File is set to No for any of the Parameters listed in the Define Overall Rule Settings step, the corresponding field (from 1 to 3) will not be mandatory. |
Time Mapping - Across Columns Include In Data File - Yes |
|
Time Mapping - Across Column by Scenario (applicable only for non-actual Scenario) Include In Data File - Yes |
|
Translations Data
Note: If users have selected 5 source columns on the Translation page, then in the Maps to Column users have to select Source Column 1 to Source Column 5.
If users have selected more than one translation on the Define Data Mappings page, then the Mandatory column will be based on the maximum number of source columns.
Define Overall Rule Settings | Define Data Mappings |
---|---|
Time Mapping - Down Rows Include In Data File - Yes |
Note: If Fiscal Year and Fiscal Month are set to No in the Include in the Data File in the Define Overall Rule Settings step, then those two parameters will not be mandatory. |
Time Mapping - Across Columns Include In Data File - Yes |
|
Time Mapping - Across Column by Scenario (applicable only for non-actual Scenario) Include In Data File - Yes |
|
Transaction Data or Transaction Data V2
The mandatory fields differ based on the Amount Columns and Source Data selected on the Transaction Details Setup page.
- The number of segments might vary based on your configuration and names will be displayed as defined in the Define Financial Segments - Configuration Task.
- Source Column(s) might differ based on the number of Source Columns assigned to the Translations Data Load in the Transaction Details Setup page.
- If users add Attribute Type(s) on the Transaction Details Setup page, those fields will also become mandatory, along with the fields mentioned in the table below.
Source Data & Amount Columns | Define Data Mappings (Mandatory Fields) |
---|---|
CCOA & Single Value |
|
CCOA & Debit amount and credit amount |
|
On Source Segments (with translations) & Single Value |
|
On Source Segments (with translations) & Debit amount and credit amount |
|
Attributes
Users can load values associated with attributes using the Attributes load item type. In the Define Data Mappings step, users must map the Name in the Maps To column, which is mandatory. Optionally, users can also map the Description.
Attributes Hierarchies
Users can load values for an attribute in an attribute hierarchy using the Attributes Hierarchies Load Item Type. The Mandatory Fields in the Define Data Mappings step depend on the levels defined in the Finance Hierarchies Setup. Users must map the level's name in the Map To column.
For example, if two levels are defined, such as Cash_Flow and Variable_Flow, the mandatory fields will be Cash_Flow Name and Variable_Flow Name.
Entity Hierarchy
Budget Entity Hierarchy
To complete the Budget Entity Hierarchy - DLR creation process, users must map specific fields in the Maps To column in the Define Data Mappings step. The required fields in the Define Data Mappings step differ based on the Build Type selected in the Define Overall Rule Settings step. Additionally, if the Include in Data File is set to Yes for a parameter, those parameters will become mandatory in the Define Data Mappings step.
Define Overall Rule Settings | Define Data Mappings (Mandatory Fields) |
---|---|
If Entity type - Based on Segments - Default Budget Entity is selected then the following will be the mandatory fields:
| For the Parent Build Type
|
For the Level Build Type
Note: If you have selected level 1, you must select Leaf Code. If you have selected more than 1 level then select n-1 level code. For example, for Level 3, you need to select Level Code1 and Level Code 2. |
User Information
The required fields for the User Information - DLR creation process in the Define Data Mappings step are:
- First Name
- Login
Additionally, if the user sets the Include In Data File to Yes for the following parameters, these parameters will become mandatory in the Define Data Mappings step:
- Navigation Role
- Reporting Role
- Support Role
Workforce Planning
The required fields for each Workforce Planning - DLR creation process in the Define Data Mappings step are explained below:
Workforce - Employee Data
The following Data Load Column fields on the Define Overall Rule Settings page are required fields:
- Employee Number
- Hire Date
- Pay Plan Code
- Employee Name
- Employee Type Code
- Position Name
- Budget Entity Code
- Rate per Hour
- Work Hours
- Wage Type or Salary
Additionally, if constant financial segments at the WFP budgeting level on the Define Workforce Planning Budgeting Criteria page are excluded, the users need to load from the Load Column options.
All the above fields must be mapped in the Maps To column on the Define Data Mappings step to complete the Workforce - Employee Data DLR creation process.
Workforce - Employee Actuals
The following fields are required on the Define Data Mappings step to complete the Workforce - Employee Actuals DLR creation process.
- Employee Number
- Employee Name
- Position Description
- Employee Type
- Position Budget Entity
- Home Budget Entity
- Segment 1 to Segment 8
- The number of segments depends on your configuration as defined in the Configuration Task > Define Financial Segments.
- The system displays segment names defined in the Configuration Task > Define Financial Segments.
- FiscalYear
- Month’s Amount (Select any 1 of the months)
- Other than Employee Number, Employee Name, and Position Description, the user can select the Default value on the Default Overall Rule Settings step. When you select a default value, that parameter will not be mandatory in the Define Data Mappings step.
- If users set the Include in Data File to No for the Fiscal Month, the selected month’s amount must be mapped.
Workforce - Employee Compensation Item
The below table explains the required fields based on the selection of Direct data input basis and Time Mapping on the Define Data Mappings step:
Direct data input basis | Define Data Mappings |
---|---|
|
|
Change by Period Time Mapping: Down Rows |
|
Change by Period Time Mapping: Across Columns |
|
Change by Period Time Mapping: Across Columns by Scenario |
|
The mentioned required fields must be mapped in the Maps To column to complete the Workforce - Employee Compensation Item DLR creation process.
Workforce - Employee Allocations
The required fields on the Define Data Mappings step differ based on the Time Mapping selected on the Define Overall Rule Settings step. To complete the Workforce - Employee Allocations DLR creation process, the required fields below must be mapped in the Maps To column on the Define Data Mappings step.
Time Mapping | Define Data Mappings |
---|---|
Down Rows |
|
Across Columns |
|
Across Columns by Scenario |
|
In addition to the fields mentioned in the table, users might find a few required fields based on the below configurations
- Mapped segment(s) in the Define Financial Budget Entity page becomes a required field
- If constant financial segments at the WFP budgeting level on the Define Workforce Planning Budgeting Criteria page are excluded, the user needs to load them from the Load Column option.
Currency Exchange Rates
To complete the Currency Exchange Rates (Load Item) > Load Exchange Rates (Load Sub Item) - DLR creation process, users must map specific fields in the Maps To column in the Define Data Mappings step. The required fields in the Define Data Mappings step differ based on the Time Mapping selected in the Define Overall Rule Settings step. Additionally, suppose the Include in Data File is set to Yes for a parameter in the Define Overall Rule Settings step. In that case, those parameters will be mandatory in the Define Data Mappings step.
The Mandatory fields are listed in the below table:
Define Overall Rule Settings | Define Data Mappings |
---|---|
Time Mapping - Down Rows Include In Data File - Yes |
|
Time Mapping - Across Columns Include In Data File - Yes |
Note: If users set the Include in Data File to No in the Define Overall Rule Settings step, those fields will not be mandatory except for the Fiscal Month. The defined Fiscal Month will become a mandatory field. |
Time Mapping - Across Columns by Scenario Include In Data File - Yes |
|
External Source Model
Users can select a Model from the Load Sub Item for the External Source Model Load type. The selected model’s corresponding fields will appear in the Define Overall Rule Setting step. These fields must be mapped in the Maps to a column in the Define Data Mappings step to complete the External Source Model - DLR creation process.