Fall18 Release Notes
  • 70 Minutes to read
  • Dark
    Light
  • PDF

Fall18 Release Notes

  • Dark
    Light
  • PDF

Article summary

Enhancements to Application Behavior

This section provides you with a quick overview of features and functionality delivered with this release that may affect the way you use the Host Analytics application. It is not a comprehensive list of all the features delivered with this release; but rather those features that impact the way in which the application behaves.

Planning

  • MyPlan Now Calculates the Impact of a User’s Actions for the Entire Scenario; Not Just the Current Year/Period

  • Workforce Planning Automated Employee Processing Means No Need to Manually Click to Process Employees

Consolidation

  • User Defined Consolidation Process Available to Configure and Personalize Your Close Process

  • Enhancements to Retained Earnings Roll Forward

  • Ability to Make Unused Elimination Sets Inactive

  • Ability to Resubmit a Rejected Journal

Reporting

  • Consolidation Data Available in Dynamic Report Drill Through Report

  • Dynamic Reports Label Update From Nested Label to Repeat Label

  • Order of Tabs in the Dynamic Report Drill Through Report Have Changed

  • Confirmation Message Displayed When Selecting to E-Mail a Dynamic Report to All Users

  • Removal of Information Icon and Text in the Enhanced Drill Through Report

  • Map and Unmap Workforce Default Measures in Attribute Settings

Dashboards

  • Settings Label Update From Data Labels to Axis Labels

  • Ability to Save Column Widths within a Dashboard Table

  • Updated the Order of Display for Legends on Stacked Column and Area Charts When Displayed to the Right of the Chart

Modeling

  • Enhanced External Source Model Functionality

  • Data Locking to Replace Dimension-Based Locking

  • Enhancements to Reports in Spotlight (Web Version)

  • Hide Empty Folders for Views and Reports

  • Client Upgrade to TLS 1.2 Required Starting December 2018

Platform

  • Ability to Change the Data Load Rule Type

  • NetSuite Integration Overhaul

Common: Task Manager (Preview - Opt-In)

Task Manager is NOT available in your application by default. You must contact Host Support to enable it.

The idea for this enhancement came from our Customer Community! Thank you and please keep posting!

Stay organized and collaboratively streamline your planning process with Task Manager. Task Manager provides a personalized checklist of tasks to better manage your planning or consolidation process. You can add tasks and subtasks, assign due dates to tasks and subtasks, and assign tasks and subtasks to other users. In turn, users can comment on tasks and subtasks as well as attach meaningful information is the form of documents. Additionally, you can setup quick Host Analytics navigation links to a task or subtask for direct access anywhere within the application.

Here is an example of the Task Manager interface:

The following topics are discussed:

How to Enable Task Manager for Users

How to Access Task Manager

Exploring the Task Manager Interface

Exploring the All Tasks Page

How to Add a Task

Auditing Task Manager

How to Enable Task Manager for Users

First, users will have to be assigned Navigation Access to Task Manager. Follow the steps below to provide access.

  1. Navigate to Maintenance > Administration > User & Role Management.

  2. Select Navigation Role.

  3. Select the user’s Navigation Role and click Navigate Access.

  4. Select the Task Manager checkbox and click Save.

How to Access Task Manager

Access Task Manager by clicking the Task Manager icon.

When you access Task Manager for the first time, there will be no tasks. You’ll begin by adding tasks. To do so, click the Add Task button. For detailed information on adding tasks, see the How to Add a Task section.

Exploring the Task Manager Interface

The Task Manager interface contains 5 tabs;

  • All Tasks - Lists all tasks assigned to you or created by you (meaning that you are the task owner)

  • Not Started - Lists all tasks that you own or are assigned to you that have not been started

  • Overdue - Lists your tasks that you own or are assigned to you that are beyond the task due date

  • In Progress - Lists tasks you own or are assigned to you that are in an In Progress state

  • Completed - List tasks you own or are assigned to you that you have completed

At the top of the Task Manager interface is a Search field where you can search for a task by entering a task name or partial task name as shown below.

Click Clear Search to return to the unfiltered view.

Exploring the All Tasks Page

As mentioned above, the All Tasks page provides a list of all tasks assigned to you or created by you (meaning that you are the task owner).

Task Column

Click the Task up or down arrow (shown below) to sort tasks in ascending or descending order. Tasks will sort numerically and then alphabetically.

Priority Column

Each task is assigned a Priority when it is added. You can click the task priority to display the task detail. If you are the task owner, you can change the priority. In the example below, the user that’s checking Priority for the Validation Testing task is not the owner and therefore does not have the ability to change the priority.

Status Column

The Status column provides information on whether the task is Not Started, In Progress, or Completed. When tasks are added, they are defaulted to a Not Started state.

If you change a task from Not Started to In Progress, it will be added to the In Progress tab. Similarity, if you move a task from In Progress to Completed, it will be added to the Completed tab.

Owner Column

Hover over each bubble for the full owner name. If there are beyond 3 owners, a bubble with the additional number of owners is shown. In the example above, there are 4 owners. Click the bubble to view the task details with owner information.

Due Date Column

Provides the due date assigned to the task when it was added. Overdue tasks have a due date that appears in a red font.

Navigate Column

Click the link to navigate directly to anywhere in the application for quick access to application areas. The navigation link is provided when a task is added.

Action Column

Provides quick access at the task level to see subtask (connected to the main Task) in hierarchy form.

Task owners can delete tasks. Click the trash can to delete a task.

How to Add a Task

  1. Click the Add Task button, which enables a pane on the left side of the screen (shown below).

  2. Enter the Title of the task in the Untitled field.

  3. Select the start and due dates. Timezone is your local time by default. This can be changed if required and the overdue status will be tracked on the basis of timezone selected.

  4. Select a Low, Medium, or High Priority. An example is shown below.

  5. Click the Add button under the Owner topic to map users to the task. Clicking this button launches a page where you can search and select users.

  6. Click the A button under the Owner topic to assign task owners or owner. Clicking this button launches a page where you can search and select users.

  7. Add notes or instructions in the Notes field.

  8. Click Save. You’ve just added a task and assigned it to users.

Auditing Task Manager

All Task Manager actions are recorded in the Audit Log located by navigating to Maintenance > Audit > View Audit Log.

Common: NetSuite Connect

We now offer seamless native integration between Host Analytics and NetSuite for those customers with source data and metadata in NetSuite. You will use Data Load Rules (example provided below) or Web Services to extract the data and metadata (Data and Segment Hierarchies) from NetSuite Saved Searches, which you will create. Once created, no technical expertise is needed to modify the saved searches.

Token based authentication between Host and NetSuite is used to secure your connection between the two applications. Steps to complete the Host Analytics configuration (set up a Profile) are provided below. However, you must first perform some configuration in the NetSuite application. The instructions to set up this configuration are provided in the Admin Guide to Loading Data to Host Analytics; under the NetSuite menu item.

How to setup a NetSuite Profile in Host:

  1. Navigate to Maintenance > Admin > Configuration Tasks > Cloud Services.

  2. Click the NetSuite Connect tab as shown below.

  3. Click Add to create a NetSuite Profile. You need to create a profile for establishing the connection between Host and NetSuite. Data can be pulled from multiple NetSuite databases. You can create multiple profiles corresponding to each database. The Configure Profile screen appears.

  4. Enter a name for the profile.

  5. Enter the Token ID and the Token Secret generated and provided by NetSuite. Tokens are used to authenticate and authorize data to transfer over the internet.

  6. For Consumer Key and Consumer Secret, NetSuite will provide this information once a new integration record is created. Consumer Key is the API key associated with NetSuite. It identifies the client.

  7. Enter the RESTlet url. A Router is a RESTlet that helps to associate a URI to the RESTlet or the Resource that will handle all requests made to this URI. In plain terms this means that the url provides a route from Host to NetSuite.

  8. Enter your NetSuite Account ID.

  9. Click Save. Your configuration will look something like the image below.

Now that configuration is complete, use a Data Load Rule (DLR) to load NetSuite Data. Follow the steps below.

  1. Navigate to Maintenance > DLR > Data Load Rules.

  2. Click the New Data Load Rule tab.

  3. Enter a name for the DLR.

  4. For Load Type, select NetSuite Connect as shown below.

  5. Select to load Segment Hierarchies, GL Data or Transaction Data. These are the only supported load type items at this time.

  6. For Connection Profile, select the name of the profile configured above. In this case it was called NS User.

  7. Select a Load Sub Item and click Next.

  8. Select the NetSuite Saved Search in the Select Sample Input File and then complete all remaining tabs as you would when creating a DLR.

Common: Ability to Change the Data Load Rule Load Type

When you set up a Data Load Rule (DLR) you select how you want your source data (such as actuals) loaded to Host Analytics. For example, you might load it in a file format or copy and paste the data. You might also use Boomi Connectors (Web Services). Let’s say you generally use the File load type to load data to Host Analytics from your source system. But, you’ve decided to use connectors via Web Services to load your data. Instead of re-creating the entire DLR (Data Load Rule), you can simply edit it and change the Load Type. This flexibility saves time and ensures consistency in loading data when the same DLR is used.

You can switch between the three types of DLRs:

  1. File Load
  1. Web Services
  1. Copy – Paste

In the image below, a File type load has been used to load actuals.

To change the Load Type from File to Web Services, double-click the DLR. Change the Load Type to Web Services and complete the remaining steps for the Select Sample Input File, Define Overall Rule Settings, Manipulate Input File, Define Data mappings and Load Data pages for the changes to take effect.

Graphical user interface, application

Description automatically generated

To change the 2017 Actuals DLR back to the File Load for Load Type, double-click the DLR from the Data Load Rule List page, on the New Data Load Rule page, change the Load Type to File Load and complete the remaining steps of the DLR.

Best Practices:

Complete all the steps in the DLR to effectively change the DLR type. Otherwise the DLR will be rendered unusable.

Reporting: Ability to Resize and Save Groups of Columns in a Dynamic Report

The idea for this enhancement came from our Customer Community! Thank you and please keep posting!

With this enhancement, you can select multiple columns at once in Dynamic Report output using shortcut keys, resize the width for all selected columns and then perform a Save. When you perform a Save (with Save Column Widths selected in the toolbar), all column widths are retained even when the report is closed and then reopened. Prior to this release, only the first resized column could be saved in a Dynamic Report and all other columns reverted to a default width unless you resized each column individually.

Note:
If you reset column widths, they default to the original widths.

In Practice:

  1. Open a Dynamic Report.

  2. Hold the SHIFT key to select the contiguous columns to resize. Or, hold the CTRL key to select disparate columns

  3. Drag the column line (shown in the image below) to the appropriate size.

  4. Click Save.

Reporting: Code/Name/Label Display in Dynamic Report Drill Through

Now, you have the flexibility to display drilled through Dynamic Report output by member code, name, or label (code+name). In the past, the member label was displayed in Drill Through output with no option to select to display code or name only.

When drill through report is exported to Excel, the code/name/label configuration is retained across all tabs. You cannot save the code/name/label configuration as it is for viewing and export purposes only. Once you navigate away from the Drill Through screen, the default setting is reapplied.

Note:
Label is the default selection.

In the example below, code is selected on the Operational Planning tab.

When Capital Planning, Workforce Planning or GL Data Loads tabs are selected, the code only will display as shown below.

You cannot have one tab display code while the other displays label, for example. All tabs will display based on the latest selection made on any of the tabs.

The table below provides information on the Drill Through tab name and the related columns that the code, name, label selection is applied to.

Reporting: Formatting Enhancements in Dynamic Reports and Report Sets

We’ve added two more formatting options available for Dynamic Reports and Report Sets; Copy Format and Text Wrap.

Copy Format

Use Copy Format to copy the formatting of text or cells in your report output.

To use Copy Format:

  1. Open a Dynamic Report.
  1. Select the test or range of cells you want to copy the format of.
  1. In the toolbar, click Copy Format.
  1. Select what you want to paste the formatting onto.
Note:
Copy Format copies both text and data format.

Text Wrap

Use Text Wrap on columns, rows or cells.

To use Text Wrap:

  1. Open a Dynamic Report and click Format.
  1. Select the cells you want to wrap.
  1. Click the Text Wrap icon (shown below).

Note:
Text wrap does not apply to data cells.

Reporting: Capital Budget Analysis Report Update

We’ve added a filter option to the Capital Budget Analysis standard report. Now, you can View report output by Quarter.

  1. Access the Capital Budget Analysis report by navigating to Maintenance > Reports > Standard Reports > Capital Budget Analysis.
  1. Select the Capital Budget Analysis report and click More > Deploy Reports. You must redeploy this report to take advantage of the filter option.
  1. Run the report from the Standard Reports interface or from the location it was deployed to within the File Cabinet.
  1. For View, select By Quarter as shown below.

Here is an example of report output when viewing by quarter.

Reporting: Staffing Summary Report Update

We fixed an issue with the Staffing Summary standard report so that column headers now appear on all pages. Please re-deploy the report by following the steps below:

  1. Navigate to Maintenance > Reports > Standard Reports.
  1. Select the Staffing Summary report and click More > Deploy Reports.
  1. Select the location in the File Cabinet to deploy the report to.

Reporting: Update to Shared Data in Enhanced Drill Through

Now, when Shared Mode is enabled in your application, GL data is displayed under the Operational Planning tab (as shown below) when you’ve drilled through from a Dynamic Report, irrespective of whether the data is posted from the Budget Control Panel or Data Load Rules.

However, when data is loaded using multiple sources within Shared Mode, the data is overridden for a given segment combination resulting is the display of the latest details. This means that the history data is not maintained and Planning templates will be updated with the latest data only after running the Simulation Engine or opening and saving each template.

In applications where Shared Mode is enabled, Entity and Doc Ref/Template columns display as Shared Mode in the Operational Planning tab for the applicable Template Types and Scenarios as shown below.

Lines and sublines are displayed if data is posted from the Budget Control Panel. However, the data displayed for each period is pulled from the most recently updated source. If data is uploaded only via Data Load Rules, it is displayed under the Operational Planning tab, but, lines and sublines will be empty and hidden from display.

Reporting: Confirmation Message When Send E-Mail to All Users is Selected

Now, when you select to email a report to all users from the Dynamic Report email interface (shown below), a confirmation message is shown. This will ensure that you don't unintentionally send the report to all users.

Reporting: Removal of Information Icon and Text in Enhanced Drill Through Report

We’ve removed the information icon and the text shown below from the Drill Through Report page (accessed by drilling through Dynamic Report output).

Information Icon and Text Prior to This Release:

Without Information Icon and Text:

Reporting: Order of Tabs in Drill Through Report

When you drill through from a Dynamic Report to the Drill Through report, views appear in tabular format. With this release, we are reordering the tab display to eliminate clicks for users who like to access Transaction data quickly. The tabular order is shown in the image below.

The tabular order is; Transactions, Translations, Operational Planning, Workforce Planning, Capital Planning, GL Data Loads, Consolidation.

Reporting: Consolidation Data Available in Enhanced Drill Through

Drill through to source information on Consolidation data is now available in Dynamic reports. This means that data imported from a general ledger or adjustments made using journals or the transformations of data by the Consolidation process can now be drilled into using Dynamic Report Drill Through.

When you drill through in a Dynamic Report, you will see a tab called Consolidation as shown below.

As shown in the image above, the default columns displayed for Consolidation include the Chart of Accounts segment combination (in this example; Account, Company, Intercompany, Department, Product, Project), Measure, Reporting, Type, Doc Ref, Line, Currency, Month and Amount.

Cells in the Type column populate from the following; Standard Journals, Dynamic Journals, Non Controlling Interest Calculations, Reclassifications, Eliminations, Consolidation Process, and MSOB (Multiple Set of Books) Calculations.

Cells in the Doc Ref column populate with labels that correspond with the Type column. For example in the image below, Doc Ref represents the name of a Standard Journal, which is why the Type is Standard Journal.

The Standard Journals in this case are set up against the 1010 - Corporate HQ company as shown below in the Member Selector screen of the Consolidation Control Panel.

Similarly, if the Type is Elimination, then Doc Ref references the Elimination entry for the account and company.

Cells in the Line column populate for Standard Journals, Dynamic Journals, and Non Controlling Interest Types. The cells in this column will be empty for all other Types. If two Lines with the same Line Name/Description are posted from a single Journal for a given segment combination, their values are aggregated in the drill through report.

Turn ON/OFF is enabled for Type and Doc Ref. If turned OFF, data is aggregated for unique segment and currency combination across all journals and the Consolidation Process.

Reporting: Ability to Map/Unmap Workforce Planning Default Measures from the Attribute Settings Screen

All default measures are now enabled for selection on the Attribute Settings page so that you can map and unmap them to and from the reporting area.

This improved flexibility is beneficial as the legacy behavior was such that if you did not want to include annual salary or bonus, for example, in a Dynamic Report, you couldn’t hide them from the Measure dimension.

In Practice:

To Map Workforce Planning Default Measures from Attribute Settings Screen:

  1. Navigate to Maintenance > Reports > Cube Settings.

  2. The first tab is displayed when the screen opens, which is Attribute Settings.

  3. Select Workforce Attributes from the list-box.

  4. Click the Add button to select measures to map.

  5. You MUST map Amount in CC (Common Currency), Headcount and any one of the measures each from the Position and Employee dimensions to take advantage of Workforce Reporting. See images below.

  6. Click Add Selected.

  7. Click Save.

Unmapping Measures

Not all measures can be unmapped. You will receive an informational message for measures that cannot be unmapped. Workforce Reporting measures (Amount in CC, Headcount and any one of the measures each from the Position and Employee dimensions) are required to be mapped.

To Unmap Measures:

  1. Click the checkbox next to the measure you want to unmap.

  2. Click the Unmap Measures button shown below.

Reporting: Toolbar Name Change from Nested Labels to Repeat Labels

We’ve changed the name from Nested Labels to Repeat Labels under the More list-box of Dynamic Report output. Now, you will see Repeat Labels Off and Repeat Labels On selections as shown below.

Reporting: Cell Level Formatting Now Available for Report Sets Included in Dynamic Reports

Contact Host Support if you have not already enabled this feature in your application and you would like to.

The cell formatting options within Dynamic Reports are now made available in those reports that are built using Report Sets. Prior to the release, cell formatting was enabled only in dimension-based reports.

In the Dynamic Report below, a Report Set is applied to the Row axis. Even though a Report Set is used to built the Dynamic Report, you can still apply cell level formatting.

Dashboards: Ability to Save Column Widths within a Dashboard Table

To further customize the look and feel of a Dashboards table, you can resize the column widths and save. Resizing of tables can be done in both Design and Edit modes. Column widths are retained when the Dashboard is accessed in Run or Edit mode, when the Dashboard is copied or duplicated, and when exported.

In Practice:

  1. Access Dashboards.

  2. Open a Dashboard.

  3. Use the arrow to drag the column to the required size.

  4. Save the Dashboard, which saves the column width.

Dashboards: Code/Name/Label Display

You now have the ability to include code or name only in charts on a dashboard. This is beneficial when the display of labels on the charts make them too busy and the legends too large.

Follow the steps below to include code or name only in charts on a dashboard.

  1. Access a Dashboard.

  2. Select a chart and click Edit or create a new chart.

  3. Click the ADD button to add a dimension or attribute as shown below.

  4. The member selector displays all members (in this case for the Scenario dimension).

  5. Click the Label list-box and select Code or Name for the dimension. This is done once for each dimension.

  6. Save your chart.

Filtering by Code/Name/Label is also available for Dashboard filters as shown in the example below.

When variables is are used for the chart title, subtitle, and/or description, the variable code/name/label display is based on what is applied in member selector. An example is shown below.

If you export a dashboard to PDF, Code/Name/Label display selections on the dimensions are displayed in the output. Selections are also retained when you duplicate a dashboard.

Note:
This functionality is available by default across all applications where Dashboards are enabled and needs no additional configuration setup.

For system-defined dimensions like Time, Reporting, Measures, Position, and Standard and Derived Variables, Label does not exist. In this case the Code/Name are always displayed in the member selector. For Standard and Derived Variables, the chart/dashboard output is displayed retaining the code/name/label configuration.

Dashboards: Ability to Display or Hide Axis Labels

Data Labels (available under the Settings tab) has been replaced with Axis Labels across all chart types including Line, Bar, Column, Waterfall, Stacked Column, Stacked Bar, Area, Stacked Area, Bubble and Scatter. The toggle button is now updated into a selectable list box (shown in the image below) with Horizontal, Vertical and None selections.

Legacy Toggle Button:

List-Box:

For existing Bar and Stacked Bar charts, Horizontal is selected by default.

Vertical is selected by default for all other chart types as shown in the image below for a Line chart.

If Vertical is not selected for all charts (excluding Bar and Stacked Bar), the axis displaying numbers/data is hidden. For Bar and Stacked Bar this applies to the Horizontal axis.

If None is selected, both the dimension labels and the data values on both the axis are not displayed for all chart types.

For Dual Combination Charts, configuration done on the Vertical axis also applies to the secondary Y axis. For existing Combination Charts, Vertical is always selected by default for Axis Labels.

Note:
Axis Labels are enabled for Pie and Doughnut charts, but there is no horizontal or vertical dropdown.

Here is an example with Horizontal and Vertical Axis Labels selected.

Dashboards: Introducing Single Y Axis Combination Charts

The Single Y Axis Combination chart allows you to represent one column of values on the Y axis. Prior to this release Dual Axis Combination charts required two columns of values on the Y axis. Examples of Single Y Axis and Dual Axis Combination charts provided below.

Example Single Y Axis Combination Chart:

Example of Dual Axis Combination Chart:

In Practice

  1. Access Dashboards.

  2. Drag and drop a Combination chart to the canvas.

  3. Select the Settings tab.

  4. For Type, select Single.

  5. Build your chart with dimension members and save.

Planning: Automated Employee Processing (Phase 1)

Note:
In order to use this feature, you must set the Enable Delta Processing option on the Define Detailed HR Budgeting Criteria Configuration Task page to Yes. Navigate to Maintenance > Admin > Configuration Tasks and open the Define Detailed HR Budgeting Criteria task. Select Yes for Enable Delta Processing.

To improve user experience, we are introducing Phase 1 of Automated Employee Processing. Automated Employee Processing is available when you perform a Save from the Employee Add or Edit page, the Employee list screen and Mass Update. This means that you no longer need to manually process employees when you perform a Save from the aforementioned screens. Once the Save action is selected, the system will process employees that have been added or modified. Please make sure you “refresh” the screen to see the updated information.

Additionally, you no longer process employees when a scenario is locked. Scenarios must be in an unlocked state in order to process employees associated with the scenarios. Scenarios can be locked and unlocked from the Scenario List page (accessed by navigating to Maintenance > Admin > Scenario Setup).

In Practice

How to access the Employee list screen and process employees:

  1. Navigate to the Planning Control Panel.

  2. Select a scenario, entity, and Detailed HR template.

  3. Open the template in Input mode. The Employee list screen is displayed (shown below). When you modify employees and perform a Save from this screen, employees are processed for current budget entity. So, there is no need to click the Process menu item in order to process employees; the Save action now handles it for you.

How to access the Employee Add screen and process employees:

  1. Navigate to the Planning Control Panel.

  2. Select a scenario, entity, and Detailed HR template.

  3. Open the template in Input mode. The Employee list screen is displayed.

  4. Select the Employees list-box and click Add. The Employee Add screen is displayed.

  5. Complete the fields on the Employee Add screen and click Save. When you perform a Save from this screen, employees are processed for current budget entity. So, there is no need to click the Process menu item and process employees.

How to access the Employee Edit screen and process employees:

  1. Navigate to the Planning Control Panel.

  2. Select a scenario, entity, and Detailed HR template.

  3. Open the template in Input mode. The Employee list screen is displayed.

  4. Select and employee, select the Employees list-box and click Edit. The Employee Edit screen is displayed.

  5. Perform all edits and click Save. When you perform a Save from this screen, employees are processed for current budget entity. So, there is no need to click the Process menu item and process employees.

How to access Mass Update and process employees:

  1. Navigate to the Planning Control Panel.

  2. Select a scenario, entity, and Detailed HR template.

  3. Open the template in Input mode. The Employee list screen is displayed.

  4. Select the Employees list-box and select Mass Update.

  5. Complete the fields on the Mass Update page and click Save. When you perform a Save from this screen, employees are processed for current budget entity. So, there is no need to return to the Employee list screen and click the Process menu item and process employees.

Important Notes:

  • Autoprocessing is for employee data modifications that are done from the Employee Listing screen, the Add / Edit screens, as well as for the Mass Update option. For updates done to employee data from any other source, processing of this data still needs to be performed.

  • In the Employee list screen, the save action automatically processes employees, however, the status will be updated only when the listing page is refreshed.

Sneak Peek at What’s Coming with Phase 2

We will be introducing Phase 2 of Automated Employee Processing with the Winter19 release. With this phase, Data Load Rules will be enhanced so that when you click Save, employees will be processed automatically, which will eliminate the need to perform this action manually. To further streamline look and feel, the Process button (shown in the example page shot below) will be moved to the More menu. Additionally, a More menu will be added to the Employee Edit screen to increase usability.

To enhance usability, the ultimate goal is to eliminate the need to perform a Refresh, as this screen and the Employee List screen will be refreshed automatically.

Planning: Ability to Perform an Approve All

Perform an Approve All from the Planning Control Panel to approve all leaf members and the parent member rollup selected in the budget hierarchy. This functionality is beneficial as it eliminates the need to individually approve members; saving time and eliminating clicks.

In the example below, when you perform an Approval All on 1000 - Corporate Mgmt Group, all of its members will be approved as well.

Planning: Budget Entity Status Report

Introducing the Budget Entity Status report, which allows users to quickly retrieve a summary status for all budget entities a user has access to. This report will provide information directly related to workflow status as well as budget entity status. And, it allows users to filter entities based on status; such as forwarded and approved.

In Practice:

  1. Navigate to the Planning Control Panel. Select a scenario and a budget hierarchy entity rollup or leaf member.

  2. Locate the Status Report option on the toolbar and click it. You will see the Status Report appear.

  3. Once the Status Report appears, you can click members in the hierarchy to view information. You can even select a different scenario from this screen. In the image below, the BUDGET 2019 scenario is selected and the related hierarchy is shown.

    In this image (below), 1 % of the budget is complete and there are a total of 105 entities. You can also see how many have not been started. Click Not Started for details.

    Click Forwarded to view the entities that are in a forwarded state as shown below.

    You can also perform actions on entities and templates from this report; the same actions that can be performed from the Planning Control Panel.

Planning: Seeding Enhancements - Ability to Seed Time Sets of the Same Frequency when Seeding Overlapping Periods

With the Summer18 release we provided you with the ability to seed template structures, overlapping periods, and formulas. With this release we are providing the ability to seed Time Sets when you select the seeding option called Reference Scenario with Overlapping Periods. If a scenario had templates with non-uniform timesets, our legacy functionality did not allow seeding. Now this use case is supported. There is no setup required as this functionality is automatic.

Time period frequencies include Year, Quarter and Month. A Time Set definition (i.e. Month, Quarter, Quarter, Year, Year, Year) determines the template column structure for allowed budget input periods.

With this release, you can seed data and the structure of templates with Time Sets of the same frequency (i.e. Quarterly, Yearly) when you have overlapping periods. This functionality works for uniform Time Sets. For example, if a template has Time Sets as All Months, All Quarters or All Years then seeding will be allowed. Custom Time Sets (like - MM - QQ -YY or MM - QQ) will not be seeded.

To verify that Time Sets where seeded, create a Dynamic Report and include the scenarios (source and target). In the example verification report below, time, measure, account and scenario was including on the row and column axes.

Essentially, Time Sets allow budget input at a selected time period frequency in a budget template for each budget input year. This flexibility allows you to create templates with accurate operational plans and provide a means for long term strategic planning.

MyPlan: Multi-Year Calculations

MyPlan now calculates the impact of a user’s actions for the entire scenario; not just the current year/period. For example, if you set up a 2 year budget scenario (i.e. 2018-2019), modifications done to scenario data in MyPlan will be calculated for the entire 2 years on a period by period basis (i.e. monthly/quarterly).

If a user selects a 2018 period in the MyPlan dashboard and adds an employee named Wally Smith on September 3rd, 2018, prior to this release, the cost calculation would only affect that period and year; until the budget was submitted. Now, the cost calculation will apply to both 2018 and 2019 instantly.

Calculations are held "in memory" so that the user can to see all cost instantly and can choose to submit their actions later. The impact to all years/periods of the scenario is displayed in Activity Manager as shown in the example below.

Now, let’s say an employee is added to a Q4 of 2018 Forecast scenario on November 1st of 2018. MyPlan will calculate the total compensation for the new employee for November and December of 2018 as well as the additional periods/years associated with the Forecast scenario.

The benefit of this feature is that it provides a total and complete comprehensive look at how the data entered impacts your company and your business decisions as a result. There is no required set up.

MyPlan: Template Enhancements

We’ve enhanced the interface for templates accessed via MyPlan by adding a menu with consolidated tasks for a better user experience and consistency with templates opened in Contributor Mode.

In Practice:

  1. Access MyPlan and click the Wrench icon to open a template. The template pane opens. In the image below, the Adjust Opex template is shown. The menu is also shown.

  2. On the far left, click the cloud icon with the up arrow to Update the template based on changes or adjustments made to the template.

  3. Click the Refresh icon to refresh the template data.

  4. Click the Wrench icon to perform a Fix It.

  5. For View, select from four options; Template Infobar, Formula Bar, Column Header and Row Header.

    • Selecting Template Infobar displays the Entity, Period and Template Name as shown below.

    • Selecting Formula Bar displays the fx bar where you can enter formulas such as SUM.

    • Select Column or Row Header to enable a more Excel like look and feel.

  6. Click Methods to enable the Methods pane where you can spread data from a total column to period columns. Prior to the addition of this menu item you would have needed to right-click in an editable cell in the template. Selections are made available based on how the template was set up by your Admin user. Options that might be available to you are shown in the image below and a description of each is provided below as well.

    • Monthly Increase in $ - Apply an increased dollar amount based on historical amounts for each month. Optionally, Round the number to the nearest whole number. For example; 67.7 will appear as 68. Click Apply.

    • Monthly Increase in % - Apply an increased percent based on historical amounts for each month. Optionally, Round the number to the nearest whole number. For example; 67.7 will appear as 68. Click Apply.

    • Full Year Growth - Apply a total increase percent based on history and use a spread method to distribute the amounts to the monthly periods. The spread method may be based on history or custom-defined spread methods created for the template. Optionally, Rounding the number to the nearest whole number. For example; 67.7 will appear as 68. Click Apply.

    • Clear Applied Method- Used to clear a previously applied method. Select a new method in the Methods pane shown below. When a method is applied, data in the cells is populated based on the method. When a user clears the method, the values are retained but the method logic underneath is removed on the row so that a new method can be applied.

  7. Click the icon on the far right (the circle with the dot in the middle) to open the template in Contributor Mode.

MyPlan: Data Trend Analysis

We are excited to release Data trends for MyPlan. It’s our first step towards introducing a functionality towards what we are calling Data Intelligence. Data Trends allow users to use prior, present or future data points to help them complete a budget or forecast, by looking at data from any given scenario and time period, in order to analyze and benchmark against their working scenario. Users can use this feature to make actionable decisions by knowing how and what should be Planned, knowing where they are going to ultimately Land, and most importantly have a clear picture of where they currently Stand.

The following topics are discussed:

Configuring Data Trends

Displaying Data Trends

Displaying Values in Annualized Mode and Adjusting Factor Explained

Configuring Data Trends

Data Trends setup is completely driven by the application administrator. It is the admins job to determine what data points they would like the end user to see while they are using MyPlan. The setup instructions in provided below.

In Practice

  1. Navigate to Scenario Setup (Maintenance > Overview > Scenario Setup), select a working scenario and click Edit.

  2. Next, navigate to the Data Trends Setup tab and select Enable Trend. You will see the budget input years appear.

  3. By default the system auto populates the compare data columns with the Actual scenario and the prior year. Use the Scenario Name and Fiscal Year selection (dropdown) to choose what other scenarios you would like to use as a reference for end users to compare against. Use the Column Header option to rename the scenario to something more user friendly (this in only an alias, you are not changing the actual scenario name).

  4. Based on the reference information you would like to show to the end users, the above steps can be completed for all budget input years.

Displaying Data Trends

This section discusses how Data Trends are displayed in the MyPlan interface.

In Practice:

  1. Navigate to MyPlan, and select the three vertical dots Icon and then select Show Data Trends. You will see the reference scenario setup that you completed for each Compare Data element displayed. By default, Compare Data 1 will be the first compare scenario shown (see screenshot below).

  2. Select the Showing Data dropdown box to access the other Compare Data options.

Displaying Values in Annualized Mode and Adjusting Factor Explained

When you set up Data Trends and selected the first section to configure the Compare Data option, there is an option called Display Values in Annualized Mode and the option to add an Adjusting Factor (as shown in the image below). A brief description of these options is provided below.

Display Values in Annualized Mode

The Annualized Mode option is used when the Actual scenario, without a full year of data, is used for compare or trend analysis. In such cases, the application will adjust for periods without any data. For example, let’s say a user is generating a Data Trend for the Budget 2019 scenario. The user selects the Actual 2018 scenario for comparative values. However, the application only has Actual data for 9 periods in the Actual 2018 scenario. It is impossible to compare 12 periods of data in the Budget 2019 scenario with 9 periods of data in the Actual 2018 scenario.

If you are in this situation, use the Display Values in Annualize Mode option. When enabled, the application will calculate the average of the 9 periods of Actual data and use the result to populate the remaining 3 periods that were without data.

The calculation to extrapolate the average of 9 periods of data would look like this:

Original Actual 2018 Period Data

P1

P2

P3

P4

P5

P6

P7

P8

P9

P10

P11

P12

100

110

120

130

140

150

160

170

180

No Data

No Data

No Data

Total P1 through P9 = 1260 -------1260/9=140

140

140

140

After Annualizd Mode is Applied

P1

P2

P3

P4

P5

P6

P7

P8

P9

P10

P11

P12

100

110

120

130

140

150

160

170

180

140

140

140

Adjusting Factor

Adjusting Factor is the average value used and is calculated by the application. For example, If adjustment factor is 10%, then the average value will be calculated and an additional 10% will be added (i.e. $140 + %10 = $154). If you have 3 periods without data, the average (plus 10%) will be used to populate the periods.

Consolidation : User Defined Consolidation Process

With the User Defined Consolidation Process feature, you can configure and personalize your Close process. For example, you can balance books at local and common currencies during your Close process and require interim and reporting currencies get processed after Consolidation is complete. For Budget and Forecasts, you now have the flexibility to run the Consolidation Process for specific financial years (period range) for the scenario versus all years (which was the case prior to this release).

Prior to this release, the Consolidation Process would run conversions and other calculations for common currency, local currency, interim currency, and reporting currency all at once in a big chunk. For example, let’s say you wanted to post Eliminations, Dynamic Journals (in common currency), and Reclassifications. You would have had to run the Consolidation Process, then Eliminations, and then the Consolidation Process again so that interim and reporting currency would reflect the eliminated intercompany transactions. Running these processes were consuming a considerable amount of time. Now, you can process local currency and common currency, post eliminations and common currency adjustments (if any) and then consolidate interim and reporting currency.

Follow the In Practice steps below. Alternatively, run the Consolidation process from Cloud Scheduler. You can choose Interim Currency and Reporting Currency by selecting the Configurable Steps option shown below.

In Practice:

  1. Access the Consolidation Control Panel.

  2. Select the Scenario, Period and Company.

  3. Click the Consolidation Process and select the Process Action. The Process Setup page is displayed.

  4. The Configurable Steps tab is displayed by default as shown below.

  5. The Period Selected field is populated based on the Period selected on the Consolidation Control Panel. Select a TO period if you want to run the process for multiple periods.

  6. For E-mail Notification, click the slider, which will change to blue (on) and select the e-mail addresses of those you want the Consolidation Process Status Log sent to.

  7. In the Configurable Steps pane, select the Interim Currency(ies) you want consolidated during the Consolidation Process. The number of available Interim currencies is based on the number of Interim Currencies specified on the Define Currency Parameters Configuration Task.

  8. Select the Reporting Currency(ies) you want consolidated during the Consolidation Process. Reporting Currency is defined in Hierarchy Management. In the image below, you can see how the leaf level members in the Reporting hierarchy match up with the Reporting Currency on the Consolidation Setup page.

  9. Click the System Defined Steps tab. The Consolidation Process calculates both Common and Local Currency by default. The activities under Local Currency and Common Currency are system populated based on what needs to be performed and how your application is configured.

  10. Click the History tab to view information on the period, date and time the last Consolidation Process was run as well as the user that ran the process and the status of the process.

  11. Click the View Setup tab to view the account setup completed on the Consolidation Control Panel > Consolidation Setup.

  12. Click Save my settings so that every time you access the Process Setup page, your selections for Interim and Reporting currency is retained from the previous run. Settings are user specific.

  13. Click Process. You’re done!

Consolidation: Option to Make Elimination Sets Inactive

Prior to this release Elimination Sets could not be disabled; instead they had to be deleted. Now, you have the option to disable an Elimination Set by making it “Inactive” (shown in the screen below) and re-enabling it at any time by making it “Active”. You can also select all Eliminate Sets by clicking the checkbox to the left of Code and post them all at once. The system has functionality built-in to only post the active Elimination Sets.

In Practice:

How to Make an Elimination Set Inactive:

  1. Access the Consolidation Control Panel.

  2. Select the Eliminations Process.

  3. In the center pane, all eliminations are displayed. Select the elimination you want to make inactive.

  4. Click the Inactivate Action (as shown in the image below).

    When you make an elimination inactive, the only action available will be “Activate” (shown below). This is because you can’t perform other actions (such as post or clear data) on an inactive elimination.

How to Make an Elimination Set Active:

  1. Access the Consolidation Control Panel.

  2. Select the Eliminations Process.

  3. In the center pane, all eliminations are displayed. Select the inactive elimination set you want to make active.

  4. Click the Activate Action.

How to Post Multiple Elimination Sets At Once:

  1. Access the Consolidation Control Panel.

  2. Select the Eliminations Process.

  3. In the center pane, all eliminations are displayed. Select the checkbox to the left of Code as shown below. Notice that only the ACTIVE eliminations are selected to post

  4. Click Post.

Consolidation: Ability to Return Rejected Journals to an In Process State

Prior to this release Rejected Standard Journals could not be returned to an In Process state. Now, a rejected Standard Journal can complete the full lifecycle and ensure auditability. You can edit rejected journals, return them to an In Process state, and have them approved.

This image shows that there were no actions available for rejected Standard Journals prior to this release.

This image shows the new option to return a rejected Standard Journal to an In Process state.

In Practice:

Follow the steps below to return a rejected Standard Journal to an In Process state.

  1. Access the Consolidation Control Panel.

  2. Select the Standard Journals Process.

  3. In the center pane, all journals are displayed. Select the rejected Standard Journal you want to return to an In Process state.

  4. Click the In Process action.

Consolidation: Enhanced Retained Earnings Roll Forward (Opt-In)

This is an opt-in feature. Contact Host Support to have this feature enabled.

Retained Earnings Roll Forward is a feature to roll over the Retained Earnings across financial years. This happens when Consolidation is run during the first period of any financial year. Prior to this release the Retained Earnings Roll Forward would carry the balance forward from the last period of the prior year-end to the first period of the current financial year resulting in Beginning Retained Earnings. The roll forward is calculated using the formula (Retained Earnings YTD balance of Last Period of Previous Financial Year (+) YTD Balance of Beginning Retained Earnings Account of Last Period of Previous Financial Year). No adjustments are allowed to the Roll Forward balance as calculated per the formula. The Fall18 Release will now allow for adjustments with this enhancement.

Once this opt-in feature is enabled, the amount of the Beginning Retained Earnings will be derived from the account rollup configured as the “Source Account for Beginning Retained Earnings Calculation” instead of the formula above. Users will have to structure the Total Retained Earnings Account Rollup (Parent Account) Node under the Equity Rollup to include the Host Current Year Earnings Account, Beginning Retained Earnings Account, and Adjustment to Retained Earnings Account as leaf members. The Standard Journals or Dynamic Journals can only be used to post adjustments to the Adjustment to Retained Earnings Account.

The Account Balance for the Source account rollup for the last period of a financial year will be rolled forward as the Beginning Retained Earnings of the first period of the successive financial year.

In Practice

To specify a beginning retained earnings account, complete the steps below.

  1. Access the Consolidation Control Panel.

  2. Select Consolidation Setup.

  3. Under Retained Earnings Roll Foward, select the Include in the Consolidation Process checkbox. Selecting this checkbox enables the option to select an account for Beginning Retained Earnings.

  4. Under Source Account for Beginning Retained Earnings Calculation, select the rollup\parent account which represents the Total Retained Earnings.

  5. Click Save.

Illustration of Use

The image below represents a portion of the financial statements for Financial Years 2016, 2017 and 2018. FY 2016 is the first year of operations. Data for periods 3 through period 11 of FY 2017 are not displayed here and all figures are in YTD.

Current Year Retained Earnings (row 2) is a leaf account setup as the Target Retained Earnings Account in Consolidation Setup. Beginning Retained Earnings (row 3) is the leaf account setup as the Retained Earnings Roll Forward in Consolidation Setup.

Dividends (row 4) and Adjustments (row 5) are leaf account members to be used for making adjustments for Retained Earnings in the Consolidation module.

Current Year Retained Earnings, Beginning Retained Earnings, Dividends and Adjustments are all leaf members of Subtotal Retained Earnings (row 6). All these leaf members have (+) as the Rollup Operator. Subtotal Retained Earnings (row 6) is a Rollup Account under Equity. The account balance of Subtotal Retained Earnings for period 12 of FY 2016 (50) is rolled forward as the Beginning Retained Earnings throughout all periods of FY 2017.

Journal entries have been made to Dividends and Adjustments in period 12 of FY 2017 to affect the adjustments for Retained Earnings which need to be rolled forward as the Beginning Retained Earnings.

The account balance of Subtotal Retained Earnings for period 12 of FY 2017 (88) is rolled forward as the Beginning Retained Earnings throughout all periods of FY 2018.

No currency conversion should be applied to the Beginning Retained Earnings account as the period 12 account balance of a financial year needs to rolled forward in the subsequent financial year.

Modeling 2.7: Enhanced External Source Model Functionality

In this release, the External Source Model feature is improved significantly. To access this functionality, you must opt in by calling Planful Support.

  • The External Source Model feature set is enhanced to support a wider variety of operational modeling use cases like Sales Operations, Marketing Operations, Customer Support Operations, and Professional Services Models.

  • We have enhanced the ESM to support operational modeling use cases.

  • We have built a formula engine for ESM which support multiple functions like arithmetic, logical, date, and text. These functions allow you to perform data transformations and to define the formulas on the ESM model.

  • Ability to load data from a wider variety of data sources by using data transformations at data load time instead of being required to have a perfect data load file. Designers can transform data with formulas and functions:

    • Support for text functions like Left, Right, Len, Trim, Concatenate, Find

    • Support for arithmetic functions like Sum, Avg, Round, Floor, Ceiling, + - * /

    • Support for date functions Day, Month, Year, DaysCount, DaysFactor, WeekDay, WeekNum

    • Support for logical operations with IF, AND, OR < > =

    • Lookup functions to refer to assumptions from driver models

  • We have enhanced the Map in ESM to support loading the data from ESM to Master model, to filter the data that is being loaded, and to detect and dynamically modify the dimension hierarchy by appending missing metadata members.

  • We have enhanced the Drill Through functionality to support the use case of loading data into a Master model from multiple different ESM models.

  • Users can now define formulas that are dependent on dates (such as start_date and end_date).

  • Users can now manage External Source Models from Model Manager.

  • Users can now use Views on ESM model to interact with the data that is available in ESM models.

  • ESM models are supported in both SpotlightXL and Model Manager.

Note:
ESM 2.0 is an opt-in feature. You must call Host Analytics Support if you want to use enhanced External Source Model functionality. If you had existing External Source Models, Host Analytics will migrate your existing implementation to ESM 2.0 functionality.

Contents

ESM Benefits

List of Key ESM Features

ESM Best Practices

ESM Process

Creating an External Source Model using Model Manager

Creating Fields in the External Source Model using Model Manager

Creating Formulas in the External Source Model using Model Manager

Loading Data into the External Source Model

Step-by-Step Use Case for Forecasting Revenue Spread Over Time

External Source Model Benefits

Ability for the end user to interact and collaborate on source models that are connected to overall corporate master models. ESM now lets users make changes directly to the data in the source model. Designers can then reload the data into the corporate model whenever desired.

List of Key External Source Model Features

  • Formula Engine

  • Map Enhancements

  • Model Manager

  • Drill Through

  • Views on ESM model

External Source Model Best Practices

  • 120 fields in one ESM model

  • 5M records in one ESM model

  • Use a single quote to input the dates in SpotlightXL

  • Define ESM model and required formulas, load some sample data and then validate the data and formulas. Once everything is working as expected you can define the map to move the data from ESM to Master Model. At any given point in time, users can update the Formula Expressions; however if there is data loaded into an ESM model or if a map defined, then making structural changes to the ESM model (such as adding a field, changing the type of the field, deleting the field) is not allowed.

  • When using copy and paste or when using a file to load the data into an external source model, all the existing data in the model will be cleared and the new data loaded.

  • Designer and Reviewer users do not have the option to create a view on an ESM model. Only admin users have the option to create views.

  • To create an ESM view, it is mandatory to select a map from the associated master model and also to select a POV dimension which will serve as a filter in run mode. For designer and reviewer users, the filter dimension from the associated master model will filter the dimension members based on the dimension security defined for the designer and reviewer users.

  • ESM views where saving data is enabled allows users to load up to 10k records. If there are more records, then the recommendation is to leverage filter dimensions to filter the data input into smaller chunks.

External Source Model Process

Using SpotlightXL or Model Manager, you can set up and work with external data sources. The following process shows 5 steps in the basic flow to working with external data.

Creating an External Source Model using Model Manager

Define a Source Model with fields and types reflecting the structure of the external data source you are copying data from.

  • Your external data source is assumed to be a two-dimensional list of transaction-style records where each row provides a piece of data along with fields that describe that data.

  • Your external data source may contain a lot of fields that you do not need and you can indicate that they be ignored.

  • Your external data source may contain information that is in a different format than you want, so you can transform that data using formulas and expressions.

Admins and Designer users can create an External Source Model from the Model Manager Grid or List layouts. Reviewer users cannot create an ESM model.

  1. Click the + Circle icon from the Grid or List layouts to create a new model.

  2. Click the tabular icon to indicate that you want to create an External Source Model.

  3. Give the new model a name and an optional description.

  4. Click the Save button.

  5. Proceed with adding fields. 

  6. Click the Save button again when done.

Creating Fields in the External Source Model using Model Manager

Before creating new Source Model Fields:

  • You cannot create new fields if a Source Model Map exists because an existing Source Model Map contains mappings from the existing list of fields.

  • You cannot create new fields if the Source Model contains data. You must clear the model first.

  1. Login to Spotlight.

  2. Select the Model task.

  3. Scroll to find the External Source Model you want to view.

  4. Click the gray text that says External Source Model.

    The list of existing fields, if any, appears.

  5. Click ADD. The Create Fields box appears.

  6. Click Add Fields. A new field is added with a default type Text. Under Label, enter a name for the field. Continue by adding additional fields.

    The table below provides information on the options available in the Create Field box.

    Type

    Select Text, Numeric, Date, Formula, Constant Text, Constant Numeric, or Constant Date, based on the content and purpose of the column in your external data source.

    Label

    Enter a name of the field. Fields will be used in the Source Map to identify data from the external data source (CSV, TXT, or other file) that will become dimensions and dimension members, or data.

    Best Practice: List the fields in the order that they appear in your external data source to make it easier to load the data.

    Format

    For Date and ConstantDate types, select the layout that the date or Expression is formatted in.

    Expression


    For ConstantText types, enter the text to place into this field.

    For ConstantDate types, enter the date to place into this field.

    For ConstantNumeric types, enter the value to place into this field.

    Data Load

    Yes indicates that this field is coming directly from the data source, and No indicates that this field is derived from a formula or constant.

    If you make a mistake you can click the trashcan icon to delete a field.

    Note:
    When you add a Formula field, you are adding a placeholder. You will fill in the formula separately.
  7. When you are finished, click Done.

    Now you see the fields you added in the list of fields. You can click each item to make changes to the fields.

    If you have added any fields of type Formula, you must fill in the formula before you can save all the fields. See Creating New ESM Formulas below.

  8. When done making all changes to fields, click the Save button.

Creating Formulas in the External Source Model using Model Manager

If your Source Model contains fields of type Formula, you can create the formulas using Model Manager's interactive formula builder.

  1. Login to Spotlight.

  2. Select the Model task.

  3. Scroll to find the External Source Model you want to view.

  4. Click the gray text that says External Source Model.

    The list of existing fields appears.

The right pane displays the formula builder options and editor. You can type in a formula where it says "Enter formula here..." or you can select from the drop-down list of functions and operators.

In this example, we will create a unique Customer # by combining other fields together:

Customer # Preface - State - Phone Number

  1. From the Functions list, select Concatenate. The basic syntax for this function is inserted into the editor.

  2. To ensure that the formula will be assigned to the correct field, enter the field name before "Enter Formula."

  3. Select the text Field 1 and press Del. Type @ to see a list of field names to insert. Select Customer # Preface.

  4. Select the text Field 2 and replace it with the second field name to use: State.

  5. Select the ... after Field 2 and replace it with the third field name to use: Phone Number. Clean up excess brackets as needed.

    Now the formula looks like this:

    A picture containing text

Description automatically generated

  6. To add the dashes between each field in the formula, add a comma and " - " as follows:

  7. Click Set Formula.

  8. Click Saved Formulas to see the list of formulas created and saved so far.

  9. When done making all changes to fields and formulas, click the Save button.

Loading Data into the External Source Model

You can load data from an external data source into External Source Model. Data can be imported from an Excel worksheet, a CSV (comma-separated values) file, or a TXT file using copy and paste (Ctrl+C or Ctrl+V) actions. Data can also be loaded using integration tools such as Boomi.

In this release, we now support the ability to create views on External Source Models. You must use SpotlightXL to create the view, and set the Enable Save property to Yes. After it is saved, you can use it on the web in Spotlight. Using a basic Copy and Paste, you can then copy data from your data source into the view and save it.

To open the view in Spotlight:

  1. Login to Spotlight.

  2. Select the Analyze task.

  3. Scroll to find the External Source Model you want to view and select the Default or saved view.

  4. For saved views, if the Enable Save property was set to Yes, you can then copy and paste data into the grid and click the Save button to load data into the External Source Model.

Step-by-Step Use Case for Forecasting Revenue Spread Over Time

The following use case demonstrates the flexibility and complex capabilities in the External Source Model functionality.

This example shows how ESM can spread a sales opportunity for Annual Recurring Revenue (ARR) over a period of time defined by a Service State Date and a Term. The resulting summary data is then copied by Sales Segment into a Master model.

Inputs:

  • From the source data: Opportunity Name, Sales Segment, Opportunity Owner, Lead Source, Term, ARR, Service Start

Calculations:

  • Calculate the Service End based on the Term and Service Start

  • Calculate the monthly revenue based on ARR, Service Start, Service End, and Term

Contents

Viewing the Source Data

Defining the External Source Model

Explanation of the Lookup Function

Explanation of the DaysFactor Function

Loading Data into the Opportunity Data External Source Model

Creating a Master Model

Defining a Map from Source Model to Master Model

Using a Calculation to Load Data into the Master Model

Viewing the Data in the Master Model

Using Views to Input Future Opportunities into the Source Model

Viewing the Source Data

Here is the source data, which is an Excel spreadsheet of sales opportunities. Opportunity Name would represent the prospective customers. Opportunity Owner would represent the Sales Rep's name. ARR is the revenue potential over a Term of months beginning on Service Start.

We will calculate the Service End, and spread the revenue for all opportunities across months beginning from a Planning Start Month of January 2018 to a Planning End Month of December 2019.

Defining the External Source Model

  1. To define the source model, go to Model > External Source Model > Source Model.

  2. Give the Source Model a name, such as Opportunity Data.

  3. Fill in the fields and formulas as listed below.

  4. Click Save.

Here are all the fields to create in the External Source Model. The first 7 fields match the 7 columns in the source data.

Property

Value

 

 

 

Source Model

Opportunity Data

 

 

 

 

 

 

 

 

Field

Type

Format

Expression

Include in Data Load

Opportunity Name

Text

 

 

Yes

Sales Segment

Text

 

 

Yes

Opportunity Owner

Text

 

 

Yes

Lead Source

Text

 

 

Yes

Term

Numeric

 

 

Yes

ARR

Numeric

 

 

Yes

Service Start

Date

MM/DD/YYYY

 

Yes

All remaining fields are calculated or preset with constants. In all these fields, Include in Data Load is No. Explanation of the Lookup and Round, If, and DaysFactor functions are provided below.

Field

Type

Format

Expression

Include in Data Load

Service End

Formula

MM/DD/YYYY

LOOKUP("Opportunity - Daily Calendar" , "Service End" , {Service Start},{Term})

No

PlanningStart

ConstantDate

MM/DD/YYYY

01/01/2018

No

PlanningEnd

ConstantDate

MM/DD/YYYY

12/31/2019

No

Jan-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jan-2018")>0.5,[ARR]/12,0),0)

No

Feb-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Feb-2018")>0.5,[ARR]/12,0),0)

No

Mar-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Mar-2018")>0.5,[ARR]/12,0),0)

No

Apr-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Apr-2018")>0.5,[ARR]/12,0),0)

No

May-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"May-2018")>0.5,[ARR]/12,0),0)

No

Jun-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jun-2018")>0.5,[ARR]/12,0),0)

No

Jul-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jul-2018")>0.5,[ARR]/12,0),0)

No

Aug-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Aug-2018")>0.5,[ARR]/12,0),0)

No

Sep-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Sep-2018")>0.5,[ARR]/12,0),0)

No

Oct-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Oct-2018")>0.5,[ARR]/12,0),0)

No

Nov-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Nov-2018")>0.5,[ARR]/12,0),0)

No

Dec-18

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Dec-2018")>0.5,[ARR]/12,0),0)

No

Jan-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jan-2019")>0.5,[ARR]/12,0),0)

No

Feb-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Feb-2019")>0.5,[ARR]/12,0),0)

No

Mar-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Mar-2019")>0.5,[ARR]/12,0),0)

No

Apr-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Apr-2019")>0.5,[ARR]/12,0),0)

No

May-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"May-2019")>0.5,[ARR]/12,0),0)

No

Jun-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jun-2019")>0.5,[ARR]/12,0),0)

No

Jul-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jul-2019")>0.5,[ARR]/12,0),0)

No

Aug-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Aug-2019")>0.5,[ARR]/12,0),0)

No

Sep-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Sep-2019")>0.5,[ARR]/12,0),0)

No

Oct-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Oct-2019")>0.5,[ARR]/12,0),0)

No

Nov-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Nov-2019")>0.5,[ARR]/12,0),0)

No

Dec-19

Formula

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Dec-2019")>0.5,[ARR]/12,0),0)

No

Scenario

ConstantText

 

Forecast

No

Account - Sub Revenue

ConstantText

 

Subscription Revenue

No

 

The resulting Source Model appears as follows:

Note:
here is a look at the same Source Model from the Web interface, Model Manager.

Explanation of the Lookup Function

Field

Type

Format

Expression

Include in Data Load

Service End

Formula

MM/DD/YYYY

LOOKUP("Opportunity - Daily Calendar" , "Service End" , {Service Start},{Term})

No

To derive the value for Service End, a Lookup function is used. Although the calculation of the Service Start Date plus a Term of n months is a simple formula in Excel, we are demonstrating how to use the Lookup function to accomplish the same goal.

Lookup uses Key fields, Service Start and Term in this case. It looks for the value of the Key fields in another ESM model, which is Opportunity - Daily Calendar in this case. When Service Start and Term are found in the same transaction in Opportunity - Daily Calendar, the value of Service End is returned to ESM model Opportunity Data and fills the Service End field (the field with the Lookup function).

Here is the definition of Opportunity - Daily Calendar and its data.

The rows of data continue all the way down to 12/31/2021 with a term of 36 months ending 12/31/2024.

For example, the first row in our opportunity data has a Service Start of 02/14/2019 and a Term of 24. Looking up these Key fields in the Opportunity - Daily Calendar returns a Service End date of 02/14/2021.

Explanation of the DaysFactor Function

Field

Type

Format

Expression

Include in Data Load

Jan-18

 

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jan-2018")>0.5,[ARR]/12,0),0)

No

.

.

.

 

 

 

 

Feb-19

 

 

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Feb-2019")>0.5,[ARR]/12,0),0)

No

To derive the value for each month's revenue forecast, a DaysFactor function is used. DaysFactor is a function with 5 arguments:

  • [Start Date Field]

  • [End Date Field]

  • [Sub-Start Date Field]

  • [Sub-End Date Field]

  • "MMM-YYYY"

DaysFactor counts the days Sub-Start Date to the specified "MMM-YYYY" (assuming month end), and divides that count by the number of days in the MMM-YYYY month.

For example, the first row in our opportunity data has a Service Start of 02/14/2019.

The first formula using DaysFactor in the Source Model, Jan-18, is this:

  • DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Jan-2018")

  • With the values filled in, the formula is DAYSFACTOR(01/01/2018, 12/31/2019, 02/14/2019, 02/14/2021, "Jan-2018").

  • DaysFactor counts the days from 02/14/2019 to the month end of Jan-2018, inclusive of the first day, and divides that count by 31 (the number of days in Jan 2018). In this case, Jan-2018 is before 02/14/2019, so the value is 0/31 = 0. The IF function regulates what to do with the resulting value of DaysFactor.

  • IF DAYSFACTOR()>0.5, then return a value of [ARR]/12 otherwise return 0. Since DaysFactor for Jan-2018 is 0, then IF returns 0.

Skipping ahead in the Source Model to the field for Feb-19, the formula is:

ROUND(IF(DAYSFACTOR([PlanningStart],[PlanningEnd],[Service Start],[Service End],"Feb-2019")>0.5,[ARR]/12,0),0)

To evaluate this function:

  • DAYSFACTOR(01/01/2018, 12/31/2019, 02/14/2019, 02/14/2021, "Feb-2019") returns 15 days divided by 28 = .5357.

  • IF DAYSFACTOR()>0.5, then return a value of [ARR]/12. Since DaysFactor for Feb-2019 is .53, then IF returns ARR/12, or 37,705/12 = 3142.083.

  • ROUND makes the result of the IF function an integer by rounding off the decimal places. This results in a final number of 3142.

When the Source Model is loaded with data and the formulas are calculated, you can see that the ARR is spread across the different months, beginning in the month of the Service Start (so long as the Service Start was in the first half of the month).

Loading Data into the Opportunity Data External Source Model

After creating the Source Model and defining all the fields that are derived from formulas, it is time to load data.

  1. Select Model > External Source Model > Source Data.

  2. Select the name of the Source Model from the drop-down.

  3. Open the source data workbook.

  4. Copy and paste the source data into the first 7 columns.

  5. Click Load Data.

  6. Click Refresh. You will then see the derived fields filled in.

Note:
The lookup ESM Model, Opportunity - Daily Calendar, was loaded prior to Opportunity Data being loaded.

Creating a Master Model

You can create a Master model and then copy the data from the ESM Source Model to the Master model. The Source Model is a 2-dimensional table of data whereas the Master model is a multidimensional model, so you will need to map the fields from the Source Model to the dimensions and members in the Master model.

  1. Go to Model > Setup.

  2. Create a new model, such as Forecast Model, of type Master.

  3. Define dimensions, such as Sales Segment, Time, Scenario, and Account.

  4. Save the model.

  5. Add members to the dimensions to match up to the Source Model you will be loading.

  6. Return to Model Setup and click Generate Model.

    Note:
    here is a look at the same Master model from the Web interface, Model Manager.

Defining a Map from Source Model to Master Model

The Source Model is a 2-dimensional table of data whereas the Master model is a multidimensional model, so you will need to map the fields from the Source Model to the dimensions and members in the Master model.

  1. Go to Model > External Source Model > Source Map.

  2. Enter a name for the map.

  3. Select the Master model from the Model drop-down.

  4. Select the Source Model from the Source Model drop-down.

  5. Select Data for Type.

  6. Optionally set Append Missing Dimension Members to Yes. This option detects if there are leaf members detected in the Source Model that are not in the Master model and adds them dynamically at runtime.

  7. Now specify the fields from the Source Model that map to a member in the dimension hierarchy of the Master model. Generally, fields that are text are mapped to dimension members. Fields that are numeric are mapped to values. Not all fields must be mapped.

  8. Save the map.

Using a Calculation to Load Data into the Master Model

To copy data from the Source Model to the Master model, you must use a calculation to run the External Source Map.

  1. Go to Model > Calculation.

  2. Select the Master model name from the Model drop-down.

  3. Give the calculation a name.

  4. Create the first step of the calculation by selecting ExternalSourceMap from the Type drop-down.

  5. Select the name of the map you created in the previous task from the name drop-down.

  6. Optionally add an Aggregation step to the calculation.

  7. Save the calculation.

  8. Run the calculation.

Viewing the Data in the Master Model

To view the data in the Master model, create a view.

Note:
here is a look at the same view from the Web interface, Spotlight.

Using Views to Input Future Opportunities into the Source Model

In this release, you have the ability to directly edit data in the Source Model. We have a series of sales opportunities already loaded into the Source Model from an external data source, and now you can create a view to allow the user to add more opportunities directly.

  1. Start Spotlight (Web).

  2. Under Analyze, Data, select the name of the Source Model, Opportunity Data.

  3. Select the Default view. Here is a snapshot of the existing data.

  4. Decide on a field that will be listed as a Point of View lookup on the page axis, such as Sales Segment.

  5. Select Design View.

  6. Optionally, remove excess columns by right-clicking the column and selecting Delete Column. Remember, to calculate all the fields with formulas in this model, the user need only input 7 fields. Leave the fields in row 1.

  7. Save the view.

  8. Select the Properties icon.

  9. To be able to save the view, you must set the Master Model Map property and a POV Dimension.

    Select the Master Model Map from the drop-down.

    Below POV dimension, add a blank row by right-clicking on a blank row and selecting Insert Row. Under POV Dimensions, type the name of the field you decided on in step 4.

    Set Enable Save to Yes.

  10. Click Save.

  11. Select Analyze > Data to run the view. Now you have a data entry form you can fill in. Or you can select a Sales Segment from the POV drop-down and view existing data.

  12. To enter a new opportunity, go to the first blank row below the list of opportunities and add another. Fill in the first 7 columns.

  13. Save the data. The screen refreshes to show the values and results of the formulas.

Modeling 2.7: Data Locking to Replace Dimension-Based Locking

In this release, we are providing an enhanced method to manage and enforce data locks. Data locks protect data from being overwritten by data entry, Breakback, formulas, and calculations. Data locking differs from dimension security in the following ways:

  • Data that is locked is visible to the user but cannot be modified by anyone. Data locking is global.

  • Data that is protected with dimension security is not visible to the user and cannot be modified by the user. Dimension security is specific to the users for which it is defined.

In previous releases, Modeling provided Dimension-Based Locking that applied only to Views and Reports. Dimension-Based Locking is renamed to Data Locking. In this implementation, Data Locking can be applied to Views, Reports, and Models.

Note:
Data Locking in Models is an opt-in feature. You must call Host Analytics Support if you want to use Model Data Locking. Model Data Locking prevents any map, formula, calculation, data load, or API call from modifying the data that is locked.

In previous releases, you could lock only one dimension at a time. Now you can lock one or any combination of dimensions and dimension members.

Contents

Dimension-Based Locking User Interface Options Being Replaced

Data Locking User Interface Options

Migrating to Data Locking from Dimension Based 'Locked'

How Data Locking Works

How to Lock Data and Turn Data Locks On or Off

Breakback and Data Locking

Examples of Data Locking

Best Practices with Data Locking

Dimension-Based Locking User Interface Options Being Replaced

Previous Option

Main Menu Access Point

Behavior Change

Replaced With

Dimension Based 'Locked', View Level

Manage > Application Administration > Application Settings

View Level option is removed.

Application Level or None. View level locking is no longer supported. In Application Settings, choose from Application Level or None.

Dimension Based 'Locked' property

Analyze > Design View, Properties

Option is removed

View level locking is no longer supported. Previously you could pick only one dimension at a time to enforce locking in a View. Now, you can lock one or any combination of dimensions.

Set Dimension Based Properties, Locked column

Model > Dimension, Action menu

Option is removed

Model > Model Administration > Data Locking

Data Locking User Interface Options

Changed Option

Main Menu Access Point

Behavior Change

Description

Data Locking setup page

Model > Model Administration > Data Locking

New

Specifies which dimension members and dimension member intersections to lock in a model. Previously you could lock only one dimension at a time. Now you can lock one or any combination of dimensions.

Dimension Based 'Locked'

Manage > Application Administration > Application Settings, View Property

Changed

Choose from Application Level or None. This option turns data locking on or off in Views. The default is Application Level.

Dimension Based 'Locked'

Manage > Application Administration > Application Settings, Report Property

No change

Choose from Application Level or None. This option turns data locking on or off in Reports. The default is Application Level.

Dimension Based 'Locked'

Manage > Application Administration > Application Settings, Model Property

New

Choose from Application Level or None. This option turns data locking on or off in models. When turned on, any map, formula, calculation, data load, or API call that modifies data is affected. The default is None.

Note:
Data Locking in Models is an opt-in feature. You must call Host Analytics Support if you want to use Model Data Locking.

Data Locking

Manage > Navigation Access > Model Administration property

New

Specifies whether Designer and Reviewer users, or individual users, can modify the Data Locking setup.

Migrating to Data Locking from Dimension Based 'Locked'

If you previously used no Dimension Based 'Locked' settings, there is no change for you.

If you previously used Dimension Based 'Locked' in views and reports at the Application Level, there is no change for you.

If you previously used Dimension Based 'Locked" in views at the View Level, your Application Settings will automatically be changed to reflect Data Locking at the Application Level.

How to Lock Data and Turn Data Locks On or Off

Data Locking is a two-step process:

  • Set up the data locking definitions on the Model, Model Administration, Data Locking page. Designers or admins can do this step.

  • Turn on or off all data locking in the application on the Manage, Application Administration, Application Settings page. Only admins can do this step. 

Note:
Data Locking in Models is an opt-in feature. You must call Host Analytics Support if you want to use Model Data Locking. Model Data Locking prevents any map, formula, calculation, data load, or API call from modifying the data that is locked.

To set up Data Locking:

  1. Go to Model, Model Administration, Data Locking.

  2. Select the model from the Model drop-down.

  3. Fill in the table.

    The Data Locking table lists models with the names of their dimensions as column headings.

    When viewing data locks for All Models, you cannot make changes. You must select a particular model to make changes to its data locks.

    Populate the Data Locking table by selecting the member(s) in each dimension that you want to lock. If you select a parent-level member, all leaf level members under that parent will be locked, but the parent-level members will not be locked.

    The following example locks all data in member combinations that include any leaf members in the 2014 hierarchy in the Time dimension.

    The following example locks all data in member combinations that include any leaf members in the 2014 hierarchy in the Time dimension and locks all data in member combinations that include Actual in the Scenario dimension.

    The following example locks all data in member combinations that include any leaf members in the 2014 hierarchy in the Time dimension and Actual in the Scenario dimension.

  4. Click Save.

    Data that is locked appears with gray background in views.

How Data Locking Works

Many different operations in Modeling can make changes to data. Data Locking lets you specify which data cannot be modified when data locking is enabled. Application Settings let you enable data locking.

Data Locking can be independently applied to:

  • Views

  • Reports

  • Models. This is an opt-in feature. You must call Host Analytics Support to request this option.

Only leaf dimension members can be locked. However, you can specify to lock a parent-level dimension member as an easy way to automatically lock all of the leaf members under that parent.

Operation

Which Application Setting Controls This Operation

Description

View, the user saves the data they entered

Application Setting > View Properties > Data Locking

Leaf data intersections that are locked are not saved. Leaf data intersections that are unlocked are saved.

View, calculation runs after the user clicks Save

Application Setting > Model Properties > Data Locking

See Model Calculation below.

View, the user runs Breakback

Application Setting > View Properties > Data Locking

Breakback applies a Hold for locked members and member combinations.

Report, the user saves the data they entered

Application Setting, Report Properties, Data Locking

Leaf data intersections that are locked are not saved. Leaf data intersections that are unlocked are saved.

Report, calculation runs after the user clicks Save

Application Setting > Model Properties > Data Locking

See Model Calculation below.

Report, the user runs Breakback

Application Setting > Report Properties > Data Locking

Breakback applies a Hold for locked members and member combinations.

Model, data is loaded using the Data menu item or an API

Application Setting > Model Properties > Data Locking

Leaf data intersections that are locked are not loaded. Leaf data intersections that are unlocked are loaded.

Model, data is loaded using the Import menu item or an API

Application Setting > Model Properties > Data Locking

Data intersections that are locked are not loaded. Data intersections that are unlocked are loaded.

Model, data is loaded from an External Source Model or an API

Application Setting > Model Properties > Data Locking

Leaf data intersections that are locked are not loaded. Leaf data intersections that are unlocked are loaded.

Model Calculation:

Application Setting, Model Properties, Data Locking

A calculation with multiple steps will run all the steps that it can run successfully.

  • Calculation ClearLeafData

Application Setting, Model Properties > Data Locking

Leaf data intersections that are locked are not cleared. Leaf data intersections that are unlocked are cleared.

  • Calculation ClearRollupData

None.

Since only leaf-level member intersections are locked, Rollup Data will be cleared.

  • Calculation ClearAllData

Application Setting > Model Properties > Data Locking

Leaf data intersections that are locked are not cleared. Leaf data intersections that are unlocked are cleared. All rollup data is cleared.

  • Calculation Aggregation

None.

Since Aggregation does not modify leaf-level data, and only leaf-level member intersections are locked, Aggregation proceeds normally.

  • Calculation Breakback

Application Setting > Model Properties > Data Locking

Breakback applies a Hold for locked members and member combinations.

  • Calculation Formula

Application Setting > Model Properties > Data Locking

Leaf data intersections that are locked are not modified by the formula. Leaf data intersections that are unlocked are modified by the formula.

  • Calculation Map

Application Setting > Model Properties > Data Locking

Leaf data intersections that are locked are not modified by the map. Leaf data intersections that are unlocked are modified by the map.

Model > Clear Model

None.

Data Locking does not prevent data from being removed with Clear Model.

Model > Generate Model

None.

Data Locking does not prevent data from being loaded with Generate Model.

Breakback and Data Locking

If your application is enabled with Data Locking, Breakback will treat Data Locks as Holds in the following circumstances:

  • Breakback run manually from a View honors the Application Setting, View Property, Data Locking.

  • Breakback run manually from a Report honors the Application Setting, Report Property, Data Locking.

  • Breakback run from a calculation honors the Application Setting, Model Property, Data Locking.

In these cases, if Data Locking is set to Application Level, then Breakback will not modify data in member intersections that are designed as locked. If Data Locking is set to None, then Breakback will proceed without regard for data locking.

Breakback applies a Hold for locked members and member combinations.

Examples of Data Locking

Lock the Actual Scenario

Lock Closed Periods in a Forecast Scenario

Lock Prior Year Historical Values

Lock the Budget Scenario After Data Input is Complete

Lock Specific Accounts from Data Input

Best Practices with Data Locking

  • We recommend to set the data locking property for Views and Report to Application Level.

  • For Model tasks, such as running a calculation, loading data from the interface or an API, the recommendation is to set to Application Level based on the need.

  • Once data locking is enabled, the leaf combinations defined in the data locking user interface will not be modified.

  • Once data locking for Model tasks is enabled, you can run an aggregation for a given year, even though it may contains some leaf members that are locked. When aggregation runs for that year, the locked leaf members are not modified, but it will rollup members will be aggregated in the model. This behavior is the same for other steps in the calculation.

  • Users can select the root member for any dimension in the data locking user interface. When the setup is saved, the root member selection is ignored as this is explicit. Selecting a root member or not selecting a root member is the same as selecting all the leaf members for the selected dimension for locking. As an example Scenario:Actual, Time:All Time is defined in the data locking, there is no need to specify the root member ‘All Time’ on the Time dimension because the data is locked for all leaf members combinations for Actual Scenario.

Modeling 2.7: Enhancements to Reports in Spotlight (Web)

In this release, we have enhanced the reporting capabilities to allow users to design reports in Excel and access them from both the Web (Spotlight) and Excel (SpotlightXL). Key additions in this release include:

  • Reports that include Excel formulas are now supported on the web. Previously, these reports were only accessible from Excel.
  • Reports that include Suppress Empty Rows or Columns are now supported on the web.
Note:
This edition of Web Reporting is an opt-in feature. You must call Host Analytics Support if you want to use this feature. There is no additional configuration is needed once the feature is enabled. Once this feature is enabled, you will see a formula bar above each report.

To access the reports, navigate to the Report area, then select a report from the folder structure. From Model Manager on the web, you can also navigate to Model > Model Manager and click on Reports associated with a particular model.

All the existing functionality that is currently supported in reports is supported with the reports accessible from the web.

With the Web Reporting functionality, the limit on cells in the reports is 100k. If the report has more than 100k cells, then you will see an alert message. We recommend designing the report with less than 100k cells. Note that Application Settings has a setting for Web Analysis Cells, but that property applies only to the traditional report functionality. It does not apply to the Web Reporting functionality.

The following example shows a report design in Excel with formulas in column F.

Here is the same report when rendered in Spotlight with the Web Reporting functionality:

Modeling 2.7: Hide Empty Folders for Views and Reports

In this release, an Application Setting is available that lets you choose whether or not you will see empty folders in the list of views and reports. Only administrators can change this setting, and it takes effect for all models in the application and all users.

For example, if you have a long list of reports and some are organized into folders, you can hide excess empty folders.

To change the setting, admins must do the following steps:

  • Select Manage, Application Administration, Application Settings.

  • Next to Hide Empty Folders for Views and Reports in Run Mode, select Yes or No.

    • When set to Yes, folders with no views or reports in them will be hidden from view in the folder structure.

    • When set to No, all folders will be visible at all times. This is the default.

  • Save your changes.

  • You must logoff and login again to see the effects of the changed setting.

Modeling 2.7: Client Upgrade to TLS 1.2 Required Before December 2018

We are providing an early notification of the need to upgrade to .NET Framework 4.7.2. Transport Layer Security (TLS) protocol is an industry standard designed to protect the privacy of information communicated over the Internet. With the December 2018 release (2.7.1), Host Analytics will support only TLS 1.2. We ask all customers who are using either Spotlight or Modeling to upgrade to the .NET Framework 4.7.2 as early as possible in order to get TLS 1.2.

MS Office Users

If you are using Spotlight for Office (SpotlightXL, PPT, Word) or just SpotlightXL and you do not have .NET Framework 4.7.2 installed, then you will be affected by this change. We recommend to update the .NET Framework to 4.7.2 as early as possible. With the December 2018 release, all users who do not have .NET Framework 4.7.2 installed will not be able to access the Spotlight Add-Ins.

Browser Users

If you use any of the modern browsers to log in to Host Analytics (Firefox, Chrome, Safari, IE 10 and above), you will NOT be affected by this change. All the modern browsers already support TLS1.2. If you are using lower versions of IE and use TLS 1.0 or TLS 1.1, please upgrade to TLS 1.2.

Upgrading to .NET Framework 4.7.2

  1. Once the November 2018 release (2.7) is available, please download the latest Add-In and navigate to the SpotlightXL tab. Or if you used ClickOnce for automatic upgrades, you can just navigate to the SpotlightXL tab.

  2. If you do not have .NET Framework 4.7.2, then you will see a message at the top ribbon.

  3. If you see the message that .NET Framework needs to upgraded, then you need to upgrade the .NET Framework to 4.7.2. To upgrade the .NET Framework to 4.7.2, you need to have Admin rights on your PC, otherwise you can reach out to your IT team to upgrade it.

  4. If .NET Framework 4.7.2 is already installed, then you do not need to do anything and there will not be any impact for you to access the application.

Where to Download .NET Framework 4.7.2

https://www.microsoft.com/net/download/dotnet-framework-runtime

How to Determine Which .NET Framework is Installed

https://docs.microsoft.com/en-us/dotnet/framework/migration-guide/how-to-determine-which-versions-are-installed

For Additional Information, Refer to the Links Below

https://en.wikipedia.org/wiki/Transport_Layer_Security

https://docs.microsoft.com/en-us/windows/desktop/secauthn/transport-layer-security-protocol

https://support.microsoft.com/en-us/help/3140245/update-to-enable-tls-1.1-and-tls-1.2-as-a-default-secure-protocols-in-winhttp-in-windows


Was this article helpful?