Comment on this topic

Documentation Comments

Use this form to comment on this topic. You can also provide any general observations about the Online Documentation, or request that additional information be added in a future release.

Reality V15.2 Online Documentation (MoTW) Revision 3

New Features (Release Information) (M6862RN+NewFeatures.htm)

To

Reality

Version

Topic

Submitted by

Company

Location

Email address

Comment

 

 

New features in Reality V15.2

Reality V15.2 contains a number of new features since the release of V15.1, including the introduction of hyper files and significant DataBasic enhancements.

Some of the new features have come from the user feedback that we receive during the life of a release, so please continue to use the 'Comment on this topic' links at the top and bottom of each topic in the Online Documentation, or visit the NPS Reality website, in order to help us to improve your Reality.

Hyper files

A hyper file is a logical view on to a number of discrete data sections in multiple physical files, including remote files. Accessing a hyper file accesses these data sections —referred to as hyper sections— as if they constituted one file.

Hyper files are primarily intended as a way to split a very large file into more manageable units. Significantly, item IDs must be unique across all hyper sections and include information —known as the hyper key— that is used to determine in which underlying physical data file each item belongs.

Applications can choose to access and update all of the hyper sections that comprise the hyper file as one logical view, or as each separate underlying physical data section file. Each data section can use dictionary definitions and indexes as normal, with the hyper file able to use all of these for its view across all data sections, provided that they are logically consistent.

For example, sales figures could be updated, processed and analysed as one hyper file opened as SALES, which is actually a view of the separate data section files called SALES_CURRENT, SALES_2014, SALES_2013, and so on. Existing applications could see all these separate data sections as one logical file called SALES.

The SALES hyper file can use English, DataBasic and the rest of the Reality features just like any standard file, including updates using dictionaries and indexes. The same file access is also available to the constituent hyper sections.

The history years of hyper sections could be fully accessible to updates, or restricted within the application to controlled maintenance so that sizing is stable. Hyper sections can be stored locally in the same or different databases, including remotely. They can be updated under application control or reside in non-updating databases. In our example, the SALES_CURRENT hyper section could be the only part of the hyper file within the main live database, growing throughout the year, and optimising performance because all previous years' hyper sections are within another database. When needed, however, all sales figures would be accessible through the SALES hyper file.

This release provides new commands that operate exclusively on hyper files, and updates some existing commands so that they can work with hyper files.

The following commands apply exclusively to hyper files:

  • HYPER-CREATE

    Creates a hyper file by creating a hyper file definition item (H-pointer) and optionally the hyper sections.

  • HYPER-EDIT

    Edits a hyper file by updating the hyper file definition item (H-pointer).

  • HYPER-ADD

    Adds a new section to an existing hyper file by updating the lookup table in the hyper file definition item (H-pointer).

  • HYPER-CREATE-INDEX

    Creates an index on all hyper sections and then adds it to an existing hyper file.

  • HYPER-ADD-INDEX

    Adds a specified index to an existing hyper file.

  • HYPER-LIST

    Displays the contents of a hyper file definition item (H-pointer) in a formatted manner.

  • HYPER-RECONCILE

    Reconciles a specific section, or all sections, of a hyper file by moving items to their correct sections.

  • HYPER-REMOVE

    Removes a specific section, or all sections, from a hyper file by updating the lookup table in the hyper file definition item (H-pointer) and removing any applicable update locks from the sections; optionally, it deletes the data sections themselves.

  • HYPER-REMOVE-INDEX

    Removes a specific index from a hyper file definition item (H-pointer).

  • HYPER-VALIDATE

    Validates a hyper file by ensuring that all its constituent hyper sections:

    1. Have the same transaction logging status.

    2. Have the same save/restore status.

    3. Have index definitions that match those in the hyper file definition item (H-pointer).

    4. Can be opened.

  • HYPER-VERIFY-INDEX

    Verifies a specified index on all hyper sections in a hyper file.

DataBasic debugger enhancements

This release includes enhancements to the DataBasic debugger, including the following new and modified debugger commands:

?Displays help pages.

?MDisplays the name of the program module currently running.

ATEnables or disables Tandem operation.

BAdds an entry to the break point table.

CPToggles switch to enable or disable cursor positioning.

CSDisplays the return stack and its depth at each internal and external subroutine, and external function.

DDisplays the current state of DataBasic debugger controls and options, as set by various debugger commands.

DCCToggles the option that causes a forced break whenever descending to a child context.

DPCToggles the option that causes a forced break whenever ascending to a parent context.

MACauses the debugger to be entered on all active statement lines in a named code module, by adding the module to the list of monitored modules.

MDRemoves monitoring for a named code module, or all code modules, by deleting them from the list of monitored modules.

MECauses the debugger to be entered on entry to the start of a named code module, by adding the module to the list of monitored modules.

MOCauses the debugger to be entered on reaching the line that immediately follows the current call to another module; in other words, it steps over the call.

MOO turns off MO stepping over (and MX stepping out) operations.

MRCauses the debugger to be entered on (re-)entry to a named code module from any other module (that is, after a CALL or RETURN command) by adding the module to the list of monitored modules.

MXCauses the debugger to be entered immediately on returning from the current module to the calling module; in other words, it steps out of the current module.

MXO turns off MX stepping out (and MO stepping over) operations.

SLDisplays the return stack and its depth at each internal and external subroutine, and external function.

A code module can be any single DataBasic code item; that is, a program, an external subroutine or an external function.

The monitored list can contain any number of modules, but the longer the list the greater the potential performance impact.

The monitored list is unique to a single context and is reset on return to TCL.

NoteSome of these enhancements were first announced in the Reality 15.1 Online Documentation (Revision 7) Documentation Note. However, in this current release the MA, ME, MR and MD commands are no longer a separately licensed feature of NPS Reality.

DataBasic Email application programming interface

A new DataBasic Email application programming interface (API) provides a mechanism that allows emails with attachments to be sent from within DataBasic.

DataBasic application dump

This new feature allows a DataBasic application to dump (save) some information about its state, either automatically or on request.

There are three main situations when a DataBasic application dump may be required:

BASIC-DUMPS file

A default global BASIC-DUMPS file is supplied, in the SYSFILES account, in which to store DataBasic application dump items.

Warning and fatal run-time errors

Whenever DataBasic generates a warning or fatal message, even if these messages are suppressed, the system creates a respective WARN or ABORT dump item and saves it to the BASIC-DUMPS file.

However, multiple WARN and ABORT dump items caused by identical errors occurring on the same port, account, user, program and line number are typically suppressed. Instead, a single WARN or ABORT dump item is created at the first occurrence and thereafter a count on the item is incremented each time the error recurs. These duplicate item counts can effectively be reset by using a new RELOG.DB.DUMPS program. The previous functionality, in which all errors are reported regardless, can be restored by setting the DB.DUMP.MULTIPLE custom environment option as described below (it is unset/clear by default).

BASIC.DUMP statement

A BASIC.DUMP DataBasic statement allows the programmer to write a SOFT dump item to the BASIC-DUMPS file whenever required.

DUMP.BASIC.VARS custom option

A DUMP.BASIC.VARS custom option is provided which, if set in an environment definition, causes the WARN, ABORT and SOFT dump items to include the program variables and their contents. If the option is clear (unset) these items do not include the program variables. DUMP.BASIC.VARS is clear by default.

DB.DUMP.MULTIPLE custom option

A DB.DUMP.MULTIPLE custom option is provided which, if set in an environment definition, causes the system to create multiple WARN or ABORT dump items even when reporting identical warning or fatal run-time error occurring on the same port, account, user, program and line number. DB.DUMP.MULTIPLE is clear by default.

DUMP command

A DUMP DataBasic debugger command generates a DEBUG dump item that contains all of the program's variables (regardless of the setting of the DUMP.BASIC.VARS option) plus an optional user-specified message.

DISP.DB.DUMPS program

A DISP.DB.DUMPS cataloged DataBasic program is provided to display DataBasic application dump items from either the default BASIC-DUMPS file or a user-specified alternative.

RELOG.DB.DUMPS program

The RELOG.DB.DUMPS cataloged DataBasic program clears the dump item cache. This effectively resets the duplicate counts that are maintained while the DB.DUMP.MULTIPLE custom environment option is unset (as it is by default).

Default DataBasic compiler specified in environment definition

The SSM (Security System Maintenance) command has an improved SSM Option 4 - Define Environment Settings option, which is shared by the DEFINE-ENVIRONMENT TCL command. In addition to being significantly easier to use, it now includes the ability to define the default DataBasic compiler for an operating environment.

An environment-specific compiler is also useful to maintain live "current code" that is running on end-user systems while any testing is performed to make sure that any MultiValue compatibility changes, or new or updated features, are fully understood.

If an environment setting is not specified, the BASIC*DEFAULT synonym entry in the BASIC-COMPILERS system file in the SYSFILES account is used instead.

The DataBasic SYSTEM(120) function returns the name of the current environment and SYSTEM(121) returns the name of the current default compiler.

Password definitions

The SSM (Security System Maintenance) command has a new SSM Option 6 - Define Password Definitions option to create and update password definitions for users and accounts.

Password definitions allow you to define the valid composition of passwords including minimum and maximum length; allowed patterns of alphabetic, numeric and special characters; sequences of ascending or descending characters; and so on.

Password definitions are stored as items in a new PW.DEFINITIONS system file in the SYSMAN account. User and account password definitions are distinguished in the file, so that it is possible to have user and account password definitions with the same name (for example, the DEFAULT password definitions).

User password definitions

Each user profile either explicitly references a user password definition or implicitly references the DEFAULT user password definition. Multiple users can share the same password definition.

User password definitions complement existing features of user profiles that control, for example, how long a password remains valid, how many retries are permitted, and so on. In addition there are two new features to force the user to change their password when they next logon, and to specify what happens when the password expires.

User passwords specified by using either the PASSWORD command or SSM Option 2 - Define User Profiles must meet the rules of the relevant user password definition, although these can be overridden from the SYSMAN account.

The DataBasic SYSTEM(119) function returns the name of the current user password definition item.

Account password definitions

Each account either uses the account password definition with the same name or, if none exists, the DEFAULT account password definition. However, an account password definition item can be a synonym to an actual account password definition, so multiple accounts can effectively share the same password definition.

Account passwords specified by using either the PASSWORD (A or CREATE-ACCOUNT command should meet the rules of the relevant account password definition, although these can be overridden from the SYSMAN account.

Note:These changes mean that the PasswordLength database configuration parameter is now redundant. On upgrading to this release the current value of PasswordLength (if any) is used to set the Min password length attribute of the DEFAULT user password definition. Similarly, the DEFAULT account password definition is also configured for backwards compatibility.

Preview of the User Interface Framework

The User Interface Framework (UIF) is an extensible feature eventually intended to provide a mechanism to separate the business logic of an application from its presentation or display logic.

For example, it is used internally by the MOUNT-IMAGE command.

Depending on user feedback, this feature will be further enhanced in future releases.

Multiple clean log deletions from tlmenu

Multiple clean log deletions are now possible from tlmenu, allowing simplified maintenance of the data-resilient features Transaction Logging, Shadow, FailSafe and Disaster Recovery.

On using the tlmenu database command:

       Administration Options
       ======================

       1. Routine Maintenance
       2. Configuration and Setup
       3. Database Recovery
       4. Miscellaneous
       5. Disaster Recovery Configuration and Maintenance
       S. Show Logging Status (from any menu)

       Enter option (1-5,S) :

On selecting option 1, Routine Maintenance:

       Routine Maintenance
       ===================

       1.  Switch Clean Log
       2.  List Clean Logs
       3.  Archive Clean Log
       4.  Delete Clean Logs
       5.  Start Transaction Logging Status Monitor
       6.  Stop Transaction Logging Status Monitor
       7.  Save multiple Clean Logs to tape
       8.  Load multiple Clean Logs from tape
       9.  List multiple Clean Logs on tape
       10. Save the Database

       Enter option (1-10) :

On selecting option 4, Delete Clean Logs, the user is prompted:

This option deletes multiple clean logs from disk

Do you want to do this (y/n/q) ? :

On entering y, you are shown a number list of clean logs available for deletion, from which you can select individual logs or ranges of logs in any combination. Alternatively, you can enter the name of a particular clean log you want to delete.

       Select Clean Logs to Delete
       ===========================

       1.  CLOG160119-001
       2.  CLOG160119-002
       3.  CLOG160119-003
       4.  CLOG160119-004
       5.  CLOG160119-990
       6.  CLOG160119-991
       7.  CLOG160119-992
       8.  CLOG160119-993
       9.  CLOG160119-994
       10. CLOG160119-995
       11. CLOG160119-996
       12. CLOG160119-997

Enter selections (eg 1,2,4-6) or Clean Log file :

Miscellaneous other changes

RealityV15.2Revision 3Comment on this topic