Julian Thorrold, MD of IDtek Solutions, addresses some trends and issues organisations face when using
technology for risk and cost reduction.
During the boom years as large enterprises grew their operations and extended their footprints, risk managers began to evaluate the role they had to play in consolidating critical data from a host of risk reduction technologies used throughout the enterprise at various locations.
The current recession has brought increasing risk related losses and liabilities to light, increasing cost pressures. The risk related losses that become most prevalent in a recession include:
* Employees stealing inventory/equipment.
* Employees stealing time.
* Retrenched employees becoming resentful and sabotaging operations/system performance.
* Poor morale leading to increased exposure at the level of health and safety.
* Increasing third party theft.
These factors all point to the need for accurate communication of risk information and for cost saving solutions.
In any large enterprise which has geographically distributed facilities, the question arises as to whether risks should be managed on a centralised versus a distributed basis. Both options have advantages and disadvantages.
Centralised risk management
* It ensures standards are in place.
* It offers the cost saving of a central control centre.
* It develops experts in different areas as a resource.
* It sometimes comes at a premium where more sophisticated communication platforms are required.
Decentralised risk management
* Risk managers set their own rules and policies at each of their separate remote locations.
* Employees tend to be more generalist, because they handle many different responsibilities – sometimes no active monitoring of the risk reduction solutions.
* The remote office sometimes tries to re-invent the wheel.
* Decisions often made too quickly.
Many enterprises with distributed operations choose to centralise. This usually means multiple facilities operate to the same standards using the same equipment. This may offer cost savings inherent in utilising the same equipment, and employees travelling between facilities are able to transport their knowledge to different sites.
The negative is that the centralised risk management approach does not always cater for unique local risk issues. The best solution is to mix these two approaches. The mixed risk reduction approach ensures at least minimal standards while allowing for unique considerations at the distributed locations and sometimes also for quick decision-making closer to the source of the risk. For instance, should the local operation choose/already have specific access control technology on site then this could be left in place provided a breach of access control can be prioritised as an alarm and communicated on the standard platform to the central control centre.
As part of this centralisation, the transmission infrastructure for the risk related information must be considered. To a large extent the selection will be driven by the amount of information to be transmitted. For example, if remote control centre is left operational only during business hours, then the alarm monitoring is switched to the central command control after hours, and the bandwidth needed to send this data is minimal and after hours. If video, audio, fire alarms, security alarms, etc, are all sent to the central control centre during working hours, larger bandwidth is needed. Video will always require the most bandwidth followed potentially by data from time and attendance applications.
With camera surveillance – typically one or two channels/streams are provided from each facility to the central control centre. The central control centre must be able to switch any camera at a remote facility and receive the new video selection over the network. Recording is usually left at the remote facility to allow all cameras to be recorded. With multistreaming IP technologies from the likes of Honeywell, Bosch and Pelco it is even possible to transmit fewer frames per second per camera for the purpose of viewing over the network and saving on bandwidth. This reduces the need for bandwidth and provides recording of all remote cameras.
As far as the availability of transmission media for data, there are several options. Each option has its benefits and negatives. As mentioned, the data required in the central control centre would establish the bandwidth requirements. The media options are as follows:
* Telephone/analogue – primarily for alarms and access control.
* ISDN lines/digital – Primarily for alarms, access control, and non-realtime video.
* Broadband/DSL – speed limited by connection and sharing of services, primarily for alarms, access control, and near realtime video.
* Diginet or leased line – various Telkom packages from Basic to Diginet Plus – primarily for alarms, access control, and realtime video – continuous connection – fairly expensive depending upon distance to ISP.
A seemingly obvious answer to the selection of transmission media is the Internet. However, at this stage the trend in larger enterprises is to avoid using the Internet for the distribution of critical information. For these companies the IT group usually controls access to the Internet and this can be a problem. If an employee utilises the intranet and sends e-mail, it is not a problem if the e-mail is delayed for a few minutes. This is a serious problem for surveillance, fire, intruder and process control alarms. There are no guarantees in regard to reliability on the Internet or intranet.
The other data transmission options mentioned earlier also have benefits and failings. It is important to know these media solutions’ pros and cons before you proceed with centralising risk related responsibilities.
IT standards
The next major industry trend is a migration towards IT standards. Information technology experienced revolutionary changes in the early ’90s, as can be seen in the expansion of object-oriented software, PC-based client-server architecture, open architecture and distributing networking, adding to the Internet explosion. Together these dramatically changed the IT landscape. IT was faced with trying to integrate proprietary systems with these fast-paced technology trends. The need for integration of third-party solutions forced IT groups to impose standards for acceptance of these solutions before they could be released.
Massive investments in new technology meant that IT groups have become more involved in evaluating and even rejecting risk-based technology solutions. Now IT managers have specific criteria for accepting third-party solutions, and they influence product evaluations and specifications.
Some of the recent criteria they use when selecting risk-based products are as follows:
* Open architecture for integration with the existing infrastructure.
* Scalability and system growth without declining performance.
* Current de facto IT standards in database, networking, and operating systems for operation on the existing network.
Unfortunately, many of the traditional risk-related technologies in the security/process control industry have been left behind. Major suppliers of security, fire and building management systems still offer proprietary systems based on dated technology. Characteristics such as limited networking capabilities, lack of robust client-server database architecture, lack of flash memory architecture, old application and operating system software, and old controller architecture with proprietary interface protocols make it very difficult to integrate such products within a corporate framework.
Traditionally, risk management systems were driven by proprietary hardware solutions while the application software was of little consequence. Often application software was provided at no cost. While this strategy was acceptable a few years ago, the reverse is now true.
With hardware becoming a commodity – think analogue and IP cameras, card readers, intrusion detectors and digital and analogue input/output units – software has become critically important. Risk reduction managers are realising that software and system architecture are what differentiates pedigree systems. The quality of software, conformance to standards, and scalability drive the choice. Hence the trend is that of a strong move towards de facto IT standards.
Risk-reduction software
The modern risk-reduction software must be written utilising at least a 32-bit native API. This offers proven performance, access to a host of APIs and compatibility with other applications. It will future-proof the system to the extent for instance that a 64 bit CPU will run a standard (32-bit) program on a 64-bit version of an operating system.
A second trend in risk-reduction technologies began when the software industry underwent an object-oriented revolution. As a result, instead of writing an application as a single, monolithic program – which is difficult to develop, maintain, and enhance – the application is developed as a set of small self-contained components (objects), each of which performs a specific function. This brought with it major benefits such as improved reliability of applications due to re-usability of objects. Product upgrades and the enhancement cycle are now also shortened.
It is now also becoming essential that modern risk-reduction applications be designed and developed as object-oriented applications. Security applications that do not utilise today’s object-oriented architectural tools will be more difficult to integrate.
As part of a total risk-reduction system, the field hardware must support the architecture and should be designed to have the same characteristics as the application software: However, the emerging trend for field hardware components is the intelligent system controller (ISC), where local decisions are made off line. For IT managers to accept these devices on a corporate network, the devices must meet the following minimum requirements:
* A 32-bit bus and CPU architecture.
* TCP/IP protocol support.
* Flash memory for firmware.
* Support for a very large (minimum 250 000) local cardholder database.
* Support for a large number of readers and alarm panels.
Support for open architecture is another trend and one of the most important criteria for IT groups in selecting the right systems in these large facilities. Unfortunately, many vendors claim to have open systems without clearly understanding what the phrase means from an IT point of view. Open system implies that every major component of the system, every communication protocol, and every interface is designed according to industry standards which allow easy integration with other systems and components.
In addition, the most advanced systems should provide Universal I/O – the ability to interface with any third-party system or device through a simple unidirectional or potentially bidirectional ASCI string protocol (fire systems, building/time management systems, alarm systems, CCTV switches, paging, and e-mail systems). Universal data import/export is the ability to move any data including multimedia (pictures, signature, fingerprints, voice, video) from/to an external file or database system with custom business rules applied to the data moved between systems.
Trend towards scalability
In access control, scalability is measured not by the number of access control doors or the number of cardholders it can support, but by the number of transactions (events) the system can consistently sustain without any visible degradation of performance. In practical terms, it means that the same system should perform equally well whether supporting two doors or several thousand doors. The number of transactions generated by the security system in the real world can range from several to dozens of transactions per minute for a small solution, to several thousand per minute for larger systems. The application should provide support without any modifications for a multiprocessor-based database server. A truly scalable system should demonstrate a substantial increase in performance when the number of processors increases.
Database integration, probably the most important issue for IT in the evaluation of any system is its ability to integrate with external databases such as HR, payroll, fixed asset management, inventory management and time management. The application and database design must comply with standards and satisfy the following requirements:
1. Open Database Connectivity (ODBC) compliant.
2. Support a bidirectional interface with an external database.
3. Downloads and distributes security-related data to every ISC in the system in real-time.
4. Guarantee the delivery of security data to each ISC in the system.
The security application must also support a bidirectional interface. That means when changes to the data are made in an external database, the modified data is moved to the security database and vice versa. Changes in the human resources database may apply to the security database, while changes in access level of the employee in the security database may apply to personnel information in the human resources database.
While distributing security data across the system, the application must detect whether any communication paths are unavailable or whether transmission errors occur. The application must store the undelivered data and continuously monitor the communication lines for availability. As soon as a communication path is restored, the security data must be delivered to the target ISCs. This design guarantees data integrity across the entire system.
Hardware trends
In addition to these trends in software development, hardware developments that have brought significant impact to the industry. For instance in many large dusty facilities – from a fire risk management point of view – traditional point detectors are being replaced with very early warning aspirating systems.
One such example is the Unilever mother warehouse in Pietermaritzburg where a 40 000 square metre facility is protected through an aspirating system that offers very early smoke detection. Traditional smoke detectors such as beams and optical devices have always experienced severe shortcomings in their ability to detect a fire in time and to avoid dust related false alarms. With the emergence of aspirating technologies such as VESDA – these problems have now been overcome with companies like Unilever using VESDA devices.
A second emerging technology is that of biometric verification of identity. Management in these large facilities is using biometric verification to identify workers for access control, health and safety requirements and time recording. However, with the advent of biometrics came a host of unsatisfactory biometric devices from all over the world at ridiculously low prices. The result was very severe negative publicity. It is only through much effort and education that the suppliers of leading technologies are making inroads once again into these large facilities.
The trend has moved now towards integrating proven biometric technologies onto established platforms where all sorts of considerations such as occupational health and safety requirements can be regulated alongside the access control decision.
In summary, the current trend is towards centralised risk-management with a preference for those risk-based systems that comply with de facto IT standards. While this does introduce network security considerations the older proprietary systems are fast losing ground to TCP/IP-based solutions and even more recently to Web-based visualisation solutions. Typically these better developed solutions allow for the integration of information from various sources into one overall monitoring graphical floor plan. Ideally information can then be shared through a single-window/Web-browser environment. By using Java as the core software technology the client environment becomes available as an independent application in a Web browser or on other devices such as Java phones, PDAs and iPhones. The ideal result is zero deployment where no application is required and full functionality is offered simply through browser technology at the Web client level.
There are other technologies that may even secure savings from date of inception. These savings include – in the case of biometric devices – time and attendance savings once overtime fraud is overcome and potentially savings through automation – reduction in manpower on the ground in guarding and health and safety roles.
Source: The Complete Book of Electronic Security by Bill Phillips.
© Technews Publishing (Pty) Ltd. | All Rights Reserved.