Software Technology Past, Present & Future
Since the first faltering steps away from hard-wired logic, the methods
and tools available to the software industry for producing applications
have progressed considerably.
This progress can be charted through a number of distinct generations
of software technology, each with demonstrable benefits over its parent
in terms of productivity and the reliability, and flexibility of the resulting
The first move away from hard-wiring came about through the desire to
lower the cost of adapting the hardware to differing requirements. The
latest software technology available today is a continuation of that fundamental
mission embarked upon all those years ago.
Procedure Oriented / Flat Files : The 3rd Generation
Since the computer hardware itself works by executing a list of instructions,
it was natural that the first generation of application software should
mimic this. The applications of this generation consist of lists of instructions
procedures each representing a business process (for example
add new account, modify exchange rate). The data is stored in files which
originally mirrored the existing manually kept data, and whose records
are read and updated at the appropriate points in the application procedures.
Much effort was expended by computer manufacturers in providing tools
to increase the efficiency with which such applications could be produced.
The result is the selection of high level language compilers available
the so called 3rd generation such as Cobol and C.
As a rule, and banks were no exception, the technology was embraced as
a way of reducing costs and increasing efficiency, by those institutions
with sufficient volumes of transactions and data to justify its implementation
and the tremendous cost of the hardware.
The cost of the development of applications was high. Analysts and programmers
were required to implement even the smallest changes and produce the simplest
As the pace of change in the banking sector picked up along with the thirst
for information, IT quickly established itself as a major cost item in
the banks statements of profit and loss. Many millions were invested
in IT to meet the challenges, and applications grew to encompass millions
of lines of code.
A lack of standards meant that compilers and consequently applications,
were written specifically for particular computers, and millions had to
be spent again by those unfortunate enough to back the wrong hardware
technology, either on re-writes or conversion.
As the complexity of the applications grew rapidly, the shortcomings of
their design became all too apparent. Spending on maintenance (fixing
bugs) outstripped that of new developments. This positively hindered the
development of the systems necessary to support new areas of business,
for example securities and later derivatives, all in a climate of increased
competition among banks, where IT had become a vital tool in the race
to establish business advantage.
In summary, the shortcomings of this generation of application software
lack of flexibility;
costs of maintenance;
time taken to bring new products to market;
lack of platform independence.
The amount invested in this generation of legacy applications is evidenced
by their continued widespread use within the banking sector today. The
shortage of Cobol programmers registered in the run up to the new millennium
proved a revealing statistic.
Legacy Systems : The Middleware Solution
The re-development of legacy applications, whilst ultimately and indisputably
desirable, is hindered by the conflict of resources brought about by the
need to concurrently address the requirements of new areas of business.
The middle way adopted by many institutions has been to strategically
purchase or develop new applications using new software methods and technology
and to spend money tactically on middleware to integrate them
with legacy systems. The downside of this approach, exacerbated by the
introduction of different legacy systems inherited through corporate acquisition,
has been the arbitrary spread of data across different platforms making
it difficult to gain a timely, consolidated and consistent view of the
The areas which have been impeded the most by this are clearly those which
most require such broad and immediate access to operational data. Risk
management, private banking, cost and profitability analysis, accounting
not to mention the general efficiencies of coordinated STP workflow
between departments. The middleware approach brings with it considerable
additional overheads of interface and data management, and significantly
increases the cost of incorporating changes which span vertical applications
and of systems operations and testing. The more systems that are added
to accommodate change, the more the overall system (the sum of all of
the systems) tends to resist change.
Relational Databases : Towards the 4th Generation
The next significant development in software technology was that of
the relational database. Instead of disparate files of often duplicated
information whose content and relationships were known only to the programmer,
an organisations data requirements could be analysed, and expressed
as an entity relationship diagram.
Each entity represents the data associated with a component of the business
processes such as a client record or an account. Each has attributes to
describe it and its relationships with the other entities of the data
The description of the entities attributes and relationships are
stored in a data dictionary. The clear advantage over the flat file approach
being that the meaning and significance of the data are stored in the
database with the data as opposed to being buried in the source code written
and understood exclusively by programmers. The benefit of this is that
the data can be accessed, understood, analysed and reported on independently
of the application software using standard tools such as SQL. Properly
implemented, it also greatly simplifies data maintenance by guaranteeing,
for example, that if a clients address is changed, by virtue of
the fact that it only exists in one place, it does not need to be changed
anywhere else. Of course these benefits are limited by the scope of the
applications serviced by the single database. However at the time of their
introduction they represented a major milestone on the journey between
the earlier trend of designing applications to be computer centric and
the current one of designing them to be business centric.
The realisation of the possibilities they offered of a consolidated bank-wide
database updated in real time and covering all products and services led
many in banking IT to set off in search of the so called holy grail.
This was the mythical package that could process all products and services
and maintain a relational database to supply all of the information requirements
of the bank.
The RDBMS methodology solved the consolidated data access issues so far
as the scope of its implementation allowed, but apart from some data maintenance
and reliability advantages, it did not offer any significant benefits
in terms of the flexibility and time taken to introduce support for new
products and services.
4th Generation Languages
Efforts to resolve these issues focussed on yet more powerful methods
of producing procedural applications known as 4th generation
programming languages. The more sophisticated of these incorporate complete
analysis and design methodologies, the automatic production of relational
data models, and the linking of the generated applications to the resulting
database. The tools are now largely provided by specialised software vendors
rather than by the computer manufacturers and so do overcome to a certain
extent the issue of platform independence. The same issue remains today,
but has now been distilled to a small number of hardware and operating
system suppliers or their tactical alliances.
Towards The Future
While the 4GL tools clearly enable rapid development of applications,
the flexibility of the applications themselves is limited by their design
to the extent that the introduction of new products and services, or of
any other changes, requires a return to the source code, changes to the
data model and re-generation of the application. This is the same cycle
of development as the previous generation, but more efficient.
However efficient they are or become, the 4GL tools cannot compete with
an application which is itself able to accommodate change without the
need for re-generation or for changes to the data model. Such applications
would provide functions to add new functions in the same way as they provide
functions to add new accounts, and with the same immediacy.
Introduced in response to a new requirement, or to replace a legacy system
for which maintenance is no longer possible or economical, such applications
offer the real possibility of starting to reduce the number of systems
and interfaces within an enterprise by enabling the incorporation of new
products and services or further legacy systems, without the cost of redesigning
This could break the current cycle of application development which banks
seem to be in where there is a real danger that each application added,
is another straw on the camels back of the banks overall system
strategy and at the least another multiplier of the cost of implementing
Parameterised and Rules Based Systems : The First Steps
Parameterised and rules based systems represent the first steps towards
building the flexible applications described above whereby flexibility
is built into the applications rather than derived from the tools with
which they are built. Parameterised and rules based systems are at first
glance easy to confuse with one another. Since there are a number of such
systems on the market, it is worth spending a little time to understand
Parameterised systems are ones in which the path taken by an application
module through a set of procedural instructions depends upon the setting
of some variables. These flags are included in the static
data records of the application and can be set relatively easily without
the need for expensive and time consuming changes to software.
The existence of parameterised systems stems largely from the requirement
of software houses to be able to sell their packages in different countries
and market segments without themselves incurring the costs of producing
and maintaining different versions.
They are implemented via if statements in the application
programs conditioned on variables within the static data tables. An example
might be if <tax_flag> then <calculate_tax>.
The ease of use of parameters has been extended to provide a modicum of
control of such packages to the bank without the need to revert to the
software house. This typically includes parameters attached to product
files to allow the bank to choose the characteristics of its products,
for example fixed rate versus floating rate, and can go as far as allowing
the definition of new product types by selecting a number of such parameters.
The first major restriction of these systems is that the parameters merely
select one of a number of predefined paths through the software. Any usage
not thought of by the programs designers is therefore not achievable
by any combination of parameters. In other words, in the above example,
both the flag tax-flag and the procedure calculate tax
must already exist within the application. The addition of new flags or
procedures requires an expensive re-design and re-build of the package.
The second major limitation is that the data stored in such systems is
equally limited to that foreseen by the applications designers.
In an attempt to overcome this, many such packages include some additional
spare data elements which can be used to accommodate unforeseen
data. Use those up, however, and you are in the same boat of having to
go back and re-design the system.
Rules based systems overcome the first limitation of parameterised systems
by breaking procedures down into smaller elements called rules. Facilities
are provided within the applications for combining the rules in different
ways to create new procedures. Provided sufficient of these smaller elements
are provided by the applications designers are foreseen, the result is
far greater flexibility than that delivered by simpler parameterised systems.
Rules based systems alone however still do not overcome the second limitation.
That is to say that the introduction of new data attributes and new tables
still requires a re-design of the application and a re-build.
Object Oriented Systems : Getting There
The next piece in the puzzle is provided by object oriented technology.
It does for procedural computer centric programs what relational databases
did for flat files.
Business objects are described both in terms of their data (attributes)
and the procedures required to process them. An example might be an object
called client which has attributes of name, address and domicile
and with functions to fetch and update client records. These functions
have a well defined programming interface which enables them to be called
in a consistent manner by any other program without the need for the other
program to be compiled with it.
Any changes to the object client, either data or procedural, are contained
within that object, hence considerably reducing the maintenance overheads
of introducing changes.
Time to create applications is also reduced partly because of the ability
for an object to be defined in terms of another already existing one (inheritance).
The standardised nature of the objects interfaces means that it
is also possible to purchase objects from 3rd party sources
and use them within new applications.
OO applications lend themselves particularly well to the presentation
of applications via graphical user interfaces, since each object can be
represented as an individual icon.
Builders of OO applications are still faced with providing a model to
which to attach their objects. In addition , new objects required must
be generated or built using 3rd or 4th generation
techniques. These factors combine to leave OO applications better placed
than others, but still short of the flexible applications required to
have a significant impact on the current situation, of the type described
earlier in this article.
The best applications combine the best of relational database, rules based
and OO technology. The generated applications deliver a set of object
classes to the application environment and provide facilities for the
online creation of new object types in terms of the supplied classes.
The result is applications where both processes and data model are defined
by rules providing solutions of unparalleled flexibility which
welcome, rather than obstruct change. If your IT department or supplier
is not thinking along these lines then perhaps its time to look
again, because these applications are available now.