Computer Security: Part 1 - What Is Computer Security?

Computer security is always a hot topic, but what do we mean by security and why do systems seem to be ever vulnerable. Comparing hardware to software helps understand vulnerabilities in software security.

When data processing was in its infancy, the number of installations was very small and there was little information exchange between them. Data exchange would typically be carried out by the physical transport of a storage medium. There was little danger of unauthorized data being input to a computer and the biggest concern of the day was system reliability.

Data errors on media threatened reliability, and error correction systems were developed to handle genuine errors caused by noise and media defects. As they relied on standardized equations, they were not secure against deliberate tampering, as the equations were available to all and sundry. That led to the development of encryption, where data to be stored or transmitted was turned into a form that was essentially meaningless to someone not having the appropriate key.

In stark contrast to the progress made in error correction and encryption, the great majority of today's computers are seriously lacking in the security department and the results are all around us. Little time goes by before the result of another hacking attack becomes public, with perhaps the personal details of thousands of company customers or hospital patients being taken. It is not the purpose of this article to catalog and bemoan the losses that have taken place, except to acknowledge that they are serious and need to be addressed.

Unauthorized access to data is only one of the problems an insecure computer can suffer from. Possibly worse is that the computer can, under external influences, be made to do things that are not necessarily beneficial, and that may not even be evident to the user.

There are a large number of reasons why the present regrettable situation prevails and it may help to understand why the problem is not being solved if we look at some of them.

It is fundamental that no processing can be carried out on encrypted data. However secure the transmission or medium by which the data arrived, and however secure the digital signature guaranteeing where it came from, at some point the data have to be decrypted so that processing can take place. That point needs to be secure and presently in most cases it is not.

Testing storage and transmission devices for data integrity is relatively straightforward and as a result errors due to such causes are not a serious issue today. In contrast, testing software is extremely difficult because of the huge number of different states computers can get into. So difficult, in fact, that with the exception of life-support applications such as fly-by-wire, very little software is rigorously tested.

Not only is software testing difficult, it is also expensive, and consequently most software available today is riddled with errors, which may be minor irritations, or security loopholes for hackers to exploit. The ease with which software can be updated means that it is too easy to release code that is riddled with bugs and to allow the unfortunate user to find them.

As will be seen in due course, numerous techniques are available to prevent hacking of computers and un-hackable computers exist, but the chances are readers of this piece don't have one. Most of today's computers simply don't implement known security techniques, either because the hardware doesn't implement the necessary logic, because the operating system simply doesn't support the available hardware or because the operating system has flaws.

The stable door is wide open and after the horse has bolted, along comes the updated anti-virus software. Anti-virus software is big business and there is no incentive on the part of such businesses to encourage any improvement in the design of computers.

Another problem is that it requires some technical knowledge to assess computer security, and this is frequently lacking, especially where politicians are involved. The need to make a profit often causes security concerns to be brushed aside, with the result that the inevitable disaster wipes out any savings that were made. The strange logic that is applied in politics often means that those who preside over an IT catastrophe simply move on to another job where they can do further damage and break further regulations. 

Fig.1 - Most programming is done in a high-level language, which is converted to machine code by a compiler. The instructions in the machine code are converted into a series of microinstructions that guide the central processor through the stages necessary to execute the instruction.

Fig.1 - Most programming is done in a high-level language, which is converted to machine code by a compiler. The instructions in the machine code are converted into a series of microinstructions that guide the central processor through the stages necessary to execute the instruction.

A related problem is that the gulf between what the average person knows and what happens inside a computer is so great that outrageous claims for the performance of software can be made in the knowledge that they will probably go unchallenged. What the software is supposed to do and what it actually manages are frequently confused.

What the software designers claimed might also be exaggerated by the marketing department, so that something with a low probability of error becomes infallible. That is the case with a well-known brand of speed camera that police officers are told cannot go wrong, when the evidence is that it can.

That is hardly a novel phenomenon. There was a famous ocean liner that pioneered a constructional technique that improved its chances of staying afloat in the event of damage. The marketing machine promptly claimed it was unsinkable. It went down on its maiden voyage.

A further difficulty is that it is impossible to verify claims for software if the code is not published. The usual reasons given for not publishing code are confidentiality or security, but often the real reason code is not published is to prevent people bursting out laughing when they see how bad it is.

Fig.1 shows the hierarchy of a typical computer. The actual computation is performed by a number of logic gates. The execution of an instruction may require a series of different stages, and these are sequenced by microinstructions. Receipt of an instruction selects the appropriate microcode to execute it.

An instruction is basically a binary number and there are quite a lot of different ones in a practical processor. It is perfectly possible to write a computer program by setting down a sequence of instructions, but in most cases this is not done. Most people find it too difficult to write machine code. Instead programmers work using what is called a high level language, in which the instructions are written in, for example, English.

One of the earliest of these was FORTRAN, short for formula translation, which allowed those with mathematical knowledge but no computer skills to write programs. The words typed in by the programmer are converted into a series of instructions by a compiler before they can be executed.

Whilst this works, the result is that the programmer is completely detached from the actual computation. Even if the programmer knew anything about security, the level at which he works prevents any meaningful action being taken.

At one time there was only hardware, which is physical. In electronic logic, hardware consists of things that influence electrical signals carrying information. Things like gates, bistables and shift registers are hardware. One definition of hardware is that unless it is defective, it always does the same thing given the same input. What it does is determined by the components and the way they are interconnected. That also means hardware is relatively easy to test.

If a different action is required, parts have to be changed or reconfigured. In contrast, the term software was coined to describe the ease with which things like instructions could be changed.

But the key difference between hardware and software is that whilst it is possible to make hardware that does not require software to operate, the converse is not true. That leads to a different definition, which is that hardware provides the physical environment in which software can operate, in the same way that the human body provides a physical environment in which there can be consciousness.

Another way of looking at the difference between hardware and software is that hardware is inflexible and it is difficult to make it do something else, whereas software is incredibly flexible and can readily be changed, locally or remotely. The logical conclusion must be that software is the way to go. For the purposes of security, that would be the wrong conclusion.

The reason is very simple. The ease with which software can be changed makes it equally easy for unauthorized changes to take place. The difficulty involved in changing what hardware does means that unauthorized change is equally difficult. Hardware must be at some physical location, and if that location is secure, no unauthorized change is possible.

It is the considered view of this writer that ultimately, computer security can only be achieved by appropriate hardware. That gives us another definition of hardware, which is that in addition to providing the environment in which software can operate it also protects itself from malicious software.

That brings us neatly to the concept of the virtual computer, which appears to the software to be indistinguishable from a real computer, until it tries to do something it shouldn't, whether due to error or malice, and finds that it doesn't work. 

You might also like...

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…