Security is not a one-time solution. It requires constant attention.
John Watkinson* looks at the importance of virus protection and some methods to help prevent cyber-attacks. Some steps may include proper staff training to adopting unique hardware and software designs.
When it comes to computer security, managers must be consistent across all levels. For instance, there is no point in striving to maintain a secure system if you allow users to be sloppy about passwords or you don’t have a concept of the risk from cyber-attack. You also have to regularly install program and security updates. Security is an all or nothing process.
Your staff may be under attack from social engineering that encourages people to divulge information that they should not. People are trusting and will often allow themselves to be manipulated by external inputs in such a way that they become disgruntled with their employer. It is then only a small step for a hacker to encourage them to harm their employer by divulging damaging information. Computer-dependant businesses need to have a process in place that makes employees aware of this insidious stuff and always be on the lookout for symptoms of trouble.
Maintaining network security must be constant and inclusive. Users need proper training and reminding about procedures and changing passwords. Software needs regular updates along with proper virus protection.
In the early days of computing, when machines were expensive, they had to be time-shared between users. Some users would be developing code that could contain errors, while other users were running experiments. It was vital that one user’s faulty code wouldn’t bring down the whole machine. Combinations of hardware and operating system software were developed that could detect when faulty user code was trying to access something it shouldn’t, like peripherals. Those systems worked, but were neglected once everyone had a computer.
Today’s problem is we don’t actually have computers of our own. Once connected to the Internet, it’s not your computer any more: you are sharing it with others who are not necessarily interested in your well-being and you need to act accordingly.
The first rule of computer security has to be that if the function of the machine does not depend on Internet connection, then don’t connect it.
The difference between chaos caused by development code that contains errors and chaos caused by viruses is only one of intent. The techniques needed to make computers secure and the tests needed to prove it all exist. Unfortunately, those solutions are often not sufficiently used or validated out of a combination of misguided penny pinching and ignorance. Things need to change.
Who holds the key?
Once upon a time the number of cars being manufactured grew enormously, and for economy they were fitted with locks that were locks in name only. The inevitable boom in car theft and joy riding led to regulations requiring locks to be more effective. Now it is the turn of computers. The problem is visible and growing and the regulations are likely to follow. We need to encourage, optimise and embrace them for the good of everyone but the outlaws.
Digital Cinema Packages of feature films are typically delivered on hard drives. Trailers and ads can be moved on USB, DVD or Internet downloads. Image courtesy Digital Cinema Mastering.
There have been some security success stories. Digital cinema was never about picture quality; it was about copyright theft and illegal copying of intellectual property. The solution was to make every digital cinema projector in the world unique. Each one has its own key to decode scrambled data. If you open the projector, the key is trashed. A movie is specifically encoded to be decoded by the projectors that are authorised to show it. Anyone intercepting or stealing the data gets nothing but noise.
One of the difficulties we have is that certain security agencies actively don’t want the world to embrace secure communication because it means they can’t eavesdrop on us. Whilst they may have reasons to do that, they have to prove that the benefits to society from their activities outweigh the damage society is suffering due to insecure communications. I haven’t seen that proof.
There have been some spectacular security failures. In particular, the rapid adoption of the Compact Disk by record companies. Personal computers were just becoming popular. Remember the term, “CD quality”? What at first were considered as “quality improvements” soon became the format’s downfall. It was easy to duplicate the ones and zeros with everyday computers. Why pay when one can steal?
You’ve got a bug
Today’s computers use what is called von Neumann architecture, after John von Neumann, the mathematician. In such a machine, the memory simply stores binary numbers without being concerned about what they mean. Some of the numbers might be the code that tells the processor what to do. Other data might be text, audio samples or pixel data that the processor is intended to act on, or which it has finished acting on.
The term "computer bug" is typically attributed Captain Grace Hopper on September 9th, 1945 when a moth short-circuited relay number 70 on Panel F of the MARK II Aiken Relay Calculator at Harvard University. In fact, engineers had used the term "bug" to label equipment problems prior to 1900.
Von Neumann argued that by storing data this way was more flexible and permitted programs to have more freedom to do things like writing programs that could improve themselves as they ran. Whilst that is true, the simple fact is that hardly any software like that is used outside of genuine research establishments and certainly not for broadcast applications.
Viruses can take advantage of that architecture by sneaking malicious code into the machine disguised as data. The processor is then told to run it and doesn’t know any better. So von Neumann architecture computers have too much freedom, freedom that is not needed for most applications.
Not all computers work like that. Some computers use a concept called tagged memory. In this approach the memory words are divided into two fields. In the larger field is stored the data and in the smaller field is a bit pattern that describes the data, for example it can be described as executable code or as information to be processed or fixed information such as a look-up table. Part of the memory management system is a piece of hardware that constantly compares what the computer is trying to do with the tags describing the memory it is trying to access.
In a system like this, any attempt to write over or copy executable code will result in an error condition that will prevent the process from going ahead. Executable code can only be accessed by the processor during an instruction fetch cycle and anything else is suspicious. Equally, any attempt to write over a look-up table will be result in an error condition.
Memory card for modern desktop computers.
The real strength of tagged memory comes from tagging data to be processed as read only or read/write.That doesn’t have any effect on normal operation, but if any attempt is made to execute read/write data, in other words if the processor attempts an instruction fetch, it won’t be allowed. The system will trap and alert the operating system to a potential threat.
Techniques like this stop viruses in their tracks. Even if they can get in to the computer, they can’t run. Cyber criminals simply turn their attention somewhere else.
Viruses look for weakness
Machines that don’t have tagged memory still have hardware that can be used to increase security. All modern computers use memory management, whereby the address generated by the processor when running user code is a virtual address that is mapped to a physical memory address by adding a constant to it. The memory management unit that adds the constant can determine what the user program can and cannot access. For example only the operating system should be able to access hard drives and other peripherals. Pages of memory can be described in a similar way to tagging.
Intel Core i7 CPU.
The problem comes when the operating system isn’t sufficiently robust and does not shut doors after it’s been places. Content of memory that is no longer in use is left intact instead of being deleted. Viruses exploit those weaknesses.
Typical computers use the same processor to run the operating system and the user programs. In theory the two are completely isolated and user programs can’t alter the operating system. In practice the operating systems’ weaknesses allow that to happen.
With CPUs being so inexpensive, there is a reasonable argument to build computers in which the operating system has its own processor that controls the memory management and the memory of the user processor.
All the CPU can do is process. It can’t access I/O devices. It has to ask for data to be put in user memory and it has to ask for it to be taken away after processing. In such an environment, a virus can run but it can’t do any damage.
Broadcasters can’t be expected to develop their own computers, but what they can do is to ask their suppliers tough questions, educate their staff and lobby their politicians for legislative action.
*John Watkinson is a Member of the British Computer Society and is a Chartered Information Systems Practitioner.
You might also like...
Artificial Intelligence (AI) has made its mark on IT and is rapidly advancing into mainstream broadcasting. By employing AI methodologies, specifically machine learning, broadcasters can benefit greatly from the advances in IT infrastructure innovation and advanced storage designs.
As broadcasters continue to successfully migrate video and audio to IP, attention soon turns to control, interoperability, and interconnectivity to improve reliability and efficiency. In this article, we investigate IP control as we move to IP infrastructures.
Broadcast systems are renowned for their high speed and high capacity data demands. Up to recently, they relied on bespoke hardware solutions to deliver the infrastructure required for live real-time uncompressed video. But new advances in IT data storage have…
Broadcast and IT technology collaboration is continuing its journey with the cross-over between them becoming ever clearer. Storage is a major growth area with advanced methods achieving greater efficiencies and resilience. And as broadcast repeatedly demands more and more capacity,…
Broadcasting used to be simple. It required one TV station sending one signal to multiple viewers. Everyone received the same imagery at the same time. That was easy.