In the previous Cloud Broadcasting article, we looked at the business case for public clouds. In this article, we delve further into Cloud Born systems and go deeper into cloud security.
Cloud security is a partnership between the cloud service provider (CSP) and the broadcaster, a breach in one will almost certainly result in a break of the other. CSP’s have an obligation to stop intruders from accessing servers and protecting processes and data, and users have an obligation to stop intruders from gaining access to their software.
IT Managers despair with password security and is the Achilles heel of any system, but the frustrating part is IT Managers can easily deliver authentication systems that are very secure. They can force complex passwords, make users regularly change passwords and have n-lockout systems. However, this is usually frowned upon by users who see complex security as an unnecessary inconvenience and hindrance to their work.
Two Factor Authentication
And they do have a point. But with ease of accessibility and use comes potential compromise of security. It’s often left to the CEO to negotiate a path between protecting the system and providing ease of use. Recent advances in two-factor-authentication make this easier but it still has its limitations. When logging in a user will enter their username and password in the usual way, the software then sends an authentication code to the users’ mobile phone so they can enter it after the password, if the number is correct they will be allowed in. Travelling employees with poor mobile phone coverage find this a difficult system to deal with.
Traditional methods of operation work on the principle of providing an employee with a laptop or desktop computer installed with software such as word processors, spreadsheets, email and even high end graphics programs. Logging on gives users access to large parts of the filesystem through file servers. Granular rights controls help but they get complicated very quickly, especially when we start sharing files.
This opens two areas of vulnerability, firstly a potential hacker can quickly gain access to the corporate filesystem by installing malware and viruses, and secondly, if a laptop is stolen, the thief can easily gain access to the system.
Stop Hackers Storing Files
Public cloud Software-as-a-Service (SAAS) solutions provide a remedy to this. Desktop and laptop computers no longer need to be powerful as the software is accessed through an internet browser, files are stored in the cloud and high demand processing takes place in the cloud too, resulting in the user’s computer needing much less memory and hard disk resource. If a hacker cannot store files on a laptop or desktop it’s much more difficult for them to gain access to the network file system.
Malicious hackers often try to gain access to a system through unpublished services running on IP ports. For example, TELNET, a command line utility to interrogate a server, listens on IP port 23.
VPC to the Rescue
Cloud systems such as AWS protect servers using their Virtual Private Cloud (VPC) allowing IT Managers to logically isolate individual servers and groups of servers. A range of IP addresses can be defined with subnet masks, route tables and network gateways. IP ports are opened so a specific service can be defined, and all other ports are blocked by default.
Network changes are made quickly and on the fly. VPC would be configured to allow ports 80 (HTTP) and 443 (HTTPS) for normal web server-client operation. When the software team need to update a system or software file they would probably use TELNET with SSH. From the AWS terminal, they would change the VPC network configuration so that it opened the TELNET port, they would make changes through TELNET and then close port 23, resulting in the TELNET session closing if they had not already logged out. The system would once again be secure allowing only HTTP and HTTPS traffic access to the server.
If the web server needed to communicate with a database server, the server would use unreserved ports and these would need to be opened. Cloud providers such as AWS further enhance their network security by providing public and private facing systems. The public facing system would restrict access to HTTP and HTTPS ports, or TELNET/SSL for maintenance, and the private facing system would allow access to other servers within the system domain such as the database and video processing.
Master Account Access
Another area of potential vulnerability is the cloud provider’s master software interface to the corporate cloud infrastructure. From a technical perspective, this is very secure as they generally rely on complex passwords and two factor authentications. Once a user is logged into the cloud portal they can cause all kinds of damage, from deleting servers to reconfiguring network systems.
System configuration is often left to Dev-Ops engineers, the members of a team who bridge the technology and communication gap between the software developers and the day-to-day users. In the past Dev-Ops generally report to the Head of Software (HoS), who in turn would often report to the Chief Engineer in broadcast stations. However, the lines of demarcation are becoming blurred and this reporting structure is not always respected, quite often the HoS now reports directly to the CEO.
Test Accounts Become Master Accounts
When a cloud infrastructure is first established a master-account is created. The credentials consist of an email address and password. It’s not unusual for a well intending member of the team to set up the master-account using their own personal email address for test purposes prior to integration, and as cloud slowly migrates into the business this test-account becomes the master-account.
CEO’s must be proactive in understanding the control Dev-Ops and the HoS has over their cloud broadcast system. They must understand how the logon credentials are stored. Who has them? Where are they? How can the CEO instruct somebody else to use them? The software team will have the best interests of the business at heart, but they are not always best placed to make decisions affecting the wider business, a responsibility that falls solely at the feet of the CEO.
You might also like...
In the last article on Cloud Broadcasting we looked at integration and how we communicate with SaaS and cloud services in the absence of GPI’s and serial connections. In this article, we introduce secure server access and issues around s…
The broadcast equipment industry is in the process of making the transition to IP based transport for video, audio and data. This has led to development of a suite of standards including SMPTE ST 2022-6 for encapsulation of uncompressed SDI…
Troubleshooting IP-centric technologies can be a new challenge for engineers. Often it becomes a case of “You don’t know—what you don’t know,” until it is too late. In addition, once the engineer knows there is a problem,…
As broadcasting moves to highly efficient production lines of the future, understanding business needs is key for engineers, and recognizing the commercial motivations of CEOs and business owners is crucial to building a successful media platform.
Broadcast engineers have a whole plethora of tools available in their kit-bag to integrate systems. The common denominators are SDI, AES and MADI for media exchange, serial and ethernet protocols for control, and the trusted GPI should everything else fail.