Submitted by smart_jackal in Tech (edited )

The recent Pegasus spyware hack has sort of "opened our eyes" or at least most people are pretending that way! However, if most people were really serious about the fact that privacy should be an ethical right of each individual, their actions would be much different than what they are today.

Firstly, many things have changed between early 80/90s and now in terms of how the software development process itself is perceived and managed. And since software engineering is a rapidly evolving and new science (in the overall time-line of humanity), its a very wrong view to take that all aspects of present processes are objectively better than those of the past.

For one, the extraordinary emphasis on users to constantly update their apps and operating system software is something that is quite recent. Ironically, people hardly used to update their software with such high frequency in 90s or even early 2000s and still managed to keep their systems far more secure - at least going by the number of hacking incidents available in public domain. Of course, one reason attributable to this is a massive increase in number of cyber-criminals trying to compromise people's security, the attack ecosystem has evolved a lot in recent years.

But on the other hand, what is the defense ecosystem doing to counter that? Constantly releasing "security updates" and constantly asking users to update their apps isn't the best way to approach this problem. Security shouldn't be an afterthought but be built into the project right from start. One of the ways to do that is to reduce complexity and feature creep. All software must be designed in order to be robust and secure, security shouldn't be an afterthought. Security updates or patches should be released only when a vulnerability is found (such as the infamous OpenSSL vulnerability).

Complexity is highly antithetical to privacy and security. The more complex a software's design, the more difficult it is to test a software for vulnerabilities and even audit its code. One way to reduce complexity is to keep components separate or decouple them (even at the cost of performance because processing power is cheap but breach of security isn't). In this regard, the move from sysvinit to systemd is an extremely bad design as the latter's "black box" approach of high complexity requires far more effort on part of software auditors or testers to check for vulnerabilities compared to former. I'm not saying sysvinit didn't needed an upgrade, it certainly did. But systemd was the wrong way to go about it. The more you move from simple to complex, the greater is the chance that some shrewd hackers will be sitting on zero day vulnerabilities which you won't be aware of.

Older Windows versions like XP and 7 didn't require such constant updates as the newer Windows-10 requires. The same could be said about older vs newer versions of Ubuntu, Fedora, etc. too. And Android is an absolute mess when it comes to software design! While AOSP is open source, the actual vendors like Samsung, Xiaomi, Oppo, etc. have their own Android versions which are proprietary and closed source. Also, they don't even release constant updates for their software and when they do, they are known to break earlier features and introduce even more bugs! If only Android had followed a simple design like that of Windows or even a Linux distro, it would have been much more secure today.

Going forward, its up to the stakeholders of the software defense ecosystem (FOSS developers, testers and auditors, designers, sponsors and advocacy companies like Red Hat, etc.) to design their systems to be more transparent and keep it simple rather than complex. Of course, as the number of features increase, some amount of complexity is bound to be introduced. Its in the nature of a user to keep asking for more and more unneeded features. But as far as possible, a developer should only implement features to the extent that he/she can keep them secure and less complex.

3

Comments

You must log in or register to comment.

Fool wrote

You seem to be confusing greater reporting with greater issues. Though you have brought to my attention things I had forgotten. I now remember the push back on moving from Sysvinit. (I started on Solaris, and later worked on old SCO Unix, when the change to Linux occurred I had forgotten why nobody wanted to migrate earlier).

10 years ago most computer security was a joke, but a lot less trust was placed in computers. The main things is people now understand how much money can be made.

Windows XP did not have any legitimate security, Windows Vista was a major upgrade.

A lot of vulnerabilities being reported are very old, and often well known by government backed groups.

Things like software defined networking and virtual network functions provide both good and bad points.

Overall you want to make choices based on what the device is used for, only connect stuff to the internet if it really needs it.

And if you're going to rely on Mobile devices for any security, you should harden it properly.

But there is a point to be made about a phone OS focused on Security.

I'm not sure if my comment was coherent.

5

cyberrose wrote

If KISS is important to you, you should try BSD I guess. And when it comes to smartphones; you don't even have to hack the Phones to get a lot of information about it's users. If people care about privacy they should double/triple think about using them.

2

AnarchoDoom wrote

The repeated security updates for Tor are also getting me worried. The problem lies in the update process in Android and more closed systems like iOS, where you can't even see what's happening, save choosing the updates.

'member how it's Microsoft that started this whole pattern with Windows XP...

1