So, the other day, I came across an article about a smart refrigerator. We already had our dose of smart phones and smart TVs but now the world was supposed to make a leap ahead. The refrigerator ostensibly comes up with an integrated diet management s/w and other utilities to inform the user about what all constituents of the fridge need a refill. All in all, it’s a great appliance. As we see, electrical appliances have moved much beyond their basic operation. They now come up with added usability to add to their raw value and in case of smart phones, it has been quite a revelation. The refrigerator is cool but you know what’s the coolest part about it ? It’s got its own ip address and just like any other terminal or laptop or smart phone or the latest smart TVs, it too is connected to the Internet. We’ve finally reached the inception of an era where almost any and every thing which serves some purpose, would be connected to the Internet. Ladies and Gentlemen, I welcome you to the Completely Connected Universe.
With a craving for more and more features in what we possess, we have been quite successful in remodeling our products and transforming them into their modern connected avatars. So far, we’ve also been successful in evangelizing the common buyer that the new technologies are here to stay. In short, we are not very far away from the day when your cars, your heating irons, your geysers, your air conditioners and even your complete homes, as such, would all be connected to you via the Internet. Your home may send you a ping on your smart phone in case of a prospective burglary or burgeoning fire. Your car may send you an email or a message when someone tries to open its locks. Your geyser may send you a chat message saying that you left it open and it’s somewhat doubtful whether it was deliberate or if it was a mistake and it should shut itself down. All in all, all of your devices and equipments will stay in touch with you, courtesy of the Internet. The whole prospect appears gigantically spectacular and yet there are intricacies which, if overlooked, can do something so devastating that you’ll make every attempt of reverting back to the old times. What is this inherent flaw which can overpower the gargantuan benefits of such astonishing technologies? It’s one thing which has been bothering cyber space for some time now. Cyber Security.
Internet, no matter how far it has come in terms of network infrastructures and topologies and how much it has advanced in terms of connectivity, it still suffers from a deficiency which makes this world reluctant to totally rely on the Completely Connected paradigm. The progress of Internet as a mode of communication and its veracity as a playground for new verticals like Social Media and E-commerce etc, has been quite enthralling, but yet there stays a concern which makes us wary of the Internet. It’s the amalgamation of all the dark terms associated with it. Crackers,intruders, malware, spam and a plethora of other things which have continued to and continue to taint its image as a completely reliable platform. Banks moved to online accounts and every single day, a multitude of customers lose tonnes of money to crackers who break through sophisticated firewalls and ciphers and manage to sneak in and do all they want. Spam is so very prevalent that not only it causes embarrassment which is its most tenable ramification, but it also wipes out a lot of computation and storage space across global servers. And then there are DDoS attacks and other ways of breaking into systems of top notch organizations like the intelligence and military, theft of tactically critical information from their servers,and finally leaving trace marks that can cause extreme humiliation and what not. The question is that irrespective of how much cipher technologies and network technologies progress in their own cohort, will the Internet ever be able to triumph over the evil ?
The answer is yes, but what is needed to be done is quite a bit of a revelation in itself, just like the things we’ve been doing on the Internet, of late. For making Internet a completely indomitable, inimitable, invincible, impregnable and extremely reliable mode, we’ve got to take it to the next level, we’ve got embed and imbibe intelligence into it and most importantly, we’ve got to let the Internet develop a conscience of its own. It sound a little tacky but the Internet is amongst the most intriguing of phenomena. The Internet is not merely a set of connected systems and networks across the globe, it’s a connection of processes that are extremely varied but still follow a set of protocols. Simply viewing the Internet as a large connection of hardware systems like routers, servers, switches and terminals manifests the notion of its fixed architecture. But if we delve a little into the state of the present Internet, it’s more about the communication between different processes on different machines which follow structured protocols. Now, we can inculcate intelligence in individual processes and systems but what can actually evade the Internet from the vicissitudes of cyber attacks is vesting decision making and awareness over the entire Internet.
As such we have systems that have their own addresses and their own set of processes guiding them on how to handle packets. There are other secondary machines like routers and switches which specifically work towards handling of packets and transfer of packets across the globe. Now these processes are very basic in nature and follow protocols and specifications. E.g a Firewall knows what all applications to block and at a deeper level what all kinds of network datagrams to block or to accept. All in all, these machines necessarily work on two types of instructions. Ones, which are the framework of a protocol like how packets are broken and how they are transmitted and two, what we explicitly demand these machines to do. We can demand them to block or allow specific application requests, specific ip addresses, specific packet formats and what not. But essentially, the machine still exercises what we asked it to do. This is an essential imitation of what computers have been doing ever since their inception – following the instructions. However, in context of the latest happenings in the field of machine learning, Internet as such is an inevitable utility but its whole visage is rendered obsolete just because the philosophy has remained essentially the same. If your machines and even your search engines have now the capability of learning your tastes and your preferences and of giving due diligence to them, why not allow the Internet to learn something too. But the true question is – what should it learn and from whom it should learn.
The Internet is huge and there’s no doubt about it. The amount of information processed and transferred on the Internet per second is a million times larger than that in any library, and yet the overall modus operandi of the Internet seems somewhat outdated and trite in the context of the systems that the Internet connects. If instead of simply allowing routers to route packets, servers to accept and process requests and firewalls to block access, if instead of all this we allow these systems to learn, to enhance their capabilities, to inculcate some decision making in them, then just imagine how much things might change. Machine Learning is present in all modern systems. Operating systems learn about usage patterns and help in fine tuning accessibility, search engines learn about preferences, E-commerce sites learn about your tastes and make offerings accordingly. And all these systems are doing it very well. Then the question is, why not allow the Internet to learn.
The Internet’s learning cannot be concentrated or centralized. It has to be distributed across the globe. The firewalls, for example, can learn with time how certain suspicious requests keep coming from a specific set of ip addresses and can subsequently alert the administrator about a possible prospective intrusion attempt from the same address set. Or, routers which already work on learning algorithms start learning a little more about the packets. For example, if certain packets of specific type are more successfully sent on a particular route, then they should accord some preference to that route for these kind of packets. In this way, a router will not only exercise its usual logic and current network status but it will also be able to use some knowledge it has been gathering in the meanwhile. Similarly, servers can learn about the requests for specific resources on a particular website and can accordingly choose to suggest modifications to its own server caching policies. The possibilities in this realm are endless. But on a larger picture, it’s about making the Internet aware of its own presence.
The major motivation for something like this can come from what is suggested in this article at this link(Resemblance between human brain and the Internet). This article suggests that the Internet and human brain are counterparts in some aspects. Just like the human brain is a connected network of networks of neurons, the Internet is a connected network of networks of systems. And just like there are certain neurons which assume higher responsibilities within the brain, certain systems (servers, routers etc.) assume a more primordial role on the Internet. However, the human brain is aware of its existence(conscience) and also unaware of it(sub-conscience), but the Internet is largely sub-conscious and this indeed is the very reason why systems on the Internet are susceptible to attacks.
If by virtue of ingenious learning methods like cognitive learning and other mechanisms, the Internet’s individual systems start learning about themselves, the overall Internet too will began to learn. However, it’s more about the dissemination of this knowledge than about learning. A router that learns specific patterns of packets getting delivered more successfully in certain routes needs to relay this information to the other routers on the path. Otherwise, only one packet hop will be affected. Similarly, a credit card payment gateway which somehow blocked certain malicious requests courtesy of its indigenous algorithms, should immediately acquaint the routers and firewalls with this knowledge so that they can block the next intrusion attempt at the perimeter itself. The systems need to learn from patterns, from failures, from human intervention and from preferences as well. All in all,the systems need to learn as well as pass the information amongst themselves. Only then will the Internet get the second level of thinking. The deeper level shall hold the usual instructions of packet hopping, packet blocking and all the commonplace things but above that would be a layer of conscience which allows the Internet to think, to learn and then to act without the need of human beings to guide them. Only then would the proponents of Internet be in position to promulgate its reliability and only then would the Internet truly mimic the human brain and develop it’s own Conscience.
Tushar Kumar Singh(@cyberdyne_101)
Writer At – Future Perfect