I envision that in the future of the Internet, Cyber Laws, Info Security & Governance will be very closely based on these ideas… I have termed these principles as either depending on the point of view or the context: Information Chaos / Information Engineering / Information Sanity Principles.
Bifurcation
This applies to any form of temporary or virtual information divergence. Though this would apply to mainly the entry points on the internet like search engines, social networks, directories, or e-commerce listings. Although this can be applied to larger sets of data, it can also be applied to logically related subsets. One example would be to deduce and apply context to each type of information search. This would ensure that there is a clear demarcation of the type of data actually available. Another aspect would be to use the behavior of the user to induce relevance to a user workflow. This aspect has to be implemented through configuration or centralized information exchanges between various participating or critical websites. In general, a repository that could hold every netizen’s usage habits or information would be the perfect sub-solution to bifurcation.
Prevention
There are enough mechanisms in place to loosely prevent out-of-track information to be added to a given set. This applies to simple form data, images, documents, videos, music, and even multiple other data forms. But using more efficient robotic analysis, we should be able to prevent multiple situations such as terrorism, adult content, religious violence, and racist remarks. These rules should be made available through an efficient regulatory body depending on the type of site. Also, this would help in reducing information chaos by not allowing unwanted, automated, or non-context information to build up. Apart from this, this would clearly help in easing the process of the Bifurcation of information. Regulated Prevention would also mean that we are able to provide a uniform barrier in the cyber society.
Reporting
The simplest and most remedial measure to counter multiple information issues on the internet is to introduce more effective cyber policing. This needs to be enforced first by making available relevant information for reporting to the user. Also, an ordinary user must have ready tools and links to use reporting mechanisms. These mechanisms need to be appropriately placed at locations that are accessible and can be followed to create detailed descriptions for more analysis. Also, by creating a newer wing of current cyber reporting or analysis mechanisms that are swifter in action, whether online or on-ground — we are removing fear from the users.
Classification
Governments all across the world, especially while forming cyber laws for the newer internet have to now follow stricter registration procedures for the internet. This may include web startups, dotcoms, information sites, social networks, or even academic websites. This will help in curbing non-genuine sites, fraudsters, scammers, and malware sites. By allowing city, state, zonal, offline, online, and national ombudsmen; we will be able to quickly enforce policing. Apart from these, by allowing a dedicated team of internet information providers, that track live information; we will be able to classify information with much ease. Also, as time progresses, we should not allow a very easy domain registration and hosting process — that does not involve the ombudsman. The additional task of each ombudsman office would be certification, classification, ranking, and maintaining historical data of its zone.
Relevance
Software Engineers and Architects should use novel practices to create more relevance in the type of data that is accessible to the user. With stricter laws in place, we should provide only the right type of data operating within a context. Apart from this, the engineers should be able to maintain a pool of intelligent information within their own applications or software. This will allow them to retire data sets that may not keep in line. They may also choose to provide this data to other developers. By automating most of the other tasks that are related to bifurcation, prevention, classification, and reporting — they would be engineering more relevant software and thereby the internet. Software and Internet Architects will be required to coordinate and also obtain relevant certifications for their own properties. They also need to envision a way wherein the software implementers can implement all of the required policies with the greatest ease. The software would also need to have standard integration points to quickly circulate information around common or related websites.
We can classify the problems that are solved as under and curb information mess and also curb anti-patterns from forming.
Bifurcation Irrelevance, Non-Context Info, Malware, Adware, Ad Driven Linking…
Prevention Underage, Adult, Unsolicited Requests, Religious Violence, Profanity, Racism…
Reporting Bullying, Financial Frauds, Misleading Mails, Chain Mails, Terrorism, Frauds…
Classification Pay for Click, Pay for Visit, Unsolicited Meeting, Phishing, Scam…
Relevance Identity Theft, Misconduct, Irrelevance, Racism, Threaten…
This article is part of the series : [ SKP’s Novel / Innovative Concepts ]. A more Detailed Research Paper is available at Working Copy of the Paper — National Journal for Computer Science and Technology, Vol 5, Issue 1 (2014) - Narsinbhai Institute of Computer Studies and Management, Gujarat, India
Also published here.