By Jonathan Agbo
In a press release dated the 13th June, 2022, the National Information Technology Development Agency (NITDA) announced the release of a draft Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries. The aim of the code as announced by NITDA was to “protect the Fundamental Rights of Nigerians and Non-Nigerians living in Nigeria as well as defining guidelines for interacting on the digital ecosystem”.
The draft code contains provisions that aim to regulate the dissemination of information using internet enabled platforms when it comes into effect. This article reviews the salient provisions in the draft code and interrogates their net effect on the freedom of expression online.
As a starting point, it is relevant to note that the right of Nigerians to express themselves is a constitutionally guaranteed right that has to be protected. As such any law, whether substantive or subsidiary, such as the draft code will become upon coming into effect, that derogates from or purports to take away that right in an arbitrary manner will be unconstitutional. The right to freedom of expression includes the right to express oneself online. That this is the case is readily seen in Section 39(2) of the Constitution of the Federal Republic of Nigeria. It provides that every citizen shall be free establish, own and operate ANY MEDIUM for the dissemination of information, ideas and opinions. It follows that any medium includes online mediums facilitated by the internet.
Any law which regulates the dissemination of information must therefore conform to the above right and not detract from it. A corollary to the above is that every Nigerian who uses any medium whether physical or online, to express himself is simply exercising a constitutionally guaranteed freedom. There is however a concomitant duty on such person(s) exercising their freedoms to also ensure that while doing so, they do not violate the freedom of other persons. It is in this respect that Section 45 (1) (b) allows relevant agencies of government to protect the rights and freedoms of other individuals but with a caveat: that such actions which derogates from the constitutionally guaranteed freedoms must be “reasonable justifiable in a democratic society” and must be in accordance with laid down rules of procedure. This principle is been given effect in several judicial pronouncements.
Presumably, the draft code of practice was made pursuant Section 45 (1) (a) and (b) of the Constitution of the Federal Republic of Nigerian as stated in the objectives of the code. A review of the draft code however shows that a blatant disregard for constitutional safeguards and the stifling of civic engagement online will be made easier by the code if it comes into effect as is.
The first pointer to the above conclusion is the definition section of the draft code. Amongst other things, it defines harmful content as “content that is not unlawful but harmful”. First, the definition is an attempt to make a distinction without a difference. By simply stating that harmful content are contents that are not unlawful but harmful, the code actually defines nothing for it would be difficult to see an action that harms someone, within the context of the definition in the code, that does not carry with it some legal implications whether civil or criminal. The implication of the above is that the definition of harmful content in the code is either a deliberate attempt to circumscribe information dissemination using online medium or a default action that inadvertently leaves the implementation of the code to the whims and caprice of individuals. In practical terms, if the definition of harmful content as given in the draft code is allowed to stand, it would mean that any person who considers a content to be “harmful” even when no known law has been violated is allowed to complain under the code. That this is the case is seen in the fact that the draft code creates a difference by defining what amounts to unlawful content as “any content that violates any existing law in Nigeria” and prohibited materials as “materials that are objectionable on grounds of public interest, morality, order, peace or otherwise prohibited under any applicable Nigerian law”. I am therefore inclined to believe that the definition of harmful content under the code is a deliberate attempt to stifle freedom of expression.
Part one of the draft code spells out the responsibilities of Interactive computer service platforms/internet intermediaries. The platforms are required to immediately take down contents that are unlawful, exposes a person’s private parts, sexual acts or revenge porn, amongst other in a non-consensual manner within 24 hours of receiving a complaint either from a user or a relevant government agency. While the objective is presumably to protect the privacy rights of others, it is my view that the obligations imposed on platforms under this part has the capacity to violate the rights of other users to fair hearing and their right to freedom of expression in the process. The code does not make provision for the rights of owners of such content to make representations in defence of their content before same is pulled down. It simply imposes and obligation on platforms to take down a content or publication within 24 hours of receiving complaints from either a user or relevant government agency. Curiously, the platform is thereafter expected, after taking down a content, to conduct an assessment in a manner akin to putting the cart before the horse. The above, apart from being a direct contravention of the right to fair hearing and freedom of expression, is also a direct contradiction of the Nigerian communication Commission’s Internet Code of Practice. In Clause 7 of the NCC code for instance. Extensive provision is made for the prevention of unlawful content. Clause 7.1 of the NCC Internet Code of Practice provides that “An Internet Access Service Provider is generally under no obligation to monitor content which it stores or transmits when providing Internet Access Services, nor under any obligation to seek facts or circumstances indicating unlawful activity, except when acting under instruction from the Commission or relevant law enforcement agency” then in clause 7.2, internet service providers are mandated to provide “clear and adequate direction to customers or users for reporting unlawful content to the commission”. The commission is then required to determine if the content is indeed unlawful before issuing a take down notice to the Internet service provider. Fundamentally, clause 7.3 (b) of the code then provides for a right of appeal against the decision of the commission to issue a take down notice against any content. While there have been arguments that even the NCC’s Internet Governance Code does not make adequate provision for the right to fair and equal representation before a take-down notice is issued, it at least makes provisions first, to platforms to issue a comprehensive direction that is clear and adequate on how complaints against unlawful content should be made. In my view, such direction will take into account the rights of owners of such contents to also be protected. Then there a multi-level procedure that includes making a formal complaint to the commission which must determine that the content is indeed unlawful and the right of appeal against such decisions. In all of these, there is implied, the acknowledgement, that the process takes some time hence the omission to impose time obligations in the code.
A better approach under the draft code would have been to follow the pattern established under the Internet Code of Practice and a possible improvement that allows the creator of any content that is complained against to make representation in defence of his content before a decision is taken. Another approach would be to approach the court for the purpose of obtaining an order which places a direct and legal obligation on a platform to take down a content that is deemed unlawful by the court. Generally, before such an order is obtained, the applicant would have placed or is expected at least, to have placed sufficient materials before the court from which the court will determine whether the content being questioned is indeed unlawful or not. Also, it ensures that the right of the creator of such content to be heard is protected and preserved.
Part II (2) of the draft code creates additional responsibilities on platforms to inform users not to amongst others share, publish or promote any content that infringes the Intellectual Property Rights of others or that such users are not lawful owners of. Part (e) is in my view a repetition of part (d) of the draft code. Any unauthorized use of a content created by another is a violation of the Intellectual Property rights of a third party. Again, the requirement to prevent the publication of or take down such content that is deemed prohibited under Part IV of the draft code within 24 hours of being notified of the existence of such content is a potential violation of the freedom of Expression online, the right to free speech and fair hearing. As earlier posited, there can be no proper take-down notice to a platform without first giving the creator of the content under scrutiny the right to be heard or at least the opportunity to review the content by the platform in line with internal standards and community rules before a decision is taken. Part IV of the draft code for instance, makes reference to several laws that a platform must make recourse to in determining whether a content is prohibited or not. The fact however is that most of the laws referenced in the draft code do not leave one in doubt that judicial or due process that respects the rights of individuals to representation must be followed for the sanctions provided in them to apply. For example, offences created under the Cyber Crimes Act, 2015 referenced in Part IV of the code can only be punished upon conviction by a competent court of law. Again, many provisions under that Act, are not only obnoxious but severely restrict the right to free speech and have been misused by authorities to limit press freedoms in Nigeria.
The fact that the Code, when it becomes operative, applies to media organizations, makes it even more pertinent to effect changes that preserve the right to free speech online before it becomes effective. It is a well-known fact that Nigeria does not have full internet freedom and is bedeviled with numerous challenges. Therefore, wholesale application of the code will only serve to constrict the already shrinking civic space and reduce civic engagement further.
It is therefore suggested that the draft code be further reviewed with the particular purpose of protecting the freedom of expression online and engender respect for other constitutional safeguards available to internet users in Nigeria.
Jonathan Agbo is a Digital Rights advocate and can be reached at firstname.lastname@example.org
 https://techcabal.com/2022/06/14/nigeria-seeks-to-regulate -social-media-with-new-nitda-code-of-practice/ retrieved on 15th June, 2022.
 See EYINNAYA V STATE (2014) LPELR 22924 (CA).
 Part I (3) and (4) of the draft code.
 Part II (2) (d) (e)
A handy text for not only lawyers, arbitrators, and judges, but also as major reference material for industry players including architects, builders, civil and structural engineers, quantity surveyors, and project managers, among others.
Written By Ewuwuni Onnoghen-Theophilus (Mrs.), LL.M, MCIArb.
The book goes for N25,000. To order, please call 08035426089, 07065398107, send an email to email@example.com or visit www.onnoghentheophilus.com