Search
Close this search box.

Managing the Promise and Threat of Emerging Technologies

By and September 12, 2023
by Tamar Rahamimoff-Honig and Keren Shahar

BESA Center Perspectives Paper No. 2,212, September 12, 2023

EXECUTIVE SUMMARY: Emerging and disruptive technologies (EDTs) are developing at a rapid pace, offering many opportunities while also raising challenges and concerns. Their innovative nature, technological complexity, and wide range of applications require that economic, social, and national security concerns be taken into consideration as they spread. As technology is neutral, states should strive to agree on the ways in which EDTs can be used responsibly and find appropriate means to regulate them.

Emerging and disruptive technologies (EDTs) are crucial for human development. Tremendous opportunities and advantages emanate from them, and they are both a product of and a catalyst for social and economic innovation. They are force multipliers that can assist humanity across a wide range of fields, including food security and climate change. However, the advent of some technologies, such as AI (artificial intelligence), has raised concerns of potentially grave implications for humanity that have prompted calls for regulation and oversight.

On March 22, 2023, a group of prominent developers, researchers, and technological experts, amongst them Elon Musk of Space-X, Tesla and X (Twitter), Apple co-founder Steve Wozniak, and Tom Gruber of Siri/Apple and Humanistic AI, published an open letter calling for a six-month pause by all labs that train AI (artificial intelligence) systems more powerful than GPT-4. They expressed concern that AI systems with human-competitive intelligence can pose profound risks to society and humanity by creating digital minds that no human can understand, predict, or reliably control. They suggested that this pause be used to develop safety protocols for advanced AI design that would ensure that these systems and their use are safe. They also recommended that AI governance systems be developed to regulate and oversee AI.

After the publication of this letter, several other initiatives and statements were made regarding the need to exercise caution and consider possible limitations on AI. A message of concern was voiced by the Secretary General of the UN in his Policy Brief Number 9, “A New Agenda for Peace”, which was published in July 2023. Therein, the SG identifies AI as both an enabling and a disruptive technology that is increasingly being integrated into a broad array of civilian, military, and dual-use applications, and that this rapid integration could have unforeseen implications.

Concerns about AI as well as other EDTs are also being voiced in multilateral fora that address arms control, disarmament, non-proliferation, and export control. In these fora some recommend that states restrict the development and use of EDTs by establishing norms, standards, procedures, or mechanisms that would ensure that EDTs provide opportunities and benefits while minimizing their inherent challenges. Apocalyptic doomsday scenarios have been raised in this context, such as killer robots devoid of any mitigating human interface that carry out lethal attacks.

For the international community to appropriately assess the opportunities and challenges associated with EDTs and decide whether norms, standards, procedures, or mechanisms are necessary, several issues need to be considered.

First, there is no internationally accepted definition of the term EDT. The US, for example, published a Critical and Emerging Technologies List, which is not identical to NATO’s Strategy on Emerging and Disruptive Technologies. It can be argued that any effort to create a comprehensive list of EDTs is futile, and that any such list would be obsolete the moment it was compiled due to the rapid pace of technological development. While this may be true, the international community must be specific about which technologies it seeks to regulate. For the time being, the international community is taking a piecemeal approach to the matter of responsible state behavior regarding EDTs like cyber, AI, and space tech.

Another challenge is that differentiation between civilian and military applications of EDTs is difficult if not impossible. For example, facial recognition technology can be used in smartphones and in counter-terrorism applications, and electronic noses can detect cancer as well as explosives. The setting of norms and standards must therefore take into account the wide range of considerations pertaining to both the security and civilian contexts. The challenge in this respect is to establish responsible regulation that does not hinder social and economic global development. Regulation of EDTs that is too stringent could have adverse effects on human prosperity and sustainable development. In this respect, EDTs regulation walks a very thin line.

An additional challenge related to the regulation of EDTs pertains to the difficulty in differentiating between malicious and legitimate use. While technology is neutral in and of itself, it can often be used for both good and ill. Given the innovative nature and dual use of EDTs, it might not always be possible to identify in real time whether an intended use is benign. In this context, it should be noted that the military/civilian differentiation is not helpful in determining the legitimate use of EDTs, as military applications are not necessarily malign and  civilian applications are not necessarily benign. Examples include military EDTs that are used for the verification of arms control treaties and cyber technologies that are used for ransomware.

These challenges are compounded by a basic lack of technological literacy among states in the field of EDTs. The evolving and complex technological nature of EDTs, as well as the unique professional skills required to keep abreast of developments, creates a challenge even for the most technologically savvy states.

While there are no clear-cut responses to EDT-related challenges as yet, there are burgeoning ideas in areas such as ethics, arms control, and export control.

In the ethical sphere, initiatives like the OECD’s Global Partnership on Artificial Intelligence (GPAI) are intended to set ethical standards for the use and development of AI consistent with human rights, fundamental freedoms, and shared democratic values as reflected in the OECD’s recommendations on AI. Another example is the UNESCO Recommendation on the Ethics of Artificial Intelligence, which is based on human rights and dignity and the advancement of principles such as transparency and fairness in the human oversight of AI systems. In addition, there are UN initiatives in the spheres of cyber and outer space that aim to define responsible state behavior, which includes ethical aspects.

With respect to arms control, it should be recognized that for a variety of reasons, including global geopolitics, it is very difficult to reach legally binding international agreements. States have increasingly tried to address arms control challenges through softer measures such as non-legally binding norms, recommendations, and best practices. The issue of EDTs is no exception. The UN Open-Ended Working Group (OEWG) on Security of and in the use of Information and Communications Technologies, which focuses on cyber, and the UN OEWG on Reducing Space Threats Through Norms, Rules and Principles of Responsible Behaviors, are prominent examples. An idea discussed in both fora is possible cooperative measures to address common threats. Another is what states can expect in terms of information-sharing in these fields. In light of the significant national security and economic interests of states with regard to EDTs, even softer measures with regard to transparency and information-sharing are not a given. It seems that arms control measures concerning EDTs may be premature.

Lastly, the application of export controls to EDTs as a means of preventing their proliferation among irresponsible state and non-state actors is similarly difficult. It is a constant challenge to update export control lists with technological advancements. Because consensus must be reached, the updating process of multilateral export control regimes is slower than the advancement of emerging technologies. This results in controls being behind the curve. The formulation of the current global export control architecture is insufficient to fully address concerns about the spread of EDTs, and the gap between controls and emerging technologies is likely to grow. States must therefore seek better, more dynamic, and more flexible ways of controlling these items, such as creating relevant catch-all provisions in national legislation and designing appropriate protocols.

The innovative nature, technological complexity, and wide range of applications of EDTs have implications for state economies, societies, and national security. Concerns regarding AI and other EDTs can be lessened if states agree on the right balance between rapid development and regulation. Technology is inherently neutral, and it is important that states not suffocate human prosperity and sustainable development with measures that are too stringent. At this time, arms control measures seem to be premature and insufficiently flexible given the rapid pace of technological change.

 

view PDF

 

Tamar Rahamimoff-Honig is Deputy Head of Division for Strategic Affairs and Keren Shahar is Senior Deputy Legal Adviser at the Israeli Ministry of Foreign Affairs. The views expressed herein under the authors’ own.

Share this article:

Accessibility Toolbar

השארו מעודכנים