Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/271328 
Authors: 
Year of Publication: 
2023
Citation: 
[Journal:] Internet Policy Review [ISSN:] 2197-6775 [Volume:] 12 [Issue:] 1 [Year:] 2023 [Pages:] 1-30
Publisher: 
Alexander von Humboldt Institute for Internet and Society, Berlin
Abstract: 
Privately developed artificial intelligence (AI) systems are frequently used in smart city technologies. The negative effects of such systems on individuals' human rights are increasingly clear, but we still only have a snapshot of their long-term risks to human rights. The central role of AI businesses in smart cities places them in a key position to identify, prevent and mitigate risks posed by smart city AI systems. The question arises as to how such preventive responsibilities are articulated in international and European governance initiatives on AI and corporate responsibility, respectively. This paper addresses the questions regarding: (1) the Organization for Economic Cooperation and Development's 'Business and Finance Outlook 2021: AI in Business and Finance'; (2) the EU's proposed 'AI Act'; and (3) the EU's 'Proposal for a Directive on corporate sustainability due diligence'. The paper first discusses the role of private AI developers in smart cities and the relevant limitations of applicable legal frameworks (section 1). Section 2 categorises long-term risks to human rights posed by the private development of smart city AI systems. Section 3 discusses how preventive responsibilities in the three initiatives reflect considerations of long-term risks. Critical observations and recommendations are provided in section 4, and conclusions are in section 5.
Subjects: 
Smart cities
Artificial intelligence
Human rights
AI Act
Corporate sustainability
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by Logo
Document Type: 
Article

Files in This Item:
File
Size
316.99 kB





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.