2023 has been a busy year for the Institute’s Technology Interest Group (TIG). This article reviews the latest guidance notes issued by the TIG looking at the roles of governance professionals in handling a range of tech risks and opportunities.

Highlights

  • governance professionals can add significant value by keeping boards aware of the ethical issues to consider when deploying AI tools
  • cybersecurity risks need to be taken seriously by boards and governance professionals can play a key role in making sure that this issue gets the attention it deserves
  • preventing all cyberattacks is unlikely to be possible and the focus should therefore also be on achieving zero disruption from such attacks

In July 2016, the Institute set up its Interest Groups under the Technical Consultation Panel to produce guidance notes on topics relevant to the work of governance professionals. One of those groups – the TIG – has since published 11 guidance notes, available in the Thought Leadership section of the Institute’s website, on a range of topics to help practitioners stay up to date with technology governance issues. 

This year saw the addition of four guidance notes to the TIG series and this article reviews these latest additions, looking at the deployment of artificial intelligence (AI) tools, managing cyber risks and virtual asset regulation in Hong Kong.

AI and the governance professional

The use of AI tools is already fairly ubiquitous in Hong Kong and globally. Such tools are in use to improve customer interfaces, to increase productivity and operational effectiveness, and to help with strategic planning and decision-making. 

The most recent guidance note published by the TIG – An Overview of Managing the Risks and Opportunities & Responsible Deployment of AI Tools (11th in the series) – recognises that AI tools are ‘intimately linked to innovation and competitive advantage’. Published in August 2023, the guidance emphasises that the deployment of AI tools carries both risks and opportunities, and that governance professionals have a crucial role in advising directors and executives on how these can be best managed. 

‘The responsibility of governance professionals is to assist directors in understanding the risks and opportunities of deploying AI, creating policies and procedures for proper risk management, and ensuring implementation consistent with the business’s purpose and values,’ the guidance says. 

It goes on to highlight the major risks governance professionals should be aware of. First among these is the potential risk of bias in AI tools. 

‘ChatGPT and other AI systems learn from a massive amount of data, and if that data contains any biases, the AI system may unintentionally reinforce those biases. It is critical to take proactive measures to combat prejudice and implement plans to ensure that decisions are made fairly,’ the guidance says. 

The protection of data privacy is another key risk to consider. Organisations need to be aware, for example, of the growing body of legislation and regulation concerning data privacy protection. 

‘Companies must manage data ethically since AI depends on it. Strong data protection measures are required to protect user information and adhere to privacy laws, such as encryption, anonymisation and secure storage,’ the guidance says. 

It also addresses another, perhaps less- well recognised, area of risk resulting from the potential social impacts of AI tools. The most obvious example of this would be the displacement of, or discrimination against, workers. Governance professionals should not, therefore, see their contribution solely in terms of helping organisations to remain compliant with relevant legislation. This is of course important, but practitioners can add significant value by keeping boards aware of the ethical issues to consider. 

‘As a governance professional, your responsibility is to steer businesses towards a future in which AI and human values coexist peacefully, enhancing society and promoting responsible digital transformation,’ the guidance says.

Managing cyber risks and opportunities

In February this year, Ada Chung Lai-ling FCG HKFCG, Privacy Commissioner for Personal Data, Hong Kong, wrote an article in this journal discussing the increasing trend of cyberattack incidents in Hong Kong and globally. She pointed out that, not only are cyberattacks increasingly common, they are usually highly destructive both in terms of an organisation’s IT system integrity and its reputation. 

In this context, governance professionals have a role to play in helping to build effective policies and governance frameworks to defend against, and minimise the disruption caused by, cyberattacks. The 10th issue in the TIG series of guidance notes – An Overview to Facilitate Boards to Manage Cyber Risks – emphasises that cybersecurity risks need to be taken seriously by boards and governance professionals can play a key role in making sure that this issue gets the attention it deserves. 

‘All company boards… must include the management of cyber risk as part of their fiduciary and oversight responsibilities. That is, a reasonable director should be concerned with cyber risk, which consistently is ranked as a top-of-the-agenda risk matter that boards should consider,’ the guidance says. 

This does not just mean keeping directors aware of the cybersecurity protections in place, however. The guidance emphasises that boards need to take on an active oversight role. ‘Board members must assume that cyberattacks are likely and exercise their oversight responsibilities to actively ensure that executives and managers have made adequate preparations to respond to, and recover from, these attacks,’ it says.

“as a governance professional, your responsibility is to steer businesses towards a future in which AI and human values coexist peacefully, enhancing society and promoting responsible digital transformation”

 

Building bridges between management and the board 

The TIG guidance also emphasises the benefit of keeping an open dialogue between management and the board on cybersecurity. ‘The governance professional should facilitate discussions of cybersecurity frequently and actively with management. This is not a “one and done” type of decision, but a constantly shifting and moving target. The more regularly the board is exposed to their organisation’s cyber situation, the more comfortable and knowledgeable they become,’ it says. 

This also means ensuring board members have access to the organisation’s cybersecurity experts. The guidance suggests that, while inviting these cyber executives to report to the board is a good first step, governance professionals should consider other ways to deepen their relationship. ‘The time to build the bridge is not during a cyber incident; this should occur well before difficult conversations are necessary,’ it adds.

Consider resilience in addition to protection

New cyberthreats continue to emerge and the TIG guidance emphasises that, even with the best possible technological safeguards, not all cyberattacks can be thwarted. ‘Therefore, in our view, the ultimate objective of an organisation should be “zero disruption” from a cyber breach. This shifts the emphasis from protection to resilience when designing a cybersecurity programme,’ the guidance says.

Virtual asset regulation in Hong Kong

Hong Kong seeks to position itself as a major market for virtual assets (VAs) and VA-related products. This was made explicit by the Policy Statement on Development of Virtual Assets in Hong Kong published by the Financial Services and the Treasury Bureau in October 2022. 

Nevertheless, the collapse of FTX, one of the world’s largest crypto exchanges, in November 2022, demonstrated the potential risks to investors. The Securities and Futures Commission (SFC), the regulator of virtual asset service providers (VASPs) in Hong Kong, has been issuing statements and press releases to warn investors about the risks involved. 

On 13 December 2022, it issued a statement warning that investors may suffer significant, or even total, losses in the event of fraud or the collapse of a VA platform. More recently, on 7 August 2023, it issued a press release reminding investors to be wary of the risks of trading virtual assets on unregulated VASPs.

Hong Kong’s new regulatory regime

Hong Kong is currently implementing a new regulatory regime to bring VASPs under SFC regulation. The Anti–Money Laundering and Counter–Terrorist Financing Ordinance (AMLO) was amended in December 2022 to introduce a new licensing regime for VASPs to be supervised by the SFC. It also proposed statutory anti–money laundering and counter–financing of terrorism obligations that will be very relevant to governance professionals working for trust or company service providers (TCSPs). 

While the amended AMLO became effective earlier this year, the new licensing regime is still in the process of being implemented. From 1 June 2023 to 29 February 2024, unlicensed VASPs can submit licence applications to the SFC. VASPs operating a VA exchange in Hong Kong that have not applied to the SFC for a licence on or before 29 February 2024, and VASP licence applicants that have been issued a rejection notice, will be required to close down their VA exchange business in Hong Kong by 31 May 2024, or within three months upon the issuance of the rejection notice (whichever is later).

What will this mean for governance professionals? 

As part of its commitment to assist governance professionals to stay up to date with these developments, the TIG issued a guidance note in two parts (eighth and ninth in the series) earlier this year, looking at how the new regulatory regime for VASPs will impact governance professionals. 

The expansion of the VA market will impact governance professionals in various sectors, but there are already direct impacts to be considered in the financial sector. For example, financial institutions (including licensed VASPs under the amended AMLO) will need to perform customer due diligence (CDD) measures before carrying out an occasional transaction that is a transfer involving VAs amounting to no less than HK$8,000, whether the transaction is carried out in a single operation or in several operations that appear to be linked. Moreover, in comparison with other financial institutions, licensed VASPs will need to perform CDD measures for a much broader range of occasional transactions. 

The guidance note also highlights the new criminal offences relating to fraud involving VAs. These offences potentially carry severe penalties. The offence involving fraudulent or deceptive devices in VA transactions, for example, carries a fine of up to HK$10 million and 10 years’ imprisonment, and on summary conviction up to HK$1 million and three years’ imprisonment. The offence of fraudulently or recklessly inducing others to invest in VA carries up to HK$1 million and seven years’ imprisonment, and on summary conviction a level six fine and six months’ imprisonment. 

These criminal offences will be applicable to any person, regardless of whether that person is providing a VA service or not, and will also be applicable to overseas VA exchanges that are not licensed by the SFC. 

The guidance notes reviewed in this article are available in the Thought Leadership section of the Institute’s website: www.hkcgi.org.hk.  

SIDEBAR: Credits

The Institute thanks the members of the Technology Interest Group (TIG) and external authors who worked on the guidance notes reviewed in this article. The members of the TIG are: Dylan Williams FCG HKFCG (Chair), Ricky Cheng, Harry Evans, Gabriela Kennedy and Philip Miller FCG HKFCG. Mr Williams authored the 10th issue guidance note on managing cyber risks. Mohan Datwani FCG HKFCG(PE), Institute Deputy Chief Executive and Secretary of the Institute’s Interest Groups, authored the 11th issue guidance note on AI tools. Hannah Cassidy, Partner, Natalie Curtis, Partner, Calvin To, Associate, and Valerie Tao, Professional Support Lawyer, Herbert Smith Freehills, authored the eighth and ninth issue guidance notes on Hong Kong’s new licensing regime for virtual asset service providers. The Institute also thanks April Chan FCG HKFCG, Institute Past President and Chairman of the Institute’s Technical Consultation Panel (which oversees the work of the Institute’s Interest Groups), for her contributions. 

Comments and/or suggestions relating to the Institute’s Interest Groups can be addressed to Mr Datwani at: mohan.datwani@hkcgi.org.hk.

“all company boards… must include the management of cyber risk as part of their fiduciary and oversight responsibilities”