Skip to main content

Technology Working Group Final Phase 2 Report Public Interest Accountability of Professional Accountants

Click HERE to download this section

A. Public Interest Accountability of PAs

Why the Profession Needs to Act 

  1. Digital technologies and related issues – such as AI, data analytics, robotic process automation, blockchain, cloud computing, and data governance (including cyber-security) – continue to have a transformational impact on organizations, governments, economies, and societies. In particular, the lingering impact of the COVID-19 pandemic, which in 2020 upended many working practices and lifestyles and made remote and hybrid work mainstream, has accelerated the adoption of digital platforms, tools, and techniques.11
  2. Despite this uptake of technology implementation and use, a majority of controllers, financial analysts, accountants, and auditors reported not completely trusting the accuracy of their own organizations’ financial data, citing causes such as human error and the vast amount of data flooding the system.12
  3. Concurrently, the centrality of ethics has become undisputed in a world of repeated crises and transformation, both corporate and financial. There is increasing pressure from investors and other stakeholders to embed ethics in corporate culture, and a growing recognition of ethics as an essential condition for sustainable business models. As such, there is a shift from a general expectation of ethics towards a more vocal demand for proactive ethical intent and actions. In particular, stakeholders also observed that there are strong ties between ethical behavior, ethical design of technology, and the incentive structure of individuals involved.
  4. Against this backdrop, ethical decision-making has become more important than ever to reinforce public trust in this semi- virtual, dynamic environment. PAs – with their responsibility to act in the public interest and to adhere to ethics principles and professional standards – are therefore well positioned to enhance this trust through their work and the organizations and clients they support. It is, however, observed that:
  1. Ethics continues to be more frequently considered on the backend of technology development, rather than in the front- end, initial design.13
  2. Stakeholders note that PAs are typically not sufficiently involved in the decision-making process of designing technology products and related services, meaning that they are not in a position to support the ethical fitness-for-purpose development and use of such products and services. Also, even when PAs are involved in the design of technological solutions, technologists and PAs often do not speak the same “language,” as most PAs in accounting and internal control functions within enterprises reportedly lack sufficient competence and experience with emerging technology tools.
  3. Companies are increasingly seeking ‘trust’ services, such as assurance over AI systems, data integrity and governance, and sustainability information.
  4. Despite PAPPs being well positioned to generate this trust through their work and the organizations and clients they support, such assurance is currently predominantly provided by other experts – typically engineering or consulting firms.14 These providers bring specialty technical competence, but largely do not operate under codes of ethics with robust objectivity-related provisions, such as in relation to conflicts of interest and independence as set out in the Code. This creates public interest concerns around the objectivity of the ‘assurance’ being provided and highlights an area where the profession’s ethics and independence foundations can make a better contribution.
  1. The environment of declining trust and an increased demand for ethical decision-making at all levels of an organization, coupled with the current under-representation of PAs in both internal decision-making and external assurance of systems, provides a strong call to action and significant opportunities for the profession to focus on its ethics and independence foundations to deliver more trusted professional services to employers and clients. 

Ethical Leadership

  1. Stakeholders observe that audit committees and risk committees are increasingly being asked about their organizations’ consideration of either developing or implementing new technology.15 In addition, PAs are seen to be ethical leaders who have an opportunity to uphold and promote integrity and objectivity as part of the ethical guardrails around innovation during their organizations’ digital transformation. For example, the Working Group believes that PAs can work with data experts, and can help employers and clients understand where to draw the line or what is ethical behavior when facing an ethics “gray zone” (i.e., under circumstances that are not illegal – perhaps because legislation does not yet exist – and are also ethically ambiguous).16 PAs can do so by relying on the skills, values and behavior they bring to the professional activities they undertake,17 including adherence to ethical principles and encouraging an ethics-based culture. It is therefore essential for PAs to be at the decision-making table and to help oversee, or at least participate in, the implementation and ongoing operations related to emerging technologies.
  2. It is, however, observed that many PAs are generally not substantially involved in the decision-making process for selecting technologies to be developed or implemented within their organizations.18 This lack of involvement might be enhanced by PAs being more appropriately upskilled in emerging and innovative technologies and gaining sufficient data fluency to understand the critical concerns and ask the right questions. This will also help ensure the ethics impact of technology deployments will be considered earlier in the process, rather than only post-implementation and on an ad-hoc basis.
  3. Nevertheless, it is also observed that where PAs are indeed involved in decision-making (for example, generally small and medium-sized organizations and practices), they might lack the relevant understanding of the technology with which they are dealing. This in turn might result in the potential misidentification of the risks and controls pertaining to such technology and a lack of professional competence to determine if the technology (or its outputs) is appropriate or reasonable. It is noted that the potential for miscommunication with software developers and technologists also increases when PAs are not appropriately skilled.
  4. In order for ethics and compliance with laws and regulations – for example, in relation to data privacy, cybersecurity, etc. – to be more fully considered in strategic decisions when organizations contemplate developing, implementing, or using technology, appropriately skilled PAs should be encouraged to be involved during conceptualization and design. In this regard, the Code contains provisions in relation to having an inquiring mind, exercising professional judgment, being aware of bias, and maintaining an appropriate level of professional competence (i.e., including relevant technology upskilling) to enable PAs to be ethical leaders in this area and have a seat at the decision-making table. PAs should also be aware of, and transparent about, the level of competence they have with different technologies. Accordingly, at the decision-making table, PAs can add value by, for example:
Image
  1. Identifying design needs and specifications that can help the business function so that fit-for-purpose tools are built in an ethical and socially responsible manner;
  2. Proactively considering, during the design process, the potential for unintended consequences;
  3. Questioning assumptions, including bias, in data and in the design of systems and algorithms, and the processes related to creating and/or collecting data;
  4. Ensuring appropriate conditions, policies and procedures, and/or systems of quality management are in place and operating effectively so that issues, such as threats to compliance with the fundamental principles of the Code,19 are identified in a timely manner. This includes having proper documentation requirements so that where an issue arises, it is easier to determine whether it is due to a governance issue where controls need to be strengthened or whether it is symptomatic of a broader ethics issue; and
  5. Being able to determine whether – and to what extent – reliance on technologists is reasonable.
  1. Stakeholders also indicated that the digital age has resulted in inherent cybersecurity and data integrity risks within every organization. Stakeholders also expressed the view that a PA’s ethics responsibility should extend to controls over:
  1. Cyberattack20 prevention and response plans to safeguard valuable intellectual property and meet confidentiality and privacy requirements; and
  2. Data governance – along the complete data-to-decision chain, including being able to cull relevant and reliable data and information from what is frequently an ‘overload’ of available sources.
  1. When issues arise, there is an expectation for PAs to take action. In particular, stakeholders stressed the importance of PAs having the moral courage to speak up when there is pressure to breach the fundamental principles in the context of developing, implementing, or using emerging technologies. This includes educating others on ethics issues in technology and fostering a business culture where it is safe to raise issues and concerns. For example, a safe environment should be fostered for others in the organization, such as data scientists, to escalate concerns about any bias or discrimination identified in AI systems without the fear of retaliation.21
  2. Finally, some stakeholders noted the importance of not conflating professional ethics with morality. For example, considering the merits of PAs working for legitimate enterprises in industries that some people might consider objectionable, such as weapons manufacturers or bioengineering companies, was deemed more a question of individual morals than professional ethics. Nonetheless, due care in evaluating the implications of decision-making on professional ethics (i.e., identifying, evaluating, and addressing threats to complying with the fundamental principles) is still expected, regardless of the organization.

Shared Responsibility

  1. In most instances, technology – even related to management processes and financial reporting systems – is not solely under the PA’s control, and consideration is needed to determine how responsibility for such systems should, or can, be shared with other professionals.
  2. For example, when technology is developed by a third-party to help deliver a service, stakeholders have questioned where the liability resides if technology is implemented and fails to detect certain issues in the organization, makes an inappropriate recommendation, or leads to a breach of confidentiality or privacy, etc. In such circumstances, it was questioned whether liability would reside with the technology designer, the PA who accepted the output of the technology, the CFO, or the auditor who provided assurance on the system or its outputs.22
  3. Some stakeholders encouraged the concept of shared responsibility, namely that it is the responsibility of everyone involved, including PAs and IT professionals (e.g., data scientists, technologists, and engineers). The degree of responsibility would also be expected to change depending on the individual’s position in the organization, commensurate with their authority and role. Stakeholders view that such shared responsibility is most effectively communicated by the tone at the top, through a robust code of conduct and implicit in a strong ethical organizational culture. In addition, accountability mechanisms for the technology solution’s output should be defined upfront, whether this relates to the data forming an input to the system, the algorithms being applied to data, or how the outputs are interpreted and evaluated.
Image
  1. Other stakeholders noted that PAs (e.g., the accounting and finance functions of an organization) are ultimately responsible for all aspects of the related accounting and financial reporting system(s), even if such systems are developed and/or maintained by a third-party. For example, where an organization has outsourced its data storage to a third-party provider, and despite there being a joint legal liability for a cyberattack, the audit committee would likely still view the responsibility to be largely on the organization itself (i.e., shared between PAs and the IT department), as opposed to the third-party provider.
  2. Nevertheless, it is noted that effectively considering ethics and potential unintended consequences of the technology development or selection process, and of the operation of such technology, needs to be driven by multidisciplinary teams working together in organizations: technologists with specialist technology, systems, and data expertise, and PAs with deep knowledge of business processes, risks and controls, and a strong code of ethics.23 However, for small and medium enterprises or practitioners (SME/Ps), stakeholders observe that they might not have the resources available to establish multidisciplinary teams, to seek expert advice when relying on or using technology, and to maintain adequate controls over security. This could be problematic and result in systems that are not fit-for- purpose and at-risk of data and other cybersecurity breaches.

Sustainability

  1. PAs are viewed as stewards of both financial and non-financial (i.e., environmental, social, and governance (ESG)24) information, and are well placed to perform and report on analyses of such information, as well as provide assurance over the reported information.25 
  2. Sustainability is rapidly becoming a core expectation of organizations and is closely tied to ethical stewardship and good governance. Fueling this core expectation is a major shift in investors’ capital allocation to businesses perceived as more sustainable, viewed through an ESG prism.26 Specifically, sustainable funds are continuing to attract capital at a record pace. For example, in the United States, such funds reached $51 billion in 2020 – more than double the total for 2019 and nearly 10 times more than in 2018, according to Morningstar.27 Investors are now subjecting ESG to the same scrutiny as operational and financial considerations, becoming skeptical of ESG disclosures and commitments, and expecting more litigation as a result of companies not delivering on ESG promises.28
  3. For meaningful progress in sustainability reporting, there is a need for technology to process the massive volume of data in order to track and narrate such information. Accordingly, considerations to enable the effective application of technology for sustainability reporting include:
Image
  1. What data should be measured? Data29 is integral to how an organization collects, tracks, and reports on sustainability. Furthermore, such data collection and tracking need to be conducted in a timely fashion in order for the reporting to be of value.
  2. What is the right set of technology tools to collect and analyze the data? This could include Internet-of-Things devices, cloud computing solutions, AI machine learning, data analytics software tools, etc.
  1. It is observed, however, that there remains a relative lack of uptake in new technologies to support sustainability and mitigate climate change because the business case remains less tangible or insufficiently understood. There is also a push to understand sustainability information and the underlying drivers of progress. For example, cryptocurrency mining consumes a lot of energy, but how this consumption compares with the energy required to support traditional financial markets and also whether and how mining adds value should be better understood. Further, cryptocurrency transactions30 and AI applications31 are also resource intensive. The Working Group believes that PAs are well- positioned to play a role in this analysis space.

  

 

 

Endnotes

11 Vargo, Deedra, et al. “Digital technology use during COVID-19 pandemic: A rapid review.” Wiley Online Library, 28 November 2020, https://onlinelibrary.wiley.com/doi/epdf/10.1002/hbe2.242.

12 Ryan, Vincent. “Risk Management: Numbers Don’t Lie, Until They Do.” CFO, 21 March 2019, https://www.cfo.com/accounting-tax/auditing/2019/03/numbers-dont-lie-until-they-do/.

13 See, for example, Ammanath, Beena. “Thinking Through the Ethics of New Tech…Before There’s a Problem.” Harvard Business Review, 9 November 2021, https://hbr.org/2021/11/thinking-through-the-ethics-of-new-techbefore-theres-a-problem.

14 Ho, Soyong. “Who Should Provide ESG Assurance?” Thomson Reuters, 20 August 2021, https://tax.thomsonreuters.com/news/who-should-provideesg-assurance/.

15 Considerations, for example, include how the transformational technology fits into the company’s strategy and its capital expenditures; the appropriateness of the company’s enterprise risk management system; and cyberattack impacts on technology assets, policies, and regulator expectations, as well as appropriate cybersecurity insurance.

16 See, for example:

17 Paragraphs 100.2 and 100.3 of the Code

18 PA involvement in the decision-making process is more significant in smaller entities or in firms, whereas in larger entities, it tends to be the IT department that drives such implementation.

19 Section 110 The Fundamental Principles of the Code

20 See also, for example, “The CPA’s Role in Addressing Cybersecurity Risk.” Center for Audit Quality, 24 May 2017, https://www.thecaq.org/cpas-roleaddressing-cybersecurity-risk/. Note also that the Working Group believes this should now include establishing ransomware polices and having backup IT security teams on standby.

21 See Wong, Julia Carrie. ”More than 1,200 Google workers condemn firing of AI scientist Timnit Gebru.” The Guardian, 4 December 2020, https://www.theguardian.com/technology/2020/dec/04/timnit-gebru-google-ai-fired-diversity-ethics. Ex-co lead of Google’s Ethical AI team who allegedly was fired over a dispute in relation to a research paper she had co-authored. The paper contended that AI systems aimed at mimicking human writing and speech do not exacerbate historical gender biases and use of offensive language.

22 In a US context, see, for example, commentary about potential liability for enforcement actions in this area by the US Federal Trade Commission in Bachman, Allen R. “FTC Issues New Guidance, Warning That Bias in Artificial Intelligence Could Create Potential Liability for Enforcement Actions.” National Law Review, 24 April 2021, https://www.natlawreview.com/article/ftc-issues-new-guidance-warning-bias-artificial-intelligence-could-createpotential.

23 See, for example, Bannister, Catherine, and Sierra, Jessica. “Ethical technology is a team sport: Addressing the ethical impact of technology requires everyone’s participation.” Deloitte, 2021, https://www2.deloitte.com/content/dam/Deloitte/us/Documents/about-deloitte/us-ethical-technology-is-ateam-sport.pdf; Ammanath, Beena. “Thinking Through the Ethics of New Tech…Before There’s a Problem.” Harvard Business Review, 9 November 2021, https://hbr.org/2021/11/thinking-through-the-ethics-of-new-techbefore-theres-a-problem; and Hao, Karen. “When algorithms mess up, the nearest human gets the blame.” MIT Technology Review, 28 May 2019, https://www.technologyreview.com/2019/05/28/65748/ai-algorithmsliability-human-blame/.

24 In this regard, stakeholders also commented that the current lack of globally consistent standards, regulations, guidelines, as well as standardized requirements for service providers hampers the ability of PAs to effectively take on this stewardship role for sustainability reporting.

25 “How CPAs can lead ESG Initiatives.” CPA Canada, 14 January 2021, https://www.cpacanada.ca/en/business-and-accounting-resources/strategy-riskand-goverance/corporate-governance/publications/esg-and-business-resilience.

26 IESBA’s Strategy Survey 2022: Part 1 on “Responding to developments relating to reporting and assurance of sustainability developments”.

27 “Auditors & ESG Information: Lending trust and credibility to ESG information.” Center for Audit Quality, 18 October 2022, https://www.thecaq.org/collections/auditors-and-esg/.

28 “2021 Trust Barometer Special Report: Institutional Investors.” Edelman, 17 November 2021, https://www.edelman.com/trust/2021-trust-barometer/investor-trust.

29 DiGuiseppe, Matt. “The No.1 ESG challenge organizations face: data.” World Economic Forum, 28 October 2021, https://www.weforum.org/agenda/2021/10/no-1-esg-challenge-data-environmental-social-governance-reporting/.

30 See, for example, Hern, Alex. “Waste from one bitcoin transaction ‘like binning two iPhones’.” The Guardian, 17 September 2021, https://www.theguardian.com/technology/2021/sep/17/waste-from-one-bitcoin-transaction-like-binning-two-iphones.

31 Gupta, Abhishek. “Quantifying the Carbon Emissions of Machine Learning.” Montreal AI Ethics Institute, 6 June 2021, https://montrealethics.ai/quantifying-the-carbon-emissions-of-machine-learning/.