The government of Canada has come through with more information on amendments to its proposed privacy and artificial intelligence legislation, although the exact wording in some cases isn’t detailed.
Whether this will be enough to satisfy business and academic critics, as well as opposition MPs, to help speed the legislation through Parliament isn’t known. Committee hearings resume next week.
In response to a demand last week from the House of Commons Industry committee, Innovation Minister François-Philippe Champagne filed a 12-page letter with details of how the government will improve C-27, the proposed bill with three pieces of legislation: the Consumer Privacy Protection Act (CPPA), which covers federally-regulated private sector firms and provinces and territories that don’t have their own private sector law; a bill creating a digital privacy tribunal to hear requests for punishment from the federal Privacy Commissioner; and the Artificial Intelligence and Data Act (AIDA).
The opposition majority on the committee demanded Champagne produce details of the government’s proposed amendments before witnesses started testifying. In fact, debate over the demand interrupted Privacy Commissioner Philippe Dufresne after he had only given his opening remarks. He’ll be back later. But it was little use, opposition committee members argued, for the witnesses to complain about legislation that is about to be changed.
One signification proposal spotted by University of Ottawa (UofO) law professor Michael Geist: AIDA would cover the use of AI systems by search engines and social media platforms to recommend or prioritize results.
“The inclusion of content moderation and discoverability/prioritization [of search engine results] comes as a surprise,” said Geist, who is Canada Research Chair in internet and e-commerce law, “as does equating AI search and discoverability with issues such as bias in hiring or uses by law enforcement. While the government says it is more closely aligning its rules to the EU [European Union], it appears Canada would be an outlier when compared to the both the EU and the U.S. on the issue.”
In an email to IT World Canada, Teressa Scassa, UofO law professor and Canada Research Chair in information law and policy, said Champagne’s letter is just “an outline of what the government has in mind,” so the committee and witnesses “will have to be satisfied with that. It is frustrating.”
Imran Ahmad, co-head of the information governance, privacy and cybersecurity practice at the Norton Rose Fulbright law firm, said the proposed changes improve C-27. In fact, he added, they “were required for C-27 to move forward with the broader support of the private sector. On the Artificial Intelligence Data Act front, the creation of categories of ‘high impact systems’ aligns with the EU approach. Clearly, Minister Champagne has been listening to the feedback provided on C-27 by industry since June 2022 when the bill was initially introduced.”
UPDATE: Asked for comment, a Microsoft spokesperson issued this statement: “We are currently reviewing the Minister’s letter to Committee outlining the government’s intended amendments to Bill C-27. We appreciate that Canadian policymakers are looking closely at how AI technology works, and we will continue to engage in these important conversations about the future development of AI and innovation in Canada.”
Here are some details from Champagne’s letter and the changes the government plans to make to the proposed legislation:
On the CPPA
— as Champagne said in his opening remarks to the committee, the right to privacy stated already in the proposed legislation would be changed to say it is a “fundamental right.” Dufresne and other critics have asked for this;
— to strengthen attempts in the proposed bill to protect children over commercial rights Champagne’s letter now says the preamble “will include a specific reference to the special interests of children with respect to their personal information.”
Wording in a section will also be changed, forcing organizations to consider the special interests of minors when determining whether personal information is being collected, used or disclosed for an appropriate purpose.
Champagne reminded MPs that the CPPA already deems all personal information belonging to a minor as “sensitive.” This means that businesses will generally need to get express consent when collecting, using, or disclosing the information;
— the CPPA would give the Privacy Commissioner the power to get an agreement from a business to comply with the legislation. But to meet complaints the Commissioner cannot levy a financial penalty on non-compliant organizations, the CPPA will be changed to say a compliance agreement may also contain financial consideration;
On AIDA
— the proposed legislation would force businesses to use “high impact” AI applications responsibly. To meet complaints that “high impact” wasn’t defined, the proposed changes add a schedule to the bill saying the definition would include systems that relating to determinations in respect of employment, including recruitment, referral, hiring, remuneration, promotion, training, apprenticeship, transfer or termination; that determine whether to provide services to an individual; that determine the type or cost of services to be provided to an individual; or that prioritize the services to be provided to individuals.
In addition the schedule would say high-impact AI systems include the use of an artificial intelligence system to process biometric information in matters relating to (a) the
identification of an individual, other than if the biometric information is processed with the individual’s consent to authenticate their identity; or (b) an individual’s behaviour or state of mind.
— it would make it clear AIDA applies to the use of an artificial intelligence system that moderates online content, including search engines and social media, in healthcare systems, and in systems used by police.
— the changes will also make it clear that those developing a machine learning model
intended for high-impact use have to ensure that appropriate data protection measures are taken before it goes on the market;
— developers of general-purpose AI systems like ChatGPT would have to establish measures to assess and mitigate risks of biased output before making the system live. Managers of general-purpose systems would have to monitor for any use of the system that could result in a risk of harm or biased output;
— the letter also says the government will support suggested amendments to strengthen the powers of the proposed AI and Data Commissioner, who will enforce the act. Some critics complain the AI Commissioner reports to the Industry minister and is not independent like the Privacy Commissioner, who reports to Parliament.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : ITBusiness.ca – https://www.itbusiness.ca/news/search-engines-social-media-platforms-to-come-under-canadas-ai-law-says-government/126274
Unveiling 2024 Community Health Assessment: Join the Conversation and Collaborate for a Healthier Future!