Canada’s privacy commissioner says the government’s proposals to modernize Canada’s federal private sector privacy law are “a step in the right direction,” but must go further to protect fundamental privacy rights.
The statement from Privacy Commissioner Phillipe Dufesne came in a written submission on Bill C-27, the Consumer Privacy Protection Act (CPPA), the government’s proposed new private sector privacy law, to the House of Commons standing committee on Industry and Technology.
As part of his submission, Dufresne repeated his office’s call for the legislation to recognize privacy as a fundamental right, and that the law limit organizations’ collection, use and disclosure of personal information to specific and explicit purposes that take into account the relevant context.
C-27 was introduced in Parliament last June. It was recently forwarded to the Industry committee for witness testimony and detailed analysis. No date has yet been set for hearings to begin.
Federal private sector privacy law applies to federally-regulated industries and firms in provinces and territories that don’t have their own law. That includes every jurisdiction except British Columbia, Alberta and Quebec.
While C-27 includes the proposed Artificial Intelligence Data Act (AIDA) for regulating AI, Dufresne’s comments only deal with the CPPA. Some experts hope the government will hive off AIDA from C-27, arguing it needs a separate analysis. Others argue a flawed AI bill is better than none.
Dufresne said the CPPA is an improvement over both the existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), as well as an earlier version of the reform bill (known at the time as C-11) which died when the last election was called.
“I welcome and am encouraged by the committee’s upcoming study of Bill C-27,” Dufresne said. “This bill is a step in the right direction, but it can and must go further to protect the fundamental privacy rights of Canadians while supporting the public interest and innovation.”
In his written submission to the committee, Dufresne listed 15 key recommendations to improve and strengthen the proposed law.
recognize privacy as a fundamental right;
protect children’s privacy and the best interests of the child;
limit organizations’ collection, use and disclosure of personal information to specific and explicit purposes that take into account the relevant context;
expand the list of violations qualifying for financial penalties to include, at a minimum, appropriate purposes violations;
provide a right to disposal of personal information even when a retention policy is in place;
create a culture of privacy by requiring organizations to build privacy into the design of products and services and to conduct privacy impact assessments for high-risk initiatives;
strengthen the framework for de-identified and anonymized information;
require organizations to explain, on request, all predictions, recommendations, decisions and profiling made using automated decision systems;
limit the government’s ability to make exceptions to the law by way of regulations;
provide that the exception for disclosure of personal information without consent for research purposes only applies to scholarly research;
allow individuals to use authorized representatives to help advance their privacy rights;
provide greater flexibility in the use of voluntary compliance agreements to help resolve matters without the need for more adversarial processes;
make the complaints process more expeditious and economical by streamlining the review of the Commissioner’s decisions;
amend timelines to ensure that the privacy protection regime is accessible and effective;
expand the Commissioner’s ability to collaborate with domestic organizations in order to ensure greater coordination and efficiencies in dealing with matters raising privacy issues.
Among the improvements C-27 has over C-11, Dufresne said, is the addition of a preamble to offer guidance on the law’s broader objectives; new provisions to help protect the privacy of minors; an expansion of personal information that individuals can request be disposed of; amendments to require that information provided to obtain valid consent be presented in understandable language; and amendments that grant increased discretion to the Office of the Privacy Commissioner, for example, in relation to complaints and investigations.
Other differences between C-27 and the previous version that Dufresne likes include an expanded requirement to ensure that the manner in which personal information is collected, used, and disclosed is appropriate; an amendment to accountability measures requiring organizations to maintain privacy management programs; and a new requirement to authenticate identity as part of security safeguarding requirements.
Businesses may focus their attention on the Commissioner’s insistence that CPPA limit organizations’ collection, use and disclosure of personal information to specific and explicit purposes that take into account the relevant context.
The CPPA, like PIPEDA, sets boundaries for how a firm can collect, use, or disclose personal information, the submission says. However, it adds, under PIPEDA, organizations’ purposes for handling personal information need to be ‘explicitly specified.’ This important requirement, that purposes be both explicit and specific, is missing from the CPPA. “Without it,” says Dufresne’s submission, “the door is open to organizations identifying overly broad and ambiguous purposes, such as ‘improving customer experience.’”
Dufrense also said provisions should be added to the CPPA to require organizations to practice privacy by design and to conduct privacy impact assessments for high-risk activities.
His recommendations for changing the CPPA also deal with automated decision-making software systems, like machine learning and AI. The CPPA imposes two new obligations on organizations using automated decision-making systems. However, Dufresne says their scope is too limited in areas where there should be increased transparency.
For example, Dufresne’s submission says, unlike the EU’s General Data Protection Regulation (GDPR) and other modern privacy laws in California and Québec, the obligations do not explicitly apply to profiling. As drafted, the obligations would only apply to automated decision systems that make decisions, recommendations, or predictions. Profiling should be added to that list, the submission says.
The CPPA also requires organizations to provide a general account of the use of any automated decision system that makes predictions, recommendations or decisions that could have a “significant impact” on individuals. That qualifier should be removed, the submission says.