Wide-angle photos of living rooms with furniture arranged to suggest spaciousness, kitchen counters cleared to a degree that no one who actually cooks maintains, and exterior shots taken in the best available light are all common in the real estate listings that show up on Montreal-area property search platforms. Because staging and photography can flatter, buyers have learned to read these pictures with a certain amount of suspicion. Recently, purchasers in Quebec came across something unexpected: a listing where the AI had subtly adjusted the fence, garage door, windows, and landscaping to create an entirely other image of the home. The offending Re/Max representative expressed regret. The agency did the same. The Quebec courts, which had already been formulating their own opinions regarding fabrications produced by artificial intelligence, took notice.
The episode is part of a larger trend of AI abuse, to which Quebec’s legal and regulatory framework has been reacting more quickly than that of most Canadian provinces. A $5,000 punishment was previously imposed on a litigant who submitted court citations produced by an AI system. These citations referenced to instances that did not exist, the kind of confident hallucination these tools produce when requested for precise legal precedents they do not actually have. The language used in the Quebec Superior Court’s cautions regarding the risks of depending on AI for accuracy in court cases was sufficiently pointed to imply that the bench had witnessed enough of this issue to feel compelled to take active action. In real estate, where the accuracy of information about a physical item bears its own financial and legal weight, the property listing controversy expanded this problem.
| Category | Details |
|---|---|
| Topic | AI Misuse in Quebec Real Estate — Listings & Valuation Models |
| Incident | Re/Max Quebec agent used AI to alter property photos (windows, fence, garage, landscaping) |
| Parties Who Apologized | The Re/Max agent and Re/Max agency |
| Court Context | Quebec Superior Court warnings on AI accuracy |
| AI Court Penalty | $5,000 fine for litigant using fake AI-generated court citations |
| Broader Concern | Competition Bureau Canada — AI rental valuation software distorting competition |
| Core Risk | AI “hallucinations” in property data, visuals, and valuations |
| Industry Implication | Legal and ethical exposure for realtors using unverified AI tools |
| Country | Canada (Quebec) |
| Reference Website |
The ease of use and difficulty of detection for a typical buyer are the mechanisms that make AI photo manipulation in real estate listings especially concerning. During an in-person visit, a prospective buyer may notice improvements like window replacements, new fence lines, and better landscaping, which they may mistakenly believe are the result of the seller’s recent effort rather than things that were never there in the first place. Fraud often occurs when there is a discrepancy between what a buyer thinks they are evaluating and what they would actually see at the property; in the Re/Max Quebec incident, this discrepancy was manufactured using an AI technology rather than conventional picture manipulation. Both the buyer who relies on deceptive photographs and the agent who created them face legal risks.
A comparable worry regarding AI in real estate that functions at a different level of the market has been developed by the Competition Bureau of Canada. Its examination of AI-based rental valuation software focuses on whether algorithmic pricing tools can efficiently coordinate pricing behavior in ways that distort competition when used by several landlords or property managers in a particular market without requiring any explicit agreement between the parties using the same software. The risk that algorithmic systems educated on comparable data and optimized for identical results may have anti-competitive impacts even in cases where no particular human actor meant to collude is a more recent type of antitrust concern. Building a case under current competition law is challenging, and it’s yet unknown if Canadian courts will find the current framework sufficient to handle it.

As these incidents spread throughout Quebec’s real estate and legal landscape, there is a sense that the regulatory environment is attempting to create suitable responses to AI misuse without a comprehensive framework in place, moving case by case, fine by fine, court warning by court warning, while the tools that cause these issues continue to advance more quickly than the frameworks for regulating them. The Re/Max apology is appropriate, and the courts’ warnings are necessary, but neither is a long-term solution to the fundamental issue: anyone with a subscription can access AI tools that can alter images, generate citations, and produce valuations. The professional obligations of those who use these tools—the duty to verify, to disclose, and to represent accurately—have not changed, but the temptation and ability to circumvent those obligations has greatly increased.
What Quebec is going through might be the first step toward a reckoning that other real estate markets utilizing AI tools will eventually have to deal with. Without developing the professional ethics frameworks necessary to keep up with the rapid use of AI-assisted pricing, automated listing optimization, and visual editing tools, the real estate sector in particular has advanced rapidly. Buyers are influenced by pictures that may have little to do with the houses they are viewing. Sellers may not fully comprehend the algorithmic recommendations that determine their prices. Citations that don’t exist are sent to courts. When an apology is offered, it usually comes after the harm has been done. The Quebec experience begs the question of whether the sector will erect the barriers before the next mishap or wait for the next penalty.