

You would need a way of verifying that the SHA256 is a true copy of the site at the time though and not a faked page. You could do something like have a distributed network of archives that coordinate archival at the same time and then using the SHA256 then be able to see which archives fetched exactly the same page at the same time through some search functionality. I mean if addons are already being used for doing the crawling then we may be mostly there already since said addons would just need to certify their archive and after that they can discard the actual copy of the page. You need need a way to validate those workers though since a bad actor could just run a whole bunch at the same time to legitimise a fake archival.

The delay makes intuitive sense especially since it will give the target a chance to complain about it to their friends and family who will hopefully stop it from there.
However, I’m not sure if it’s worth it. I imagine this would stop exfiltration apps which scan the users device to useful data and maybe passive screenshots but this pales in comparison to apps with subscription dark patterns, gambling and apps that harvest and sell your data legally already. If this was a case of apps prompting the user to enter sensitive information into a form then they could just use a browser.
I don’t know. I think this is a good measure to prevent scams. I’m just uncomfortable about Google’s motivation.