All the methods suggested (and as stated in the article, none of them likely to be very effective) are around protecting information, whether by not giving it out or through automated methods of forgetting.
What about reputational costs for people who overreact to minor and/or outdated pieces of information?
Also, unless I missed something (I admit I skimmed) there was nothing about changes over time in standards for behavior, and a background assumption that every group has the same standards.
Who knows, maybe there can be longterm running scores of the effects of maintaining different sets of standards? Not that such a thing would cause people to agree on what effects are valuable, but it still might be interesting and somewhat useful.
Link thanks to rm.
Some of the underlying premises inspired by H G Wells' Men Like Gods -- it's a book about a half dozen or so random British people from the 1920s stranded in a utopia. Among other things, records about everyone are publicly available, though, iirc, they're on paper files in central locations. One of the British character says something like he knows someone who could make utopia into hell in a few weeks just from having access to the records. It seems to me that you can only do that sort of thing if there's a tremendous amount that people want to keep secret.
One thing I'm hoping for is that so much information being available about what people actually do will lead to reasonable standards for how people can be expected to act. This may be excessively optimistic.