jack: (Default)
[personal profile] jack
I was recently debating in another journal about new suggestions for internet filtering, ostensibly to prevent children seeing child-inappropriate sites. This is normally met with -- imho justified -- cries of doom. However, it does seem likely that there would be ways to approach it which would actually do some good -- if you, as a reasonably technical aware person, were proposing something, what would it be?

Suggestions:

* Not support political censorship
* If it requires a large investment of manpower (eg. great firewall) be upfront about where that comes from
* Should fulfil stated purpose of allowing concerned non-technical parents to protect their children from inappropriate content to at least some extent
* Should not be a massive expensive unworkable pointless joke
* Should be clear if it will work a country at a time (probably not) or be a small but incremental improvement over large classes of website.

Whatever the government is thinking about is almost certainly unworkable. But if there were something NOT ridiculous which could be suggested instead, that would actually be better than just "it doesn't work", or at least make clear to people who DO want a solution that it may be expensive.

It might even have positive side effects if (eg) pure spam domain names were caught in the crossfire.

Date: 2010-12-21 03:37 pm (UTC)
naath: (Default)
From: [personal profile] naath
I think the best you could do would be to have some group draw up a list of child *appropriate* websites, at various age levels; you could then sell restricted internet connections to concerned parents that would allow *only* these designated-appropriate websites (at the age level specified by the parent).

Allowing the ISP to sell such a service means the parents don't have to worry about their offspring being more tech-savvy than them and circumventing solutions run on the home computer.

I think the decisions about what to white-list have to be made by humans to be properly effective. This would cost money which you could pass on to the people purchasing this service (which would probably deter people from purchasing the restricted service if they don't actually have a requirement for it). But obviously by having humans do this work you couldn't even hope to consider *all* of the web, and would end up excluding enormous amounts of stuff just because it's not popular enough to bother to check. Also the kind of people who demand porn be denied to children old enough to hack the net-nanny software are probably likely to also demand other, less obviously "bad" sites (such as sex-ed sites) are also banned.

I still think, though, that there are problems inherent in the proposition "let us prevent young people from interacting with material we don't like", even if you really do only affect the minor children of parents who actively sign up for your service. Internet resources can be a lifeline for young people whose identities are antithetical to their parents' views, to pick just one aspect.

Date: 2010-12-21 03:45 pm (UTC)
andrewducker: (Default)
From: [personal profile] andrewducker
I agree. A restricted whitelist of websites that are paying to be accredited as child-friendly, and promise to adhere to specific standards is the only way to make this workable.

Date: 2010-12-21 04:21 pm (UTC)
From: [identity profile] robhu.livejournal.com
I'm pretty sure Google don't have a black list of bad sites. They use some algorithm to remove sites which look like they have may adult content. There are three levels of protection from none, to medium, to strict. Which presumably turns up the aggressiveness of the algorithm.

I don't understand why people think an automated system would be so bad. In practice Google Safe Search is very effective, as is the filtering on my mobile internet.

Date: 2010-12-21 04:48 pm (UTC)
naath: (Default)
From: [personal profile] naath
*in order for safe-search like algorithms to keep away ALL the porn the inevitably catch non-porn in their net.
*if, therefore, you want to get all the non-porn then you are going to end up with some porn
*unless you have a large number of humans check for porn (assuming everyone agrees about what porn *is*)

At present a lot of anti-porn filters also filter out things like non pornographic nudity and factual websites concerning genital anatomy or sexual health. If we have a plan to filter EVERYONE'S internet (unless they specifically say "give me the porn") then people are going to be very upset if they can no longer read articles about breast cancer or buy their underwear on the internet.

Date: 2010-12-21 06:31 pm (UTC)
From: [identity profile] robhu.livejournal.com
Yes there will be a false positive rate with such a system, and if you're tuning it to block a lot of porn then it'll probably block a lot of non-porn as well.

I would expect the proposed system (which will probably never happen, but anyway...) would allow the customer to entirely opt out if they wanted to (like the mobile broadband system) and might even allow you to override on a site by site basis (e.g. "Enter your pin to unlock hotsexygirls.com for 1 hour").

Date: 2010-12-21 07:52 pm (UTC)
naath: (Default)
From: [personal profile] naath
Well, that depends how it works.

If it is meant to be robust against teenagers who are more tech-savvy than their parents (because if the parents were more tech-savvy they would have solved the problem of blocking sites they don't like from their teen's computers already!) then you can't allow the user to disable it from their computer. I imagine the process for turning it off would probably be quite irritating to manage (or it would be functionally useless).

If you are blocking porn because the user has decided themselves that they don't want to see random porn all over the place then you obviously include a "no no I really want to see this one" button. But if you don't want the user to have that power it's harder.

I admit I worry mostly about what non-porn things will end up classified as "not suitable for children". I'm not going to cry over kids not being able to see porn! But kids not being able to access information about sexuality and anatomy would be depressing.

Date: 2010-12-21 11:03 pm (UTC)
From: [identity profile] robhu.livejournal.com
I think any such system would be primarily targeting young children inadvertently coming across porn online. The figure quoted in the commons was that 50% of children have seen pornography by the age of 11, and the example given was that the google search "American girls" is both the name of a series of kids dolls and the name of a porn site that appears on the first page of a google moderate safe search search.

I don't think such a system would be very effective at stopping truly determined late teenagers who are technically proficient (although I think it could make things much harder which probably is a good thing).

I agree it would be fairly pointless for preventing access for adults themselves. I suspect a better implementation might have a "The site is blocked due to being XYZ type of site. To unblock for your internet connection permanently or for X minutes enter your PIN here".

I think there would be some miscategorisation, that's inevitable, but this could be limited. Sites could optionally self certify and such self certification would take precedence over the algorithms categorisation. So a sex ed site could self certify as being non pornographic and so access would be allowed. Of course this is open to abuse, but it sounds like a good compromise to me.

Date: 2010-12-21 06:34 pm (UTC)
From: [identity profile] robhu.livejournal.com
I don't think the average person would probably have the system on. I might but I imagine most people I know would just have the filtering turned off.

It'd pretty much only be of interest to parents who didn't have the savvy to restrict porn on their end (i.e. most parents) who were concerned about their kids accessing porn (i.e. almost all parents).

I haven't found the automated system provided by T-mobile to constantly get in my way, so I guess I find it odd that people expect the proposed system to be entirely unusable because my experience of it (well, what they'd almost certainly used) is that it isn't.

Date: 2010-12-21 06:38 pm (UTC)
From: [identity profile] robhu.livejournal.com
Why do you think it's pointless? If I try to access porn on my mobile phone it is actually quite hard to do so. Such a system would prevent children inadvertently coming across porn which would I think be quite a good thing, if 50% of children first view pornography at 11 then I think things that we do that move us towards lowering that number are a good thing especially if the greatest inconvenience such a system poses to non-parents is that they untick a box once.

Date: 2010-12-21 04:19 pm (UTC)
From: [identity profile] robhu.livejournal.com
Hey :-)

What about a link to me in your post body? :)

Date: 2010-12-21 06:36 pm (UTC)
From: [identity profile] robhu.livejournal.com
No problem :-) I just like being more famous!

Actually I think I have an image problem among [personal profile] atreic's friends so I prefer being linked to because it sends a "This guy is saying interesting things" message, although on these sorts of highly controversial issues perhaps the net effect is negative for me.

Date: 2010-12-21 08:50 pm (UTC)
seryn: flowers (Default)
From: [personal profile] seryn
I like the idea, main living area with wireless child-friendly keyboards, a giant LCD TV panel, wireless mouse/trackball in the cartoonishly large size.... and children are only allowed to be online that way. If anyone even glancing from the next room can see what you're doing, there's no problem with direct observation.

If they need computers for schoolwork, those computers are offline.

Until the kid is old enough to be given a limited flexibility, then they should share the "family" computer the history of which is reviewed by the parents. The parents should discuss with children the appropriateness of various choices and how to avoid personal information being taken and how to avoid nefarious places online. "If there's a link in your twitter feed, don't click it because you can't tell where it goes. Rickrolling is a mild irritant compared to sites that give the computer viruses even past two firewalls." Or whatever the future equivalent is.

If there are intermittent returns to the big TV version for joint/supervised lessons about finding information online and evaluation of sources, I think that would provide enough training. Then by the time the child is a teenager in actual need of (limited) privacy, they have skills enough to avoid being defrauded or abused more than happens in real life.

I think it's a low-cost solution, since it costs for the wireless keyboard and cables to hook a computer to the TV, then there are no additional costs. It's implementable at a family level. It provides means for children to actually learn the skills they need instead of being dropped into boiling water at whatever "magic age". It requires no external input or agreement about what is appropriate for "all" children because it's done on a case-by-case basis. The only negative to this is that it requires a massive committment on the part of the parents to raise a productive healthy member of our society... which is exactly what they signed up for.

Date: 2010-12-21 10:50 pm (UTC)
ewx: (Default)
From: [personal profile] ewx

Customer activates their own filter, most likely by ticking a box in their ISP's setup software. Perhaps implemented on customer's ISP-supplied router but could (less conveniently) be part of the desktop software.

  • Minimized scope for censorship because everyone chooses their own filtering. (Inappropriate censorship is inevitably possible, but you can at least limit it to the people who've turned on filtering of some sort.)
  • Entirely funded by people buying filtering software/hardware or ISPs and equipment vendors bundling it (i.e. either competing on features rather than price or being big enough to submerge the costs).
  • Plainly as least as workable as any global approach (it can just make the same decisions but implement them more locally) and in some cases more so (any intensive processing, e.g. to mechanically recognize unwanted material, is both more distributed and implemented on hardware that spends most of its time idle; this is an embarrassingly parallel problem).
  • How fine-grained it is, whether it operates on whitelists, blacklists, pattern recognition or whatever is a matter of user choice between the different available options or local configuration.

Of course, this covers the existing implementations, which plenty of people seem perfectly happy with.

Date: 2010-12-21 11:05 pm (UTC)
From: [identity profile] robhu.livejournal.com
This suggestion is more or less a restatement of the existing system which doesn't work.

Date: 2010-12-22 09:08 am (UTC)
ewx: (Default)
From: [personal profile] ewx
You still haven't produced any evidence for that claim. And since our last encounter started with you making false accusations against me, you're not very credible.