I've looked at a few schemes for blinding the NSA privacy attacks on U.S. Citizens. I view their activity as a direct violation of our rights. So making it hard is nothing if not patriotic from a Constitutional point of view - If you're a fan of the this administration then doing anything that might support the Constitution is considered extremist and enemies of the state - and of course you should be locked up in GitMo (yeah that place they wanted Obama to shut down and he never did.) So as long as it can be used to get rid of Enemies of the State - don't hold your breath - it won't be closing any time soon.
So Borepatch has posted several times about circumventing or corrupting the NSA attempts to violate your rights. I'm going to propose my own. I haven't spent too much time thinking this though, so I'll have missed a few things. But anything that makes their attack on our privacy harder or worth less, I view as a good thing.
Keep in mind - you're 700 times more likely to be killed by mistake as a patient in a hospital than you are to be killed by a terrorist in this country. 65,000 times if we take a 10 year slice of data that doesn't include 9/11. See any huge, expensive programs to lower those numbers? Yeah there's your tax dollars at work.
While Email Spam is useful in creating so many false positives that the system is essentially a cost burden with no useable output. I think we're missing a bet.
I'd like to see an open source app that anyone can run that has the following:
A downloadable list of "bad sites" - things that the government wouldn't approve of, and would prefer just went away - nothing illegal, just stuff they consider the territory of the extremist. How to make explosives, assassination techniques, links to web sites that are pro radical Islam, white supremacist, etc. It's a long list.
A personal list of sites, if you find something new and juicy you'll want to be able to add it.
It needs to be able to spoof your web browser, if you use firefox it should look like firefox, you just throw the data away once it's processed for links.
It should keep the data long enough to simulate browsing - following links, with enough time that it might look like your actually reading the stuff.
It should use search engines to find new and similar sites.
It needs to know when you launch your browser, and stop it's own browsing and resume if your away from keyboard for a long time - say long enough to actually read the page you were on.
It should allow you to upload site URLs to be added to the general list.
In other words - needs to act like someone is actually browsing
Runs in the back ground - 24 hours a day, creating false data.
Why? with enough people hitting these sites, and adding new ones, there will be no point in tracking it anymore - how useful is 99% false positives?
Get everyone who thinks the NSA in particular and the Government in general has a bad case of overreach to run this program.Yeah it will result in more network traffic, which I'd like to avoid. I'm starting to view it as an necessary evil.
So what happens when in addition to the email spam they get web access spam? how useful will their data be? The more false data that they can't differentiate from real data, the better.
Another option? We need a series of VPN's hosted in non-extradition countries. The VPN's should also create false access data. These sights should contain no history except in memory - and the memory should be wiped on a regular basis - eventually they'll be physically compromised, so just assume it and rebuild everything on a regular basis.
Right now TOR is broken - the NSA has compromised it. VPN's were always vulnerable to meta data collection - once the endpoints are identified, it's just a matter of tracking web site access on one end and users on inbound side. With sufficient analysis of usage time, and end point access times, you can start to get a pretty good idea of what sites people are using even though you have no idea what they really searched for. How do you fix this? Spam access to sites - for every inbound request, you create a few thousand outbound requests. Unless they've compromised the endpoint server - they're job just got a lot harder. You'd need to keep a significant portion of overlap in both the normal sites and the false positive sites, if you make it random, a pattern will emerge. In reality, a patter will emerge sooner or later anyway, you fix that by mirroring a significant percentage of requested sites. So that inbound request don't correlate to out bound connections. The downside is that you won't know if as the end user your getting current data or not, not with out some additional data. Like an add-on for the VPN that shows if the data came from the web or a mirror, and when the last update was, and if it's out of date, when it will be updated.
I'd like the meet the idiots that thought having the NSA compromise internet encryption was a good idea. Once people believe a system can be hacked - it will be, once that happens you've just compromised all e-commerce and e-banking - well done guys - you BONE HEADS. If you really think it was worth the economic cost of destroying the worlds trust in these operations - you're so screwed up in the head your hopeless.