X-Git-Url: http://www.privoxy.org/gitweb/?p=privoxy.git;a=blobdiff_plain;f=TODO;h=e1cf90e28f5cc054a94b7fd92a97c3f8409f2835;hp=0e8aaf5fa307a70b8dfeb6b3c0b62b3cc5f020ba;hb=aaff4cd0e076bb9d094f69ac83a4ff5429235991;hpb=db815ca309911a876eac92ebd5cfba5ccba706ff diff --git a/TODO b/TODO index 0e8aaf5f..e1cf90e2 100644 --- a/TODO +++ b/TODO @@ -1,4 +1,4 @@ -$Id: TODO,v 1.102 2014/05/11 13:17:49 fabiankeil Exp $ +$Id: TODO,v 1.108 2014/05/20 11:52:46 fabiankeil Exp $ Some Privoxy-related tasks, sorted by the time they have been added, not by priority. @@ -120,6 +120,9 @@ http://ijbswa.cvs.sourceforge.net/viewvc/ijbswa/current/TODO It would probably also make sense to look into what other projects did when migrating away from SF. + 2014-05-13: Work in progress. Hosting wish list at the end + of this file. + 54) Move away from CVS to a more modern revision control system. Find out if there are any objection against going with Git. Using Git would also have the advantage that SF now pretends @@ -127,10 +130,6 @@ http://ijbswa.cvs.sourceforge.net/viewvc/ijbswa/current/TODO 55) Apply for Coverity scans: http://scan.coverity.com/ -56) Apply for the "free online access for qualified open-source - software projects" for the Co-Advisor HTTP compliance tests: - http://coad.measurement-factory.com/details.html#pricing - 57) Allow piping into external programs to allow more powerful filters and policy decisions. Incomplete support available in Fabian's popen branch. @@ -346,11 +345,65 @@ http://ijbswa.cvs.sourceforge.net/viewvc/ijbswa/current/TODO 122) Allow customized log messages. -123) Allow to decrypt encrypted traffic using the - voluntarily-disclose-session-keys option in Firefox. - Depends on #16. +123) Evaluate if the voluntarily-disclose-session-keys option in Firefox + (and other browsers) can be leveraged. Probably depends on #16. 124) Add support for the "lightweight OS capability and sandbox framework" Capsicum. http://www.cl.cam.ac.uk/research/security/capsicum/ 125) Allow clients to HTTPS-encrypt the proxy connection. + +126) Run the Co-Advisor HTTP compliance tests, evaluate the results, + fix the compliance issues that aren't by design and document + the rest. + Note that Privoxy developers qualified for free account upgrades: + http://coad.measurement-factory.com/details.html#pricing + +########################################################################## + +Hosting wish list (relevant for #53) + +What we need: + +- Bug tracker +- Mailinglists (Mailman with public archives preferred) +- Webspace (on a Unix-like OS that works with the webserver targets + in GNUMakefile) +- Source code repositories (currently CVS, but migrating away + from it is TODO #54 anyway and shouldn't be too much trouble) +- Commit mails (preferably with unified diffs) + +(Unsorted) details to look at when evaluating hosters: + +1. Preferably no third-party ads and trackers. + External images, CSS and JavaScript may count as trackers + but texts like "supported by company XYZ" may be acceptable. + +2. JavaScript should be optional or not used at all. + +3. Services we don't need shouldn't be enabled anyway. + (We currently don't use Web forums, wikis, surveys etc.) + +4. It would be preferable if the hoster didn't have a bad track + record as far as user experience, security and privacy are + concerned and if the terms of service are "reasonable" and + haven't changed too often in the past. Updates in the past + should have been improvements and not regressions. + +5. It would be preferable if most of the server administration + is done by a trusted third-party (or at least not a lot of work + for us). + +6. The server(s) should be located in a country with laws we can + understand and follow (or at least not unintentionally violate). + +7. A server location in a country with some kind of due process + and strong data protection laws (at least on paper) would be + preferable. + +8. Given that Privoxy is a free software project it would be + preferable if the hoster would use free software where possible. + +9. Migrating away from the hoster in the future without losing + any important data should be possible without writing web + scrapers first.