X-Git-Url: https://git.xandkar.net/?a=blobdiff_plain;f=TODO;h=b48bfd808bf8d745037f826e699b008ead6f67cd;hb=7d9f2ab580eb3356275434ca0b70f7fef69a9513;hp=06c36749777ca93dc413c65d9063586367e85b7b;hpb=8cd862edbefa9ae27d78e3b03eb7a6256acfdcf6;p=tt.git diff --git a/TODO b/TODO index 06c3674..b48bfd8 100644 --- a/TODO +++ b/TODO @@ -10,7 +10,6 @@ Legend: In-progress ----------- - - [-] Convert to Typed Racket - [x] build executable (otherwise too-slow) - [-] add signatures @@ -18,11 +17,16 @@ In-progress - [ ] inner - [ ] imports - [-] commands: + - [x] c | crawl + Discover new peers mentioned by known peers. - [x] r | read - see timeline ops above - [ ] w | write - arg or stdin - nick expand to URI + - Watch FIFO for lines, then read, timestamp and append [+ upload]. + Can be part of a "live" mode, along with background polling and + incremental printing. Sort of an ii-like IRC experience. - [ ] q | query - see timeline ops above - see hashtag and channels above @@ -45,11 +49,15 @@ In-progress - [x] mentions from timeline messages - [x] @ - [x] @ - - [x] "following" from timeline comments: # following = + - [ ] "following" from timeline comments: # following = - [ ] Parse User-Agent web access logs. - - Rough sketch from late 2019: - + - [-] Update peer ref file(s) + - [x] peers-all + - [x] peers-mentioned + - [ ] peers-followed (by others, parsed from comments) + - [ ] peers-down (net errors) + - [ ] redirects? + Rough sketch from late 2019: let read file = ... let write file peers = @@ -91,6 +99,31 @@ In-progress Backlog ------- +- [ ] Crawl all cache/objects/*, not given peers. + BUT, in order to build A-mentioned-B graph, we need to know the nick + associated with the URI whos object we're examining. How to do that? +- [ ] Crawl downloaded web access logs +- [ ] download-command hook to grab the access logs + + (define (parse log-line) + (match (regexp-match #px"([^/]+)/([^ ]+) +\\(\\+([a-z]+://[^;]+); *@([^\\)]+)\\)" log-line) + [(list _ client version uri nick) (cons nick uri)] + [_ #f])) + + (list->set (filter-map parse (file->lines "logs/combined-access.log"))) + + (filter (λ (p) (equal? 'file (file-or-directory-type p))) (directory-list logs-dir)) + +- [ ] user-agent file as CLI option - need to run at least the crawler as another user +- [ ] Support fetching rsync URIs +- [ ] Check for peer duplicates: + - [ ] same nick for N>1 URIs + - [ ] same URI for N>1 nicks +- [ ] Background polling and incremental timeline updates. + We can mark which messages have already been printed and print new ones as + they come in. + REQUIRES: polling +- [ ] Polling mode/command, where tt periodically polls peer timelines - [ ] nick tiebreaker(s) - [ ] some sort of a hash of URI? - [ ] angry-purple-tiger kind if thingie? @@ -135,6 +168,11 @@ Backlog Done ---- +- [x] Dedup read-in peers before using them. +- [x] Prevent redundant downloads + - [x] Check ETag + - [x] Check Last-Modified if no ETag was provided + - [x] Parse rfc2822 timestamps - [x] caching (use cache by default, unless explicitly asked for update) - [x] value --> cache - [x] value <-- cache