X-Git-Url: https://git.xandkar.net/?a=blobdiff_plain;f=TODO;h=69ace9090ff10c91453beb93eea1af28186ccc6f;hb=HEAD;hp=80fff856479c9aca8b92871204fa69e6cb24ba80;hpb=3231d4b5fb17c45440761873ae66a8ad03f7308c;p=tt.git diff --git a/TODO b/TODO index 80fff85..69ace90 100644 --- a/TODO +++ b/TODO @@ -10,7 +10,13 @@ Legend: In-progress ----------- - +- [-] timeline limits + - [x] by time range + - [ ] by msg count + - [ ] per peer + - [ ] total + Not necessary for short format, because we have Unix head/tail, + but may be convinient for long format (because msg spans multiple lines). - [-] Convert to Typed Racket - [x] build executable (otherwise too-slow) - [-] add signatures @@ -18,6 +24,8 @@ In-progress - [ ] inner - [ ] imports - [-] commands: + - [x] c | crawl + Discover new peers mentioned by known peers. - [x] r | read - see timeline ops above - [ ] w | write @@ -48,9 +56,19 @@ In-progress - [x] mentions from timeline messages - [x] @ - [x] @ - - [x] "following" from timeline comments: # following = + - [ ] "following" from timeline comments: # following = + 1. split file lines in 2 groups: comments and messages + 2. dispatch messages parsing as usual + 3. dispatch comments parsing for: + - # following = + - what else? - [ ] Parse User-Agent web access logs. - - [ ] Update peer ref file(s) + - [-] Update peer ref file(s) + - [x] peers-all + - [x] peers-mentioned + - [ ] peers-followed (by others, parsed from comments) + - [ ] peers-down (net errors) + - [ ] redirects? Rough sketch from late 2019: let read file = ... @@ -93,6 +111,22 @@ In-progress Backlog ------- +- [ ] Support date without time in timestamps +- [ ] Associate cached object with nick. +- [ ] Crawl downloaded web access logs +- [ ] download-command hook to grab the access logs + + (define (parse log-line) + (match (regexp-match #px"([^/]+)/([^ ]+) +\\(\\+([a-z]+://[^;]+); *@([^\\)]+)\\)" log-line) + [(list _ client version uri nick) (cons nick uri)] + [_ #f])) + + (list->set (filter-map parse (file->lines "logs/combined-access.log"))) + + (filter (λ (p) (equal? 'file (file-or-directory-type p))) (directory-list logs-dir)) + +- [ ] user-agent file as CLI option - need to run at least the crawler as another user +- [ ] Support fetching rsync URIs - [ ] Check for peer duplicates: - [ ] same nick for N>1 URIs - [ ] same URI for N>1 nicks @@ -100,7 +134,6 @@ Backlog We can mark which messages have already been printed and print new ones as they come in. REQUIRES: polling -- [ ] Use ETag to prevent redundant downloads - [ ] Polling mode/command, where tt periodically polls peer timelines - [ ] nick tiebreaker(s) - [ ] some sort of a hash of URI? @@ -114,10 +147,8 @@ Backlog - [ ] download times per peer - [ ] Support redirects - should permanent redirects update the peer ref somehow? -- [ ] Support time ranges (i.e. reading the timeline between given time points) - [ ] optional text wrap - [ ] write -- [ ] timeline limits - [ ] peer refs set operations (perhaps better done externally?) - [ ] timeline as a result of a query (peer ref set op + filter expressions) - [ ] config files @@ -146,6 +177,13 @@ Backlog Done ---- +- [x] Crawl all cache/objects/*, not given peers. +- [x] Support time ranges (i.e. reading the timeline between given time points) +- [x] Dedup read-in peers before using them. +- [x] Prevent redundant downloads + - [x] Check ETag + - [x] Check Last-Modified if no ETag was provided + - [x] Parse rfc2822 timestamps - [x] caching (use cache by default, unless explicitly asked for update) - [x] value --> cache - [x] value <-- cache