This repository has been archived on 2024-10-12. You can view files and clone it, but cannot push or open issues or pull requests.
wikipedia-game-solver/TODO.md
Théo LUDWIG 20ab889cf8
All checks were successful
Chromatic / chromatic (push) Successful in 2m3s
CI / ci (push) Successful in 4m16s
CI / commitlint (push) Successful in 14s
chore: improve type safety Tuyau
2024-08-18 01:31:02 +01:00

4.1 KiB

TODO

  • chore: initial commit
  • Deploy first staging version (v1.0.0-staging.1)
  • Wikipedia Database Dump
    • Download SQL files
    • Extract SQL files
    • Tables structure CREATE TABLE
      • page.sql (pages table)
      • pagelinks.sql (internal_links table)
    • Adapt downloaded SQL files
      • page.sql (pages table)
      • pagelinks.sql (internal_links table)
    • Import SQL files
    • Try SELECT count(*) FROM internal_links il WHERE il.from_page_id = (SELECT p.id FROM pages p WHERE p.title = 'Linux'); -- Count of internal links for 'Linux' page
    • Try:
      SELECT il.to_page_id, pl.title
      FROM internal_links il
      JOIN pages pl ON pl.id = il.to_page_id
      WHERE il.from_page_id = (
        SELECT p.id FROM pages p WHERE p.title = 'Node.js'
      );
      
    • Move from POC (Proof of concept) in data folder to apps/cli folder
    • Documentation how to use + Last execution date
    • Rewrite bash script to download and extract SQL files from Wikipedia Database Dump to Node.js for better cross-platform support and easier maintenance + automation, preferably one Node.js script to generate everything to create the database
    • Verify file content up to before inserts, to check if it matches last version, and diff with last version
    • Update logic to create custom internal_links table to make it work with latest wikipedia dumps (notably concerning the change in pagelinks.sql where the title is not included anymore, but instead it uses pl_target_id, foreign key to linktarget), last tested dumb working 20240420
    • Handle redirects
  • Implement REST API (api) with JSON responses (AdonisJS) to get shortest paths between 2 pages
    • Init AdonisJS project
    • Create Lucid models and migrations for Wikipedia Database Dump: pages and internal_links tables
    • Implement GET /wikipedia/pages?title=Node.js to search a page by title (not necessarily with the title sanitized, search with input by user to check if page exists)
    • Implement GET /wikipedia/pages/[id] to get a page and all its internal links with the pageId
    • Implement GET /wikipedia/internal-links/paths?fromPageId=id&toPageId=id to get all the possible paths between 2 pages
    • Setup tests with database + add coverage
    • Setup Health checks
    • Setup Rate limiting
    • Share VineJS validators between website and api
  • Implement Wikipedia Game Solver (website)
    • Init Next.js project
    • Try to use https://www.npmjs.com/package/@tuyau/client for API calls
    • Implement a form with inputs, button to submit, and list all pages to go from one to another, or none if it is not possible
    • Add images, links to the pages + good UI/UX
    • Autocompletion page titles
    • Implement toast notifications for errors, warnings, and success messages
  • Implement CLI (cli)
    • Init Clipanion project
    • Implement wikipedia-game-solver internal-links --from="Node.js" --to="Linux" command to get all the possible paths between 2 pages.
  • Add docs to add locale/edit translations, create component, install a dependency in a package, create a new package, technology used, architecture, links where it's deployed, how to use/install for end users, how to update dependencies with npx taze -l etc.
  • GitHub Mirror
  • Delete TODO.md file and instead use issues for the remaining tasks