routing

 routing  (or URL routing ) is a way to configure a website to have some (or all) of its URLs handled by software rather than static files; an IndieWeb site may have no routing (a static site), some routing like one directory handled by PHP or other software, or all URLs handled by a framework like Ruby on Rails with its own routing system.

Why
Why use routing? You need some form of routing for any kind of dynamic site (anything more than serving a static site) in order for the web server to send a URL to some code for it to figure out what content should be served, rather than it just using the file system.

Tantek
undefined has used some hybrid routing on his site tantek.com since 2010-01-01 at least when he posted his first "note" that was dynamically handled by PHP. Details:
 * 1) Apache .htaccess used to route of some URL patterns to falcon.php
 * 2) PHP Falcon.php used to route those URLs to specific types of pages:
 * 3) * home page (using an index.html template)
 * 4) * Atom feed (100% generated by code)
 * 5) * permalink pages (100% generated by code)
 * 6) * archive pages (100% generated by code, for all posts in a day or new month, or all posts of a particular type in a single day)

Tools

 * PHP
 * https://github.com/nikic/FastRoute regular expression based router
 * Slim framework includes routing based on FastRoute

PHP-only routing
undefined: Since showed me on day two of IndieWebCamp Brighton (2019-10-20) that it was possible to launch a PHP webserver on Mac for localhost, using e.g. "" it made me realize that the routing in my htaccess file was being ignored and thus untestable using a local PHP webserver.

So to fix that I'm thinking of migrating the routing in my htaccess into PHP somehow with perhaps the following goals:
 * 1) Use htaccess for "server config" type stuff that only needs to be handled "on the open internet", e.g.
 * 2) * subdomains, e.g. permanent redirect from www.tantek. to tantek.
 * 3) * HTTPS-only for admin UI
 * 4) * defensive blocking abusive IPs, e.g. for excessive non-human requests
 * 5) * defensive bot blocking, e.g. for misbehaving bots ignoring robots.txt
 * 6) ** AhrefsBot
 * 7) ** EasouSpider
 * 8) ** Egress
 * 9) ** FAST Enterprise Crawler
 * 10) ** FyberSpider
 * 11) ** Gigabot
 * 12) ** ichiro
 * 13) ** Java
 * 14) ** Mubidi-bot
 * 15) ** MJ12bot
 * 16) ** OutfoxBot
 * 17) ** SpiderMan
 * 18) ** Wavefire
 * 19) ** wume_crawler
 * 20) Use PHP "routing" for anything "content" related, including URL paths
 * 21) * content types, returning HTTP header    vs.   vs   vs   vs   etc. for different file extensions
 * 22) * static file passthroughs for files that should be returned from the file system as-is, e.g. .css .png .jpeg .jpg .js .ico
 * 23) * paths without extensions, like /contact rather than /contact.html
 * 24) * 404 redirects to actual/new locations (sometimes repairing others's inbound links)
 * 25) * specific shortnames for blog posts, presentations, pages, or redirects like affiliate links or profile accounts (e.g. /github to github.com/tantek) or comms URLs (e.g. /txt to sms:tantek@... )
 * 26) ** defensive shortname (or path) banning, e.g. requests from bots (presumably) for things I've never had like: /_vti_bin, /MSOffice, /wp-admin. Keeping this blocklist here should help prevent actually putting something there useful.
 * 27) * algorithmic shortpaths e.g. Whistle shortlinks
 * 28) * algorithmic permalinks