Statistics can be useful but can also lead to false conclusions, especially when the data is inaccurate. Web statistics are no exception, particularly when different systems provide different numbers. For example, I use the WordPress blogging platform, which provides some traffic statistics. I recently upgraded the hosting plan to include google analytics and the numbers do not match.
As a result, tracking systems will likely move logic to the application tier, but with a service-oriented architecture that would support static files with Jamstack in addition to dynamic content delivery engines such as ASP.NET. Rather than processing metadata from incoming requests, the web application tier would asynchronously forward relevant details to a service that performs robot detection, data storage for reporting, and likely experience management and other features implemented as services possibly consumed by the same web application tier. Logically, this metadata forwarding functionality will likely appear in the form of filters that process all incoming HTTP requests, possibly at the edge.
Without second-party and third-party cookies, tracking users across web properties not managed by a single organization would require aggregating this data, which should be impossible if the tracking service vendors enforce security and data privacy for their customers. As a programmer, I am familiar with some if clauses that do not cover all corner cases.
Time, please prove me wrong.