Detecting robots and classifying browsers

We encounter lots of people still relying on the browser classifications in despite it being very much out-of-date. Instead there’s a few other tables that you should use instead. I talk about that in this new blog post.


Very interesting @Simon_Rumble - haven’t looked at those enrichments too much.

Do you find it challenging / slow to remove robots by joining those tables onto eveything else? Or do you find their IPs/ranges consistent enough that you can filter them out based on their IPs?

I join them all in our data model after each batch load so it’s lightning fast for my queries. Doing it live in queries wouldn’t be how I’d recommend you do it.