How Much Traffic Do You Get from ChatGPT? Here Are the Numbers and Methods
More and more people are asking: “Does ChatGPT actually drive traffic to websites?” The answer is more nuanced than you might think – and yes, it can be measured. Here, I show how I did it and what the numbers reveal.
This blog post was originally written and published in Norwegian: Link to original
What is utm_source and how does ChatGPT use it?
Starting in fall 2024, ChatGPT began adding utm_source
to URLs when linking to different sources, making it possible for websites to see this traffic in their analytics.
Quick sidenote: Did you know that utm
stands for Urchin Tracking Module? The analytics tool Urchin was acquired by Google in 2005 and renamed Google Analytics the following year. Today, utm_source
, utm_medium
, and utm_campaign
are the de facto standard for tracking web traffic, not only in Google Analytics but across most analytics platforms.
This is what ChatGPT adds to some of its outbound links:
https://example.com/my-awesome-content?
For the website I reviewed in April, there were only 18 sessions from October to April recorded as such traffic. So, only 18 times did someone click from a ChatGPT result to my site.
In the same period, the site had about 9,500 sessions, meaning traffic from ChatGPT made up just 0.2% of the total traffic.
Bot Traffic: Lots of Activity – But Not Always Visible
But does this statistic tell the whole story? Is my site practically invisible to AI? Is there really zero activity?
No, and that’s where bots come in.
An essential part of the Internet as we know it today is bots. This is also something I touched on when I wrote about robots.txt back in 2013.
In 2013, most bots were simply crawling content for search engine indexing, and there were only a few types of bots.
Now, 12 years later, that has changed drastically. There are countless different bots crawling the web with various purposes, such as:
- Crawling page content for search engines like Google and Bing
- Checking ads.txt to verify ad placement legitimacy
- Retrieving metadata to generate previews in tools like Teams or iMessage
- Crawling and categorizing content for ad platforms
- Scraping content for analytics or other uses
A bot works by opening a webpage, reading the content, and performing an action based on that. That action could be sending the content for indexing in a search engine, or using it to train AI models like ChatGPT.
Over a little more than a month, these 107 different bots accessed 150,000 web pages.
But what happens when ChatGPT checks a website "live" for you, based on your prompt?
This Is How I Logged ChatGPT's Website Visits
Bot traffic doesn’t show up in Google Analytics or similar tools, as it often has low value and causes noise. Also, it’s technically not trackable via JavaScript.
But to dig deeper into how ChatGPT behaves when checking a webpage, I started logging every time a page was loaded by a browser identifying as a bot.
This information can be pulled from server logs, but in my case, I logged this in a separate database every time a page was “served” to a bot. I logged IP address, user agent, URL of the requested page, and timestamp. Not much more info is available than that.
Over just a little more than a month, I recorded 107 different bots loading my pages. And these were only bots that included “bot” in their user agent, like this:
Mozilla/5.0 (compatible; Awario
I was able to identify three different bots from OpenAI that visited my pages:
Bot | Logged user agent |
---|---|
ChatGPT-User | Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/ |
GPTBot | Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GPT |
OAI-SearchBot | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36; compatible; OAI-Search |
OAI-SearchBot | Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; OAI-Search |
It turns out that the first one, ChatGPT-User, is the bot that checks a site for you “live.”
Here’s a prompt asking about the lake Sandtjern in the forest Finnemarka:
“What can you tell me about Sandtjern in Finnemarka? Feel free to check online :)”
What happens behind the scenes when you ask ChatGPT something like that is:
- It first checks an old blog post about Sandtjern
- Then an info page on Sandtjern
- And finally perhaps also the homepage of the site
All of this can be logged, which provides a solid picture of content demand, insight into trending topics, and a view of the differences between human and AI bot page views from ChatGPT.
How Much Human Traffic Comes from ChatGPT?
If you want to take it a step further, you can import this traffic into GA4 so that “human” traffic and ChatGPT traffic appear side-by-side.
What I do is group pageviews within 5-second intervals to create something that resembles sessions or users: If ChatGPT requests 3 pages, like in the Sandtjern example, within 1–2 seconds, it’s fair to assume these 3 views are tied to the same person and prompt on the other end. So, 3 pageviews and 1 user.
Example of a report used by Liernett:
As the GA4 report shows, the number of views generated by humans and ChatGPT are quite similar and evenly distributed over time.
If I break down the number of users by hour of the day over the last 28 days, the numbers follow each other even more closely:
Website Type and Content Matters
In the Liernett example above, ChatGPT-generated traffic is about equal to regular human traffic. But note that Liernett is a news archive with over 7,000 articles packed with text and info. How much traffic different types of websites get from ChatGPT will likely vary a lot.
For the other websites where I ran the same experiment, the results were a bit different, with pageviews generated by ChatGPT prompts making up around 10% of the total:
Click-Through Rate and What It Really Means
At the start, I asked how many actual visitors a site gets from AI models like ChatGPT.
The answer is that it might be vanishingly few people who end up on your site.
But as I’ve shown, it’s technically possible to measure the demand and how often ChatGPT uses your site as a source.
And once you’ve done that, you can calculate a form of click-through rate: How many times were you listed as a source? How many times did that lead to a (human) visit?
Click-Through Rate | = | Sessions with utm_source=chatgpt.com |
Sessions from ChatGPT-User bot |
If Liernett had 1,200 sessions from the ChatGPT-User bot over 4 weeks, and this led to 6 sessions (clicks) from chatgpt.com, the click-through rate would be 0.5%:
6 | = | 0.5% |
1200 |
It’s only when you can measure this over time and optimize for this type of traffic that it becomes truly interesting.
Sources of Error and Limitations You Should Know
Caching: It's worth noting that we don’t fully understand how a LLM (Large Language Model) like ChatGPT functions, despite some documentation about its bots. For example, some content may come from recently cached pages. But my tests show that at least one page is typically checked when the site is used as a source.
Other AI Models: Many other AI models exist besides ChatGPT, especially Google’s AI Overview. I log those too, but it's been more difficult to determine which bots are tied to user prompts vs. regular crawling. I’ve recorded 24 different user agents from Google, for example.
Missing utm_source: Even though ChatGPT started using utm_source in fall 2025, not all links have included it consistently. There’s a difference between links shown in the main answer, links listed as sources, and those listed under “More.” The latter only recently (April–May 2025) started receiving utm_source.
Missing Bots: There may also be other bots that don’t include bot in their user agent, and therefore aren’t logged by me. One example was Perplexity.ai, which I initially missed because its user agent was:
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Perplexity-User/1.0; +https://perplexity.ai/perplexity-user)
Consent: Regular (human) visitors are only measured in Google Analytics if they’ve given consent. In my numbers, I’ve simply scaled traffic numbers based on a 91% consent rate across all sites. Bot traffic requires no consent since it contains no personal data.
My Main Findings
- ChatGPT often generates traffic – but it’s mostly bot traffic
- Actual human traffic is low (often <1%)
- Logging bot activity gives better insight than Google Analytics alone
- Tracking ChatGPT click-through rates opens new optimization opportunities