Google Search Console for SEO: The Reports That Actually Matter
Search Console shows real clicks, impressions, and positions from Google's index. Not estimates. The reports that drive action and the workflows behind them.
Google Search Console is the only source of real search data from Google. Not estimates from third-party tools. Not keyword difficulty scores calculated from proxy metrics. Actual impressions, clicks, positions, and indexing status, straight from Google's index.
Most SEOs check it once, glance at total clicks for the last 28 days, see a number that looks reasonable, and close the tab. That is like checking your bank balance without looking at individual transactions. The total might look fine while problems grow underneath.
Here are the Search Console reports that drive action, and the workflows that turn them into results.
Performance Report: Where the Real Data Lives
The Performance report shows queries, pages, countries, and devices for organic search. You can filter, compare, and segment the data. This is where most of your time in Search Console should be spent.
The power move: date range comparison
Select "Compare" in the date filter and pick two equivalent periods. Compare the last 28 days against the previous 28 days. Or this quarter against last quarter. The comparison view adds delta columns showing the change in clicks, impressions, CTR, and position for every query and page.
This is how you find problems early. A page that dropped from position 4 to position 9 for its primary keyword might not show up in your total traffic numbers yet, because other pages picked up the slack. But the decline in position is a leading indicator. If you catch it now, you can update the content or fix technical issues before the traffic loss compounds.
Filter by page, then look at queries
Instead of browsing the full query list (which can be thousands of rows), filter by a specific page URL first. This shows you exactly which queries are driving impressions and clicks to that page. You will often find queries you did not expect, which reveal content gaps or opportunities to expand the page.
You might also find that a page ranks for queries that do not match the page's intent. That is a signal to either update the page content to better serve those queries, or create a new page specifically targeting them.
Queries with high impressions but low CTR
Filter for queries where your average position is between 1 and 10 (page one) but your CTR is below 3%. These are pages that show up in search results but do not get clicked. The usual suspects: a title tag that does not match the query intent, a meta description that does not compel a click, or a SERP with rich results (featured snippets, knowledge panels, "People Also Ask") that push organic results below the fold.
AI analytics tools for marketing fall into four categories: built-in AI features in platforms you already use (GA4, ad platforms), general-purpose AI applied to marketing data (ChatGPT, Claude), dedicated AI analytics platforms (Amplitude, Mixpanel, Tableau),
ChatGPT data analysis works by uploading CSV or Excel files to the Code Interpreter (Advanced Data Analysis) environment, where ChatGPT writes and executes Python code on your behalf to clean, explore, visualize, and interpret datasets. It handles files up to
There are three ways to connect ChatGPT to Google Analytics: exporting CSV files and uploading them to ChatGPT, using the GA4 API through Code Interpreter, and connecting through an MCP server for real-time access. Each method has different setup requirements,
Test updated title tags and meta descriptions. You cannot control SERP features, but you can write better titles.
Pages Report: Understanding Indexing
The Pages report (under Indexing in the sidebar) shows every URL Google knows about and its indexing status. This is more important than most people realize.
Key statuses to monitor
Crawled, currently not indexed. Google crawled the page but decided not to add it to the index. This usually means Google thinks the content is low quality, duplicate, or not useful enough. If you see important pages here, improve the content quality, add internal links pointing to them, and resubmit via URL Inspection.
Discovered, currently not indexed. Google knows the URL exists (from a sitemap or internal link) but has not crawled it yet. For new pages, this is normal and temporary. For pages that have been in this state for weeks, it could indicate a crawl budget issue. The site has too many URLs for Google to get through efficiently.
Excluded by 'noindex' tag. Intentional if you added the tag. Alarming if you did not. Check for noindex tags that were added during development and never removed, or CMS plugins that apply noindex to categories or tag pages by default.
Duplicate, submitted URL not selected as canonical. Google found multiple versions of the same content and picked a different URL as the canonical than the one you submitted. This happens with HTTP/HTTPS duplicates, www/non-www duplicates, and pages with query parameters. Fix: set proper canonical tags and redirect duplicates.
The indexing audit workflow
Monthly, check the Pages report for unexpected changes. If the number of "not indexed" pages suddenly increases, something changed: a site migration broke URLs, a robots.txt update blocked crawling, or a CMS update added unintended noindex tags. Catching these issues monthly prevents them from snowballing.
Core Web Vitals Report: Real User Data
The Core Web Vitals report shows performance data from real users (Chrome User Experience Report data). This is different from lab tools like Lighthouse or PageSpeed Insights, which test under simulated conditions.
Why this matters more than lab scores
Lab scores test one device on one connection at one point in time. The Core Web Vitals report in Search Console shows how your pages actually perform for real users across all devices and connection speeds. A page might score 95 in Lighthouse on your fast office connection but have poor LCP for users on mobile networks in rural areas.
The report groups URLs into "Good," "Needs improvement," and "Poor" categories for each metric (LCP, INP, CLS). Click into any group to see which specific URLs are affected and what the issue is.
What to prioritize
Fix "Poor" URLs first, then "Needs improvement." Focus on LCP (Largest Contentful Paint) because it has the most direct relationship with user experience and bounce rate. A page that takes four seconds to render its main content loses visitors before they even see what the page offers.
For a detailed breakdown of each Core Web Vitals metric and how to fix them, see our Core Web Vitals guide.
Links Report: Your Backlink Profile from the Source
Third-party backlink tools (Ahrefs, Moz, Semrush) estimate your backlink profile by crawling the web. They are useful but incomplete. Search Console shows the links Google actually knows about.
External links
The "Top linked pages" list shows which of your pages have the most backlinks. The "Top linking sites" list shows which domains link to you most. The "Top linking text" list shows the anchor text other sites use when linking to you.
This data reveals patterns. If your homepage has 500 linking domains but your product pages have 3, you have a distribution problem. If most of your anchor text is your brand name, you might be missing topical authority signals for your target keywords.
Internal links
The internal links section is underused. It shows how many internal links point to each page on your site. Pages with few internal links are harder for Google to discover and tend to rank lower. Your most important pages should have the most internal links.
Practical check: Look at your top 10 revenue pages. How many internal links point to each? If the answer is under 5 for any of them, you have an internal linking gap that is costing you rankings.
URL Inspection: How Google Sees Your Page
The URL Inspection tool shows you exactly how Google processes a specific URL. Enter any URL from your site and you get:
Index status: Is the page indexed? If not, why?
Last crawl date: When did Googlebot last visit?
Crawled as: Which user agent (smartphone or desktop)?
Rendered HTML: What the page looks like after JavaScript execution. This is critical for JavaScript-heavy sites where content loads dynamically.
Mobile usability: Any mobile-specific issues?
When to use it
After publishing or updating a page, inspect the URL and request indexing. This does not guarantee immediate indexing, but it adds the URL to Google's priority crawl queue.
When a page drops in rankings, inspect it. Check if Google is seeing the content you expect. JavaScript rendering issues can cause Google to see a blank or partial page even though it looks fine in your browser.
When debugging indexing issues from the Pages report, inspect specific URLs to understand why they are not indexed.
Practical Weekly and Monthly Workflows
Weekly (15 minutes)
Open Performance report. Compare last 7 days vs. previous 7 days.
Sort queries by change in clicks (descending). Note any significant drops.
Filter by pages. Check your top 5 landing pages for position changes.
If any page dropped more than 2 positions, investigate: content freshness, new competitors, technical issues.
Monthly (30 minutes)
Check the Pages report for changes in indexed/not-indexed counts.
Review Core Web Vitals for any pages that moved from "Good" to "Needs improvement."
Review the Links report. Note any new linking domains or lost backlinks.
Cross-reference with your GA4 data. Pages losing search traffic should be priorities for content updates. If you have not connected GA4 to Search Console yet, follow our GA4 setup guide.
Quarterly (1 hour)
Export the full query list. Compare against previous quarter's export.
Identify keyword clusters where position is trending down across multiple queries.
Review internal link distribution. Ensure important pages have adequate internal linking.
Audit the sitemap. Remove URLs that return 404 or redirect.
Going Further with Search Console Data
Search Console's interface is functional but limited. You cannot build custom dashboards, combine search data with analytics data, or set up automated alerts for ranking drops. The API offers more flexibility, but requires development resources. For guidance on turning Search Console data into actionable deliverables, see our SEO reporting guide.
If you want to combine Search Console data with site audits, keyword tracking, and competitor analysis in one workflow, Ooty's free SEO analyzer uses your real search data to surface actionable recommendations. For teams managing multiple sites, Ooty SEO connects directly to Search Console and layers competitive intelligence and AI visibility tracking on top.
Search Console is the foundation. Every other SEO tool is building on top of what Google tells you here. Learn to read it well, and you will spot problems and opportunities that most of your competitors miss entirely.