web-crawlerblazor-server-side

How do I conditionally turn off pre-render for 2 Blazor pages


I have a Blazor InteractiveServer application using .NET 8.

The Google web crawler gets just the static html for a page, not waiting for Blazor to create the SignalR circuit and therefor not waiting for OnInitializeAsync() to read from the DB and populate the page.

I have 2 pages that I need to have indexed with their full content. These pages are unique dur to a parameter in the url ?id=123 and the sitemap.txt lists thousands of these 2 pages.

My question is first, how do I determine if the page is being read by a crawler?

And second, how do I then put that page in a render mode where what is first returned is the fully populated page as static HTML.

It's fine if after the page is fully rendered as static HTML it then re-renders it again as an interactive page with SignalR set up.

I have found this info on pre-rendering, but it's not clear where it goes or if this is what I need. And nothing about how to only do it if a crawler has requested a page.


Solution

  • Here's the solution:

    1. In your sitemap add &crawler=true to every url.
    2. When the crawler parameter is passed in, fully populate the page on the pre-render call.

    That's it.