Skip to content

Latest commit

 

History

History
70 lines (56 loc) · 1.57 KB

README.md

File metadata and controls

70 lines (56 loc) · 1.57 KB

RobotsTxtMiddleware Build Status

A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.

Installation

NuGet

PM> Install-Package RobotsTxtCore

.Net CLI

> dotnet add package RobotsTxtCore

https://www.nuget.org/packages/RobotsTxtCore/

Usage

To specify multiple rules with the fluent interface makes it really easy.

    app.UseRobotsTxt(builder =>
        builder
            .AddSection(section => 
                section
                    .AddComment("Allow Googlebot")
                    .AddUserAgent("Googlebot")
                    .Allow("/")
                )
            .AddSection(section => 
                section
                    .AddComment("Disallow the rest")
                    .AddUserAgent("*")
                    .AddCrawlDelay(TimeSpan.FromSeconds(10))
                    .Disallow("/")
                )
            .AddSitemap("https://example.com/sitemap.xml")
    );

Output

# Allow Googlebot
User-agent: Googlebot
Allow: /

# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10

Sitemap: https://example.com/sitemap.xml

Or if you just want to deny everyone.

    app.UseRobotsTxt(builder =>
        builder
            .DenyAll()
    );

Output

User-agent: *
Disallow: /