webls/robots

Types

The policy for a specific robot

pub type Robot {
  Robot(
    user_agent: String,
    allowed_routes: List(String),
    disallowed_routes: List(String),
  )
}

Constructors

  • Robot(
      user_agent: String,
      allowed_routes: List(String),
      disallowed_routes: List(String),
    )

    Arguments

    • user_agent

      The user agent such as “googlebot” or “*” for catch all

    • allowed_routes

      The allowed routes such as “/posts/” and “/contact/”

    • disallowed_routes

      The disallowed routes such as “/admin/” and “/private/”

The configuration for a robots.txt file

pub type RobotsConfig {
  RobotsConfig(sitemap_url: String, robots: List(Robot))
}

Constructors

  • RobotsConfig(sitemap_url: String, robots: List(Robot))

    Arguments

    • sitemap_url

      The url of the sitemap for crawlers to use

    • robots

      A list of robot policies

Functions

pub fn allowed_routes(
  robot: Robot,
  routes: List(String),
) -> Robot
pub fn disallowed_routes(
  robot: Robot,
  routes: List(String),
) -> Robot
pub fn robot(user_agent: String) -> Robot
pub fn to_string(config: RobotsConfig) -> String
Search Document