AI Readiness

This page documents the machine-readable infrastructure of websites.blackai.capital for AI agents and audit tools. All data shown here is derived from the actual site implementation.

01 JSON-LD Schema Markup

Embedded in the <head> of every page as a single @graph with four schema.org types:

@type: Organization

@id: https://websites.blackai.capital/#organization

  • name: BlackAI Holding AG
  • legalName: BlackAI Holding AG
  • taxID: CHE-202.737.638
  • url: https://websites.blackai.capital
  • address: c/o MLaw Daniel Villiger, Baarerstrasse 78, 6300 Zug, CH
  • contactPoint: hello@blackai.capital (en, de)
  • founder: Prof. Dr. Walter Kurz, Chairman of the Board

@type: ProfessionalService

@id: https://websites.blackai.capital/#service

  • name: BlackAI Websites
  • serviceType: AI Readiness & Website Optimization
  • areaServed: CH, DE, AT, EU
  • parentOrganization: BlackAI Holding AG
  • Services offered (6-stage AI readiness model):
  • Stage 1 — AI-Readable: Make existing websites visible to AI agents (llms.txt, structured data, schema markup)
  • Stage 2 — AI-Optimized: New websites built from the ground up for humans and AI agents
  • Stage 3 — AI Website: Websites with integrated AI conversation layer
  • Stage 4 — Enterprise AI: AI systems inside the company (RAG pipelines, vector stores, model serving)
  • Stage 5 — Own AI Model: Finetuned AI models trained on company data (QLoRA, full fine-tuning)

@type: WebSite

@id: https://websites.blackai.capital/#website

  • name: BlackAI Websites
  • url: https://websites.blackai.capital
  • inLanguage: en

@type: Person

@id: https://websites.blackai.capital/#founder

  • name: Prof. Dr. Walter Kurz
  • jobTitle: Founder & Chairman
  • url: https://www.linkedin.com/in/drwalterkurz

02 robots.txt

Available at /robots.txt. Generated dynamically by Next.js from src/app/robots.ts.

User-Agent: * (all crawlers)

Allow: /

Disallow: /api/

Disallow: /auth/

User-Agent: GPTBot, ChatGPT-User, ClaudeBot, Claude-Web, PerplexityBot, Applebot, Applebot-Extended, GoogleOther, Bingbot, anthropic-ai, cohere-ai, Bytespider, Meta-ExternalAgent

Allow: /

Allow: /llms.txt

Allow: /llms-full.txt

Allow: /llms/

Disallow: /api/

Disallow: /auth/

Sitemap: https://websites.blackai.capital/sitemap.xml
Host: https://websites.blackai.capital

03 Sitemap

Available at /sitemap.xml. Contains all pages plus LLM markdown files.

04 llms.txt

Two AI agent content files are available:

  • /llms.txt — Site index with the six-stage AI readiness model, service links, and company overview
  • /llms-full.txt — Complete site content in plain text (all pages concatenated)

Per-page markdown files are available under /llms/ for individual content extraction.

05 Noscript Fallbacks

Every page includes a <noscript> block with a content summary, service descriptions, and links to machine-readable resources.

Pages with noscript fallbacks:

  • / (Homepage)
  • /ai-websites
  • /ai-integration
  • /enterprise-ai
  • /ai-models
  • /nerds
  • /partners
  • /contact
  • /imprint
  • /privacy-policy
  • /services
  • /research
  • /team

06 Server-Side Rendering

This is a Next.js server-rendered website. All pages deliver full HTML from the server. No JavaScript is required to read page content.