How we applied our own Layer 0–4 methodology to this site — and what happened.
Collins Tech is a strategic consulting firm serving established businesses in construction, legal services, and healthcare. The firm's managing partner, PT Collins, has 25+ years of experience, a Master's in Psychology, NLP coaching certification, and three published books on AEO and AI strategy.
None of that mattered to AI systems.
When potential clients asked ChatGPT, Claude, or Perplexity for AEO consulting recommendations in the Tampa Bay area, Collins Tech didn't appear. The firm had exceptional credentials and zero AI visibility. The same credential-visibility gap we diagnose in our clients existed in our own business.
We decided to fix it — and document every step.
The first thing we check on any engagement is whether AI crawlers can even reach the site. This is the number one failure mode in AEO — and it's invisible. A misconfigured robots.txt will silently block every AI system from indexing your content, and you'll never know unless you check.
A robots.txt file explicitly allowing all nine major AI crawlers: GPTBot, ChatGPT-User, Google-Extended, ClaudeBot, anthropic-ai, PerplexityBot, Amazonbot, CCBot, and Bytespider. Each gets its own User-agent directive with Allow: / — no ambiguity, no inheritance issues.
We also deployed an llms.txt file — a structured plain-text document that gives AI systems a complete, parseable overview of the firm: services, methodology, credentials, proprietary concepts, and publication list. This is the equivalent of handing an AI a well-organized briefing packet instead of making it dig through HTML.
Within 48 hours of deployment, AI crawl logs showed activity from GPTBot, ClaudeBot, and PerplexityBot. The door was open.
AI systems need to understand what your business is before they can recommend it. This means structured data — schema markup that defines your entity in a language machines parse natively.
ProfessionalService schema on the homepage with complete organizational data: name, address, service types, price range, founder, sameAs links to LinkedIn, Yelp, and Google Business Profile. Person schema for PT Collins with jobTitle, credentials, and worksFor linkage back to the organization entity.
FAQPage schema on the FAQ page with 9 structured question-answer pairs covering the most common queries about AEO methodology and engagement model.
Article schema on every content page. Book schema on each publication page with ISBN and ASIN identifiers. ContactPage, CollectionPage, and WebApplication schemas where appropriate.
A Wikidata entity was created to establish Collins Tech in the knowledge graph — a foundational authority signal that AI systems reference when validating whether an entity is real.
18 pages, each with at least one schema block. Every page machine-readable. The entity is no longer a collection of web pages — it's a defined, structured thing that AI systems can parse, validate, and cite.
AI answer engines don't cite pages. They cite answers. Your content has to be structured so that specific, citation-ready passages exist for the queries your prospects are asking.
Seven long-form content pages averaging 2,100+ words each — covering construction market contraction, the credential-visibility gap, healthcare single-location plateaus, law firm billing ceilings, and the 14-day AEO implementation framework. Each structured with clear H2/H3 hierarchy, answer capsules in the opening paragraphs, and specific data points that AI systems can extract.
A comprehensive FAQ page addressing the exact questions prospects ask: What is AEO? How long does it take? What does it cost? What industries do you serve? — formatted for both human readability and machine extraction.
Three individual book pages with full descriptions, cover images, and "Buy on Amazon" CTAs — keeping users on-site instead of sending them directly to Amazon.
Over 15,000 words of original, structured content across the site. Every page optimized for both human readers and AI extraction. No thin content. No filler pages.
AI systems cross-reference. They check whether the claims on your site are validated by external sources. A business that exists only on its own website looks unverifiable. A business referenced across multiple credible platforms looks real.
Three published books on Amazon with proper metadata, ISBNs, and ASINs — each a distinct authority signal. LinkedIn company page with consistent NAP data. Yelp business listing. Google Business Profile. Medium publications for the "Strategic Intelligence Brief" and "Field Notes." Quora presence seeded with AEO expertise.
sameAs links in the organization schema connecting all of these platforms back to the canonical site — creating a verified cross-reference network that AI systems can trace.
Collins Tech exists across 7+ platforms with consistent data. AI systems can validate the entity from multiple independent sources. This is the difference between "a website that claims things" and "a verified business with corroborating evidence."
AEO isn't a one-time project. It's infrastructure that needs monitoring. We built tools to track our own visibility.
An AEO Visibility Analyzer tool — a free diagnostic that scans any domain for the same infrastructure we implement for clients. It checks robots.txt configuration, schema markup, content structure, and authority signals.
Google Search Console integration with an 18-URL sitemap. Manual citation tracking across ChatGPT, Claude, Perplexity, and Google AI Overviews for target queries.
The entire site runs on static HTML deployed to Netlify via API — no CMS, no JavaScript framework, no build system. Each page is a self-contained HTML file with inline CSS. This is a deliberate architectural choice: static HTML loads faster, has fewer failure modes, and is fully readable by every crawler on the internet.
Deployment is done directly via the Netlify API from our working environment. Changes go live in seconds. No Git repository, no CI/CD pipeline, no abstraction layers between the code and the served page.
Layer 0 is where most businesses fail and never know it. If your robots.txt blocks AI crawlers — which many default configurations do — every other optimization is invisible. Check robots.txt first. Always.
Schema markup is not optional. It's the difference between a page that AI might stumble across and a page that AI can structurally understand. Every page on this site has at least one schema block.
Content structure matters more than content volume. A 2,000-word article with clear H2/H3 hierarchy and answer capsules outperforms a 5,000-word wall of text every time.
Authority is cumulative. Each published book, each platform presence, each sameAs link strengthens the entity. There's no single silver bullet — it's the network effect of consistent, verifiable data across multiple sources.
18 pages live. Schema on every page. AI crawlers explicitly permitted. Sitemap current. Authority signals active across 7+ platforms. Three published books. One proprietary analyzer tool. The methodology works because we use it on ourselves first.