How to Implement and Troubleshoot AI Search in Drupal 10

How to Implement and Troubleshoot AI Search in Drupal 10

Alex Rollin
Alex Rollin
August 16, 2025
Last updated : February 15, 2026
August 16, 2025

Setting up intelligent search in Drupal 10 can feel overwhelming. With multiple modules, API configurations, and vector databases to consider, knowing where to start isn't always clear. This guide walks you through implementing AI-powered search step by step, from initial setup through troubleshooting common issues.

By the end, you'll have a working AI search implementation that understands user intent, handles typos and synonyms, and delivers relevant results even when users don't know the exact keywords to use.

Prerequisites

Before you begin implementing AI search, make sure you have:

  • Drupal 10.1 or higher installed and running
  • Composer available for module installation
  • Drush installed for command-line operations
  • Admin access to your Drupal site
  • API credentials from at least one provider (OpenAI, Anthropic, or similar)
  • Basic understanding of Drupal's module system and configuration management

We've found that teams starting with cloud-based providers like OpenAI have an easier initial setup, while those with privacy requirements often prefer self-hosted options using Ollama and local vector databases.

You'll also need to decide on your vector database. Popular choices include:

  • Pinecone (cloud-based, easy setup)
  • Milvus (self-hosted or cloud)
  • PostgreSQL with pgvector (if you already use PostgreSQL)
  • Zilliz (managed Milvus cloud service)

Step-by-Step Implementation

Step 1: Install Required Modules

Start by installing the core modules needed for AI search functionality:

composer require drupal/ai drupal/key drupal/search_api

These three modules form the foundation:

  • drupal/ai provides the AI framework and includes the ai_search submodule
  • drupal/key securely stores your API credentials
  • drupal/search_api creates the search infrastructure

Depending on your chosen providers, add specific modules:

# For OpenAI users
composer require drupal/ai_provider_openai

# For Milvus vector database
composer require drupal/ai_vdb_provider_milvus

# For PostgreSQL with pgvector
composer require drupal/ai_vdb_provider_postgres

# For self-hosted Ollama
composer require drupal/ai_provider_ollama

Step 2: Enable the Modules

After installation, enable the modules using Drush:

drush en ai ai_search ai_api_explorer key search_api

Then enable your specific provider modules:

# Example for OpenAI and Pinecone setup
drush en ai_provider_openai ai_vdb_provider_pinecone

Clear the cache to ensure all configurations load properly:

drush cr

Step 3: Configure API Keys

Navigate to Configuration > System > Key in your Drupal admin interface. Create a new key for each API service you're using:

1. Click "Add key"
2. Give it a descriptive name (e.g., "OpenAI API Key")
3. Select "Key type" as "Authentication"
4. Choose "Key provider" (we recommend "Configuration" for development, "File" for production)
5. Enter your API key value
6. Save the configuration

Repeat this process for your vector database API key if using a cloud service.

Step 4: Set Up the AI Provider

Go to Configuration > AI > Provider settings and configure your chosen provider:

For OpenAI:

1. Select "OpenAI" as the provider
2. Choose the API key you created earlier
3. Enter your OpenAI organization ID (use the ID, not the organization name)
4. Select your preferred model (text-embedding-3-small works well for most use cases)
5. Save the configuration

Step 5: Configure Vector Database

Navigate to Configuration > AI > VDB Providers and set up your vector database:

For Milvus:

Host: your-milvus-server.com
Port: 19530
Database: default
Collection: drupal_content
Dimension: 1536 (for OpenAI embeddings)
Similarity Metric: Cosine

For PostgreSQL with pgvector:

Host: localhost
Port: 5432
Database: drupal_vectors
Table: content_embeddings
Dimension: 1536

Test the connection before saving to ensure everything connects properly.

Step 6: Create the Search API Server

1. Navigate to Configuration > Search and metadata > Search API
2. Click "Add server"
3. Enter a name like "AI Search Server"
4. For backend, select "AI Search"
5. Configure these settings:

  • Embeddings engine: Your AI provider (e.g., OpenAI)
  • Model: text-embedding-3-small (or your preferred model)
  • Vector Database: Your configured VDB provider
  • Chunk size: 512 tokens (adjust based on your content)
  • Overlap: 50 tokens (helps maintain context between chunks)

Step 7: Create and Configure the Search Index

1. In Search API, click "Add index"
2. Name it something descriptive like "AI Content Index"
3. Select the content types to index (Articles, Pages, etc.)
4. Choose your AI Search Server
5. Configure fields carefully:

Main content field (this gets chunked and embedded):

  • Select your body field or main content area
  • This will be split into chunks for processing

Contextual fields (added to each chunk for better understanding):

  • Title
  • Summary/teaser
  • Tags or categories
  • Author information

Filterable fields (for faceted search):

  • Content type
  • Publication date
  • Category terms

Step 8: Index Your Content

After configuring the index, you need to process your content:

1. Go to your index's "View" tab
2. Click "Index now" to start processing
3. For sites with lots of content, adjust the batch size:

// In settings.php for large sites
$config['search_api.index.ai_content_index']['options']['batch_size'] = 5;

Monitor the indexing progress. Each piece of content gets:

  • Split into chunks
  • Sent to the AI provider for embedding generation
  • Stored in the vector database with metadata

Step 9: Test Your Implementation

Use the AI API Explorer to verify everything works:

1. Navigate to /admin/config/ai/explorers/vector_db_generator
2. Enter a test query
3. Check that results return with relevance scores
4. Verify the content matches your expectations

Example test queries to try:

  • A question about your content
  • A misspelled search term
  • A conceptual query using different words than your content

Step 10: Add Search Interface

Finally, add the search interface to your site:

For a search block:

1. Go to Structure > Block layout
2. Click "Place block" in your desired region
3. Search for "AI Search" block
4. Configure the block settings
5. Save and test on your site

For a dedicated search page, create a custom view using the Search API index.

Code Examples with Explanations

Custom Search Implementation

Here's how to programmatically query your AI search:

use Drupal\search_api\Entity\Index;

function custom_ai_search($query_text) {
  // Load your AI search index
  $index = Index::load('ai_content_index');
  
  // Create query
  $query = $index->query();
  
  // Set the search string
  $query->keys($query_text);
  
  // Set result limit
  $query->range(0, 10);
  
  // Execute search
  $results = $query->execute();
  
  // Process results
  $items = [];
  foreach ($results->getResultItems() as $item) {
    $items[] = [
      'title' => $item->getField('title')->getValues()[0],
      'excerpt' => $item->getExcerpt(),
      'score' => $item->getScore(),
      'url' => $item->getOriginalObject()->toUrl()->toString(),
    ];
  }
  
  return $items;
}

Embedding Generation Example

If you need to generate embeddings directly:

use Drupal\ai\Service\AiProviderManager;

function generate_embedding($text) {
  // Get the AI provider service
  $ai_manager = \Drupal::service('ai.provider_manager');
  
  // Get your configured provider
  $provider = $ai_manager->getProvider('openai');
  
  // Generate embedding
  $embedding = $provider->embeddings($text, [
    'model' => 'text-embedding-3-small',
  ]);
  
  return $embedding;
}

Vector Database Query

Direct vector database queries for custom implementations:

use Drupal\ai_vdb_provider\Service\VectorDatabaseManager;

function search_vectors($embedding, $limit = 10) {
  // Get vector database service
  $vdb_manager = \Drupal::service('ai.vdb_manager');
  
  // Get your configured database
  $vdb = $vdb_manager->getDatabase('milvus');
  
  // Perform similarity search
  $results = $vdb->search([
    'vector' => $embedding,
    'limit' => $limit,
    'metric' => 'cosine',
  ]);
  
  return $results;
}

Common Mistakes to Avoid

1. Incorrect Field Configuration

The most frequent issue we see is misconfigured fields in the search index. Make sure:

  • Only one field is set as "Main content" (the field that gets chunked)
  • Contextual fields are marked as "Contextual" not "Main"
  • Don't index unnecessary fields that bloat your vectors

2. API Key and Organization ID Issues

When using OpenAI, always use the organization ID (starts with "org-"), not the organization name. You can find this in your OpenAI account settings.

3. Oversized Chunks

Keeping chunks too large leads to:

  • Higher API costs
  • Less precise search results
  • Timeout errors during indexing

Start with 512 token chunks and adjust based on your content structure.

4. Missing Error Handling

Always implement fallbacks:

try {
  $results = custom_ai_search($query);
} catch (\Exception $e) {
  // Fall back to standard search
  \Drupal::logger('ai_search')->error('AI search failed: @message', [
    '@message' => $e->getMessage(),
  ]);
  $results = fallback_keyword_search($query);
}

5. Ignoring Rate Limits

Our experience shows that API rate limits catch many teams off guard. Monitor your usage and implement throttling:

// Add delays between batch operations
$config['ai.settings']['batch_delay'] = 1000; // milliseconds

Testing and Verification Steps

1. Verify Module Installation

Run this command to confirm all modules are enabled:

drush pm:list --type=module --status=enabled | grep -E "ai|search_api|key"

2. Test API Connectivity

Use the AI API Explorer at /admin/config/ai/explorers/ai_provider to test:

  • Send a simple text for embedding
  • Verify you receive a vector array back
  • Check the vector dimensions match your configuration

3. Validate Vector Database

Test vector database operations at /admin/config/ai/explorers/vector_db:

1. Create a test collection
2. Insert a sample vector
3. Perform a similarity search
4. Delete the test collection

4. Check Index Status

Monitor your search index at /admin/config/search/search-api:

  • Verify items are being indexed
  • Check for any error messages
  • Review the index log for issues

5. Query Testing Checklist

Test these query types to ensure full functionality:

  • Exact keyword match
  • Synonym search (e.g., "car" finding "automobile")
  • Typo tolerance (e.g., "drupla" finding "Drupal")
  • Question-based search
  • Conceptual search (finding related content without exact terms)

6. Performance Monitoring

Track these metrics:

  • Indexing speed (items per minute)
  • Query response time
  • API usage and costs
  • Vector database storage usage

Troubleshooting Guide

Teams we work with report these issues most frequently:

No Results Returned

1. Check if content is indexed: /admin/config/search/search-api/index/[your_index]
2. Verify embeddings exist in vector database
3. Test with broader queries
4. Check similarity threshold settings

Slow Indexing

  • Reduce batch size to 5 or fewer items
  • Add delays between batches
  • Check API rate limits
  • Consider upgrading API tier

Memory Errors

// Increase memory limit in settings.php
ini_set('memory_limit', '512M');

// Or in .htaccess
php_value memory_limit 512M

Connection Timeouts

Increase timeout values:

// In settings.php
$config['ai.settings']['timeout'] = 60; // seconds
$config['ai.settings']['connect_timeout'] = 10;

Debugging API Calls

Enable verbose logging:

// In settings.local.php for development
$config['ai.settings']['debug'] = TRUE;
$config['system.logging']['error_level'] = 'verbose';

Check logs at /admin/reports/dblog filtered by type "ai" or "search_api".

Conclusion

Implementing AI search in Drupal 10 brings real improvements to how users find content on your site. Following this guide, you've set up intelligent search that understands context, handles variations in how people search, and delivers relevant results even with imperfect queries.

The key points to remember:

  • Start with a clear plan for which providers and databases fit your needs
  • Configure fields thoughtfully - proper chunking and contextual data make a huge difference
  • Test thoroughly at each step before moving forward

Our experience shows that successful AI search implementations require ongoing refinement. Monitor your search analytics, gather user feedback, and adjust your chunking approach and embedding models as you learn what works best for your specific content and audience.

Building AI search for Drupal requires careful attention to API configuration, field mapping, and index settings. If you're planning to implement semantic search for your Drupal site and want help evaluating which embedding models and vector databases would best serve your content structure and search patterns, we can review your requirements and recommend an implementation approach that balances performance, cost, and search quality for your specific use case.

Share this article