Skip to content
JSBlogs
Go back

Prompt templates in Spring AI — stop hardcoding your prompts

Dev’s support endpoint was working. The system prompt was a Java text block — a multi-line string right there in the @Bean method. When the product manager asked to tweak the wording, Dev had to open the IDE, edit Java code, rebuild, and redeploy.

Then it happened again for a different prompt. And again.

By the third time, Dev realised: prompts are configuration, not code. They should not live in .java files.

Table of contents

Open Table of contents

The problem with hardcoded prompts

A prompt string embedded in Java has several problems:

// ❌ Prompt buried in Java — hard to update, hard to review
@Bean
ChatClient chatClient(ChatClient.Builder builder) {
    return builder
            .defaultSystem("You are a helpful support assistant. Answer questions about " +
                           "products and orders. If you cannot help, say you don't know. " +
                           "Keep answers short. Only use provided context.")
            .build();
}

What PromptTemplate provides

PromptTemplate is Spring AI’s solution for externalizing prompts and injecting variables into them. It loads a template from a classpath resource and substitutes named variables at call time.

// ✅ Prompt in a file, variables injected at runtime
PromptTemplate template = new PromptTemplate(
    new ClassPathResource("prompts/product-summary.st")
);

Prompt prompt = template.create(Map.of(
    "productName", "ProX Wireless Headphones",
    "audience", "first-time buyers"
));

The template file lives in src/main/resources/prompts/ and uses {variableName} placeholders:

You are a product specialist for TechGadgets.

Write a concise product description for {audience} about the following product.
Use plain language. Highlight the top 3 benefits. Keep it under 100 words.

Product: {productName}

Tip: Name your prompt files descriptively after the task they perform, not the component that uses them: classify-support-message.st, generate-product-description.st, summarize-customer-feedback.st. This makes the prompts directory self-documenting.

Setting up the prompts directory

Create the directory structure:

src/
  main/
    resources/
      prompts/
        support-system.st          ← the support assistant system prompt
        classify-message.st        ← message intent classification
        generate-summary.st        ← product summary generation

These files are plain text — no special format required beyond the {variable} placeholders.

Using PromptTemplate for system prompts

For the system prompt, load it from a file rather than hardcoding it in the bean:

src/main/resources/prompts/support-system.st:

You are a customer support assistant for TechGadgets, an online electronics store.
Answer only questions about products, orders, shipping, and store policies.
Use only the information provided in the context below — do not invent product details.
If you cannot answer, say: "I don't have that information right now. Please contact support@techgadgets.com."
Keep responses concise — 2 to 4 sentences unless detail is explicitly requested.

AiConfig.java:

@Configuration
class AiConfig {

    @Bean
    ChatClient chatClient(
            ChatClient.Builder builder,
            @Value("classpath:prompts/support-system.st") Resource systemPromptResource
    ) {
        return builder
                .defaultSystem(systemPromptResource)
                .defaultOptions(OpenAiChatOptions.builder()
                        .model("gpt-4o-mini")
                        .temperature(0.2)
                        .maxTokens(400)
                        .build())
                .build();
    }
}

ChatClient.Builder.defaultSystem() accepts a Resource directly. Spring loads the file at startup and uses its text as the system prompt.

Important: The system prompt is loaded once at application startup. If you change the .st file, you still need to restart the application. But the change is a resource edit, not a code change — no compile step, and it is clearly visible in version control as a text file diff.

Using PromptTemplate for dynamic user prompts

For prompts with variable content — a customer question, a product name, a chunk of retrieved text — use PromptTemplate.create():

src/main/resources/prompts/classify-message.st:

Classify the following customer support message into exactly one of these categories:
PRODUCT_QUESTION, ORDER_STATUS, RETURN_REQUEST, COMPLAINT, COMPLIMENT, OTHER

Respond with only the category name — no explanation, no punctuation.

Message: {message}

ClassificationService.java:

@Service
class ClassificationService {

    private final ChatClient chatClient;
    private final PromptTemplate classifyTemplate;

    ClassificationService(
            ChatClient chatClient,
            @Value("classpath:prompts/classify-message.st") Resource classifyPrompt
    ) {
        this.chatClient = chatClient;
        this.classifyTemplate = new PromptTemplate(classifyPrompt);
    }

    String classify(String customerMessage) {
        Prompt prompt = classifyTemplate.create(Map.of("message", customerMessage));
        return chatClient.prompt(prompt).call().content().strip();
    }
}

The PromptTemplate.create(Map) substitutes all {variable} placeholders and returns a Prompt object ready to pass to chatClient.prompt(prompt).

Multi-variable templates

Templates can have multiple variables. The product description generator uses two:

src/main/resources/prompts/generate-product-description.st:

You are a product copywriter for TechGadgets.

Write a product description for {audience} about the product below.
Highlight the top 3 benefits. Keep it under {maxWords} words. Use plain, friendly language.

Product details:
{productDetails}
Prompt prompt = descriptionTemplate.create(Map.of(
    "audience", "budget-conscious shoppers",
    "maxWords", "80",
    "productDetails", productCatalogEntry
));

String description = chatClient.prompt(prompt).call().content();

Tip: Variables in PromptTemplate are substituted with plain string replacement. If a variable value contains curly braces, escape them. For complex formatting, pre-process the data in Java before passing it to the template rather than adding complex logic to the template itself.

Testing prompt templates in isolation

Because prompts live in files, you can test them directly without making a real LLM call:

@SpringBootTest
class ClassificationServiceTest {

    @Autowired
    ClassificationService classificationService;

    @MockBean
    ChatClient chatClient;

    @Test
    void classifiesComplaintCorrectly() {
        when(chatClient.prompt(any(Prompt.class)))
                .thenReturn(mock(ChatClient.ChatClientRequestSpec.class,
                        ans -> "COMPLAINT"));

        String result = classificationService.classify(
            "My order arrived damaged and I want a replacement."
        );

        assertThat(result).isEqualTo("COMPLAINT");
    }
}

More importantly, you can inspect the generated Prompt directly to verify the template produced the expected text — without calling any external API:

@Test
void templateSubstitutesVariablesCorrectly() {
    PromptTemplate template = new PromptTemplate(
        new ClassPathResource("prompts/classify-message.st")
    );

    Prompt prompt = template.create(Map.of("message", "Where is my order?"));
    String promptText = prompt.getContents();

    assertThat(promptText).contains("Where is my order?");
    assertThat(promptText).contains("ORDER_STATUS");
    assertThat(promptText).doesNotContain("{message}"); // variable was substituted
}

Caution: PromptTemplate does not validate at construction time that all variables in the template have values. A missing variable leaves the {variableName} placeholder literally in the prompt text. The model will try to interpret it, usually producing wrong output. Always test templates with the same variable set the production code uses.

Checklist for externalizing prompts

  1. Move all system prompt text to src/main/resources/prompts/*.st files.
  2. Use @Value("classpath:prompts/...") to inject Resource references.
  3. Use PromptTemplate + .create(Map) for any prompt with variable content.
  4. Name template files after the task, not the class.
  5. Add a test that loads each template and verifies variables are substituted correctly.
  6. Review prompts as text file diffs in pull requests, separate from code changes.

Note: Externalizing prompts is a discipline, not a technical requirement. Spring AI will work fine with hardcoded strings. The discipline pays off when prompts need to evolve — which they always do once real users start interacting with the system.

References


Share this post on:

Module 02 · First Contact — Spring AI Setup and Your First LLM Calls · Next up

Getting structured JSON responses from LLMs in Spring AI


Previous Post
Getting structured JSON responses from LLMs in Spring AI
Next Post
Understanding Spring AI's ChatClient — the heart of every AI call