How to Set Up Elasticsearch and Kibana Locally with Docker on Windows WSL 2025

How to Set Up Elasticsearch and Kibana Locally with Docker on Windows WSL 2025

Setting up Elasticsearch locally for development can be challenging on Windows, especially when dealing with Docker and WSL (Windows Subsystem for Linux). This guide walks you through the exact steps to get Elasticsearch and Kibana running on your Windows machine using the official start-local script, avoiding common pitfalls developers encounter.

Why Local Elasticsearch Development Matters

Before diving into setup, understand why this matters. Running Elasticsearch locally lets you:

  • Test full-text search, vector search, and RAG (Retrieval Augmented Generation) implementations without cloud costs
  • Develop APM (Application Performance Monitoring) integrations without affecting production
  • Build log aggregation and metrics pipelines during development
  • Work offline without relying on managed services like Elasticsearch Service on Elastic Cloud

Prerequisites: What You Need Before Starting

Make sure you have these installed on your Windows machine:

1. Docker Desktop for Windows

Download and install Docker Desktop for Windows. This includes the Docker daemon and CLI tools needed to run containers.

Verification:

docker --version
# Expected output: Docker version 24.x.x or higher

2. Windows Subsystem for Linux (WSL 2)

Elasticsearch on Docker requires WSL 2, not WSL 1. Install or upgrade WSL 2 by running PowerShell as Administrator:

wsl --install
wsl --set-default-version 2

This installs the latest Linux kernel and sets WSL 2 as your default.

Verification:

wsl --list --verbose
# You should see a * next to your distro with VERSION 2

3. A Terminal with bash

You can use:

  • Windows Terminal (recommended) with WSL 2 profile
  • Git Bash
  • PowerShell with WSL integration
  • Native WSL 2 Ubuntu terminal

Step-by-Step Setup Process

Step 1: Run the start-local Script

The official start-local script handles all Docker configuration automatically. Open your terminal and execute:

curl -fsSL https://elastic.co/start-local | sh

If you're on Windows Terminal with WSL 2 profile, run this command directly. The script will:

  1. Create an elastic-start-local folder in your home directory
  2. Generate Docker Compose configuration files
  3. Start Elasticsearch and Kibana containers
  4. Create an .env file with credentials

What happens during execution:

The script pulls Docker images (approximately 1-2 GB download) and starts two containers:

  • es (Elasticsearch on port 9200)
  • kib (Kibana on port 5601)

Expect 2-3 minutes on first run while Docker downloads and initializes the images.

Step 2: Retrieve Your Generated Credentials

After the script completes, you'll see output similar to:

✅ Elasticsearch is running on http://localhost:9200
✅ Kibana is running on http://localhost:5601

The generated password for the elastic user:
Sup3rStr0ng!RandomPassword123

API key stored in: /home/youruser/elastic-start-local/.env

The script stores:

  • elastic user password: Used for Kibana login and Basic auth
  • ES_LOCAL_API_KEY: Stored in .env for programmatic access

Step 3: Access Your Local Services

Kibana Dashboard

Open your browser and navigate to:

http://localhost:5601

Login with:

  • Username: elastic
  • Password: The one displayed after script execution

Elasticsearch API Directly

Test your Elasticsearch instance by making a request:

curl -u elastic:YourGeneratedPassword http://localhost:9200

Expected response:

{
  "name" : "instance-0000000001",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "...",
  "version" : {
    "number" : "8.x.x",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "...",
    "build_date" : "...",
    "build_snapshot" : false,
    "lucene_version" : "9.x.x",
    "minimum_wire_compatibility_version" : "7.17.0",
    "minimum_index_compatibility_version" : "7.0.0"
  },
  "tagline" : "You Know, for Search"
}

Connecting Your Application to Local Elasticsearch

Once running, connect your development application using a client library.

Using Node.js

const { Client } = require('@elastic/elasticsearch');

const client = new Client({
  node: 'http://localhost:9200',
  auth: {
    username: 'elastic',
    password: 'YourGeneratedPassword'
  }
});

// Test connection
await client.info();
console.log('Connected to Elasticsearch');

Using Python

from elasticsearch import Elasticsearch

es = Elasticsearch(
    ['http://localhost:9200'],
    basic_auth=('elastic', 'YourGeneratedPassword')
)

print(es.info())

Using API Key Authentication

For production-like setup, use the generated API key from .env:

cat elastic-start-local/.env | grep ES_LOCAL_API_KEY

Then in your code:

const client = new Client({
  node: 'http://localhost:9200',
  auth: {
    apiKey: 'YOUR_API_KEY_VALUE'
  }
});

Understanding Your Trial License

The local setup includes a one-month trial license enabling all Elastic features:

  • Advanced security settings
  • Machine learning capabilities
  • Vector search functionality
  • RAG (Retrieval Augmented Generation) support

After 30 days, your license automatically reverts to the Free and Open Basic tier, which includes:

  • Full-text search
  • Basic aggregations
  • Log and metrics storage
  • Open source Elasticsearch core features

This is sufficient for most development scenarios.

Common Windows WSL Issues and Solutions

Issue: "Docker daemon is not running"

Solution: Start Docker Desktop. On Windows, this is a GUI application that must be running for Docker CLI to work.

Issue: "WSL 2 backend not available"

Solution: Enable WSL 2 and set it as default:

wsl --set-default-version 2

Then restart Docker Desktop in Settings > Resources > WSL Integration.

Issue: Port 9200 or 5601 already in use

Solution: Check what's using the ports:

netstat -ano | findstr :9200

Or modify the Docker Compose file in elastic-start-local/ to use different ports:

services:
  es:
    ports:
      - "9300:9200"  # Use 9300 instead of 9200

Then restart containers:

cd elastic-start-local
docker compose down
docker compose up

Managing Your Local Elasticsearch Deployment

Stop Containers

cd elastic-start-local
docker compose down

Start Containers Again

cd elastic-start-local
docker compose up

View Container Logs

docker compose logs -f es

Access Elasticsearch Health

curl -u elastic:password http://localhost:9200/_cluster/health

Next Steps: What to Build

With Elasticsearch running locally, you can now:

  1. Test Search Features: Implement full-text search on your data
  2. Build RAG Applications: Combine vector search with generative AI
  3. Create Log Dashboards: Aggregate application logs with Kibana
  4. Develop APM Integrations: Monitor application performance
  5. Experiment with Vector Search: Store and query embeddings for ML applications

The local environment perfectly mirrors production-like behavior, so what you develop will transfer seamlessly to Elasticsearch Service on Elastic Cloud when ready.

Important Security Notes

Remember: This setup is for local development only.

  • HTTPS is disabled for localhost convenience
  • Basic authentication is simplified
  • Elasticsearch listens only on 127.0.0.1 (not accessible over network)
  • Do NOT expose this to the internet
  • For production, use Elasticsearch Service on Elastic Cloud or properly secured self-hosted deployments

Once you're ready to move to production, Elastic Cloud handles security, scaling, and updates automatically.

Recommended Tools