Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Praesent elementum facilisis leo vel fringilla est ullamcorper eget. At imperdiet dui accumsan sit amet nulla facilities morbi tempus. Praesent elementum facilisis leo vel fringilla. Congue mauris rhoncus aenean vel. Egestas sed tempus urna et pharetra pharetra massa massa ultricies.

Venenatis cras sed felis eget velit. Consectetur libero id faucibus nisl tincidunt. Gravida in fermentum et sollicitudin ac orci phasellus egestas tellus. Volutpat consequat mauris nunc congue nisi vitae. Id aliquet risus feugiat in ante metus dictum at tempor. Sed blandit libero volutpat sed cras. Sed odio morbi quis commodo odio aenean sed adipiscing. Velit euismod in pellentesque massa placerat. Mi bibendum neque egestas congue quisque egestas diam in arcu. Nisi lacus sed viverra tellus in. Nibh cras pulvinar mattis nunc sed. Luctus accumsan tortor posuere ac ut consequat semper viverra. Fringilla ut morbi tincidunt augue interdum velit euismod.

Mathematical Examples

Inline Math

Here's some inline math: \(E = mc^2\) and the quadratic formula \(x = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a}\).

The area of a circle is \(A = \pi r^2\) where \(r\) is the radius.

Display Math

Here's a more complex equation displayed on its own line:

\[ \int_{-\infty}^{\infty} e^{-x^2} dx = \sqrt{\pi} \]

Matrix operations are also supported:

\[\begin{bmatrix}a & b \\c & d\end{bmatrix}\begin{bmatrix}x \\y\end{bmatrix}=\begin{bmatrix}ax + by \\cx + dy\end{bmatrix}\]

The Fourier transform of a function \(f(t)\):

\[ F(\omega) = \int_{-\infty}^{\infty} f(t) e^{-i\omega t} dt \]

Diagram Examples

Process Flow

Here's a simple process flow using Mermaid:

flowchart TD
    A[Start] --> B{Is it working?}
    B -->|Yes| C[Great!]
    B -->|No| D[Debug]
    D --> E[Fix Issue]
    E --> B
    C --> F[End]

System Architecture

graph LR
    A[Client] --> B[Load Balancer]
    B --> C[API Gateway]
    C --> D[Service A]
    C --> E[Service B] 
    D --> F[(Database A)]
    E --> G[(Database B)]
    C --> H[Message Queue]
    H --> I[Background Jobs]

Timeline/Gantt Chart

gantt
    title Development Timeline
    dateFormat YYYY-MM-DD
    section Planning
    Requirements    :done, req, 2024-01-01, 2024-01-15
    Design         :done, design, 2024-01-10, 2024-01-25
    section Development
    Backend API    :active, backend, 2024-01-20, 2024-02-15
    Frontend       :frontend, after backend, 2024-02-20
    Testing        :testing, after frontend, 1w
    section Deployment
    Staging        :staging, after testing, 3d
    Production     :prod, after staging, 1d

Lorem Ipsum

Tristique senectus et netus et malesuada fames ac turpis. Ridiculous mus mauris vitae ultricies leo integer malesuada nunc vel. In mollis nunc sed id semper. Egestas tellus rutrum tellus pellentesque. Phasellus vestibulum lorem sed risus ultricies tristique nulla. Quis blandit turpis cursus in hac habitasse platea dictumst quisque. Eros donec ac odio tempor orci dapibus ultrices. Aliquam sem et tortor consequat id porta nibh. Adipiscing elit duis tristique sollicitudin nibh sit amet commodo nulla. Diam vulputate ut pharetra sit amet. Ut tellus elementum sagittis vitae et leo. Arcu non odio euismod lacinia at quis risus sed vulputate.

This post serves as a test for syntax highlighting capabilities across various programming languages commonly used in distributed systems and web development.

Rust Example

Here's a simple Raft consensus implementation in Rust:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
use std::collections::HashMap;
use tokio::time::{sleep, Duration};

#[derive(Debug, Clone)]
pub struct RaftNode {
    id: u64,
    current_term: u64,
    voted_for: Option<u64>,
    log: Vec<LogEntry>,
    state: NodeState,
}

#[derive(Debug, Clone)]
pub enum NodeState {
    Follower,
    Candidate,
    Leader,
}

#[derive(Debug, Clone)]
pub struct LogEntry {
    term: u64,
    index: u64,
    command: String,
}

impl RaftNode {
    pub fn new(id: u64) -> Self {
        Self {
            id,
            current_term: 0,
            voted_for: None,
            log: Vec::new(),
            state: NodeState::Follower,
        }
    }

    pub async fn start_election(&mut self) -> Result<(), Box<dyn std::error::Error>> {
        self.current_term += 1;
        self.state = NodeState::Candidate;
        self.voted_for = Some(self.id);
        
        println!("Node {} starting election for term {}", self.id, self.current_term);
        
        // Request votes from other nodes
        // Implementation would go here...
        
        Ok(())
    }
}

Go Example

A simple HTTP server with middleware:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
package main

import (
    "context"
    "encoding/json"
    "fmt"
    "log"
    "net/http"
    "time"

    "github.com/gorilla/mux"
)

type Server struct {
    router *mux.Router
    db     Database
}

type Database interface {
    GetUser(ctx context.Context, id string) (*User, error)
    CreateUser(ctx context.Context, user *User) error
}

type User struct {
    ID       string    `json:"id"`
    Name     string    `json:"name"`
    Email    string    `json:"email"`
    Created  time.Time `json:"created"`
}

// Middleware for logging requests
func loggingMiddleware(next http.Handler) http.Handler {
    return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        start := time.Now()
        
        next.ServeHTTP(w, r)
        
        log.Printf("%s %s %v", r.Method, r.RequestURI, time.Since(start))
    })
}

// Handler for getting user information
func (s *Server) getUserHandler(w http.ResponseWriter, r *http.Request) {
    vars := mux.Vars(r)
    userID := vars["id"]
    
    ctx, cancel := context.WithTimeout(r.Context(), 5*time.Second)
    defer cancel()
    
    user, err := s.db.GetUser(ctx, userID)
    if err != nil {
        http.Error(w, "User not found", http.StatusNotFound)
        return
    }
    
    w.Header().Set("Content-Type", "application/json")
    json.NewEncoder(w).Encode(user)
}

func main() {
    server := &Server{
        router: mux.NewRouter(),
    }
    
    server.router.Use(loggingMiddleware)
    server.router.HandleFunc("/users/{id}", server.getUserHandler).Methods("GET")
    
    fmt.Println("Server starting on :8080")
    log.Fatal(http.ListenAndServe(":8080", server.router))
}

Python Example

Async web scraping with error handling:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
import asyncio
import aiohttp
from typing import List, Dict, Optional
from dataclasses import dataclass
from urllib.parse import urljoin, urlparse

@dataclass
class ScrapedData:
    url: str
    title: Optional[str]
    status_code: int
    content_length: int
    error: Optional[str] = None

class WebScraper:
    def __init__(self, max_concurrent: int = 10):
        self.max_concurrent = max_concurrent
        self.session: Optional[aiohttp.ClientSession] = None
        
    async def __aenter__(self):
        connector = aiohttp.TCPConnector(limit=self.max_concurrent)
        timeout = aiohttp.ClientTimeout(total=30)
        self.session = aiohttp.ClientSession(
            connector=connector,
            timeout=timeout
        )
        return self
        
    async def __aexit__(self, exc_type, exc_val, exc_tb):
        if self.session:
            await self.session.close()
    
    async def scrape_url(self, url: str) -> ScrapedData:
        """Scrape a single URL and extract basic information."""
        try:
            async with self.session.get(url) as response:
                content = await response.text()
                title = self._extract_title(content)
                
                return ScrapedData(
                    url=url,
                    title=title,
                    status_code=response.status,
                    content_length=len(content)
                )
        except Exception as e:
            return ScrapedData(
                url=url,
                title=None,
                status_code=-1,
                content_length=0,
                error=str(e)
            )
    
    def _extract_title(self, html: str) -> Optional[str]:
        """Extract title from HTML content."""
        import re
        match = re.search(r'<title[^>]*>([^<]+)</title>', html, re.IGNORECASE)
        return match.group(1).strip() if match else None
    
    async def scrape_urls(self, urls: List[str]) -> List[ScrapedData]:
        """Scrape multiple URLs concurrently."""
        semaphore = asyncio.Semaphore(self.max_concurrent)
        
        async def scrape_with_semaphore(url: str) -> ScrapedData:
            async with semaphore:
                return await self.scrape_url(url)
        
        tasks = [scrape_with_semaphore(url) for url in urls]
        return await asyncio.gather(*tasks)

# Usage example
async def main():
    urls = [
        "https://github.com",
        "https://stackoverflow.com",
        "https://news.ycombinator.com",
    ]
    
    async with WebScraper(max_concurrent=5) as scraper:
        results = await scraper.scrape_urls(urls)
        
        for result in results:
            if result.error:
                print(f"❌ {result.url}: {result.error}")
            else:
                print(f"✅ {result.url}: {result.title} ({result.status_code})")

if __name__ == "__main__":
    asyncio.run(main())

JavaScript/TypeScript Example

Modern React component with hooks and TypeScript:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
import React, { useState, useEffect, useCallback } from 'react';
import { debounce } from 'lodash';

interface SearchResult {
  id: string;
  title: string;
  description: string;
  url: string;
  type: 'post' | 'project' | 'page';
  score: number;
}

interface SearchProps {
  onResultSelect: (result: SearchResult) => void;
  placeholder?: string;
  maxResults?: number;
}

interface SearchState {
  query: string;
  results: SearchResult[];
  loading: boolean;
  error: string | null;
}

const useSearch = (maxResults: number = 10) => {
  const [state, setState] = useState<SearchState>({
    query: '',
    results: [],
    loading: false,
    error: null,
  });

  const performSearch = useCallback(async (query: string): Promise<void> => {
    if (!query.trim()) {
      setState(prev => ({ ...prev, results: [], loading: false }));
      return;
    }

    setState(prev => ({ ...prev, loading: true, error: null }));

    try {
      const response = await fetch('/api/search', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ query, maxResults }),
      });

      if (!response.ok) {
        throw new Error(`Search failed: ${response.statusText}`);
      }

      const results: SearchResult[] = await response.json();
      
      setState(prev => ({
        ...prev,
        results,
        loading: false,
      }));
    } catch (error) {
      setState(prev => ({
        ...prev,
        error: error instanceof Error ? error.message : 'Unknown error',
        loading: false,
      }));
    }
  }, [maxResults]);

  const debouncedSearch = useCallback(
    debounce(performSearch, 300),
    [performSearch]
  );

  useEffect(() => {
    debouncedSearch(state.query);
    
    return () => {
      debouncedSearch.cancel();
    };
  }, [state.query, debouncedSearch]);

  const setQuery = useCallback((query: string) => {
    setState(prev => ({ ...prev, query }));
  }, []);

  return { ...state, setQuery };
};

export const SearchComponent: React.FC<SearchProps> = ({ 
  onResultSelect, 
  placeholder = "Search...", 
  maxResults = 10 
}) => {
  const { query, results, loading, error, setQuery } = useSearch(maxResults);
  const [selectedIndex, setSelectedIndex] = useState(-1);

  const handleKeyDown = useCallback((e: React.KeyboardEvent) => {
    switch (e.key) {
      case 'ArrowDown':
        e.preventDefault();
        setSelectedIndex(prev => 
          Math.min(prev + 1, results.length - 1)
        );
        break;
        
      case 'ArrowUp':
        e.preventDefault();
        setSelectedIndex(prev => Math.max(prev - 1, -1));
        break;
        
      case 'Enter':
        e.preventDefault();
        if (selectedIndex >= 0 && results[selectedIndex]) {
          onResultSelect(results[selectedIndex]);
        }
        break;
        
      case 'Escape':
        setQuery('');
        setSelectedIndex(-1);
        break;
    }
  }, [results, selectedIndex, onResultSelect, setQuery]);

  const getResultIcon = (type: SearchResult['type']): string => {
    switch (type) {
      case 'post': return '📝';
      case 'project': return '🚀';
      default: return '📄';
    }
  };

  return (
    <div className="search-container">
      <div className="search-input-wrapper">
        <input
          type="text"
          value={query}
          onChange={(e) => setQuery(e.target.value)}
          onKeyDown={handleKeyDown}
          placeholder={placeholder}
          className="search-input"
          aria-label="Search"
        />
        
        {loading && <div className="search-spinner"></div>}
      </div>

      {error && (
        <div className="search-error" role="alert">
          Error: {error}
        </div>
      )}

      {results.length > 0 && (
        <ul className="search-results" role="listbox">
          {results.map((result, index) => (
            <li
              key={result.id}
              className={`search-result ${index === selectedIndex ? 'selected' : ''}`}
              onClick={() => onResultSelect(result)}
              role="option"
              aria-selected={index === selectedIndex}
            >
              <div className="result-icon">{getResultIcon(result.type)}</div>
              <div className="result-content">
                <h3 className="result-title">{result.title}</h3>
                <p className="result-description">{result.description}</p>
                <span className="result-type">{result.type}</span>
              </div>
            </li>
          ))}
        </ul>
      )}
    </div>
  );
};

export default SearchComponent;

SQL Example

Complex query with CTEs and window functions:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
-- Calculate user engagement metrics with rolling averages
WITH user_activity AS (
  SELECT 
    u.id AS user_id,
    u.name,
    u.created_at AS user_created,
    DATE(a.created_at) AS activity_date,
    COUNT(*) AS daily_actions,
    SUM(CASE WHEN a.action_type = 'post' THEN 1 ELSE 0 END) AS posts,
    SUM(CASE WHEN a.action_type = 'comment' THEN 1 ELSE 0 END) AS comments,
    SUM(CASE WHEN a.action_type = 'like' THEN 1 ELSE 0 END) AS likes
  FROM users u
  JOIN user_actions a ON u.id = a.user_id
  WHERE a.created_at >= CURRENT_DATE - INTERVAL '30 days'
  GROUP BY u.id, u.name, u.created_at, DATE(a.created_at)
),
engagement_metrics AS (
  SELECT 
    user_id,
    name,
    user_created,
    activity_date,
    daily_actions,
    posts,
    comments,
    likes,
    -- Rolling 7-day average
    AVG(daily_actions) OVER (
      PARTITION BY user_id 
      ORDER BY activity_date 
      ROWS BETWEEN 6 PRECEDING AND CURRENT ROW
    ) AS rolling_avg_actions,
    -- Cumulative actions
    SUM(daily_actions) OVER (
      PARTITION BY user_id 
      ORDER BY activity_date
    ) AS cumulative_actions,
    -- Rank by daily activity
    DENSE_RANK() OVER (
      PARTITION BY activity_date 
      ORDER BY daily_actions DESC
    ) AS daily_activity_rank
  FROM user_activity
)
SELECT 
  em.user_id,
  em.name,
  COUNT(DISTINCT em.activity_date) AS active_days,
  AVG(em.daily_actions) AS avg_daily_actions,
  MAX(em.daily_actions) AS max_daily_actions,
  SUM(em.posts) AS total_posts,
  SUM(em.comments) AS total_comments,
  SUM(em.likes) AS total_likes,
  AVG(em.rolling_avg_actions) AS overall_rolling_avg,
  MIN(em.daily_activity_rank) AS best_daily_rank,
  CASE 
    WHEN AVG(em.daily_actions) >= 50 THEN 'High'
    WHEN AVG(em.daily_actions) >= 20 THEN 'Medium'
    ELSE 'Low'
  END AS engagement_level
FROM engagement_metrics em
GROUP BY em.user_id, em.name
HAVING COUNT(DISTINCT em.activity_date) >= 5  -- At least 5 active days
ORDER BY avg_daily_actions DESC
LIMIT 100;

Configuration Examples

YAML configuration for Kubernetes deployment:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
apiVersion: apps/v1
kind: Deployment
metadata:
  name: web-app
  labels:
    app: web-app
    version: v1.0.0
spec:
  replicas: 3
  selector:
    matchLabels:
      app: web-app
  template:
    metadata:
      labels:
        app: web-app
    spec:
      containers:
      - name: web-app
        image: nginx:1.21-alpine
        ports:
        - containerPort: 80
        env:
        - name: NODE_ENV
          value: "production"
        - name: DATABASE_URL
          valueFrom:
            secretKeyRef:
              name: app-secrets
              key: database-url
        resources:
          requests:
            memory: "128Mi"
            cpu: "100m"
          limits:
            memory: "512Mi"
            cpu: "500m"
        livenessProbe:
          httpGet:
            path: /health
            port: 80
          initialDelaySeconds: 30
          periodSeconds: 10
        readinessProbe:
          httpGet:
            path: /ready
            port: 80
          initialDelaySeconds: 5
          periodSeconds: 5

JSON configuration example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
{
  "name": "distributed-kv",
  "version": "1.0.0",
  "description": "A distributed key-value store with Raft consensus",
  "main": "src/main.rs",
  "scripts": {
    "build": "cargo build --release",
    "test": "cargo test",
    "bench": "cargo bench",
    "fmt": "cargo fmt",
    "clippy": "cargo clippy -- -D warnings"
  },
  "dependencies": {
    "tokio": { "version": "1.0", "features": ["full"] },
    "serde": { "version": "1.0", "features": ["derive"] },
    "clap": { "version": "4.0", "features": ["derive"] }
  },
  "config": {
    "cluster": {
      "nodes": [
        { "id": 1, "addr": "127.0.0.1:8001" },
        { "id": 2, "addr": "127.0.0.1:8002" },
        { "id": 3, "addr": "127.0.0.1:8003" }
      ],
      "election_timeout": 150,
      "heartbeat_interval": 50
    },
    "storage": {
      "data_dir": "./data",
      "log_segment_size": "1GB",
      "compaction_threshold": 0.8
    },
    "api": {
      "bind_addr": "0.0.0.0:8080",
      "max_connections": 1000,
      "request_timeout": 30
    }
  }
}