vector search angular
Vector Search in Angular — Service Pattern
Angular developers usually hit the same search problem: keyword search is easy to ship, but it fails when the user phrase and the document phrase do not overlap. altor-vec solves the retrieval side by running HNSW vector search locally in WebAssembly. That means you can keep query latency close to the browser, eliminate per-query billing, and still expose a familiar framework component API. This guide focuses on implementation details rather than marketing claims.
npm install altor-vecThe example below uses tiny four-dimensional vectors so the code is runnable as-is and easy to understand. In production you would usually replace those manual vectors with embeddings from a model such as Xenova/all-MiniLM-L6-v2 or a build-time embedding job. The retrieval flow stays the same: install, import the WASM package, create an index, optionally add vectors, then query the engine and map result IDs back to metadata.
Step 1: install and understand the runtime boundary
Start with npm install altor-vec. The package exposes a default init() function that loads the WASM module and a WasmSearchEngine class that loads or builds an HNSW index. The important design question in Angular is not whether vector retrieval is possible. It is where initialization should live so the index is created once, memory is released intentionally, and queries do not trigger unnecessary work on each re-render or navigation.
Step 2: import the library and create the index
The sample builds an index from a flat Float32Array. That matches the real API from the package README: WasmSearchEngine.from_vectors(flat, dims, m, ef_construction, ef_search). The four HNSW parameters here are conservative defaults for a small browser index. If you precompute a production index offline, you can instead serialize it with to_bytes() and load it using new WasmSearchEngine(bytes).
// vector-search.service.ts
import { Injectable } from '@angular/core';
import init, { WasmSearchEngine } from 'altor-vec';
const docs = [
{ title: 'Dependency injection', vector: [1, 0, 0, 0] },
{ title: 'Build optimizer', vector: [0, 1, 0, 0] },
{ title: 'Search architecture', vector: [0, 0, 1, 0] },
];
@Injectable({ providedIn: 'root' })
export class VectorSearchService {
private engine: WasmSearchEngine | null = null;
async init() {
if (this.engine) return;
await init();
const dim = 4;
const flat = new Float32Array(docs.flatMap((doc) => doc.vector));
this.engine = WasmSearchEngine.from_vectors(flat, dim, 16, 200, 50);
docs.push({ title: 'Angular hydration guide', vector: [0.95, 0.05, 0, 0] });
this.engine.add_vectors(new Float32Array([0.95, 0.05, 0, 0]), dim);
}
search(query: [number, number, number, number]) {
if (!this.engine) throw new Error('init() first');
const hits: [number, number][] = JSON.parse(this.engine.search(new Float32Array(query), 3));
return hits.map(([id, distance]) => ({ ...docs[id], distance }));
}
}
// search.component.ts
@Component({ selector: 'app-search', template: `
<button (click)="run()">Search</button>
<ul><li *ngFor="let hit of hits">{{ hit.title }} — {{ hit.distance | number:'1.2-2' }}</li></ul>
` })
export class SearchComponent {
hits: Array<{ title: string; distance: number }> = [];
constructor(private readonly vectorSearch: VectorSearchService) {}
async ngOnInit() { await this.vectorSearch.init(); }
run() { this.hits = this.vectorSearch.search([0.94, 0.06, 0, 0]); }
}Step 3: what the code is actually doing
- Install: the project adds
altor-vecfrom npm. - Import: the code imports
initandWasmSearchEngine. - Create index: manual vectors are flattened into a single
Float32Arrayand passed intofrom_vectors(). - Add vectors: the example appends one more vector via
add_vectors()so you can see incremental updates. - Query: it converts a query vector into a
Float32Array, callssearch(), parses the JSON response, and maps IDs back to the in-memory document array.
That pattern is stable across browser frameworks because altor-vec is model-agnostic. The framework concerns are mostly lifecycle-related: where to hold the engine instance, how to debounce query creation, and whether embeddings run on the main thread or in a worker. If you keep those concerns separate, semantic retrieval feels surprisingly ordinary.
Performance notes specific to this framework
- The Angular service boundary is useful because it prevents repeated engine creation as routes and components mount.
- Use RxJS debounceTime before any embedding call. altor-vec search is fast enough that over-eager query streams usually waste model cycles, not HNSW time.
- Keep Zone.js change detection away from high-frequency worker messages if you process live semantic suggestions.
0.6ms, index load is about 19ms, the raw WASM binary is 117KB, and the gzipped WASM payload is 54KB. In real apps, embedding generation and rendering usually cost much more than the vector lookup itself.When to use client-side vs server-side in Angular
Client-side: Browser-only Angular works well for internal tools, admin panels, and static docs where the bundle already ships app logic.
Server-side: Introduce server search when enterprise requirements demand authorization, centralized indexing pipelines, or auditability beyond the Angular frontend.
A good rule is simple. If the data is already safe to send to every browser and you mostly care about fast semantic ranking, keep it local. If the search layer also needs to enforce business rules, security boundaries, or complex shared state, put retrieval on the server and let Angular call it as a normal endpoint.
Production checklist
- Cache the WASM and serialized index aggressively with versioned asset names.
- Validate vector dimensions before every search to prevent subtle runtime errors.
- Keep metadata outside the HNSW graph so result rendering stays flexible.
- Measure cold start, repeated search latency, and memory on at least one mid-range mobile device.
- Free the engine explicitly if you unload large indexes on navigation.
Conclusion
Angular does not require a special semantic-search abstraction. It only needs a clean place to initialize the engine and a disciplined boundary between embedding, retrieval, and UI state. altor-vec gives you a small browser-native ANN core, while the framework handles rendering and ergonomics. If you want a developer-friendly starting point with no backend dependency, this is the shortest path: npm install altor-vec, build or load an index, and search locally.
Related hubs: Framework guides · Use cases · Comparison pages