The Idea
You have a TypeScript agent. Maybe it uses the OpenAI SDK, LangChain.js, or just rawfetch calls. You want it to be a real microservice — with identity, authentication, payments, and a standard protocol. But you don’t want to rewrite infrastructure.
Installation
What Happens When You Call bindufy()
When a message arrives via A2A HTTP, the core calls your handler over gRPC. You process it, return a string, and the core sends it back to the client with a DID signature.
When you press
Ctrl+C, the SDK kills the Python core and exits.
Handler Patterns
Simple response
OpenAI SDK
LangChain.js
Multi-turn conversation
Sometimes your agent needs more information before it can answer. Return a state transition:input-required. The user sends a follow-up. The core calls your handler again with the full conversation history.
Error handling
If your handler throws, the SDK catches it and returns a gRPC error. ManifestWorker marks the task as failed. The user gets an error response.Configuration
Skills
Define what your agent can do. Two options:File-based (recommended)
Createskills/my-skill/skill.yaml or skills/my-skill/SKILL.md:
Inline
Define skills directly in code:Types
The SDK exports these types for your handler:Debugging
Check core logs
The Python core’s output is prefixed with[bindu-core] in your terminal:
Test the agent manually
Port conflicts
Limitations
See full limitations for details.Examples
OpenAI Agent
Direct OpenAI SDK usage
LangChain Agent
LangChain.js with ChatOpenAI