You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The /api/v1/search endpoint hangs and times out after data upload and cognify processing complete. Server operational, auth works, datasets API works, but search queries never return results.
Steps to Reproduce
Steps to Reproduce
Deploy Cognee via Docker with vLLM + Ollama config (see above)
Authenticate via /api/v1/auth/login (works ✅)
Create dataset via /api/v1/datasets (works ✅)
Upload 175 text chunks via /api/v1/add using multipart form-data (works ✅)
Run cognify via /api/v1/cognify (appears to process successfully ✅)
The error "Constructor parameter should be str" suggests Cognee's search implementation is passing a list (['query']) to something expecting a string ('query'). This appears to be a bug in the search layer, not our configuration.
Request
Can you help us debug why the search endpoint silently hangs after successful data upload and cognify? Are there additional debug flags or logs we can enable to see what's failing?
Pre-submission Checklist
I have searched existing issues to ensure this bug hasn't been reported already
I have provided a clear and detailed description of the bug
Bug Description
Issue Summary
The
/api/v1/searchendpoint hangs and times out after data upload and cognify processing complete. Server operational, auth works, datasets API works, but search queries never return results.Steps to Reproduce
Steps to Reproduce
/api/v1/auth/login(works ✅)/api/v1/datasets(works ✅)/api/v1/addusing multipart form-data (works ✅)/api/v1/cognify(appears to process successfully ✅)/api/v1/search:bash
curl -X POST http://localhost:8002/api/v1/search
-H "Authorization: Bearer "
-H "Content-Type: application/json"
-d '{"query": "test"}'
Expected Behavior
Expected Behavior
Search endpoint should return results with scores and matched text.
Actual Behavior
Actual Behavior
With
LLM_PROVIDER=openaiconfig:Search hangs for 30+ seconds
Logs show error:
[error] Error during brute force search for query: ['test']. Error: Constructor parameter should be str
Request times out, returns empty response
With
LLM_PROVIDER=customconfig (per docs):Environment
Environment
0.5.5-local(Docker imagecognee/cognee:latest)Configuration
Working Config (current) env
#LLM Config
LLM_PROVIDER=custom
LLM_MODEL=hosted_vllm/Qwen3-8B
LLM_ENDPOINT=http://host.docker.internal:8000/v1
LLM_API_KEY=.
LLM_INSTRUCTOR_MODE=json_schema_mode
Embedding Config
EMBEDDING_PROVIDER=ollama
EMBEDDING_MODEL=nomic-embed-text
OLLAMA_BASE_URL=http://host.docker.internal:11434
Tokenizer
HUGGINGFACE_TOKENIZER=bert-base-uncased
Vector DB
VECTOR_DB_PROVIDER=lancedb
Graph DB
GRAPH_DB_PROVIDER=networkx
Flags
SKIP_MIGRATIONS=true
COGNEE_SKIP_CONNECTION_TEST=true
ENABLE_BACKEND_ACCESS_CONTROL=false
Also Tested (earlier attempt)
env
LLM_PROVIDER=openai
LLM_MODEL=openai/Qwen/Qwen3-8B
LLM_ENDPOINT=http://host.docker.internal:8000/v1
LLM_API_KEY=EMPTY
Logs/Error Messages
Additional Context
Additional Context
What We Tried
LLM_PROVIDERfromopenai→customwithhosted_vllm/prefix (per Cognee docs)LLM_INSTRUCTOR_MODE=json_schema_mode(per GitHub issue [Docs]: Setting up custom LLMs #1500)/api/v1/datasetsendpoint (works)Nothing fixed the search endpoint.
Suspected Root Cause
The error
"Constructor parameter should be str"suggests Cognee's search implementation is passing a list (['query']) to something expecting a string ('query'). This appears to be a bug in the search layer, not our configuration.Request
Can you help us debug why the search endpoint silently hangs after successful data upload and cognify? Are there additional debug flags or logs we can enable to see what's failing?
Pre-submission Checklist