Jina
This page covers how to use the Jina ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific Jina wrappers.
Installation and Setupโ
- Install the Python SDK with 
pip install jina - Get a Jina AI Cloud auth token from here and set it as an environment variable (
JINA_AUTH_TOKEN) 
Wrappersโ
Embeddingsโ
There exists a Jina Embeddings wrapper, which you can access with
from langchain.embeddings import JinaEmbeddings
For a more detailed walkthrough of this, see this notebook
Deploymentโ
Langchain-serve, powered by Jina, helps take LangChain apps to production with easy to use REST/WebSocket APIs and Slack bots.
Usageโ
Install the package from PyPI.
pip install langchain-serve
Wrap your LangChain app with the @serving decorator. 
# app.py
from lcserve import serving
@serving
def ask(input: str) -> str:
    from langchain.chains import LLMChain
    from langchain.llms import OpenAI
    from langchain.agents import AgentExecutor, ZeroShotAgent
    
    tools = [...] # list of tools
    prompt = ZeroShotAgent.create_prompt(
        tools, input_variables=["input", "agent_scratchpad"],
    )
    llm_chain = LLMChain(llm=OpenAI(temperature=0), prompt=prompt)
    agent = ZeroShotAgent(
        llm_chain=llm_chain, allowed_tools=[tool.name for tool in tools]
    )
    agent_executor = AgentExecutor.from_agent_and_tools(
        agent=agent, 
        tools=tools, 
        verbose=True,
    )
    return agent_executor.run(input)
Deploy on Jina AI Cloud with lc-serve deploy jcloud app. Once deployed, we can send a POST request to the API endpoint to get a response.
curl -X 'POST' 'https://<your-app>.wolf.jina.ai/ask' \
 -d '{
  "input": "Your Question here?",
  "envs": {
     "OPENAI_API_KEY": "sk-***"
  }
}'
You can also self-host the app on your infrastructure with Docker-compose or Kubernetes. See here for more details.
Langchain-serve also allows to deploy the apps with WebSocket APIs and Slack Bots both on Jina AI Cloud or self-hosted infrastructure.