⚡ Building applications with LLMs through composability ⚡
This is the Java language implementation of LangChain.
Large language models (LLMs) are emerging as a transformative technology, enabling developers to build applications that they previously could not. But using these LLMs in isolation is often not enough to create a truly powerful app - the real power comes when you can combine them with other sources of computation or knowledge.
This library is aimed at assisting in the development of those types of applications.
Looking for the Python version? Check out LangChain.
This tutorial gives you a quick walkthrough about building an end-to-end language model application with LangChain.
View the Quickstart Guide on the LangChain official website.
Prerequisites for building:
- Java 17 or later
- Unix-like environment (we use Linux, Mac OS X)
- Maven (we recommend version 3.8.6 and require at least 3.5.4)
<dependency>
<groupId>io.github.hamawhitegg</groupId>
<artifactId>langchain-core</artifactId>
<version>0.1.5</version>
</dependency>
Using LangChain will usually require integrations with one or more model providers, data stores, apis, etc. For this example, we will be using OpenAI’s APIs.
We will then need to set the environment variable.
export OPENAI_API_KEY=xxx
# If a proxy is needed, set the OPENAI_PROXY environment variable.
export OPENAI_PROXY=http://host:port
If you want to set the API key and proxy dynamically, you can use the openaiApiKey and openaiProxy parameter when initiating OpenAI class.
OpenAI llm = OpenAI.builder()
.openaiApiKey("xxx")
.openaiProxy("http://host:port")
.build()
.init();
The following test code can be used to view the QuickStart.java
The most basic building block of LangChain is calling an LLM on some input. Let’s walk through a simple example of how to do this. For this purpose, let’s pretend we are building a service that generates a company name based on what the company makes.
OpenAI llm = OpenAI.builder()
.temperature(0.9f)
.build()
.init();
String text = "What would be a good company name for a company that makes colorful socks?";
System.out.println(llm.call(text));
Feetful of Fun
Calling an LLM is a great first step, but it’s just the beginning. Normally when you use an LLM in an application, you are not sending user input directly to the LLM. Instead, you are probably taking user input and constructing a prompt, and then sending that to the LLM.
PromptTemplate prompt = new PromptTemplate(List.of("product"),
"What is a good name for a company that makes {product}?");
System.out.println(prompt.format(Map.of("product", "colorful socks")));
What is a good name for a company that makes colorful socks?
Up until now, we’ve worked with the PromptTemplate and LLM primitives by themselves. But of course, a real application is not just one primitive, but rather a combination of them.
A chain in LangChain is made up of links, which can be either primitives like LLMs or other chains.
The most core type of chain is an LLMChain, which consists of a PromptTemplate and an LLM.
OpenAI llm = OpenAI.builder()
.temperature(0.9f)
.build()
.init();
PromptTemplate prompt = new PromptTemplate(List.of("product"),
"What is a good name for a company that makes {product}?");
Chain chain = new LLMChain(llm, prompt);
System.out.println(chain.run("colorful socks"));
\n\nSocktastic!
This example demonstrates the use of the SQLDatabaseChain for answering questions over a database.
SQLDatabase database = SQLDatabase.fromUri("jdbc:mysql://127.0.0.1:3306/demo", "xxx", "xxx");
BaseLanguageModel llm = OpenAI.builder()
.temperature(0)
.build()
.init();
Chain chain = SQLDatabaseChain.fromLLM(llm, database);
System.out.println(chain.run("How many students are there?"));
There are 6 students.
git clone https://github.com/HamaWhiteGG/langchain-java.git
cd langchain-java
# export JAVA_HOME=JDK17_INSTALL_HOME && mvn clean test
mvn clean test
cd langchain-java
# export JAVA_HOME=JDK17_INSTALL_HOME && mvn spotless:apply
mvn spotless:apply
Don’t hesitate to ask!
Open an issue if you find a bug in Flink.
This is an active open-source project. We are always open to people who want to use the system or contribute to it.
Contact me if you are looking for implementation tasks that fit your skills.