GLM4
Langchain
LangChain是一个由大型语言模型( LLMs )驱动的应用开发框架。
依赖
pip install langchain langchain_community httpx httpx_sse PyJWT
Demo
import os
from langchain_community.chat_models import ChatZhipuAI
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
from langchain_core.callbacks.manager import CallbackManager
from langchain_core.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
os.environ["ZHIPUAI_API_KEY"] = ""
chat = ChatZhipuAI(
model="glm-4",
temperature=0.5,
streaming=True,
callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),
)
message = [
AIMessage(content="Hi"),
SystemMessage(content="You are an automated operations engineer"),
HumanMessage(content="give me a shell scripts for deploy java package"),
]
def main():
response = chat.invoke(message)
print(response.content)
if __name__ == "__main__":
main()
响应结果
#!/bin/bash
# Define source and target directories
SOURCE_DIR="/path/to/source"
TARGET_DIR="/path/to/deployment"
JAR_FILE="MyApp.jar"
JAR_PATH="$SOURCE_DIR/$JAR_FILE"
# Define the Java application's PID file and log file
PID_FILE="/path/to/deployment/MyApp.pid"
LOG_FILE="/path/to/deployment/MyApp.log"
# Stop the currently running instance of the application, if any
if [[ -f $PID_FILE ]]; then
PID=$(cat $PID_FILE)
kill $PID
rm $PID_FILE
echo "Previous instance of the application stopped."
fi
# Copy the new JAR file to the deployment directory
cp "$JAR_PATH" "$TARGET_DIR"
if [[ $? -eq 0 ]]; then
echo "JAR file successfully copied to deployment directory."
else
echo "Error copying JAR file. Deployment failed."
exit 1
fi
# Start the Java application
java -jar "$TARGET_DIR/$JAR_FILE" > "$LOG_FILE" 2>&1 &
echo $! > $PID_FILE
echo "Application started with PID $(cat $PID_FILE)."
Make sure to give execute permissions to the script:
chmod +x deploy.sh
To run the script:
./deploy.sh
标签:GLM4,初体验,file,PID,JAR,Langchain,langchain,FILE,deployment
From: https://www.cnblogs.com/luyifo/p/18222092