- #133
- #132
- #131
- #130
- #129
Deploy Opencode Server to Serve Multiple Clients
Previously I mentioned that Opencode can actually be separated into Server and Client components. That means the Opencode Server can be deployed independently for other Opencode Clients to use. For example, a machine within a company can run local models or call APIs:
Terminal window opencode serve --hostname 0.0.0.0 --port 4096 --mdns --mdns-domain opencode.local# Ollama 與 opencode 有深度整合,設置模型很方便# https://docs.ollama.com/integrations/opencodeAny client on the intranet can connect to
http://opencode.local:4096to operate, whether it’s a browser, TUI, or an app: https://opencode.ai/zht/downloadTerminal window # In another terminal, attach the TUI to the running backendopencode attach http://opencode.local:4096Right now the server can host and serve its own models; per-client management isn’t available yet. There is an existing service called Zen you can check out, or LiteLLM.
- #128
- #127
- #126
- #125
- #124
- #123
- #122
- #121
- #120
- #119
- #118
- #117
- #116
- #115
- #114
- #113
- #112
- #111
- #110
- #109
- #108
- #107
- #106
- #105
- #104
- #103
- #102
- #101
- #100
- #99
- #98
- #97
- #96
- #95
- #94
- #93
- #92
- #91
- #90
- #89
- #88
- #87
- #86
- #85
- #84
- #83
- #82
- #81
- #80
- #79
- #78
- #77
- #76
- #75
- #74
- #73
- #72
- #71
- #70
- #69
- #68
- #67
- #66
- #65
- #64
- #63
- #62
- #61
- #60
- #59
- #58
- #57
- #56
- #55
- #54
- #53
- #52
- #51
- #50
- #49
- #48
- #47
- #46
- #45
- #44
- #43
- #42
- #41
- #40
- #39
- #38
- #37
- #36
- #35
- #34
- #33
- #32
- #31
- #30
- #29
- #28
- #27
- #26
- #25
- #24
- #23
- #22
- #21
- #20
- #19
- #18
- #17
- #16
- #15
- #14
- #13
- #12
- #11
- #10
- #9
- #8
- #7
- #6
- #5
- #4
- #3
- #2
- #1