StubbyBackend LanguageModelService-7
| Status | OK |
| Proto Service Name | cloud.ai.nl.llm.proto.service.LanguageModelService |
| Stub Service Name | LanguageModelService |
| Tools | requestz |
Configuration
Configured with the following CanonicalStubConfiguration proto (bypassing Cornea):
server_name: "LanguageModelService-7" enable_client_side_throttling: false smart_service { target: "cloud.ai.nl.llm.proto.service.languagemodelservice-staging-us-central1" client_channel_options { start_reachable: false backend_subset: false subset_size: 0 channel_type: BNS_CHANNEL } wait_sec: 0 }
See /stubz for Extensible Stubs run-time information.