Thursday, July 28, 2022

How to customize TensorFlow Serving


TensorFlow Serving supports many additional features that you can leverage. Wei Wei, Developer Advocate at Google, discusses how you can customize TensorFlow Serving for your needs. He covers its integration with monitor tools, support for custom ops, how to configure the TF Serving model server, and more. Learn how TF Serving supports basic A/B tests and seamlessly integrates with Docker and Kubernetes to scale with demand. Resources: TensorFlow Serving configuration → https://goo.gle/3QxxvIF Prometheus → https://goo.gle/3N8XCmi Monitoring configuration → https://goo.gle/3N8XCmi Use TensorFlow Serving with Kubernetes → https://goo.gle/3zYvX4B Serving TensorFlow models with custom ops → https://goo.gle/3y7JvsY TensorFlow Serving model server flags → https://goo.gle/3tQ99Qq TensorFlow Serving metrics documentation → https://goo.gle/3NeAjY6 Docker Compose documentation →https://goo.gle/3xRuDxw Subscribe to TensorFlow → https://goo.gle/TensorFlow

No comments:

Post a Comment